Just putting this out there before I dive in to fix this on a project. I just did a coverage report in Chrome devtools and found that 99% of css in default.css is unused on my site. This goes down to 94% when editing an item. So effectively I depend on less than 10% of the default.css.
@tiberiuichim
The GitHub - plone/critical-css-cli tool looks like it solves the problem for anonymous users.
There's some CSS for logged in users that it probably won't get.
Was that part of your problem space and how did you address it?
@pigeonflight: No, we don't really care about the logged-in users, as they'll either have the CSS in cache or they'll go through the anonymous screens first, before getting authenticated. For our use cases, we have a limited number of authenticated users.
I'd say it's definitely possible to enhance the @plone/critical-css-cli to handle authenticated requests. Internally, we use critical and this one uses penthouse
Penthouse supports passing cookies and critical supports passing options to penthouse, so I guess it's a matter of exposing this in some form to the CLI.
I Did not understand all of its docs.
Is it possible to make one CSS that takes ALL pages into account, so one is sure that everything is included ? Passing the sitemap would be nice(?)
What does this mean: the css needs all sizes (media queries) that are defined in the CSS (?)
One more thing: Does this work with (just) CSS-files, or does it care about hard coded CSS that could be in a template ( <style>) ?
The way it works is this: pupeteer loads the passed URLs and extracts statistics from the browser engine on the CSS rules it encountered while displaying the pages at the provided dimenssions. Then we take these separate CSS files generated for each of the provided URLs and we merge them, while removing the duplicates. As far as I can tell, it doesn't care if a rule comes from a style tag or a css file, as a real browser is used to extract them after rendering the pages, it's not something done based on heuristics.
My personal opinion is that one shouldn't go too crazy with the critical-css tool. In the end, we're adding an extra 10-20kb to each loaded page. For the best possible performance, and most of all in the context of a CMS, where each page is dynamic and combines different addons, a custom critical.css would have to be generated for each content page, right? That's not something that is easy to fit in any infrastructure: you'll want custom async workers, per URL css extraction, invalidation, etc. Crazy.
So, you can pass a few screen dimmensions that would fit your website media query rules and they'll be properly extracted to the generated css.