Can I trim unused css from default.css?

Only using 10% of default.css

Just putting this out there before I dive in to fix this on a project. I just did a coverage report in Chrome devtools and found that 99% of css in default.css is unused on my site. This goes down to 94% when editing an item. So effectively I depend on less than 10% of the default.css.

Site performance heavily affected

When I disable default.css my site moves from 48% score to 88% score using Lighthouse. Of course some things break.

So, my current "mission" is to isolate the 10% I actually need from default.css. Pointer and tips are welcome.

We just pushed something in Volto for a similar-ish problem. See our performance page and the PR for impact

@tiberiuichim
The GitHub - plone/critical-css-cli tool looks like it solves the problem for anonymous users.
There's some CSS for logged in users that it probably won't get.
Was that part of your problem space and how did you address it?

1 Like

@pigeonflight: No, we don't really care about the logged-in users, as they'll either have the CSS in cache or they'll go through the anonymous screens first, before getting authenticated. For our use cases, we have a limited number of authenticated users.

I'd say it's definitely possible to enhance the @plone/critical-css-cli to handle authenticated requests. Internally, we use critical and this one uses penthouse

Penthouse supports passing cookies and critical supports passing options to penthouse, so I guess it's a matter of exposing this in some form to the CLI.

2 Likes

I Did not understand all of its docs.
Is it possible to make one CSS that takes ALL pages into account, so one is sure that everything is included ? Passing the sitemap would be nice(?)

What does this mean: the css needs all sizes (media queries) that are defined in the CSS (?)

One more thing: Does this work with (just) CSS-files, or does it care about hard coded CSS that could be in a template ( <style>) ?

You can pass multiple URLs in the command line.

critical-cli https:/something.com/ https://something.com/sitemap https://something.com/special-page

The way it works is this: pupeteer loads the passed URLs and extracts statistics from the browser engine on the CSS rules it encountered while displaying the pages at the provided dimenssions. Then we take these separate CSS files generated for each of the provided URLs and we merge them, while removing the duplicates. As far as I can tell, it doesn't care if a rule comes from a style tag or a css file, as a real browser is used to extract them after rendering the pages, it's not something done based on heuristics.

My personal opinion is that one shouldn't go too crazy with the critical-css tool. In the end, we're adding an extra 10-20kb to each loaded page. For the best possible performance, and most of all in the context of a CMS, where each page is dynamic and combines different addons, a custom critical.css would have to be generated for each content page, right? That's not something that is easy to fit in any infrastructure: you'll want custom async workers, per URL css extraction, invalidation, etc. Crazy.

So, you can pass a few screen dimmensions that would fit your website media query rules and they'll be properly extracted to the generated css.

Based on a bit of additional research... you could consider integrating a tool like criticalcss.com

Seems like a paid service for something you could get for free. Have you tried critical-css-cli? Is there a problem with it?

If it supports logged in scenarios it might make sense. I've already use crtical-css cli for non-logged in scenerios.