You can pass multiple URLs in the command line.
critical-cli https:/something.com/ https://something.com/sitemap https://something.com/special-page
The way it works is this: pupeteer loads the passed URLs and extracts statistics from the browser engine on the CSS rules it encountered while displaying the pages at the provided dimenssions. Then we take these separate CSS files generated for each of the provided URLs and we merge them, while removing the duplicates. As far as I can tell, it doesn't care if a rule comes from a style tag or a css file, as a real browser is used to extract them after rendering the pages, it's not something done based on heuristics.
My personal opinion is that one shouldn't go too crazy with the critical-css tool. In the end, we're adding an extra 10-20kb to each loaded page. For the best possible performance, and most of all in the context of a CMS, where each page is dynamic and combines different addons, a custom critical.css would have to be generated for each content page, right? That's not something that is easy to fit in any infrastructure: you'll want custom async workers, per URL css extraction, invalidation, etc. Crazy.
So, you can pass a few screen dimmensions that would fit your website media query rules and they'll be properly extracted to the generated css.