Creating constraints and requirements from buildout config

Does anyone have code that takes a buildout config and outputs a constraints.txt and requirements.txt?

We already have the old buildout.requirements extension which since recently is used in the Plone coredev buildout. But this basically keeps track of which packages are getting installed by buildout, so at the end, the packages have already been installed by buildout.

What I am looking for instead, is a separate tool, so you can generate constraints/requirements.txt and use that with pip install.

The constraints.txt should contain all pinned packages form the versions section, so likely several hundred, whether they are used or not.

The requirements.txt should contain all extensions and recipes, and the eggs that they install. In the simplest Plone buildout, the requirements would be something like: zc.buildout, setuptools, plone.recipe.zope2instance, Plone.

Does anyone have such code?

For the constraints I already tried, and it seems easy enough:

from zc.buildout import buildout

configfile = "buildout.cfg"
# Let buildout read the config, including any local or remote extends files.
config = buildout.Buildout(configfile, [])
versions = config.versions
constraints_file = "constraints.txt"
with open(constraints_file, "w") as cfile:
    cfile.write("# Constraints parsed from {}\n".format(configfile))
    for package, version in sorted(versions.items()):
        cfile.write("{}=={}\n".format(package, version))

print("Wrote versions as constraints to {}.".format(constraints_file))

https://www.fourdigits.nl/blog/installing-plone-5-using-pip/ might have something for you.
I would keep setup.py to manage dependencies, so we're not PIP dependent.

https://gist.github.com/jaroel/d7f57b0884ab1eb3007cdd787fd388a2 does the same thing I guess?

Thanks Roel. I have seen that link, while fixing some z3c.autoinclude problems when using pip.

I didn't check your script at the time. Looks like the same idea yes.

I'm not sure where you want to go with a requirements.txt? We only need a constraints.txt file for versions. The rest should be just "Plone" or "MyProject" with the correct dependencies, right?

I don't see how plone.recipe.zope2instance or zc.buildout is useful in a pip install environment. Maybe I'm missing somethings here?

1 Like

In fact in future I would like to have a constraints.txt with all versions and a Buildout reading that file instead of a [versions] section in there.

For requirements.txt: IMO it should contain just Plone (if needed at all) - or did I miss something?

We're jacking the thread a bit though.

@mauritsvanrees We could parse bin/instance and extract the eggs + versions from there?

The general idea is that in the future, I do not want to use buildout for installing Python packages. It uses an old hacked-up copy of easy_install for this, instead of the modern pip. But a buildout not only installs Python packages, but does lots of other things.

So the steps are:

  1. Create a buildout config.
  2. Extract constraints and requirements from it.
  3. Install the requirements with pip in a virtualenv. This includes buildout recipes and extensions.
  4. Run buildout to create scripts (bin/backup), create a folder structure (parts/instance), install non-Python software (varnish), create files from templates (nginx config), install cron jobs, and whatever else it is that your buildout config does, except that it should not install Python packages.

If instead you no longer want to use buildout to install and configure Plone, then you can use pip for installing Python packages, but you still need some other tool for the rest.

1 Like

The code is really messy but the method used by c.hostout should get you what you want. https://github.com/collective/collective.hostout/blob/master/collective/hostout/hostout.py#L754.

Roughly how it works is that makes the hostout recipe a dependent on all other recipes so they get resolved first and then redoes the package resolution for each buildout section and records the packages and versions that they resolved to. This gets you all the packages and versions instead of just those that were picked. It puts them all into a .cfg file rather than .txt

So this is during a buildout run where buildout installs packages, and hostout keeps track of them?

Or does hostout call buildout without installing packages?
And when a recipe like plone.recipe.zope2instance has eggs = Plone, does hostout find this egg too?
If that works, it would be interesting.

My own idea would be to read the buildout config, go over all sections, and add any eggs option to the requirements. And then allow specifying extra eggs, maybe in a section like this:

[some-non-existing-tool]
extra-eggs =
    collective.something
    ${other-section:some-option}

Such code would probably end up being cleaner, but it means more work for the person who creates the buildout config.

yes exactly.

yes it does. It was designed to get all the eggs and their version for the entire buildout so that you could reproduce the buildout else where. It also includes packages in development and has code that will package those up with special unique version numbers. Not sure thats useful to you. We used to use it to deploy a buildout via ssh commands. Now we use it to package a buildout up into a docker image.
But adding a command the fabfile to write out a .txt file would not be hard to add.

That won't get the all the dependencies via setup.py's etc so it won't freeze your entire buildout.
On the other hand it doesn't require adding something to a buildout and having to run that buildout which depending on how old your bulldout is, might be something you are trying to avoid.
Another option is to look in all bin/... scripts and get all the versions out of there. Thats what pycharm does. It won't get you the recipe versions though.

If you want to give hostout a go I think that just adding the below to your existing buildout and rerunning buildout, would get you the versions

parts += hostout

[hostout]
recipe = collective.hostout
versionsfile=hostoutversions.cfg

to be clear the collective.hostout recipe just collects the versions and puts the other settings in a file ready for the bin/hostout command. the bin/hostout command does things with those settings and the versions which you probably aren't interested in like create a docker image.

actually perhaps an even easier idea is to read .installed.cfg. Every recipe has a __buildout_signature__ that contains all the dependent eggs and versions. Not sure on development eggs but they might be in there somewhere too

Thanks for the information, Dylan. That is useful.

In my (for the moment theoretical) use case I am after something else. I am trying to avoid letting buildout install any Python packages at all. It should just use what is there in the Python (virtual) environment. When I can parse the correct constraints.txt out of a buildout config, and extract the requirements.txt (easiest case: zc.buildout, plone.recipe.zope2instance and Plone) then pip can install those, and we have everything that we need. Then buildout comes and installs some configuration and we are done.

I am late to the party, but Zope uses Zope/util.py at master · zopefoundation/Zope · GitHub to generate its constraints.txt and requirements.txt.

2 Likes

I now have an ugly Python script that reads a buildout.cfg from the current directory and outputs parsed-constraints.txt and parsed-requirements.txt. For a very simple Plone buildout config, the parsed-requirements.txt looks like this:

$ cat parsed-requirements.txt 
# Requirements parsed from buildout.cfg
Products.CMFPlone
plone.recipe.zope2instance
zc.buildout

Works for me as proof of concept.

Well, I tried that in a fresh test project:

$ bin/python buildoutconfig2pip.py 
Wrote versions as constraints to parsed-constraints.txt.
Getting distribution for 'plone.recipe.zope2instance==6.5.0'.
Got plone.recipe.zope2instance 6.5.0.
Getting distribution for 'zc.recipe.egg==2.0.7'.
Got zc.recipe.egg 2.0.7.
Getting distribution for 'waitress==1.3.0'.
Got waitress 1.3.0.
...

So when the script iterates over the sections, buildout automatically installs the recipes and their eggs. Exactly not what I want...
So this already installs eggs:

config = buildout.Buildout("buildout.cfg", []) 
print(config["instance"])

I can instead use the raw buildout configuration:

print(config._raw["instance"])

That avoid installing eggs, and works in the simple case, but in the Plone coredev buildout the parsed-requirements.txt is like this:

${buildout:custom-eggs}
${buildout:devtool-eggs}
${buildout:test-eggs}
${instance:eggs}
${test:eggs}
Pillow
Plone
buildout.requirements
collective.recipe.omelette
collective.xmltestreport
mr.developer
plone.app.robotframework[reload,debug]
...

So now I need to parse those ${test:eggs} lines. Should be doable, though I wonder if I then just run into yet another problem. Say mr.developer checkouts. :slight_smile: Ah well, one step at a time.

I have updated my script to handle this. There are probably corner cases going wrong, but again enough for a proof of concept.

Is there a reason to not consider to refactor buildout to make it work over pip instead of easy_install ?
IOW, has there been research that shows that it would be a dead end ?

Sounds like a plan! (with no idea how feasible it is)

But then

  • using requirements.txt/constraints.txt instead of [versions]? (I would say yes)
  • Whats with mr.developer then? Triggering "editable" -e pip installs?

As alternative we could make a install/download free buildout and pip install everything before buildout. But this could be cumbersome.

Years ago I started on a recipe for installing packages in a virtualenv.
And this year I started experimenting with a package that might replace buildout.
Both did not get beyond the toy/experimental project stage.

Maybe one or more buildout extensions would be easier, so people can test several approaches and see what works best, before finally moving something to core buildout:

  • An extension could replace the easy_install call with a pip call. Probably check that you are in a virtualenv.
  • Another extension could exit with an error when buildout tries to install a package.

Has someone experience with pip -e workflow ?

I really like the mr.developer workflow. I must say I'd rather keep it working.

I lack deep pip knowledge. But a shalIow exploration makes me think it should be possible to use it as a base for both buildout and mr.developer.