Does anyone have code that takes a buildout config and outputs a constraints.txt and requirements.txt?
We already have the old buildout.requirements extension which since recently is used in the Plone coredev buildout. But this basically keeps track of which packages are getting installed by buildout, so at the end, the packages have already been installed by buildout.
What I am looking for instead, is a separate tool, so you can generate constraints/requirements.txt and use that with pip install.
The constraints.txt should contain all pinned packages form the versions section, so likely several hundred, whether they are used or not.
The requirements.txt should contain all extensions and recipes, and the eggs that they install. In the simplest Plone buildout, the requirements would be something like: zc.buildout, setuptools, plone.recipe.zope2instance, Plone.
Does anyone have such code?
For the constraints I already tried, and it seems easy enough:
from zc.buildout import buildout
configfile = "buildout.cfg"
# Let buildout read the config, including any local or remote extends files.
config = buildout.Buildout(configfile, [])
versions = config.versions
constraints_file = "constraints.txt"
with open(constraints_file, "w") as cfile:
cfile.write("# Constraints parsed from {}\n".format(configfile))
for package, version in sorted(versions.items()):
cfile.write("{}=={}\n".format(package, version))
print("Wrote versions as constraints to {}.".format(constraints_file))
I'm not sure where you want to go with a requirements.txt? We only need a constraints.txt file for versions. The rest should be just "Plone" or "MyProject" with the correct dependencies, right?
I don't see how plone.recipe.zope2instance or zc.buildout is useful in a pip install environment. Maybe I'm missing somethings here?
The general idea is that in the future, I do not want to use buildout for installing Python packages. It uses an old hacked-up copy of easy_install for this, instead of the modern pip. But a buildout not only installs Python packages, but does lots of other things.
So the steps are:
Create a buildout config.
Extract constraints and requirements from it.
Install the requirements with pip in a virtualenv. This includes buildout recipes and extensions.
Run buildout to create scripts (bin/backup), create a folder structure (parts/instance), install non-Python software (varnish), create files from templates (nginx config), install cron jobs, and whatever else it is that your buildout config does, except that it should not install Python packages.
If instead you no longer want to use buildout to install and configure Plone, then you can use pip for installing Python packages, but you still need some other tool for the rest.
Roughly how it works is that makes the hostout recipe a dependent on all other recipes so they get resolved first and then redoes the package resolution for each buildout section and records the packages and versions that they resolved to. This gets you all the packages and versions instead of just those that were picked. It puts them all into a .cfg file rather than .txt
So this is during a buildout run where buildout installs packages, and hostout keeps track of them?
Or does hostout call buildout without installing packages?
And when a recipe like plone.recipe.zope2instance has eggs = Plone, does hostout find this egg too?
If that works, it would be interesting.
My own idea would be to read the buildout config, go over all sections, and add any eggs option to the requirements. And then allow specifying extra eggs, maybe in a section like this:
yes it does. It was designed to get all the eggs and their version for the entire buildout so that you could reproduce the buildout else where. It also includes packages in development and has code that will package those up with special unique version numbers. Not sure thats useful to you. We used to use it to deploy a buildout via ssh commands. Now we use it to package a buildout up into a docker image.
But adding a command the fabfile to write out a .txt file would not be hard to add.
That won't get the all the dependencies via setup.py's etc so it won't freeze your entire buildout.
On the other hand it doesn't require adding something to a buildout and having to run that buildout which depending on how old your bulldout is, might be something you are trying to avoid.
Another option is to look in all bin/... scripts and get all the versions out of there. Thats what pycharm does. It won't get you the recipe versions though.
If you want to give hostout a go I think that just adding the below to your existing buildout and rerunning buildout, would get you the versions
parts += hostout
[hostout]
recipe = collective.hostout
versionsfile=hostoutversions.cfg
to be clear the collective.hostout recipe just collects the versions and puts the other settings in a file ready for the bin/hostout command. the bin/hostout command does things with those settings and the versions which you probably aren't interested in like create a docker image.
actually perhaps an even easier idea is to read .installed.cfg. Every recipe has a __buildout_signature__ that contains all the dependent eggs and versions. Not sure on development eggs but they might be in there somewhere too
Thanks for the information, Dylan. That is useful.
In my (for the moment theoretical) use case I am after something else. I am trying to avoid letting buildout install any Python packages at all. It should just use what is there in the Python (virtual) environment. When I can parse the correct constraints.txt out of a buildout config, and extract the requirements.txt (easiest case: zc.buildout, plone.recipe.zope2instance and Plone) then pip can install those, and we have everything that we need. Then buildout comes and installs some configuration and we are done.
I now have an ugly Python script that reads a buildout.cfg from the current directory and outputs parsed-constraints.txt and parsed-requirements.txt. For a very simple Plone buildout config, the parsed-requirements.txt looks like this:
$ bin/python buildoutconfig2pip.py
Wrote versions as constraints to parsed-constraints.txt.
Getting distribution for 'plone.recipe.zope2instance==6.5.0'.
Got plone.recipe.zope2instance 6.5.0.
Getting distribution for 'zc.recipe.egg==2.0.7'.
Got zc.recipe.egg 2.0.7.
Getting distribution for 'waitress==1.3.0'.
Got waitress 1.3.0.
...
So when the script iterates over the sections, buildout automatically installs the recipes and their eggs. Exactly not what I want...
So this already installs eggs:
So now I need to parse those ${test:eggs} lines. Should be doable, though I wonder if I then just run into yet another problem. Say mr.developer checkouts. Ah well, one step at a time.
Is there a reason to not consider to refactor buildout to make it work over pip instead of easy_install ?
IOW, has there been research that shows that it would be a dead end ?
Years ago I started on a recipe for installing packages in a virtualenv.
And this year I started experimenting with a package that might replace buildout.
Both did not get beyond the toy/experimental project stage.
Maybe one or more buildout extensions would be easier, so people can test several approaches and see what works best, before finally moving something to core buildout: