Shared pip cache possible?

I know that pip has a cache of downloading packages and building packages. But as far as I understand, for each venv the packages are copied. So if I'm developing 10 addons, I'll have 10 copies of Plone. Is that right or am I missing something? Don't we have a cache like the buildout egg cache?

1 Like

You mean a global cache?

Yes I'm talking about this cache. It decreases download time but not disk space. Plone 6 occupies 328 MB in the lib folder. That's times the number of addons you're working on.

Honestly, I do not see any problem in having up to half a gig disk space used by a Plone instance. I have plenty of projects in my workspace and it is not what fills up my drives.

The problem is also in local testing. For example in plone.namedfile we have a tox setup which uses pip and mxdev to test on Plone 5.2 and 6.0 on all supported Python 3 versions. When I run tox -p auto, this results in over 2 gigabyte of disk usage:

$ du -sh .tox
2.2G	.tox
$ du -sh .tox/*
  0B	.tox/log
347M	.tox/plone52-py36
349M	.tox/plone52-py37
344M	.tox/plone52-py38
297M	.tox/plone60-py310
297M	.tox/plone60-py37
297M	.tox/plone60-py38
297M	.tox/plone60-py39

I think this is a slightly different issue. @wesleybl develops a number of plugins - which at least possibly could use the same Python version and so also possibly a common set of general Plone dependencies.

Here, you both use different Python and different Plone versions.

While there would be some common versions, it would be some kind of effort to manage that with tox. Think of library maintainers dropping support for unsupported Python versions - in your example Python 3.6. So for each env there must be a separate mapping which possibly manages symlinks to a central Python package store.

You can certainly create an issue on our tracker ( GitHub - tox-dev/tox: Command line driven CI frontend and development task automation tool. ), but I do not see a solution in a near future. Especially as disk space is cheap nowadays.

I guess that you all speak about different things.

Yes there is a possibility to have a shared cache directory for pip simmilar to buildout_cache it is described at Configuration - pip documentation v22.1.2 how to make a global, user or project specific config of pip. Caching is described at: Caching - pip documentation v22.1.2

I use sthis for several years, and it helps a lot on saving storage and time, both download and debug time, when some download fails to complete or is corrupt.

The second thing is to use the cached packages in a virtualenv or tox context. as there the packages / eggs got installed and unpacked, there is nothing where you could save disk space.

1 Like

Except if you use a filesystem with de-duplication features (like ZFS offers if switched on).

1 Like