docker-compose on the server to manage launching the container (will likely move to something like ansible and kubernetes in the future for more automation).
Relstorage that connects to an external managed postgres
Development approach:
We pg_dump the production database to a local pg container to create a local "snapshot". Development is done against the local snapshot container.
I'd love to defang the local data somehow.
This used to be "a thing" when using a Data.fs, there was very useful package: isotoma.plone.defang · PyPI
A simple way is to export the whole Plone site as a zexp and import it into the local instance. This does not work well for large databases as it takes a lot of time.
plone-uninstall (as oposed to pip-uninstall) add-ons like plone.reload or plone.restapi that for whatever reason might have security issues or might not be pip-installed in the target (production) environment
This is one of the ideas of isostoma.plone.defang: to defang the database before it is started up.
They open a dbfile via ZODB.FileStorage and ZODB.DB. This could be adapted to work on app after opening it via Zope2.Startup.run import make_wsgi_app with a zope.conf.
Another interesting idea of isotoma's approach is the use of - a rather rudimentary - kind of interface Fang with two methods fang and defang which get app as argument (see isotoma.plone.defang.fangs) .