Updated: A description to export the dump from the container is already available in my latest comment. I still miss a way to stop the backend in docker in the staging site and restore the database from the dump!
What we lack is to document the simplest documentation to dump the Postgres data and file it into a staging site.
Yes, I know it is beyond the scope of Plone and should be reachable independent of Plone setups.
But guess what? I simply need these instructions handy and better: Distilled into a make command: make export relstorage dump and make import_relstorage_dump
Any links to existing docs?
Cheers Armin
Background
During Plone Tagung 2025 in Koblenz, @fredvd gave a short talk (not recorded) how to convert Postgres production data to a local ZODB Filestorage database on a dev machine using zodbconvert and a specialised container fired up in parallel to do the job without touching the running production containers. Great thing.
Update:@fredvd added a link below to his repo covering the conversion to filestorage.
@fredvd Is anything related to using the worker container for exporting/importing the relstorage dump documented yet? I wanted to learn and add that to the troubleshooting.md in the training docs.
In the initial attempt to add a hint there the goal was to execute classic Zope related commandline tools inside a regular docker container.
There @yurj mentioned the option to use a docker exec approach, but did not explored it further and suggested to continue here in the community forum.
Here I go!
By the way: When fixing issues with the migration of the Plone Tagung 2025 Recordings from the cloud recording server to an at rest local server under a different domain, I learned to work with Postgres databases live to fix some table entries from the commandline. Hard Stuff!
I want to come back on this dump and restore relstorage in cookieplone created docker-swarm challenge
This is not a backup replacement! but can be used for quick restore in uncritical development situations:
Dump the Postgres database out of the container using the pg_dump command into a single file including blobs.
Reimport the dump into a staging server instance with an exact matching setup (no filestorage, Postgres relstorage)
Create the dump
For a cookieplone 0.9.3 and template 83b50c6 based basic devops setup for a Plone 6.1.1 using docker swarm I this command dumps the PostgreSQL database into the user root directory:
NOTE: The initial part picks the postgres container estimating the name contains _db
Import the dump (not tested yet)
I expect you need to
shut down the backend
replace the database
restart the backend
THIS IS THE OPEN QUESTION -> Backend Start Stop in Dockerized Production is Unclear
I have not found a proper description / procedure for the devops setup how to temporarily shut down a dockerized production system and restart only on demand after restoring the database.
The system setup is designed to restart at once if a container wents down.
Import / Restore dump command
The command to replace the database in the Postgres container should be similarily working
Prerequisite:
The site should be provisoned with identical setup except the FQDN and be working with the default setup.
Since both setups are identical they should share database name an credentials
The path to the dump file should also be the same for loading
Instead of moving the relstorage dump I convert the database to a filestorage, move it as a tarball to the staging site and do a forced recreate of the relstorage with the zodbconvert --clear option.
Steps:
do a zodbconvert to a filestorage in the first backend docker container
move the resulting var directory from the backend docker container as a tarball into the first backend docker container of the staging site. (procedure not mentioned here in detail).
there are other approaches to use a worker container, I left this for the next try.
on the staging site (should be on idle) untar the archive and used the reverse config with the zodbconvert --clear option to hard overwrite the relstorage.
the site is now broken
restart the server
Worked for me for having test content from the live site (until I discover glitches).
I used this on a small amount of content (189Mb as uncompressed SQL dump, 33Mb as filestorage tarball). It ends up quite fast.