POSKeyError on portal_catalog, portal_workflow

I have a database using RelStorage. Suddenly I have POSKeyErrors on every tool in my site. Nothing is working (even ZMI is not working). I have no idea what happend or how I can repair it. I have read http://plonechix.blogspot.com/2009/12/definitive-guide-to-poskeyerror.html but I do not think that my solution can be to delete the offending objects.

I have restored a backup - but I want to find out what happened and how I can repair it if it happens again.

Has somebody an idea?

>>> app.site
<PloneSite at /site>
>>> app.site.portal_catalog
2019-07-08 09:16:34 WARNING relstorage POSKeyError on oid 155: no tid found; history-free adapter
2019-07-08 09:16:34 ERROR ZODB.Connection Couldn't load state for Products.CMFPlone.CatalogTool.CatalogTool 0x9b
Traceback (most recent call last):
  File "/opt/plone/buildout-cache/eggs/ZODB-5.3.0-py2.7.egg/ZODB/Connection.py", line 796, in setstate
    p, serial = self._storage.load(oid)
  File "/opt/plone/buildout-cache/eggs/perfmetrics-2.0-py2.7.egg/perfmetrics/__init__.py", line 127, in call_with_metric
    return f(*args, **kw)
  File "/opt/plone/buildout-cache/eggs/RelStorage-2.1.1-py2.7-linux-x86_64.egg/relstorage/storage.py", line 587, in load
    raise POSKeyError(oid)
POSKeyError: 0x9b
2019-07-08 09:16:34 WARNING relstorage POSKeyError on oid 155: no tid found; history-free adapter
2019-07-08 09:16:34 ERROR ZODB.Connection Couldn't load state for Products.CMFPlone.CatalogTool.CatalogTool 0x9b
Traceback (most recent call last):
  File "/opt/plone/buildout-cache/eggs/ZODB-5.3.0-py2.7.egg/ZODB/Connection.py", line 796, in setstate
    p, serial = self._storage.load(oid)
  File "/opt/plone/buildout-cache/eggs/perfmetrics-2.0-py2.7.egg/perfmetrics/__init__.py", line 127, in call_with_metric
    return f(*args, **kw)
  File "/opt/plone/buildout-cache/eggs/RelStorage-2.1.1-py2.7-linux-x86_64.egg/relstorage/storage.py", line 587, in load
    raise POSKeyError(oid)
POSKeyError: 0x9b
<Products.CMFPlone.CatalogTool.CatalogTool object at 0x7fad23683d70>
>>> app.site.portal_workflow
2019-07-08 09:18:19 WARNING relstorage POSKeyError on oid 179: no tid found; history-free adapter
2019-07-08 09:18:19 ERROR ZODB.Connection Couldn't load state for Products.CMFPlone.WorkflowTool.WorkflowTool 0xb3
Traceback (most recent call last):
  File "/opt/plone/buildout-cache/eggs/ZODB-5.3.0-py2.7.egg/ZODB/Connection.py", line 796, in setstate
    p, serial = self._storage.load(oid)
  File "/opt/plone/buildout-cache/eggs/perfmetrics-2.0-py2.7.egg/perfmetrics/__init__.py", line 133, in call_with_metric
    return f(*args, **kw)
  File "/opt/plone/buildout-cache/eggs/RelStorage-2.1.1-py2.7-linux-x86_64.egg/relstorage/storage.py", line 587, in load
    raise POSKeyError(oid)
POSKeyError: 0xb3
2019-07-08 09:18:19 WARNING relstorage POSKeyError on oid 179: no tid found; history-free adapter
2019-07-08 09:18:19 ERROR ZODB.Connection Couldn't load state for Products.CMFPlone.WorkflowTool.WorkflowTool 0xb3
Traceback (most recent call last):
  File "/opt/plone/buildout-cache/eggs/ZODB-5.3.0-py2.7.egg/ZODB/Connection.py", line 796, in setstate
    p, serial = self._storage.load(oid)
  File "/opt/plone/buildout-cache/eggs/perfmetrics-2.0-py2.7.egg/perfmetrics/__init__.py", line 127, in call_with_metric
    return f(*args, **kw)
  File "/opt/plone/buildout-cache/eggs/RelStorage-2.1.1-py2.7-linux-x86_64.egg/relstorage/storage.py", line 587, in load
    raise POSKeyError(oid)
POSKeyError: 0xb3
<Products.CMFPlone.WorkflowTool.WorkflowTool object at 0x7fad2369b500>

In order to help you, we'd have to know a lot more about your buildout, including the version of Plone, RelStorage, and what changed recently in your system.

I see in your logs RelStorage 2.1.1 so that's quite recent. Is this a new installation? Or an upgrade?

The logs above are from a test system where I installed the corrupted dump - it has different versions than production.

In production I have

Plone 5.1.2.1 (5112)
CMF 2.2.12
Zope 2.13.27
Python 2.7.13 (default, Sep 26 2018, 18:42:22) [GCC 6.3.0 20170516]
PIL 4.3.0 (Pillow)
RelStorage 2.0.0

(I plan to update production to RelStorage 2.1.1 - are there any known issues or things I should consider?)

I did not have any recent changes. About a month ago I installed plone.restapi. (I use it for reading data.)

The first time the error occured was after packing the database. It finished at 00:55 and at 00:57 I see the first POSKeyError in sentry.

In the zodbpack log file is nothing eye-catching.

2019-07-06 00:00:02,565 [zodbpack] INFO Opening storage (RelStorageFactory)...
2019-07-06 00:00:02,780 [zodbpack] INFO Packing storage (RelStorageFactory).
2019-07-06 00:00:02,869 [relstorage] INFO pack: analyzing transactions committed Fri Jun 28 20:00:52 2019 or before
2019-07-06 00:02:48,498 [relstorage.adapters.packundo] INFO pre_pack: analyzing references from 179449 object(s)
2019-07-06 00:03:48,878 [relstorage.adapters.packundo] INFO pre_pack: objects analyzed: 6600/179449
2019-07-06 00:04:49,003 [relstorage.adapters.packundo] INFO pre_pack: objects analyzed: 20700/179449
2019-07-06 00:05:50,611 [relstorage.adapters.packundo] INFO pre_pack: objects analyzed: 26700/179449
2019-07-06 00:06:50,777 [relstorage.adapters.packundo] INFO pre_pack: objects analyzed: 36600/179449
2019-07-06 00:07:50,806 [relstorage.adapters.packundo] INFO pre_pack: objects analyzed: 49400/179449
2019-07-06 00:08:50,846 [relstorage.adapters.packundo] INFO pre_pack: objects analyzed: 73600/179449
2019-07-06 00:09:50,911 [relstorage.adapters.packundo] INFO pre_pack: objects analyzed: 146500/179449
2019-07-06 00:10:17,584 [relstorage.adapters.packundo] INFO pre_pack: objects analyzed: 179449/179449
2019-07-06 00:12:07,390 [relstorage.adapters.packundo] INFO pre_pack: analyzing references from 709 object(s)
2019-07-06 00:12:09,667 [relstorage.adapters.packundo] INFO pre_pack: objects analyzed: 709/709
2019-07-06 00:13:32,143 [relstorage.adapters.packundo] INFO pre_pack: analyzing references from 252 object(s)
2019-07-06 00:13:32,766 [relstorage.adapters.packundo] INFO pre_pack: objects analyzed: 252/252
2019-07-06 00:13:44,454 [relstorage.adapters.packundo] INFO pre_pack: filling the pack_object table
2019-07-06 00:16:55,309 [relstorage.adapters.packundo] INFO pre_pack: downloading pack_object and object_ref.
2019-07-06 00:20:22,385 [relstorage.adapters.packundo] INFO pre_pack: traversing the object graph to find reachable objects.
2019-07-06 00:44:35,029 [relstorage.adapters.packundo] INFO pre_pack: marking objects reachable: 8093958
2019-07-06 00:52:06,239 [relstorage.adapters.packundo] INFO pre_pack: finished successfully
2019-07-06 00:53:42,169 [relstorage.adapters.packundo] INFO pack: will remove 78681 object(s)
2019-07-06 00:53:43,368 [relstorage.adapters.packundo] INFO pack: removed 900 (1.1%) state(s)
2019-07-06 00:53:44,734 [relstorage.adapters.packundo] INFO pack: removed 1100 (1.4%) state(s)
2019-07-06 00:53:46,236 [relstorage.adapters.packundo] INFO pack: removed 1700 (2.2%) state(s)
2019-07-06 00:53:47,721 [relstorage.adapters.packundo] INFO pack: removed 2200 (2.8%) state(s)
2019-07-06 00:53:49,024 [relstorage.adapters.packundo] INFO pack: removed 2500 (3.2%) state(s)
2019-07-06 00:53:50,323 [relstorage.adapters.packundo] INFO pack: removed 2800 (3.6%) state(s)
2019-07-06 00:53:51,693 [relstorage.adapters.packundo] INFO pack: removed 3200 (4.1%) state(s)
2019-07-06 00:53:53,045 [relstorage.adapters.packundo] INFO pack: removed 3800 (4.8%) state(s)
2019-07-06 00:53:54,879 [relstorage.adapters.packundo] INFO pack: removed 4400 (5.6%) state(s)
2019-07-06 00:53:56,376 [relstorage.adapters.packundo] INFO pack: removed 5000 (6.4%) state(s)
2019-07-06 00:53:57,776 [relstorage.adapters.packundo] INFO pack: removed 5400 (6.9%) state(s)
2019-07-06 00:53:59,211 [relstorage.adapters.packundo] INFO pack: removed 5900 (7.5%) state(s)
2019-07-06 00:54:00,726 [relstorage.adapters.packundo] INFO pack: removed 6500 (8.3%) state(s)
2019-07-06 00:54:02,076 [relstorage.adapters.packundo] INFO pack: removed 7000 (8.9%) state(s)
2019-07-06 00:54:03,670 [relstorage.adapters.packundo] INFO pack: removed 7500 (9.5%) state(s)
2019-07-06 00:54:05,392 [relstorage.adapters.packundo] INFO pack: removed 7900 (10.0%) state(s)
2019-07-06 00:54:07,844 [relstorage.adapters.packundo] INFO pack: removed 8300 (10.5%) state(s)
2019-07-06 00:54:09,509 [relstorage.adapters.packundo] INFO pack: removed 8700 (11.1%) state(s)
2019-07-06 00:54:11,777 [relstorage.adapters.packundo] INFO pack: removed 9500 (12.1%) state(s)
2019-07-06 00:54:13,626 [relstorage.adapters.packundo] INFO pack: removed 10000 (12.7%) state(s)
2019-07-06 00:54:15,064 [relstorage.adapters.packundo] INFO pack: removed 10400 (13.2%) state(s)
2019-07-06 00:54:16,569 [relstorage.adapters.packundo] INFO pack: removed 10700 (13.6%) state(s)
2019-07-06 00:54:18,622 [relstorage.adapters.packundo] INFO pack: removed 11300 (14.4%) state(s)
2019-07-06 00:54:21,470 [relstorage.adapters.packundo] INFO pack: removed 11900 (15.1%) state(s)
2019-07-06 00:54:26,781 [relstorage.adapters.packundo] INFO pack: removed 12700 (16.1%) state(s)
2019-07-06 00:54:29,852 [relstorage.adapters.packundo] INFO pack: removed 13400 (17.0%) state(s)
2019-07-06 00:54:32,410 [relstorage.adapters.packundo] INFO pack: removed 14000 (17.8%) state(s)
2019-07-06 00:54:35,457 [relstorage.adapters.packundo] INFO pack: removed 14800 (18.8%) state(s)
2019-07-06 00:54:38,448 [relstorage.adapters.packundo] INFO pack: removed 15600 (19.8%) state(s)
2019-07-06 00:54:40,943 [relstorage.adapters.packundo] INFO pack: removed 16400 (20.8%) state(s)
2019-07-06 00:54:44,646 [relstorage.adapters.packundo] INFO pack: removed 17200 (21.9%) state(s)
2019-07-06 00:54:47,292 [relstorage.adapters.packundo] INFO pack: removed 18400 (23.4%) state(s)
2019-07-06 00:54:48,433 [relstorage.adapters.packundo] INFO pack: removed 20800 (26.4%) state(s)
2019-07-06 00:54:49,517 [relstorage.adapters.packundo] INFO pack: removed 22400 (28.5%) state(s)
2019-07-06 00:54:50,717 [relstorage.adapters.packundo] INFO pack: removed 25000 (31.8%) state(s)
2019-07-06 00:54:51,904 [relstorage.adapters.packundo] INFO pack: removed 26500 (33.7%) state(s)
2019-07-06 00:54:53,084 [relstorage.adapters.packundo] INFO pack: removed 28900 (36.7%) state(s)
2019-07-06 00:54:54,326 [relstorage.adapters.packundo] INFO pack: removed 31300 (39.8%) state(s)
2019-07-06 00:54:55,467 [relstorage.adapters.packundo] INFO pack: removed 33100 (42.1%) state(s)
2019-07-06 00:54:56,830 [relstorage.adapters.packundo] INFO pack: removed 35500 (45.1%) state(s)
2019-07-06 00:54:58,058 [relstorage.adapters.packundo] INFO pack: removed 37600 (47.8%) state(s)
2019-07-06 00:54:59,411 [relstorage.adapters.packundo] INFO pack: removed 39300 (49.9%) state(s)
2019-07-06 00:55:00,747 [relstorage.adapters.packundo] INFO pack: removed 42100 (53.5%) state(s)
2019-07-06 00:55:02,707 [relstorage.adapters.packundo] INFO pack: removed 44100 (56.0%) state(s)
2019-07-06 00:55:03,912 [relstorage.adapters.packundo] INFO pack: removed 46800 (59.5%) state(s)
2019-07-06 00:55:07,035 [relstorage.adapters.packundo] INFO pack: removed 48800 (62.0%) state(s)
2019-07-06 00:55:08,539 [relstorage.adapters.packundo] INFO pack: removed 51300 (65.2%) state(s)
2019-07-06 00:55:09,767 [relstorage.adapters.packundo] INFO pack: removed 53600 (68.1%) state(s)
2019-07-06 00:55:10,986 [relstorage.adapters.packundo] INFO pack: removed 55800 (70.9%) state(s)
2019-07-06 00:55:12,532 [relstorage.adapters.packundo] INFO pack: removed 58000 (73.7%) state(s)
2019-07-06 00:55:13,661 [relstorage.adapters.packundo] INFO pack: removed 60900 (77.4%) state(s)
2019-07-06 00:55:15,247 [relstorage.adapters.packundo] INFO pack: removed 62400 (79.3%) state(s)
2019-07-06 00:55:16,616 [relstorage.adapters.packundo] INFO pack: removed 65400 (83.1%) state(s)
2019-07-06 00:55:18,217 [relstorage.adapters.packundo] INFO pack: removed 67100 (85.3%) state(s)
2019-07-06 00:55:19,389 [relstorage.adapters.packundo] INFO pack: removed 69600 (88.5%) state(s)
2019-07-06 00:55:20,792 [relstorage.adapters.packundo] INFO pack: removed 71900 (91.4%) state(s)
2019-07-06 00:55:22,299 [relstorage.adapters.packundo] INFO pack: removed 73900 (93.9%) state(s)
2019-07-06 00:55:23,410 [relstorage.adapters.packundo] INFO pack: removed 75600 (96.1%) state(s)
2019-07-06 00:55:24,572 [relstorage.adapters.packundo] INFO pack: removed 77100 (98.0%) state(s)
2019-07-06 00:55:25,655 [relstorage.adapters.packundo] INFO pack: removed 78200 (99.4%) state(s)
2019-07-06 00:55:25,960 [relstorage.adapters.packundo] INFO pack: cleaning up
2019-07-06 00:55:55,825 [relstorage.adapters.packundo] INFO pack: finished successfully
2019-07-06 00:55:56,170 [zodbpack] INFO Packed storage (RelStorageFactory).

Thanks a lot and please let me know if you need further information.

If your versions don't match, that would be the first potential source of problems.

Are the errors happening on your production site or the test site?

The POSKeyError occurred on production. I had no idea how to fix it so I dumped the corrupted database and installed a nightly backup to get my site running again. Than I installed the corrupted database dump on my test site to investigate the problem.

The only times this sort of problem occurred to me, we had been using Copy and Paste to clone Plone sites between ZODB mount points. It turns out this is a bad thing to do because it leaves references to the original site, and when you eventually delete the old site and try to view anything in the new site, you get POSKeyErrors.

I would guess RelStorage doesn’t have anything to do with your problem, but I would hesitate to upgrade it until you’ve figured out (and maybe fixed) whatever is causing the POSKeyErrors.

Restoring any Plone backup to a buildout that doesn’t have matching versions (except to do an upgrade) seems like it is going to cause extraneous errors that will mask whatever it is you are looking for.

1 Like

I think there are scripts that will “walk” your ZODB and will output details on the objects that are apparently missing.

The only times this sort of problem occurred to me, we had been using Copy and Paste to clone Plone sites between ZODB mount points. It turns out this is a bad thing to do because it leaves references to the original site, and when you eventually delete the old site and try to view anything in the new site, you get POSKeyErrors.

I have read about this - but I just have mount point / (RelStorage) and /temp_folder.

Restoring any Plone backup to a buildout that doesn’t have matching versions (except to do an upgrade) seems like it is going to cause extraneous errors that will mask whatever it is you are looking for.

Good point. I will downgrade my test system to plone 5.1.2.1 and RelStorage 2.0.0.

I think there are scripts that will “walk” your ZODB and will output details on the objects that are apparently missing.

I did not find any scripts for RelStorage. zodbverify is for FileStorage (path to Data.fs is needed as parameter) and zc.zodbdgc says that it

 does not apply to RelStorage which has the same features built-in

Because of that I am really confused that I can not see any errors in my zodbpack log.

OK, sorry I don’t know then what you could do to walk the ZODB to try to identify the broken objects, if those don’t work with RelStorage and you can’t figure out what in RelStorage apparently provides the same functionality. Maybe you will have to file an issue in the RelStorage GitHub repo.

But I don’t think your problem was caused by RelStorage.

It’s possible the packing script would not encounter missing objects or would maybe not report them. The logic may not match that of a script that tries to find broken objects.

Another thought: either you could take one of those ZODB-walking scripts and modify it yourself or you could ask or hire another developer to do that for you. If you chose to do the latter, you could post to our Jobs section

Maybe you will have to file an issue in the RelStorage GitHub repo.

1 Like

I saw you got some good info from Jason Madden jamadden there, specifically

Packing is not a consistency check

Run zodbpack --prepack and look at the resulting table of references. 
You can use the multi-zodb-check-refs provided by zc.zodbdgc even 
in a non-multi-database RelStorage setup, it's just likely to be slower 
than building the --prepack table.

The main table you're probably going to be interested in 
is object_refs, which shows what objects refer to what other objects; 
a query like SELECT zoid FROM object_ref WHERE to_zoid = 179 
will give you the OIDs of objects that refer to the missing OID 179.
1 Like

I saw you got some good info from Jason Madden jamadden there, specifically

Yes, this helped to understand the situation. At the moment it seems that all tools (portal_catalog ...) were vanished but I still have no idea why.

If you want more help, you’ll probably have to provide more information, e.g. your buildout.cfg

If you want more help, you’ll probably have to provide more information, e.g. your buildout.cfg

Sure - every help is appreciated - Thanks a lot!

It is a huge plone site with almost 95000 objects. It uses RelStorage with postgresql and memcached. We have 8 plone instances for read write access and 12 plone instances for read only access.

My buildout is splited into several files because I reuse parts for other environments. I have copied the parts I thought that are important. Please let me know if you miss something.

buildout_relstorage_prod.cfg

[buildout]
extends = mybase.cfg

parts += instance
         relstorage-zodbpack-conf
         zhw

eggs += RelStorage
        psycopg2
        pylibmc
        plone.app.mosaic
        raven
        Products.ZopeHealthWatcher

[instance]
<= instance_base
environment-vars +=
    MEMCACHED_SERVERS ${buildout:ldap_cache_server}
rel-storage =
    type postgresql
    cache-servers ${buildout:relstorage_cache_servers}
    poll-interval ${buildout:relstorage_poll_interval}
    read-only ${buildout:relstorage_read_only}
    keep-history ${buildout:relstorage_keep_history}
    dsn dbname=${buildout:relstorage_dbname} user=${buildout:relstorage_dbuser} host=${buildout:relstorage_dbhost} password=${buildout:relstorage_dbpassword}
    blob-dir ${buildout:blob-dir}

zope-conf-additional +=
    <product-config collective.fingerpointing>
        audit-log ${buildout:audit-log}
    </product-config>
event-log-custom =
    %import raven.contrib.zope
    <logfile>
       path ${buildout:var-dir}/log/instance.log
       level INFO
    </logfile>
    <sentry>
      dsn ${buildout:sentry-dsn}
      level WARNING
      exclude_paths node
    </sentry>

[relstorage-zodbpack-conf]
recipe = collective.recipe.template
input = inline:
    <relstorage>
        pack-gc true
        cache-servers ${buildout:relstorage_cache_servers}
        keep-history ${buildout:relstorage_keep_history}
        blob-dir ${buildout:blob-dir}
        <postgresql>
            dsn dbname=${buildout:relstorage_dbname} user=${buildout:relstorage_dbuser} host=${buildout:relstorage_dbhost} password=${buildout:relstorage_dbpassword}
        </postgresql>
    </relstorage>
output = ${buildout:directory}/zodbpack.conf

[zhw]
recipe = zc.recipe.egg
eggs = Products.ZopeHealthWatcher
scripts = zHealthWatcher

[versions]
plone.app.imagecropping = 2.1.0
plone.app.mosaic = 2.0rc8
plone.tiles = 1.8.3
plone.subrequest = 1.8.1
plone.app.tiles = 3.0.3
plone.app.standardtiles = 2.2.0
plone.app.blocks = 4.1.0
plone.app.drafts = 1.1.2
plone.formwidget.multifile = 2.0
plone.jsonserializer = 0.9.3


RelStorage = 2.0.0
Products.ZopeHealthWatcher = 0.9.0.2

mybase.cfg

[buildout]
extends = base.cfg

eggs += collective.upgrade
        collective.fingerpointing

[instance_base]
environment-vars +=
    PERSISTENT_CACHE_DIR ${buildout:blob-dir}
zserver-threads = ${buildout:zserver-threads}

[versions]
node.ext.ldap = 1.0b3
collective.fingerpointing = 1.5rc1
plone.recipe.precompiler = 0.6

base.cfg

[buildout]
parts =
    zopepy
    testrunner
    precompile
    munin

extends =
    vars.cfg
    versions.cfg
versions = versions


eggs =
    Plone
    Products.PloneHotfix20171128
    Products.PloneHotfix20151208==1.0
    Products.PloneHotfix20160419
    Products.PloneHotfix20160830
    Products.PloneHotfix20161129
    Products.PloneHotfix20170117
    Pillow
    munin.zope

extensions =
    buildout.sanitycheck

show-picked-versions = true

[instance_base]
recipe = plone.recipe.zope2instance
var = ${buildout:var-dir}
user = ${buildout:user}
http-address = ${buildout:http-address}
http-fast-listen = ${buildout:http-fast-listen}
effective-user = ${buildout:effective-user}
eggs = ${buildout:eggs}
zodb-cache-size = 100000
environment-vars =
    zope_i18n_compile_mo_files true
zope-conf-additional =
    <product-config munin.zope>
        secret notsosecret
    </product-config>

[zopepy]
recipe = zc.recipe.egg
eggs = ${buildout:eggs}
interpreter = zopepy
#extra-paths = ${zope2:location}/lib/python
scripts = zopepy


[testrunner]
recipe = collective.xmltestreport
eggs = ${buildout:eggs}
       unittest2
       robotframework-selenium2screenshots
defaults = ['--auto-color', '--auto-progress', '--xml']
environment = environmentfortests

[environmentfortests]
PLONE_CSRF_DISABLED = true

[precompile]
recipe = plone.recipe.precompiler
eggs = ${buildout:eggs}
compile-mo-files = true

[munin]
recipe = zc.recipe.egg
eggs = munin.zope
arguments = http_address='${instance:http-address}', user='${instance:user}', secret='notsosecret'


[versions]
setuptools =
zc.buildout =
Products.PloneHotfix20161129 = 1.2
Products.PloneHotfix20170117 = 1.0
Products.PloneHotfix20171128 = 1.0
pas.plugins.ldap             = 1.5.2
# b/c plone.restapi needs plone.schema 1.2.0
plone.schema = 1.2.0
# jsonschema 3.0.0 and 3.0.1 throw error while installing with buildout
jsonschema = 2.6.0
# pin six
six = 1.12.0

vars.cfg

[buildout]
buildout-cache-dir=.
eggs-directory=${buildout:buildout-cache-dir}/eggs
download-cache=${buildout:buildout-cache-dir}/downloads
index = https://pypi.python.org
user = admin:admin
http-address = 8082
http-fast-listen = off
effective-user = plone_daemon
relstorage_dbhost = localhost
relstorage_dbname = zodb
relstorage_dbuser = user
relstorage_dbpassword = pass
relstorage_cache_servers = localhost:11211
relstorage_poll_interval = 0
relstorage_read_only = false
relstorage_keep_history = false
blob-dir = ../../blobs
var-dir = ${buildout:directory}/var
secure-cookie = False
sentry-dsn = https://f55c9c80f3af4337b435ffe531db1c68:95f7e239a3b936578481db5f410e5xyz@sentry.xyz.de/6?verify_ssl=0
audit-log = ${var-dir}/log/audit.log
zserver-threads = 2
ldap_cache_server = localhost:11211

We were able to reproduce POSKeyError