Migration: Transmogrifier and workflowhistory

I am struggling with content export/import of a small plone installation. I want to get rid of the old installation because it is totally borked. I have only limited knowledge of plone interns and python.

The best-working start point for me was this blog post:

I set up Plone 5.1 with collective.jsonmigrator. The basic transition works, I just can't get workflowhistory imported. I would love to get some input on this. How can I get some log output from the import process to see what is failing?

I tried the following blueprints without success:

Here is my current pipeline.cfg
pipeline =

blueprint = collective.jsonmigrator.catalogsource
remote-url =
remote-username = admin
remote-password = admin
catalog-path = /Custom/portal_catalog
catalog-query =
    {'path': {'query': '/Custom', 'depth': -1},
     'modified': {'query': '2000/01/01', 'range': 'min'}}

# Only import News Items, Images, and Files
# Everything else will be skipped if it doesn't match
blueprint = collective.transmogrifier.sections.condition
condition = python:item['_type'] in ['News Item', 'Image', 'File', 'Folder']

# The jsonify will have the id we want to use in the `_id` field
# so remove the one from the data we don't want to use
blueprint = collective.transmogrifier.sections.manipulator
delete = id

# Remove the old Plone Site ID from the paths so we can import into any new Plone
# site reguardless of its site ID.
blueprint = plone.app.transmogrifier.pathfixer
stripstring = /Custom

#blueprint = collective.transmogrifier.sections.folders

# Here's where the actual magic happens, create the content object in the site
blueprint = collective.transmogrifier.sections.constructor

# Needed so that the schemaupdater can set the UUID correctly
blueprint = collective.transmogrifier.sections.manipulator
keys = _uid
destination = string:plone.uuid

# If the data was contained inside of an attached JSON file, stuff that data
# back into the pipline for the next step.
blueprint = transmogrify.dexterity.deserializer

# Now updated the created item from the data in the dictionary that has been
# passed down the pipeline
blueprint = transmogrify.dexterity.schemaupdater

blueprint = ftw.blueprints.workflowmanager
old-workflow-id = news_custom_workflow
new-workflow-id = simple_publication_workflow
state-map = python: {
    'pending': 'pending',
    'published': 'published',
    'visible': 'private'}

blueprint = collective.jsonmigrator.workflowhistory

# Plone itself includes some nice blueprint sections that are ready to use
# Set creation, modification and effective dates on the object
blueprint = plone.app.transmogrifier.datesupdater

# Copy each item's zope properties into the the property sheet of the object
blueprint = collective.jsonmigrator.properties

blueprint = collective.jsonmigrator.owner

# Critical to developing your piplines is the ability to see log output of what has happened
# Use the logger blueprint section to pick and choose what to output into your logs
blueprint = collective.transmogrifier.sections.logger
level = DEBUG
delete =

What version of collective.jsonmigrator are you using? We had a similar problem, but it has been fixed in collective.jsonmigrator 1.0

I am using the master branch from GitHub.

I found someone with enough Python knowledge to check the blueprint and put some debug output in it. I found out that ftw.blueprints.workflowmanager relies on Archetypes. I did a small hack around it and now the workflow_history imports flawlessly.

The issue lies in the check "or not IBaseObject.providedBy(obj)" here:

please open an issue or pull request on the package so everybody can take advantage of your fix.

I did.