It is an important usecase but isn't it more an integrator use case? who should be comfortable with transmogrifier which is not hard to use to covert jsonify format. Alternatively, converting jsonify to the CSV or other json format should not be hard using some custom python code
Yes, as I already said this addon is built on top of plone.restapi. So the migration limits solely depend upon plone.restapi.
During the development period, I tried migrating Plone v4.0 data to Plone v5.0 and it worked out very well. Though the limits of migration still need a check.
I think it would be great if it was possible to
- Install an add on ( collective.something) on both sites
- Go to control panel, choose export, choose a few options (like what to export (or what not to export)
- Go to control panel on new site, choose import. Select between a few options, like what do to on error (log, stop, continue etc).
- Maybe, if necessary, be able to 'make decisions as the migration runs' (there is an error on migrating 'field A', should I migrate the others.... (or manually enter field A).
Yes that was the spirit behind this addon
Hi! this seems like a great project. I am new to the community but i would like to work on this. I know python, Django, React,etc.
Hi @tulikavijay, and welcome to the Plone community!
If you're interested in working with us for the 2018 GSoC, you'll want to get started by learning a bit about Plone, what it is for, how to use it, and how it works. You've taken the all-important first step of speaking up here in our forum. We've outlined a few useful tips on how to get started and you should work your way through those. We are happy to answer any questions you might have, especially if you can dig a bit and look for answers yourself first.
Again, welcome to Plone! We are glad you're here and we look forward to hearing more from you.
Note there's another new kid on the block:
I asked the author, Ramon Bartl, whether it's a generic Plone solution, and he answered:
senaite.syncis capable to sync two Plone instances as well using https://pypi.python.org/pypi/plone.jsonapi.routes. It can handle custom content types, references and metadata, e.g. the
review_statesetc. Currently @juangallostra and @Nihadness are working on
senaite.syncto import huge data from a remote site into Senaite. We’re all excited about this project
Yes, this project sounds useful. As it would help to establish routes for data transfer in JSON.
But can it serialize/deserialize data into Plone instances?
Is this project still under consideration for GSoC'18?
Yes it is
I just came up with a project idea around collective.transmogrifier improvements... I'm not sure how the two ideas are related or if they really overlap.
@djay @jean @hvelarde Any Idea on how to add the feature in which user has the ability to upload from the different site with url and authentication?
How do we stream data between the sites?
Which protocol would be a good choice to stream big data (large sized BLOB) between sites?
This comment by @djay could be useful to understand it's working over the surface. The more about it can be found in the same thread.
senaite.sync seems like a good option but isn't this achievable with plone.restapi?
In my knowledge plone.restapi does the same job. Though I see senaite will definitely help in learning to import huge data from a remote site.
As I see on the idea page
"Add the ability to perform migration of data between Plone 4 and Plone 5 (optional)"
But plone 4 doesn't even support plone.restapi right? how are we going to accomplish this then??
probably not easily. There are a couple of ideas
- ensure the csv import works with some of the various plugins that already do csv exports
- perhaps support the jsonify format?
But personally I think this requirement is less important. You have the option of an inplace database upgrade which loses no data and a major upgrade is an more technical endevour and there are more technical tools like transmogrifier available.
Having the following is the most important:
- robust and useful import/export in both json and csv is most important.
- That supports both import/export without loss and syncing files and/or parts of metadata.
- Ability to be used by non technical people
If it can do that machine to machine also, then thats a bonus.
And of these goals that @djay has laid out, @kakshay21, I would say this one is by far the most important. Making it dead simple to use and fairly close to bomb-proof in function so that a non-technical person has the reasonable expectation that it "just works".
Without knowing anything about 'what it does':
Could the export function in ZMI when one chooses XML be used for anything ?
What was the problem while importing/exporting large sized BLOB?
I mean was it because of loading/processing BLOB in the memory or something else?
Any available backtrack of error would be very helpful.
If member attribute is not exported then how user's membership or things like portal_membership are still maintained in the new imported site?
No. and I don't understand why the must include attributes are there either.
Go and have a look at collective.importexport and how it was designed.
It is intended to be used so that you can upload any partial set of metadata you want and match with any primary key you want. It asks you which field you want to match on.
The current plone.importexport is inferior and you should imagine something more flexible that this.