Future of JavaScript in Plone

This deserves a new thread :slight_smile:

Thats a lot to process.

But I gather what you a saying is that any plugin author will release the required npm packages on pypi? Is that sustainable?
What happens when author A needs 1.1 of blahjs and author B needs 1.2? They need a shared pw on pypi?

I created Calmjs originally to avoid the problem of having to split a repository that contain both Python server side components apart from the JavaScript client side components that achieve a single unified goal. Granted, an argument can be made to split it into three packages (Python only onto PyPI, JavaScript only onto npm, some glue package that bring the two together) but I personally find that annoying to manage dependency problems within what ultimately is a single package. So no, to be very clear: I hate the idea where Python package authors cannot ship JavaScript code with their Python package and hate the idea of being forced to split the JavaScript onto npm. I want a single Python wheel containing all the things required to start the web server, and I am working towards that goal (until artifacts can be declared within calmjs framework, this goal is incomplete).

Anyway, the fact that the Python package requires/contains specific client-side JavaScript code that work directly with the included Python code within that package that ultimately target a Python environment means that the JavaScript code will have minimum value on npm anyway. So ultimately the Python wheel (package) that gets generated will also contain the raw JavaScript code (or whatever variants) and the tools that integrates with Calmjs (currently only calmjs.rjs) will be able to extract them from the package, together along with the dependencies on npm, generate the target artifact which gets served to the client.

As for dealing with versions, this is where the downstream packages have to make a decision. They can provide explicit version pins, and it gets away from the nested dependency structure that npm implicitly introduces, at the cost of having to manually test the viability of their generated bundle. Which is why I also made it much easier to declare/build/generate the test harness during the build process (where the JavaScript tests are extracted from the provided Python packages) and run them against the development or even the generated artifact(s). If the tests passed (and with sufficient test coverage) it really doesn't matter too much which explicit versions the upstream packages have pinned. So for instance, if some package extending on nunja really needs nunjucks 3.1.0 they can just specify that in their package_json attribute and it will shadow the version defined in nunja.

Also, if this behavior is undesirable, another package can simply consume the package.json in the egg-info (via the setuptools/distutils metadata API) generated by the calmjs supported packages and create new ways to harmonize the versions, or whatever. I am simply using and extending the core Python libraries to expose JavaScript development files that no current Python packages/frameworks have even tried to do so (putting a package.json/bower.json/webpack.json or whatever static/hard-coded configuration specific to the project's git repo/tools is really not a portable way to drive development and integration between projects).

I agree that this is all very annoying and a lot to process, because nobody really thought hard about communicating Node.js package dependencies (i.e. npm) down/with Python package dependencies (i.e. pypi), that the approach done by both of these package management frameworks have differences in how pretty much everything are done and putting all of these things together is a lot to take in at one time (spoiler: I've been thinking about how to bridge Python and JavaScript package dependencies together for about 3 years on and off before I finally grok the problem and built calmjs, and I don't even claim to fully understand the full problem at hand, so I only tackled what were blatantly obvious to me and that alone is a huge problem already. My attempt to solve this is basically doing it at a level that works for me and in a way that can be extended, and to use core Python libraries and work with existing things from other software/development ecosystems to avoid inventing completely brand new standards).

I do like the idea of (optionally) being able to release both Python backend code and JS/CSS backend resources in the same package in a way compatible for both buildout/pip and npm. I've also done that myself.

The main issue IMO is to find good enough conventions to map the required metadata for both Python code and npm. CalmJs does that, but I still wonder could we do that with just stdlib. E.g. define everything JS related in package.json, but the read it in setup.py into somewhere available in Python runtime. @metatoaster probably tried that and found issues?

There is nothing stopping anyone from having a package.json in their project root and setup.py read that into the package_json field that Calmjs understands, if using Calmjs is permitted. Much like how most Python projects read README.rst into the long_description field (in their setup.py), one can do the same thing for reading package.json into the Calmjs package_json field (don't even need to decode it with JSON; the validation for that field by Calmjs is done on the string). As the stdlib (distutils) is really limited in what it can understand/convey, setuptools was created to rectify the situation; and likewise setuptools doesn't understand things like package.json or other Node.js conventions, a way to convey that must be done and that is one of the many reasons the calmjs package was created.

If pure Python stdlib solution is desirable, simply commit and include the package.json in the package's egg-info (or dist-info) directory (or link the copy at the root level into it). In fact, all Calmjs really does for handling of package_json flag for the setup method is to do exactly that (plus some small sanity checks), and use the couple helpers from pkg_resources provided by setuptools to do the heavy lifting. The code to access that from Python is simply this (example assumes calmjs.rjs was installed in the running Python environment).

from pkg_resources import Requirement
from pkg_resources import working_set
pkg = working_set.find(Requirement.parse('calmjs.rjs'))
package_json = pkg.get_metadata('package.json')

Since the JavaScript tooling for standardized packaging is pretty lacking in features compared to what Python has to offer, going the reverse direction (i.e. using pure Node.js based packaging tools, whatever they are, to feed back into Python) is more problematic.

1 Like

@tisto @sneridagh @tkimnguyen What do you think of CalmJS-approach?

I like the basic idea that we could use package.json to define frontend dependencies and entrypoints in the package and then include package.json also in Python package metadata to make that data accessible from Python. (And those, who want to completely ignore JS tooling, could just define the same info directly into setup.py.)

CalmJS adds a lot of tooling on top of that basic idea. I think I understand its reasons, but once I've become comfortable with JS tooling, CalmJS wrappers on top of that feel complex.

A large part of the complexity comes from my desire to expose the module/namespace structure presented within a Python environment to the accompanied JavaScript environment, so that the layout of the modules and associated tests would feel a bit more natural for a stubborn Python programmer like I. That said, the tooling is designed to interoperate with existing JavaScript tools, and this is accomplished by the generation of configuration files that build the tests and artifacts, a task that I've found annoying to do using standard Node.js tools due to the need to manually manage all those configuration files (I guess I should mention that I hate writing configuration files and would rather write programs to write them for me; also for building the configuration files to start the tests against the specific packages I want to run the tests against, too).

Example: let's supposed there is an existing framework that already provides all the standard client side JavaScript libraries (e.g JQuery, Underscore and say MathJax). We want to introduce a new $library (that I or anyone could have written) that uses some of those provided libraries, plus a bunch more that we don't already have (Handlebars + Backbone + whatever else). In the traditional way, hard-coded configuration files (such as grunt/gulp) would be provided to bundle everything together for that $library, which may or may not be structured in a way that is compatible with the existing framework or way of loading things. So from what I understand someone will have to craft a new configuration file that bridge the two together to deploy with the framework. Okay, simple, manually building configuration files is a perfectly fine way for working with just Node.js packages, but problem comes when we are talking about interoperability with Python packages, and that Python packaging tools by default does not provide a way to communicate all things Node.js with other Python packages, especially these pesky configuration files. Though if the problems are limit it to this, Calmjs doesn't offer much, especially for a workflow that is open to git clones (to grab the grunt/gulp files as Python wheels don't typically ship them) and Node.js tooling.

The real complicated part is to allow stubborn Python programmers (like me) to develop/ship JavaScript with their Python code as complete Python packages, but do it in a way that at least open and compatible with existing JavaScript tooling and module frameworks. A few commenting here (like @zopyx) have expressed that being forced to split JS/CSS from their Python package into a separate Node.js/npm package is undesirable, and this reason is precisely why I've created calmjs, just to reiterate this point once more in a more direct way (also that the nunja templating engine was the driver for this). One major thing I need to get done to fully achieve this goal is to figure out how to declare, generate (with the Node.js tooling integration done by Calmjs) and ship complete JavaScript artifacts with Python wheels, so that only Python wheels (and pip) are required for deployment, to make Node.js not a requirement on the server.

1 Like

I haven't looked into CalmJS. Though, in my experience mixing JS and Python code/packaging is never a good idea.

In the end, it all comes down to the question if Plone/Python web developers need to master the JS toolchain. IMHO, it is an illusion to think you can get away without understanding the JS toolchain, as a web developer in 2017.

2 Likes

Never seems too strong of language, no? We have many people in the larger Python webdev community doing this for reasons that are too varied and complex to dismiss out of hand as historical baggage, I think?

Edit: Maybe I am misunderstanding, conflating packaging with distribution?

JS tools and their configuration may be complex, but that complexity is shared, documented and hopefully eventually solved by the JS developer community.

Yet, I understood that CalmJS would allow developers familiar with JS tools and practices to keep following them, but those afraid of JS tools would have as "pythonic" alternative. That would sound good for me, but the real question is: would CalmJS be easy enough for the critics of Plone 5 resource registry and bundling tools?

Isn't this similar issue to compiling po files to mo files during packaging? Plone does that currently with zest.releaser, which supports plugins like

Still confused. Doesnt this still require a plugin developer to release all their js dependencies as python packages which is impracticle? I get packaging your own custom js code but remaking js packages in Python seems like a lot work.

Thanks Asko but I'm out of my depth here, and I defer to those of you who have been trying to address the issue! "better minds" etc. :slight_smile:

Yes, I've seen how grunt/gulp work but I really dislike how manual everything feels. I have this project called repodono.storage that implemented some extension to the mockup structure pattern, at the same time working on the mockup code because of the various fixes I have to do to it. Sure I just end up copy/pasting the scripts, but the setup is so specific to that repo/directory layout that makes it not portable. I like having tools that allow some common modifications available via command line arguments (like specifying the browser to run the tests, which packages to extract the tests from, which package to test), not bound to the development source tree, and have the option to run their tests easily against production.

As for dealing with npm, if I had needed more dependency over what mockup has, I would have to copy/paste their package.json (and bower.json) into my project root and extend on it (rather than just lazily symlink that). Yes, if mockup is in npm this problem might go away, but it doesn't make it easy to work with in a development mode in conjunction with my extension (namely, live reloading selectively). I also like having a way to communicate this to Python packages (and avoid the nested dependency issue that npm introduces), as mentioned.

At the very least provide a way to link from Python (server side) package with explicit JavaScript package versions. Without this in place forces everyone to dig for these information in VCS rather than having the ability to find this information from pre-installed Python packages.

The use case I had in mind was this: we have C++ libraries that provide Python bindings for working with 3D, and this includes export to WebGL. While the bulk of the JavaScript code are provided by libraries sourced from npm (like ThreeJS), that alone doesn't provide much UI controls from within the browser. This is where the bits of JavaScript code will be added along with the relevant Python code as they would work in one cohesive unit. The fact is that the PhD students here don't exactly have a lot of time to do their own research plus learning the ever deprecating Node.js tooling (plus knowing how to craft/modify the various configuration files); why would I want to force them to learn webdev tooling when they don't even want/have time to do webdev? If I could give them a single command line tool that will generate/build/open their result page from their browser without them having to know all of the various Node.js tooling, wouldn't it be better (because my tooling actually wraps around Node.js tooling; for a more concrete example please see run through this example, starting from the Installing Prerequisites section - I will need to build one for webpack)? Isn't programming about automation and solving repetitive problems? I hate having to be forced to go along with the endless churning that the Node.js community love to waddle in, and sure as heck those PhD students wouldn't want to take part in that.

Also that I feel that a good tool should be here to solve problems and get out of the way, and in my experience Node.js tools just simply get in my way too much while not really solving the problems I have. For example the r.js optimizer, and I've explained this somewhat in my previous post (how user have to edit some config file, instead of just easily stub out the dependencies).

Yeah, similar, though I would also need to provide an API of some sort that would expose those pre-generated artifacts from the installed Python packages, too.

No. I make no requirements on anything, aside from the ability to ask a Python package on what dependencies they need from npm. I am sorry for the confusion, so please let me spell this out some more. If developer wants to have a Python package (the.upstream) for only Python stuff and a completely separate JavaScript package (upstream.js) for their client side stuff, and wants to use Calmjs for that, they simply include that upstream.js package into the package.json declaration mechanism. Ideally, also declare the JavaScript files (typically webpack bundles) they might require (an example, although I am currently thinking of streamlining and reworking this bit).

Now comes a developer that want to consume the.upstream package (as downstream). Once they declared their dependency on the.upstream (by putting that package in install_requires), they can simply invoke calmjs npm --install downstream and lo and behold - the required Node.js environment is now installed into the current directory, which will include the separately developed upstream.js package in node_modules (plus their npm dependencies). Then at that point, if the downstream package developer wishes, they can with Calmjs help, develop and use JavaScript directly within their Python source tree as a single package, in conjunction with the sources/packages acquired via npm. This also doesn't really stop people from generating the required bits to build a package that would also be deployed to npm from the same source tree that also build a Python package also.

As I noted previously, the final generation of the artifact for inclusion with Python wheels still needs to be done, which is partly why I had hesitated to introduce Calmjs, but given how fast everything related to Node.js move I felt that I should at least show my hand, to at least get some more words and ideas on how it could work better.

@datakurre This is the problem with any of these efforts to simplify JS in Plone. The people most affected (simple integrators) can't understand the options. I don't really understand it.

The only way around this that I can see is some documentation first.

If you installing some plugins, here are the commands you will have to run.
If you are creating a plone plugin that has JS dependencies, here is what you have to run.
If you are a creating a custom theme for one site, here is what you have to run.

1 Like

Documentation is something I will have to work on, especially for creating brand new package that extend up on either calmjs or nunja (only the latter I have the example of, but only for deployment not so much building a brand new thing). Also, I am not doing this for simplifying JS in Plone, nor really simplifying anything, but more providing the information to assist in using the Node.js ecosystem from Python (specifically with integration with things provided by setuptools) (also simplifying everything is unfeasible, I just settled for making things less painful for myself, and write a thing that has an option to support Webpack later even though I've focused on RequireJS).

Another key point is this: I built Calmjs so that specific and explicit commands can be run to create the environment needed to build whatever extensions required (like my nunja.stock examples I linked previously), and also for the generation of artifacts that are to be served. This seems to be what you desire, however I still need to build the Plone integration bits.

Heck, even within Zope/Plone itself, building pure Python integration packages isn't all that easy either - the documentation for some of the internal stuff was pretty lacking also (if they existed, were hard to find). While I was building pmr2.oauth (an OAuth 1.0 provider for Plone) I basically had to read through various source code, trace through execution through the PAS framework to see what actually is happening before I was able to actually put that together.

1 Like

@djay raises something that I think needs to be pointed out: Documentation Driven Development. For those who don't know or don't remember: plone.api was built starting from the documentation first. If the resource registry had been built the same way, we probaby wouldn't be in the current pickle.

2 Likes

If you head over to the first post of this discussion it was already stated, roughly of course, the expected workflows. Turning that into proper docs should be the next step.

Although my hands are full right now (and about to get a second kid in a few months), is there any disagreement to bring this discussion to a PLIP? Anyone wiling to take the lead?

Attended a quite interesting tech talk last week.

1 Like

that's just awesome! I can't wait for next week's JS module loader! :wink:

1 Like

that's what I thought too, at first :wink: