Logging to Logstash/ Kibana?

Is there anything around which helps integrating Zope logging with Logstash? I found https://pypi.python.org/pypi/gocept.logging - any experiences? Alternatives?

1 Like

We use graylog, because of its structural GELF format (e.g. exceptions with stack trace are logged properly). It has integrations for both available Python GELF libraries.

I looked into the ELK stack last summer for doing Zope/Plone logging and monitoring (1 to 2 weeks spent), but I was disappointed. For now we'll stick to using sentry which is also non trivial to set up on site, but has deep integration for the backtraces/exceptions, which is the most important part of what you would want to monitor from the logs.

In the full Plone stack there are more interesting logfiles you would want to monitor, which was the reason for looking into ELK, but setting up Logstash -> Elesticsearch -> Kibana (the correct flow) has a lot of moving and interdependent parts which will cost a lot of time. All the separate tools are open source and well documentation, but the pain/time-sink is in the integration. Sounds familiar? :wink:

I found out about graylog2 only after my ELK experiments, Graylog also has elasticsearch as one it's components, but is a kind of distribution, where everything is already connected and has configuration tools in place. If I return to logging, I'll look at graylog2 first.

You asked specifically about logstash though, the issue I ran into there is that you have to modify the default format zope is logging in and you have to configure/compose a separate logstash parser for it, because the backtrace is a multiline format which isn't possible to capture nicely with logstash rules.

But you don't want to run logstash on every production server, another resource hungry java monster just to capture log files. Others and Elastic recognised this and came up with tools like filebeat that only monitor the files, have very limited processing options and ship the logs to logstash which then is a kind of central processing tool and does the heavy lifting and enriching of metadata and then sends the processed log entry to elasticsearch for storage and later reporting by Kibana.

In late 2016 Elastic released new versions of almost all the tools, maybe things have improved a bit. My main non technical concern with the whole ELK stack is that anything that makes actual using the tools easier (any security, monitoring) can be bought as X-packs from Elastic, but the pricing is hidden and only available on request. i.e. We make you pay what we think you can afford or what we can wring out of your pockets.

My customer has already a working ELK stack, I am just looking for a tool to write logs into it.

Further research resulted into the idea of using python-logstash plus some magic bits taken from raven and assemble an information rich logging into logstash.

For now our customer will use gocept.logging.

Eventually we implement later a full featured solution as sketched.