Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Tried Grafana briefly a year or two ago, and I wanted to like it, but similar to Kibana, it's laser-focused on the task of realtime monitoring current data. I wanted to use it for a high-level view of historical stuff (robot data recordings from ROS), and there was a lot of really basic functionality for that use case that just wasn't there at all.

Even stuff as basic as being able to pan a plot back and forth after you've zoomed in— here's the four year old ticket for that in their issue tracker: https://github.com/grafana/grafana/issues/1387

I ended up generating Bokeh plots and had a much better time. So Grafana is great for what it's great at, but I don't recommend it for uses other than current-moment data.



It's solely focused on data analysis (none of this "dashboard" stuff), but I've found Kst[0] to be the lowest-friction way of going from plaintext time-series data to fancy, zoomable, draggable plots in whatever layout you want. It handles realtime data very handily as well, happily scaling with as much backlog as I could throw at it.

Bokeh is a nice kit as far as it goes, but I hit scaling problems quite early with it - as I understood it, it was supposed to send incremental updates to the client, but in fact it resent everything every time - and therefore fell over after half an hour when the time to update exceeded the update rate. Maybe I was holding it wrong.

[0] https://kst-plot.kde.org/


kst is amazing! I write CSV files with 100 series about 10 samples per second for 8-12 hours a day and it doesn't break a sweat plotting the whole day, any number of series, it does transformations as required, and the next day when I have a new CSV file I can point it at the new file and keep all the plots I configured the previous day.

KST has been a total game changer for me commissioning plants and machines.


Can you go into more detail for how you use it?


I don't have access to a proper data acquisition system that can sample at kHz rates and I'm not willing to buy one out of pocket.

SCADA systems and HMIs are generally not set up to poll data from PLCs faster than 1 Hz.

Using a driver for the PLC communications protocol I write all of the variables of interest to a CSV file at 10 Hz.

PLC scan times are usually 10-100Hz so while I can't capture everything the PLC sees or higher frequency components to the signals than the PLC can measure, it is happy middle ground between a proper DAQ and just using the HMI software.

In addition a DAQ wouldn't be connected to all of the PLC IO but with this system I can easily grab all the PLC tags I want, as well as internal PLC tags that are not IO points.

Most HMI software has pretty brutal plotting capabilities as well (Citect process analyst being the only one that is better than passable), so on top of getting at least 10x the resolution in the data I get to use KST which is great for zooming and panning on the plots, creating sets of plots, or different plots for specific tests.

Not all HMI software even allows plot configurations to be saved, so you can spend a lot of time just re-adding the time series to the plot and setting the scales.

The plots are used for commissioning reports and records.


Interesting and thanks! I use time historians for grid level operations, but the data coming from a single generator is somewhat limited to things like real and reactive power (among others) and the economic data is submitted in a different way.


Although we sometimes supply historians they are not usually part of the up-front controls and commissioning contract so I am not as familiar with them. Usually part of the reason the HMI software plotting capabilities are such shit is so that the HMI vendor can try to up-sell a historian.


thanks for the feedback. there's definitely a lot of validity to what you're saying, for what you describe.

grafana has traditionally been used for 'real time dashboarding and analytics' in the IT/devops world. that's the original use case, and its sweet spot, as you allude to.

but, since the beginning, the mission of the open source grafana project has had nothing to do with IT per se. it was about democratizing metrics; helping teams understand their 'systems', by breaking down silos between databases and people.

over the last few years, interesting things are afoot in grafana community. we're seeing grafana used for more and more non-IT use cases. it's being deployed in the industrial and business worlds. about 10-20% of the grafana community now deal with things that have nothing to do with IT/devops.

the 'systems' are no longer limited to things like servers, switches, containers and clusters. these emerging users deal with things like temperature sensors, dollars, robots and ambulances. we are making progress in bringing grafana to these worlds, while also ideally improving it overall.

there's tangential threads in various stages of recent completeness (none of which solve your specific issue admittedly). things like sql support, general focus on ad-hoc analysis with ('explore'), the upcoming abstraction around being able to better use ui components within grafana ('grafana/ui'), improved support for tabular data, new panels, etc.

sorry about the four year old issue; i'd be lying if i said there weren't myriad things we'd like to do, that don't make the cut not due to desire but due to time and resources.

again, thanks for the feedback, please know that we're very interested in continuing to develop and improve grafana for use cases like yours!

-r

[disclosure, very biased and opinionated response. am co-founder/ceo at grafana labs. lucky enough to work with torkel and the team on making grafana better]


Thanks for the response and for a pretty cool open source project! Sorry my comment dumping on it ended up being the top of the thread here. FWIW, I definitely had a nicer time trying out Grafana than I ever have fighting with Kibana, and I definitely liked that I was able to use it with SQL based datasources rather than just Elastic.


We are at the beginning of trying out Grafana for real time monitoring of a combined indoor fish and vegetables farm (aquaponics).


I had the same problem. I also want simple binary metrics and batch related updates like are batch jobs running on time or over due. I really want to avoid writing my own dashboard but seems the only way.


I use it for my Pi based solar production and house monitoring and like it. It was the first time i've ever used it. I found it fit perfect for this project. I was surprised when I tried to use it for literally anything other than raw linear data input it was completely useless. Neat product, but not sure why it gets so much attention given how it really one does one very very specific task... I guess it looks neat...


Actually, this greatly comes down to the datasource. Each exposes different analytical functionality at the database level. Combined with the templating and variable functionality, you can do a ton of excellent analysis on all kinds of metrics.

What kind of data were you wanting to analyse?


I had a similar problem. I think Grafana or Chronograf still has its place, but based on what you've said I'll try replacing some of our dashboards with a Bokeh generated dashboard. The library looks very powerful. What is the best practice of updating the plots (e.g. once a minute)?


There are several strategies for live-updating bokeh plots; tbh I'm not sure which ones are best. The officially supported one is Bokeh Server: https://demo.bokeh.org/surface3d

In any case, for me it's "the robot had a problem sometime in this 30 minute window, please dig through 3GB of logs to figure out what went wrong", and doing a bunch of pandas crunching upfront and dumping out a bunch of time series plots makes that kind of task really straightforward.


Sounds like you want to look at Redash or other like BI platforms.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: