Does anyone have kopano-prometheus-exporter running?
I stumbled across this in the repo: https://stash.kopano.io/projects/KC/repos/prometheus-kopano-exporter/browse
Most of the Prometheus exporters that I have used are very simple, they are a single binary that exposes data in a format that Prometheus is expecting on a port that you set. Looking at this exporter, it seems like Kopano Server, Dagent, and Spooler need to be configured to send data to a unix socket at /run/prometheus-kopano-exporter/exporter.sock some how, but I have not seen any documentation on how that works? Does anyone have this running?
the project you’ve found is something that we implemented for a few larger customer projects. Re-reading the readme first I must say I missed the important bit as well, but it has actually everything you need:
The kopano-server or other deamons require the statsclient_url to be set to the exporter socket and set statsclient_interval to a lower value than the prometheus scraping interval.
So you only need to add these two options (with the right values) to
dagent.cfg(or put this into a single file and include this on the already named files).
This is the description of the named config options (from the dagent.cfg man-page):
statsclient_url A HTTP URL or filesystem-local socket specification for a kopano- statsd compatible web service that ingests service statistics such as memory usage or mail processing counters. Example: https://my.local.org/collector.php Default: unix:/var/run/kopano/statsd.sock statsclient_interval The time interval at which the statsd service is to be contacted, in seconds. When "statsclient_url" points to a kopano-statsd instance, the value should be 60 (for now), because its rrdtool archives are set to expect data at this rate. Default: 0 (submission service is deactivated)
Thats what I get for working at midnight, I’ll test it out and see how it works.
I have a customer that only has about 20 users, but they process an incredible amount of emails.
@fbartels Looks like it working! I’ll have to finish setting it up with the service file and get it talking to Prometheus later (I run Prometheus on a private ZeroTier network to keep it secure), but it seems to be pulling data.
I forgot, i’m not really familiar with Go, is there a better want to build the binary for this, or is installing Go and running the makefile the expected way?
@burgessja yes, that is pretty much the expected way unless there would be official builds from us. The good thing about go is that you could compile it on a different system (or in a container) and just copy the resulting binary to your target system.