• 1 Post
  • 54 Comments
Joined 1 year ago
cake
Cake day: June 9th, 2023

help-circle







  • I did something very similar with Opensearch rather than grafana, but it’s definitely possible. My setup:

    • fluent-bit installed on webserver to scrape and parse nginx logs, then forward them over TLS to the monitoring server
    • on the monitoring server, a second fluent-bit service runs here to collect the forwarded logs and insert them into the correct index pattern. A filter also inserts geoip lookups into the records.
    • opensearch & dashboards set up to exclude known “bot” user-agents from the analytics, and do some other basic data cleanup to make the dashboards pretty

    It works well, but could be a bit simpler admittedly. You may choose to use Loki instead of Opensearch/Elasticsearch, and there are plenty of other log parsing tools out there.

    Another, much simpler option is to just run Goaccess on your log files, either periodically to generate reports, or as a daemon to create a live dashboard.




  • Not sure how to do that in docker, I’ve run mine as a plain old PHP-FPM site for years and years. It might be something that can be tweaked using config files or environment variables, or might require building a custom image.

    ClamAV is slow and doesn’t catch the nastiest of malware. Its entire approach is stuck in 2008. It’s better than nothing for screening emails, but for a private file store it won’t help much considering that you’ll already have the files on your system somewhere. And most importantly, it slows down file uploads 10x and increases CPU load substantially. The only good reason to use ClamAV for nextcloud is if you will be sued if you don’t!