Coming soon - Get a detailed view of why an account is flagged as spam!
view details

This post has been de-listed

It is no longer included in search results and normal feeds (front page, hot posts, subreddit posts, etc). It remains visible only via the author's post history.

1
Help with some best practises in logging and metrics
Post Body

I have started to look into logging and performance monitoring, and have been between the Grafana stack (the online one as well as a local one), zabbix and checkmk. Right now I am leaning towards Grafana, but the documentation is very lacking on a proper expenation.

I am running the grafana-agent in flow mode (as suggested in their docs over promtail), and that is alright.

But my main issue is that way too much data is sent and it is just unuseable. Also I am a bit unsure how to seperate the logs and data if for example I run the agent on two computers. Say I run it on a server, it got its syslogs, for example /var/log/syslog, and then I run it on a raspberry pi (that is handling UPS and such), that also got a /var/log/syslog.

When logging into the explore mode on grafana, its just a filename label with the filename, but I have no idea which one belongs to what computer. Is it any best practise flows for this?

Also any suggestion to filter the logs so it reduce the amount of data sent? I filled up the 10K limit in days.

Here is the example of the config this far:

logging {
  level  = "error"
  format = "logfmt"
}

local.file_match "unraid" {
  path_targets = [{
    __address__ = "localhost",
    __path__    = "/var/log/**",
    __path_exclude__    = "/var/log/file.activity.log",
  }]
}

loki.source.file "files" {
    targets    = local.file_match.unraid.targets
    forward_to = [loki.write.default.receiver]
}


discovery.docker "flog_scrape" {
    host             = "unix:///var/run/docker.sock"
    refresh_interval = "5s"

    filter {
        name   = "label"
        values = ["logging=promtail"]
    }
}

discovery.relabel "flog_scrape" {
    targets = discovery.docker.flog_scrape.targets

    rule {
        source_labels = ["__meta_docker_container_name"]
        regex         = "/(.*)"
        target_label  = "container"
    }

    rule {
        source_labels = ["__meta_docker_container_log_stream"]
        target_label  = "logstream"
    }

    rule {
        source_labels = ["__meta_docker_container_label_logging_jobname"]
        target_label  = "job"
    }
}


local.file_match "flog_scrape" {
    path_targets = discovery.relabel.flog_scrape.output
}

loki.source.file "flog_scrape" {
    targets    = local.file_match.flog_scrape.targets
    forward_to = [loki.write.default.receiver]
}

loki.write "default" {
    endpoint {
        url = "https://logs-prod-025.grafana.net/loki/api/v1/push"
        basic_auth {
            username = user
            password = "password"
        }
    }
}

Author
Account Strength
90%
Account Age
9 years
Verified Email
Yes
Verified Flair
No
Total Karma
1,261
Link Karma
561
Comment Karma
700
Profile updated: 7 months ago
Posts updated: 7 hours ago

Subreddit

Post Details

We try to extract some basic information from the post title. This is not always successful or accurate, please use your best judgement and compare these values to the post title and body for confirmation.
Posted
1 year ago