This post has been de-listed
It is no longer included in search results and normal feeds (front page, hot posts, subreddit posts, etc). It remains visible only via the author's post history.
I have started to look into logging and performance monitoring, and have been between the Grafana stack (the online one as well as a local one), zabbix and checkmk. Right now I am leaning towards Grafana, but the documentation is very lacking on a proper expenation.
I am running the grafana-agent in flow mode (as suggested in their docs over promtail), and that is alright.
But my main issue is that way too much data is sent and it is just unuseable. Also I am a bit unsure how to seperate the logs and data if for example I run the agent on two computers. Say I run it on a server, it got its syslogs, for example /var/log/syslog, and then I run it on a raspberry pi (that is handling UPS and such), that also got a /var/log/syslog.
When logging into the explore mode on grafana, its just a filename label with the filename, but I have no idea which one belongs to what computer. Is it any best practise flows for this?
Also any suggestion to filter the logs so it reduce the amount of data sent? I filled up the 10K limit in days.
Here is the example of the config this far:
logging {
level = "error"
format = "logfmt"
}
local.file_match "unraid" {
path_targets = [{
__address__ = "localhost",
__path__ = "/var/log/**",
__path_exclude__ = "/var/log/file.activity.log",
}]
}
loki.source.file "files" {
targets = local.file_match.unraid.targets
forward_to = [loki.write.default.receiver]
}
discovery.docker "flog_scrape" {
host = "unix:///var/run/docker.sock"
refresh_interval = "5s"
filter {
name = "label"
values = ["logging=promtail"]
}
}
discovery.relabel "flog_scrape" {
targets = discovery.docker.flog_scrape.targets
rule {
source_labels = ["__meta_docker_container_name"]
regex = "/(.*)"
target_label = "container"
}
rule {
source_labels = ["__meta_docker_container_log_stream"]
target_label = "logstream"
}
rule {
source_labels = ["__meta_docker_container_label_logging_jobname"]
target_label = "job"
}
}
local.file_match "flog_scrape" {
path_targets = discovery.relabel.flog_scrape.output
}
loki.source.file "flog_scrape" {
targets = local.file_match.flog_scrape.targets
forward_to = [loki.write.default.receiver]
}
loki.write "default" {
endpoint {
url = "https://logs-prod-025.grafana.net/loki/api/v1/push"
basic_auth {
username = user
password = "password"
}
}
}
Subreddit
Post Details
- Posted
- 1 year ago
- Reddit URL
- View post on reddit.com
- External URL
- reddit.com/r/selfhosted/...