Skip to content

Commit

Permalink
Merge pull request #107 from grafana/killercoda-alerts-logs
Browse files Browse the repository at this point in the history
Killercoda alerts logs
  • Loading branch information
Jayclifford345 authored Aug 30, 2024
2 parents 81e639a + e80966d commit 35123e7
Show file tree
Hide file tree
Showing 4 changed files with 8 additions and 10 deletions.
2 changes: 1 addition & 1 deletion grafana/alerting-loki-logs/step2.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# Generate sample logs

1. Download and save a python file that generates logs.
1. Download and save a Python file that generates logs.

```bash
wget https://raw.githubusercontent.com/grafana/tutorial-environment/master/app/loki/web-server-logs-simulator.py
Expand Down
4 changes: 1 addition & 3 deletions grafana/alerting-loki-logs/step3.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,9 +4,7 @@ Besides being an open-source observability tool, Grafana has its own built-in al

In this step, we’ll set up a new [contact point](https://grafana.com/docs/grafana/latest/alerting/configure-notifications/manage-contact-points/integrations/webhook-notifier/). This contact point will use the _webhooks_ integration. In order to make this work, we also need an endpoint for our webhook integration to receive the alert. We will use [Webhook.site](https://webhook.site/) to quickly set up that test endpoint. This way we can make sure that our alert is actually sending a notification somewhere.

1. In your browser, **sign in** to your Grafana Cloud account.

OSS users: To log in, navigate to [http://localhost:3000]({{TRAFFIC_HOST1_3000}}), where Grafana is running.
1. Navigate to [http://localhost:3000]({{TRAFFIC_HOST1_3000}}), where Grafana is running.

1. In another tab, go to [Webhook.site](https://webhook.site/).

Expand Down
10 changes: 5 additions & 5 deletions grafana/alerting-loki-logs/step4.md
Original file line number Diff line number Diff line change
Expand Up @@ -18,9 +18,9 @@ In this section, we define queries, expressions (used to manipulate the data), a

1. Paste the query below.

```
sum by (message)(count_over_time({filename="/var/log/web_requests.log"} != `status=200` | pattern `<_> <message> duration<_>` [10m]))
```{{copy}}
```
sum by (message)(count_over_time({filename="/var/log/web_requests.log"} != "status=200" | pattern "<_> <message> duration<_>" [10m]))
```{{copy}}
This query will count the number of log lines with a status code that is not 200 (OK), then sum the result set by message type using an **instant query** and the time interval indicated in brackets. It uses the LogQL pattern parser to add a new label called `message`{{copy}} that contains the level, method, url, and status from the log line.
Expand All @@ -42,7 +42,7 @@ If you’re using your own logs, modify the LogQL query to match your own log me
1. Click **Preview** to run the queries.
It should return a single sample with the value 1 at the current timestamp. And, since `1`{{copy}} is above `0`{{copy}}, the alert condition has been met, and the alert rule state is `Firing`{{copy}}.
It should return alert instances from log lines with a status code that is not 200 (OK), and that has met the alert condition. The condition for the alert rule to fire is any ocurrence that goes over the threshold of `0`{{copy}}. Since the Loki query has returned more than zero alert instances, the alert rule is `Firing`{{copy}}.
![Preview of a firing alert instances](https://grafana.com/media/docs/alerting/expression-loki-alert.png)
Expand All @@ -56,7 +56,7 @@ An [evaluation group](https://grafana.com/docs/grafana/latest/alerting/fundament
To set up the evaluation:
1. In **Folder**, click **+ New folder** and enter a name. For example: _loki-alerts_. This folder will contain our alerts.
1. In **Folder**, click **+ New folder** and enter a name. For example: _web-server-alerts_. This folder will contain our alerts.
1. In the **Evaluation group**, repeat the above step to create a new evaluation group. We will name it _1m-evaluation_.
Expand Down
2 changes: 1 addition & 1 deletion grafana/alerting-loki-logs/step5.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
# Trigger the alert rule

Since the alert rule that you have created has been configured to always fire, once the evaluation interval has concluded, you should receive an alert notification in the Webhook endpoint.
Since the Python script will continue to generate log data that matches the alert rule condition, once the evaluation interval has concluded, you should receive an alert notification in the Webhook endpoint.

![Firing alert notification details](https://grafana.com/media/docs/alerting/alerting-webhook-firing-alert.png)

0 comments on commit 35123e7

Please sign in to comment.