How to collect and visualize your logs with the ELK stack (Elasticsearch Logstash Kibana) | Scaleway


How to collect and visualize your logs with the ELK stack (Elasticsearch Logstash Kibana)

This page shows you how to use the ELK stack InstantApp on your C1 server. ELK stack is an environment that lets you collect and visualize your logs with:

  • Elasticsearch for search and data analytics
  • Logstash for centralized logging, log enrichment and parsing
  • Kibana to visualize data


  • You have an account and are logged into
  • You have configured your SSH Key

There are three steps to deploy the ELK stack InstantApp

  • Create and start a new C1 server using the ELK stack InstantApp
  • Collect syslogs data with Logstash
  • Visualize your data with Kibana

Step 1 - Create and start a new C1 server using the ELK stack InstantApp

First, we need to create a new server using the ELK stack InstantApp. Click the “Create Server” button in the control panel.

You land on the server creation page where you must input information and choose an image.

After inputting your server basic information, select the ELK stack image for your server. On the ImageHub tab, select ELK stack and click the “Create Server” button.

The server will be created with a ready to use install of elasticsearch, Kibana and logstash.

Step 2 - Collect Syslogs data with Logstash

In this tutorial we will see how to track syslogs data and visualize them from Kibana.

Let’s start by creating a new configuration file to collect system logs. Open a new file in /etc/logstash/conf.d/logstash-syslog.conf and fill it with the following:

input {
  file {
    path => [ "/var/log/*.log", "/var/log/messages", "/var/log/syslog" ]
    type => "syslog"
output {
  elasticsearch { host => localhost }
  stdout { codec => rubydebug }

The configuration above tells Logstash to collect all files with .log extention in /var/log, /var/log/messages and /var/log/syslog.

Next, we will create a filter to prevent Elasticsearch to store logs in the message field and simplify the analysis.

filter {
  if [type] == "syslog" {
    grok {
      match => { "message" => "%{SYSLOGTIMESTAMP:syslog_timestamp} %{SYSLOGHOST:syslog_hostname} %{DATA:syslog_program}(?:\[%{POSINT:syslog_pid}\])?: %{GREEDYDATA:syslog_message}" }
      add_field => [ "received_at", "%{@timestamp}" ]
      add_field => [ "received_from", "%{host}" ]
    syslog_pri { }
    date {
      match => [ "syslog_timestamp", "MMM  d HH:mm:ss", "MMM dd HH:mm:ss" ]

Restart logstash to apply our changes service logstash restart

Step 3 - Visualize your data with Kibana

System logs are now collected and stored in elasticsearch, you can visualize them with Kibana. Open a browser and go to http://<your_server_public_ip>. You are asked for a login and password. You can retrieve them on the message of the day (MOTD) when you connect your server.

Welcome on ELK stack on Scaleway' C1.
 * Kernel:           GNU/Linux 3.2.34-30 armv7l - Marvell (Proprietary)
                     - This kernel has the best performances on this hardware
                     - For mainline kernel with latest features and plenty of modules, use a 3.17 kernel instead
 * Distribution:     ELK stack (2015-06-09) on Ubuntu 14.10
 * Internal ip:
 * External ip:
 * Disk /dev/nbd0:   scw-app-elk-latest-2015-06-09_18:11 (l_ssd 50G)
 * Uptime:           09:50:11 up 17:31,  0 users,  load average: 3.23, 3.15, 3.08
 * Documentation:
 * Community:
 * Image source:
To access Kibana, open http://xxx.yyy.zzz.www/.
Login with user kibana and password -> ieshahchuemohfohxooshieshieshiojiepiengeng <-
You can hide this message on the next connection by deleting the /etc/update-motd.d/70-elk file.

You land on Kibana homepage and are asked to configure an index pattern. Index patterns are used to identify the Elasticsearch index to run search and analytics against.

To create the first index, select @timestamp from the Time-field-name menu and click the Create button.

On the top navigation bar, click the Discover tab.

Here will be displayed all the log collected and an histogram representing the log activity.

It is your turn now! Start playing with Kibana, create graphics and filters on your logs :)


ELK stack lets you search and analyze your data with ease. From here you can go deeper and create a more complex configuration. For instance you can use logstash-forwarder which let you collect logs from remote servers and send them to Logstash.

If you have any suggestion or question on this documentation, please leave a comment.

This is a companion discussion topic for the original entry at

Hi there! I am using this instantapp.

I am trying to use the beats input for filebeat, but get this error on logstash:

Couldn’t find any input plugin named ‘beats’. Are you sure this is correct? Trying to load the beats input plugin resulted in this error: no such file to load – logstash/inputs/beats

Can you help me?


Never mind, now I realized that the version you offer is too old for Filebeat. Could you please offer a more modern version of ELK?