Category Archives: Databases

Adding in Custom Indices to Elastiflow

Let’s say you have an elastiflow docker instance set up. This stack pushes all flow info into an index named “elastiflow-<version>-<Year>.<Month>.<Day>”. What if you wanted to use the same ELK stack for both elastiflow AND other stuff?

This is possible, of course!

Clone the elastiflow git repo

Cd into the repo

Add a new input filter to logstash/elastiflow/conf.d/10_input_syslog.conf . For example to bring in syslog:

input {
  udp {
    host => "0.0.0.0"
    port => 10514
    codec => "json"
    type => "rsyslog"
    tags => ["rsyslog"]
  }
}

filter { }

Modify logstash/elastiflow/conf.d/30_output_10_single.logstash.conf


output {
  if "rsyslog" in [tags]  {
    elasticsearch {
      user => "${ELASTIFLOW_ES_USER:elastic}"
      password => "${ELASTIFLOW_ES_PASSWD:changeme}"
      hosts => [ "172.10.4.1:9200" ]
      index => "logstash-%{+YYYY.MM.dd}"
      template => "${ELASTIFLOW_TEMPLATE_PATH:/etc/logstash/elastiflow/templates}/logstash.template.json"
      template_name => "logstash-1.0.0"
    
    }
  } else {
    elasticsearch {
      id => "output_elasticsearch_single"
      hosts => [ "${ELASTIFLOW_ES_HOST:127.0.0.1:9200}" ]
      ssl => "${ELASTIFLOW_ES_SSL_ENABLE:false}"
      ssl_certificate_verification => "${ELASTIFLOW_ES_SSL_VERIFY:false}"
      # If ssl_certificate_verification is true, uncomment cacert and set the path to the certificate.
      #cacert => "/PATH/TO/CERT"
      user => "${ELASTIFLOW_ES_USER:elastic}"
      password => "${ELASTIFLOW_ES_PASSWD:changeme}"
      index => "elastiflow-3.5.3-%{+YYYY.MM.dd}"
      template => "${ELASTIFLOW_TEMPLATE_PATH:/etc/logstash/elastiflow/templates}/elastiflow.template.json"
      template_name => "elastiflow-3.5.3"
      template_overwrite => "true"
    }
  } 
} 

Rebuild the image:

docker build --tag logstash-elastiflow-custom:1.0 .

Now bring up your stack, e.g. “docker-compose up -d”

Now let’s test it. We can generate a new syslog message by, say, logging into the syslog server. If we do this the server shows the following message:

Mar 31 08:37:37 zoobie-2-1 sshd[2625]: Accepted publickey for magplus from 172.10.4.32 port 61811 ssh2: RSA SHA256:2dui2biubddjwbdjbd

If we go to kibana -> Management and create an index, we should see a new logstash index. Add it to kibana. Then view the index in the discover view. It should look like this:

node.hostname:elk.myhomenet.sys node.ipaddr:172.10.4.1 event.type:rsyslog event.host:syslogserver.myhomenet.sys @version:3.5.3 facility:auth @timestamp:Mar 31, 2020 @ 08:24:52.000 sysloghost:zoobie-2-1 severity:info programname:sshd procid:2575 logstash_host:syslogserver.myhomenet.sys tags:rsyslog message: Accepted publickey for magplus from 172.10.4.32 port 61736 ssh2: RSA SHA256:2dui2biubddjwbdjbd _id:IJ6xanIBxE6Ab_zHIO3i _type:_doc _index:logstash-2020.03.31 _score:0

And there you have it!

NOTE: This example does not cover setting up syslog forwarding, which is required to get syslog into logstash. For a good example of this go to this Digital Ocean tutorial on syslog and logstash