In a previous post, I discuss setting up the ELK stack on Azure, which is a prerequisite for using Logbat.
To pump logs from a linux server to , we can use a tool called Filebeat1 (not sure where that fits in “ELK”?)
On the server you want monitored, you’ll need to download the appropriate distribution2 for your server version (Supports Windows or Linux)
For instance, for me this was:
curl -L -O https://download.elastic.co/beats/filebeat/filebeat_1.0.1_i386.deb
sudo dpkg -i filebeat_1.0.1_i386.deb
Then, we need to change a couple settings in the filebeat configuration: the host name we’re shipping logs to, and the index name – depending on what you set the pattern to when you installed Kibana. For me, this meant adding this setting: ‘index: “logstash”‘
sudo vi /etc/filebeat/filebeat.yml
Then, you can start the service:
sudo /etc/init.d/filebeat start
If you want to see logs for this, by default they go to syslogd:
sudo tail -f /var/log/syslog
However, up to this point, Filebeat still can’t connect, because it writes directly to Elastic Search, so you’ll have to open up another port through the Azure firewall – lets make this 8080. The Azure UI lets you filter by CIDR ranges, but only down to a /32- in other words, you can’t filter only to the specific IPs of your other servers (a /256), which adds an additional problem we’ll solve below.
Fronting these servers with Nginx seems to be quite popular, so we’ll do that, back on our Elastic Search Server:
apt-get install nginx
Edit the nginx config file:
vi /etc/nginx/nginx.conf
In the “http” block, add a proxy server:
server {
listen 8080;
allow 173.255.224.150;
location / {
proxy_pass http://localhost:9200;
}
}
Obviously you should change the IP – this lets you restrict who can write logs. Filebeat also claims to support Basic Auth, although I haven’t tried it. I notice in the forums, that there is discussion about adding more robust security options (e.g. Kerberos), but for now, most people are using whatever Nginx offers.
You should also set up SSL, but this is sufficient for testing purposes.
From the remote server, you should be able to test connectivity:
curl http://elk-me3u257f.cloudapp.net:8080
Which will give you this:
{
"name" : "Phantom Eagle",
"cluster_name" : "elasticsearch",
"version" : {
"number" : "2.1.1",
"build_hash" : "40e2c53a6b6c2972b3d13846e450e66f4375bd71",
"build_timestamp" : "2015-12-15T13:05:55Z",
"build_snapshot" : false,
"lucene_version" : "5.3.1"
},
"tagline" : "You Know, for Search"
}
In order for Filebeat to continue, it will also need to upload some information about the structure of it’s logs in advance- run this from the server to be logged:
curl -XPUT 'http://elk-me3u257f.cloudapp.net:8080/_template/filebeat?pretty' \
-d@/etc/filebeat/filebeat.template.json
Which gives this:
{
"acknowledged" : true
}
Then restart logbeat, and everything should start working:
/etc/init.d/logbeat restart