The ELK stack (Elasticsearch, Logstash, Kibana) is becoming more popular as a solution for centralised logging due to it’s simplicity and powerfull features. It’s easy to set up, integrate and use.
Locally, instead of installing elasticsearch and kibana we can use docker, this enables us to easily spin up and tear down both of these services without the need to install them.
Docker is also a good candidate for production but there are out of the box solutions wich deal with a lot of the complexity for you, like managing the Elasticsearch cluster. see AWS ElasticSearch as a service.
A minimal local setup can be obtained using docker compose:
Please note specified ports. Once this is in place open the Docker Quickstart Terminal in the docker_compose.yml location and run:
Running the Quickstart terminal will start the default docker machine and give it an IP, this will be used to access ES and Kibana.
If you are using a regular terminal you will need to set some environment variables by running
eval $(docker-machine env).
docker-machine start default
Running docker-compose up will download the images and start the containers. At this point both elastic search and kibana should be avaiable.
Kibana: http://192.168.99.100:5601/ Elasticsearch: http://192.168.99.100:9200/
ElasticSearch provides a REST Api which can be very handy. I’m using Postman and a saved custom collection with useful requests, as an example the follwing request will display a list of indices.
Install NLog using NuGet:
Install NLog custom target:
This setup allows you to try Kibana and Elasticsearch and will get you thinking about the next steps.
- What is the logging strategy ? Should we use content indetifiers?
- How do we handle elasticsearch cluster failover?
- What type of dashboards would help?
- Does it make sense to add Logstash into the equation ?