Table of Contents
When it comes to log management and log management solutions, there is one name that always pops up – the Elastic Stack, formerly known as ELK Stack. But what is the Elastic stack and what makes it so good that millions of people prefer it over any other log management platform – even the historical leader Splunk?
In this ELK stack tutorial, we answer that and more. From what is the ELK stack to how to install and configure it, how to use it for analysis, use cases, and best practices. Keep on reading and find out how the ELK works, why do you need it, and how you can leverage it to manage massive amounts of log data and extract valuable insights to improve your business operations.
What Is the ELK Stack?
The ELK stack is an acronym used to describe a collection of three open-source projects – Elasticsearch, Logstash, and Kibana. Elasticsearch is a full-text search and analytics engine. Logstash is a log aggregator that collects and processes data from multiple sources, converts, and ships it to various destinations, such as Elasticsearch. And finally, Kibana provides a user interface, allowing users to visualize, query, and analyze their data via graphs and charts.
Recently, however, a fourth project was added to the mix – Beats – which led to the stack being rebranded as the Elastic Stack. Beats is a family of lightweight data shippers that collect and send data from different machines and systems to the stack, in this case, to Logstash or Elasticsearch.
Although all four are independent projects run by Elastic, they were designed to complete each other into an end-to-end log analysis solution.
Thus, ELK is a log management platform that works by enabling you to gather massive amounts of log data from anywhere across your infrastructure into a single place, then search, analyze and visualize it in real time. Among the most common ELK use cases, we can name monitoring, troubleshooting, web analytics, risk management, business intelligence, compliance, fraud detection and security analysis.
Want to avoid managing the ELK and focus instead of gathering insights from your logs?
We offer ELK as a service in the Cloud or On-prem without Elasticsearch and infrastructure management or paying expensive consultants.
A Short Recap: Why Is Log Management Important?
Competitors are always ready to grab one of your unhappy customers. On the other hand, while it's easy to lose clients, it's getting harder and harder to make sure apps are available, performant, and secure at all times.
You can get the necessary information by analyzing logs. However, the architecture of the environments generating these logs has evolved into microservices, containers, and orchestration infrastructure deployed on the cloud, across clouds or in hybrid clouds. As you move more and more of your IT infrastructure to the cloud, you build highly distributed and dynamic environments that are more difficult to monitor. Data is everywhere across your infrastructure, in different formats and sometimes, even difficult to locate and manage.
That's where log management tools such as the ELK stack come in. Their main capabilities – collection, aggregation, search and analysis, monitoring and alerting, and visualization and reporting – that help SREs, IT Operations, or DevOps keep an eye on application and infrastructure performance, gather meaningful insights and make better data-driven decisions.
Why Is the ELK Stack So Popular?
The ELK stack is popular because it fulfills a need in the log management and analytics space. In cloud-based environments, performance isolation is extremely difficult to reach. Specific loads, infrastructure servers, environments and the number of active users are just a few of the factors that influence the performance of virtual machines in the cloud. However, ELK and other similar platforms can help with such infrastructure problems, as well as Linux log files or other operating system logs, NGINX and IIS server logs.
ELK has definitely set itself apart as one of the best log management solutions by constantly improving the stack to meet customers' demands, as evidenced by the recent addition of Beats. ELK is open-source and, as one of the oldest tools available, it has gathered a large community of enthusiasts that drive innovation and new features and offer help when needed. Not to mention, with an open-source tool, you don't depend on a vendor.
And lastly, ELK is just that of a powerful platform. Simple and robust, it can manage large volumes of data and it can further scale as data grows without any bumps in performance. Compared to Splunk, it has fewer functionalities, but you don't need all the analytical capabilities offered by Splunk to do your job. You can do just as well with ELK.
On the other hand, you can update from free and open-source to a paid version of ELK that makes it just as pricey. It's called Elastic Stack Features (formerly X-Pack) and extends the basic setup with ELK has the option of extending its capabilities. However, just as ELK is a great Splunk alternative, there are some great alternatives for each “Elastic Stack Features” component.
Popular Elastic Stack Topics
ELK Logging: How to Use the Elastic Stack for Log Management, Analysis & Analytics
Depending on the use case and environment, businesses require different logging architectures.
For small environments, the classic ELK stack architecture is more than enough. It would look as follows:
On the other hand, when you work with massive amounts of data, you will, more than likely, need additional components. For example, you may want to use Apache Kafka for buffering:
A full-production grade architecture will more than likely, have multiple Elasticsearch nodes, maybe even multiple Logstash instances, an alerting plugin, and an archiving mechanism.
That is why, before setting up your stack, you should be clear about your use case. This will influence where and how you install the stack, how you configure your Elasticsearch cluster, how you allocate resources, and many more.
What Is Elasticsearch?
You've probably heard of Elasticsearch before the ELK, right? That's because Elasticsearch is the most popular search engine available today and, practically, the heart of the Elastic Stack; so much so, that people use it as a synonym for the name of the stack itself.
Elasticsearch is a free and open-source search and analytics engine based on the Apache Lucene library that was first released in 2010. It's equipped with a rich and powerful HTTP RESTful API that enables you to perform fast searches in near real-time. Elasticsearch is developed in Java, supporting clients in many different languages, such as PHP, Python, C#, and Ruby.
In the context of using ELK as a tool for log management and analytics, Elasticsearch is in charge of indexing and storing data. You can read more about Elasticsearch in our Elasticsearch tutorial, from basic concepts to how it works, the benefits of using Elasticsearch, and use cases.
And since you're starting with the ELK, check out this presentation where our colleagues cover how to do log analysis with Elasticsearch and what you shouldn't do when working with Elasticsearch in Top 10 Elasticsearch mistakes.
When getting started with Elasticsearch, one of the first things you should dive into is the query syntax as it will be of great help along the way. Learn more about queries in our Elasticsearch cheat sheet, as well as other core Elasticsearch operations such as index creation, deletion, mapping manipulation, and more.
Looking for a fully managed ELK solution with powerful search and filtering capabilities?
Sematext Logs enables you to query, filter, and analyze log data with fast and intuitive search to detect and fix issues before they impact your business.
Elasticsearch REST API
Since you're getting started with Elasticsearch, you should be familiar with the most common APIs – Document API, Search API, Indices API, Cluster API, and cat API.
Last but not least, Elasticsearch functionality can be extended with plugins to better suit your needs. There are many types of plugins you may care about, such as alerting, analysis, API extension, discovery, ingest, management, mapper, security, snapshot/restore, and store plugins. You can learn how to install them here.
Read more on Elasticsearch:
Now you know what Elasticsearch does, but not how you get data to Elasticsearch. Enter Logstash.
What Is Logstash?
You can't do log analysis on unstructured logs. Or better said, you can but with great expense of time and energy. That's why tools like Logstash are indispensable in the space of log management and analytics.
Logstash is a free and open-source log aggregator and processor that works by reading data from many sources and sending it to one or more destinations for storage or stashing – in this case, when using ELK for data analytics, to Elasticsearch. However, along the way, data is processed by filtering, massaging and shaping it to reach a uniform and structured view. Logstash is equipped with ready-made inputs, filters, codecs, and outputs, to help you extract relevant, high-value data from your logs.
Similar to Elasticsearch, Logstash too has a rich library of plugins, allowing it to collect, convert, and enrich various log types, from system logs to web server logs, error logs, and app logs.
Learn more about how to install and use Logstash from our Logstash tutorial, where we also talk about Logstash monitoring, best practices, and walk you through a Logstash configuration example to help you understand the basics in under 5 minutes!
Logstash is one of the best and easy to use logging tools, but there are some other good options available too, such as Fluentd, rsyslog, syslog-ng, or Filebeat, which we discuss in our article about Logstash alternatives. Logagent is our own modern, open-source, lightweight data shipper that allows you to ship logs to Elasticsearch – you can see how Logagent compares to other log shippers.
If you want to learn more about Logstash and how it works, you might also be interested in:
- Parsing and centralizing Elasticsearch logs with Logstash
- How to: Logstash to Kafka to rsyslog
- Sending your Windows event logs to Sematext using NxLog and Logstash
- Handling multiline stack traces with Logstash
- Elasticsearch ingest node vs. Logstash performance
- Recipe: Reindexing Elasticsearch documents with Logstash
At this point, your log data is collected and stored in a single location, but you can't use it since you can't see it, much less monitor or query it. At least, not without Kibana.
What Is Kibana?
Kibana is a free and open-source analysis and visualization layer that works on top of Elasticsearch and Logstash. It's actually the preferred choice for visualizing logs stored in Elasticsearch. Kibana makes it really easy to search, analyze, and visualize large volumes of data, as well as to detect trends and patterns. The dashboard features various interactive charts and allows for customization, depending on what team in your company uses it – yes, ELK logging is useful for BizOps as well!
- Securing Elasticsearch and Kibana with Search Guard for free
- How to ship Kibana server logs to Elasticsearch
- Recipe: rsyslog + Elasticsearch + Kibana
What Is Beats?
Beats is a family of specialized, lightweight data shippers. Think of them as the opposite of Logstash, which takes more resources but is feature-rich. While Logstash has lots of built-in inputs, here you have a different Beat for each supported input. For example:
- Filebeat can read log files or syslog
- Metricbeat can fetch metrics, like CPU or memory (system metrics), or Elasticsearch query time (application-specific metrics)
- Packetbeat can capture network traffic to generate protocol-specific metrics
- Winlogbeat can read from the Windows Event Log
- Auditbeat can read audit entries from the Linux Audit Framework
- Functionbeat can run as a serverless (e.g. AWS Lambda) function to read from cloud-specific log stores (e.g. AWS Cloudwatch)
Other beats are available, too, both official and community-based. All of them share a library (libbeat) which provides common functionality: minor parsing and buffering, and support for a limited set of destinations, mainly Logstash, Elasticsearch, Kafka and Redis.
You would typically use Beats in a simple setup with just Elasticsearch and Kibana. There, you can rely on Elasticsearch's Ingest feature for parsing. A more complex setup will have Beats push data to a centralized place for buffering and parsing, which may include Kafka and Logstash.
Because of their straightforward functionality, Beats configuration tends to be pretty simple as well: you define the source and the destination for your data, and not much else. You'll find some concrete examples on how to configure Beats in the links below.
Some Beats also have modules. For example, you can use the Apache Filebeat module to tail Apache access and error logs, parse them through some predefined Ingest pipelines, index them to Elasticsearch via predefined index templates, and finally explore them through predefined Kibana dashboards. Similarly, you'll use the Apache Metricbeat module to fetch, index and explore metrics such as the number of active connections.
- Monitoring Linux Audit Logs with auditd and Auditbeat
- Tutorial: sending logs to Sematext Logs with Filebeat
- Performance benchmark between Logstash and Filebeat + Elasticsearch Ingest
ELK Stack Use Cases & Applications
You've probably caught on that the ELK stack is most commonly used as a log analysis tool for various use cases and purposes – from monitoring and troubleshooting to security and compliance, SEO and business intelligence.
As already mentioned, make sure you have a very good understanding of your use case before setting up your ELK stack. Here are a few ideas of what is the ELK stack most used for:
Development and troubleshooting
While log management is great for monitoring performance and troubleshooting, you can actually leverage itearlier than that, in the development phase of the application's lifecycle.
When implementing logging in the code, developers can correlate, identify and solve errors and exceptions – in testing or staging. You collect logs into the ELK, ship them to a centralized location then push them to production. Once there, the Kibana dashboards make it easier and faster for you to monitor, analyze and troubleshoot.
Medium, a platform that receives 25 million unique readers and thousands of published posts per week, uses the ELK stack to debug their production issues.
Application Performance Monitoring (APM)
With modern apps, it's critical to monitor performance metrics for each component in your architecture. How does the ELK fit into the picture?
Although it wasn't designed to store metrics, Elasticsearch is used by many for that specific purpose. You can ship such data to Elasticsearch or Logstash with Metricbeat, thus making ELK an alternative to other basic APM tools. If you need to look even deeper into application performance, you can use open source distributed tracing tools such as Jaeger and Zipkin or commercial ones like Sematext Tracing.
Linkedin is one of the big names that use ELK to monitor performance. Moreover, they integrated with Kafka to support their load in real-time.
Security and Compliance
With the number of cyberattacks increasing, companies are forced to meet more and more compliance regulations such as HIPAA, FISMA, SOC, or PCI. As such, they need to have a reliable security mechanism in place.
The ELK stack brings together the data that paints a clear picture of your overall IT security, more specifically, who does what with your app or system in real time. For example, you can aggregate Linux audit logs in Elasticsearch to monitor systems for suspicious activity. This makes ELK great for SIEM.
Netflix uses ELK for security purposes, as well as for monitoring and analyzing customer service-related operations.
Interested in a solution that can ensure top-notch security for your cluster?
Sematext Logs is a hosted ELK solution featuring scheduled reports, alerting, anomaly detection to help you avoid and prevent any kind of threat.
Try it Free for 14 Days
No credit card required – Get started in seconds
Unlike conventional environments, cloud and hybrid environments are multilayered and distributed, making them much more difficult to manage. This raises questions such as how to access each machine, how to collect and process it, where to store it and for how long, how to analyze, secure, and backup the data.
ELK, with all its four components, can cover all of them. Beats is installed on each machine to forward data to Logstash. Logstash normalizes it and ships it to Elasticsearch where is indexed and stored. And, finally, Kibana's dashboards and visualizations help you analyze, detect anomalies, troubleshoot, and prevent security-related issues.
Business Intelligence (BI)
Business intelligence refers to the process of leveraging technologies, software, applications, tools, and best practices to turn raw data into actionable insights that help make better data-driven business decisions and improve performance and collaboration.
The ELK stack is great for processing big data. The ELK stack collects raw data from multiple sources such as supply chain, manufacturing data, databases, personnel records, sales, and marketing campaigns, and many others. Understanding your customers and their online behavior better and how they are accessing your website are just of few of the benefits of using Elastic logging for this purpose.
By using the ELK stack, you can leverage web server access logs to increase the relevancy and visibility of your site. They allow you to see who's visiting your website, both people and the bots used by search engines to crawl the site. Keeping an eye for bots belonging to Google, Yahoo, Baidu, Yandex, and any other such platform helps SEO specialists to detect when bots crawled the site, optimize the crawl budget, monitor website errors, and faulty redirects, and many more.
Even though ELK is mostly used for log aggregation and log analysis, one of the use cases, for the stack can also be generic search – searching through a website for example – by leveraging reverse indexing.
Github, Wikipedia, and Stack Overflow are just a few of the companies that use ELK for their searching and filtering features.
Using Sematext as a Hosted ELK Solution
We offer ELK as a service to make it easier for you to focus on productive work instead of managing the stack. Just ship your logs to Sematext Logs and it will make them accessible for you in real-time via a simple and intuitive user interface.
You can see logs and errors as they stream in from however many data sources you may have. Sematext Logs handles massive amounts of data without blinking an eye. It further allows you to set up alerts on both logs and metrics. On top of that, there's anomaly detection, which also works across both logs and metrics.
Sematext Logs adds a security blanket around your logs with TLS/SSL encryption and multi-user RBAC (Role-Based Access Control) which allows you to give access to an unlimited number of users, as well as revoke their privileges when you need to.
Sematext Logs features sophisticated searching and filtering capabilities along with syslog support to make it easier for you to identify and troubleshoot issues before they affect your users, and spot opportunities to drive business growth.
What Do You Choose: Open Source ELK Stack or Commercial Tools?
As your company grows, so is the volume of data. If you're looking for a good, scalable, and affordable log management and analysis solution to help make sense of your logs, the ELK stack is the one for you. It has impressive features that can very well compete with those of commercial tools. Not to mention, you may not even need them.
Commercial tools, like Sematext Cloud, give you a fast start. You don't have to learn or worry about Elasticsearch or any of the features on top, such as access control. This way, you have more time to grow your business.
On the other hand, if ELK is closer to the core of your business it might be worth growing your ELK and scale it, along with the operations team. To help you on this route, Sematext offers ELK trainings, consulting, and production support.
Nevertheless, remember that your organization requirements and your use case dictate the kind of tool you need and, if you choose ELK, the architecture setup as well.