Introduction
Use the Elastic Stack (ELK stack) to analyze the business data and API analytics. You can use Logstash for Filebeat to process Anypoint Platform log files, insert them into an Elasticsearch database, and then analyze them with Kibana.
Elastic Stack overview
ELK stands for the three Elastic products – Elasticsearch, Logstash, and Kibana
To understand what the Elastic core products, we will use a simple architecture:

- The logs will be created by an application and pushed into the AWS SQS Queue.
- Logstash aggregates the logs from different sources and processes it.
- Elasticsearch stores and indexes the data in order to search it.
- Kibana is the visualization tool that makes sense of the data.
What is Logstash?
Logstash is a data collection tool. It consists of three elements: input, filters, and output.

What is Elasticsearch?
ES (Elasticsearch) is a NoSQL database that is based on the Lucene search engine. ES provides RESTful APIs to search and analyze the data. Different data types such as numbers, text, and geo — structured or unstructured — can be stored.
What is Kibana?
Kibana is a data visualization tool. It helps you to quickly get insight into the data and offers capabilities like diagrams, dashboards, etc. Kibana uses all the data stored on Elasticsearch.
Why we need to push logs into Kibana ?
Mulesoft CloudHub stores up to 100 MB of log data per application per worker, or up to 30 days, whichever limit is reached first. Due to this, we are not able to preserve all the logs for long time. And the searching process is laborious. So, you have to push the logs into to ELK, which can store the logs for long time and can visualize the logs.
How to push the logs into ELK ?
Externalize the Mulesoft Logs to Elastic Search using AWS Service.

Assumptions:
- User is aware about the Mulesoft. And User have already created Hello world Application with Mulesoft logger. For Hello world Application Read Here…
- SQS is created and the API Gateway is created for SQS. If you are not aware to create the API Gateway.
Procedure:
- Configure Log4J.
- Install Logstash, Elastic Search, Kibana
- Configure Logstash.
- configure Elastic Search.
- Configure Kibana.
- Visualize logs into Kibana.
1. Configure LOG4J.
Step 1. You have to create the application, for which you need to push the logs into Kibana.
Step 2. You have to open the log4j file of mule project and update the appender as following.

Step 3. Update the Appender Reference for SQS (Appender Name will appear in Appender Ref)

You have completed the Log4j Configuration, to push the logs into Kibana.
Step 4. Deploy the Project into CloudHub, and while deploying the application please disable the CloudHub Logs.

Note : Once we Disable the CloudHub logs, the logs will not appear into the CloudHub. It will move to the AWS SQS Service. If you need the logs available in both places(CloudHub and SQS), Please update the Log4j as below.

2. Install Logstash, Elastic Search, Kibana
Logstash : The Logstash binaries are available from https://www.elastic.co/downloads. Download the Logstash installation file for your host environment—TARG.GZ, DEB, ZIP, or RPM.
Unpack the file. Do not install Logstash into a directory path that contains colon (:) characters.
Elastic Search: The Elastic Search are available from the following link.
https://www.elastic.co/guide/en/elasticsearch/reference/current/zip-windows.html
This comes with a elasticsearch-service.bat command which will setup Elasticsearch to run as a service.
Kibana: Download the .zip windows archive for Kibana v7.13.1 from the following link.
https://artifacts.elastic.co/downloads/kibana/kibana-7.13.1-windows-x86_64.zip Please update the latest version.
3. Configure Logstash :
Step 1. Open the Config Folder in Logstash.
Step 2. Verify the *.conf file in Logstash, if available

Step 3. If only Sample config file available, please ignore and create a new file as name logstash-sqs.conf
Step 4. Update the file with following Content:
input {
sqs {
region => “eu-central-1” // SQS Region
queue => “MuleSoftLogs” // SQS Queue Name
access_key_id => “XXXXXXXXXXXXXX” //AWS Access Key
secret_access_key => ” XXXXXXXXXXXXXX ” //AWS Secret key
}
}
filter {
json {
# Parses the incoming JSON message into fields.
source => “message”
}
}
output {
elasticsearch {
hosts => “localhost:9200”
codec => “json”
index => “mule-sqs”
#user => “elastic”
#password => “changeme”
}
}
Step 5. Go to Logstash bin folder and find the logstash.bat file.
Step 6. Open the Logstash Config file { logstash/bin/logstash.conf file }Step 7. Set the output for elastic search as following:

Note : The Elastic search port will be set in next step. So, we can set the same, once we will setup the Elastic Search Configuration.
Step 8. Create the index for Kibana. (Kibana requires an index pattern to access the Elasticsearch data that you want to explore. An index pattern selects the data to use and allows you to define properties of the fields. An index pattern can point to a specific index, for example, your log data from yesterday, or all indices that contain your data.)
Step 9. Trigger the Logstash.bat file to execute the Logstash.
Step 10. The default port for Logstash is 9600 (For Localhost)

4. Configure Elastic Search.
Step 1. Open the elastic search folder and go to the Config folder.
Step 2. Open the file “elasticsearch.yml” and verify the port Number. (Default Port : 9200)
Note : Set the Same port Number in Logstash -> bin -> logstash.conf File -> output
Step 3. Run the elastic Search. { Elastic Search -> Bin -> elasticsearch.bat }
Step 4. Verify the elastic Search logs as below :

5. Configure Kibana.
Step 1. Open the Kibana Folder and go to the Kibana.yml file from config folder.
Step 2. Verify the kibana host and port. {Default Host : Localhost, Default Port : 5601}
Step 3. Verify the elastic Search URLs in the same file. { As elsticsearch config we set the host as localhost and port is 9200 – In Configure Elastic SearchStep 2.}
Step 4. Run the kibana application. { Kibana -> Bin -> kibana.bat }
Step 5. Verify the elastic Search logs as below :

Note : Elastic Search should be up and running for Kibana application, If elastic search fails to run, you will not be able to run the Kibana application.
6. Visualize logs into Kibana.
Step 1. Open the Kibana application and click on Hamburger Icon.
Step 2. Go to Management Tab -> Stack Management.
Step 3. Go to Kibana -> Index Patterns
Step 4. Create the index Pattern

Step 5. Provide the same index name, which we defined in Logstash Output.

Note : Index Name should be same as defined in Logstash (Defined in Configure Logstash -> Step Number 7)
Step 6. Config the Setting for index management as per the requirement. And create the index pattern.

Step 7. Index Pattern Created Successfully.
Step 8. Click on Hamburger Icon -> Kibana -> Discover.

Step 9. Select the Valid index Pattern as below :

Step 10. Run the Application, once the logs create it will be visible to your Kibana Application as below :

References: