Logstash logging with AWS Lambda
Its a challenge to log messages with a Lambda, given that there is no server to run the agents or forwarders (splunk, filebeat, etc.) on. Here is a quick and easy tutorial to set up ELK logging by writing directly to logstash via the TCP appender and logback. This is for a Java/Maven based Lambda.
Make sure the Lambda is running in the right Subnet and has the right Security Group(s) to be able to talk to Logstash server and port.
pom.xml
<dependency>
<groupId>net.logstash.logback</groupId>
<artifactId>logstash-logback-encoder</artifactId>
<version>4.11</version>
</dependency>
<dependency>
<groupId>ch.qos.logback</groupId>
<artifactId>logback-classic</artifactId>
<version>1.2.3</version>
</dependency>
logback.xml
We will use 2 appenders - the STDOUT
appender will send the logs to cloudwatch, just in case Logstash TCP does not work.
Add this file to src/main/resources
folder.
<?xml version="1.0" encoding="UTF-8"?>
<configuration>
<appender name="stash" class="net.logstash.logback.appender.LogstashTcpSocketAppender">
<!-- Replace the destination with the logstash server: logstash port -->
<destination>localhost:5000</destination>
<!-- There can be multiple destinations -->
<encoder class="net.logstash.logback.encoder.LogstashEncoder" />
</appender>
<appender name="STDOUT" class="ch.qos.logback.core.ConsoleAppender">
<layout class="ch.qos.logback.classic.PatternLayout">
<Pattern>
%d{yyyy-MM-dd HH:mm:ss} [%thread] %-5level %logger{36} - %msg%n
</Pattern>
</layout>
</appender>
<root level="DEBUG">
<appender-ref ref="stash" />
<appender-ref ref="STDOUT" />
</root>
</configuration>
The Handler method
In the hander method, reset the LoggerContext
, and reload the configuration. I know this sounds super odd, but when invoking lambdas multiple times, I did notice the logs missing from Kibana. Since I do not know how AWS recycles lambda execution environments, it is something to do with logback context being in a limbo. I even noticed some executions timing out (I had a 60s timeout), but somehow made it to Kibana as unique invocations. If you’ve made it work any other way, please mention in the comments section below. I am using Serverless Framework.
private final Logger logger = LoggerFactory.getLogger(this.getClass());
@Override
public Void handleRequest(SNSEvent event, Context context) {
LoggerContext loggerContext = (LoggerContext) LoggerFactory.getILoggerFactory();
loggerContext.reset();
JoranConfigurator config = new JoranConfigurator();
config.setContext(loggerContext);
try {
config.doConfigure(this.getClass().getResourceAsStream("/logback.xml"));
event.getRecords().forEach(x -> logger.info(x.getSNS().getMessage()));
}
}catch(JoranException e){
logger.error("Cannot initialize logger context ",e);
}
finally block
Make sure the code closes the Logging Context before the JVM dies.
finally{
LoggerContext loggerContext = (LoggerContext) LoggerFactory.getILoggerFactory();
loggerContext.stop();
}
Testing locally with Docker-ELK
Install the ELK stack on Docker locally via docker-elk and create the default index. This will listen on port 5000
for LogStash.
Logstash TCP receiver
Final change, locate $LOGSTASH_HOME/pipeline/logstash.conf
and change the input
to include the json_lines
codec. This is needed to ensure that the message
does not end up being embedded in another message
.
input {
tcp {
port => 5000
codec => json_lines
}
}
Restart Logstash, and search for the message in Kibana (port 5601
if you’re using docker-elk) after running the lambda.
Here is a sample output -
@version:
1
host:
10.0.1.128
@timestamp:
October 10th 2017, 23:16:55.273
message:
{"@timestamp":"2017-10-10T06:16:55.018+00:00","@version":1,"message":"Testing ELK Logging with Logstash and Lambda ","logger_name":"com.test.serverless.logging.Handler","thread_name":"main","level":"INFO","level_value":20000}
port:
48,608
_id:
AV8o-lTxQqZEdfFJQV0z
_type:
logs
_index:
logstash-2017.10.10
_score: