Its a challenge to log messages with a Lambda, given that there is no server to run the agents or forwarders (splunk, filebeat, etc.) on. Here is a quick and easy tutorial to set up ELK logging by writing directly to logstash via the TCP appender and logback. This is for a Java/Maven based Lambda.

Make sure the Lambda is running in the right Subnet and has the right Security Group(s) to be able to talk to Logstash server and port.




We will use 2 appenders - the STDOUT appender will send the logs to cloudwatch, just in case Logstash TCP does not work.

Add this file to src/main/resources folder.

<?xml version="1.0" encoding="UTF-8"?>
    <appender name="stash" class="net.logstash.logback.appender.LogstashTcpSocketAppender">
  <!-- Replace the destination with the logstash server: logstash port -->
  <!-- There can be multiple destinations -->
        <encoder class="net.logstash.logback.encoder.LogstashEncoder" />
    <appender name="STDOUT" class="ch.qos.logback.core.ConsoleAppender">
          <layout class="ch.qos.logback.classic.PatternLayout">
                  %d{yyyy-MM-dd HH:mm:ss} [%thread] %-5level %logger{36} - %msg%n
      <root level="DEBUG">
          <appender-ref ref="stash" />
          <appender-ref ref="STDOUT" />

The Handler method

In the hander method, reset the LoggerContext, and reload the configuration. I know this sounds super odd, but when invoking lambdas multiple times, I did notice the logs missing from Kibana. Since I do not know how AWS recycles lambda execution environments, it is something to do with logback context being in a limbo. I even noticed some executions timing out (I had a 60s timeout), but somehow made it to Kibana as unique invocations. If you’ve made it work any other way, please mention in the comments section below. I am using Serverless Framework.

 private final Logger logger = LoggerFactory.getLogger(this.getClass());

 public Void handleRequest(SNSEvent event, Context context) {
       LoggerContext loggerContext = (LoggerContext) LoggerFactory.getILoggerFactory();
       JoranConfigurator config = new JoranConfigurator();
       try {
            event.getRecords().forEach(x ->;
       }catch(JoranException e){
           logger.error("Cannot initialize logger context ",e);

finally block

Make sure the code closes the Logging Context before the JVM dies.

    LoggerContext loggerContext = (LoggerContext) LoggerFactory.getILoggerFactory();

Testing locally with Docker-ELK

Install the ELK stack on Docker locally via docker-elk and create the default index. This will listen on port 5000 for LogStash.

Logstash TCP receiver

Final change, locate $LOGSTASH_HOME/pipeline/logstash.conf and change the input to include the json_lines codec. This is needed to ensure that the message does not end up being embedded in another message.

input {
        tcp {
                port => 5000
                codec => json_lines

Restart Logstash, and search for the message in Kibana (port 5601 if you’re using docker-elk) after running the lambda.

Here is a sample output -

    October 10th 2017, 23:16:55.273
    {"@timestamp":"2017-10-10T06:16:55.018+00:00","@version":1,"message":"Testing ELK Logging with Logstash and Lambda ","logger_name":"com.test.serverless.logging.Handler","thread_name":"main","level":"INFO","level_value":20000}