Monthly Archives: March 2017

How to build a AWS CodePipeline to build and deploy a Lambda function written in Scala

After reading through many tutorials and playing around with tools and concepts I finally managed to build an automated AWS based deployment chain, which deploys my Scala code into a Lambda.

This is what I have in the end: When I push my code changes into my github repository, AWS CodePipeline reacts on the change and downloads the sources into an AWS CodeBuild environment. There everything is built using sbt and the build artifacts are stored in S3. When that’s done AWS CloudFormation takes over to deploy the changes. This creates or updates a Lambda function to use my latest Scala code for event handling.

The code is just a basic http request handler that takes a piece of json containing a “name” key and responding with a hello string using the key’s value in the response body.

I started to work my way up from this tutorial about running Scala in Lambda. But instead of reacting to an S3 event I wanted the handler to react on a HTTP request with some json in the body. Luckily Yeghishe provides some code to handle those requests in Scala.

So I ended up with this MyHandler.class

package example

import com.amazonaws.services.lambda.runtime.Context
import io.circe.generic.auto._
import io.github.yeghishe.lambda._

// handler io.github.yeghishe.MySimpleHander::handler
// input "foo"
object MySimpleHandler extends App {
  def handler(name: String, context: Context): String = s"Hello $name"
}

case class Name(name: String)
case class Greeting(message: String)

// handler io.github.yeghishe.MyHandler
// input {"name": "Yeghishe"}
class MyHandler extends Handler[Name, Greeting] {
  def handler(name: Name, context: Context): Greeting = {
    logger.info(s"Name is $name")
    Greeting(s"Hi ${name.name}. Have a nice day!")
  }
}

For sbt to build this there are some external library dependencies to manage. Including also the AWS dependencies my build.sbt looks like this now:

import sbt.Keys.libraryDependencies

javacOptions ++= Seq("-source", "1.8", "-target", "1.8", "-Xlint")

lazy val root = (project in file(".")).
  settings(
    name := "lambda-demo",
    version := "1.0",
    scalaVersion := "2.11.4",
    retrieveManaged := true,
    libraryDependencies += "com.amazonaws" % "aws-lambda-java-core" % "1.0.0",
    libraryDependencies += "com.amazonaws" % "aws-lambda-java-events" % "1.0.0",
    libraryDependencies += "com.amazonaws" % "aws-lambda-java-log4j" % "1.0.0",
    libraryDependencies += "io.github.yeghishe" %% "scala-aws-lambda-utils" % "0.0.2"
  )

assemblyMergeStrategy in assembly :=
    {
      case PathList("META-INF", xs @ _*) => MergeStrategy.discard
      case x => MergeStrategy.first
    }

This adds three amazon libs to let the code run in the AWS environment and the Yegishe’s lib for the request handling. I also had to add a plugins.sbt into the root/project folder containing just the line

addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.12.0")

This gives my sbt the ability to execute an assembly command, which builds a jar containing my code an all the needed libs in it. At this point I was able to directly upload my jar into Lambda and run it through the web interface.

But I wanted a fully automated build pipeline. Following this amazon tutorial I created a CodePipeline with four steps. In general everything was straight forward from here, but there still where some traps for me to fall into, since this tutorial assumed JavaScript code to be executed in Lambda.

I had to change the buildspec.yml to deploy my assembled jar file correctly.

version: 0.1

phases:
  build:
    commands:
      - echo Build started on `date`
      - echo Run the test and package the code...
      - sbt compile
      - sbt assembly
  post_build:
    commands:
      - echo Build completed on `date`
      - mkdir deploy
      - mkdir deploy/lib
      - cp target/scala-2.11/lambda-demo-assembly-1.0.jar deploy/lib
      - aws cloudformation package --template-file samTemplate.yaml --s3-bucket testbucket-77 --output-template-file NewSamTemplate.yaml
artifacts:
  type: zip
  files:
    - deploy/lib/lambda-demo-assembly-1.0.jar
    - NewSamTemplate.yaml

In the tutorial the call to “aws cloudformation” is done in the install phase before any code is being actually compiled and packaged. So the package command only puts the NewSamTemplate.yaml and all the source files into my S3 bucket. In the end I always ran into ClassNotFoundExceptions in Lambda, since it could not find my handler class. The compiled and packaged build artifacts where actually stored in another S3 bucket, which was created by AWS CodeBuild.

Moving the package command into post_build ensures that it is called after everything is build. It will than upload everything in respect to the CodeUri value in the samTemplate.yaml

AWSTemplateFormatVersion: '2010-09-09'
Transform: 'AWS::Serverless-2016-10-31'
Description: An AWS Serverless Specification template describing your function.
Resources:
  Hello:
    Type: 'AWS::Serverless::Function'
    Properties:
      Handler: example.MyHandler
      Runtime: java8
      CodeUri: deploy
      Description: ''
      MemorySize: 512
      Timeout: 15
      Role: 'arn:aws:iam::362856109184:role/service-role/lambda_basic_execution'

Here I defined the code artifact location to me the deploy folder. In the buildspec above I made sure this folder exists and that the assembled jar file will be in a lib folder inside that deploy folder. I learned that Lambda can only access jars if they are in a lib folder in the final deploy zip.

Inside the CodeBuild environment I also needed a Docker image, that had sbt and aws-shell on board. The first to build the Scala code and the second to execute the aws cloudformation command. Since I could not find a prebuild image I created my own one and pushed it into dockerhub. You can find it here 7oas7er/sbt-aws-shell.

Advertisements