Article 2: How to manage AWS Lambda Functions with Serverless Framework
目次
Topics
With the instruction in Article 1, maybe you could define how to create AWS Lambda function and trigger event with using Serverless Framework.
I will explain the following topics in more detail. These topics were not introduced in Article 1.
- Variables in serverless.yml
- External file
- stage
- serverless-external-s3-event plugin
- Tips and other etc.
Variables in serverless.yml
As you can use variables in Serverless.yml, you can write it in a simple and flexible way.
For the details, please refer to the document. I will explain the points that are difficult and are not written in the document.
Types of Variables
There are some types of variables which you can use in serveless.yml. Let’s take a look at them one by one.
Variables defined in serverless.yml
If you need to define the values that will be used many times in Serverless.yml, I recommend that you to define the values as the variables for easier modification in the future.
If you want to define the variable, define them within the ‘custom’ key section.
custom:
bucket:
module1: mycompany-module1
module2: mycompany-module2
To use the defined value, you use the self-keyword as below.
${self:custom.bucket.module1}
Environment Variable
You can refer to environment variables of the shell where ‘sls’ (or ‘serverless’) command is executing as below.
${env:SOME_ENV}
Command line arguments
You can also refer to the Command Line Arguments of ‘sls’ command. The main usage of this will be when you switch the stage to development, production or staging.
The Arguments that you can pass to the AWS Lambda Functions is written in below. (only for AWS)
Serverless Framework Commands – AWS Lambda – Deploy
If it is stage arguments, an example is shown below.
${opt:stage}
The default value for an undefined variable
If the variable is undefined, you may want to use the default value. In such case, please refer below.(from document:arguments/medication.)
custom:
myRegion: ${opt:region, 'us-west-1'}
External file
How to include an external file
You can read external files, YAML, from serverless.yml.
If you want to define the same functions/name etc in serverless.yml many times, you can define them in the external YAML file. By reading the external file, you can refer to them in serverless.yml.
Let’s say an external YAML file is like below.
foo:
- foo1
- foo2
bar: something
There are a few ways to read the external file. Please refer below.
# To refer to all of an external file
${file(./external_file.yml)}
# To refer to part of an external file
${file(./external_file.yml):foo}
External file and variables
The external file is readable and referable as well as serverless.yml. However, the variables which are defined in the external file don’t seem to be referable.
Let’s say if there is a file like below.
# serverless.yml
custom:
foo_in_parent: something
functions:
handler: handler.foo
name: foo_func
events: ${file(./events.yml)}
Here, events.yml will be read but it seems the values defined in serverless.yml is not readable.
# events.yml
- cloudwatchEvent:
event:
source: ${self:custom.foo_in_parent} # Not good.
Stage
Serverless allows you to switch the environments such as production and staging.
In this case, you need to design beforehand.
Method by recommend documents
Official documentation says like below.
- At the very least, use a dev and production stage.
- Use different AWS accounts for stages.
- In larger teams, each member should use a separate AWS account and their own stage for development.
I totally agree with the first one, but the second one can be a problem.
The official documentation recommends to use different AWS account for dev and production and define the Lambda functions of the same name for both of them. However, you don’t want a different account if the project is small. I will explain more about this case later.
If you use a different AWS account, it will not be so difficult. You can just use a profile which name is as same as the stage one as below.
provider:
stage: ${opt:stage, 'dev'}
profile: ${self:provider.stage}
Please refer below in case if the stage name is different from the profile one:
custom:
profiles:
dev: awsProfileForDev
production: awsProfileForProduction
provider:
stage: ${opt:stage, 'dev'}
profile: ${self:custom.profiles.${self:provider.stage}}
To use one account
It is not recommended to use one AWS account for dev and production. But if you still want to use one account, I would like to explain tips for this.
# serverless.yml
# You have to define the necessary resources which are separate use for dev and production in the custom section.
custom:
bucket:
dev: example-bucket-dev
pro: example-bucket
provider:
# Serverless create CloudFormation stack and it includes the stage name.、
# So I suggest using the name that is clearly not for dev or production such as 'default' and common etc.
functions:
# Use dev_ prefix for function.
# I will explain about other method later.
dev_foo_func:
handler: handler.dev_foo_func
name: dev_foo_func
events:
# I will explain about existingS3 later.
- existingS3:
bucket: ${self:custom.bucket.dev}
# I suggest to use external file for the common use in dev and production.
# I skip explanation of s3_event_foo.yml
events: ${file(./yaml/s3_event_foo.yml):events}
rules: ${file(./yaml/s3_event_foo.yml):rules}
pro_foo_func:
handler: handler.pro_foo_func
name: dev_foo_func
events:
- existingS3:
bucket: ${self:custom.bucket.pro}
events: ${file(./yaml/s3_event_foo.yml):events}
rules: ${file(./yaml/s3_event_foo.yml):rules}
For handler.py, you can set as below. For other languages, please apply the suitable name.
def dev_foo_func(event, context):
foo_func('dev', event, context)
def pro_foo_func(event, context):
foo_func('pro', event, context)
def foo_func(stage, event, context):
# いろいろな処理
It is not considered a good idea to use prefix, dev_ and pro_, to define many functions.
# various process
functions:
# Define one function.
foo_func:
# Execute the entity with the stage options to define many functions.
handler: handler.${self:provider.stage}_foo_func
name: ${self:provider.stage}_foo_func
events:
# I skip here.
As a result, this didn’t work.
It is because a few plugins expect the entry key (foo_func) and the name in the name section (dev_foo_func or pro_foo_func) to be the same.
(A plugin, serverless-external-s3-event, I will explain one of them later.)
serverless-external-s3-event plugin
How to enable the existing S3 bucket
Because of Serverless specification at this point, if you specify S3 in ‘events’, the bucket will be created with the name you specified. And if the same name bucket exists, an error will be returned. (refer to the document below for the details.)
Serverless Framework – AWS Lambda Events – S3
It will not cause any problems if you can manage all the infrastructures by yourself.
In the actual case, you may want to set new Lambda function events for the existing system or S3 buckets and other infrastructures.
As I mentioned already, Serverless cannot manage such a case at this point. Please refer to the issues below.
There is a workaround for this issue.
- Can’t subscribe to events of existing S3 bucket · Issue #2154 · serverless/serverless
- Introduce a functionality which attach events to an existing resource · Issue #4241 · serverless/serverless
There is a plugin, serverless-external-s3-event, to handle this situation which is commented about in the first issue.
Install and set up
For installation, you have to use npm.
As it is written below and in the document, it is better to run without -g, then commit package.json.
npm install serverless-external-s3-event
Settings of serverless.yml are as below.
events:
- existingS3:
bucket: a-bucket-that-already-exists
events:
- s3:ObjectCreated:*
rules:
- prefix: path/to/some/dir/
- suffix: .csv
Please refer to the document for the details.
Conclusion
This time, I explained the parts that I could not introduce in the previous post.
Also, I will post in another article if I come up with other tips.
コメントを残す