serverless-rust v0.3.8
π¦ Install
Install the plugin inside your serverless project with npm.
$ npm i -D serverless-rust
π‘The -D
flag adds it to your development dependencies in npm speak
π‘ This plugin assumes you are building Rustlang lambdas targeting the AWS Lambda "provided" runtime. The AWS Lambda Rust Runtime makes this easy.
Add the following to your serverless project's serverless.yml
file
service: demo
provider:
name: aws
runtime: rust
plugins:
# this registers the plugin
# with serverless
- serverless-rust
# creates one artifact for each function
package:
individually: true
functions:
test:
# handler value syntax is `{cargo-package-name}.{bin-name}`
# or `{cargo-package-name}` for short when you are building a
# default bin for a given package.
handler: your-cargo-package-name
events:
- http:
path: /test
method: GET
π‘ The Rust Lambda runtime requires a binary named
bootstrap
. This plugin renames the binary cargo builds tobootstrap
for you. You do not need to do this manually in yourCargo.toml
configuration file.
ποΈ customize
You can optionally adjust the default settings of the dockerized build env using a custom section of your serverless.yaml configuration
custom:
# this section customizes of the default
# serverless-rust plugin settings
rust:
# flags passed to cargo
cargoFlags: '--features enable-awesome'
# custom docker tag
dockerTag: 'some-custom-tag'
# custom docker image
dockerImage: 'dockerUser/dockerRepo'
π₯Ό (experimental) local builds
While it's useful to have a build environment that matches your deployment environment, dockerized builds come with some notable tradeoffs.
The external dependency on docker itself often causes friction as an added dependency to your build.
Depending on a docker image limits which versions of rust you can build with. The default docker image tracks stable rust. Some users might wish to try unstable versions of rust before they stabilize. Local builds enable that.
If you wish to build lambda's locally, use the dockerless
configuration setting.
custom:
# this section allows for customization of the default
# serverless-rust plugin settings
rust:
# flags passed to cargo
cargoFlags: '--features enable-awesome'
# experimental! when set to true, artifacts are built locally outside of docker
+ dockerless: true
This will build and link your lambda as a static binary outside a container that can be deployed in to the lambda execution environment using MUSL. The aim is that in future releases, this might become the default behavior.
In order to use this mode its expected that you install the x86_64-unknown-linux-musl
target on all platforms locally with
$ rustup target add x86_64-unknown-linux-musl
On linux platforms, you will need to install musl-tools
$ sudo apt-get update && sudo apt-get install -y musl-tools
On Mac OSX, you will need to install a MUSL cross compilation toolchain
$ brew install filosottile/musl-cross/musl-cross
Using MUSL comes with some other notable tradeoffs. One of which is complications that arise when depending on dynamically linked dependencies.
- With OpenSSL bindings which you can safely replace is with rustls or vendor it
- Other limitations are noted here.
If you find other MUSL specific issues, please report them by opening an issue.
π¨ Per function customization
If your serverless project contains multiple functions, you may sometimes
need to customize the options above at the function level. You can do this
by defining a rust
key with the same options inline in your function
specification.
functions:
test:
rust:
# function specific flags passed to cargo
cargoFlags: '--features enable-awesome'
# handler value syntax is `{cargo-package-name}.{bin-name}`
# or `{cargo-package-name}` for short when you are building a
# default bin for a given package.
handler: your-cargo-package-name
events:
- http:
path: /test
method: GET
π€Έ usage
Every serverless workflow command should work out of the box.
invoke your lambdas locally
$ npx serverless invoke local -f hello -d '{"hello":"world"}'
deploy your lambdas to the cloud
$ npx serverless deploy
invoke your lambdas in the cloud directly
$ npx serverless invoke -f hello -d '{"hello":"world"}'
view your lambdas logs
$ npx serverless logs -f hello
ποΈ serverless templates
^0.2.*
- a minimal echo application - https://github.com/softprops/serverless-aws-rust
- a minimal http application - https://github.com/softprops/serverless-aws-rust-http
- a minimal multi-function application - https://github.com/softprops/serverless-aws-rust-multi
- a minimal apigateway websocket application - https://github.com/softprops/serverless-aws-rust-websockets
- a minimal kinesis application - https://github.com/softprops/serverless-aws-rust-kinesis
0.1.*
Older versions targeted the python 3.6 AWS Lambda runtime and rust crowbar and lando applications
- lando api gateway application - https://github.com/softprops/serverless-lando
- multi function lando api gateway application - https://github.com/softprops/serverless-multi-lando
- crowbar cloudwatch scheduled lambda application - https://github.com/softprops/serverless-crowbar
Doug Tangren (softprops) 2018-2019
5 years ago
5 years ago
5 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
7 years ago
7 years ago
7 years ago
7 years ago
7 years ago
7 years ago
7 years ago