This repo contains lambdas for use in the AWS infrastructure.
- arcimoto-lambda-global-dependencies is used by this repo as a git submodule. The pipeline for
arcimoto-lambda-global-dependencies
automatically generates a pull request to update this repository when there upstream changes. - arcimoto-lambda-utility is used during the pipeline.
- Cord Slatton - Repo Man (Authorizes changes to master branch)
- Gary Malcolm
We use the Angular git commit
style
Full (long) version:
<type>(<scope>): <subject>
<BLANK LINE>
<body>
<BLANK LINE>
<footer>
Short version
<type>(<scope>): <subject>
See https://github.com/angular/angular/blob/master/CONTRIBUTING.md#-commit-message-format more details
VS Code extension that helps you to make the commit msg, no need to remember your scope, etc.: https://marketplace.visualstudio.com/items?itemName=Jhecht.git-angular
We use the allowed types
from https://github.com/angular/angular/blob/master/CONTRIBUTING.md#-commit-message-format
We use custom scopes
per repository, see scopes
below. If you need to add a scope that is allowed.
- arcimoto
- config
- dependencies
- docs
- alarms
- authorities
- backfill
- debug
- dependencies
- firmware
- fleets
- grafana
- hologram
- lambda-utility
- locations
- managed sessions
- notes
- pipeline
- recalls
- reef
- replicate
- root
- schema
- sheer id
- submodule
- telemetry
- tests
- userpool
- users
- utility
- vehicles
- yrisk
The design of the Authority Manager focuses on encapsulating all trust and access mechanisms inside a strictly controlled environment within Amazon Web Services (AWS). Each component is built to perform a single focused action, and access to those actions is strictly controlled through limited entry points.
Sensitive key material never leaves the Authority Manager. A trust relationship is established between the Authority Manager and the vehicle by linking a policy restricted area of the Internet of Things (IoT) Core Shadow Profile of each registered vehicle to a list of authority principals that have been granted access to the vehicle within the Authority Manager. A corresponding vehicle access token is generated by the Authority Manager and securely distributed to the associated authority principal. Presenting a signed and trusted access token to a vehicle over Bluetooth LE allows control of the associated vehicle.
See the Authority Manager Overview in the Arcimoto wiki.
- Login to Palantir:
- Go to the vehicles page
- Use the header "Provision" link to open the context bar OR
- Open the context bar and select the "Provision" button
- Go to the vehicles page
- Use the tool to provision a vehicle
Vehicles deliver telemetry data over cellular or wifi connections to an AWS based cloud system that stores and visualizes reported data.
See the Telemetry Overview on the Arcimoto wiki.
The majority of the AWSLambda repo code is written in Python and meant to be executed by AWS Lambda in a Python 3.8 runtime environment. Many of the functions rely on the AWS SKD for Python, boto3.
We use Python virtual environments for each repository to keep things sane and clean.
Install virtualenv
using pip:
pip install virtualenv
Create new virtual environment for repo (run at repo root, env
folder should be in .gitignore
file):
python3 -m venv env
After creating a new virtual environment VSCode should detect the new virtual environment and ask if you would like to set it as active for the repository.
Install boto3
using pip:
pip install boto3
Additionally, in order to run Lambda functions which interact with AWS services, use the AWS Command Line Interface (CLI).
Install CLI
using pip:
pip install awscli
In order to use the CLI, you will also need the proper AWS credentials, managed by the AWS Identity and Access Management (IAM) service. Have the Arcimoto AWS administrator create CLI credentials for you, or if applicable grant IAM permission and create them yourself. Then on the command line run:
aws configure
and enter the Access Key ID and Secret Access Key that you just created.
To get the AWSLmbda Lambda functions running locally, first obtain a local copy by cloning the repository. From the directory that you want to install your local copy, use git on the command line:
git clone --recurse-submodules https://<username>@bitbucket.org/arcimotocode1/awsshared.git
Where <username>
is your bitbucket user name.
By using the --recurse-submodules
flag it the command will automatically initialize and populate the git submodules after cloning the repository.
Exports of the API Gateway configurations we have implemented as well as directions to export/import to the API Gateway.
Global and common dependencies for the lambda functions.
The arcimoto
global dependency is actually populated via the git submodule arcimoto-lambda-global-depenedencies.
You must package the common dependency up and include it in the dependencies folder. Packages can either be zipped or not:
- zipped is slightly easier to deal with but doesn't allow you or your IDE to look at the files later if you need to without unzipping,
- unzipped allows inspection of the files at any time, including automatically by your IDE, if needed.
-
Run
pip install --target ./{PACKAGE_NAME} {PACKAGE_NAME}
where{PACKAGE_NAME}
is the pip package you want to create as a common dependency. Note that pip may collect additional packages that may be required by the main package. If required, they must also be added todependencies.json
in step 4 below. The additional packages may remain in the parent package subdirectory to indicate the association. -
Optional: Zip the folder
-
Move the folder (zipped or not) to the
dependencies
folder -
Create an entry in the
dependencies.json
file in thecommon_dependencies
sectionExamples of common dependencies entries:
zipped:
"influxdb_extended": { "from": "dependencies/influxdb_extended_38.zip", "action": "unzip" }
unzipped:
"grafana_telemetry_points": { "from": "dependencies/grafana_json/grafana_telemetry_points.json", "to": "grafana_telemetry_points.json" }
-
Import the common dependency in your lambda to use
Various AWS-related documentation.
This folder contains lambda functions, broken down into functional bundles.
Contains the database schemas for the authoritykeys and telemetry databases as well as all database update and rollback scripts.
Deployment tools, CI integrations and other development utilities.
Used by the arcimoto-lambda-utility package to identify resources to deploy with lambda functions.
From the certifi package using the wheel file type, extracted.
The global_dependencies
layer builds for a specific python3 minor (3.8) version due to compatibility with psycopg2 (postgres db interaction package).
The compatibility with psycopg2 requires a build for a specific runtime of python3 for our global_dependencies
layer. Rather than compile psycopg2
for the AWS Lambda environment ourselves we can use awslambda-psycopg2 an open source repository that maintains compiled versions of psycopg2 for each version of python.
- Copy the
psycopg2-3.x
directory from the awslambda-psycopg2 repository matching your intended minor version of python3 into your AWS Lambda projectdependencies
folder and rename the folder topsycopg2
before zipping it into a file calledpsycopg2.zip
. This will then be used in any subsequent lambda creation or updates by the lambda-utility commands in this repository ONLY if they directly use thepsycopg2
module as acommon_dependency
. However, newer, upgraded lambdas rely on ourglobal_dependencies
layer. - Create a new layer version for the
global_dependencies
with a compatible runtime set to match the minor version of python3 you are upgrading to using the lambda-utility. After the layer is created via the lamda-utility it will return a version number: independencies.json
set thelayers.global_dependencies.version
property to the new layer version number. This will cause subsequentcreate
andupdate
lambda-utility commands to use the new layer (and the upgraded version of python and psycopg2).
The BitBucket pipelines file [bitbucket-pipelines.yml] implements the Continuous Integration (CI)/Continuous Deployment (CD) for the AWSLambda repository.
The lambdas that have code updates as well as lambdas with dependencies updates (code updates) or dependencies changed (dependencies added or removed) are included in a given pipeine run. In the case that the global dependencies are updated/changed all lambdas are included in the pipeline run.
See utility/ses/README.md
for more information about the SES Templates that are automatically maintained in the pipeline.
Note: branch must be prefixed with prefix MAINT-
This is a special pipeline for automatically generated pull requests from the arcimoto-lambda-global-dependencies
repository due to a new release. The pipeline updates all lambdas, then runs all unit tests.
Note: branch must be prefixed with TEL-*
The tests are run automatically on Pull Request (PR) creation/update from a feature branch beginning with the prefix TEL-
to the dev
branch.
These tests include:
- testing the
dependencies.json
file JSON validity - unit tests for each bundle (defined in each bundle's
bundle.json
file)
Any failed tests will result in a failure of the pipeline, which will prevent merging the PR to dev
until the PR is updated to fix the issue(s) and the pipeline is re-run successfully for the PR.
Upon merge to dev
the pipeline deploys all lambdas included in the changed/updated checks mentioned above in Inclusion of Lambdas in a Pipeline Run
are packaged and deployed to our dev
AWS Lambda environment.
Upon merge to dev
branch, if the SES Template source files have changes, the corresponding SES templates for the dev
environment are updated using the arcimoto-ses-utility.
Upon merge to staging
from the dev
branch the pipeline releases all lambdas included in the changed/updated checks mentioned above in Inclusion of Lambdas in a Pipeline Run
to our staging
AWS Lambda environment by creating a new lambda version from dev
and setting the AWS lambda alias for staging
to point to the newly created version.
Upon merge to staging
branch, if the SES Template source files have changes, the corresponding SES templates for the staging
environment are updated using the arcimoto-ses-utility.
Upon merge to master
from the staging
branch the pipeline releases all lambdas included in the changed/updated checks mentioned above in Inclusion of Lambdas in a Pipeline Run
to our prod
AWS Lambda environment by setting the AWS lambda alias for prod
to point to the version staging
is pointed at.
Upon merge to master
branch, if the SES Template source files have changes, the corresponding SES templates for the prod
environment are updated using the arcimoto-ses-utility.