You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
In my project I'm using DynamoDB streams to trigger a Lambda. However the Alias plugin seems to prevent a regular redeploy of the project. Only the first deploy worked as expected.
This is the error thrown in CloudWatch during deployment (Checking Stack update progress):
An error occurred: my-project-dev - Export my-project-dev-MyTableStreamArn cannot be updated as it is in use by my-project-dev-dev.
The relevant parts of the serverless.yaml are:
# [...]functions:
myLambdaFunc:
handler: handlers/myLambdaFunc.handlertimeout: 300events:
# Run once a day (midnight UTC)
- schedule: cron(0 0 * * ? *)# Also trigger if anything relevant in the database changes
- stream:
type: dynamodbarn:
Fn::GetAtt:
- MyTable
- StreamArnresources:
# AWS CloudFormation TemplateResources:
MyTable:
Type: AWS::DynamoDB::TableDeletionPolicy: RetainProperties:
StreamSpecification:
StreamViewType: KEYS_ONLYAttributeDefinitions:
- # [...]KeySchema:
- # [...]ProvisionedThroughput:
ReadCapacityUnits: 1WriteCapacityUnits: 1GlobalSecondaryIndexes:
- # [...]
I will have to spend more time figuring out what exactly causes this problem as none of the resources are actually supposed to change during the deploy. But as for right now, it seems I cannot properly use DynamoDB streams as Lambda triggers in combination with the Alias plugin.
The text was updated successfully, but these errors were encountered:
I think the support for DynamoDB streams is yet missing in the alias plugin. The reason why the error occurs is, that the target of the trigger has to be changed to the current alias of the deployment, i.e. the trigger itself has to be moved to the alias stack, so that it is attached to the alias / versioned function.
Technically the plugin has to be changed, so that dynamo triggers are exactly handled like the kinesis triggers (see stackOps/events). As soon as the trigger is moved to the alias stack, its target has to be set to the logical id of the alias resource - which in turn will be automatically be converted to the aliased function version.
To begin alanyzing the issue, you can utilize serverless package and inspect the generated main and alias templates in .serverless. Then you see what has actually to be moved by the plugin.
I checked the implementation now in detail and read the DynamoDB stream documentation.
The reason for this issue is, that on a subsequent deploy the DynamoDB stream Arn might (and will) change. So the value of the generated Stream output variable changes too.
But that fails, because the alias stack currently references the output (Arn) via Fn::ImportValue which does not allow changing the underlying value.
In my project I'm using DynamoDB streams to trigger a Lambda. However the Alias plugin seems to prevent a regular redeploy of the project. Only the first deploy worked as expected.
This is the error thrown in CloudWatch during deployment (Checking Stack update progress):
The relevant parts of the
serverless.yaml
are:I will have to spend more time figuring out what exactly causes this problem as none of the resources are actually supposed to change during the deploy. But as for right now, it seems I cannot properly use DynamoDB streams as Lambda triggers in combination with the Alias plugin.
The text was updated successfully, but these errors were encountered: