You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I want to add support of a separate AWS connection for DynamoDB in DynamoDBToS3Operator in apache-airflow-providers-amazon via aws_dynamodb_conn_id constructor argument.
Use case/motivation
Sometimes DynamoDB tables and S3 buckets live in different AWS accounts so to access both resources you need to assume a role in another account from one of them.
That role can be specified in AWS connection, thus we need to support two of them in this operator.
I think this is relevant for any aws to aws transfer operator (and we have several of thouse)
I wonder if we can have a generic solution that will handle this part like AwsToAwsBaseTransferOperator ?
cc @o-nikolas@shubham22
@eladkal, I can see that RedshiftToS3Operator and S3ToRedshiftOperator already implement a similar pattern using redshift_conn_id connection argument.
I don't see any other operators in this "transfer family".
Also, I think the semantics of two AWS connections in case of such AwsToAwsBaseTransferOperator would be tricky and not straightforward at all. How would you even name them if they are not service-specific?
I'll probably continue with a change to DynamoDBToS3Operator specifically for now, however, I'm open to suggestions on how to address this semantical challenge.
@eladkal Are you thinking of something like a generic "source" and "destination" connection that would be implemented in a base AWS transfer operator that all concrete transfer operators would use? Would both those connections be based on the aws connection? I suppose we could use extras to plumb anything service specific through. It's an interesting thought.
Description
I want to add support of a separate AWS connection for DynamoDB in
DynamoDBToS3Operator
inapache-airflow-providers-amazon
viaaws_dynamodb_conn_id
constructor argument.Use case/motivation
Sometimes DynamoDB tables and S3 buckets live in different AWS accounts so to access both resources you need to assume a role in another account from one of them.
That role can be specified in AWS connection, thus we need to support two of them in this operator.
Related issues
No response
Are you willing to submit a PR?
Code of Conduct
The text was updated successfully, but these errors were encountered: