Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Multiple AWS connections support in DynamoDBToS3Operator #29422

Closed
2 tasks done
dym-ok opened this issue Feb 8, 2023 · 4 comments · Fixed by #29452
Closed
2 tasks done

Multiple AWS connections support in DynamoDBToS3Operator #29422

dym-ok opened this issue Feb 8, 2023 · 4 comments · Fixed by #29452
Assignees
Labels
good first issue kind:feature Feature Requests provider:amazon-aws AWS/Amazon - related issues

Comments

@dym-ok
Copy link
Contributor

dym-ok commented Feb 8, 2023

Description

I want to add support of a separate AWS connection for DynamoDB in DynamoDBToS3Operator in apache-airflow-providers-amazon via aws_dynamodb_conn_id constructor argument.

Use case/motivation

Sometimes DynamoDB tables and S3 buckets live in different AWS accounts so to access both resources you need to assume a role in another account from one of them.
That role can be specified in AWS connection, thus we need to support two of them in this operator.

Related issues

No response

Are you willing to submit a PR?

  • Yes I am willing to submit a PR!

Code of Conduct

@dym-ok dym-ok added the kind:feature Feature Requests label Feb 8, 2023
@boring-cyborg
Copy link

boring-cyborg bot commented Feb 8, 2023

Thanks for opening your first issue here! Be sure to follow the issue template!

@Taragolis Taragolis added the provider:amazon-aws AWS/Amazon - related issues label Feb 8, 2023
@eladkal
Copy link
Contributor

eladkal commented Feb 8, 2023

I think this is relevant for any aws to aws transfer operator (and we have several of thouse)
I wonder if we can have a generic solution that will handle this part like AwsToAwsBaseTransferOperator ?
cc @o-nikolas @shubham22

@dym-ok
Copy link
Contributor Author

dym-ok commented Feb 9, 2023

@eladkal, I can see that RedshiftToS3Operator and S3ToRedshiftOperator already implement a similar pattern using redshift_conn_id connection argument.
I don't see any other operators in this "transfer family".

Also, I think the semantics of two AWS connections in case of such AwsToAwsBaseTransferOperator would be tricky and not straightforward at all. How would you even name them if they are not service-specific?

I'll probably continue with a change to DynamoDBToS3Operator specifically for now, however, I'm open to suggestions on how to address this semantical challenge.

@o-nikolas
Copy link
Contributor

@eladkal Are you thinking of something like a generic "source" and "destination" connection that would be implemented in a base AWS transfer operator that all concrete transfer operators would use? Would both those connections be based on the aws connection? I suppose we could use extras to plumb anything service specific through. It's an interesting thought.

What do you think @Taragolis?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
good first issue kind:feature Feature Requests provider:amazon-aws AWS/Amazon - related issues
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants