-
Notifications
You must be signed in to change notification settings - Fork 9.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Terraform aws s3 bucket policy are planned for change every time #4948
Comments
Hi @aldarund Short-term solution is to always use string instead of array for Long-term solutionIn ideal case we could get IAM JSON mappings from the AWS SDK and do the (un)marshalling easily, that's being discussed in aws/aws-sdk-go#127 Another, similar approach was proposed in #3124 where the mappings are part of the codebase. Also there's #4278 which is following the same principal as the other linked PR. |
I ran across the same issue with an Here's what the resource "aws_s3_bucket" "my-bucket" {
bucket = "my-bucket"
policy = <<EOF
{
"Version": "2012-10-17",
"Statement": [{
"Sid": "",
"Action": ["s3:PutObject"],
"Effect": "Allow",
"Resource": ["my-resource"],
"Principal": {
"AWS": "1234567890"
}
}]
}
EOF
} And here is the resource "aws_s3_bucket" "my-bucket" {
bucket = "my-bucket"
policy = <<EOF
{
"Version": "2012-10-17",
"Statement": [{
"Sid": "",
"Action": "s3:PutObject",
"Effect": "Allow",
"Resource": "my-resource",
"Principal": {
"AWS": "arn:aws:iam::1234567890:root"
}
}]
}
EOF
} Notice that |
We also noticed that we had to make each Action separate as well. |
Quick Fix Here: I just found that the policy has to be exactly the same as you see it in web console, so first apply your change and copy the result from web console to terraform file |
Specifically, for me, this occurs when wanting to specify multiple principals in the form of:
|
Yes, I don't see a workaround when we need to specify a list of principals, e.g.:
Any workaround is welcome. For now, we won't track those buckets with terraform. |
This is making me a sad panda, but I would like to confirm the workaround by @Ehekatl |
@lra there is a workaround for that case. You need to completely remove all lists from the policy and instead split them into their own statements. For example: {
"Version": "2008-10-17",
"Statement": [
{
"Sid": "",
"Effect": "Allow",
"Principal": {
"AWS": [
"arn:aws:iam::XXXXXXXXXXXX:root",
"arn:aws:iam::YYYYYYYYYYYY:root"
]
},
"Action": "s3:*",
"Resource": "arn:aws:s3:::ZZZZZZZZZ/*"
}
]
} ... should change to this: {
"Version": "2008-10-17",
"Statement": [
{
"Sid": "",
"Effect": "Allow",
"Principal": {
"AWS": "arn:aws:iam::XXXXXXXXXXXX:root"
},
"Action": "s3:*",
"Resource": "arn:aws:s3:::ZZZZZZZZZ/*"
},
{
"Sid": "",
"Effect": "Allow",
"Principal": {
"AWS": "arn:aws:iam::YYYYYYYYYYYY:root"
},
"Action": "s3:*",
"Resource": "arn:aws:s3:::ZZZZZZZZZ/*"
}
]
} |
Hi @aldarund This has been fixed via #8615 :) This will be released in 0.7.3. TL;DR - we are now able to test the structure of the AWS policy for similarities. This means that we know if a string equals the item in a single array etc. It will also understand the ordering of the blocks in a policy Hope this helps Paul |
Even using the @AMeng solution on 0.7.3 does not work for me. In my case though I've narrowed it down to using Conditions in the policy. If I remove the Condition then it works as expected. |
Using a aws_iam_policy_document data resource with an s3 bucket policy and getting the same error and I am on 0.7.4 |
I got the same problem, but advice from @Ehekatl helped: I figured out that |
I'm having the same issue in v0.7.13 and even removing the arrays doesn't seem to be helping. |
v0.7.13 is working well for me. One thing I have changed though is to stop using the terrible |
I am also using an aws_s3_bucket_policy resource with a template_file |
I am importing the buckets into the state file using
Then I create resources based on the state-file, when I run the plan I am expecting a empty plan because the resources that I generated from the state-file are same with same configurations, but the policy always seem to change although the policy is also same in the resource file and in the state-file but is the plan it shows that It is going to change the policy. One more thing it does is It marks the buckets with attached polices to destroy and that is strange. If I have 13 buckets and 6 of them have policies Plans shows that 13 resources would be changed and 6 would be destroyed. Those 6 are also in the changed section and also showed in the mark to destroy section |
I'm encountering this problem as well with Terraform 0.7.13
I'm using the Can this issue be re-opened, or should a new issue be created? |
@onoffleftright I am running |
This was happening to me on
to enable alb access logs in eu-west-1. AWS API accepts that but turns it into
and that's what's saved against the bucket so terraform will always show a change. |
If I change my ELB bucket policy to what @jurajseffer suggests, terraform no longer updates the policy on every run. |
Confirming that using the ARN instead of the account ID for an AWS Principal fixed my issues (I, too, was setting up an S3 policy for ELB logs). |
Can confirm that ARN instead of account ID worked for me as well. |
I faced a similar issue, realized the sample code in below document is the right way to fix my issue. If you are working with cloudfront, you should read this:
|
I've had a similar problem with a valid policy in Terraform |
Due to a Terraform bug (hashicorp/terraform#4948), if Action is set to a single-element array, the S3 policy always reports that it needs to be modified in place. The workaround is to use a string.
Due to a Terraform bug (hashicorp/terraform#4948), if Principal is set to an account id, Terraform always reports the resource as if it needs changing (because the AWS API accepts the value, but turns it into an ARN).
Due to a Terraform bug (hashicorp/terraform#4948), if Principal is set to an account id, Terraform always reports the resource as if it needs changing (because the AWS API accepts the value, but turns it into an ARN).
I was having this problem with Terraform 0.11.13, and none of the suggestions above helped. Turns out my aws_iam_policy_document contained |
@jvelonis having the same problem. Do you know why adding depends_on in aws_iam_policy_document would introduce this problem? |
Using terraform 0.12 and having the issue with multiple Principals. Update: After doing |
I'm going to lock this issue because it has been closed for 30 days ⏳. This helps our maintainers find and focus on the active issues. If you have found a problem that seems similar to this, please open a new issue and complete the issue template so we can capture all the details necessary to investigate further. |
Every time i run plan and apply terraform says bucket policy is changed and needed to be applied. But its same policy all over again.
The text was updated successfully, but these errors were encountered: