-
-
Notifications
You must be signed in to change notification settings - Fork 354
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Unable to add additional iam roles to cluster #155
Comments
The If you set
The screen shot you provided looks like what would happen if you change |
Thanks for your answer. My concern here is that, in the case that I let it to false, the worker roles will disappear, even though in the terraform-aws-eks-cluster/auth.tf Line 148 in dc373e0
This is why I’m more confused. I would love to manage all the authentication with terraform, but without breaking the regular configuration. |
I kind of agree this is a bug because of:
So in both cases you lose something. I have got around this by first reading the config map (if exists) and merge with the current settings). That way terraform will manage only the roles setup in vars and will not touch any other manually added roles. Example:
Also I don't recommend using the new |
@sebastianmacarescu I think that's an amazing option! The only thing that I see it could happen, is when deleting roles. As they'd be already on the configmap, if you remove them from the I definitely prefer to have to remove them manually but having the ability to manage them with terraform than having to add them by hand. |
If you look at the Cloud Posse EKS Terraform component (think of at this point in time as a work-in-progress), you see that the way we handle this is to output and then read the managed role ARNs to include them in future auth maps. The component uses the Cloud Pose YAML stack config module to read the state, but you can use the |
Can we get some review on the work done? It would be really great to have this merged |
What is the issue with moving to |
Found a bug? Maybe our Slack Community can help.
Describe the Bug
When using the module to create an EKS cluster, I'm trying to add additional roles to the
aws_auth
configmap. This only happens with the roles, adding additional users works perfectly.The behavior changes depending on the
kubernetes_config_map_ignore_role_changes
config.Expected Behavior
When adding
map_additional_iam_roles
, those roles should appear on the aws_auth configmap, together with the worker roles when thekubernetes_config_map_ignore_role_changes
is set tofalse
.Steps to Reproduce
Steps to reproduce the behavior:
Screenshots
This example shows when I change the variable
kubernetes_config_map_ignore_role_changes
from false to true.Also, I don’t see a difference in the map roles in the data block for both options except for the quoting.
Environment (please complete the following information):
Anything that will help us triage the bug will help. Here are some ideas:
The text was updated successfully, but these errors were encountered: