-
Notifications
You must be signed in to change notification settings - Fork 116
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Improve output for topic changes on dryRun and apply #295
Comments
Hi @purbon , I hope you are all good! There is already have work-in-progress on parts of this so if possible would be nice with some feedback on what you think :-) |
@purbon, it would be very helpful if you replied to this. We have several quite large changes in our local pipeline now, to support Aksel's suggestions. |
Hi,
sorry guys ... really last weeks been super busy. I propose one thing,
do you want to do 1h call this week? we could probably arrange and close
everything on that call. What do you think?
Missatge de Sverre H. Huseby ***@***.***> del dia ds., 19 de
juny 2021 a les 2:34:
… @purbon <https://github.com/purbon>, it would be very helpful if you
replied to this. We have several quite large changes in our local pipeline
now, to support Aksel's suggestions.
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
<#295 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AAAQXPFV2SNEUX3VNCEKJDTTTPQZBANCNFSM46ZIPPSQ>
.
--
Pere Urbon-Bayes
Software Architect
https://twitter.com/purbon
https://www.linkedin.com/in/purbon/
|
@purbon , no problem. I fully understand. Yes, a call to discuss and agree on the changes would be good. We can agree on time and channel via Gitter? |
yes, please.
Missatge de Aksel Hilde ***@***.***> del dia dl., 21 de juny
2021 a les 10:53:
… @purbon <https://github.com/purbon> , no problem. I fully understand.
Yes, a call to discuss and agree on the changes would be good. We can agree
on time and channel via Gitter?
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
<#295 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AAAQXPDWP6ORVEJ27VC2NCDTT34W3ANCNFSM46ZIPPSQ>
.
--
Pere Urbon-Bayes
Software Architect
https://twitter.com/purbon
https://www.linkedin.com/in/purbon/
|
👍 Gitter message sent. |
Hi @purbon, when can we have the call? |
Any update on this issue. I think it would be extremely helpful to have this feature. |
I have not gotten any feedback from @purbon, but we have started work on a PR. Right now we have entered summer holidays, will pick it up again in August. 🌞😎 |
Hi @purbon, hope summer has been good! Still on summer vacation? |
There have been multiple other issues (#220, #102, #109) on this subject and also a PR that got closed. This issue proposes a solution for the most pressing issues, including the need for a refactor of code related to this.
Is your feature request related to a problem? Please describe.
Functional issues with current solution
Currently JulieOps only logs planned changes during a dryRun execution. For topics the logging also has the following issues:
For bindings (ACLs/RBAC etc) the logging is correct as the actual new/changed ACLs is updated and logged. However there are also a corresponding issue that currently there is no logging of the updates done on schemas.
In addition the apply phase does not log the changes that have actually been performed; only a listing of all topics and all ACLs in the cluster.
Technical issues with current solution
TopicManager.apply
indicates that the manager will apply the suppliedExecutionPlan
, but rather actions is added to the plan.BaseAction
although they are not actions - only creating the bindings that will later be used by actions.Describe the solution you'd like
Proposal for functional changes
Example of new logs for topic and schemas changes:
Proposal for technical changes
Some refactoring is required to support the functional improvements:
The text was updated successfully, but these errors were encountered: