-
Notifications
You must be signed in to change notification settings - Fork 8.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Dataset Quality] Provide field limit mitigation option with rollover #190330
Comments
Pinging @elastic/obs-ux-logs-team (Team:obs-ux-logs) |
cc: @gbamparop @LucaWintergerst @flash1293 As discussed during the weekly sync, updating the discussions here
Text i used for which i need copy is
An alternate solution to this would be to display the user the name of the custom pipeline they should create with a copy icon next to it. A text must be presented to them, that they can copy the pipeline name and create a custom pipeline to handle this situation. |
## Summary As part of this [issue](#190330), we need to add link to the Elasticsearch Mapping Settings Limit page here - https://www.elastic.co/guide/en/elasticsearch/reference/current/mapping-settings-limit.html Not to mix this change with my original PR, creating a separate PR here
## Summary As part of this [issue](elastic#190330), we need to add link to the Elasticsearch Mapping Settings Limit page here - https://www.elastic.co/guide/en/elasticsearch/reference/current/mapping-settings-limit.html Not to mix this change with my original PR, creating a separate PR here (cherry picked from commit 7d667f3)
…#194178) # Backport This will backport the following commits from `main` to `8.x`: - [[Dataset Quality] Add new doc link for mapping limits (#194156)](#194156) <!--- Backport version: 9.4.3 --> ### Questions ? Please refer to the [Backport tool documentation](https://github.com/sqren/backport) <!--BACKPORT [{"author":{"name":"Achyut Jhunjhunwala","email":"achyut.jhunjhunwala@elastic.co"},"sourceCommit":{"committedDate":"2024-09-26T15:44:58Z","message":"[Dataset Quality] Add new doc link for mapping limits (#194156)\n\n## Summary\r\n\r\nAs part of this\r\n[issue](#190330), we need to add\r\nlink to the Elasticsearch Mapping Settings Limit page here -\r\nhttps://www.elastic.co/guide/en/elasticsearch/reference/current/mapping-settings-limit.html\r\n\r\nNot to mix this change with my original PR, creating a separate PR here","sha":"7d667f3c9d5ec26340906fcb6bbe34d889e78f74","branchLabelMapping":{"^v9.0.0$":"main","^v8.16.0$":"8.x","^v(\\d+).(\\d+).\\d+$":"$1.$2"}},"sourcePullRequest":{"labels":["release_note:skip","v9.0.0","backport:prev-minor"],"title":"[Dataset Quality] Add new doc link for mapping limits","number":194156,"url":"https://github.com/elastic/kibana/pull/194156","mergeCommit":{"message":"[Dataset Quality] Add new doc link for mapping limits (#194156)\n\n## Summary\r\n\r\nAs part of this\r\n[issue](#190330), we need to add\r\nlink to the Elasticsearch Mapping Settings Limit page here -\r\nhttps://www.elastic.co/guide/en/elasticsearch/reference/current/mapping-settings-limit.html\r\n\r\nNot to mix this change with my original PR, creating a separate PR here","sha":"7d667f3c9d5ec26340906fcb6bbe34d889e78f74"}},"sourceBranch":"main","suggestedTargetBranches":[],"targetPullRequestStates":[{"branch":"main","label":"v9.0.0","branchLabelMappingKey":"^v9.0.0$","isSourceBranch":true,"state":"MERGED","url":"https://github.com/elastic/kibana/pull/194156","number":194156,"mergeCommit":{"message":"[Dataset Quality] Add new doc link for mapping limits (#194156)\n\n## Summary\r\n\r\nAs part of this\r\n[issue](#190330), we need to add\r\nlink to the Elasticsearch Mapping Settings Limit page here -\r\nhttps://www.elastic.co/guide/en/elasticsearch/reference/current/mapping-settings-limit.html\r\n\r\nNot to mix this change with my original PR, creating a separate PR here","sha":"7d667f3c9d5ec26340906fcb6bbe34d889e78f74"}}]}] BACKPORT--> Co-authored-by: Achyut Jhunjhunwala <achyut.jhunjhunwala@elastic.co>
How about: Changes not applied to new data The component template was successfully updated with the new field limit, but the changes were not applied to the most recent backing index.
Do we have a design for this one? Not sure exactly what it will look like and what text we will need here. |
@mdbirnstiehl Thanks for the copy. I would need one more copy from you based on the discussion we had today. When a user fixes a Field Limit issue, we apply the fix on the current backing index as well. This means that the issue still exists and now when the user open the pop up, they would see our fallback Hence in the call today @flash1293 had a great suggestion that we can display a text similar to this when @patpscal Can you help me and Mike here with a change in design as well. Previously we were displaying this
|
@achyutjhunjhunwala Will this only apply for field limit changes? Can we be specific or do we want to avoid mentioning the specific settings? How about something like this: If you've recently updated your |
This makes more sense |
@mdbirnstiehl As Patri is on PTO, i came with a rough design to explain what i meant For Non Integrations we would recommend them a very generic custom pipeline based on For Integrations, we would recommend them with explicit custom pipeline name which is |
@achyutjhunjhunwala would the user be taken to the page to create a pipeline when they copy the name or would there be another link for them to follow? In this scenario, would users always be creating the custom pipeline? If they are always creating the pipeline, it would make sense to make the title "Add custom ingest pipeline" and remove the edit. Do we want to give a little summary of what/why the user would add a custom pipeline? I'm not sure that would be obvious to the user. We could possibly organize it like this: Add a custom ingest pipeline to...(I'm not sure of the mechanics of it, but we could give a reason here as to why they need to create the pipeline):
|
@mdbirnstiehl Why the user need to modify a pipeline ? At the moment we are only offering a fix it flow for Field Limit Issues that too only for integrations. Other issues like character limit reached or malformed values still need to be fixed manually by the user. Editing/Adding Pipeline processor is also one of the way |
@patpscal As you are back, can you help me with the UX for this Pipeline panel. I just randomly built this post @LucaWintergerst comment. |
How hard would it be for us to check if the pipeline already exists?
if it exists, we display
if it does not exist, we do
this would make for a cleaner UX as we'd have a nicer solution in the case that it's there already if we can't easily check for the presence of the pipeline I'd suggest
|
@LucaWintergerst In favour of this issue - #183992 Reason being once the above ticket is implemented, we can always point to that URL and Management will automatically will take care of opening Edit or Create page. |
Ok, that makes sense to me |
It seems the original design did not account for the intermediate "copy" step, so it links directly to the create pipeline page, hence the external link icon. We can disregard the link for this card title and use an accordion instead for the content, as in the Increase field mapping limit card. |
## Summary Closes - #190330 This PR implements the logic to support - One click increasing of Field Limit for Field Limit Issues (applicable on for Integrations). For Non Integrations, only text is displayed as to how they can do it. - The One click increase updates the linked custom component template as well as the last backing Index - If Last Backing Index update fails due to any reason, it provides user an option to trigger a Rollover manually. ## Demo Not possible, to many things to display 😆 ## What's Pending ? Tests - [x] API tests - [x] Settings API - [x] Rollover API - [x] Apply New limit API - [x] FTR tests - [x] Displaying of various issues for integrations and non integrations - [x] Fix it Flow Good case, without Rollover - [x] Fix it Flow Good case, with Rollover - [x] Manual Mitigation - Click on Component Template shold navigate to proper logic based on Integration / Non - [x] Manual Mitigation - Ingest Pipeline - [x] Link for official Documentation ## How to setup a local environment We will be setting up 2 different data streams, one with integration and one without. Please follow the steps in the exact order 1. Start Local ES and Local Kibana 2. Install Nginx Integration 1st 3. Ingest data as per script here - https://gist.github.com/achyutjhunjhunwala/03ea29190c6594544f584d2f0efa71e5 4. Set the Limit for the 2 datasets ``` PUT logs-synth.3-default/_settings { "mapping.total_fields.limit": 36 } // Set the limit for Nginx PUT logs-nginx.access-default/_settings { "mapping.total_fields.limit": 52 } ``` 5. Now uncomment line number 59 from the synthtrace script to enable cloud.project.id field and run the scenario again 6. Do a Rollover ``` POST logs-synth.3-default/_rollover POST logs-nginx.access-default/_rollover ``` 7. Get last backing index for both dataset ``` GET _data_stream/logs-synth.3-default/ GET _data_stream/logs-nginx.access-default ``` 8. Increase the Limit by 1 but for last backing index ``` PUT .ds-logs-synth.3-default-2024.10.10-000002/_settings { "mapping.total_fields.limit": 37 } PUT .ds-logs-nginx.access-default-2024.10.10-000002/_settings { "mapping.total_fields.limit": 53 } ``` 9. Run the same Synthtrace scenario again. This setup will give you 3 fields for testings 1. cloud.availability_zone - Which will show the character limit isue 2. cloud.project - Which will show an obsolete error which happened in the past and now does not exists due to field limit 3. cloud.project.id - A current field limit issue --------- Co-authored-by: Marco Antonio Ghiani <marcoantonio.ghiani01@gmail.com> Co-authored-by: kibanamachine <42973632+kibanamachine@users.noreply.github.com>
## Summary Closes - elastic#190330 This PR implements the logic to support - One click increasing of Field Limit for Field Limit Issues (applicable on for Integrations). For Non Integrations, only text is displayed as to how they can do it. - The One click increase updates the linked custom component template as well as the last backing Index - If Last Backing Index update fails due to any reason, it provides user an option to trigger a Rollover manually. ## Demo Not possible, to many things to display 😆 ## What's Pending ? Tests - [x] API tests - [x] Settings API - [x] Rollover API - [x] Apply New limit API - [x] FTR tests - [x] Displaying of various issues for integrations and non integrations - [x] Fix it Flow Good case, without Rollover - [x] Fix it Flow Good case, with Rollover - [x] Manual Mitigation - Click on Component Template shold navigate to proper logic based on Integration / Non - [x] Manual Mitigation - Ingest Pipeline - [x] Link for official Documentation ## How to setup a local environment We will be setting up 2 different data streams, one with integration and one without. Please follow the steps in the exact order 1. Start Local ES and Local Kibana 2. Install Nginx Integration 1st 3. Ingest data as per script here - https://gist.github.com/achyutjhunjhunwala/03ea29190c6594544f584d2f0efa71e5 4. Set the Limit for the 2 datasets ``` PUT logs-synth.3-default/_settings { "mapping.total_fields.limit": 36 } // Set the limit for Nginx PUT logs-nginx.access-default/_settings { "mapping.total_fields.limit": 52 } ``` 5. Now uncomment line number 59 from the synthtrace script to enable cloud.project.id field and run the scenario again 6. Do a Rollover ``` POST logs-synth.3-default/_rollover POST logs-nginx.access-default/_rollover ``` 7. Get last backing index for both dataset ``` GET _data_stream/logs-synth.3-default/ GET _data_stream/logs-nginx.access-default ``` 8. Increase the Limit by 1 but for last backing index ``` PUT .ds-logs-synth.3-default-2024.10.10-000002/_settings { "mapping.total_fields.limit": 37 } PUT .ds-logs-nginx.access-default-2024.10.10-000002/_settings { "mapping.total_fields.limit": 53 } ``` 9. Run the same Synthtrace scenario again. This setup will give you 3 fields for testings 1. cloud.availability_zone - Which will show the character limit isue 2. cloud.project - Which will show an obsolete error which happened in the past and now does not exists due to field limit 3. cloud.project.id - A current field limit issue --------- Co-authored-by: Marco Antonio Ghiani <marcoantonio.ghiani01@gmail.com> Co-authored-by: kibanamachine <42973632+kibanamachine@users.noreply.github.com> (cherry picked from commit 3ece950) # Conflicts: # x-pack/test_serverless/functional/test_suites/observability/dataset_quality/degraded_field_flyout.ts
📓 Summary
Add UI to be able to update the Field Limit. Things to consider
Acceptance Criteria
🎨 Design
Figma
The text was updated successfully, but these errors were encountered: