-
Notifications
You must be signed in to change notification settings - Fork 2.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
DELETE Statement Deleting Another Record #11212
Comments
@Amar1404 Can you please try 0.14.1. This was fixed. I tried below code also to demonstrate -
Can you please check above and let us know. |
@ad1happy2go - Is there any other way to do it on hudi 0.12.3 like I am trying to use config hoodie.combine.before.delete setting it as false, or any other config |
@ad1happy2go - Do you know any other way to delete duplicated record from the hudi table without rewriting whole table |
@Amar1404 With 0.12 we always used to delete records based on record key. That is the reason both of those records are getting filtered out. |
@Amar1404 Did the approach worked? Do you need any other help here? |
@ad1happy2go - that approach worked thanks. |
Tips before filing an issue
Have you gone through our FAQs?
Join the mailing list to engage in conversations and get faster support at dev-subscribe@hudi.apache.org.
If you have triaged this as a bug, then file an issue directly.
Describe the problem you faced
I have duplicated keys in hudi table due to the insert statement, when I tried deleting the key based on a different filter both the keys were deleted
A clear and concise description of the problem.
To Reproduce
Steps to reproduce the behavior:
Expected behavior
The delete command should only delete the one row which was used for filtering
Environment Description
Hudi version : 0.12.3
Spark version : 3.3
Hive version : 3
Hadoop version :
Storage (HDFS/S3/GCS..) : s3
Running on Docker? (yes/no) : no
Additional context
Add any other context about the problem here.
Stacktrace
Add the stacktrace of the error.
The text was updated successfully, but these errors were encountered: