-
Notifications
You must be signed in to change notification settings - Fork 7
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Bug in Spark Decimal type to Exasol Decimal type conversion #44
Comments
@morazow Can I take this ticket, or is somebody already working on it? I think I am familiar with this area of the code. |
Hey @jpizagno, Good to hear from you! Yes sure, please feel free to work on this task. Thanks a lot! Only to make sure that we are on the same page, the Exasol decimal type mapping should be changed here. The Spark decimal type |
@morazow I think we also have to think about defaults. DecimalType.SYSTEM_DEFAULT, DecimalType.USER_DEFAULT, DecimalType.MAX_PRECISION. etc... |
Hey @jpizagno, This looks great! Please send a pull request. I can also suggest some changes before pr. Could you please use the
similar for the scale value also, on these lines. Thanks for working on this! Please send a pull request once you are finished. |
Fixed in #47. |
Situation
Currently any Spark decimal type, even if it contains different precision and scale is mapped to Exasol default
Decimal
type, which by default isDecimal(18, 0)
.Solution
Spark Decimal type should be correctly mapped to Exasol decimal type with exact precision and scale values.
Test code
The text was updated successfully, but these errors were encountered: