[Coral-Spark] Modify coral hive parser and coral spark writer dialect to generate spark compliant escaped string literal #473
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
What changes are proposed in this pull request, and why are they necessary?
Without this patch, a string literal of
select 'I'm'
will be re-written in spark-sql output ofselect I''m
, the''
double single quote escaping is a ANSI sql escaping standard but is not what is recognized by the spark sql parser. Instead, spark uses the backslash for escaping quotes inside quotes in sql string, i.e. the correct output should beselect 'I\'m'
.This patch adds the customizations in the
SparkSqlDialect
to make the generated output sql consistency use single quote for quoting a outermost level string literal and use\'
to quote internal literals occurences of'
, for"
, it will appear as-is since it will never need to be escaped because the outermost quote is always single quote.How was this patch tested?
Added 6 unit tests to demonstrate the new behavior:
Run regression tests against all views and results looks fine. Also tested against the view sql which exposed this issue on spark, the spark read result literals are correct now.