You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Describe the bug
Spark expectations fails on databricks runtime 14 and above with following error
SparkExpectationsMiscException: Input rules_df is not of dataframe type
To Reproduce
Steps to reproduce the behavior:
Run any spark expectation job on Databricks runtime 14 and above
Expected behavior
Spark expectations job doesn't fail on Databricks runtime 14 and above
Screenshots
If applicable, add screenshots to help explain your problem.
Desktop (please complete the following information):
OS: iOS
Browser: chrome
Version
Additional context
Databricks returns Dataframe of type pyspark.sql.connect.dataframe.DataFrame instead of pyspark.sql.Dataframe in some cases which results in this error as current version of Spark Expectations strictly expects pyspark.sql.Datafram type as input.
All other operations that can be done on pyspark.sql.Datafram can also be done on pyspark.sql.connect.dataframe.DataFrame
The text was updated successfully, but these errors were encountered:
Describe the bug
Spark expectations fails on databricks runtime 14 and above with following error
SparkExpectationsMiscException: Input rules_df is not of dataframe type
To Reproduce
Steps to reproduce the behavior:
Run any spark expectation job on Databricks runtime 14 and above
Expected behavior
Spark expectations job doesn't fail on Databricks runtime 14 and above
Screenshots
If applicable, add screenshots to help explain your problem.
Desktop (please complete the following information):
Additional context
Databricks returns Dataframe of type pyspark.sql.connect.dataframe.DataFrame instead of pyspark.sql.Dataframe in some cases which results in this error as current version of Spark Expectations strictly expects pyspark.sql.Datafram type as input.
All other operations that can be done on pyspark.sql.Datafram can also be done on pyspark.sql.connect.dataframe.DataFrame
The text was updated successfully, but these errors were encountered: