Hi Amazing Developers,
I came across the is not a Delta table error when I try to read my delta table in databricks.
I just wish to understand the problem with respect to delta'.
Error:
An error occurred while fetching the table:
table_name
com.databricks.backend.common.rpc.DatabricksExceptions$SQLExecutionException: org.apache.spark.sql.AnalysisException: dbfs:/user/hive/warehouse/database_name
.db/
table_name doesn't exist;;
com.databricks.backend.common.rpc.DatabricksExceptions$SQLExecutionException: org.apache.spark.sql.AnalysisException: `database_name`.`table_name` is not a Delta table.
scenario:
The tables received updates hourly, and specific pipelines take input as the table to perform some aggregation. The pipeline will run for 2 to 3 months without breaking then boom the above error. Once the Error hits, almost all our delta tables crash, even the ones that were not used for a long time.
This hits only our development Environment and Production still runs healthy.
Delta settings:
We are using the default settings in all the environments.
please let me know if more info is required.
Thanks for your help.