If all the files of the delta lake tables are intact in the file system, then you can always read the tables directly using the path.
- SQL: select * from delta.`path` (path has to be inside backticks)
- DataFrame: spark.read.format("delta").load("path")
All the table's metadata (schema, properties, etc.) are present in the table's transaction log inside tableDir/_delta_log/ .
In addition, if you want to add that path as a table in the metastore, then you can use the standard CREATE TABLE ... LOCATION 'path' to create an external table on that path. With that all SQL commands using the table name should just work.
TD