Tested as much as I understand the scenarios here, spylon-kernal connects with remove Hive metastore and conduct read/write successfully after config spark-sql with
spark.sql.catalogImplementation=hive by setting following properties when creating cluster:
--properties hive:hive.metastore.warehouse.dir=gs://<warehouse-path>,hive:hive.metastore.uris=thrift://<remote-hive-host>:9083,spark:spark.sql.catalogImplementation=hive