Databricks can not connect to hopsworks hive metastore

When saving the dataframe from external databricks into hopswork I am getting database not found error.

Code:

from pyspark.sql import functions as F
exogenous_features_df = exogenous_csv.withColumn(‘date’, F.to_date(“date”, ‘dd/MM/yyy’))

exogenous_fg_meta = fs.create_feature_group(name=“exogenous_fg_new”,
version=1,
primary_key=[‘store’, ‘date’],
description=“External features that influence sales, but are not under the control of the distribution chain”,
time_travel_format=None,
#statistics_config={“enabled”: True, “histograms”: True, “correlations”: True})
statistics_config=False)
#fs.get_feature_group(“exogenous_fg_new”, version=1).delete()
exogenous_fg_meta.save(exogenous_features_df)

Error:

/databricks/spark/python/pyspark/sql/readwriter.py in saveAsTable(self, name, format, mode, partitionBy, **options)
1183 if format is not None:
1184 self.format(format)
→ 1185 self._jwrite.saveAsTable(name)
1186
1187 def json(self, path, mode=None, compression=None, dateFormat=None, timestampFormat=None,

/databricks/spark/python/lib/py4j-0.10.9-src.zip/py4j/java_gateway.py in call(self, *args)
1302
1303 answer = self.gateway_client.send_command(command)
→ 1304 return_value = get_return_value(
1305 answer, self.gateway_client, self.target_id, self.name)
1306

/databricks/spark/python/pyspark/sql/utils.py in deco(*a, **kw)
114 # Hide where the exception came from that shows a non-Pythonic
115 # JVM exception message.
→ 116 raise converted from None
117 else:
118 raise

AnalysisException: Database dev_featurestore’ not found

I was missing hive metastore information in spark config which I added as below. And it resolve the above issue.
spark.hadoop.hops.ssl.trustore.name /dbfs/FileStore/tables/trustStore.jks
spark.hadoop.hops.rpc.socket.factory.class.default io.hops.hadoop.shaded.org.apache.hadoop.net.HopsSSLSocketFactory
spark.hadoop.hops.ssl.hostname.verifier ALLOW_ALL
spark.hadoop.hops.ssl.keystore.name /dbfs/FileStore/tables/keyStore.jks
spark.hadoop.fs.hopsfs.impl io.hops.hopsfs.client.HopsFileSystem
spark.hadoop.hops.ssl.keystores.passwd.name /dbfs/hopsworks/dev/dev_cert.key
spark.hadoop.hops.ipc.server.ssl.enabled true
spark.master local[]
spark.databricks.cluster.profile singleNode
spark.sql.hive.metastore.jars /hopsworks_metastore_jar/lib/

spark.hadoop.client.rpc.ssl.enabled.protocol TLSv1.2
spark.hadoop.hive.metastore.uris thrift://XXXXXXX:9083

But I am getting new issue…“Caused by: MetaException(message: Could not connect to meta store using any of the URIs provided. Most recent failure: org.apache.thrift.transport.TTransportException: Error creating the transport”

I have genrated keystore , truststore and cert.key file from Hopsworks–> Project–> Setting–> Export Certificates. I used the certificates information and uploaded the certificates properly. I have done it couple of times to make sure that keystore, truststore and cert.key files are correct. But whatever I do I am getting this error. Has anyone able to connect databricks with community version of hopsworks?

Py4JJavaError: An error occurred while calling o461.saveAsTable.
: java.lang.RuntimeException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:522)
at org.apache.spark.sql.hive.client.HiveClientImpl.newState(HiveClientImpl.scala:177)
at org.apache.spark.sql.hive.client.HiveClientImpl.(HiveClientImpl.scala:131)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.spark.sql.hive.client.LocalHiveClient.metastore$lzycompute(LocalHiveClientImpl.scala:71)
at org.apache.spark.sql.hive.client.LocalHiveClient.metastore(LocalHiveClientImpl.scala:63)
at org.apache.spark.sql.hive.client.PoolingHiveClient.$anonfun$new$1(PoolingHiveClient.scala:106)
at org.apache.spark.sql.hive.client.PoolingHiveClient.withHiveClient(PoolingHiveClient.scala:112)
at org.apache.spark.sql.hive.client.PoolingHiveClient.(PoolingHiveClient.scala:104)
at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:325)
at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:535)
at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:358)
at org.apache.spark.sql.hive.HiveExternalCatalog.client$lzycompute(HiveExternalCatalog.scala:78)
at org.apache.spark.sql.hive.HiveExternalCatalog.client(HiveExternalCatalog.scala:77)
at org.apache.spark.sql.hive.HiveExternalCatalog.maybeSynchronized(HiveExternalCatalog.scala:110)
at org.apache.spark.sql.hive.HiveExternalCatalog.$anonfun$withClient$1(HiveExternalCatalog.scala:150)
at com.databricks.backend.daemon.driver.ProgressReporter$.withStatusCode(ProgressReporter.scala:377)
at com.databricks.backend.daemon.driver.ProgressReporter$.withStatusCode(ProgressReporter.scala:363)
at com.databricks.spark.util.SparkDatabricksProgressReporter$.withStatusCode(ProgressReporter.scala:34)
at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:149)
at org.apache.spark.sql.hive.HiveExternalCatalog.databaseExists(HiveExternalCatalog.scala:292)
at org.apache.spark.sql.internal.SharedState.externalCatalog$lzycompute(SharedState.scala:175)
at org.apache.spark.sql.internal.SharedState.externalCatalog(SharedState.scala:162)
at org.apache.spark.sql.hive.HiveSessionStateBuilder.externalCatalog(HiveSessionStateBuilder.scala:50)
at org.apache.spark.sql.hive.HiveSessionStateBuilder.$anonfun$catalog$1(HiveSessionStateBuilder.scala:65)
at org.apache.spark.sql.catalyst.catalog.SessionCatalog.externalCatalog$lzycompute(SessionCatalog.scala:104)
at org.apache.spark.sql.catalyst.catalog.SessionCatalog.externalCatalog(SessionCatalog.scala:104)
at org.apache.spark.sql.catalyst.catalog.SessionCatalog.tableExists(SessionCatalog.scala:508)
at org.apache.spark.sql.DataFrameWriter.saveAsTable(DataFrameWriter.scala:774)
at org.apache.spark.sql.DataFrameWriter.saveAsTable(DataFrameWriter.scala:701)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:380)
at py4j.Gateway.invoke(Gateway.java:295)
at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
at py4j.commands.CallCommand.execute(CallCommand.java:79)
at py4j.GatewayConnection.run(GatewayConnection.java:251)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1523)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.(RetryingMetaStoreClient.java:86)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104)
at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3005)
at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3024)
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:503)
… 43 more
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1521)
… 49 more
Caused by: MetaException(message:Could not connect to meta store using any of the URIs provided. Most recent failure: org.apache.thrift.transport.TTransportException: Error creating the transport
at org.apache.thrift.transport.TSSLTransportFactory.createSSLContext(TSSLTransportFactory.java:214)
at org.apache.thrift.transport.TSSLTransportFactory.getClientSocket(TSSLTransportFactory.java:172)
at org.apache.hadoop.hive.common.auth.HiveAuthUtils.get2WayTLSClientSocket(HiveAuthUtils.java:76)
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:441)
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.(HiveMetaStoreClient.java:258)
at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.(SessionHiveMetaStoreClient.java:74)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1521)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.(RetryingMetaStoreClient.java:86)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104)
at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3005)
at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3024)
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:503)
at org.apache.spark.sql.hive.client.HiveClientImpl.newState(HiveClientImpl.scala:177)
at org.apache.spark.sql.hive.client.HiveClientImpl.(HiveClientImpl.scala:131)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.spark.sql.hive.client.LocalHiveClient.metastore$lzycompute(LocalHiveClientImpl.scala:71)
at org.apache.spark.sql.hive.client.LocalHiveClient.metastore(LocalHiveClientImpl.scala:63)
at org.apache.spark.sql.hive.client.PoolingHiveClient.$anonfun$new$1(PoolingHiveClient.scala:106)
at org.apache.spark.sql.hive.client.PoolingHiveClient.withHiveClient(PoolingHiveClient.scala:112)
at org.apache.spark.sql.hive.client.PoolingHiveClient.(PoolingHiveClient.scala:104)
at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:325)
at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:535)
at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:358)
at org.apache.spark.sql.hive.HiveExternalCatalog.client$lzycompute(HiveExternalCatalog.scala:78)
at org.apache.spark.sql.hive.HiveExternalCatalog.client(HiveExternalCatalog.scala:77)
at org.apache.spark.sql.hive.HiveExternalCatalog.maybeSynchronized(HiveExternalCatalog.scala:110)
at org.apache.spark.sql.hive.HiveExternalCatalog.$anonfun$withClient$1(HiveExternalCatalog.scala:150)
at com.databricks.backend.daemon.driver.ProgressReporter$.withStatusCode(ProgressReporter.scala:377)
at com.databricks.backend.daemon.driver.ProgressReporter$.withStatusCode(ProgressReporter.scala:363)
at com.databricks.spark.util.SparkDatabricksProgressReporter$.withStatusCode(ProgressReporter.scala:34)
at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:149)
at org.apache.spark.sql.hive.HiveExternalCatalog.databaseExists(HiveExternalCatalog.scala:292)
at org.apache.spark.sql.internal.SharedState.externalCatalog$lzycompute(SharedState.scala:175)
at org.apache.spark.sql.internal.SharedState.externalCatalog(SharedState.scala:162)
at org.apache.spark.sql.hive.HiveSessionStateBuilder.externalCatalog(HiveSessionStateBuilder.scala:50)
at org.apache.spark.sql.hive.HiveSessionStateBuilder.$anonfun$catalog$1(HiveSessionStateBuilder.scala:65)
at org.apache.spark.sql.catalyst.catalog.SessionCatalog.externalCatalog$lzycompute(SessionCatalog.scala:104)
at org.apache.spark.sql.catalyst.catalog.SessionCatalog.externalCatalog(SessionCatalog.scala:104)
at org.apache.spark.sql.catalyst.catalog.SessionCatalog.tableExists(SessionCatalog.scala:508)
at org.apache.spark.sql.DataFrameWriter.saveAsTable(DataFrameWriter.scala:774)
at org.apache.spark.sql.DataFrameWriter.saveAsTable(DataFrameWriter.scala:701)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:380)
at py4j.Gateway.invoke(Gateway.java:295)
at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
at py4j.commands.CallCommand.execute(CallCommand.java:79)
at py4j.GatewayConnection.run(GatewayConnection.java:251)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.io.IOException: Keystore was tampered with, or password was incorrect
at sun.security.provider.JavaKeyStore.engineLoad(JavaKeyStore.java:792)
at sun.security.provider.JavaKeyStore$JKS.engineLoad(JavaKeyStore.java:57)
at sun.security.provider.KeyStoreDelegator.engineLoad(KeyStoreDelegator.java:224)
at sun.security.provider.JavaKeyStore$DualFormatJKS.engineLoad(JavaKeyStore.java:71)
at java.security.KeyStore.load(KeyStore.java:1445)
at org.apache.thrift.transport.TSSLTransportFactory.createSSLContext(TSSLTransportFactory.java:190)
… 59 more
Caused by: java.security.UnrecoverableKeyException: Password verification failed
at sun.security.provider.JavaKeyStore.engineLoad(JavaKeyStore.java:790)
… 64 more
)
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:538)
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.(HiveMetaStoreClient.java:258)
at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.(SessionHiveMetaStoreClient.java:74)
… 54 more

Hi @rajumaha100,
Can you make sure you are using runtime 6.4 (LTS) that’s what the current release supports. Support for newer runtimes is coming in the next release.