Online FS creation issue. Could not get JDBC connection for the online featurestore

I can create offline feature store but can not create online feature store. I am getting this error. How can I resolve this?

RestAPIError: Metadata operation error: (url: https://20.72.144.33/hopsworks-api/api/project/120/featurestores/68/featuregroups). Server response:
HTTP code: 500, HTTP reason: Internal Server Error, error code: 270063, error msg: Could not get JDBC connection for the online featurestore, user msg: Problem getting secrets for the JDBC connection to the online FS

@rajumaha100 - Can you check if you still have the secret with the password for the online feature store account in your secret store? Click on your email on the top right > Settings > Secrets. You should have a secret called something like [your username]_online_featurestore.

@Fabio No . I do not have one “[your username]_online_featurestore”, How can I create it?

I created another project to see if I was missing anything in hopsworks. With the new project I am getting this different error. Any suggestion on this new error while saving feature group as online.

An error was encountered:
Metadata operation error: (url: https://hopsworks.glassfish.service.consul:8182/hopsworks-api/api/project/1144/featurestores/1092/featuregroups). Server response:
HTTP code: 500, HTTP reason: Internal Server Error, error code: 190023, error msg: Could not fetch topic details., user msg: Topic name: 1144_1071_card_transactions_10m_agg_1_onlinefs
Traceback (most recent call last):
File “/srv/hops/anaconda/envs/theenv/lib/python3.7/site-packages/hsfs/feature_group.py”, line 632, in save
self._feature_group_engine.save(self, feature_dataframe, write_options)
File “/srv/hops/anaconda/envs/theenv/lib/python3.7/site-packages/hsfs/core/feature_group_engine.py”, line 46, in save
self._feature_group_api.save(feature_group)
File “/srv/hops/anaconda/envs/theenv/lib/python3.7/site-packages/hsfs/core/feature_group_api.py”, line 52, in save
data=feature_group_instance.json(),
File “/srv/hops/anaconda/envs/theenv/lib/python3.7/site-packages/hsfs/decorators.py”, line 35, in if_connected
return fn(inst, *args, **kwargs)
File “/srv/hops/anaconda/envs/theenv/lib/python3.7/site-packages/hsfs/client/base.py”, line 147, in _send_request
raise exceptions.RestAPIError(url, response)
hsfs.client.exceptions.RestAPIError: Metadata operation error: (url: https://hopsworks.glassfish.service.consul:8182/hopsworks-api/api/project/1144/featurestores/1092/featuregroups). Server response:
HTTP code: 500, HTTP reason: Internal Server Error, error code: 190023, error msg: Could not fetch topic details., user msg: Topic name: 1144_1071_card_transactions_10m_agg_1_onlinefs

@rajumaha100 - are you running on hopsworks.ai (maybe on the demo cluster?) or do you have your own installation?

In your project, in the Kafka section check if you have the topic created. If you have your own installation, check if the Kafka service and zookeeper service is running correctly. You can check that from the admin UI (Click on your email on the top right > Admin > Grafana > Kafka Dashboard).

I am running my own installation.
I reinstalled the hopsworks and now I can create online storage but still with error. I can see topics in kafka as well. I can see online feature group created but I still have this error:
Py4JError: An error occurred while calling z:org.apache.spark.sql.avro.functions.to_avro. Trace:
py4j.Py4JException: Method to_avro([class org.apache.spark.sql.Column, class java.lang.String]) does not exist
at py4j.reflection.ReflectionEngine.getMethod(ReflectionEngine.java:341)
at py4j.reflection.ReflectionEngine.getMethod(ReflectionEngine.java:362)
at py4j.Gateway.invoke(Gateway.java:289)
at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
at py4j.commands.CallCommand.execute(CallCommand.java:79)
at py4j.GatewayConnection.run(GatewayConnection.java:251)
at java.lang.Thread.run(Thread.java:748)

@rajumaha100 - Where is the Spark application running? On Hopsworks? or some external Spark cluster?

If you are running on Hopsworks, does your Job/Jupyter notebook have additional dependencies?