Error when trying to insert Dataframe into feature group

Hi,

currently, I am trying out the 14 day demo version of Hopsworks which is managed in your environment.

I am able to create a new feature group from my local Python environment. However, when I call the fg.insert(dfFeatures) with the Dataframe on the new feature group, I receive the following error message:

  File "...\Python37\site-packages\hsfs\feature_group.py", line 728, in insert
    write_options,
  File "...\Python37\site-packages\hsfs\core\feature_group_engine.py", line 83, in insert
    online_write_options = self.get_kafka_config(write_options)
  File "...\Python37\site-packages\hsfs\core\feature_group_engine.py", line 203, in get_kafka_config
    for endpoint in self._kafka_api.get_broker_endpoints()
  File "...\Python37\site-packages\hsfs\core\kafka_api.py", line 43, in get_broker_endpoints
    return _client._send_request("GET", path_params, headers=headers)["brokers"]
  File "...\Python37\site-packages\hsfs\decorators.py", line 35, in if_connected
    return fn(inst, *args, **kwargs)
  File ...\Python37\site-packages\hsfs\client\base.py", line 147, in _send_request
    raise exceptions.RestAPIError(url, response)
hsfs.client.exceptions.RestAPIError: Metadata operation error: (url: <myInstanceUrl>/hopsworks-api/api/project/1149/kafka/clusterinfo). Server response:
HTTP code: 403, HTTP reason: Forbidden, error code: 320004, error msg: No valid scope found for this invocation, user msg:

Can you please help me with this problem?

Kind regards
Alex

Hi Alex,

the API key that you created and that you are using for the connection in HSFS, which scopes did you choose at creation time?

Hi Moritz,

thank you for your feedback.

It has the following scopes:
[“DATASET_CREATE”,“DATASET_DELETE”,“FEATURESTORE”,“DATASET_VIEW”,“PROJECT”]
Do I also have to enable the ‘kafka’ scope?

Kind regards
Alex

Hi Moritz,

I created a new API key and enabled all scopes there. Now it works.

Kind regards
Alex

Hey,

sorry I missed your previous answer.
Yes, you were right, you will also need the Job and Kafka scopes. Glad you figured it out!