Permission denied while reading TrainingDataset from external Python environment

I have installed hadoop jar binary, hsfs[hive] and pydoop python lib successfully in an external python environment, while using the I got an error of permission dened. Can anyone help?

Feature Store: 2.4.0
My ApiKey has the Scope:


My Error output:

File "/app/", line 80, in fetch_dataset_csv
    df: DataFrame ="dataset_split", "train"))
  File "/usr/local/lib/python3.8/site-packages/hsfs/", line 257, in read
    return, split, read_options)
  File "/usr/local/lib/python3.8/site-packages/hsfs/core/", line 107, in read
  File "/usr/local/lib/python3.8/site-packages/hsfs/", line 106, in read
    return engine.get_instance().read(self, data_format, options, path)
  File "/usr/local/lib/python3.8/site-packages/hsfs/engine/", line 73, in read
    df_list = self._read_hopsfs(location, data_format)
  File "/usr/local/lib/python3.8/site-packages/hsfs/engine/", line 108, in _read_hopsfs
    path_list =, recursive=True)
  File "/usr/local/lib/python3.8/site-packages/pydoop/hdfs/", line 307, in ls
    dir_list = lsl(hdfs_path, user, recursive)
  File "/usr/local/lib/python3.8/site-packages/pydoop/hdfs/", line 291, in lsl
    top = next(treewalk)
  File "/usr/local/lib/python3.8/site-packages/pydoop/hdfs/", line 631, in walk
    top = self.get_path_info(top)
  File "/usr/local/lib/python3.8/site-packages/pydoop/hdfs/", line 406, in get_path_info
    return self.fs.get_path_info(path)
PermissionError: [Errno 13] Permission denied

Hi @Yingding,

How did you setup your Hopsworks cluster? Is it on prem or is it (aws/azure)?


@Davit_Bzhalava It is an on-prem community edition of a single cluster host.