Error in Jupyter notebook of Feature Store demo on hopsworks.ai

Hi,

I have signup for 30 days trail account, While using jupyter notebook, I’m getting following error

Warning: The Spark session does not have enough YARN resources to start.
The code failed because of a fatal error:
Session 70 unexpectedly reached final status ‘dead’. See logs:

stderr:

YARN Diagnostics:
[Tue Mar 09 07:56:07 +0000 2021] Application is added to the scheduler and is not yet activated. Queue’s AM resource limit exceeded. Details : AM Partition = <DEFAULT_PARTITION>; AM Resource Request = <memory:1408, vCores:2>; Queue Resource Limit for AM = <memory:8320, vCores:3>; User AM Resource Limit of the queue = <memory:8320, vCores:3>; Queue AM Resource Usage = <memory:7296, vCores:3>; .

Some things to try:
a) Make sure Spark has enough available resources for Jupyter to create a Spark context.
b) Contact your Jupyter administrator to make sure the Spark magics library is configured correctly.
c) Restart the kernel.

Any idea, how can we resolve this issue

Hi @mohit.garg

As this is a demo cluster shared among other users we cannot guarantee compute capacity. I would say try again now.