Unable to start pyspark session from jupyter notebook in managed Hopswork demo instance

I’m trying to follow the tutorial to run the fraud credit card transactions. As a part of the exercise I’m trying to run some commands in jupyter notebook but failing to start the pyspark application. I have tried it multiple times by creating a different jupyter notebook but its failing every time.
import json
from pyspark.sql.types import StructField, StructType, StringType, DoubleType, TimestampType, LongType, IntegerType

2021-10-26 19:39:37,133 INFO YarnClientImpl: Application submission is not finished, submitted application application_1634125593199_0034 is still in NEW_SAVING

Some things to try:
a) Make sure Spark has enough available resources for Jupyter to create a Spark context.
b) Contact your Jupyter administrator to make sure the Spark magics library is configured correctly.
c) Restart the kernel.


The problem should now be fixed.