Can you provide a docker file where PySpark code can be run

As we all know, certain Hopworks code can only run in a PySpark environment. Can you provide a Dockerfile so that we can run such code?

Hi @Tim, are you trying to run Spark externally from the Hopsworks? Trying to understand what you want to achieve here? The PySpark environment you run in Hopsworks is already dockerized but needs to be run on a specific worker machine in the cluster as configuration and such is mounted inside the container.

yes we are trying to run the code on azure function.