Interested to learn how/where on the cluster Hopsworks Enterprise runs spark jobs. How/where is the spark version specified? Is it configurable?
The Spark version is not configurable when running jobs inside Hopsworks.
We are running Spark 3.1 for Hopsworks versions 2.3, 2.4, 2.5.