When running jobs, is Spark version configurable?

Hello,
Interested to learn how/where on the cluster Hopsworks Enterprise runs spark jobs. How/where is the spark version specified? Is it configurable?
Thanks

Hi @dsiegel,

The Spark version is not configurable when running jobs inside Hopsworks.
We are running Spark 3.1 for Hopsworks versions 2.3, 2.4, 2.5.

Regards,
Alex