While you can install the debugger plugin in a Jupyterlab notebook, it will only stay deployed as long as your notebook is running. When you restart Jupyterlab, it will not be installed. You would have to update the base docker image for the project to do that - you can do it yourself in source code, but we don’t currently provide support for doing it.
In PySpark jobs, you can collect the logs and debug them. At the end of the application, the logs are visible in the Dataset
While the notebook is running, in the Notebook view in Hopsworks, there is a green button you can click on to navigate to the Spark UI. You can use that for debugging. Also, logs are streamed in real-time to Kibana (on the save monitoring UI) where you can search through logs.