Hudi Support in Pyspark Kernel

I am trying to use Hudi using python in spark. When I try to import the module io.hops.utils.Hops it gives me an error saying io is not a package. I have a jupyter notebook with pyspark kernel in Hopsworks.ai environment.

That’s expected. io.hops.utils.Hops is a Scala package not a Python one.

You can still use Hudi from a PySpark Kernel using the Spark Dataframe APIs. You can have a look at this example notebook to get some inspiration: https://github.com/SirOibaf/hops-examples/blob/hudi_python/notebooks/featurestore/hudi/hudi-python.ipynb

You can also checkout Apache Hudi documentation: https://hudi.apache.org/docs/quick-start-guide.html#pyspark-example

Yes that helps. I just wanted to know that similar to scala package - in which you can create Hops.createfeaturegroup. setHudi(true).
Is this same functionality available through python as well? I have created a feature group but I want to use Hudi Functionality on it. How do I achieve that in Python ?