Hudi Support in Pyspark Kernel

I am trying to use Hudi using python in spark. When I try to import the module io.hops.utils.Hops it gives me an error saying io is not a package. I have a jupyter notebook with pyspark kernel in environment.

That’s expected. io.hops.utils.Hops is a Scala package not a Python one.

You can still use Hudi from a PySpark Kernel using the Spark Dataframe APIs. You can have a look at this example notebook to get some inspiration:

You can also checkout Apache Hudi documentation:

Yes that helps. I just wanted to know that similar to scala package - in which you can create Hops.createfeaturegroup. setHudi(true).
Is this same functionality available through python as well? I have created a feature group but I want to use Hudi Functionality on it. How do I achieve that in Python ?