[ERROR] Connection external kafka producer

When I try to connect to Kafka from an external Producer in Python, i get a network connection error:

%3|1627231930.672|FAIL|rdkafka#producer-1| [thrd:ssl://]: ssl:// Connect to ipv4# failed: Network is unreachable (after 5ms in state CONNECT)

My Config looks like this:

Do you have any ideas, why the connection isn’t working. Can you please help me?

Kind regards,


Try to check if is set as an EXTERNAL listener in the /srv/hops/kafka/config/server.properties file under property advertised.listeners.

The broker needs to be restarted if this property is updated.

I just checked all the files and paths from “/srv/hops” and there is no kafka folder.

The only path i could find with kafka was: [’/srv/hops/anaconda/envs/theenv/lib/python3.7/site-packages/botocore/data/kafka’]

There wasn’t a config folder.

The Kafka installation director is only accessible by command line, typically the cluster administrator that has set it up has access to it.

Is it a self-managed Hopsworks installation or is it o hopsworks.ai?

It is on hopsworks.ai

I still can’t manage to find the Path you refered to. I am using hopsworks.ai, got all admin roles and all i found was just the path i linked you above. When i check the Hopsworks Variables in the Admin Control Panel i still can’t find a Kafka folder. Can you please explain on how to get to this path on hopsworks.ai? Or if this is not possible, how to resolve the Problem at all?

Hi @Akil_Guler

@Theo meant to ssh into your VM and sudo cat /srv/hops/kafka/config/server.properties There you can see the advertised EXTERNAL IP address.

In any case, the IP you are using falls in the private IP range so it wouldn’t work. Follow the instructions here and open the Kafka service. Then from your producer try using the domain name of your cluster which follows the following format INSTANCE-ID.cloud.hopsworks.ai

Kind regards,

First of all, thank you for your effort.

I managed to connect to the cluster via ssh and to execute sudo cat /srv/hops/kafka/config/server.properties. Then i updated the bootstrap.server adress according to the advertised external IP address. Now i get a ssl handshake error:

%3|1627828660.205|FAIL|rdkafka#producer-1| [thrd:ssl://da1b2ea0-f134-11eb-bc5d-cb0b7ed872e9.cloud.hopsworks.ai:9]: ssl://da1b2ea0-f134-11eb-bc5d-cb0b7ed872e9.cloud.hopsworks.ai:9092/bootstrap: SSL handshake failed: error:1416F086:SSL routines:tls_process_server_certificate:certificate verify failed: broker certificate could not be verified, verify that ssl.ca.location is correctly configured or root CA certificates are installed (install ca-certificates package) (after 68ms in state SSL_HANDSHAKE)

Do you have any ideas where this comes from? I created the pem files according to this script using the jks certificates i downloaded from hopsworks.ai:


echo $keyStore
echo “Generating certificate.pem”
keytool -exportcert -alias $alias -keystore $keyStore -rfc -file $outputFolder/certificate.pem -storepass $password

echo “Generating key.pem”
keytool -v -importkeystore -srckeystore $keyStore -srcalias $alias -destkeystore $outputFolder/cert_and_key.p12 -deststoretype PKCS12 -storepass $password -srcstorepass $password
openssl pkcs12 -in $outputFolder/cert_and_key.p12 -nodes -nocerts -out $outputFolder/key.pem -passin pass:$password

echo “Generating CARoot.pem”
keytool -exportcert -alias $alias -keystore $keyStore -rfc -file $outputFolder/CARoot.pem -storepass $password

I hope you can help me with this.

Kind Regards

Hi @Akil_Guler

Can you post your client config properties? Here is an example of a Python client that you can run in Hopsworks Kafka Python Feature Store Example that shows the kafka client properties that need to be provided.

@Theo thank you for reply.

You can download the pem certificates for the Kafka client by running this code in a Jupyter Python notebook in Hopsworks

from hops import tls, hdfs
hdfs.copy_to_hdfs(tls.get_ca_chain_location(), "Resources/", True)
hdfs.copy_to_hdfs(tls.get_client_certificate_location(), "Resources/", True)
hdfs.copy_to_hdfs(tls.get_client_key_location(), "Resources/", True)

and then from the Resources dataset download the ca_chain.pem and set it as the location of ssl.ca.location, the client.pem file and set is as the location of ssl.certificate.location and the client_key.pem file and set is as the location of ssl.key.location

@Theo Thank you so much its working now! :slight_smile: