Quantcast
Channel: SCN : Discussion List - SAP HANA and In-Memory Computing
Viewing all articles
Browse latest Browse all 5653

Accesing Hana from spark via jdbc

$
0
0

Hi there,

 

I am trying to access a Hana table from spark using sparksql but when  building the dataframe and doing .show() i get an error saying one of the classes(Host)  in the hana driver does not support serializable, any idea how to fix this?

 

this is what I do to generate the problem:

 

1. Start spark-shell

2. val df = sqlContext.load("jdbc", Map("url" -> "jdbc:sap://xXXXXXX:30015/?databaseName=mydb&user=SYSTEM&password=xxxxx", "dbtable" -> "SYSTEM.TEST1"));

3. df.show();

 

I get below exception on calling any action on dataframe object.

 

org.apache.spark.SparkException: Job aborted due to stage failure: Task not serializable: java.io.NotSerializableException: com.sap.db.jdbc.topology.Host

Serialization stack:

        - object not serializable (class: com.sap.db.jdbc.topology.Host, value: xxxxxx:30015)

        - writeObject data (class: java.util.ArrayList)

        - object (class java.util.ArrayList, [xxxxxx:30015])

        - writeObject data (class: java.util.Hashtable)

        - object (class java.util.Properties, {dburl=jdbc:sap://xxxxx:30015, user=SYSTEM, password=Saphana123, url=jdbc:sap://xxxxxxxx:30015/?system&user=SYSTEM&password=Saphana123, dbtable=SYSTEM.TEST1, hostlist=[xxxxxxx:30015]})

 

 

any ideas how to fix this?

 

regards,

Miguel


Viewing all articles
Browse latest Browse all 5653

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>