0

OK, I have known that I could use jdbc connector to create DataFrame with this command:

val jdbcDF = sqlContext.load("jdbc", 
Map("url" -> "jdbc:mysql://localhost:3306/video_rcmd?user=root&password=123456",
"dbtable" -> "video"))

But I got this error: java.sql.SQLException: No suitable driver found for ...

And I have tried to add jdbc jar to spark_path with both commands but failed:

  • spark-shell --jars mysql-connector-java-5.0.8-bin.jar
  • SPARK_CLASSPATH=mysql-connector-java-5.0.8-bin.jar spark-shell

My Spark version is 1.3.0 while Class.forName("com.mysql.jdbc.Driver").newInstance is worked.

2 Answers 2

1

It is caused because the data frame does the find the the Mysql Connector Jar in the class path. This can be resolved by adding the jar to the spark class path as below:

Edit /spark/bin/compute-classpath.sh as

CLASSPATH="$CLASSPATH:$ASSEMBLY_JAR:yourPathToJar/mysql-connector-java-5.0.8-bin.jar"

Save the file and Restart the spark.

Sign up to request clarification or add additional context in comments.

Comments

0

You might want to try mysql-connector-java-5.1.29-bin.jar

3 Comments

Is there some specific reason you expect this to help? If so perhaps you could edit your post to include that.
I have updated to mysql-connector-java-5.1.34-bin.jar and it worked.
mysql-connector-java-5.1.38-bin.jar works too, don't know what's wrong with 5.0.8.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.