at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:522) at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3005)
Hive .14 Spark1.6 .Trying以务实的方式连接Spark的蜂窝表。我已经将我的hive-site.xml放到了spark文件夹中。请有人建议我solution.Below是我的代码 at org.apache.spark.sql.hive.client.ClientWrapper.<init>(SQLContext.scala:330)
at org.apache.spark.sql.hive.Hiv
(Hive.java:3005) atthe "BONECP" plugin to create a ConnectionPool gave an error : The specified datastore driver ("com.mysql.jdbc.Driverorg.datanucleus.store.