最新的EMR4.1.0包使用Hive 1.0.0和Spark1.5.0,Hive1.0.0使用parquet-hadoop-bundle-1.5.0.jar,而Spark使用parquet-hadoop我试着在蜂巢壳中使用add jar parquet-hive-bundle-1.7.0.jar,但是没有运气,Hive仍然使用它捆绑的旧Parquet罐子。然而,我将parquet-hive-bundle-1.7.0.jar复制到/usr/lib
我已经确认正在使用的用户具有写入/var/log/hive/user/hadoop的权限java.io.FileNotFoundException: /var/log/hive/user/hadoop/hive.log (No such file or directory)
at java.io.Fi
apache.hadoop.hive.ql.metadata.Hive.registerAllFunctionsOnce(Hive.java:226) at org.apache.hadoop.hive.ql.metadata.Hive(Hive.java:366) at org.apache.hadoop.hive.ql.metadata.Hive.create(Hive.jav
at org.apache.hadoop.hive.ql.parse.ParseUtils.parse(ParseUtils.java:77) at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1456)
at org.apache.hadoop</