我下载了spark-2.4.0-bin-without-hadoop.tgz包并安装在我的系统中。我想在本地模式下运行简单的apache火花代码,但是它给了我NoClassDefFoundError.。Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/conf/Configuration
C
在运行hadoop作业时,我得到以下错误:
org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:399):java.lang.Exception/cpp/opencv_core$CvArr java.lang.Class.forName0(原生方法) at java.lang.Class.forName(Class.java:270) at org.apache.hadoop
at java.lang.Class.forName0(Native Method) at org.apache.hadoop.conf.Configuration.getClassByNameOrNull(Configuration.java:1834)
at org.apache.hadoop.conf.