1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32
| //class not found //你以为是找不到类吗,并不是,哈哈哈,可能是配置的错误连累的 20/04/20 23:34:17 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 20/04/20 23:34:18 WARN deploy.DependencyUtils: Local jar /home/ruoze/app/spark-2.4.5-bin-2.6.0-cdh5.16.2/spark.executor.extraJavaOptions=-Dlog4j.configuration=log4j.properties does not exist, skipping. 20/04/20 23:34:18 WARN deploy.SparkSubmit$$anon$2: Failed to load com.ruozedata.spark.core.homeworks.loggerControl.ControlSparkLogger. java.lang.ClassNotFoundException: com.ruozedata.spark.core.homeworks.loggerControl.ControlSparkLogger
//原始语句: bin/spark-submit \ --class com.ruoze.spark.core.homeworks.loggerControl.ControlSparkLogger \ --master yarn \ --deploy-mode client \ --executor-memory 1G \ --num-executors 1 \ --files /home/ruoze/data/myjars/log4j.properties \ --conf spark.driver.extraJavaOptions="-Dlog4j.configuration=log4j.properties"+空格 +spark.executor.extraJavaOptions="-Dlog4j.configuration=log4j.properties"\ /home/ruoze/data/myjars/ruozedata-spark-core-1.0-SNAPSHOT.jar
//错误分析: 查看jar包,包中缺失有这个类。第一,运行以前的包,spark环境没错。第二,打包到其他项目中。用以前的方式 提交,没错。对比以前的提交发现问题在--config导致的解析错误 --conf 多个参数不能用空格分隔,最好也不要用逗号,按照官方推荐来。
//官方实例 //标准配置 --conf ./bin/spark-submit \ --name "My app" \ --master local[4] \ --conf spark.eventLog.enabled=false \ --conf "spark.executor.extraJavaOptions=-XX:+PrintGCDetails -XX:+PrintGCTimeStamps" \ --conf spark.hadoop.abc.def=xyz \ myApp.jar
|