• <bdo id='U2HO7'></bdo><ul id='U2HO7'></ul>
    <i id='U2HO7'><tr id='U2HO7'><dt id='U2HO7'><q id='U2HO7'><span id='U2HO7'><b id='U2HO7'><form id='U2HO7'><ins id='U2HO7'></ins><ul id='U2HO7'></ul><sub id='U2HO7'></sub></form><legend id='U2HO7'></legend><bdo id='U2HO7'><pre id='U2HO7'><center id='U2HO7'></center></pre></bdo></b><th id='U2HO7'></th></span></q></dt></tr></i><div id='U2HO7'><tfoot id='U2HO7'></tfoot><dl id='U2HO7'><fieldset id='U2HO7'></fieldset></dl></div>

      <small id='U2HO7'></small><noframes id='U2HO7'>

      <tfoot id='U2HO7'></tfoot>

      <legend id='U2HO7'><style id='U2HO7'><dir id='U2HO7'><q id='U2HO7'></q></dir></style></legend>
      1. Spark 安装 - 错误:无法找到或加载主类 org.apache.spark.launcher.Main

        时间:2023-06-05

          <tfoot id='tVEub'></tfoot>
        1. <small id='tVEub'></small><noframes id='tVEub'>

            <legend id='tVEub'><style id='tVEub'><dir id='tVEub'><q id='tVEub'></q></dir></style></legend>

            <i id='tVEub'><tr id='tVEub'><dt id='tVEub'><q id='tVEub'><span id='tVEub'><b id='tVEub'><form id='tVEub'><ins id='tVEub'></ins><ul id='tVEub'></ul><sub id='tVEub'></sub></form><legend id='tVEub'></legend><bdo id='tVEub'><pre id='tVEub'><center id='tVEub'></center></pre></bdo></b><th id='tVEub'></th></span></q></dt></tr></i><div id='tVEub'><tfoot id='tVEub'></tfoot><dl id='tVEub'><fieldset id='tVEub'></fieldset></dl></div>
              • <bdo id='tVEub'></bdo><ul id='tVEub'></ul>

                    <tbody id='tVEub'></tbody>

                  本文介绍了Spark 安装 - 错误:无法找到或加载主类 org.apache.spark.launcher.Main的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着跟版网的小编来一起学习吧!

                  问题描述

                  安装 spark 2.3 并在 .bashrc 中设置以下环境变量(使用 gitbash)

                  After spark installation 2.3 and setting the following env variables in .bashrc (using gitbash)

                  1. HADOOP_HOME

                  1. HADOOP_HOME

                  SPARK_HOME

                  PYSPARK_PYTHON

                  PYSPARK_PYTHON

                  JDK_HOME

                  执行 $SPARK_HOME/bin/spark-submit 显示以下错误.

                  executing $SPARK_HOME/bin/spark-submit is displaying the following error.

                  错误:无法找到或加载主类 org.apache.spark.launcher.Main

                  Error: Could not find or load main class org.apache.spark.launcher.Main

                  我在 stackoverflow 和其他网站上进行了一些研究检查,但无法找出问题所在.

                  I did some research checking in stackoverflow and other sites, but could not figure out the problem.

                  执行环境

                  1. Windows 10 企业版
                  2. Spark 版本 - 2.3
                  3. Python 版本 - 3.6.4

                  你能提供一些指导吗?

                  推荐答案

                  我收到了那个错误信息.它可能有几个根本原因,但这是我调查和解决问题的方式(在 linux 上):

                  I had that error message. It probably may have several root causes but this how I investigated and solved the problem (on linux):

                  • 不要启动 spark-submit,而是尝试使用 bash -x spark-submit 来查看哪一行失败.
                  • 多次执行该过程(因为 spark-submit 调用嵌套脚本),直到找到调用的底层过程:在我的情况下类似于:
                  • instead of launching spark-submit, try using bash -x spark-submit to see which line fails.
                  • do that process several times ( since spark-submit calls nested scripts ) until you find the underlying process called : in my case something like :

                  /usr/lib/jvm/java-8-openjdk-amd64/jre/bin/java -cp '/opt/spark-2.2.0-bin-hadoop2.7/conf/:/opt/spark-2.2.0-bin-hadoop2.7/jars/*' -Xmx1g org.apache.spark.deploy.SparkSubmit --class org.apache.spark.repl.Main --name 'Spark shell' spark-shell

                  因此,spark-submit 启动了一个 java 进程,但使用 /opt/spark-2.2.0-bin-hadoop2.7/中的文件找不到 org.apache.spark.launcher.Main 类jars/* (参见上面的 -cp 选项).我在这个 jars 文件夹中做了一个 ls 并计算了 4 个文件而不是整个 spark 分发(约 200 个文件).这可能是安装过程中的一个问题.所以我重新安装了 spark,检查了 jar 文件夹,它就像一个魅力.

                  So, spark-submit launches a java process and can't find the org.apache.spark.launcher.Main class using the files in /opt/spark-2.2.0-bin-hadoop2.7/jars/* (see the -cp option above). I did an ls in this jars folder and counted 4 files instead of the whole spark distrib (~200 files). It was probably a problem during the installation process. So I reinstalled spark, checked the jar folder and it worked like a charm.

                  所以,你应该:

                  • 检查 java 命令(cp 选项)
                  • 检查您的 jars 文件夹(它至少包含所有 spark-*.jar 吗?)

                  希望对你有帮助.

                  这篇关于Spark 安装 - 错误:无法找到或加载主类 org.apache.spark.launcher.Main的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持跟版网!

                  <legend id='nqojI'><style id='nqojI'><dir id='nqojI'><q id='nqojI'></q></dir></style></legend>
                    <tbody id='nqojI'></tbody>
                  <i id='nqojI'><tr id='nqojI'><dt id='nqojI'><q id='nqojI'><span id='nqojI'><b id='nqojI'><form id='nqojI'><ins id='nqojI'></ins><ul id='nqojI'></ul><sub id='nqojI'></sub></form><legend id='nqojI'></legend><bdo id='nqojI'><pre id='nqojI'><center id='nqojI'></center></pre></bdo></b><th id='nqojI'></th></span></q></dt></tr></i><div id='nqojI'><tfoot id='nqojI'></tfoot><dl id='nqojI'><fieldset id='nqojI'></fieldset></dl></div>

                        <bdo id='nqojI'></bdo><ul id='nqojI'></ul>
                      • <small id='nqojI'></small><noframes id='nqojI'>

                          <tfoot id='nqojI'></tfoot>