<legend id='4V67B'><style id='4V67B'><dir id='4V67B'><q id='4V67B'></q></dir></style></legend>

      <bdo id='4V67B'></bdo><ul id='4V67B'></ul>
    1. <tfoot id='4V67B'></tfoot>

      <i id='4V67B'><tr id='4V67B'><dt id='4V67B'><q id='4V67B'><span id='4V67B'><b id='4V67B'><form id='4V67B'><ins id='4V67B'></ins><ul id='4V67B'></ul><sub id='4V67B'></sub></form><legend id='4V67B'></legend><bdo id='4V67B'><pre id='4V67B'><center id='4V67B'></center></pre></bdo></b><th id='4V67B'></th></span></q></dt></tr></i><div id='4V67B'><tfoot id='4V67B'></tfoot><dl id='4V67B'><fieldset id='4V67B'></fieldset></dl></div>

      <small id='4V67B'></small><noframes id='4V67B'>

    2. 在同一个 JVM 中检测到多个 SparkContext

      时间:2023-08-23

        <tbody id='M88R5'></tbody>

      <small id='M88R5'></small><noframes id='M88R5'>

      • <tfoot id='M88R5'></tfoot>
          <legend id='M88R5'><style id='M88R5'><dir id='M88R5'><q id='M88R5'></q></dir></style></legend>
          <i id='M88R5'><tr id='M88R5'><dt id='M88R5'><q id='M88R5'><span id='M88R5'><b id='M88R5'><form id='M88R5'><ins id='M88R5'></ins><ul id='M88R5'></ul><sub id='M88R5'></sub></form><legend id='M88R5'></legend><bdo id='M88R5'><pre id='M88R5'><center id='M88R5'></center></pre></bdo></b><th id='M88R5'></th></span></q></dt></tr></i><div id='M88R5'><tfoot id='M88R5'></tfoot><dl id='M88R5'><fieldset id='M88R5'></fieldset></dl></div>

                <bdo id='M88R5'></bdo><ul id='M88R5'></ul>
                本文介绍了在同一个 JVM 中检测到多个 SparkContext的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着跟版网的小编来一起学习吧!

                问题描述

                根据我的 最后一个问题,我必须为我的独特的 JVM.

                according my last question I have to define the Multiple SparkContext for my unique JVM.

                我用下一个方法(使用 Java):

                I did it in the next way (using Java):

                SparkConf conf = new SparkConf();
                conf.setAppName("Spark MultipleContest Test");
                conf.set("spark.driver.allowMultipleContexts", "true");
                conf.setMaster("local");
                

                之后我创建下一个源代码:

                After that I create the next source code:

                SparkContext sc = new SparkContext(conf);
                SQLContext sqlContext = new org.apache.spark.sql.SQLContext(sc);
                

                以及后面的代码:

                JavaSparkContext ctx = new JavaSparkContext(conf);
                JavaRDD<Row> testRDD = ctx.parallelize(AllList);
                

                代码执行后,我收到下一条错误消息:

                After the code executing I got next error message:

                16/01/19 15:21:08 WARN SparkContext: Multiple running SparkContexts detected in the same JVM!
                org.apache.spark.SparkException: Only one SparkContext may be running in this JVM (see SPARK-2243). To ignore this error, set spark.driver.allowMultipleContexts = true. The currently running SparkContext was created at:
                org.apache.spark.SparkContext.<init>(SparkContext.scala:81)
                test.MLlib.BinarryClassification.main(BinaryClassification.java:41)
                    at org.apache.spark.SparkContext$$anonfun$assertNoOtherContextIsRunning$1.apply(SparkContext.scala:2083)
                    at org.apache.spark.SparkContext$$anonfun$assertNoOtherContextIsRunning$1.apply(SparkContext.scala:2065)
                    at scala.Option.foreach(Option.scala:236)
                    at org.apache.spark.SparkContext$.assertNoOtherContextIsRunning(SparkContext.scala:2065)
                    at org.apache.spark.SparkContext$.setActiveContext(SparkContext.scala:2151)
                    at org.apache.spark.SparkContext.<init>(SparkContext.scala:2023)
                    at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:61)
                    at test.MLlib.BinarryClassification.main(BinaryClassification.java:105)
                

                数字 41105 是行,这两个对象都是用 Java 代码定义的.我的问题是,如果我已经使用 set-method,是否可以在同一个 JVM 上执行多个 SparkContext 以及如何执行?

                The numbers 41 and 105 are the lines, where both objects are defined in Java code. My question is, is it possible to execute multiple SparkContext on the same JVM and how to do it, if I already use the set-method ?

                推荐答案

                您确定需要将 JavaSparkContext 作为单独的上下文吗?您提到的上一个问题没有这样说.如果你已经有一个 Spark 上下文,你可以从中创建一个新的 JavaSparkContext,而不是创建一个单独的上下文:

                Are you sure you need the JavaSparkContext as a separate context? The previous question that you refer to doesn't say so. If you already have a Spark Context you can create a new JavaSparkContext from it, rather than create a separate context:

                SparkConf conf = new SparkConf();
                conf.setAppName("Spark MultipleContest Test");
                conf.set("spark.driver.allowMultipleContexts", "true");
                conf.setMaster("local");
                
                SparkContext sc = new SparkContext(conf);
                SQLContext sqlContext = new org.apache.spark.sql.SQLContext(sc);
                
                //Create a Java Context which is the same as the scala one under the hood
                JavaSparkContext.fromSparkContext(sc)
                

                这篇关于在同一个 JVM 中检测到多个 SparkContext的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持跟版网!

                上一篇:监控 Java 应用程序自己的内存使用情况 下一篇:为什么java中默认不启用assert

                相关文章

              • <i id='hnlu4'><tr id='hnlu4'><dt id='hnlu4'><q id='hnlu4'><span id='hnlu4'><b id='hnlu4'><form id='hnlu4'><ins id='hnlu4'></ins><ul id='hnlu4'></ul><sub id='hnlu4'></sub></form><legend id='hnlu4'></legend><bdo id='hnlu4'><pre id='hnlu4'><center id='hnlu4'></center></pre></bdo></b><th id='hnlu4'></th></span></q></dt></tr></i><div id='hnlu4'><tfoot id='hnlu4'></tfoot><dl id='hnlu4'><fieldset id='hnlu4'></fieldset></dl></div>

                <tfoot id='hnlu4'></tfoot>

              • <legend id='hnlu4'><style id='hnlu4'><dir id='hnlu4'><q id='hnlu4'></q></dir></style></legend>
                • <bdo id='hnlu4'></bdo><ul id='hnlu4'></ul>
                  1. <small id='hnlu4'></small><noframes id='hnlu4'>