41

i am new in using spark , i try to run this code on pyspark

from pyspark import SparkConf, SparkContext
import collections

conf = SparkConf().setMaster("local").setAppName("RatingsHistogram")
sc = SparkContext(conf = conf)

but he till me this erore message

Using Python version 3.5.2 (default, Jul  5 2016 11:41:13)
SparkSession available as 'spark'.
>>> from pyspark import SparkConf, SparkContext
>>> import collections
>>> conf = SparkConf().setMaster("local").setAppName("RatingsHistogram")
>>> sc = SparkContext(conf = conf)



   Traceback (most recent call last):
      File "<stdin>", line 1, in <module>
      File "C:\spark\python\pyspark\context.py", line 115, in __init__
        SparkContext._ensure_initialized(self, gateway=gateway, conf=conf)
      File "C:\spark\python\pyspark\context.py", line 275, in _ensure_initialized
        callsite.function, callsite.file, callsite.linenum))
    ValueError: Cannot run multiple SparkContexts at once; existing SparkContext(app=PySparkShell, master=local[*]) created by getOrCreate at C:\spark\bin\..\python\pyspark\shell.py:43
    >>>

i have version spark 2.1.1 and python 3.5.2 , i search and found it is problem in sc ,he could not read it but no when till why , any one have help here

3
  • 4
    You are using your code inside of pyspark2, which creates a SparkSession for you already. Don't use pyspark shell since you are creating your own SparkContext. Save your code into a Python file and submit it via spark-submit Commented Sep 21, 2017 at 19:57
  • @arun Post it as an answer Commented Sep 21, 2017 at 19:59
  • it run in win shell, but not run in jupyter notebook or canopy , at all i will find the problem , thank you a lot @arun Commented Sep 21, 2017 at 20:19

4 Answers 4

80

You can try out this

sc = SparkContext.getOrCreate();

Sign up to request clarification or add additional context in comments.

Comments

28

You can try:

sc = SparkContext.getOrCreate(conf=conf)

Comments

8

Your previous session is still on. You can run

sc.stop()

Comments

0

it can run through Jupyter lab also. but you have to use as your previous session is still running and local can not run two sessions at a time sc = SparkContext.getOrCreate( conf =conf)

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.