py4j.protocol.Py4JError:JVM中不存在org.apache.spark.api.python.PythonUtils.getEncryptionEnabled

py4j.protocol.Py4JError: org.apache.spark.api.python.PythonUtils.getEncryptionEnabled does not exist in the JVM

我当前在使用JRE:1.8.0_181,Python:3.6.4,spark:2.3.2

我正在尝试在Python中执行以下代码:

1
2
3
from pyspark.sql import SparkSession

spark = SparkSession.builder.appName('Basics').getOrCreate()

失败并显示以下错误:

spark = SparkSession.builder.appName('Basics').getOrCreate()
Traceback (most recent call last):
File"", line 1, in
File"C:\Tools\Anaconda3\lib\site-packages\pyspark\sql\session.py", line 173, in getOrCreate
sc = SparkContext.getOrCreate(sparkConf)
File"C:\Tools\Anaconda3\lib\site-packages\pyspark\context.py", line 349, in getOrCreate
SparkContext(conf=conf or SparkConf())
File"C:\Tools\Anaconda3\lib\site-packages\pyspark\context.py", line 118, in init
conf, jsc, profiler_cls)
File"C:\Tools\Anaconda3\lib\site-packages\pyspark\context.py", line 195, in _do_init
self._encryption_enabled = self._jvm.PythonUtils.getEncryptionEnabled(self._jsc)
File"C:\Tools\Anaconda3\lib\site-packages\py4j\java_gateway.py", line 1487, in getattr
"{0}.{1} does not exist in the JVM".format(self._fqn, name))
py4j.protocol.Py4JError: org.apache.spark.api.python.PythonUtils.getEncryptionEnabled does not exist in the JVM

有人对这里可能存在的潜在问题有任何想法吗?

在此感谢任何帮助或反馈。 谢谢!


如概述的那样,初始化SparkContext时,jvm错误中不存在@ pyspark错误,添加PYTHONPATH环境变量(值为:

%SPARK_HOME%\python;%SPARK_HOME%\python\lib\py4j--src.zip:%PYTHONPATH%
-只需检查spark/python/lib文件夹中的哪个py4j版本)即可解决此问题。