python - Apache Spark: Error while starting PySpark -
on centos machine, python v2.6.6 , apache spark v1.2.1
getting following error when trying run ./pyspark
seems issue python not able figure out
15/06/18 08:11:16 info spark.sparkcontext: stopped sparkcontext traceback (most recent call last): file "/usr/lib/spark_1.2.1/spark-1.2.1-bin-hadoop2.4/python/pyspark/shell.py", line 45, in <module> sc = sparkcontext(appname="pysparkshell", pyfiles=add_files) file "/usr/lib/spark_1.2.1/spark-1.2.1-bin-hadoop2.4/python/pyspark/context.py", line 105, in __init__ conf, jsc) file "/usr/lib/spark_1.2.1/spark-1.2.1-bin-hadoop2.4/python/pyspark/context.py", line 157, in _do_init self._accumulatorserver = accumulators._start_update_server() file "/usr/lib/spark_1.2.1/spark-1.2.1-bin-hadoop2.4/python/pyspark/accumulators.py", line 269, in _start_update_server server = accumulatorserver(("localhost", 0), _updaterequesthandler) file "/usr/lib64/python2.6/socketserver.py", line 402, in __init__ self.server_bind() file "/usr/lib64/python2.6/socketserver.py", line 413, in server_bind self.socket.bind(self.server_address) file "<string>", line 1, in bind socket.gaierror: [errno -2] name or service not known >>> 15/06/18 08:11:16 info remote.remoteactorrefprovider$remotingterminator: shutting down remote daemon. 15/06/18 08:11:16 info remote.remoteactorrefprovider$remotingterminator: remote daemon shut down; proceeding flushing remote transports.
from logs looks pyspark unable understand host localhost.please check /etc/hosts file , if localhost not available , add entry should resolve issue.
e.g:
[ip] [hostname] localhost
in case not able change host entry of server edit /python/pyspark/accumulators.py line number 269 below
server = accumulatorserver(("[server host name hosts file]", 0), _updaterequesthandler)
Comments
Post a Comment