Why does Spark not run locally while it should be possible according the documentation? -


the aim started spark executing examples , investigate output.

i have cloned apache spark repository, built following instructions in readme , ran ./bin/spark-shell results in:

using spark's default log4j profile: org/apache/spark/log4j-defaults.properties setting default log level "warn". adjust logging level use sc.setloglevel(newlevel). sparkr, use setloglevel(newlevel). 16/11/10 08:47:48 warn nativecodeloader: unable load native-hadoop library platform... using builtin-java classes applicable 16/11/10 08:47:48 warn utils: service 'sparkdriver' not bind on port 0. attempting port 1. 16/11/10 08:47:48 warn utils: service 'sparkdriver' not bind on port 0. attempting port 1. 16/11/10 08:47:48 warn utils: service 'sparkdriver' not bind on port 0. attempting port 1. 16/11/10 08:47:48 warn utils: service 'sparkdriver' not bind on port 0. attempting port 1. 16/11/10 08:47:48 warn utils: service 'sparkdriver' not bind on port 0. attempting port 1. 16/11/10 08:47:48 warn utils: service 'sparkdriver' not bind on port 0. attempting port 1. 16/11/10 08:47:48 warn utils: service 'sparkdriver' not bind on port 0. attempting port 1. 16/11/10 08:47:48 warn utils: service 'sparkdriver' not bind on port 0. attempting port 1. 16/11/10 08:47:48 warn utils: service 'sparkdriver' not bind on port 0. attempting port 1. 16/11/10 08:47:48 warn utils: service 'sparkdriver' not bind on port 0. attempting port 1. 16/11/10 08:47:48 warn utils: service 'sparkdriver' not bind on port 0. attempting port 1. 16/11/10 08:47:48 warn utils: service 'sparkdriver' not bind on port 0. attempting port 1. 16/11/10 08:47:48 warn utils: service 'sparkdriver' not bind on port 0. attempting port 1. 16/11/10 08:47:48 warn utils: service 'sparkdriver' not bind on port 0. attempting port 1. 16/11/10 08:47:48 warn utils: service 'sparkdriver' not bind on port 0. attempting port 1. 16/11/10 08:47:48 warn utils: service 'sparkdriver' not bind on port 0. attempting port 1. 16/11/10 08:47:48 error sparkcontext: error initializing sparkcontext. java.net.bindexception: cannot assign requested address: service 'sparkdriver' failed after 16 retries (starting 0)! consider explicitly setting appropriate port service 'sparkdriver' (for example spark.ui.port sparkui) available port or increasing spark.port.maxretries.     @ sun.nio.ch.net.bind0(native method)     @ sun.nio.ch.net.bind(net.java:433)     @ sun.nio.ch.net.bind(net.java:425)     @ sun.nio.ch.serversocketchannelimpl.bind(serversocketchannelimpl.java:223)     @ io.netty.channel.socket.nio.nioserversocketchannel.dobind(nioserversocketchannel.java:127)     @ io.netty.channel.abstractchannel$abstractunsafe.bind(abstractchannel.java:501)     @ io.netty.channel.defaultchannelpipeline$headcontext.bind(defaultchannelpipeline.java:1218)     @ io.netty.channel.abstractchannelhandlercontext.invokebind(abstractchannelhandlercontext.java:505)     @ io.netty.channel.abstractchannelhandlercontext.bind(abstractchannelhandlercontext.java:490)     @ io.netty.channel.defaultchannelpipeline.bind(defaultchannelpipeline.java:965)     @ io.netty.channel.abstractchannel.bind(abstractchannel.java:210)     @ io.netty.bootstrap.abstractbootstrap$2.run(abstractbootstrap.java:353)     @ io.netty.util.concurrent.singlethreadeventexecutor.runalltasks(singlethreadeventexecutor.java:408)     @ io.netty.channel.nio.nioeventloop.run(nioeventloop.java:441)     @ io.netty.util.concurrent.singlethreadeventexecutor$2.run(singlethreadeventexecutor.java:140)     @ io.netty.util.concurrent.defaultthreadfactory$defaultrunnabledecorator.run(defaultthreadfactory.java:144)     @ java.lang.thread.run(thread.java:745) 16/11/10 08:47:48 error sparkcontext: error stopping sparkcontext after init error. java.lang.nullpointerexception     @ org.apache.spark.sparkcontext.stop(sparkcontext.scala:1764)     @ org.apache.spark.sparkcontext.<init>(sparkcontext.scala:591)     @ org.apache.spark.sparkcontext$.getorcreate(sparkcontext.scala:2309)     @ org.apache.spark.sql.sparksession$builder$$anonfun$6.apply(sparksession.scala:843)     @ org.apache.spark.sql.sparksession$builder$$anonfun$6.apply(sparksession.scala:835)     @ scala.option.getorelse(option.scala:121)     @ org.apache.spark.sql.sparksession$builder.getorcreate(sparksession.scala:835)     @ org.apache.spark.repl.main$.createsparksession(main.scala:101)     @ $line3.$read$$iw$$iw.<init>(<console>:15)     @ $line3.$read$$iw.<init>(<console>:42)     @ $line3.$read.<init>(<console>:44)     @ $line3.$read$.<init>(<console>:48)     @ $line3.$read$.<clinit>(<console>)     @ $line3.$eval$.$print$lzycompute(<console>:7)     @ $line3.$eval$.$print(<console>:6)     @ $line3.$eval.$print(<console>)     @ sun.reflect.nativemethodaccessorimpl.invoke0(native method)     @ sun.reflect.nativemethodaccessorimpl.invoke(nativemethodaccessorimpl.java:62)     @ sun.reflect.delegatingmethodaccessorimpl.invoke(delegatingmethodaccessorimpl.java:43)     @ java.lang.reflect.method.invoke(method.java:498)     @ scala.tools.nsc.interpreter.imain$readevalprint.call(imain.scala:786)     @ scala.tools.nsc.interpreter.imain$request.loadandrun(imain.scala:1047)     @ scala.tools.nsc.interpreter.imain$wrappedrequest$$anonfun$loadandrunreq$1.apply(imain.scala:638)     @ scala.tools.nsc.interpreter.imain$wrappedrequest$$anonfun$loadandrunreq$1.apply(imain.scala:637)     @ scala.reflect.internal.util.scalaclassloader$class.ascontext(scalaclassloader.scala:31)     @ scala.reflect.internal.util.abstractfileclassloader.ascontext(abstractfileclassloader.scala:19)     @ scala.tools.nsc.interpreter.imain$wrappedrequest.loadandrunreq(imain.scala:637)     @ scala.tools.nsc.interpreter.imain.interpret(imain.scala:569)     @ scala.tools.nsc.interpreter.imain.interpret(imain.scala:565)     @ scala.tools.nsc.interpreter.iloop.interpretstartingwith(iloop.scala:807)     @ scala.tools.nsc.interpreter.iloop.command(iloop.scala:681)     @ scala.tools.nsc.interpreter.iloop.processline(iloop.scala:395)     @ org.apache.spark.repl.sparkiloop$$anonfun$initializespark$1.apply$mcv$sp(sparkiloop.scala:38)     @ org.apache.spark.repl.sparkiloop$$anonfun$initializespark$1.apply(sparkiloop.scala:37)     @ org.apache.spark.repl.sparkiloop$$anonfun$initializespark$1.apply(sparkiloop.scala:37)     @ scala.tools.nsc.interpreter.imain.bequietduring(imain.scala:214)     @ org.apache.spark.repl.sparkiloop.initializespark(sparkiloop.scala:37)     @ org.apache.spark.repl.sparkiloop.loadfiles(sparkiloop.scala:105)     @ scala.tools.nsc.interpreter.iloop$$anonfun$process$1.apply$mcz$sp(iloop.scala:920)     @ scala.tools.nsc.interpreter.iloop$$anonfun$process$1.apply(iloop.scala:909)     @ scala.tools.nsc.interpreter.iloop$$anonfun$process$1.apply(iloop.scala:909)     @ scala.reflect.internal.util.scalaclassloader$.savingcontextloader(scalaclassloader.scala:97)     @ scala.tools.nsc.interpreter.iloop.process(iloop.scala:909)     @ org.apache.spark.repl.main$.domain(main.scala:68)     @ org.apache.spark.repl.main$.main(main.scala:51)     @ org.apache.spark.repl.main.main(main.scala)     @ sun.reflect.nativemethodaccessorimpl.invoke0(native method)     @ sun.reflect.nativemethodaccessorimpl.invoke(nativemethodaccessorimpl.java:62)     @ sun.reflect.delegatingmethodaccessorimpl.invoke(delegatingmethodaccessorimpl.java:43)     @ java.lang.reflect.method.invoke(method.java:498)     @ org.apache.spark.deploy.sparksubmit$.org$apache$spark$deploy$sparksubmit$$runmain(sparksubmit.scala:738)     @ org.apache.spark.deploy.sparksubmit$.dorunmain$1(sparksubmit.scala:187)     @ org.apache.spark.deploy.sparksubmit$.submit(sparksubmit.scala:212)     @ org.apache.spark.deploy.sparksubmit$.main(sparksubmit.scala:126)     @ org.apache.spark.deploy.sparksubmit.main(sparksubmit.scala) java.net.bindexception: cannot assign requested address: service 'sparkdriver' failed after 16 retries (starting 0)! consider explicitly setting appropriate port service 'sparkdriver' (for example spark.ui.port sparkui) available port or increasing spark.port.maxretries.   @ sun.nio.ch.net.bind0(native method)   @ sun.nio.ch.net.bind(net.java:433)   @ sun.nio.ch.net.bind(net.java:425)   @ sun.nio.ch.serversocketchannelimpl.bind(serversocketchannelimpl.java:223)   @ io.netty.channel.socket.nio.nioserversocketchannel.dobind(nioserversocketchannel.java:127)   @ io.netty.channel.abstractchannel$abstractunsafe.bind(abstractchannel.java:501)   @ io.netty.channel.defaultchannelpipeline$headcontext.bind(defaultchannelpipeline.java:1218)   @ io.netty.channel.abstractchannelhandlercontext.invokebind(abstractchannelhandlercontext.java:505)   @ io.netty.channel.abstractchannelhandlercontext.bind(abstractchannelhandlercontext.java:490)   @ io.netty.channel.defaultchannelpipeline.bind(defaultchannelpipeline.java:965)   @ io.netty.channel.abstractchannel.bind(abstractchannel.java:210)   @ io.netty.bootstrap.abstractbootstrap$2.run(abstractbootstrap.java:353)   @ io.netty.util.concurrent.singlethreadeventexecutor.runalltasks(singlethreadeventexecutor.java:408)   @ io.netty.channel.nio.nioeventloop.run(nioeventloop.java:441)   @ io.netty.util.concurrent.singlethreadeventexecutor$2.run(singlethreadeventexecutor.java:140)   @ io.netty.util.concurrent.defaultthreadfactory$defaultrunnabledecorator.run(defaultthreadfactory.java:144)   @ java.lang.thread.run(thread.java:745) <console>:14: error: not found: value spark        import spark.implicits._               ^ <console>:14: error: not found: value spark        import spark.sql               ^ welcome       ____              __      / __/__  ___ _____/ /__     _\ \/ _ \/ _ `/ __/  '_/    /___/ .__/\_,_/_/ /_/\_\   version 2.1.0-snapshot       /_/  using scala version 2.11.8 (java hotspot(tm) 64-bit server vm, java 1.8.0_92) type in expressions have them evaluated. type :help more information. 

running 1 of examples fails well:

scala> sc.parallelize(1 1000).count() <console>:18: error: not found: value sc        sc.parallelize(1 1000).count() 

try following 2 directions:

1) spark find ip:

if hostname isn't included in /etc/hosts add /etc/hosts

127.0.0.1      your_hostname 

set environment variable spark_local_ip="127.0.0.1"

2) if exists - kill old spark process


Comments

Popular posts from this blog

asynchronous - C# WinSCP .NET assembly: How to upload multiple files asynchronously -

aws api gateway - SerializationException in posting new Records via Dynamodb Proxy Service in API -

asp.net - Problems sending emails from forum -