java - How to pass parameters / properties to Spark jobs with spark-submit -


i running spark job implemented in java using spark-submit. pass parameters job - e.g. time-start , time-end parameter parametrize spark application.

what tried using the

--conf key=value 

option of spark-submit script, when try read parameter in spark job

sparkcontext.getconf().get("key") 

i exception:

exception in thread "main" java.util.nosuchelementexception: key 

furthermore, when use sparkcontext.getconf().todebugstring() don't see value in output.

further notice since want submit spark job via spark rest service cannot use os environment variable or like.

is there possibility implement this?

since want use custom properties need place properties after application.jar in spark-submit (like in spark example [application-arguments] should properties. --conf should spark configuration properties.

--conf: arbitrary spark configuration property in key=value format. values contain spaces wrap “key=value” in quotes (as shown).

./bin/spark-submit \   --class <main-class> \   --master <master-url> \   --deploy-mode <deploy-mode> \   --conf <key>=<value> \   ... # options   <application-jar> \   [application-arguments] <--- here our app arguments 

so when do: spark-submit .... app.jar key=value in main method args[0] key=value.

public static void main(string[] args) {     string firstarg = args[0]; //eq. key=value } 

but want use key value pairs need parse somehow app arguments.

you can check apache commons cli library or alternative.


Comments

Popular posts from this blog

asynchronous - C# WinSCP .NET assembly: How to upload multiple files asynchronously -

aws api gateway - SerializationException in posting new Records via Dynamodb Proxy Service in API -

asp.net - Problems sending emails from forum -