r - Can't connect to spark with sparklyr -
i trying connect spark sparklyr getting following error:
error in start_shell(master = master, spark_home = spark_home, spark_version = version, : failed launch spark shell. ports file not exist. path: c:\users\k\appdata\local\rstudio\spark\cache\spark-1.6.2-bin-hadoop2.6\bin\spark-submit.cmd parameters: --class, sparklyr.backend, --packages, "com.databricks:spark-csv_2.11:1.3.0","com.amazonaws:aws-java-sdk-pom:1.10.34", "c:\users\k\documents\r\r-3.2.4revised\library\sparklyr\java\sparklyr-1.6-2.10.jar", c:\users\k\appdata\local\temp\rtmpajvihg\file12543ce11e0f.out error occurred during initialization of vm not reserve enough space 1048576kb object heap
any suggestions? thanks!
Comments
Post a Comment