scala - Spark convert single column into array -


how can convert single column in spark 2.0.1 array?

+---+-----+ | id| dist|  +---+-----+ |1.0|2.0| |2.0|4.0| |3.0|6.0| |4.0|8.0| +---+-----+ 

should return array(1.0, 2.0, 3.0, 4.0)

a

import scala.collection.javaconverters._  df.select("id").collectaslist.asscala.toarray 

fails with

java.lang.runtimeexception: unsupported array type: [lorg.apache.spark.sql.row; java.lang.runtimeexception: unsupported array type: [lorg.apache.spark.sql.row; 

why use javaconverters if re-transform java list scala list ? need collect dataset , map array of rows array of doubles, :

df.select("id").collect.map(_.getdouble(0)) 

Comments

Popular posts from this blog

asynchronous - C# WinSCP .NET assembly: How to upload multiple files asynchronously -

aws api gateway - SerializationException in posting new Records via Dynamodb Proxy Service in API -

asp.net - Problems sending emails from forum -