Get elements of type structure of row by name in SPARK SCALA -


in dataframe object in apache spark (i'm using scala interface), if i'm iterating on row objects, there way extract structure values name?

i using below code extract name facing problem on how read struct value .

if values had been of type string have done this:

 val resultdf=joineddf.rdd.map{row=>        val id=row.getas[long]("id")       val values=row.getas[string]("slotsize")       val feilds=row.getas[string](values)       (id,values,feilds)       }.todf("id","values","feilds") 

but in case values has below schema

v1: struct (nullable = true)      |    |-- level1: string (nullable = true)      |    |-- level2: string (nullable = true)      |    |-- level3: string (nullable = true)      |    |-- level4: string (nullable = true)      |    |-- level5: string (nullable = true) 

what shall replace line make code work given value has above structure.

  row.getas[string](values) 

you can access struct elements first extracting row (structs modeled row in spark) toplevel row this:

val level1 = row.getas[row]("struct").getas[string]("level1") 

Comments

Popular posts from this blog

asynchronous - C# WinSCP .NET assembly: How to upload multiple files asynchronously -

aws api gateway - SerializationException in posting new Records via Dynamodb Proxy Service in API -

asp.net - Problems sending emails from forum -