scala - SPARK SQL: Implement AND condition inside a CASE statement -


i aware of how implement simple case-when-then clause in spark sql using scala. using version 1.6.2. but, need specify , condition on multiple columns inside case-when clause. how achieve in spark using scala ?

thanks in advance time , help!

here's sql query have:

select  sd.standardizationid,    case when sd.numberofshares = 0 ,              isnull(sd.derivatives,0) = 0 ,               sd.holdingtypeid not in (3,10)                       8         else              holdingtypeid        end     holdingtypeid  sd; 

an alternative option, if it's wanted avoid using full string expression, following:

import org.apache.spark.sql.column import org.apache.spark.sql.functions._  val sd = sqlcontext.table("sd")  val conditionedcolumn: column = when(   (sd("numberofshares") === 0) ,   (coalesce(sd("derivatives"), lit(0)) === 0) ,   (!sd("holdingtypeid").isin(seq(3,10): _*)), 8 ).otherwise(sd("holdingtypeid")).as("holdingtypeid")  val result = sd.select(sd("standardizationid"), conditionedcolumn) 

Comments

Popular posts from this blog

aws api gateway - SerializationException in posting new Records via Dynamodb Proxy Service in API -

asp.net - Problems sending emails from forum -