In SQL, if we have to check multiple conditions for any column value then we use case statament. In Spark SQL dataframes also we can replicate same functionality by using WHEN clause multiple times, once for each conditional check. No requirement to add CASE keyword though. So let’s see an example to see how to check for multiple conditions and replicate SQL CASE statement in Spark SQL. scala> df_pres.select($”pres_name”,$”pres_dob”,$”pres_bs”, when($”pres_bs”===”Virginia”,”VA”).when($”pres_bs”===”Massachusetts”,”MA”) .when($”pres_bs”===”Ohio”,”OH”).otherwise(“Others”).alias(“state_abbr”)).show() +——————–+———-+——————–+———-+ | pres_name| pres_dob| pres_bs|state_abbr| +——————–+———-+——————–+———-+ | George Washington|1732-02-22| Virginia| VA| | John Adams|1735-10-30| Massachusetts| MA| | Thomas Jefferson|1743-04-13| Virginia| VA| | James Madison|1751-03-16| Virginia| VA| | James Monroe|1758-04-28| Virginia| VA| | John QuincyRead More →