2

I want to convert the following query to Spark SQL using Scala API:

select ag.part_id name from sample c join testing ag on c.part=ag.part and concat(c.firstname,c.lastname) not like 'Dummy%' 

Any ideas?

Thanks in advance

3 Answers 3

2

Maybe this would work:

import org.apache.spark.sql.functions._

val c = sqlContext.table("sample")
val ag = sqlContext.table("testing")

val fullnameCol = concat(c("firstname"), c("lastname))

val resultDF = c.join(ag, (c("part") === ag("part")) && !fullnameCol.like("Dummy%"))

For more information about the functions I used above, please check the following links:

Sign up to request clarification or add additional context in comments.

4 Comments

hi i am getting below error <console>:42: error: value like is not a member of org.apache.spark.sql.DataFrame
@praveen can you share your code? maybe you are calling the like function on the wrong object
Hi thanks man i got the answer. i made some mistake thank u :)
@praveen: Good to know you succeeded! If you don't mind, please consider marking the answer as correct if it helped. Cheers
2

You mean this

df.filter("filed1 not like 'Dummy%'").show

or

df.filter("filed1 ! like 'Dummy%'").show

Comments

-1

use like this

df.filter(!'col1.like("%COND%").show

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.