0

I want to assign a spark SQL function to a variable. For example:

    val func = org.apache.spark.sql.functions.max(_)

Now, this is the way for doing partial functions. But when I'm doing this, I get this error:

  Cannot resolve overloaded method max

Searched online, couldn't find a solution. Does anyone have an idea?

1 Answer 1

2

Note max is an overloaded function:

So either:

import org.apache.spark.sql.Column

scala> val func: Column => Column = org.apache.spark.sql.functions.max
func: org.apache.spark.sql.Column => org.apache.spark.sql.Column = <function1>

or

scala> val func: String => Column = org.apache.spark.sql.functions.max
func: String => org.apache.spark.sql.Column = <function1>

Out of interest why do you want to do this?

Sign up to request clarification or add additional context in comments.

4 Comments

Thank you! Why does the annotation makes it to work? How is it related of max being overloaded function?
So there are two definitions of max, Column => Column and String => Column Defining the type tells the compiler which one we are aliasing.
Ok, I see. and what about the " _ " parameter? Doesn't it need to be provided?
It doesn't have to be provided in this case.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.