0

The whole scala project for UDF is here:

Flink_SQL_Client_UDF/Scala_fixed/

My operation to register the udf is like this:

①mvn scala:compile package
②cp table_api-1.0-SNAPSHOT.jar $FLINK_HOME/lib
③add the following sentence into $FLINK_HOME/conf/flink-conf.yaml
flink.execution.jars: $FLINK_HOME/lib/table_api-1.0-SNAPSHOT.jar
④create temporary function scalaupper as 'ScalaUpper';
⑤CREATE TABLE orders (
    order_uid  BIGINT,
    product_name String,
    price      DECIMAL(32, 2),
    order_time TIMESTAMP(3)
) WITH (
    'connector' = 'datagen'
);
⑥select scalaupper(product_name) from orders;

Then I got

java.lang.ClassNotFoundException: ScalaUpper

error

Need your help, thanks!

1 Answer 1

1

@needhelp. Thanks for your detailed steps to reproduce the probelm. I think we can solve this problem by using -j command in the sql client[1] to add jar into the java class path. In my local environemnt, it works. But I don't find any information about the 'flink.execution.jars' in the document[2]. Therefore, I am not sure whether this option works for the sql client.

When registering the function into the table environment, the function catalog just does a simple validation and add the <identifier, path> into a map. It doesn't load the class into the runtime. Only when the job invokes the function, the function catalog load the class into the runtime.

[1]https://ci.apache.org/projects/flink/flink-docs-master/dev/table/sqlClient.html#configuration [2]https://ci.apache.org/projects/flink/flink-docs-master/deployment/config.html

Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.