0

I would like to create a UDF inside a specific dataset in BigQuery. I've tried using the create_table and other API methods from the Python Client but none seem to work.

Is there any method or way to upload a UDF (stored in a file and read it as a string) into BigQuery?

1 Answer 1

2

This is my UDF:

$ cat test.udf
CREATE OR REPLACE FUNCTION dataset_name.addFourAndDivideAny(x ANY TYPE, y ANY TYPE) AS ((x + 4) / y);

This is my upload command ran on Cloud Shell (it will ask to authorise):

bq query --nouse_legacy_sql  < test.udf

In python:

from google.cloud import bigquery

client = bigquery.Client()

with open('test.udf', 'r') as file:
     QUERY = file.read().replace('\n', '')

query_job = client.query(QUERY)

I have got the UDF in single line in the file, but you can use multi-lines file as well. Just make sure the QUERY string is properly constructed.

Sign up to request clarification or add additional context in comments.

3 Comments

This works perfectly using the bq in bash. But I was wondering if this could also be achieved in Python?
@Julian - I have added python way as well.
Ye, i did the same way, I was creating a job_config and calling methods: create_table and some others instead of just querying what I was needing. Thanks so much :)

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.