3

We have deployed our ML pipeline (using SDKV2) on batch endpoints using PipelineComponentBatchDeployment. This pipeline takes multiple arguments (all of str type). How can we trigger or invoke the endpoint with different arguments from with ADF.

Previously, with azureml sdkv1, we use to publish the pipelines, in adf we used to have "Machine Learning Execution" activity that would invoke the pipeline with different arguments, like shown below: enter image description here

But now (with sdkv2) we have endpoints like this: enter image description here

On checking the Pipeline Component, we see this: enter image description here

Is there a way to see what arguments the pipeline takes? like in sdkv1.

How do we use adf to invoke the endpoint with different inputs.

Thanks,

2
  • How you created this component? provide the code you used to create this component. Commented Mar 15, 2024 at 3:31
  • I have given the solution. I you want to send the string literal or custom inputs to your endpoint, you may try to write custom scoring script and use that for deployment. You can check this learn.microsoft.com/en-us/azure/machine-learning/… Commented Mar 18, 2024 at 12:21

2 Answers 2

1

Followed this:

https://learn.microsoft.com/en-us/azure/machine-learning/how-to-use-batch-azure-data-factory?view=azureml-api-2&tabs=mi

Also, its possible to send inputs that are just str using Literals: https://github.com/MicrosoftDocs/azure-docs/blob/main/articles/machine-learning/how-to-access-data-batch-endpoints-jobs.md#create-jobs-with-literal-inputs

Sign up to request clarification or add additional context in comments.

1 Comment

github link is broken. Any updated link or references?
1

As mentioned in the given documentation, to invoke Batch endpoints in Azure Data Factory, you need to use Web activity. Follow the steps below:

First, you need to have a token for authorization. Run the following commands in Azure CLI and get the token:

az login
az account get-access-token --resource https://ml.azure.com --query "accessToken" --output tsv

enter image description here

You will get the token; copy it to any editor and add the Bearer word to it like below.

enter image description here

Make sure you give space between the token and the Bearer word.

For more information about authorization, refer to this documentation.

Now, open Data Factory and add a Web activity. In the Parameters tab, create a new parameter called aad_token as SecureString type and paste the token created with bearer in the default field.

enter image description here

Next, configure details like below in the Settings tab.

URL: https://xyx.centralindia.inference.ml.azure.com/jobs

Method: POST

Body:

{
    "properties": {
        "InputData": {
           "credit_dataset": {
               "JobInputType" : "UriFolder",
               "Uri": "https://jgsml4897076647.blob.core.windows.net/azureml-blobstore-f218745d-63e0-4947-bdaf-493fef6422ae/modelDataCollector/jgsml-cyxkp/credit-defaults-model-1/model_inputs/"
           }
        }
    }
}

Here, I have given the input type as folder and passed the blob URL folder path to data.

Usually, the batch endpoints take a list of paths as input, as defined in the scoring script.

See this documentation for a sample scoring script that takes a list of paths.

In the Headers, give Authorization as the pipeline token variable you have created.

enter image description here

Next, run the activity, and it will start the batch job.

enter image description here

Regarding passing literal strings, it is not supported in batch endpoints according to the documentation provided for Data Factory.

endpoint_input_type: The type of the input data you are providing. Currently, batch endpoints support folders (UriFolder) and files (UriFile). Defaults to UriFolder.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.