As mentioned in the given documentation, to invoke Batch endpoints in Azure Data Factory, you need to use Web activity. Follow the steps below:
First, you need to have a token for authorization. Run the following commands in Azure CLI and get the token:
az login
az account get-access-token --resource https://ml.azure.com --query "accessToken" --output tsv

You will get the token; copy it to any editor and add the Bearer word to it like below.

Make sure you give space between the token and the Bearer word.
For more information about authorization, refer to this documentation.
Now, open Data Factory and add a Web activity. In the Parameters tab, create a new parameter called aad_token as SecureString type and paste the token created with bearer in the default field.

Next, configure details like below in the Settings tab.
URL: https://xyx.centralindia.inference.ml.azure.com/jobs
Method: POST
Body:
{
"properties": {
"InputData": {
"credit_dataset": {
"JobInputType" : "UriFolder",
"Uri": "https://jgsml4897076647.blob.core.windows.net/azureml-blobstore-f218745d-63e0-4947-bdaf-493fef6422ae/modelDataCollector/jgsml-cyxkp/credit-defaults-model-1/model_inputs/"
}
}
}
}
Here, I have given the input type as folder and passed the blob URL folder path to data.
Usually, the batch endpoints take a list of paths as input, as defined in the scoring script.
See this documentation for a sample scoring script that takes a list of paths.
In the Headers, give Authorization as the pipeline token variable you have created.

Next, run the activity, and it will start the batch job.

Regarding passing literal strings, it is not supported in batch endpoints according to the documentation provided for Data Factory.
endpoint_input_type: The type of the input data you are providing. Currently, batch endpoints support folders (UriFolder) and files (UriFile). Defaults to UriFolder.