1

I am trying to use Databricks CLI in PowerShell. I need to pass JSON string as parameter.

I have two variables - job_id equals to 10 and parameterValue equal to some string.

I used like 4 different combinations, but still getting an error of Error: JSONDecodeError: Expecting property name enclosed in double quotes: line 1 column 2 (char 1)

Code used:

databricks jobs run-now --job-id $job_id --notebook-params "{""parameterName"":""$parameterValue""}"
databricks jobs run-now --job-id $job_id --notebook-params "{`"parameterName`":`"$parameterValue`"}"
databricks jobs run-now --job-id $job_id --notebook-params '{"parameterName":"$parameterValue"}'
3
  • the first won't work because escaping " is done with backtick character. the third won't work because $parameterValue will not be expanded to its value when enclosed in backtick character. Remains the second, try $("{`"parameterName`":`"$parameterValue`"}") or "{`"parameterName`":`"$($parameterValue)`"}". Commented Oct 26, 2021 at 10:17
  • "Expecting property name enclosed in double quotes" .. I doubt that is the error message for all cases, because some clearly satisfy this requirement. Do you also get other errors? Have you tried a simple literal string without expansion, to narrow-down the issue? Commented Oct 26, 2021 at 10:22
  • The sad reality as of PowerShell 7.1 is that an extra, manual layer of \ -escaping of embedded " characters is required in arguments passed to external programs. This may get fixed in 7.2, which may require opt-in. See this answer to the linked duplicate for details. Commented Oct 26, 2021 at 19:06

2 Answers 2

2

It is a bad practice to try to build your own serialized strings, instead use the intended methods and cmdlets (such as ConvertTo-Json):

$job_id = 10
$parameterName = 'Some String'
$Data = @{
    job_id = $job_id
    parameterName = $parameterName
}
$Json = ConvertTo-Json $Data
databricks jobs run-now --job-id $job_id --notebook-params $Json

or compressed:

databricks jobs run-now --job-id $job_id --notebook-params (ConvertTo-Json -Compress @{ parameterName = $parameterName })
Sign up to request clarification or add additional context in comments.

2 Comments

Thank you, however there is still an error due to space in string Error: Got unexpected extra argument (String })
This has to do with the command itself, something like databricks jobs run-now --job-id $job_id --notebook-params "$Json"or databricks jobs run-now --job-id $job_id --notebook-params "$(ConvertTo-Json -Compress @{ parameterName = $parameterName })" might just work
0

Ok, so what I did is

  1. Write python code to print passed argument
if __name__ == '__main__':
    import sys
    print(sys.argv[1])

2. Tested this with different setups and for example:

input: $s="{`"parameterName`":`"$parameterValue`"}"

powershell output: {"tableName":"some_string"}

python output: {parameterName:some_string}

So python completely ignored the double quotes, even they were escaped.

I had to:

  1. Escape ( \ ) the escape ( ` ) character:

powershell input: $s="{\`"parameterName\`":\`"$parameterValue\`"}"

powershell output: {\"parameterName\":\"some_string\"}

python output: {"parameterName":"some_string"}



The only problem I have not figured out yet is when $parameterValue variable includes space, then it does not work:

powershell input:

$parameterValue='some string'
$s="{\`"parameterName\`":\`"$parameterValue\`"}"

powershell output: {\"parameterName\":\"some string\"}

python output: {"parameterName":"some

2 Comments

The primary problem - the unexpected need for \ -escaping - is on the PowerShell side (see linked duplicate). The problem with spaces must be unrelated, as something like "{\`"parameterName\`": \`"some string\`"}" does get seen by an external program as verbatim{"parameterName": "some string"}, i.e. as valid JSON, up to at least PowerShell 7.1
@romanzdk, did you ever find a solution for the space in json values ?

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.