0

I have opened the new thread as requested.I believe you have the script with you which I shared in the previous forum.Please consider that script and find the below requirement.

function Add-Entity()
{
 [CmdletBinding()]

 param
 (
 $table, 
 [string] $partitionKey, 
 [string] $RowKey, 
 [string] $Label_Value,
 [string] $Lable_cost 
 )

 $entity = New-Object -TypeName Microsoft.WindowsAzure.Storage.Table.DynamicTableEntity -ArgumentList $partitionKey, $rowKey 
 $entity.Properties.Add("Label_Value",$Label_Value)
 $entity.Properties.Add("Label_Value",$Lable_cost)
 $result = $table.CloudTable.Execute([Microsoft.WindowsAzure.Storage.Table.TableOperation]::Insert($entity))
}

$tableName = "TestTable"
$subscriptionName = "Tech Enabled Solutions"
$resourceGroupName = "abc"
$storageAccountName = "defghi"
$location = "North Central US, South Central US"

# Get the storage key for the storage account
$StorageAccountKey = "12345678"

# Get a storage context
$ctx = New-AzureStorageContext -StorageAccountName $storageAccountName -StorageAccountKey $storageAccountKey

# Get a reference to the table
$table = Get-AzureStorageTable -Name $tableName -Context $ctx -ErrorAction Ignore
$csv = Import-CSV "d:\a\1\s\DeploymentScripts\TestTable.csv"

ForEach ($line in $csv)
{
 Add-Entity -Table $table -partitionKey $line.partitionkey -rowKey $line.RowKey -Label_Value $line.Label_Value Lable_cost $line.Lable_cost

}

My assumption is like,if I can pass files as a parameters in the script and it will read that file and can insert and delete the data for that particular file into azure storage.but the thing is currently I am inserting data using this command in the power shell script...

$csv = Import-CSV "d:\a\1\s\DeploymentScripts\TestTable.csv" currently as per the script it is testtable.csv

if I want to pass different file say like testtable2.csv to the powershell script,I can't write 2nd script and keep in VSTS repo right as there are many csv files I need to deploy into azure storage.so at the run time of script how can I pass different files using one script which i am currently running. how can I implement the script and how can I pass parameters.

One more doubt pal, how can I deploy multiple csv files into table storage as each csv file data is different in rows and columns and will have extra rows and extra columns for each csv file.So how can I automate/implement/change the above script for deploying multiple csv files using power shell script as not every csv file having same columns right,some may csv files have 3 fields and some csv files have 5,6,7 and so on fields..I hope you understand my requirement.Please help me out.

16
  • 1
    Please share those scripts in this post as it will be hard to find it otherwise. A full example of what you tried and where you failed would be useful. Commented Oct 19, 2017 at 7:45
  • function Add-Entity() { [CmdletBinding()] param ( $table, [string] $partitionKey, [string] $RowKey, [string] $Label_Value, [string] $Lable_cost ) $entity = New-Object -TypeName Microsoft.WindowsAzure.Storage.Table.DynamicTableEntity -ArgumentList $partitionKey, $rowKey $entity.Properties.Add("Label_Value",$Label_Value) $entity.Properties.Add("Label_Value",$Lable_cost) $result = $table.CloudTable.Execute([Microsoft.WindowsAzure.Storage.Table.TableOperation]::Insert($entity)) } Commented Oct 19, 2017 at 8:06
  • Add that to your question as correctly formatted code. It's really hard to read that. Commented Oct 19, 2017 at 8:07
  • $tableName = "TestTable" $subscriptionName = "Tech Enabled Solutions" $resourceGroupName = "abc" $storageAccountName = "efghigh" $location = "North Central US" # Get the storage key for the storage account $StorageAccountKey = "12345678" # Get a storage context $ctx = New-AzureStorageContext -StorageAccountName $storageAccountName -StorageAccountKey $storageAccountKey # Get a reference to the table $table = Get-AzureStorageTable -Name $tableName -Context $ctx -ErrorAction Ignore $csv = Import-CSV "d:\a\1\s\DeploymentScripts\TestTable.csv" Commented Oct 19, 2017 at 8:08
  • ForEach ($line in $csv) { Add-Entity -Table $table -partitionKey $line.partitionkey -rowKey $line.RowKey -Label_Value $line.Label_Value Lable_cost $line.Lable_cost } This is my complete code..So please help pal. Commented Oct 19, 2017 at 8:09

1 Answer 1

1

Regarding pass file path as parameter, you can definition the global parameters for this Script:

param(
[string]$filepath
)
function Add-Entity()
{
 [CmdletBinding()]

 param
 (
 $table, 
 [string] $partitionKey, 
 [string] $RowKey, 
 [string] $Label_Value,
 [string] $Lable_cost 
 )

 $entity = New-Object -TypeName Microsoft.WindowsAzure.Storage.Table.DynamicTableEntity -ArgumentList $partitionKey, $rowKey 
 $entity.Properties.Add("Label_Value",$Label_Value)
 $entity.Properties.Add("Label_Value",$Lable_cost)
 $result = $table.CloudTable.Execute([Microsoft.WindowsAzure.Storage.Table.TableOperation]::Insert($entity))
}

$tableName = "TestTable"
$subscriptionName = "Tech Enabled Solutions"
$resourceGroupName = "abc"
$storageAccountName = "defghi"
$location = "North Central US, South Central US"

# Get the storage key for the storage account
$StorageAccountKey = "12345678"

# Get a storage context
$ctx = New-AzureStorageContext -StorageAccountName $storageAccountName -StorageAccountKey $storageAccountKey

# Get a reference to the table
$table = Get-AzureStorageTable -Name $tableName -Context $ctx -ErrorAction Ignore
$csv = Import-CSV $filepath

ForEach ($line in $csv)
{
 Add-Entity -Table $table -partitionKey $line.partitionkey -rowKey $line.RowKey -Label_Value $line.Label_Value Lable_cost $line.Lable_cost

}

Then you can pass parameters in Script Arguments input box of Azure PowerShell task: -filepath $(build.sourcesdirectory)\DeploymentScripts\TestTable.csv.

Regarding the multiple files with different columns, you can define a array object parameter, then include filepath and columns in its properties, then iterate the array object, for example:

    param(
         [object[]]$fileObj
        )
    foreach($fo in $fileObj){

     Write-Host $fo.filepath

    $cArray=$fo.Cols.split(",")

      foreach($c in $cArray){

      Write-Host $c

      #TODO add column to table

     }

#TODO insert data to cloud table per to current file 
    }

arguments:

-fileObj @(@{"filepath"="$(build.sourcesdirectory)\DeploymentScripts\TestTable.csv";"Cols"='c1,c2'},@{"filepath"="$(build.sourcesdirectory)\DeploymentScripts\TestTable2.csv";"Cols"='c3,c2'})

Update:

foreach($fo in $fileObj){
 Write-Host $fo.filepath
 $csv = Import-CSV $fo.filepath
  $cArray=$fo.Cols.split(",")
  foreach($line in $csv)
    {
    Write-Host "$($line.partitionkey), $($line.rowKey)"
    $entity = New-Object -TypeName Microsoft.WindowsAzure.Storage.Table.DynamicTableEntity -ArgumentList $line.partitionkey, $line.rowKey 
        foreach($c in $cArray){
     Write-Host "$c,$($line.$c)"
 $entity.Properties.Add($c,$line.$c)


        }
 $result = $table.CloudTable.Execute([Microsoft.WindowsAzure.Storage.Table.TableOperation]::Insert($entity))
 }
}

Argument:

@(@{"filepath"="data.csv";"Cols"="Col1,Col2,Col3"},@{"filepath"="data2.csv";"Cols"="Col1,Col6,Col7,Col8"})

Sample data in csv:

data.csv:

 partitionkey,rowKey,Col1,Col2,Col3
    p1,r1,one,two,three
    p2,r2,example1,example2,example3

data2.csv:

partitionkey,rowKey,Col1,Col6,Col7,Col8
p1,r1,one,two,three,four
p2,r2,example1,example2,example3,example4

Update:

param(
     [object[]]$fileObj
    )

$storageAccountName = "XXX"

$tableName="XXX"

# Get the storage key for the storage account
$StorageAccountKey = "XXX"

# Get a storage context
$ctx = New-AzureStorageContext -StorageAccountName $storageAccountName -StorageAccountKey $storageAccountKey

$table = Get-AzureStorageTable -Name $tableName -Context $ctx -ErrorAction Ignore

foreach($fo in $fileObj){
 Write-Host $fo.filepath
 $csv = Import-CSV $fo.filepath
  $cArray=$fo.Cols.split(",")
  foreach($line in $csv)
    {
    Write-Host "$($line.partitionkey), $($line.rowKey)"
    $entity = New-Object -TypeName Microsoft.WindowsAzure.Storage.Table.DynamicTableEntity -ArgumentList $line.partitionkey, $line.rowKey 
        foreach($c in $cArray){
     Write-Host "$c,$($line.$c)"
        $entity.Properties.Add($c,$line.$c)


        }
        $result = $table.CloudTable.Execute([Microsoft.WindowsAzure.Storage.Table.TableOperation]::Insert($entity))
 }
}

Update:

param(
     [object[]]$fileObj
    )

$storageAccountName = "XXX"

$tableName="XXX"

# Get the storage key for the storage account
$StorageAccountKey = "XXX"

# Get a storage context
$ctx = New-AzureStorageContext -StorageAccountName $storageAccountName -StorageAccountKey $storageAccountKey



foreach($fo in $fileObj){
$table = Get-AzureStorageTable -Name $fo.tableName -Context $ctx -ErrorAction Ignore
 Write-Host $fo.filepath
 $csv = Import-CSV $fo.filepath
  $cArray=$fo.Cols.split(",")
  foreach($line in $csv)
    {
    Write-Host "$($line.partitionkey), $($line.rowKey)"
    $entity = New-Object -TypeName Microsoft.WindowsAzure.Storage.Table.DynamicTableEntity -ArgumentList $line.partitionkey, $line.rowKey 
        foreach($c in $cArray){
     Write-Host "$c,$($line.$c)"
        $entity.Properties.Add($c,$line.$c)


        }
        $result = $table.CloudTable.Execute([Microsoft.WindowsAzure.Storage.Table.TableOperation]::Insert($entity))
 }
}

Arguments: @(@{"filepath"="data.csv";"Cols"="Col1,Col2,Col3";"tableName"="table1"},@{"filepath"="data2.csv";"Cols"="Col1,Col6,Col7,Col8";"tableName"="table2"})

Sign up to request clarification or add additional context in comments.

34 Comments

I am running the below script only along with the parameters you shared but throwing error:2017-10-24T09:45:01.8482070Z ##[command]. 'd:\a\1\s\DeploymentScripts\InsertEntity.ps1' -fileObj @(@{"filepath"="d:\a\1\s\DeploymentScripts\TestTable.csv";"Cols"='partitionkey,rowkey,Lable_value'},@{"filepath"="d:\a\1\s\DeploymentScripts\TestTable2.csv";"Cols"='partitionkey,rowkey,Lable_value,Lable_cost'}) 2017-10-24T09:45:02.4312066Z d:\a\1\s\DeploymentScripts\TestTable.csv 2017-10-24T09:45:02.4372069Z partitionkey 2017-10-24T09:45:02.4382085Z rowkey 2017-10-24T09:45:02.4382085Z Lable_value
2017-10-24T09:45:02.4392062Z d:\a\1\s\DeploymentScripts\TestTable2.csv 2017-10-24T09:45:02.4392062Z partitionkey 2017-10-24T09:45:02.4392062Z rowkey 2017-10-24T09:45:02.4402069Z Lable_value 2017-10-24T09:45:02.4413537Z Lable_cost 2017-10-24T09:45:03.6140905Z ##[error]Import-CSV : Could not find file 'D:\a\1\s\filepath'. At D:\a\1\s\DeploymentScripts\InsertEntity.ps1:33 char:8 + $csv = Import-CSV filepath + CategoryInfo : OpenError: (:) [Import-Csv], FileNotFoundException + FullyQualifiedErrorId : FileOpenFailure,Microsoft.PowerShell.Commands.ImportCsvCommand
@PDBRPRAVEEN Sorry for the mistake in my first part code, updated it (Add $). What's your current script?
Hi MSFT, I would like to deploy multiple csv files with different columns in each csv file.So please suggest me how many scripts i need to execute and share those scripts kindly.because the whole post is bit confusing friend.So please please don't mind and try to help me.As I asked you how to deploy multiple csv files and I understand that we can deploy by passing $filename as a parameter but how about the columns define in the script.
currently I have added 4 columns in testtable1.csv file:say partitionkey,ro‌​wkey,Lable_value,Lab‌​le_cost' but how to define other csv files columns in this script say partitionkey,rowkey,price,validity and there would be many columns which we can't define right in the script.or how to add all the columns of different csv files of different columns into table storage.
|

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.