2

I currently have a yaml Azure Pipeline that generates Excel files as artifacts and I would like to upload them to a SharePoint folder. I would like to achieve this using Power Automate.

I understand how to run my Power Automate flow when the pipeline finishes. My issue is about understanding how to extract the Excel files from Azure Pipeline artifacts using Power Automate to then push them to SharePoint.

2 Answers 2

2

As far as I have tested, there is no action in Power Automate to retrieve pipeline artifacts. Since Azure DevOps itself is an automation tool, we can upload the Excel files onto SharePoint folder directly during a pipeline agent job.

We can use the 3rd-party extension Upload files to SharePoint Online - Visual Studio Marketplace to upload .xls files from the working directory on the pipeline agent onto the SharePoint target folder; see for more details on how the task authenticates access to SharePoint against App registration in Azure AD.

Here is my sample pipeline for your refence, together with the API permission settings of my app registration that is used by the pipeline task to authenticate access to SharePoint.


variables:
- group: ARM
- name: CurrentTime
  value : $[ format('{0:yyyy}-{0:MM}-{0:dd} {0:HH}:{0:mm}:{0:ss}', pipeline.startTime) ]

pool:
  vmImage: windows-latest

jobs:
- job: Prep
  steps:
  - checkout: none
  - powershell: |
      Write-Host "Current Time is $(currentTime)"
      Write-Host "Tenant Id is $(TenantId)"
      Write-Host "Client Id is $(ClientId)"
    displayName: List variables
  - powershell: |
      $data = [PSCustomObject]@{
        TimeStamp = "$(CurrentTime)"
        ProjectName = "$(System.TeamProject)"
        PipelineName = "$(Build.DefinitionName)"
        DefinitionId = "$(System.DefinitionId)"
        BuildId = "$(Build.BuildId)"
      }
      $data | Export-Csv -Path "BuildData.csv" -NoTypeInformation
      Write-Host "Files in $(System.DefaultWorkingDirectory)"
      tree $(System.DefaultWorkingDirectory) /F /A
    displayName: Generate csv file
  - powershell: |
      Install-Module ImportExcel -Scope CurrentUser -Force
      Import-Csv -Path "BuildData.csv" | Export-Excel -Path "$(Build.DefinitionName)-$(Build.BuildId)-1.xls" -WorksheetName "BuildInfo"
      Import-Csv -Path "BuildData.csv" | Export-Excel -Path "$(Build.DefinitionName)-$(Build.BuildId)-2.xls" -WorksheetName "BuildInfo"
      Write-Host "Files in $(System.DefaultWorkingDirectory)"
      tree $(System.DefaultWorkingDirectory) /F /A
    displayName: Generate xls file
  - task: az-pipelines-2-sharepoint@0
    displayName: Upload the xls files from System.DefaultWorkingDirectory to SharePoint site
    inputs:
      tenantId: '$(TenantId)'
      clientId: '$(ClientId)'
      clientSecret: '$(ClientSecret)'
      driveId: 'https://$(SharePointOnline).sharepoint.com/sites/AzureDevOpsTeamSite/Shared%20Documents/'
      targetFolder: 'AzureDevOpsExcel'
      sourceFolder: '$(System.DefaultWorkingDirectory)'
      contents: '**/*.xls'
      conflictBehaviour: 'replace'
      cleanTargetFolder: false
      flattenFolders: false
      failOnEmptySource: false

enter image description here enter image description here

Sign up to request clarification or add additional context in comments.

3 Comments

Hi @Batesias, Have you got a chance to check my answer below to upload xls files to SharePoint folder during pipeline agent job? Hope it can resolve your query and benefit other users. Cheers!
Hi, yes I had already seen this extension, however I'm trying the Power Automate route because I can't currently add extensions in my Azure DevOps organization.
Same issue for me - I was not able to add the extension, so I built a workaround (separate answer incoming)
0

As an alternative to the solution using Azure Pipelines with the thrid-party extension (which requies permission for installation), I have found a way doing this with Power Automate only:

Picture of a Power Automate flow with 5 elements

Explanation:

1. When a build completes (trigger)

  • Configure trigger as needed including 'Organization Name' (of your Azure DevOps organization) and 'Project Name' (of the project in your organization)
  • You can also configure a 'Filter By Result' so that the flow only gets triggered if for example the build succeeded
  • In addition, you can 'Filter By Definition Id' so that the flow only gets triggered for certain Pipelines (How to get Definition Id of build)

2. Get build artifact info (Azure DevOps HTTP request)

  • Organization name: see 1.
  • Method: GET
  • Relative URL: concat('https://dev.azure.com/**ORGANIZATION**/**PROJECT**/_apis/build/builds/', triggerOutputs()?['body/id'], '/artifacts')

3. Parse build artifact info (Data Operation Parse JSON)

  • Content: 'Body' variable produced by previous step
  • Schema: You can receive the schema by simply opening the URL of the previous step in a browser, copy and paste the displayed JSON into the field

4. Download artifact (Azure DevOps HTTP request)

  • Organization name: see 1.
  • Method: GET
  • Relative URL: body('Parse_build_artifact_info')['value'][0]['resource']['downloadUrl']

5. Create file (OneDrive OR SharePoint possible)

  • Folder Path: Select OneDrive/SharePoint folder from context menu
  • File Name: Set as needed, I added a time stamp: concat(utcNow(),'_filename.zip')
  • File Content: 'Body' variable produced by previous step

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.