Logic App: Parse CSV Sent From AZCopy To Azure Blob
Description:
In this post, I will setup a RHEL 7 server to upload CSV’s generated by an application to an Azure Blob. Once the CSV hits the blob, a logic app will trigger that will parse it and make decisions based on content within the CSV.
To Resolve:
- If you haven’t already, download/install Storage Explorer. Then create a blob -
myTestBlob
- Generate SAS per walkthrough
- download the tool to the server and move it to the directory our application writes to:
1 2 3 4
curl -s -D https://aka.ms/downloadazcopy-v10-linux | grep ^Location curl -o azcopy_v10.tar.gz https://azcopyvnext.azureedge.net/release20200501/azcopy_linux_amd64_10.4.3.tar.gz tar -xf azcopy_v10.tar.gz --strip-components=1 mv azcopy /csv
-
Test uploading a CSV from the server to Azure Blob:
1 2 3 4
azcopy copy "248.csv" "https://sasURL" --recursive=false # failed: see log and it says 'access denied' sudo azcopy copy "248.csv" "https://sasURL" --recursive=false # worked!
-
Build a logic app that will parse the CSV:
- When a blob is added or modified (properties only)
- Container: myTestBlob
- Number of blobs to return: 10
- Interval: 1
- Frequency: Minute
- Get Blob content:
- Blob: List of Files id
- Infer content type: Yes
- Compose
- Inputs: File Content
- Compose 2:
- Inputs:
split(outputs('Compose'),',')
- Inputs:
- Initialize Variable:
- Name = stringArray
- Type = Array
- Value = Outputs (from compose2)
- Initialize Variable2:
- Name = Action
- Type = String
- Value =
variables('stringArray')[8]
- Initialize Variable3:
- Name = PermName
- Type = String
- Value =
variables('stringArray')[12]
- Initialize Variable4:
- Name = PermName2
- Type = String
- Value =
variables('stringArray')[14]
- Delay:
- Count = 1
- Unit = Minute
- Create Blob:
- Folder path: /myTestBlob/processed
- Blob Name: List of Files Displayname
- Blob Content: File Content
- Delete Blob
- Blob: List of Files id
- When a blob is added or modified (properties only)
- The Logic App will grab lines 9, 13, and 15 from the uploaded CSV and store them in variables. It will then copy the blob to a different folder and delete the original blob.
- Next I would do something with information from these variables like call an Azure Automation runbook based on their contents (haven’t got there yet)
-
Back on the RHEL box, set the script to upload every minute
- Script
1 2 3 4 5 6 7 8
#!/bin/bash for filename in /csv/*.csv; do echo "coping $filename to blob" /csv/azcopy copy $filename "https:sasURL" --recursive=false echo "moving $filename to destination" base=${filename##*/} mv "$filename" /csv/processed/$base done
- Run every minute: see this first and set it to
* * * * * /root/script.sh
Comments