Using Azure Data Factory For SFTP Transfers
Description
So currently I use Hybrid Workers to SFTP files from my organization to another since I can set a public static IP on the workers and have them whitelist. Another option I’m looking into is Azure Date Factory. Here is an example using Data Factory to transfer a file from storage account to a SFTP server.
To Resolve:
-
In the azure portal, create a data factory.
-
Go to datasets
- source:
- type: binary
-
location: Azure file storage, select any file
- destination:
- binary2
- sftp - enter connection details - select a folder for file to land in.
-
Create pipeline:
- Add step called
copy data
- Source: binary
- Destination (sink): binary2
- Add step called
-
Click publish all
-
On pipeline, select ‘run’. Notice that the binary select in source is transferred to destination. The SFTP uses the credentials specified in the SFTP connection.
- To my knowledge, ADF uses 4 different IP addresses
- I would like to look into this, but I’m currently leaning more toward NAT Gateway with Function Apps since I mostly use those for everything currently
- On my to-do-list is to update to use Python instead if we go with ADF.
Comments