Created by Erik Vigorito
Small Python project that shows how to upload, list, and download files from Azure Blob Storage using a storage account created in the Azure portal.
This project was completed by Erik Vigorito as part of a data engineering course. This repo demonstrates a simple data engineering pattern:
- Create an Azure Storage account and a blob container.
- Use the Azure Storage Blob SDK for Python to:
- Upload a local CSV file to Azure Blob Storage.
- List blobs in a container.
- Download the blob back to the local machine to verify the round trip.
The project is based on the official Azure Blob Storage Python quickstart and client library documentation.
- Python 3.14.0
- Azure Storage account (Blob Storage)
- Azure Storage Blob SDK for Python (
azure-storage-blob) - Visual Studio Code
azure_blob_demo.py– Main script that connects to Azure Blob Storage, creates/uses a container, uploads a CSV, lists blobs, and downloads the file.test_small.csv– Small example CSV used to test the upload/list/download pipeline.downloaded_test_small.csv– File downloaded from Azure (created when the script runs).01_azure_account.png– Screenshot of the storage account overview page in the Azure portal.02_pip_install.png– Screenshot of installing the Azure Blob SDK in the terminal.03_script_run.png– Screenshot of a successful script run in the terminal.04_azure_portal.png– Screenshot of the container and blob in the Azure portal.
- Azure subscription with a storage account and a container created in the portal.
- Python 3.14.0 installed on your machine.
pipavailable in your PATH.- Visual Studio Code (or any editor).
- Clone this repository:
git clone git remote add origin https://github.com/equicktruth-ctrl/azure-blob-uploader.git
cd azure-blob-uploader
- Install dependencies:
python -m pip install --upgrade pip python -m pip install azure-storage-blob
- Create and update a
.pyscript:
- Open
azure_blob_demo.py. - Set these variables at the top of the file:
account_name = "YOUR_STORAGE_ACCOUNT_NAME" account_key = "YOUR_STORAGE_ACCOUNT_KEY" sample_file = "test_small.csv" - The
account_nameandaccount_keycome from the Azure portal under Storage account → Security + networking → Access keys.
- Make sure
test_small.csvexists in the project folder (or changesample_fileto another CSV file name that exists locally).
From the project folder:
python azure_blob_demo.py
The script will:
- Connect to Azure Blob Storage using the account name and key.
- Create or reuse a container named
annotation-data. - Upload
test_small.csvor yoursample_fileto the container as a blob. - List blobs in the container and print their names.
- Download the blob as
downloaded_test_small.csvto verify the content.
Expected console output is similar to:
- “Container 'annotation-data' already exists. Using existing container.”
- “File 'test_small.csv' uploaded to Azure Blob Storage.”
- “File in container: test_small.csv”
- “File downloaded as 'downloaded_test_small.csv'.”
- “Azure Blob demo complete—upload, list, download successful!”
01_azure_account.png– Storage account overview page after successful deployment.02_pip_install.png– Installingazure-storage-blobwithpip.03_script_run.png– Terminal output from runningazure_blob_demo.py.04_azure_portal.png– Azure portal view of theannotation-datacontainer and uploaded blob.
These images document the full workflow from Azure setup to running the Python script.
- The script currently uses an account key for simplicity; in production, use Azure AD and managed identities for more secure authentication.
- As a next step, this uploader could be integrated into a larger data pipeline (e.g., reading CSVs from blob storage into a data warehouse or notebook for analysis).
Erik Vigorito