You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Deploy Data Pipeline in AWS using Serverless Framework
Preparation (on MacOS as example)
Serverless Framework Code
Deploy change into AWS
Preparation (on MacOS as example)
S3
Build a S3 bucket for deploy if needed
Python
# 1. Install or update python to version 3.8# 2. cd to the directory where requirements.txt is located# 3. Optional: activate your virtualenv# 4. Run the following command to install required python packages% pip3 install -r requirements.txt
# Create a new Serverless Service/Project% serverless create --template aws-python3 --path {project_folder}
Serverless Framework Code
Open your project in VScode
% open -a “Visual Studio Code” {project_folder}
Modify .python file
# Rename python file% mv handler.py data_pipeline.py
# Copy the code into your python file from data_pipeline.py and save it# Install boto3 if you do not have it
sudo apt install python3-pip -y
sudo pip3 install boto3
# Create requirements.txt for external packages
touch requirements.txt
echo “<library>”>> requirements.txt
# Install needed librarires
sudo pip3 install -r requirements.txt
Create .env file and put environment variables if need
APPLICATION=my-data-pipelineSTAGE=devREGION=ap-southeast-2# This will be automatically added into lambda runtime, if you enter it in serverless.yml:custom.dotenv.include TZ_LOCAL=Australia/Sydney