You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
A demo combining Amazon Kinesis Data Streams, Amazon Data Firehose and
Amazon Managed Service for Apache Flink running a Zeppelin notebook.
Storm Library for Terraform
This repository is a member of the SLT | Storm Library for Terraform,
a collection of Terraform modules for Amazon Web Services. The focus
of these modules, maintained in separate GitHub™ repositories, is on
building examples, demos and showcases on AWS. The audience of the
library is learners and presenters alike - people that want to know
or show how a certain service, pattern or solution looks like, or "feels".
Only if your account has AWS Lake Formation enabled:
Check the value of the DEPLOYMENT_ROLE environment variable as shown
in the summary of the Apply workflow you executed in order to install
the Storm Library for Terraform. It starts with slt-0 and ends with
-deployment.
In AWS Lake Formation, open Administrative roles and tasks. In the
section Database creators, click on Grant
In the opening Grant permissions panel, in the IAM users and roles
input, select your deployment role which you fetched before
Deployment of this member should take < 5 minutes on GitHub resources.
Architecture
Explore this demo
In your GitHub Apply workflow run, click on Summary. If you scroll
down, you will see the apply / check summary.
Take a look at the ENVIRONMENT variables. Note the DEPLOYMENT_NAME
environment variable. Its value should match the pattern
slt-(some number)-log-analytics-demo-(your github owner).
From the same page, download the artifact ending on .zeppelin to
your download folder and unzip it if necessary
In the AWS Console, on the Managed Apache Flink service page, click on
Studio notebooks on the left-hand side menu
In the list of Studio notebooks in the center, click on the
log-analytics-demo notebook that has been created
In the view that opens, click on the Run button. You will be asked to
confirm. Do so by clicking Run once more.
A message will appear in your center view, informing you that the service
is "Starting Studio notebook (your studio notebook)...". Wait until the
startup process has completed. It may take a few minutes.
You will be notified in the AWS Console as soon as your notebook has
successfully started. Then, click Open in Apache Zeppelin.
A new browser tab will open, welcoming you to Apache Zeppelin.
On that browser page, click Import note. A panel titled Import
New Note will pop up. Click on the area saying Select JSON File/IPYNB File
and choose your unzipped Zeppelin notebook from your download folder.
You will notice a new notebook link called Log Analytics Demo
on the refreshed Zeppelin browser page. Click on the link.
The Log Analytics Demo notebook will open. Next to its title, find
a small triangle (like a "play" button) that enables you to run
all paragraphs one after the other. Click on that play button.
The paragraphs will run one after the other. Execution will take
a minute or two.
Check the last paragraph. Don't worry if it's saying No Data available
for a minute or so. A pie chart will show up. The pie chart will
update every ten seconds, showing percentages of HTTP status codes
sent from an instance running Faker.
object({ actor = string # Github actor (deployer) of the deployment catalog_id = string # SLT catalog id of this module deployment = string # slt-<catalod_id>-- ref = string # Git reference of the deployment ref_name = string # Git ref_name (branch) of the deployment repo = string # GitHub short repository name (without owner) of the deployment repository = string # GitHub full repository name (including owner) of the deployment sha = string # Git (full-length, 40 char) commit SHA of the deployment short_name = string # slt-<catalog_id>- time = string # Timestamp of the deployment })