This repository contains automation scripts designed to integrate with and support Eramba. These scripts are intentionally simple, reusable, and easy to adapt to different environments and vendors.
The goal of this repository is to provide practical automation examples that can be reused, adjusted, or extended to fit specific Eramba use cases such as evidence collection, account validation, integrations with third-party systems, and control testing.
There is no magic here: these are straightforward scripts meant to run, produce evidence, and stop.
Scripts are organized using the following structure:
Vendor/
└── Automation_Goal/
├── composer.json
├── run.php
└── README.md
-
Vendor
Represents the external system being automated (e.g. AWS, Google, Calendly). -
Automation_Goal
Describes what the automation does (e.g. user reconciliation, log collection, access review).
Every automation directory must include:
-
Composer configuration
Acomposer.jsonfile defining any required dependencies. -
Single execution script
One PHP script that:- Connects to the target system
- Collects data
- Performs validation or comparison
- Outputs evidence (usually JSON)
- Ends execution
-
Local README.md
A README that clearly explains:- Required environment variables
- How to create accounts, roles, or permissions on the target system
- Any assumptions or limitations of the script
No frameworks. No background jobs. One run, one output.
The fastest way to test any automation is to clone the repository and run the script locally.
Make sure you first download all necesary libraries by running:
composer install
You will need to define environment variables so the script can authenticate and run properly.
Examples:
export SCRIPT_TOKEN_KEY='keyhere'or
export SCRIPT_TOKEN_KEY="$HOME/key.file"Once variables are set, you can execute the script directly:
php run.phpThis mirrors how the script will behave when executed inside Eramba.
All scripts in this repository were created using ChatGPT, following a strict two-step approach:
-
Account & permission instructions
An LLM prompt is used to generate clear, actionable instructions explaining how to:- Create service accounts
- Assign roles and permissions
- Prepare the target system securely
-
Script generation
A second LLM prompt is used to generate:- A single-run PHP script
- A Composer configuration
- A ZIP-ready structure suitable for Eramba automations
The exact LLM instructions used for this process are included in this repository for transparency and reuse.
- Scripts are intentionally minimal and opinionated.
- They are meant to be adjusted, not treated as black boxes.
- If something breaks, fix the script — don’t over-engineer it.
If you are looking for polished SaaS integrations, this repo is not that.
If you want clear, auditable, and reproducible automations, you are in the right place.