Skip to content

j450nmartin/ALIProWeb

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

ALIProWeb – PS/ALI Automation Platform

Overview

ALIProWeb is a production SaaS platform that processes and delivers subscriber PS/ALI (911 location) data to the Washington State ALI DBMS as part of a public safety data pipeline.

  • Built and operated independently since 2019
  • Supports real-world 911 location update workflows
  • Integrates cloud-native processing with business systems
  • Designed for correctness, reliability, and controlled execution

The system has been built and runs end-to-end in a production environment. The architecture emphasizes managed services and event-driven execution, resulting in a highly efficient, reliable, and operationally lightweight platform.


Scope

This repo focuses on architecture, workflows, and system design.

It does not include proprietary code, but reflects how the production system is structured and operated.


What is PS/ALI?

PS/ALI (Public Safety / Automatic Location Identification) data links a telephone number to a physical location.

During a 911 call, dispatchers rely on this data to determine where the caller is and route assistance.


Why This Matters

This system sits in a public safety data pipeline.

While it is not a dispatch system, it directly affects the quality and availability of location data used during emergency response.

That drives different priorities:

  • correctness of transformations
  • reliability of processing
  • traceability of changes
  • safe handling of malformed input

Downstream systems assume this data is accurate.


Production Context

ALIProWeb has been running in production since 2019, supporting workflows tied to 911 infrastructure.

It processes customer data and feeds systems relied on during emergency situations. The client base is primarily public sector entities.


Standards Alignment (NENA)

The platform operates within NENA-defined standards.

This affects:

  • how location data is structured and validated
  • how inputs are normalized
  • how updates are handled
  • how downstream systems consume data

Why This Matters

This is not a flexible data model:

  • formats are externally defined
  • input is often inconsistent
  • compliance and normalization must coexist
  • interoperability depends on precision

Much of the system complexity comes from these constraints.


Architecture Overview

Architecture Overview

At a high level, the system ingests data, stages it durably, processes it through a controlled workflow, delivers validated output to downstream ALI systems and reporst status to the end user.

Key Characteristics

  • Deterministic processing pipeline
  • Hybrid execution model (serverless + containers)
  • Integration with business systems
  • Controlled operational layer for replay and diagnostics

Architecture at a Glance

[ Customer Portal / Web API ]
            │
            ▼
[ Object Storage / Ingestion Boundary ]
            │
            ▼
[ Processing Layer ]
      │
      ├───────────────┬───────────────────────┐
      ▼               ▼                       ▼
[ Serverless ]   [ EC2 + Containers ]   [ Systems Manager ]
      │               │                       │
      └──────┬────────┴──────────────┬────────┘
             ▼                       ▼
      [ Business Integrations ]   [ Control Plane ]
             │
             ▼
[ Downstream ALI DBMS ]

Key Capabilities

  • Customer ingestion via portal and API
  • Durable staging and replayable ingestion
  • Data normalization and transformation
  • Workflow tracking through business systems
  • Controlled delivery to ALI DBMS
  • Centralized configuration and secrets
  • Remote operational control
  • Hybrid compute model
  • Infrastructure managed via CloudFormation

Core Architectural Concepts

Durable Ingestion Boundary

All inbound data is written to object storage before processing.

This enables replay, auditability, and separation of intake from execution.


Distributed Workflow Processing

Processing is staged:

  1. ingestion
  2. validation
  3. transformation
  4. tracking
  5. submission
  6. reconciliation

Each stage is observable and recoverable.


Hybrid Compute Model

Workloads run across multiple environments:

Serverless

  • event-driven
  • short-lived
  • scalable

EC2 + Containers

  • long-running
  • batch processing
  • persistent integrations

Systems Manager

  • operator-triggered workflows
  • diagnostics
  • recovery

Configuration & Secrets

Configuration and secrets are centrally managed:

  • API credentials
  • environment-specific config
  • separation from code
  • access controlled via IAM

Operational Control

Operational actions run through controlled interfaces rather than direct infrastructure access.

This includes:

  • replay
  • retries
  • diagnostics
  • maintenance

Infrastructure as Code

The platform is defined using CloudFormation.

  • environments are reproducible
  • changes are versioned
  • infrastructure is not manually configured

Workload Decision Framework

Workload Type Execution Model
Event-driven ingestion Serverless
Lightweight validation Serverless
Short transformations Serverless
Long-running jobs EC2 + Containers
Batch processing EC2 + Containers
Persistent integrations EC2 + Containers
Retry / replay Systems Manager
Diagnostics Systems Manager

Repository Structure

docs/           → architecture and design documentation  
examples/       → sample payloads  
adr/            → design decisions  
supplemental/   → supporting materials  

Design Principles

  • separation of concerns
  • auditability
  • replayability
  • least-privilege security
  • operational simplicity

Source Code

The production application code is maintained in AWS CodeCommit.

This repo focuses on architecture and system design.


Ownership

The system was designed and implemented end-to-end by a single engineer.

This includes:

  • architecture
  • workflows
  • infrastructure (CloudFormation)
  • integrations
  • operational tooling
  • configuration and secrets
  • lifecycle management

Integration Map

Integration Map

Shows how the workflow core connects to identity, state, business systems, and platform control.

Highlights

  • runtime execution across multiple environments
  • supporting services for identity, state, and configuration
  • integration with CRM and accounting systems
  • delivery into the ALI DBMS

About

Production PS/ALI (911 location data) processing system — architecture, integrations, and operational design

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors