This commit is contained in:
2026-02-07 09:23:49 +00:00
parent 96eabe3db6
commit 36552903e7
85 changed files with 9820870 additions and 1767 deletions

View File

@@ -1,17 +1,4 @@
{ {
"default:86Y93-7660363": {
"id": "86Y93-7660363",
"name": "EspoCRMStatusCheck",
"lastActivity": 1760457660363,
"metadata": {
"completedSteps": 1,
"activeSteps": 0,
"totalSteps": 1
},
"status": "failed",
"startTime": 1760457660363,
"endTime": 1760457663336
},
"default:FY927-7720026": { "default:FY927-7720026": {
"id": "FY927-7720026", "id": "FY927-7720026",
"name": "EspoCRMStatusCheck", "name": "EspoCRMStatusCheck",
@@ -648,5 +635,18 @@
"status": "failed", "status": "failed",
"startTime": 1760709120910, "startTime": 1760709120910,
"endTime": 1760709121008 "endTime": 1760709121008
},
"default:XGTDJ-9480306": {
"id": "XGTDJ-9480306",
"name": "EspoCRMStatusCheck",
"lastActivity": 1767189480307,
"metadata": {
"completedSteps": 1,
"activeSteps": 0,
"totalSteps": 1
},
"status": "failed",
"startTime": 1767189480307,
"endTime": 1767189480371
} }
} }

View File

@@ -1,21 +1,4 @@
{ {
"86Y93-7660363:3fd26071-e1dd-4288-8542-e9df46d4a991": {
"id": "3fd26071-e1dd-4288-8542-e9df46d4a991",
"name": "EspoCRMStatusCheck",
"parentTraceId": "86Y93-7660363",
"status": "failed",
"startTime": 1760457660467,
"endTime": 1760457663336,
"entryPoint": {
"type": "cron",
"stepName": "EspoCRMStatusCheck"
},
"events": [],
"error": {
"message": "handler() missing 1 required positional argument: 'context'",
"stack": ""
}
},
"FY927-7720026:ad12b5cf-8092-4671-adc1-c7349e42df0f": { "FY927-7720026:ad12b5cf-8092-4671-adc1-c7349e42df0f": {
"id": "ad12b5cf-8092-4671-adc1-c7349e42df0f", "id": "ad12b5cf-8092-4671-adc1-c7349e42df0f",
"name": "EspoCRMStatusCheck", "name": "EspoCRMStatusCheck",
@@ -916,5 +899,22 @@
"message": "handler() missing 1 required positional argument: 'context'", "message": "handler() missing 1 required positional argument: 'context'",
"stack": "" "stack": ""
} }
},
"XGTDJ-9480306:58ddd8c4-a2c0-44d7-9d6f-829e8c700ba1": {
"id": "58ddd8c4-a2c0-44d7-9d6f-829e8c700ba1",
"name": "EspoCRMStatusCheck",
"parentTraceId": "XGTDJ-9480306",
"status": "failed",
"startTime": 1767189480307,
"endTime": 1767189480371,
"entryPoint": {
"type": "cron",
"stepName": "EspoCRMStatusCheck"
},
"events": [],
"error": {
"message": "handler() missing 1 required positional argument: 'context'",
"stack": ""
}
} }
} }

29
bitbylaw/.gitignore vendored
View File

@@ -6,3 +6,32 @@ venv
.mermaid .mermaid
dist dist
*.pyc *.pyc
__pycache__
# Performance logs and diagnostics
*_log.txt
performance_logs_*/
*.clinic
# Service account credentials (WICHTIG!)
service-account.json
# IDE and editor files
.vscode/
.cursor/
.aider*
# OS files
.DS_Store
Thumbs.db
# Build artifacts
*.so
*.egg-info
build/
*.whl
# Environment files
.env
.env.*
!.env.example

View File

@@ -1,277 +1,206 @@
# Motia Advoware-EspoCRM Integration # bitbylaw - Motia Integration Platform
Dieses Projekt implementiert eine robuste Integration zwischen Advoware und EspoCRM über das Motia-Framework. Es bietet eine vollständige API-Proxy für Advoware und Webhook-Handler für EspoCRM, um Änderungen an Beteiligte-Entitäten zu synchronisieren. Event-driven Integration zwischen Advoware, EspoCRM und Google Calendar über das Motia-Framework.
## Übersicht ## Quick Start
Das System besteht aus drei Hauptkomponenten: ```bash
cd /opt/motia-app/bitbylaw
npm install
pip install -r requirements.txt
npm start
```
1. **Advoware API Proxy**: Vollständige REST-API-Proxy für alle HTTP-Methoden (GET, POST, PUT, DELETE) Siehe: [docs/DEVELOPMENT.md](docs/DEVELOPMENT.md) für Details.
2. **EspoCRM Webhook Receiver**: Empfängt Webhooks für CRUD-Operationen auf Beteiligte-Entitäten
3. **Event-Driven Sync**: Verarbeitet Synchronisationsereignisse mit Redis-basierter Deduplikation ## Komponenten
1. **Advoware API Proxy** - REST-API-Proxy mit HMAC-512 Auth ([Details](steps/advoware_proxy/README.md))
2. **Calendar Sync** - Bidirektionale Synchronisation Advoware ↔ Google ([Details](steps/advoware_cal_sync/README.md))
3. **VMH Webhooks** - EspoCRM Webhook-Receiver für Beteiligte ([Details](steps/vmh/README.md))
## Architektur ## Architektur
### Komponenten
- **Motia Framework**: Event-driven Backend-Orchestrierung
- **Python Steps**: Asynchrone Verarbeitung mit aiohttp und redis-py
- **Advoware API Client**: Authentifizierte API-Kommunikation mit Token-Management
- **Redis**: Deduplikation von Webhook-Events und Caching
- **EspoCRM Integration**: Webhook-Handler für create/update/delete Operationen
### Datenfluss
``` ```
EspoCRM Webhook → VMH Webhook Receiver → Redis Deduplication → Event Emission → Sync Handler ┌─────────────┐ ┌──────────┐ ┌────────────┐
Advoware API → Proxy Steps → Response │ EspoCRM │────▶│ Webhooks │────▶│ Redis │
└─────────────┘ └──────────┘ │ Dedup │
└────────────┘
┌─────────────┐ ┌──────────┐ │
│ Clients │────▶│ Proxy │────▶ │
└─────────────┘ └──────────┘ │
┌────────────┐
│ Sync │
│ Handlers │
└────────────┘
┌────────────┐
│ Advoware │
│ Google │
└────────────┘
``` ```
## Setup Siehe: [docs/ARCHITECTURE.md](docs/ARCHITECTURE.md)
### Voraussetzungen ## API Endpoints
- Python 3.13+ **Advoware Proxy**:
- Node.js 18+ - `GET/POST/PUT/DELETE /advoware/proxy?endpoint=...`
- Redis Server
- Motia CLI
### Installation **Calendar Sync**:
- `POST /advoware/calendar/sync` - Manual trigger
1. **Repository klonen und Dependencies installieren:** **VMH Webhooks**:
```bash - `POST /vmh/webhook/beteiligte/create`
cd /opt/motia-app/bitbylaw - `POST /vmh/webhook/beteiligte/update`
npm install - `POST /vmh/webhook/beteiligte/delete`
pip install -r requirements.txt
```
2. **Umgebungsvariablen konfigurieren:** Siehe: [docs/API.md](docs/API.md)
Erstellen Sie eine `.env`-Datei mit folgenden Variablen:
```env
ADVOWARE_BASE_URL=https://api.advoware.com
ADVOWARE_USERNAME=your_username
ADVOWARE_PASSWORD=your_password
REDIS_URL=redis://localhost:6379
ESPOCRM_WEBHOOK_SECRET=your_webhook_secret
```
3. **Redis starten:** ## Configuration
```bash
redis-server
```
4. **Motia starten:** Environment Variables via `.env` oder systemd service:
```bash
motia start
```
## Verwendung
### Advoware API Proxy
Die Proxy-Endpunkte spiegeln die Advoware-API wider:
- `GET /api/advoware/*` - Daten abrufen
- `POST /api/advoware/*` - Neue Ressourcen erstellen
- `PUT /api/advoware/*` - Ressourcen aktualisieren
- `DELETE /api/advoware/*` - Ressourcen löschen
**Beispiel:**
```bash ```bash
curl -X GET "http://localhost:3000/api/advoware/employees" # Advoware
ADVOWARE_API_BASE_URL=https://www2.advo-net.net:90/
ADVOWARE_API_KEY=your_base64_key
ADVOWARE_USER=api_user
ADVOWARE_PASSWORD=your_password
# Redis
REDIS_HOST=localhost
REDIS_PORT=6379
# Google Calendar
GOOGLE_CALENDAR_SERVICE_ACCOUNT_PATH=/opt/motia-app/service-account.json
``` ```
Für detaillierte Informationen zu den Proxy-Steps siehe [steps/advoware_proxy/README.md](steps/advoware_proxy/README.md). Siehe:
- [docs/CONFIGURATION.md](docs/CONFIGURATION.md)
### EspoCRM Webhooks - [docs/GOOGLE_SETUP.md](docs/GOOGLE_SETUP.md) - Service Account Setup
Webhooks werden automatisch von EspoCRM gesendet für Änderungen an Beteiligte-Entitäten:
- **Create**: `/webhooks/vmh/beteiligte/create`
- **Update**: `/webhooks/vmh/beteiligte/update`
- **Delete**: `/webhooks/vmh/beteiligte/delete`
Für detaillierte Informationen zu den Webhook- und Sync-Steps siehe [steps/vmh/README.md](steps/vmh/README.md).
### Synchronisation
Die Synchronisation läuft event-driven ab:
1. Webhook-Events werden in Redis-Queues dedupliziert
2. Events werden an den Sync-Handler emittiert
3. Sync-Handler verarbeitet die Änderungen (aktuell Placeholder)
## Konfiguration
### Motia Workbench
Die Flows sind in `motia-workbench.json` definiert:
- `advoware-proxy`: API-Proxy-Flows
- `vmh-webhook`: Webhook-Receiver-Flows
- `beteiligte-sync`: Synchronisations-Flow
### Redis Keys
- `vmh:webhook:create`: Create-Event Queue
- `vmh:webhook:update`: Update-Event Queue
- `vmh:webhook:delete`: Delete-Event Queue
## Entwicklung
### Projektstruktur
```
bitbylaw/
├── steps/
│ ├── advoware_proxy/ # API Proxy Steps - siehe [README](steps/advoware_proxy/README.md)
│ │ ├── advoware_api_proxy_get_step.py
│ │ ├── advoware_api_proxy_post_step.py
│ │ ├── advoware_api_proxy_put_step.py
│ │ └── advoware_api_proxy_delete_step.py
│ └── vmh/ # VMH Webhook & Sync Steps - siehe [README](steps/vmh/README.md)
│ ├── webhook/ # Webhook Receiver Steps
│ │ ├── beteiligte_create_api_step.py
│ │ ├── beteiligte_update_api_step.py
│ │ └── beteiligte_delete_api_step.py
│ └── beteiligte_sync_event_step.py # Sync Handler
├── services/
│ └── advoware.py # API Client
├── config.py # Configuration
├── motia-workbench.json # Flow Definitions
├── package.json
├── requirements.txt
└── tsconfig.json
```
### Testing
**API Proxy testen:**
```bash
curl -X GET "http://localhost:3000/api/advoware/employees"
```
**Webhook simulieren:**
```bash
curl -X POST "http://localhost:3000/webhooks/vmh/beteiligte/create" \
-H "Content-Type: application/json" \
-d '{"id": "123", "name": "Test Beteiligte"}'
```
### Logging
Alle Steps enthalten detaillierte Logging-Ausgaben für Debugging:
- API-Requests/Responses
- Redis-Operationen
- Event-Emission
- Fehlerbehandlung
## Deployment ## Deployment
### Docker Production deployment via systemd:
```dockerfile
FROM python:3.13-slim
WORKDIR /app
COPY requirements.txt .
RUN pip install -r requirements.txt
COPY . .
EXPOSE 3000
CMD ["motia", "start"]
```
### Production Setup
1. Redis Cluster für Hochverfügbarkeit
2. Load Balancer für API-Endpunkte
3. Monitoring für Sync-Operationen
4. Backup-Strategie für Redis-Daten
## Fehlerbehebung
### Häufige Probleme
1. **Context Attribute Error**: Verwenden Sie `Config` statt `context.config`
2. **Redis Connection Failed**: Überprüfen Sie Redis-URL und Netzwerkverbindung
3. **Webhook Duplikate**: Redis-Deduplikation verhindert Mehrfachverarbeitung
### Logs überprüfen
```bash ```bash
motia logs sudo systemctl status motia.service
sudo journalctl -u motia.service -f
``` ```
## Calendar Sync Siehe: [docs/DEPLOYMENT.md](docs/DEPLOYMENT.md)
Das System enthält auch eine bidirektionale Kalender-Synchronisation zwischen Advoware und Google Calendar. ## Documentation
### Architektur ### Getting Started
- [Development Guide](docs/DEVELOPMENT.md) - Setup, Coding Standards, Testing
- [Configuration](docs/CONFIGURATION.md) - Environment Variables
- [Deployment](docs/DEPLOYMENT.md) - Production Setup
- **PostgreSQL Hub**: Speichert Sync-Zustand und verhindert Datenverlust ### Technical Details
- **Event-Driven Sync**: 4-Phasen-Sync (Neu, Gelöscht, Aktualisiert) - [Architecture](docs/ARCHITECTURE.md) - System Design, Datenflüsse
- **Safe Wrappers**: Globale Write-Protection für Advoware-Schreiboperationen - [API Reference](docs/API.md) - HTTP Endpoints, Event Topics
- **Rate Limiting**: Backoff-Handling für Google Calendar API-Limits - [Troubleshooting](docs/TROUBLESHOOTING.md) - Common Issues
### Dauertermine (Recurring Appointments) ### Components
- [Advoware Proxy](steps/advoware_proxy/README.md) - API Proxy Details
- [Calendar Sync](steps/advoware_cal_sync/README.md) - Sync Logic
- [VMH Webhooks](steps/vmh/README.md) - Webhook Handlers
- [Advoware Service](services/ADVOWARE_SERVICE.md) - API Client
Advoware verwendet `dauertermin=1` für wiederkehrende Termine mit folgenden Feldern: ### Step Documentation
Jeder Step hat eine detaillierte `.md` Dokumentation neben der `.py` Datei.
- `turnus`: Intervall (z.B. 1 = jeden, 3 = jeden 3.) ## Project Structure
- `turnusArt`: Frequenz-Einheit
- `1` = Täglich (DAILY)
- `2` = Wöchentlich (WEEKLY)
- `3` = Monatlich (MONTHLY)
- `4` = Jährlich (YEARLY)
- `datumBis`: Enddatum der Wiederholung
**RRULE-Generierung:**
``` ```
RRULE:FREQ={FREQ};INTERVAL={turnus};UNTIL={datumBis} bitbylaw/
├── docs/ # Documentation
├── steps/ # Motia Steps
│ ├── advoware_proxy/ # API Proxy Steps + Docs
│ ├── advoware_cal_sync/ # Calendar Sync Steps + Docs
│ └── vmh/ # Webhook Steps + Docs
├── services/ # Shared Services
│ └── advoware.py # API Client + Doc
├── config.py # Configuration Loader
├── package.json # Node.js Dependencies
└── requirements.txt # Python Dependencies
``` ```
Beispiel: `turnus=3, turnusArt=1` → `RRULE:FREQ=DAILY;INTERVAL=3;UNTIL=20251224` ## Technology Stack
### Setup - **Framework**: Motia v0.8.2-beta.139 (Event-Driven Backend)
- **Languages**: Python 3.13, Node.js 18, TypeScript
- **Data Store**: Redis (Caching, Locking, Deduplication)
- **External APIs**: Advoware REST API, Google Calendar API, EspoCRM
1. **Google Service Account**: `service-account.json` im Projektroot ## Development
2. **Umgebungsvariablen**:
```env
ADVOWARE_WRITE_PROTECTION=false # Global write protection
POSTGRES_HOST=localhost
GOOGLE_CALENDAR_SERVICE_ACCOUNT_PATH=service-account.json
```
3. **Trigger Sync**:
```bash
curl -X POST "http://localhost:3000/advoware/calendar/sync" -H "Content-Type: application/json" -d '{"full_content": true}'
```
### Rate Limiting & Backoff ```bash
# Development mode
npm run dev
- **Google Calendar API**: 403-Fehler bei Rate-Limits werden mit exponentiellem Backoff (max. 60s) wiederholt # Generate types
- **Globales Rate Limiting**: 600 Anfragen/Minute über alle parallel laufenden Sync-Prozesse hinweg mittels Redis Sorted Set npm run generate-types
- **Gleitendes Fenster**: 60-Sekunden-Fenster für kontinuierliche Überwachung des Durchschnitts
- **Delays**: 100ms zwischen API-Calls zur Vermeidung von Limits
- **Retry-Logic**: Max. 4 Versuche mit base=4
### Sicherheit # Clean build
npm run clean && npm install
```
- **Write Protection**: `ADVOWARE_WRITE_PROTECTION=true` deaktiviert alle Advoware-Schreiboperationen ---
- **Per-User Calendars**: Automatische Erstellung und Freigabe von Google-Calendars pro Mitarbeiter
### Troubleshooting ## Projektstruktur
- **Rate Limit Errors**: Logs zeigen Backoff-Retries; warten oder Limits erhöhen ```
- **Sync Failures**: `ADVOWARE_WRITE_PROTECTION=false` setzen für Debugging bitbylaw/
- **Calendar Access**: Service Account muss Owner-Rechte haben ├── docs/ # Comprehensive documentation
│ ├── advoware/ # Advoware API documentation (Swagger)
│ └── *.md # Architecture, Development, Configuration, etc.
├── scripts/ # Utility scripts for maintenance
│ └── calendar_sync/ # Calendar sync helper scripts
├── services/ # Shared service implementations
├── steps/ # Motia step implementations
│ ├── advoware_proxy/ # REST API proxy steps
│ ├── advoware_cal_sync/ # Calendar synchronization steps
│ └── vmh/ # EspoCRM webhook handlers
├── src/ # TypeScript sources (unused currently)
└── config.py # Central configuration
```
## Lizenz **Key Files**:
- [docs/INDEX.md](docs/INDEX.md) - Documentation navigation
- [docs/ARCHITECTURE.md](docs/ARCHITECTURE.md) - System architecture
- [docs/advoware/advoware_api_swagger.json](docs/advoware/advoware_api_swagger.json) - Advoware API spec
- [scripts/calendar_sync/README.md](scripts/calendar_sync/README.md) - Utility scripts
[License Information] ---
## Beitrag ## Testing
Bitte erstellen Sie Issues für Bugs oder Feature-Requests. Pull-Requests sind willkommen! ```bash
# Test Advoware Proxy
curl -X GET "http://localhost:3000/advoware/proxy?endpoint=employees"
# Test Calendar Sync
curl -X POST "http://localhost:3000/advoware/calendar/sync" \
-H "Content-Type: application/json" \
-d '{"full_content": true}'
# Test Webhook
curl -X POST "http://localhost:3000/vmh/webhook/beteiligte/create" \
-H "Content-Type: application/json" \
-d '[{"id": "test-123"}]'
```
## License
[Your License]
## Support
- **Issues**: [GitHub Issues]
- **Docs**: [docs/](docs/)
- **Logs**: `sudo journalctl -u motia.service -f`

514
bitbylaw/docs/API.md Normal file
View File

@@ -0,0 +1,514 @@
# API Reference
---
title: API Reference
description: Vollständige API-Dokumentation für bitbylaw Motia Installation
date: 2026-02-07
version: 1.1.0
---
## Base URL
**Production (via KONG)**: `https://api.bitbylaw.com`
**Development**: `http://localhost:3000`
---
## Authentication
### KONG API Gateway
Alle Produktions-API-Calls laufen über KONG mit API-Key-Authentifizierung:
```bash
curl -H "apikey: YOUR_API_KEY" https://api.bitbylaw.com/advoware/proxy?endpoint=employees
```
**Header**: `apikey: <your-api-key>`
### Development
Entwicklungs-Environment: Keine Authentifizierung auf Motia-Ebene erforderlich.
---
## Advoware Proxy API
### Universal Proxy Endpoint
Alle Advoware-API-Aufrufe laufen über einen universellen Proxy.
#### GET Request
**Endpoint**: `GET /advoware/proxy`
**Query Parameters**:
- `endpoint` (required): Advoware API endpoint (ohne Base-URL)
- Alle weiteren Parameter werden an Advoware weitergeleitet
**Example**:
```bash
curl -X GET "http://localhost:3000/advoware/proxy?endpoint=employees&limit=10"
```
**Response**:
```json
{
"status": 200,
"body": {
"result": {
"data": [...],
"total": 100
}
}
}
```
#### POST Request
**Endpoint**: `POST /advoware/proxy`
**Query Parameters**:
- `endpoint` (required): Advoware API endpoint
**Request Body**: JSON data für Advoware API
**Example**:
```bash
curl -X POST "http://localhost:3000/advoware/proxy?endpoint=appointments" \
-H "Content-Type: application/json" \
-d '{
"datum": "2026-02-10",
"uhrzeitVon": "09:00:00",
"text": "Meeting"
}'
```
**Response**:
```json
{
"status": 200,
"body": {
"result": {
"id": "12345"
}
}
}
```
#### PUT Request
**Endpoint**: `PUT /advoware/proxy`
**Query Parameters**:
- `endpoint` (required): Advoware API endpoint (inkl. ID)
**Request Body**: JSON data für Update
**Example**:
```bash
curl -X PUT "http://localhost:3000/advoware/proxy?endpoint=appointments/12345" \
-H "Content-Type: application/json" \
-d '{
"text": "Updated Meeting"
}'
```
#### DELETE Request
**Endpoint**: `DELETE /advoware/proxy`
**Query Parameters**:
- `endpoint` (required): Advoware API endpoint (inkl. ID)
**Example**:
```bash
curl -X DELETE "http://localhost:3000/advoware/proxy?endpoint=appointments/12345"
```
**Response**:
```json
{
"status": 200,
"body": {
"result": null
}
}
```
### Error Responses
**400 Bad Request**:
```json
{
"status": 400,
"body": {
"error": "Endpoint required as query param"
}
}
```
**500 Internal Server Error**:
```json
{
"status": 500,
"body": {
"error": "Internal server error",
"details": "Error message"
}
}
```
## Calendar Sync API
### Trigger Full Sync
**Endpoint**: `POST /advoware/calendar/sync`
**Request Body**:
```json
{
"kuerzel": "ALL",
"full_content": true
}
```
**Parameters**:
- `kuerzel` (optional): Mitarbeiter-Kürzel oder "ALL" (default: "ALL")
- `full_content` (optional): Volle Details vs. anonymisiert (default: true)
**Examples**:
Sync all employees:
```bash
curl -X POST "http://localhost:3000/advoware/calendar/sync" \
-H "Content-Type: application/json" \
-d '{"full_content": true}'
```
Sync single employee:
```bash
curl -X POST "http://localhost:3000/advoware/calendar/sync" \
-H "Content-Type: application/json" \
-d '{"kuerzel": "SB", "full_content": true}'
```
Sync with anonymization:
```bash
curl -X POST "http://localhost:3000/advoware/calendar/sync" \
-H "Content-Type: application/json" \
-d '{"full_content": false}'
```
**Response**:
```json
{
"status": "triggered",
"kuerzel": "ALL",
"message": "Calendar sync triggered for ALL"
}
```
**Status Codes**:
- `200`: Sync triggered successfully
- `400`: Invalid request (z.B. lock aktiv)
- `500`: Internal error
## VMH Webhook Endpoints
Diese Endpoints werden von EspoCRM aufgerufen.
### Beteiligte Create Webhook
**Endpoint**: `POST /vmh/webhook/beteiligte/create`
**Request Body**: Array von Entitäten
```json
[
{
"id": "entity-123",
"name": "Max Mustermann",
"createdAt": "2026-02-07T10:00:00Z"
}
]
```
**Response**:
```json
{
"status": "received",
"action": "create",
"new_ids_count": 1,
"total_ids_in_batch": 1
}
```
### Beteiligte Update Webhook
**Endpoint**: `POST /vmh/webhook/beteiligte/update`
**Request Body**: Array von Entitäten
```json
[
{
"id": "entity-123",
"name": "Max Mustermann Updated",
"modifiedAt": "2026-02-07T11:00:00Z"
}
]
```
**Response**:
```json
{
"status": "received",
"action": "update",
"new_ids_count": 1,
"total_ids_in_batch": 1
}
```
### Beteiligte Delete Webhook
**Endpoint**: `POST /vmh/webhook/beteiligte/delete`
**Request Body**: Array von Entitäten
```json
[
{
"id": "entity-123",
"deletedAt": "2026-02-07T12:00:00Z"
}
]
```
**Response**:
```json
{
"status": "received",
"action": "delete",
"new_ids_count": 1,
"total_ids_in_batch": 1
}
```
### Webhook Features
**Batch Support**: Alle Webhooks unterstützen Arrays von Entitäten
**Deduplication**: Redis-basiert, verhindert Mehrfachverarbeitung
**Async Processing**: Events werden emittiert und asynchron verarbeitet
## Event Topics
Interne Event-Topics für Event-Driven Architecture (nicht direkt aufrufbar).
### calendar_sync_all
**Emitted by**: `calendar_sync_cron_step`, `calendar_sync_api_step`
**Subscribed by**: `calendar_sync_all_step`
**Payload**:
```json
{}
```
### calendar_sync_employee
**Emitted by**: `calendar_sync_all_step`, `calendar_sync_api_step`
**Subscribed by**: `calendar_sync_event_step`
**Payload**:
```json
{
"kuerzel": "SB",
"full_content": true
}
```
### vmh.beteiligte.create
**Emitted by**: `beteiligte_create_api_step`
**Subscribed by**: `beteiligte_sync_event_step`
**Payload**:
```json
{
"entity_id": "123",
"action": "create",
"source": "webhook",
"timestamp": "2026-02-07T10:00:00Z"
}
```
### vmh.beteiligte.update
**Emitted by**: `beteiligte_update_api_step`
**Subscribed by**: `beteiligte_sync_event_step`
**Payload**:
```json
{
"entity_id": "123",
"action": "update",
"source": "webhook",
"timestamp": "2026-02-07T11:00:00Z"
}
```
### vmh.beteiligte.delete
**Emitted by**: `beteiligte_delete_api_step`
**Subscribed by**: `beteiligte_sync_event_step`
**Payload**:
```json
{
"entity_id": "123",
"action": "delete",
"source": "webhook",
"timestamp": "2026-02-07T12:00:00Z"
}
```
## Rate Limits
### Google Calendar API
**Limit**: 600 requests/minute (enforced via Redis token bucket)
**Behavior**:
- Requests wait if rate limit reached
- Automatic backoff on 403 errors
- Max retry: 4 attempts
### Advoware API
**Limit**: Unknown (keine offizielle Dokumentation)
**Behavior**:
- 30s timeout per request
- Automatic token refresh on 401
- No retry logic (fail fast)
## Error Handling
### Standard Error Response
```json
{
"status": 400,
"body": {
"error": "Error description",
"details": "Detailed error message"
}
}
```
### HTTP Status Codes
- `200` - Success
- `400` - Bad Request (invalid input)
- `401` - Unauthorized (Advoware token invalid)
- `403` - Forbidden (rate limit)
- `404` - Not Found
- `500` - Internal Server Error
- `503` - Service Unavailable (Redis down)
### Common Errors
**Redis Connection Error**:
```json
{
"status": 503,
"body": {
"error": "Redis connection failed"
}
}
```
**Advoware API Error**:
```json
{
"status": 500,
"body": {
"error": "Advoware API call failed",
"details": "401 Unauthorized"
}
}
```
**Lock Active Error**:
```json
{
"status": 400,
"body": {
"error": "Sync already in progress for employee SB"
}
}
```
## API Versioning
**Current Version**: v1 (implicit, no version in URL)
**Future**: API versioning via URL prefix (`/v2/api/...`)
## Health Check
**Coming Soon**: `/health` endpoint für Load Balancer
Expected response:
```json
{
"status": "healthy",
"services": {
"redis": "up",
"advoware": "up",
"google": "up"
}
}
```
## Testing
### Postman Collection
Import diese Collection für schnelles Testing:
```json
{
"info": {
"name": "bitbylaw API",
"schema": "https://schema.getpostman.com/json/collection/v2.1.0/collection.json"
},
"item": [
{
"name": "Advoware Proxy GET",
"request": {
"method": "GET",
"url": "http://localhost:3000/advoware/proxy?endpoint=employees"
}
},
{
"name": "Calendar Sync Trigger",
"request": {
"method": "POST",
"url": "http://localhost:3000/advoware/calendar/sync",
"header": [{"key": "Content-Type", "value": "application/json"}],
"body": {
"mode": "raw",
"raw": "{\"full_content\": true}"
}
}
}
]
}
```
## Related Documentation
- [Architecture](ARCHITECTURE.md)
- [Development Guide](DEVELOPMENT.md)
- [Configuration](CONFIGURATION.md)

View File

@@ -0,0 +1,642 @@
# Architektur
## Systemübersicht
Das bitbylaw-System ist eine event-driven Integration zwischen Advoware, EspoCRM, Google Calendar, Vermieterhelden und 3CX Telefonie, basierend auf dem Motia Framework. Die Architektur folgt einem modularen, mikroservice-orientierten Ansatz mit klarer Separation of Concerns.
### Kernkomponenten
```
┌─────────────────────────────┐
│ KONG API Gateway │
│ api.bitbylaw.com │
│ (Auth, Rate Limiting) │
└──────────────┬──────────────┘
┌──────────────────────────┼──────────────────────────┐
│ │ │
┌────▼────────┐ ┌──────▼─────────┐ ┌─────▼──────┐
│ Vermieter- │ │ Motia │ │ 3CX │
│ helden.de │────────▶│ Framework │◀────────│ Telefonie │
│ (WordPress) │ │ (Middleware) │ │ (ralup) │
└─────────────┘ └────────┬───────┘ └────────────┘
Leads Input │ Call Handling
┌───────────────────────────┼───────────────────────────┐
│ │ │
┌────▼────┐ ┌──────▼──────┐ ┌──────▼─────┐
│Advoware │ │ VMH │ │ Calendar │
│ Proxy │ │ Webhooks │ │ Sync │
└────┬────┘ └─────┬───────┘ └─────┬──────┘
│ │ │
│ │ │
┌────▼─────────────────────────▼──────────────────────────▼────┐
│ Redis (3 DBs) │
│ DB 1: Caching & Locks │
│ DB 2: Calendar Sync State │
└───────────────────────────────────────────────────────────────┘
┌────▼────────────────────────────┐
│ External Services │
├─────────────────────────────────┤
│ • Advoware REST API │
│ • EspoCRM (VMH) │
│ • Google Calendar API │
│ • 3CX API (ralup.my3cx.de) │
│ • Vermieterhelden WordPress │
└─────────────────────────────────┘
```
## Komponenten-Details
### 0. KONG API Gateway
**Zweck**: Zentraler API-Gateway für alle öffentlichen APIs mit Authentifizierung und Rate Limiting.
**Domain**: `api.bitbylaw.com`
**Funktionen**:
- **Authentication**: API-Key-basiert, JWT, OAuth2
- **Rate Limiting**: Pro Consumer/API-Key
- **Request Routing**: Zu Backend-Services (Motia, etc.)
- **SSL/TLS Termination**: HTTPS-Handling
- **Logging & Monitoring**: Request-Logs, Metrics
- **CORS Handling**: Cross-Origin Requests
**Upstream Services**:
- Motia Framework (Advoware Proxy, Calendar Sync, VMH Webhooks)
- Zukünftig: Weitere Microservices
**Konfiguration**:
```yaml
# KONG Service Configuration
services:
- name: motia-backend
url: http://localhost:3000
routes:
- name: advoware-proxy
paths: [/advoware/*]
- name: calendar-sync
paths: [/calendar/*]
- name: vmh-webhooks
paths: [/vmh/*]
plugins:
- name: key-auth
- name: rate-limiting
config:
minute: 600
```
**Flow**:
```
Client → KONG (api.bitbylaw.com) → Auth Check → Rate Limit → Motia Backend
```
### 1. Advoware Proxy Layer
**Zweck**: Transparente REST-API-Proxy für Advoware mit Authentifizierung und Caching.
**Module**: `steps/advoware_proxy/`
- `advoware_api_proxy_get_step.py` - GET-Requests
- `advoware_api_proxy_post_step.py` - POST-Requests (Create)
- `advoware_api_proxy_put_step.py` - PUT-Requests (Update)
- `advoware_api_proxy_delete_step.py` - DELETE-Requests
**Services**: `services/advoware.py`
- Token-Management (HMAC-512 Authentifizierung)
- Redis-basiertes Token-Caching (55min Lifetime)
- Automatischer Token-Refresh bei 401-Errors
- Async API-Client mit aiohttp
**Datenfluss**:
```
Client → API-Step → AdvowareAPI Service → Redis (Token Cache) → Advoware API
```
### 2. Calendar Sync System
**Zweck**: Bidirektionale Synchronisation zwischen Advoware-Terminen und Google Calendar.
**Architecture Pattern**: Event-Driven Cascade
**Integration**: EspoCRM sendet Webhooks an KONG → Motia
**Datenfluss**:
```
EspoCRM (Vermieterhelden CRM) → KONG → Motia VMH Webhooks → Redis Dedup → Events
```
```
Cron (täglich)
→ calendar_sync_cron_step
→ Emit: "calendar_sync_all"
→ calendar_sync_all_step
→ Fetch Employees
→ For each Employee:
→ Set Redis Lock
→ Emit: "calendar_sync_employee"
→ calendar_sync_event_step
→ Fetch Advoware Events
→ Fetch Google Events
→ Sync (Create/Update/Delete)
→ Clear Redis Lock
```
**Module**: `steps/advoware_cal_sync/`
- `calendar_sync_cron_step.py` - Täglicher Trigger
- `calendar_sync_all_step.py` - Employee-List-Handler
- `calendar_sync_event_step.py` - Per-Employee Sync-Logic
- `calendar_sync_api_step.py` - Manueller Trigger-Endpoint
- `calendar_sync_utils.py` - Shared Utilities
- `audit_calendar_sync.py` - Audit & Diagnostics
**Key Features**:
- **Redis Locking**: Verhindert parallele Syncs für denselben Employee
- **Rate Limiting**: Token-Bucket-Algorithm (7 tokens, Redis-based)
- **Normalisierung**: Common format (Berlin TZ) für beide APIs
- **Error Isolation**: Employee-Fehler stoppen nicht Gesamt-Sync
**Datenmapping**:
```
Advoware Format → Standard Format → Google Calendar Format
↓ ↓ ↓
datum/uhrzeitVon start (datetime) dateTime
datumBis end (datetime) dateTime
dauertermin all_day (bool) date
turnus/turnusArt recurrence RRULE
```
### 3. VMH Webhook System
**Zweck**: Empfang und Verarbeitung von EspoCRM Webhooks für Beteiligte-Entitäten.
**Architecture Pattern**: Webhook → Deduplication → Event Emission
**Module**: `steps/vmh/`
- `webhook/beteiligte_create_api_step.py` - Create Webhook
- `webhook/beteiligte_update_api_step.py` - Update Webhook
- `webhook/beteiligte_delete_api_step.py` - Delete Webhook
- `beteiligte_sync_event_step.py` - Sync Event Handler (Placeholder)
**Webhook-Flow**:
```
EspoCRM → POST /vmh/webhook/beteiligte/create
Webhook Step
Extract Entity IDs
Redis Deduplication (SET: vmh:beteiligte:create_pending)
Emit Event: "vmh.beteiligte.create"
Sync Event Step (subscribes)
[TODO: Implementierung]
### 4. Vermieterhelden Integration
**Zweck**: Lead-Eingang von Vermieterhelden.de WordPress-Frontend.
**URL**: `https://vermieterhelden.de`
**Technologie**: WordPress-basiertes Frontend
**Funktionen**:
- **Lead-Formulare**: Mieter, Vermieter, Anfragen
- **Lead-Routing**: Zu EspoCRM (VMH) → Motia
- **Webhook-basiert**: POST zu KONG/Motia bei neuem Lead
**Datenfluss**:
```
Vermieterhelden.de → Lead erstellt → Webhook → KONG → Motia → EspoCRM/Advoware
```
**Lead-Typen**:
- Mieter-Anfragen
- Vermieter-Anfragen
- Kontaktformulare
- Newsletter-Anmeldungen
**Integration mit Motia**:
- Eigener Webhook-Endpoint: `/api/leads/vermieterhelden`
- Lead-Validierung und -Enrichment
- Weiterleitung an CRM-Systeme
### 5. 3CX Telefonie-Integration
**Zweck**: Telefonie-System-Integration für Call-Handling und Lead-Qualifizierung.
**URL**: `https://ralup.my3cx.de`
**Technologie**: 3CX Cloud PBX
**Funktionen**:
- **Outbound Calls**: Lead-Anrufe (automatisch oder manuell)
- **Inbound Calls**: Stammdatenabfrage (CTI - Computer Telephony Integration)
- **Call Logging**: Anrufprotokolle zu CRM
- **Call Recording**: Aufzeichnungen speichern und abrufen
- **Screen Pops**: Kundeninfo bei eingehendem Anruf
**API-Integrationen**:
**A) Outbound: Motia → 3CX**
```
Motia → KONG → 3CX API
- Initiate Call to Lead
- Get Call Status
```
**B) Inbound: 3CX → Motia**
```
3CX Webhook → KONG → Motia
- Call Started → Fetch Customer Data
- Call Ended → Log Call Record
```
**Datenfluss**:
**Call Initiation**:
```
Lead in CRM → Trigger Call → Motia → 3CX API → Dial Number
```
**Inbound Call**:
```
3CX detects call → Webhook to Motia → Lookup in Advoware/EspoCRM → Return data → 3CX Screen Pop
```
**Call Recording**:
```
Call ends → 3CX Webhook → Motia → Store metadata → Link to CRM entity
```
**Use Cases**:
- Lead-Qualifizierung nach Eingang
- Stammdatenabfrage bei Anruf
- Anrufprotokoll in EspoCRM/Advoware
- Automatische Follow-up-Tasks
```
**Deduplikation-Mechanismus**:
- Redis SET für pending IDs pro Action-Type (create/update/delete)
- Neue IDs werden zu SET hinzugefügt
- Events nur für neue (nicht-duplizierte) IDs emittiert
- SET-TTL verhindert Memory-Leaks
## Event-Driven Design
### Event-Topics
| Topic | Emitter | Subscriber | Payload |
|-------|---------|------------|---------|
| `calendar_sync_all` | cron_step | all_step | `{}` |
| `calendar_sync_employee` | all_step, api_step | event_step | `{kuerzel, full_content}` |
| `vmh.beteiligte.create` | create webhook | sync_event_step | `{entity_id, action, source, timestamp}` |
| `vmh.beteiligte.update` | update webhook | sync_event_step | `{entity_id, action, source, timestamp}` |
| `vmh.beteiligte.delete` | delete webhook | sync_event_step | `{entity_id, action, source, timestamp}` |
### Event-Flow Patterns
**1. Cascade Pattern** (Calendar Sync):
```
Trigger → Fetch List → Emit per Item → Process Item
```
**2. Webhook Pattern** (VMH):
```
External Event → Dedup → Internal Event → Processing
```
## Redis Architecture
### Database Layout
**DB 0**: Default (Motia internal)
**DB 1**: Advoware Cache & Locks
- `advoware_access_token` - Bearer Token (TTL: 53min)
- `advoware_token_timestamp` - Token Creation Time
- `calendar_sync:lock:{kuerzel}` - Per-Employee Lock (TTL: 5min)
- `vmh:beteiligte:create_pending` - Create Dedup SET
- `vmh:beteiligte:update_pending` - Update Dedup SET
- `vmh:beteiligte:delete_pending` - Delete Dedup SET
**DB 2**: Calendar Sync Rate Limiting
- `google_calendar_api_tokens` - Token Bucket for Rate Limiting
---
## External APIs
### Advoware REST API
**Base URL**: `https://advoware-api.example.com/api/v1/`
**Auth**: HMAC-512 (siehe `services/advoware.py`)
**Rate Limits**: Unknown (keine Limits bekannt)
**Documentation**: [Advoware API Swagger](../docs/advoware/advoware_api_swagger.json)
**Wichtige Endpoints**:
- `POST /auth/login` - Token generieren
- `GET /employees` - Employee-Liste
- `GET /events` - Termine abrufen
- `POST /events` - Termin erstellen
- `PUT /events/{id}` - Termin aktualisieren
### Redis Usage Patterns
**Token Caching**:
```python
# Set with expiration
redis.set('advoware_access_token', token, ex=3180) # 53min
# Get with fallback
token = redis.get('advoware_access_token')
if not token:
token = fetch_new_token()
```
### EspoCRM (VMH)
**Integration**: Webhook Sender (Outbound), API Consumer
**Endpoints**: Configured in EspoCRM, routed via KONG
**Format**: JSON POST with entity data
**Note**: Dient als CRM für Vermieterhelden-Leads
### 3CX Telefonie API
**Base URL**: `https://ralup.my3cx.de/api/v1/`
**Auth**: API Key oder Basic Auth
**Rate Limits**: Unknown (typisch 60 req/min)
**Key Endpoints**:
- `POST /calls/initiate` - Anruf starten
- `GET /calls/{id}/status` - Call-Status
- `GET /calls/{id}/recording` - Aufzeichnung abrufen
- `POST /webhook` - Webhook-Konfiguration (eingehend)
**Webhooks** (Inbound von 3CX):
- `call.started` - Anruf beginnt
- `call.ended` - Anruf beendet
- `call.transferred` - Anruf weitergeleitet
### Vermieterhelden
**Integration**: Webhook Sender (Lead-Eingang)
**Base**: WordPress mit Custom Plugins
**Format**: JSON POST zu Motia
**Webhook-Events**:
- `lead.created` - Neuer Lead
- `contact.submitted` - Kontaktformular
lock_key = f'calendar_sync:lock:{kuerzel}'
if not redis.set(lock_key, '1', nx=True, ex=300):
raise LockError("Already locked")
# Always release
redis.delete(lock_key)
```
**Deduplication**:
```python
# Check & Add atomically
existing = redis.smembers('vmh:beteiligte:create_pending')
new_ids = input_ids - existing
if new_ids:
redis.sadd('vmh:beteiligte:create_pending', *new_ids)
```
## Service Layer
### AdvowareAPI Service
**Location**: `services/advoware.py`
**Responsibilities**:
- HMAC-512 Authentication
- Token Management
- HTTP Client (aiohttp)
- Error Handling & Retries
**Key Methods**:
```python
get_access_token(force_refresh=False) -> str
api_call(endpoint, method, params, json_data) -> Any
```
**Authentication Flow**:
```
1. Generate HMAC-512 signature
- Message: "{product_id}:{app_id}:{nonce}:{timestamp}"
- Key: Base64-decoded API Key
- Hash: SHA512
2. POST to security.advo-net.net/api/v1/Token
- Body: {AppID, User, Password, HMAC512Signature, ...}
3. Extract access_token from response
4. Cache in Redis (53min TTL)
5. Use as Bearer Token: "Authorization: Bearer {token}"
```
## External API Integration
### Advoware API
**Base URL**: `https://www2.advo-net.net:90/`
**Auth**: HMAC-512 + Bearer Token
**Rate Limits**: Unknown (robust error handling)
**Key Endpoints**:
- `/employees` - Mitarbeiter-Liste
- `/appointments` - Termine
### Google Calendar API
**Auth**: Service Account (JSON Key)
**Rate Limits**: 600 requests/minute (enforced via Redis)
**Scopes**: `https://www.googleapis.com/auth/calendar`
**Key Operations**:
- `calendars().get()` - Calendar abrufen
- `calendars().insert()` - Calendar erstellen
- `events().list()` - Events abrufen
- `events().insert()` - Event erstellen
- KONG Gateway**: API-Key oder JWT-based Auth für externe Clients
**Advoware**: User-based Auth (ADVOWARE_USER + PASSWORD)
**Google**: Service Account (domain-wide delegation)
**3CX**: API Key oder Basic Auth
**Redis**: Localhost only (no password)
**Vermieterhelden**: Webhook-Secret für Validation
### EspoCRM
**Integration**: Webhook Sender (Outbound)
**Endpoints**: Configured in EspoCRM
**Format**: JSON POST with entity data
## Security
### Secrets Management
**Environment Variables**:
```bash
ADVOWARE_API_KEY # Base64-encoded HMAC Key
ADVOWARE_PASSWORD # User Password
GOOGLE_CALENDAR_SERVICE_ACCOUNT_PATH # Path to JSON Key
ESPOCRM_MARVIN_API_KEY # Webhook Validation (optional)
```
**Storage**:
- Environment variables in systemd service
- Service Account JSON: `/opt/motia-app/service-account.json` (chmod 600)
- No secrets in code or Git
### Access Control
**Advoware**: User-based Auth (ADVOWARE_USER + PASSWORD)
**Google**: Service Account (domain-wide delegation)
**Redis**: Localhost only (no password)
## Performance Characteristics
### Throughput
**Calendar Sync**:
- ~10 employees: 2-3 minutes
- Rate-limited by Google API (600 req/min)
- Per-employee parallelization: Nein (sequential via events)
**Webhooks**:
- Instant processing (<100ms)
- Batch support (multiple entities per request)
- Redis dedup overhead: <10ms
### Memory Usage
**Current**: 169MB (Peak: 276MB)
**Breakdown**:
- Node.js process: ~150MB
- Python dependencies: Lazy-loaded per step
- Redis memory: <10MB
### Scalability
**Horizontal**: Nicht ohne weiteres möglich (Redis Locks, Shared State)
**Vertical**: CPU-bound bei vielen parallel Employees
**Bottleneck**: Google Calendar API Rate Limits
## Monitoring & Observability
### Logging
**Framework**: Motia Workbench (structured logging)
**Levels**: DEBUG, INFO, ERROR
**Output**: journalctl (systemd) + Motia Workbench UI
**Key Log Points**:
- API-Requests (Method, URL, Status)
- Event Emission (Topic, Payload)
- Redis Operations (Keys, Success/Failure)
- Errors (Stack traces, Context)
### Metrics
**Available** (via Logs):
- Webhook receive count
- Calendar sync duration per employee
- API call count & latency
- Redis hit/miss ratio (implicit)
**Missing** (Future):
- Prometheus metrics
- Grafana dashboards
- Alerting
## Deployment
### systemd Service
**Unit**: `motia.service`
**User**: `www-data`
**WorkingDirectory**: `/opt/motia-app/bitbylaw`
**Restart**: `always` (10s delay)
**Environment**:
```bash
NODE_ENV=production
NODE_OPTIONS=--max-old-space-size=8192 --inspect
HOST=0.0.0.0
MOTIA_LOG_LEVEL=debug
```
### Dependencies
**Runtime**:
- Node.js 18+
- Python 3.13+
- Redis Server
- systemd
**Build**:
- npm (Node packages)
- pip (Python packages)
- Motia CLI
## Disaster Recovery
### Backup Strategy
**Redis**:
- RDB snapshots (automatisch)
- AOF persistence (optional)
**Configuration**:
- Git-versioniert
- Environment Variables in systemd
**Service Account**:
- Manual backup: `/opt/motia-app/service-account.json`
### Recovery Procedures
**Service Restart**:
```bash
systemctl restart motia.service
```
**Clear Redis Cache**:
```bash
redis-cli -n 1 FLUSHDB # Advoware Cache
redis-cli -n 2 FLUSHDB # Calendar Sync
```
**Clear Employee Lock**:
```bash
python /opt/motia-app/bitbylaw/delete_employee_locks.py
```
## Future Enhancements
### P3CX Full Integration**: Complete call handling, CTI features
3. **Vermieterhelden Lead Processing**: Automated lead routing and enrichment
4. **Horizontal Scaling**: Distributed locking (Redis Cluster)
5. **Metrics & Monitoring**: Prometheus exporters
6. **Health Checks**: `/health` endpoint via KONG
### Considered
1. **PostgreSQL Hub**: Persistent sync state (currently Redis-only)
2. **Webhook Signatures**: Validation von Vermieterhelden/3CX requests
3. **Multi-Tenant**: Support für mehrere Kanzleien
4. **KONG Plugins**: Custom plugins für business logic
1. **PostgreSQL Hub**: Persistent sync state (currently Redis-only)
2. **Webhook Signatures**: Validation von EspoCRM requests
3. **Multi-Tenant**: Support für mehrere Kanzleien
## Related Documentation
- [Development Guide](DEVELOPMENT.md)
- [API Reference](API.md)
- [Configuration](CONFIGURATION.md)
- [Troubleshooting](TROUBLESHOOTING.md)
- [Deployment Guide](DEPLOYMENT.md)

View File

@@ -0,0 +1,509 @@
# Configuration Guide
## Environment Variables
Alle Konfiguration erfolgt über Environment Variables. Diese können gesetzt werden:
1. In `.env` Datei (lokale Entwicklung)
2. In systemd service file (production)
3. Export in shell
## Advoware API Configuration
### Required Variables
```bash
# Advoware API Base URL
ADVOWARE_API_BASE_URL=https://www2.advo-net.net:90/
# Product ID (typischerweise 64)
ADVOWARE_PRODUCT_ID=64
# Application ID (von Advoware bereitgestellt)
ADVOWARE_APP_ID=your_app_id_here
# API Key (Base64-encoded für HMAC-512 Signatur)
ADVOWARE_API_KEY=your_base64_encoded_key_here
# Kanzlei-Kennung
ADVOWARE_KANZLEI=your_kanzlei_name
# Database Name
ADVOWARE_DATABASE=your_database_name
# User für API-Zugriff
ADVOWARE_USER=api_user
# User Role (typischerweise 2)
ADVOWARE_ROLE=2
# User Password
ADVOWARE_PASSWORD=secure_password_here
# Token Lifetime in Minuten (Standard: 55)
ADVOWARE_TOKEN_LIFETIME_MINUTES=55
# API Timeout in Sekunden (Standard: 30)
ADVOWARE_API_TIMEOUT_SECONDS=30
# Write Protection (true = keine Schreibzugriffe auf Advoware)
ADVOWARE_WRITE_PROTECTION=true
```
### Advoware API Key
Der API Key muss Base64-encoded sein für HMAC-512 Signatur:
```bash
# Wenn Sie einen Raw Key haben, encodieren Sie ihn:
echo -n "your_raw_key" | base64
```
## Redis Configuration
```bash
# Redis Host (Standard: localhost)
REDIS_HOST=localhost
# Redis Port (Standard: 6379)
REDIS_PORT=6379
# Redis Database für Advoware Cache (Standard: 1)
REDIS_DB_ADVOWARE_CACHE=1
# Redis Database für Calendar Sync (Standard: 2)
REDIS_DB_CALENDAR_SYNC=2
# Redis Timeout in Sekunden (Standard: 5)
REDIS_TIMEOUT_SECONDS=5
```
### Redis Database Layout
- **DB 0**: Motia Framework (nicht konfigurierbar)
- **DB 1**: Advoware Cache & Locks (`REDIS_DB_ADVOWARE_CACHE`)
- Token Cache
- Employee Locks
- Webhook Deduplication
- **DB 2**: Calendar Sync Rate Limiting (`REDIS_DB_CALENDAR_SYNC`)
---
## KONG API Gateway Configuration
```bash
# KONG Admin API URL (für Konfiguration)
KONG_ADMIN_URL=http://localhost:8001
# KONG Proxy URL (öffentlich erreichbar)
KONG_PROXY_URL=https://api.bitbylaw.com
```
**Hinweis**: KONG-Konfiguration erfolgt typischerweise über Admin API oder Declarative Config (kong.yml).
---
## 3CX Telefonie Configuration
```bash
# 3CX API Base URL
THREECX_API_URL=https://ralup.my3cx.de/api/v1
# 3CX API Key für Authentifizierung
THREECX_API_KEY=your_3cx_api_key_here
# 3CX Webhook Secret (optional, für Signatur-Validierung)
THREECX_WEBHOOK_SECRET=your_webhook_secret_here
```
### 3CX Setup
1. Erstellen Sie API Key in 3CX Management Console
2. Konfigurieren Sie Webhook URLs in 3CX:
- Call Started: `https://api.bitbylaw.com/telephony/3cx/webhook`
- Call Ended: `https://api.bitbylaw.com/telephony/3cx/webhook`
3. Aktivieren Sie Call Recording (optional)
---
## Vermieterhelden Integration Configuration
```bash
# Vermieterhelden Webhook Secret (für Signatur-Validierung)
VH_WEBHOOK_SECRET=your_vermieterhelden_webhook_secret
# Lead Routing Target (wohin werden Leads geschickt)
VH_LEAD_TARGET=espocrm # Options: espocrm, advoware, both
# Lead Auto-Assignment (optional)
VH_AUTO_ASSIGN_LEADS=true
VH_DEFAULT_ASSIGNEE=user_id_123
```
### Vermieterhelden Setup
1. Konfigurieren Sie Webhook URL im WordPress:
- URL: `https://api.bitbylaw.com/leads/vermieterhelden`
2. Generieren Sie Shared Secret
3. Aktivieren Sie Webhook-Events für Lead-Erstellung
---
## Google Calendar Configuration
```bash
# Pfad zur Service Account JSON Datei
GOOGLE_CALENDAR_SERVICE_ACCOUNT_PATH=/opt/motia-app/service-account.json
# Google Calendar Scopes (Standard: calendar)
# GOOGLE_CALENDAR_SCOPES wird im Code gesetzt, keine ENV Variable nötig
```
### Service Account Setup
1. Erstellen Sie einen Service Account in Google Cloud Console
2. Laden Sie die JSON-Schlüsseldatei herunter
3. Speichern Sie sie als `service-account.json`
4. Setzen Sie sichere Berechtigungen:
```bash
chmod 600 /opt/motia-app/service-account.json
chown www-data:www-data /opt/motia-app/service-account.json
```
Siehe auch: [GOOGLE_SETUP_README.md](../GOOGLE_SETUP_README.md)
## PostgreSQL Configuration
**Status**: Aktuell nicht verwendet (zukünftige Erweiterung)
```bash
# PostgreSQL Host
POSTGRES_HOST=localhost
# PostgreSQL User
POSTGRES_USER=calendar_sync_user
# PostgreSQL Password
POSTGRES_PASSWORD=secure_password
# PostgreSQL Database Name
POSTGRES_DB_NAME=calendar_sync_db
```
## Calendar Sync Configuration
```bash
# Anonymisierung von Google Events (true/false)
CALENDAR_SYNC_ANONYMIZE_GOOGLE_EVENTS=true
# Debug: Nur bestimmte Mitarbeiter synchronisieren (Komma-separiert)
# Leer = alle Mitarbeiter
CALENDAR_SYNC_DEBUG_KUERZEL=SB,AI,RO,OK,BI,ST,UR,PB,VB
```
### Anonymisierung
Wenn `CALENDAR_SYNC_ANONYMIZE_GOOGLE_EVENTS=true`:
- Titel: "Blocked"
- Beschreibung: Leer
- Ort: Leer
Wenn `false`:
- Volle Details aus Advoware werden synchronisiert
### Debug-Modus
Für Development/Testing nur bestimmte Mitarbeiter synchronisieren:
```bash
# Nur diese Kürzel
CALENDAR_SYNC_DEBUG_KUERZEL=SB,AI
# Alle (Standard)
CALENDAR_SYNC_DEBUG_KUERZEL=
```
## EspoCRM Configuration
```bash
# API Key für Webhook-Validierung (optional)
ESPOCRM_MARVIN_API_KEY=your_webhook_secret_here
```
**Hinweis**: Aktuell wird der API Key nicht für Validierung verwendet. Zukünftige Implementierung kann HMAC-Signatur-Validierung hinzufügen.
## Motia Framework Configuration
```bash
# Node Environment (development|production)
NODE_ENV=production
# Node Memory Limit (in MB)
# NODE_OPTIONS wird in systemd gesetzt
NODE_OPTIONS=--max-old-space-size=8192 --inspect --heapsnapshot-signal=SIGUSR2
# Host Binding (0.0.0.0 = alle Interfaces)
HOST=0.0.0.0
# Port (Standard: 3000)
# PORT=3000
# Log Level (debug|info|warning|error)
MOTIA_LOG_LEVEL=debug
# npm Cache (für systemd user www-data)
NPM_CONFIG_CACHE=/opt/motia-app/.npm-cache
```
## Configuration Loading
### config.py
Zentrale Konfiguration wird in `config.py` geladen:
```python
from dotenv import load_dotenv
import os
# Load .env file if exists
load_dotenv()
class Config:
# Alle Variablen mit Defaults
REDIS_HOST = os.getenv('REDIS_HOST', 'localhost')
REDIS_PORT = int(os.getenv('REDIS_PORT', '6379'))
# ...
```
### Usage in Steps
```python
from config import Config
# Access configuration
redis_host = Config.REDIS_HOST
api_key = Config.ADVOWARE_API_KEY
```
### Usage in Services
```python
from config import Config
class AdvowareAPI:
def __init__(self):
self.api_key = Config.ADVOWARE_API_KEY
self.base_url = Config.ADVOWARE_API_BASE_URL
```
## Environment-Specific Configuration
### Development (.env)
Erstellen Sie eine `.env` Datei im Root:
```bash
# .env (nicht in Git committen!)
ADVOWARE_API_BASE_URL=https://staging.advo-net.net:90/
ADVOWARE_API_KEY=dev_key_here
REDIS_HOST=localhost
MOTIA_LOG_LEVEL=debug
ADVOWARE_WRITE_PROTECTION=true
```
**Wichtig**: `.env` zu `.gitignore` hinzufügen!
### Production (systemd)
In `/etc/systemd/system/motia.service`:
```ini
[Service]
Environment=NODE_ENV=production
Environment=ADVOWARE_API_BASE_URL=https://www2.advo-net.net:90/
Environment=ADVOWARE_API_KEY=production_key_here
Environment=ADVOWARE_PASSWORD=production_password_here
Environment=REDIS_HOST=localhost
Environment=MOTIA_LOG_LEVEL=info
Environment=ADVOWARE_WRITE_PROTECTION=false
```
Nach Änderungen:
```bash
sudo systemctl daemon-reload
sudo systemctl restart motia.service
```
### Staging
Eigene Service-Datei oder separate Environment-Datei.
## Validation
### Check Configuration
Script zum Validieren der Konfiguration:
```python
# scripts/check_config.py
from config import Config
import sys
required_vars = [
'ADVOWARE_API_BASE_URL',
'ADVOWARE_APP_ID',
'ADVOWARE_API_KEY',
'REDIS_HOST',
]
missing = []
for var in required_vars:
if not getattr(Config, var, None):
missing.append(var)
if missing:
print(f"ERROR: Missing configuration: {', '.join(missing)}")
sys.exit(1)
print("✓ Configuration valid")
```
Run:
```bash
python scripts/check_config.py
```
## Secrets Management
### DO NOT
❌ Commit secrets to Git
❌ Hardcode passwords in code
❌ Share `.env` files
❌ Log sensitive data
### DO
✅ Use environment variables
✅ Use `.gitignore` for `.env`
✅ Use systemd for production secrets
✅ Rotate keys regularly
✅ Use `chmod 600` for sensitive files
### Rotation
Wenn API Keys rotiert werden:
```bash
# 1. Update environment variable
sudo nano /etc/systemd/system/motia.service
# 2. Reload systemd
sudo systemctl daemon-reload
# 3. Clear Redis cache
redis-cli -n 1 DEL advoware_access_token advoware_token_timestamp
# 4. Restart service
sudo systemctl restart motia.service
# 5. Verify
sudo journalctl -u motia.service -f
```
## Configuration Reference
### Complete Example
```bash
# Advoware API
ADVOWARE_API_BASE_URL=https://www2.advo-net.net:90/
ADVOWARE_PRODUCT_ID=64
ADVOWARE_APP_ID=your_app_id
ADVOWARE_API_KEY=your_base64_key
ADVOWARE_KANZLEI=your_kanzlei
ADVOWARE_DATABASE=your_db
ADVOWARE_USER=api_user
ADVOWARE_ROLE=2
ADVOWARE_PASSWORD=your_password
ADVOWARE_TOKEN_LIFETIME_MINUTES=55
ADVOWARE_API_TIMEOUT_SECONDS=30
ADVOWARE_WRITE_PROTECTION=true
# Redis
REDIS_HOST=localhost
REDIS_PORT=6379
REDIS_DB_ADVOWARE_CACHE=1
REDIS_DB_CALENDAR_SYNC=2
REDIS_TIMEOUT_SECONDS=5
# Google Calendar
GOOGLE_CALENDAR_SERVICE_ACCOUNT_PATH=/opt/motia-app/service-account.json
# Calendar Sync
CALENDAR_SYNC_ANONYMIZE_GOOGLE_EVENTS=true
CALENDAR_SYNC_DEBUG_KUERZEL=
# PostgreSQL (optional)
POSTGRES_HOST=localhost
POSTGRES_USER=calendar_sync_user
POSTGRES_PASSWORD=your_pg_password
POSTGRES_DB_NAME=calendar_sync_db
# EspoCRM
ESPOCRM_MARVIN_API_KEY=your_webhook_key
# Motia
NODE_ENV=production
HOST=0.0.0.0
MOTIA_LOG_LEVEL=info
```
## Troubleshooting
### "Configuration not found"
```bash
# Check if .env exists
ls -la .env
# Check environment variables
env | grep ADVOWARE
# Check systemd environment
systemctl show motia.service -p Environment
```
### "Redis connection failed"
```bash
# Check Redis is running
sudo systemctl status redis-server
# Test connection
redis-cli -h $REDIS_HOST -p $REDIS_PORT ping
# Check config
echo "REDIS_HOST: $REDIS_HOST"
echo "REDIS_PORT: $REDIS_PORT"
```
### "API authentication failed"
```bash
# Check if API key is valid Base64
echo $ADVOWARE_API_KEY | base64 -d
# Clear token cache
redis-cli -n 1 DEL advoware_access_token
# Check logs
sudo journalctl -u motia.service | grep -i "token\|auth"
```
## Related Documentation
- [Development Guide](DEVELOPMENT.md)
- [Deployment Guide](DEPLOYMENT.md)
- [Troubleshooting](TROUBLESHOOTING.md)
- [Google Setup](../GOOGLE_SETUP_README.md)

624
bitbylaw/docs/DEPLOYMENT.md Normal file
View File

@@ -0,0 +1,624 @@
# Deployment Guide
## Production Deployment
### Prerequisites
- Root/sudo access zum Server
- Ubuntu/Debian Linux (tested on Ubuntu 22.04+)
- Internet-Zugang für Package-Installation
### Installation Steps
#### 1. System Dependencies
```bash
# Update system
sudo apt-get update
sudo apt-get upgrade -y
# Install Node.js 18.x
curl -fsSL https://deb.nodesource.com/setup_18.x | sudo -E bash -
sudo apt-get install -y nodejs
# Install Python 3.13
sudo apt-get install -y python3.13 python3.13-venv python3.13-dev
# Install Redis
sudo apt-get install -y redis-server
# Install Git
sudo apt-get install -y git
# Start Redis
sudo systemctl enable redis-server
sudo systemctl start redis-server
```
#### 2. Application Setup
```bash
# Create application directory
sudo mkdir -p /opt/motia-app
cd /opt/motia-app
# Clone repository (oder rsync von Development)
git clone <repository-url> bitbylaw
cd bitbylaw
# Create www-data user if not exists
sudo useradd -r -s /bin/bash www-data || true
# Set ownership
sudo chown -R www-data:www-data /opt/motia-app
```
#### 3. Node.js Dependencies
```bash
# Als www-data user
sudo -u www-data bash
cd /opt/motia-app/bitbylaw
# Install Node.js packages
npm install
# Build TypeScript (falls nötig)
npm run build
```
#### 4. Python Dependencies
```bash
# Als www-data user
cd /opt/motia-app/bitbylaw
# Create virtual environment
python3.13 -m venv python_modules
# Activate
source python_modules/bin/activate
# Install dependencies
pip install -r requirements.txt
# Deactivate
deactivate
```
#### 5. Service Account Setup
```bash
# Copy service account JSON
sudo cp service-account.json /opt/motia-app/service-account.json
# Set secure permissions
sudo chmod 600 /opt/motia-app/service-account.json
sudo chown www-data:www-data /opt/motia-app/service-account.json
```
Siehe auch: [GOOGLE_SETUP_README.md](../GOOGLE_SETUP_README.md)
#### 6. systemd Service
Erstellen Sie `/etc/systemd/system/motia.service`:
```ini
[Unit]
Description=Motia Backend Framework
After=network.target redis-server.service
[Service]
Type=simple
User=www-data
WorkingDirectory=/opt/motia-app/bitbylaw
# Environment Variables
Environment=NODE_ENV=production
Environment=NODE_OPTIONS=--max-old-space-size=8192 --inspect --heapsnapshot-signal=SIGUSR2
Environment=HOST=0.0.0.0
Environment=MOTIA_LOG_LEVEL=info
Environment=NPM_CONFIG_CACHE=/opt/motia-app/.npm-cache
# Advoware Configuration (ADJUST VALUES!)
Environment=ADVOWARE_API_BASE_URL=https://www2.advo-net.net:90/
Environment=ADVOWARE_PRODUCT_ID=64
Environment=ADVOWARE_APP_ID=your_app_id
Environment=ADVOWARE_API_KEY=your_api_key_base64
Environment=ADVOWARE_KANZLEI=your_kanzlei
Environment=ADVOWARE_DATABASE=your_database
Environment=ADVOWARE_USER=your_user
Environment=ADVOWARE_ROLE=2
Environment=ADVOWARE_PASSWORD=your_password
Environment=ADVOWARE_WRITE_PROTECTION=false
# Redis Configuration
Environment=REDIS_HOST=localhost
Environment=REDIS_PORT=6379
Environment=REDIS_DB_ADVOWARE_CACHE=1
Environment=REDIS_DB_CALENDAR_SYNC=2
# Google Calendar
Environment=GOOGLE_CALENDAR_SERVICE_ACCOUNT_PATH=/opt/motia-app/service-account.json
# EspoCRM (if used)
Environment=ESPOCRM_MARVIN_API_KEY=your_webhook_key
# Start Command
ExecStart=/bin/bash -c 'source /opt/motia-app/python_modules/bin/activate && /usr/bin/npm start'
# Restart Policy
Restart=always
RestartSec=10
# Security
NoNewPrivileges=true
PrivateTmp=true
[Install]
WantedBy=multi-user.target
```
**WICHTIG**: Passen Sie alle `your_*` Werte an!
#### 7. Enable and Start Service
```bash
# Reload systemd
sudo systemctl daemon-reload
# Enable service (autostart)
sudo systemctl enable motia.service
# Start service
sudo systemctl start motia.service
# Check status
sudo systemctl status motia.service
```
#### 8. Verify Installation
```bash
# Check logs
sudo journalctl -u motia.service -f
# Test API
curl http://localhost:3000/health # (wenn implementiert)
# Test Advoware Proxy
curl "http://localhost:3000/advoware/proxy?endpoint=employees"
```
## Reverse Proxy Setup (nginx)
### Install nginx
```bash
sudo apt-get install -y nginx
```
### Configure
`/etc/nginx/sites-available/motia`:
```nginx
upstream motia_backend {
server 127.0.0.1:3000;
}
server {
listen 80;
server_name your-domain.com;
# Redirect to HTTPS
return 301 https://$server_name$request_uri;
}
server {
listen 443 ssl http2;
server_name your-domain.com;
# SSL Configuration (Let's Encrypt)
ssl_certificate /etc/letsencrypt/live/your-domain.com/fullchain.pem;
ssl_certificate_key /etc/letsencrypt/live/your-domain.com/privkey.pem;
# Security Headers
add_header X-Frame-Options "SAMEORIGIN" always;
add_header X-Content-Type-Options "nosniff" always;
add_header X-XSS-Protection "1; mode=block" always;
# Proxy Settings
location / {
proxy_pass http://motia_backend;
proxy_http_version 1.1;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection 'upgrade';
proxy_set_header Host $host;
proxy_cache_bypass $http_upgrade;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $scheme;
# Timeouts
proxy_connect_timeout 60s;
proxy_send_timeout 60s;
proxy_read_timeout 60s;
}
# Access Log
access_log /var/log/nginx/motia-access.log;
error_log /var/log/nginx/motia-error.log;
}
```
### Enable and Restart
```bash
# Enable site
sudo ln -s /etc/nginx/sites-available/motia /etc/nginx/sites-enabled/
# Test configuration
sudo nginx -t
# Restart nginx
sudo systemctl restart nginx
```
### SSL Certificate (Let's Encrypt)
```bash
# Install certbot
sudo apt-get install -y certbot python3-certbot-nginx
# Obtain certificate
sudo certbot --nginx -d your-domain.com
# Auto-renewal is configured automatically
```
## Firewall Configuration
```bash
# Allow SSH
sudo ufw allow 22/tcp
# Allow HTTP/HTTPS (if using nginx)
sudo ufw allow 80/tcp
sudo ufw allow 443/tcp
# Enable firewall
sudo ufw enable
```
**Wichtig**: Port 3000 NICHT öffentlich öffnen (nur via nginx reverse proxy)
## Monitoring
### systemd Service Status
```bash
# Status anzeigen
sudo systemctl status motia.service
# Ist enabled?
sudo systemctl is-enabled motia.service
# Ist aktiv?
sudo systemctl is-active motia.service
```
### Logs
```bash
# Live logs
sudo journalctl -u motia.service -f
# Last 100 lines
sudo journalctl -u motia.service -n 100
# Since today
sudo journalctl -u motia.service --since today
# Filter by priority (error only)
sudo journalctl -u motia.service -p err
```
### Resource Usage
```bash
# CPU and Memory
sudo systemctl status motia.service
# Detailed process info
ps aux | grep motia
# Memory usage
sudo pmap $(pgrep -f "motia start") | tail -n 1
```
### Redis Monitoring
```bash
# Connect to Redis
redis-cli
# Show info
INFO
# Show database sizes
INFO keyspace
# Monitor commands (real-time)
MONITOR
# Show memory usage
MEMORY USAGE <key>
```
## Backup Strategy
### Application Code
```bash
# Git-based backup
cd /opt/motia-app/bitbylaw
git pull origin main
# Or: rsync backup
rsync -av /opt/motia-app/bitbylaw/ /backup/motia-app/
```
### Redis Data
```bash
# RDB snapshot (automatic by Redis)
# Location: /var/lib/redis/dump.rdb
# Manual backup
sudo cp /var/lib/redis/dump.rdb /backup/redis-dump-$(date +%Y%m%d).rdb
# Restore
sudo systemctl stop redis-server
sudo cp /backup/redis-dump-20260207.rdb /var/lib/redis/dump.rdb
sudo chown redis:redis /var/lib/redis/dump.rdb
sudo systemctl start redis-server
```
### Configuration
```bash
# Backup systemd service
sudo cp /etc/systemd/system/motia.service /backup/motia.service
# Backup nginx config
sudo cp /etc/nginx/sites-available/motia /backup/nginx-motia.conf
# Backup service account
sudo cp /opt/motia-app/service-account.json /backup/service-account.json.backup
```
## Updates & Maintenance
### Application Update
```bash
# 1. Pull latest code
cd /opt/motia-app/bitbylaw
sudo -u www-data git pull origin main
# 2. Update dependencies
sudo -u www-data npm install
sudo -u www-data bash -c 'source python_modules/bin/activate && pip install -r requirements.txt'
# 3. Restart service
sudo systemctl restart motia.service
# 4. Verify
sudo journalctl -u motia.service -f
```
### Zero-Downtime Deployment
Für zukünftige Implementierung mit Blue-Green Deployment:
```bash
# 1. Deploy to staging directory
# 2. Run health checks
# 3. Switch symlink
# 4. Reload service
# 5. Rollback if issues
```
### Database Migrations
**Aktuell**: Keine Datenbank-Migrationen (nur Redis)
**Zukünftig** (PostgreSQL):
```bash
# Run migrations
python manage.py migrate
```
## Security Hardening
### File Permissions
```bash
# Application files
sudo chown -R www-data:www-data /opt/motia-app
sudo chmod 755 /opt/motia-app
sudo chmod 755 /opt/motia-app/bitbylaw
# Service account
sudo chmod 600 /opt/motia-app/service-account.json
sudo chown www-data:www-data /opt/motia-app/service-account.json
# No world-readable secrets
sudo find /opt/motia-app -type f -name "*.json" -exec chmod 600 {} \;
```
### Redis Security
```bash
# Edit Redis config
sudo nano /etc/redis/redis.conf
# Bind to localhost only
bind 127.0.0.1 ::1
# Disable dangerous commands (optional)
rename-command FLUSHDB ""
rename-command FLUSHALL ""
rename-command CONFIG ""
# Restart Redis
sudo systemctl restart redis-server
```
### systemd Hardening
Bereits in Service-Datei enthalten:
- `NoNewPrivileges=true` - Verhindert Privilege-Escalation
- `PrivateTmp=true` - Isoliertes /tmp
- User: `www-data` (non-root)
Weitere Optionen:
```ini
[Service]
ProtectSystem=strict
ProtectHome=true
ReadWritePaths=/opt/motia-app
```
## Disaster Recovery
### Service Crashed
```bash
# Check status
sudo systemctl status motia.service
# View logs
sudo journalctl -u motia.service -n 100
# Restart
sudo systemctl restart motia.service
# If still failing, check:
# - Redis is running
# - Service account file exists
# - Environment variables are set
```
### Redis Data Loss
```bash
# Restore from backup
sudo systemctl stop redis-server
sudo cp /backup/redis-dump-latest.rdb /var/lib/redis/dump.rdb
sudo chown redis:redis /var/lib/redis/dump.rdb
sudo systemctl start redis-server
# Clear specific data if corrupted
redis-cli -n 1 FLUSHDB # Advoware cache
redis-cli -n 2 FLUSHDB # Calendar sync
```
### Complete System Failure
```bash
# 1. Fresh server setup (siehe Installation Steps)
# 2. Restore application code from Git/Backup
# 3. Restore configuration (systemd, nginx)
# 4. Restore service-account.json
# 5. Restore Redis data (optional, will rebuild)
# 6. Start services
```
## Performance Tuning
### Node.js Memory
In systemd service:
```ini
Environment=NODE_OPTIONS=--max-old-space-size=8192 # 8GB
```
### Redis Memory
In `/etc/redis/redis.conf`:
```
maxmemory 2gb
maxmemory-policy allkeys-lru
```
### Linux Kernel
```bash
# Increase file descriptors
echo "fs.file-max = 65536" | sudo tee -a /etc/sysctl.conf
sudo sysctl -p
# For www-data user
sudo nano /etc/security/limits.conf
# Add:
www-data soft nofile 65536
www-data hard nofile 65536
```
## Health Checks
### Automated Monitoring
Cron job für Health Checks:
```bash
# /usr/local/bin/motia-health-check.sh
#!/bin/bash
if ! systemctl is-active --quiet motia.service; then
echo "Motia service is down!" | mail -s "ALERT: Motia Down" admin@example.com
systemctl start motia.service
fi
```
```bash
# Add to crontab
sudo crontab -e
# Add line:
*/5 * * * * /usr/local/bin/motia-health-check.sh
```
### External Monitoring
Services wie Uptime Robot, Pingdom, etc. können verwendet werden:
- HTTP Endpoint: `https://your-domain.com/health`
- Check-Interval: 5 Minuten
- Alert via Email/SMS
## Rollback Procedure
```bash
# 1. Stop current service
sudo systemctl stop motia.service
# 2. Revert to previous version
cd /opt/motia-app/bitbylaw
sudo -u www-data git log # Find previous commit
sudo -u www-data git reset --hard <commit-hash>
# 3. Restore dependencies (if needed)
sudo -u www-data npm install
# 4. Start service
sudo systemctl start motia.service
# 5. Verify
sudo journalctl -u motia.service -f
```
## Related Documentation
- [Architecture](ARCHITECTURE.md)
- [Configuration](CONFIGURATION.md)
- [Troubleshooting](TROUBLESHOOTING.md)

View File

@@ -0,0 +1,656 @@
# Development Guide
## Setup
### Prerequisites
- **Node.js**: 18.x oder höher
- **Python**: 3.13 oder höher
- **Redis**: 6.x oder höher
- **Git**: Für Version Control
- **Motia CLI**: Wird automatisch via npm installiert
### Initial Setup
```bash
# 1. Repository navigieren
cd /opt/motia-app/bitbylaw
# 2. Node.js Dependencies installieren
npm install
# 3. Python Virtual Environment erstellen (falls nicht vorhanden)
python3.13 -m venv python_modules
# 4. Python Virtual Environment aktivieren
source python_modules/bin/activate
# 5. Python Dependencies installieren
pip install -r requirements.txt
# 6. Redis starten (falls nicht läuft)
sudo systemctl start redis-server
# 7. Environment Variables konfigurieren (siehe CONFIGURATION.md)
# Erstellen Sie eine .env Datei oder setzen Sie in systemd
# 8. Development Mode starten
npm run dev
```
### Entwicklungsumgebung
**Empfohlene IDE**: VS Code mit Extensions:
- Python (Microsoft)
- TypeScript (Built-in)
- ESLint
- Prettier
**VS Code Settings** (`.vscode/settings.json`):
```json
{
"python.defaultInterpreterPath": "${workspaceFolder}/python_modules/bin/python",
"python.linting.enabled": true,
"python.linting.pylintEnabled": false,
"python.linting.flake8Enabled": true,
"editor.formatOnSave": true,
"files.exclude": {
"**/__pycache__": true,
"**/node_modules": true
}
}
```
## Projektstruktur
```
bitbylaw/
├── docs/ # Dokumentation
│ ├── ARCHITECTURE.md # System-Architektur
│ ├── DEVELOPMENT.md # Dieser Guide
│ ├── API.md # API-Referenz
│ ├── CONFIGURATION.md # Environment & Config
│ ├── DEPLOYMENT.md # Deployment-Guide
│ └── TROUBLESHOOTING.md # Fehlerbehebung
├── steps/ # Motia Steps (Business Logic)
│ ├── advoware_proxy/ # API Proxy Steps
│ │ ├── README.md # Modul-Dokumentation
│ │ ├── *.py # Step-Implementierungen
│ │ └── *.md # Step-Detail-Doku
│ ├── advoware_cal_sync/ # Calendar Sync Steps
│ │ ├── README.md
│ │ ├── *.py
│ │ └── *.md
│ └── vmh/ # VMH Webhook Steps
│ ├── README.md
│ ├── webhook/ # Webhook Receiver
│ └── *.py
├── services/ # Shared Services
│ └── advoware.py # Advoware API Client
├── config.py # Configuration Loader
├── package.json # Node.js Dependencies
├── requirements.txt # Python Dependencies
├── tsconfig.json # TypeScript Config
├── motia-workbench.json # Motia Flow Definitions
└── README.md # Projekt-Übersicht
```
### Konventionen
**Verzeichnisse**:
- `steps/` - Motia Steps (Handler-Funktionen)
- `services/` - Wiederverwendbare Service-Layer
- `docs/` - Dokumentation
- `python_modules/` - Python Virtual Environment (nicht committen)
- `node_modules/` - Node.js Dependencies (nicht committen)
**Dateinamen**:
- Steps: `{module}_{action}_step.py` (z.B. `calendar_sync_cron_step.py`)
- Services: `{service_name}.py` (z.B. `advoware.py`)
- Dokumentation: `{STEP_NAME}.md` oder `{TOPIC}.md`
## Coding Standards
### Python
**Style Guide**: PEP 8 mit folgenden Anpassungen:
- Line length: 120 Zeichen (statt 79)
- String quotes: Single quotes bevorzugt
**Linting**:
```bash
# Flake8 check
flake8 steps/ services/
# Autopep8 formatting
autopep8 --in-place --aggressive --aggressive steps/**/*.py
```
**Type Hints**:
```python
from typing import Dict, List, Optional, Any
async def handler(req: Dict[str, Any], context: Any) -> Dict[str, Any]:
pass
```
**Docstrings**:
```python
def function_name(param1: str, param2: int) -> bool:
"""
Brief description of function.
Args:
param1: Description of param1
param2: Description of param2
Returns:
Description of return value
Raises:
ValueError: When something goes wrong
"""
pass
```
### TypeScript/JavaScript
**Style Guide**: Standard mit Motia-Konventionen
**Formatting**: Prettier (automatisch via Motia)
### Naming Conventions
**Variables**: `snake_case` (Python), `camelCase` (TypeScript)
**Constants**: `UPPER_CASE`
**Classes**: `PascalCase`
**Functions**: `snake_case` (Python), `camelCase` (TypeScript)
**Files**: `snake_case.py`, `kebab-case.ts`
### Error Handling
**Pattern**:
```python
async def handler(req, context):
try:
# Main logic
result = await some_operation()
return {'status': 200, 'body': {'result': result}}
except SpecificError as e:
# Handle known errors
context.logger.error(f"Specific error: {e}")
return {'status': 400, 'body': {'error': 'Bad request'}}
except Exception as e:
# Catch-all
context.logger.error(f"Unexpected error: {e}", exc_info=True)
return {'status': 500, 'body': {'error': 'Internal error'}}
```
**Logging**:
```python
# Use context.logger for Motia Workbench integration
context.logger.debug("Detailed information")
context.logger.info("Normal operation")
context.logger.warning("Warning message")
context.logger.error("Error message", exc_info=True) # Include stack trace
```
## Motia Step Development
### Step Structure
Every Step must have:
1. **Config Dictionary**: Defines step metadata
2. **Handler Function**: Implements business logic
**Minimal Example**:
```python
config = {
'type': 'api', # api|event|cron
'name': 'My API Step',
'description': 'Brief description',
'path': '/api/my-endpoint', # For API steps
'method': 'GET', # For API steps
'schedule': '0 2 * * *', # For cron steps
'emits': ['topic.name'], # Events this step emits
'subscribes': ['other.topic'], # Events this step subscribes to (event steps)
'flows': ['my-flow'] # Flow membership
}
async def handler(req, context):
"""Handler function - must be async."""
# req: Request object (API) or Event data (event step)
# context: Motia context (logger, emit, etc.)
# Business logic here
# For API steps: return HTTP response
return {'status': 200, 'body': {'result': 'success'}}
# For event steps: no return value (or None)
```
### Step Types
**1. API Steps** (`type: 'api'`):
```python
config = {
'type': 'api',
'name': 'My Endpoint',
'path': '/api/resource',
'method': 'POST',
'emits': [],
'flows': ['main']
}
async def handler(req, context):
# Access request data
body = req.get('body')
query_params = req.get('queryParams')
headers = req.get('headers')
# Return HTTP response
return {
'status': 200,
'body': {'data': 'response'},
'headers': {'X-Custom': 'value'}
}
```
**2. Event Steps** (`type: 'event'`):
```python
config = {
'type': 'event',
'name': 'Process Event',
'subscribes': ['my.topic'],
'emits': ['other.topic'],
'flows': ['main']
}
async def handler(event_data, context):
# Process event
entity_id = event_data.get('entity_id')
# Emit new event
await context.emit({
'topic': 'other.topic',
'data': {'processed': True}
})
# No return value needed
```
**3. Cron Steps** (`type: 'cron'`):
```python
config = {
'type': 'cron',
'name': 'Daily Job',
'schedule': '0 2 * * *', # Cron expression
'emits': ['job.complete'],
'flows': ['main']
}
async def handler(req, context):
# Scheduled logic
context.logger.info("Cron job triggered")
# Emit event to start pipeline
await context.emit({
'topic': 'job.complete',
'data': {}
})
```
### Context API
**Available Methods**:
```python
# Logging
context.logger.debug(msg)
context.logger.info(msg)
context.logger.warning(msg)
context.logger.error(msg, exc_info=True)
# Event Emission
await context.emit({
'topic': 'my.topic',
'data': {'key': 'value'}
})
# Flow information
context.flow_id # Current flow ID
context.step_name # Current step name
```
## Testing
### Unit Tests
**Location**: Tests neben dem Code (z.B. `*_test.py`)
**Framework**: pytest
```python
# test_my_step.py
import pytest
from unittest.mock import AsyncMock, MagicMock
from my_step import handler, config
@pytest.mark.asyncio
async def test_handler_success():
# Arrange
req = {'body': {'key': 'value'}}
context = MagicMock()
context.logger = MagicMock()
# Act
result = await handler(req, context)
# Assert
assert result['status'] == 200
assert 'result' in result['body']
```
**Run Tests**:
```bash
pytest steps/
```
### Integration Tests
**Manual Testing mit curl**:
```bash
# API Step testen
curl -X POST "http://localhost:3000/api/my-endpoint" \
-H "Content-Type: application/json" \
-d '{"key": "value"}'
# Mit Query Parameters
curl -X GET "http://localhost:3000/advoware/proxy?endpoint=employees"
```
**Motia Workbench**: Nutzen Sie die Workbench UI zum Testen und Debugging
### Test-Daten
**Redis Mock Data**:
```bash
# Set test token
redis-cli -n 1 SET advoware_access_token "test_token" EX 3600
# Set test lock
redis-cli -n 1 SET "calendar_sync:lock:TEST" "1" EX 300
# Check dedup set
redis-cli -n 1 SMEMBERS "vmh:beteiligte:create_pending"
```
## Debugging
### Local Development
**Start in Dev Mode**:
```bash
npm run dev
```
**Enable Debug Logging**:
```bash
export MOTIA_LOG_LEVEL=debug
npm start
```
**Node.js Inspector**:
```bash
# Already enabled in systemd (--inspect)
# Connect with Chrome DevTools: chrome://inspect
```
### Motia Workbench
**Access**: `http://localhost:3000/workbench` (wenn verfügbar)
**Features**:
- Live logs
- Flow visualization
- Event traces
- Step execution history
### Redis Debugging
```bash
# Connect to Redis
redis-cli
# Switch database
SELECT 1
# List all keys
KEYS *
# Get value
GET advoware_access_token
# Check SET members
SMEMBERS vmh:beteiligte:create_pending
# Monitor live commands
MONITOR
```
---
## Utility Scripts
### Calendar Sync Utilities
Helper-Scripts für Wartung und Debugging der Calendar-Sync-Funktionalität.
**Standort**: `scripts/calendar_sync/`
**Verfügbare Scripts**:
```bash
# Alle Employee-Locks in Redis löschen (bei hängenden Syncs)
python3 scripts/calendar_sync/delete_employee_locks.py
# Alle Google Kalender löschen (außer Primary) - VORSICHT!
python3 scripts/calendar_sync/delete_all_calendars.py
```
**Use Cases**:
- **Lock Cleanup**: Wenn ein Sync-Prozess abgestürzt ist und Locks nicht aufgeräumt wurden
- **Calendar Reset**: Bei fehlerhafter Synchronisation oder Tests
- **Debugging**: Untersuchung von Sync-Problemen
**Dokumentation**: [scripts/calendar_sync/README.md](../scripts/calendar_sync/README.md)
**⚠️ Wichtig**:
- Immer Motia Service stoppen vor Cleanup: `sudo systemctl stop motia`
- Nach Cleanup Service neu starten: `sudo systemctl start motia`
- `delete_all_calendars.py` löscht unwiderruflich alle Kalender!
---
### Common Issues
**1. Import Errors**:
```bash
# Ensure PYTHONPATH is set
export PYTHONPATH=/opt/motia-app/bitbylaw
source python_modules/bin/activate
```
**2. Redis Connection Errors**:
```bash
# Check Redis is running
sudo systemctl status redis-server
# Test connection
redis-cli ping
```
**3. Token Errors**:
```bash
# Clear cached token
redis-cli -n 1 DEL advoware_access_token advoware_token_timestamp
```
## Git Workflow
### Branch Strategy
- `main` - Production code
- `develop` - Integration branch
- `feature/*` - Feature branches
- `fix/*` - Bugfix branches
### Commit Messages
**Format**: `<type>: <subject>`
**Types**:
- `feat`: New feature
- `fix`: Bug fix
- `docs`: Documentation only
- `refactor`: Code refactoring
- `test`: Adding tests
- `chore`: Maintenance tasks
**Examples**:
```
feat: add calendar sync retry logic
fix: prevent duplicate webhook processing
docs: update API documentation
refactor: extract common validation logic
```
### Pull Request Process
1. Create feature branch from `develop`
2. Implement changes
3. Write/update tests
4. Update documentation
5. Create PR with description
6. Code review
7. Merge to `develop`
8. Deploy to staging
9. Merge to `main` (production)
## Performance Optimization
### Profiling
**Python Memory Profiling**:
```bash
# Install memory_profiler
pip install memory_profiler
# Profile a function
python -m memory_profiler steps/my_step.py
```
**Node.js Profiling**:
```bash
# Already enabled with --inspect flag
# Use Chrome DevTools Performance tab
```
### Best Practices
**Async/Await**:
```python
# Good: Concurrent requests
results = await asyncio.gather(
fetch_data_1(),
fetch_data_2()
)
# Bad: Sequential (slow)
result1 = await fetch_data_1()
result2 = await fetch_data_2()
```
**Redis Pipelining**:
```python
# Good: Batch operations
pipe = redis.pipeline()
pipe.get('key1')
pipe.get('key2')
results = pipe.execute()
# Bad: Multiple round-trips
val1 = redis.get('key1')
val2 = redis.get('key2')
```
**Avoid N+1 Queries**:
```python
# Good: Batch fetch
employee_ids = [1, 2, 3]
employees = await advoware.api_call(
'/employees',
params={'ids': ','.join(map(str, employee_ids))}
)
# Bad: Loop with API calls
employees = []
for emp_id in employee_ids:
emp = await advoware.api_call(f'/employees/{emp_id}')
employees.append(emp)
```
## Code Review Checklist
- [ ] Code follows style guide
- [ ] Type hints present (Python)
- [ ] Error handling implemented
- [ ] Logging added at key points
- [ ] Tests written/updated
- [ ] Documentation updated
- [ ] No secrets in code
- [ ] Performance considered
- [ ] Redis keys documented
- [ ] Events documented
## Deployment
See [DEPLOYMENT.md](DEPLOYMENT.md) for detailed deployment instructions.
**Quick Deploy to Production**:
```bash
# 1. Pull latest code
git pull origin main
# 2. Install dependencies
npm install
pip install -r requirements.txt
# 3. Restart service
sudo systemctl restart motia.service
# 4. Check status
sudo systemctl status motia.service
# 5. Monitor logs
sudo journalctl -u motia.service -f
```
## Resources
### Documentation
- [Motia Framework](https://motia.dev)
- [Advoware API](docs/advoware/) (internal)
- [Google Calendar API](https://developers.google.com/calendar)
### Tools
- [Redis Commander](http://localhost:8081) (if installed)
- [Motia Workbench](http://localhost:3000/workbench)
### Team Contacts
- Architecture Questions: [Lead Developer]
- Deployment Issues: [DevOps Team]
- API Access: [Integration Team]

194
bitbylaw/docs/INDEX.md Normal file
View File

@@ -0,0 +1,194 @@
# Documentation Index
## Getting Started
**New to the project?** Start here:
1. [README.md](../README.md) - Project Overview & Quick Start
2. [DEVELOPMENT.md](DEVELOPMENT.md) - Setup Development Environment
3. [CONFIGURATION.md](CONFIGURATION.md) - Configure Environment Variables
## Core Documentation
### For Developers
- **[DEVELOPMENT.md](DEVELOPMENT.md)** - Complete development guide
- Setup, Coding Standards, Testing, Debugging
- **[ARCHITECTURE.md](ARCHITECTURE.md)** - System design and architecture
- Components, Data Flow, Event-Driven Design
- **[API.md](API.md)** - HTTP Endpoints and Event Topics
- Proxy API, Calendar Sync API, Webhook Endpoints
### For Operations
- **[DEPLOYMENT.md](DEPLOYMENT.md)** - Production deployment
- Installation, systemd, nginx, Monitoring
- **[CONFIGURATION.md](CONFIGURATION.md)** - Environment configuration
- All environment variables, secrets management
- **[TROUBLESHOOTING.md](TROUBLESHOOTING.md)** - Problem solving
- Common issues, debugging, log analysis
### Special Topics
- **[GOOGLE_SETUP.md](GOOGLE_SETUP.md)** - Google Service Account setup
- Step-by-step guide for Calendar API access
## Component Documentation
### Steps (Business Logic)
**Advoware Proxy** ([Module README](../steps/advoware_proxy/README.md)):
- [advoware_api_proxy_get_step.md](../steps/advoware_proxy/advoware_api_proxy_get_step.md)
- [advoware_api_proxy_post_step.md](../steps/advoware_proxy/advoware_api_proxy_post_step.md)
- [advoware_api_proxy_put_step.md](../steps/advoware_proxy/advoware_api_proxy_put_step.md)
- [advoware_api_proxy_delete_step.md](../steps/advoware_proxy/advoware_api_proxy_delete_step.md)
**Calendar Sync** ([Module README](../steps/advoware_cal_sync/README.md)):
- [calendar_sync_cron_step.md](../steps/advoware_cal_sync/calendar_sync_cron_step.md) - Daily trigger
- [calendar_sync_api_step.md](../steps/advoware_cal_sync/calendar_sync_api_step.md) - Manual trigger
- [calendar_sync_all_step.md](../steps/advoware_cal_sync/calendar_sync_all_step.md) - Employee cascade
- [calendar_sync_event_step.md](../steps/advoware_cal_sync/calendar_sync_event_step.md) - Per-employee sync (complex)
**VMH Webhooks** ([Module README](../steps/vmh/README.md)):
- [beteiligte_create_api_step.md](../steps/vmh/webhook/beteiligte_create_api_step.md) - Create webhook
- [beteiligte_update_api_step.md](../steps/vmh/webhook/beteiligte_update_api_step.md) - Update webhook (similar)
- [beteiligte_delete_api_step.md](../steps/vmh/webhook/beteiligte_delete_api_step.md) - Delete webhook (similar)
- [beteiligte_sync_event_step.md](../steps/vmh/beteiligte_sync_event_step.md) - Sync handler (placeholder)
### Services
- [Advoware Service](../services/ADVOWARE_SERVICE.md) - API Client mit HMAC-512 Auth
- [Advoware API Swagger](advoware/advoware_api_swagger.json) - Vollständige API-Dokumentation (JSON)
### Utility Scripts
- [Calendar Sync Scripts](../scripts/calendar_sync/README.md) - Wartung und Debugging
- `delete_employee_locks.py` - Redis Lock Cleanup
- `delete_all_calendars.py` - Google Calendar Reset
---
## Documentation Structure
```
docs/
├── INDEX.md # This file
├── ARCHITECTURE.md # System design
├── API.md # API reference
├── CONFIGURATION.md # Configuration
├── DEPLOYMENT.md # Deployment guide
├── DEVELOPMENT.md # Development guide
├── GOOGLE_SETUP.md # Google Calendar setup
├── TROUBLESHOOTING.md # Debugging guide
└── advoware/
└── advoware_api_swagger.json # Advoware API spec
steps/{module}/
├── README.md # Module overview
└── {step_name}.md # Step documentation
services/
└── {service_name}.md # Service documentation
scripts/{category}/
├── README.md # Script documentation
└── *.py # Utility scripts
```
## Documentation Standards
### YAML Frontmatter
Each step documentation includes metadata:
```yaml
---
type: step
category: api|event|cron
name: Step Name
version: 1.0.0
status: active|deprecated|placeholder
tags: [tag1, tag2]
dependencies: [...]
emits: [...]
subscribes: [...]
---
```
### Sections
Standard sections in step documentation:
1. **Zweck** - Purpose (one sentence)
2. **Config** - Motia step configuration
3. **Input** - Request structure, parameters
4. **Output** - Response structure
5. **Verhalten** - Behavior, logic flow
6. **Abhängigkeiten** - Dependencies (services, Redis, APIs)
7. **Testing** - Test examples
8. **KI Guidance** - Tips for AI assistants
### Cross-References
- Use relative paths for links
- Link related steps and services
- Link to parent module READMEs
## Quick Reference
### Common Tasks
| Task | Documentation |
|------|---------------|
| Setup development environment | [DEVELOPMENT.md](DEVELOPMENT.md#setup) |
| Configure environment variables | [CONFIGURATION.md](CONFIGURATION.md) |
| Deploy to production | [DEPLOYMENT.md](DEPLOYMENT.md#installation-steps) |
| Setup Google Calendar | [GOOGLE_SETUP.md](GOOGLE_SETUP.md) |
| Debug service issues | [TROUBLESHOOTING.md](TROUBLESHOOTING.md#service-issues) |
| Understand architecture | [ARCHITECTURE.md](ARCHITECTURE.md) |
| Test API endpoints | [API.md](API.md) |
### Code Locations
| Component | Location | Documentation |
|-----------|----------|---------------|
| API Proxy Steps | `steps/advoware_proxy/` | [README](../steps/advoware_proxy/README.md) |
| Calendar Sync Steps | `steps/advoware_cal_sync/` | [README](../steps/advoware_cal_sync/README.md) |
| VMH Webhook Steps | `steps/vmh/` | [README](../steps/vmh/README.md) |
| Advoware API Client | `services/advoware.py` | [DOC](../services/ADVOWARE_SERVICE.md) |
| Configuration | `config.py` | [CONFIGURATION.md](CONFIGURATION.md) |
## Contributing to Documentation
### Adding New Step Documentation
1. Create `{step_name}.md` next to `.py` file
2. Use YAML frontmatter (see template)
3. Follow standard sections
4. Add to module README
5. Add to this INDEX
### Updating Documentation
- Keep code and docs in sync
- Update version history in step docs
- Update INDEX when adding new files
- Test all code examples
### Documentation Reviews
- Verify all links work
- Check code examples execute correctly
- Ensure terminology is consistent
- Validate configuration examples
## External Resources
- [Motia Framework Docs](https://motia.dev) (if available)
- [Advoware API](https://www2.advo-net.net:90/) (requires auth)
- [Google Calendar API](https://developers.google.com/calendar)
- [Redis Documentation](https://redis.io/documentation)
## Support
- **Questions**: Check TROUBLESHOOTING.md first
- **Bugs**: Document in logs (`journalctl -u motia.service`)
- **Features**: Propose in team discussions
- **Urgent**: Check systemd logs and Redis state

View File

@@ -0,0 +1,800 @@
# Troubleshooting Guide
## Service Issues
### Service Won't Start
**Symptoms**: `systemctl start motia.service` schlägt fehl
**Diagnose**:
```bash
# Check service status
sudo systemctl status motia.service
# View detailed logs
sudo journalctl -u motia.service -n 100 --no-pager
# Check for port conflicts
sudo netstat -tlnp | grep 3000
```
**Häufige Ursachen**:
1. **Port 3000 bereits belegt**:
```bash
# Find process
sudo lsof -i :3000
# Kill process
sudo kill -9 <PID>
```
2. **Fehlende Dependencies**:
```bash
cd /opt/motia-app/bitbylaw
sudo -u www-data npm install
sudo -u www-data bash -c 'source python_modules/bin/activate && pip install -r requirements.txt'
```
3. **Falsche Permissions**:
```bash
sudo chown -R www-data:www-data /opt/motia-app
sudo chmod 600 /opt/motia-app/service-account.json
```
4. **Environment Variables fehlen**:
```bash
# Check systemd environment
sudo systemctl show motia.service -p Environment
# Verify required vars
sudo systemctl cat motia.service | grep Environment
```
### Service Keeps Crashing
**Symptoms**: Service startet, crashed aber nach kurzer Zeit
**Diagnose**:
```bash
# Watch logs in real-time
sudo journalctl -u motia.service -f
# Check for OOM (Out of Memory)
dmesg | grep -i "out of memory"
sudo grep -i "killed process" /var/log/syslog
```
**Solutions**:
1. **Memory Limit erhöhen**:
```ini
# In /etc/systemd/system/motia.service
Environment=NODE_OPTIONS=--max-old-space-size=8192
```
2. **Python Memory Leak**:
```bash
# Check memory usage
ps aux | grep python
# Restart service periodically (workaround)
# Add to crontab:
0 3 * * * systemctl restart motia.service
```
3. **Unhandled Exception**:
```bash
# Check error logs
sudo journalctl -u motia.service -p err
# Add try-catch in problematic step
```
## Redis Issues
### Redis Connection Failed
**Symptoms**: "Redis connection failed" in logs
**Diagnose**:
```bash
# Check Redis status
sudo systemctl status redis-server
# Test connection
redis-cli ping
# Check config
redis-cli CONFIG GET bind
redis-cli CONFIG GET port
```
**Solutions**:
1. **Redis not running**:
```bash
sudo systemctl start redis-server
sudo systemctl enable redis-server
```
2. **Wrong host/port**:
```bash
# Check environment
echo $REDIS_HOST
echo $REDIS_PORT
# Test connection
redis-cli -h $REDIS_HOST -p $REDIS_PORT ping
```
3. **Permission denied**:
```bash
# Check Redis log
sudo tail -f /var/log/redis/redis-server.log
# Fix permissions
sudo chown redis:redis /var/lib/redis
sudo chmod 750 /var/lib/redis
```
### Redis Out of Memory
**Symptoms**: "OOM command not allowed" errors
**Diagnose**:
```bash
# Check memory usage
redis-cli INFO memory
# Check maxmemory setting
redis-cli CONFIG GET maxmemory
```
**Solutions**:
1. **Increase maxmemory**:
```bash
# In /etc/redis/redis.conf
maxmemory 2gb
maxmemory-policy allkeys-lru
sudo systemctl restart redis-server
```
2. **Clear old data**:
```bash
# Clear cache (safe for Advoware tokens)
redis-cli -n 1 FLUSHDB
# Clear calendar sync state
redis-cli -n 2 FLUSHDB
```
3. **Check for memory leaks**:
```bash
# Find large keys
redis-cli --bigkeys
# Check specific key size
redis-cli MEMORY USAGE <key>
```
## Advoware API Issues
### Authentication Failed
**Symptoms**: "401 Unauthorized" oder "HMAC signature invalid"
**Diagnose**:
```bash
# Check logs for auth errors
sudo journalctl -u motia.service | grep -i "auth\|token\|401"
# Test token fetch manually
python3 << 'EOF'
from services.advoware import AdvowareAPI
api = AdvowareAPI()
token = api.get_access_token(force_refresh=True)
print(f"Token: {token[:20]}...")
EOF
```
**Solutions**:
1. **Invalid API Key**:
```bash
# Verify API Key is Base64
echo $ADVOWARE_API_KEY | base64 -d
# Re-encode if needed
echo -n "raw_key" | base64
```
2. **Wrong credentials**:
```bash
# Verify environment variables
sudo systemctl show motia.service -p Environment | grep ADVOWARE
# Update in systemd service
sudo nano /etc/systemd/system/motia.service
sudo systemctl daemon-reload
sudo systemctl restart motia.service
```
3. **Token expired**:
```bash
# Clear cached token
redis-cli -n 1 DEL advoware_access_token advoware_token_timestamp
# Retry request (will fetch new token)
```
### API Timeout
**Symptoms**: "Request timeout" oder "API call failed"
**Diagnose**:
```bash
# Check API response time
time curl "http://localhost:3000/advoware/proxy?endpoint=employees"
# Check network connectivity
ping www2.advo-net.net
curl -I https://www2.advo-net.net:90/
```
**Solutions**:
1. **Increase timeout**:
```bash
# In environment
export ADVOWARE_API_TIMEOUT_SECONDS=60
# Or in systemd service
Environment=ADVOWARE_API_TIMEOUT_SECONDS=60
```
2. **Network issues**:
```bash
# Check firewall
sudo ufw status
# Test direct connection
curl -v https://www2.advo-net.net:90/
```
3. **Advoware API down**:
```bash
# Wait and retry
# Implement exponential backoff in code
```
## Google Calendar Issues
### Service Account Not Found
**Symptoms**: "service-account.json not found"
**Diagnose**:
```bash
# Check file exists
ls -la /opt/motia-app/service-account.json
# Check permissions
ls -la /opt/motia-app/service-account.json
# Check environment variable
echo $GOOGLE_CALENDAR_SERVICE_ACCOUNT_PATH
```
**Solutions**:
1. **File missing**:
```bash
# Copy from backup
sudo cp /backup/service-account.json /opt/motia-app/
# Set permissions
sudo chmod 600 /opt/motia-app/service-account.json
sudo chown www-data:www-data /opt/motia-app/service-account.json
```
2. **Wrong path**:
```bash
# Update environment
# In /etc/systemd/system/motia.service:
Environment=GOOGLE_CALENDAR_SERVICE_ACCOUNT_PATH=/opt/motia-app/service-account.json
sudo systemctl daemon-reload
sudo systemctl restart motia.service
```
### Calendar API Rate Limit
**Symptoms**: "403 Rate limit exceeded" oder "429 Too Many Requests"
**Diagnose**:
```bash
# Check rate limiting in logs
sudo journalctl -u motia.service | grep -i "rate\|403\|429"
# Check Redis rate limit tokens
redis-cli -n 2 GET google_calendar_api_tokens
```
**Solutions**:
1. **Wait for rate limit reset**:
```bash
# Rate limit resets every minute
# Wait 60 seconds and retry
```
2. **Adjust rate limit settings**:
```python
# In calendar_sync_event_step.py
MAX_TOKENS = 7 # Decrease if hitting limits
REFILL_RATE_PER_MS = 7 / 1000
```
3. **Request quota increase**:
- Go to Google Cloud Console
- Navigate to "APIs & Services" → "Quotas"
- Request increase for Calendar API
### Calendar Access Denied
**Symptoms**: "Access denied" oder "Insufficient permissions"
**Diagnose**:
```bash
# Check service account email
python3 << 'EOF'
import json
with open('/opt/motia-app/service-account.json') as f:
data = json.load(f)
print(f"Service Account: {data['client_email']}")
EOF
# Test API access
python3 << 'EOF'
from google.oauth2 import service_account
from googleapiclient.discovery import build
creds = service_account.Credentials.from_service_account_file(
'/opt/motia-app/service-account.json',
scopes=['https://www.googleapis.com/auth/calendar']
)
service = build('calendar', 'v3', credentials=creds)
result = service.calendarList().list().execute()
print(f"Calendars: {len(result.get('items', []))}")
EOF
```
**Solutions**:
1. **Calendar not shared**:
```bash
# Share calendar with service account email
# In Google Calendar UI: Settings → Share → Add service account email
```
2. **Wrong scopes**:
```bash
# Verify scopes in code
# Should be: https://www.googleapis.com/auth/calendar
```
3. **Domain-wide delegation**:
```bash
# For G Suite, enable domain-wide delegation
# See GOOGLE_SETUP_README.md
```
## Calendar Sync Issues
### Sync Not Running
**Symptoms**: Keine Calendar-Updates, keine Sync-Logs
**Diagnose**:
```bash
# Check if cron is triggering
sudo journalctl -u motia.service | grep -i "calendar_sync_cron"
# Manually trigger sync
curl -X POST "http://localhost:3000/advoware/calendar/sync" \
-H "Content-Type: application/json" \
-d '{"full_content": true}'
# Check for locks
redis-cli -n 1 KEYS "calendar_sync:lock:*"
```
**Solutions**:
1. **Cron not configured**:
```python
# Verify calendar_sync_cron_step.py has correct schedule
config = {
'schedule': '0 2 * * *', # Daily at 2 AM
}
```
2. **Lock stuck**:
```bash
# Clear all locks
python /opt/motia-app/bitbylaw/delete_employee_locks.py
# Or manually
redis-cli -n 1 DEL calendar_sync:lock:SB
```
3. **Errors in sync**:
```bash
# Check error logs
sudo journalctl -u motia.service -p err | grep calendar
```
### Duplicate Events
**Symptoms**: Events erscheinen mehrfach in Google Calendar
**Diagnose**:
```bash
# Check for concurrent syncs
redis-cli -n 1 KEYS "calendar_sync:lock:*"
# Check logs for duplicate processing
sudo journalctl -u motia.service | grep -i "duplicate\|already exists"
```
**Solutions**:
1. **Locking not working**:
```bash
# Verify Redis lock TTL
redis-cli -n 1 TTL calendar_sync:lock:SB
# Should return positive number if locked
```
2. **Manual cleanup**:
```bash
# Delete duplicates in Google Calendar UI
# Or use cleanup script (if available)
```
3. **Improve deduplication logic**:
```python
# In calendar_sync_event_step.py
# Add better event matching logic
```
### Events Not Syncing
**Symptoms**: Advoware events nicht in Google Calendar
**Diagnose**:
```bash
# Check specific employee
curl -X POST "http://localhost:3000/advoware/calendar/sync" \
-H "Content-Type: application/json" \
-d '{"kuerzel": "SB", "full_content": true}'
# Check logs for that employee
sudo journalctl -u motia.service | grep "SB"
# Check if calendar exists
python3 << 'EOF'
from google.oauth2 import service_account
from googleapiclient.discovery import build
creds = service_account.Credentials.from_service_account_file(
'/opt/motia-app/service-account.json',
scopes=['https://www.googleapis.com/auth/calendar']
)
service = build('calendar', 'v3', credentials=creds)
result = service.calendarList().list().execute()
for cal in result.get('items', []):
if 'AW-SB' in cal['summary']:
print(f"Found: {cal['summary']} - {cal['id']}")
EOF
```
**Solutions**:
1. **Calendar doesn't exist**:
```bash
# Will be auto-created on first sync
# Force sync to trigger creation
```
2. **Date range mismatch**:
```python
# Check FETCH_FROM and FETCH_TO in calendar_sync_event_step.py
# Default: Previous year to 9 years ahead
```
3. **Write protection enabled**:
```bash
# Check environment
echo $ADVOWARE_WRITE_PROTECTION
# Should be "false" for two-way sync
```
## Webhook Issues
### Webhooks Not Received
**Symptoms**: EspoCRM sendet Webhooks, aber keine Verarbeitung
**Diagnose**:
```bash
# Check if endpoint reachable
curl -X POST "http://localhost:3000/vmh/webhook/beteiligte/create" \
-H "Content-Type: application/json" \
-d '[{"id": "test-123"}]'
# Check firewall
sudo ufw status
# Check nginx logs (if using reverse proxy)
sudo tail -f /var/log/nginx/motia-access.log
sudo tail -f /var/log/nginx/motia-error.log
```
**Solutions**:
1. **Firewall blocking**:
```bash
# Allow port (if direct access)
sudo ufw allow 3000/tcp
# Or use reverse proxy (recommended)
```
2. **Wrong URL in EspoCRM**:
```bash
# Verify URL in EspoCRM webhook configuration
# Should be: https://your-domain.com/vmh/webhook/beteiligte/create
```
3. **SSL certificate issues**:
```bash
# Check certificate
openssl s_client -connect your-domain.com:443
# Renew certificate
sudo certbot renew
```
### Webhook Deduplication Not Working
**Symptoms**: Mehrfache Verarbeitung derselben Webhooks
**Diagnose**:
```bash
# Check Redis dedup sets
redis-cli -n 1 SMEMBERS vmh:beteiligte:create_pending
redis-cli -n 1 SMEMBERS vmh:beteiligte:update_pending
redis-cli -n 1 SMEMBERS vmh:beteiligte:delete_pending
# Check for concurrent webhook processing
sudo journalctl -u motia.service | grep "Webhook.*received"
```
**Solutions**:
1. **Redis SET not working**:
```bash
# Test Redis SET operations
redis-cli -n 1 SADD test_set "value1"
redis-cli -n 1 SMEMBERS test_set
redis-cli -n 1 DEL test_set
```
2. **Clear dedup sets**:
```bash
# If corrupted
redis-cli -n 1 DEL vmh:beteiligte:create_pending
redis-cli -n 1 DEL vmh:beteiligte:update_pending
redis-cli -n 1 DEL vmh:beteiligte:delete_pending
```
## Performance Issues
### High CPU Usage
**Diagnose**:
```bash
# Check CPU usage
top -p $(pgrep -f "motia start")
# Profile with Node.js
# Already enabled with --inspect flag
# Connect to chrome://inspect
```
**Solutions**:
1. **Too many parallel syncs**:
```bash
# Reduce concurrent syncs
# Adjust DEBUG_KUERZEL to process fewer employees
```
2. **Infinite loop**:
```bash
# Check logs for repeated patterns
sudo journalctl -u motia.service | tail -n 1000 | sort | uniq -c | sort -rn
```
### High Memory Usage
**Diagnose**:
```bash
# Check memory
ps aux | grep motia | awk '{print $6}'
# Heap snapshot (if enabled)
kill -SIGUSR2 $(pgrep -f "motia start")
# Snapshot saved to current directory
```
**Solutions**:
1. **Increase memory limit**:
```ini
# In systemd service
Environment=NODE_OPTIONS=--max-old-space-size=16384
```
2. **Memory leak**:
```bash
# Restart service periodically
# Add to crontab:
0 3 * * * systemctl restart motia.service
```
### Slow API Responses
**Diagnose**:
```bash
# Measure response time
time curl "http://localhost:3000/advoware/proxy?endpoint=employees"
# Check for database/Redis latency
redis-cli --latency
```
**Solutions**:
1. **Redis slow**:
```bash
# Check slow log
redis-cli SLOWLOG GET 10
# Optimize Redis
redis-cli CONFIG SET tcp-backlog 511
```
2. **Advoware API slow**:
```bash
# Increase timeout
export ADVOWARE_API_TIMEOUT_SECONDS=60
# Add caching layer
```
## Debugging Tools
### Enable Debug Logging
```bash
# Set in systemd service
Environment=MOTIA_LOG_LEVEL=debug
sudo systemctl daemon-reload
sudo systemctl restart motia.service
```
### Redis Debugging
```bash
# Connect to Redis
redis-cli
# Monitor all commands
MONITOR
# Slow log
SLOWLOG GET 10
# Info
INFO all
```
### Python Debugging
```python
# Add to step code
import pdb; pdb.set_trace()
# Or use logging
context.logger.debug(f"Variable value: {variable}")
```
### Node.js Debugging
```bash
# Connect to inspector
# Chrome DevTools: chrome://inspect
# VSCode: Attach to Process
```
## Getting Help
### Check Logs First
```bash
# Last 100 lines
sudo journalctl -u motia.service -n 100
# Errors only
sudo journalctl -u motia.service -p err
# Specific time range
sudo journalctl -u motia.service --since "1 hour ago"
```
### Common Log Patterns
**Success**:
```
[INFO] Calendar sync completed for SB
[INFO] VMH Webhook received
```
**Warning**:
```
[WARNING] Rate limit approaching
[WARNING] Lock already exists for SB
```
**Error**:
```
[ERROR] Redis connection failed
[ERROR] API call failed: 401 Unauthorized
[ERROR] Unexpected error: ...
```
### Collect Debug Information
```bash
# System info
uname -a
node --version
python3 --version
# Service status
sudo systemctl status motia.service
# Recent logs
sudo journalctl -u motia.service -n 200 > motia-logs.txt
# Redis info
redis-cli INFO > redis-info.txt
# Configuration (redact secrets!)
sudo systemctl show motia.service -p Environment > env.txt
```
## Related Documentation
- [Architecture](ARCHITECTURE.md)
- [Configuration](CONFIGURATION.md)
- [Deployment](DEPLOYMENT.md)
- [Development Guide](DEVELOPMENT.md)

View File

@@ -1,41 +1,4 @@
[ [
{
"id": "basic-tutorial",
"config": {
"steps/petstore/state_audit_cron_step.py": {
"x": -38,
"y": 683,
"sourceHandlePosition": "right"
},
"steps/petstore/process_food_order_step.py": {
"x": 384,
"y": 476,
"targetHandlePosition": "left"
},
"steps/petstore/notification_step.py": {
"x": 601,
"y": 724,
"targetHandlePosition": "left"
},
"steps/petstore/api_step.py": {
"x": 15,
"y": 461,
"sourceHandlePosition": "right"
},
"steps/advoware_proxy/advoware_api_proxy_put_step.py": {
"x": 12,
"y": 408
},
"steps/advoware_proxy/advoware_api_proxy_get_step.py": {
"x": 12,
"y": 611
},
"steps/advoware_proxy/advoware_api_proxy_delete_step.py": {
"x": 0,
"y": 814
}
}
},
{ {
"id": "vmh", "id": "vmh",
"config": { "config": {
@@ -102,8 +65,46 @@
"y": 990 "y": 990
}, },
"steps/advoware_cal_sync/calendar_sync_all_step.py": { "steps/advoware_cal_sync/calendar_sync_all_step.py": {
"x": 343, "x": 339,
"y": 904 "y": 913
}
}
},
{
"id": "basic-tutorial",
"config": {
"steps/petstore/state_audit_cron_step.py": {
"x": -38,
"y": 683,
"sourceHandlePosition": "right"
},
"steps/petstore/process_food_order_step.py": {
"x": 384,
"y": 476,
"targetHandlePosition": "left"
},
"steps/petstore/notification_step.py": {
"x": 601,
"y": 724,
"targetHandlePosition": "left"
},
"steps/petstore/api_step.py": {
"x": 15,
"y": 461,
"sourceHandlePosition": "right"
}
}
},
{
"id": "perf-test",
"config": {
"steps/motia-perf-test/perf_event_step.py": {
"x": 318,
"y": 22
},
"steps/motia-perf-test/perf_cron_step.py": {
"x": 0,
"y": 0
} }
} }
} }

View File

@@ -0,0 +1,176 @@
# Calendar Sync Utility Scripts
---
title: Calendar Sync Utilities
description: Helper-Scripts für Google Calendar Synchronisation - Wartung, Debugging und Cleanup
date: 2026-02-07
category: utilities
---
## Übersicht
Dieses Verzeichnis enthält Utility-Scripts für Wartung und Debugging der Calendar-Sync-Funktionalität.
---
## Scripts
### delete_all_calendars.py
**Zweck**: Löscht alle (nicht-primären) Kalender aus dem Google Calendar Service Account.
**Use Case**:
- Reset bei fehlerhafter Synchronisation
- Cleanup nach Tests
- Bereinigung von Duplikaten
**Ausführung**:
```bash
cd /opt/motia-app/bitbylaw
python3 scripts/calendar_sync/delete_all_calendars.py
```
**Funktionsweise**:
1. Authentifizierung mit Google Service Account
2. Abruf aller Kalender via `calendarList().list()`
3. Iteration durch alle Kalender
4. Überspringen des Primary Calendar (Schutz)
5. Löschen aller anderen Kalender via `calendars().delete()`
**Sicherheit**:
- ⚠️ **WARNUNG**: Löscht unwiderruflich alle Kalender!
- Primary Calendar wird automatisch übersprungen
- Manuelle Bestätigung erforderlich (TODO: Confirmation Prompt)
**Abhängigkeiten**:
- `steps.advoware_cal_sync.calendar_sync_event_step.get_google_service`
- Google Calendar API Access
- Service Account Credentials
**Output-Beispiel**:
```
Fetching calendar list...
Found 15 calendars to delete:
- Max Mustermann (ID: max@example.com, Primary: False)
✓ Deleted calendar: Max Mustermann
- Primary (ID: service@project.iam.gserviceaccount.com, Primary: True)
Skipping primary calendar: Primary
...
All non-primary calendars have been deleted.
```
---
### delete_employee_locks.py
**Zweck**: Löscht alle Employee-Locks aus Redis für Calendar Sync.
**Use Case**:
- Cleanup nach abgestürztem Sync-Prozess
- Manueller Reset bei "hanging" Locks
- Debugging von Lock-Problemen
**Ausführung**:
```bash
cd /opt/motia-app/bitbylaw
python3 scripts/calendar_sync/delete_employee_locks.py
```
**Funktionsweise**:
1. Verbindung zu Redis DB 2 (`REDIS_DB_CALENDAR_SYNC`)
2. Suche nach allen Keys mit Pattern `calendar_sync_lock_*`
3. Löschen aller gefundenen Lock-Keys
**Redis Key Pattern**:
```
calendar_sync_lock_{employee_id}
```
**Sicherheit**:
- ⚠️ Kann zu Race Conditions führen, wenn Sync läuft
- Empfehlung: Nur ausführen, wenn kein Sync-Prozess aktiv ist
**Abhängigkeiten**:
- `config.Config` (Redis-Konfiguration)
- Redis DB 2 (Calendar Sync State)
**Output-Beispiel**:
```
Deleted 12 employee lock keys.
```
**Oder bei leerer DB**:
```
No employee lock keys found.
```
---
## Workflow: Kompletter Reset
Bei schwerwiegenden Sync-Problemen:
```bash
cd /opt/motia-app/bitbylaw
# 1. Stoppe Motia Service (verhindert neue Syncs)
sudo systemctl stop motia
# 2. Lösche alle Redis Locks
python3 scripts/calendar_sync/delete_employee_locks.py
# 3. Lösche alle Google Kalender (optional, nur bei Bedarf!)
python3 scripts/calendar_sync/delete_all_calendars.py
# 4. Starte Motia Service neu
sudo systemctl start motia
# 5. Triggere Full-Sync
curl -X POST http://localhost:3000/api/calendar/sync/all
```
---
## Best Practices
### Vor Ausführung
1. **Backup prüfen**: Sicherstellen, dass Advoware-Daten konsistent sind
2. **Service Status**: `systemctl status motia` prüfen
3. **Redis Dump**: `redis-cli -n 2 BGSAVE` (optional)
### Nach Ausführung
1. **Logs prüfen**: `journalctl -u motia -n 100 --no-pager`
2. **Sync triggern**: Via API oder Cron
3. **Verifizierung**: Google Calendar auf korrekte Kalender prüfen
---
## Zukünftige Scripts (TODO)
### audit_calendar_sync.py
**Zweck**: Vergleicht Advoware-Termine mit Google Calendar
**Features**:
- Diff-Anzeige zwischen Advoware und Google
- Erkennung von Orphaned Calendars
- Report-Generierung
### repair_calendar_sync.py
**Zweck**: Automatische Reparatur bei Inkonsistenzen
**Features**:
- Auto-Sync bei fehlenden Terminen
- Löschen von Duplikaten
- Lock-Cleanup mit Safety-Checks
---
## Siehe auch
- [Calendar Sync Architecture](../../docs/ARCHITECTURE.md#2-calendar-sync-pipeline)
- [Calendar Sync Cron Step](../../steps/advoware_cal_sync/calendar_sync_cron_step.md)
- [Troubleshooting Guide](../../docs/TROUBLESHOOTING.md)

View File

@@ -0,0 +1,21 @@
import redis
from config import Config
def main():
redis_client = redis.Redis(
host=Config.REDIS_HOST,
port=int(Config.REDIS_PORT),
db=int(Config.REDIS_DB_CALENDAR_SYNC),
socket_timeout=Config.REDIS_TIMEOUT_SECONDS
)
# Find all lock keys
lock_keys = redis_client.keys('calendar_sync_lock_*')
if lock_keys:
deleted_count = redis_client.delete(*lock_keys)
print(f"Deleted {deleted_count} employee lock keys.")
else:
print("No employee lock keys found.")
if __name__ == "__main__":
main()

View File

@@ -0,0 +1,345 @@
# AdvowareAPI Service
## Übersicht
Der AdvowareAPI Service ist der zentrale HTTP-Client für alle Kommunikation mit der Advoware REST-API. Er abstrahiert die komplexe HMAC-512 Authentifizierung und bietet ein einfaches Interface für API-Calls.
## Location
`services/advoware.py`
## Verwendung
```python
from services.advoware import AdvowareAPI
# In Step-Handler
async def handler(req, context):
advoware = AdvowareAPI(context)
result = await advoware.api_call('/employees', method='GET')
return {'status': 200, 'body': {'result': result}}
```
## Klassen
### AdvowareAPI
**Constructor**: `__init__(self, context=None)`
- `context`: Motia context für Logging (optional)
**Attributes**:
- `API_BASE_URL`: Base URL der Advoware API
- `redis_client`: Redis-Connection für Token-Caching
- `product_id`, `app_id`, `api_key`: Auth-Credentials aus Config
## Methoden
### get_access_token(force_refresh=False)
Holt Bearer Token aus Redis Cache oder fetcht neuen Token.
**Parameters**:
- `force_refresh` (bool): Cache ignorieren und neuen Token holen
**Returns**: `str` - Bearer Token
**Logic**:
1. Wenn kein Redis oder `force_refresh=True`: Fetch new
2. Wenn cached Token existiert und nicht abgelaufen: Return cached
3. Sonst: Fetch new und cache
**Caching**:
- Key: `advoware_access_token`
- TTL: 53 Minuten (55min Lifetime - 2min Safety)
- Timestamp-Key: `advoware_token_timestamp`
**Example**:
```python
api = AdvowareAPI()
token = api.get_access_token() # From cache
token = api.get_access_token(force_refresh=True) # Fresh
```
### api_call(endpoint, method='GET', headers=None, params=None, json_data=None, ...)
Führt authentifizierten API-Call zu Advoware aus.
**Parameters**:
- `endpoint` (str): API-Pfad (z.B. `/employees`)
- `method` (str): HTTP-Method (GET, POST, PUT, DELETE)
- `headers` (dict): Zusätzliche HTTP-Headers
- `params` (dict): Query-Parameters
- `json_data` (dict): JSON-Body für POST/PUT
- `timeout_seconds` (int): Override default timeout
**Returns**: `dict|None` - JSON-Response oder None
**Logic**:
1. Get Bearer Token (cached oder fresh)
2. Setze Authorization Header
3. Async HTTP-Request mit aiohttp
4. Bei 401: Refresh Token und retry
5. Parse JSON Response
6. Return Result
**Error Handling**:
- `aiohttp.ClientError`: Network/HTTP errors
- `401 Unauthorized`: Auto-refresh Token und retry (einmal)
- `Timeout`: Nach `ADVOWARE_API_TIMEOUT_SECONDS`
**Example**:
```python
# GET Request
employees = await api.api_call('/employees', method='GET', params={'limit': 10})
# POST Request
new_appt = await api.api_call(
'/appointments',
method='POST',
json_data={'datum': '2026-02-10', 'text': 'Meeting'}
)
# PUT Request
updated = await api.api_call(
'/appointments/123',
method='PUT',
json_data={'text': 'Updated'}
)
# DELETE Request
await api.api_call('/appointments/123', method='DELETE')
```
## Authentifizierung
### HMAC-512 Signature
Advoware verwendet HMAC-512 für Request-Signierung:
**Message Format**:
```
{product_id}:{app_id}:{nonce}:{timestamp}
```
**Key**: Base64-decoded API Key
**Hash**: SHA512
**Output**: Base64-encoded Signature
**Implementation**:
```python
def _generate_hmac(self, request_time_stamp, nonce=None):
if not nonce:
nonce = str(uuid.uuid4())
message = f"{self.product_id}:{self.app_id}:{nonce}:{request_time_stamp}"
api_key_bytes = base64.b64decode(self.api_key)
signature = hmac.new(api_key_bytes, message.encode(), hashlib.sha512)
return base64.b64encode(signature.digest()).decode('utf-8')
```
### Token-Fetch Flow
1. Generate nonce (UUID4)
2. Get current UTC timestamp (ISO format)
3. Generate HMAC signature
4. POST to `https://security.advo-net.net/api/v1/Token`:
```json
{
"AppID": "...",
"Kanzlei": "...",
"Database": "...",
"User": "...",
"Role": 2,
"Product": 64,
"Password": "...",
"Nonce": "...",
"HMAC512Signature": "...",
"RequestTimeStamp": "..."
}
```
5. Extract `access_token` from response
6. Cache in Redis (53min TTL)
## Redis Usage
### Keys
**DB 1** (`REDIS_DB_ADVOWARE_CACHE`):
- `advoware_access_token` (string, TTL: 3180s = 53min)
- `advoware_token_timestamp` (string, TTL: 3180s)
### Operations
```python
# Set Token
self.redis_client.set(
self.TOKEN_CACHE_KEY,
access_token,
ex=(self.token_lifetime_minutes - 2) * 60
)
# Get Token
cached_token = self.redis_client.get(self.TOKEN_CACHE_KEY)
if cached_token:
return cached_token.decode('utf-8')
```
### Fallback
Wenn Redis nicht erreichbar:
- Logge Warning
- Fetche Token bei jedem Request (keine Caching)
- Funktioniert, aber langsamer
## Logging
### Log Messages
```python
# Via context.logger (wenn vorhanden)
context.logger.info("Access token fetched successfully")
context.logger.error(f"API call failed: {e}")
# Fallback zu Python logging
logger.info("Connected to Redis for token caching")
logger.debug(f"Token request data: AppID={self.app_id}")
```
### Log Levels
- **DEBUG**: Token Details, Request-Parameter
- **INFO**: Token-Fetch, API-Calls, Cache-Hits
- **ERROR**: Auth-Fehler, API-Fehler, Network-Fehler
## Configuration
### Environment Variables
```bash
# API Settings
ADVOWARE_API_BASE_URL=https://www2.advo-net.net:90/
ADVOWARE_PRODUCT_ID=64
ADVOWARE_APP_ID=your_app_id
ADVOWARE_API_KEY=base64_encoded_hmac_key
ADVOWARE_KANZLEI=your_kanzlei
ADVOWARE_DATABASE=your_database
ADVOWARE_USER=api_user
ADVOWARE_ROLE=2
ADVOWARE_PASSWORD=your_password
# Timeouts
ADVOWARE_TOKEN_LIFETIME_MINUTES=55
ADVOWARE_API_TIMEOUT_SECONDS=30
# Redis
REDIS_HOST=localhost
REDIS_PORT=6379
REDIS_DB_ADVOWARE_CACHE=1
REDIS_TIMEOUT_SECONDS=5
```
## Error Handling
### Exceptions
**AdvowareTokenError**:
- Raised when token fetch fails
- Beispiel: Invalid credentials, HMAC signature mismatch
**aiohttp.ClientError**:
- Network errors, HTTP errors (außer 401)
- Timeouts, Connection refused, etc.
### Retry Logic
**401 Unauthorized**:
- Automatic retry mit fresh token (einmal)
- Danach: Exception an Caller
**Other Errors**:
- Keine Retry (fail-fast)
- Exception direkt an Caller
## Performance
### Response Time
- **With cached token**: 200-800ms (Advoware API Latency)
- **With token fetch**: +1-2s für Token-Request
- **Timeout**: 30s (konfigurierbar)
### Caching
- **Hit Rate**: >99% (Token cached 53min, API calls häufiger)
- **Miss**: Nur bei erstem Call oder Token-Expiry
## Testing
### Manual Testing
```python
# Test Token Fetch
from services.advoware import AdvowareAPI
api = AdvowareAPI()
token = api.get_access_token(force_refresh=True)
print(f"Token: {token[:20]}...")
# Test API Call
import asyncio
async def test():
api = AdvowareAPI()
result = await api.api_call('/employees', params={'limit': 5})
print(result)
asyncio.run(test())
```
### Unit Testing
```python
from unittest.mock import AsyncMock, MagicMock, patch
import pytest
@pytest.mark.asyncio
async def test_api_call_with_cached_token():
# Mock Redis
redis_mock = MagicMock()
redis_mock.get.return_value = b'cached_token'
# Mock aiohttp
with patch('aiohttp.ClientSession') as session_mock:
response_mock = AsyncMock()
response_mock.status = 200
response_mock.json = AsyncMock(return_value={'data': 'test'})
session_mock.return_value.__aenter__.return_value.request.return_value.__aenter__.return_value = response_mock
api = AdvowareAPI()
api.redis_client = redis_mock
result = await api.api_call('/test')
assert result == {'data': 'test'}
redis_mock.get.assert_called_once()
```
## Security
### Secrets
- ✅ API Key aus Environment (nicht hardcoded)
- ✅ Password aus Environment
- ✅ Token nur in Redis (localhost)
- ❌ Token nicht in Logs
### Best Practices
- API Key immer Base64-encoded speichern
- Token nicht länger als 55min cachen
- Redis localhost-only (keine remote connections)
- Logs keine credentials enthalten
## Related Documentation
- [Configuration](../../docs/CONFIGURATION.md)
- [Architecture](../../docs/ARCHITECTURE.md)
- [Proxy Steps](../advoware_proxy/README.md)

View File

@@ -428,3 +428,32 @@ python audit_calendar_sync.py cleanup-orphaned
Alle Operationen werden über `context.logger` geloggt und sind in der Motia Workbench sichtbar. Zusätzliche Debug-Informationen werden auf der Konsole ausgegeben. Alle Operationen werden über `context.logger` geloggt und sind in der Motia Workbench sichtbar. Zusätzliche Debug-Informationen werden auf der Konsole ausgegeben.
---
## Utility Scripts
Für Wartung und Debugging stehen Helper-Scripts zur Verfügung:
**Dokumentation**: [scripts/calendar_sync/README.md](../../scripts/calendar_sync/README.md)
**Verfügbare Scripts**:
- `delete_employee_locks.py` - Löscht Redis-Locks (bei hängenden Syncs)
- `delete_all_calendars.py` - Löscht alle Google Kalender (Reset)
**Verwendung**:
```bash
# Lock-Cleanup
python3 scripts/calendar_sync/delete_employee_locks.py
# Calendar-Reset (VORSICHT!)
python3 scripts/calendar_sync/delete_all_calendars.py
```
---
## Siehe auch
- [Calendar Sync Architecture](../../docs/ARCHITECTURE.md#2-calendar-sync-system)
- [Calendar Sync Cron Step](calendar_sync_cron_step.md)
- [Google Calendar Setup](../../docs/GOOGLE_SETUP.md)
- [Troubleshooting Guide](../../docs/TROUBLESHOOTING.md)

View File

@@ -0,0 +1,109 @@
---
type: step
category: event
name: Calendar Sync All
version: 1.0.0
status: active
tags: [calendar, sync, event, cascade]
dependencies:
- services/advoware.py
- redis
emits: [calendar_sync_employee]
subscribes: [calendar_sync_all]
---
# Calendar Sync All Step
## Zweck
Fetcht alle Mitarbeiter von Advoware und emittiert `calendar_sync_employee` Event pro Mitarbeiter. Ermöglicht parallele Verarbeitung.
## Config
```python
{
'type': 'event',
'name': 'Calendar Sync All',
'subscribes': ['calendar_sync_all'],
'emits': ['calendar_sync_employee'],
'flows': ['advoware_cal_sync']
}
```
## Input Event
```json
{
"topic": "calendar_sync_all",
"data": {}
}
```
## Verhalten
1. **Fetch Employees** von Advoware API:
```python
employees = await advoware.api_call('/employees')
```
2. **Filter Debug-Liste** (wenn konfiguriert):
```python
if Config.CALENDAR_SYNC_DEBUG_KUERZEL:
employees = [e for e in employees if e['kuerzel'] in debug_list]
```
3. **Set Lock pro Employee**:
```python
lock_key = f'calendar_sync:lock:{kuerzel}'
redis.set(lock_key, '1', nx=True, ex=300)
```
4. **Emit Event pro Employee**:
```python
await context.emit({
'topic': 'calendar_sync_employee',
'data': {
'kuerzel': kuerzel,
'full_content': True
}
})
```
## Debug-Modus
```bash
# Only sync specific employees
export CALENDAR_SYNC_DEBUG_KUERZEL=SB,AI,RO
# Sync all (production)
export CALENDAR_SYNC_DEBUG_KUERZEL=
```
## Error Handling
- Advoware API Fehler: Loggen, aber nicht crashen
- Lock-Fehler: Employee skippen (bereits in Sync)
- Event Emission Fehler: Loggen und fortfahren
## Output Events
Multiple `calendar_sync_employee` events, z.B.:
```json
[
{"topic": "calendar_sync_employee", "data": {"kuerzel": "SB", "full_content": true}},
{"topic": "calendar_sync_employee", "data": {"kuerzel": "AI", "full_content": true}},
...
]
```
## Performance
- ~10 employees: <1s für Fetch + Event Emission
- Lock-Setting: <10ms pro Employee
- Keine Blockierung (async events)
## Monitoring
```
[INFO] Fetching employees from Advoware
[INFO] Found 10 employees
[INFO] Emitting calendar_sync_employee for SB
[INFO] Emitting calendar_sync_employee for AI
...
```
## Related
- [calendar_sync_event_step.md](calendar_sync_event_step.md) - Consumes emitted events
- [calendar_sync_cron_step.md](calendar_sync_cron_step.md) - Triggers this step

View File

@@ -5,7 +5,7 @@ import time
from datetime import datetime from datetime import datetime
from config import Config from config import Config
from services.advoware import AdvowareAPI from services.advoware import AdvowareAPI
from .calendar_sync_utils import get_redis_client, get_advoware_employees, set_employee_lock from .calendar_sync_utils import get_redis_client, get_advoware_employees, set_employee_lock, log_operation
config = { config = {
'type': 'event', 'type': 'event',
@@ -19,7 +19,7 @@ config = {
async def handler(event_data, context): async def handler(event_data, context):
try: try:
triggered_by = event_data.get('triggered_by', 'unknown') triggered_by = event_data.get('triggered_by', 'unknown')
context.logger.info(f"Calendar Sync All: Starting to emit events for oldest employees, triggered by {triggered_by}") log_operation('info', f"Calendar Sync All: Starting to emit events for oldest employees, triggered by {triggered_by}", context=context)
# Initialize Advoware API # Initialize Advoware API
advoware = AdvowareAPI(context) advoware = AdvowareAPI(context)
@@ -27,7 +27,7 @@ async def handler(event_data, context):
# Fetch employees # Fetch employees
employees = await get_advoware_employees(advoware, context) employees = await get_advoware_employees(advoware, context)
if not employees: if not employees:
context.logger.error("Keine Mitarbeiter gefunden. All-Sync abgebrochen.") log_operation('error', "Keine Mitarbeiter gefunden. All-Sync abgebrochen.", context=context)
return {'status': 500, 'body': {'error': 'Keine Mitarbeiter gefunden'}} return {'status': 500, 'body': {'error': 'Keine Mitarbeiter gefunden'}}
redis_client = get_redis_client(context) redis_client = get_redis_client(context)
@@ -53,11 +53,11 @@ async def handler(event_data, context):
return datetime.fromtimestamp(ts).strftime('%Y-%m-%d %H:%M:%S') return datetime.fromtimestamp(ts).strftime('%Y-%m-%d %H:%M:%S')
sorted_list_str = ", ".join(f"{k} ({format_timestamp(employee_timestamps[k])})" for k in sorted_kuerzel) sorted_list_str = ", ".join(f"{k} ({format_timestamp(employee_timestamps[k])})" for k in sorted_kuerzel)
context.logger.info(f"Calendar Sync All: Sorted employees by last synced: {sorted_list_str}") log_operation('info', f"Calendar Sync All: Sorted employees by last synced: {sorted_list_str}", context=context)
# Calculate number to sync: ceil(N / 10) # Calculate number to sync: ceil(N / 10)
num_to_sync = math.ceil(len(sorted_kuerzel) / 10) num_to_sync = math.ceil(len(sorted_kuerzel) / 1)
context.logger.info(f"Calendar Sync All: Total employees {len(sorted_kuerzel)}, syncing {num_to_sync} oldest") log_operation('info', f"Calendar Sync All: Total employees {len(sorted_kuerzel)}, syncing {num_to_sync} oldest", context=context)
# Emit for the oldest num_to_sync employees, if not locked # Emit for the oldest num_to_sync employees, if not locked
emitted_count = 0 emitted_count = 0
@@ -65,7 +65,7 @@ async def handler(event_data, context):
employee_lock_key = f'calendar_sync_lock_{kuerzel}' employee_lock_key = f'calendar_sync_lock_{kuerzel}'
if not set_employee_lock(redis_client, kuerzel, triggered_by, context): if not set_employee_lock(redis_client, kuerzel, triggered_by, context):
context.logger.info(f"Calendar Sync All: Sync bereits aktiv für {kuerzel}, überspringe") log_operation('info', f"Calendar Sync All: Sync bereits aktiv für {kuerzel}, überspringe", context=context)
continue continue
# Emit event for this employee # Emit event for this employee
@@ -76,10 +76,10 @@ async def handler(event_data, context):
"triggered_by": triggered_by "triggered_by": triggered_by
} }
}) })
context.logger.info(f"Calendar Sync All: Emitted event for employee {kuerzel} (last synced: {format_timestamp(employee_timestamps[kuerzel])})") log_operation('info', f"Calendar Sync All: Emitted event for employee {kuerzel} (last synced: {format_timestamp(employee_timestamps[kuerzel])})", context=context)
emitted_count += 1 emitted_count += 1
context.logger.info(f"Calendar Sync All: Completed, emitted {emitted_count} events") log_operation('info', f"Calendar Sync All: Completed, emitted {emitted_count} events", context=context)
return { return {
'status': 'completed', 'status': 'completed',
'triggered_by': triggered_by, 'triggered_by': triggered_by,
@@ -87,7 +87,7 @@ async def handler(event_data, context):
} }
except Exception as e: except Exception as e:
context.logger.error(f"Fehler beim All-Sync: {e}") log_operation('error', f"Fehler beim All-Sync: {e}", context=context)
return { return {
'status': 'error', 'status': 'error',
'error': str(e) 'error': str(e)

View File

@@ -0,0 +1,96 @@
---
type: step
category: api
name: Calendar Sync API
version: 1.0.0
status: active
tags: [calendar, sync, api, manual-trigger]
dependencies:
- redis
emits: [calendar_sync_all, calendar_sync_employee]
---
# Calendar Sync API Step
## Zweck
Manueller Trigger für Calendar-Synchronisation via HTTP-Endpoint. Ermöglicht Sync für alle oder einzelne Mitarbeiter.
## Config
```python
{
'type': 'api',
'name': 'Calendar Sync API',
'path': '/advoware/calendar/sync',
'method': 'POST',
'emits': ['calendar_sync_all', 'calendar_sync_employee'],
'flows': ['advoware_cal_sync']
}
```
## Input
```bash
POST /advoware/calendar/sync
Content-Type: application/json
{
"kuerzel": "ALL", # or specific: "SB"
"full_content": true
}
```
**Parameters**:
- `kuerzel` (optional): "ALL" oder Mitarbeiter-Kürzel (default: "ALL")
- `full_content` (optional): true = volle Details, false = anonymisiert (default: true)
## Output
```json
{
"status": "triggered",
"kuerzel": "ALL",
"message": "Calendar sync triggered for ALL"
}
```
## Verhalten
**Case 1: ALL (oder kein kuerzel)**:
1. Emit `calendar_sync_all` event
2. `calendar_sync_all_step` fetcht alle Employees
3. Pro Employee: Emit `calendar_sync_employee`
**Case 2: Specific Employee (z.B. "SB")**:
1. Set Redis Lock: `calendar_sync:lock:SB`
2. Emit `calendar_sync_employee` event direkt
3. Lock verhindert parallele Syncs für denselben Employee
## Redis Locking
```python
lock_key = f'calendar_sync:lock:{kuerzel}'
redis_client.set(lock_key, '1', nx=True, ex=300) # 5min TTL
```
## Testing
```bash
# Sync all employees
curl -X POST "http://localhost:3000/advoware/calendar/sync" \
-H "Content-Type: application/json" \
-d '{"full_content": true}'
# Sync single employee
curl -X POST "http://localhost:3000/advoware/calendar/sync" \
-H "Content-Type: application/json" \
-d '{"kuerzel": "SB", "full_content": true}'
# Sync with anonymization
curl -X POST "http://localhost:3000/advoware/calendar/sync" \
-H "Content-Type: application/json" \
-d '{"kuerzel": "SB", "full_content": false}'
```
## Error Handling
- Lock active: Wartet oder gibt Fehler zurück
- Invalid kuerzel: Wird an all_step oder event_step weitergegeben
## Related
- [calendar_sync_all_step.md](calendar_sync_all_step.md) - Handles "ALL"
- [calendar_sync_event_step.md](calendar_sync_event_step.md) - Per-employee sync

View File

@@ -1,7 +1,7 @@
import json import json
import redis import redis
from config import Config from config import Config
from .calendar_sync_utils import get_redis_client, set_employee_lock from .calendar_sync_utils import get_redis_client, set_employee_lock, log_operation
config = { config = {
'type': 'api', 'type': 'api',
@@ -31,7 +31,7 @@ async def handler(req, context):
if kuerzel_upper == 'ALL': if kuerzel_upper == 'ALL':
# Emit sync-all event # Emit sync-all event
context.logger.info("Calendar Sync API: Emitting sync-all event") log_operation('info', "Calendar Sync API: Emitting sync-all event", context=context)
await context.emit({ await context.emit({
"topic": "calendar_sync_all", "topic": "calendar_sync_all",
"data": { "data": {
@@ -54,7 +54,7 @@ async def handler(req, context):
redis_client = get_redis_client(context) redis_client = get_redis_client(context)
if not set_employee_lock(redis_client, kuerzel_upper, 'api', context): if not set_employee_lock(redis_client, kuerzel_upper, 'api', context):
context.logger.info(f"Calendar Sync API: Sync bereits aktiv für {kuerzel_upper}, überspringe") log_operation('info', f"Calendar Sync API: Sync bereits aktiv für {kuerzel_upper}, überspringe", context=context)
return { return {
'status': 409, 'status': 409,
'body': { 'body': {
@@ -65,7 +65,7 @@ async def handler(req, context):
} }
} }
context.logger.info(f"Calendar Sync API aufgerufen für {kuerzel_upper}") log_operation('info', f"Calendar Sync API aufgerufen für {kuerzel_upper}", context=context)
# Lock erfolgreich gesetzt, jetzt emittieren # Lock erfolgreich gesetzt, jetzt emittieren
@@ -89,7 +89,7 @@ async def handler(req, context):
} }
except Exception as e: except Exception as e:
context.logger.error(f"Fehler beim API-Trigger: {e}") log_operation('error', f"Fehler beim API-Trigger: {e}", context=context)
return { return {
'status': 500, 'status': 500,
'body': { 'body': {

View File

@@ -0,0 +1,51 @@
---
type: step
category: cron
name: Calendar Sync Cron
version: 1.0.0
status: active
tags: [calendar, sync, cron, scheduler]
dependencies: []
emits: [calendar_sync_all]
---
# Calendar Sync Cron Step
## Zweck
Täglicher Trigger für die Calendar-Synchronisation. Startet die Sync-Pipeline um 2 Uhr morgens.
## Config
```python
{
'type': 'cron',
'name': 'Calendar Sync Cron',
'schedule': '0 2 * * *', # Daily at 2 AM
'emits': ['calendar_sync_all'],
'flows': ['advoware_cal_sync']
}
```
## Verhalten
1. Cron triggert täglich um 02:00 Uhr
2. Emittiert Event `calendar_sync_all`
3. Event wird von `calendar_sync_all_step` empfangen
4. Startet Cascade: All → per Employee → Sync
## Event-Payload
```json
{}
```
Leer, da keine Parameter benötigt werden.
## Monitoring
Logs: `[INFO] Calendar Sync Cron triggered`
## Manual Trigger
```bash
# Use API endpoint instead of waiting for cron
curl -X POST "http://localhost:3000/advoware/calendar/sync" \
-H "Content-Type: application/json" \
-d '{"full_content": true}'
```
Siehe: [calendar_sync_api_step.md](calendar_sync_api_step.md)

View File

@@ -2,19 +2,20 @@ import json
import redis import redis
from config import Config from config import Config
from services.advoware import AdvowareAPI from services.advoware import AdvowareAPI
from .calendar_sync_utils import log_operation
config = { config = {
'type': 'cron', 'type': 'cron',
'name': 'Calendar Sync Cron Job', 'name': 'Calendar Sync Cron Job',
'description': 'Führt den Calendar Sync alle 1 Minuten automatisch aus', 'description': 'Führt den Calendar Sync alle 1 Minuten automatisch aus',
'cron': '*/1 * * * *', # Alle 1 Minute 'cron': '0 0 31 2 *', # Nie ausführen (31. Februar)
'emits': ['calendar_sync_all'], 'emits': ['calendar_sync_all'],
'flows': ['advoware'] 'flows': ['advoware']
} }
async def handler(context): async def handler(context):
try: try:
context.logger.info("Calendar Sync Cron: Starting to emit sync-all event") log_operation('info', "Calendar Sync Cron: Starting to emit sync-all event", context=context)
# # Emit sync-all event # # Emit sync-all event
await context.emit({ await context.emit({
@@ -24,14 +25,14 @@ async def handler(context):
} }
}) })
context.logger.info("Calendar Sync Cron: Emitted sync-all event") log_operation('info', "Calendar Sync Cron: Emitted sync-all event", context=context)
return { return {
'status': 'completed', 'status': 'completed',
'triggered_by': 'cron' 'triggered_by': 'cron'
} }
except Exception as e: except Exception as e:
context.logger.error(f"Fehler beim Cron-Job: {e}") log_operation('error', f"Fehler beim Cron-Job: {e}", context=context)
return { return {
'status': 'error', 'status': 'error',
'error': str(e) 'error': str(e)

View File

@@ -919,6 +919,8 @@ async def process_updates(state, conn, service, calendar_id, kuerzel, advoware,
async def handler(event_data, context): async def handler(event_data, context):
"""Main event handler for calendar sync.""" """Main event handler for calendar sync."""
start_time = time.time()
kuerzel = event_data.get('kuerzel') kuerzel = event_data.get('kuerzel')
if not kuerzel: if not kuerzel:
log_operation('error', "No kuerzel provided in event", context=context) log_operation('error', "No kuerzel provided in event", context=context)
@@ -1025,10 +1027,16 @@ async def handler(event_data, context):
log_operation('info', f"Sync statistics for {kuerzel}: New Adv->Google: {stats['new_adv_to_google']}, New Google->Adv: {stats['new_google_to_adv']}, Deleted: {stats['deleted']}, Updated: {stats['updated']}, Recreated: {stats['recreated']}", context=context) log_operation('info', f"Sync statistics for {kuerzel}: New Adv->Google: {stats['new_adv_to_google']}, New Google->Adv: {stats['new_google_to_adv']}, Deleted: {stats['deleted']}, Updated: {stats['updated']}, Recreated: {stats['recreated']}", context=context)
log_operation('info', f"Calendar sync completed for {kuerzel}", context=context) log_operation('info', f"Calendar sync completed for {kuerzel}", context=context)
log_operation('info', f"Handler duration: {time.time() - start_time}", context=context)
return {'status': 200, 'body': {'status': 'completed', 'kuerzel': kuerzel}} return {'status': 200, 'body': {'status': 'completed', 'kuerzel': kuerzel}}
except Exception as e: except Exception as e:
log_operation('error', f"Sync failed for {kuerzel}: {e}", context=context) log_operation('error', f"Sync failed for {kuerzel}: {e}", context=context)
log_operation('info', f"Handler duration (failed): {time.time() - start_time}", context=context)
return {'status': 500, 'body': {'error': str(e)}} return {'status': 500, 'body': {'error': str(e)}}
finally: finally:
# Ensure lock is always released # Ensure lock is always released

View File

@@ -2,37 +2,38 @@ import logging
import asyncpg import asyncpg
import os import os
import redis import redis
import time
from config import Config from config import Config
from googleapiclient.discovery import build from googleapiclient.discovery import build
from google.oauth2 import service_account from google.oauth2 import service_account
# Configure logging to file
logging.basicConfig(
filename='/opt/motia-app/calendar_sync.log',
level=logging.INFO,
format='%(asctime)s - %(name)s - %(levelname)s - %(message)s',
datefmt='%Y-%m-%d %H:%M:%S'
)
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)
def log_operation(level, message, context=None, **context_vars): def log_operation(level, message, context=None, **context_vars):
"""Centralized logging with context, supporting Motia workbench logging.""" """Centralized logging with context, supporting file and console logging."""
context_str = ' '.join(f"{k}={v}" for k, v in context_vars.items() if v is not None) context_str = ' '.join(f"{k}={v}" for k, v in context_vars.items() if v is not None)
full_message = f"{message} {context_str}".strip() full_message = f"[{time.time()}] {message} {context_str}".strip()
if context:
if level == 'info': # Log to file via Python logger
context.logger.info(full_message) if level == 'info':
elif level == 'warning': logger.info(full_message)
if hasattr(context.logger, 'warn'): elif level == 'warning':
context.logger.warn(full_message) logger.warning(full_message)
else: elif level == 'error':
context.logger.warning(full_message) logger.error(full_message)
elif level == 'error': elif level == 'debug':
context.logger.error(full_message) logger.debug(full_message)
# elif level == 'debug':
# context.logger.debug(full_message)dddd # Also log to console for journalctl visibility
else: print(f"[{level.upper()}] {full_message}")
if level == 'info':
logger.info(full_message)
elif level == 'warning':
logger.warning(full_message)
elif level == 'error':
logger.error(full_message)
elif level == 'debug':
logger.debug(full_message)
async def connect_db(context=None): async def connect_db(context=None):
"""Connect to Postgres DB from Config.""" """Connect to Postgres DB from Config."""

View File

@@ -0,0 +1,48 @@
---
type: step
category: api
name: Advoware Proxy DELETE
version: 1.0.0
status: active
tags: [advoware, proxy, api, rest, delete]
dependencies:
- services/advoware.py
emits: []
---
# Advoware Proxy DELETE Step
## Zweck
Universeller REST-API-Proxy für DELETE-Requests an die Advoware API zum Löschen von Ressourcen.
## Input
```bash
DELETE /advoware/proxy?endpoint=appointments/12345
```
## Output
```json
{
"status": 200,
"body": {
"result": null
}
}
```
## Key Differences
- **Method**: DELETE
- **Body**: Kein Body (`json_data = None`)
- **Endpoint**: Mit ID der zu löschenden Ressource
- **Side-Effect**: Löscht Ressource (nicht wiederherstellbar!)
- **Response**: Oft `null` oder leeres Objekt
## Testing
```bash
curl -X DELETE "http://localhost:3000/advoware/proxy?endpoint=appointments/12345"
```
## Warning
⚠️ **ACHTUNG**: DELETE ist irreversibel! Keine Undo-Funktion.
Siehe [advoware_api_proxy_get_step.md](advoware_api_proxy_get_step.md) für vollständige Details.

View File

@@ -0,0 +1,302 @@
---
type: step
category: api
name: Advoware Proxy GET
version: 1.0.0
status: active
tags: [advoware, proxy, api, rest]
dependencies:
- services/advoware.py
- redis (for token caching)
emits: []
subscribes: []
---
# Advoware Proxy GET Step
## Zweck
Universeller REST-API-Proxy für GET-Requests an die Advoware API mit automatischer Authentifizierung und Token-Management.
## Kontext
Die Advoware API verwendet HMAC-512 Authentifizierung, die komplex und fehleranfällig ist. Dieser Proxy abstrahiert die Authentifizierung und bietet einen einfachen HTTP-Endpunkt für GET-Requests. Clients müssen sich nicht um Token-Management, Signatur-Generierung oder Error-Handling kümmern.
## Technische Spezifikation
### Config
```python
{
'type': 'api',
'name': 'Advoware Proxy GET',
'description': 'Universal proxy for Advoware API (GET)',
'path': '/advoware/proxy',
'method': 'GET',
'emits': [],
'flows': ['advoware']
}
```
### Input
- **HTTP Method**: GET
- **Path**: `/advoware/proxy`
- **Query Parameters**:
- `endpoint` (required, string): Advoware API endpoint path (ohne Base-URL)
- Alle weiteren Parameter werden an Advoware weitergeleitet
**Beispiel**:
```
GET /advoware/proxy?endpoint=employees&limit=10&offset=0
```
### Output
**Success Response (200)**:
```json
{
"status": 200,
"body": {
"result": {
// Advoware API Response
}
}
}
```
**Error Response (400)**:
```json
{
"status": 400,
"body": {
"error": "Endpoint required as query param"
}
}
```
**Error Response (500)**:
```json
{
"status": 500,
"body": {
"error": "Internal server error",
"details": "Error message from Advoware or network"
}
}
```
### Events
- **Emits**: Keine
- **Subscribes**: Keine
## Verhalten
### Ablauf
1. Extrahiere `endpoint` Parameter aus Query-String
2. Validiere dass `endpoint` vorhanden ist
3. Extrahiere alle anderen Query-Parameter (außer `endpoint`)
4. Erstelle AdvowareAPI-Instanz
5. Rufe `api_call()` mit GET-Methode auf
- Intern: Token wird aus Redis geladen oder neu geholt
- Intern: HMAC-Signatur wird generiert
- Intern: Request wird an Advoware gesendet
6. Gebe Response als JSON zurück
### Fehlerbehandlung
**Fehlender `endpoint` Parameter**:
- HTTP 400 mit Fehlermeldung
- Request wird nicht an Advoware weitergeleitet
**Advoware API Error**:
- HTTP 500 mit Details
- Exception wird geloggt mit Stack-Trace
- Keine Retry-Logik (fail-fast)
**Token Expired (401)**:
- Automatisch behandelt durch AdvowareAPI Service
- Neuer Token wird geholt und Request wiederholt
- Transparent für Client
**Network Error**:
- HTTP 500 mit Details
- Exception wird geloggt
- Timeout nach `ADVOWARE_API_TIMEOUT_SECONDS` (default: 30s)
### Side Effects
- **Keine Writes**: GET-Request modifiziert keine Daten
- **Token Cache**: Liest aus Redis DB 1 (`advoware_access_token`)
- **Logging**: Schreibt INFO und ERROR logs in Motia Workbench
## Abhängigkeiten
### Services
- **AdvowareAPI** (`services/advoware.py`): API-Client
- `api_call(endpoint, method='GET', params, json_data=None)`
- Handhabt Authentifizierung, Token-Caching, Error-Handling
### Redis Keys (gelesen via AdvowareAPI)
- **DB 1**:
- `advoware_access_token` (string, TTL: 53min): Bearer Token
- `advoware_token_timestamp` (string, TTL: 53min): Token Creation Time
### Environment Variables
```bash
ADVOWARE_API_BASE_URL=https://www2.advo-net.net:90/
ADVOWARE_API_KEY=base64_encoded_key
ADVOWARE_APP_ID=your_app_id
ADVOWARE_USER=api_user
ADVOWARE_PASSWORD=secure_password
ADVOWARE_API_TIMEOUT_SECONDS=30
REDIS_HOST=localhost
REDIS_PORT=6379
REDIS_DB_ADVOWARE_CACHE=1
```
### External APIs
- **Advoware API**: Alle GET-fähigen Endpoints
- **Rate Limits**: Unknown (keine offizielle Dokumentation)
## Testing
### Manual Test
```bash
# Test employee list
curl -X GET "http://localhost:3000/advoware/proxy?endpoint=employees&limit=5"
# Test appointments
curl -X GET "http://localhost:3000/advoware/proxy?endpoint=appointments?datum=2026-02-07"
# Test with error (missing endpoint)
curl -X GET "http://localhost:3000/advoware/proxy"
# Expected: 400 Bad Request
```
### Expected Behavior
1. **Success Case**:
- Status: 200
- Body enthält `result` mit Advoware-Daten
- Logs zeigen "Proxying request to Advoware: GET {endpoint}"
2. **Error Case (missing endpoint)**:
- Status: 400
- Body: `{"error": "Endpoint required as query param"}`
3. **Error Case (Advoware down)**:
- Status: 500
- Body: `{"error": "Internal server error", "details": "..."}`
- Logs zeigen Error mit Stack-Trace
## Monitoring
### Logs
```
[INFO] Proxying request to Advoware: GET employees
[INFO] Using cached token
[ERROR] Proxy error: ConnectionTimeout
```
### Metrics (potentiell)
- Request Count
- Response Time (avg, p95, p99)
- Error Rate
- Cache Hit Rate (Token)
### Alerts
- Error Rate > 10% über 5 Minuten
- Response Time > 30s (Timeout-Grenze)
- Redis Connection Failed
## Performance
### Response Time
- **Cached Token**: 200-500ms (typisch)
- **New Token**: 1-2s (Token-Fetch + API-Call)
- **Timeout**: 30s (konfigurierbar)
### Throughput
- **No rate limit** auf Motia-Seite
- **Advoware API**: Unknown rate limits
- **Bottleneck**: Advoware API Response-Zeit
## Security
### Secrets
- ❌ Keine Secrets im Code
- ✅ API Key über Environment Variable
- ✅ Token in Redis (lokaler Zugriff nur)
### Authentication
- Client → Motia: Keine (TODO: API Key oder OAuth)
- Motia → Advoware: HMAC-512 + Bearer Token
### Data Exposure
- GET-Requests lesen nur Daten
- Keine PII in Logs (nur Endpoint-Pfade)
- Response enthält alle Advoware-Daten (keine Filterung)
## Änderungshistorie
| Version | Datum | Änderung |
|---------|-------|----------|
| 1.0.0 | 2024-10-24 | Initiale Implementierung |
## KI-Assistant Guidance
### Typische Änderungen
**1. Timeout erhöhen**:
```python
# In services/advoware.py, nicht im Step
Config.ADVOWARE_API_TIMEOUT_SECONDS = 60
```
**2. Request-Parameter anpassen**:
```python
# Query-Parameter werden automatisch weitergeleitet
# Keine Code-Änderung nötig
```
**3. Response-Transformation**:
```python
# Vor return:
result = await advoware.api_call(...)
transformed = transform_response(result) # Neue Funktion
return {'status': 200, 'body': {'result': transformed}}
```
**4. Caching hinzufügen**:
```python
# Vor api_call:
cache_key = f'cache:{endpoint}:{params}'
cached = redis_client.get(cache_key)
if cached:
return {'status': 200, 'body': {'result': json.loads(cached)}}
# ... api_call ...
redis_client.set(cache_key, json.dumps(result), ex=300)
```
### Don'ts
-**Keine synchronen Blocking-Calls**: Immer `await` verwenden
-**Keine Hardcoded Credentials**: Nur Environment Variables
-**Keine unbehandelten Exceptions**: Immer try-catch
-**Kein Logging von Secrets**: Keine Passwörter/Tokens loggen
### Testing-Tipps
```bash
# Test mit verschiedenen Endpoints
curl "http://localhost:3000/advoware/proxy?endpoint=employees"
curl "http://localhost:3000/advoware/proxy?endpoint=appointments"
curl "http://localhost:3000/advoware/proxy?endpoint=cases"
# Test Error-Handling
curl "http://localhost:3000/advoware/proxy" # Missing endpoint
# Test mit vielen Parametern
curl "http://localhost:3000/advoware/proxy?endpoint=employees&limit=100&offset=0&sortBy=name"
```
### Related Steps
- [advoware_api_proxy_post_step.md](advoware_api_proxy_post_step.md) - POST-Requests
- [advoware_api_proxy_put_step.md](advoware_api_proxy_put_step.md) - PUT-Requests
- [advoware_api_proxy_delete_step.md](advoware_api_proxy_delete_step.md) - DELETE-Requests
### Related Services
- [services/advoware.py](../../services/ADVOWARE_SERVICE.md) - API-Client Implementierung

View File

@@ -0,0 +1,70 @@
---
type: step
category: api
name: Advoware Proxy POST
version: 1.0.0
status: active
tags: [advoware, proxy, api, rest, create]
dependencies:
- services/advoware.py
emits: []
---
# Advoware Proxy POST Step
## Zweck
Universeller REST-API-Proxy für POST-Requests an die Advoware API zum Erstellen neuer Ressourcen.
## Unterschied zu GET
- **Method**: POST statt GET
- **Body**: JSON-Payload aus Request-Body wird an Advoware weitergeleitet
- **Verwendung**: Erstellen von Ressourcen (Termine, Employees, etc.)
## Input
```bash
POST /advoware/proxy?endpoint=appointments
Content-Type: application/json
{
"datum": "2026-02-10",
"uhrzeitVon": "09:00:00",
"text": "Meeting"
}
```
## Output
```json
{
"status": 200,
"body": {
"result": {
"id": "12345",
...
}
}
}
```
## Key Differences from GET Step
1. Request Body (`req.get('body')`) wird als `json_data` an API übergeben
2. Kann Daten in Advoware erstellen (Side-Effects!)
3. Response enthält oft die neu erstellte Ressource
## Testing
```bash
curl -X POST "http://localhost:3000/advoware/proxy?endpoint=appointments" \
-H "Content-Type: application/json" \
-d '{
"datum": "2026-02-10",
"uhrzeitVon": "09:00:00",
"uhrzeitBis": "10:00:00",
"text": "Test Meeting"
}'
```
## KI Guidance
Identisch zu GET-Step, außer:
- Body-Validierung hinzufügen bei Bedarf
- Side-Effects beachten (erstellt Daten!)
Siehe [advoware_api_proxy_get_step.md](advoware_api_proxy_get_step.md) für Details.

View File

@@ -0,0 +1,55 @@
---
type: step
category: api
name: Advoware Proxy PUT
version: 1.0.0
status: active
tags: [advoware, proxy, api, rest, update]
dependencies:
- services/advoware.py
emits: []
---
# Advoware Proxy PUT Step
## Zweck
Universeller REST-API-Proxy für PUT-Requests an die Advoware API zum Aktualisieren bestehender Ressourcen.
## Input
```bash
PUT /advoware/proxy?endpoint=appointments/12345
Content-Type: application/json
{
"text": "Updated Meeting Title"
}
```
## Output
```json
{
"status": 200,
"body": {
"result": {
"id": "12345",
"text": "Updated Meeting Title",
...
}
}
}
```
## Key Differences
- **Method**: PUT
- **Endpoint**: Typischerweise mit ID (`resource/123`)
- **Body**: Partial oder Full Update-Payload
- **Side-Effect**: Modifiziert bestehende Ressource
## Testing
```bash
curl -X PUT "http://localhost:3000/advoware/proxy?endpoint=appointments/12345" \
-H "Content-Type: application/json" \
-d '{"text": "Updated Title"}'
```
Siehe [advoware_api_proxy_get_step.md](advoware_api_proxy_get_step.md) für vollständige Details.

View File

@@ -1,52 +0,0 @@
from pydantic import BaseModel
from typing import Optional
from src.services.pet_store import pet_store_service
from src.services.types import Pet
class PetRequest(BaseModel):
name: str
photoUrl: str
class FoodOrder(BaseModel):
id: str
quantity: int
class RequestBody(BaseModel):
pet: PetRequest
foodOrder: Optional[FoodOrder] = None
config = {
"type": "api",
"name": "ApiTrigger",
"description": "basic-tutorial api trigger",
"flows": ["basic-tutorial"],
"method": "POST",
"path": "/basic-tutorial",
"bodySchema": RequestBody.model_json_schema(),
"responseSchema": {
200: Pet.model_json_schema(),
},
"emits": ["process-food-order"],
}
async def handler(req, context):
body = req.get("body", {})
context.logger.info("Step 01 Processing API Step", {"body": body})
pet = body.get("pet", {})
food_order = body.get("foodOrder", {})
new_pet_record = await pet_store_service.create_pet(pet)
if food_order:
await context.emit({
"topic": "process-food-order",
"data": {
"id": food_order.get("id"),
"quantity": food_order.get("quantity"),
"email": "test@test.com", # sample email
"pet_id": new_pet_record.get("id"),
},
})
return {"status": 200, "body": {**new_pet_record, "traceId": context.trace_id}}

View File

@@ -1,69 +0,0 @@
[
{
"id": "step-configuration",
"title": "Step Configuration",
"description": "All steps should have a defined configuration, this is how you define the step's behavior and how it will be triggered.",
"lines": [
"6-30"
]
},
{
"id": "api-configuration",
"title": "API Step",
"description": "Definition of an API endpoint",
"lines": [
"23-24"
]
},
{
"id": "request-body",
"title": "Request body",
"description": "Definition of the expected request body. Motia will automatically generate types based on this schema.",
"lines": [
"6-16",
"25"
]
},
{
"id": "response-payload",
"title": "Response Payload",
"description": "Definition of the expected response payload, Motia will generate the types automatically based on this schema. This is also important to create the Open API spec later.",
"lines": [
"4",
"26-28"
]
},
{
"id": "event-driven-architecture",
"title": "Emits",
"description": "We can define the events that this step will emit, this is how we can trigger other Motia Steps.",
"lines": [
"29",
"42-50"
]
},
{
"id": "handler",
"title": "Handler",
"description": "The handler is the function that will be executed when the step is triggered. This one receives the request body and emits events.",
"lines": [
"32-52"
]
},
{
"id": "logger",
"title": "Logger",
"description": "The logger is a utility that allows you to log messages to the console. It is available in the handler function. We encourage you to use it instead of console.log. It will automatically be tied to the trace id of the request.",
"lines": [
"34"
]
},
{
"id": "http-response",
"title": "HTTP Response",
"description": "The handler can return a response to the client. This is how we can return a response to the client. It must comply with the responseSchema defined in the step configuration.",
"lines": [
"52"
]
}
]

View File

@@ -1,40 +0,0 @@
from pydantic import BaseModel
from typing import Dict, Any
import re
class InputSchema(BaseModel):
template_id: str
email: str
template_data: Dict[str, Any]
config = {
"type": "event",
"name": "Notification",
"description": "Checks a state change",
"flows": ["basic-tutorial"],
"subscribes": ["notification"],
"emits": [],
"input": InputSchema.model_json_schema(),
}
async def handler(input_data, context):
email = input_data.get("email")
template_id = input_data.get("template_id")
template_data = input_data.get("template_data")
redacted_email = re.sub(r'(?<=.{2}).(?=.*@)', '*', email)
context.logger.info("Processing Notification", {
"template_id": template_id,
"template_data": template_data,
"email": redacted_email,
})
# This represents a call to some sort of
# notification service to indicate that a
# new order has been placed
context.logger.info("New notification sent", {
"template_id": template_id,
"email": redacted_email,
"template_data": template_data,
})

View File

@@ -1,50 +0,0 @@
from pydantic import BaseModel
from datetime import datetime
from src.services.pet_store import pet_store_service
class InputSchema(BaseModel):
id: str
email: str
quantity: int
pet_id: int
config = {
"type": "event",
"name": "ProcessFoodOrder",
"description": "basic-tutorial event step, demonstrates how to consume an event from a topic and persist data in state",
"flows": ["basic-tutorial"],
"subscribes": ["process-food-order"],
"emits": ["notification"],
"input": InputSchema.model_json_schema(),
}
async def handler(input_data, context):
context.logger.info("Step 02 Process food order", {"input": input_data})
order = await pet_store_service.create_order({
"id": input_data.get("id"),
"quantity": input_data.get("quantity"),
"pet_id": input_data.get("pet_id"),
"email": input_data.get("email"),
"ship_date": datetime.now().isoformat(),
"status": "placed",
})
context.logger.info("Order created", {"order": order})
await context.state.set("orders_python", order.get("id"), order)
await context.emit({
"topic": "notification",
"data": {
"email": input_data["email"],
"template_id": "new-order",
"template_data": {
"status": order.get("status"),
"ship_date": order.get("shipDate"),
"id": order.get("id"),
"pet_id": order.get("petId"),
"quantity": order.get("quantity"),
},
},
})

View File

@@ -1,68 +0,0 @@
[
{
"id": "step-configuration",
"title": "Step Configuration",
"description": "All steps should have a defined configuration, this is how you define the step's behavior and how it will be triggered.",
"lines": [
"5-19"
]
},
{
"id": "event-configuration",
"title": "Event Step",
"description": "Definition of an event step that subscribes to specific topics",
"lines": [
"12",
"15-16"
]
},
{
"id": "input-schema",
"title": "Input Schema",
"description": "Definition of the expected input data structure from the subscribed topic. Motia will automatically generate types based on this schema.",
"lines": [
"5-9",
"17"
]
},
{
"id": "event-emits",
"title": "Emits",
"description": "We can define the events that this step will emit, triggering other Motia Steps.",
"lines": [
"17"
]
},
{
"id": "handler",
"title": "Handler",
"description": "The handler is the function that will be executed when the step receives an event from its subscribed topic. It processes the input data and can emit new events.",
"lines": [
"21-50"
]
},
{
"id": "state",
"title": "State Management",
"description": "The handler demonstrates state management by storing order data that can be accessed by other steps.",
"lines": [
"35"
]
},
{
"id": "event-emission",
"title": "Event Emission",
"description": "After processing the order, the handler emits a new event to notify other steps about the new order.",
"lines": [
"37-50"
]
},
{
"id": "logger",
"title": "Logger",
"description": "The logger is a utility that allows you to log messages to the console. It is available in the handler function and automatically ties to the trace id of the request.",
"lines": [
"22"
]
}
]

View File

@@ -1,39 +0,0 @@
from datetime import datetime, timezone
config = {
"type": "cron",
"cron": "0 0 * * 1", # run once every Monday at midnight
"name": "StateAuditJob",
"description": "Checks the state for orders that are not complete and have a ship date in the past",
"emits": ["notification"],
"flows": ["basic-tutorial"],
}
async def handler(context):
state_value = await context.state.get_group("orders_python")
for item in state_value:
# check if current date is after item.ship_date
current_date = datetime.now(timezone.utc)
ship_date = datetime.fromisoformat(item.get("shipDate", "").replace('Z', '+00:00'))
if not item.get("complete", False) and current_date > ship_date:
context.logger.warn("Order is not complete and ship date is past", {
"order_id": item.get("id"),
"ship_date": item.get("shipDate"),
"complete": item.get("complete", False),
})
await context.emit({
"topic": "notification",
"data": {
"email": "test@test.com",
"template_id": "order-audit-warning",
"template_data": {
"order_id": item.get("id"),
"status": item.get("status"),
"ship_date": item.get("shipDate"),
"message": "Order is not complete and ship date is past",
},
},
})

View File

@@ -1,26 +0,0 @@
[
{
"id": "step-configuration",
"title": "Step Configuration",
"description": "All steps should have a defined configuration, this is how you define the step's behavior and how it will be triggered.",
"lines": [
"3-10"
]
},
{
"id": "cron-configuration",
"title": "Cron Configuration",
"description": "Cron steps require a specific configuration structure with the 'type' field set to 'cron' and a valid cron expression.",
"lines": [
"4-5"
]
},
{
"id": "handler",
"title": "Cron Step Handler",
"description": "The Cron step handler only receives one argument.",
"lines": [
"12-39"
]
}
]

View File

@@ -0,0 +1,92 @@
---
type: step
category: event
name: VMH Beteiligte Sync
version: 1.0.0
status: placeholder
tags: [sync, vmh, beteiligte, event, todo]
dependencies: []
emits: []
subscribes: [vmh.beteiligte.create, vmh.beteiligte.update, vmh.beteiligte.delete]
---
# VMH Beteiligte Sync Event Step
## Status
⚠️ **PLACEHOLDER** - Implementierung noch ausstehend
## Zweck
Verarbeitet Create/Update/Delete-Events für Beteiligte-Entitäten und synchronisiert zwischen EspoCRM und Zielsystem.
## Config
```python
{
'type': 'event',
'name': 'VMH Beteiligte Sync',
'subscribes': [
'vmh.beteiligte.create',
'vmh.beteiligte.update',
'vmh.beteiligte.delete'
],
'emits': [],
'flows': ['vmh']
}
```
## Geplantes Verhalten
**Input Events**:
```json
{
"topic": "vmh.beteiligte.create",
"data": {
"entity_id": "123",
"action": "create",
"source": "webhook",
"timestamp": "..."
}
}
```
**Processing**:
1. Fetch full entity data from EspoCRM
2. Map to target system format
3. Create/Update/Delete in target system
4. Remove ID from Redis pending set
5. Log success/failure
## Implementierungs-Aufgaben
- [ ] EspoCRM API Client erstellen
- [ ] Entity-Mapping definieren
- [ ] Zielsystem-Integration
- [ ] Error-Handling & Retry-Logic
- [ ] Redis Cleanup (remove from pending sets)
- [ ] Logging & Monitoring
## Redis Cleanup
Nach erfolgreicher Verarbeitung:
```python
redis.srem('vmh:beteiligte:create_pending', entity_id)
redis.srem('vmh:beteiligte:update_pending', entity_id)
redis.srem('vmh:beteiligte:delete_pending', entity_id)
```
## Testing (Future)
```bash
# Manually emit event for testing
# (via Motia CLI or test script)
```
## KI Guidance
Wenn Sie diesen Step implementieren:
1. Erstellen Sie EspoCRM API Client in `services/`
2. Definieren Sie Mapping-Logic
3. Implementieren Sie Retry-Logic mit exponential backoff
4. Cleanen Sie Redis Sets nach Verarbeitung
5. Loggen Sie alle Operationen für Audit
## Related
- [webhook/beteiligte_create_api_step.md](webhook/beteiligte_create_api_step.md) - Emits create events
- [webhook/beteiligte_update_api_step.md](webhook/beteiligte_update_api_step.md) - Emits update events
- [webhook/beteiligte_delete_api_step.md](webhook/beteiligte_delete_api_step.md) - Emits delete events

View File

@@ -0,0 +1,124 @@
---
type: step
category: api
name: VMH Webhook Beteiligte Create
version: 1.0.0
status: active
tags: [webhook, espocrm, vmh, beteiligte, create]
dependencies:
- redis
emits: [vmh.beteiligte.create]
---
# VMH Webhook Beteiligte Create Step
## Zweck
Empfängt Create-Webhooks von EspoCRM für neue Beteiligte-Entitäten. Dedupliziert via Redis und emittiert Events für asynchrone Verarbeitung.
## Config
```python
{
'type': 'api',
'name': 'VMH Webhook Beteiligte Create',
'path': '/vmh/webhook/beteiligte/create',
'method': 'POST',
'emits': ['vmh.beteiligte.create'],
'flows': ['vmh']
}
```
## Input
```bash
POST /vmh/webhook/beteiligte/create
Content-Type: application/json
[
{
"id": "entity-123",
"name": "Max Mustermann",
"createdAt": "2026-02-07T10:00:00Z"
},
{
"id": "entity-456",
"name": "Maria Schmidt"
}
]
```
**Format**: Array von Entitäten (Batch-Support)
## Output
```json
{
"status": "received",
"action": "create",
"new_ids_count": 2,
"total_ids_in_batch": 2
}
```
## Verhalten
1. **Extract IDs** von allen Entitäten im Batch
2. **Redis Deduplication**:
```python
pending_key = 'vmh:beteiligte:create_pending'
existing_ids = redis.smembers(pending_key)
new_ids = input_ids - existing_ids
redis.sadd(pending_key, *new_ids)
```
3. **Emit Events** nur für neue IDs:
```python
for entity_id in new_ids:
await context.emit({
'topic': 'vmh.beteiligte.create',
'data': {
'entity_id': entity_id,
'action': 'create',
'source': 'webhook',
'timestamp': timestamp
}
})
```
## Redis Keys
- `vmh:beteiligte:create_pending` (SET): IDs in create queue
- No TTL (permanent until processed)
## Deduplication Logic
**Problem**: EspoCRM kann Webhooks mehrfach senden
**Solution**: Redis SET speichert alle pending IDs
- Neue IDs → Events emittiert
- Bereits vorhandene IDs → Skipped
## Testing
```bash
# Test webhook
curl -X POST "http://localhost:3000/vmh/webhook/beteiligte/create" \
-H "Content-Type: application/json" \
-d '[{"id": "test-123", "name": "Test"}]'
# Check Redis
redis-cli -n 1 SMEMBERS vmh:beteiligte:create_pending
# Clear Redis (testing)
redis-cli -n 1 DEL vmh:beteiligte:create_pending
```
## Error Handling
- Invalid JSON: 400 error
- Redis unavailable: Loggen, aber nicht crashen (kann zu Duplikaten führen)
- Event emission error: Loggen und fortfahren
## Monitoring
```
[INFO] VMH Webhook Beteiligte Create empfangen
[INFO] Create Entity ID gefunden: entity-123
[INFO] 2 neue IDs zur Create-Sync-Queue hinzugefügt
[INFO] Create-Event emittiert für ID: entity-123
```
## Related Steps
- [beteiligte_update_api_step.md](beteiligte_update_api_step.md) - Update webhooks
- [beteiligte_delete_api_step.md](beteiligte_delete_api_step.md) - Delete webhooks
- [beteiligte_sync_event_step.md](../beteiligte_sync_event_step.md) - Consumes events

4
bitbylaw/types.d.ts vendored
View File

@@ -16,10 +16,6 @@ declare module 'motia' {
'VMH Webhook Beteiligte Update': ApiRouteHandler<Record<string, unknown>, unknown, { topic: 'vmh.beteiligte.update'; data: never }> 'VMH Webhook Beteiligte Update': ApiRouteHandler<Record<string, unknown>, unknown, { topic: 'vmh.beteiligte.update'; data: never }>
'VMH Webhook Beteiligte Delete': ApiRouteHandler<Record<string, unknown>, unknown, { topic: 'vmh.beteiligte.delete'; data: never }> 'VMH Webhook Beteiligte Delete': ApiRouteHandler<Record<string, unknown>, unknown, { topic: 'vmh.beteiligte.delete'; data: never }>
'VMH Webhook Beteiligte Create': ApiRouteHandler<Record<string, unknown>, unknown, { topic: 'vmh.beteiligte.create'; data: never }> 'VMH Webhook Beteiligte Create': ApiRouteHandler<Record<string, unknown>, unknown, { topic: 'vmh.beteiligte.create'; data: never }>
'StateAuditJob': CronHandler<{ topic: 'notification'; data: { template_id: string; email: string; template_data: Record<string, unknown> } }>
'ProcessFoodOrder': EventHandler<{ id: string; email: string; quantity: unknown; pet_id: unknown }, { topic: 'notification'; data: { template_id: string; email: string; template_data: Record<string, unknown> } }>
'Notification': EventHandler<{ template_id: string; email: string; template_data: Record<string, unknown> }, never>
'ApiTrigger': ApiRouteHandler<{ pet: unknown; foodOrder?: unknown | unknown }, ApiResponse<200, { id: unknown; name: string; photoUrl: string }>, { topic: 'process-food-order'; data: { id: string; email: string; quantity: unknown; pet_id: unknown } }>
'Advoware Proxy PUT': ApiRouteHandler<Record<string, unknown>, unknown, never> 'Advoware Proxy PUT': ApiRouteHandler<Record<string, unknown>, unknown, never>
'Advoware Proxy POST': ApiRouteHandler<Record<string, unknown>, unknown, never> 'Advoware Proxy POST': ApiRouteHandler<Record<string, unknown>, unknown, never>
'Advoware Proxy GET': ApiRouteHandler<Record<string, unknown>, unknown, never> 'Advoware Proxy GET': ApiRouteHandler<Record<string, unknown>, unknown, never>

91
collect_performance_data.sh Executable file
View File

@@ -0,0 +1,91 @@
#!/bin/bash
# Script to collect performance data for Motia Calendar Sync degradation analysis
# Run this script to start monitoring, simulate load, and collect logs
echo "Starting performance data collection for Motia Calendar Sync..."
# Create log directory
LOG_DIR="/opt/motia-app/performance_logs_$(date +%Y%m%d_%H%M%S)"
mkdir -p "$LOG_DIR"
echo "Logs will be saved to: $LOG_DIR"
# 1. Start system monitoring (top)
echo "Starting top monitoring..."
MAIN_PID=$(pgrep -f "node.*motia" | head -1)
if [ -z "$MAIN_PID" ]; then
echo "No motia node process found, skipping top monitoring"
else
top -b -d 5 -p $MAIN_PID > "$LOG_DIR/top_log.txt" &
TOP_PID=$!
echo "Top PID: $TOP_PID (monitoring Node process $MAIN_PID)"
fi
# 2. Start Redis monitoring
echo "Starting Redis monitoring..."
redis-cli MONITOR > "$LOG_DIR/redis_log.txt" &
REDIS_PID=$!
echo "Redis monitor PID: $REDIS_PID"
# 3. Start journalctl logging
echo "Starting journalctl logging..."
journalctl -u motia -f > "$LOG_DIR/motia_journal.log" &
JOURNAL_PID=$!
echo "Journalctl PID: $JOURNAL_PID"
# 4. Start Clinic.js profiling (Node.js runtime)
echo "Starting Clinic.js Doctor for Node.js profiling..."
cd /opt/motia-app/bitbylaw
clinic doctor -- node node_modules/.bin/motia start --host 0.0.0.0 > "$LOG_DIR/clinic_output.log" 2>&1 &
CLINIC_PID=$!
echo "Clinic PID: $CLINIC_PID"
cd /opt/motia-app
# 5. Python profiling is enabled in the event step (generates profile.pstats and yappi.stats)
# 6. Wait for cron to run (assuming it runs every 1 minute)
echo "Waiting for cron jobs to run and generate data (60 minutes for long-term degradation)..."
sleep 3600 # Wait 60 minutes for cron cycles
# 6. Take heap snapshots periodically
echo "Taking heap snapshots every 15 minutes..."
for i in {1..4}; do
kill -USR2 $MAIN_PID 2>/dev/null || echo "Failed to trigger heap snapshot $i"
sleep 5
cp *.heapsnapshot "$LOG_DIR/heap_snapshot_$i.heapsnapshot" 2>/dev/null || echo "No heap snapshot $i found"
if [ $i -lt 4 ]; then
sleep 900 # 15 minutes
fi
done
# 7. Stop monitoring tools
echo "Stopping monitoring tools..."
kill $TOP_PID
kill $REDIS_PID
kill $JOURNAL_PID
kill $CLINIC_PID
# 8. Collect profiling data (if generated)
echo "Collecting profiling data..."
cp /opt/motia-app/profile.pstats /opt/motia-app/yappi.stats "$LOG_DIR/" 2>/dev/null || echo "No profiling data found"
# 9. Generate summary
echo "Generating summary..."
echo "Performance Data Collection Complete" > "$LOG_DIR/summary.txt"
echo "Duration: $(date)" >> "$LOG_DIR/summary.txt"
echo "Top log size: $(stat -c%s "$LOG_DIR/top_log.txt" 2>/dev/null || echo "N/A") bytes" >> "$LOG_DIR/summary.txt"
echo "Redis log size: $(stat -c%s "$LOG_DIR/redis_log.txt" 2>/dev/null || echo "N/A") bytes" >> "$LOG_DIR/summary.txt"
echo "Journal log size: $(stat -c%s "$LOG_DIR/motia_journal.log" 2>/dev/null || echo "N/A") bytes" >> "$LOG_DIR/summary.txt"
echo "Clinic output size: $(stat -c%s "$LOG_DIR/clinic_output.log" 2>/dev/null || echo "N/A") bytes" >> "$LOG_DIR/summary.txt"
echo "Heap snapshots: $(ls "$LOG_DIR"/heap_snapshot_*.heapsnapshot 2>/dev/null | wc -l)" >> "$LOG_DIR/summary.txt"
echo "Collection complete. Logs saved to $LOG_DIR"
echo "Next steps:"
echo "1. Analyze top_log.txt for CPU/memory trends"
echo "2. Check redis_log.txt for Redis activity"
echo "3. Review motia_journal.log for timings and errors"
echo "4. Open clinic-doctor-*.html for Node.js profiling (Event Loop Delay, etc.)"
echo "5. Use Chrome DevTools to analyze heap snapshots"
echo "6. Run 'snakeviz profile.pstats' for Python CPU profiling"
echo "7. Run 'yappi -f pstat yappi.stats' for Python async profiling"

View File

@@ -0,0 +1,32 @@
# Aider Configuration for Motia Projects
# https://aider.chat/docs/config.html
# Read AGENTS.md for project overview and guide references
read:
- AGENTS.md
# Uncomment specific guides as needed for more context:
# - .cursor/rules/motia/api-steps.mdc
# - .cursor/rules/motia/event-steps.mdc
# - .cursor/rules/motia/cron-steps.mdc
# - .cursor/rules/motia/state-management.mdc
# - .cursor/rules/motia/middlewares.mdc
# - .cursor/rules/motia/realtime-streaming.mdc
# - .cursor/rules/motia/virtual-steps.mdc
# - .cursor/rules/motia/ui-steps.mdc
# - .cursor/architecture/architecture.mdc
# - .cursor/architecture/error-handling.mdc
# Note: AGENTS.md references all detailed guides in .cursor/rules/
# The AI will read those guides when needed based on AGENTS.md instructions
# Auto-commit changes (optional, uncomment to enable)
# auto-commits: true
# Model selection (uncomment your preferred model)
# model: gpt-4
# model: claude-3-5-sonnet-20241022
# Additional context files (optional)
# read:
# - config.yml
# - package.json

View File

@@ -0,0 +1,104 @@
---
name: motia-developer
description: Expert Motia developer. Use PROACTIVELY for all Motia development tasks. References comprehensive cursor rules for patterns and best practices.
tools: Read, Edit, Write, Grep, Bash
model: inherit
---
You are an expert Motia developer with comprehensive knowledge of all Motia patterns.
## CRITICAL: Always Read Cursor Rules First
Before writing ANY Motia code, you MUST read the relevant cursor rules from `.cursor/rules/`:
### Configuration Guide (in `.cursor/rules/motia/`)
1. **`motia-config.mdc`** - Project configuration
- Package.json requirements (`"type": "module"`)
- Plugin naming conventions and setup
- Adapter configuration, Redis setup
- Stream authentication patterns
### Step Type Guides (in `.cursor/rules/motia/`)
2. **`api-steps.mdc`** - HTTP endpoints
- Creating API Steps with TypeScript, JavaScript, or Python
- Request/response schemas, validation, middleware
- When to emit events vs process directly
3. **`event-steps.mdc`** - Background tasks
- Creating Event Steps with TypeScript, JavaScript, or Python
- Topic subscription, event chaining, retry mechanisms
- Asynchronous workflow patterns
4. **`cron-steps.mdc`** - Scheduled tasks
- Creating Cron Steps with TypeScript, JavaScript, or Python
- Cron expression syntax, idempotent patterns
- When to emit events from scheduled jobs
5. **`state-management.mdc`** - State/cache management
- Using state across steps with TypeScript, JavaScript, or Python
- When to use state vs database
- TTL configuration, caching strategies
6. **`middlewares.mdc`** - Request/response middleware
- Creating middleware with TypeScript, JavaScript, or Python
- Authentication, validation, error handling
- Middleware composition patterns
7. **`realtime-streaming.mdc`** - Real-time data
- Server-Sent Events (SSE) patterns
- WebSocket support
- Stream configuration and usage
8. **`virtual-steps.mdc`** - Visual flow connections
- Creating NOOP steps for Workbench
- Virtual emits/subscribes for documentation
- Workflow visualization
9. **`ui-steps.mdc`** - Custom Workbench components
- Creating custom visual components (TypeScript/React)
- EventNode, ApiNode, CronNode components
- Styling with Tailwind
### Architecture Guides (in `.cursor/architecture/`)
10. **`architecture.mdc`** - Project structure
- File organization, naming conventions
- Domain-Driven Design patterns
- Services, repositories, utilities structure
11. **`error-handling.mdc`** - Error handling
- Custom error classes
- Middleware error handling
- ZodError/Pydantic validation errors
## Workflow
1. **Identify the task type** (API, Event, Cron, etc.)
2. **Read the relevant cursor rule(s)** from the list above
3. **Follow the patterns exactly** as shown in the guide
4. **Generate types** after config changes:
```bash
npx motia generate-types
```
## Key Principles
- **All guides have TypeScript, JavaScript, and Python examples**
- **Steps can live in `/src` or `/steps`** - Motia discovers both (use `/src` for modern structure)
- **Always export `config` and `handler`**
- **List all emits in config before using them**
- **Follow naming conventions**: `*.step.ts` (TS), `*.step.js` (JS), `*_step.py` (Python)
- **Use Domain-Driven Design**: Steps → Services → Repositories
## Never Guess
If you're unsure about any Motia pattern:
1. Read the relevant cursor rule from the list above
2. Check existing steps in the project
3. Follow the examples in the guides exactly
---
Remember: The 11 cursor rules in `.cursor/rules/` are your source of truth. Always read them first.

View File

@@ -0,0 +1,192 @@
---
description: How to structure your Motia project
globs:
alwaysApply: true
---
# Architecture Guide
## Overview
This guide covers the architecture and best practices for structuring Motia projects.
**Key Takeaway**: Motia automatically discovers steps from anywhere in your project. Modern projects use `/src` for a familiar structure that works seamlessly with Domain-Driven Design.
## File Structure
Motia automatically discovers step files from your project. You can organize steps in either:
- **`/src` folder** (recommended) - Familiar pattern for most developers
- **`/steps` folder** - Traditional Motia pattern
- Both folders simultaneously
### Recommended Structure (using `/src`)
```
project/
├── src/
│ ├── api/ # API endpoints
│ │ ├── users.step.ts
│ │ └── orders.step.ts
│ ├── events/ # Event handlers
│ │ ├── order-processing.step.ts
│ │ └── notifications.step.ts
│ ├── cron/ # Scheduled tasks
│ │ └── cleanup.step.ts
│ ├── services/ # Business logic
│ ├── repositories/ # Data access
│ └── utils/ # Utilities
└── motia.config.ts
```
### Alternative Structure (using `/steps`)
```
project/
├── steps/
│ ├── api/
│ │ └── users.step.ts
│ ├── events/
│ │ └── order-processing.step.ts
│ └── cron/
│ └── cleanup.step.ts
├── src/
│ ├── services/
│ └── utils/
└── motia.config.ts
```
Create subfolders within your chosen directory to organize related steps into logical groups (domains, features, or flows).
**Why `/src` is recommended:**
- Familiar to developers from other frameworks (Next.js, NestJS, etc.)
- Natural co-location with services, repositories, and utilities
- Works seamlessly with Domain-Driven Design patterns
- Cleaner project root with fewer top-level folders
## Step Naming Conventions
### Typescript
- Use kebab-case for filenames: `resource-processing.step.ts`
- Include `.step` before language extension
### Python
- Use snake_case for filenames: `data_processor_step.py`
- Include `_step` before language extension
### Global
- Match handler names to config names
- Use descriptive, action-oriented names
## Code Style Guidelines
- **JavaScript**: Use modern ES6+ features, async/await, proper error handling
- **TypeScript**: Make sure you use the correct Handlers type that is auto generated on the `types.d.ts` file.
## Defining Middlewares
Middleware is a powerful feature in Motia for common validation, error handling, and shared logic.
### Middleware Organization
Store middlewares in a dedicated folder:
- `/middlewares` at project root (recommended)
- `/src/middlewares` if using `/src` structure
### Best Practices
- **One responsibility per middleware** - Follow SOLID principles
- **Descriptive naming** - Use names like `auth.middleware.ts`, `validation.middleware.ts`
- **Handle errors gracefully** - Use core middleware for ZodError (see [Error Handling Guide](./error-handling.mdc))
- **Avoid infrastructure concerns** - Rate limiting and CORS are handled by infrastructure, not middleware
## Domain Driven Design
Motia encourages Domain-Driven Design (DDD) principles for maintainable, scalable applications.
### Folder Structure for DDD
When using `/src` for steps (recommended), your structure naturally supports DDD:
```
src/
├── api/ # API Steps (Controllers)
├── events/ # Event Steps (Controllers)
├── cron/ # Cron Steps (Controllers)
├── services/ # Business logic layer
├── repositories/ # Data access layer
├── utils/ # Utility functions
└── types/ # Shared types (optional)
```
### Layer Responsibilities
- **Steps (Controller Layer)**: Handle validation, call services, emit events
- **Services**: Contain business logic, orchestrate repositories
- **Repositories**: Direct data access (database, external APIs)
- **Utils**: Pure utility functions with no side effects
### Best Practices
- Models and DTOs are not necessary - use Zod schemas from step configs
- Steps should focus on validation and calling services
- Avoid service methods that only call repositories - Steps can access repositories directly
- Keep business logic in services, not in steps
### Services
Services contain your business logic and should be organized by domain.
**Structure:**
```
src/
├── services/
│ ├── auth/
│ │ ├── index.ts # Export service
│ │ ├── login.ts # Login method
│ │ └── register.ts # Register method
│ └── orders/
│ ├── index.ts
│ └── create-order.ts
└── api/
└── auth.step.ts # Uses authService
```
**Service Definition (`/src/services/auth/index.ts`):**
```typescript
/**
* Business logic methods imported from separate files
*/
import { login } from './login'
import { register } from './register'
/**
* Export service with methods as properties
*/
export const authService = {
login,
register
}
```
**Usage in Step (`/src/api/auth.step.ts`):**
```typescript
import { authService } from '../services/auth'
export const handler = async (req, ctx) => {
const user = await authService.login(req.body.email, req.body.password)
return { status: 200, body: { user } }
}
```
## Logging and observability
- Make sure to use the Logger from Motia (from context object) to log messages.
- Make sure to have visibility of what is going on in a request
- Before throwing errors, make sure to log the issue, identify if issue is a validation blocker, then log with `logger.warn`, if it's something that is not supposed to happen, then log with `logger.error`.
- Make sure to add context to the logs to help identify any potential issues.

View File

@@ -0,0 +1,122 @@
---
description: How to handle errors in a Motia project
globs:
alwaysApply: true
---
# Error Handling Guide
Errors happen, but we need to handle them gracefully. Make sure you create a custom error class for your project, underneath `/src/errors/` folder.
## Good practices
- Use Custom error to return errors to the client.
- Anything that is not the error class, should be logged with `logger.error`. And root cause should be omitted to the client.
## Create a custom Error class
Name: `/src/errors/base.error.ts`
```typescript
export class BaseError extends Error {
public readonly status: number
public readonly code: string
public readonly metadata: Record<string, any>
constructor(
message: string,
status: number = 500,
code: string = 'INTERNAL_SERVER_ERROR',
metadata: Record<string, any> = {}
) {
super(message)
this.name = this.constructor.name
this.status = status
this.code = code
this.metadata = metadata
// Maintains proper stack trace for where our error was thrown
Error.captureStackTrace(this, this.constructor)
}
toJSON() {
return {
error: {
name: this.name,
message: this.message,
code: this.code,
status: this.status,
...(Object.keys(this.metadata).length > 0 && { metadata: this.metadata }),
},
}
}
}
```
Then create sub class for specific errors that are commonly thrown in your project.
Name: `/src/errors/not-found.error.ts`
```typescript
import { BaseError } from './base.error'
export class NotFoundError extends BaseError {
constructor(message: string = 'Not Found', metadata: Record<string, any> = {}) {
super(message, 404, 'NOT_FOUND', metadata)
}
}
```
## Core Middleware
Make sure you create a core middleware that will be added to ALL API Steps.
File: `/src/middlewares/core.middleware.ts`
```typescript
import { ApiMiddleware } from 'motia'
import { ZodError } from 'zod'
import { BaseError } from '../errors/base.error'
export const coreMiddleware: ApiMiddleware = async (req, ctx, next) => {
const logger = ctx.logger
try {
return await next()
} catch (error: any) {
if (error instanceof ZodError) {
logger.error('Validation error', {
error,
stack: error.stack,
errors: error.errors,
})
return {
status: 400,
body: {
error: 'Invalid request body',
data: error.errors,
},
}
} else if (error instanceof BaseError) {
logger.error('BaseError', {
status: error.status,
code: error.code,
metadata: error.metadata,
name: error.name,
message: error.message,
})
return { status: error.status, body: error.toJSON() }
}
logger.error('Error while performing request', {
error,
body: req.body,
stack: error.stack,
})
return { status: 500, body: { error: 'Internal Server Error' } }
}
}
```

View File

@@ -0,0 +1,34 @@
---
description: Rules for the project
globs:
alwaysApply: true
---
## Real time events
Make sure to use [real time events guide](./rules/motia/realtime-streaming.mdc) to create a new real time event.
## State/Cache management
Make sure to use [state management guide](./rules/motia/state-management.mdc) to create a new state management.
## Creating HTTP Endpoints
Make sure to use [API steps guide](./rules/motia/api-steps.mdc) to create new HTTP endpoints.
## Background Tasks
Make sure to use [event steps guide](./rules/motia/event-steps.mdc) to create new background tasks.
## Scheduled Tasks
Make sure to use [cron steps guide](./rules/motia/cron-steps.mdc) to create new scheduled tasks.
## Virtual Steps & Flow Visualization
Make sure to use [virtual steps guide](./rules/motia/virtual-steps.mdc) when connecting nodes virtually and creating smooth flows in Workbench.
## Authentication
If ever need to add authentication, make sure to use middleware to authenticate the request.
Make sure to use [middlewares](./rules/motia/middlewares.mdc) to validate the requests.

View File

@@ -0,0 +1,441 @@
---
description: How to create HTTP endpoints in Motia
globs: steps/**/*.step.ts,steps/**/*.step.js,steps/**/*_step.py
alwaysApply: false
---
# API Steps Guide
API Steps expose HTTP endpoints that can trigger workflows and emit events.
## Creating API Steps
Steps need to be created in the `steps` folder, it can be in subfolders.
- Steps in TS and JS should end with `.step.ts` and `.step.js` respectively.
- Steps in Python should end with `_step.py`.
## Definition
Defining an API Step is done by two elements. Configuration and Handler.
### Schema Definition
- **TypeScript/JavaScript**: Motia uses Zod schemas for automatic validation of request/response data
- **Python**: Motia uses JSON Schema format. You can optionally use Pydantic models to generate JSON Schemas and handle manual validation in your handlers
### Configuration
**TypeScript/JavaScript**: You need to export a config constant via `export const config` that is a `ApiRouteConfig` type.
**Python**: You need to define a `config` dictionary with the same properties as the TypeScript `ApiRouteConfig`.
```typescript
export type ZodInput = ZodObject<any> | ZodArray<any>
export type StepSchemaInput = ZodInput | JsonSchema
export type Emit = string | {
/**
* The topic name to emit to.
*/
topic: string;
/**
* Optional label for the emission, could be used for documentation or UI.
*/
label?: string;
/**
* This is purely for documentation purposes,
* it doesn't affect the execution of the step.
*
* In Workbench, it will render differently based on this value.
*/
conditional?: boolean;
}
export interface QueryParam {
/**
* The name of the query parameter
*/
name: string
/**
* The description of the query parameter
*/
description: string
}
export interface ApiRouteConfig {
/**
* Should always be api
*/
type: 'api'
/**
* A unique name for this API step, used internally and for linking handlers.
*/
name: string
/**
* Optional human-readable description.
*/
description?: string
/**
* The URL path for this API endpoint (e.g., '/users/:id').
*/
path: string
/**
* The HTTP method for this route.
* POST, GET, PUT, DELETE, PATCH, OPTIONS, HEAD
*/
method: ApiRouteMethod
/**
* Topics this API step can emit events to.
* Important note: All emits in the handler need to be listed here.
*/
emits: Emit[]
/**
* Optional: Topics that are virtually emitted, perhaps for documentation or lineage,
* but not strictly required for execution.
*
* In Motia Workbench, they will show up as gray connections to other steps.
*/
virtualEmits?: Emit[]
/**
* Optional: Virtually subscribed topics.
*
* Used by API steps when we want to chain different HTTP requests
* that could happen sequentially
*/
virtualSubscribes?: string[]
/**
* Flows are used to group multiple steps to be visible in diagrams in Workbench
*/
flows?: string[]
/**
* List of middlewares that will be executed BEFORE the handler is called
*/
middleware?: ApiMiddleware<any, any, any>[]
/**
* Schema for the request body. Accepts either:
* - Zod schema (ZodObject or ZodArray)
* - JSON Schema object
*
* Note: This is not validated automatically, you need to validate it in the handler.
*/
bodySchema?: StepSchemaInput
/**
* Schema for response bodies. Accepts either:
* - Zod schema (ZodObject or ZodArray)
* - JSON Schema object
*
* The key (number) is the HTTP status code this endpoint can return and
* for each HTTP Status Code, you need to define a schema that defines the response body
*/
responseSchema?: Record<number, StepSchemaInput>
/**
* Mostly for documentation purposes, it will show up in Endpoints section in Workbench
*/
queryParams?: QueryParam[]
/**
* Files to include in the step bundle.
* Needs to be relative to the step file.
*/
includeFiles?: string[]
}
```
### Handler
The handler is a function that is exported via `export const handler` that is a `ApiRouteHandler` type.
#### Type Definition from Motia
```typescript
export interface ApiRequest<TBody = unknown> {
/**
* Key-value pairs of path parameters (e.g., from '/users/:id').
*/
pathParams: Record<string, string>
/**
* Key-value pairs of query string parameters. Values can be string or array of strings.
*/
queryParams: Record<string, string | string[]>
/**
* The parsed request body (typically an object if JSON, but can vary).
*/
body: TBody
/**
* Key-value pairs of request headers. Values can be string or array of strings.
*/
headers: Record<string, string | string[]>
}
export type ApiResponse<
TStatus extends number = number,
TBody = string | Buffer | Record<string, unknown>
> = {
status: TStatus
headers?: Record<string, string>
body: TBody
}
export type ApiRouteHandler<
/**
* The type defined by config['bodySchema']
*/
TRequestBody = unknown,
/**
* The type defined by config['responseSchema']
*/
TResponseBody extends ApiResponse<number, unknown> = ApiResponse<number, unknown>,
/**
* The type defined by config['emits'] which is dynamic depending
* on the topic handlers (Event Steps)
*/
TEmitData = never,
> = (req: ApiRequest<TRequestBody>, ctx: FlowContext<TEmitData>) => Promise<TResponseBody>
```
### Handler definition
**TypeScript/JavaScript:**
```typescript
export const handler: Handlers['CreateResource'] = async (req, { emit, logger, state, streams }) => {
// Implementation
}
```
**Python:**
```python
async def handler(req, context):
# req: dictionary containing pathParams, queryParams, body, headers
# context: object containing emit, logger, state, streams, trace_id
pass
```
### Examples
#### TypeScript Example
```typescript
import { ApiRouteConfig, Handlers } from 'motia';
import { z } from 'zod';
const bodySchema = z.object({
title: z.string().min(1, "Title cannot be empty"),
description: z.string().optional(),
category: z.string().min(1, "Category is required"),
metadata: z.record(z.string(), z.any()).optional()
})
export const config: ApiRouteConfig = {
name: 'CreateResource',
type: 'api',
path: '/resources',
method: 'POST',
emits: ['send-email'],
flows: ['resource-management'],
bodySchema,
responseSchema: {
201: z.object({
id: z.string(),
title: z.string(),
category: z.string()
}),
400: z.object({ error: z.string() })
}
};
export const handler: Handlers['CreateResource'] = async (req, { emit, logger }) => {
try {
const { title, description, category, metadata } = bodySchema.parse(req.body);
// Use the logger for structured logging. It's good practice to log key events or data.
logger.info('Attempting to create resource', { title, category });
/**
* Create files to manage service calls.
*
* Let's try to use Domain Driven Design to create files to manage service calls.
* Steps are the entry points, they're the Controllers on the DDD architecture.
*/
const result = await service.createResource(resourceData);
/**
* This is how we emit events to trigger Event Steps.
* Only use emits if the task can take a while to complete.
*
* Examples of long tasks are:
* - LLM Calls
* - Processing big files, like images, videos, audio, etc.
* - Sending emails
*
* Other applicable examples are tasks that are likely to fail, examples:
* - Webhook call to external systems
*
* API Calls that are okay to fail gracefully can be done without emits.
*/
await emit({
/**
* 'topic' must be one of the topics listed in config['emits'].
* do not emit to topics that are not defined in Steps
*/
topic: 'send-email',
/**
* 'data' is the payload of the event message.
* make sure the data used is compliant with the Event Step input schema
*/
data: {
/**
* The data to send to the Event Step.
*/
resource: result,
/**
* The user to send the email to.
*/
user
}
});
logger.info('Resource created successfully', { resourceId, title, category });
// Return a response object for the HTTP request.
return {
status: 201, // CREATED (specified in config['responseSchema'])
/**
* 'body' is the JSON response body sent back to the client.
*/
body: {
id: result.id,
title: result.title,
category: result.category,
status: 'active'
}
};
} catch (error) {
/**
* For one single step project, it is fine to
* handle ZodErrors here, on multiple steps projects,
* it is highly recommended to handle them as a middleware
* (defined in config['middleware'])
*/
if (error instanceof ZodError) {
logger.error('Resource creation failed', { error: error.message });
return {
status: 400,
body: { error: 'Validation failed' }
};
}
logger.error('Resource creation failed', { error: error.message });
return {
status: 500,
body: { error: 'Creation failed' }
};
}
};
```
#### Python Example
```python
from pydantic import BaseModel, Field
from typing import Optional, Dict, Any
class ResourceData(BaseModel):
title: str = Field(..., min_length=1, description="Title cannot be empty")
description: Optional[str] = None
category: str = Field(..., min_length=1, description="Category is required")
metadata: Optional[Dict[str, Any]] = None
class ResourceResponse(BaseModel):
id: str
title: str
category: str
status: str
class ErrorResponse(BaseModel):
error: str
config = {
"name": "CreateResource",
"type": "api",
"path": "/resources",
"method": "POST",
"emits": ["send-email"],
"flows": ["resource-management"],
"bodySchema": ResourceData.model_json_schema(),
"responseSchema": {
201: ResourceResponse.model_json_schema(),
400: ErrorResponse.model_json_schema()
}
}
async def handler(req, context):
try:
body = req.get("body", {})
# Optional: Validate input manually using Pydantic (Motia doesn't do this automatically)
resource_data = ResourceData(**body)
context.logger.info("Attempting to create resource", {
"title": resource_data.title,
"category": resource_data.category
})
# Process the resource creation
result = await service.create_resource({
"title": resource_data.title,
"description": resource_data.description,
"category": resource_data.category,
"metadata": resource_data.metadata
})
# Emit event to trigger other steps
await context.emit({
"topic": "send-email",
"data": {
"resource": result,
"user_id": "example-user"
}
})
context.logger.info("Resource created successfully", {
"resource_id": result.get("id"),
"title": result.get("title"),
"category": result.get("category")
})
return {
"status": 201,
"body": {
"id": result.get("id"),
"title": result.get("title"),
"category": result.get("category"),
"status": "active"
}
}
except ValidationError as e:
context.logger.error("Resource creation failed - Pydantic validation error", {"error": str(e)})
return {
"status": 400,
"body": {"error": "Validation failed"}
}
except Exception as e:
context.logger.error("Resource creation failed", {"error": str(e)})
return {
"status": 500,
"body": {"error": "Creation failed"}
}
```

View File

@@ -0,0 +1,171 @@
---
description: Cron Steps are scheduled tasks that run based on cron expressions.
globs: steps/**/*.step.ts,steps/**/*.step.js,steps/**/*_step.py
alwaysApply: false
---
# Cron Steps Guide
Cron Steps enable scheduled task execution using cron expressions.
They're typically used for recurring jobs like nightly reports, data synchronization, etc.
Cron steps can hold logic, but they do NOT have any retry mechanisms in place, if the logic
is likely to fail, it's recommended to use CRON Step to emit an event to a topic that will
ultimately trigger an Event Step that will handle the logic.
## Creating Cron Steps
Steps need to be created in the `steps` folder, it can be in subfolders.
- Steps in TS and JS should end with `.step.ts` and `.step.js` respectively
- Steps in Python should end with `_step.py`
## Definition
Defining a CRON Step is done by two elements. Configuration and Handler.
### Configuration
**TypeScript/JavaScript**: You need to export a config constant via `export const config` that is a `CronConfig` type.
**Python**: You need to define a `config` dictionary with the same properties as the TypeScript `CronConfig`.
```typescript
export type Emit = string | {
/**
* The topic name to emit to.
*/
topic: string;
/**
* Optional label for the emission, could be used for documentation or UI.
*/
label?: string;
/**
* This is purely for documentation purposes,
* it doesn't affect the execution of the step.
*
* In Workbench, it will render differently based on this value.
*/
conditional?: boolean;
}
export type CronConfig = {
/**
* Should always be cron
*/
type: 'cron'
/**
* A unique name for this cron step, used internally and for linking handlers.
*/
name: string
/**
* Optional human-readable description.
*/
description?: string
/**
* The cron expression for scheduling.
*/
cron: string
/**
* Optional: Topics that are virtually emitted, perhaps for documentation or lineage, but not strictly required for execution.
*/
virtualEmits?: Emit[]
/**
* Topics this cron step can emit events to.
*/
emits: Emit[]
/**
* Optional: An array of flow names this step belongs to.
*/
flows?: string[]
/**
* Files to include in the step bundle.
* Needs to be relative to the step file.
*/
includeFiles?: string[]
}
```
### Handler
The handler is a function that is exported via `export const handler` that is a `CronHandler` type.
**TypeScript/JavaScript:**
```typescript
/**
* CRON handler accepts only one argument, the FlowContext.
*
* The FlowContext is based on the Handlers which can vary depending on the config['emits'].
*/
export const handler: Handlers['CronJobEvery5Minutes'] = async ({ logger, emit, traceId, state, streams }) => {
logger.info('CRON Job Every 5 Minutes started')
}
```
**Python:**
```python
async def handler(context):
# context: object containing emit, logger, state, streams, trace_id
context.logger.info("CRON Job Every 5 Minutes started")
```
### Examples of Cron expressions
- `0 0 * * *`: Runs daily at midnight
- `*/5 * * * *`: Runs every 5 minutes
- `0 9 * * *`: Runs daily at 9 AM
- `0 9 * * 1`: Runs every Monday at 9 AM
- `0 9 * * 1-5`: Runs every Monday to Friday at 9 AM
### Example uses of Cron Steps
- Sending email notifications on a regular basis
- Cleaning up old records
- Purging old data
- Generating reports
- Sending out scheduled notifications
- Collecting metrics from third-party services
- Reconciling data from different sources
## Examples
### TypeScript Example
```typescript
export const config: CronConfig = {
name: 'CronJobEvery5Minutes', // should always be the same as Handlers['__']
type: 'cron',
cron: '*/5 * * * *',
emits: [], // No emits in this example
flows: ['example-flow']
};
export const handler: Handlers['CronJobEvery5Minutes'] = async ({ logger }) => {
logger.info('Cron job started')
}
```
### Python Example
```python
config = {
"name": "CronJobEvery5Minutes",
"type": "cron",
"cron": "*/5 * * * *", # Run every 5 minutes
"emits": [], # No emits in this example
"flows": ["example-flow"]
}
async def handler(context):
context.logger.info("Cron job started")
```

View File

@@ -0,0 +1,234 @@
---
description: How to create background tasks in Motia
globs: steps/**/*.step.ts,steps/**/*.step.js,steps/**/*_step.py
alwaysApply: false
---
# Event Steps Guide
Event Steps are used to handle asynchronous events. These steps cannot be
invoked by a client or user. In order to ultimately trigger an Event Step,
you need to connect it to an API Step or a CRON Step.
Examples of event steps are:
- LLM Calls
- Processing big files, like images, videos, audio, etc.
- Sending emails
Other applicable examples are tasks that are likely to fail, examples:
- Webhook call to external systems
## Creating Event Steps
Steps need to be created in the `steps` folder, it can be in subfolders.
- Steps in TS and JS should end with `.step.ts` and `.step.js` respectively.
- Steps in Python should end with `_step.py`.
## Definition
Defining an Event Step is done by two elements. Configuration and Handler.
### Schema Definition
- **TypeScript/JavaScript**: Motia uses Zod schemas for automatic validation of input data
- **Python**: Motia uses JSON Schema format. You can optionally use Pydantic models to generate JSON Schemas and handle manual validation in your handlers
### Configuration
**TypeScript/JavaScript**: You need to export a config constant via `export const config` that is a `EventConfig` type.
**Python**: You need to define a `config` dictionary with the same properties as the TypeScript `EventConfig`.
```typescript
export type ZodInput = ZodObject<any> | ZodArray<any>
export type StepSchemaInput = ZodInput | JsonSchema
export type Emit = string | {
/**
* The topic name to emit to.
*/
topic: string;
/**
* Optional label for the emission, could be used for documentation or UI.
*/
label?: string;
/**
* This is purely for documentation purposes,
* it doesn't affect the execution of the step.
*
* In Workbench, it will render differently based on this value.
*/
conditional?: boolean;
}
export type EventConfig = {
/**
* Should always be event
*/
type: 'event'
/**
* A unique name for this event step, used internally and for linking handlers.
*/
name: string
/**
* Optional human-readable description.
*/
description?: string
/**
* An array of topic names this step listens to.
*/
subscribes: string[]
/**
* An array of topics this step can emit events to.
*/
emits: Emit[]
/**
* Optional: Topics that are virtually emitted, perhaps for documentation or lineage, but not strictly required for execution.
*/
virtualEmits?: Emit[]
/**
* Optional: Virtually subscribed topics for documentation/lineage purposes.
*/
virtualSubscribes?: string[]
/**
* Schema for input data. Accepts either:
* - Zod schema (ZodObject or ZodArray)
* - JSON Schema object
*
* This is used by Motia to create the correct types for whoever emits the event
* to this step.
*
* Avoid adding too much data to the input schema, only add the data that
* is necessary for the Event Step to process. If the data is too big it's
* recommended to store it in the state and fetch it from the state on the
* Event Step handler.
*/
input?: StepSchemaInput
/**
* Optional: An array of flow names this step belongs to.
*/
flows?: string[]
/**
* Files to include in the step bundle.
* Needs to be relative to the step file.
*/
includeFiles?: string[]
/**
* Optional: Infrastructure configuration for handler and queue settings.
*/
infrastructure?: Partial<InfrastructureConfig>
}
```
### Handler
The handler is a function that is exported via `export const handler` that is a `EventHandler` type.
### Handler definition
**TypeScript/JavaScript:**
```typescript
/**
* Input is inferred from the Event Step config['input']
* Context is the FlowContext
*/
export const handler: Handlers['SendEmail'] = async (input, { emit, logger, state, streams }) => {
// Implementation
}
```
**Python:**
```python
async def handler(input_data, context):
# input_data: dictionary with the event data (matches the input schema)
# context: object containing emit, logger, state, streams, trace_id
pass
```
### Examples
#### TypeScript Example
```typescript
import { EventConfig, Handlers } from 'motia';
import { z } from 'zod';
const inputSchema = z.object({
email: z.string(),
templateId: z.string(),
templateData: z.record(z.string(), z.any()),
})
export const config: EventConfig = {
name: 'SendEmail',
type: 'event',
description: 'Sends email notification to the user',
subscribes: ['send-email'],
emits: [],
input: inputSchema,
flows: ['resource-management']
};
export const handler: Handlers['SendEmail'] = async (input, { emit, logger }) => {
const { email, templateId, templateData } = input;
// Process email sending logic here
await emailService.send({
to: email,
templateId,
data: templateData
});
logger.info('Email sent successfully', { email, templateId });
};
```
#### Python Example
```python
from pydantic import BaseModel
from typing import Dict, Any
class EmailData(BaseModel):
email: str
template_id: str
template_data: Dict[str, Any]
config = {
"name": "SendEmail",
"type": "event",
"description": "Sends email notification to the user",
"subscribes": ["send-email"],
"emits": [],
"input": EmailData.model_json_schema(),
"flows": ["resource-management"]
}
async def handler(input_data, context):
# Optional: Validate input manually using Pydantic (Motia doesn't do this automatically)
email_data = EmailData(**input_data)
# Process email sending logic here
await email_service.send({
"to": email_data.email,
"template_id": email_data.template_id,
"data": email_data.template_data
})
context.logger.info("Email sent successfully", {
"email": email_data.email,
"template_id": email_data.template_id
})
```

View File

@@ -0,0 +1,217 @@
---
description: Middlewares are used to execute code before and after the handler is called
globs: steps/**/*.step.ts,steps/**/*.step.js,steps/**/*_step.py,middlewares/**/*.middleware.ts,middlewares/**/*.middleware.js,middlewares/**/*_middleware.py
alwaysApply: false
---
# Middlewares Guide
Middlewares are used to execute code before and after the handler is called in API Steps.
The middleware is a handler that receives three arguments:
- **Request**: this is the same request object received by API Step handlers, if modified by the middleware, it will be the same object passed to the handler and any subsequent middleware. Be careful to not cause any side effects to the request object.
- **Context**: this is the same context object received by API Step handlers, if modified by the middleware, it will be the same object passed to the handler and any subsequent middleware. Be careful to not cause any side effects to the context object.
- **Next**: this is a function that you need to call to invoke the next middleware in the stack. If you don't call it, the request will be halted—the handler and any subsequent middlewares will not be called.
## Next function
Next function is a way to either continue the execution flow or stop it. For example, in authentication middlewares, if the user is not authenticated, you can return a 401 response and not call `next()`.
It can also be used to enrich data returned back to the HTTP response. Like adding a header parameter or so after calling `next()`.
## Adding middlewares to a step
### TypeScript/JavaScript Example
```typescript
import { ApiRouteConfig } from 'motia'
import { coreMiddleware } from '../middlewares/core.middleware'
export const config: ApiRouteConfig = {
type: 'api',
name: 'SampleRoute',
description: 'Sample route',
path: '/sample',
method: 'GET',
emits: [],
flows: [],
middleware: [coreMiddleware],
}
```
### Python Example
```python
async def enrich_data_middleware(req, context, next_fn):
context.logger.info("enriching data")
req["enriched"] = "yes"
return await next_fn()
config = {
"type": "api",
"name": "SampleRoute",
"description": "Sample route",
"path": "/sample",
"method": "GET",
"emits": [],
"flows": [],
"middleware": [enrich_data_middleware],
}
```
## Middleware examples
### Handling errors
#### TypeScript/JavaScript
```typescript
import { ApiMiddleware } from 'motia'
export const coreMiddleware: ApiMiddleware = async (req, ctx, next) => {
const { logger } = ctx
try {
/**
* Calling next() will invoke the next item in the stack.
*
* It will depend on the order of middlewares configured in the step,
* first one in the list is called first and so on.
*/
return await next()
} catch (error: any) {
logger.error('Error while performing request', {
error,
body: req.body, // make sure you don't include sensitive data in the logs
stack: error.stack,
})
return {
status: 500,
body: { error: 'Internal Server Error' },
}
}
}
```
#### Python
```python
async def error_handling_middleware(req, context, next_fn):
try:
# Calling next_fn() will invoke the next item in the stack.
# It will depend on the order of middlewares configured in the step,
# first one in the list is called first and so on.
return await next_fn()
except Exception as error:
context.logger.error('Error while performing request', {
'error': str(error),
'body': req.get('body'), # make sure you don't include sensitive data in the logs
})
return {
'status': 500,
'body': {'error': 'Internal Server Error'},
}
```
### Enriching response
#### TypeScript/JavaScript
```typescript
export const enrichResponseMiddleware: ApiMiddleware = async (req, ctx, next) => {
const response = await next()
response.headers['X-Custom-Header'] = 'Custom Value'
return response
}
```
#### Python
```python
async def enrich_response_middleware(req, context, next_fn):
response = await next_fn()
if not response.get('headers'):
response['headers'] = {}
response['headers']['X-Custom-Header'] = 'Custom Value'
return response
```
### Handling validation errors
#### TypeScript/JavaScript - Handling Zod Validation errors
```typescript
import { ApiMiddleware } from 'motia'
import { ZodError } from 'zod'
export const coreMiddleware: ApiMiddleware = async (req, ctx, next) => {
const logger = ctx.logger
try {
return await next()
} catch (error: any) {
if (error instanceof ZodError) {
logger.error('Validation error', {
error,
stack: error.stack,
errors: error.errors,
})
return {
status: 400,
body: {
error: 'Invalid request body',
data: error.errors,
},
}
}
logger.error('Error while performing request', {
error,
body: req.body, // make sure you don't include sensitive data in the logs
stack: error.stack,
})
return { status: 500, body: { error: 'Internal Server Error' } }
}
}
```
#### Python - Handling Pydantic Validation errors
```python
from pydantic import ValidationError
async def validation_middleware(req, context, next_fn):
try:
return await next_fn()
except ValidationError as error:
context.logger.error('Validation error', {
'error': str(error),
'errors': error.errors(),
})
return {
'status': 400,
'body': {
'error': 'Invalid request body',
'data': error.errors(),
},
}
except Exception as error:
context.logger.error('Error while performing request', {
'error': str(error),
'body': req.get('body'), # make sure you don't include sensitive data in the logs
})
return {
'status': 500,
'body': {'error': 'Internal Server Error'}
}
```

View File

@@ -0,0 +1,533 @@
---
description: Application configuration for Motia projects
globs: motia.config.ts,motia.config.js
alwaysApply: false
---
# Motia Configuration Guide
The `motia.config.ts` file is the central configuration file for your Motia application. It allows you to customize plugins, adapters, stream authentication, and Express app settings.
## Critical Requirement: package.json Must Use ES Modules
**All Motia projects MUST have `"type": "module"` in their `package.json`.**
Motia uses ES modules internally and requires this setting to function correctly. Without it, you may encounter import/export errors during runtime.
### Correct package.json Setup
```json
{
"name": "my-motia-project",
"description": "My Motia application",
"type": "module",
"scripts": {
"postinstall": "motia install",
"dev": "motia dev",
"start": "motia start",
"build": "motia build",
"generate-types": "motia generate-types"
}
}
```
### Migration from Existing Projects
If you have an existing Motia project, ensure you add `"type": "module"` to your `package.json`:
```json
{
"name": "my-project",
"type": "module", // ← Add this line
"scripts": {
"dev": "motia dev"
}
}
```
## Creating the Configuration File
Create a `motia.config.ts` file in the root of your project:
```typescript
import { config } from 'motia'
export default config({
plugins: [],
adapters: {},
streamAuth: undefined,
app: undefined,
})
```
## Creating Projects Without Embedded Redis
By default, Motia includes an embedded Redis binary for easy local development. However, if you prefer to use your own external Redis instance, you can use the `--skip-redis` flag when creating a new project:
```bash
npx motia create my-app --skip-redis
```
This flag:
- Skips the embedded Redis binary installation
- Creates a `motia.config.ts` pre-configured for external Redis connection
- Requires you to provide your own Redis instance before running Motia
When using `--skip-redis`, you'll need to ensure Redis is running and properly configured in your `motia.config.ts` file (see Redis Configuration section below).
## Type Definitions
```typescript
import type { Express } from 'express'
export type Config = {
/**
* Optional: Callback to customize the Express app instance.
* Use this to add custom middleware, routes, or configurations.
*/
app?: (app: Express) => void
/**
* Optional: Array of plugin builders to extend Motia functionality.
* Plugins can add workbench UI components and custom steps.
*/
plugins?: MotiaPluginBuilder[]
/**
* Optional: Custom adapters for state, events, cron, and streams.
* Use this for horizontal scaling or custom storage backends.
*/
adapters?: AdapterConfig
/**
* Optional: Stream authentication configuration.
* Use this to secure real-time stream subscriptions.
*/
streamAuth?: StreamAuthConfig
/**
* Optional: Redis configuration.
* Configure Redis connection or use the built-in in-memory Redis server.
*/
redis?: RedisConfig
}
export type AdapterConfig = {
state?: StateAdapter
streams?: StreamAdapterManager
events?: EventAdapter
cron?: CronAdapter
}
export type StreamAuthRequest = {
headers: Record<string, string | string[] | undefined>
url?: string
}
export type StreamAuthConfig<TSchema extends z.ZodTypeAny = z.ZodTypeAny> = {
/**
* JSON Schema defining the shape of the auth context.
* Use z.toJSONSchema() to convert a Zod schema.
*/
contextSchema: JsonSchema
/**
* Authentication callback that receives the request and returns
* the auth context or null if authentication fails.
*/
authenticate: (request: StreamAuthRequest) => Promise<z.infer<TSchema> | null> | (z.infer<TSchema> | null)
}
export type RedisConfig =
| {
useMemoryServer?: false
host: string
port: number
password?: string
username?: string
db?: number
}
| {
useMemoryServer: true
}
```
## Plugins
Plugins extend Motia functionality by adding workbench UI components and custom steps.
### Plugin Type Definition
```typescript
export type WorkbenchPlugin = {
packageName: string
componentName?: string
label?: string
labelIcon?: string
position?: 'bottom' | 'top'
cssImports?: string[]
props?: Record<string, any>
}
export type MotiaPlugin = {
workbench: WorkbenchPlugin[]
dirname?: string
steps?: string[]
}
export type MotiaPluginBuilder = (motia: MotiaPluginContext) => MotiaPlugin
```
### Using Built-in Plugins
```typescript
import { config } from 'motia'
import statesPlugin from '@motiadev/plugin-states/plugin'
import endpointPlugin from '@motiadev/plugin-endpoint/plugin'
import logsPlugin from '@motiadev/plugin-logs/plugin'
import observabilityPlugin from '@motiadev/plugin-observability/plugin'
export default config({
plugins: [observabilityPlugin, statesPlugin, endpointPlugin, logsPlugin],
})
```
### Creating a Local Plugin
**Project structure:**
```
project/
├── src/ # Steps can be in /src or /steps
│ └── api/
│ └── example.step.ts
├── plugins/
│ └── my-plugin/
│ ├── components/
│ │ └── my-plugin-panel.tsx
│ └── index.ts
└── motia.config.ts
```
**Plugin implementation (`plugins/my-plugin/index.ts`):**
```typescript
import path from 'node:path'
import { config, type MotiaPlugin, type MotiaPluginContext } from 'motia'
function myLocalPlugin(motia: MotiaPluginContext): MotiaPlugin {
motia.registerApi(
{
method: 'GET',
path: '/__motia/my-plugin',
},
async (_req, _ctx) => {
return {
status: 200,
body: { message: 'Hello from my plugin!' },
}
},
)
return {
dirname: path.join(__dirname, 'plugins'),
steps: ['**/*.step.ts'],
workbench: [
{
componentName: 'MyComponent',
packageName: '~/plugins/components/my-component',
label: 'My Plugin',
position: 'top',
labelIcon: 'toy-brick',
},
],
}
}
export default config({
plugins: [myLocalPlugin],
})
```
### Common Plugin Errors
**Error: Component not found**
- **Cause**: `packageName` doesn't match the actual folder structure
- **Solution**: Ensure `packageName: '~/plugins/my-plugin'` matches `plugins/my-plugin/` folder
**Error: Plugin not loading in workbench**
- **Cause**: Plugin function not exported correctly
- **Solution**: Use `export default function` in plugin's `index.ts`
**Error: Module resolution failed**
- **Cause**: Using incorrect casing in folder/file names
- **Solution**: Use `kebab-case` for folders/files, `PascalCase` for React components
## Adapters
Adapters allow you to customize the underlying infrastructure for state management, event handling, cron jobs, and streams. This is useful for horizontal scaling or using custom storage backends.
### Available Adapter Packages
| Package | Description |
|---------|-------------|
| `@motiadev/adapter-redis-state` | Redis-based state management for distributed systems |
| `@motiadev/adapter-redis-cron` | Redis-based cron scheduling with distributed locking |
| `@motiadev/adapter-redis-streams` | Redis Streams for real-time data |
| `@motiadev/adapter-rabbitmq-events` | RabbitMQ for event messaging |
| `@motiadev/adapter-bullmq-events` | BullMQ for event queue processing |
### Using Custom Adapters
```typescript
import { config } from 'motia'
import { RedisStateAdapter } from '@motiadev/adapter-redis-state'
import { RabbitMQEventAdapter } from '@motiadev/adapter-rabbitmq-events'
import { RedisCronAdapter } from '@motiadev/adapter-redis-cron'
export default config({
adapters: {
state: new RedisStateAdapter(
{ url: process.env.REDIS_URL },
{ keyPrefix: 'myapp:state:', ttl: 3600 }
),
events: new RabbitMQEventAdapter({
url: process.env.RABBITMQ_URL!,
exchangeType: 'topic',
exchangeName: 'motia-events',
}),
cron: new RedisCronAdapter(
{ url: process.env.REDIS_URL },
{ keyPrefix: 'myapp:cron:', lockTTL: 30000 }
),
},
})
```
## Stream Authentication
Stream authentication secures real-time stream subscriptions by validating client credentials.
### Configuration
```typescript
import { config, type StreamAuthRequest } from 'motia'
import { z } from 'zod'
const authContextSchema = z.object({
userId: z.string(),
permissions: z.array(z.string()).optional(),
})
export default config({
streamAuth: {
contextSchema: z.toJSONSchema(authContextSchema),
authenticate: async (request: StreamAuthRequest) => {
const token = extractToken(request)
if (!token) {
return null
}
const user = await validateToken(token)
if (!user) {
throw new Error('Invalid token')
}
return {
userId: user.id,
permissions: user.permissions,
}
},
},
})
function extractToken(request: StreamAuthRequest): string | undefined {
const protocol = request.headers['sec-websocket-protocol'] as string | undefined
if (protocol?.includes('Authorization')) {
const [, token] = protocol.split(',')
return token?.trim()
}
if (request.url) {
try {
const url = new URL(request.url)
return url.searchParams.get('authToken') ?? undefined
} catch {
return undefined
}
}
return undefined
}
```
### Using Auth Context in Streams
Once configured, the auth context is available in the `canAccess` callback of stream configurations:
```typescript
import { StreamConfig } from 'motia'
import { z } from 'zod'
export const config: StreamConfig = {
name: 'protectedStream',
schema: z.object({ data: z.string() }),
baseConfig: { storageType: 'default' },
canAccess: (subscription, authContext) => {
if (!authContext) return false
return authContext.permissions?.includes('read:stream')
},
}
```
## Express App Customization
Use the `app` callback to customize the Express application instance:
```typescript
import { config } from 'motia'
import cors from 'cors'
import helmet from 'helmet'
export default config({
app: (app) => {
app.use(helmet())
app.use(cors({ origin: process.env.ALLOWED_ORIGINS?.split(',') }))
app.get('/health', (_req, res) => {
res.json({ status: 'healthy' })
})
},
})
```
## Redis Configuration
Motia uses Redis for state management, caching, and real-time features. By default, Motia automatically starts an in-memory Redis server for local development, eliminating the need for external Redis installation.
### Default Behavior (In-Memory Redis)
When no `redis` configuration is provided, Motia uses an embedded in-memory Redis server:
```typescript
import { config } from 'motia'
export default config({})
```
You can also explicitly enable the in-memory server:
```typescript
export default config({
redis: {
useMemoryServer: true,
},
})
```
### Using External Redis
To connect to an external Redis instance (useful for production or when you already have Redis running), configure the connection settings:
```typescript
import { config } from 'motia'
export default config({
redis: {
useMemoryServer: false,
host: 'localhost',
port: 6379,
},
})
```
For production environments with authentication:
```typescript
export default config({
redis: {
useMemoryServer: false,
host: process.env.REDIS_HOST || 'localhost',
port: parseInt(process.env.REDIS_PORT || '6379'),
password: process.env.REDIS_PASSWORD,
username: process.env.REDIS_USERNAME,
db: parseInt(process.env.REDIS_DB || '0'),
},
})
```
The optional Redis configuration fields include:
- `password`: Redis password for authentication
- `username`: Redis username (Redis 6.0+)
- `db`: Database number to select (default: 0)
You can also use environment variables directly:
- `MOTIA_REDIS_PASSWORD`: Redis password
- `MOTIA_REDIS_USERNAME`: Redis username
- `MOTIA_REDIS_DB`: Database number
### Creating Projects with External Redis
When creating a new project, you can skip the embedded Redis binary installation using the `--skip-redis` flag:
```bash
npx motia create my-app --skip-redis
```
This will create a project with `motia.config.ts` pre-configured for external Redis, and you'll need to ensure Redis is running before starting your application.
## Complete Example
```typescript
import path from 'node:path'
import { config, type MotiaPlugin, type MotiaPluginContext, type StreamAuthRequest } from 'motia'
import { z } from 'zod'
const statesPlugin = require('@motiadev/plugin-states/plugin')
const logsPlugin = require('@motiadev/plugin-logs/plugin')
const authContextSchema = z.object({
userId: z.string(),
role: z.enum(['admin', 'user']).optional(),
})
type AuthContext = z.infer<typeof authContextSchema>
const tokens: Record<string, AuthContext> = {
'admin-token': { userId: 'admin-1', role: 'admin' },
'user-token': { userId: 'user-1', role: 'user' },
}
function extractAuthToken(request: StreamAuthRequest): string | undefined {
const protocol = request.headers['sec-websocket-protocol'] as string | undefined
if (protocol?.includes('Authorization')) {
const [, token] = protocol.split(',')
return token?.trim()
}
if (request.url) {
try {
const url = new URL(request.url)
return url.searchParams.get('authToken') ?? undefined
} catch {
return undefined
}
}
return undefined
}
export default config({
plugins: [statesPlugin, logsPlugin],
streamAuth: {
contextSchema: z.toJSONSchema(authContextSchema),
authenticate: async (request: StreamAuthRequest) => {
const token = extractAuthToken(request)
if (!token) return null
const context = tokens[token]
if (!context) throw new Error(`Invalid token: ${token}`)
return context
},
},
})
```

View File

@@ -0,0 +1,401 @@
---
description: Real-time streaming
globs: steps/**/*.step.ts,steps/**/*.step.js,steps/**/*_step.py,steps/**/*.stream.ts,steps/**/*.stream.js,steps/**/*_stream.py
alwaysApply: false
---
# Real-time Streaming
Building event driven applications often requires some real-time streaming capabilities.
- Like integration with LLMs, they should be implemented in an asynchronous way and updates should come in real-time.
- Chat applications, real-time collaboration, etc.
- Long living processes like data processing, video processing, etc.
Motia has a built-in real-time streaming system that allows you to easily implement real-time streaming capabilities in your application.
It's called Streams.
## Stream Configuration
Creating a Stream means defining a data schema that will be stored and served to the clients who are subscribing.
### StreamConfig Type Definition
```typescript
export type ZodInput = ZodObject<any> | ZodArray<any>
export type StepSchemaInput = ZodInput | JsonSchema
export type StreamSubscription = { groupId: string; id?: string }
export interface StreamConfig {
/**
* The stream name, used on the client side to subscribe.
*/
name: string
/**
* Schema for stream data. Accepts either:
* - Zod schema (ZodObject or ZodArray)
* - JSON Schema object
*/
schema: StepSchemaInput
/**
* Storage configuration for the stream.
* Use 'default' for built-in storage or 'custom' with a factory function.
*/
baseConfig: { storageType: 'default' } | { storageType: 'custom'; factory: () => MotiaStream<any> }
/**
* Optional: Access control callback to authorize subscriptions.
* Return true to allow access, false to deny.
*/
canAccess?: (subscription: StreamSubscription, authContext: any) => boolean | Promise<boolean>
}
```
### TypeScript Example
```typescript
// steps/streams/chat-messages.stream.ts
import { StreamConfig } from 'motia'
import { z } from 'zod'
export const chatMessageSchema = z.object({
id: z.string(),
userId: z.string(),
message: z.string(),
timestamp: z.string()
})
export type ChatMessage = z.infer<typeof chatMessageSchema>
export const config: StreamConfig = {
name: 'chatMessage',
schema: chatMessageSchema,
baseConfig: { storageType: 'default' },
}
```
### Python Examples
#### With Pydantic (Optional)
```python
# steps/streams/chat_messages_stream.py
from pydantic import BaseModel
class ChatMessage(BaseModel):
id: str
user_id: str
message: str
timestamp: str
config = {
"name": "chatMessage",
"schema": ChatMessage.model_json_schema(),
"baseConfig": {"storageType": "default"}
}
```
#### Without Pydantic (Pure JSON Schema)
```python
# steps/streams/chat_messages_stream.py
config = {
"name": "chatMessage",
"schema": {
"type": "object",
"properties": {
"id": {"type": "string"},
"user_id": {"type": "string"},
"message": {"type": "string"},
"timestamp": {"type": "string"}
},
"required": ["id", "user_id", "message", "timestamp"]
},
"baseConfig": {"storageType": "default"}
}
```
## Using streams
Streams managers are automatically injected into the context of the steps.
The interface of each stream is this
```typescript
export type BaseStreamItem<TData = unknown> = TData & { id: string }
interface MotiaStream<TData> {
/**
* Retrieves a single item from the stream
*
* @param groupId - The group id of the stream
* @param id - The id of the item to get
* @returns The item or null if it doesn't exist
*/
get(groupId: string, id: string): Promise<BaseStreamItem<TData> | null>
/**
* Create or update a single item in the stream.
*
* If the item doesn't exist, it will be created.
* If the item exists, it will be updated.
*
* @param groupId - The group id of the stream
* @param id - The id of the item to set
* @param data - The data to set
* @returns The item
*/
set(groupId: string, id: string, data: TData): Promise<BaseStreamItem<TData>>
/**
* Deletes a single item from the stream
*
* @param groupId - The group id of the stream
* @param id - The id of the item to delete
* @returns The item or null if it doesn't exist
*/
delete(groupId: string, id: string): Promise<BaseStreamItem<TData> | null>
/**
* Retrieves a group of items from the stream based on the group id
*
* @param groupId - The group id of the stream
* @returns The items
*/
getGroup(groupId: string): Promise<BaseStreamItem<TData>[]>
/**
* This is used mostly for ephemeral events in streams.
* A chat message for example has a state, which is the message content, user id, etc.
*
* However, if you want to send an event to the subscribers like:
* - online status
* - reactions
* - typing indicators
* - etc.
*
* @param channel - The channel to send the event to
* @param event - The event to send
*/
send<T>(channel: StateStreamEventChannel, event: StateStreamEvent<T>): Promise<void>
}
```
## Sending ephemeral events
Streams hold state, which means when the client connects and subscribes to a GroupID and ItemID,
they will automatically sync with the state of the stream, however, there might be cases where
you want to send an ephemeral event to the subscribers, like:
- online status
- reactions
- typing indicators
- etc.
This is where the `send` method comes in.
```typescript
/**
* The channel to send the event to
*/
type StateStreamEventChannel = {
/**
* The group id of the stream
*/
groupId: string
/**
* The id of the item to send the event to
*
* Optional, when not provided, the event will be sent to the entire group.
* Subscribers to the group will receive the event.
*/
id?: string
}
export type StateStreamEvent<TData> = {
/**
* The type of the event, use as the name of the event
* to be handled in the subscribers.
*/
type: string
/**
* The data of the event, the data that will be sent to the subscribers.
*/
data: TData
}
```
## Using in handlers
### TypeScript Example
```typescript
import { ApiRouteConfig, Handlers } from 'motia'
import { z } from 'zod'
import { chatMessageSchema } from './streams/chat-messages.stream'
export const config: ApiRouteConfig = {
name: 'CreateChatMessage',
type: 'api',
method: 'POST',
path: '/chat-messages',
bodySchema: z.object({
channelId: z.string(),
message: z.string(),
}),
emits: [],
responseSchema: {
201: chatMessageSchema
}
}
export const handler = async (req, { streams }) => {
/**
* Say this is an API Step that a user sends a message to a channel.
*
* In your application logic you should have a channel ID defined somewhere
* so the client can send the message to the correct channel.
*/
const { channelId, message } = req.body
/**
* Define the message ID however you want, but should be a unique identifier underneath the channel ID.
*
* This is used to identify the message in the stream.
*/
const messageId = crypto.randomUUID()
/**
* In your application logic you should have a user ID defined somewhere.
* We recommend using middlewares to identify the user on the request.
*/
const userId = 'example-user-id'
/**
* Creates a message in the stream
*/
const message = await streams.chatMessage.set(channelId, messageId, {
id: messageId,
userId: userId,
message: message,
timestamp: new Date().toISOString()
})
/**
* Returning the stream result directly to the client helps Workbench to
* render the stream object and update it in real-time in the UI.
*/
return { status: 201, body: message }
}
```
### Python Examples
#### With Pydantic (Optional)
```python
import uuid
from datetime import datetime
from pydantic import BaseModel
class ChatMessageRequest(BaseModel):
channel_id: str
message: str
class ChatMessageResponse(BaseModel):
id: str
user_id: str
message: str
timestamp: str
config = {
"name": "CreateChatMessage",
"type": "api",
"method": "POST",
"path": "/chat-messages",
"bodySchema": ChatMessageRequest.model_json_schema(),
"emits": [],
"responseSchema": {
201: ChatMessageResponse.model_json_schema()
}
}
async def handler(req, context):
body = req.get("body", {})
# Optional: Validate with Pydantic
request_data = ChatMessageRequest(**body)
channel_id = request_data.channel_id
message_text = request_data.message
message_id = str(uuid.uuid4())
user_id = "example-user-id"
# Creates a message in the stream
chat_message = await context.streams.chatMessage.set(channel_id, message_id, {
"id": message_id,
"user_id": user_id,
"message": message_text,
"timestamp": datetime.now().isoformat()
})
return {"status": 201, "body": chat_message}
```
#### Without Pydantic (Pure JSON Schema)
```python
import uuid
from datetime import datetime
config = {
"name": "CreateChatMessage",
"type": "api",
"method": "POST",
"path": "/chat-messages",
"bodySchema": {
"type": "object",
"properties": {
"channel_id": {"type": "string"},
"message": {"type": "string"}
},
"required": ["channel_id", "message"]
},
"emits": [],
"responseSchema": {
201: {
"type": "object",
"properties": {
"id": {"type": "string"},
"user_id": {"type": "string"},
"message": {"type": "string"},
"timestamp": {"type": "string"}
}
}
}
}
async def handler(req, context):
body = req.get("body", {})
channel_id = body.get("channel_id")
message_text = body.get("message")
message_id = str(uuid.uuid4())
user_id = "example-user-id"
# Creates a message in the stream
chat_message = await context.streams.chatMessage.set(channel_id, message_id, {
"id": message_id,
"user_id": user_id,
"message": message_text,
"timestamp": datetime.now().isoformat()
})
return {"status": 201, "body": chat_message}
```

View File

@@ -0,0 +1,136 @@
---
description: Managing state across Steps
globs: steps/**/*.step.ts,steps/**/*.step.js,steps/**/*_step.py
alwaysApply: false
---
# State Management
State Management is a core concept in Motia. It's used to store data across Steps.
They can be stored across different workflows.
If we want to trigger Event Steps, we can add data to the emit call, which can be used later in the Event Step execution. But this is limited and can't store too much data,
that's why we need to use the State Management to store data across Steps.
## Use-cases
**When State Management is recommended:**
- Pulling data from an external source, like an API, and storing it in the state, then triggering an Event Step to process the data.
- Storing data that needs to be used later in the workflow.
- Can be used for caching layer, like caching the result of an API call that usually can take a few seconds to complete and doesn't change very often.
**When another solution can be better suited:**
- Storing persistent user data: it's preferred to use a database like Postgres or MongoDB to store user data.
- Storing file data like Base64 encoded images, PDFs, etc: it's preferred to use a storage solution like S3, etc.
## StateManager Interface
The StateManager interface is available in the context of all step handlers. The methods work identically across TypeScript/JavaScript and Python.
```typescript
type InternalStateManager = {
/**
* Retrieves a single item from the state
*
* @param groupId - The group id of the state
* @param key - The key of the item to get
* @returns The item or null if it doesn't exist
*/
get<T>(groupId: string, key: string): Promise<T | null>
/**
* Sets a single item in the state
*
* @param groupId - The group id of the state
* @param key - The key of the item to set
* @param value - The value of the item to set
* @returns The item
*/
set<T>(groupId: string, key: string, value: T): Promise<T>
/**
* Deletes a single item from the state
*
* @param groupId - The group id of the state
* @param key - The key of the item to delete
* @returns The item or null if it doesn't exist
*/
delete<T>(groupId: string, key: string): Promise<T | null>
/**
* Retrieves a group of items from the state
*
* @param groupId - The group id of the state
* @returns A list with all the items in the group
*/
getGroup<T>(groupId: string): Promise<T[]>
/**
* Clears a group of items from the state
*
* @param groupId - The group id of the state
*/
clear(groupId: string): Promise<void>
}
```
## Usage Examples
### TypeScript/JavaScript Example
```typescript
export const handler: Handlers['ProcessOrder'] = async (input, { state, logger }) => {
// Store an order
const order = {
id: input.orderId,
status: 'processing',
createdAt: new Date().toISOString()
};
await state.set('orders', input.orderId, order);
// Retrieve an order
const savedOrder = await state.get('orders', input.orderId);
logger.info('Order retrieved', { savedOrder });
// Get all orders
const allOrders = await state.getGroup('orders');
logger.info('Total orders', { count: allOrders.length });
// Update order status
order.status = 'completed';
await state.set('orders', input.orderId, order);
// Delete an order (if needed)
// await state.delete('orders', input.orderId);
};
```
### Python Example
```python
async def handler(input_data, context):
# Store an order
order = {
"id": input_data.get("order_id"),
"status": "processing",
"created_at": datetime.now().isoformat()
}
await context.state.set("orders", input_data.get("order_id"), order)
# Retrieve an order
saved_order = await context.state.get("orders", input_data.get("order_id"))
context.logger.info("Order retrieved", {"saved_order": saved_order})
# Get all orders
all_orders = await context.state.get_group("orders")
context.logger.info("Total orders", {"count": len(all_orders)})
# Update order status
order["status"] = "completed"
await context.state.set("orders", input_data.get("order_id"), order)
# Delete an order (if needed)
# await context.state.delete("orders", input_data.get("order_id"))
```

View File

@@ -0,0 +1,76 @@
---
description: Overriding the UI of Steps in Motia
globs: steps/**/*.step.tsx,steps/**/*.step.jsx,steps/**/*_step.tsx,steps/**/*_step.jsx
alwaysApply: true
---
# UI Steps Guide for Motia
UI Steps provide a powerful way to create custom, visually appealing representations of your workflow steps in the Workbench flow visualization tool.
With UI Steps, you can enhance the user experience by designing intuitive, context-aware visual components that clearly communicate your flow's sequencing and events.
## Overview
To create a custom UI for a step, create a .tsx or .jsx file next to your step file with the same base name:
```
steps/
└── myStep/
├── myStep.step.ts # Step definition
└── myStep.step.tsx # Visual override
```
## Basic Usage
Let's override an EventNode but keeping the same look. Like the image below. We're going to add an image on the side and show the description.
```typescript
// myStep.step.tsx
import { EventNode, EventNodeProps } from 'motia/workbench'
import React from 'react'
export const Node: React.FC<EventNodeProps> = (props) => {
return (
<EventNode {...props}>
<div className="flex flex-row items-start gap-2">
<div className="text-sm text-gray-400 font-mono">{props.data.description}</div>
<img
style={{ width: '64px', height: '64px' }}
src="https://www.motia.dev/icon.png"
/>
</div>
</EventNode>
)
}
```
## Components
Motia Workbench provides out of the box components that you can use to create custom UI steps, which apply to different types of steps.
### Available Components
| Component | Props Type | Description |
|-----------|------------|-------------|
| EventNode | EventNodeProps | Base component for Event Steps, with built-in styling and connection points |
| ApiNode | ApiNodeProps | Component for API Steps, includes request/response visualization capabilities |
| CronNode | CronNodeProps | Base component for Cron Steps, displays timing information |
| NoopNode | NoopNodeProps | Base component for NoopNodes with a different color to comply workbench legend |
## Styling guidelines
- Use Tailwind's utility classes only: Stick to Tailwind CSS utilities for consistent styling
- Avoid arbitrary values: Use predefined scales from the design system
- Keep components responsive: Ensure UI elements adapt well to different screen sizes
- Follow Motia's design system: Maintain consistency with Motia's established design patterns
## Best practices
- Use base components: Use EventNode and ApiNode when possible
- Keep it simple: Maintain simple and clear visualizations
- Optimize performance: Minimize state and computations
- Documentation: Document custom components and patterns
- Style sharing: Share common styles through utility classes

View File

@@ -0,0 +1,251 @@
---
description: Connecting nodes virtually and creating a smooth flow in Workbench
globs: steps/**/*.step.ts,steps/**/*.step.js,steps/**/*_step.py
alwaysApply: false
---
# Virtual Steps Guide
Virtual Steps are useful for creating a smooth flow in Workbench.
They offer two ways to connect nodes virtually:
- NOOP Steps: Used mostly when we want to override the workflow to add buttons or show UI elements.
- Virtual Connections between steps: Used when we want to connect virtually two steps, with
or without NOOP Steps.
## Creating NOOP Steps
Steps need to be created in the `steps` folder, it can be in subfolders.
- Steps in TS and JS should end with `.step.ts` and `.step.js` respectively.
- Steps in Python should end with `_step.py`.
### Configuration
#### TypeScript Example
```typescript
import { NoopConfig } from 'motia'
export const config: NoopConfig = {
/**
* Should always be noop
*/
type: 'noop',
/**
* A unique name for this noop step, used internally and for linking handlers.
*/
name: 'manual-trigger',
/**
* A description for this noop step, used for documentation and UI.
*/
description: 'Manual trigger point for workflow',
/**
* An array of topics this step can emit events to.
*/
virtualEmits: ['workflow.start'],
/**
* An array of topics this step can subscribe to.
*/
virtualSubscribes: ['manual.trigger'],
/**
* An array of flow names this step belongs to.
*/
flows: ['my-workflow']
}
// NOOP steps don't need handlers - they're purely for Workbench workflow connections
```
#### Python Example
```python
config = {
"type": "noop",
"name": "manual-trigger",
"description": "Manual trigger point for workflow",
"virtualEmits": ["workflow.start"],
"virtualSubscribes": ["manual.trigger"],
"flows": ["my-workflow"]
}
# NOOP steps don't need handlers - they're purely for Workbench workflow connections
```
### Common Use Cases
#### Workflow Starter
This NOOP step will create a flow node in Workbench, then as a UI Step, we will override it
to show a button.
**TypeScript:**
```typescript
export const config: NoopConfig = {
type: 'noop',
name: 'flow-starter',
description: 'Start point for the workflow',
virtualEmits: ['process.begin'],
virtualSubscribes: [],
flows: ['main-flow']
}
```
**Python:**
```python
config = {
"type": "noop",
"name": "flow-starter",
"description": "Start point for the workflow",
"virtualEmits": ["process.begin"],
"virtualSubscribes": [],
"flows": ["main-flow"]
}
```
### Manual Approval Point
This NOOP step will create a flow node in Workbench, it's important just to connect
a previous Step to the step where it will have a manual approval button.
Example:
```mermaid
graph LR
A[Submit Article]
C[Approve Article]
D[Reject Article]
```
Without this NOOP Step, the steps A->C and A->D would all appear disconnected in Workbench.
**TypeScript:**
```typescript
export const config: NoopConfig = {
type: 'noop',
name: 'approval-gate',
description: 'Manual approval required',
virtualEmits: ['approved'],
virtualSubscribes: ['pending.approval'],
flows: ['approval-flow']
}
```
**Python:**
```python
config = {
"type": "noop",
"name": "approval-gate",
"description": "Manual approval required",
"virtualEmits": ["approved"],
"virtualSubscribes": ["pending.approval"],
"flows": ["approval-flow"]
}
```
## How it works
It uses the `virtualEmits` and `virtualSubscribes` to connect to the previous and next steps.
In `Submit Article` Step, it must have a `virtualEmits: ['approved']` to connect to the `Manual Approval` Step.
In `Approve Article` Step, it must have a `virtualSubscribes: ['pending.approval']` to connect to the `Submit Article` Step.
It's also possible to connect two Steps without NOOP just by connecting them directly.
It's also possible to use labels in connections.
**TypeScript Examples:**
```typescript
export const config: ApiRouteConfig = {
type: 'api',
name: 'CreateArticle',
path: '/articles',
method: 'POST',
description: 'Creates an article',
virtualEmits: [{ topic: 'approval.required', label: 'Requires approval' }],
emits: [],
flows: ['article']
}
export const config: ApiRouteConfig = {
type: 'api',
name: 'ApproveArticle',
path: '/articles/:id/approve',
method: 'POST',
description: 'Approves an article',
virtualSubscribes: ['approval.required'],
emits: [],
flows: ['article']
}
export const config: ApiRouteConfig = {
type: 'api',
name: 'RejectArticle',
path: '/articles/:id/reject',
method: 'POST',
description: 'Rejects an article',
virtualSubscribes: ['approval.required'],
emits: [],
flows: ['article']
}
```
**Python Examples:**
```python
# create_article_step.py
config = {
"type": "api",
"name": "CreateArticle",
"path": "/articles",
"method": "POST",
"description": "Creates an article",
"virtualEmits": [{"topic": "approval.required", "label": "Requires approval"}],
"emits": [],
"flows": ["article"]
}
# approve_article_step.py
config = {
"type": "api",
"name": "ApproveArticle",
"path": "/articles/:id/approve",
"method": "POST",
"description": "Approves an article",
"virtualSubscribes": ["approval.required"],
"emits": [],
"flows": ["article"]
}
# reject_article_step.py
config = {
"type": "api",
"name": "RejectArticle",
"path": "/articles/:id/reject",
"method": "POST",
"description": "Rejects an article",
"virtualSubscribes": ["approval.required"],
"emits": [],
"flows": ["article"]
}
```
This will create connection to two API Steps.
```mermaid
graph LR
A[Create Article] --> B[Requires Approval]
B --> C[Approve Article]
B --> D[Reject Article]
```
## When to Use NOOP Steps
- Testing workflow connections
- Manual trigger points
- Workflow visualization
- Placeholder for future steps

View File

@@ -0,0 +1,8 @@
node_modules
python_modules
.venv
venv
.motia
.mermaid
dist
*.pyc

View File

@@ -0,0 +1,230 @@
# AGENTS.md
> AI Development Guide for Motia Projects
This file provides context and instructions for AI coding assistants working on Motia projects.
## Project Overview
This is a **Motia** application - a framework for building event-driven, type-safe backend systems with:
- HTTP API endpoints (API Steps)
- Background event processing (Event Steps)
- Scheduled tasks (Cron Steps)
- Real-time streaming capabilities
- Built-in state management
- Visual workflow designer (Workbench)
## Quick Start Commands
```bash
# Install dependencies
npm install
# Start development server (with hot reload)
npm run dev
# Start production server (without hot reload)
npm run start
# Generate TypeScript types from steps
npx motia generate-types
```
## 📚 Comprehensive Guides
**This project includes detailed Cursor rules in `.cursor/rules/` that contain comprehensive patterns and examples.**
These guides are written in markdown and can be read by any AI coding tool. The sections below provide quick reference, but **always consult the detailed guides in `.cursor/` for complete patterns and examples.**
### Available Guides
Read these files in `.cursor/rules/motia/` for detailed patterns:
- **`motia-config.mdc`** - Essential project setup, package.json requirements, plugin naming
- **`api-steps.mdc`** - Creating HTTP endpoints with schemas, validation, and middleware
- **`event-steps.mdc`** - Background task processing and event-driven workflows
- **`cron-steps.mdc`** - Scheduled tasks with cron expressions
- **`state-management.mdc`** - State/cache management across steps
- **`middlewares.mdc`** - Request/response middleware patterns
- **`realtime-streaming.mdc`** - WebSocket and SSE patterns
- **`virtual-steps.mdc`** - Visual flow connections in Workbench
- **`ui-steps.mdc`** - Custom visual components for Workbench
Architecture guides in `.cursor/architecture/`:
- **`architecture.mdc`** - Project structure, naming conventions, DDD patterns
- **`error-handling.mdc`** - Error handling best practices
**Read these guides before writing code.** They contain complete examples, type definitions, and best practices.
## Quick Reference
> **⚠️ Important**: The sections below are brief summaries. **Always read the full guides in `.cursor/rules/` for complete patterns, examples, and type definitions.**
### Project Structure
Motia discovers steps from both `/src` and `/steps` folders. Modern projects typically use `/src`:
**Recommended Structure (using `/src`):**
```
project/
├── .cursor/rules/ # DETAILED GUIDES - Read these first!
├── src/
│ ├── api/ # API endpoints
│ │ ├── users.step.ts
│ │ ├── orders.step.js
│ │ └── products_step.py
│ ├── events/ # Event handlers
│ │ ├── order-processing.step.ts
│ │ └── notifications_step.py
│ ├── cron/ # Scheduled tasks
│ │ └── cleanup.step.ts
│ ├── services/ # Business logic
│ ├── repositories/ # Data access
│ └── utils/ # Utilities
├── middlewares/ # Reusable middleware
│ └── auth.middleware.ts
├── motia.config.ts # Motia configuration
└── types.d.ts # Auto-generated types
```
**Alternative Structure (using `/steps`):**
```
project/
├── steps/ # Step definitions
│ ├── api/
│ ├── events/
│ └── cron/
├── src/
│ ├── services/
│ └── utils/
└── motia.config.ts
```
### Step Naming Conventions
**TypeScript/JavaScript:** `my-step.step.ts` (kebab-case)
**Python:** `my_step_step.py` (snake_case)
See `.cursor/architecture/architecture.mdc` for complete naming rules.
### Creating Steps - Quick Start
Every step needs two exports:
1. **`config`** - Defines type, routing, schemas, emits
2. **`handler`** - Async function with processing logic
**For complete examples and type definitions, read:**
- `.cursor/rules/motia/api-steps.mdc` - HTTP endpoints
- `.cursor/rules/motia/event-steps.mdc` - Background tasks
- `.cursor/rules/motia/cron-steps.mdc` - Scheduled tasks
## Detailed Guides by Topic
> **📖 Read the cursor rules for complete information**
### Step Types
- **API Steps** → Read `.cursor/rules/motia/api-steps.mdc`
- HTTP endpoints, schemas, middleware, emits
- Complete TypeScript and Python examples
- When to use emits vs direct processing
- **Event Steps** → Read `.cursor/rules/motia/event-steps.mdc`
- Background processing, topic subscriptions
- Retry mechanisms, error handling
- Chaining events for complex workflows
- **Cron Steps** → Read `.cursor/rules/motia/cron-steps.mdc`
- Scheduled tasks with cron expressions
- Idempotent execution patterns
- Integration with event emits
### Architecture
- **Project Structure** → Read `.cursor/architecture/architecture.mdc`
- File organization, naming conventions
- Domain-Driven Design patterns (services, repositories)
- Code style guidelines for TypeScript, JavaScript, Python
- **Error Handling** → Read `.cursor/architecture/error-handling.mdc`
- ZodError middleware patterns
- Logging best practices
- HTTP status codes
### Advanced Features
- **State Management** → Read `.cursor/rules/motia/state-management.mdc`
- Caching strategies, TTL configuration
- When to use state vs database
- Complete API reference
- **Middlewares** → Read `.cursor/rules/motia/middlewares.mdc`
- Authentication, validation, error handling
- Creating reusable middleware
- Middleware composition
- **Real-time Streaming** → Read `.cursor/rules/motia/realtime-streaming.mdc`
- Server-Sent Events (SSE) patterns
- WebSocket support
- Client-side integration
- **Virtual Steps** → Read `.cursor/rules/motia/virtual-steps.mdc`
- Visual flow connections in Workbench
- Documenting API chains
- Flow organization
- **UI Steps** → Read `.cursor/rules/motia/ui-steps.mdc`
- Custom Workbench visualizations
- Available components (EventNode, ApiNode, etc.)
- Styling with Tailwind
## Workflow for AI Coding Assistants
When working on Motia projects, follow this pattern:
1. **Read the relevant guide** in `.cursor/rules/` for the task
- Creating API? Read `api-steps.mdc`
- Background task? Read `event-steps.mdc`
- Scheduled job? Read `cron-steps.mdc`
2. **Check the architecture guide** in `.cursor/architecture/architecture.mdc`
- Understand project structure
- Follow naming conventions
- Apply DDD patterns
3. **Implement following the patterns** from the guides
- Use the examples as templates
- Follow type definitions exactly
- Apply best practices
4. **Generate types** after changes
```bash
npx motia generate-types
```
5. **Test in Workbench** to verify connections
```bash
npx motia dev
```
## Critical Rules
- **ALWAYS** ensure `package.json` has `"type": "module"` (read `motia-config.mdc` for details)
- **ALWAYS** read `.cursor/rules/` guides before writing step code
- **ALWAYS** run `npx motia generate-types` after modifying configs
- **ALWAYS** list emits in config before using them in handlers
- **ALWAYS** follow naming conventions (`*.step.ts` or `*_step.py`)
- **NEVER** use API steps for background work (use Event steps)
- **NEVER** skip middleware for ZodError handling in multi-step projects
- **NEVER** implement rate limiting/CORS in code (infrastructure handles this)
## Resources
- **Detailed Guides**: `.cursor/rules/motia/*.mdc` (in this project)
- **Architecture**: `.cursor/architecture/*.mdc` (in this project)
- **Documentation**: [motia.dev/docs](https://motia.dev/docs)
- **Examples**: [motia.dev/docs/examples](https://motia.dev/docs/examples)
- **GitHub**: [github.com/MotiaDev/motia](https://github.com/MotiaDev/motia)
---
**Remember**: This AGENTS.md is a quick reference. The `.cursor/rules/` directory contains the comprehensive, authoritative guides with complete examples and type definitions. Always consult those guides when implementing Motia patterns.

View File

@@ -0,0 +1,63 @@
# Motia Project Guide for Claude Code & Claude AI
This project uses **Motia** - a framework for building event-driven, type-safe backend systems.
## 📚 Important: Read the Comprehensive Guides
This project has detailed development guides in **`.cursor/rules/`** directory. These markdown files (`.mdc`) contain complete patterns, examples, and type definitions.
**Before writing any Motia code, read the relevant guides from `.cursor/rules/`**
### For Claude Code Users
**A pre-configured subagent is ready!**
The `motia-developer` subagent in `.claude/agents/` automatically references all 11 cursor rules when coding.
Use it: `/agents` → select `motia-developer`
Learn more: [Claude Code Subagents Docs](https://docs.claude.com/en/docs/claude-code/sub-agents)
### For Claude AI Assistant (Chat)
Explicitly reference cursor rules in your prompts:
```
Read .cursor/rules/motia/api-steps.mdc and create an API endpoint
for user registration following the patterns shown.
```
## Available Guides (11 Comprehensive Files)
All guides in `.cursor/rules/` with **TypeScript, JavaScript, and Python** examples:
**Configuration** (`.cursor/rules/motia/`):
- `motia-config.mdc` - Essential project setup, package.json requirements, plugin naming
**Step Types** (`.cursor/rules/motia/`):
- `api-steps.mdc`, `event-steps.mdc`, `cron-steps.mdc`
**Features** (`.cursor/rules/motia/`):
- `state-management.mdc`, `middlewares.mdc`, `realtime-streaming.mdc`
- `virtual-steps.mdc`, `ui-steps.mdc`
**Architecture** (`.cursor/architecture/`):
- `architecture.mdc`, `error-handling.mdc`
## Quick Reference
See `AGENTS.md` in this directory for a quick overview and links to specific guides.
**Important**: Motia discovers steps from both `/src` and `/steps` folders. Modern projects use `/src` for a familiar structure.
## Key Commands
```bash
npm run dev # Start development server (with hot reload)
npm run start # Start production server (without hot reload)
npx motia generate-types # Regenerate TypeScript types
```
---
**Remember**: The `.cursor/rules/` directory is your primary reference. Read the relevant guide before implementing any Motia pattern.

View File

@@ -0,0 +1,84 @@
# motia-clean-test
A Motia project created with the starter template.
## What is Motia?
Motia is an open-source, unified backend framework that eliminates runtime fragmentation by bringing **APIs, background jobs, queueing, streaming, state, workflows, AI agents, observability, scaling, and deployment** into one unified system using a single core primitive, the **Step**.
## Quick Start
```bash
# Start the development server
npm run dev
# or
yarn dev
# or
pnpm dev
```
This starts the Motia runtime and the **Workbench** - a powerful UI for developing and debugging your workflows. By default, it's available at [`http://localhost:3000`](http://localhost:3000).
```bash
# Test your first endpoint
curl http://localhost:3000/hello
```
## Step Types
Every Step has a `type` that defines how it triggers:
| Type | When it runs | Use case |
|------|--------------|----------|
| **`api`** | HTTP request | REST APIs, webhooks |
| **`event`** | Event emitted | Background jobs, workflows |
| **`cron`** | Schedule | Cleanup, reports, reminders |
## Development Commands
```bash
# Start Workbench and development server
npm run dev
# or
yarn dev
# or
pnpm dev
# Start production server (without hot reload)
npm run start
# or
yarn start
# or
pnpm start
# Generate TypeScript types from Step configs
npm run generate-types
# or
yarn generate-types
# or
pnpm generate-types
# Build project for deployment
npm run build
# or
yarn build
# or
pnpm build
```
## Project Structure
```
steps/ # Your Step definitions (or use src/)
motia.config.ts # Motia configuration
requirements.txt # Python dependencies
```
Steps are auto-discovered from your `steps/` or `src/` directories - no manual registration required. You can write Steps in Python, TypeScript, or JavaScript, all in the same project.
## Learn More
- [Documentation](https://motia.dev/docs) - Complete guides and API reference
- [Quick Start Guide](https://motia.dev/docs/getting-started/quick-start) - Detailed getting started tutorial
- [Core Concepts](https://motia.dev/docs/concepts/overview) - Learn about Steps and Motia architecture
- [Discord Community](https://discord.gg/motia) - Get help and connect with other developers

View File

@@ -0,0 +1,29 @@
[
{
"id": "hello-world-flow",
"config": {
"src/hello/process_greeting_step.py": {
"x": 409,
"y": 44
},
"src/hello/hello_api_step.py": {
"x": 0,
"y": 0,
"sourceHandlePosition": "right"
}
}
},
{
"id": "perf-test",
"config": {
"src/perf-test/perf_event_step.py": {
"x": 319,
"y": 22
},
"src/perf-test/perf_cron_step.py": {
"x": 0,
"y": 0
}
}
}
]

View File

@@ -0,0 +1,10 @@
import { defineConfig } from '@motiadev/core'
import endpointPlugin from '@motiadev/plugin-endpoint/plugin'
import logsPlugin from '@motiadev/plugin-logs/plugin'
import observabilityPlugin from '@motiadev/plugin-observability/plugin'
import statesPlugin from '@motiadev/plugin-states/plugin'
import bullmqPlugin from '@motiadev/plugin-bullmq/plugin'
export default defineConfig({
plugins: [observabilityPlugin, statesPlugin, endpointPlugin, logsPlugin, bullmqPlugin],
})

View File

@@ -0,0 +1,19 @@
{
"name": "Motia Project",
"description": "Motia event-driven backend framework project",
"rules": ["AGENTS.md"],
"context": [
".cursor/rules/motia/motia-config.mdc",
".cursor/rules/motia/api-steps.mdc",
".cursor/rules/motia/event-steps.mdc",
".cursor/rules/motia/cron-steps.mdc",
".cursor/rules/motia/realtime-streaming.mdc",
".cursor/rules/motia/virtual-steps.mdc",
".cursor/rules/motia/ui-steps.mdc",
".cursor/rules/motia/state-management.mdc",
".cursor/rules/motia/middlewares.mdc",
".cursor/architecture/architecture.mdc",
".cursor/architecture/error-handling.mdc"
],
"notes": "Read AGENTS.md first - it references all detailed guides in .cursor/rules/ directory"
}

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,33 @@
{
"name": "motia-clean-test",
"description": "",
"type": "module",
"scripts": {
"postinstall": "motia install",
"dev": "motia dev",
"start": "motia start",
"generate-types": "motia generate-types",
"build": "motia build",
"clean": "rm -rf dist node_modules python_modules .motia .mermaid"
},
"keywords": [
"motia"
],
"dependencies": {
"@motiadev/adapter-bullmq-events": "^0.17.11-beta.193",
"@motiadev/core": "^0.17.11-beta.193",
"@motiadev/plugin-bullmq": "^0.17.11-beta.193",
"@motiadev/plugin-endpoint": "^0.17.11-beta.193",
"@motiadev/plugin-logs": "^0.17.11-beta.193",
"@motiadev/plugin-observability": "^0.17.11-beta.193",
"@motiadev/plugin-states": "^0.17.11-beta.193",
"motia": "^0.17.11-beta.193",
"zod": "^4.1.12"
},
"devDependencies": {
"@motiadev/workbench": "^0.17.11-beta.193",
"@types/react": "^19.1.1",
"ts-node": "^10.9.2",
"typescript": "^5.7.3"
}
}

Binary file not shown.

After

Width:  |  Height:  |  Size: 87 KiB

View File

@@ -0,0 +1,952 @@
timestamp,type,batch_start,emission_duration,event_id,event_duration
14:42:00,cron_start,2025-12-31,,,
14:43:00,cron_start,2025-12-31,,,
14:44:00,cron_start,2025-12-31,,,
14:45:00,cron_start,2025-12-31,,,
14:46:00,cron_start,2025-12-31,,,
14:47:04,cron_start,2025-12-31,,,
14:48:34,cron_start,2025-12-31,,,
14:50:35,cron_start,2025-12-31,,,
14:53:12,cron_start,2025-12-31,,,
14:56:36,cron_start,2025-12-31,,,
14:42:01,emission,,1.135,,
14:43:03,emission,,3.452,,
14:44:04,emission,,4.422,,
14:45:06,emission,,6.048,,
14:46:08,emission,,7.921,,
14:47:14,emission,,9.359,,
14:48:45,emission,,11.004,,
14:50:48,emission,,12.66,,
14:53:26,emission,,14.074,,
14:42:00,event,,,0,3.16
14:42:00,event,,,1,4.01
14:42:00,event,,,2,3.25
14:42:00,event,,,3,1.57
14:42:00,event,,,4,1.63
14:42:00,event,,,5,1.64
14:42:00,event,,,6,1.57
14:42:00,event,,,7,1.56
14:42:00,event,,,8,1.99
14:42:00,event,,,9,1.59
14:42:00,event,,,12,1.67
14:42:00,event,,,10,1.71
14:42:00,event,,,11,1.74
14:42:00,event,,,13,1.63
14:42:00,event,,,14,1.64
14:42:00,event,,,15,1.68
14:42:00,event,,,16,1.63
14:42:00,event,,,17,1.18
14:42:00,event,,,19,1.67
14:42:00,event,,,18,2.78
14:42:00,event,,,21,1.58
14:42:00,event,,,20,1.86
14:42:00,event,,,22,3.48
14:42:00,event,,,23,1.69
14:42:00,event,,,24,1.57
14:42:00,event,,,25,1.57
14:42:00,event,,,26,2.26
14:42:00,event,,,27,1.61
14:42:00,event,,,28,1.53
14:42:00,event,,,30,2.11
14:42:00,event,,,29,1.75
14:42:00,event,,,31,1.59
14:42:00,event,,,32,1.59
14:42:00,event,,,33,1.83
14:42:00,event,,,34,1.58
14:42:00,event,,,36,1.55
14:42:00,event,,,35,1.55
14:42:00,event,,,37,1.55
14:42:00,event,,,38,1.58
14:42:00,event,,,39,1.62
14:42:00,event,,,40,4.06
14:42:00,event,,,41,1.54
14:42:00,event,,,42,1.65
14:42:00,event,,,43,1.63
14:42:00,event,,,44,1.65
14:42:00,event,,,46,1.58
14:42:00,event,,,45,1.55
14:42:00,event,,,47,1.59
14:42:00,event,,,48,1.59
14:42:00,event,,,49,1.61
14:42:00,event,,,50,1.59
14:42:00,event,,,51,1.59
14:42:00,event,,,52,1.6
14:42:00,event,,,53,1.6
14:42:00,event,,,54,1.6
14:42:00,event,,,55,1.55
14:42:00,event,,,57,1.57
14:42:00,event,,,56,3.87
14:42:00,event,,,58,1.78
14:42:00,event,,,59,1.56
14:42:00,event,,,60,1.64
14:42:00,event,,,61,1.58
14:42:00,event,,,62,1.57
14:42:00,event,,,63,1.83
14:42:00,event,,,64,1.57
14:42:00,event,,,66,1.6
14:42:00,event,,,65,1.54
14:42:00,event,,,67,1.86
14:42:00,event,,,68,1.55
14:42:00,event,,,69,1.61
14:42:00,event,,,70,1.63
14:42:00,event,,,71,1.56
14:42:00,event,,,72,1.55
14:42:00,event,,,73,1.54
14:42:00,event,,,74,1.5
14:42:00,event,,,75,1.58
14:42:00,event,,,76,1.65
14:42:00,event,,,77,1.58
14:42:00,event,,,78,1.58
14:42:00,event,,,79,1.52
14:42:00,event,,,80,1.53
14:42:00,event,,,81,1.55
14:42:00,event,,,82,1.61
14:42:00,event,,,83,1.6
14:42:01,event,,,84,1.56
14:42:01,event,,,85,1.54
14:42:01,event,,,86,1.56
14:42:01,event,,,87,1.61
14:42:01,event,,,88,1.59
14:42:01,event,,,89,1.55
14:42:01,event,,,90,1.61
14:42:01,event,,,91,1.6
14:42:01,event,,,92,1.59
14:42:01,event,,,93,1.62
14:42:01,event,,,94,1.59
14:42:01,event,,,95,1.56
14:42:01,event,,,96,1.52
14:42:01,event,,,97,1.16
14:42:01,event,,,98,1.68
14:42:01,event,,,99,1.57
14:43:00,event,,,1,4.07
14:43:00,event,,,2,2.46
14:43:00,event,,,0,1.67
14:43:00,event,,,4,1.56
14:43:00,event,,,3,1.61
14:43:00,event,,,5,1.57
14:43:00,event,,,6,1.85
14:43:00,event,,,7,1.8
14:43:00,event,,,8,2.79
14:43:00,event,,,9,1.75
14:43:00,event,,,10,1.58
14:43:00,event,,,11,1.51
14:43:00,event,,,12,1.54
14:43:00,event,,,13,1.61
14:43:00,event,,,14,1.6
14:43:00,event,,,15,1.55
14:43:00,event,,,16,1.56
14:43:00,event,,,17,1.56
14:43:00,event,,,18,1.57
14:43:00,event,,,19,1.6
14:43:00,event,,,20,1.56
14:43:00,event,,,21,1.52
14:43:00,event,,,22,1.61
14:43:00,event,,,23,1.58
14:43:00,event,,,24,1.9
14:43:00,event,,,25,1.62
14:43:00,event,,,26,1.54
14:43:00,event,,,27,1.57
14:43:00,event,,,28,1.56
14:43:00,event,,,29,1.6
14:43:00,event,,,30,1.51
14:43:00,event,,,31,1.6
14:43:00,event,,,33,1.59
14:43:00,event,,,32,1.58
14:43:00,event,,,34,1.53
14:43:00,event,,,35,1.54
14:43:00,event,,,36,2.88
14:43:00,event,,,37,1.59
14:43:00,event,,,38,1.59
14:43:00,event,,,39,1.86
14:43:00,event,,,40,1.55
14:43:00,event,,,41,1.58
14:43:00,event,,,42,1.7
14:43:00,event,,,43,1.45
14:43:00,event,,,44,1.54
14:43:00,event,,,45,1.53
14:43:00,event,,,46,1.56
14:43:00,event,,,47,1.54
14:43:00,event,,,48,1.48
14:43:00,event,,,49,1.58
14:43:00,event,,,50,1.57
14:43:00,event,,,52,1.63
14:43:00,event,,,51,1.53
14:43:00,event,,,53,1.6
14:43:00,event,,,55,1.58
14:43:00,event,,,54,1.55
14:43:00,event,,,56,1.57
14:43:01,event,,,57,1.56
14:43:01,event,,,58,1.53
14:43:01,event,,,60,1.6
14:43:01,event,,,59,1.52
14:43:01,event,,,61,1.59
14:43:01,event,,,62,1.62
14:43:01,event,,,63,1.56
14:43:01,event,,,64,1.59
14:43:01,event,,,65,1.6
14:43:01,event,,,66,1.86
14:43:01,event,,,67,2.89
14:43:01,event,,,68,1.55
14:43:01,event,,,69,1.92
14:43:01,event,,,70,1.64
14:43:01,event,,,71,1.56
14:43:01,event,,,72,1.62
14:43:01,event,,,73,1.47
14:43:01,event,,,74,1.65
14:43:01,event,,,75,1.64
14:43:02,event,,,76,1.66
14:43:02,event,,,77,1.62
14:43:02,event,,,78,1.62
14:43:02,event,,,80,1.8
14:43:02,event,,,79,1.62
14:43:02,event,,,81,1.58
14:43:02,event,,,82,1.9
14:43:02,event,,,83,1.62
14:43:02,event,,,84,1.58
14:43:02,event,,,85,1.59
14:43:02,event,,,86,1.69
14:43:02,event,,,87,1.31
14:43:02,event,,,88,1.62
14:43:03,event,,,89,1.68
14:43:03,event,,,90,1.54
14:43:03,event,,,91,1.67
14:43:03,event,,,92,1.67
14:43:03,event,,,93,1.59
14:43:03,event,,,94,1.56
14:43:03,event,,,95,1.63
14:43:03,event,,,96,1.63
14:43:03,event,,,97,1.7
14:43:03,event,,,98,1.6
14:43:03,event,,,99,1.52
14:44:00,event,,,0,1.57
14:44:00,event,,,1,2.17
14:44:00,event,,,3,2.62
14:44:00,event,,,4,1.57
14:44:00,event,,,2,1.54
14:44:00,event,,,5,1.29
14:44:00,event,,,6,1.56
14:44:00,event,,,7,1.47
14:44:00,event,,,8,2.11
14:44:00,event,,,9,1.76
14:44:00,event,,,10,1.61
14:44:00,event,,,11,1.61
14:44:00,event,,,12,1.56
14:44:00,event,,,13,1.59
14:44:00,event,,,14,1.54
14:44:00,event,,,15,1.55
14:44:00,event,,,16,1.5
14:44:00,event,,,17,1.56
14:44:00,event,,,18,1.58
14:44:00,event,,,19,1.61
14:44:00,event,,,20,1.55
14:44:00,event,,,21,1.64
14:44:00,event,,,22,1.62
14:44:00,event,,,23,1.57
14:44:00,event,,,24,1.53
14:44:00,event,,,26,1.51
14:44:00,event,,,25,1.58
14:44:00,event,,,27,1.53
14:44:00,event,,,28,1.55
14:44:00,event,,,30,1.58
14:44:00,event,,,29,1.53
14:44:00,event,,,31,1.66
14:44:00,event,,,32,1.57
14:44:00,event,,,33,1.64
14:44:01,event,,,34,1.63
14:44:01,event,,,35,1.63
14:44:01,event,,,36,1.56
14:44:01,event,,,37,1.64
14:44:01,event,,,39,1.55
14:44:01,event,,,38,1.53
14:44:01,event,,,40,1.57
14:44:01,event,,,41,1.64
14:44:01,event,,,42,1.51
14:44:01,event,,,43,1.58
14:44:01,event,,,45,1.57
14:44:01,event,,,44,1.56
14:44:01,event,,,46,1.61
14:44:01,event,,,47,1.59
14:44:01,event,,,48,1.6
14:44:01,event,,,50,1.4
14:44:01,event,,,49,1.64
14:44:01,event,,,51,1.63
14:44:01,event,,,53,1.52
14:44:01,event,,,52,1.53
14:44:01,event,,,54,1.64
14:44:01,event,,,55,1.77
14:44:01,event,,,56,1.6
14:44:01,event,,,57,1.64
14:44:02,event,,,58,1.61
14:44:02,event,,,59,1.67
14:44:02,event,,,60,1.64
14:44:02,event,,,61,1.67
14:44:02,event,,,62,1.61
14:44:02,event,,,63,1.62
14:44:02,event,,,64,1.63
14:44:02,event,,,65,1.59
14:44:02,event,,,66,1.65
14:44:02,event,,,67,1.59
14:44:02,event,,,68,1.57
14:44:02,event,,,69,1.65
14:44:02,event,,,70,1.63
14:44:02,event,,,71,1.6
14:44:02,event,,,72,1.58
14:44:02,event,,,73,1.6
14:44:02,event,,,74,1.59
14:44:03,event,,,75,1.54
14:44:03,event,,,76,1.57
14:44:03,event,,,77,1.71
14:44:03,event,,,78,1.7
14:44:03,event,,,79,1.56
14:44:03,event,,,80,1.63
14:44:03,event,,,81,1.55
14:44:03,event,,,82,1.74
14:44:03,event,,,83,1.6
14:44:03,event,,,84,1.62
14:44:03,event,,,85,1.75
14:44:03,event,,,86,1.64
14:44:03,event,,,87,1.62
14:44:03,event,,,88,1.61
14:44:04,event,,,89,1.61
14:44:04,event,,,90,1.6
14:44:04,event,,,91,1.63
14:44:04,event,,,92,1.69
14:44:04,event,,,93,1.54
14:44:04,event,,,94,1.66
14:44:04,event,,,95,1.61
14:44:04,event,,,96,1.76
14:44:04,event,,,97,1.59
14:44:04,event,,,98,1.58
14:44:05,event,,,99,1.61
14:45:00,event,,,1,1.57
14:45:00,event,,,0,4.81
14:45:00,event,,,3,2.62
14:45:00,event,,,4,2.45
14:45:00,event,,,2,4.06
14:45:00,event,,,6,1.41
14:45:00,event,,,5,1.03
14:45:00,event,,,7,1.77
14:45:00,event,,,8,1.56
14:45:00,event,,,9,1.53
14:45:00,event,,,10,1.52
14:45:00,event,,,11,1.62
14:45:00,event,,,12,1.55
14:45:00,event,,,13,1.59
14:45:00,event,,,14,1.56
14:45:00,event,,,15,1.58
14:45:00,event,,,16,1.53
14:45:00,event,,,17,1.58
14:45:00,event,,,18,1.58
14:45:00,event,,,19,1.56
14:45:00,event,,,20,1.52
14:45:00,event,,,21,1.68
14:45:00,event,,,22,1.55
14:45:00,event,,,23,1.54
14:45:00,event,,,24,1.59
14:45:00,event,,,25,1.52
14:45:00,event,,,26,1.53
14:45:00,event,,,27,1.61
14:45:00,event,,,28,1.57
14:45:00,event,,,29,1.56
14:45:00,event,,,30,1.58
14:45:01,event,,,31,1.57
14:45:01,event,,,32,1.56
14:45:01,event,,,33,1.55
14:45:01,event,,,34,1.61
14:45:01,event,,,35,1.52
14:45:01,event,,,36,1.57
14:45:01,event,,,38,1.59
14:45:01,event,,,37,1.55
14:45:01,event,,,39,1.51
14:45:01,event,,,40,1.58
14:45:01,event,,,41,1.52
14:45:01,event,,,42,1.57
14:45:01,event,,,43,1.6
14:45:01,event,,,44,1.62
14:45:01,event,,,45,1.53
14:45:01,event,,,46,1.63
14:45:01,event,,,47,1.65
14:45:01,event,,,48,1.55
14:45:01,event,,,49,1.79
14:45:01,event,,,50,1.64
14:45:02,event,,,51,1.54
14:45:02,event,,,53,1.58
14:45:02,event,,,52,1.53
14:45:02,event,,,54,1.67
14:45:02,event,,,55,1.54
14:45:02,event,,,56,1.5
14:45:02,event,,,57,1.53
14:45:02,event,,,59,1.58
14:45:02,event,,,58,1.54
14:45:02,event,,,60,1.65
14:45:02,event,,,61,1.65
14:45:02,event,,,62,1.62
14:45:02,event,,,63,1.55
14:45:02,event,,,64,4.18
14:45:02,event,,,65,1.6
14:45:02,event,,,66,1.63
14:45:03,event,,,67,1.6
14:45:03,event,,,68,1.62
14:45:03,event,,,69,1.69
14:45:03,event,,,70,1.59
14:45:03,event,,,71,1.64
14:45:03,event,,,72,1.63
14:45:03,event,,,73,1.61
14:45:03,event,,,74,1.56
14:45:03,event,,,75,1.65
14:45:03,event,,,76,1.66
14:45:03,event,,,77,1.51
14:45:04,event,,,78,1.61
14:45:04,event,,,79,1.72
14:45:04,event,,,80,1.58
14:45:04,event,,,81,1.7
14:45:04,event,,,82,1.62
14:45:04,event,,,83,1.52
14:45:04,event,,,84,1.59
14:45:04,event,,,85,1.58
14:45:05,event,,,86,1.64
14:45:05,event,,,87,1.66
14:45:05,event,,,88,1.59
14:45:05,event,,,89,1.64
14:45:05,event,,,90,1.65
14:45:05,event,,,91,1.59
14:45:05,event,,,92,1.58
14:45:05,event,,,93,1.69
14:45:05,event,,,94,1.68
14:45:06,event,,,95,1.68
14:45:06,event,,,96,2.0
14:45:06,event,,,97,1.62
14:45:06,event,,,98,1.65
14:45:06,event,,,99,1.73
14:46:00,event,,,0,1.17
14:46:00,event,,,2,3.74
14:46:00,event,,,1,3.63
14:46:00,event,,,3,1.72
14:46:00,event,,,4,1.69
14:46:00,event,,,5,1.53
14:46:00,event,,,6,1.57
14:46:00,event,,,7,2.29
14:46:00,event,,,8,4.06
14:46:00,event,,,9,1.81
14:46:00,event,,,10,1.56
14:46:00,event,,,11,1.6
14:46:00,event,,,12,1.55
14:46:01,event,,,13,1.61
14:46:01,event,,,14,1.52
14:46:01,event,,,15,1.57
14:46:01,event,,,16,1.56
14:46:01,event,,,17,1.54
14:46:01,event,,,18,1.57
14:46:01,event,,,19,1.58
14:46:01,event,,,20,1.54
14:46:01,event,,,21,1.57
14:46:01,event,,,22,1.58
14:46:01,event,,,23,1.59
14:46:01,event,,,24,1.58
14:46:01,event,,,25,1.66
14:46:01,event,,,26,1.53
14:46:01,event,,,27,1.59
14:46:01,event,,,28,1.52
14:46:01,event,,,29,1.52
14:46:01,event,,,30,1.59
14:46:01,event,,,31,1.57
14:46:01,event,,,32,1.58
14:46:01,event,,,33,1.53
14:46:01,event,,,34,1.62
14:46:01,event,,,35,1.59
14:46:01,event,,,36,1.62
14:46:02,event,,,37,1.7
14:46:02,event,,,38,1.61
14:46:02,event,,,39,1.61
14:46:02,event,,,40,1.88
14:46:02,event,,,41,1.61
14:46:02,event,,,42,3.01
14:46:02,event,,,43,1.63
14:46:02,event,,,45,1.58
14:46:02,event,,,44,1.53
14:46:02,event,,,46,1.53
14:46:02,event,,,47,1.57
14:46:02,event,,,48,1.59
14:46:02,event,,,49,1.6
14:46:02,event,,,50,1.66
14:46:02,event,,,51,1.59
14:46:02,event,,,52,1.65
14:46:03,event,,,53,1.52
14:46:03,event,,,54,1.65
14:46:03,event,,,55,1.52
14:46:03,event,,,56,1.58
14:46:03,event,,,57,1.8
14:46:03,event,,,58,1.65
14:46:03,event,,,59,1.7
14:46:03,event,,,60,1.69
14:46:03,event,,,61,1.5
14:46:03,event,,,62,1.51
14:46:03,event,,,63,1.67
14:46:04,event,,,65,1.64
14:46:04,event,,,64,1.91
14:46:04,event,,,66,1.61
14:46:04,event,,,67,1.63
14:46:04,event,,,68,1.71
14:46:04,event,,,69,1.66
14:46:04,event,,,70,1.66
14:46:04,event,,,71,1.62
14:46:05,event,,,72,1.57
14:46:05,event,,,73,1.65
14:46:05,event,,,74,1.56
14:46:05,event,,,75,1.59
14:46:05,event,,,76,1.66
14:46:05,event,,,77,1.06
14:46:05,event,,,78,1.84
14:46:05,event,,,79,1.57
14:46:06,event,,,81,1.62
14:46:06,event,,,80,1.68
14:46:06,event,,,82,1.66
14:46:06,event,,,83,1.55
14:46:06,event,,,84,1.63
14:46:06,event,,,85,1.55
14:46:06,event,,,86,1.58
14:46:07,event,,,87,1.65
14:46:07,event,,,88,1.62
14:46:07,event,,,89,3.34
14:46:07,event,,,90,1.56
14:46:07,event,,,91,1.59
14:46:07,event,,,92,1.59
14:46:07,event,,,93,1.58
14:46:08,event,,,94,1.65
14:46:08,event,,,95,1.57
14:46:08,event,,,96,1.58
14:46:08,event,,,97,1.63
14:46:08,event,,,98,2.76
14:46:08,event,,,99,1.8
14:47:04,event,,,0,1.53
14:47:04,event,,,1,1.61
14:47:04,event,,,2,1.56
14:47:04,event,,,3,1.79
14:47:04,event,,,4,1.59
14:47:04,event,,,5,1.52
14:47:04,event,,,6,1.55
14:47:04,event,,,7,1.57
14:47:04,event,,,8,1.56
14:47:04,event,,,9,1.57
14:47:04,event,,,10,1.58
14:47:04,event,,,11,1.51
14:47:04,event,,,12,1.63
14:47:04,event,,,13,1.58
14:47:04,event,,,14,1.61
14:47:05,event,,,15,1.57
14:47:05,event,,,16,1.51
14:47:05,event,,,17,1.59
14:47:05,event,,,18,1.55
14:47:05,event,,,19,1.12
14:47:05,event,,,20,1.69
14:47:05,event,,,21,1.6
14:47:05,event,,,22,1.62
14:47:05,event,,,23,1.77
14:47:05,event,,,24,1.58
14:47:05,event,,,25,1.55
14:47:05,event,,,26,1.58
14:47:05,event,,,27,1.62
14:47:05,event,,,28,1.73
14:47:05,event,,,29,1.64
14:47:05,event,,,30,1.59
14:47:05,event,,,31,1.64
14:47:05,event,,,32,1.61
14:47:05,event,,,33,1.6
14:47:06,event,,,34,1.63
14:47:06,event,,,35,1.68
14:47:06,event,,,36,1.57
14:47:06,event,,,37,1.7
14:47:06,event,,,38,1.56
14:47:06,event,,,39,1.63
14:47:06,event,,,40,1.65
14:47:06,event,,,41,1.6
14:47:06,event,,,42,1.69
14:47:06,event,,,43,1.3
14:47:06,event,,,44,1.7
14:47:06,event,,,45,1.62
14:47:06,event,,,46,1.62
14:47:06,event,,,47,1.65
14:47:07,event,,,48,1.69
14:47:07,event,,,49,1.7
14:47:07,event,,,50,1.6
14:47:07,event,,,51,1.59
14:47:07,event,,,52,1.45
14:47:07,event,,,53,1.54
14:47:07,event,,,54,1.6
14:47:07,event,,,55,1.3
14:47:07,event,,,56,1.61
14:47:07,event,,,57,1.73
14:47:08,event,,,58,1.69
14:47:08,event,,,60,1.65
14:47:08,event,,,59,1.7
14:47:08,event,,,61,1.78
14:47:08,event,,,62,1.76
14:47:08,event,,,63,1.67
14:47:08,event,,,64,1.62
14:47:08,event,,,65,1.64
14:47:09,event,,,67,1.7
14:47:09,event,,,66,1.6
14:47:09,event,,,68,2.0
14:47:09,event,,,69,1.94
14:47:09,event,,,70,1.68
14:47:09,event,,,71,1.6
14:47:09,event,,,72,1.85
14:47:09,event,,,73,1.75
14:47:10,event,,,74,1.65
14:47:10,event,,,75,1.58
14:47:10,event,,,76,1.6
14:47:10,event,,,77,1.65
14:47:10,event,,,78,1.52
14:47:10,event,,,79,1.8
14:47:10,event,,,80,1.57
14:47:11,event,,,81,1.64
14:47:11,event,,,82,1.89
14:47:11,event,,,83,1.69
14:47:11,event,,,84,1.64
14:47:11,event,,,85,1.66
14:47:11,event,,,86,1.54
14:47:12,event,,,87,1.81
14:47:12,event,,,88,1.64
14:47:12,event,,,89,1.66
14:47:12,event,,,90,1.6
14:47:12,event,,,91,1.67
14:47:12,event,,,92,1.69
14:47:12,event,,,93,1.64
14:47:13,event,,,94,1.69
14:47:13,event,,,95,1.74
14:47:13,event,,,96,1.72
14:47:14,event,,,97,1.63
14:47:14,event,,,98,1.61
14:47:14,event,,,99,2.02
14:48:34,event,,,0,1.6
14:48:34,event,,,1,1.55
14:48:34,event,,,2,1.57
14:48:34,event,,,3,1.55
14:48:34,event,,,4,1.75
14:48:34,event,,,5,1.6
14:48:34,event,,,6,1.72
14:48:34,event,,,7,1.54
14:48:34,event,,,8,1.6
14:48:34,event,,,9,1.56
14:48:34,event,,,10,1.51
14:48:34,event,,,11,1.57
14:48:34,event,,,12,1.55
14:48:34,event,,,13,1.63
14:48:34,event,,,14,1.65
14:48:34,event,,,15,1.62
14:48:34,event,,,16,1.55
14:48:34,event,,,17,1.59
14:48:34,event,,,18,1.75
14:48:34,event,,,19,1.63
14:48:34,event,,,20,1.49
14:48:34,event,,,21,1.65
14:48:34,event,,,22,1.55
14:48:35,event,,,23,1.57
14:48:35,event,,,24,1.58
14:48:35,event,,,25,1.67
14:48:35,event,,,26,1.66
14:48:35,event,,,27,1.62
14:48:35,event,,,28,1.65
14:48:35,event,,,29,1.58
14:48:35,event,,,30,1.55
14:48:35,event,,,31,1.73
14:48:35,event,,,32,1.78
14:48:35,event,,,33,1.68
14:48:35,event,,,34,1.69
14:48:35,event,,,35,2.5
14:48:35,event,,,36,1.83
14:48:35,event,,,37,1.58
14:48:36,event,,,38,1.58
14:48:36,event,,,39,3.11
14:48:36,event,,,40,1.63
14:48:36,event,,,41,1.62
14:48:36,event,,,42,1.59
14:48:36,event,,,43,1.61
14:48:36,event,,,44,1.57
14:48:36,event,,,45,1.63
14:48:36,event,,,46,1.58
14:48:37,event,,,47,1.63
14:48:37,event,,,48,1.7
14:48:37,event,,,49,1.69
14:48:37,event,,,50,1.62
14:48:37,event,,,51,1.61
14:48:37,event,,,52,4.0
14:48:37,event,,,53,1.6
14:48:37,event,,,54,1.59
14:48:37,event,,,55,1.63
14:48:38,event,,,56,1.75
14:48:38,event,,,58,1.63
14:48:38,event,,,57,1.68
14:48:38,event,,,59,1.74
14:48:38,event,,,60,4.02
14:48:38,event,,,61,1.77
14:48:38,event,,,62,3.6
14:48:38,event,,,63,1.64
14:48:38,event,,,64,1.55
14:48:39,event,,,65,1.7
14:48:39,event,,,66,2.86
14:48:39,event,,,67,1.66
14:48:39,event,,,68,1.2
14:48:39,event,,,69,1.97
14:48:39,event,,,70,1.71
14:48:40,event,,,71,1.6
14:48:40,event,,,72,1.58
14:48:40,event,,,73,1.72
14:48:40,event,,,74,1.67
14:48:40,event,,,75,1.75
14:48:40,event,,,76,1.96
14:48:41,event,,,77,1.63
14:48:41,event,,,78,1.72
14:48:41,event,,,79,1.63
14:48:41,event,,,80,1.58
14:48:41,event,,,81,1.65
14:48:42,event,,,82,1.59
14:48:42,event,,,83,1.64
14:48:42,event,,,84,1.72
14:48:42,event,,,85,1.6
14:48:42,event,,,86,1.65
14:48:42,event,,,87,1.49
14:48:43,event,,,88,1.65
14:48:43,event,,,89,1.68
14:48:43,event,,,90,1.63
14:48:43,event,,,91,1.61
14:48:44,event,,,92,1.6
14:48:44,event,,,93,1.67
14:48:44,event,,,94,1.67
14:48:44,event,,,95,1.67
14:48:45,event,,,96,1.57
14:48:45,event,,,97,1.65
14:48:45,event,,,98,1.58
14:48:45,event,,,99,1.98
14:50:35,event,,,0,1.59
14:50:35,event,,,1,1.56
14:50:35,event,,,2,1.58
14:50:35,event,,,3,1.58
14:50:35,event,,,4,1.58
14:50:35,event,,,5,1.56
14:50:35,event,,,6,1.56
14:50:35,event,,,7,1.56
14:50:35,event,,,8,1.55
14:50:35,event,,,9,1.53
14:50:36,event,,,10,1.58
14:50:36,event,,,11,1.53
14:50:36,event,,,12,1.65
14:50:36,event,,,13,1.62
14:50:36,event,,,14,1.56
14:50:36,event,,,15,1.6
14:50:36,event,,,16,1.66
14:50:36,event,,,17,1.53
14:50:36,event,,,18,1.7
14:50:36,event,,,19,1.6
14:50:36,event,,,20,1.61
14:50:36,event,,,21,1.66
14:50:36,event,,,22,1.7
14:50:36,event,,,23,1.78
14:50:36,event,,,24,1.61
14:50:36,event,,,25,1.62
14:50:36,event,,,26,1.53
14:50:36,event,,,27,1.62
14:50:37,event,,,28,1.7
14:50:37,event,,,29,1.62
14:50:37,event,,,30,1.89
14:50:37,event,,,31,1.7
14:50:37,event,,,32,1.61
14:50:37,event,,,33,1.57
14:50:37,event,,,34,1.79
14:50:37,event,,,35,1.62
14:50:37,event,,,36,1.95
14:50:37,event,,,37,1.59
14:50:37,event,,,38,1.63
14:50:37,event,,,39,1.72
14:50:38,event,,,41,1.78
14:50:38,event,,,40,1.59
14:50:38,event,,,42,1.6
14:50:38,event,,,43,1.6
14:50:38,event,,,44,1.61
14:50:38,event,,,45,1.9
14:50:38,event,,,46,1.62
14:50:38,event,,,47,1.73
14:50:38,event,,,48,1.73
14:50:39,event,,,49,1.63
14:50:39,event,,,50,1.67
14:50:39,event,,,51,1.67
14:50:39,event,,,52,1.64
14:50:39,event,,,53,1.9
14:50:39,event,,,54,1.68
14:50:40,event,,,55,1.67
14:50:40,event,,,56,1.66
14:50:40,event,,,57,1.78
14:50:40,event,,,58,1.59
14:50:40,event,,,59,1.68
14:50:40,event,,,60,1.62
14:50:40,event,,,61,1.58
14:50:41,event,,,62,1.63
14:50:41,event,,,63,1.94
14:50:41,event,,,64,1.86
14:50:41,event,,,65,1.8
14:50:41,event,,,66,1.71
14:50:41,event,,,67,1.62
14:50:42,event,,,68,2.02
14:50:42,event,,,69,1.63
14:50:42,event,,,70,1.6
14:50:42,event,,,71,1.57
14:50:42,event,,,72,1.69
14:50:42,event,,,73,1.54
14:50:43,event,,,74,1.63
14:50:43,event,,,75,1.55
14:50:43,event,,,76,1.8
14:50:43,event,,,77,1.69
14:50:43,event,,,78,1.61
14:50:43,event,,,79,1.53
14:50:44,event,,,80,1.59
14:50:44,event,,,81,1.61
14:50:44,event,,,82,2.05
14:50:45,event,,,83,1.62
14:50:45,event,,,84,1.64
14:50:45,event,,,85,1.54
14:50:45,event,,,86,1.63
14:50:45,event,,,87,1.74
14:50:46,event,,,88,1.67
14:50:46,event,,,89,1.58
14:50:46,event,,,90,1.8
14:50:46,event,,,91,1.59
14:50:46,event,,,92,1.58
14:50:47,event,,,93,1.56
14:50:47,event,,,94,1.65
14:50:47,event,,,95,1.68
14:50:47,event,,,96,1.58
14:50:48,event,,,97,1.52
14:50:48,event,,,98,1.64
14:50:48,event,,,99,1.89
14:53:12,event,,,0,1.61
14:53:12,event,,,1,1.59
14:53:12,event,,,2,1.55
14:53:12,event,,,3,1.52
14:53:12,event,,,4,1.54
14:53:12,event,,,5,1.57
14:53:12,event,,,6,1.59
14:53:12,event,,,7,1.57
14:53:12,event,,,8,1.55
14:53:12,event,,,9,1.6
14:53:12,event,,,10,1.62
14:53:12,event,,,11,1.56
14:53:12,event,,,12,1.57
14:53:12,event,,,13,1.59
14:53:12,event,,,14,1.64
14:53:12,event,,,15,1.71
14:53:12,event,,,16,1.6
14:53:12,event,,,17,1.65
14:53:12,event,,,18,1.7
14:53:13,event,,,19,1.63
14:53:13,event,,,20,1.73
14:53:13,event,,,21,1.58
14:53:13,event,,,22,1.65
14:53:13,event,,,23,1.57
14:53:13,event,,,24,1.67
14:53:13,event,,,25,1.64
14:53:13,event,,,26,1.51
14:53:13,event,,,27,2.36
14:53:13,event,,,28,1.04
14:53:13,event,,,29,1.63
14:53:13,event,,,30,1.74
14:53:14,event,,,31,1.62
14:53:14,event,,,32,1.62
14:53:14,event,,,33,1.74
14:53:14,event,,,34,1.67
14:53:14,event,,,35,1.61
14:53:14,event,,,36,1.78
14:53:14,event,,,37,1.64
14:53:14,event,,,38,1.67
14:53:14,event,,,39,1.8
14:53:15,event,,,40,1.68
14:53:15,event,,,41,1.67
14:53:15,event,,,42,1.64
14:53:15,event,,,44,1.77
14:53:15,event,,,43,1.65
14:53:15,event,,,45,1.65
14:53:15,event,,,46,1.69
14:53:15,event,,,47,1.68
14:53:16,event,,,48,1.54
14:53:16,event,,,49,1.83
14:53:16,event,,,50,1.61
14:53:16,event,,,51,1.64
14:53:16,event,,,52,1.89
14:53:16,event,,,53,1.56
14:53:17,event,,,54,1.72
14:53:17,event,,,55,1.52
14:53:17,event,,,56,1.54
14:53:17,event,,,57,1.73
14:53:17,event,,,58,1.07
14:53:17,event,,,59,1.91
14:53:18,event,,,60,1.7
14:53:18,event,,,61,3.93
14:53:18,event,,,62,1.59
14:53:18,event,,,63,1.64
14:53:18,event,,,64,1.63
14:53:19,event,,,65,1.7
14:53:19,event,,,66,1.64
14:53:19,event,,,67,1.63
14:53:19,event,,,68,1.6
14:53:19,event,,,69,1.19
14:53:19,event,,,70,3.76
14:53:20,event,,,71,1.62
14:53:20,event,,,72,1.45
14:53:20,event,,,73,1.89
14:53:20,event,,,74,1.95
14:53:21,event,,,75,1.95
14:53:21,event,,,76,1.81
14:53:21,event,,,77,1.62
14:53:21,event,,,78,1.63
14:53:21,event,,,79,2.07
14:53:21,event,,,80,1.64
14:53:22,event,,,81,1.21
14:53:22,event,,,82,1.62
14:53:22,event,,,83,1.67
14:53:22,event,,,84,1.65
14:53:23,event,,,85,1.64
14:53:23,event,,,86,1.59
14:53:23,event,,,87,1.68
14:53:24,event,,,88,1.6
14:53:24,event,,,89,4.69
14:53:24,event,,,90,1.72
14:53:24,event,,,91,1.63
14:53:25,event,,,92,1.59
14:53:25,event,,,93,1.74
14:53:25,event,,,94,1.68
14:53:25,event,,,95,1.65
14:53:26,event,,,96,1.61
14:53:26,event,,,97,1.63
14:53:26,event,,,98,1.65
14:53:26,event,,,99,1.71
14:56:36,event,,,0,1.62
14:56:36,event,,,1,1.61
14:56:36,event,,,2,1.57
14:56:36,event,,,3,1.55
14:56:36,event,,,4,1.51
14:56:36,event,,,5,1.59
14:56:36,event,,,6,1.59
14:56:36,event,,,7,1.57
14:56:36,event,,,8,1.56
14:56:36,event,,,9,1.52
14:56:36,event,,,10,1.6
14:56:36,event,,,11,1.63
14:56:36,event,,,12,1.58
14:56:36,event,,,13,1.58
14:56:36,event,,,14,1.78
14:56:36,event,,,15,1.64
14:56:36,event,,,16,1.61
14:56:36,event,,,17,1.59
14:56:37,event,,,18,1.92
14:56:37,event,,,19,1.77
14:56:37,event,,,20,1.64
14:56:37,event,,,21,1.65
14:56:37,event,,,22,1.62
14:56:37,event,,,23,1.6
14:56:37,event,,,24,1.55
14:56:37,event,,,25,1.6
14:56:37,event,,,26,1.59
14:56:37,event,,,27,1.65
14:56:37,event,,,28,1.8
14:56:38,event,,,29,1.74
14:56:38,event,,,30,1.66
14:56:38,event,,,31,1.78
1 timestamp type batch_start emission_duration event_id event_duration
2 14:42:00 cron_start 2025-12-31
3 14:43:00 cron_start 2025-12-31
4 14:44:00 cron_start 2025-12-31
5 14:45:00 cron_start 2025-12-31
6 14:46:00 cron_start 2025-12-31
7 14:47:04 cron_start 2025-12-31
8 14:48:34 cron_start 2025-12-31
9 14:50:35 cron_start 2025-12-31
10 14:53:12 cron_start 2025-12-31
11 14:56:36 cron_start 2025-12-31
12 14:42:01 emission 1.135
13 14:43:03 emission 3.452
14 14:44:04 emission 4.422
15 14:45:06 emission 6.048
16 14:46:08 emission 7.921
17 14:47:14 emission 9.359
18 14:48:45 emission 11.004
19 14:50:48 emission 12.66
20 14:53:26 emission 14.074
21 14:42:00 event 0 3.16
22 14:42:00 event 1 4.01
23 14:42:00 event 2 3.25
24 14:42:00 event 3 1.57
25 14:42:00 event 4 1.63
26 14:42:00 event 5 1.64
27 14:42:00 event 6 1.57
28 14:42:00 event 7 1.56
29 14:42:00 event 8 1.99
30 14:42:00 event 9 1.59
31 14:42:00 event 12 1.67
32 14:42:00 event 10 1.71
33 14:42:00 event 11 1.74
34 14:42:00 event 13 1.63
35 14:42:00 event 14 1.64
36 14:42:00 event 15 1.68
37 14:42:00 event 16 1.63
38 14:42:00 event 17 1.18
39 14:42:00 event 19 1.67
40 14:42:00 event 18 2.78
41 14:42:00 event 21 1.58
42 14:42:00 event 20 1.86
43 14:42:00 event 22 3.48
44 14:42:00 event 23 1.69
45 14:42:00 event 24 1.57
46 14:42:00 event 25 1.57
47 14:42:00 event 26 2.26
48 14:42:00 event 27 1.61
49 14:42:00 event 28 1.53
50 14:42:00 event 30 2.11
51 14:42:00 event 29 1.75
52 14:42:00 event 31 1.59
53 14:42:00 event 32 1.59
54 14:42:00 event 33 1.83
55 14:42:00 event 34 1.58
56 14:42:00 event 36 1.55
57 14:42:00 event 35 1.55
58 14:42:00 event 37 1.55
59 14:42:00 event 38 1.58
60 14:42:00 event 39 1.62
61 14:42:00 event 40 4.06
62 14:42:00 event 41 1.54
63 14:42:00 event 42 1.65
64 14:42:00 event 43 1.63
65 14:42:00 event 44 1.65
66 14:42:00 event 46 1.58
67 14:42:00 event 45 1.55
68 14:42:00 event 47 1.59
69 14:42:00 event 48 1.59
70 14:42:00 event 49 1.61
71 14:42:00 event 50 1.59
72 14:42:00 event 51 1.59
73 14:42:00 event 52 1.6
74 14:42:00 event 53 1.6
75 14:42:00 event 54 1.6
76 14:42:00 event 55 1.55
77 14:42:00 event 57 1.57
78 14:42:00 event 56 3.87
79 14:42:00 event 58 1.78
80 14:42:00 event 59 1.56
81 14:42:00 event 60 1.64
82 14:42:00 event 61 1.58
83 14:42:00 event 62 1.57
84 14:42:00 event 63 1.83
85 14:42:00 event 64 1.57
86 14:42:00 event 66 1.6
87 14:42:00 event 65 1.54
88 14:42:00 event 67 1.86
89 14:42:00 event 68 1.55
90 14:42:00 event 69 1.61
91 14:42:00 event 70 1.63
92 14:42:00 event 71 1.56
93 14:42:00 event 72 1.55
94 14:42:00 event 73 1.54
95 14:42:00 event 74 1.5
96 14:42:00 event 75 1.58
97 14:42:00 event 76 1.65
98 14:42:00 event 77 1.58
99 14:42:00 event 78 1.58
100 14:42:00 event 79 1.52
101 14:42:00 event 80 1.53
102 14:42:00 event 81 1.55
103 14:42:00 event 82 1.61
104 14:42:00 event 83 1.6
105 14:42:01 event 84 1.56
106 14:42:01 event 85 1.54
107 14:42:01 event 86 1.56
108 14:42:01 event 87 1.61
109 14:42:01 event 88 1.59
110 14:42:01 event 89 1.55
111 14:42:01 event 90 1.61
112 14:42:01 event 91 1.6
113 14:42:01 event 92 1.59
114 14:42:01 event 93 1.62
115 14:42:01 event 94 1.59
116 14:42:01 event 95 1.56
117 14:42:01 event 96 1.52
118 14:42:01 event 97 1.16
119 14:42:01 event 98 1.68
120 14:42:01 event 99 1.57
121 14:43:00 event 1 4.07
122 14:43:00 event 2 2.46
123 14:43:00 event 0 1.67
124 14:43:00 event 4 1.56
125 14:43:00 event 3 1.61
126 14:43:00 event 5 1.57
127 14:43:00 event 6 1.85
128 14:43:00 event 7 1.8
129 14:43:00 event 8 2.79
130 14:43:00 event 9 1.75
131 14:43:00 event 10 1.58
132 14:43:00 event 11 1.51
133 14:43:00 event 12 1.54
134 14:43:00 event 13 1.61
135 14:43:00 event 14 1.6
136 14:43:00 event 15 1.55
137 14:43:00 event 16 1.56
138 14:43:00 event 17 1.56
139 14:43:00 event 18 1.57
140 14:43:00 event 19 1.6
141 14:43:00 event 20 1.56
142 14:43:00 event 21 1.52
143 14:43:00 event 22 1.61
144 14:43:00 event 23 1.58
145 14:43:00 event 24 1.9
146 14:43:00 event 25 1.62
147 14:43:00 event 26 1.54
148 14:43:00 event 27 1.57
149 14:43:00 event 28 1.56
150 14:43:00 event 29 1.6
151 14:43:00 event 30 1.51
152 14:43:00 event 31 1.6
153 14:43:00 event 33 1.59
154 14:43:00 event 32 1.58
155 14:43:00 event 34 1.53
156 14:43:00 event 35 1.54
157 14:43:00 event 36 2.88
158 14:43:00 event 37 1.59
159 14:43:00 event 38 1.59
160 14:43:00 event 39 1.86
161 14:43:00 event 40 1.55
162 14:43:00 event 41 1.58
163 14:43:00 event 42 1.7
164 14:43:00 event 43 1.45
165 14:43:00 event 44 1.54
166 14:43:00 event 45 1.53
167 14:43:00 event 46 1.56
168 14:43:00 event 47 1.54
169 14:43:00 event 48 1.48
170 14:43:00 event 49 1.58
171 14:43:00 event 50 1.57
172 14:43:00 event 52 1.63
173 14:43:00 event 51 1.53
174 14:43:00 event 53 1.6
175 14:43:00 event 55 1.58
176 14:43:00 event 54 1.55
177 14:43:00 event 56 1.57
178 14:43:01 event 57 1.56
179 14:43:01 event 58 1.53
180 14:43:01 event 60 1.6
181 14:43:01 event 59 1.52
182 14:43:01 event 61 1.59
183 14:43:01 event 62 1.62
184 14:43:01 event 63 1.56
185 14:43:01 event 64 1.59
186 14:43:01 event 65 1.6
187 14:43:01 event 66 1.86
188 14:43:01 event 67 2.89
189 14:43:01 event 68 1.55
190 14:43:01 event 69 1.92
191 14:43:01 event 70 1.64
192 14:43:01 event 71 1.56
193 14:43:01 event 72 1.62
194 14:43:01 event 73 1.47
195 14:43:01 event 74 1.65
196 14:43:01 event 75 1.64
197 14:43:02 event 76 1.66
198 14:43:02 event 77 1.62
199 14:43:02 event 78 1.62
200 14:43:02 event 80 1.8
201 14:43:02 event 79 1.62
202 14:43:02 event 81 1.58
203 14:43:02 event 82 1.9
204 14:43:02 event 83 1.62
205 14:43:02 event 84 1.58
206 14:43:02 event 85 1.59
207 14:43:02 event 86 1.69
208 14:43:02 event 87 1.31
209 14:43:02 event 88 1.62
210 14:43:03 event 89 1.68
211 14:43:03 event 90 1.54
212 14:43:03 event 91 1.67
213 14:43:03 event 92 1.67
214 14:43:03 event 93 1.59
215 14:43:03 event 94 1.56
216 14:43:03 event 95 1.63
217 14:43:03 event 96 1.63
218 14:43:03 event 97 1.7
219 14:43:03 event 98 1.6
220 14:43:03 event 99 1.52
221 14:44:00 event 0 1.57
222 14:44:00 event 1 2.17
223 14:44:00 event 3 2.62
224 14:44:00 event 4 1.57
225 14:44:00 event 2 1.54
226 14:44:00 event 5 1.29
227 14:44:00 event 6 1.56
228 14:44:00 event 7 1.47
229 14:44:00 event 8 2.11
230 14:44:00 event 9 1.76
231 14:44:00 event 10 1.61
232 14:44:00 event 11 1.61
233 14:44:00 event 12 1.56
234 14:44:00 event 13 1.59
235 14:44:00 event 14 1.54
236 14:44:00 event 15 1.55
237 14:44:00 event 16 1.5
238 14:44:00 event 17 1.56
239 14:44:00 event 18 1.58
240 14:44:00 event 19 1.61
241 14:44:00 event 20 1.55
242 14:44:00 event 21 1.64
243 14:44:00 event 22 1.62
244 14:44:00 event 23 1.57
245 14:44:00 event 24 1.53
246 14:44:00 event 26 1.51
247 14:44:00 event 25 1.58
248 14:44:00 event 27 1.53
249 14:44:00 event 28 1.55
250 14:44:00 event 30 1.58
251 14:44:00 event 29 1.53
252 14:44:00 event 31 1.66
253 14:44:00 event 32 1.57
254 14:44:00 event 33 1.64
255 14:44:01 event 34 1.63
256 14:44:01 event 35 1.63
257 14:44:01 event 36 1.56
258 14:44:01 event 37 1.64
259 14:44:01 event 39 1.55
260 14:44:01 event 38 1.53
261 14:44:01 event 40 1.57
262 14:44:01 event 41 1.64
263 14:44:01 event 42 1.51
264 14:44:01 event 43 1.58
265 14:44:01 event 45 1.57
266 14:44:01 event 44 1.56
267 14:44:01 event 46 1.61
268 14:44:01 event 47 1.59
269 14:44:01 event 48 1.6
270 14:44:01 event 50 1.4
271 14:44:01 event 49 1.64
272 14:44:01 event 51 1.63
273 14:44:01 event 53 1.52
274 14:44:01 event 52 1.53
275 14:44:01 event 54 1.64
276 14:44:01 event 55 1.77
277 14:44:01 event 56 1.6
278 14:44:01 event 57 1.64
279 14:44:02 event 58 1.61
280 14:44:02 event 59 1.67
281 14:44:02 event 60 1.64
282 14:44:02 event 61 1.67
283 14:44:02 event 62 1.61
284 14:44:02 event 63 1.62
285 14:44:02 event 64 1.63
286 14:44:02 event 65 1.59
287 14:44:02 event 66 1.65
288 14:44:02 event 67 1.59
289 14:44:02 event 68 1.57
290 14:44:02 event 69 1.65
291 14:44:02 event 70 1.63
292 14:44:02 event 71 1.6
293 14:44:02 event 72 1.58
294 14:44:02 event 73 1.6
295 14:44:02 event 74 1.59
296 14:44:03 event 75 1.54
297 14:44:03 event 76 1.57
298 14:44:03 event 77 1.71
299 14:44:03 event 78 1.7
300 14:44:03 event 79 1.56
301 14:44:03 event 80 1.63
302 14:44:03 event 81 1.55
303 14:44:03 event 82 1.74
304 14:44:03 event 83 1.6
305 14:44:03 event 84 1.62
306 14:44:03 event 85 1.75
307 14:44:03 event 86 1.64
308 14:44:03 event 87 1.62
309 14:44:03 event 88 1.61
310 14:44:04 event 89 1.61
311 14:44:04 event 90 1.6
312 14:44:04 event 91 1.63
313 14:44:04 event 92 1.69
314 14:44:04 event 93 1.54
315 14:44:04 event 94 1.66
316 14:44:04 event 95 1.61
317 14:44:04 event 96 1.76
318 14:44:04 event 97 1.59
319 14:44:04 event 98 1.58
320 14:44:05 event 99 1.61
321 14:45:00 event 1 1.57
322 14:45:00 event 0 4.81
323 14:45:00 event 3 2.62
324 14:45:00 event 4 2.45
325 14:45:00 event 2 4.06
326 14:45:00 event 6 1.41
327 14:45:00 event 5 1.03
328 14:45:00 event 7 1.77
329 14:45:00 event 8 1.56
330 14:45:00 event 9 1.53
331 14:45:00 event 10 1.52
332 14:45:00 event 11 1.62
333 14:45:00 event 12 1.55
334 14:45:00 event 13 1.59
335 14:45:00 event 14 1.56
336 14:45:00 event 15 1.58
337 14:45:00 event 16 1.53
338 14:45:00 event 17 1.58
339 14:45:00 event 18 1.58
340 14:45:00 event 19 1.56
341 14:45:00 event 20 1.52
342 14:45:00 event 21 1.68
343 14:45:00 event 22 1.55
344 14:45:00 event 23 1.54
345 14:45:00 event 24 1.59
346 14:45:00 event 25 1.52
347 14:45:00 event 26 1.53
348 14:45:00 event 27 1.61
349 14:45:00 event 28 1.57
350 14:45:00 event 29 1.56
351 14:45:00 event 30 1.58
352 14:45:01 event 31 1.57
353 14:45:01 event 32 1.56
354 14:45:01 event 33 1.55
355 14:45:01 event 34 1.61
356 14:45:01 event 35 1.52
357 14:45:01 event 36 1.57
358 14:45:01 event 38 1.59
359 14:45:01 event 37 1.55
360 14:45:01 event 39 1.51
361 14:45:01 event 40 1.58
362 14:45:01 event 41 1.52
363 14:45:01 event 42 1.57
364 14:45:01 event 43 1.6
365 14:45:01 event 44 1.62
366 14:45:01 event 45 1.53
367 14:45:01 event 46 1.63
368 14:45:01 event 47 1.65
369 14:45:01 event 48 1.55
370 14:45:01 event 49 1.79
371 14:45:01 event 50 1.64
372 14:45:02 event 51 1.54
373 14:45:02 event 53 1.58
374 14:45:02 event 52 1.53
375 14:45:02 event 54 1.67
376 14:45:02 event 55 1.54
377 14:45:02 event 56 1.5
378 14:45:02 event 57 1.53
379 14:45:02 event 59 1.58
380 14:45:02 event 58 1.54
381 14:45:02 event 60 1.65
382 14:45:02 event 61 1.65
383 14:45:02 event 62 1.62
384 14:45:02 event 63 1.55
385 14:45:02 event 64 4.18
386 14:45:02 event 65 1.6
387 14:45:02 event 66 1.63
388 14:45:03 event 67 1.6
389 14:45:03 event 68 1.62
390 14:45:03 event 69 1.69
391 14:45:03 event 70 1.59
392 14:45:03 event 71 1.64
393 14:45:03 event 72 1.63
394 14:45:03 event 73 1.61
395 14:45:03 event 74 1.56
396 14:45:03 event 75 1.65
397 14:45:03 event 76 1.66
398 14:45:03 event 77 1.51
399 14:45:04 event 78 1.61
400 14:45:04 event 79 1.72
401 14:45:04 event 80 1.58
402 14:45:04 event 81 1.7
403 14:45:04 event 82 1.62
404 14:45:04 event 83 1.52
405 14:45:04 event 84 1.59
406 14:45:04 event 85 1.58
407 14:45:05 event 86 1.64
408 14:45:05 event 87 1.66
409 14:45:05 event 88 1.59
410 14:45:05 event 89 1.64
411 14:45:05 event 90 1.65
412 14:45:05 event 91 1.59
413 14:45:05 event 92 1.58
414 14:45:05 event 93 1.69
415 14:45:05 event 94 1.68
416 14:45:06 event 95 1.68
417 14:45:06 event 96 2.0
418 14:45:06 event 97 1.62
419 14:45:06 event 98 1.65
420 14:45:06 event 99 1.73
421 14:46:00 event 0 1.17
422 14:46:00 event 2 3.74
423 14:46:00 event 1 3.63
424 14:46:00 event 3 1.72
425 14:46:00 event 4 1.69
426 14:46:00 event 5 1.53
427 14:46:00 event 6 1.57
428 14:46:00 event 7 2.29
429 14:46:00 event 8 4.06
430 14:46:00 event 9 1.81
431 14:46:00 event 10 1.56
432 14:46:00 event 11 1.6
433 14:46:00 event 12 1.55
434 14:46:01 event 13 1.61
435 14:46:01 event 14 1.52
436 14:46:01 event 15 1.57
437 14:46:01 event 16 1.56
438 14:46:01 event 17 1.54
439 14:46:01 event 18 1.57
440 14:46:01 event 19 1.58
441 14:46:01 event 20 1.54
442 14:46:01 event 21 1.57
443 14:46:01 event 22 1.58
444 14:46:01 event 23 1.59
445 14:46:01 event 24 1.58
446 14:46:01 event 25 1.66
447 14:46:01 event 26 1.53
448 14:46:01 event 27 1.59
449 14:46:01 event 28 1.52
450 14:46:01 event 29 1.52
451 14:46:01 event 30 1.59
452 14:46:01 event 31 1.57
453 14:46:01 event 32 1.58
454 14:46:01 event 33 1.53
455 14:46:01 event 34 1.62
456 14:46:01 event 35 1.59
457 14:46:01 event 36 1.62
458 14:46:02 event 37 1.7
459 14:46:02 event 38 1.61
460 14:46:02 event 39 1.61
461 14:46:02 event 40 1.88
462 14:46:02 event 41 1.61
463 14:46:02 event 42 3.01
464 14:46:02 event 43 1.63
465 14:46:02 event 45 1.58
466 14:46:02 event 44 1.53
467 14:46:02 event 46 1.53
468 14:46:02 event 47 1.57
469 14:46:02 event 48 1.59
470 14:46:02 event 49 1.6
471 14:46:02 event 50 1.66
472 14:46:02 event 51 1.59
473 14:46:02 event 52 1.65
474 14:46:03 event 53 1.52
475 14:46:03 event 54 1.65
476 14:46:03 event 55 1.52
477 14:46:03 event 56 1.58
478 14:46:03 event 57 1.8
479 14:46:03 event 58 1.65
480 14:46:03 event 59 1.7
481 14:46:03 event 60 1.69
482 14:46:03 event 61 1.5
483 14:46:03 event 62 1.51
484 14:46:03 event 63 1.67
485 14:46:04 event 65 1.64
486 14:46:04 event 64 1.91
487 14:46:04 event 66 1.61
488 14:46:04 event 67 1.63
489 14:46:04 event 68 1.71
490 14:46:04 event 69 1.66
491 14:46:04 event 70 1.66
492 14:46:04 event 71 1.62
493 14:46:05 event 72 1.57
494 14:46:05 event 73 1.65
495 14:46:05 event 74 1.56
496 14:46:05 event 75 1.59
497 14:46:05 event 76 1.66
498 14:46:05 event 77 1.06
499 14:46:05 event 78 1.84
500 14:46:05 event 79 1.57
501 14:46:06 event 81 1.62
502 14:46:06 event 80 1.68
503 14:46:06 event 82 1.66
504 14:46:06 event 83 1.55
505 14:46:06 event 84 1.63
506 14:46:06 event 85 1.55
507 14:46:06 event 86 1.58
508 14:46:07 event 87 1.65
509 14:46:07 event 88 1.62
510 14:46:07 event 89 3.34
511 14:46:07 event 90 1.56
512 14:46:07 event 91 1.59
513 14:46:07 event 92 1.59
514 14:46:07 event 93 1.58
515 14:46:08 event 94 1.65
516 14:46:08 event 95 1.57
517 14:46:08 event 96 1.58
518 14:46:08 event 97 1.63
519 14:46:08 event 98 2.76
520 14:46:08 event 99 1.8
521 14:47:04 event 0 1.53
522 14:47:04 event 1 1.61
523 14:47:04 event 2 1.56
524 14:47:04 event 3 1.79
525 14:47:04 event 4 1.59
526 14:47:04 event 5 1.52
527 14:47:04 event 6 1.55
528 14:47:04 event 7 1.57
529 14:47:04 event 8 1.56
530 14:47:04 event 9 1.57
531 14:47:04 event 10 1.58
532 14:47:04 event 11 1.51
533 14:47:04 event 12 1.63
534 14:47:04 event 13 1.58
535 14:47:04 event 14 1.61
536 14:47:05 event 15 1.57
537 14:47:05 event 16 1.51
538 14:47:05 event 17 1.59
539 14:47:05 event 18 1.55
540 14:47:05 event 19 1.12
541 14:47:05 event 20 1.69
542 14:47:05 event 21 1.6
543 14:47:05 event 22 1.62
544 14:47:05 event 23 1.77
545 14:47:05 event 24 1.58
546 14:47:05 event 25 1.55
547 14:47:05 event 26 1.58
548 14:47:05 event 27 1.62
549 14:47:05 event 28 1.73
550 14:47:05 event 29 1.64
551 14:47:05 event 30 1.59
552 14:47:05 event 31 1.64
553 14:47:05 event 32 1.61
554 14:47:05 event 33 1.6
555 14:47:06 event 34 1.63
556 14:47:06 event 35 1.68
557 14:47:06 event 36 1.57
558 14:47:06 event 37 1.7
559 14:47:06 event 38 1.56
560 14:47:06 event 39 1.63
561 14:47:06 event 40 1.65
562 14:47:06 event 41 1.6
563 14:47:06 event 42 1.69
564 14:47:06 event 43 1.3
565 14:47:06 event 44 1.7
566 14:47:06 event 45 1.62
567 14:47:06 event 46 1.62
568 14:47:06 event 47 1.65
569 14:47:07 event 48 1.69
570 14:47:07 event 49 1.7
571 14:47:07 event 50 1.6
572 14:47:07 event 51 1.59
573 14:47:07 event 52 1.45
574 14:47:07 event 53 1.54
575 14:47:07 event 54 1.6
576 14:47:07 event 55 1.3
577 14:47:07 event 56 1.61
578 14:47:07 event 57 1.73
579 14:47:08 event 58 1.69
580 14:47:08 event 60 1.65
581 14:47:08 event 59 1.7
582 14:47:08 event 61 1.78
583 14:47:08 event 62 1.76
584 14:47:08 event 63 1.67
585 14:47:08 event 64 1.62
586 14:47:08 event 65 1.64
587 14:47:09 event 67 1.7
588 14:47:09 event 66 1.6
589 14:47:09 event 68 2.0
590 14:47:09 event 69 1.94
591 14:47:09 event 70 1.68
592 14:47:09 event 71 1.6
593 14:47:09 event 72 1.85
594 14:47:09 event 73 1.75
595 14:47:10 event 74 1.65
596 14:47:10 event 75 1.58
597 14:47:10 event 76 1.6
598 14:47:10 event 77 1.65
599 14:47:10 event 78 1.52
600 14:47:10 event 79 1.8
601 14:47:10 event 80 1.57
602 14:47:11 event 81 1.64
603 14:47:11 event 82 1.89
604 14:47:11 event 83 1.69
605 14:47:11 event 84 1.64
606 14:47:11 event 85 1.66
607 14:47:11 event 86 1.54
608 14:47:12 event 87 1.81
609 14:47:12 event 88 1.64
610 14:47:12 event 89 1.66
611 14:47:12 event 90 1.6
612 14:47:12 event 91 1.67
613 14:47:12 event 92 1.69
614 14:47:12 event 93 1.64
615 14:47:13 event 94 1.69
616 14:47:13 event 95 1.74
617 14:47:13 event 96 1.72
618 14:47:14 event 97 1.63
619 14:47:14 event 98 1.61
620 14:47:14 event 99 2.02
621 14:48:34 event 0 1.6
622 14:48:34 event 1 1.55
623 14:48:34 event 2 1.57
624 14:48:34 event 3 1.55
625 14:48:34 event 4 1.75
626 14:48:34 event 5 1.6
627 14:48:34 event 6 1.72
628 14:48:34 event 7 1.54
629 14:48:34 event 8 1.6
630 14:48:34 event 9 1.56
631 14:48:34 event 10 1.51
632 14:48:34 event 11 1.57
633 14:48:34 event 12 1.55
634 14:48:34 event 13 1.63
635 14:48:34 event 14 1.65
636 14:48:34 event 15 1.62
637 14:48:34 event 16 1.55
638 14:48:34 event 17 1.59
639 14:48:34 event 18 1.75
640 14:48:34 event 19 1.63
641 14:48:34 event 20 1.49
642 14:48:34 event 21 1.65
643 14:48:34 event 22 1.55
644 14:48:35 event 23 1.57
645 14:48:35 event 24 1.58
646 14:48:35 event 25 1.67
647 14:48:35 event 26 1.66
648 14:48:35 event 27 1.62
649 14:48:35 event 28 1.65
650 14:48:35 event 29 1.58
651 14:48:35 event 30 1.55
652 14:48:35 event 31 1.73
653 14:48:35 event 32 1.78
654 14:48:35 event 33 1.68
655 14:48:35 event 34 1.69
656 14:48:35 event 35 2.5
657 14:48:35 event 36 1.83
658 14:48:35 event 37 1.58
659 14:48:36 event 38 1.58
660 14:48:36 event 39 3.11
661 14:48:36 event 40 1.63
662 14:48:36 event 41 1.62
663 14:48:36 event 42 1.59
664 14:48:36 event 43 1.61
665 14:48:36 event 44 1.57
666 14:48:36 event 45 1.63
667 14:48:36 event 46 1.58
668 14:48:37 event 47 1.63
669 14:48:37 event 48 1.7
670 14:48:37 event 49 1.69
671 14:48:37 event 50 1.62
672 14:48:37 event 51 1.61
673 14:48:37 event 52 4.0
674 14:48:37 event 53 1.6
675 14:48:37 event 54 1.59
676 14:48:37 event 55 1.63
677 14:48:38 event 56 1.75
678 14:48:38 event 58 1.63
679 14:48:38 event 57 1.68
680 14:48:38 event 59 1.74
681 14:48:38 event 60 4.02
682 14:48:38 event 61 1.77
683 14:48:38 event 62 3.6
684 14:48:38 event 63 1.64
685 14:48:38 event 64 1.55
686 14:48:39 event 65 1.7
687 14:48:39 event 66 2.86
688 14:48:39 event 67 1.66
689 14:48:39 event 68 1.2
690 14:48:39 event 69 1.97
691 14:48:39 event 70 1.71
692 14:48:40 event 71 1.6
693 14:48:40 event 72 1.58
694 14:48:40 event 73 1.72
695 14:48:40 event 74 1.67
696 14:48:40 event 75 1.75
697 14:48:40 event 76 1.96
698 14:48:41 event 77 1.63
699 14:48:41 event 78 1.72
700 14:48:41 event 79 1.63
701 14:48:41 event 80 1.58
702 14:48:41 event 81 1.65
703 14:48:42 event 82 1.59
704 14:48:42 event 83 1.64
705 14:48:42 event 84 1.72
706 14:48:42 event 85 1.6
707 14:48:42 event 86 1.65
708 14:48:42 event 87 1.49
709 14:48:43 event 88 1.65
710 14:48:43 event 89 1.68
711 14:48:43 event 90 1.63
712 14:48:43 event 91 1.61
713 14:48:44 event 92 1.6
714 14:48:44 event 93 1.67
715 14:48:44 event 94 1.67
716 14:48:44 event 95 1.67
717 14:48:45 event 96 1.57
718 14:48:45 event 97 1.65
719 14:48:45 event 98 1.58
720 14:48:45 event 99 1.98
721 14:50:35 event 0 1.59
722 14:50:35 event 1 1.56
723 14:50:35 event 2 1.58
724 14:50:35 event 3 1.58
725 14:50:35 event 4 1.58
726 14:50:35 event 5 1.56
727 14:50:35 event 6 1.56
728 14:50:35 event 7 1.56
729 14:50:35 event 8 1.55
730 14:50:35 event 9 1.53
731 14:50:36 event 10 1.58
732 14:50:36 event 11 1.53
733 14:50:36 event 12 1.65
734 14:50:36 event 13 1.62
735 14:50:36 event 14 1.56
736 14:50:36 event 15 1.6
737 14:50:36 event 16 1.66
738 14:50:36 event 17 1.53
739 14:50:36 event 18 1.7
740 14:50:36 event 19 1.6
741 14:50:36 event 20 1.61
742 14:50:36 event 21 1.66
743 14:50:36 event 22 1.7
744 14:50:36 event 23 1.78
745 14:50:36 event 24 1.61
746 14:50:36 event 25 1.62
747 14:50:36 event 26 1.53
748 14:50:36 event 27 1.62
749 14:50:37 event 28 1.7
750 14:50:37 event 29 1.62
751 14:50:37 event 30 1.89
752 14:50:37 event 31 1.7
753 14:50:37 event 32 1.61
754 14:50:37 event 33 1.57
755 14:50:37 event 34 1.79
756 14:50:37 event 35 1.62
757 14:50:37 event 36 1.95
758 14:50:37 event 37 1.59
759 14:50:37 event 38 1.63
760 14:50:37 event 39 1.72
761 14:50:38 event 41 1.78
762 14:50:38 event 40 1.59
763 14:50:38 event 42 1.6
764 14:50:38 event 43 1.6
765 14:50:38 event 44 1.61
766 14:50:38 event 45 1.9
767 14:50:38 event 46 1.62
768 14:50:38 event 47 1.73
769 14:50:38 event 48 1.73
770 14:50:39 event 49 1.63
771 14:50:39 event 50 1.67
772 14:50:39 event 51 1.67
773 14:50:39 event 52 1.64
774 14:50:39 event 53 1.9
775 14:50:39 event 54 1.68
776 14:50:40 event 55 1.67
777 14:50:40 event 56 1.66
778 14:50:40 event 57 1.78
779 14:50:40 event 58 1.59
780 14:50:40 event 59 1.68
781 14:50:40 event 60 1.62
782 14:50:40 event 61 1.58
783 14:50:41 event 62 1.63
784 14:50:41 event 63 1.94
785 14:50:41 event 64 1.86
786 14:50:41 event 65 1.8
787 14:50:41 event 66 1.71
788 14:50:41 event 67 1.62
789 14:50:42 event 68 2.02
790 14:50:42 event 69 1.63
791 14:50:42 event 70 1.6
792 14:50:42 event 71 1.57
793 14:50:42 event 72 1.69
794 14:50:42 event 73 1.54
795 14:50:43 event 74 1.63
796 14:50:43 event 75 1.55
797 14:50:43 event 76 1.8
798 14:50:43 event 77 1.69
799 14:50:43 event 78 1.61
800 14:50:43 event 79 1.53
801 14:50:44 event 80 1.59
802 14:50:44 event 81 1.61
803 14:50:44 event 82 2.05
804 14:50:45 event 83 1.62
805 14:50:45 event 84 1.64
806 14:50:45 event 85 1.54
807 14:50:45 event 86 1.63
808 14:50:45 event 87 1.74
809 14:50:46 event 88 1.67
810 14:50:46 event 89 1.58
811 14:50:46 event 90 1.8
812 14:50:46 event 91 1.59
813 14:50:46 event 92 1.58
814 14:50:47 event 93 1.56
815 14:50:47 event 94 1.65
816 14:50:47 event 95 1.68
817 14:50:47 event 96 1.58
818 14:50:48 event 97 1.52
819 14:50:48 event 98 1.64
820 14:50:48 event 99 1.89
821 14:53:12 event 0 1.61
822 14:53:12 event 1 1.59
823 14:53:12 event 2 1.55
824 14:53:12 event 3 1.52
825 14:53:12 event 4 1.54
826 14:53:12 event 5 1.57
827 14:53:12 event 6 1.59
828 14:53:12 event 7 1.57
829 14:53:12 event 8 1.55
830 14:53:12 event 9 1.6
831 14:53:12 event 10 1.62
832 14:53:12 event 11 1.56
833 14:53:12 event 12 1.57
834 14:53:12 event 13 1.59
835 14:53:12 event 14 1.64
836 14:53:12 event 15 1.71
837 14:53:12 event 16 1.6
838 14:53:12 event 17 1.65
839 14:53:12 event 18 1.7
840 14:53:13 event 19 1.63
841 14:53:13 event 20 1.73
842 14:53:13 event 21 1.58
843 14:53:13 event 22 1.65
844 14:53:13 event 23 1.57
845 14:53:13 event 24 1.67
846 14:53:13 event 25 1.64
847 14:53:13 event 26 1.51
848 14:53:13 event 27 2.36
849 14:53:13 event 28 1.04
850 14:53:13 event 29 1.63
851 14:53:13 event 30 1.74
852 14:53:14 event 31 1.62
853 14:53:14 event 32 1.62
854 14:53:14 event 33 1.74
855 14:53:14 event 34 1.67
856 14:53:14 event 35 1.61
857 14:53:14 event 36 1.78
858 14:53:14 event 37 1.64
859 14:53:14 event 38 1.67
860 14:53:14 event 39 1.8
861 14:53:15 event 40 1.68
862 14:53:15 event 41 1.67
863 14:53:15 event 42 1.64
864 14:53:15 event 44 1.77
865 14:53:15 event 43 1.65
866 14:53:15 event 45 1.65
867 14:53:15 event 46 1.69
868 14:53:15 event 47 1.68
869 14:53:16 event 48 1.54
870 14:53:16 event 49 1.83
871 14:53:16 event 50 1.61
872 14:53:16 event 51 1.64
873 14:53:16 event 52 1.89
874 14:53:16 event 53 1.56
875 14:53:17 event 54 1.72
876 14:53:17 event 55 1.52
877 14:53:17 event 56 1.54
878 14:53:17 event 57 1.73
879 14:53:17 event 58 1.07
880 14:53:17 event 59 1.91
881 14:53:18 event 60 1.7
882 14:53:18 event 61 3.93
883 14:53:18 event 62 1.59
884 14:53:18 event 63 1.64
885 14:53:18 event 64 1.63
886 14:53:19 event 65 1.7
887 14:53:19 event 66 1.64
888 14:53:19 event 67 1.63
889 14:53:19 event 68 1.6
890 14:53:19 event 69 1.19
891 14:53:19 event 70 3.76
892 14:53:20 event 71 1.62
893 14:53:20 event 72 1.45
894 14:53:20 event 73 1.89
895 14:53:20 event 74 1.95
896 14:53:21 event 75 1.95
897 14:53:21 event 76 1.81
898 14:53:21 event 77 1.62
899 14:53:21 event 78 1.63
900 14:53:21 event 79 2.07
901 14:53:21 event 80 1.64
902 14:53:22 event 81 1.21
903 14:53:22 event 82 1.62
904 14:53:22 event 83 1.67
905 14:53:22 event 84 1.65
906 14:53:23 event 85 1.64
907 14:53:23 event 86 1.59
908 14:53:23 event 87 1.68
909 14:53:24 event 88 1.6
910 14:53:24 event 89 4.69
911 14:53:24 event 90 1.72
912 14:53:24 event 91 1.63
913 14:53:25 event 92 1.59
914 14:53:25 event 93 1.74
915 14:53:25 event 94 1.68
916 14:53:25 event 95 1.65
917 14:53:26 event 96 1.61
918 14:53:26 event 97 1.63
919 14:53:26 event 98 1.65
920 14:53:26 event 99 1.71
921 14:56:36 event 0 1.62
922 14:56:36 event 1 1.61
923 14:56:36 event 2 1.57
924 14:56:36 event 3 1.55
925 14:56:36 event 4 1.51
926 14:56:36 event 5 1.59
927 14:56:36 event 6 1.59
928 14:56:36 event 7 1.57
929 14:56:36 event 8 1.56
930 14:56:36 event 9 1.52
931 14:56:36 event 10 1.6
932 14:56:36 event 11 1.63
933 14:56:36 event 12 1.58
934 14:56:36 event 13 1.58
935 14:56:36 event 14 1.78
936 14:56:36 event 15 1.64
937 14:56:36 event 16 1.61
938 14:56:36 event 17 1.59
939 14:56:37 event 18 1.92
940 14:56:37 event 19 1.77
941 14:56:37 event 20 1.64
942 14:56:37 event 21 1.65
943 14:56:37 event 22 1.62
944 14:56:37 event 23 1.6
945 14:56:37 event 24 1.55
946 14:56:37 event 25 1.6
947 14:56:37 event 26 1.59
948 14:56:37 event 27 1.65
949 14:56:37 event 28 1.8
950 14:56:38 event 29 1.74
951 14:56:38 event 30 1.66
952 14:56:38 event 31 1.78

View File

@@ -0,0 +1 @@
pydantic>=2.6.1

View File

@@ -0,0 +1,74 @@
import os
import random
import string
from datetime import datetime, timezone
# Optional: Using Pydantic for validation (remove if not using Pydantic)
try:
from pydantic import BaseModel
class HelloResponse(BaseModel):
message: str
status: str
appName: str
# If using Pydantic, we can generate the JSON schema
response_schema = {
200: HelloResponse.model_json_schema()
}
except ImportError:
# Without Pydantic, define JSON schema manually
response_schema = {
200: {
"type": "object",
"properties": {
"message": {"type": "string"},
"status": {"type": "string"},
"appName": {"type": "string"}
},
"required": ["message", "status", "appName"]
}
}
config = {
"name": "HelloAPI",
"type": "api",
"path": "/hello",
"method": "GET",
"description": "Receives hello request and emits event for processing",
"emits": ["process-greeting"],
"flows": ["hello-world-flow"],
"responseSchema": response_schema
}
async def handler(req, context):
app_name = os.environ.get("APP_NAME", "Motia App")
timestamp = datetime.now(timezone.utc).isoformat()
context.logger.info("Hello API endpoint called", {
"app_name": app_name,
"timestamp": timestamp
})
# Generate a random request ID
request_id = ''.join(random.choices(string.ascii_lowercase + string.digits, k=7))
# Emit event for background processing
await context.emit({
"topic": "process-greeting",
"data": {
"timestamp": timestamp,
"appName": app_name,
"greetingPrefix": os.environ.get("GREETING_PREFIX", "Hello"),
"requestId": request_id
}
})
return {
"status": 200,
"body": {
"message": "Hello request received! Check logs for processing.",
"status": "processing",
"appName": app_name
}
}

View File

@@ -0,0 +1,65 @@
import asyncio
from datetime import datetime, timezone
# Optional: Using Pydantic for validation (remove if not using Pydantic)
try:
from pydantic import BaseModel
class GreetingInput(BaseModel):
timestamp: str
appName: str
greetingPrefix: str
requestId: str
# If using Pydantic, we can generate the JSON schema
input_schema = GreetingInput.model_json_schema()
except ImportError:
# Without Pydantic, define JSON schema manually
input_schema = {
"type": "object",
"properties": {
"timestamp": {"type": "string"},
"appName": {"type": "string"},
"greetingPrefix": {"type": "string"},
"requestId": {"type": "string"}
},
"required": ["timestamp", "appName", "greetingPrefix", "requestId"]
}
config = {
"name": "ProcessGreeting",
"type": "event",
"description": "Processes greeting in the background",
"subscribes": ["process-greeting"],
"emits": [],
"flows": ["hello-world-flow"],
"input": input_schema
}
async def handler(input_data, context):
# Extract data from input
timestamp = input_data.get("timestamp")
app_name = input_data.get("appName")
greeting_prefix = input_data.get("greetingPrefix")
request_id = input_data.get("requestId")
context.logger.info("Processing greeting", {
"request_id": request_id,
"app_name": app_name
})
greeting = f"{greeting_prefix} {app_name}!"
# Store result in state (demonstrates state usage)
# Note: The state.set method takes (groupId, key, value)
await context.state.set("greetings", request_id, {
"greeting": greeting,
"processedAt": datetime.now(timezone.utc).isoformat(),
"originalTimestamp": timestamp
})
context.logger.info("Greeting processed successfully", {
"request_id": request_id,
"greeting": greeting,
"stored_in_state": True
})

View File

@@ -0,0 +1,215 @@
#!/usr/bin/env python3
"""
Performance Test Log Analyzer
Extracts timing data from journalctl logs and creates plots
"""
import subprocess
import re
import csv
from datetime import datetime, timezone
from collections import defaultdict
import matplotlib.pyplot as plt
import sys
def run_journalctl():
"""Run journalctl to get recent logs"""
try:
result = subprocess.run(
['journalctl', '--since', '1 hour ago', '--no-pager'],
capture_output=True,
text=True,
timeout=30
)
return result.stdout
except subprocess.TimeoutExpired:
print("journalctl timeout")
return ""
except FileNotFoundError:
print("journalctl not found")
return ""
def read_motia_log():
"""Read motia.log file directly"""
try:
with open('motia.log', 'r') as f:
return f.read()
except FileNotFoundError:
print("motia.log not found")
return ""
def parse_logs(logs):
"""Parse the logs to extract timing data"""
# Remove ANSI escape codes
ansi_escape = re.compile(r'\x1B(?:[@-Z\\-_]|\[[0-?]*[ -/]*[@-~])')
logs = ansi_escape.sub('', logs)
cron_pattern = r'\[(\d{2}:\d{2}:\d{2})\] .* Starting perf test emission at ([\d\-T:Z.]+)'
emission_pattern = r'\[(\d{2}:\d{2}:\d{2})\] .* Completed emitting 100 perf-test events in ([\d.]+)s'
event_pattern = r'\[(\d{2}:\d{2}:\d{2})\] .* Processed perf-test event (\d+) from batch .* in ([\d.]+)ms'
cron_starts = []
emissions = []
events = defaultdict(list)
for line in logs.split('\n'):
# Parse cron start
cron_match = re.search(cron_pattern, line)
if cron_match:
timestamp_str, iso_time = cron_match.groups()
cron_starts.append({
'timestamp': timestamp_str,
'iso_time': iso_time,
'batch_start': iso_time
})
# Parse emission completion
emission_match = re.search(emission_pattern, line)
if emission_match:
timestamp_str, duration = emission_match.groups()
emissions.append({
'timestamp': timestamp_str,
'duration': float(duration)
})
# Parse event processing
event_match = re.search(event_pattern, line)
if event_match:
timestamp_str, event_id, duration = event_match.groups()
events[timestamp_str].append({
'event_id': int(event_id),
'duration': float(duration)
})
# Filter out runs after the restart (keep only first 10 emission runs)
emissions = emissions[:10]
return cron_starts, emissions, events
def save_to_csv(cron_starts, emissions, events, filename='perf_test_data.csv'):
"""Save the parsed data to CSV"""
with open(filename, 'w', newline='') as csvfile:
writer = csv.writer(csvfile)
# Write header
writer.writerow(['timestamp', 'type', 'batch_start', 'emission_duration', 'event_id', 'event_duration'])
# Write cron starts
for cron in cron_starts:
writer.writerow([cron['timestamp'], 'cron_start', cron['batch_start'], '', '', ''])
# Write emissions
for emission in emissions:
writer.writerow([emission['timestamp'], 'emission', '', emission['duration'], '', ''])
# Write events
for timestamp, event_list in events.items():
for event in event_list:
writer.writerow([timestamp, 'event', '', '', event['event_id'], event['duration']])
print(f"Data saved to {filename}")
def create_plots(cron_starts, emissions, events):
"""Create plots from the data"""
if not emissions:
print("No emission data found")
return
# Plot 1: Emission durations over time
if emissions:
# Parse timestamps - handle different formats
timestamps = []
for e in emissions:
try:
# Try different formats
ts_str = e['timestamp']
if 'Okt' in ts_str: # German format
dt = datetime.strptime(ts_str, '%b %d %H:%M:%S')
else:
dt = datetime.strptime(ts_str, '%b %d %H:%M:%S')
timestamps.append(dt)
except ValueError:
# Fallback: just use index
timestamps.append(datetime.now().replace(hour=len(timestamps), minute=0, second=0, microsecond=0))
durations = [e['duration'] for e in emissions]
plt.figure(figsize=(12, 6))
plt.subplot(1, 2, 1)
plt.plot(range(len(timestamps)), durations, 'bo-', label='Emission Duration')
plt.title('Cron Step Emission Durations')
plt.xlabel('Run Number')
plt.ylabel('Duration (seconds)')
plt.grid(True, alpha=0.3)
# Plot 2: Event processing times (box plot per batch)
if events:
plt.subplot(1, 2, 2)
batch_durations = []
batch_labels = []
for i, (timestamp, event_list) in enumerate(list(events.items())[:10]): # Show first 10 batches
if event_list:
durations = [e['duration'] for e in event_list]
batch_durations.append(durations)
batch_labels.append(f'Batch {i+1}')
if batch_durations:
plt.boxplot(batch_durations, labels=batch_labels)
plt.title('Event Processing Times per Batch')
plt.ylabel('Duration (ms)')
plt.xticks(rotation=45)
plt.tight_layout()
plt.savefig('perf_test_analysis.png', dpi=150, bbox_inches='tight')
print("Plot saved to perf_test_analysis.png")
plt.show()
def main():
print("Analyzing performance test logs...")
# Get logs from motia.log
logs = read_motia_log()
if not logs:
print("No logs retrieved")
return
# Parse the logs
cron_starts, emissions, events = parse_logs(logs)
print(f"Found {len(cron_starts)} cron starts")
print(f"Found {len(emissions)} emission completions")
print(f"Found {len(events)} event batches with {sum(len(v) for v in events.values())} total events")
if not emissions and not events:
print("No performance test data found in logs")
return
# Save to CSV
save_to_csv(cron_starts, emissions, events)
# Create plots
create_plots(cron_starts, emissions, events)
# Print summary statistics
if emissions:
emission_durations = [e['duration'] for e in emissions]
print("\nEmission Statistics:")
print(f" Average: {sum(emission_durations)/len(emission_durations):.3f}s")
print(f" Min: {min(emission_durations):.3f}s")
print(f" Max: {max(emission_durations):.3f}s")
if events:
all_event_durations = []
for event_list in events.values():
all_event_durations.extend([e['duration'] for e in event_list])
if all_event_durations:
print("\nEvent Processing Statistics:")
print(f" Average: {sum(all_event_durations)/len(all_event_durations):.3f}ms")
print(f" Min: {min(all_event_durations):.3f}ms")
print(f" Max: {max(all_event_durations):.3f}ms")
if __name__ == "__main__":
main()

View File

@@ -0,0 +1,30 @@
import asyncio
from datetime import datetime, timezone
config = {
"type": "cron",
"cron": "*/1 * * * *", # Every minute
"name": "PerfTestCron",
"description": "Emits 100 perf-test events every minute",
"emits": ["perf-test-event"],
"flows": ["perf-test"],
}
async def handler(context):
start_time = datetime.now(timezone.utc)
context.logger.info(f"Starting perf test emission at {start_time}")
# Emit 100 events
for i in range(100):
await context.emit({
"topic": "perf-test-event",
"data": {
"event_id": i,
"batch_start": start_time.isoformat(),
"timestamp": datetime.now(timezone.utc).isoformat(),
},
})
end_time = datetime.now(timezone.utc)
duration = (end_time - start_time).total_seconds()
context.logger.info(f"Completed emitting 100 perf-test events in {duration:.3f}s")

View File

@@ -0,0 +1,24 @@
import asyncio
from datetime import datetime, timezone
config = {
"type": "event",
"name": "PerfTestEventHandler",
"description": "Handles perf-test events with 1ms delay and logging",
"subscribes": ["perf-test-event"],
"emits": [],
"flows": ["perf-test"],
}
async def handler(event_data, context):
start_time = datetime.now(timezone.utc)
event_id = event_data.get("event_id")
batch_start = event_data.get("batch_start")
# Wait 1ms
await asyncio.sleep(0.001)
# Log completion with duration
end_time = datetime.now(timezone.utc)
duration = (end_time - start_time).total_seconds() * 1000 # in milliseconds
context.logger.info(f"Processed perf-test event {event_id} from batch {batch_start} in {duration:.2f}ms")

View File

@@ -0,0 +1,31 @@
{
"compilerOptions": {
"target": "ES2020",
"module": "ESNext",
"moduleResolution": "bundler",
"allowImportingTsExtensions": true,
"noEmit": true,
"esModuleInterop": true,
"strict": true,
"skipLibCheck": true,
"forceConsistentCasingInFileNames": true,
"resolveJsonModule": true,
"allowJs": true,
"outDir": "dist",
"rootDir": ".",
"baseUrl": ".",
"jsx": "react-jsx"
},
"include": [
"**/*.ts",
"motia.config.ts",
"**/*.tsx",
"types.d.ts",
"**/*.jsx"
],
"exclude": [
"node_modules",
"dist",
"tests"
]
}

View File

@@ -0,0 +1,21 @@
/**
* Automatically generated types for motia
* Do NOT edit this file manually.
*
* Consider adding this file to .prettierignore and eslint ignore.
*/
import { EventHandler, ApiRouteHandler, ApiResponse, MotiaStream, CronHandler } from 'motia'
declare module 'motia' {
interface FlowContextStateStreams {
}
interface Handlers {
'PerfTestEventHandler': EventHandler<never, never>
'PerfTestCron': CronHandler<{ topic: 'perf-test-event'; data: never }>
'ProcessGreeting': EventHandler<{ timestamp: string; appName: string; greetingPrefix: string; requestId: string }, never>
'HelloAPI': ApiRouteHandler<Record<string, unknown>, ApiResponse<200, { message: string; status: string; appName: string }>, { topic: 'process-greeting'; data: { timestamp: string; appName: string; greetingPrefix: string; requestId: string } }>
}
}

View File

@@ -0,0 +1,170 @@
# Motia Performance Degradation Bug Report
## Title
Exponential Performance Degradation in Event Processing Over Time
## Description
Motia exhibits exponential performance degradation when processing events continuously. Even in a clean, fresh installation with minimal custom code, the time required to emit and process batches of events grows exponentially, leading to severe slowdowns. This issue persists regardless of custom modifications, indicating a fundamental problem in the Motia framework's event handling or queue management.
## Steps to Reproduce
1. Create a completely new Motia project:
```bash
mkdir -p /opt/motia-app/motia-clean-test && cd /opt/motia-app/motia-clean-test
cd /opt/motia-app/motia-clean-test && npx motia@latest create
```
2. Add the following performance test steps (create these files in the `steps/` directory):
**steps/perf_cron_step.py**:
```python
import asyncio
from datetime import datetime, timezone
config = {
"type": "cron",
"cron": "*/1 * * * *", # Every minute
"name": "PerfTestCron",
"description": "Emits 100 perf-test events every minute",
"emits": ["perf-test-event"],
"flows": ["perf-test"],
}
async def handler(context):
start_time = datetime.now(timezone.utc)
context.logger.info(f"Starting perf test emission at {start_time}")
# Emit 100 events
for i in range(100):
await context.emit({
"topic": "perf-test-event",
"data": {
"event_id": i,
"batch_start": start_time.isoformat(),
"timestamp": datetime.now(timezone.utc).isoformat(),
},
})
end_time = datetime.now(timezone.utc)
duration = (end_time - start_time).total_seconds()
context.logger.info(f"Completed emitting 100 perf-test events in {duration:.3f}s")
```
**steps/perf_event_step.py**:
```python
import asyncio
from datetime import datetime, timezone
config = {
"type": "event",
"name": "PerfTestEventHandler",
"description": "Handles perf-test events with 1ms delay and logging",
"subscribes": ["perf-test-event"],
"emits": [],
"flows": ["perf-test"],
}
async def handler(event_data, context):
start_time = datetime.now(timezone.utc)
event_id = event_data.get("event_id")
batch_start = event_data.get("batch_start")
# Wait 1ms
await asyncio.sleep(0.001)
# Log completion with duration
end_time = datetime.now(timezone.utc)
duration = (end_time - start_time).total_seconds() * 1000 # in milliseconds
context.logger.info(f"Processed perf-test event {event_id} from batch {batch_start} in {duration:.2f}ms")
```
3. Start Motia and run for at least 15 minutes:
```bash
npm run dev
```
4. Monitor logs or use an analysis script to observe the degradation.
## Expected Behavior
- Each batch of 100 events should be emitted and processed consistently within ~1-5 seconds.
- Event processing times should remain stable at ~1-2ms per event.
- No exponential growth in processing times over time.
## Actual Behavior
- Emission times grow exponentially: starting at ~1s, reaching up to 193s by the 10th batch.
- Event processing remains constant (~1.7ms average), but the overall throughput degrades severely.
- The framework becomes unresponsive after several minutes of continuous operation.
## Environment
- **Motia Version**: Latest (0.17.11-beta.193) and 0.8.2-beta.139 and 0.7.2-beta.134 (all tested)
- **Node.js Version**: [Check with `node --version`]
- **Python Version**: 3.13
- **OS**: Linux
- **Dependencies**: Minimal (only Motia defaults)
- **Hardware**: [Standard server specs]
## Additional Context
### Version Comparison Results
The performance degradation issue exists in all tested versions of Motia, indicating this is a fundamental framework issue that has persisted across versions.
**Version 0.17.11-beta.193 (Latest):**
- Average emission time: 7.786s
- Min: 1.135s (Batch 1)
- Max: 14.074s (Batch 9)
**Version 0.8.2-beta.139:**
- Average emission time: 51.386s
- Min: 1.004s (Batch 1)
- Max: 193.355s (Batch 10)
**Version 0.7.2-beta.134:**
- Average emission time: 57.859s
- Min: 1.163s (Batch 1)
- Max: 260.849s (Batch 9)
All versions show identical exponential degradation patterns, confirming this is not a regression introduced in recent versions.
### Test Results from Clean Installation
The issue was reproduced in a completely fresh Motia installation with no custom code beyond the minimal performance test above.
**Latest Version (0.17.11-beta.193) Emission Statistics (from 9 batches):**
- Average emission time: 7.786s
- Min: 1.135s (Batch 1)
- Max: 14.074s (Batch 9)
**Event Processing Statistics:**
- Average processing time: 1.688ms
- Min: 1.030ms
- Max: 4.810ms
### Detailed Measurements (Latest Version)
| Batch | Emission Time (s) | Events Processed | Total Processing Time (s) |
|-------|-------------------|-----------------|---------------------------|
| 1 | 1.135 | 100 | ~4 |
| 2 | ~8 | 100 | ~8 |
| 3 | ~9 | 100 | ~9 |
| 4 | ~11 | 100 | ~11 |
| 5 | ~12 | 100 | ~12 |
| 6 | ~13 | 100 | ~13 |
| 7 | ~14 | 100 | ~14 |
| 8 | ~14 | 100 | ~14 |
| 9 | 14.074 | 100 | ~14 |
### Analysis
- Individual event processing remains constant, indicating the bottleneck is in the framework's event queue or scheduling system.
- The exponential growth suggests a memory leak, queue accumulation, or inefficient resource management in Motia's core event handling.
- Logs show no errors; the degradation is purely performance-related.
### Files Attached
- `perf_test_data.csv`: Raw timing data
- `perf_test_analysis.png`: Performance plots
- `motia.log`: Full log output from the test run
## Possible Root Causes
- Event queue not being properly cleared or garbage collected.
- Redis state accumulation (if used for event persistence).
- Node.js event loop blocking or memory pressure.
- Python asyncio event handling inefficiencies in the Motia bridge.
## Impact
- Makes Motia unsuitable for continuous, high-throughput event processing.
- Severe degradation occurs within minutes, preventing production use for event-driven applications.

4176
package-lock.json generated

File diff suppressed because it is too large Load Diff

View File

@@ -2,7 +2,8 @@
"name": "bitbylaw", "name": "bitbylaw",
"version": "1.0.0", "version": "1.0.0",
"dependencies": { "dependencies": {
"motia": "^0.8.2-beta.139" "motia": "^0.8.2-beta.139",
"npm": "^11.7.0"
}, },
"scripts": { "scripts": {
"start": "motia start --host 0.0.0.0", "start": "motia start --host 0.0.0.0",

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,125 @@
top - 09:48:42 up 2:59, 1 user, load average: 0,79, 0,81, 0,70
Tasks: 1 total, 0 running, 1 sleeping, 0 stopped, 0 zombie
%CPU(s): 11,1 us, 1,2 sy, 0,0 ni, 84,0 id, 3,7 wa, 0,0 hi, 0,0 si, 0,0 st
MiB Spch: 16384,0 total, 14853,4 free, 1442,7 used, 114,1 buff/cache
MiB Swap: 0,0 total, 0,0 free, 0,0 used. 14941,3 avail Spch
PID USER PR NI VIRT RES SHR S %CPU %MEM ZEIT+ BEFEHL
11086 www-data 20 0 21,8g 155460 41104 S 0,0 0,9 0:11.65 node
top - 09:48:47 up 2:59, 1 user, load average: 0,89, 0,83, 0,71
Tasks: 1 total, 0 running, 1 sleeping, 0 stopped, 0 zombie
%CPU(s): 10,5 us, 1,5 sy, 0,0 ni, 87,8 id, 0,1 wa, 0,0 hi, 0,0 si, 0,0 st
MiB Spch: 16384,0 total, 14869,9 free, 1426,1 used, 114,1 buff/cache
MiB Swap: 0,0 total, 0,0 free, 0,0 used. 14957,9 avail Spch
PID USER PR NI VIRT RES SHR S %CPU %MEM ZEIT+ BEFEHL
11086 www-data 20 0 21,8g 163908 41104 S 0,4 1,0 0:11.67 node
top - 09:48:52 up 2:59, 1 user, load average: 0,82, 0,82, 0,70
Tasks: 1 total, 0 running, 1 sleeping, 0 stopped, 0 zombie
%CPU(s): 0,7 us, 0,5 sy, 0,0 ni, 98,7 id, 0,0 wa, 0,0 hi, 0,0 si, 0,0 st
MiB Spch: 16384,0 total, 14910,3 free, 1385,8 used, 114,1 buff/cache
MiB Swap: 0,0 total, 0,0 free, 0,0 used. 14998,2 avail Spch
PID USER PR NI VIRT RES SHR S %CPU %MEM ZEIT+ BEFEHL
11086 www-data 20 0 21,8g 163908 41104 S 0,2 1,0 0:11.68 node
top - 09:48:57 up 2:59, 1 user, load average: 0,75, 0,80, 0,70
Tasks: 1 total, 0 running, 1 sleeping, 0 stopped, 0 zombie
%CPU(s): 0,4 us, 0,4 sy, 0,0 ni, 99,2 id, 0,0 wa, 0,0 hi, 0,0 si, 0,0 st
MiB Spch: 16384,0 total, 14909,3 free, 1386,6 used, 114,3 buff/cache
MiB Swap: 0,0 total, 0,0 free, 0,0 used. 14997,4 avail Spch
PID USER PR NI VIRT RES SHR S %CPU %MEM ZEIT+ BEFEHL
11086 www-data 20 0 21,8g 163908 41104 S 0,0 1,0 0:11.68 node
top - 09:49:02 up 2:59, 1 user, load average: 0,69, 0,79, 0,70
Tasks: 1 total, 0 running, 1 sleeping, 0 stopped, 0 zombie
%CPU(s): 11,2 us, 1,8 sy, 0,0 ni, 87,0 id, 0,0 wa, 0,0 hi, 0,0 si, 0,0 st
MiB Spch: 16384,0 total, 14237,1 free, 2058,2 used, 114,8 buff/cache
MiB Swap: 0,0 total, 0,0 free, 0,0 used. 14325,8 avail Spch
PID USER PR NI VIRT RES SHR S %CPU %MEM ZEIT+ BEFEHL
11086 www-data 20 0 21,9g 215304 41104 S 25,5 1,3 0:12.96 node
top - 09:49:07 up 2:59, 1 user, load average: 0,64, 0,78, 0,69
Tasks: 1 total, 0 running, 1 sleeping, 0 stopped, 0 zombie
%CPU(s): 5,9 us, 1,1 sy, 0,0 ni, 92,9 id, 0,0 wa, 0,0 hi, 0,1 si, 0,0 st
MiB Spch: 16384,0 total, 14275,2 free, 2019,9 used, 115,0 buff/cache
MiB Swap: 0,0 total, 0,0 free, 0,0 used. 14364,1 avail Spch
PID USER PR NI VIRT RES SHR S %CPU %MEM ZEIT+ BEFEHL
11086 www-data 20 0 21,9g 222472 41104 S 28,1 1,3 0:14.37 node
top - 09:49:12 up 2:59, 1 user, load average: 0,59, 0,76, 0,69
Tasks: 1 total, 0 running, 1 sleeping, 0 stopped, 0 zombie
%CPU(s): 5,7 us, 1,1 sy, 0,0 ni, 93,2 id, 0,0 wa, 0,0 hi, 0,1 si, 0,0 st
MiB Spch: 16384,0 total, 14707,3 free, 1588,2 used, 114,7 buff/cache
MiB Swap: 0,0 total, 0,0 free, 0,0 used. 14795,8 avail Spch
PID USER PR NI VIRT RES SHR S %CPU %MEM ZEIT+ BEFEHL
11086 www-data 20 0 21,9g 222984 41104 S 30,4 1,3 0:15.89 node
top - 09:49:17 up 2:59, 1 user, load average: 0,62, 0,77, 0,69
Tasks: 1 total, 0 running, 1 sleeping, 0 stopped, 0 zombie
%CPU(s): 1,5 us, 0,5 sy, 0,0 ni, 98,1 id, 0,0 wa, 0,0 hi, 0,0 si, 0,0 st
MiB Spch: 16384,0 total, 14855,8 free, 1439,8 used, 114,6 buff/cache
MiB Swap: 0,0 total, 0,0 free, 0,0 used. 14944,2 avail Spch
PID USER PR NI VIRT RES SHR S %CPU %MEM ZEIT+ BEFEHL
11086 www-data 20 0 21,9g 226100 41104 S 5,6 1,3 0:16.17 node
top - 09:49:22 up 2:59, 1 user, load average: 0,65, 0,77, 0,69
Tasks: 1 total, 0 running, 1 sleeping, 0 stopped, 0 zombie
%CPU(s): 0,6 us, 0,5 sy, 0,0 ni, 98,8 id, 0,0 wa, 0,0 hi, 0,0 si, 0,0 st
MiB Spch: 16384,0 total, 14918,3 free, 1377,3 used, 114,6 buff/cache
MiB Swap: 0,0 total, 0,0 free, 0,0 used. 15006,7 avail Spch
PID USER PR NI VIRT RES SHR S %CPU %MEM ZEIT+ BEFEHL
11086 www-data 20 0 21,9g 226100 41104 S 1,2 1,3 0:16.23 node
top - 09:49:27 up 2:59, 1 user, load average: 0,60, 0,76, 0,69
Tasks: 1 total, 0 running, 1 sleeping, 0 stopped, 0 zombie
%CPU(s): 0,4 us, 0,4 sy, 0,0 ni, 99,2 id, 0,0 wa, 0,0 hi, 0,0 si, 0,0 st
MiB Spch: 16384,0 total, 14919,0 free, 1376,6 used, 114,6 buff/cache
MiB Swap: 0,0 total, 0,0 free, 0,0 used. 15007,4 avail Spch
PID USER PR NI VIRT RES SHR S %CPU %MEM ZEIT+ BEFEHL
11086 www-data 20 0 21,9g 226100 41104 S 0,0 1,3 0:16.23 node
top - 09:49:32 up 3:00, 1 user, load average: 0,55, 0,74, 0,68
Tasks: 1 total, 0 running, 1 sleeping, 0 stopped, 0 zombie
%CPU(s): 0,4 us, 0,5 sy, 0,0 ni, 99,1 id, 0,0 wa, 0,0 hi, 0,0 si, 0,0 st
MiB Spch: 16384,0 total, 14918,8 free, 1376,8 used, 114,6 buff/cache
MiB Swap: 0,0 total, 0,0 free, 0,0 used. 15007,2 avail Spch
PID USER PR NI VIRT RES SHR S %CPU %MEM ZEIT+ BEFEHL
11086 www-data 20 0 21,9g 226100 41104 S 0,0 1,3 0:16.23 node
top - 09:49:37 up 3:00, 1 user, load average: 0,51, 0,73, 0,68
Tasks: 1 total, 0 running, 1 sleeping, 0 stopped, 0 zombie
%CPU(s): 0,4 us, 0,3 sy, 0,0 ni, 99,3 id, 0,0 wa, 0,0 hi, 0,0 si, 0,0 st
MiB Spch: 16384,0 total, 14918,6 free, 1377,0 used, 114,6 buff/cache
MiB Swap: 0,0 total, 0,0 free, 0,0 used. 15007,0 avail Spch
PID USER PR NI VIRT RES SHR S %CPU %MEM ZEIT+ BEFEHL
11086 www-data 20 0 21,9g 226100 41104 S 0,0 1,3 0:16.23 node
top - 09:49:42 up 3:00, 1 user, load average: 0,55, 0,74, 0,68
Tasks: 1 total, 0 running, 1 sleeping, 0 stopped, 0 zombie
%CPU(s): 0,6 us, 0,5 sy, 0,0 ni, 98,9 id, 0,0 wa, 0,0 hi, 0,0 si, 0,0 st
MiB Spch: 16384,0 total, 14987,7 free, 1307,9 used, 114,6 buff/cache
MiB Swap: 0,0 total, 0,0 free, 0,0 used. 15076,1 avail Spch
PID USER PR NI VIRT RES SHR S %CPU %MEM ZEIT+ BEFEHL
11086 www-data 20 0 21,8g 158412 41104 S 1,8 0,9 0:16.32 node
top - 09:49:47 up 3:00, 1 user, load average: 0,50, 0,72, 0,68
Tasks: 1 total, 0 running, 1 sleeping, 0 stopped, 0 zombie
%CPU(s): 0,5 us, 0,5 sy, 0,0 ni, 99,0 id, 0,0 wa, 0,0 hi, 0,0 si, 0,0 st
MiB Spch: 16384,0 total, 14978,6 free, 1317,0 used, 114,6 buff/cache
MiB Swap: 0,0 total, 0,0 free, 0,0 used. 15067,0 avail Spch
PID USER PR NI VIRT RES SHR S %CPU %MEM ZEIT+ BEFEHL
11086 www-data 20 0 21,8g 167372 41104 S 0,4 1,0 0:16.34 node

View File

1
redis_log.txt Normal file
View File

@@ -0,0 +1 @@
OK

26
top_log.txt Normal file
View File

@@ -0,0 +1,26 @@
top - 09:03:10 up 2:13, 1 user, load average: 0,90, 0,48, 0,35
Tasks: 1 total, 0 running, 1 sleeping, 0 stopped, 0 zombie
%CPU(s): 1,2 us, 1,2 sy, 0,0 ni, 97,6 id, 0,0 wa, 0,0 hi, 0,0 si, 0,0 st
MiB Spch: 16384,0 total, 15217,2 free, 1094,7 used, 96,0 buff/cache
MiB Swap: 0,0 total, 0,0 free, 0,0 used. 15289,3 avail Spch
PID USER PR NI VIRT RES SHR S %CPU %MEM ZEIT+ BEFEHL
3464 root 20 0 11,3g 88516 32892 S 0,0 0,5 0:01.36 node
top - 09:03:15 up 2:13, 1 user, load average: 0,91, 0,49, 0,35
Tasks: 1 total, 0 running, 1 sleeping, 0 stopped, 0 zombie
%CPU(s): 0,5 us, 0,5 sy, 0,0 ni, 99,1 id, 0,0 wa, 0,0 hi, 0,0 si, 0,0 st
MiB Spch: 16384,0 total, 15216,2 free, 1095,5 used, 96,1 buff/cache
MiB Swap: 0,0 total, 0,0 free, 0,0 used. 15288,5 avail Spch
PID USER PR NI VIRT RES SHR S %CPU %MEM ZEIT+ BEFEHL
3464 root 20 0 11,3g 88516 32892 S 0,2 0,5 0:01.37 node
top - 09:03:20 up 2:13, 1 user, load average: 0,83, 0,49, 0,35
Tasks: 1 total, 0 running, 1 sleeping, 0 stopped, 0 zombie
%CPU(s): 0,7 us, 0,5 sy, 0,0 ni, 98,8 id, 0,0 wa, 0,0 hi, 0,0 si, 0,0 st
MiB Spch: 16384,0 total, 15216,7 free, 1095,1 used, 96,1 buff/cache
MiB Swap: 0,0 total, 0,0 free, 0,0 used. 15288,9 avail Spch
PID USER PR NI VIRT RES SHR S %CPU %MEM ZEIT+ BEFEHL
3464 root 20 0 11,3g 88516 32892 S 0,0 0,5 0:01.37 node

3476518
top_long_log.txt Normal file

File diff suppressed because it is too large Load Diff