cleanup
This commit is contained in:
31
bitbylaw/.gitignore
vendored
31
bitbylaw/.gitignore
vendored
@@ -5,4 +5,33 @@ venv
|
||||
.motia
|
||||
.mermaid
|
||||
dist
|
||||
*.pyc
|
||||
*.pyc
|
||||
__pycache__
|
||||
|
||||
# Performance logs and diagnostics
|
||||
*_log.txt
|
||||
performance_logs_*/
|
||||
*.clinic
|
||||
|
||||
# Service account credentials (WICHTIG!)
|
||||
service-account.json
|
||||
|
||||
# IDE and editor files
|
||||
.vscode/
|
||||
.cursor/
|
||||
.aider*
|
||||
|
||||
# OS files
|
||||
.DS_Store
|
||||
Thumbs.db
|
||||
|
||||
# Build artifacts
|
||||
*.so
|
||||
*.egg-info
|
||||
build/
|
||||
*.whl
|
||||
|
||||
# Environment files
|
||||
.env
|
||||
.env.*
|
||||
!.env.example
|
||||
@@ -1,277 +1,206 @@
|
||||
# Motia Advoware-EspoCRM Integration
|
||||
# bitbylaw - Motia Integration Platform
|
||||
|
||||
Dieses Projekt implementiert eine robuste Integration zwischen Advoware und EspoCRM über das Motia-Framework. Es bietet eine vollständige API-Proxy für Advoware und Webhook-Handler für EspoCRM, um Änderungen an Beteiligte-Entitäten zu synchronisieren.
|
||||
Event-driven Integration zwischen Advoware, EspoCRM und Google Calendar über das Motia-Framework.
|
||||
|
||||
## Übersicht
|
||||
## Quick Start
|
||||
|
||||
Das System besteht aus drei Hauptkomponenten:
|
||||
```bash
|
||||
cd /opt/motia-app/bitbylaw
|
||||
npm install
|
||||
pip install -r requirements.txt
|
||||
npm start
|
||||
```
|
||||
|
||||
1. **Advoware API Proxy**: Vollständige REST-API-Proxy für alle HTTP-Methoden (GET, POST, PUT, DELETE)
|
||||
2. **EspoCRM Webhook Receiver**: Empfängt Webhooks für CRUD-Operationen auf Beteiligte-Entitäten
|
||||
3. **Event-Driven Sync**: Verarbeitet Synchronisationsereignisse mit Redis-basierter Deduplikation
|
||||
Siehe: [docs/DEVELOPMENT.md](docs/DEVELOPMENT.md) für Details.
|
||||
|
||||
## Komponenten
|
||||
|
||||
1. **Advoware API Proxy** - REST-API-Proxy mit HMAC-512 Auth ([Details](steps/advoware_proxy/README.md))
|
||||
2. **Calendar Sync** - Bidirektionale Synchronisation Advoware ↔ Google ([Details](steps/advoware_cal_sync/README.md))
|
||||
3. **VMH Webhooks** - EspoCRM Webhook-Receiver für Beteiligte ([Details](steps/vmh/README.md))
|
||||
|
||||
## Architektur
|
||||
|
||||
### Komponenten
|
||||
|
||||
- **Motia Framework**: Event-driven Backend-Orchestrierung
|
||||
- **Python Steps**: Asynchrone Verarbeitung mit aiohttp und redis-py
|
||||
- **Advoware API Client**: Authentifizierte API-Kommunikation mit Token-Management
|
||||
- **Redis**: Deduplikation von Webhook-Events und Caching
|
||||
- **EspoCRM Integration**: Webhook-Handler für create/update/delete Operationen
|
||||
|
||||
### Datenfluss
|
||||
|
||||
```
|
||||
EspoCRM Webhook → VMH Webhook Receiver → Redis Deduplication → Event Emission → Sync Handler
|
||||
Advoware API → Proxy Steps → Response
|
||||
┌─────────────┐ ┌──────────┐ ┌────────────┐
|
||||
│ EspoCRM │────▶│ Webhooks │────▶│ Redis │
|
||||
└─────────────┘ └──────────┘ │ Dedup │
|
||||
└────────────┘
|
||||
┌─────────────┐ ┌──────────┐ │
|
||||
│ Clients │────▶│ Proxy │────▶ │
|
||||
└─────────────┘ └──────────┘ │
|
||||
▼
|
||||
┌────────────┐
|
||||
│ Sync │
|
||||
│ Handlers │
|
||||
└────────────┘
|
||||
│
|
||||
▼
|
||||
┌────────────┐
|
||||
│ Advoware │
|
||||
│ Google │
|
||||
└────────────┘
|
||||
```
|
||||
|
||||
## Setup
|
||||
Siehe: [docs/ARCHITECTURE.md](docs/ARCHITECTURE.md)
|
||||
|
||||
### Voraussetzungen
|
||||
## API Endpoints
|
||||
|
||||
- Python 3.13+
|
||||
- Node.js 18+
|
||||
- Redis Server
|
||||
- Motia CLI
|
||||
**Advoware Proxy**:
|
||||
- `GET/POST/PUT/DELETE /advoware/proxy?endpoint=...`
|
||||
|
||||
### Installation
|
||||
**Calendar Sync**:
|
||||
- `POST /advoware/calendar/sync` - Manual trigger
|
||||
|
||||
1. **Repository klonen und Dependencies installieren:**
|
||||
```bash
|
||||
cd /opt/motia-app/bitbylaw
|
||||
npm install
|
||||
pip install -r requirements.txt
|
||||
```
|
||||
**VMH Webhooks**:
|
||||
- `POST /vmh/webhook/beteiligte/create`
|
||||
- `POST /vmh/webhook/beteiligte/update`
|
||||
- `POST /vmh/webhook/beteiligte/delete`
|
||||
|
||||
2. **Umgebungsvariablen konfigurieren:**
|
||||
Erstellen Sie eine `.env`-Datei mit folgenden Variablen:
|
||||
```env
|
||||
ADVOWARE_BASE_URL=https://api.advoware.com
|
||||
ADVOWARE_USERNAME=your_username
|
||||
ADVOWARE_PASSWORD=your_password
|
||||
REDIS_URL=redis://localhost:6379
|
||||
ESPOCRM_WEBHOOK_SECRET=your_webhook_secret
|
||||
```
|
||||
Siehe: [docs/API.md](docs/API.md)
|
||||
|
||||
3. **Redis starten:**
|
||||
```bash
|
||||
redis-server
|
||||
```
|
||||
## Configuration
|
||||
|
||||
4. **Motia starten:**
|
||||
```bash
|
||||
motia start
|
||||
```
|
||||
Environment Variables via `.env` oder systemd service:
|
||||
|
||||
## Verwendung
|
||||
|
||||
### Advoware API Proxy
|
||||
|
||||
Die Proxy-Endpunkte spiegeln die Advoware-API wider:
|
||||
|
||||
- `GET /api/advoware/*` - Daten abrufen
|
||||
- `POST /api/advoware/*` - Neue Ressourcen erstellen
|
||||
- `PUT /api/advoware/*` - Ressourcen aktualisieren
|
||||
- `DELETE /api/advoware/*` - Ressourcen löschen
|
||||
|
||||
**Beispiel:**
|
||||
```bash
|
||||
curl -X GET "http://localhost:3000/api/advoware/employees"
|
||||
# Advoware
|
||||
ADVOWARE_API_BASE_URL=https://www2.advo-net.net:90/
|
||||
ADVOWARE_API_KEY=your_base64_key
|
||||
ADVOWARE_USER=api_user
|
||||
ADVOWARE_PASSWORD=your_password
|
||||
|
||||
# Redis
|
||||
REDIS_HOST=localhost
|
||||
REDIS_PORT=6379
|
||||
|
||||
# Google Calendar
|
||||
GOOGLE_CALENDAR_SERVICE_ACCOUNT_PATH=/opt/motia-app/service-account.json
|
||||
```
|
||||
|
||||
Für detaillierte Informationen zu den Proxy-Steps siehe [steps/advoware_proxy/README.md](steps/advoware_proxy/README.md).
|
||||
|
||||
### EspoCRM Webhooks
|
||||
|
||||
Webhooks werden automatisch von EspoCRM gesendet für Änderungen an Beteiligte-Entitäten:
|
||||
|
||||
- **Create**: `/webhooks/vmh/beteiligte/create`
|
||||
- **Update**: `/webhooks/vmh/beteiligte/update`
|
||||
- **Delete**: `/webhooks/vmh/beteiligte/delete`
|
||||
|
||||
Für detaillierte Informationen zu den Webhook- und Sync-Steps siehe [steps/vmh/README.md](steps/vmh/README.md).
|
||||
|
||||
### Synchronisation
|
||||
|
||||
Die Synchronisation läuft event-driven ab:
|
||||
|
||||
1. Webhook-Events werden in Redis-Queues dedupliziert
|
||||
2. Events werden an den Sync-Handler emittiert
|
||||
3. Sync-Handler verarbeitet die Änderungen (aktuell Placeholder)
|
||||
|
||||
## Konfiguration
|
||||
|
||||
### Motia Workbench
|
||||
|
||||
Die Flows sind in `motia-workbench.json` definiert:
|
||||
|
||||
- `advoware-proxy`: API-Proxy-Flows
|
||||
- `vmh-webhook`: Webhook-Receiver-Flows
|
||||
- `beteiligte-sync`: Synchronisations-Flow
|
||||
|
||||
### Redis Keys
|
||||
|
||||
- `vmh:webhook:create`: Create-Event Queue
|
||||
- `vmh:webhook:update`: Update-Event Queue
|
||||
- `vmh:webhook:delete`: Delete-Event Queue
|
||||
|
||||
## Entwicklung
|
||||
|
||||
### Projektstruktur
|
||||
|
||||
```
|
||||
bitbylaw/
|
||||
├── steps/
|
||||
│ ├── advoware_proxy/ # API Proxy Steps - siehe [README](steps/advoware_proxy/README.md)
|
||||
│ │ ├── advoware_api_proxy_get_step.py
|
||||
│ │ ├── advoware_api_proxy_post_step.py
|
||||
│ │ ├── advoware_api_proxy_put_step.py
|
||||
│ │ └── advoware_api_proxy_delete_step.py
|
||||
│ └── vmh/ # VMH Webhook & Sync Steps - siehe [README](steps/vmh/README.md)
|
||||
│ ├── webhook/ # Webhook Receiver Steps
|
||||
│ │ ├── beteiligte_create_api_step.py
|
||||
│ │ ├── beteiligte_update_api_step.py
|
||||
│ │ └── beteiligte_delete_api_step.py
|
||||
│ └── beteiligte_sync_event_step.py # Sync Handler
|
||||
├── services/
|
||||
│ └── advoware.py # API Client
|
||||
├── config.py # Configuration
|
||||
├── motia-workbench.json # Flow Definitions
|
||||
├── package.json
|
||||
├── requirements.txt
|
||||
└── tsconfig.json
|
||||
```
|
||||
|
||||
### Testing
|
||||
|
||||
**API Proxy testen:**
|
||||
```bash
|
||||
curl -X GET "http://localhost:3000/api/advoware/employees"
|
||||
```
|
||||
|
||||
**Webhook simulieren:**
|
||||
```bash
|
||||
curl -X POST "http://localhost:3000/webhooks/vmh/beteiligte/create" \
|
||||
-H "Content-Type: application/json" \
|
||||
-d '{"id": "123", "name": "Test Beteiligte"}'
|
||||
```
|
||||
|
||||
### Logging
|
||||
|
||||
Alle Steps enthalten detaillierte Logging-Ausgaben für Debugging:
|
||||
|
||||
- API-Requests/Responses
|
||||
- Redis-Operationen
|
||||
- Event-Emission
|
||||
- Fehlerbehandlung
|
||||
Siehe:
|
||||
- [docs/CONFIGURATION.md](docs/CONFIGURATION.md)
|
||||
- [docs/GOOGLE_SETUP.md](docs/GOOGLE_SETUP.md) - Service Account Setup
|
||||
|
||||
## Deployment
|
||||
|
||||
### Docker
|
||||
|
||||
```dockerfile
|
||||
FROM python:3.13-slim
|
||||
|
||||
WORKDIR /app
|
||||
COPY requirements.txt .
|
||||
RUN pip install -r requirements.txt
|
||||
|
||||
COPY . .
|
||||
EXPOSE 3000
|
||||
|
||||
CMD ["motia", "start"]
|
||||
```
|
||||
|
||||
### Production Setup
|
||||
|
||||
1. Redis Cluster für Hochverfügbarkeit
|
||||
2. Load Balancer für API-Endpunkte
|
||||
3. Monitoring für Sync-Operationen
|
||||
4. Backup-Strategie für Redis-Daten
|
||||
|
||||
## Fehlerbehebung
|
||||
|
||||
### Häufige Probleme
|
||||
|
||||
1. **Context Attribute Error**: Verwenden Sie `Config` statt `context.config`
|
||||
2. **Redis Connection Failed**: Überprüfen Sie Redis-URL und Netzwerkverbindung
|
||||
3. **Webhook Duplikate**: Redis-Deduplikation verhindert Mehrfachverarbeitung
|
||||
|
||||
### Logs überprüfen
|
||||
Production deployment via systemd:
|
||||
|
||||
```bash
|
||||
motia logs
|
||||
sudo systemctl status motia.service
|
||||
sudo journalctl -u motia.service -f
|
||||
```
|
||||
|
||||
## Calendar Sync
|
||||
Siehe: [docs/DEPLOYMENT.md](docs/DEPLOYMENT.md)
|
||||
|
||||
Das System enthält auch eine bidirektionale Kalender-Synchronisation zwischen Advoware und Google Calendar.
|
||||
## Documentation
|
||||
|
||||
### Architektur
|
||||
### Getting Started
|
||||
- [Development Guide](docs/DEVELOPMENT.md) - Setup, Coding Standards, Testing
|
||||
- [Configuration](docs/CONFIGURATION.md) - Environment Variables
|
||||
- [Deployment](docs/DEPLOYMENT.md) - Production Setup
|
||||
|
||||
- **PostgreSQL Hub**: Speichert Sync-Zustand und verhindert Datenverlust
|
||||
- **Event-Driven Sync**: 4-Phasen-Sync (Neu, Gelöscht, Aktualisiert)
|
||||
- **Safe Wrappers**: Globale Write-Protection für Advoware-Schreiboperationen
|
||||
- **Rate Limiting**: Backoff-Handling für Google Calendar API-Limits
|
||||
### Technical Details
|
||||
- [Architecture](docs/ARCHITECTURE.md) - System Design, Datenflüsse
|
||||
- [API Reference](docs/API.md) - HTTP Endpoints, Event Topics
|
||||
- [Troubleshooting](docs/TROUBLESHOOTING.md) - Common Issues
|
||||
|
||||
### Dauertermine (Recurring Appointments)
|
||||
### Components
|
||||
- [Advoware Proxy](steps/advoware_proxy/README.md) - API Proxy Details
|
||||
- [Calendar Sync](steps/advoware_cal_sync/README.md) - Sync Logic
|
||||
- [VMH Webhooks](steps/vmh/README.md) - Webhook Handlers
|
||||
- [Advoware Service](services/ADVOWARE_SERVICE.md) - API Client
|
||||
|
||||
Advoware verwendet `dauertermin=1` für wiederkehrende Termine mit folgenden Feldern:
|
||||
### Step Documentation
|
||||
Jeder Step hat eine detaillierte `.md` Dokumentation neben der `.py` Datei.
|
||||
|
||||
- `turnus`: Intervall (z.B. 1 = jeden, 3 = jeden 3.)
|
||||
- `turnusArt`: Frequenz-Einheit
|
||||
- `1` = Täglich (DAILY)
|
||||
- `2` = Wöchentlich (WEEKLY)
|
||||
- `3` = Monatlich (MONTHLY)
|
||||
- `4` = Jährlich (YEARLY)
|
||||
- `datumBis`: Enddatum der Wiederholung
|
||||
## Project Structure
|
||||
|
||||
**RRULE-Generierung:**
|
||||
```
|
||||
RRULE:FREQ={FREQ};INTERVAL={turnus};UNTIL={datumBis}
|
||||
bitbylaw/
|
||||
├── docs/ # Documentation
|
||||
├── steps/ # Motia Steps
|
||||
│ ├── advoware_proxy/ # API Proxy Steps + Docs
|
||||
│ ├── advoware_cal_sync/ # Calendar Sync Steps + Docs
|
||||
│ └── vmh/ # Webhook Steps + Docs
|
||||
├── services/ # Shared Services
|
||||
│ └── advoware.py # API Client + Doc
|
||||
├── config.py # Configuration Loader
|
||||
├── package.json # Node.js Dependencies
|
||||
└── requirements.txt # Python Dependencies
|
||||
```
|
||||
|
||||
Beispiel: `turnus=3, turnusArt=1` → `RRULE:FREQ=DAILY;INTERVAL=3;UNTIL=20251224`
|
||||
## Technology Stack
|
||||
|
||||
### Setup
|
||||
- **Framework**: Motia v0.8.2-beta.139 (Event-Driven Backend)
|
||||
- **Languages**: Python 3.13, Node.js 18, TypeScript
|
||||
- **Data Store**: Redis (Caching, Locking, Deduplication)
|
||||
- **External APIs**: Advoware REST API, Google Calendar API, EspoCRM
|
||||
|
||||
1. **Google Service Account**: `service-account.json` im Projektroot
|
||||
2. **Umgebungsvariablen**:
|
||||
```env
|
||||
ADVOWARE_WRITE_PROTECTION=false # Global write protection
|
||||
POSTGRES_HOST=localhost
|
||||
GOOGLE_CALENDAR_SERVICE_ACCOUNT_PATH=service-account.json
|
||||
```
|
||||
3. **Trigger Sync**:
|
||||
```bash
|
||||
curl -X POST "http://localhost:3000/advoware/calendar/sync" -H "Content-Type: application/json" -d '{"full_content": true}'
|
||||
```
|
||||
## Development
|
||||
|
||||
### Rate Limiting & Backoff
|
||||
```bash
|
||||
# Development mode
|
||||
npm run dev
|
||||
|
||||
- **Google Calendar API**: 403-Fehler bei Rate-Limits werden mit exponentiellem Backoff (max. 60s) wiederholt
|
||||
- **Globales Rate Limiting**: 600 Anfragen/Minute über alle parallel laufenden Sync-Prozesse hinweg mittels Redis Sorted Set
|
||||
- **Gleitendes Fenster**: 60-Sekunden-Fenster für kontinuierliche Überwachung des Durchschnitts
|
||||
- **Delays**: 100ms zwischen API-Calls zur Vermeidung von Limits
|
||||
- **Retry-Logic**: Max. 4 Versuche mit base=4
|
||||
# Generate types
|
||||
npm run generate-types
|
||||
|
||||
### Sicherheit
|
||||
# Clean build
|
||||
npm run clean && npm install
|
||||
```
|
||||
|
||||
- **Write Protection**: `ADVOWARE_WRITE_PROTECTION=true` deaktiviert alle Advoware-Schreiboperationen
|
||||
- **Per-User Calendars**: Automatische Erstellung und Freigabe von Google-Calendars pro Mitarbeiter
|
||||
---
|
||||
|
||||
### Troubleshooting
|
||||
## Projektstruktur
|
||||
|
||||
- **Rate Limit Errors**: Logs zeigen Backoff-Retries; warten oder Limits erhöhen
|
||||
- **Sync Failures**: `ADVOWARE_WRITE_PROTECTION=false` setzen für Debugging
|
||||
- **Calendar Access**: Service Account muss Owner-Rechte haben
|
||||
```
|
||||
bitbylaw/
|
||||
├── docs/ # Comprehensive documentation
|
||||
│ ├── advoware/ # Advoware API documentation (Swagger)
|
||||
│ └── *.md # Architecture, Development, Configuration, etc.
|
||||
├── scripts/ # Utility scripts for maintenance
|
||||
│ └── calendar_sync/ # Calendar sync helper scripts
|
||||
├── services/ # Shared service implementations
|
||||
├── steps/ # Motia step implementations
|
||||
│ ├── advoware_proxy/ # REST API proxy steps
|
||||
│ ├── advoware_cal_sync/ # Calendar synchronization steps
|
||||
│ └── vmh/ # EspoCRM webhook handlers
|
||||
├── src/ # TypeScript sources (unused currently)
|
||||
└── config.py # Central configuration
|
||||
```
|
||||
|
||||
## Lizenz
|
||||
**Key Files**:
|
||||
- [docs/INDEX.md](docs/INDEX.md) - Documentation navigation
|
||||
- [docs/ARCHITECTURE.md](docs/ARCHITECTURE.md) - System architecture
|
||||
- [docs/advoware/advoware_api_swagger.json](docs/advoware/advoware_api_swagger.json) - Advoware API spec
|
||||
- [scripts/calendar_sync/README.md](scripts/calendar_sync/README.md) - Utility scripts
|
||||
|
||||
[License Information]
|
||||
---
|
||||
|
||||
## Beitrag
|
||||
## Testing
|
||||
|
||||
Bitte erstellen Sie Issues für Bugs oder Feature-Requests. Pull-Requests sind willkommen!
|
||||
```bash
|
||||
# Test Advoware Proxy
|
||||
curl -X GET "http://localhost:3000/advoware/proxy?endpoint=employees"
|
||||
|
||||
# Test Calendar Sync
|
||||
curl -X POST "http://localhost:3000/advoware/calendar/sync" \
|
||||
-H "Content-Type: application/json" \
|
||||
-d '{"full_content": true}'
|
||||
|
||||
# Test Webhook
|
||||
curl -X POST "http://localhost:3000/vmh/webhook/beteiligte/create" \
|
||||
-H "Content-Type: application/json" \
|
||||
-d '[{"id": "test-123"}]'
|
||||
```
|
||||
|
||||
## License
|
||||
|
||||
[Your License]
|
||||
|
||||
## Support
|
||||
|
||||
- **Issues**: [GitHub Issues]
|
||||
- **Docs**: [docs/](docs/)
|
||||
- **Logs**: `sudo journalctl -u motia.service -f`
|
||||
514
bitbylaw/docs/API.md
Normal file
514
bitbylaw/docs/API.md
Normal file
@@ -0,0 +1,514 @@
|
||||
# API Reference
|
||||
|
||||
---
|
||||
title: API Reference
|
||||
description: Vollständige API-Dokumentation für bitbylaw Motia Installation
|
||||
date: 2026-02-07
|
||||
version: 1.1.0
|
||||
---
|
||||
|
||||
## Base URL
|
||||
|
||||
**Production (via KONG)**: `https://api.bitbylaw.com`
|
||||
**Development**: `http://localhost:3000`
|
||||
|
||||
---
|
||||
|
||||
## Authentication
|
||||
|
||||
### KONG API Gateway
|
||||
|
||||
Alle Produktions-API-Calls laufen über KONG mit API-Key-Authentifizierung:
|
||||
|
||||
```bash
|
||||
curl -H "apikey: YOUR_API_KEY" https://api.bitbylaw.com/advoware/proxy?endpoint=employees
|
||||
```
|
||||
|
||||
**Header**: `apikey: <your-api-key>`
|
||||
|
||||
### Development
|
||||
|
||||
Entwicklungs-Environment: Keine Authentifizierung auf Motia-Ebene erforderlich.
|
||||
|
||||
---
|
||||
|
||||
## Advoware Proxy API
|
||||
|
||||
### Universal Proxy Endpoint
|
||||
|
||||
Alle Advoware-API-Aufrufe laufen über einen universellen Proxy.
|
||||
|
||||
#### GET Request
|
||||
|
||||
**Endpoint**: `GET /advoware/proxy`
|
||||
|
||||
**Query Parameters**:
|
||||
- `endpoint` (required): Advoware API endpoint (ohne Base-URL)
|
||||
- Alle weiteren Parameter werden an Advoware weitergeleitet
|
||||
|
||||
**Example**:
|
||||
```bash
|
||||
curl -X GET "http://localhost:3000/advoware/proxy?endpoint=employees&limit=10"
|
||||
```
|
||||
|
||||
**Response**:
|
||||
```json
|
||||
{
|
||||
"status": 200,
|
||||
"body": {
|
||||
"result": {
|
||||
"data": [...],
|
||||
"total": 100
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
#### POST Request
|
||||
|
||||
**Endpoint**: `POST /advoware/proxy`
|
||||
|
||||
**Query Parameters**:
|
||||
- `endpoint` (required): Advoware API endpoint
|
||||
|
||||
**Request Body**: JSON data für Advoware API
|
||||
|
||||
**Example**:
|
||||
```bash
|
||||
curl -X POST "http://localhost:3000/advoware/proxy?endpoint=appointments" \
|
||||
-H "Content-Type: application/json" \
|
||||
-d '{
|
||||
"datum": "2026-02-10",
|
||||
"uhrzeitVon": "09:00:00",
|
||||
"text": "Meeting"
|
||||
}'
|
||||
```
|
||||
|
||||
**Response**:
|
||||
```json
|
||||
{
|
||||
"status": 200,
|
||||
"body": {
|
||||
"result": {
|
||||
"id": "12345"
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
#### PUT Request
|
||||
|
||||
**Endpoint**: `PUT /advoware/proxy`
|
||||
|
||||
**Query Parameters**:
|
||||
- `endpoint` (required): Advoware API endpoint (inkl. ID)
|
||||
|
||||
**Request Body**: JSON data für Update
|
||||
|
||||
**Example**:
|
||||
```bash
|
||||
curl -X PUT "http://localhost:3000/advoware/proxy?endpoint=appointments/12345" \
|
||||
-H "Content-Type: application/json" \
|
||||
-d '{
|
||||
"text": "Updated Meeting"
|
||||
}'
|
||||
```
|
||||
|
||||
#### DELETE Request
|
||||
|
||||
**Endpoint**: `DELETE /advoware/proxy`
|
||||
|
||||
**Query Parameters**:
|
||||
- `endpoint` (required): Advoware API endpoint (inkl. ID)
|
||||
|
||||
**Example**:
|
||||
```bash
|
||||
curl -X DELETE "http://localhost:3000/advoware/proxy?endpoint=appointments/12345"
|
||||
```
|
||||
|
||||
**Response**:
|
||||
```json
|
||||
{
|
||||
"status": 200,
|
||||
"body": {
|
||||
"result": null
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### Error Responses
|
||||
|
||||
**400 Bad Request**:
|
||||
```json
|
||||
{
|
||||
"status": 400,
|
||||
"body": {
|
||||
"error": "Endpoint required as query param"
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
**500 Internal Server Error**:
|
||||
```json
|
||||
{
|
||||
"status": 500,
|
||||
"body": {
|
||||
"error": "Internal server error",
|
||||
"details": "Error message"
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Calendar Sync API
|
||||
|
||||
### Trigger Full Sync
|
||||
|
||||
**Endpoint**: `POST /advoware/calendar/sync`
|
||||
|
||||
**Request Body**:
|
||||
```json
|
||||
{
|
||||
"kuerzel": "ALL",
|
||||
"full_content": true
|
||||
}
|
||||
```
|
||||
|
||||
**Parameters**:
|
||||
- `kuerzel` (optional): Mitarbeiter-Kürzel oder "ALL" (default: "ALL")
|
||||
- `full_content` (optional): Volle Details vs. anonymisiert (default: true)
|
||||
|
||||
**Examples**:
|
||||
|
||||
Sync all employees:
|
||||
```bash
|
||||
curl -X POST "http://localhost:3000/advoware/calendar/sync" \
|
||||
-H "Content-Type: application/json" \
|
||||
-d '{"full_content": true}'
|
||||
```
|
||||
|
||||
Sync single employee:
|
||||
```bash
|
||||
curl -X POST "http://localhost:3000/advoware/calendar/sync" \
|
||||
-H "Content-Type: application/json" \
|
||||
-d '{"kuerzel": "SB", "full_content": true}'
|
||||
```
|
||||
|
||||
Sync with anonymization:
|
||||
```bash
|
||||
curl -X POST "http://localhost:3000/advoware/calendar/sync" \
|
||||
-H "Content-Type: application/json" \
|
||||
-d '{"full_content": false}'
|
||||
```
|
||||
|
||||
**Response**:
|
||||
```json
|
||||
{
|
||||
"status": "triggered",
|
||||
"kuerzel": "ALL",
|
||||
"message": "Calendar sync triggered for ALL"
|
||||
}
|
||||
```
|
||||
|
||||
**Status Codes**:
|
||||
- `200`: Sync triggered successfully
|
||||
- `400`: Invalid request (z.B. lock aktiv)
|
||||
- `500`: Internal error
|
||||
|
||||
## VMH Webhook Endpoints
|
||||
|
||||
Diese Endpoints werden von EspoCRM aufgerufen.
|
||||
|
||||
### Beteiligte Create Webhook
|
||||
|
||||
**Endpoint**: `POST /vmh/webhook/beteiligte/create`
|
||||
|
||||
**Request Body**: Array von Entitäten
|
||||
```json
|
||||
[
|
||||
{
|
||||
"id": "entity-123",
|
||||
"name": "Max Mustermann",
|
||||
"createdAt": "2026-02-07T10:00:00Z"
|
||||
}
|
||||
]
|
||||
```
|
||||
|
||||
**Response**:
|
||||
```json
|
||||
{
|
||||
"status": "received",
|
||||
"action": "create",
|
||||
"new_ids_count": 1,
|
||||
"total_ids_in_batch": 1
|
||||
}
|
||||
```
|
||||
|
||||
### Beteiligte Update Webhook
|
||||
|
||||
**Endpoint**: `POST /vmh/webhook/beteiligte/update`
|
||||
|
||||
**Request Body**: Array von Entitäten
|
||||
```json
|
||||
[
|
||||
{
|
||||
"id": "entity-123",
|
||||
"name": "Max Mustermann Updated",
|
||||
"modifiedAt": "2026-02-07T11:00:00Z"
|
||||
}
|
||||
]
|
||||
```
|
||||
|
||||
**Response**:
|
||||
```json
|
||||
{
|
||||
"status": "received",
|
||||
"action": "update",
|
||||
"new_ids_count": 1,
|
||||
"total_ids_in_batch": 1
|
||||
}
|
||||
```
|
||||
|
||||
### Beteiligte Delete Webhook
|
||||
|
||||
**Endpoint**: `POST /vmh/webhook/beteiligte/delete`
|
||||
|
||||
**Request Body**: Array von Entitäten
|
||||
```json
|
||||
[
|
||||
{
|
||||
"id": "entity-123",
|
||||
"deletedAt": "2026-02-07T12:00:00Z"
|
||||
}
|
||||
]
|
||||
```
|
||||
|
||||
**Response**:
|
||||
```json
|
||||
{
|
||||
"status": "received",
|
||||
"action": "delete",
|
||||
"new_ids_count": 1,
|
||||
"total_ids_in_batch": 1
|
||||
}
|
||||
```
|
||||
|
||||
### Webhook Features
|
||||
|
||||
**Batch Support**: Alle Webhooks unterstützen Arrays von Entitäten
|
||||
|
||||
**Deduplication**: Redis-basiert, verhindert Mehrfachverarbeitung
|
||||
|
||||
**Async Processing**: Events werden emittiert und asynchron verarbeitet
|
||||
|
||||
## Event Topics
|
||||
|
||||
Interne Event-Topics für Event-Driven Architecture (nicht direkt aufrufbar).
|
||||
|
||||
### calendar_sync_all
|
||||
|
||||
**Emitted by**: `calendar_sync_cron_step`, `calendar_sync_api_step`
|
||||
**Subscribed by**: `calendar_sync_all_step`
|
||||
|
||||
**Payload**:
|
||||
```json
|
||||
{}
|
||||
```
|
||||
|
||||
### calendar_sync_employee
|
||||
|
||||
**Emitted by**: `calendar_sync_all_step`, `calendar_sync_api_step`
|
||||
**Subscribed by**: `calendar_sync_event_step`
|
||||
|
||||
**Payload**:
|
||||
```json
|
||||
{
|
||||
"kuerzel": "SB",
|
||||
"full_content": true
|
||||
}
|
||||
```
|
||||
|
||||
### vmh.beteiligte.create
|
||||
|
||||
**Emitted by**: `beteiligte_create_api_step`
|
||||
**Subscribed by**: `beteiligte_sync_event_step`
|
||||
|
||||
**Payload**:
|
||||
```json
|
||||
{
|
||||
"entity_id": "123",
|
||||
"action": "create",
|
||||
"source": "webhook",
|
||||
"timestamp": "2026-02-07T10:00:00Z"
|
||||
}
|
||||
```
|
||||
|
||||
### vmh.beteiligte.update
|
||||
|
||||
**Emitted by**: `beteiligte_update_api_step`
|
||||
**Subscribed by**: `beteiligte_sync_event_step`
|
||||
|
||||
**Payload**:
|
||||
```json
|
||||
{
|
||||
"entity_id": "123",
|
||||
"action": "update",
|
||||
"source": "webhook",
|
||||
"timestamp": "2026-02-07T11:00:00Z"
|
||||
}
|
||||
```
|
||||
|
||||
### vmh.beteiligte.delete
|
||||
|
||||
**Emitted by**: `beteiligte_delete_api_step`
|
||||
**Subscribed by**: `beteiligte_sync_event_step`
|
||||
|
||||
**Payload**:
|
||||
```json
|
||||
{
|
||||
"entity_id": "123",
|
||||
"action": "delete",
|
||||
"source": "webhook",
|
||||
"timestamp": "2026-02-07T12:00:00Z"
|
||||
}
|
||||
```
|
||||
|
||||
## Rate Limits
|
||||
|
||||
### Google Calendar API
|
||||
|
||||
**Limit**: 600 requests/minute (enforced via Redis token bucket)
|
||||
|
||||
**Behavior**:
|
||||
- Requests wait if rate limit reached
|
||||
- Automatic backoff on 403 errors
|
||||
- Max retry: 4 attempts
|
||||
|
||||
### Advoware API
|
||||
|
||||
**Limit**: Unknown (keine offizielle Dokumentation)
|
||||
|
||||
**Behavior**:
|
||||
- 30s timeout per request
|
||||
- Automatic token refresh on 401
|
||||
- No retry logic (fail fast)
|
||||
|
||||
## Error Handling
|
||||
|
||||
### Standard Error Response
|
||||
|
||||
```json
|
||||
{
|
||||
"status": 400,
|
||||
"body": {
|
||||
"error": "Error description",
|
||||
"details": "Detailed error message"
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### HTTP Status Codes
|
||||
|
||||
- `200` - Success
|
||||
- `400` - Bad Request (invalid input)
|
||||
- `401` - Unauthorized (Advoware token invalid)
|
||||
- `403` - Forbidden (rate limit)
|
||||
- `404` - Not Found
|
||||
- `500` - Internal Server Error
|
||||
- `503` - Service Unavailable (Redis down)
|
||||
|
||||
### Common Errors
|
||||
|
||||
**Redis Connection Error**:
|
||||
```json
|
||||
{
|
||||
"status": 503,
|
||||
"body": {
|
||||
"error": "Redis connection failed"
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
**Advoware API Error**:
|
||||
```json
|
||||
{
|
||||
"status": 500,
|
||||
"body": {
|
||||
"error": "Advoware API call failed",
|
||||
"details": "401 Unauthorized"
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
**Lock Active Error**:
|
||||
```json
|
||||
{
|
||||
"status": 400,
|
||||
"body": {
|
||||
"error": "Sync already in progress for employee SB"
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## API Versioning
|
||||
|
||||
**Current Version**: v1 (implicit, no version in URL)
|
||||
|
||||
**Future**: API versioning via URL prefix (`/v2/api/...`)
|
||||
|
||||
## Health Check
|
||||
|
||||
**Coming Soon**: `/health` endpoint für Load Balancer
|
||||
|
||||
Expected response:
|
||||
```json
|
||||
{
|
||||
"status": "healthy",
|
||||
"services": {
|
||||
"redis": "up",
|
||||
"advoware": "up",
|
||||
"google": "up"
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Testing
|
||||
|
||||
### Postman Collection
|
||||
|
||||
Import diese Collection für schnelles Testing:
|
||||
|
||||
```json
|
||||
{
|
||||
"info": {
|
||||
"name": "bitbylaw API",
|
||||
"schema": "https://schema.getpostman.com/json/collection/v2.1.0/collection.json"
|
||||
},
|
||||
"item": [
|
||||
{
|
||||
"name": "Advoware Proxy GET",
|
||||
"request": {
|
||||
"method": "GET",
|
||||
"url": "http://localhost:3000/advoware/proxy?endpoint=employees"
|
||||
}
|
||||
},
|
||||
{
|
||||
"name": "Calendar Sync Trigger",
|
||||
"request": {
|
||||
"method": "POST",
|
||||
"url": "http://localhost:3000/advoware/calendar/sync",
|
||||
"header": [{"key": "Content-Type", "value": "application/json"}],
|
||||
"body": {
|
||||
"mode": "raw",
|
||||
"raw": "{\"full_content\": true}"
|
||||
}
|
||||
}
|
||||
}
|
||||
]
|
||||
}
|
||||
```
|
||||
|
||||
## Related Documentation
|
||||
|
||||
- [Architecture](ARCHITECTURE.md)
|
||||
- [Development Guide](DEVELOPMENT.md)
|
||||
- [Configuration](CONFIGURATION.md)
|
||||
642
bitbylaw/docs/ARCHITECTURE.md
Normal file
642
bitbylaw/docs/ARCHITECTURE.md
Normal file
@@ -0,0 +1,642 @@
|
||||
# Architektur
|
||||
|
||||
## Systemübersicht
|
||||
|
||||
Das bitbylaw-System ist eine event-driven Integration zwischen Advoware, EspoCRM, Google Calendar, Vermieterhelden und 3CX Telefonie, basierend auf dem Motia Framework. Die Architektur folgt einem modularen, mikroservice-orientierten Ansatz mit klarer Separation of Concerns.
|
||||
|
||||
### Kernkomponenten
|
||||
|
||||
```
|
||||
┌─────────────────────────────┐
|
||||
│ KONG API Gateway │
|
||||
│ api.bitbylaw.com │
|
||||
│ (Auth, Rate Limiting) │
|
||||
└──────────────┬──────────────┘
|
||||
│
|
||||
┌──────────────────────────┼──────────────────────────┐
|
||||
│ │ │
|
||||
┌────▼────────┐ ┌──────▼─────────┐ ┌─────▼──────┐
|
||||
│ Vermieter- │ │ Motia │ │ 3CX │
|
||||
│ helden.de │────────▶│ Framework │◀────────│ Telefonie │
|
||||
│ (WordPress) │ │ (Middleware) │ │ (ralup) │
|
||||
└─────────────┘ └────────┬───────┘ └────────────┘
|
||||
Leads Input │ Call Handling
|
||||
│
|
||||
┌───────────────────────────┼───────────────────────────┐
|
||||
│ │ │
|
||||
┌────▼────┐ ┌──────▼──────┐ ┌──────▼─────┐
|
||||
│Advoware │ │ VMH │ │ Calendar │
|
||||
│ Proxy │ │ Webhooks │ │ Sync │
|
||||
└────┬────┘ └─────┬───────┘ └─────┬──────┘
|
||||
│ │ │
|
||||
│ │ │
|
||||
┌────▼─────────────────────────▼──────────────────────────▼────┐
|
||||
│ Redis (3 DBs) │
|
||||
│ DB 1: Caching & Locks │
|
||||
│ DB 2: Calendar Sync State │
|
||||
└───────────────────────────────────────────────────────────────┘
|
||||
│
|
||||
┌────▼────────────────────────────┐
|
||||
│ External Services │
|
||||
├─────────────────────────────────┤
|
||||
│ • Advoware REST API │
|
||||
│ • EspoCRM (VMH) │
|
||||
│ • Google Calendar API │
|
||||
│ • 3CX API (ralup.my3cx.de) │
|
||||
│ • Vermieterhelden WordPress │
|
||||
└─────────────────────────────────┘
|
||||
```
|
||||
|
||||
## Komponenten-Details
|
||||
|
||||
### 0. KONG API Gateway
|
||||
|
||||
**Zweck**: Zentraler API-Gateway für alle öffentlichen APIs mit Authentifizierung und Rate Limiting.
|
||||
|
||||
**Domain**: `api.bitbylaw.com`
|
||||
|
||||
**Funktionen**:
|
||||
- **Authentication**: API-Key-basiert, JWT, OAuth2
|
||||
- **Rate Limiting**: Pro Consumer/API-Key
|
||||
- **Request Routing**: Zu Backend-Services (Motia, etc.)
|
||||
- **SSL/TLS Termination**: HTTPS-Handling
|
||||
- **Logging & Monitoring**: Request-Logs, Metrics
|
||||
- **CORS Handling**: Cross-Origin Requests
|
||||
|
||||
**Upstream Services**:
|
||||
- Motia Framework (Advoware Proxy, Calendar Sync, VMH Webhooks)
|
||||
- Zukünftig: Weitere Microservices
|
||||
|
||||
**Konfiguration**:
|
||||
```yaml
|
||||
# KONG Service Configuration
|
||||
services:
|
||||
- name: motia-backend
|
||||
url: http://localhost:3000
|
||||
routes:
|
||||
- name: advoware-proxy
|
||||
paths: [/advoware/*]
|
||||
- name: calendar-sync
|
||||
paths: [/calendar/*]
|
||||
- name: vmh-webhooks
|
||||
paths: [/vmh/*]
|
||||
plugins:
|
||||
- name: key-auth
|
||||
- name: rate-limiting
|
||||
config:
|
||||
minute: 600
|
||||
```
|
||||
|
||||
**Flow**:
|
||||
```
|
||||
Client → KONG (api.bitbylaw.com) → Auth Check → Rate Limit → Motia Backend
|
||||
```
|
||||
|
||||
### 1. Advoware Proxy Layer
|
||||
|
||||
**Zweck**: Transparente REST-API-Proxy für Advoware mit Authentifizierung und Caching.
|
||||
|
||||
**Module**: `steps/advoware_proxy/`
|
||||
- `advoware_api_proxy_get_step.py` - GET-Requests
|
||||
- `advoware_api_proxy_post_step.py` - POST-Requests (Create)
|
||||
- `advoware_api_proxy_put_step.py` - PUT-Requests (Update)
|
||||
- `advoware_api_proxy_delete_step.py` - DELETE-Requests
|
||||
|
||||
**Services**: `services/advoware.py`
|
||||
- Token-Management (HMAC-512 Authentifizierung)
|
||||
- Redis-basiertes Token-Caching (55min Lifetime)
|
||||
- Automatischer Token-Refresh bei 401-Errors
|
||||
- Async API-Client mit aiohttp
|
||||
|
||||
**Datenfluss**:
|
||||
```
|
||||
Client → API-Step → AdvowareAPI Service → Redis (Token Cache) → Advoware API
|
||||
```
|
||||
|
||||
### 2. Calendar Sync System
|
||||
|
||||
**Zweck**: Bidirektionale Synchronisation zwischen Advoware-Terminen und Google Calendar.
|
||||
|
||||
**Architecture Pattern**: Event-Driven Cascade
|
||||
|
||||
**Integration**: EspoCRM sendet Webhooks an KONG → Motia
|
||||
|
||||
**Datenfluss**:
|
||||
```
|
||||
EspoCRM (Vermieterhelden CRM) → KONG → Motia VMH Webhooks → Redis Dedup → Events
|
||||
```
|
||||
```
|
||||
Cron (täglich)
|
||||
→ calendar_sync_cron_step
|
||||
→ Emit: "calendar_sync_all"
|
||||
→ calendar_sync_all_step
|
||||
→ Fetch Employees
|
||||
→ For each Employee:
|
||||
→ Set Redis Lock
|
||||
→ Emit: "calendar_sync_employee"
|
||||
→ calendar_sync_event_step
|
||||
→ Fetch Advoware Events
|
||||
→ Fetch Google Events
|
||||
→ Sync (Create/Update/Delete)
|
||||
→ Clear Redis Lock
|
||||
```
|
||||
|
||||
**Module**: `steps/advoware_cal_sync/`
|
||||
- `calendar_sync_cron_step.py` - Täglicher Trigger
|
||||
- `calendar_sync_all_step.py` - Employee-List-Handler
|
||||
- `calendar_sync_event_step.py` - Per-Employee Sync-Logic
|
||||
- `calendar_sync_api_step.py` - Manueller Trigger-Endpoint
|
||||
- `calendar_sync_utils.py` - Shared Utilities
|
||||
- `audit_calendar_sync.py` - Audit & Diagnostics
|
||||
|
||||
**Key Features**:
|
||||
- **Redis Locking**: Verhindert parallele Syncs für denselben Employee
|
||||
- **Rate Limiting**: Token-Bucket-Algorithm (7 tokens, Redis-based)
|
||||
- **Normalisierung**: Common format (Berlin TZ) für beide APIs
|
||||
- **Error Isolation**: Employee-Fehler stoppen nicht Gesamt-Sync
|
||||
|
||||
**Datenmapping**:
|
||||
```
|
||||
Advoware Format → Standard Format → Google Calendar Format
|
||||
↓ ↓ ↓
|
||||
datum/uhrzeitVon start (datetime) dateTime
|
||||
datumBis end (datetime) dateTime
|
||||
dauertermin all_day (bool) date
|
||||
turnus/turnusArt recurrence RRULE
|
||||
```
|
||||
|
||||
### 3. VMH Webhook System
|
||||
|
||||
**Zweck**: Empfang und Verarbeitung von EspoCRM Webhooks für Beteiligte-Entitäten.
|
||||
|
||||
**Architecture Pattern**: Webhook → Deduplication → Event Emission
|
||||
|
||||
**Module**: `steps/vmh/`
|
||||
- `webhook/beteiligte_create_api_step.py` - Create Webhook
|
||||
- `webhook/beteiligte_update_api_step.py` - Update Webhook
|
||||
- `webhook/beteiligte_delete_api_step.py` - Delete Webhook
|
||||
- `beteiligte_sync_event_step.py` - Sync Event Handler (Placeholder)
|
||||
|
||||
**Webhook-Flow**:
|
||||
```
|
||||
EspoCRM → POST /vmh/webhook/beteiligte/create
|
||||
↓
|
||||
Webhook Step
|
||||
↓
|
||||
Extract Entity IDs
|
||||
↓
|
||||
Redis Deduplication (SET: vmh:beteiligte:create_pending)
|
||||
↓
|
||||
Emit Event: "vmh.beteiligte.create"
|
||||
↓
|
||||
Sync Event Step (subscribes)
|
||||
↓
|
||||
[TODO: Implementierung]
|
||||
|
||||
### 4. Vermieterhelden Integration
|
||||
|
||||
**Zweck**: Lead-Eingang von Vermieterhelden.de WordPress-Frontend.
|
||||
|
||||
**URL**: `https://vermieterhelden.de`
|
||||
|
||||
**Technologie**: WordPress-basiertes Frontend
|
||||
|
||||
**Funktionen**:
|
||||
- **Lead-Formulare**: Mieter, Vermieter, Anfragen
|
||||
- **Lead-Routing**: Zu EspoCRM (VMH) → Motia
|
||||
- **Webhook-basiert**: POST zu KONG/Motia bei neuem Lead
|
||||
|
||||
**Datenfluss**:
|
||||
```
|
||||
Vermieterhelden.de → Lead erstellt → Webhook → KONG → Motia → EspoCRM/Advoware
|
||||
```
|
||||
|
||||
**Lead-Typen**:
|
||||
- Mieter-Anfragen
|
||||
- Vermieter-Anfragen
|
||||
- Kontaktformulare
|
||||
- Newsletter-Anmeldungen
|
||||
|
||||
**Integration mit Motia**:
|
||||
- Eigener Webhook-Endpoint: `/api/leads/vermieterhelden`
|
||||
- Lead-Validierung und -Enrichment
|
||||
- Weiterleitung an CRM-Systeme
|
||||
|
||||
### 5. 3CX Telefonie-Integration
|
||||
|
||||
**Zweck**: Telefonie-System-Integration für Call-Handling und Lead-Qualifizierung.
|
||||
|
||||
**URL**: `https://ralup.my3cx.de`
|
||||
|
||||
**Technologie**: 3CX Cloud PBX
|
||||
|
||||
**Funktionen**:
|
||||
- **Outbound Calls**: Lead-Anrufe (automatisch oder manuell)
|
||||
- **Inbound Calls**: Stammdatenabfrage (CTI - Computer Telephony Integration)
|
||||
- **Call Logging**: Anrufprotokolle zu CRM
|
||||
- **Call Recording**: Aufzeichnungen speichern und abrufen
|
||||
- **Screen Pops**: Kundeninfo bei eingehendem Anruf
|
||||
|
||||
**API-Integrationen**:
|
||||
|
||||
**A) Outbound: Motia → 3CX**
|
||||
```
|
||||
Motia → KONG → 3CX API
|
||||
- Initiate Call to Lead
|
||||
- Get Call Status
|
||||
```
|
||||
|
||||
**B) Inbound: 3CX → Motia**
|
||||
```
|
||||
3CX Webhook → KONG → Motia
|
||||
- Call Started → Fetch Customer Data
|
||||
- Call Ended → Log Call Record
|
||||
```
|
||||
|
||||
**Datenfluss**:
|
||||
|
||||
**Call Initiation**:
|
||||
```
|
||||
Lead in CRM → Trigger Call → Motia → 3CX API → Dial Number
|
||||
```
|
||||
|
||||
**Inbound Call**:
|
||||
```
|
||||
3CX detects call → Webhook to Motia → Lookup in Advoware/EspoCRM → Return data → 3CX Screen Pop
|
||||
```
|
||||
|
||||
**Call Recording**:
|
||||
```
|
||||
Call ends → 3CX Webhook → Motia → Store metadata → Link to CRM entity
|
||||
```
|
||||
|
||||
**Use Cases**:
|
||||
- Lead-Qualifizierung nach Eingang
|
||||
- Stammdatenabfrage bei Anruf
|
||||
- Anrufprotokoll in EspoCRM/Advoware
|
||||
- Automatische Follow-up-Tasks
|
||||
```
|
||||
|
||||
**Deduplikation-Mechanismus**:
|
||||
- Redis SET für pending IDs pro Action-Type (create/update/delete)
|
||||
- Neue IDs werden zu SET hinzugefügt
|
||||
- Events nur für neue (nicht-duplizierte) IDs emittiert
|
||||
- SET-TTL verhindert Memory-Leaks
|
||||
|
||||
## Event-Driven Design
|
||||
|
||||
### Event-Topics
|
||||
|
||||
| Topic | Emitter | Subscriber | Payload |
|
||||
|-------|---------|------------|---------|
|
||||
| `calendar_sync_all` | cron_step | all_step | `{}` |
|
||||
| `calendar_sync_employee` | all_step, api_step | event_step | `{kuerzel, full_content}` |
|
||||
| `vmh.beteiligte.create` | create webhook | sync_event_step | `{entity_id, action, source, timestamp}` |
|
||||
| `vmh.beteiligte.update` | update webhook | sync_event_step | `{entity_id, action, source, timestamp}` |
|
||||
| `vmh.beteiligte.delete` | delete webhook | sync_event_step | `{entity_id, action, source, timestamp}` |
|
||||
|
||||
### Event-Flow Patterns
|
||||
|
||||
**1. Cascade Pattern** (Calendar Sync):
|
||||
```
|
||||
Trigger → Fetch List → Emit per Item → Process Item
|
||||
```
|
||||
|
||||
**2. Webhook Pattern** (VMH):
|
||||
```
|
||||
External Event → Dedup → Internal Event → Processing
|
||||
```
|
||||
|
||||
## Redis Architecture
|
||||
|
||||
### Database Layout
|
||||
|
||||
**DB 0**: Default (Motia internal)
|
||||
**DB 1**: Advoware Cache & Locks
|
||||
- `advoware_access_token` - Bearer Token (TTL: 53min)
|
||||
- `advoware_token_timestamp` - Token Creation Time
|
||||
- `calendar_sync:lock:{kuerzel}` - Per-Employee Lock (TTL: 5min)
|
||||
- `vmh:beteiligte:create_pending` - Create Dedup SET
|
||||
- `vmh:beteiligte:update_pending` - Update Dedup SET
|
||||
- `vmh:beteiligte:delete_pending` - Delete Dedup SET
|
||||
|
||||
**DB 2**: Calendar Sync Rate Limiting
|
||||
- `google_calendar_api_tokens` - Token Bucket for Rate Limiting
|
||||
|
||||
---
|
||||
|
||||
## External APIs
|
||||
|
||||
### Advoware REST API
|
||||
|
||||
**Base URL**: `https://advoware-api.example.com/api/v1/`
|
||||
**Auth**: HMAC-512 (siehe `services/advoware.py`)
|
||||
**Rate Limits**: Unknown (keine Limits bekannt)
|
||||
**Documentation**: [Advoware API Swagger](../docs/advoware/advoware_api_swagger.json)
|
||||
|
||||
**Wichtige Endpoints**:
|
||||
- `POST /auth/login` - Token generieren
|
||||
- `GET /employees` - Employee-Liste
|
||||
- `GET /events` - Termine abrufen
|
||||
- `POST /events` - Termin erstellen
|
||||
- `PUT /events/{id}` - Termin aktualisieren
|
||||
|
||||
### Redis Usage Patterns
|
||||
|
||||
**Token Caching**:
|
||||
```python
|
||||
# Set with expiration
|
||||
redis.set('advoware_access_token', token, ex=3180) # 53min
|
||||
|
||||
# Get with fallback
|
||||
token = redis.get('advoware_access_token')
|
||||
if not token:
|
||||
token = fetch_new_token()
|
||||
```
|
||||
|
||||
### EspoCRM (VMH)
|
||||
|
||||
**Integration**: Webhook Sender (Outbound), API Consumer
|
||||
**Endpoints**: Configured in EspoCRM, routed via KONG
|
||||
**Format**: JSON POST with entity data
|
||||
**Note**: Dient als CRM für Vermieterhelden-Leads
|
||||
|
||||
### 3CX Telefonie API
|
||||
|
||||
**Base URL**: `https://ralup.my3cx.de/api/v1/`
|
||||
**Auth**: API Key oder Basic Auth
|
||||
**Rate Limits**: Unknown (typisch 60 req/min)
|
||||
|
||||
**Key Endpoints**:
|
||||
- `POST /calls/initiate` - Anruf starten
|
||||
- `GET /calls/{id}/status` - Call-Status
|
||||
- `GET /calls/{id}/recording` - Aufzeichnung abrufen
|
||||
- `POST /webhook` - Webhook-Konfiguration (eingehend)
|
||||
|
||||
**Webhooks** (Inbound von 3CX):
|
||||
- `call.started` - Anruf beginnt
|
||||
- `call.ended` - Anruf beendet
|
||||
- `call.transferred` - Anruf weitergeleitet
|
||||
|
||||
### Vermieterhelden
|
||||
|
||||
**Integration**: Webhook Sender (Lead-Eingang)
|
||||
**Base**: WordPress mit Custom Plugins
|
||||
**Format**: JSON POST zu Motia
|
||||
|
||||
**Webhook-Events**:
|
||||
- `lead.created` - Neuer Lead
|
||||
- `contact.submitted` - Kontaktformular
|
||||
lock_key = f'calendar_sync:lock:{kuerzel}'
|
||||
if not redis.set(lock_key, '1', nx=True, ex=300):
|
||||
raise LockError("Already locked")
|
||||
|
||||
# Always release
|
||||
redis.delete(lock_key)
|
||||
```
|
||||
|
||||
**Deduplication**:
|
||||
```python
|
||||
# Check & Add atomically
|
||||
existing = redis.smembers('vmh:beteiligte:create_pending')
|
||||
new_ids = input_ids - existing
|
||||
if new_ids:
|
||||
redis.sadd('vmh:beteiligte:create_pending', *new_ids)
|
||||
```
|
||||
|
||||
## Service Layer
|
||||
|
||||
### AdvowareAPI Service
|
||||
|
||||
**Location**: `services/advoware.py`
|
||||
|
||||
**Responsibilities**:
|
||||
- HMAC-512 Authentication
|
||||
- Token Management
|
||||
- HTTP Client (aiohttp)
|
||||
- Error Handling & Retries
|
||||
|
||||
**Key Methods**:
|
||||
```python
|
||||
get_access_token(force_refresh=False) -> str
|
||||
api_call(endpoint, method, params, json_data) -> Any
|
||||
```
|
||||
|
||||
**Authentication Flow**:
|
||||
```
|
||||
1. Generate HMAC-512 signature
|
||||
- Message: "{product_id}:{app_id}:{nonce}:{timestamp}"
|
||||
- Key: Base64-decoded API Key
|
||||
- Hash: SHA512
|
||||
|
||||
2. POST to security.advo-net.net/api/v1/Token
|
||||
- Body: {AppID, User, Password, HMAC512Signature, ...}
|
||||
|
||||
3. Extract access_token from response
|
||||
|
||||
4. Cache in Redis (53min TTL)
|
||||
|
||||
5. Use as Bearer Token: "Authorization: Bearer {token}"
|
||||
```
|
||||
|
||||
## External API Integration
|
||||
|
||||
### Advoware API
|
||||
|
||||
**Base URL**: `https://www2.advo-net.net:90/`
|
||||
**Auth**: HMAC-512 + Bearer Token
|
||||
**Rate Limits**: Unknown (robust error handling)
|
||||
|
||||
**Key Endpoints**:
|
||||
- `/employees` - Mitarbeiter-Liste
|
||||
- `/appointments` - Termine
|
||||
|
||||
### Google Calendar API
|
||||
|
||||
**Auth**: Service Account (JSON Key)
|
||||
**Rate Limits**: 600 requests/minute (enforced via Redis)
|
||||
**Scopes**: `https://www.googleapis.com/auth/calendar`
|
||||
|
||||
**Key Operations**:
|
||||
- `calendars().get()` - Calendar abrufen
|
||||
- `calendars().insert()` - Calendar erstellen
|
||||
- `events().list()` - Events abrufen
|
||||
- `events().insert()` - Event erstellen
|
||||
- KONG Gateway**: API-Key oder JWT-based Auth für externe Clients
|
||||
**Advoware**: User-based Auth (ADVOWARE_USER + PASSWORD)
|
||||
**Google**: Service Account (domain-wide delegation)
|
||||
**3CX**: API Key oder Basic Auth
|
||||
**Redis**: Localhost only (no password)
|
||||
**Vermieterhelden**: Webhook-Secret für Validation
|
||||
### EspoCRM
|
||||
|
||||
**Integration**: Webhook Sender (Outbound)
|
||||
**Endpoints**: Configured in EspoCRM
|
||||
**Format**: JSON POST with entity data
|
||||
|
||||
## Security
|
||||
|
||||
### Secrets Management
|
||||
|
||||
**Environment Variables**:
|
||||
```bash
|
||||
ADVOWARE_API_KEY # Base64-encoded HMAC Key
|
||||
ADVOWARE_PASSWORD # User Password
|
||||
GOOGLE_CALENDAR_SERVICE_ACCOUNT_PATH # Path to JSON Key
|
||||
ESPOCRM_MARVIN_API_KEY # Webhook Validation (optional)
|
||||
```
|
||||
|
||||
**Storage**:
|
||||
- Environment variables in systemd service
|
||||
- Service Account JSON: `/opt/motia-app/service-account.json` (chmod 600)
|
||||
- No secrets in code or Git
|
||||
|
||||
### Access Control
|
||||
|
||||
**Advoware**: User-based Auth (ADVOWARE_USER + PASSWORD)
|
||||
**Google**: Service Account (domain-wide delegation)
|
||||
**Redis**: Localhost only (no password)
|
||||
|
||||
## Performance Characteristics
|
||||
|
||||
### Throughput
|
||||
|
||||
**Calendar Sync**:
|
||||
- ~10 employees: 2-3 minutes
|
||||
- Rate-limited by Google API (600 req/min)
|
||||
- Per-employee parallelization: Nein (sequential via events)
|
||||
|
||||
**Webhooks**:
|
||||
- Instant processing (<100ms)
|
||||
- Batch support (multiple entities per request)
|
||||
- Redis dedup overhead: <10ms
|
||||
|
||||
### Memory Usage
|
||||
|
||||
**Current**: 169MB (Peak: 276MB)
|
||||
**Breakdown**:
|
||||
- Node.js process: ~150MB
|
||||
- Python dependencies: Lazy-loaded per step
|
||||
- Redis memory: <10MB
|
||||
|
||||
### Scalability
|
||||
|
||||
**Horizontal**: Nicht ohne weiteres möglich (Redis Locks, Shared State)
|
||||
**Vertical**: CPU-bound bei vielen parallel Employees
|
||||
**Bottleneck**: Google Calendar API Rate Limits
|
||||
|
||||
## Monitoring & Observability
|
||||
|
||||
### Logging
|
||||
|
||||
**Framework**: Motia Workbench (structured logging)
|
||||
**Levels**: DEBUG, INFO, ERROR
|
||||
**Output**: journalctl (systemd) + Motia Workbench UI
|
||||
|
||||
**Key Log Points**:
|
||||
- API-Requests (Method, URL, Status)
|
||||
- Event Emission (Topic, Payload)
|
||||
- Redis Operations (Keys, Success/Failure)
|
||||
- Errors (Stack traces, Context)
|
||||
|
||||
### Metrics
|
||||
|
||||
**Available** (via Logs):
|
||||
- Webhook receive count
|
||||
- Calendar sync duration per employee
|
||||
- API call count & latency
|
||||
- Redis hit/miss ratio (implicit)
|
||||
|
||||
**Missing** (Future):
|
||||
- Prometheus metrics
|
||||
- Grafana dashboards
|
||||
- Alerting
|
||||
|
||||
## Deployment
|
||||
|
||||
### systemd Service
|
||||
|
||||
**Unit**: `motia.service`
|
||||
**User**: `www-data`
|
||||
**WorkingDirectory**: `/opt/motia-app/bitbylaw`
|
||||
**Restart**: `always` (10s delay)
|
||||
|
||||
**Environment**:
|
||||
```bash
|
||||
NODE_ENV=production
|
||||
NODE_OPTIONS=--max-old-space-size=8192 --inspect
|
||||
HOST=0.0.0.0
|
||||
MOTIA_LOG_LEVEL=debug
|
||||
```
|
||||
|
||||
### Dependencies
|
||||
|
||||
**Runtime**:
|
||||
- Node.js 18+
|
||||
- Python 3.13+
|
||||
- Redis Server
|
||||
- systemd
|
||||
|
||||
**Build**:
|
||||
- npm (Node packages)
|
||||
- pip (Python packages)
|
||||
- Motia CLI
|
||||
|
||||
## Disaster Recovery
|
||||
|
||||
### Backup Strategy
|
||||
|
||||
**Redis**:
|
||||
- RDB snapshots (automatisch)
|
||||
- AOF persistence (optional)
|
||||
|
||||
**Configuration**:
|
||||
- Git-versioniert
|
||||
- Environment Variables in systemd
|
||||
|
||||
**Service Account**:
|
||||
- Manual backup: `/opt/motia-app/service-account.json`
|
||||
|
||||
### Recovery Procedures
|
||||
|
||||
**Service Restart**:
|
||||
```bash
|
||||
systemctl restart motia.service
|
||||
```
|
||||
|
||||
**Clear Redis Cache**:
|
||||
```bash
|
||||
redis-cli -n 1 FLUSHDB # Advoware Cache
|
||||
redis-cli -n 2 FLUSHDB # Calendar Sync
|
||||
```
|
||||
|
||||
**Clear Employee Lock**:
|
||||
```bash
|
||||
python /opt/motia-app/bitbylaw/delete_employee_locks.py
|
||||
```
|
||||
|
||||
## Future Enhancements
|
||||
|
||||
### P3CX Full Integration**: Complete call handling, CTI features
|
||||
3. **Vermieterhelden Lead Processing**: Automated lead routing and enrichment
|
||||
4. **Horizontal Scaling**: Distributed locking (Redis Cluster)
|
||||
5. **Metrics & Monitoring**: Prometheus exporters
|
||||
6. **Health Checks**: `/health` endpoint via KONG
|
||||
|
||||
### Considered
|
||||
|
||||
1. **PostgreSQL Hub**: Persistent sync state (currently Redis-only)
|
||||
2. **Webhook Signatures**: Validation von Vermieterhelden/3CX requests
|
||||
3. **Multi-Tenant**: Support für mehrere Kanzleien
|
||||
4. **KONG Plugins**: Custom plugins für business logic
|
||||
1. **PostgreSQL Hub**: Persistent sync state (currently Redis-only)
|
||||
2. **Webhook Signatures**: Validation von EspoCRM requests
|
||||
3. **Multi-Tenant**: Support für mehrere Kanzleien
|
||||
|
||||
## Related Documentation
|
||||
|
||||
- [Development Guide](DEVELOPMENT.md)
|
||||
- [API Reference](API.md)
|
||||
- [Configuration](CONFIGURATION.md)
|
||||
- [Troubleshooting](TROUBLESHOOTING.md)
|
||||
- [Deployment Guide](DEPLOYMENT.md)
|
||||
509
bitbylaw/docs/CONFIGURATION.md
Normal file
509
bitbylaw/docs/CONFIGURATION.md
Normal file
@@ -0,0 +1,509 @@
|
||||
# Configuration Guide
|
||||
|
||||
## Environment Variables
|
||||
|
||||
Alle Konfiguration erfolgt über Environment Variables. Diese können gesetzt werden:
|
||||
1. In `.env` Datei (lokale Entwicklung)
|
||||
2. In systemd service file (production)
|
||||
3. Export in shell
|
||||
|
||||
## Advoware API Configuration
|
||||
|
||||
### Required Variables
|
||||
|
||||
```bash
|
||||
# Advoware API Base URL
|
||||
ADVOWARE_API_BASE_URL=https://www2.advo-net.net:90/
|
||||
|
||||
# Product ID (typischerweise 64)
|
||||
ADVOWARE_PRODUCT_ID=64
|
||||
|
||||
# Application ID (von Advoware bereitgestellt)
|
||||
ADVOWARE_APP_ID=your_app_id_here
|
||||
|
||||
# API Key (Base64-encoded für HMAC-512 Signatur)
|
||||
ADVOWARE_API_KEY=your_base64_encoded_key_here
|
||||
|
||||
# Kanzlei-Kennung
|
||||
ADVOWARE_KANZLEI=your_kanzlei_name
|
||||
|
||||
# Database Name
|
||||
ADVOWARE_DATABASE=your_database_name
|
||||
|
||||
# User für API-Zugriff
|
||||
ADVOWARE_USER=api_user
|
||||
|
||||
# User Role (typischerweise 2)
|
||||
ADVOWARE_ROLE=2
|
||||
|
||||
# User Password
|
||||
ADVOWARE_PASSWORD=secure_password_here
|
||||
|
||||
# Token Lifetime in Minuten (Standard: 55)
|
||||
ADVOWARE_TOKEN_LIFETIME_MINUTES=55
|
||||
|
||||
# API Timeout in Sekunden (Standard: 30)
|
||||
ADVOWARE_API_TIMEOUT_SECONDS=30
|
||||
|
||||
# Write Protection (true = keine Schreibzugriffe auf Advoware)
|
||||
ADVOWARE_WRITE_PROTECTION=true
|
||||
```
|
||||
|
||||
### Advoware API Key
|
||||
|
||||
Der API Key muss Base64-encoded sein für HMAC-512 Signatur:
|
||||
|
||||
```bash
|
||||
# Wenn Sie einen Raw Key haben, encodieren Sie ihn:
|
||||
echo -n "your_raw_key" | base64
|
||||
```
|
||||
|
||||
## Redis Configuration
|
||||
|
||||
```bash
|
||||
# Redis Host (Standard: localhost)
|
||||
REDIS_HOST=localhost
|
||||
|
||||
# Redis Port (Standard: 6379)
|
||||
REDIS_PORT=6379
|
||||
|
||||
# Redis Database für Advoware Cache (Standard: 1)
|
||||
REDIS_DB_ADVOWARE_CACHE=1
|
||||
|
||||
# Redis Database für Calendar Sync (Standard: 2)
|
||||
REDIS_DB_CALENDAR_SYNC=2
|
||||
|
||||
# Redis Timeout in Sekunden (Standard: 5)
|
||||
REDIS_TIMEOUT_SECONDS=5
|
||||
```
|
||||
|
||||
### Redis Database Layout
|
||||
|
||||
- **DB 0**: Motia Framework (nicht konfigurierbar)
|
||||
- **DB 1**: Advoware Cache & Locks (`REDIS_DB_ADVOWARE_CACHE`)
|
||||
- Token Cache
|
||||
- Employee Locks
|
||||
- Webhook Deduplication
|
||||
- **DB 2**: Calendar Sync Rate Limiting (`REDIS_DB_CALENDAR_SYNC`)
|
||||
|
||||
---
|
||||
|
||||
## KONG API Gateway Configuration
|
||||
|
||||
```bash
|
||||
# KONG Admin API URL (für Konfiguration)
|
||||
KONG_ADMIN_URL=http://localhost:8001
|
||||
|
||||
# KONG Proxy URL (öffentlich erreichbar)
|
||||
KONG_PROXY_URL=https://api.bitbylaw.com
|
||||
```
|
||||
|
||||
**Hinweis**: KONG-Konfiguration erfolgt typischerweise über Admin API oder Declarative Config (kong.yml).
|
||||
|
||||
---
|
||||
|
||||
## 3CX Telefonie Configuration
|
||||
|
||||
```bash
|
||||
# 3CX API Base URL
|
||||
THREECX_API_URL=https://ralup.my3cx.de/api/v1
|
||||
|
||||
# 3CX API Key für Authentifizierung
|
||||
THREECX_API_KEY=your_3cx_api_key_here
|
||||
|
||||
# 3CX Webhook Secret (optional, für Signatur-Validierung)
|
||||
THREECX_WEBHOOK_SECRET=your_webhook_secret_here
|
||||
```
|
||||
|
||||
### 3CX Setup
|
||||
|
||||
1. Erstellen Sie API Key in 3CX Management Console
|
||||
2. Konfigurieren Sie Webhook URLs in 3CX:
|
||||
- Call Started: `https://api.bitbylaw.com/telephony/3cx/webhook`
|
||||
- Call Ended: `https://api.bitbylaw.com/telephony/3cx/webhook`
|
||||
3. Aktivieren Sie Call Recording (optional)
|
||||
|
||||
---
|
||||
|
||||
## Vermieterhelden Integration Configuration
|
||||
|
||||
```bash
|
||||
# Vermieterhelden Webhook Secret (für Signatur-Validierung)
|
||||
VH_WEBHOOK_SECRET=your_vermieterhelden_webhook_secret
|
||||
|
||||
# Lead Routing Target (wohin werden Leads geschickt)
|
||||
VH_LEAD_TARGET=espocrm # Options: espocrm, advoware, both
|
||||
|
||||
# Lead Auto-Assignment (optional)
|
||||
VH_AUTO_ASSIGN_LEADS=true
|
||||
VH_DEFAULT_ASSIGNEE=user_id_123
|
||||
```
|
||||
|
||||
### Vermieterhelden Setup
|
||||
|
||||
1. Konfigurieren Sie Webhook URL im WordPress:
|
||||
- URL: `https://api.bitbylaw.com/leads/vermieterhelden`
|
||||
2. Generieren Sie Shared Secret
|
||||
3. Aktivieren Sie Webhook-Events für Lead-Erstellung
|
||||
|
||||
---
|
||||
|
||||
## Google Calendar Configuration
|
||||
|
||||
```bash
|
||||
# Pfad zur Service Account JSON Datei
|
||||
GOOGLE_CALENDAR_SERVICE_ACCOUNT_PATH=/opt/motia-app/service-account.json
|
||||
|
||||
# Google Calendar Scopes (Standard: calendar)
|
||||
# GOOGLE_CALENDAR_SCOPES wird im Code gesetzt, keine ENV Variable nötig
|
||||
```
|
||||
|
||||
### Service Account Setup
|
||||
|
||||
1. Erstellen Sie einen Service Account in Google Cloud Console
|
||||
2. Laden Sie die JSON-Schlüsseldatei herunter
|
||||
3. Speichern Sie sie als `service-account.json`
|
||||
4. Setzen Sie sichere Berechtigungen:
|
||||
|
||||
```bash
|
||||
chmod 600 /opt/motia-app/service-account.json
|
||||
chown www-data:www-data /opt/motia-app/service-account.json
|
||||
```
|
||||
|
||||
Siehe auch: [GOOGLE_SETUP_README.md](../GOOGLE_SETUP_README.md)
|
||||
|
||||
## PostgreSQL Configuration
|
||||
|
||||
**Status**: Aktuell nicht verwendet (zukünftige Erweiterung)
|
||||
|
||||
```bash
|
||||
# PostgreSQL Host
|
||||
POSTGRES_HOST=localhost
|
||||
|
||||
# PostgreSQL User
|
||||
POSTGRES_USER=calendar_sync_user
|
||||
|
||||
# PostgreSQL Password
|
||||
POSTGRES_PASSWORD=secure_password
|
||||
|
||||
# PostgreSQL Database Name
|
||||
POSTGRES_DB_NAME=calendar_sync_db
|
||||
```
|
||||
|
||||
## Calendar Sync Configuration
|
||||
|
||||
```bash
|
||||
# Anonymisierung von Google Events (true/false)
|
||||
CALENDAR_SYNC_ANONYMIZE_GOOGLE_EVENTS=true
|
||||
|
||||
# Debug: Nur bestimmte Mitarbeiter synchronisieren (Komma-separiert)
|
||||
# Leer = alle Mitarbeiter
|
||||
CALENDAR_SYNC_DEBUG_KUERZEL=SB,AI,RO,OK,BI,ST,UR,PB,VB
|
||||
```
|
||||
|
||||
### Anonymisierung
|
||||
|
||||
Wenn `CALENDAR_SYNC_ANONYMIZE_GOOGLE_EVENTS=true`:
|
||||
- Titel: "Blocked"
|
||||
- Beschreibung: Leer
|
||||
- Ort: Leer
|
||||
|
||||
Wenn `false`:
|
||||
- Volle Details aus Advoware werden synchronisiert
|
||||
|
||||
### Debug-Modus
|
||||
|
||||
Für Development/Testing nur bestimmte Mitarbeiter synchronisieren:
|
||||
|
||||
```bash
|
||||
# Nur diese Kürzel
|
||||
CALENDAR_SYNC_DEBUG_KUERZEL=SB,AI
|
||||
|
||||
# Alle (Standard)
|
||||
CALENDAR_SYNC_DEBUG_KUERZEL=
|
||||
```
|
||||
|
||||
## EspoCRM Configuration
|
||||
|
||||
```bash
|
||||
# API Key für Webhook-Validierung (optional)
|
||||
ESPOCRM_MARVIN_API_KEY=your_webhook_secret_here
|
||||
```
|
||||
|
||||
**Hinweis**: Aktuell wird der API Key nicht für Validierung verwendet. Zukünftige Implementierung kann HMAC-Signatur-Validierung hinzufügen.
|
||||
|
||||
## Motia Framework Configuration
|
||||
|
||||
```bash
|
||||
# Node Environment (development|production)
|
||||
NODE_ENV=production
|
||||
|
||||
# Node Memory Limit (in MB)
|
||||
# NODE_OPTIONS wird in systemd gesetzt
|
||||
NODE_OPTIONS=--max-old-space-size=8192 --inspect --heapsnapshot-signal=SIGUSR2
|
||||
|
||||
# Host Binding (0.0.0.0 = alle Interfaces)
|
||||
HOST=0.0.0.0
|
||||
|
||||
# Port (Standard: 3000)
|
||||
# PORT=3000
|
||||
|
||||
# Log Level (debug|info|warning|error)
|
||||
MOTIA_LOG_LEVEL=debug
|
||||
|
||||
# npm Cache (für systemd user www-data)
|
||||
NPM_CONFIG_CACHE=/opt/motia-app/.npm-cache
|
||||
```
|
||||
|
||||
## Configuration Loading
|
||||
|
||||
### config.py
|
||||
|
||||
Zentrale Konfiguration wird in `config.py` geladen:
|
||||
|
||||
```python
|
||||
from dotenv import load_dotenv
|
||||
import os
|
||||
|
||||
# Load .env file if exists
|
||||
load_dotenv()
|
||||
|
||||
class Config:
|
||||
# Alle Variablen mit Defaults
|
||||
REDIS_HOST = os.getenv('REDIS_HOST', 'localhost')
|
||||
REDIS_PORT = int(os.getenv('REDIS_PORT', '6379'))
|
||||
# ...
|
||||
```
|
||||
|
||||
### Usage in Steps
|
||||
|
||||
```python
|
||||
from config import Config
|
||||
|
||||
# Access configuration
|
||||
redis_host = Config.REDIS_HOST
|
||||
api_key = Config.ADVOWARE_API_KEY
|
||||
```
|
||||
|
||||
### Usage in Services
|
||||
|
||||
```python
|
||||
from config import Config
|
||||
|
||||
class AdvowareAPI:
|
||||
def __init__(self):
|
||||
self.api_key = Config.ADVOWARE_API_KEY
|
||||
self.base_url = Config.ADVOWARE_API_BASE_URL
|
||||
```
|
||||
|
||||
## Environment-Specific Configuration
|
||||
|
||||
### Development (.env)
|
||||
|
||||
Erstellen Sie eine `.env` Datei im Root:
|
||||
|
||||
```bash
|
||||
# .env (nicht in Git committen!)
|
||||
ADVOWARE_API_BASE_URL=https://staging.advo-net.net:90/
|
||||
ADVOWARE_API_KEY=dev_key_here
|
||||
REDIS_HOST=localhost
|
||||
MOTIA_LOG_LEVEL=debug
|
||||
ADVOWARE_WRITE_PROTECTION=true
|
||||
```
|
||||
|
||||
**Wichtig**: `.env` zu `.gitignore` hinzufügen!
|
||||
|
||||
### Production (systemd)
|
||||
|
||||
In `/etc/systemd/system/motia.service`:
|
||||
|
||||
```ini
|
||||
[Service]
|
||||
Environment=NODE_ENV=production
|
||||
Environment=ADVOWARE_API_BASE_URL=https://www2.advo-net.net:90/
|
||||
Environment=ADVOWARE_API_KEY=production_key_here
|
||||
Environment=ADVOWARE_PASSWORD=production_password_here
|
||||
Environment=REDIS_HOST=localhost
|
||||
Environment=MOTIA_LOG_LEVEL=info
|
||||
Environment=ADVOWARE_WRITE_PROTECTION=false
|
||||
```
|
||||
|
||||
Nach Änderungen:
|
||||
```bash
|
||||
sudo systemctl daemon-reload
|
||||
sudo systemctl restart motia.service
|
||||
```
|
||||
|
||||
### Staging
|
||||
|
||||
Eigene Service-Datei oder separate Environment-Datei.
|
||||
|
||||
## Validation
|
||||
|
||||
### Check Configuration
|
||||
|
||||
Script zum Validieren der Konfiguration:
|
||||
|
||||
```python
|
||||
# scripts/check_config.py
|
||||
from config import Config
|
||||
import sys
|
||||
|
||||
required_vars = [
|
||||
'ADVOWARE_API_BASE_URL',
|
||||
'ADVOWARE_APP_ID',
|
||||
'ADVOWARE_API_KEY',
|
||||
'REDIS_HOST',
|
||||
]
|
||||
|
||||
missing = []
|
||||
for var in required_vars:
|
||||
if not getattr(Config, var, None):
|
||||
missing.append(var)
|
||||
|
||||
if missing:
|
||||
print(f"ERROR: Missing configuration: {', '.join(missing)}")
|
||||
sys.exit(1)
|
||||
|
||||
print("✓ Configuration valid")
|
||||
```
|
||||
|
||||
Run:
|
||||
```bash
|
||||
python scripts/check_config.py
|
||||
```
|
||||
|
||||
## Secrets Management
|
||||
|
||||
### DO NOT
|
||||
|
||||
❌ Commit secrets to Git
|
||||
❌ Hardcode passwords in code
|
||||
❌ Share `.env` files
|
||||
❌ Log sensitive data
|
||||
|
||||
### DO
|
||||
|
||||
✅ Use environment variables
|
||||
✅ Use `.gitignore` for `.env`
|
||||
✅ Use systemd for production secrets
|
||||
✅ Rotate keys regularly
|
||||
✅ Use `chmod 600` for sensitive files
|
||||
|
||||
### Rotation
|
||||
|
||||
Wenn API Keys rotiert werden:
|
||||
|
||||
```bash
|
||||
# 1. Update environment variable
|
||||
sudo nano /etc/systemd/system/motia.service
|
||||
|
||||
# 2. Reload systemd
|
||||
sudo systemctl daemon-reload
|
||||
|
||||
# 3. Clear Redis cache
|
||||
redis-cli -n 1 DEL advoware_access_token advoware_token_timestamp
|
||||
|
||||
# 4. Restart service
|
||||
sudo systemctl restart motia.service
|
||||
|
||||
# 5. Verify
|
||||
sudo journalctl -u motia.service -f
|
||||
```
|
||||
|
||||
## Configuration Reference
|
||||
|
||||
### Complete Example
|
||||
|
||||
```bash
|
||||
# Advoware API
|
||||
ADVOWARE_API_BASE_URL=https://www2.advo-net.net:90/
|
||||
ADVOWARE_PRODUCT_ID=64
|
||||
ADVOWARE_APP_ID=your_app_id
|
||||
ADVOWARE_API_KEY=your_base64_key
|
||||
ADVOWARE_KANZLEI=your_kanzlei
|
||||
ADVOWARE_DATABASE=your_db
|
||||
ADVOWARE_USER=api_user
|
||||
ADVOWARE_ROLE=2
|
||||
ADVOWARE_PASSWORD=your_password
|
||||
ADVOWARE_TOKEN_LIFETIME_MINUTES=55
|
||||
ADVOWARE_API_TIMEOUT_SECONDS=30
|
||||
ADVOWARE_WRITE_PROTECTION=true
|
||||
|
||||
# Redis
|
||||
REDIS_HOST=localhost
|
||||
REDIS_PORT=6379
|
||||
REDIS_DB_ADVOWARE_CACHE=1
|
||||
REDIS_DB_CALENDAR_SYNC=2
|
||||
REDIS_TIMEOUT_SECONDS=5
|
||||
|
||||
# Google Calendar
|
||||
GOOGLE_CALENDAR_SERVICE_ACCOUNT_PATH=/opt/motia-app/service-account.json
|
||||
|
||||
# Calendar Sync
|
||||
CALENDAR_SYNC_ANONYMIZE_GOOGLE_EVENTS=true
|
||||
CALENDAR_SYNC_DEBUG_KUERZEL=
|
||||
|
||||
# PostgreSQL (optional)
|
||||
POSTGRES_HOST=localhost
|
||||
POSTGRES_USER=calendar_sync_user
|
||||
POSTGRES_PASSWORD=your_pg_password
|
||||
POSTGRES_DB_NAME=calendar_sync_db
|
||||
|
||||
# EspoCRM
|
||||
ESPOCRM_MARVIN_API_KEY=your_webhook_key
|
||||
|
||||
# Motia
|
||||
NODE_ENV=production
|
||||
HOST=0.0.0.0
|
||||
MOTIA_LOG_LEVEL=info
|
||||
```
|
||||
|
||||
## Troubleshooting
|
||||
|
||||
### "Configuration not found"
|
||||
|
||||
```bash
|
||||
# Check if .env exists
|
||||
ls -la .env
|
||||
|
||||
# Check environment variables
|
||||
env | grep ADVOWARE
|
||||
|
||||
# Check systemd environment
|
||||
systemctl show motia.service -p Environment
|
||||
```
|
||||
|
||||
### "Redis connection failed"
|
||||
|
||||
```bash
|
||||
# Check Redis is running
|
||||
sudo systemctl status redis-server
|
||||
|
||||
# Test connection
|
||||
redis-cli -h $REDIS_HOST -p $REDIS_PORT ping
|
||||
|
||||
# Check config
|
||||
echo "REDIS_HOST: $REDIS_HOST"
|
||||
echo "REDIS_PORT: $REDIS_PORT"
|
||||
```
|
||||
|
||||
### "API authentication failed"
|
||||
|
||||
```bash
|
||||
# Check if API key is valid Base64
|
||||
echo $ADVOWARE_API_KEY | base64 -d
|
||||
|
||||
# Clear token cache
|
||||
redis-cli -n 1 DEL advoware_access_token
|
||||
|
||||
# Check logs
|
||||
sudo journalctl -u motia.service | grep -i "token\|auth"
|
||||
```
|
||||
|
||||
## Related Documentation
|
||||
|
||||
- [Development Guide](DEVELOPMENT.md)
|
||||
- [Deployment Guide](DEPLOYMENT.md)
|
||||
- [Troubleshooting](TROUBLESHOOTING.md)
|
||||
- [Google Setup](../GOOGLE_SETUP_README.md)
|
||||
624
bitbylaw/docs/DEPLOYMENT.md
Normal file
624
bitbylaw/docs/DEPLOYMENT.md
Normal file
@@ -0,0 +1,624 @@
|
||||
# Deployment Guide
|
||||
|
||||
## Production Deployment
|
||||
|
||||
### Prerequisites
|
||||
|
||||
- Root/sudo access zum Server
|
||||
- Ubuntu/Debian Linux (tested on Ubuntu 22.04+)
|
||||
- Internet-Zugang für Package-Installation
|
||||
|
||||
### Installation Steps
|
||||
|
||||
#### 1. System Dependencies
|
||||
|
||||
```bash
|
||||
# Update system
|
||||
sudo apt-get update
|
||||
sudo apt-get upgrade -y
|
||||
|
||||
# Install Node.js 18.x
|
||||
curl -fsSL https://deb.nodesource.com/setup_18.x | sudo -E bash -
|
||||
sudo apt-get install -y nodejs
|
||||
|
||||
# Install Python 3.13
|
||||
sudo apt-get install -y python3.13 python3.13-venv python3.13-dev
|
||||
|
||||
# Install Redis
|
||||
sudo apt-get install -y redis-server
|
||||
|
||||
# Install Git
|
||||
sudo apt-get install -y git
|
||||
|
||||
# Start Redis
|
||||
sudo systemctl enable redis-server
|
||||
sudo systemctl start redis-server
|
||||
```
|
||||
|
||||
#### 2. Application Setup
|
||||
|
||||
```bash
|
||||
# Create application directory
|
||||
sudo mkdir -p /opt/motia-app
|
||||
cd /opt/motia-app
|
||||
|
||||
# Clone repository (oder rsync von Development)
|
||||
git clone <repository-url> bitbylaw
|
||||
cd bitbylaw
|
||||
|
||||
# Create www-data user if not exists
|
||||
sudo useradd -r -s /bin/bash www-data || true
|
||||
|
||||
# Set ownership
|
||||
sudo chown -R www-data:www-data /opt/motia-app
|
||||
```
|
||||
|
||||
#### 3. Node.js Dependencies
|
||||
|
||||
```bash
|
||||
# Als www-data user
|
||||
sudo -u www-data bash
|
||||
cd /opt/motia-app/bitbylaw
|
||||
|
||||
# Install Node.js packages
|
||||
npm install
|
||||
|
||||
# Build TypeScript (falls nötig)
|
||||
npm run build
|
||||
```
|
||||
|
||||
#### 4. Python Dependencies
|
||||
|
||||
```bash
|
||||
# Als www-data user
|
||||
cd /opt/motia-app/bitbylaw
|
||||
|
||||
# Create virtual environment
|
||||
python3.13 -m venv python_modules
|
||||
|
||||
# Activate
|
||||
source python_modules/bin/activate
|
||||
|
||||
# Install dependencies
|
||||
pip install -r requirements.txt
|
||||
|
||||
# Deactivate
|
||||
deactivate
|
||||
```
|
||||
|
||||
#### 5. Service Account Setup
|
||||
|
||||
```bash
|
||||
# Copy service account JSON
|
||||
sudo cp service-account.json /opt/motia-app/service-account.json
|
||||
|
||||
# Set secure permissions
|
||||
sudo chmod 600 /opt/motia-app/service-account.json
|
||||
sudo chown www-data:www-data /opt/motia-app/service-account.json
|
||||
```
|
||||
|
||||
Siehe auch: [GOOGLE_SETUP_README.md](../GOOGLE_SETUP_README.md)
|
||||
|
||||
#### 6. systemd Service
|
||||
|
||||
Erstellen Sie `/etc/systemd/system/motia.service`:
|
||||
|
||||
```ini
|
||||
[Unit]
|
||||
Description=Motia Backend Framework
|
||||
After=network.target redis-server.service
|
||||
|
||||
[Service]
|
||||
Type=simple
|
||||
User=www-data
|
||||
WorkingDirectory=/opt/motia-app/bitbylaw
|
||||
|
||||
# Environment Variables
|
||||
Environment=NODE_ENV=production
|
||||
Environment=NODE_OPTIONS=--max-old-space-size=8192 --inspect --heapsnapshot-signal=SIGUSR2
|
||||
Environment=HOST=0.0.0.0
|
||||
Environment=MOTIA_LOG_LEVEL=info
|
||||
Environment=NPM_CONFIG_CACHE=/opt/motia-app/.npm-cache
|
||||
|
||||
# Advoware Configuration (ADJUST VALUES!)
|
||||
Environment=ADVOWARE_API_BASE_URL=https://www2.advo-net.net:90/
|
||||
Environment=ADVOWARE_PRODUCT_ID=64
|
||||
Environment=ADVOWARE_APP_ID=your_app_id
|
||||
Environment=ADVOWARE_API_KEY=your_api_key_base64
|
||||
Environment=ADVOWARE_KANZLEI=your_kanzlei
|
||||
Environment=ADVOWARE_DATABASE=your_database
|
||||
Environment=ADVOWARE_USER=your_user
|
||||
Environment=ADVOWARE_ROLE=2
|
||||
Environment=ADVOWARE_PASSWORD=your_password
|
||||
Environment=ADVOWARE_WRITE_PROTECTION=false
|
||||
|
||||
# Redis Configuration
|
||||
Environment=REDIS_HOST=localhost
|
||||
Environment=REDIS_PORT=6379
|
||||
Environment=REDIS_DB_ADVOWARE_CACHE=1
|
||||
Environment=REDIS_DB_CALENDAR_SYNC=2
|
||||
|
||||
# Google Calendar
|
||||
Environment=GOOGLE_CALENDAR_SERVICE_ACCOUNT_PATH=/opt/motia-app/service-account.json
|
||||
|
||||
# EspoCRM (if used)
|
||||
Environment=ESPOCRM_MARVIN_API_KEY=your_webhook_key
|
||||
|
||||
# Start Command
|
||||
ExecStart=/bin/bash -c 'source /opt/motia-app/python_modules/bin/activate && /usr/bin/npm start'
|
||||
|
||||
# Restart Policy
|
||||
Restart=always
|
||||
RestartSec=10
|
||||
|
||||
# Security
|
||||
NoNewPrivileges=true
|
||||
PrivateTmp=true
|
||||
|
||||
[Install]
|
||||
WantedBy=multi-user.target
|
||||
```
|
||||
|
||||
**WICHTIG**: Passen Sie alle `your_*` Werte an!
|
||||
|
||||
#### 7. Enable and Start Service
|
||||
|
||||
```bash
|
||||
# Reload systemd
|
||||
sudo systemctl daemon-reload
|
||||
|
||||
# Enable service (autostart)
|
||||
sudo systemctl enable motia.service
|
||||
|
||||
# Start service
|
||||
sudo systemctl start motia.service
|
||||
|
||||
# Check status
|
||||
sudo systemctl status motia.service
|
||||
```
|
||||
|
||||
#### 8. Verify Installation
|
||||
|
||||
```bash
|
||||
# Check logs
|
||||
sudo journalctl -u motia.service -f
|
||||
|
||||
# Test API
|
||||
curl http://localhost:3000/health # (wenn implementiert)
|
||||
|
||||
# Test Advoware Proxy
|
||||
curl "http://localhost:3000/advoware/proxy?endpoint=employees"
|
||||
```
|
||||
|
||||
## Reverse Proxy Setup (nginx)
|
||||
|
||||
### Install nginx
|
||||
|
||||
```bash
|
||||
sudo apt-get install -y nginx
|
||||
```
|
||||
|
||||
### Configure
|
||||
|
||||
`/etc/nginx/sites-available/motia`:
|
||||
|
||||
```nginx
|
||||
upstream motia_backend {
|
||||
server 127.0.0.1:3000;
|
||||
}
|
||||
|
||||
server {
|
||||
listen 80;
|
||||
server_name your-domain.com;
|
||||
|
||||
# Redirect to HTTPS
|
||||
return 301 https://$server_name$request_uri;
|
||||
}
|
||||
|
||||
server {
|
||||
listen 443 ssl http2;
|
||||
server_name your-domain.com;
|
||||
|
||||
# SSL Configuration (Let's Encrypt)
|
||||
ssl_certificate /etc/letsencrypt/live/your-domain.com/fullchain.pem;
|
||||
ssl_certificate_key /etc/letsencrypt/live/your-domain.com/privkey.pem;
|
||||
|
||||
# Security Headers
|
||||
add_header X-Frame-Options "SAMEORIGIN" always;
|
||||
add_header X-Content-Type-Options "nosniff" always;
|
||||
add_header X-XSS-Protection "1; mode=block" always;
|
||||
|
||||
# Proxy Settings
|
||||
location / {
|
||||
proxy_pass http://motia_backend;
|
||||
proxy_http_version 1.1;
|
||||
proxy_set_header Upgrade $http_upgrade;
|
||||
proxy_set_header Connection 'upgrade';
|
||||
proxy_set_header Host $host;
|
||||
proxy_cache_bypass $http_upgrade;
|
||||
proxy_set_header X-Real-IP $remote_addr;
|
||||
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
|
||||
proxy_set_header X-Forwarded-Proto $scheme;
|
||||
|
||||
# Timeouts
|
||||
proxy_connect_timeout 60s;
|
||||
proxy_send_timeout 60s;
|
||||
proxy_read_timeout 60s;
|
||||
}
|
||||
|
||||
# Access Log
|
||||
access_log /var/log/nginx/motia-access.log;
|
||||
error_log /var/log/nginx/motia-error.log;
|
||||
}
|
||||
```
|
||||
|
||||
### Enable and Restart
|
||||
|
||||
```bash
|
||||
# Enable site
|
||||
sudo ln -s /etc/nginx/sites-available/motia /etc/nginx/sites-enabled/
|
||||
|
||||
# Test configuration
|
||||
sudo nginx -t
|
||||
|
||||
# Restart nginx
|
||||
sudo systemctl restart nginx
|
||||
```
|
||||
|
||||
### SSL Certificate (Let's Encrypt)
|
||||
|
||||
```bash
|
||||
# Install certbot
|
||||
sudo apt-get install -y certbot python3-certbot-nginx
|
||||
|
||||
# Obtain certificate
|
||||
sudo certbot --nginx -d your-domain.com
|
||||
|
||||
# Auto-renewal is configured automatically
|
||||
```
|
||||
|
||||
## Firewall Configuration
|
||||
|
||||
```bash
|
||||
# Allow SSH
|
||||
sudo ufw allow 22/tcp
|
||||
|
||||
# Allow HTTP/HTTPS (if using nginx)
|
||||
sudo ufw allow 80/tcp
|
||||
sudo ufw allow 443/tcp
|
||||
|
||||
# Enable firewall
|
||||
sudo ufw enable
|
||||
```
|
||||
|
||||
**Wichtig**: Port 3000 NICHT öffentlich öffnen (nur via nginx reverse proxy)
|
||||
|
||||
## Monitoring
|
||||
|
||||
### systemd Service Status
|
||||
|
||||
```bash
|
||||
# Status anzeigen
|
||||
sudo systemctl status motia.service
|
||||
|
||||
# Ist enabled?
|
||||
sudo systemctl is-enabled motia.service
|
||||
|
||||
# Ist aktiv?
|
||||
sudo systemctl is-active motia.service
|
||||
```
|
||||
|
||||
### Logs
|
||||
|
||||
```bash
|
||||
# Live logs
|
||||
sudo journalctl -u motia.service -f
|
||||
|
||||
# Last 100 lines
|
||||
sudo journalctl -u motia.service -n 100
|
||||
|
||||
# Since today
|
||||
sudo journalctl -u motia.service --since today
|
||||
|
||||
# Filter by priority (error only)
|
||||
sudo journalctl -u motia.service -p err
|
||||
```
|
||||
|
||||
### Resource Usage
|
||||
|
||||
```bash
|
||||
# CPU and Memory
|
||||
sudo systemctl status motia.service
|
||||
|
||||
# Detailed process info
|
||||
ps aux | grep motia
|
||||
|
||||
# Memory usage
|
||||
sudo pmap $(pgrep -f "motia start") | tail -n 1
|
||||
```
|
||||
|
||||
### Redis Monitoring
|
||||
|
||||
```bash
|
||||
# Connect to Redis
|
||||
redis-cli
|
||||
|
||||
# Show info
|
||||
INFO
|
||||
|
||||
# Show database sizes
|
||||
INFO keyspace
|
||||
|
||||
# Monitor commands (real-time)
|
||||
MONITOR
|
||||
|
||||
# Show memory usage
|
||||
MEMORY USAGE <key>
|
||||
```
|
||||
|
||||
## Backup Strategy
|
||||
|
||||
### Application Code
|
||||
|
||||
```bash
|
||||
# Git-based backup
|
||||
cd /opt/motia-app/bitbylaw
|
||||
git pull origin main
|
||||
|
||||
# Or: rsync backup
|
||||
rsync -av /opt/motia-app/bitbylaw/ /backup/motia-app/
|
||||
```
|
||||
|
||||
### Redis Data
|
||||
|
||||
```bash
|
||||
# RDB snapshot (automatic by Redis)
|
||||
# Location: /var/lib/redis/dump.rdb
|
||||
|
||||
# Manual backup
|
||||
sudo cp /var/lib/redis/dump.rdb /backup/redis-dump-$(date +%Y%m%d).rdb
|
||||
|
||||
# Restore
|
||||
sudo systemctl stop redis-server
|
||||
sudo cp /backup/redis-dump-20260207.rdb /var/lib/redis/dump.rdb
|
||||
sudo chown redis:redis /var/lib/redis/dump.rdb
|
||||
sudo systemctl start redis-server
|
||||
```
|
||||
|
||||
### Configuration
|
||||
|
||||
```bash
|
||||
# Backup systemd service
|
||||
sudo cp /etc/systemd/system/motia.service /backup/motia.service
|
||||
|
||||
# Backup nginx config
|
||||
sudo cp /etc/nginx/sites-available/motia /backup/nginx-motia.conf
|
||||
|
||||
# Backup service account
|
||||
sudo cp /opt/motia-app/service-account.json /backup/service-account.json.backup
|
||||
```
|
||||
|
||||
## Updates & Maintenance
|
||||
|
||||
### Application Update
|
||||
|
||||
```bash
|
||||
# 1. Pull latest code
|
||||
cd /opt/motia-app/bitbylaw
|
||||
sudo -u www-data git pull origin main
|
||||
|
||||
# 2. Update dependencies
|
||||
sudo -u www-data npm install
|
||||
sudo -u www-data bash -c 'source python_modules/bin/activate && pip install -r requirements.txt'
|
||||
|
||||
# 3. Restart service
|
||||
sudo systemctl restart motia.service
|
||||
|
||||
# 4. Verify
|
||||
sudo journalctl -u motia.service -f
|
||||
```
|
||||
|
||||
### Zero-Downtime Deployment
|
||||
|
||||
Für zukünftige Implementierung mit Blue-Green Deployment:
|
||||
|
||||
```bash
|
||||
# 1. Deploy to staging directory
|
||||
# 2. Run health checks
|
||||
# 3. Switch symlink
|
||||
# 4. Reload service
|
||||
# 5. Rollback if issues
|
||||
```
|
||||
|
||||
### Database Migrations
|
||||
|
||||
**Aktuell**: Keine Datenbank-Migrationen (nur Redis)
|
||||
|
||||
**Zukünftig** (PostgreSQL):
|
||||
```bash
|
||||
# Run migrations
|
||||
python manage.py migrate
|
||||
```
|
||||
|
||||
## Security Hardening
|
||||
|
||||
### File Permissions
|
||||
|
||||
```bash
|
||||
# Application files
|
||||
sudo chown -R www-data:www-data /opt/motia-app
|
||||
sudo chmod 755 /opt/motia-app
|
||||
sudo chmod 755 /opt/motia-app/bitbylaw
|
||||
|
||||
# Service account
|
||||
sudo chmod 600 /opt/motia-app/service-account.json
|
||||
sudo chown www-data:www-data /opt/motia-app/service-account.json
|
||||
|
||||
# No world-readable secrets
|
||||
sudo find /opt/motia-app -type f -name "*.json" -exec chmod 600 {} \;
|
||||
```
|
||||
|
||||
### Redis Security
|
||||
|
||||
```bash
|
||||
# Edit Redis config
|
||||
sudo nano /etc/redis/redis.conf
|
||||
|
||||
# Bind to localhost only
|
||||
bind 127.0.0.1 ::1
|
||||
|
||||
# Disable dangerous commands (optional)
|
||||
rename-command FLUSHDB ""
|
||||
rename-command FLUSHALL ""
|
||||
rename-command CONFIG ""
|
||||
|
||||
# Restart Redis
|
||||
sudo systemctl restart redis-server
|
||||
```
|
||||
|
||||
### systemd Hardening
|
||||
|
||||
Bereits in Service-Datei enthalten:
|
||||
- `NoNewPrivileges=true` - Verhindert Privilege-Escalation
|
||||
- `PrivateTmp=true` - Isoliertes /tmp
|
||||
- User: `www-data` (non-root)
|
||||
|
||||
Weitere Optionen:
|
||||
```ini
|
||||
[Service]
|
||||
ProtectSystem=strict
|
||||
ProtectHome=true
|
||||
ReadWritePaths=/opt/motia-app
|
||||
```
|
||||
|
||||
## Disaster Recovery
|
||||
|
||||
### Service Crashed
|
||||
|
||||
```bash
|
||||
# Check status
|
||||
sudo systemctl status motia.service
|
||||
|
||||
# View logs
|
||||
sudo journalctl -u motia.service -n 100
|
||||
|
||||
# Restart
|
||||
sudo systemctl restart motia.service
|
||||
|
||||
# If still failing, check:
|
||||
# - Redis is running
|
||||
# - Service account file exists
|
||||
# - Environment variables are set
|
||||
```
|
||||
|
||||
### Redis Data Loss
|
||||
|
||||
```bash
|
||||
# Restore from backup
|
||||
sudo systemctl stop redis-server
|
||||
sudo cp /backup/redis-dump-latest.rdb /var/lib/redis/dump.rdb
|
||||
sudo chown redis:redis /var/lib/redis/dump.rdb
|
||||
sudo systemctl start redis-server
|
||||
|
||||
# Clear specific data if corrupted
|
||||
redis-cli -n 1 FLUSHDB # Advoware cache
|
||||
redis-cli -n 2 FLUSHDB # Calendar sync
|
||||
```
|
||||
|
||||
### Complete System Failure
|
||||
|
||||
```bash
|
||||
# 1. Fresh server setup (siehe Installation Steps)
|
||||
# 2. Restore application code from Git/Backup
|
||||
# 3. Restore configuration (systemd, nginx)
|
||||
# 4. Restore service-account.json
|
||||
# 5. Restore Redis data (optional, will rebuild)
|
||||
# 6. Start services
|
||||
```
|
||||
|
||||
## Performance Tuning
|
||||
|
||||
### Node.js Memory
|
||||
|
||||
In systemd service:
|
||||
```ini
|
||||
Environment=NODE_OPTIONS=--max-old-space-size=8192 # 8GB
|
||||
```
|
||||
|
||||
### Redis Memory
|
||||
|
||||
In `/etc/redis/redis.conf`:
|
||||
```
|
||||
maxmemory 2gb
|
||||
maxmemory-policy allkeys-lru
|
||||
```
|
||||
|
||||
### Linux Kernel
|
||||
|
||||
```bash
|
||||
# Increase file descriptors
|
||||
echo "fs.file-max = 65536" | sudo tee -a /etc/sysctl.conf
|
||||
sudo sysctl -p
|
||||
|
||||
# For www-data user
|
||||
sudo nano /etc/security/limits.conf
|
||||
# Add:
|
||||
www-data soft nofile 65536
|
||||
www-data hard nofile 65536
|
||||
```
|
||||
|
||||
## Health Checks
|
||||
|
||||
### Automated Monitoring
|
||||
|
||||
Cron job für Health Checks:
|
||||
|
||||
```bash
|
||||
# /usr/local/bin/motia-health-check.sh
|
||||
#!/bin/bash
|
||||
if ! systemctl is-active --quiet motia.service; then
|
||||
echo "Motia service is down!" | mail -s "ALERT: Motia Down" admin@example.com
|
||||
systemctl start motia.service
|
||||
fi
|
||||
```
|
||||
|
||||
```bash
|
||||
# Add to crontab
|
||||
sudo crontab -e
|
||||
# Add line:
|
||||
*/5 * * * * /usr/local/bin/motia-health-check.sh
|
||||
```
|
||||
|
||||
### External Monitoring
|
||||
|
||||
Services wie Uptime Robot, Pingdom, etc. können verwendet werden:
|
||||
- HTTP Endpoint: `https://your-domain.com/health`
|
||||
- Check-Interval: 5 Minuten
|
||||
- Alert via Email/SMS
|
||||
|
||||
## Rollback Procedure
|
||||
|
||||
```bash
|
||||
# 1. Stop current service
|
||||
sudo systemctl stop motia.service
|
||||
|
||||
# 2. Revert to previous version
|
||||
cd /opt/motia-app/bitbylaw
|
||||
sudo -u www-data git log # Find previous commit
|
||||
sudo -u www-data git reset --hard <commit-hash>
|
||||
|
||||
# 3. Restore dependencies (if needed)
|
||||
sudo -u www-data npm install
|
||||
|
||||
# 4. Start service
|
||||
sudo systemctl start motia.service
|
||||
|
||||
# 5. Verify
|
||||
sudo journalctl -u motia.service -f
|
||||
```
|
||||
|
||||
## Related Documentation
|
||||
|
||||
- [Architecture](ARCHITECTURE.md)
|
||||
- [Configuration](CONFIGURATION.md)
|
||||
- [Troubleshooting](TROUBLESHOOTING.md)
|
||||
656
bitbylaw/docs/DEVELOPMENT.md
Normal file
656
bitbylaw/docs/DEVELOPMENT.md
Normal file
@@ -0,0 +1,656 @@
|
||||
# Development Guide
|
||||
|
||||
## Setup
|
||||
|
||||
### Prerequisites
|
||||
|
||||
- **Node.js**: 18.x oder höher
|
||||
- **Python**: 3.13 oder höher
|
||||
- **Redis**: 6.x oder höher
|
||||
- **Git**: Für Version Control
|
||||
- **Motia CLI**: Wird automatisch via npm installiert
|
||||
|
||||
### Initial Setup
|
||||
|
||||
```bash
|
||||
# 1. Repository navigieren
|
||||
cd /opt/motia-app/bitbylaw
|
||||
|
||||
# 2. Node.js Dependencies installieren
|
||||
npm install
|
||||
|
||||
# 3. Python Virtual Environment erstellen (falls nicht vorhanden)
|
||||
python3.13 -m venv python_modules
|
||||
|
||||
# 4. Python Virtual Environment aktivieren
|
||||
source python_modules/bin/activate
|
||||
|
||||
# 5. Python Dependencies installieren
|
||||
pip install -r requirements.txt
|
||||
|
||||
# 6. Redis starten (falls nicht läuft)
|
||||
sudo systemctl start redis-server
|
||||
|
||||
# 7. Environment Variables konfigurieren (siehe CONFIGURATION.md)
|
||||
# Erstellen Sie eine .env Datei oder setzen Sie in systemd
|
||||
|
||||
# 8. Development Mode starten
|
||||
npm run dev
|
||||
```
|
||||
|
||||
### Entwicklungsumgebung
|
||||
|
||||
**Empfohlene IDE**: VS Code mit Extensions:
|
||||
- Python (Microsoft)
|
||||
- TypeScript (Built-in)
|
||||
- ESLint
|
||||
- Prettier
|
||||
|
||||
**VS Code Settings** (`.vscode/settings.json`):
|
||||
```json
|
||||
{
|
||||
"python.defaultInterpreterPath": "${workspaceFolder}/python_modules/bin/python",
|
||||
"python.linting.enabled": true,
|
||||
"python.linting.pylintEnabled": false,
|
||||
"python.linting.flake8Enabled": true,
|
||||
"editor.formatOnSave": true,
|
||||
"files.exclude": {
|
||||
"**/__pycache__": true,
|
||||
"**/node_modules": true
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Projektstruktur
|
||||
|
||||
```
|
||||
bitbylaw/
|
||||
├── docs/ # Dokumentation
|
||||
│ ├── ARCHITECTURE.md # System-Architektur
|
||||
│ ├── DEVELOPMENT.md # Dieser Guide
|
||||
│ ├── API.md # API-Referenz
|
||||
│ ├── CONFIGURATION.md # Environment & Config
|
||||
│ ├── DEPLOYMENT.md # Deployment-Guide
|
||||
│ └── TROUBLESHOOTING.md # Fehlerbehebung
|
||||
├── steps/ # Motia Steps (Business Logic)
|
||||
│ ├── advoware_proxy/ # API Proxy Steps
|
||||
│ │ ├── README.md # Modul-Dokumentation
|
||||
│ │ ├── *.py # Step-Implementierungen
|
||||
│ │ └── *.md # Step-Detail-Doku
|
||||
│ ├── advoware_cal_sync/ # Calendar Sync Steps
|
||||
│ │ ├── README.md
|
||||
│ │ ├── *.py
|
||||
│ │ └── *.md
|
||||
│ └── vmh/ # VMH Webhook Steps
|
||||
│ ├── README.md
|
||||
│ ├── webhook/ # Webhook Receiver
|
||||
│ └── *.py
|
||||
├── services/ # Shared Services
|
||||
│ └── advoware.py # Advoware API Client
|
||||
├── config.py # Configuration Loader
|
||||
├── package.json # Node.js Dependencies
|
||||
├── requirements.txt # Python Dependencies
|
||||
├── tsconfig.json # TypeScript Config
|
||||
├── motia-workbench.json # Motia Flow Definitions
|
||||
└── README.md # Projekt-Übersicht
|
||||
```
|
||||
|
||||
### Konventionen
|
||||
|
||||
**Verzeichnisse**:
|
||||
- `steps/` - Motia Steps (Handler-Funktionen)
|
||||
- `services/` - Wiederverwendbare Service-Layer
|
||||
- `docs/` - Dokumentation
|
||||
- `python_modules/` - Python Virtual Environment (nicht committen)
|
||||
- `node_modules/` - Node.js Dependencies (nicht committen)
|
||||
|
||||
**Dateinamen**:
|
||||
- Steps: `{module}_{action}_step.py` (z.B. `calendar_sync_cron_step.py`)
|
||||
- Services: `{service_name}.py` (z.B. `advoware.py`)
|
||||
- Dokumentation: `{STEP_NAME}.md` oder `{TOPIC}.md`
|
||||
|
||||
## Coding Standards
|
||||
|
||||
### Python
|
||||
|
||||
**Style Guide**: PEP 8 mit folgenden Anpassungen:
|
||||
- Line length: 120 Zeichen (statt 79)
|
||||
- String quotes: Single quotes bevorzugt
|
||||
|
||||
**Linting**:
|
||||
```bash
|
||||
# Flake8 check
|
||||
flake8 steps/ services/
|
||||
|
||||
# Autopep8 formatting
|
||||
autopep8 --in-place --aggressive --aggressive steps/**/*.py
|
||||
```
|
||||
|
||||
**Type Hints**:
|
||||
```python
|
||||
from typing import Dict, List, Optional, Any
|
||||
|
||||
async def handler(req: Dict[str, Any], context: Any) -> Dict[str, Any]:
|
||||
pass
|
||||
```
|
||||
|
||||
**Docstrings**:
|
||||
```python
|
||||
def function_name(param1: str, param2: int) -> bool:
|
||||
"""
|
||||
Brief description of function.
|
||||
|
||||
Args:
|
||||
param1: Description of param1
|
||||
param2: Description of param2
|
||||
|
||||
Returns:
|
||||
Description of return value
|
||||
|
||||
Raises:
|
||||
ValueError: When something goes wrong
|
||||
"""
|
||||
pass
|
||||
```
|
||||
|
||||
### TypeScript/JavaScript
|
||||
|
||||
**Style Guide**: Standard mit Motia-Konventionen
|
||||
|
||||
**Formatting**: Prettier (automatisch via Motia)
|
||||
|
||||
### Naming Conventions
|
||||
|
||||
**Variables**: `snake_case` (Python), `camelCase` (TypeScript)
|
||||
**Constants**: `UPPER_CASE`
|
||||
**Classes**: `PascalCase`
|
||||
**Functions**: `snake_case` (Python), `camelCase` (TypeScript)
|
||||
**Files**: `snake_case.py`, `kebab-case.ts`
|
||||
|
||||
### Error Handling
|
||||
|
||||
**Pattern**:
|
||||
```python
|
||||
async def handler(req, context):
|
||||
try:
|
||||
# Main logic
|
||||
result = await some_operation()
|
||||
return {'status': 200, 'body': {'result': result}}
|
||||
|
||||
except SpecificError as e:
|
||||
# Handle known errors
|
||||
context.logger.error(f"Specific error: {e}")
|
||||
return {'status': 400, 'body': {'error': 'Bad request'}}
|
||||
|
||||
except Exception as e:
|
||||
# Catch-all
|
||||
context.logger.error(f"Unexpected error: {e}", exc_info=True)
|
||||
return {'status': 500, 'body': {'error': 'Internal error'}}
|
||||
```
|
||||
|
||||
**Logging**:
|
||||
```python
|
||||
# Use context.logger for Motia Workbench integration
|
||||
context.logger.debug("Detailed information")
|
||||
context.logger.info("Normal operation")
|
||||
context.logger.warning("Warning message")
|
||||
context.logger.error("Error message", exc_info=True) # Include stack trace
|
||||
```
|
||||
|
||||
## Motia Step Development
|
||||
|
||||
### Step Structure
|
||||
|
||||
Every Step must have:
|
||||
1. **Config Dictionary**: Defines step metadata
|
||||
2. **Handler Function**: Implements business logic
|
||||
|
||||
**Minimal Example**:
|
||||
```python
|
||||
config = {
|
||||
'type': 'api', # api|event|cron
|
||||
'name': 'My API Step',
|
||||
'description': 'Brief description',
|
||||
'path': '/api/my-endpoint', # For API steps
|
||||
'method': 'GET', # For API steps
|
||||
'schedule': '0 2 * * *', # For cron steps
|
||||
'emits': ['topic.name'], # Events this step emits
|
||||
'subscribes': ['other.topic'], # Events this step subscribes to (event steps)
|
||||
'flows': ['my-flow'] # Flow membership
|
||||
}
|
||||
|
||||
async def handler(req, context):
|
||||
"""Handler function - must be async."""
|
||||
# req: Request object (API) or Event data (event step)
|
||||
# context: Motia context (logger, emit, etc.)
|
||||
|
||||
# Business logic here
|
||||
|
||||
# For API steps: return HTTP response
|
||||
return {'status': 200, 'body': {'result': 'success'}}
|
||||
|
||||
# For event steps: no return value (or None)
|
||||
```
|
||||
|
||||
### Step Types
|
||||
|
||||
**1. API Steps** (`type: 'api'`):
|
||||
```python
|
||||
config = {
|
||||
'type': 'api',
|
||||
'name': 'My Endpoint',
|
||||
'path': '/api/resource',
|
||||
'method': 'POST',
|
||||
'emits': [],
|
||||
'flows': ['main']
|
||||
}
|
||||
|
||||
async def handler(req, context):
|
||||
# Access request data
|
||||
body = req.get('body')
|
||||
query_params = req.get('queryParams')
|
||||
headers = req.get('headers')
|
||||
|
||||
# Return HTTP response
|
||||
return {
|
||||
'status': 200,
|
||||
'body': {'data': 'response'},
|
||||
'headers': {'X-Custom': 'value'}
|
||||
}
|
||||
```
|
||||
|
||||
**2. Event Steps** (`type: 'event'`):
|
||||
```python
|
||||
config = {
|
||||
'type': 'event',
|
||||
'name': 'Process Event',
|
||||
'subscribes': ['my.topic'],
|
||||
'emits': ['other.topic'],
|
||||
'flows': ['main']
|
||||
}
|
||||
|
||||
async def handler(event_data, context):
|
||||
# Process event
|
||||
entity_id = event_data.get('entity_id')
|
||||
|
||||
# Emit new event
|
||||
await context.emit({
|
||||
'topic': 'other.topic',
|
||||
'data': {'processed': True}
|
||||
})
|
||||
|
||||
# No return value needed
|
||||
```
|
||||
|
||||
**3. Cron Steps** (`type: 'cron'`):
|
||||
```python
|
||||
config = {
|
||||
'type': 'cron',
|
||||
'name': 'Daily Job',
|
||||
'schedule': '0 2 * * *', # Cron expression
|
||||
'emits': ['job.complete'],
|
||||
'flows': ['main']
|
||||
}
|
||||
|
||||
async def handler(req, context):
|
||||
# Scheduled logic
|
||||
context.logger.info("Cron job triggered")
|
||||
|
||||
# Emit event to start pipeline
|
||||
await context.emit({
|
||||
'topic': 'job.complete',
|
||||
'data': {}
|
||||
})
|
||||
```
|
||||
|
||||
### Context API
|
||||
|
||||
**Available Methods**:
|
||||
```python
|
||||
# Logging
|
||||
context.logger.debug(msg)
|
||||
context.logger.info(msg)
|
||||
context.logger.warning(msg)
|
||||
context.logger.error(msg, exc_info=True)
|
||||
|
||||
# Event Emission
|
||||
await context.emit({
|
||||
'topic': 'my.topic',
|
||||
'data': {'key': 'value'}
|
||||
})
|
||||
|
||||
# Flow information
|
||||
context.flow_id # Current flow ID
|
||||
context.step_name # Current step name
|
||||
```
|
||||
|
||||
## Testing
|
||||
|
||||
### Unit Tests
|
||||
|
||||
**Location**: Tests neben dem Code (z.B. `*_test.py`)
|
||||
|
||||
**Framework**: pytest
|
||||
|
||||
```python
|
||||
# test_my_step.py
|
||||
import pytest
|
||||
from unittest.mock import AsyncMock, MagicMock
|
||||
from my_step import handler, config
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_handler_success():
|
||||
# Arrange
|
||||
req = {'body': {'key': 'value'}}
|
||||
context = MagicMock()
|
||||
context.logger = MagicMock()
|
||||
|
||||
# Act
|
||||
result = await handler(req, context)
|
||||
|
||||
# Assert
|
||||
assert result['status'] == 200
|
||||
assert 'result' in result['body']
|
||||
```
|
||||
|
||||
**Run Tests**:
|
||||
```bash
|
||||
pytest steps/
|
||||
```
|
||||
|
||||
### Integration Tests
|
||||
|
||||
**Manual Testing mit curl**:
|
||||
|
||||
```bash
|
||||
# API Step testen
|
||||
curl -X POST "http://localhost:3000/api/my-endpoint" \
|
||||
-H "Content-Type: application/json" \
|
||||
-d '{"key": "value"}'
|
||||
|
||||
# Mit Query Parameters
|
||||
curl -X GET "http://localhost:3000/advoware/proxy?endpoint=employees"
|
||||
```
|
||||
|
||||
**Motia Workbench**: Nutzen Sie die Workbench UI zum Testen und Debugging
|
||||
|
||||
### Test-Daten
|
||||
|
||||
**Redis Mock Data**:
|
||||
```bash
|
||||
# Set test token
|
||||
redis-cli -n 1 SET advoware_access_token "test_token" EX 3600
|
||||
|
||||
# Set test lock
|
||||
redis-cli -n 1 SET "calendar_sync:lock:TEST" "1" EX 300
|
||||
|
||||
# Check dedup set
|
||||
redis-cli -n 1 SMEMBERS "vmh:beteiligte:create_pending"
|
||||
```
|
||||
|
||||
## Debugging
|
||||
|
||||
### Local Development
|
||||
|
||||
**Start in Dev Mode**:
|
||||
```bash
|
||||
npm run dev
|
||||
```
|
||||
|
||||
**Enable Debug Logging**:
|
||||
```bash
|
||||
export MOTIA_LOG_LEVEL=debug
|
||||
npm start
|
||||
```
|
||||
|
||||
**Node.js Inspector**:
|
||||
```bash
|
||||
# Already enabled in systemd (--inspect)
|
||||
# Connect with Chrome DevTools: chrome://inspect
|
||||
```
|
||||
|
||||
### Motia Workbench
|
||||
|
||||
**Access**: `http://localhost:3000/workbench` (wenn verfügbar)
|
||||
|
||||
**Features**:
|
||||
- Live logs
|
||||
- Flow visualization
|
||||
- Event traces
|
||||
- Step execution history
|
||||
|
||||
### Redis Debugging
|
||||
|
||||
```bash
|
||||
# Connect to Redis
|
||||
redis-cli
|
||||
|
||||
# Switch database
|
||||
SELECT 1
|
||||
|
||||
# List all keys
|
||||
KEYS *
|
||||
|
||||
# Get value
|
||||
GET advoware_access_token
|
||||
|
||||
# Check SET members
|
||||
SMEMBERS vmh:beteiligte:create_pending
|
||||
|
||||
# Monitor live commands
|
||||
MONITOR
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Utility Scripts
|
||||
|
||||
### Calendar Sync Utilities
|
||||
|
||||
Helper-Scripts für Wartung und Debugging der Calendar-Sync-Funktionalität.
|
||||
|
||||
**Standort**: `scripts/calendar_sync/`
|
||||
|
||||
**Verfügbare Scripts**:
|
||||
|
||||
```bash
|
||||
# Alle Employee-Locks in Redis löschen (bei hängenden Syncs)
|
||||
python3 scripts/calendar_sync/delete_employee_locks.py
|
||||
|
||||
# Alle Google Kalender löschen (außer Primary) - VORSICHT!
|
||||
python3 scripts/calendar_sync/delete_all_calendars.py
|
||||
```
|
||||
|
||||
**Use Cases**:
|
||||
- **Lock Cleanup**: Wenn ein Sync-Prozess abgestürzt ist und Locks nicht aufgeräumt wurden
|
||||
- **Calendar Reset**: Bei fehlerhafter Synchronisation oder Tests
|
||||
- **Debugging**: Untersuchung von Sync-Problemen
|
||||
|
||||
**Dokumentation**: [scripts/calendar_sync/README.md](../scripts/calendar_sync/README.md)
|
||||
|
||||
**⚠️ Wichtig**:
|
||||
- Immer Motia Service stoppen vor Cleanup: `sudo systemctl stop motia`
|
||||
- Nach Cleanup Service neu starten: `sudo systemctl start motia`
|
||||
- `delete_all_calendars.py` löscht unwiderruflich alle Kalender!
|
||||
|
||||
---
|
||||
|
||||
### Common Issues
|
||||
|
||||
**1. Import Errors**:
|
||||
```bash
|
||||
# Ensure PYTHONPATH is set
|
||||
export PYTHONPATH=/opt/motia-app/bitbylaw
|
||||
source python_modules/bin/activate
|
||||
```
|
||||
|
||||
**2. Redis Connection Errors**:
|
||||
```bash
|
||||
# Check Redis is running
|
||||
sudo systemctl status redis-server
|
||||
|
||||
# Test connection
|
||||
redis-cli ping
|
||||
```
|
||||
|
||||
**3. Token Errors**:
|
||||
```bash
|
||||
# Clear cached token
|
||||
redis-cli -n 1 DEL advoware_access_token advoware_token_timestamp
|
||||
```
|
||||
|
||||
## Git Workflow
|
||||
|
||||
### Branch Strategy
|
||||
|
||||
- `main` - Production code
|
||||
- `develop` - Integration branch
|
||||
- `feature/*` - Feature branches
|
||||
- `fix/*` - Bugfix branches
|
||||
|
||||
### Commit Messages
|
||||
|
||||
**Format**: `<type>: <subject>`
|
||||
|
||||
**Types**:
|
||||
- `feat`: New feature
|
||||
- `fix`: Bug fix
|
||||
- `docs`: Documentation only
|
||||
- `refactor`: Code refactoring
|
||||
- `test`: Adding tests
|
||||
- `chore`: Maintenance tasks
|
||||
|
||||
**Examples**:
|
||||
```
|
||||
feat: add calendar sync retry logic
|
||||
fix: prevent duplicate webhook processing
|
||||
docs: update API documentation
|
||||
refactor: extract common validation logic
|
||||
```
|
||||
|
||||
### Pull Request Process
|
||||
|
||||
1. Create feature branch from `develop`
|
||||
2. Implement changes
|
||||
3. Write/update tests
|
||||
4. Update documentation
|
||||
5. Create PR with description
|
||||
6. Code review
|
||||
7. Merge to `develop`
|
||||
8. Deploy to staging
|
||||
9. Merge to `main` (production)
|
||||
|
||||
## Performance Optimization
|
||||
|
||||
### Profiling
|
||||
|
||||
**Python Memory Profiling**:
|
||||
```bash
|
||||
# Install memory_profiler
|
||||
pip install memory_profiler
|
||||
|
||||
# Profile a function
|
||||
python -m memory_profiler steps/my_step.py
|
||||
```
|
||||
|
||||
**Node.js Profiling**:
|
||||
```bash
|
||||
# Already enabled with --inspect flag
|
||||
# Use Chrome DevTools Performance tab
|
||||
```
|
||||
|
||||
### Best Practices
|
||||
|
||||
**Async/Await**:
|
||||
```python
|
||||
# Good: Concurrent requests
|
||||
results = await asyncio.gather(
|
||||
fetch_data_1(),
|
||||
fetch_data_2()
|
||||
)
|
||||
|
||||
# Bad: Sequential (slow)
|
||||
result1 = await fetch_data_1()
|
||||
result2 = await fetch_data_2()
|
||||
```
|
||||
|
||||
**Redis Pipelining**:
|
||||
```python
|
||||
# Good: Batch operations
|
||||
pipe = redis.pipeline()
|
||||
pipe.get('key1')
|
||||
pipe.get('key2')
|
||||
results = pipe.execute()
|
||||
|
||||
# Bad: Multiple round-trips
|
||||
val1 = redis.get('key1')
|
||||
val2 = redis.get('key2')
|
||||
```
|
||||
|
||||
**Avoid N+1 Queries**:
|
||||
```python
|
||||
# Good: Batch fetch
|
||||
employee_ids = [1, 2, 3]
|
||||
employees = await advoware.api_call(
|
||||
'/employees',
|
||||
params={'ids': ','.join(map(str, employee_ids))}
|
||||
)
|
||||
|
||||
# Bad: Loop with API calls
|
||||
employees = []
|
||||
for emp_id in employee_ids:
|
||||
emp = await advoware.api_call(f'/employees/{emp_id}')
|
||||
employees.append(emp)
|
||||
```
|
||||
|
||||
## Code Review Checklist
|
||||
|
||||
- [ ] Code follows style guide
|
||||
- [ ] Type hints present (Python)
|
||||
- [ ] Error handling implemented
|
||||
- [ ] Logging added at key points
|
||||
- [ ] Tests written/updated
|
||||
- [ ] Documentation updated
|
||||
- [ ] No secrets in code
|
||||
- [ ] Performance considered
|
||||
- [ ] Redis keys documented
|
||||
- [ ] Events documented
|
||||
|
||||
## Deployment
|
||||
|
||||
See [DEPLOYMENT.md](DEPLOYMENT.md) for detailed deployment instructions.
|
||||
|
||||
**Quick Deploy to Production**:
|
||||
```bash
|
||||
# 1. Pull latest code
|
||||
git pull origin main
|
||||
|
||||
# 2. Install dependencies
|
||||
npm install
|
||||
pip install -r requirements.txt
|
||||
|
||||
# 3. Restart service
|
||||
sudo systemctl restart motia.service
|
||||
|
||||
# 4. Check status
|
||||
sudo systemctl status motia.service
|
||||
|
||||
# 5. Monitor logs
|
||||
sudo journalctl -u motia.service -f
|
||||
```
|
||||
|
||||
## Resources
|
||||
|
||||
### Documentation
|
||||
- [Motia Framework](https://motia.dev)
|
||||
- [Advoware API](docs/advoware/) (internal)
|
||||
- [Google Calendar API](https://developers.google.com/calendar)
|
||||
|
||||
### Tools
|
||||
- [Redis Commander](http://localhost:8081) (if installed)
|
||||
- [Motia Workbench](http://localhost:3000/workbench)
|
||||
|
||||
### Team Contacts
|
||||
- Architecture Questions: [Lead Developer]
|
||||
- Deployment Issues: [DevOps Team]
|
||||
- API Access: [Integration Team]
|
||||
92
bitbylaw/docs/GOOGLE_SETUP.md
Normal file
92
bitbylaw/docs/GOOGLE_SETUP.md
Normal file
@@ -0,0 +1,92 @@
|
||||
# Google Service Account Setup für Advoware Calendar Sync
|
||||
|
||||
## Übersicht
|
||||
Dieser Calendar Sync verwendet **ausschließlich Google Service Accounts** für die Authentifizierung. Kein OAuth, kein Browser-Interaktion - perfekt für Server-Umgebungen!
|
||||
|
||||
## Voraussetzungen
|
||||
- Google Cloud Console Zugang
|
||||
- Berechtigung zum Erstellen von Service Accounts
|
||||
- (Optional) Google Workspace Admin Zugang für Domain-wide Delegation
|
||||
|
||||
## Schritt 1: Google Cloud Console aufrufen
|
||||
1. Gehen Sie zu: https://console.cloud.google.com/
|
||||
2. Melden Sie sich mit Ihrem Google-Konto an
|
||||
3. Wählen Sie ein bestehendes Projekt aus oder erstellen Sie ein neues
|
||||
|
||||
## Schritt 2: Google Calendar API aktivieren
|
||||
1. Klicken Sie auf "APIs & Dienste" → "Bibliothek"
|
||||
2. Suchen Sie nach "Google Calendar API"
|
||||
3. Klicken Sie auf "Google Calendar API" → "Aktivieren"
|
||||
|
||||
## Schritt 3: Service Account erstellen
|
||||
1. Gehen Sie zu "IAM & Verwaltung" → "Service-Konten"
|
||||
2. Klicken Sie auf "+ Service-Konto erstellen"
|
||||
3. Grundlegende Informationen:
|
||||
- **Service-Kontoname**: `advoware-calendar-sync`
|
||||
- **Beschreibung**: `Service Account für Advoware-Google Calendar Synchronisation`
|
||||
- **E-Mail**: wird automatisch generiert
|
||||
4. Klicken Sie auf "Erstellen und fortfahren"
|
||||
|
||||
## Schritt 4: Berechtigungen zuweisen
|
||||
1. **Rolle zuweisen**: Wählen Sie eine der folgenden Optionen:
|
||||
- Für volle Zugriffe: `Editor`
|
||||
- Für eingeschränkte Zugriffe: `Calendar API Admin` (falls verfügbar)
|
||||
2. Klicken Sie auf "Fertig"
|
||||
|
||||
## Schritt 5: JSON-Schlüssel erstellen und installieren
|
||||
1. Klicken Sie auf das neu erstellte Service-Konto
|
||||
2. Gehen Sie zum Tab "Schlüssel"
|
||||
3. Klicken Sie auf "Schlüssel hinzufügen" → "Neuen Schlüssel erstellen"
|
||||
4. Wählen Sie "JSON" als Schlüsseltyp
|
||||
5. Klicken Sie auf "Erstellen"
|
||||
6. Die JSON-Datei wird automatisch heruntergeladen
|
||||
7. **Benennen Sie die Datei um zu: `service-account.json`**
|
||||
8. **Kopieren Sie die Datei nach: `/opt/motia-app/service-account.json`**
|
||||
9. **Stellen Sie sichere Berechtigungen ein:**
|
||||
```bash
|
||||
chmod 600 /opt/motia-app/service-account.json
|
||||
```
|
||||
|
||||
## Schritt 6: Domain-wide Delegation (nur für Google Workspace)
|
||||
Falls Sie Google Workspace verwenden und auf Kalender anderer Benutzer zugreifen möchten:
|
||||
|
||||
1. Gehen Sie zurück zum Service-Konto
|
||||
2. Aktivieren Sie "Google Workspace Domain-wide Delegation"
|
||||
3. Notieren Sie sich die "Unique ID" des Service-Kontos
|
||||
4. Gehen Sie zu Google Admin Console: https://admin.google.com/
|
||||
5. "Sicherheit" → "API-Berechtigungen"
|
||||
6. "Domain-wide Delegation" → "API-Clienten verwalten"
|
||||
7. Fügen Sie die Unique ID hinzu
|
||||
8. Berechtigungen: `https://www.googleapis.com/auth/calendar`
|
||||
|
||||
## Schritt 7: Testen
|
||||
Nach dem Setup können Sie den Calendar Sync testen:
|
||||
|
||||
```bash
|
||||
# Vollständige Termindetails synchronisieren
|
||||
curl -X POST http://localhost:3000/api/flows/advoware_cal_sync \
|
||||
-H "Content-Type: application/json" \
|
||||
-d '{"full_content": true}'
|
||||
|
||||
# Nur "blocked" Termine synchronisieren (weniger Details)
|
||||
curl -X POST http://localhost:3000/api/flows/advoware_cal_sync \
|
||||
-H "Content-Type: application/json" \
|
||||
-d '{"full_content": false}'
|
||||
```
|
||||
|
||||
## Wichtige Hinweise
|
||||
- ✅ **Kein Browser nötig** - läuft komplett server-seitig
|
||||
- ✅ **Automatisch** - einmal setup, läuft für immer
|
||||
- ✅ **Sicher** - Service Accounts haben granulare Berechtigungen
|
||||
- ✅ **Skalierbar** - perfekt für Produktionsumgebungen
|
||||
|
||||
## Fehlerbehebung
|
||||
- **"service-account.json nicht gefunden"**: Überprüfen Sie den Pfad `/opt/motia-app/service-account.json`
|
||||
- **"Access denied"**: Überprüfen Sie die Berechtigungen des Service Accounts
|
||||
- **"API not enabled"**: Stellen Sie sicher, dass Calendar API aktiviert ist
|
||||
- **"Invalid credentials"**: Überprüfen Sie die service-account.json Datei
|
||||
|
||||
## Sicherheit
|
||||
- Halten Sie die `service-account.json` Datei sicher und versionieren Sie sie nicht
|
||||
- Verwenden Sie IAM-Rollen in GCP-Umgebungen statt JSON-Keys
|
||||
- Rotiere Service Account Keys regelmäßig
|
||||
194
bitbylaw/docs/INDEX.md
Normal file
194
bitbylaw/docs/INDEX.md
Normal file
@@ -0,0 +1,194 @@
|
||||
# Documentation Index
|
||||
|
||||
## Getting Started
|
||||
|
||||
**New to the project?** Start here:
|
||||
|
||||
1. [README.md](../README.md) - Project Overview & Quick Start
|
||||
2. [DEVELOPMENT.md](DEVELOPMENT.md) - Setup Development Environment
|
||||
3. [CONFIGURATION.md](CONFIGURATION.md) - Configure Environment Variables
|
||||
|
||||
## Core Documentation
|
||||
|
||||
### For Developers
|
||||
- **[DEVELOPMENT.md](DEVELOPMENT.md)** - Complete development guide
|
||||
- Setup, Coding Standards, Testing, Debugging
|
||||
- **[ARCHITECTURE.md](ARCHITECTURE.md)** - System design and architecture
|
||||
- Components, Data Flow, Event-Driven Design
|
||||
- **[API.md](API.md)** - HTTP Endpoints and Event Topics
|
||||
- Proxy API, Calendar Sync API, Webhook Endpoints
|
||||
|
||||
### For Operations
|
||||
- **[DEPLOYMENT.md](DEPLOYMENT.md)** - Production deployment
|
||||
- Installation, systemd, nginx, Monitoring
|
||||
- **[CONFIGURATION.md](CONFIGURATION.md)** - Environment configuration
|
||||
- All environment variables, secrets management
|
||||
- **[TROUBLESHOOTING.md](TROUBLESHOOTING.md)** - Problem solving
|
||||
- Common issues, debugging, log analysis
|
||||
|
||||
### Special Topics
|
||||
- **[GOOGLE_SETUP.md](GOOGLE_SETUP.md)** - Google Service Account setup
|
||||
- Step-by-step guide for Calendar API access
|
||||
|
||||
## Component Documentation
|
||||
|
||||
### Steps (Business Logic)
|
||||
|
||||
**Advoware Proxy** ([Module README](../steps/advoware_proxy/README.md)):
|
||||
- [advoware_api_proxy_get_step.md](../steps/advoware_proxy/advoware_api_proxy_get_step.md)
|
||||
- [advoware_api_proxy_post_step.md](../steps/advoware_proxy/advoware_api_proxy_post_step.md)
|
||||
- [advoware_api_proxy_put_step.md](../steps/advoware_proxy/advoware_api_proxy_put_step.md)
|
||||
- [advoware_api_proxy_delete_step.md](../steps/advoware_proxy/advoware_api_proxy_delete_step.md)
|
||||
|
||||
**Calendar Sync** ([Module README](../steps/advoware_cal_sync/README.md)):
|
||||
- [calendar_sync_cron_step.md](../steps/advoware_cal_sync/calendar_sync_cron_step.md) - Daily trigger
|
||||
- [calendar_sync_api_step.md](../steps/advoware_cal_sync/calendar_sync_api_step.md) - Manual trigger
|
||||
- [calendar_sync_all_step.md](../steps/advoware_cal_sync/calendar_sync_all_step.md) - Employee cascade
|
||||
- [calendar_sync_event_step.md](../steps/advoware_cal_sync/calendar_sync_event_step.md) - Per-employee sync (complex)
|
||||
|
||||
**VMH Webhooks** ([Module README](../steps/vmh/README.md)):
|
||||
- [beteiligte_create_api_step.md](../steps/vmh/webhook/beteiligte_create_api_step.md) - Create webhook
|
||||
- [beteiligte_update_api_step.md](../steps/vmh/webhook/beteiligte_update_api_step.md) - Update webhook (similar)
|
||||
- [beteiligte_delete_api_step.md](../steps/vmh/webhook/beteiligte_delete_api_step.md) - Delete webhook (similar)
|
||||
- [beteiligte_sync_event_step.md](../steps/vmh/beteiligte_sync_event_step.md) - Sync handler (placeholder)
|
||||
|
||||
### Services
|
||||
|
||||
- [Advoware Service](../services/ADVOWARE_SERVICE.md) - API Client mit HMAC-512 Auth
|
||||
- [Advoware API Swagger](advoware/advoware_api_swagger.json) - Vollständige API-Dokumentation (JSON)
|
||||
|
||||
### Utility Scripts
|
||||
|
||||
- [Calendar Sync Scripts](../scripts/calendar_sync/README.md) - Wartung und Debugging
|
||||
- `delete_employee_locks.py` - Redis Lock Cleanup
|
||||
- `delete_all_calendars.py` - Google Calendar Reset
|
||||
|
||||
---
|
||||
|
||||
## Documentation Structure
|
||||
|
||||
```
|
||||
docs/
|
||||
├── INDEX.md # This file
|
||||
├── ARCHITECTURE.md # System design
|
||||
├── API.md # API reference
|
||||
├── CONFIGURATION.md # Configuration
|
||||
├── DEPLOYMENT.md # Deployment guide
|
||||
├── DEVELOPMENT.md # Development guide
|
||||
├── GOOGLE_SETUP.md # Google Calendar setup
|
||||
├── TROUBLESHOOTING.md # Debugging guide
|
||||
└── advoware/
|
||||
└── advoware_api_swagger.json # Advoware API spec
|
||||
|
||||
steps/{module}/
|
||||
├── README.md # Module overview
|
||||
└── {step_name}.md # Step documentation
|
||||
|
||||
services/
|
||||
└── {service_name}.md # Service documentation
|
||||
|
||||
scripts/{category}/
|
||||
├── README.md # Script documentation
|
||||
└── *.py # Utility scripts
|
||||
```
|
||||
|
||||
## Documentation Standards
|
||||
|
||||
### YAML Frontmatter
|
||||
|
||||
Each step documentation includes metadata:
|
||||
|
||||
```yaml
|
||||
---
|
||||
type: step
|
||||
category: api|event|cron
|
||||
name: Step Name
|
||||
version: 1.0.0
|
||||
status: active|deprecated|placeholder
|
||||
tags: [tag1, tag2]
|
||||
dependencies: [...]
|
||||
emits: [...]
|
||||
subscribes: [...]
|
||||
---
|
||||
```
|
||||
|
||||
### Sections
|
||||
|
||||
Standard sections in step documentation:
|
||||
|
||||
1. **Zweck** - Purpose (one sentence)
|
||||
2. **Config** - Motia step configuration
|
||||
3. **Input** - Request structure, parameters
|
||||
4. **Output** - Response structure
|
||||
5. **Verhalten** - Behavior, logic flow
|
||||
6. **Abhängigkeiten** - Dependencies (services, Redis, APIs)
|
||||
7. **Testing** - Test examples
|
||||
8. **KI Guidance** - Tips for AI assistants
|
||||
|
||||
### Cross-References
|
||||
|
||||
- Use relative paths for links
|
||||
- Link related steps and services
|
||||
- Link to parent module READMEs
|
||||
|
||||
## Quick Reference
|
||||
|
||||
### Common Tasks
|
||||
|
||||
| Task | Documentation |
|
||||
|------|---------------|
|
||||
| Setup development environment | [DEVELOPMENT.md](DEVELOPMENT.md#setup) |
|
||||
| Configure environment variables | [CONFIGURATION.md](CONFIGURATION.md) |
|
||||
| Deploy to production | [DEPLOYMENT.md](DEPLOYMENT.md#installation-steps) |
|
||||
| Setup Google Calendar | [GOOGLE_SETUP.md](GOOGLE_SETUP.md) |
|
||||
| Debug service issues | [TROUBLESHOOTING.md](TROUBLESHOOTING.md#service-issues) |
|
||||
| Understand architecture | [ARCHITECTURE.md](ARCHITECTURE.md) |
|
||||
| Test API endpoints | [API.md](API.md) |
|
||||
|
||||
### Code Locations
|
||||
|
||||
| Component | Location | Documentation |
|
||||
|-----------|----------|---------------|
|
||||
| API Proxy Steps | `steps/advoware_proxy/` | [README](../steps/advoware_proxy/README.md) |
|
||||
| Calendar Sync Steps | `steps/advoware_cal_sync/` | [README](../steps/advoware_cal_sync/README.md) |
|
||||
| VMH Webhook Steps | `steps/vmh/` | [README](../steps/vmh/README.md) |
|
||||
| Advoware API Client | `services/advoware.py` | [DOC](../services/ADVOWARE_SERVICE.md) |
|
||||
| Configuration | `config.py` | [CONFIGURATION.md](CONFIGURATION.md) |
|
||||
|
||||
## Contributing to Documentation
|
||||
|
||||
### Adding New Step Documentation
|
||||
|
||||
1. Create `{step_name}.md` next to `.py` file
|
||||
2. Use YAML frontmatter (see template)
|
||||
3. Follow standard sections
|
||||
4. Add to module README
|
||||
5. Add to this INDEX
|
||||
|
||||
### Updating Documentation
|
||||
|
||||
- Keep code and docs in sync
|
||||
- Update version history in step docs
|
||||
- Update INDEX when adding new files
|
||||
- Test all code examples
|
||||
|
||||
### Documentation Reviews
|
||||
|
||||
- Verify all links work
|
||||
- Check code examples execute correctly
|
||||
- Ensure terminology is consistent
|
||||
- Validate configuration examples
|
||||
|
||||
## External Resources
|
||||
|
||||
- [Motia Framework Docs](https://motia.dev) (if available)
|
||||
- [Advoware API](https://www2.advo-net.net:90/) (requires auth)
|
||||
- [Google Calendar API](https://developers.google.com/calendar)
|
||||
- [Redis Documentation](https://redis.io/documentation)
|
||||
|
||||
## Support
|
||||
|
||||
- **Questions**: Check TROUBLESHOOTING.md first
|
||||
- **Bugs**: Document in logs (`journalctl -u motia.service`)
|
||||
- **Features**: Propose in team discussions
|
||||
- **Urgent**: Check systemd logs and Redis state
|
||||
800
bitbylaw/docs/TROUBLESHOOTING.md
Normal file
800
bitbylaw/docs/TROUBLESHOOTING.md
Normal file
@@ -0,0 +1,800 @@
|
||||
# Troubleshooting Guide
|
||||
|
||||
## Service Issues
|
||||
|
||||
### Service Won't Start
|
||||
|
||||
**Symptoms**: `systemctl start motia.service` schlägt fehl
|
||||
|
||||
**Diagnose**:
|
||||
```bash
|
||||
# Check service status
|
||||
sudo systemctl status motia.service
|
||||
|
||||
# View detailed logs
|
||||
sudo journalctl -u motia.service -n 100 --no-pager
|
||||
|
||||
# Check for port conflicts
|
||||
sudo netstat -tlnp | grep 3000
|
||||
```
|
||||
|
||||
**Häufige Ursachen**:
|
||||
|
||||
1. **Port 3000 bereits belegt**:
|
||||
```bash
|
||||
# Find process
|
||||
sudo lsof -i :3000
|
||||
|
||||
# Kill process
|
||||
sudo kill -9 <PID>
|
||||
```
|
||||
|
||||
2. **Fehlende Dependencies**:
|
||||
```bash
|
||||
cd /opt/motia-app/bitbylaw
|
||||
sudo -u www-data npm install
|
||||
sudo -u www-data bash -c 'source python_modules/bin/activate && pip install -r requirements.txt'
|
||||
```
|
||||
|
||||
3. **Falsche Permissions**:
|
||||
```bash
|
||||
sudo chown -R www-data:www-data /opt/motia-app
|
||||
sudo chmod 600 /opt/motia-app/service-account.json
|
||||
```
|
||||
|
||||
4. **Environment Variables fehlen**:
|
||||
```bash
|
||||
# Check systemd environment
|
||||
sudo systemctl show motia.service -p Environment
|
||||
|
||||
# Verify required vars
|
||||
sudo systemctl cat motia.service | grep Environment
|
||||
```
|
||||
|
||||
### Service Keeps Crashing
|
||||
|
||||
**Symptoms**: Service startet, crashed aber nach kurzer Zeit
|
||||
|
||||
**Diagnose**:
|
||||
```bash
|
||||
# Watch logs in real-time
|
||||
sudo journalctl -u motia.service -f
|
||||
|
||||
# Check for OOM (Out of Memory)
|
||||
dmesg | grep -i "out of memory"
|
||||
sudo grep -i "killed process" /var/log/syslog
|
||||
```
|
||||
|
||||
**Solutions**:
|
||||
|
||||
1. **Memory Limit erhöhen**:
|
||||
```ini
|
||||
# In /etc/systemd/system/motia.service
|
||||
Environment=NODE_OPTIONS=--max-old-space-size=8192
|
||||
```
|
||||
|
||||
2. **Python Memory Leak**:
|
||||
```bash
|
||||
# Check memory usage
|
||||
ps aux | grep python
|
||||
|
||||
# Restart service periodically (workaround)
|
||||
# Add to crontab:
|
||||
0 3 * * * systemctl restart motia.service
|
||||
```
|
||||
|
||||
3. **Unhandled Exception**:
|
||||
```bash
|
||||
# Check error logs
|
||||
sudo journalctl -u motia.service -p err
|
||||
|
||||
# Add try-catch in problematic step
|
||||
```
|
||||
|
||||
## Redis Issues
|
||||
|
||||
### Redis Connection Failed
|
||||
|
||||
**Symptoms**: "Redis connection failed" in logs
|
||||
|
||||
**Diagnose**:
|
||||
```bash
|
||||
# Check Redis status
|
||||
sudo systemctl status redis-server
|
||||
|
||||
# Test connection
|
||||
redis-cli ping
|
||||
|
||||
# Check config
|
||||
redis-cli CONFIG GET bind
|
||||
redis-cli CONFIG GET port
|
||||
```
|
||||
|
||||
**Solutions**:
|
||||
|
||||
1. **Redis not running**:
|
||||
```bash
|
||||
sudo systemctl start redis-server
|
||||
sudo systemctl enable redis-server
|
||||
```
|
||||
|
||||
2. **Wrong host/port**:
|
||||
```bash
|
||||
# Check environment
|
||||
echo $REDIS_HOST
|
||||
echo $REDIS_PORT
|
||||
|
||||
# Test connection
|
||||
redis-cli -h $REDIS_HOST -p $REDIS_PORT ping
|
||||
```
|
||||
|
||||
3. **Permission denied**:
|
||||
```bash
|
||||
# Check Redis log
|
||||
sudo tail -f /var/log/redis/redis-server.log
|
||||
|
||||
# Fix permissions
|
||||
sudo chown redis:redis /var/lib/redis
|
||||
sudo chmod 750 /var/lib/redis
|
||||
```
|
||||
|
||||
### Redis Out of Memory
|
||||
|
||||
**Symptoms**: "OOM command not allowed" errors
|
||||
|
||||
**Diagnose**:
|
||||
```bash
|
||||
# Check memory usage
|
||||
redis-cli INFO memory
|
||||
|
||||
# Check maxmemory setting
|
||||
redis-cli CONFIG GET maxmemory
|
||||
```
|
||||
|
||||
**Solutions**:
|
||||
|
||||
1. **Increase maxmemory**:
|
||||
```bash
|
||||
# In /etc/redis/redis.conf
|
||||
maxmemory 2gb
|
||||
maxmemory-policy allkeys-lru
|
||||
|
||||
sudo systemctl restart redis-server
|
||||
```
|
||||
|
||||
2. **Clear old data**:
|
||||
```bash
|
||||
# Clear cache (safe for Advoware tokens)
|
||||
redis-cli -n 1 FLUSHDB
|
||||
|
||||
# Clear calendar sync state
|
||||
redis-cli -n 2 FLUSHDB
|
||||
```
|
||||
|
||||
3. **Check for memory leaks**:
|
||||
```bash
|
||||
# Find large keys
|
||||
redis-cli --bigkeys
|
||||
|
||||
# Check specific key size
|
||||
redis-cli MEMORY USAGE <key>
|
||||
```
|
||||
|
||||
## Advoware API Issues
|
||||
|
||||
### Authentication Failed
|
||||
|
||||
**Symptoms**: "401 Unauthorized" oder "HMAC signature invalid"
|
||||
|
||||
**Diagnose**:
|
||||
```bash
|
||||
# Check logs for auth errors
|
||||
sudo journalctl -u motia.service | grep -i "auth\|token\|401"
|
||||
|
||||
# Test token fetch manually
|
||||
python3 << 'EOF'
|
||||
from services.advoware import AdvowareAPI
|
||||
api = AdvowareAPI()
|
||||
token = api.get_access_token(force_refresh=True)
|
||||
print(f"Token: {token[:20]}...")
|
||||
EOF
|
||||
```
|
||||
|
||||
**Solutions**:
|
||||
|
||||
1. **Invalid API Key**:
|
||||
```bash
|
||||
# Verify API Key is Base64
|
||||
echo $ADVOWARE_API_KEY | base64 -d
|
||||
|
||||
# Re-encode if needed
|
||||
echo -n "raw_key" | base64
|
||||
```
|
||||
|
||||
2. **Wrong credentials**:
|
||||
```bash
|
||||
# Verify environment variables
|
||||
sudo systemctl show motia.service -p Environment | grep ADVOWARE
|
||||
|
||||
# Update in systemd service
|
||||
sudo nano /etc/systemd/system/motia.service
|
||||
sudo systemctl daemon-reload
|
||||
sudo systemctl restart motia.service
|
||||
```
|
||||
|
||||
3. **Token expired**:
|
||||
```bash
|
||||
# Clear cached token
|
||||
redis-cli -n 1 DEL advoware_access_token advoware_token_timestamp
|
||||
|
||||
# Retry request (will fetch new token)
|
||||
```
|
||||
|
||||
### API Timeout
|
||||
|
||||
**Symptoms**: "Request timeout" oder "API call failed"
|
||||
|
||||
**Diagnose**:
|
||||
```bash
|
||||
# Check API response time
|
||||
time curl "http://localhost:3000/advoware/proxy?endpoint=employees"
|
||||
|
||||
# Check network connectivity
|
||||
ping www2.advo-net.net
|
||||
curl -I https://www2.advo-net.net:90/
|
||||
```
|
||||
|
||||
**Solutions**:
|
||||
|
||||
1. **Increase timeout**:
|
||||
```bash
|
||||
# In environment
|
||||
export ADVOWARE_API_TIMEOUT_SECONDS=60
|
||||
|
||||
# Or in systemd service
|
||||
Environment=ADVOWARE_API_TIMEOUT_SECONDS=60
|
||||
```
|
||||
|
||||
2. **Network issues**:
|
||||
```bash
|
||||
# Check firewall
|
||||
sudo ufw status
|
||||
|
||||
# Test direct connection
|
||||
curl -v https://www2.advo-net.net:90/
|
||||
```
|
||||
|
||||
3. **Advoware API down**:
|
||||
```bash
|
||||
# Wait and retry
|
||||
# Implement exponential backoff in code
|
||||
```
|
||||
|
||||
## Google Calendar Issues
|
||||
|
||||
### Service Account Not Found
|
||||
|
||||
**Symptoms**: "service-account.json not found"
|
||||
|
||||
**Diagnose**:
|
||||
```bash
|
||||
# Check file exists
|
||||
ls -la /opt/motia-app/service-account.json
|
||||
|
||||
# Check permissions
|
||||
ls -la /opt/motia-app/service-account.json
|
||||
|
||||
# Check environment variable
|
||||
echo $GOOGLE_CALENDAR_SERVICE_ACCOUNT_PATH
|
||||
```
|
||||
|
||||
**Solutions**:
|
||||
|
||||
1. **File missing**:
|
||||
```bash
|
||||
# Copy from backup
|
||||
sudo cp /backup/service-account.json /opt/motia-app/
|
||||
|
||||
# Set permissions
|
||||
sudo chmod 600 /opt/motia-app/service-account.json
|
||||
sudo chown www-data:www-data /opt/motia-app/service-account.json
|
||||
```
|
||||
|
||||
2. **Wrong path**:
|
||||
```bash
|
||||
# Update environment
|
||||
# In /etc/systemd/system/motia.service:
|
||||
Environment=GOOGLE_CALENDAR_SERVICE_ACCOUNT_PATH=/opt/motia-app/service-account.json
|
||||
|
||||
sudo systemctl daemon-reload
|
||||
sudo systemctl restart motia.service
|
||||
```
|
||||
|
||||
### Calendar API Rate Limit
|
||||
|
||||
**Symptoms**: "403 Rate limit exceeded" oder "429 Too Many Requests"
|
||||
|
||||
**Diagnose**:
|
||||
```bash
|
||||
# Check rate limiting in logs
|
||||
sudo journalctl -u motia.service | grep -i "rate\|403\|429"
|
||||
|
||||
# Check Redis rate limit tokens
|
||||
redis-cli -n 2 GET google_calendar_api_tokens
|
||||
```
|
||||
|
||||
**Solutions**:
|
||||
|
||||
1. **Wait for rate limit reset**:
|
||||
```bash
|
||||
# Rate limit resets every minute
|
||||
# Wait 60 seconds and retry
|
||||
```
|
||||
|
||||
2. **Adjust rate limit settings**:
|
||||
```python
|
||||
# In calendar_sync_event_step.py
|
||||
MAX_TOKENS = 7 # Decrease if hitting limits
|
||||
REFILL_RATE_PER_MS = 7 / 1000
|
||||
```
|
||||
|
||||
3. **Request quota increase**:
|
||||
- Go to Google Cloud Console
|
||||
- Navigate to "APIs & Services" → "Quotas"
|
||||
- Request increase for Calendar API
|
||||
|
||||
### Calendar Access Denied
|
||||
|
||||
**Symptoms**: "Access denied" oder "Insufficient permissions"
|
||||
|
||||
**Diagnose**:
|
||||
```bash
|
||||
# Check service account email
|
||||
python3 << 'EOF'
|
||||
import json
|
||||
with open('/opt/motia-app/service-account.json') as f:
|
||||
data = json.load(f)
|
||||
print(f"Service Account: {data['client_email']}")
|
||||
EOF
|
||||
|
||||
# Test API access
|
||||
python3 << 'EOF'
|
||||
from google.oauth2 import service_account
|
||||
from googleapiclient.discovery import build
|
||||
|
||||
creds = service_account.Credentials.from_service_account_file(
|
||||
'/opt/motia-app/service-account.json',
|
||||
scopes=['https://www.googleapis.com/auth/calendar']
|
||||
)
|
||||
service = build('calendar', 'v3', credentials=creds)
|
||||
result = service.calendarList().list().execute()
|
||||
print(f"Calendars: {len(result.get('items', []))}")
|
||||
EOF
|
||||
```
|
||||
|
||||
**Solutions**:
|
||||
|
||||
1. **Calendar not shared**:
|
||||
```bash
|
||||
# Share calendar with service account email
|
||||
# In Google Calendar UI: Settings → Share → Add service account email
|
||||
```
|
||||
|
||||
2. **Wrong scopes**:
|
||||
```bash
|
||||
# Verify scopes in code
|
||||
# Should be: https://www.googleapis.com/auth/calendar
|
||||
```
|
||||
|
||||
3. **Domain-wide delegation**:
|
||||
```bash
|
||||
# For G Suite, enable domain-wide delegation
|
||||
# See GOOGLE_SETUP_README.md
|
||||
```
|
||||
|
||||
## Calendar Sync Issues
|
||||
|
||||
### Sync Not Running
|
||||
|
||||
**Symptoms**: Keine Calendar-Updates, keine Sync-Logs
|
||||
|
||||
**Diagnose**:
|
||||
```bash
|
||||
# Check if cron is triggering
|
||||
sudo journalctl -u motia.service | grep -i "calendar_sync_cron"
|
||||
|
||||
# Manually trigger sync
|
||||
curl -X POST "http://localhost:3000/advoware/calendar/sync" \
|
||||
-H "Content-Type: application/json" \
|
||||
-d '{"full_content": true}'
|
||||
|
||||
# Check for locks
|
||||
redis-cli -n 1 KEYS "calendar_sync:lock:*"
|
||||
```
|
||||
|
||||
**Solutions**:
|
||||
|
||||
1. **Cron not configured**:
|
||||
```python
|
||||
# Verify calendar_sync_cron_step.py has correct schedule
|
||||
config = {
|
||||
'schedule': '0 2 * * *', # Daily at 2 AM
|
||||
}
|
||||
```
|
||||
|
||||
2. **Lock stuck**:
|
||||
```bash
|
||||
# Clear all locks
|
||||
python /opt/motia-app/bitbylaw/delete_employee_locks.py
|
||||
|
||||
# Or manually
|
||||
redis-cli -n 1 DEL calendar_sync:lock:SB
|
||||
```
|
||||
|
||||
3. **Errors in sync**:
|
||||
```bash
|
||||
# Check error logs
|
||||
sudo journalctl -u motia.service -p err | grep calendar
|
||||
```
|
||||
|
||||
### Duplicate Events
|
||||
|
||||
**Symptoms**: Events erscheinen mehrfach in Google Calendar
|
||||
|
||||
**Diagnose**:
|
||||
```bash
|
||||
# Check for concurrent syncs
|
||||
redis-cli -n 1 KEYS "calendar_sync:lock:*"
|
||||
|
||||
# Check logs for duplicate processing
|
||||
sudo journalctl -u motia.service | grep -i "duplicate\|already exists"
|
||||
```
|
||||
|
||||
**Solutions**:
|
||||
|
||||
1. **Locking not working**:
|
||||
```bash
|
||||
# Verify Redis lock TTL
|
||||
redis-cli -n 1 TTL calendar_sync:lock:SB
|
||||
|
||||
# Should return positive number if locked
|
||||
```
|
||||
|
||||
2. **Manual cleanup**:
|
||||
```bash
|
||||
# Delete duplicates in Google Calendar UI
|
||||
# Or use cleanup script (if available)
|
||||
```
|
||||
|
||||
3. **Improve deduplication logic**:
|
||||
```python
|
||||
# In calendar_sync_event_step.py
|
||||
# Add better event matching logic
|
||||
```
|
||||
|
||||
### Events Not Syncing
|
||||
|
||||
**Symptoms**: Advoware events nicht in Google Calendar
|
||||
|
||||
**Diagnose**:
|
||||
```bash
|
||||
# Check specific employee
|
||||
curl -X POST "http://localhost:3000/advoware/calendar/sync" \
|
||||
-H "Content-Type: application/json" \
|
||||
-d '{"kuerzel": "SB", "full_content": true}'
|
||||
|
||||
# Check logs for that employee
|
||||
sudo journalctl -u motia.service | grep "SB"
|
||||
|
||||
# Check if calendar exists
|
||||
python3 << 'EOF'
|
||||
from google.oauth2 import service_account
|
||||
from googleapiclient.discovery import build
|
||||
|
||||
creds = service_account.Credentials.from_service_account_file(
|
||||
'/opt/motia-app/service-account.json',
|
||||
scopes=['https://www.googleapis.com/auth/calendar']
|
||||
)
|
||||
service = build('calendar', 'v3', credentials=creds)
|
||||
result = service.calendarList().list().execute()
|
||||
for cal in result.get('items', []):
|
||||
if 'AW-SB' in cal['summary']:
|
||||
print(f"Found: {cal['summary']} - {cal['id']}")
|
||||
EOF
|
||||
```
|
||||
|
||||
**Solutions**:
|
||||
|
||||
1. **Calendar doesn't exist**:
|
||||
```bash
|
||||
# Will be auto-created on first sync
|
||||
# Force sync to trigger creation
|
||||
```
|
||||
|
||||
2. **Date range mismatch**:
|
||||
```python
|
||||
# Check FETCH_FROM and FETCH_TO in calendar_sync_event_step.py
|
||||
# Default: Previous year to 9 years ahead
|
||||
```
|
||||
|
||||
3. **Write protection enabled**:
|
||||
```bash
|
||||
# Check environment
|
||||
echo $ADVOWARE_WRITE_PROTECTION
|
||||
|
||||
# Should be "false" for two-way sync
|
||||
```
|
||||
|
||||
## Webhook Issues
|
||||
|
||||
### Webhooks Not Received
|
||||
|
||||
**Symptoms**: EspoCRM sendet Webhooks, aber keine Verarbeitung
|
||||
|
||||
**Diagnose**:
|
||||
```bash
|
||||
# Check if endpoint reachable
|
||||
curl -X POST "http://localhost:3000/vmh/webhook/beteiligte/create" \
|
||||
-H "Content-Type: application/json" \
|
||||
-d '[{"id": "test-123"}]'
|
||||
|
||||
# Check firewall
|
||||
sudo ufw status
|
||||
|
||||
# Check nginx logs (if using reverse proxy)
|
||||
sudo tail -f /var/log/nginx/motia-access.log
|
||||
sudo tail -f /var/log/nginx/motia-error.log
|
||||
```
|
||||
|
||||
**Solutions**:
|
||||
|
||||
1. **Firewall blocking**:
|
||||
```bash
|
||||
# Allow port (if direct access)
|
||||
sudo ufw allow 3000/tcp
|
||||
|
||||
# Or use reverse proxy (recommended)
|
||||
```
|
||||
|
||||
2. **Wrong URL in EspoCRM**:
|
||||
```bash
|
||||
# Verify URL in EspoCRM webhook configuration
|
||||
# Should be: https://your-domain.com/vmh/webhook/beteiligte/create
|
||||
```
|
||||
|
||||
3. **SSL certificate issues**:
|
||||
```bash
|
||||
# Check certificate
|
||||
openssl s_client -connect your-domain.com:443
|
||||
|
||||
# Renew certificate
|
||||
sudo certbot renew
|
||||
```
|
||||
|
||||
### Webhook Deduplication Not Working
|
||||
|
||||
**Symptoms**: Mehrfache Verarbeitung derselben Webhooks
|
||||
|
||||
**Diagnose**:
|
||||
```bash
|
||||
# Check Redis dedup sets
|
||||
redis-cli -n 1 SMEMBERS vmh:beteiligte:create_pending
|
||||
redis-cli -n 1 SMEMBERS vmh:beteiligte:update_pending
|
||||
redis-cli -n 1 SMEMBERS vmh:beteiligte:delete_pending
|
||||
|
||||
# Check for concurrent webhook processing
|
||||
sudo journalctl -u motia.service | grep "Webhook.*received"
|
||||
```
|
||||
|
||||
**Solutions**:
|
||||
|
||||
1. **Redis SET not working**:
|
||||
```bash
|
||||
# Test Redis SET operations
|
||||
redis-cli -n 1 SADD test_set "value1"
|
||||
redis-cli -n 1 SMEMBERS test_set
|
||||
redis-cli -n 1 DEL test_set
|
||||
```
|
||||
|
||||
2. **Clear dedup sets**:
|
||||
```bash
|
||||
# If corrupted
|
||||
redis-cli -n 1 DEL vmh:beteiligte:create_pending
|
||||
redis-cli -n 1 DEL vmh:beteiligte:update_pending
|
||||
redis-cli -n 1 DEL vmh:beteiligte:delete_pending
|
||||
```
|
||||
|
||||
## Performance Issues
|
||||
|
||||
### High CPU Usage
|
||||
|
||||
**Diagnose**:
|
||||
```bash
|
||||
# Check CPU usage
|
||||
top -p $(pgrep -f "motia start")
|
||||
|
||||
# Profile with Node.js
|
||||
# Already enabled with --inspect flag
|
||||
# Connect to chrome://inspect
|
||||
```
|
||||
|
||||
**Solutions**:
|
||||
|
||||
1. **Too many parallel syncs**:
|
||||
```bash
|
||||
# Reduce concurrent syncs
|
||||
# Adjust DEBUG_KUERZEL to process fewer employees
|
||||
```
|
||||
|
||||
2. **Infinite loop**:
|
||||
```bash
|
||||
# Check logs for repeated patterns
|
||||
sudo journalctl -u motia.service | tail -n 1000 | sort | uniq -c | sort -rn
|
||||
```
|
||||
|
||||
### High Memory Usage
|
||||
|
||||
**Diagnose**:
|
||||
```bash
|
||||
# Check memory
|
||||
ps aux | grep motia | awk '{print $6}'
|
||||
|
||||
# Heap snapshot (if enabled)
|
||||
kill -SIGUSR2 $(pgrep -f "motia start")
|
||||
# Snapshot saved to current directory
|
||||
```
|
||||
|
||||
**Solutions**:
|
||||
|
||||
1. **Increase memory limit**:
|
||||
```ini
|
||||
# In systemd service
|
||||
Environment=NODE_OPTIONS=--max-old-space-size=16384
|
||||
```
|
||||
|
||||
2. **Memory leak**:
|
||||
```bash
|
||||
# Restart service periodically
|
||||
# Add to crontab:
|
||||
0 3 * * * systemctl restart motia.service
|
||||
```
|
||||
|
||||
### Slow API Responses
|
||||
|
||||
**Diagnose**:
|
||||
```bash
|
||||
# Measure response time
|
||||
time curl "http://localhost:3000/advoware/proxy?endpoint=employees"
|
||||
|
||||
# Check for database/Redis latency
|
||||
redis-cli --latency
|
||||
```
|
||||
|
||||
**Solutions**:
|
||||
|
||||
1. **Redis slow**:
|
||||
```bash
|
||||
# Check slow log
|
||||
redis-cli SLOWLOG GET 10
|
||||
|
||||
# Optimize Redis
|
||||
redis-cli CONFIG SET tcp-backlog 511
|
||||
```
|
||||
|
||||
2. **Advoware API slow**:
|
||||
```bash
|
||||
# Increase timeout
|
||||
export ADVOWARE_API_TIMEOUT_SECONDS=60
|
||||
|
||||
# Add caching layer
|
||||
```
|
||||
|
||||
## Debugging Tools
|
||||
|
||||
### Enable Debug Logging
|
||||
|
||||
```bash
|
||||
# Set in systemd service
|
||||
Environment=MOTIA_LOG_LEVEL=debug
|
||||
|
||||
sudo systemctl daemon-reload
|
||||
sudo systemctl restart motia.service
|
||||
```
|
||||
|
||||
### Redis Debugging
|
||||
|
||||
```bash
|
||||
# Connect to Redis
|
||||
redis-cli
|
||||
|
||||
# Monitor all commands
|
||||
MONITOR
|
||||
|
||||
# Slow log
|
||||
SLOWLOG GET 10
|
||||
|
||||
# Info
|
||||
INFO all
|
||||
```
|
||||
|
||||
### Python Debugging
|
||||
|
||||
```python
|
||||
# Add to step code
|
||||
import pdb; pdb.set_trace()
|
||||
|
||||
# Or use logging
|
||||
context.logger.debug(f"Variable value: {variable}")
|
||||
```
|
||||
|
||||
### Node.js Debugging
|
||||
|
||||
```bash
|
||||
# Connect to inspector
|
||||
# Chrome DevTools: chrome://inspect
|
||||
# VSCode: Attach to Process
|
||||
```
|
||||
|
||||
## Getting Help
|
||||
|
||||
### Check Logs First
|
||||
|
||||
```bash
|
||||
# Last 100 lines
|
||||
sudo journalctl -u motia.service -n 100
|
||||
|
||||
# Errors only
|
||||
sudo journalctl -u motia.service -p err
|
||||
|
||||
# Specific time range
|
||||
sudo journalctl -u motia.service --since "1 hour ago"
|
||||
```
|
||||
|
||||
### Common Log Patterns
|
||||
|
||||
**Success**:
|
||||
```
|
||||
[INFO] Calendar sync completed for SB
|
||||
[INFO] VMH Webhook received
|
||||
```
|
||||
|
||||
**Warning**:
|
||||
```
|
||||
[WARNING] Rate limit approaching
|
||||
[WARNING] Lock already exists for SB
|
||||
```
|
||||
|
||||
**Error**:
|
||||
```
|
||||
[ERROR] Redis connection failed
|
||||
[ERROR] API call failed: 401 Unauthorized
|
||||
[ERROR] Unexpected error: ...
|
||||
```
|
||||
|
||||
### Collect Debug Information
|
||||
|
||||
```bash
|
||||
# System info
|
||||
uname -a
|
||||
node --version
|
||||
python3 --version
|
||||
|
||||
# Service status
|
||||
sudo systemctl status motia.service
|
||||
|
||||
# Recent logs
|
||||
sudo journalctl -u motia.service -n 200 > motia-logs.txt
|
||||
|
||||
# Redis info
|
||||
redis-cli INFO > redis-info.txt
|
||||
|
||||
# Configuration (redact secrets!)
|
||||
sudo systemctl show motia.service -p Environment > env.txt
|
||||
```
|
||||
|
||||
## Related Documentation
|
||||
|
||||
- [Architecture](ARCHITECTURE.md)
|
||||
- [Configuration](CONFIGURATION.md)
|
||||
- [Deployment](DEPLOYMENT.md)
|
||||
- [Development Guide](DEVELOPMENT.md)
|
||||
@@ -1,41 +1,4 @@
|
||||
[
|
||||
{
|
||||
"id": "basic-tutorial",
|
||||
"config": {
|
||||
"steps/petstore/state_audit_cron_step.py": {
|
||||
"x": -38,
|
||||
"y": 683,
|
||||
"sourceHandlePosition": "right"
|
||||
},
|
||||
"steps/petstore/process_food_order_step.py": {
|
||||
"x": 384,
|
||||
"y": 476,
|
||||
"targetHandlePosition": "left"
|
||||
},
|
||||
"steps/petstore/notification_step.py": {
|
||||
"x": 601,
|
||||
"y": 724,
|
||||
"targetHandlePosition": "left"
|
||||
},
|
||||
"steps/petstore/api_step.py": {
|
||||
"x": 15,
|
||||
"y": 461,
|
||||
"sourceHandlePosition": "right"
|
||||
},
|
||||
"steps/advoware_proxy/advoware_api_proxy_put_step.py": {
|
||||
"x": 12,
|
||||
"y": 408
|
||||
},
|
||||
"steps/advoware_proxy/advoware_api_proxy_get_step.py": {
|
||||
"x": 12,
|
||||
"y": 611
|
||||
},
|
||||
"steps/advoware_proxy/advoware_api_proxy_delete_step.py": {
|
||||
"x": 0,
|
||||
"y": 814
|
||||
}
|
||||
}
|
||||
},
|
||||
{
|
||||
"id": "vmh",
|
||||
"config": {
|
||||
@@ -102,8 +65,46 @@
|
||||
"y": 990
|
||||
},
|
||||
"steps/advoware_cal_sync/calendar_sync_all_step.py": {
|
||||
"x": 343,
|
||||
"y": 904
|
||||
"x": 339,
|
||||
"y": 913
|
||||
}
|
||||
}
|
||||
},
|
||||
{
|
||||
"id": "basic-tutorial",
|
||||
"config": {
|
||||
"steps/petstore/state_audit_cron_step.py": {
|
||||
"x": -38,
|
||||
"y": 683,
|
||||
"sourceHandlePosition": "right"
|
||||
},
|
||||
"steps/petstore/process_food_order_step.py": {
|
||||
"x": 384,
|
||||
"y": 476,
|
||||
"targetHandlePosition": "left"
|
||||
},
|
||||
"steps/petstore/notification_step.py": {
|
||||
"x": 601,
|
||||
"y": 724,
|
||||
"targetHandlePosition": "left"
|
||||
},
|
||||
"steps/petstore/api_step.py": {
|
||||
"x": 15,
|
||||
"y": 461,
|
||||
"sourceHandlePosition": "right"
|
||||
}
|
||||
}
|
||||
},
|
||||
{
|
||||
"id": "perf-test",
|
||||
"config": {
|
||||
"steps/motia-perf-test/perf_event_step.py": {
|
||||
"x": 318,
|
||||
"y": 22
|
||||
},
|
||||
"steps/motia-perf-test/perf_cron_step.py": {
|
||||
"x": 0,
|
||||
"y": 0
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
176
bitbylaw/scripts/calendar_sync/README.md
Normal file
176
bitbylaw/scripts/calendar_sync/README.md
Normal file
@@ -0,0 +1,176 @@
|
||||
# Calendar Sync Utility Scripts
|
||||
|
||||
---
|
||||
title: Calendar Sync Utilities
|
||||
description: Helper-Scripts für Google Calendar Synchronisation - Wartung, Debugging und Cleanup
|
||||
date: 2026-02-07
|
||||
category: utilities
|
||||
---
|
||||
|
||||
## Übersicht
|
||||
|
||||
Dieses Verzeichnis enthält Utility-Scripts für Wartung und Debugging der Calendar-Sync-Funktionalität.
|
||||
|
||||
---
|
||||
|
||||
## Scripts
|
||||
|
||||
### delete_all_calendars.py
|
||||
|
||||
**Zweck**: Löscht alle (nicht-primären) Kalender aus dem Google Calendar Service Account.
|
||||
|
||||
**Use Case**:
|
||||
- Reset bei fehlerhafter Synchronisation
|
||||
- Cleanup nach Tests
|
||||
- Bereinigung von Duplikaten
|
||||
|
||||
**Ausführung**:
|
||||
```bash
|
||||
cd /opt/motia-app/bitbylaw
|
||||
python3 scripts/calendar_sync/delete_all_calendars.py
|
||||
```
|
||||
|
||||
**Funktionsweise**:
|
||||
1. Authentifizierung mit Google Service Account
|
||||
2. Abruf aller Kalender via `calendarList().list()`
|
||||
3. Iteration durch alle Kalender
|
||||
4. Überspringen des Primary Calendar (Schutz)
|
||||
5. Löschen aller anderen Kalender via `calendars().delete()`
|
||||
|
||||
**Sicherheit**:
|
||||
- ⚠️ **WARNUNG**: Löscht unwiderruflich alle Kalender!
|
||||
- Primary Calendar wird automatisch übersprungen
|
||||
- Manuelle Bestätigung erforderlich (TODO: Confirmation Prompt)
|
||||
|
||||
**Abhängigkeiten**:
|
||||
- `steps.advoware_cal_sync.calendar_sync_event_step.get_google_service`
|
||||
- Google Calendar API Access
|
||||
- Service Account Credentials
|
||||
|
||||
**Output-Beispiel**:
|
||||
```
|
||||
Fetching calendar list...
|
||||
Found 15 calendars to delete:
|
||||
- Max Mustermann (ID: max@example.com, Primary: False)
|
||||
✓ Deleted calendar: Max Mustermann
|
||||
- Primary (ID: service@project.iam.gserviceaccount.com, Primary: True)
|
||||
Skipping primary calendar: Primary
|
||||
...
|
||||
All non-primary calendars have been deleted.
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### delete_employee_locks.py
|
||||
|
||||
**Zweck**: Löscht alle Employee-Locks aus Redis für Calendar Sync.
|
||||
|
||||
**Use Case**:
|
||||
- Cleanup nach abgestürztem Sync-Prozess
|
||||
- Manueller Reset bei "hanging" Locks
|
||||
- Debugging von Lock-Problemen
|
||||
|
||||
**Ausführung**:
|
||||
```bash
|
||||
cd /opt/motia-app/bitbylaw
|
||||
python3 scripts/calendar_sync/delete_employee_locks.py
|
||||
```
|
||||
|
||||
**Funktionsweise**:
|
||||
1. Verbindung zu Redis DB 2 (`REDIS_DB_CALENDAR_SYNC`)
|
||||
2. Suche nach allen Keys mit Pattern `calendar_sync_lock_*`
|
||||
3. Löschen aller gefundenen Lock-Keys
|
||||
|
||||
**Redis Key Pattern**:
|
||||
```
|
||||
calendar_sync_lock_{employee_id}
|
||||
```
|
||||
|
||||
**Sicherheit**:
|
||||
- ⚠️ Kann zu Race Conditions führen, wenn Sync läuft
|
||||
- Empfehlung: Nur ausführen, wenn kein Sync-Prozess aktiv ist
|
||||
|
||||
**Abhängigkeiten**:
|
||||
- `config.Config` (Redis-Konfiguration)
|
||||
- Redis DB 2 (Calendar Sync State)
|
||||
|
||||
**Output-Beispiel**:
|
||||
```
|
||||
Deleted 12 employee lock keys.
|
||||
```
|
||||
|
||||
**Oder bei leerer DB**:
|
||||
```
|
||||
No employee lock keys found.
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Workflow: Kompletter Reset
|
||||
|
||||
Bei schwerwiegenden Sync-Problemen:
|
||||
|
||||
```bash
|
||||
cd /opt/motia-app/bitbylaw
|
||||
|
||||
# 1. Stoppe Motia Service (verhindert neue Syncs)
|
||||
sudo systemctl stop motia
|
||||
|
||||
# 2. Lösche alle Redis Locks
|
||||
python3 scripts/calendar_sync/delete_employee_locks.py
|
||||
|
||||
# 3. Lösche alle Google Kalender (optional, nur bei Bedarf!)
|
||||
python3 scripts/calendar_sync/delete_all_calendars.py
|
||||
|
||||
# 4. Starte Motia Service neu
|
||||
sudo systemctl start motia
|
||||
|
||||
# 5. Triggere Full-Sync
|
||||
curl -X POST http://localhost:3000/api/calendar/sync/all
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Best Practices
|
||||
|
||||
### Vor Ausführung
|
||||
|
||||
1. **Backup prüfen**: Sicherstellen, dass Advoware-Daten konsistent sind
|
||||
2. **Service Status**: `systemctl status motia` prüfen
|
||||
3. **Redis Dump**: `redis-cli -n 2 BGSAVE` (optional)
|
||||
|
||||
### Nach Ausführung
|
||||
|
||||
1. **Logs prüfen**: `journalctl -u motia -n 100 --no-pager`
|
||||
2. **Sync triggern**: Via API oder Cron
|
||||
3. **Verifizierung**: Google Calendar auf korrekte Kalender prüfen
|
||||
|
||||
---
|
||||
|
||||
## Zukünftige Scripts (TODO)
|
||||
|
||||
### audit_calendar_sync.py
|
||||
|
||||
**Zweck**: Vergleicht Advoware-Termine mit Google Calendar
|
||||
|
||||
**Features**:
|
||||
- Diff-Anzeige zwischen Advoware und Google
|
||||
- Erkennung von Orphaned Calendars
|
||||
- Report-Generierung
|
||||
|
||||
### repair_calendar_sync.py
|
||||
|
||||
**Zweck**: Automatische Reparatur bei Inkonsistenzen
|
||||
|
||||
**Features**:
|
||||
- Auto-Sync bei fehlenden Terminen
|
||||
- Löschen von Duplikaten
|
||||
- Lock-Cleanup mit Safety-Checks
|
||||
|
||||
---
|
||||
|
||||
## Siehe auch
|
||||
|
||||
- [Calendar Sync Architecture](../../docs/ARCHITECTURE.md#2-calendar-sync-pipeline)
|
||||
- [Calendar Sync Cron Step](../../steps/advoware_cal_sync/calendar_sync_cron_step.md)
|
||||
- [Troubleshooting Guide](../../docs/TROUBLESHOOTING.md)
|
||||
21
bitbylaw/scripts/calendar_sync/delete_employee_locks.py
Normal file
21
bitbylaw/scripts/calendar_sync/delete_employee_locks.py
Normal file
@@ -0,0 +1,21 @@
|
||||
import redis
|
||||
from config import Config
|
||||
|
||||
def main():
|
||||
redis_client = redis.Redis(
|
||||
host=Config.REDIS_HOST,
|
||||
port=int(Config.REDIS_PORT),
|
||||
db=int(Config.REDIS_DB_CALENDAR_SYNC),
|
||||
socket_timeout=Config.REDIS_TIMEOUT_SECONDS
|
||||
)
|
||||
|
||||
# Find all lock keys
|
||||
lock_keys = redis_client.keys('calendar_sync_lock_*')
|
||||
if lock_keys:
|
||||
deleted_count = redis_client.delete(*lock_keys)
|
||||
print(f"Deleted {deleted_count} employee lock keys.")
|
||||
else:
|
||||
print("No employee lock keys found.")
|
||||
|
||||
if __name__ == "__main__":
|
||||
main()
|
||||
345
bitbylaw/services/ADVOWARE_SERVICE.md
Normal file
345
bitbylaw/services/ADVOWARE_SERVICE.md
Normal file
@@ -0,0 +1,345 @@
|
||||
# AdvowareAPI Service
|
||||
|
||||
## Übersicht
|
||||
|
||||
Der AdvowareAPI Service ist der zentrale HTTP-Client für alle Kommunikation mit der Advoware REST-API. Er abstrahiert die komplexe HMAC-512 Authentifizierung und bietet ein einfaches Interface für API-Calls.
|
||||
|
||||
## Location
|
||||
|
||||
`services/advoware.py`
|
||||
|
||||
## Verwendung
|
||||
|
||||
```python
|
||||
from services.advoware import AdvowareAPI
|
||||
|
||||
# In Step-Handler
|
||||
async def handler(req, context):
|
||||
advoware = AdvowareAPI(context)
|
||||
result = await advoware.api_call('/employees', method='GET')
|
||||
return {'status': 200, 'body': {'result': result}}
|
||||
```
|
||||
|
||||
## Klassen
|
||||
|
||||
### AdvowareAPI
|
||||
|
||||
**Constructor**: `__init__(self, context=None)`
|
||||
- `context`: Motia context für Logging (optional)
|
||||
|
||||
**Attributes**:
|
||||
- `API_BASE_URL`: Base URL der Advoware API
|
||||
- `redis_client`: Redis-Connection für Token-Caching
|
||||
- `product_id`, `app_id`, `api_key`: Auth-Credentials aus Config
|
||||
|
||||
## Methoden
|
||||
|
||||
### get_access_token(force_refresh=False)
|
||||
|
||||
Holt Bearer Token aus Redis Cache oder fetcht neuen Token.
|
||||
|
||||
**Parameters**:
|
||||
- `force_refresh` (bool): Cache ignorieren und neuen Token holen
|
||||
|
||||
**Returns**: `str` - Bearer Token
|
||||
|
||||
**Logic**:
|
||||
1. Wenn kein Redis oder `force_refresh=True`: Fetch new
|
||||
2. Wenn cached Token existiert und nicht abgelaufen: Return cached
|
||||
3. Sonst: Fetch new und cache
|
||||
|
||||
**Caching**:
|
||||
- Key: `advoware_access_token`
|
||||
- TTL: 53 Minuten (55min Lifetime - 2min Safety)
|
||||
- Timestamp-Key: `advoware_token_timestamp`
|
||||
|
||||
**Example**:
|
||||
```python
|
||||
api = AdvowareAPI()
|
||||
token = api.get_access_token() # From cache
|
||||
token = api.get_access_token(force_refresh=True) # Fresh
|
||||
```
|
||||
|
||||
### api_call(endpoint, method='GET', headers=None, params=None, json_data=None, ...)
|
||||
|
||||
Führt authentifizierten API-Call zu Advoware aus.
|
||||
|
||||
**Parameters**:
|
||||
- `endpoint` (str): API-Pfad (z.B. `/employees`)
|
||||
- `method` (str): HTTP-Method (GET, POST, PUT, DELETE)
|
||||
- `headers` (dict): Zusätzliche HTTP-Headers
|
||||
- `params` (dict): Query-Parameters
|
||||
- `json_data` (dict): JSON-Body für POST/PUT
|
||||
- `timeout_seconds` (int): Override default timeout
|
||||
|
||||
**Returns**: `dict|None` - JSON-Response oder None
|
||||
|
||||
**Logic**:
|
||||
1. Get Bearer Token (cached oder fresh)
|
||||
2. Setze Authorization Header
|
||||
3. Async HTTP-Request mit aiohttp
|
||||
4. Bei 401: Refresh Token und retry
|
||||
5. Parse JSON Response
|
||||
6. Return Result
|
||||
|
||||
**Error Handling**:
|
||||
- `aiohttp.ClientError`: Network/HTTP errors
|
||||
- `401 Unauthorized`: Auto-refresh Token und retry (einmal)
|
||||
- `Timeout`: Nach `ADVOWARE_API_TIMEOUT_SECONDS`
|
||||
|
||||
**Example**:
|
||||
```python
|
||||
# GET Request
|
||||
employees = await api.api_call('/employees', method='GET', params={'limit': 10})
|
||||
|
||||
# POST Request
|
||||
new_appt = await api.api_call(
|
||||
'/appointments',
|
||||
method='POST',
|
||||
json_data={'datum': '2026-02-10', 'text': 'Meeting'}
|
||||
)
|
||||
|
||||
# PUT Request
|
||||
updated = await api.api_call(
|
||||
'/appointments/123',
|
||||
method='PUT',
|
||||
json_data={'text': 'Updated'}
|
||||
)
|
||||
|
||||
# DELETE Request
|
||||
await api.api_call('/appointments/123', method='DELETE')
|
||||
```
|
||||
|
||||
## Authentifizierung
|
||||
|
||||
### HMAC-512 Signature
|
||||
|
||||
Advoware verwendet HMAC-512 für Request-Signierung:
|
||||
|
||||
**Message Format**:
|
||||
```
|
||||
{product_id}:{app_id}:{nonce}:{timestamp}
|
||||
```
|
||||
|
||||
**Key**: Base64-decoded API Key
|
||||
|
||||
**Hash**: SHA512
|
||||
|
||||
**Output**: Base64-encoded Signature
|
||||
|
||||
**Implementation**:
|
||||
```python
|
||||
def _generate_hmac(self, request_time_stamp, nonce=None):
|
||||
if not nonce:
|
||||
nonce = str(uuid.uuid4())
|
||||
message = f"{self.product_id}:{self.app_id}:{nonce}:{request_time_stamp}"
|
||||
api_key_bytes = base64.b64decode(self.api_key)
|
||||
signature = hmac.new(api_key_bytes, message.encode(), hashlib.sha512)
|
||||
return base64.b64encode(signature.digest()).decode('utf-8')
|
||||
```
|
||||
|
||||
### Token-Fetch Flow
|
||||
|
||||
1. Generate nonce (UUID4)
|
||||
2. Get current UTC timestamp (ISO format)
|
||||
3. Generate HMAC signature
|
||||
4. POST to `https://security.advo-net.net/api/v1/Token`:
|
||||
```json
|
||||
{
|
||||
"AppID": "...",
|
||||
"Kanzlei": "...",
|
||||
"Database": "...",
|
||||
"User": "...",
|
||||
"Role": 2,
|
||||
"Product": 64,
|
||||
"Password": "...",
|
||||
"Nonce": "...",
|
||||
"HMAC512Signature": "...",
|
||||
"RequestTimeStamp": "..."
|
||||
}
|
||||
```
|
||||
5. Extract `access_token` from response
|
||||
6. Cache in Redis (53min TTL)
|
||||
|
||||
## Redis Usage
|
||||
|
||||
### Keys
|
||||
|
||||
**DB 1** (`REDIS_DB_ADVOWARE_CACHE`):
|
||||
- `advoware_access_token` (string, TTL: 3180s = 53min)
|
||||
- `advoware_token_timestamp` (string, TTL: 3180s)
|
||||
|
||||
### Operations
|
||||
|
||||
```python
|
||||
# Set Token
|
||||
self.redis_client.set(
|
||||
self.TOKEN_CACHE_KEY,
|
||||
access_token,
|
||||
ex=(self.token_lifetime_minutes - 2) * 60
|
||||
)
|
||||
|
||||
# Get Token
|
||||
cached_token = self.redis_client.get(self.TOKEN_CACHE_KEY)
|
||||
if cached_token:
|
||||
return cached_token.decode('utf-8')
|
||||
```
|
||||
|
||||
### Fallback
|
||||
|
||||
Wenn Redis nicht erreichbar:
|
||||
- Logge Warning
|
||||
- Fetche Token bei jedem Request (keine Caching)
|
||||
- Funktioniert, aber langsamer
|
||||
|
||||
## Logging
|
||||
|
||||
### Log Messages
|
||||
|
||||
```python
|
||||
# Via context.logger (wenn vorhanden)
|
||||
context.logger.info("Access token fetched successfully")
|
||||
context.logger.error(f"API call failed: {e}")
|
||||
|
||||
# Fallback zu Python logging
|
||||
logger.info("Connected to Redis for token caching")
|
||||
logger.debug(f"Token request data: AppID={self.app_id}")
|
||||
```
|
||||
|
||||
### Log Levels
|
||||
|
||||
- **DEBUG**: Token Details, Request-Parameter
|
||||
- **INFO**: Token-Fetch, API-Calls, Cache-Hits
|
||||
- **ERROR**: Auth-Fehler, API-Fehler, Network-Fehler
|
||||
|
||||
## Configuration
|
||||
|
||||
### Environment Variables
|
||||
|
||||
```bash
|
||||
# API Settings
|
||||
ADVOWARE_API_BASE_URL=https://www2.advo-net.net:90/
|
||||
ADVOWARE_PRODUCT_ID=64
|
||||
ADVOWARE_APP_ID=your_app_id
|
||||
ADVOWARE_API_KEY=base64_encoded_hmac_key
|
||||
ADVOWARE_KANZLEI=your_kanzlei
|
||||
ADVOWARE_DATABASE=your_database
|
||||
ADVOWARE_USER=api_user
|
||||
ADVOWARE_ROLE=2
|
||||
ADVOWARE_PASSWORD=your_password
|
||||
|
||||
# Timeouts
|
||||
ADVOWARE_TOKEN_LIFETIME_MINUTES=55
|
||||
ADVOWARE_API_TIMEOUT_SECONDS=30
|
||||
|
||||
# Redis
|
||||
REDIS_HOST=localhost
|
||||
REDIS_PORT=6379
|
||||
REDIS_DB_ADVOWARE_CACHE=1
|
||||
REDIS_TIMEOUT_SECONDS=5
|
||||
```
|
||||
|
||||
## Error Handling
|
||||
|
||||
### Exceptions
|
||||
|
||||
**AdvowareTokenError**:
|
||||
- Raised when token fetch fails
|
||||
- Beispiel: Invalid credentials, HMAC signature mismatch
|
||||
|
||||
**aiohttp.ClientError**:
|
||||
- Network errors, HTTP errors (außer 401)
|
||||
- Timeouts, Connection refused, etc.
|
||||
|
||||
### Retry Logic
|
||||
|
||||
**401 Unauthorized**:
|
||||
- Automatic retry mit fresh token (einmal)
|
||||
- Danach: Exception an Caller
|
||||
|
||||
**Other Errors**:
|
||||
- Keine Retry (fail-fast)
|
||||
- Exception direkt an Caller
|
||||
|
||||
## Performance
|
||||
|
||||
### Response Time
|
||||
|
||||
- **With cached token**: 200-800ms (Advoware API Latency)
|
||||
- **With token fetch**: +1-2s für Token-Request
|
||||
- **Timeout**: 30s (konfigurierbar)
|
||||
|
||||
### Caching
|
||||
|
||||
- **Hit Rate**: >99% (Token cached 53min, API calls häufiger)
|
||||
- **Miss**: Nur bei erstem Call oder Token-Expiry
|
||||
|
||||
## Testing
|
||||
|
||||
### Manual Testing
|
||||
|
||||
```python
|
||||
# Test Token Fetch
|
||||
from services.advoware import AdvowareAPI
|
||||
api = AdvowareAPI()
|
||||
token = api.get_access_token(force_refresh=True)
|
||||
print(f"Token: {token[:20]}...")
|
||||
|
||||
# Test API Call
|
||||
import asyncio
|
||||
async def test():
|
||||
api = AdvowareAPI()
|
||||
result = await api.api_call('/employees', params={'limit': 5})
|
||||
print(result)
|
||||
|
||||
asyncio.run(test())
|
||||
```
|
||||
|
||||
### Unit Testing
|
||||
|
||||
```python
|
||||
from unittest.mock import AsyncMock, MagicMock, patch
|
||||
import pytest
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_api_call_with_cached_token():
|
||||
# Mock Redis
|
||||
redis_mock = MagicMock()
|
||||
redis_mock.get.return_value = b'cached_token'
|
||||
|
||||
# Mock aiohttp
|
||||
with patch('aiohttp.ClientSession') as session_mock:
|
||||
response_mock = AsyncMock()
|
||||
response_mock.status = 200
|
||||
response_mock.json = AsyncMock(return_value={'data': 'test'})
|
||||
session_mock.return_value.__aenter__.return_value.request.return_value.__aenter__.return_value = response_mock
|
||||
|
||||
api = AdvowareAPI()
|
||||
api.redis_client = redis_mock
|
||||
result = await api.api_call('/test')
|
||||
|
||||
assert result == {'data': 'test'}
|
||||
redis_mock.get.assert_called_once()
|
||||
```
|
||||
|
||||
## Security
|
||||
|
||||
### Secrets
|
||||
|
||||
- ✅ API Key aus Environment (nicht hardcoded)
|
||||
- ✅ Password aus Environment
|
||||
- ✅ Token nur in Redis (localhost)
|
||||
- ❌ Token nicht in Logs
|
||||
|
||||
### Best Practices
|
||||
|
||||
- API Key immer Base64-encoded speichern
|
||||
- Token nicht länger als 55min cachen
|
||||
- Redis localhost-only (keine remote connections)
|
||||
- Logs keine credentials enthalten
|
||||
|
||||
## Related Documentation
|
||||
|
||||
- [Configuration](../../docs/CONFIGURATION.md)
|
||||
- [Architecture](../../docs/ARCHITECTURE.md)
|
||||
- [Proxy Steps](../advoware_proxy/README.md)
|
||||
@@ -428,3 +428,32 @@ python audit_calendar_sync.py cleanup-orphaned
|
||||
|
||||
Alle Operationen werden über `context.logger` geloggt und sind in der Motia Workbench sichtbar. Zusätzliche Debug-Informationen werden auf der Konsole ausgegeben.
|
||||
|
||||
---
|
||||
|
||||
## Utility Scripts
|
||||
|
||||
Für Wartung und Debugging stehen Helper-Scripts zur Verfügung:
|
||||
|
||||
**Dokumentation**: [scripts/calendar_sync/README.md](../../scripts/calendar_sync/README.md)
|
||||
|
||||
**Verfügbare Scripts**:
|
||||
- `delete_employee_locks.py` - Löscht Redis-Locks (bei hängenden Syncs)
|
||||
- `delete_all_calendars.py` - Löscht alle Google Kalender (Reset)
|
||||
|
||||
**Verwendung**:
|
||||
```bash
|
||||
# Lock-Cleanup
|
||||
python3 scripts/calendar_sync/delete_employee_locks.py
|
||||
|
||||
# Calendar-Reset (VORSICHT!)
|
||||
python3 scripts/calendar_sync/delete_all_calendars.py
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Siehe auch
|
||||
|
||||
- [Calendar Sync Architecture](../../docs/ARCHITECTURE.md#2-calendar-sync-system)
|
||||
- [Calendar Sync Cron Step](calendar_sync_cron_step.md)
|
||||
- [Google Calendar Setup](../../docs/GOOGLE_SETUP.md)
|
||||
- [Troubleshooting Guide](../../docs/TROUBLESHOOTING.md)
|
||||
|
||||
109
bitbylaw/steps/advoware_cal_sync/calendar_sync_all_step.md
Normal file
109
bitbylaw/steps/advoware_cal_sync/calendar_sync_all_step.md
Normal file
@@ -0,0 +1,109 @@
|
||||
---
|
||||
type: step
|
||||
category: event
|
||||
name: Calendar Sync All
|
||||
version: 1.0.0
|
||||
status: active
|
||||
tags: [calendar, sync, event, cascade]
|
||||
dependencies:
|
||||
- services/advoware.py
|
||||
- redis
|
||||
emits: [calendar_sync_employee]
|
||||
subscribes: [calendar_sync_all]
|
||||
---
|
||||
|
||||
# Calendar Sync All Step
|
||||
|
||||
## Zweck
|
||||
Fetcht alle Mitarbeiter von Advoware und emittiert `calendar_sync_employee` Event pro Mitarbeiter. Ermöglicht parallele Verarbeitung.
|
||||
|
||||
## Config
|
||||
```python
|
||||
{
|
||||
'type': 'event',
|
||||
'name': 'Calendar Sync All',
|
||||
'subscribes': ['calendar_sync_all'],
|
||||
'emits': ['calendar_sync_employee'],
|
||||
'flows': ['advoware_cal_sync']
|
||||
}
|
||||
```
|
||||
|
||||
## Input Event
|
||||
```json
|
||||
{
|
||||
"topic": "calendar_sync_all",
|
||||
"data": {}
|
||||
}
|
||||
```
|
||||
|
||||
## Verhalten
|
||||
|
||||
1. **Fetch Employees** von Advoware API:
|
||||
```python
|
||||
employees = await advoware.api_call('/employees')
|
||||
```
|
||||
|
||||
2. **Filter Debug-Liste** (wenn konfiguriert):
|
||||
```python
|
||||
if Config.CALENDAR_SYNC_DEBUG_KUERZEL:
|
||||
employees = [e for e in employees if e['kuerzel'] in debug_list]
|
||||
```
|
||||
|
||||
3. **Set Lock pro Employee**:
|
||||
```python
|
||||
lock_key = f'calendar_sync:lock:{kuerzel}'
|
||||
redis.set(lock_key, '1', nx=True, ex=300)
|
||||
```
|
||||
|
||||
4. **Emit Event pro Employee**:
|
||||
```python
|
||||
await context.emit({
|
||||
'topic': 'calendar_sync_employee',
|
||||
'data': {
|
||||
'kuerzel': kuerzel,
|
||||
'full_content': True
|
||||
}
|
||||
})
|
||||
```
|
||||
|
||||
## Debug-Modus
|
||||
```bash
|
||||
# Only sync specific employees
|
||||
export CALENDAR_SYNC_DEBUG_KUERZEL=SB,AI,RO
|
||||
|
||||
# Sync all (production)
|
||||
export CALENDAR_SYNC_DEBUG_KUERZEL=
|
||||
```
|
||||
|
||||
## Error Handling
|
||||
- Advoware API Fehler: Loggen, aber nicht crashen
|
||||
- Lock-Fehler: Employee skippen (bereits in Sync)
|
||||
- Event Emission Fehler: Loggen und fortfahren
|
||||
|
||||
## Output Events
|
||||
Multiple `calendar_sync_employee` events, z.B.:
|
||||
```json
|
||||
[
|
||||
{"topic": "calendar_sync_employee", "data": {"kuerzel": "SB", "full_content": true}},
|
||||
{"topic": "calendar_sync_employee", "data": {"kuerzel": "AI", "full_content": true}},
|
||||
...
|
||||
]
|
||||
```
|
||||
|
||||
## Performance
|
||||
- ~10 employees: <1s für Fetch + Event Emission
|
||||
- Lock-Setting: <10ms pro Employee
|
||||
- Keine Blockierung (async events)
|
||||
|
||||
## Monitoring
|
||||
```
|
||||
[INFO] Fetching employees from Advoware
|
||||
[INFO] Found 10 employees
|
||||
[INFO] Emitting calendar_sync_employee for SB
|
||||
[INFO] Emitting calendar_sync_employee for AI
|
||||
...
|
||||
```
|
||||
|
||||
## Related
|
||||
- [calendar_sync_event_step.md](calendar_sync_event_step.md) - Consumes emitted events
|
||||
- [calendar_sync_cron_step.md](calendar_sync_cron_step.md) - Triggers this step
|
||||
@@ -5,7 +5,7 @@ import time
|
||||
from datetime import datetime
|
||||
from config import Config
|
||||
from services.advoware import AdvowareAPI
|
||||
from .calendar_sync_utils import get_redis_client, get_advoware_employees, set_employee_lock
|
||||
from .calendar_sync_utils import get_redis_client, get_advoware_employees, set_employee_lock, log_operation
|
||||
|
||||
config = {
|
||||
'type': 'event',
|
||||
@@ -19,7 +19,7 @@ config = {
|
||||
async def handler(event_data, context):
|
||||
try:
|
||||
triggered_by = event_data.get('triggered_by', 'unknown')
|
||||
context.logger.info(f"Calendar Sync All: Starting to emit events for oldest employees, triggered by {triggered_by}")
|
||||
log_operation('info', f"Calendar Sync All: Starting to emit events for oldest employees, triggered by {triggered_by}", context=context)
|
||||
|
||||
# Initialize Advoware API
|
||||
advoware = AdvowareAPI(context)
|
||||
@@ -27,7 +27,7 @@ async def handler(event_data, context):
|
||||
# Fetch employees
|
||||
employees = await get_advoware_employees(advoware, context)
|
||||
if not employees:
|
||||
context.logger.error("Keine Mitarbeiter gefunden. All-Sync abgebrochen.")
|
||||
log_operation('error', "Keine Mitarbeiter gefunden. All-Sync abgebrochen.", context=context)
|
||||
return {'status': 500, 'body': {'error': 'Keine Mitarbeiter gefunden'}}
|
||||
|
||||
redis_client = get_redis_client(context)
|
||||
@@ -53,11 +53,11 @@ async def handler(event_data, context):
|
||||
return datetime.fromtimestamp(ts).strftime('%Y-%m-%d %H:%M:%S')
|
||||
|
||||
sorted_list_str = ", ".join(f"{k} ({format_timestamp(employee_timestamps[k])})" for k in sorted_kuerzel)
|
||||
context.logger.info(f"Calendar Sync All: Sorted employees by last synced: {sorted_list_str}")
|
||||
log_operation('info', f"Calendar Sync All: Sorted employees by last synced: {sorted_list_str}", context=context)
|
||||
|
||||
# Calculate number to sync: ceil(N / 10)
|
||||
num_to_sync = math.ceil(len(sorted_kuerzel) / 10)
|
||||
context.logger.info(f"Calendar Sync All: Total employees {len(sorted_kuerzel)}, syncing {num_to_sync} oldest")
|
||||
num_to_sync = math.ceil(len(sorted_kuerzel) / 1)
|
||||
log_operation('info', f"Calendar Sync All: Total employees {len(sorted_kuerzel)}, syncing {num_to_sync} oldest", context=context)
|
||||
|
||||
# Emit for the oldest num_to_sync employees, if not locked
|
||||
emitted_count = 0
|
||||
@@ -65,7 +65,7 @@ async def handler(event_data, context):
|
||||
employee_lock_key = f'calendar_sync_lock_{kuerzel}'
|
||||
|
||||
if not set_employee_lock(redis_client, kuerzel, triggered_by, context):
|
||||
context.logger.info(f"Calendar Sync All: Sync bereits aktiv für {kuerzel}, überspringe")
|
||||
log_operation('info', f"Calendar Sync All: Sync bereits aktiv für {kuerzel}, überspringe", context=context)
|
||||
continue
|
||||
|
||||
# Emit event for this employee
|
||||
@@ -76,10 +76,10 @@ async def handler(event_data, context):
|
||||
"triggered_by": triggered_by
|
||||
}
|
||||
})
|
||||
context.logger.info(f"Calendar Sync All: Emitted event for employee {kuerzel} (last synced: {format_timestamp(employee_timestamps[kuerzel])})")
|
||||
log_operation('info', f"Calendar Sync All: Emitted event for employee {kuerzel} (last synced: {format_timestamp(employee_timestamps[kuerzel])})", context=context)
|
||||
emitted_count += 1
|
||||
|
||||
context.logger.info(f"Calendar Sync All: Completed, emitted {emitted_count} events")
|
||||
log_operation('info', f"Calendar Sync All: Completed, emitted {emitted_count} events", context=context)
|
||||
return {
|
||||
'status': 'completed',
|
||||
'triggered_by': triggered_by,
|
||||
@@ -87,7 +87,7 @@ async def handler(event_data, context):
|
||||
}
|
||||
|
||||
except Exception as e:
|
||||
context.logger.error(f"Fehler beim All-Sync: {e}")
|
||||
log_operation('error', f"Fehler beim All-Sync: {e}", context=context)
|
||||
return {
|
||||
'status': 'error',
|
||||
'error': str(e)
|
||||
|
||||
96
bitbylaw/steps/advoware_cal_sync/calendar_sync_api_step.md
Normal file
96
bitbylaw/steps/advoware_cal_sync/calendar_sync_api_step.md
Normal file
@@ -0,0 +1,96 @@
|
||||
---
|
||||
type: step
|
||||
category: api
|
||||
name: Calendar Sync API
|
||||
version: 1.0.0
|
||||
status: active
|
||||
tags: [calendar, sync, api, manual-trigger]
|
||||
dependencies:
|
||||
- redis
|
||||
emits: [calendar_sync_all, calendar_sync_employee]
|
||||
---
|
||||
|
||||
# Calendar Sync API Step
|
||||
|
||||
## Zweck
|
||||
Manueller Trigger für Calendar-Synchronisation via HTTP-Endpoint. Ermöglicht Sync für alle oder einzelne Mitarbeiter.
|
||||
|
||||
## Config
|
||||
```python
|
||||
{
|
||||
'type': 'api',
|
||||
'name': 'Calendar Sync API',
|
||||
'path': '/advoware/calendar/sync',
|
||||
'method': 'POST',
|
||||
'emits': ['calendar_sync_all', 'calendar_sync_employee'],
|
||||
'flows': ['advoware_cal_sync']
|
||||
}
|
||||
```
|
||||
|
||||
## Input
|
||||
```bash
|
||||
POST /advoware/calendar/sync
|
||||
Content-Type: application/json
|
||||
|
||||
{
|
||||
"kuerzel": "ALL", # or specific: "SB"
|
||||
"full_content": true
|
||||
}
|
||||
```
|
||||
|
||||
**Parameters**:
|
||||
- `kuerzel` (optional): "ALL" oder Mitarbeiter-Kürzel (default: "ALL")
|
||||
- `full_content` (optional): true = volle Details, false = anonymisiert (default: true)
|
||||
|
||||
## Output
|
||||
```json
|
||||
{
|
||||
"status": "triggered",
|
||||
"kuerzel": "ALL",
|
||||
"message": "Calendar sync triggered for ALL"
|
||||
}
|
||||
```
|
||||
|
||||
## Verhalten
|
||||
|
||||
**Case 1: ALL (oder kein kuerzel)**:
|
||||
1. Emit `calendar_sync_all` event
|
||||
2. `calendar_sync_all_step` fetcht alle Employees
|
||||
3. Pro Employee: Emit `calendar_sync_employee`
|
||||
|
||||
**Case 2: Specific Employee (z.B. "SB")**:
|
||||
1. Set Redis Lock: `calendar_sync:lock:SB`
|
||||
2. Emit `calendar_sync_employee` event direkt
|
||||
3. Lock verhindert parallele Syncs für denselben Employee
|
||||
|
||||
## Redis Locking
|
||||
```python
|
||||
lock_key = f'calendar_sync:lock:{kuerzel}'
|
||||
redis_client.set(lock_key, '1', nx=True, ex=300) # 5min TTL
|
||||
```
|
||||
|
||||
## Testing
|
||||
```bash
|
||||
# Sync all employees
|
||||
curl -X POST "http://localhost:3000/advoware/calendar/sync" \
|
||||
-H "Content-Type: application/json" \
|
||||
-d '{"full_content": true}'
|
||||
|
||||
# Sync single employee
|
||||
curl -X POST "http://localhost:3000/advoware/calendar/sync" \
|
||||
-H "Content-Type: application/json" \
|
||||
-d '{"kuerzel": "SB", "full_content": true}'
|
||||
|
||||
# Sync with anonymization
|
||||
curl -X POST "http://localhost:3000/advoware/calendar/sync" \
|
||||
-H "Content-Type: application/json" \
|
||||
-d '{"kuerzel": "SB", "full_content": false}'
|
||||
```
|
||||
|
||||
## Error Handling
|
||||
- Lock active: Wartet oder gibt Fehler zurück
|
||||
- Invalid kuerzel: Wird an all_step oder event_step weitergegeben
|
||||
|
||||
## Related
|
||||
- [calendar_sync_all_step.md](calendar_sync_all_step.md) - Handles "ALL"
|
||||
- [calendar_sync_event_step.md](calendar_sync_event_step.md) - Per-employee sync
|
||||
@@ -1,7 +1,7 @@
|
||||
import json
|
||||
import redis
|
||||
from config import Config
|
||||
from .calendar_sync_utils import get_redis_client, set_employee_lock
|
||||
from .calendar_sync_utils import get_redis_client, set_employee_lock, log_operation
|
||||
|
||||
config = {
|
||||
'type': 'api',
|
||||
@@ -31,7 +31,7 @@ async def handler(req, context):
|
||||
|
||||
if kuerzel_upper == 'ALL':
|
||||
# Emit sync-all event
|
||||
context.logger.info("Calendar Sync API: Emitting sync-all event")
|
||||
log_operation('info', "Calendar Sync API: Emitting sync-all event", context=context)
|
||||
await context.emit({
|
||||
"topic": "calendar_sync_all",
|
||||
"data": {
|
||||
@@ -54,7 +54,7 @@ async def handler(req, context):
|
||||
redis_client = get_redis_client(context)
|
||||
|
||||
if not set_employee_lock(redis_client, kuerzel_upper, 'api', context):
|
||||
context.logger.info(f"Calendar Sync API: Sync bereits aktiv für {kuerzel_upper}, überspringe")
|
||||
log_operation('info', f"Calendar Sync API: Sync bereits aktiv für {kuerzel_upper}, überspringe", context=context)
|
||||
return {
|
||||
'status': 409,
|
||||
'body': {
|
||||
@@ -65,7 +65,7 @@ async def handler(req, context):
|
||||
}
|
||||
}
|
||||
|
||||
context.logger.info(f"Calendar Sync API aufgerufen für {kuerzel_upper}")
|
||||
log_operation('info', f"Calendar Sync API aufgerufen für {kuerzel_upper}", context=context)
|
||||
|
||||
# Lock erfolgreich gesetzt, jetzt emittieren
|
||||
|
||||
@@ -89,7 +89,7 @@ async def handler(req, context):
|
||||
}
|
||||
|
||||
except Exception as e:
|
||||
context.logger.error(f"Fehler beim API-Trigger: {e}")
|
||||
log_operation('error', f"Fehler beim API-Trigger: {e}", context=context)
|
||||
return {
|
||||
'status': 500,
|
||||
'body': {
|
||||
|
||||
51
bitbylaw/steps/advoware_cal_sync/calendar_sync_cron_step.md
Normal file
51
bitbylaw/steps/advoware_cal_sync/calendar_sync_cron_step.md
Normal file
@@ -0,0 +1,51 @@
|
||||
---
|
||||
type: step
|
||||
category: cron
|
||||
name: Calendar Sync Cron
|
||||
version: 1.0.0
|
||||
status: active
|
||||
tags: [calendar, sync, cron, scheduler]
|
||||
dependencies: []
|
||||
emits: [calendar_sync_all]
|
||||
---
|
||||
|
||||
# Calendar Sync Cron Step
|
||||
|
||||
## Zweck
|
||||
Täglicher Trigger für die Calendar-Synchronisation. Startet die Sync-Pipeline um 2 Uhr morgens.
|
||||
|
||||
## Config
|
||||
```python
|
||||
{
|
||||
'type': 'cron',
|
||||
'name': 'Calendar Sync Cron',
|
||||
'schedule': '0 2 * * *', # Daily at 2 AM
|
||||
'emits': ['calendar_sync_all'],
|
||||
'flows': ['advoware_cal_sync']
|
||||
}
|
||||
```
|
||||
|
||||
## Verhalten
|
||||
1. Cron triggert täglich um 02:00 Uhr
|
||||
2. Emittiert Event `calendar_sync_all`
|
||||
3. Event wird von `calendar_sync_all_step` empfangen
|
||||
4. Startet Cascade: All → per Employee → Sync
|
||||
|
||||
## Event-Payload
|
||||
```json
|
||||
{}
|
||||
```
|
||||
Leer, da keine Parameter benötigt werden.
|
||||
|
||||
## Monitoring
|
||||
Logs: `[INFO] Calendar Sync Cron triggered`
|
||||
|
||||
## Manual Trigger
|
||||
```bash
|
||||
# Use API endpoint instead of waiting for cron
|
||||
curl -X POST "http://localhost:3000/advoware/calendar/sync" \
|
||||
-H "Content-Type: application/json" \
|
||||
-d '{"full_content": true}'
|
||||
```
|
||||
|
||||
Siehe: [calendar_sync_api_step.md](calendar_sync_api_step.md)
|
||||
@@ -2,19 +2,20 @@ import json
|
||||
import redis
|
||||
from config import Config
|
||||
from services.advoware import AdvowareAPI
|
||||
from .calendar_sync_utils import log_operation
|
||||
|
||||
config = {
|
||||
'type': 'cron',
|
||||
'name': 'Calendar Sync Cron Job',
|
||||
'description': 'Führt den Calendar Sync alle 1 Minuten automatisch aus',
|
||||
'cron': '*/1 * * * *', # Alle 1 Minute
|
||||
'cron': '0 0 31 2 *', # Nie ausführen (31. Februar)
|
||||
'emits': ['calendar_sync_all'],
|
||||
'flows': ['advoware']
|
||||
}
|
||||
|
||||
async def handler(context):
|
||||
try:
|
||||
context.logger.info("Calendar Sync Cron: Starting to emit sync-all event")
|
||||
log_operation('info', "Calendar Sync Cron: Starting to emit sync-all event", context=context)
|
||||
|
||||
# # Emit sync-all event
|
||||
await context.emit({
|
||||
@@ -24,14 +25,14 @@ async def handler(context):
|
||||
}
|
||||
})
|
||||
|
||||
context.logger.info("Calendar Sync Cron: Emitted sync-all event")
|
||||
log_operation('info', "Calendar Sync Cron: Emitted sync-all event", context=context)
|
||||
return {
|
||||
'status': 'completed',
|
||||
'triggered_by': 'cron'
|
||||
}
|
||||
|
||||
except Exception as e:
|
||||
context.logger.error(f"Fehler beim Cron-Job: {e}")
|
||||
log_operation('error', f"Fehler beim Cron-Job: {e}", context=context)
|
||||
return {
|
||||
'status': 'error',
|
||||
'error': str(e)
|
||||
|
||||
@@ -919,6 +919,8 @@ async def process_updates(state, conn, service, calendar_id, kuerzel, advoware,
|
||||
|
||||
async def handler(event_data, context):
|
||||
"""Main event handler for calendar sync."""
|
||||
start_time = time.time()
|
||||
|
||||
kuerzel = event_data.get('kuerzel')
|
||||
if not kuerzel:
|
||||
log_operation('error', "No kuerzel provided in event", context=context)
|
||||
@@ -1025,10 +1027,16 @@ async def handler(event_data, context):
|
||||
log_operation('info', f"Sync statistics for {kuerzel}: New Adv->Google: {stats['new_adv_to_google']}, New Google->Adv: {stats['new_google_to_adv']}, Deleted: {stats['deleted']}, Updated: {stats['updated']}, Recreated: {stats['recreated']}", context=context)
|
||||
|
||||
log_operation('info', f"Calendar sync completed for {kuerzel}", context=context)
|
||||
|
||||
log_operation('info', f"Handler duration: {time.time() - start_time}", context=context)
|
||||
|
||||
return {'status': 200, 'body': {'status': 'completed', 'kuerzel': kuerzel}}
|
||||
|
||||
except Exception as e:
|
||||
log_operation('error', f"Sync failed for {kuerzel}: {e}", context=context)
|
||||
|
||||
log_operation('info', f"Handler duration (failed): {time.time() - start_time}", context=context)
|
||||
|
||||
return {'status': 500, 'body': {'error': str(e)}}
|
||||
finally:
|
||||
# Ensure lock is always released
|
||||
|
||||
@@ -2,37 +2,38 @@ import logging
|
||||
import asyncpg
|
||||
import os
|
||||
import redis
|
||||
import time
|
||||
from config import Config
|
||||
from googleapiclient.discovery import build
|
||||
from google.oauth2 import service_account
|
||||
|
||||
# Configure logging to file
|
||||
logging.basicConfig(
|
||||
filename='/opt/motia-app/calendar_sync.log',
|
||||
level=logging.INFO,
|
||||
format='%(asctime)s - %(name)s - %(levelname)s - %(message)s',
|
||||
datefmt='%Y-%m-%d %H:%M:%S'
|
||||
)
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
def log_operation(level, message, context=None, **context_vars):
|
||||
"""Centralized logging with context, supporting Motia workbench logging."""
|
||||
"""Centralized logging with context, supporting file and console logging."""
|
||||
context_str = ' '.join(f"{k}={v}" for k, v in context_vars.items() if v is not None)
|
||||
full_message = f"{message} {context_str}".strip()
|
||||
if context:
|
||||
if level == 'info':
|
||||
context.logger.info(full_message)
|
||||
elif level == 'warning':
|
||||
if hasattr(context.logger, 'warn'):
|
||||
context.logger.warn(full_message)
|
||||
else:
|
||||
context.logger.warning(full_message)
|
||||
elif level == 'error':
|
||||
context.logger.error(full_message)
|
||||
# elif level == 'debug':
|
||||
# context.logger.debug(full_message)dddd
|
||||
else:
|
||||
if level == 'info':
|
||||
logger.info(full_message)
|
||||
elif level == 'warning':
|
||||
logger.warning(full_message)
|
||||
elif level == 'error':
|
||||
logger.error(full_message)
|
||||
elif level == 'debug':
|
||||
logger.debug(full_message)
|
||||
full_message = f"[{time.time()}] {message} {context_str}".strip()
|
||||
|
||||
# Log to file via Python logger
|
||||
if level == 'info':
|
||||
logger.info(full_message)
|
||||
elif level == 'warning':
|
||||
logger.warning(full_message)
|
||||
elif level == 'error':
|
||||
logger.error(full_message)
|
||||
elif level == 'debug':
|
||||
logger.debug(full_message)
|
||||
|
||||
# Also log to console for journalctl visibility
|
||||
print(f"[{level.upper()}] {full_message}")
|
||||
|
||||
async def connect_db(context=None):
|
||||
"""Connect to Postgres DB from Config."""
|
||||
|
||||
@@ -0,0 +1,48 @@
|
||||
---
|
||||
type: step
|
||||
category: api
|
||||
name: Advoware Proxy DELETE
|
||||
version: 1.0.0
|
||||
status: active
|
||||
tags: [advoware, proxy, api, rest, delete]
|
||||
dependencies:
|
||||
- services/advoware.py
|
||||
emits: []
|
||||
---
|
||||
|
||||
# Advoware Proxy DELETE Step
|
||||
|
||||
## Zweck
|
||||
Universeller REST-API-Proxy für DELETE-Requests an die Advoware API zum Löschen von Ressourcen.
|
||||
|
||||
## Input
|
||||
```bash
|
||||
DELETE /advoware/proxy?endpoint=appointments/12345
|
||||
```
|
||||
|
||||
## Output
|
||||
```json
|
||||
{
|
||||
"status": 200,
|
||||
"body": {
|
||||
"result": null
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Key Differences
|
||||
- **Method**: DELETE
|
||||
- **Body**: Kein Body (`json_data = None`)
|
||||
- **Endpoint**: Mit ID der zu löschenden Ressource
|
||||
- **Side-Effect**: Löscht Ressource (nicht wiederherstellbar!)
|
||||
- **Response**: Oft `null` oder leeres Objekt
|
||||
|
||||
## Testing
|
||||
```bash
|
||||
curl -X DELETE "http://localhost:3000/advoware/proxy?endpoint=appointments/12345"
|
||||
```
|
||||
|
||||
## Warning
|
||||
⚠️ **ACHTUNG**: DELETE ist irreversibel! Keine Undo-Funktion.
|
||||
|
||||
Siehe [advoware_api_proxy_get_step.md](advoware_api_proxy_get_step.md) für vollständige Details.
|
||||
302
bitbylaw/steps/advoware_proxy/advoware_api_proxy_get_step.md
Normal file
302
bitbylaw/steps/advoware_proxy/advoware_api_proxy_get_step.md
Normal file
@@ -0,0 +1,302 @@
|
||||
---
|
||||
type: step
|
||||
category: api
|
||||
name: Advoware Proxy GET
|
||||
version: 1.0.0
|
||||
status: active
|
||||
tags: [advoware, proxy, api, rest]
|
||||
dependencies:
|
||||
- services/advoware.py
|
||||
- redis (for token caching)
|
||||
emits: []
|
||||
subscribes: []
|
||||
---
|
||||
|
||||
# Advoware Proxy GET Step
|
||||
|
||||
## Zweck
|
||||
Universeller REST-API-Proxy für GET-Requests an die Advoware API mit automatischer Authentifizierung und Token-Management.
|
||||
|
||||
## Kontext
|
||||
Die Advoware API verwendet HMAC-512 Authentifizierung, die komplex und fehleranfällig ist. Dieser Proxy abstrahiert die Authentifizierung und bietet einen einfachen HTTP-Endpunkt für GET-Requests. Clients müssen sich nicht um Token-Management, Signatur-Generierung oder Error-Handling kümmern.
|
||||
|
||||
## Technische Spezifikation
|
||||
|
||||
### Config
|
||||
```python
|
||||
{
|
||||
'type': 'api',
|
||||
'name': 'Advoware Proxy GET',
|
||||
'description': 'Universal proxy for Advoware API (GET)',
|
||||
'path': '/advoware/proxy',
|
||||
'method': 'GET',
|
||||
'emits': [],
|
||||
'flows': ['advoware']
|
||||
}
|
||||
```
|
||||
|
||||
### Input
|
||||
- **HTTP Method**: GET
|
||||
- **Path**: `/advoware/proxy`
|
||||
- **Query Parameters**:
|
||||
- `endpoint` (required, string): Advoware API endpoint path (ohne Base-URL)
|
||||
- Alle weiteren Parameter werden an Advoware weitergeleitet
|
||||
|
||||
**Beispiel**:
|
||||
```
|
||||
GET /advoware/proxy?endpoint=employees&limit=10&offset=0
|
||||
```
|
||||
|
||||
### Output
|
||||
|
||||
**Success Response (200)**:
|
||||
```json
|
||||
{
|
||||
"status": 200,
|
||||
"body": {
|
||||
"result": {
|
||||
// Advoware API Response
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
**Error Response (400)**:
|
||||
```json
|
||||
{
|
||||
"status": 400,
|
||||
"body": {
|
||||
"error": "Endpoint required as query param"
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
**Error Response (500)**:
|
||||
```json
|
||||
{
|
||||
"status": 500,
|
||||
"body": {
|
||||
"error": "Internal server error",
|
||||
"details": "Error message from Advoware or network"
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### Events
|
||||
- **Emits**: Keine
|
||||
- **Subscribes**: Keine
|
||||
|
||||
## Verhalten
|
||||
|
||||
### Ablauf
|
||||
1. Extrahiere `endpoint` Parameter aus Query-String
|
||||
2. Validiere dass `endpoint` vorhanden ist
|
||||
3. Extrahiere alle anderen Query-Parameter (außer `endpoint`)
|
||||
4. Erstelle AdvowareAPI-Instanz
|
||||
5. Rufe `api_call()` mit GET-Methode auf
|
||||
- Intern: Token wird aus Redis geladen oder neu geholt
|
||||
- Intern: HMAC-Signatur wird generiert
|
||||
- Intern: Request wird an Advoware gesendet
|
||||
6. Gebe Response als JSON zurück
|
||||
|
||||
### Fehlerbehandlung
|
||||
|
||||
**Fehlender `endpoint` Parameter**:
|
||||
- HTTP 400 mit Fehlermeldung
|
||||
- Request wird nicht an Advoware weitergeleitet
|
||||
|
||||
**Advoware API Error**:
|
||||
- HTTP 500 mit Details
|
||||
- Exception wird geloggt mit Stack-Trace
|
||||
- Keine Retry-Logik (fail-fast)
|
||||
|
||||
**Token Expired (401)**:
|
||||
- Automatisch behandelt durch AdvowareAPI Service
|
||||
- Neuer Token wird geholt und Request wiederholt
|
||||
- Transparent für Client
|
||||
|
||||
**Network Error**:
|
||||
- HTTP 500 mit Details
|
||||
- Exception wird geloggt
|
||||
- Timeout nach `ADVOWARE_API_TIMEOUT_SECONDS` (default: 30s)
|
||||
|
||||
### Side Effects
|
||||
- **Keine Writes**: GET-Request modifiziert keine Daten
|
||||
- **Token Cache**: Liest aus Redis DB 1 (`advoware_access_token`)
|
||||
- **Logging**: Schreibt INFO und ERROR logs in Motia Workbench
|
||||
|
||||
## Abhängigkeiten
|
||||
|
||||
### Services
|
||||
- **AdvowareAPI** (`services/advoware.py`): API-Client
|
||||
- `api_call(endpoint, method='GET', params, json_data=None)`
|
||||
- Handhabt Authentifizierung, Token-Caching, Error-Handling
|
||||
|
||||
### Redis Keys (gelesen via AdvowareAPI)
|
||||
- **DB 1**:
|
||||
- `advoware_access_token` (string, TTL: 53min): Bearer Token
|
||||
- `advoware_token_timestamp` (string, TTL: 53min): Token Creation Time
|
||||
|
||||
### Environment Variables
|
||||
```bash
|
||||
ADVOWARE_API_BASE_URL=https://www2.advo-net.net:90/
|
||||
ADVOWARE_API_KEY=base64_encoded_key
|
||||
ADVOWARE_APP_ID=your_app_id
|
||||
ADVOWARE_USER=api_user
|
||||
ADVOWARE_PASSWORD=secure_password
|
||||
ADVOWARE_API_TIMEOUT_SECONDS=30
|
||||
REDIS_HOST=localhost
|
||||
REDIS_PORT=6379
|
||||
REDIS_DB_ADVOWARE_CACHE=1
|
||||
```
|
||||
|
||||
### External APIs
|
||||
- **Advoware API**: Alle GET-fähigen Endpoints
|
||||
- **Rate Limits**: Unknown (keine offizielle Dokumentation)
|
||||
|
||||
## Testing
|
||||
|
||||
### Manual Test
|
||||
```bash
|
||||
# Test employee list
|
||||
curl -X GET "http://localhost:3000/advoware/proxy?endpoint=employees&limit=5"
|
||||
|
||||
# Test appointments
|
||||
curl -X GET "http://localhost:3000/advoware/proxy?endpoint=appointments?datum=2026-02-07"
|
||||
|
||||
# Test with error (missing endpoint)
|
||||
curl -X GET "http://localhost:3000/advoware/proxy"
|
||||
# Expected: 400 Bad Request
|
||||
```
|
||||
|
||||
### Expected Behavior
|
||||
1. **Success Case**:
|
||||
- Status: 200
|
||||
- Body enthält `result` mit Advoware-Daten
|
||||
- Logs zeigen "Proxying request to Advoware: GET {endpoint}"
|
||||
|
||||
2. **Error Case (missing endpoint)**:
|
||||
- Status: 400
|
||||
- Body: `{"error": "Endpoint required as query param"}`
|
||||
|
||||
3. **Error Case (Advoware down)**:
|
||||
- Status: 500
|
||||
- Body: `{"error": "Internal server error", "details": "..."}`
|
||||
- Logs zeigen Error mit Stack-Trace
|
||||
|
||||
## Monitoring
|
||||
|
||||
### Logs
|
||||
```
|
||||
[INFO] Proxying request to Advoware: GET employees
|
||||
[INFO] Using cached token
|
||||
[ERROR] Proxy error: ConnectionTimeout
|
||||
```
|
||||
|
||||
### Metrics (potentiell)
|
||||
- Request Count
|
||||
- Response Time (avg, p95, p99)
|
||||
- Error Rate
|
||||
- Cache Hit Rate (Token)
|
||||
|
||||
### Alerts
|
||||
- Error Rate > 10% über 5 Minuten
|
||||
- Response Time > 30s (Timeout-Grenze)
|
||||
- Redis Connection Failed
|
||||
|
||||
## Performance
|
||||
|
||||
### Response Time
|
||||
- **Cached Token**: 200-500ms (typisch)
|
||||
- **New Token**: 1-2s (Token-Fetch + API-Call)
|
||||
- **Timeout**: 30s (konfigurierbar)
|
||||
|
||||
### Throughput
|
||||
- **No rate limit** auf Motia-Seite
|
||||
- **Advoware API**: Unknown rate limits
|
||||
- **Bottleneck**: Advoware API Response-Zeit
|
||||
|
||||
## Security
|
||||
|
||||
### Secrets
|
||||
- ❌ Keine Secrets im Code
|
||||
- ✅ API Key über Environment Variable
|
||||
- ✅ Token in Redis (lokaler Zugriff nur)
|
||||
|
||||
### Authentication
|
||||
- Client → Motia: Keine (TODO: API Key oder OAuth)
|
||||
- Motia → Advoware: HMAC-512 + Bearer Token
|
||||
|
||||
### Data Exposure
|
||||
- GET-Requests lesen nur Daten
|
||||
- Keine PII in Logs (nur Endpoint-Pfade)
|
||||
- Response enthält alle Advoware-Daten (keine Filterung)
|
||||
|
||||
## Änderungshistorie
|
||||
|
||||
| Version | Datum | Änderung |
|
||||
|---------|-------|----------|
|
||||
| 1.0.0 | 2024-10-24 | Initiale Implementierung |
|
||||
|
||||
## KI-Assistant Guidance
|
||||
|
||||
### Typische Änderungen
|
||||
|
||||
**1. Timeout erhöhen**:
|
||||
```python
|
||||
# In services/advoware.py, nicht im Step
|
||||
Config.ADVOWARE_API_TIMEOUT_SECONDS = 60
|
||||
```
|
||||
|
||||
**2. Request-Parameter anpassen**:
|
||||
```python
|
||||
# Query-Parameter werden automatisch weitergeleitet
|
||||
# Keine Code-Änderung nötig
|
||||
```
|
||||
|
||||
**3. Response-Transformation**:
|
||||
```python
|
||||
# Vor return:
|
||||
result = await advoware.api_call(...)
|
||||
transformed = transform_response(result) # Neue Funktion
|
||||
return {'status': 200, 'body': {'result': transformed}}
|
||||
```
|
||||
|
||||
**4. Caching hinzufügen**:
|
||||
```python
|
||||
# Vor api_call:
|
||||
cache_key = f'cache:{endpoint}:{params}'
|
||||
cached = redis_client.get(cache_key)
|
||||
if cached:
|
||||
return {'status': 200, 'body': {'result': json.loads(cached)}}
|
||||
# ... api_call ...
|
||||
redis_client.set(cache_key, json.dumps(result), ex=300)
|
||||
```
|
||||
|
||||
### Don'ts
|
||||
- ❌ **Keine synchronen Blocking-Calls**: Immer `await` verwenden
|
||||
- ❌ **Keine Hardcoded Credentials**: Nur Environment Variables
|
||||
- ❌ **Keine unbehandelten Exceptions**: Immer try-catch
|
||||
- ❌ **Kein Logging von Secrets**: Keine Passwörter/Tokens loggen
|
||||
|
||||
### Testing-Tipps
|
||||
```bash
|
||||
# Test mit verschiedenen Endpoints
|
||||
curl "http://localhost:3000/advoware/proxy?endpoint=employees"
|
||||
curl "http://localhost:3000/advoware/proxy?endpoint=appointments"
|
||||
curl "http://localhost:3000/advoware/proxy?endpoint=cases"
|
||||
|
||||
# Test Error-Handling
|
||||
curl "http://localhost:3000/advoware/proxy" # Missing endpoint
|
||||
|
||||
# Test mit vielen Parametern
|
||||
curl "http://localhost:3000/advoware/proxy?endpoint=employees&limit=100&offset=0&sortBy=name"
|
||||
```
|
||||
|
||||
### Related Steps
|
||||
- [advoware_api_proxy_post_step.md](advoware_api_proxy_post_step.md) - POST-Requests
|
||||
- [advoware_api_proxy_put_step.md](advoware_api_proxy_put_step.md) - PUT-Requests
|
||||
- [advoware_api_proxy_delete_step.md](advoware_api_proxy_delete_step.md) - DELETE-Requests
|
||||
|
||||
### Related Services
|
||||
- [services/advoware.py](../../services/ADVOWARE_SERVICE.md) - API-Client Implementierung
|
||||
@@ -0,0 +1,70 @@
|
||||
---
|
||||
type: step
|
||||
category: api
|
||||
name: Advoware Proxy POST
|
||||
version: 1.0.0
|
||||
status: active
|
||||
tags: [advoware, proxy, api, rest, create]
|
||||
dependencies:
|
||||
- services/advoware.py
|
||||
emits: []
|
||||
---
|
||||
|
||||
# Advoware Proxy POST Step
|
||||
|
||||
## Zweck
|
||||
Universeller REST-API-Proxy für POST-Requests an die Advoware API zum Erstellen neuer Ressourcen.
|
||||
|
||||
## Unterschied zu GET
|
||||
- **Method**: POST statt GET
|
||||
- **Body**: JSON-Payload aus Request-Body wird an Advoware weitergeleitet
|
||||
- **Verwendung**: Erstellen von Ressourcen (Termine, Employees, etc.)
|
||||
|
||||
## Input
|
||||
```bash
|
||||
POST /advoware/proxy?endpoint=appointments
|
||||
Content-Type: application/json
|
||||
|
||||
{
|
||||
"datum": "2026-02-10",
|
||||
"uhrzeitVon": "09:00:00",
|
||||
"text": "Meeting"
|
||||
}
|
||||
```
|
||||
|
||||
## Output
|
||||
```json
|
||||
{
|
||||
"status": 200,
|
||||
"body": {
|
||||
"result": {
|
||||
"id": "12345",
|
||||
...
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Key Differences from GET Step
|
||||
1. Request Body (`req.get('body')`) wird als `json_data` an API übergeben
|
||||
2. Kann Daten in Advoware erstellen (Side-Effects!)
|
||||
3. Response enthält oft die neu erstellte Ressource
|
||||
|
||||
## Testing
|
||||
```bash
|
||||
curl -X POST "http://localhost:3000/advoware/proxy?endpoint=appointments" \
|
||||
-H "Content-Type: application/json" \
|
||||
-d '{
|
||||
"datum": "2026-02-10",
|
||||
"uhrzeitVon": "09:00:00",
|
||||
"uhrzeitBis": "10:00:00",
|
||||
"text": "Test Meeting"
|
||||
}'
|
||||
```
|
||||
|
||||
## KI Guidance
|
||||
Identisch zu GET-Step, außer:
|
||||
- Body-Validierung hinzufügen bei Bedarf
|
||||
- Side-Effects beachten (erstellt Daten!)
|
||||
|
||||
Siehe [advoware_api_proxy_get_step.md](advoware_api_proxy_get_step.md) für Details.
|
||||
55
bitbylaw/steps/advoware_proxy/advoware_api_proxy_put_step.md
Normal file
55
bitbylaw/steps/advoware_proxy/advoware_api_proxy_put_step.md
Normal file
@@ -0,0 +1,55 @@
|
||||
---
|
||||
type: step
|
||||
category: api
|
||||
name: Advoware Proxy PUT
|
||||
version: 1.0.0
|
||||
status: active
|
||||
tags: [advoware, proxy, api, rest, update]
|
||||
dependencies:
|
||||
- services/advoware.py
|
||||
emits: []
|
||||
---
|
||||
|
||||
# Advoware Proxy PUT Step
|
||||
|
||||
## Zweck
|
||||
Universeller REST-API-Proxy für PUT-Requests an die Advoware API zum Aktualisieren bestehender Ressourcen.
|
||||
|
||||
## Input
|
||||
```bash
|
||||
PUT /advoware/proxy?endpoint=appointments/12345
|
||||
Content-Type: application/json
|
||||
|
||||
{
|
||||
"text": "Updated Meeting Title"
|
||||
}
|
||||
```
|
||||
|
||||
## Output
|
||||
```json
|
||||
{
|
||||
"status": 200,
|
||||
"body": {
|
||||
"result": {
|
||||
"id": "12345",
|
||||
"text": "Updated Meeting Title",
|
||||
...
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Key Differences
|
||||
- **Method**: PUT
|
||||
- **Endpoint**: Typischerweise mit ID (`resource/123`)
|
||||
- **Body**: Partial oder Full Update-Payload
|
||||
- **Side-Effect**: Modifiziert bestehende Ressource
|
||||
|
||||
## Testing
|
||||
```bash
|
||||
curl -X PUT "http://localhost:3000/advoware/proxy?endpoint=appointments/12345" \
|
||||
-H "Content-Type: application/json" \
|
||||
-d '{"text": "Updated Title"}'
|
||||
```
|
||||
|
||||
Siehe [advoware_api_proxy_get_step.md](advoware_api_proxy_get_step.md) für vollständige Details.
|
||||
@@ -1,52 +0,0 @@
|
||||
from pydantic import BaseModel
|
||||
from typing import Optional
|
||||
from src.services.pet_store import pet_store_service
|
||||
from src.services.types import Pet
|
||||
|
||||
class PetRequest(BaseModel):
|
||||
name: str
|
||||
photoUrl: str
|
||||
|
||||
class FoodOrder(BaseModel):
|
||||
id: str
|
||||
quantity: int
|
||||
|
||||
class RequestBody(BaseModel):
|
||||
pet: PetRequest
|
||||
foodOrder: Optional[FoodOrder] = None
|
||||
|
||||
config = {
|
||||
"type": "api",
|
||||
"name": "ApiTrigger",
|
||||
"description": "basic-tutorial api trigger",
|
||||
"flows": ["basic-tutorial"],
|
||||
"method": "POST",
|
||||
"path": "/basic-tutorial",
|
||||
"bodySchema": RequestBody.model_json_schema(),
|
||||
"responseSchema": {
|
||||
200: Pet.model_json_schema(),
|
||||
},
|
||||
"emits": ["process-food-order"],
|
||||
}
|
||||
|
||||
async def handler(req, context):
|
||||
body = req.get("body", {})
|
||||
context.logger.info("Step 01 – Processing API Step", {"body": body})
|
||||
|
||||
pet = body.get("pet", {})
|
||||
food_order = body.get("foodOrder", {})
|
||||
|
||||
new_pet_record = await pet_store_service.create_pet(pet)
|
||||
|
||||
if food_order:
|
||||
await context.emit({
|
||||
"topic": "process-food-order",
|
||||
"data": {
|
||||
"id": food_order.get("id"),
|
||||
"quantity": food_order.get("quantity"),
|
||||
"email": "test@test.com", # sample email
|
||||
"pet_id": new_pet_record.get("id"),
|
||||
},
|
||||
})
|
||||
|
||||
return {"status": 200, "body": {**new_pet_record, "traceId": context.trace_id}}
|
||||
@@ -1,69 +0,0 @@
|
||||
[
|
||||
{
|
||||
"id": "step-configuration",
|
||||
"title": "Step Configuration",
|
||||
"description": "All steps should have a defined configuration, this is how you define the step's behavior and how it will be triggered.",
|
||||
"lines": [
|
||||
"6-30"
|
||||
]
|
||||
},
|
||||
{
|
||||
"id": "api-configuration",
|
||||
"title": "API Step",
|
||||
"description": "Definition of an API endpoint",
|
||||
"lines": [
|
||||
"23-24"
|
||||
]
|
||||
},
|
||||
{
|
||||
"id": "request-body",
|
||||
"title": "Request body",
|
||||
"description": "Definition of the expected request body. Motia will automatically generate types based on this schema.",
|
||||
"lines": [
|
||||
"6-16",
|
||||
"25"
|
||||
]
|
||||
},
|
||||
{
|
||||
"id": "response-payload",
|
||||
"title": "Response Payload",
|
||||
"description": "Definition of the expected response payload, Motia will generate the types automatically based on this schema. This is also important to create the Open API spec later.",
|
||||
"lines": [
|
||||
"4",
|
||||
"26-28"
|
||||
]
|
||||
},
|
||||
{
|
||||
"id": "event-driven-architecture",
|
||||
"title": "Emits",
|
||||
"description": "We can define the events that this step will emit, this is how we can trigger other Motia Steps.",
|
||||
"lines": [
|
||||
"29",
|
||||
"42-50"
|
||||
]
|
||||
},
|
||||
{
|
||||
"id": "handler",
|
||||
"title": "Handler",
|
||||
"description": "The handler is the function that will be executed when the step is triggered. This one receives the request body and emits events.",
|
||||
"lines": [
|
||||
"32-52"
|
||||
]
|
||||
},
|
||||
{
|
||||
"id": "logger",
|
||||
"title": "Logger",
|
||||
"description": "The logger is a utility that allows you to log messages to the console. It is available in the handler function. We encourage you to use it instead of console.log. It will automatically be tied to the trace id of the request.",
|
||||
"lines": [
|
||||
"34"
|
||||
]
|
||||
},
|
||||
{
|
||||
"id": "http-response",
|
||||
"title": "HTTP Response",
|
||||
"description": "The handler can return a response to the client. This is how we can return a response to the client. It must comply with the responseSchema defined in the step configuration.",
|
||||
"lines": [
|
||||
"52"
|
||||
]
|
||||
}
|
||||
]
|
||||
@@ -1,40 +0,0 @@
|
||||
from pydantic import BaseModel
|
||||
from typing import Dict, Any
|
||||
import re
|
||||
|
||||
class InputSchema(BaseModel):
|
||||
template_id: str
|
||||
email: str
|
||||
template_data: Dict[str, Any]
|
||||
|
||||
config = {
|
||||
"type": "event",
|
||||
"name": "Notification",
|
||||
"description": "Checks a state change",
|
||||
"flows": ["basic-tutorial"],
|
||||
"subscribes": ["notification"],
|
||||
"emits": [],
|
||||
"input": InputSchema.model_json_schema(),
|
||||
}
|
||||
|
||||
async def handler(input_data, context):
|
||||
email = input_data.get("email")
|
||||
template_id = input_data.get("template_id")
|
||||
template_data = input_data.get("template_data")
|
||||
|
||||
redacted_email = re.sub(r'(?<=.{2}).(?=.*@)', '*', email)
|
||||
|
||||
context.logger.info("Processing Notification", {
|
||||
"template_id": template_id,
|
||||
"template_data": template_data,
|
||||
"email": redacted_email,
|
||||
})
|
||||
|
||||
# This represents a call to some sort of
|
||||
# notification service to indicate that a
|
||||
# new order has been placed
|
||||
context.logger.info("New notification sent", {
|
||||
"template_id": template_id,
|
||||
"email": redacted_email,
|
||||
"template_data": template_data,
|
||||
})
|
||||
@@ -1,50 +0,0 @@
|
||||
from pydantic import BaseModel
|
||||
from datetime import datetime
|
||||
from src.services.pet_store import pet_store_service
|
||||
|
||||
class InputSchema(BaseModel):
|
||||
id: str
|
||||
email: str
|
||||
quantity: int
|
||||
pet_id: int
|
||||
|
||||
config = {
|
||||
"type": "event",
|
||||
"name": "ProcessFoodOrder",
|
||||
"description": "basic-tutorial event step, demonstrates how to consume an event from a topic and persist data in state",
|
||||
"flows": ["basic-tutorial"],
|
||||
"subscribes": ["process-food-order"],
|
||||
"emits": ["notification"],
|
||||
"input": InputSchema.model_json_schema(),
|
||||
}
|
||||
|
||||
async def handler(input_data, context):
|
||||
context.logger.info("Step 02 – Process food order", {"input": input_data})
|
||||
|
||||
order = await pet_store_service.create_order({
|
||||
"id": input_data.get("id"),
|
||||
"quantity": input_data.get("quantity"),
|
||||
"pet_id": input_data.get("pet_id"),
|
||||
"email": input_data.get("email"),
|
||||
"ship_date": datetime.now().isoformat(),
|
||||
"status": "placed",
|
||||
})
|
||||
|
||||
context.logger.info("Order created", {"order": order})
|
||||
|
||||
await context.state.set("orders_python", order.get("id"), order)
|
||||
|
||||
await context.emit({
|
||||
"topic": "notification",
|
||||
"data": {
|
||||
"email": input_data["email"],
|
||||
"template_id": "new-order",
|
||||
"template_data": {
|
||||
"status": order.get("status"),
|
||||
"ship_date": order.get("shipDate"),
|
||||
"id": order.get("id"),
|
||||
"pet_id": order.get("petId"),
|
||||
"quantity": order.get("quantity"),
|
||||
},
|
||||
},
|
||||
})
|
||||
@@ -1,68 +0,0 @@
|
||||
[
|
||||
{
|
||||
"id": "step-configuration",
|
||||
"title": "Step Configuration",
|
||||
"description": "All steps should have a defined configuration, this is how you define the step's behavior and how it will be triggered.",
|
||||
"lines": [
|
||||
"5-19"
|
||||
]
|
||||
},
|
||||
{
|
||||
"id": "event-configuration",
|
||||
"title": "Event Step",
|
||||
"description": "Definition of an event step that subscribes to specific topics",
|
||||
"lines": [
|
||||
"12",
|
||||
"15-16"
|
||||
]
|
||||
},
|
||||
{
|
||||
"id": "input-schema",
|
||||
"title": "Input Schema",
|
||||
"description": "Definition of the expected input data structure from the subscribed topic. Motia will automatically generate types based on this schema.",
|
||||
"lines": [
|
||||
"5-9",
|
||||
"17"
|
||||
]
|
||||
},
|
||||
{
|
||||
"id": "event-emits",
|
||||
"title": "Emits",
|
||||
"description": "We can define the events that this step will emit, triggering other Motia Steps.",
|
||||
"lines": [
|
||||
"17"
|
||||
]
|
||||
},
|
||||
{
|
||||
"id": "handler",
|
||||
"title": "Handler",
|
||||
"description": "The handler is the function that will be executed when the step receives an event from its subscribed topic. It processes the input data and can emit new events.",
|
||||
"lines": [
|
||||
"21-50"
|
||||
]
|
||||
},
|
||||
{
|
||||
"id": "state",
|
||||
"title": "State Management",
|
||||
"description": "The handler demonstrates state management by storing order data that can be accessed by other steps.",
|
||||
"lines": [
|
||||
"35"
|
||||
]
|
||||
},
|
||||
{
|
||||
"id": "event-emission",
|
||||
"title": "Event Emission",
|
||||
"description": "After processing the order, the handler emits a new event to notify other steps about the new order.",
|
||||
"lines": [
|
||||
"37-50"
|
||||
]
|
||||
},
|
||||
{
|
||||
"id": "logger",
|
||||
"title": "Logger",
|
||||
"description": "The logger is a utility that allows you to log messages to the console. It is available in the handler function and automatically ties to the trace id of the request.",
|
||||
"lines": [
|
||||
"22"
|
||||
]
|
||||
}
|
||||
]
|
||||
@@ -1,39 +0,0 @@
|
||||
from datetime import datetime, timezone
|
||||
|
||||
config = {
|
||||
"type": "cron",
|
||||
"cron": "0 0 * * 1", # run once every Monday at midnight
|
||||
"name": "StateAuditJob",
|
||||
"description": "Checks the state for orders that are not complete and have a ship date in the past",
|
||||
"emits": ["notification"],
|
||||
"flows": ["basic-tutorial"],
|
||||
}
|
||||
|
||||
async def handler(context):
|
||||
state_value = await context.state.get_group("orders_python")
|
||||
|
||||
for item in state_value:
|
||||
# check if current date is after item.ship_date
|
||||
current_date = datetime.now(timezone.utc)
|
||||
ship_date = datetime.fromisoformat(item.get("shipDate", "").replace('Z', '+00:00'))
|
||||
|
||||
if not item.get("complete", False) and current_date > ship_date:
|
||||
context.logger.warn("Order is not complete and ship date is past", {
|
||||
"order_id": item.get("id"),
|
||||
"ship_date": item.get("shipDate"),
|
||||
"complete": item.get("complete", False),
|
||||
})
|
||||
|
||||
await context.emit({
|
||||
"topic": "notification",
|
||||
"data": {
|
||||
"email": "test@test.com",
|
||||
"template_id": "order-audit-warning",
|
||||
"template_data": {
|
||||
"order_id": item.get("id"),
|
||||
"status": item.get("status"),
|
||||
"ship_date": item.get("shipDate"),
|
||||
"message": "Order is not complete and ship date is past",
|
||||
},
|
||||
},
|
||||
})
|
||||
@@ -1,26 +0,0 @@
|
||||
[
|
||||
{
|
||||
"id": "step-configuration",
|
||||
"title": "Step Configuration",
|
||||
"description": "All steps should have a defined configuration, this is how you define the step's behavior and how it will be triggered.",
|
||||
"lines": [
|
||||
"3-10"
|
||||
]
|
||||
},
|
||||
{
|
||||
"id": "cron-configuration",
|
||||
"title": "Cron Configuration",
|
||||
"description": "Cron steps require a specific configuration structure with the 'type' field set to 'cron' and a valid cron expression.",
|
||||
"lines": [
|
||||
"4-5"
|
||||
]
|
||||
},
|
||||
{
|
||||
"id": "handler",
|
||||
"title": "Cron Step Handler",
|
||||
"description": "The Cron step handler only receives one argument.",
|
||||
"lines": [
|
||||
"12-39"
|
||||
]
|
||||
}
|
||||
]
|
||||
92
bitbylaw/steps/vmh/beteiligte_sync_event_step.md
Normal file
92
bitbylaw/steps/vmh/beteiligte_sync_event_step.md
Normal file
@@ -0,0 +1,92 @@
|
||||
---
|
||||
type: step
|
||||
category: event
|
||||
name: VMH Beteiligte Sync
|
||||
version: 1.0.0
|
||||
status: placeholder
|
||||
tags: [sync, vmh, beteiligte, event, todo]
|
||||
dependencies: []
|
||||
emits: []
|
||||
subscribes: [vmh.beteiligte.create, vmh.beteiligte.update, vmh.beteiligte.delete]
|
||||
---
|
||||
|
||||
# VMH Beteiligte Sync Event Step
|
||||
|
||||
## Status
|
||||
⚠️ **PLACEHOLDER** - Implementierung noch ausstehend
|
||||
|
||||
## Zweck
|
||||
Verarbeitet Create/Update/Delete-Events für Beteiligte-Entitäten und synchronisiert zwischen EspoCRM und Zielsystem.
|
||||
|
||||
## Config
|
||||
```python
|
||||
{
|
||||
'type': 'event',
|
||||
'name': 'VMH Beteiligte Sync',
|
||||
'subscribes': [
|
||||
'vmh.beteiligte.create',
|
||||
'vmh.beteiligte.update',
|
||||
'vmh.beteiligte.delete'
|
||||
],
|
||||
'emits': [],
|
||||
'flows': ['vmh']
|
||||
}
|
||||
```
|
||||
|
||||
## Geplantes Verhalten
|
||||
|
||||
**Input Events**:
|
||||
```json
|
||||
{
|
||||
"topic": "vmh.beteiligte.create",
|
||||
"data": {
|
||||
"entity_id": "123",
|
||||
"action": "create",
|
||||
"source": "webhook",
|
||||
"timestamp": "..."
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
**Processing**:
|
||||
1. Fetch full entity data from EspoCRM
|
||||
2. Map to target system format
|
||||
3. Create/Update/Delete in target system
|
||||
4. Remove ID from Redis pending set
|
||||
5. Log success/failure
|
||||
|
||||
## Implementierungs-Aufgaben
|
||||
|
||||
- [ ] EspoCRM API Client erstellen
|
||||
- [ ] Entity-Mapping definieren
|
||||
- [ ] Zielsystem-Integration
|
||||
- [ ] Error-Handling & Retry-Logic
|
||||
- [ ] Redis Cleanup (remove from pending sets)
|
||||
- [ ] Logging & Monitoring
|
||||
|
||||
## Redis Cleanup
|
||||
Nach erfolgreicher Verarbeitung:
|
||||
```python
|
||||
redis.srem('vmh:beteiligte:create_pending', entity_id)
|
||||
redis.srem('vmh:beteiligte:update_pending', entity_id)
|
||||
redis.srem('vmh:beteiligte:delete_pending', entity_id)
|
||||
```
|
||||
|
||||
## Testing (Future)
|
||||
```bash
|
||||
# Manually emit event for testing
|
||||
# (via Motia CLI or test script)
|
||||
```
|
||||
|
||||
## KI Guidance
|
||||
Wenn Sie diesen Step implementieren:
|
||||
1. Erstellen Sie EspoCRM API Client in `services/`
|
||||
2. Definieren Sie Mapping-Logic
|
||||
3. Implementieren Sie Retry-Logic mit exponential backoff
|
||||
4. Cleanen Sie Redis Sets nach Verarbeitung
|
||||
5. Loggen Sie alle Operationen für Audit
|
||||
|
||||
## Related
|
||||
- [webhook/beteiligte_create_api_step.md](webhook/beteiligte_create_api_step.md) - Emits create events
|
||||
- [webhook/beteiligte_update_api_step.md](webhook/beteiligte_update_api_step.md) - Emits update events
|
||||
- [webhook/beteiligte_delete_api_step.md](webhook/beteiligte_delete_api_step.md) - Emits delete events
|
||||
124
bitbylaw/steps/vmh/webhook/beteiligte_create_api_step.md
Normal file
124
bitbylaw/steps/vmh/webhook/beteiligte_create_api_step.md
Normal file
@@ -0,0 +1,124 @@
|
||||
---
|
||||
type: step
|
||||
category: api
|
||||
name: VMH Webhook Beteiligte Create
|
||||
version: 1.0.0
|
||||
status: active
|
||||
tags: [webhook, espocrm, vmh, beteiligte, create]
|
||||
dependencies:
|
||||
- redis
|
||||
emits: [vmh.beteiligte.create]
|
||||
---
|
||||
|
||||
# VMH Webhook Beteiligte Create Step
|
||||
|
||||
## Zweck
|
||||
Empfängt Create-Webhooks von EspoCRM für neue Beteiligte-Entitäten. Dedupliziert via Redis und emittiert Events für asynchrone Verarbeitung.
|
||||
|
||||
## Config
|
||||
```python
|
||||
{
|
||||
'type': 'api',
|
||||
'name': 'VMH Webhook Beteiligte Create',
|
||||
'path': '/vmh/webhook/beteiligte/create',
|
||||
'method': 'POST',
|
||||
'emits': ['vmh.beteiligte.create'],
|
||||
'flows': ['vmh']
|
||||
}
|
||||
```
|
||||
|
||||
## Input
|
||||
```bash
|
||||
POST /vmh/webhook/beteiligte/create
|
||||
Content-Type: application/json
|
||||
|
||||
[
|
||||
{
|
||||
"id": "entity-123",
|
||||
"name": "Max Mustermann",
|
||||
"createdAt": "2026-02-07T10:00:00Z"
|
||||
},
|
||||
{
|
||||
"id": "entity-456",
|
||||
"name": "Maria Schmidt"
|
||||
}
|
||||
]
|
||||
```
|
||||
|
||||
**Format**: Array von Entitäten (Batch-Support)
|
||||
|
||||
## Output
|
||||
```json
|
||||
{
|
||||
"status": "received",
|
||||
"action": "create",
|
||||
"new_ids_count": 2,
|
||||
"total_ids_in_batch": 2
|
||||
}
|
||||
```
|
||||
|
||||
## Verhalten
|
||||
|
||||
1. **Extract IDs** von allen Entitäten im Batch
|
||||
2. **Redis Deduplication**:
|
||||
```python
|
||||
pending_key = 'vmh:beteiligte:create_pending'
|
||||
existing_ids = redis.smembers(pending_key)
|
||||
new_ids = input_ids - existing_ids
|
||||
redis.sadd(pending_key, *new_ids)
|
||||
```
|
||||
3. **Emit Events** nur für neue IDs:
|
||||
```python
|
||||
for entity_id in new_ids:
|
||||
await context.emit({
|
||||
'topic': 'vmh.beteiligte.create',
|
||||
'data': {
|
||||
'entity_id': entity_id,
|
||||
'action': 'create',
|
||||
'source': 'webhook',
|
||||
'timestamp': timestamp
|
||||
}
|
||||
})
|
||||
```
|
||||
|
||||
## Redis Keys
|
||||
- `vmh:beteiligte:create_pending` (SET): IDs in create queue
|
||||
- No TTL (permanent until processed)
|
||||
|
||||
## Deduplication Logic
|
||||
**Problem**: EspoCRM kann Webhooks mehrfach senden
|
||||
**Solution**: Redis SET speichert alle pending IDs
|
||||
- Neue IDs → Events emittiert
|
||||
- Bereits vorhandene IDs → Skipped
|
||||
|
||||
## Testing
|
||||
```bash
|
||||
# Test webhook
|
||||
curl -X POST "http://localhost:3000/vmh/webhook/beteiligte/create" \
|
||||
-H "Content-Type: application/json" \
|
||||
-d '[{"id": "test-123", "name": "Test"}]'
|
||||
|
||||
# Check Redis
|
||||
redis-cli -n 1 SMEMBERS vmh:beteiligte:create_pending
|
||||
|
||||
# Clear Redis (testing)
|
||||
redis-cli -n 1 DEL vmh:beteiligte:create_pending
|
||||
```
|
||||
|
||||
## Error Handling
|
||||
- Invalid JSON: 400 error
|
||||
- Redis unavailable: Loggen, aber nicht crashen (kann zu Duplikaten führen)
|
||||
- Event emission error: Loggen und fortfahren
|
||||
|
||||
## Monitoring
|
||||
```
|
||||
[INFO] VMH Webhook Beteiligte Create empfangen
|
||||
[INFO] Create Entity ID gefunden: entity-123
|
||||
[INFO] 2 neue IDs zur Create-Sync-Queue hinzugefügt
|
||||
[INFO] Create-Event emittiert für ID: entity-123
|
||||
```
|
||||
|
||||
## Related Steps
|
||||
- [beteiligte_update_api_step.md](beteiligte_update_api_step.md) - Update webhooks
|
||||
- [beteiligte_delete_api_step.md](beteiligte_delete_api_step.md) - Delete webhooks
|
||||
- [beteiligte_sync_event_step.md](../beteiligte_sync_event_step.md) - Consumes events
|
||||
4
bitbylaw/types.d.ts
vendored
4
bitbylaw/types.d.ts
vendored
@@ -16,10 +16,6 @@ declare module 'motia' {
|
||||
'VMH Webhook Beteiligte Update': ApiRouteHandler<Record<string, unknown>, unknown, { topic: 'vmh.beteiligte.update'; data: never }>
|
||||
'VMH Webhook Beteiligte Delete': ApiRouteHandler<Record<string, unknown>, unknown, { topic: 'vmh.beteiligte.delete'; data: never }>
|
||||
'VMH Webhook Beteiligte Create': ApiRouteHandler<Record<string, unknown>, unknown, { topic: 'vmh.beteiligte.create'; data: never }>
|
||||
'StateAuditJob': CronHandler<{ topic: 'notification'; data: { template_id: string; email: string; template_data: Record<string, unknown> } }>
|
||||
'ProcessFoodOrder': EventHandler<{ id: string; email: string; quantity: unknown; pet_id: unknown }, { topic: 'notification'; data: { template_id: string; email: string; template_data: Record<string, unknown> } }>
|
||||
'Notification': EventHandler<{ template_id: string; email: string; template_data: Record<string, unknown> }, never>
|
||||
'ApiTrigger': ApiRouteHandler<{ pet: unknown; foodOrder?: unknown | unknown }, ApiResponse<200, { id: unknown; name: string; photoUrl: string }>, { topic: 'process-food-order'; data: { id: string; email: string; quantity: unknown; pet_id: unknown } }>
|
||||
'Advoware Proxy PUT': ApiRouteHandler<Record<string, unknown>, unknown, never>
|
||||
'Advoware Proxy POST': ApiRouteHandler<Record<string, unknown>, unknown, never>
|
||||
'Advoware Proxy GET': ApiRouteHandler<Record<string, unknown>, unknown, never>
|
||||
|
||||
Reference in New Issue
Block a user