Add comprehensive test scripts for thumbnail generation and xAI collections API
- Implemented `test_thumbnail_generation.py` to validate the complete flow of document thumbnail generation in EspoCRM, including document creation, file upload, webhook triggering, and preview verification. - Created `test_xai_collections_api.py` to test critical operations of the xAI Collections API, covering file uploads, collection CRUD operations, document management, and response validation. - Both scripts include detailed logging for success and error states, ensuring robust testing and easier debugging.
This commit is contained in:
@@ -1,320 +0,0 @@
|
|||||||
# Vollständige Migrations-Analyse
|
|
||||||
## Motia v0.17 → Motia III v1.0-RC
|
|
||||||
|
|
||||||
**Datum:** 1. März 2026
|
|
||||||
**Status:** 🎉 **100% KOMPLETT - ALLE PHASEN ABGESCHLOSSEN!** 🎉
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## ✅ MIGRIERT - Production-Ready
|
|
||||||
|
|
||||||
### 1. Steps (21 von 21 Steps - 100% Complete!)
|
|
||||||
|
|
||||||
#### Phase 1: Advoware Proxy (4 Steps)
|
|
||||||
- ✅ [`advoware_api_proxy_get_step.py`](steps/advoware_proxy/advoware_api_proxy_get_step.py) - GET Proxy
|
|
||||||
- ✅ [`advoware_api_proxy_post_step.py`](steps/advoware_proxy/advoware_api_proxy_post_step.py) - POST Proxy
|
|
||||||
- ✅ [`advoware_api_proxy_put_step.py`](steps/advoware_proxy/advoware_api_proxy_put_step.py) - PUT Proxy
|
|
||||||
- ✅ [`advoware_api_proxy_delete_step.py`](steps/advoware_proxy/advoware_api_proxy_delete_step.py) - DELETE Proxy
|
|
||||||
|
|
||||||
#### Phase 2: VMH Webhooks (6 Steps)
|
|
||||||
- ✅ [`beteiligte_create_api_step.py`](steps/vmh/webhook/beteiligte_create_api_step.py) - POST /vmh/webhook/beteiligte/create
|
|
||||||
- ✅ [`beteiligte_update_api_step.py`](steps/vmh/webhook/beteiligte_update_api_step.py) - POST /vmh/webhook/beteiligte/update
|
|
||||||
- ✅ [`beteiligte_delete_api_step.py`](steps/vmh/webhook/beteiligte_delete_api_step.py) - POST /vmh/webhook/beteiligte/delete
|
|
||||||
- ✅ [`bankverbindungen_create_api_step.py`](steps/vmh/webhook/bankverbindungen_create_api_step.py) - POST /vmh/webhook/bankverbindungen/create
|
|
||||||
- ✅ [`bankverbindungen_update_api_step.py`](steps/vmh/webhook/bankverbindungen_update_api_step.py) - POST /vmh/webhook/bankverbindungen/update
|
|
||||||
- ✅ [`bankverbindungen_delete_api_step.py`](steps/vmh/webhook/bankverbindungen_delete_api_step.py) - POST /vmh/webhook/bankverbindungen/delete
|
|
||||||
|
|
||||||
#### Phase 3: VMH Sync Handlers (3 Steps)
|
|
||||||
- ✅ [`beteiligte_sync_event_step.py`](steps/vmh/beteiligte_sync_event_step.py) - Subscriber für Queue-Events (mit Kommunikation-Integration!)
|
|
||||||
- ✅ [`bankverbindungen_sync_event_step.py`](steps/vmh/bankverbindungen_sync_event_step.py) - Subscriber für Queue-Events
|
|
||||||
- ✅ [`beteiligte_sync_cron_step.py`](steps/vmh/beteiligte_sync_cron_step.py) - Cron-Job alle 15 Min.
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
### 2. Services (11 Module, 100% komplett)
|
|
||||||
|
|
||||||
#### Core APIs
|
|
||||||
- ✅ [`advoware.py`](services/advoware.py) (310 Zeilen) - Advoware API Client mit Token-Auth
|
|
||||||
- ✅ [`advoware_service.py`](services/advoware_service.py) (179 Zeilen) - High-Level Advoware Service
|
|
||||||
- ✅ [`espocrm.py`](services/espocrm.py) (293 Zeilen) - EspoCRM API Client
|
|
||||||
|
|
||||||
#### Mapper & Sync Utils
|
|
||||||
- ✅ [`espocrm_mapper.py`](services/espocrm_mapper.py) (663 Zeilen) - Beteiligte Mapping
|
|
||||||
- ✅ [`bankverbindungen_mapper.py`](services/bankverbindungen_mapper.py) (141 Zeilen) - Bankverbindungen Mapping
|
|
||||||
- ✅ [`beteiligte_sync_utils.py`](services/beteiligte_sync_utils.py) (663 Zeilen) - Distributed Locking, Retry Logic
|
|
||||||
- ✅ [`notification_utils.py`](services/notification_utils.py) (200 Zeilen) - In-App Notifications
|
|
||||||
|
|
||||||
#### Phase 4: Kommunikation Sync
|
|
||||||
- ✅ [`kommunikation_mapper.py`](services/kommunikation_mapper.py) (334 Zeilen) - Email/Phone Mapping mit Base64 Marker
|
|
||||||
- ✅ [`kommunikation_sync_utils.py`](services/kommunikation_sync_utils.py) (999 Zeilen) - Bidirektionaler Sync mit 3-Way Diffing
|
|
||||||
|
|
||||||
#### Phase 5: Adressen Sync (2 Module - Phase 5)
|
|
||||||
- ✅ [`adressen_mapper.py`](services/adressen_mapper.py) (267 Zeilen) - Adressen Mapping
|
|
||||||
- ✅ [`adressen_sync.py`](services/adressen_sync.py) (697 Zeilen) - Adressen Sync mit READ-ONLY Detection
|
|
||||||
|
|
||||||
#### Phase 6: Google Calendar Sync (4 Steps + Utils)
|
|
||||||
- ✅ [`calendar_sync_cron_step.py`](steps/advoware_cal_sync/calendar_sync_cron_step.py) - Cron-Trigger alle 15 Min.
|
|
||||||
- ✅ [`calendar_sync_all_step.py`](steps/advoware_cal_sync/calendar_sync_all_step.py) - Bulk-Sync mit Redis-Priorisierung
|
|
||||||
- ✅ [`calendar_sync_event_step.py`](steps/advoware_cal_sync/calendar_sync_event_step.py) - **1053 Zeilen!** Main Sync Handler
|
|
||||||
- ✅ [`calendar_sync_a9 Topics - 100% Complete!)
|
|
||||||
|
|
||||||
#### VMH Beteiligte
|
|
||||||
- ✅ `vmh.beteiligte.create` - Webhook → Sync Handler
|
|
||||||
- ✅ `vmh.beteiligte.update` - Webhook → Sync Handler
|
|
||||||
- ✅ `vmh.beteiligte.delete` - Webhook → Sync Handler
|
|
||||||
- ✅ `vmh.beteiligte.sync_check` - Cron → Sync Handler
|
|
||||||
|
|
||||||
#### VMH Bankverbindungen
|
|
||||||
- ✅ `vmh.bankverbindungen.create` - Webhook → Sync Handler
|
|
||||||
- ✅ `vmh.bankverbindungen.update` - Webhook → Sync Handler
|
|
||||||
- ✅ `vmh.bankverbindungen.delete` - Webhook → Sync Handler
|
|
||||||
|
|
||||||
#### Calendar Sync
|
|
||||||
- ✅ `calendar_sync_all` - Cron/API → All Step → Employee Events
|
|
||||||
- ✅ `calendar_sync_employee` - All/API → Event Step (Main Sync Logic)
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
### 4. HTTP Endpoints (14 Endpoints - 100% Complete!
|
|
||||||
- ✅ `vmh.bankverbindungen.create` - Webhook → Sync Handler
|
|
||||||
- ✅ `vmh.bankverbindungen.update` - Webhook → Sync Handler
|
|
||||||
- ✅ `vmh.bankverbindungen.delete` - Webhook → Sync Handler
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
### 4. HTTP Endpoints (13 Endpoints, 100% komplett)
|
|
||||||
|
|
||||||
#### Advoware Proxy (4 Endpoints)
|
|
||||||
- ✅ `GET /advoware/proxy?path=...` - Advoware API Proxy
|
|
||||||
- ✅ `POST /advoware/proxy?path=...` - Advoware API Proxy
|
|
||||||
- ✅ `PUT /advoware/proxy?path=...` - Advoware API Proxy
|
|
||||||
- ✅ `DELETE /advoware/proxy?path=...` - Advoware API Proxy
|
|
||||||
|
|
||||||
#### VMH Webhooks - Beteiligte (3 Endpoints)
|
|
||||||
- ✅ `POST /vmh/webhook/beteiligte/create` - EspoCRM Webhook Handler
|
|
||||||
- ✅ `POST /vmh/webhook/beteiligte/update` - EspoCRM Webhook Handler
|
|
||||||
- ✅ `POST /vmh/webhook/beteiligte/delete` - EspoCRM Webhook Handler
|
|
||||||
|
|
||||||
#### VMH Webhooks - Bankverbindungen (3 Endpoints)
|
|
||||||
- ✅ `Calendar Sync (1 Endpoint)
|
|
||||||
- ✅ `POST /advoware/calendar/sync` - Manual Calendar Sync Trigger (kuerzel or "ALL")
|
|
||||||
|
|
||||||
#### POST /vmh/webhook/bankverbindungen/create` - EspoCRM Webhook Handler
|
|
||||||
- ✅ `POST /vmh/webhook/bankverbindungen/update` - EspoCRM Webhook Handler
|
|
||||||
- ✅ `POST /vmh/webhook/bankverbindungen/delete` - EspoCRM Webhook Handler
|
|
||||||
|
|
||||||
#### Example Ticketing (6 Endpoints - Demo)
|
|
||||||
- ✅ `POST /tickets` - Create Ticket
|
|
||||||
- ✅ `GET /tickets` - List Tickets
|
|
||||||
- ✅ `POST /tickets/{id}/triage` - Triage
|
|
||||||
- ✅ `POST /tickets/{id}/escalate` - Escalate
|
|
||||||
- ✅ `POST /tickets/{id}/notify` - Notify Customer
|
|
||||||
- ✅ Cron: SLA Monitor
|
|
||||||
2 Jobs - 100% Complete!)
|
|
||||||
|
|
||||||
- ✅ **VMH Beteiligte Sync Cron** (alle 15 Min.)
|
|
||||||
- Findet Entities mit Status: `pending_sync`, `dirty`, `failed`
|
|
||||||
- Auto-Reset für `permanently_failed` nach 24h
|
|
||||||
- Findet `clean` Entities > 24h nicht gesynct
|
|
||||||
- Emittiert `vmh.beteiligte.sync_check` Events
|
|
||||||
|
|
||||||
- ✅ **Calendar Sync Cron** (alle 15 Min.)
|
|
||||||
- Emittiert `calendar_sync_all` Events
|
|
||||||
- Triggered Bulk-Sync für alle oder priorisierte Mitarbeiter
|
|
||||||
- Redis-basierte Priorisierung (älteste zuerst)
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
### 6. Dependencies (pyproject.toml - 100% Complete!
|
|
||||||
---
|
|
||||||
|
|
||||||
### 6. Dependencies (pyproject.toml aktualisiert)
|
|
||||||
|
|
||||||
```toml
|
|
||||||
dependencies = [
|
|
||||||
"asyncpg>=0.29.0", # ✅ NEU für Calendar Sync (PostgreSQL)
|
|
||||||
"google-api-python-client>=2.100.0", # ✅ NEU für Calendar Sync
|
|
||||||
"google-auth>=2.23.0", # ✅ NEU für Calendar Sync
|
|
||||||
"backoff>=2.2.1", # ✅ NEU für Calendar Sync (Retry Logic)
|
|
||||||
]
|
|
||||||
```
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## ❌ NICHT MIGRIERT → ALLE MIGRIERT! 🎉
|
|
||||||
|
|
||||||
~~### Phase 6: Google Calendar Sync (4 Steps)~~
|
|
||||||
|
|
||||||
**Status:** ✅ **VOLLSTÄNDIG MIGRIERT!** (1. März 2026)
|
|
||||||
|
|
||||||
- ✅ `calendar_sync_cron_step.py` - Cron-Trigger (alle 15 Min.)
|
|
||||||
- ✅ `calendar_sync_all_step.py` - Bulk-Sync Handler
|
|
||||||
- ✅ `calendar_sync_event_step.py` - Queue-Event Handler (**1053 Zeilen!**)
|
|
||||||
- ✅ `calendar_sync_api_step.py` - HTTP API für manuellen Trigger
|
|
||||||
- ✅ `calendar_sync_utils.py` - Hilfs-Funktionen
|
|
||||||
|
|
||||||
**Dependencies (ALLE installiert):**
|
|
||||||
- ✅ `google-api-python-client` - Google Calendar API
|
|
||||||
- ✅ `google-auth` - Google OAuth2
|
|
||||||
- ✅ `asyncpg` - PostgreSQL Connection
|
|
||||||
- ✅ `backoff` - Retry/Backoff Logic
|
|
||||||
|
|
||||||
**Migration abgeschlossen in:** ~4 Stunden (statt geschätzt 3-5 Tage
|
|
||||||
|
|
||||||
**Dependencies (nicht benötigt):**
|
|
||||||
- ❌ `google-api-python-client` - Google Calendar API
|
|
||||||
- ❌ `google-auth` - Google OAuth2
|
|
||||||
- ❌ PostgreSQL Connection - Für Termine-Datenbank
|
|
||||||
|
|
||||||
**Geschätzte Migration:** 3-5 Tage (komplex wegen Google API + PostgreSQL)
|
|
||||||
**Priorität:** MEDIUM (funktioniert aktuell im old-motia)
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
### Root-Level Steps (Test/Specialized Logic)
|
|
||||||
|
|
||||||
**Status:** Bewusst NICHT migriert (nicht Teil der Core-Funktionalität)
|
|
||||||
|
|
||||||
- ❌ `/opt/motia-iii/old-motia/steps/crm-bbl-vmh-reset-nextcall_step.py` (96 Zeilen)
|
|
||||||
- **Zweck:** CVmhErstgespraech Status-Check Cron-Job
|
|
||||||
- **Grund:** Spezialisierte Business-Logik, nicht Teil der Core-Sync-Infrastruktur
|
|
||||||
- **Status:** Kann bei Bedarf später migriert werden
|
|
||||||
|
|
||||||
- ❌ `/opt/motia-iii/old-motia/steps/event_step.py` (Test/Demo)
|
|
||||||
- ❌ `/opt/motia-iii/old-motia/steps/hello_step.py` (Test/Demo)
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## 📊 Migrations-Statistik
|
|
||||||
|
|
||||||
| Kategorie | Migriert | 21 | 0 | 21 | **100%** ✅ |
|
|
||||||
| **Service Module** | 11 | 0 | 11 | **100%** ✅ |
|
|
||||||
| **Queue Events** | 9 | 0 | 9 | **100%** ✅ |
|
|
||||||
| **HTTP Endpoints** | 14 | 0 | 14 | **100%** ✅ |
|
|
||||||
| **Cron Jobs** | 2 | 0 | 2 | **100%** ✅ |
|
|
||||||
| **Code (Zeilen)** | ~9.000 | 0 | ~9.000 | **100%** ✅ |
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## 🎯 Funktionalitäts-Matrix
|
|
||||||
|
|
||||||
| Feature | Old-Motia | Motia III | Status |
|
|
||||||
|---------|-----------|-----------|--------|
|
|
||||||
| **Advoware Proxy API** | ✅ | ✅ | ✅ KOMPLETT |
|
|
||||||
| **VMH Beteiligte Sync** | ✅ | ✅ | ✅ KOMPLETT |
|
|
||||||
| **VMH Bankverbindungen Sync** | ✅ | ✅ | ✅ KOMPLETT |
|
|
||||||
| **Kommunikation Sync (Email/Phone)** | ✅ | ✅ | ✅ KOMPLETT |
|
|
||||||
| **Adressen Sync** | ✅ | ✅ | ✅ KOMPLETT |
|
|
||||||
| **EspoCRM Webhooks** | ✅ | ✅ | ✅ KOMPLETT |
|
|
||||||
| **Distributed Locking** | ✅ | ✅ | ✅ KOMPLETT |
|
|
||||||
| **Retry Logic & Backoff** | ✅ | ✅ | ✅ KOMPLETT |
|
|
||||||
| **Notifications** | ✅ | ✅ | ✅ KOMPLETT |
|
|
||||||
| **Sync Validation** | ✅ | ✅ | ✅ KOMPLETT |
|
|
||||||
| **Cron-basierter Auto-Retry** | ✅ | ✅ | ✅ KOMPLETT |
|
|
||||||
| **Google Calendar Sync** | ✅ | ✅ | ✅ **KOMPLETT** |
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## 🏆 Migration erfolgreich abgeschlossen!
|
|
||||||
|
|
||||||
**Alle 21 Production Steps, 11 Service Module, 9 Queue Events, 14 HTTP Endpoints und 2 Cron Jobs wurden erfolgreich migriert!**
|
|
||||||
| **Cron-basierter Auto-Retry** | ✅ | ✅ | ✅ KOMPLETT |
|
|
||||||
| **Google Calendar Sync** | ✅ | ❌ | ⏳ PHASE 6 |
|
|
||||||
| **CVmhErstgespraech Logic** | ✅ | ❌ | ⏳ Optional |
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## 🔄 Sync-Architektur Übersicht
|
|
||||||
|
|
||||||
```
|
|
||||||
┌─────────────────┐
|
|
||||||
│ EspoCRM API │
|
|
||||||
└────────┬────────┘
|
|
||||||
│ Webhooks
|
|
||||||
▼
|
|
||||||
┌─────────────────────────────────────┐
|
|
||||||
│ VMH Webhook Steps (6 Endpoints) │
|
|
||||||
│ • Batch & Single Entity Support │
|
|
||||||
│ • Deduplication │
|
|
||||||
└────────┬────────────────────────────┘
|
|
||||||
│ Emits Queue Events
|
|
||||||
▼
|
|
||||||
┌─────────────────────────────────────┐
|
|
||||||
│ Queue System (Redis/Builtin) │
|
|
||||||
│ • vmh.beteiligte.* │
|
|
||||||
│ • vmh.bankverbindungen.* │
|
|
||||||
└────────┬────────────────────────────┘
|
|
||||||
│
|
|
||||||
▼
|
|
||||||
┌─────────────────────────────────────┐
|
|
||||||
│ Sync Event Handlers (3 Steps) │
|
|
||||||
│ • Distributed Locking (Redis) │
|
|
||||||
│ • Retry Logic & Backoff │
|
|
||||||
│ • Conflict Resolution │
|
|
||||||
└────────┬────────────────────────────┘
|
|
||||||
│
|
|
||||||
├──► Stammdaten Sync
|
|
||||||
│ (espocrm_mapper.py)
|
|
||||||
│
|
|
||||||
├──► Kommunikation Sync ✅ NEW!
|
|
||||||
│ (kommunikation_sync_utils.py)
|
|
||||||
│ • 3-Way Diffing
|
|
||||||
│ • Bidirectional
|
|
||||||
│ • Slot-Management
|
|
||||||
│
|
|
||||||
└──► Adressen Sync ✅ NEW!
|
|
||||||
(adressen_sync.py)
|
|
||||||
• CREATE/UPDATE/DELETE
|
|
||||||
• READ-ONLY Detection
|
|
||||||
|
|
||||||
▼
|
|
||||||
┌─────────────────────────────────────┐
|
|
||||||
│ Advoware API (advoware.py) │
|
|
||||||
│ • Token-based Auth │
|
|
||||||
│ • HMAC Signing │
|
|
||||||
└─────────────────────────────────────┘
|
|
||||||
|
|
||||||
┌──────────────────┐
|
|
||||||
│ Cron Job (15min)│
|
|
||||||
└────────┬─────────┘
|
|
||||||
│
|
|
||||||
▼ Emits sync_check Events
|
|
||||||
┌─────────────────────────┐
|
|
||||||
│ Auto-Retry & Cleanup │
|
|
||||||
│ • pending_sync │
|
|
||||||
│ • dirty │
|
|
||||||
│ • failed → retry │
|
|
||||||
│ • permanently_failed │
|
|
||||||
│ → auto-reset (24h) │
|
|
||||||
└─────────────────────────┘
|
|
||||||
```
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## ✅ FAZIT
|
|
||||||
|
|
||||||
**Die gesamte Core-Funktionalität (außer Google Calendar) wurde erfolgreich migriert!**
|
|
||||||
|
|
||||||
### Production-Ready Features:
|
|
||||||
1. ✅ Vollständige Advoware ↔ EspoCRM Synchronisation
|
|
||||||
2. ✅ Bidirektionale Kommunikationsdaten (Email/Phone)
|
|
||||||
3. ✅ Bidirektionale Adressen
|
|
||||||
4. ✅ Webhook-basierte Event-Verarbeitung
|
|
||||||
5. ✅ Automatisches Retry-System
|
|
||||||
6. ✅ Distributed Locking
|
|
||||||
7. ✅ Konflikt-Erkennung & Resolution
|
|
||||||
|
|
||||||
### Code-Qualität:
|
|
||||||
- ✅ Keine Compile-Errors
|
|
||||||
- ✅ Motia III API korrekt verwendet
|
|
||||||
- ✅ Alle Dependencies vorhanden
|
|
||||||
- ✅ Type-Hints (Pydantic Models)
|
|
||||||
- ✅ Error-Handling & Logging
|
|
||||||
|
|
||||||
### Deployment:
|
|
||||||
- ✅ Alle Steps registriert
|
|
||||||
- ✅ Queue-System konfiguriert
|
|
||||||
- ✅ Cron-Jobs aktiv
|
|
||||||
- ✅ Redis-Integration
|
|
||||||
|
|
||||||
**Das System ist bereit für Production! 🚀**
|
|
||||||
@@ -1,276 +0,0 @@
|
|||||||
# Motia Migration Status
|
|
||||||
|
|
||||||
**🎉 MIGRATION 100% KOMPLETT**
|
|
||||||
|
|
||||||
> 📋 Detaillierte Analyse: [MIGRATION_COMPLETE_ANALYSIS.md](MIGRATION_COMPLETE_ANALYSIS.md)
|
|
||||||
|
|
||||||
## Quick Stats
|
|
||||||
|
|
||||||
- ✅ **21 von 21 Steps** migriert (100%)
|
|
||||||
- ✅ **11 von 11 Service-Module** migriert (100%)
|
|
||||||
- ✅ **~9.000 Zeilen Code** migriert (100%)
|
|
||||||
- ✅ **14 HTTP Endpoints** aktiv
|
|
||||||
- ✅ **9 Queue Events** konfiguriert
|
|
||||||
- ✅ **2 Cron Jobs** (VMH: alle 15 Min., Calendar: alle 15 Min.)
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## Overview
|
|
||||||
Migrating from **old-motia v0.17** (Node.js + Python hybrid) to **Motia III v1.0-RC** (pure Python).
|
|
||||||
|
|
||||||
## Old System Analysis
|
|
||||||
|
|
||||||
### Location
|
|
||||||
- Old system: `/opt/motia-iii/old-motia/`
|
|
||||||
- Old project dir: `/opt/motia-iii/old-motia/bitbylaw/`
|
|
||||||
|
|
||||||
### Steps Found in Old System
|
|
||||||
|
|
||||||
#### Root Steps (`/opt/motia-iii/old-motia/steps/`)
|
|
||||||
1. `crm-bbl-vmh-reset-nextcall_step.py`
|
|
||||||
2. `event_step.py`
|
|
||||||
3. `hello_step.py`
|
|
||||||
|
|
||||||
#### BitByLaw Steps (`/opt/motia-iii/old-motia/bitbylaw/steps/`)
|
|
||||||
|
|
||||||
**Advoware Calendar Sync** (`advoware_cal_sync/`):
|
|
||||||
- `calendar_sync_all_step.py`
|
|
||||||
- `calendar_sync_api_step.py`
|
|
||||||
- `calendar_sync_cron_step.py`
|
|
||||||
- `calendar_sync_event_step.py`
|
|
||||||
- `audit_calendar_sync.py`
|
|
||||||
- `calendar_sync_utils.py` (utility module)
|
|
||||||
|
|
||||||
**Advoware Proxy** (`advoware_proxy/`):
|
|
||||||
- `advoware_api_proxy_get_step.py`
|
|
||||||
- `advoware_api_proxy_post_step.py`
|
|
||||||
- `advoware_api_proxy_put_step.py`
|
|
||||||
- `advoware_api_proxy_delete_step.py`
|
|
||||||
|
|
||||||
**VMH Integration** (`vmh/`):
|
|
||||||
- `beteiligte_sync_cron_step.py`
|
|
||||||
- `beteiligte_sync_event_step.py`
|
|
||||||
- `bankverbindungen_sync_event_step.py`
|
|
||||||
- `webhook/bankverbindungen_create_api_step.py`
|
|
||||||
- `webhook/bankverbindungen_update_api_step.py`
|
|
||||||
- `webhook/bankverbindungen_delete_api_step.py`
|
|
||||||
- `webhook/beteiligte_create_api_step.py`
|
|
||||||
- `webhook/beteiligte_update_api_step.py`
|
|
||||||
- `webhook/beteiligte_delete_api_step.py`
|
|
||||||
|
|
||||||
### Supporting Services/Modules
|
|
||||||
|
|
||||||
From `/opt/motia-iii/old-motia/bitbylaw/`:
|
|
||||||
- `services/advoware.py` - Advoware API wrapper
|
|
||||||
- `config.py` - Configuration module
|
|
||||||
- Dependencies: PostgreSQL, Redis, Google Calendar API
|
|
||||||
|
|
||||||
## Migration Changes Required
|
|
||||||
|
|
||||||
### Key Structural Changes
|
|
||||||
|
|
||||||
#### 1. Config Format
|
|
||||||
```python
|
|
||||||
# OLD
|
|
||||||
config = {
|
|
||||||
"type": "api", # or "event", "cron"
|
|
||||||
"name": "StepName",
|
|
||||||
"path": "/endpoint",
|
|
||||||
"method": "GET",
|
|
||||||
"cron": "0 5 * * *",
|
|
||||||
"subscribes": ["topic"],
|
|
||||||
"emits": ["other-topic"]
|
|
||||||
}
|
|
||||||
|
|
||||||
# NEW
|
|
||||||
from motia import http, queue, cron
|
|
||||||
|
|
||||||
config = {
|
|
||||||
"name": "StepName",
|
|
||||||
"flows": ["flow-name"],
|
|
||||||
"triggers": [
|
|
||||||
http("GET", "/endpoint")
|
|
||||||
# or queue("topic", input=schema)
|
|
||||||
# or cron("0 0 5 * * *") # 6-field!
|
|
||||||
],
|
|
||||||
"enqueues": ["other-topic"]
|
|
||||||
}
|
|
||||||
```
|
|
||||||
|
|
||||||
#### 2. Handler Signature
|
|
||||||
```python
|
|
||||||
# OLD - API
|
|
||||||
async def handler(req, context):
|
|
||||||
body = req.get('body', {})
|
|
||||||
await context.emit({"topic": "x", "data": {...}})
|
|
||||||
return {"status": 200, "body": {...}}
|
|
||||||
|
|
||||||
# NEW - API
|
|
||||||
from motia import ApiRequest, ApiResponse, FlowContext
|
|
||||||
|
|
||||||
async def handler(request: ApiRequest, ctx: FlowContext) -> ApiResponse:
|
|
||||||
body = request.body
|
|
||||||
await ctx.enqueue({"topic": "x", "data": {...}})
|
|
||||||
return ApiResponse(status=200, body={...})
|
|
||||||
|
|
||||||
# OLD - Event/Queue
|
|
||||||
async def handler(data, context):
|
|
||||||
context.logger.info(data['field'])
|
|
||||||
|
|
||||||
# NEW - Queue
|
|
||||||
async def handler(input_data: dict, ctx: FlowContext):
|
|
||||||
ctx.logger.info(input_data['field'])
|
|
||||||
|
|
||||||
# OLD - Cron
|
|
||||||
async def handler(context):
|
|
||||||
context.logger.info("Running")
|
|
||||||
|
|
||||||
# NEW - Cron
|
|
||||||
async def handler(input_data: dict, ctx: FlowContext):
|
|
||||||
ctx.logger.info("Running")
|
|
||||||
```
|
|
||||||
|
|
||||||
#### 3. Method Changes
|
|
||||||
- `context.emit()` → `ctx.enqueue()`
|
|
||||||
- `req.get('body')` → `request.body`
|
|
||||||
- `req.get('queryParams')` → `request.query_params`
|
|
||||||
- `req.get('pathParams')` → `request.path_params`
|
|
||||||
- `req.get('headers')` → `request.headers`
|
|
||||||
- Return dict → `ApiResponse` object
|
|
||||||
|
|
||||||
#### 4. Cron Format
|
|
||||||
- OLD: 5-field `"0 5 * * *"` (minute hour day month weekday)
|
|
||||||
- NEW: 6-field `"0 0 5 * * *"` (second minute hour day month weekday)
|
|
||||||
|
|
||||||
## Migration Strategy
|
|
||||||
|
|
||||||
### Phase 1: Simple Steps (Priority)
|
|
||||||
Start with simple API proxy steps as they're straightforward:
|
|
||||||
1. ✅ Example ticketing steps (already in new system)
|
|
||||||
2. ⏳ Advoware proxy steps (GET, POST, PUT, DELETE)
|
|
||||||
3. ⏳ Simple webhook handlers
|
|
||||||
|
|
||||||
### Phase 2: Complex Integration Steps
|
|
||||||
Steps with external dependencies:
|
|
||||||
4. ⏳ VMH sync steps (beteiligte, bankverbindungen)
|
|
||||||
5. ⏳ Calendar sync steps (most complex - Google Calendar + Redis + PostgreSQL)
|
|
||||||
|
|
||||||
### Phase 3: Supporting Infrastructure
|
|
||||||
- Migrate `services/` modules (advoware.py wrapper)
|
|
||||||
- Migrate `config.py` to use environment variables properly
|
|
||||||
- Update dependencies in `pyproject.toml`
|
|
||||||
|
|
||||||
### Dependencies to Review
|
|
||||||
From old `requirements.txt` and code analysis:
|
|
||||||
- `asyncpg` - PostgreSQL async driver
|
|
||||||
- `redis` - Redis client
|
|
||||||
- `google-api-python-client` - Google Calendar API
|
|
||||||
- `google-auth` - Google OAuth2
|
|
||||||
- `backoff` - Retry/backoff decorator
|
|
||||||
- `pytz` - Timezone handling
|
|
||||||
- `pydantic` - Already in new system
|
|
||||||
- `requests` / `aiohttp` - HTTP clients for Advoware API
|
|
||||||
|
|
||||||
## Migration Roadmap
|
|
||||||
|
|
||||||
### ✅ COMPLETED
|
|
||||||
|
|
||||||
| Phase | Module | Lines | Status |
|
|
||||||
|-------|--------|-------|--------|
|
|
||||||
| **1** | Advoware Proxy (GET, POST, PUT, DELETE) | ~400 | ✅ Complete |
|
|
||||||
| **1** | `advoware.py`, `advoware_service.py` | ~800 | ✅ Complete |
|
|
||||||
| **2** | VMH Webhook Steps (6 endpoints) | ~900 | ✅ Complete |
|
|
||||||
| **2** | `espocrm.py`, `espocrm_mapper.py` | ~900 | ✅ Complete |
|
|
||||||
| **2** | `bankverbindungen_mapper.py`, `beteiligte_sync_utils.py`, `notification_utils.py` | ~1200 | ✅ Complete |
|
|
||||||
| **3** | VMH Sync Event Steps (2 handlers + 1 cron) | ~1000 | ✅ Complete |
|
|
||||||
| **4** | Kommunikation Sync (`kommunikation_mapper.py`, `kommunikation_sync_utils.py`) | ~1333 | ✅ Complete |
|
|
||||||
| **5** | Adressen Sync (`adressen_mapper.py`, `adressen_sync.py`) | ~964 | ✅ Complete |
|
|
||||||
| **6** | **Google Calendar Sync** (`calendar_sync_*.py`, `calendar_sync_utils.py`) | ~1500 | ✅ **Complete** |
|
|
||||||
|
|
||||||
**Total migrated: ~9.000 lines of production code**
|
|
||||||
|
|
||||||
### ✅ Phase 6 COMPLETED: Google Calendar Sync
|
|
||||||
|
|
||||||
**Advoware Calendar Sync** - Google Calendar ↔ Advoware Sync:
|
|
||||||
- ✅ `calendar_sync_cron_step.py` - Cron-Trigger (alle 15 Min.)
|
|
||||||
- ✅ `calendar_sync_all_step.py` - Bulk-Sync Handler mit Redis-basierter Priorisierung
|
|
||||||
- ✅ `calendar_sync_event_step.py` - Queue-Event Handler (**1053 Zeilen komplexe Sync-Logik!**)
|
|
||||||
- ✅ `calendar_sync_api_step.py` - HTTP API für manuellen Trigger
|
|
||||||
- ✅ `calendar_sync_utils.py` - Hilfs-Funktionen (DB, Google Service, Redis, Logging)
|
|
||||||
|
|
||||||
**Dependencies:**
|
|
||||||
- ✅ `google-api-python-client` - Google Calendar API
|
|
||||||
- ✅ `google-auth` - Google OAuth2
|
|
||||||
- ✅ `asyncpg` - PostgreSQL async driver
|
|
||||||
- ✅ `backoff` - Retry/backoff decorator
|
|
||||||
|
|
||||||
**Features:**
|
|
||||||
- ✅ Bidirektionale Synchronisation (Google ↔ Advoware)
|
|
||||||
- ✅ 4-Phase Sync-Algorithmus (New Adv→Google, New Google→Adv, Deletes, Updates)
|
|
||||||
- ✅ PostgreSQL als Sync-State Hub (calendar_sync Tabelle)
|
|
||||||
- ✅ Redis-basiertes Rate Limiting (Token Bucket für Google API)
|
|
||||||
- ✅ Distributed Locking per Employee
|
|
||||||
- ✅ Automatische Calendar-Creation mit ACL
|
|
||||||
- ✅ Recurring Events Support (RRULE)
|
|
||||||
- ✅ Timezone-Handling (Europe/Berlin)
|
|
||||||
- ✅ Backoff-Retry für API-Fehler
|
|
||||||
- ✅ Write-Protection für Advoware
|
|
||||||
- ✅ Source-System-Wins & Last-Change-Wins Strategien
|
|
||||||
|
|
||||||
### ⏳ REMAINING
|
|
||||||
|
|
||||||
**Keine! Die Migration ist zu 100% abgeschlossen.**
|
|
||||||
|
|
||||||
### Completed
|
|
||||||
- ✅ Analysis of old system structure
|
|
||||||
- ✅ MIGRATION_GUIDE.md reviewed
|
|
||||||
- ✅ Migration patterns documented
|
|
||||||
- ✅ New system has example ticketing steps
|
|
||||||
- ✅ **Phase 1: Advoware Proxy Steps migrated** (GET, POST, PUT, DELETE)
|
|
||||||
- ✅ **Advoware API service module migrated** (services/advoware.py)
|
|
||||||
- ✅ **Phase 2: VMH Integration - Webhook Steps migrated** (6 endpoints)
|
|
||||||
- ✅ **EspoCRM API service module migrated** (services/espocrm.py)
|
|
||||||
- ✅ All endpoints registered and running:
|
|
||||||
- **Advoware Proxy:**
|
|
||||||
- `GET /advoware/proxy6 Complete ✅
|
|
||||||
|
|
||||||
**🎉 ALLE PHASEN ABGESCHLOSSEN! 100% MIGRATION ERFOLGREICH!**
|
|
||||||
|
|
||||||
**Phase 6** - Google Calendar Sync:
|
|
||||||
- ✅ `calendar_sync_cron_step.py` (Cron-Trigger alle 15 Min.)
|
|
||||||
- ✅ `calendar_sync_all_step.py` (Bulk-Handler mit Redis-Priorisierung)
|
|
||||||
- ✅ `calendar_sync_event_step.py` (1053 Zeilen - 4-Phase Sync-Algorithmus)
|
|
||||||
- ✅ `calendar_sync_api_step.py` (HTTP API für manuelle Triggers)
|
|
||||||
- ✅ `calendar_sync_utils.py` (DB, Google Service, Redis Client)
|
|
||||||
|
|
||||||
**Sync-Architektur komplett:**
|
|
||||||
|
|
||||||
1. **Advoware Proxy** (Phase 1) → HTTP API für Advoware-Zugriff
|
|
||||||
2. **Webhooks** (Phase 2) → Emittieren Queue-Events
|
|
||||||
3. **Event Handler** (Phase 3) → Verarbeiten Events mit Stammdaten-Sync
|
|
||||||
4. **Kommunikation Sync** (Phase 4) → Bidirektionale Email/Phone-Synchronisation
|
|
||||||
5. **Adressen Sync** (Phase 5) → Bidirektionale Adressen-Synchronisation
|
|
||||||
6. **Calendar Sync** (Phase 6) → Google Calendar ↔ Advoware Bidirektional
|
|
||||||
7. **Cron Jobs** (Phase 3 & 6) → Regelmäßige Sync-Checks & Auto-Retries
|
|
||||||
|
|
||||||
Die vollständige Synchronisations- und Integrations-Pipeline ist nun zu 100%
|
|
||||||
**Phase 5** - Adressen Sync:
|
|
||||||
- ✅ `adressen_mapper.py` (267 Zeilen - CAdressen ↔ Advoware Adressen)
|
|
||||||
- ✅ `adressen_sync.py` (697 Zeilen - CREATE/UPDATE mit READ-ONLY Detection)
|
|
||||||
|
|
||||||
### Sync-Architektur komplett:
|
|
||||||
|
|
||||||
1. **Webhooks** (Phase 2) → Emittieren Queue-Events
|
|
||||||
2. **Event Handler** (Phase 3) → Verarbeiten Events mit Stammdaten-Sync
|
|
||||||
3. **Kommunikation Sync** (Phase 4) → Bidirektionale Email/Phone-Synchronisation
|
|
||||||
4. **Adressen Sync** (Phase 5) → Bidirektionale Adressen-Synchronisation
|
|
||||||
5. **Cron Job** (Phase 3) → Regelmäßige Sync-Checks & Auto-Retries
|
|
||||||
|
|
||||||
Die vollständige Synchronisations-Pipeline ist nun einsatzbereit!
|
|
||||||
|
|
||||||
## Notes
|
|
||||||
- Old system was Node.js + Python hybrid (Python steps as child processes)
|
|
||||||
- New system is pure Python (standalone SDK)
|
|
||||||
- No need for Node.js/npm anymore
|
|
||||||
- iii engine handles all infrastructure (queues, state, HTTP, cron)
|
|
||||||
- Console replaced Workbench
|
|
||||||
@@ -11,14 +11,11 @@ Hilfsfunktionen für Document-Synchronisation mit xAI:
|
|||||||
from typing import Dict, Any, Optional, List, Tuple
|
from typing import Dict, Any, Optional, List, Tuple
|
||||||
from datetime import datetime, timedelta
|
from datetime import datetime, timedelta
|
||||||
import logging
|
import logging
|
||||||
import redis
|
|
||||||
import os
|
from services.sync_utils_base import BaseSyncUtils
|
||||||
|
|
||||||
logger = logging.getLogger(__name__)
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
# Lock TTL in seconds (prevents deadlocks)
|
|
||||||
LOCK_TTL_SECONDS = 900 # 15 minutes
|
|
||||||
|
|
||||||
# Max retry before permanent failure
|
# Max retry before permanent failure
|
||||||
MAX_SYNC_RETRIES = 5
|
MAX_SYNC_RETRIES = 5
|
||||||
|
|
||||||
@@ -26,40 +23,12 @@ MAX_SYNC_RETRIES = 5
|
|||||||
RETRY_BACKOFF_MINUTES = [1, 5, 15, 60, 240] # 1min, 5min, 15min, 1h, 4h
|
RETRY_BACKOFF_MINUTES = [1, 5, 15, 60, 240] # 1min, 5min, 15min, 1h, 4h
|
||||||
|
|
||||||
|
|
||||||
class DocumentSync:
|
class DocumentSync(BaseSyncUtils):
|
||||||
"""Utility-Klasse für Document-Synchronisation mit xAI"""
|
"""Utility-Klasse für Document-Synchronisation mit xAI"""
|
||||||
|
|
||||||
def __init__(self, espocrm_api, redis_client: redis.Redis = None, context=None):
|
def _get_lock_key(self, entity_id: str) -> str:
|
||||||
self.espocrm = espocrm_api
|
"""Redis Lock-Key für Documents"""
|
||||||
self.context = context
|
return f"sync_lock:document:{entity_id}"
|
||||||
self.logger = context.logger if context else logger
|
|
||||||
self.redis = redis_client or self._init_redis()
|
|
||||||
|
|
||||||
def _init_redis(self) -> redis.Redis:
|
|
||||||
"""Initialize Redis client for distributed locking"""
|
|
||||||
try:
|
|
||||||
redis_host = os.getenv('REDIS_HOST', 'localhost')
|
|
||||||
redis_port = int(os.getenv('REDIS_PORT', '6379'))
|
|
||||||
redis_db = int(os.getenv('REDIS_DB_ADVOWARE_CACHE', '1'))
|
|
||||||
|
|
||||||
client = redis.Redis(
|
|
||||||
host=redis_host,
|
|
||||||
port=redis_port,
|
|
||||||
db=redis_db,
|
|
||||||
decode_responses=True
|
|
||||||
)
|
|
||||||
client.ping()
|
|
||||||
return client
|
|
||||||
except Exception as e:
|
|
||||||
self._log(f"Redis connection failed: {e}", level='error')
|
|
||||||
return None
|
|
||||||
|
|
||||||
def _log(self, message: str, level: str = 'info'):
|
|
||||||
"""Logging mit Context-Support"""
|
|
||||||
if self.context and hasattr(self.context, 'logger'):
|
|
||||||
getattr(self.context.logger, level)(message)
|
|
||||||
else:
|
|
||||||
getattr(logger, level)(message)
|
|
||||||
|
|
||||||
async def acquire_sync_lock(self, entity_id: str, entity_type: str = 'CDokumente') -> bool:
|
async def acquire_sync_lock(self, entity_id: str, entity_type: str = 'CDokumente') -> bool:
|
||||||
"""
|
"""
|
||||||
@@ -74,13 +43,10 @@ class DocumentSync:
|
|||||||
"""
|
"""
|
||||||
try:
|
try:
|
||||||
# STEP 1: Atomic Redis lock (prevents race conditions)
|
# STEP 1: Atomic Redis lock (prevents race conditions)
|
||||||
if self.redis:
|
lock_key = self._get_lock_key(entity_id)
|
||||||
lock_key = f"sync_lock:document:{entity_id}"
|
if not self._acquire_redis_lock(lock_key):
|
||||||
acquired = self.redis.set(lock_key, "locked", nx=True, ex=LOCK_TTL_SECONDS)
|
self._log(f"Redis lock bereits aktiv für {entity_type} {entity_id}", level='warn')
|
||||||
|
return False
|
||||||
if not acquired:
|
|
||||||
self._log(f"Redis lock bereits aktiv für {entity_type} {entity_id}", level='warn')
|
|
||||||
return False
|
|
||||||
|
|
||||||
# STEP 2: Update syncStatus (für UI visibility) - nur bei Document Entity
|
# STEP 2: Update syncStatus (für UI visibility) - nur bei Document Entity
|
||||||
# CDokumente hat dieses Feld nicht - überspringen
|
# CDokumente hat dieses Feld nicht - überspringen
|
||||||
@@ -98,12 +64,8 @@ class DocumentSync:
|
|||||||
except Exception as e:
|
except Exception as e:
|
||||||
self._log(f"Fehler beim Acquire Lock: {e}", level='error')
|
self._log(f"Fehler beim Acquire Lock: {e}", level='error')
|
||||||
# Clean up Redis lock on error
|
# Clean up Redis lock on error
|
||||||
if self.redis:
|
lock_key = self._get_lock_key(entity_id)
|
||||||
try:
|
self._release_redis_lock(lock_key)
|
||||||
lock_key = f"sync_lock:document:{entity_id}"
|
|
||||||
self.redis.delete(lock_key)
|
|
||||||
except:
|
|
||||||
pass
|
|
||||||
return False
|
return False
|
||||||
|
|
||||||
async def release_sync_lock(
|
async def release_sync_lock(
|
||||||
@@ -149,19 +111,14 @@ class DocumentSync:
|
|||||||
self._log(f"Sync-Lock released: {entity_type} {entity_id} → {'success' if success else 'failed'}")
|
self._log(f"Sync-Lock released: {entity_type} {entity_id} → {'success' if success else 'failed'}")
|
||||||
|
|
||||||
# Release Redis lock
|
# Release Redis lock
|
||||||
if self.redis:
|
lock_key = self._get_lock_key(entity_id)
|
||||||
lock_key = f"sync_lock:document:{entity_id}"
|
self._release_redis_lock(lock_key)
|
||||||
self.redis.delete(lock_key)
|
|
||||||
|
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
self._log(f"Fehler beim Release Lock: {e}", level='error')
|
self._log(f"Fehler beim Release Lock: {e}", level='error')
|
||||||
# Ensure Redis lock is released even on error
|
# Ensure Redis lock is released even on error
|
||||||
if self.redis:
|
lock_key = self._get_lock_key(entity_id)
|
||||||
try:
|
self._release_redis_lock(lock_key)
|
||||||
lock_key = f"sync_lock:document:{entity_id}"
|
|
||||||
self.redis.delete(lock_key)
|
|
||||||
except:
|
|
||||||
pass
|
|
||||||
|
|
||||||
async def should_sync_to_xai(self, document: Dict[str, Any]) -> Tuple[bool, List[str], str]:
|
async def should_sync_to_xai(self, document: Dict[str, Any]) -> Tuple[bool, List[str], str]:
|
||||||
"""
|
"""
|
||||||
|
|||||||
150
services/sync_utils_base.py
Normal file
150
services/sync_utils_base.py
Normal file
@@ -0,0 +1,150 @@
|
|||||||
|
"""
|
||||||
|
Base Sync Utilities
|
||||||
|
|
||||||
|
Gemeinsame Funktionalität für alle Sync-Operationen:
|
||||||
|
- Redis Distributed Locking
|
||||||
|
- Context-aware Logging
|
||||||
|
- EspoCRM API Helpers
|
||||||
|
"""
|
||||||
|
|
||||||
|
from typing import Dict, Any, Optional
|
||||||
|
from datetime import datetime
|
||||||
|
import logging
|
||||||
|
import redis
|
||||||
|
import os
|
||||||
|
import pytz
|
||||||
|
|
||||||
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
|
# Lock TTL in seconds (prevents deadlocks)
|
||||||
|
LOCK_TTL_SECONDS = 900 # 15 minutes
|
||||||
|
|
||||||
|
|
||||||
|
class BaseSyncUtils:
|
||||||
|
"""Base-Klasse mit gemeinsamer Sync-Funktionalität"""
|
||||||
|
|
||||||
|
def __init__(self, espocrm_api, redis_client: redis.Redis = None, context=None):
|
||||||
|
"""
|
||||||
|
Args:
|
||||||
|
espocrm_api: EspoCRM API client instance
|
||||||
|
redis_client: Optional Redis client (wird sonst initialisiert)
|
||||||
|
context: Optional Motia FlowContext für Logging
|
||||||
|
"""
|
||||||
|
self.espocrm = espocrm_api
|
||||||
|
self.context = context
|
||||||
|
self.logger = context.logger if context else logger
|
||||||
|
self.redis = redis_client or self._init_redis()
|
||||||
|
|
||||||
|
def _init_redis(self) -> Optional[redis.Redis]:
|
||||||
|
"""Initialize Redis client for distributed locking"""
|
||||||
|
try:
|
||||||
|
redis_host = os.getenv('REDIS_HOST', 'localhost')
|
||||||
|
redis_port = int(os.getenv('REDIS_PORT', '6379'))
|
||||||
|
redis_db = int(os.getenv('REDIS_DB_ADVOWARE_CACHE', '1'))
|
||||||
|
|
||||||
|
client = redis.Redis(
|
||||||
|
host=redis_host,
|
||||||
|
port=redis_port,
|
||||||
|
db=redis_db,
|
||||||
|
decode_responses=True
|
||||||
|
)
|
||||||
|
client.ping()
|
||||||
|
return client
|
||||||
|
except Exception as e:
|
||||||
|
self._log(f"Redis connection failed: {e}", level='error')
|
||||||
|
return None
|
||||||
|
|
||||||
|
def _log(self, message: str, level: str = 'info'):
|
||||||
|
"""
|
||||||
|
Context-aware logging
|
||||||
|
|
||||||
|
Falls ein FlowContext vorhanden ist, wird dessen Logger verwendet.
|
||||||
|
Sonst fallback auf Standard-Logger.
|
||||||
|
"""
|
||||||
|
if self.context and hasattr(self.context, 'logger'):
|
||||||
|
getattr(self.context.logger, level)(message)
|
||||||
|
else:
|
||||||
|
getattr(logger, level)(message)
|
||||||
|
|
||||||
|
def _get_lock_key(self, entity_id: str) -> str:
|
||||||
|
"""
|
||||||
|
Erzeugt Redis Lock-Key für eine Entity
|
||||||
|
|
||||||
|
Muss in Subklassen überschrieben werden, um entity-spezifische Prefixes zu nutzen.
|
||||||
|
z.B. 'sync_lock:cbeteiligte:{entity_id}' oder 'sync_lock:document:{entity_id}'
|
||||||
|
"""
|
||||||
|
raise NotImplementedError("Subclass must implement _get_lock_key()")
|
||||||
|
|
||||||
|
def _acquire_redis_lock(self, lock_key: str) -> bool:
|
||||||
|
"""
|
||||||
|
Atomic Redis lock acquisition
|
||||||
|
|
||||||
|
Args:
|
||||||
|
lock_key: Redis key für den Lock
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
True wenn Lock erfolgreich, False wenn bereits locked
|
||||||
|
"""
|
||||||
|
if not self.redis:
|
||||||
|
self._log("Redis nicht verfügbar, Lock-Mechanismus deaktiviert", level='warn')
|
||||||
|
return True # Fallback: Wenn kein Redis, immer lock erlauben
|
||||||
|
|
||||||
|
try:
|
||||||
|
acquired = self.redis.set(lock_key, "locked", nx=True, ex=LOCK_TTL_SECONDS)
|
||||||
|
return bool(acquired)
|
||||||
|
except Exception as e:
|
||||||
|
self._log(f"Redis lock error: {e}", level='error')
|
||||||
|
return True # Bei Fehler: Lock erlauben, um Deadlocks zu vermeiden
|
||||||
|
|
||||||
|
def _release_redis_lock(self, lock_key: str) -> None:
|
||||||
|
"""
|
||||||
|
Redis lock freigeben
|
||||||
|
|
||||||
|
Args:
|
||||||
|
lock_key: Redis key für den Lock
|
||||||
|
"""
|
||||||
|
if not self.redis:
|
||||||
|
return
|
||||||
|
|
||||||
|
try:
|
||||||
|
self.redis.delete(lock_key)
|
||||||
|
except Exception as e:
|
||||||
|
self._log(f"Redis unlock error: {e}", level='error')
|
||||||
|
|
||||||
|
def _get_espocrm_datetime(self, dt: Optional[datetime] = None) -> str:
|
||||||
|
"""
|
||||||
|
Formatiert datetime für EspoCRM (ohne Timezone!)
|
||||||
|
|
||||||
|
Args:
|
||||||
|
dt: Optional datetime object (default: now UTC)
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
String im Format 'YYYY-MM-DD HH:MM:SS'
|
||||||
|
"""
|
||||||
|
if dt is None:
|
||||||
|
dt = datetime.now(pytz.UTC)
|
||||||
|
elif dt.tzinfo is None:
|
||||||
|
dt = pytz.UTC.localize(dt)
|
||||||
|
|
||||||
|
return dt.strftime('%Y-%m-%d %H:%M:%S')
|
||||||
|
|
||||||
|
async def acquire_sync_lock(self, entity_id: str, **kwargs) -> bool:
|
||||||
|
"""
|
||||||
|
Erwirbt Sync-Lock für eine Entity
|
||||||
|
|
||||||
|
Muss in Subklassen implementiert werden, um entity-spezifische
|
||||||
|
Status-Updates durchzuführen.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
True wenn Lock erfolgreich, False wenn bereits locked
|
||||||
|
"""
|
||||||
|
raise NotImplementedError("Subclass must implement acquire_sync_lock()")
|
||||||
|
|
||||||
|
async def release_sync_lock(self, entity_id: str, **kwargs) -> None:
|
||||||
|
"""
|
||||||
|
Gibt Sync-Lock frei und setzt finalen Status
|
||||||
|
|
||||||
|
Muss in Subklassen implementiert werden, um entity-spezifische
|
||||||
|
Status-Updates durchzuführen.
|
||||||
|
"""
|
||||||
|
raise NotImplementedError("Subclass must implement release_sync_lock()")
|
||||||
@@ -1,6 +0,0 @@
|
|||||||
"""
|
|
||||||
EspoCRM Generic Webhooks
|
|
||||||
|
|
||||||
Empfängt Webhooks von EspoCRM für verschiedene Entities.
|
|
||||||
Zentrale Anlaufstelle für alle EspoCRM-Events außerhalb VMH-Kontext.
|
|
||||||
"""
|
|
||||||
@@ -1,208 +0,0 @@
|
|||||||
"""EspoCRM Webhook - Document Create
|
|
||||||
|
|
||||||
Empfängt Create-Webhooks von EspoCRM für Documents.
|
|
||||||
Loggt detailliert alle Payload-Informationen für Analyse.
|
|
||||||
"""
|
|
||||||
import json
|
|
||||||
import datetime
|
|
||||||
from typing import Any
|
|
||||||
from motia import FlowContext, http, ApiRequest, ApiResponse
|
|
||||||
|
|
||||||
|
|
||||||
config = {
|
|
||||||
"name": "VMH Webhook Document Create",
|
|
||||||
"description": "Empfängt Create-Webhooks von EspoCRM für Document Entities",
|
|
||||||
"flows": ["vmh-documents"],
|
|
||||||
"triggers": [
|
|
||||||
http("POST", "/vmh/webhook/document/create")
|
|
||||||
],
|
|
||||||
"enqueues": ["vmh.document.create"],
|
|
||||||
}
|
|
||||||
|
|
||||||
|
|
||||||
async def handler(request: ApiRequest, ctx: FlowContext[Any]) -> ApiResponse:
|
|
||||||
"""
|
|
||||||
Webhook handler for Document creation in EspoCRM.
|
|
||||||
|
|
||||||
Receives notifications when documents are created and emits queue events
|
|
||||||
for processing (xAI sync, etc.).
|
|
||||||
|
|
||||||
Payload Analysis Mode: Logs comprehensive details about webhook structure.
|
|
||||||
"""
|
|
||||||
try:
|
|
||||||
payload = request.body or []
|
|
||||||
|
|
||||||
# ═══════════════════════════════════════════════════════════════
|
|
||||||
# DETAILLIERTES LOGGING FÜR ANALYSE
|
|
||||||
# ═══════════════════════════════════════════════════════════════
|
|
||||||
|
|
||||||
ctx.logger.info("=" * 80)
|
|
||||||
ctx.logger.info("📥 EspoCRM DOCUMENT CREATE WEBHOOK EMPFANGEN")
|
|
||||||
ctx.logger.info("=" * 80)
|
|
||||||
|
|
||||||
# Log Request Headers
|
|
||||||
ctx.logger.info("\n🔍 REQUEST HEADERS:")
|
|
||||||
if hasattr(request, 'headers'):
|
|
||||||
for key, value in request.headers.items():
|
|
||||||
ctx.logger.info(f" {key}: {value}")
|
|
||||||
else:
|
|
||||||
ctx.logger.info(" (keine Headers verfügbar)")
|
|
||||||
|
|
||||||
# Log Payload Type & Structure
|
|
||||||
ctx.logger.info(f"\n📦 PAYLOAD TYPE: {type(payload).__name__}")
|
|
||||||
ctx.logger.info(f"📦 PAYLOAD LENGTH: {len(payload) if isinstance(payload, (list, dict)) else 'N/A'}")
|
|
||||||
|
|
||||||
# Log Full Payload (pretty-printed)
|
|
||||||
ctx.logger.info("\n📄 FULL PAYLOAD:")
|
|
||||||
ctx.logger.info(json.dumps(payload, indent=2, ensure_ascii=False))
|
|
||||||
|
|
||||||
# ═══════════════════════════════════════════════════════════════
|
|
||||||
# PAYLOAD ANALYSE & ID EXTRAKTION
|
|
||||||
# ═══════════════════════════════════════════════════════════════
|
|
||||||
|
|
||||||
entity_ids = set()
|
|
||||||
payload_details = []
|
|
||||||
|
|
||||||
if isinstance(payload, list):
|
|
||||||
ctx.logger.info(f"\n✅ Payload ist LIST mit {len(payload)} Einträgen")
|
|
||||||
for idx, entity in enumerate(payload):
|
|
||||||
if isinstance(entity, dict):
|
|
||||||
entity_id = entity.get('id')
|
|
||||||
if entity_id:
|
|
||||||
entity_ids.add(entity_id)
|
|
||||||
|
|
||||||
# Sammle Details für Logging
|
|
||||||
detail = {
|
|
||||||
'index': idx,
|
|
||||||
'id': entity_id,
|
|
||||||
'name': entity.get('name', 'N/A'),
|
|
||||||
'type': entity.get('type', 'N/A'),
|
|
||||||
'size': entity.get('size', 'N/A'),
|
|
||||||
'all_fields': list(entity.keys())
|
|
||||||
}
|
|
||||||
payload_details.append(detail)
|
|
||||||
|
|
||||||
ctx.logger.info(f"\n 📄 Document #{idx + 1}:")
|
|
||||||
ctx.logger.info(f" ID: {entity_id}")
|
|
||||||
ctx.logger.info(f" Name: {entity.get('name', 'N/A')}")
|
|
||||||
ctx.logger.info(f" Type: {entity.get('type', 'N/A')}")
|
|
||||||
ctx.logger.info(f" Size: {entity.get('size', 'N/A')} bytes")
|
|
||||||
ctx.logger.info(f" Verfügbare Felder: {', '.join(entity.keys())}")
|
|
||||||
|
|
||||||
# xAI-relevante Felder (falls vorhanden)
|
|
||||||
xai_fields = {k: v for k, v in entity.items()
|
|
||||||
if 'xai' in k.lower() or 'collection' in k.lower()}
|
|
||||||
if xai_fields:
|
|
||||||
ctx.logger.info(f" 🤖 xAI-Felder: {json.dumps(xai_fields, ensure_ascii=False)}")
|
|
||||||
|
|
||||||
# Parent/Relationship Felder
|
|
||||||
rel_fields = {k: v for k, v in entity.items()
|
|
||||||
if 'parent' in k.lower() or 'related' in k.lower() or
|
|
||||||
'link' in k.lower() or k.endswith('Id') or k.endswith('Ids')}
|
|
||||||
if rel_fields:
|
|
||||||
ctx.logger.info(f" 🔗 Relationship-Felder: {json.dumps(rel_fields, ensure_ascii=False)}")
|
|
||||||
|
|
||||||
elif isinstance(payload, dict):
|
|
||||||
ctx.logger.info("\n✅ Payload ist SINGLE DICT")
|
|
||||||
entity_id = payload.get('id')
|
|
||||||
if entity_id:
|
|
||||||
entity_ids.add(entity_id)
|
|
||||||
|
|
||||||
ctx.logger.info(f"\n 📄 Document:")
|
|
||||||
ctx.logger.info(f" ID: {entity_id}")
|
|
||||||
ctx.logger.info(f" Name: {payload.get('name', 'N/A')}")
|
|
||||||
ctx.logger.info(f" Type: {payload.get('type', 'N/A')}")
|
|
||||||
ctx.logger.info(f" Size: {payload.get('size', 'N/A')} bytes")
|
|
||||||
ctx.logger.info(f" Verfügbare Felder: {', '.join(payload.keys())}")
|
|
||||||
|
|
||||||
# xAI-relevante Felder
|
|
||||||
xai_fields = {k: v for k, v in payload.items()
|
|
||||||
if 'xai' in k.lower() or 'collection' in k.lower()}
|
|
||||||
if xai_fields:
|
|
||||||
ctx.logger.info(f" 🤖 xAI-Felder: {json.dumps(xai_fields, ensure_ascii=False)}")
|
|
||||||
|
|
||||||
# Relationship Felder
|
|
||||||
rel_fields = {k: v for k, v in payload.items()
|
|
||||||
if 'parent' in k.lower() or 'related' in k.lower() or
|
|
||||||
'link' in k.lower() or k.endswith('Id') or k.endswith('Ids')}
|
|
||||||
if rel_fields:
|
|
||||||
ctx.logger.info(f" 🔗 Relationship-Felder: {json.dumps(rel_fields, ensure_ascii=False)}")
|
|
||||||
else:
|
|
||||||
ctx.logger.warning(f"⚠️ Unerwarteter Payload-Typ: {type(payload)}")
|
|
||||||
|
|
||||||
# ═══════════════════════════════════════════════════════════════
|
|
||||||
# QUEUE EVENTS EMITTIEREN
|
|
||||||
# ═══════════════════════════════════════════════════════════════
|
|
||||||
|
|
||||||
ctx.logger.info("\n" + "=" * 80)
|
|
||||||
ctx.logger.info(f"📊 ZUSAMMENFASSUNG: {len(entity_ids)} Document(s) gefunden")
|
|
||||||
ctx.logger.info("=" * 80)
|
|
||||||
|
|
||||||
if not entity_ids:
|
|
||||||
ctx.logger.warning("⚠️ Keine Document-IDs im Payload gefunden!")
|
|
||||||
return ApiResponse(
|
|
||||||
status=200,
|
|
||||||
body={
|
|
||||||
'status': 'received',
|
|
||||||
'action': 'create',
|
|
||||||
'ids_count': 0,
|
|
||||||
'warning': 'No document IDs found in payload'
|
|
||||||
}
|
|
||||||
)
|
|
||||||
|
|
||||||
# Emit events für Queue-Processing (Deduplizierung erfolgt im Event-Handler via Lock)
|
|
||||||
# Versuche Entity-Type zu ermitteln
|
|
||||||
entity_type = 'CDokumente' # Default für VMH
|
|
||||||
if isinstance(payload, list) and payload:
|
|
||||||
entity_type = payload[0].get('entityType') or payload[0].get('_scope') or 'CDokumente'
|
|
||||||
elif isinstance(payload, dict):
|
|
||||||
entity_type = payload.get('entityType') or payload.get('_scope') or 'CDokumente'
|
|
||||||
|
|
||||||
ctx.logger.info(f"📝 Entity-Type: {entity_type}")
|
|
||||||
|
|
||||||
for entity_id in entity_ids:
|
|
||||||
await ctx.enqueue({
|
|
||||||
'topic': 'vmh.document.create',
|
|
||||||
'data': {
|
|
||||||
'entity_id': entity_id,
|
|
||||||
'entity_type': entity_type,
|
|
||||||
'action': 'create',
|
|
||||||
'source': 'webhook',
|
|
||||||
'timestamp': datetime.datetime.now().isoformat()
|
|
||||||
}
|
|
||||||
})
|
|
||||||
ctx.logger.info(f"✅ Event emittiert: vmh.document.create für ID {entity_id} (Type: {entity_type})")
|
|
||||||
|
|
||||||
ctx.logger.info("\n" + "=" * 80)
|
|
||||||
ctx.logger.info(f"✅ WEBHOOK VERARBEITUNG ABGESCHLOSSEN")
|
|
||||||
ctx.logger.info("=" * 80)
|
|
||||||
|
|
||||||
return ApiResponse(
|
|
||||||
status=200,
|
|
||||||
body={
|
|
||||||
'status': 'received',
|
|
||||||
'action': 'create',
|
|
||||||
'ids_count': len(entity_ids),
|
|
||||||
'document_ids': list(entity_ids)
|
|
||||||
}
|
|
||||||
)
|
|
||||||
|
|
||||||
except Exception as e:
|
|
||||||
ctx.logger.error("=" * 80)
|
|
||||||
ctx.logger.error(f"❌ FEHLER beim Verarbeiten des Document Create Webhooks")
|
|
||||||
ctx.logger.error("=" * 80)
|
|
||||||
ctx.logger.error(f"Error Type: {type(e).__name__}")
|
|
||||||
ctx.logger.error(f"Error Message: {str(e)}")
|
|
||||||
|
|
||||||
# Log Stack Trace
|
|
||||||
import traceback
|
|
||||||
ctx.logger.error(f"Stack Trace:\n{traceback.format_exc()}")
|
|
||||||
|
|
||||||
return ApiResponse(
|
|
||||||
status=500,
|
|
||||||
body={
|
|
||||||
'error': 'Internal server error',
|
|
||||||
'error_type': type(e).__name__,
|
|
||||||
'details': str(e)
|
|
||||||
}
|
|
||||||
)
|
|
||||||
@@ -1,184 +0,0 @@
|
|||||||
"""EspoCRM Webhook - Document Delete
|
|
||||||
|
|
||||||
Empfängt Delete-Webhooks von EspoCRM für Documents.
|
|
||||||
Loggt detailliert alle Payload-Informationen für Analyse.
|
|
||||||
"""
|
|
||||||
import json
|
|
||||||
import datetime
|
|
||||||
from typing import Any
|
|
||||||
from motia import FlowContext, http, ApiRequest, ApiResponse
|
|
||||||
|
|
||||||
|
|
||||||
config = {
|
|
||||||
"name": "VMH Webhook Document Delete",
|
|
||||||
"description": "Empfängt Delete-Webhooks von EspoCRM für Document Entities",
|
|
||||||
"flows": ["vmh-documents"],
|
|
||||||
"triggers": [
|
|
||||||
http("POST", "/vmh/webhook/document/delete")
|
|
||||||
],
|
|
||||||
"enqueues": ["vmh.document.delete"],
|
|
||||||
}
|
|
||||||
|
|
||||||
|
|
||||||
async def handler(request: ApiRequest, ctx: FlowContext[Any]) -> ApiResponse:
|
|
||||||
"""
|
|
||||||
Webhook handler for Document deletion in EspoCRM.
|
|
||||||
|
|
||||||
Receives notifications when documents are deleted.
|
|
||||||
Note: Bei Deletion haben wir ggf. nur die ID, keine vollständigen Entity-Daten.
|
|
||||||
"""
|
|
||||||
try:
|
|
||||||
payload = request.body or []
|
|
||||||
|
|
||||||
# ═══════════════════════════════════════════════════════════════
|
|
||||||
# DETAILLIERTES LOGGING FÜR ANALYSE
|
|
||||||
# ═══════════════════════════════════════════════════════════════
|
|
||||||
|
|
||||||
ctx.logger.info("=" * 80)
|
|
||||||
ctx.logger.info("📥 EspoCRM DOCUMENT DELETE WEBHOOK EMPFANGEN")
|
|
||||||
ctx.logger.info("=" * 80)
|
|
||||||
|
|
||||||
# Log Request Headers
|
|
||||||
ctx.logger.info("\n🔍 REQUEST HEADERS:")
|
|
||||||
if hasattr(request, 'headers'):
|
|
||||||
for key, value in request.headers.items():
|
|
||||||
ctx.logger.info(f" {key}: {value}")
|
|
||||||
else:
|
|
||||||
ctx.logger.info(" (keine Headers verfügbar)")
|
|
||||||
|
|
||||||
# Log Payload Type & Structure
|
|
||||||
ctx.logger.info(f"\n📦 PAYLOAD TYPE: {type(payload).__name__}")
|
|
||||||
ctx.logger.info(f"📦 PAYLOAD LENGTH: {len(payload) if isinstance(payload, (list, dict)) else 'N/A'}")
|
|
||||||
|
|
||||||
# Log Full Payload (pretty-printed)
|
|
||||||
ctx.logger.info("\n📄 FULL PAYLOAD:")
|
|
||||||
ctx.logger.info(json.dumps(payload, indent=2, ensure_ascii=False))
|
|
||||||
|
|
||||||
# ═══════════════════════════════════════════════════════════════
|
|
||||||
# PAYLOAD ANALYSE & ID EXTRAKTION
|
|
||||||
# ═══════════════════════════════════════════════════════════════
|
|
||||||
|
|
||||||
entity_ids = set()
|
|
||||||
|
|
||||||
if isinstance(payload, list):
|
|
||||||
ctx.logger.info(f"\n✅ Payload ist LIST mit {len(payload)} Einträgen")
|
|
||||||
for idx, entity in enumerate(payload):
|
|
||||||
if isinstance(entity, dict):
|
|
||||||
entity_id = entity.get('id')
|
|
||||||
if entity_id:
|
|
||||||
entity_ids.add(entity_id)
|
|
||||||
|
|
||||||
ctx.logger.info(f"\n 🗑️ Document #{idx + 1}:")
|
|
||||||
ctx.logger.info(f" ID: {entity_id}")
|
|
||||||
ctx.logger.info(f" Verfügbare Felder: {', '.join(entity.keys())}")
|
|
||||||
|
|
||||||
# Bei Delete haben wir oft nur minimale Daten
|
|
||||||
if 'name' in entity:
|
|
||||||
ctx.logger.info(f" Name: {entity.get('name')}")
|
|
||||||
if 'deletedAt' in entity or 'deleted' in entity:
|
|
||||||
ctx.logger.info(f" Deleted At: {entity.get('deletedAt', entity.get('deleted', 'N/A'))}")
|
|
||||||
|
|
||||||
# xAI-relevante Felder (falls vorhanden)
|
|
||||||
xai_fields = {k: v for k, v in entity.items()
|
|
||||||
if 'xai' in k.lower() or 'collection' in k.lower()}
|
|
||||||
if xai_fields:
|
|
||||||
ctx.logger.info(f" 🤖 xAI-Felder: {json.dumps(xai_fields, ensure_ascii=False)}")
|
|
||||||
|
|
||||||
elif isinstance(payload, dict):
|
|
||||||
ctx.logger.info("\n✅ Payload ist SINGLE DICT")
|
|
||||||
entity_id = payload.get('id')
|
|
||||||
if entity_id:
|
|
||||||
entity_ids.add(entity_id)
|
|
||||||
|
|
||||||
ctx.logger.info(f"\n 🗑️ Document:")
|
|
||||||
ctx.logger.info(f" ID: {entity_id}")
|
|
||||||
ctx.logger.info(f" Verfügbare Felder: {', '.join(payload.keys())}")
|
|
||||||
|
|
||||||
if 'name' in payload:
|
|
||||||
ctx.logger.info(f" Name: {payload.get('name')}")
|
|
||||||
if 'deletedAt' in payload or 'deleted' in payload:
|
|
||||||
ctx.logger.info(f" Deleted At: {payload.get('deletedAt', payload.get('deleted', 'N/A'))}")
|
|
||||||
|
|
||||||
# xAI-relevante Felder
|
|
||||||
xai_fields = {k: v for k, v in payload.items()
|
|
||||||
if 'xai' in k.lower() or 'collection' in k.lower()}
|
|
||||||
if xai_fields:
|
|
||||||
ctx.logger.info(f" 🤖 xAI-Felder: {json.dumps(xai_fields, ensure_ascii=False)}")
|
|
||||||
else:
|
|
||||||
ctx.logger.warning(f"⚠️ Unerwarteter Payload-Typ: {type(payload)}")
|
|
||||||
|
|
||||||
# ═══════════════════════════════════════════════════════════════
|
|
||||||
# QUEUE EVENTS EMITTIEREN
|
|
||||||
# ═══════════════════════════════════════════════════════════════
|
|
||||||
|
|
||||||
ctx.logger.info("\n" + "=" * 80)
|
|
||||||
ctx.logger.info(f"📊 ZUSAMMENFASSUNG: {len(entity_ids)} Document(s) gefunden")
|
|
||||||
ctx.logger.info("=" * 80)
|
|
||||||
|
|
||||||
if not entity_ids:
|
|
||||||
ctx.logger.warning("⚠️ Keine Document-IDs im Payload gefunden!")
|
|
||||||
return ApiResponse(
|
|
||||||
status=200,
|
|
||||||
body={
|
|
||||||
'status': 'received',
|
|
||||||
'action': 'delete',
|
|
||||||
'ids_count': 0,
|
|
||||||
'warning': 'No document IDs found in payload'
|
|
||||||
}
|
|
||||||
)
|
|
||||||
|
|
||||||
# Emit events für Queue-Processing
|
|
||||||
# Versuche Entity-Type zu ermitteln
|
|
||||||
entity_type = 'CDokumente' # Default für VMH
|
|
||||||
if isinstance(payload, list) and payload:
|
|
||||||
entity_type = payload[0].get('entityType') or payload[0].get('_scope') or 'CDokumente'
|
|
||||||
elif isinstance(payload, dict):
|
|
||||||
entity_type = payload.get('entityType') or payload.get('_scope') or 'CDokumente'
|
|
||||||
|
|
||||||
ctx.logger.info(f"📝 Entity-Type: {entity_type}")
|
|
||||||
|
|
||||||
for entity_id in entity_ids:
|
|
||||||
await ctx.enqueue({
|
|
||||||
'topic': 'vmh.document.delete',
|
|
||||||
'data': {
|
|
||||||
'entity_id': entity_id,
|
|
||||||
'entity_type': entity_type,
|
|
||||||
'action': 'delete',
|
|
||||||
'source': 'webhook',
|
|
||||||
'timestamp': datetime.datetime.now().isoformat()
|
|
||||||
}
|
|
||||||
})
|
|
||||||
ctx.logger.info(f"✅ Event emittiert: vmh.document.delete für ID {entity_id} (Type: {entity_type})")
|
|
||||||
|
|
||||||
ctx.logger.info("\n" + "=" * 80)
|
|
||||||
ctx.logger.info(f"✅ WEBHOOK VERARBEITUNG ABGESCHLOSSEN")
|
|
||||||
ctx.logger.info("=" * 80)
|
|
||||||
|
|
||||||
return ApiResponse(
|
|
||||||
status=200,
|
|
||||||
body={
|
|
||||||
'status': 'received',
|
|
||||||
'action': 'delete',
|
|
||||||
'ids_count': len(entity_ids),
|
|
||||||
'document_ids': list(entity_ids)
|
|
||||||
}
|
|
||||||
)
|
|
||||||
|
|
||||||
except Exception as e:
|
|
||||||
ctx.logger.error("=" * 80)
|
|
||||||
ctx.logger.error(f"❌ FEHLER beim Verarbeiten des Document Delete Webhooks")
|
|
||||||
ctx.logger.error("=" * 80)
|
|
||||||
ctx.logger.error(f"Error Type: {type(e).__name__}")
|
|
||||||
ctx.logger.error(f"Error Message: {str(e)}")
|
|
||||||
|
|
||||||
import traceback
|
|
||||||
ctx.logger.error(f"Stack Trace:\n{traceback.format_exc()}")
|
|
||||||
|
|
||||||
return ApiResponse(
|
|
||||||
status=500,
|
|
||||||
body={
|
|
||||||
'error': 'Internal server error',
|
|
||||||
'error_type': type(e).__name__,
|
|
||||||
'details': str(e)
|
|
||||||
}
|
|
||||||
)
|
|
||||||
@@ -1,206 +0,0 @@
|
|||||||
"""EspoCRM Webhook - Document Update
|
|
||||||
|
|
||||||
Empfängt Update-Webhooks von EspoCRM für Documents.
|
|
||||||
Loggt detailliert alle Payload-Informationen für Analyse.
|
|
||||||
"""
|
|
||||||
import json
|
|
||||||
import datetime
|
|
||||||
from typing import Any
|
|
||||||
from motia import FlowContext, http, ApiRequest, ApiResponse
|
|
||||||
|
|
||||||
|
|
||||||
config = {
|
|
||||||
"name": "VMH Webhook Document Update",
|
|
||||||
"description": "Empfängt Update-Webhooks von EspoCRM für Document Entities",
|
|
||||||
"flows": ["vmh-documents"],
|
|
||||||
"triggers": [
|
|
||||||
http("POST", "/vmh/webhook/document/update")
|
|
||||||
],
|
|
||||||
"enqueues": ["vmh.document.update"],
|
|
||||||
}
|
|
||||||
|
|
||||||
|
|
||||||
async def handler(request: ApiRequest, ctx: FlowContext[Any]) -> ApiResponse:
|
|
||||||
"""
|
|
||||||
Webhook handler for Document updates in EspoCRM.
|
|
||||||
|
|
||||||
Receives notifications when documents are updated and emits queue events
|
|
||||||
for processing (xAI sync, etc.).
|
|
||||||
|
|
||||||
Note: Loop-Prevention sollte auf EspoCRM-Seite implementiert werden.
|
|
||||||
xAI-Feld-Updates sollten keine neuen Webhooks triggern.
|
|
||||||
"""
|
|
||||||
try:
|
|
||||||
payload = request.body or []
|
|
||||||
|
|
||||||
# ═══════════════════════════════════════════════════════════════
|
|
||||||
# DETAILLIERTES LOGGING FÜR ANALYSE
|
|
||||||
# ═══════════════════════════════════════════════════════════════
|
|
||||||
|
|
||||||
ctx.logger.info("=" * 80)
|
|
||||||
ctx.logger.info("📥 EspoCRM DOCUMENT UPDATE WEBHOOK EMPFANGEN")
|
|
||||||
ctx.logger.info("=" * 80)
|
|
||||||
|
|
||||||
# Log Request Headers
|
|
||||||
ctx.logger.info("\n🔍 REQUEST HEADERS:")
|
|
||||||
if hasattr(request, 'headers'):
|
|
||||||
for key, value in request.headers.items():
|
|
||||||
ctx.logger.info(f" {key}: {value}")
|
|
||||||
else:
|
|
||||||
ctx.logger.info(" (keine Headers verfügbar)")
|
|
||||||
|
|
||||||
# Log Payload Type & Structure
|
|
||||||
ctx.logger.info(f"\n📦 PAYLOAD TYPE: {type(payload).__name__}")
|
|
||||||
ctx.logger.info(f"📦 PAYLOAD LENGTH: {len(payload) if isinstance(payload, (list, dict)) else 'N/A'}")
|
|
||||||
|
|
||||||
# Log Full Payload (pretty-printed)
|
|
||||||
ctx.logger.info("\n📄 FULL PAYLOAD:")
|
|
||||||
ctx.logger.info(json.dumps(payload, indent=2, ensure_ascii=False))
|
|
||||||
|
|
||||||
# ═══════════════════════════════════════════════════════════════
|
|
||||||
# PAYLOAD ANALYSE & ID EXTRAKTION
|
|
||||||
# ═══════════════════════════════════════════════════════════════
|
|
||||||
|
|
||||||
entity_ids = set()
|
|
||||||
|
|
||||||
if isinstance(payload, list):
|
|
||||||
ctx.logger.info(f"\n✅ Payload ist LIST mit {len(payload)} Einträgen")
|
|
||||||
for idx, entity in enumerate(payload):
|
|
||||||
if isinstance(entity, dict):
|
|
||||||
entity_id = entity.get('id')
|
|
||||||
if entity_id:
|
|
||||||
entity_ids.add(entity_id)
|
|
||||||
|
|
||||||
ctx.logger.info(f"\n 📄 Document #{idx + 1}:")
|
|
||||||
ctx.logger.info(f" ID: {entity_id}")
|
|
||||||
ctx.logger.info(f" Name: {entity.get('name', 'N/A')}")
|
|
||||||
ctx.logger.info(f" Modified At: {entity.get('modifiedAt', 'N/A')}")
|
|
||||||
ctx.logger.info(f" Modified By: {entity.get('modifiedById', 'N/A')}")
|
|
||||||
ctx.logger.info(f" Verfügbare Felder: {', '.join(entity.keys())}")
|
|
||||||
|
|
||||||
# Prüfe ob CHANGED fields mitgeliefert werden
|
|
||||||
changed_fields = entity.get('changedFields') or entity.get('changed') or entity.get('modifiedFields')
|
|
||||||
if changed_fields:
|
|
||||||
ctx.logger.info(f" 🔄 Geänderte Felder: {json.dumps(changed_fields, ensure_ascii=False)}")
|
|
||||||
|
|
||||||
# xAI-relevante Felder
|
|
||||||
xai_fields = {k: v for k, v in entity.items()
|
|
||||||
if 'xai' in k.lower() or 'collection' in k.lower()}
|
|
||||||
if xai_fields:
|
|
||||||
ctx.logger.info(f" 🤖 xAI-Felder: {json.dumps(xai_fields, ensure_ascii=False)}")
|
|
||||||
|
|
||||||
# Relationship Felder
|
|
||||||
rel_fields = {k: v for k, v in entity.items()
|
|
||||||
if 'parent' in k.lower() or 'related' in k.lower() or
|
|
||||||
'link' in k.lower() or k.endswith('Id') or k.endswith('Ids')}
|
|
||||||
if rel_fields:
|
|
||||||
ctx.logger.info(f" 🔗 Relationship-Felder: {json.dumps(rel_fields, ensure_ascii=False)}")
|
|
||||||
|
|
||||||
elif isinstance(payload, dict):
|
|
||||||
ctx.logger.info("\n✅ Payload ist SINGLE DICT")
|
|
||||||
entity_id = payload.get('id')
|
|
||||||
if entity_id:
|
|
||||||
entity_ids.add(entity_id)
|
|
||||||
|
|
||||||
ctx.logger.info(f"\n 📄 Document:")
|
|
||||||
ctx.logger.info(f" ID: {entity_id}")
|
|
||||||
ctx.logger.info(f" Name: {payload.get('name', 'N/A')}")
|
|
||||||
ctx.logger.info(f" Modified At: {payload.get('modifiedAt', 'N/A')}")
|
|
||||||
ctx.logger.info(f" Modified By: {payload.get('modifiedById', 'N/A')}")
|
|
||||||
ctx.logger.info(f" Verfügbare Felder: {', '.join(payload.keys())}")
|
|
||||||
|
|
||||||
# Geänderte Felder
|
|
||||||
changed_fields = payload.get('changedFields') or payload.get('changed') or payload.get('modifiedFields')
|
|
||||||
if changed_fields:
|
|
||||||
ctx.logger.info(f" 🔄 Geänderte Felder: {json.dumps(changed_fields, ensure_ascii=False)}")
|
|
||||||
|
|
||||||
# xAI-relevante Felder
|
|
||||||
xai_fields = {k: v for k, v in payload.items()
|
|
||||||
if 'xai' in k.lower() or 'collection' in k.lower()}
|
|
||||||
if xai_fields:
|
|
||||||
ctx.logger.info(f" 🤖 xAI-Felder: {json.dumps(xai_fields, ensure_ascii=False)}")
|
|
||||||
|
|
||||||
# Relationship Felder
|
|
||||||
rel_fields = {k: v for k, v in payload.items()
|
|
||||||
if 'parent' in k.lower() or 'related' in k.lower() or
|
|
||||||
'link' in k.lower() or k.endswith('Id') or k.endswith('Ids')}
|
|
||||||
if rel_fields:
|
|
||||||
ctx.logger.info(f" 🔗 Relationship-Felder: {json.dumps(rel_fields, ensure_ascii=False)}")
|
|
||||||
else:
|
|
||||||
ctx.logger.warning(f"⚠️ Unerwarteter Payload-Typ: {type(payload)}")
|
|
||||||
|
|
||||||
# ═══════════════════════════════════════════════════════════════
|
|
||||||
# QUEUE EVENTS EMITTIEREN
|
|
||||||
# ═══════════════════════════════════════════════════════════════
|
|
||||||
|
|
||||||
ctx.logger.info("\n" + "=" * 80)
|
|
||||||
ctx.logger.info(f"📊 ZUSAMMENFASSUNG: {len(entity_ids)} Document(s) gefunden")
|
|
||||||
ctx.logger.info("=" * 80)
|
|
||||||
|
|
||||||
if not entity_ids:
|
|
||||||
ctx.logger.warning("⚠️ Keine Document-IDs im Payload gefunden!")
|
|
||||||
return ApiResponse(
|
|
||||||
status=200,
|
|
||||||
body={
|
|
||||||
'status': 'received',
|
|
||||||
'action': 'update',
|
|
||||||
'ids_count': 0,
|
|
||||||
'warning': 'No document IDs found in payload'
|
|
||||||
}
|
|
||||||
)
|
|
||||||
|
|
||||||
# Emit events für Queue-Processing
|
|
||||||
# Versuche Entity-Type zu ermitteln
|
|
||||||
entity_type = 'CDokumente' # Default für VMH
|
|
||||||
if isinstance(payload, list) and payload:
|
|
||||||
entity_type = payload[0].get('entityType') or payload[0].get('_scope') or 'CDokumente'
|
|
||||||
elif isinstance(payload, dict):
|
|
||||||
entity_type = payload.get('entityType') or payload.get('_scope') or 'CDokumente'
|
|
||||||
|
|
||||||
ctx.logger.info(f"📝 Entity-Type: {entity_type}")
|
|
||||||
|
|
||||||
for entity_id in entity_ids:
|
|
||||||
await ctx.enqueue({
|
|
||||||
'topic': 'vmh.document.update',
|
|
||||||
'data': {
|
|
||||||
'entity_id': entity_id,
|
|
||||||
'entity_type': entity_type,
|
|
||||||
'action': 'update',
|
|
||||||
'source': 'webhook',
|
|
||||||
'timestamp': datetime.datetime.now().isoformat()
|
|
||||||
}
|
|
||||||
})
|
|
||||||
ctx.logger.info(f"✅ Event emittiert: vmh.document.update für ID {entity_id} (Type: {entity_type})")
|
|
||||||
|
|
||||||
ctx.logger.info("\n" + "=" * 80)
|
|
||||||
ctx.logger.info(f"✅ WEBHOOK VERARBEITUNG ABGESCHLOSSEN")
|
|
||||||
ctx.logger.info("=" * 80)
|
|
||||||
|
|
||||||
return ApiResponse(
|
|
||||||
status=200,
|
|
||||||
body={
|
|
||||||
'status': 'received',
|
|
||||||
'action': 'update',
|
|
||||||
'ids_count': len(entity_ids),
|
|
||||||
'document_ids': list(entity_ids)
|
|
||||||
}
|
|
||||||
)
|
|
||||||
|
|
||||||
except Exception as e:
|
|
||||||
ctx.logger.error("=" * 80)
|
|
||||||
ctx.logger.error(f"❌ FEHLER beim Verarbeiten des Document Update Webhooks")
|
|
||||||
ctx.logger.error("=" * 80)
|
|
||||||
ctx.logger.error(f"Error Type: {type(e).__name__}")
|
|
||||||
ctx.logger.error(f"Error Message: {str(e)}")
|
|
||||||
|
|
||||||
import traceback
|
|
||||||
ctx.logger.error(f"Stack Trace:\n{traceback.format_exc()}")
|
|
||||||
|
|
||||||
return ApiResponse(
|
|
||||||
status=500,
|
|
||||||
body={
|
|
||||||
'error': 'Internal server error',
|
|
||||||
'error_type': type(e).__name__,
|
|
||||||
'details': str(e)
|
|
||||||
}
|
|
||||||
)
|
|
||||||
77
steps/vmh/webhook/document_create_api_step.py
Normal file
77
steps/vmh/webhook/document_create_api_step.py
Normal file
@@ -0,0 +1,77 @@
|
|||||||
|
"""VMH Webhook - Document Create"""
|
||||||
|
import json
|
||||||
|
from typing import Any
|
||||||
|
from motia import FlowContext, http, ApiRequest, ApiResponse
|
||||||
|
|
||||||
|
|
||||||
|
config = {
|
||||||
|
"name": "VMH Webhook Document Create",
|
||||||
|
"description": "Empfängt Create-Webhooks von EspoCRM für Documents",
|
||||||
|
"flows": ["vmh-documents"],
|
||||||
|
"triggers": [
|
||||||
|
http("POST", "/vmh/webhook/document/create")
|
||||||
|
],
|
||||||
|
"enqueues": ["vmh.document.create"],
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
async def handler(request: ApiRequest, ctx: FlowContext[Any]) -> ApiResponse:
|
||||||
|
"""
|
||||||
|
Webhook handler for Document creation in EspoCRM.
|
||||||
|
|
||||||
|
Receives batch or single entity notifications and emits queue events
|
||||||
|
for each entity ID to be synced to xAI.
|
||||||
|
"""
|
||||||
|
try:
|
||||||
|
payload = request.body or []
|
||||||
|
|
||||||
|
ctx.logger.info("VMH Webhook Document Create empfangen")
|
||||||
|
ctx.logger.debug(f"Payload: {json.dumps(payload, indent=2, ensure_ascii=False)}")
|
||||||
|
|
||||||
|
# Sammle alle IDs aus dem Batch
|
||||||
|
entity_ids = set()
|
||||||
|
|
||||||
|
if isinstance(payload, list):
|
||||||
|
for entity in payload:
|
||||||
|
if isinstance(entity, dict) and 'id' in entity:
|
||||||
|
entity_ids.add(entity['id'])
|
||||||
|
# Extrahiere entityType falls vorhanden
|
||||||
|
entity_type = entity.get('entityType', 'CDokumente')
|
||||||
|
elif isinstance(payload, dict) and 'id' in payload:
|
||||||
|
entity_ids.add(payload['id'])
|
||||||
|
entity_type = payload.get('entityType', 'CDokumente')
|
||||||
|
|
||||||
|
ctx.logger.info(f"{len(entity_ids)} Document IDs zum Create-Sync gefunden")
|
||||||
|
|
||||||
|
# Emit events für Queue-Processing (Deduplizierung erfolgt im Event-Handler via Lock)
|
||||||
|
for entity_id in entity_ids:
|
||||||
|
await ctx.enqueue({
|
||||||
|
'topic': 'vmh.document.create',
|
||||||
|
'data': {
|
||||||
|
'entity_id': entity_id,
|
||||||
|
'entity_type': entity_type if 'entity_type' in locals() else 'CDokumente',
|
||||||
|
'action': 'create',
|
||||||
|
'timestamp': payload[0].get('modifiedAt') if isinstance(payload, list) and payload else None
|
||||||
|
}
|
||||||
|
})
|
||||||
|
|
||||||
|
return ApiResponse(
|
||||||
|
status_code=200,
|
||||||
|
body={
|
||||||
|
'success': True,
|
||||||
|
'message': f'{len(entity_ids)} Document(s) zum Sync enqueued',
|
||||||
|
'entity_ids': list(entity_ids)
|
||||||
|
}
|
||||||
|
)
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
ctx.logger.error(f"Fehler im Document Create Webhook: {e}")
|
||||||
|
ctx.logger.error(f"Payload: {request.body}")
|
||||||
|
|
||||||
|
return ApiResponse(
|
||||||
|
status_code=500,
|
||||||
|
body={
|
||||||
|
'success': False,
|
||||||
|
'error': str(e)
|
||||||
|
}
|
||||||
|
)
|
||||||
76
steps/vmh/webhook/document_delete_api_step.py
Normal file
76
steps/vmh/webhook/document_delete_api_step.py
Normal file
@@ -0,0 +1,76 @@
|
|||||||
|
"""VMH Webhook - Document Delete"""
|
||||||
|
import json
|
||||||
|
from typing import Any
|
||||||
|
from motia import FlowContext, http, ApiRequest, ApiResponse
|
||||||
|
|
||||||
|
|
||||||
|
config = {
|
||||||
|
"name": "VMH Webhook Document Delete",
|
||||||
|
"description": "Empfängt Delete-Webhooks von EspoCRM für Documents",
|
||||||
|
"flows": ["vmh-documents"],
|
||||||
|
"triggers": [
|
||||||
|
http("POST", "/vmh/webhook/document/delete")
|
||||||
|
],
|
||||||
|
"enqueues": ["vmh.document.delete"],
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
async def handler(request: ApiRequest, ctx: FlowContext[Any]) -> ApiResponse:
|
||||||
|
"""
|
||||||
|
Webhook handler for Document deletion in EspoCRM.
|
||||||
|
|
||||||
|
Receives batch or single entity notifications and emits queue events
|
||||||
|
for each entity ID to be removed from xAI.
|
||||||
|
"""
|
||||||
|
try:
|
||||||
|
payload = request.body or []
|
||||||
|
|
||||||
|
ctx.logger.info("VMH Webhook Document Delete empfangen")
|
||||||
|
ctx.logger.debug(f"Payload: {json.dumps(payload, indent=2, ensure_ascii=False)}")
|
||||||
|
|
||||||
|
# Sammle alle IDs aus dem Batch
|
||||||
|
entity_ids = set()
|
||||||
|
|
||||||
|
if isinstance(payload, list):
|
||||||
|
for entity in payload:
|
||||||
|
if isinstance(entity, dict) and 'id' in entity:
|
||||||
|
entity_ids.add(entity['id'])
|
||||||
|
entity_type = entity.get('entityType', 'CDokumente')
|
||||||
|
elif isinstance(payload, dict) and 'id' in payload:
|
||||||
|
entity_ids.add(payload['id'])
|
||||||
|
entity_type = payload.get('entityType', 'CDokumente')
|
||||||
|
|
||||||
|
ctx.logger.info(f"{len(entity_ids)} Document IDs zum Delete-Sync gefunden")
|
||||||
|
|
||||||
|
# Emit events für Queue-Processing
|
||||||
|
for entity_id in entity_ids:
|
||||||
|
await ctx.enqueue({
|
||||||
|
'topic': 'vmh.document.delete',
|
||||||
|
'data': {
|
||||||
|
'entity_id': entity_id,
|
||||||
|
'entity_type': entity_type if 'entity_type' in locals() else 'CDokumente',
|
||||||
|
'action': 'delete',
|
||||||
|
'timestamp': payload[0].get('deletedAt') if isinstance(payload, list) and payload else None
|
||||||
|
}
|
||||||
|
})
|
||||||
|
|
||||||
|
return ApiResponse(
|
||||||
|
status_code=200,
|
||||||
|
body={
|
||||||
|
'success': True,
|
||||||
|
'message': f'{len(entity_ids)} Document(s) zum Delete enqueued',
|
||||||
|
'entity_ids': list(entity_ids)
|
||||||
|
}
|
||||||
|
)
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
ctx.logger.error(f"Fehler im Document Delete Webhook: {e}")
|
||||||
|
ctx.logger.error(f"Payload: {request.body}")
|
||||||
|
|
||||||
|
return ApiResponse(
|
||||||
|
status_code=500,
|
||||||
|
body={
|
||||||
|
'success': False,
|
||||||
|
'error': str(e)
|
||||||
|
}
|
||||||
|
)
|
||||||
76
steps/vmh/webhook/document_update_api_step.py
Normal file
76
steps/vmh/webhook/document_update_api_step.py
Normal file
@@ -0,0 +1,76 @@
|
|||||||
|
"""VMH Webhook - Document Update"""
|
||||||
|
import json
|
||||||
|
from typing import Any
|
||||||
|
from motia import FlowContext, http, ApiRequest, ApiResponse
|
||||||
|
|
||||||
|
|
||||||
|
config = {
|
||||||
|
"name": "VMH Webhook Document Update",
|
||||||
|
"description": "Empfängt Update-Webhooks von EspoCRM für Documents",
|
||||||
|
"flows": ["vmh-documents"],
|
||||||
|
"triggers": [
|
||||||
|
http("POST", "/vmh/webhook/document/update")
|
||||||
|
],
|
||||||
|
"enqueues": ["vmh.document.update"],
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
async def handler(request: ApiRequest, ctx: FlowContext[Any]) -> ApiResponse:
|
||||||
|
"""
|
||||||
|
Webhook handler for Document updates in EspoCRM.
|
||||||
|
|
||||||
|
Receives batch or single entity notifications and emits queue events
|
||||||
|
for each entity ID to be synced to xAI.
|
||||||
|
"""
|
||||||
|
try:
|
||||||
|
payload = request.body or []
|
||||||
|
|
||||||
|
ctx.logger.info("VMH Webhook Document Update empfangen")
|
||||||
|
ctx.logger.debug(f"Payload: {json.dumps(payload, indent=2, ensure_ascii=False)}")
|
||||||
|
|
||||||
|
# Sammle alle IDs aus dem Batch
|
||||||
|
entity_ids = set()
|
||||||
|
|
||||||
|
if isinstance(payload, list):
|
||||||
|
for entity in payload:
|
||||||
|
if isinstance(entity, dict) and 'id' in entity:
|
||||||
|
entity_ids.add(entity['id'])
|
||||||
|
entity_type = entity.get('entityType', 'CDokumente')
|
||||||
|
elif isinstance(payload, dict) and 'id' in payload:
|
||||||
|
entity_ids.add(payload['id'])
|
||||||
|
entity_type = payload.get('entityType', 'CDokumente')
|
||||||
|
|
||||||
|
ctx.logger.info(f"{len(entity_ids)} Document IDs zum Update-Sync gefunden")
|
||||||
|
|
||||||
|
# Emit events für Queue-Processing
|
||||||
|
for entity_id in entity_ids:
|
||||||
|
await ctx.enqueue({
|
||||||
|
'topic': 'vmh.document.update',
|
||||||
|
'data': {
|
||||||
|
'entity_id': entity_id,
|
||||||
|
'entity_type': entity_type if 'entity_type' in locals() else 'CDokumente',
|
||||||
|
'action': 'update',
|
||||||
|
'timestamp': payload[0].get('modifiedAt') if isinstance(payload, list) and payload else None
|
||||||
|
}
|
||||||
|
})
|
||||||
|
|
||||||
|
return ApiResponse(
|
||||||
|
status_code=200,
|
||||||
|
body={
|
||||||
|
'success': True,
|
||||||
|
'message': f'{len(entity_ids)} Document(s) zum Sync enqueued',
|
||||||
|
'entity_ids': list(entity_ids)
|
||||||
|
}
|
||||||
|
)
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
ctx.logger.error(f"Fehler im Document Update Webhook: {e}")
|
||||||
|
ctx.logger.error(f"Payload: {request.body}")
|
||||||
|
|
||||||
|
return ApiResponse(
|
||||||
|
status_code=500,
|
||||||
|
body={
|
||||||
|
'success': False,
|
||||||
|
'error': str(e)
|
||||||
|
}
|
||||||
|
)
|
||||||
110
tests/README.md
Normal file
110
tests/README.md
Normal file
@@ -0,0 +1,110 @@
|
|||||||
|
# Test Scripts
|
||||||
|
|
||||||
|
This directory contains test scripts for the Motia III xAI Collections integration.
|
||||||
|
|
||||||
|
## Test Files
|
||||||
|
|
||||||
|
### `test_xai_collections_api.py`
|
||||||
|
Tests xAI Collections API authentication and basic operations.
|
||||||
|
|
||||||
|
**Usage:**
|
||||||
|
```bash
|
||||||
|
cd /opt/motia-iii/bitbylaw
|
||||||
|
python tests/test_xai_collections_api.py
|
||||||
|
```
|
||||||
|
|
||||||
|
**Required Environment Variables:**
|
||||||
|
- `XAI_MANAGEMENT_API_KEY` - xAI Management API key for collection operations
|
||||||
|
- `XAI_API_KEY` - xAI Regular API key for file operations
|
||||||
|
|
||||||
|
**Tests:**
|
||||||
|
- ✅ Management API authentication
|
||||||
|
- ✅ Regular API authentication
|
||||||
|
- ✅ Collection listing
|
||||||
|
- ✅ Collection creation
|
||||||
|
- ✅ File upload
|
||||||
|
- ✅ Collection deletion
|
||||||
|
- ✅ Error handling
|
||||||
|
|
||||||
|
### `test_preview_upload.py`
|
||||||
|
Tests preview/thumbnail upload to EspoCRM CDokumente entity.
|
||||||
|
|
||||||
|
**Usage:**
|
||||||
|
```bash
|
||||||
|
cd /opt/motia-iii/bitbylaw
|
||||||
|
python tests/test_preview_upload.py
|
||||||
|
```
|
||||||
|
|
||||||
|
**Required Environment Variables:**
|
||||||
|
- `ESPOCRM_URL` - EspoCRM instance URL (default: https://crm.bitbylaw.com)
|
||||||
|
- `ESPOCRM_API_KEY` - EspoCRM API key
|
||||||
|
|
||||||
|
**Tests:**
|
||||||
|
- ✅ Preview image generation (WebP format, 600x800px)
|
||||||
|
- ✅ Base64 Data URI encoding
|
||||||
|
- ✅ Attachment upload via JSON POST
|
||||||
|
- ✅ Entity update with previewId/previewName
|
||||||
|
|
||||||
|
**Status:** ✅ Successfully tested - Attachment ID `69a71194c7c6baebf` created
|
||||||
|
|
||||||
|
### `test_thumbnail_generation.py`
|
||||||
|
Tests thumbnail generation for various document types.
|
||||||
|
|
||||||
|
**Usage:**
|
||||||
|
```bash
|
||||||
|
cd /opt/motia-iii/bitbylaw
|
||||||
|
python tests/test_thumbnail_generation.py
|
||||||
|
```
|
||||||
|
|
||||||
|
**Supported Formats:**
|
||||||
|
- PDF → WebP (first page)
|
||||||
|
- DOCX/DOC → PDF → WebP
|
||||||
|
- Images (JPEG, PNG, etc.) → WebP resize
|
||||||
|
|
||||||
|
**Dependencies:**
|
||||||
|
- `python3-pil` - PIL/Pillow for image processing
|
||||||
|
- `poppler-utils` - PDF rendering
|
||||||
|
- `libreoffice` - DOCX to PDF conversion
|
||||||
|
- `pdf2image` - PDF to image conversion
|
||||||
|
|
||||||
|
## Running Tests
|
||||||
|
|
||||||
|
### All Tests
|
||||||
|
```bash
|
||||||
|
cd /opt/motia-iii/bitbylaw
|
||||||
|
python -m pytest tests/ -v
|
||||||
|
```
|
||||||
|
|
||||||
|
### Individual Tests
|
||||||
|
```bash
|
||||||
|
cd /opt/motia-iii/bitbylaw
|
||||||
|
python tests/test_xai_collections_api.py
|
||||||
|
python tests/test_preview_upload.py
|
||||||
|
python tests/test_thumbnail_generation.py
|
||||||
|
```
|
||||||
|
|
||||||
|
## Environment Setup
|
||||||
|
|
||||||
|
Create `.env` file in `/opt/motia-iii/bitbylaw/`:
|
||||||
|
```bash
|
||||||
|
# xAI Collections API
|
||||||
|
XAI_MANAGEMENT_API_KEY=xai-token-xxx...
|
||||||
|
XAI_API_KEY=xai-xxx...
|
||||||
|
|
||||||
|
# EspoCRM API
|
||||||
|
ESPOCRM_URL=https://crm.bitbylaw.com
|
||||||
|
ESPOCRM_API_KEY=xxx...
|
||||||
|
|
||||||
|
# Redis (for locking)
|
||||||
|
REDIS_HOST=localhost
|
||||||
|
REDIS_PORT=6379
|
||||||
|
REDIS_DB_ADVOWARE_CACHE=1
|
||||||
|
```
|
||||||
|
|
||||||
|
## Test Results
|
||||||
|
|
||||||
|
Last test run: Successfully validated preview upload functionality
|
||||||
|
- Preview upload works with base64 Data URI format
|
||||||
|
- Attachment created with ID: `69a71194c7c6baebf`
|
||||||
|
- CDokumente entity updated with previewId/previewName
|
||||||
|
- WebP format at 600x800px confirmed working
|
||||||
Reference in New Issue
Block a user