Lizenzserver - Integration Admin Panel
Dieser Commit ist enthalten in:
148
LIZENZSERVER.md
148
LIZENZSERVER.md
@@ -459,30 +459,144 @@ Cache-Keys:
|
|||||||
|
|
||||||
## Implementierungs-Roadmap
|
## Implementierungs-Roadmap
|
||||||
|
|
||||||
### Phase 1: Foundation (Woche 1-2)
|
### Phase 1: Foundation (Woche 1-2) ✅ ERLEDIGT
|
||||||
- [ ] Docker-Setup für alle Services
|
- [x] Docker-Setup für alle Services
|
||||||
- [ ] Basis-Datenbankschema
|
- [x] Basis-Datenbankschema
|
||||||
- [ ] JWT Authentication
|
- [x] JWT Authentication
|
||||||
- [ ] Basis API-Endpoints
|
- [x] Basis API-Endpoints
|
||||||
|
|
||||||
### Phase 2: Core Features (Woche 3-4)
|
### Phase 2: Core Features (Woche 3-4) ✅ ERLEDIGT
|
||||||
- [ ] License Validation Logic
|
- [x] License Validation Logic
|
||||||
- [ ] Device Management
|
- [x] Device Management
|
||||||
- [ ] Heartbeat System
|
- [x] Heartbeat System
|
||||||
- [ ] Admin API
|
- [x] Admin API
|
||||||
|
|
||||||
### Phase 3: Advanced Features (Woche 5-6)
|
### Phase 3: Advanced Features (Woche 5-6) 🚧 IN ARBEIT
|
||||||
- [ ] Offline Token System
|
- [x] Offline Token System
|
||||||
- [ ] Anomaly Detection
|
- [x] Anomaly Detection (Basis)
|
||||||
- [ ] Analytics Service
|
- [ ] Analytics Service (Detailliert)
|
||||||
- [ ] Monitoring Setup
|
- [ ] Monitoring Setup (Prometheus)
|
||||||
|
|
||||||
### Phase 4: Optimization (Woche 7-8)
|
### Phase 4: Optimization (Woche 7-8) 📋 GEPLANT
|
||||||
- [ ] Caching Layer
|
- [x] Caching Layer (Redis implementiert)
|
||||||
- [ ] Performance Tuning
|
- [ ] Performance Tuning
|
||||||
- [ ] Load Testing
|
- [ ] Load Testing
|
||||||
- [ ] Documentation
|
- [ ] Documentation
|
||||||
|
|
||||||
|
## Aktueller Implementierungsstand (Stand: 18.06.2025)
|
||||||
|
|
||||||
|
### ✅ Fertiggestellte Komponenten:
|
||||||
|
|
||||||
|
#### 1. **Microservices-Architektur**
|
||||||
|
- **Auth Service** (Port 5001): JWT-Token-Generierung und -Validierung
|
||||||
|
- **License API Service** (Port 5002): Lizenzvalidierung, Aktivierung, Heartbeat
|
||||||
|
- **Docker Compose**: Vollständiges Setup mit Redis, RabbitMQ, PostgreSQL
|
||||||
|
- **Netzwerk**: Gemeinsames `v2_network` für Service-Kommunikation
|
||||||
|
|
||||||
|
#### 2. **Datenbank-Erweiterungen**
|
||||||
|
Alle neuen Tabellen wurden erfolgreich implementiert:
|
||||||
|
- `license_tokens` - Offline-Validierung mit Token-Management
|
||||||
|
- `license_heartbeats` - Partitionierte Tabelle für Heartbeat-Tracking
|
||||||
|
- `activation_events` - Vollständige Aktivierungshistorie
|
||||||
|
- `anomaly_detections` - Anomalie-Tracking mit Severity-Levels
|
||||||
|
- `api_clients` & `api_rate_limits` - API-Key-Verwaltung
|
||||||
|
- `feature_flags` - Feature Toggle System
|
||||||
|
- `active_sessions` - Session-Management für Concurrent-Use-Prevention
|
||||||
|
|
||||||
|
#### 3. **Repository Pattern & Services**
|
||||||
|
- `BaseRepository`: Abstrakte Basis für DB-Operationen
|
||||||
|
- `LicenseRepository`: Spezifische Lizenz-Operationen
|
||||||
|
- `CacheRepository`: Redis-Integration für Performance
|
||||||
|
- `EventBus`: RabbitMQ-basiertes Event-System für lose Kopplung
|
||||||
|
|
||||||
|
#### 4. **API Endpoints (Implementiert)**
|
||||||
|
##### Public API:
|
||||||
|
- `POST /api/v1/license/validate` - Online-Lizenzvalidierung
|
||||||
|
- `POST /api/v1/license/activate` - Lizenzaktivierung auf neuem Gerät
|
||||||
|
- `POST /api/v1/license/heartbeat` - Heartbeat für aktive Sessions
|
||||||
|
- `POST /api/v1/license/offline-token` - Offline-Token-Generierung
|
||||||
|
- `POST /api/v1/license/validate-offline` - Offline-Token-Validierung
|
||||||
|
|
||||||
|
##### Auth API:
|
||||||
|
- `POST /api/v1/auth/token` - Access Token generieren
|
||||||
|
- `POST /api/v1/auth/refresh` - Token erneuern
|
||||||
|
- `POST /api/v1/auth/verify` - Token verifizieren
|
||||||
|
- `POST /api/v1/auth/api-key` - API-Key erstellen (Admin)
|
||||||
|
|
||||||
|
#### 5. **Admin Panel Integration**
|
||||||
|
Neuer Menüpunkt "Lizenzserver" mit folgenden Unterseiten:
|
||||||
|
- **Live Monitor** (`/lizenzserver/monitor`):
|
||||||
|
- Echtzeit-Statistiken (aktive Lizenzen, Validierungen/Min)
|
||||||
|
- Top 10 aktive Lizenzen
|
||||||
|
- Aktuelle Anomalien
|
||||||
|
- Validierungs-Timeline mit Chart.js
|
||||||
|
- **Analytics** (`/lizenzserver/analytics`): Placeholder für detaillierte Analysen
|
||||||
|
- **Anomalien** (`/lizenzserver/anomalies`):
|
||||||
|
- Anomalie-Liste mit Filterung
|
||||||
|
- Anomalie-Resolution mit Audit-Log
|
||||||
|
- **Konfiguration** (`/lizenzserver/config`):
|
||||||
|
- Feature Flag Management
|
||||||
|
- API Client Verwaltung
|
||||||
|
- Rate Limit Konfiguration
|
||||||
|
|
||||||
|
#### 6. **Integration in Kunden & Lizenzen Übersicht**
|
||||||
|
- Neue Spalte "Server Status" zeigt:
|
||||||
|
- 💚 Online (aktive Heartbeats in letzten 5 Min)
|
||||||
|
- ⏱️ X Min (letzte Aktivität)
|
||||||
|
- 💤 Offline (länger als 1h inaktiv)
|
||||||
|
- ⚠️ Anzahl ungelöster Anomalien
|
||||||
|
|
||||||
|
### 🚧 In Entwicklung:
|
||||||
|
|
||||||
|
1. **Analytics Service** (Port 5003)
|
||||||
|
- Grundstruktur vorhanden
|
||||||
|
- Detaillierte Implementierung ausstehend
|
||||||
|
|
||||||
|
2. **Admin API Service** (Port 5004)
|
||||||
|
- Struktur vorbereitet
|
||||||
|
- Implementation pending
|
||||||
|
|
||||||
|
### 📋 Noch zu implementieren:
|
||||||
|
|
||||||
|
1. **Monitoring & Observability**
|
||||||
|
- Prometheus Integration
|
||||||
|
- Grafana Dashboards
|
||||||
|
- Alert Rules
|
||||||
|
|
||||||
|
2. **Erweiterte Anomalie-Erkennung**
|
||||||
|
- Machine Learning basierte Pattern-Erkennung
|
||||||
|
- Geo-Location Anomalien
|
||||||
|
- Automatische Aktionen bei kritischen Anomalien
|
||||||
|
|
||||||
|
3. **Performance Optimierungen**
|
||||||
|
- Connection Pooling
|
||||||
|
- Query Optimization
|
||||||
|
- Batch Processing für Heartbeats
|
||||||
|
|
||||||
|
4. **Erweiterte Features**
|
||||||
|
- WebSocket für Live-Updates
|
||||||
|
- Bulk-Operationen
|
||||||
|
- Export-Funktionen
|
||||||
|
- API Documentation (OpenAPI/Swagger)
|
||||||
|
|
||||||
|
### 🔧 Technische Details:
|
||||||
|
|
||||||
|
- **Python Version**: 3.11
|
||||||
|
- **Flask Version**: 3.0.0
|
||||||
|
- **PostgreSQL**: 15 mit UUID-Extension
|
||||||
|
- **Redis**: 7-alpine für Caching
|
||||||
|
- **RabbitMQ**: 3-management für Event Bus
|
||||||
|
- **JWT**: PyJWT 2.8.0
|
||||||
|
- **Psycopg2**: 2.9.9 für PostgreSQL
|
||||||
|
|
||||||
|
### 📝 Nächste Schritte:
|
||||||
|
|
||||||
|
1. Analytics Service vollständig implementieren
|
||||||
|
2. Prometheus Monitoring aufsetzen
|
||||||
|
3. Load Testing durchführen
|
||||||
|
4. API-Dokumentation mit Swagger erstellen
|
||||||
|
5. Kubernetes Deployment vorbereiten
|
||||||
|
|
||||||
## Testing-Strategie
|
## Testing-Strategie
|
||||||
|
|
||||||
### Unit Tests
|
### Unit Tests
|
||||||
|
|||||||
89
lizenzserver/config.py
Normale Datei
89
lizenzserver/config.py
Normale Datei
@@ -0,0 +1,89 @@
|
|||||||
|
import os
|
||||||
|
from datetime import timedelta
|
||||||
|
|
||||||
|
class Config:
|
||||||
|
"""Base configuration with sensible defaults"""
|
||||||
|
|
||||||
|
# Database
|
||||||
|
DATABASE_URL = os.getenv('DATABASE_URL', 'postgresql://admin:adminpass@localhost:5432/v2')
|
||||||
|
|
||||||
|
# Redis
|
||||||
|
REDIS_URL = os.getenv('REDIS_URL', 'redis://localhost:6379')
|
||||||
|
|
||||||
|
# RabbitMQ
|
||||||
|
RABBITMQ_URL = os.getenv('RABBITMQ_URL', 'amqp://guest:guest@localhost:5672')
|
||||||
|
|
||||||
|
# JWT
|
||||||
|
JWT_SECRET = os.getenv('JWT_SECRET', 'change-this-in-production')
|
||||||
|
JWT_ALGORITHM = 'HS256'
|
||||||
|
JWT_ACCESS_TOKEN_EXPIRES = timedelta(hours=1)
|
||||||
|
JWT_REFRESH_TOKEN_EXPIRES = timedelta(days=30)
|
||||||
|
|
||||||
|
# API Rate Limiting
|
||||||
|
DEFAULT_RATE_LIMIT_PER_MINUTE = 60
|
||||||
|
DEFAULT_RATE_LIMIT_PER_HOUR = 1000
|
||||||
|
DEFAULT_RATE_LIMIT_PER_DAY = 10000
|
||||||
|
|
||||||
|
# Offline tokens
|
||||||
|
MAX_OFFLINE_TOKEN_DURATION_HOURS = 72
|
||||||
|
DEFAULT_OFFLINE_TOKEN_DURATION_HOURS = 24
|
||||||
|
|
||||||
|
# Heartbeat settings
|
||||||
|
HEARTBEAT_INTERVAL_SECONDS = 300 # 5 minutes
|
||||||
|
HEARTBEAT_TIMEOUT_SECONDS = 900 # 15 minutes
|
||||||
|
|
||||||
|
# Session settings
|
||||||
|
MAX_CONCURRENT_SESSIONS = 1
|
||||||
|
SESSION_TIMEOUT_MINUTES = 30
|
||||||
|
|
||||||
|
# Cache TTL
|
||||||
|
CACHE_TTL_VALIDATION = 300 # 5 minutes
|
||||||
|
CACHE_TTL_LICENSE_STATUS = 60 # 1 minute
|
||||||
|
CACHE_TTL_DEVICE_LIST = 300 # 5 minutes
|
||||||
|
|
||||||
|
# Anomaly detection thresholds
|
||||||
|
ANOMALY_RAPID_HARDWARE_CHANGE_MINUTES = 10
|
||||||
|
ANOMALY_MULTIPLE_IPS_THRESHOLD = 5
|
||||||
|
ANOMALY_GEO_DISTANCE_KM = 1000
|
||||||
|
|
||||||
|
# Logging
|
||||||
|
LOG_LEVEL = os.getenv('LOG_LEVEL', 'INFO')
|
||||||
|
LOG_FORMAT = '%(asctime)s - %(name)s - %(levelname)s - %(message)s'
|
||||||
|
|
||||||
|
# Service ports
|
||||||
|
AUTH_SERVICE_PORT = int(os.getenv('PORT', 5001))
|
||||||
|
LICENSE_API_PORT = int(os.getenv('PORT', 5002))
|
||||||
|
ANALYTICS_SERVICE_PORT = int(os.getenv('PORT', 5003))
|
||||||
|
ADMIN_API_PORT = int(os.getenv('PORT', 5004))
|
||||||
|
|
||||||
|
class DevelopmentConfig(Config):
|
||||||
|
"""Development configuration"""
|
||||||
|
DEBUG = True
|
||||||
|
TESTING = False
|
||||||
|
|
||||||
|
class ProductionConfig(Config):
|
||||||
|
"""Production configuration"""
|
||||||
|
DEBUG = False
|
||||||
|
TESTING = False
|
||||||
|
|
||||||
|
# Override with production values
|
||||||
|
JWT_SECRET = os.environ['JWT_SECRET'] # Required in production
|
||||||
|
|
||||||
|
class TestingConfig(Config):
|
||||||
|
"""Testing configuration"""
|
||||||
|
DEBUG = True
|
||||||
|
TESTING = True
|
||||||
|
DATABASE_URL = 'postgresql://admin:adminpass@localhost:5432/v2_test'
|
||||||
|
|
||||||
|
# Configuration dictionary
|
||||||
|
config = {
|
||||||
|
'development': DevelopmentConfig,
|
||||||
|
'production': ProductionConfig,
|
||||||
|
'testing': TestingConfig,
|
||||||
|
'default': DevelopmentConfig
|
||||||
|
}
|
||||||
|
|
||||||
|
def get_config():
|
||||||
|
"""Get configuration based on environment"""
|
||||||
|
env = os.getenv('FLASK_ENV', 'development')
|
||||||
|
return config.get(env, config['default'])
|
||||||
123
lizenzserver/docker-compose.yaml
Normale Datei
123
lizenzserver/docker-compose.yaml
Normale Datei
@@ -0,0 +1,123 @@
|
|||||||
|
version: '3.8'
|
||||||
|
|
||||||
|
services:
|
||||||
|
license-auth:
|
||||||
|
build: ./services/auth
|
||||||
|
container_name: license-auth
|
||||||
|
environment:
|
||||||
|
- JWT_SECRET=${JWT_SECRET:-your-secret-key-change-in-production}
|
||||||
|
- DATABASE_URL=postgresql://admin:adminpass@postgres:5432/v2
|
||||||
|
- REDIS_URL=redis://redis:6379
|
||||||
|
- PORT=5001
|
||||||
|
ports:
|
||||||
|
- "5001:5001"
|
||||||
|
depends_on:
|
||||||
|
- postgres
|
||||||
|
- redis
|
||||||
|
networks:
|
||||||
|
- v2_network
|
||||||
|
restart: unless-stopped
|
||||||
|
|
||||||
|
license-api:
|
||||||
|
build: ./services/license_api
|
||||||
|
container_name: license-api
|
||||||
|
environment:
|
||||||
|
- DATABASE_URL=postgresql://admin:adminpass@postgres:5432/v2
|
||||||
|
- REDIS_URL=redis://redis:6379
|
||||||
|
- RABBITMQ_URL=amqp://guest:guest@rabbitmq:5672
|
||||||
|
- JWT_SECRET=${JWT_SECRET:-your-secret-key-change-in-production}
|
||||||
|
- PORT=5002
|
||||||
|
ports:
|
||||||
|
- "5002:5002"
|
||||||
|
depends_on:
|
||||||
|
- postgres
|
||||||
|
- redis
|
||||||
|
- rabbitmq
|
||||||
|
networks:
|
||||||
|
- v2_network
|
||||||
|
restart: unless-stopped
|
||||||
|
|
||||||
|
license-analytics:
|
||||||
|
build: ./services/analytics
|
||||||
|
container_name: license-analytics
|
||||||
|
environment:
|
||||||
|
- DATABASE_URL=postgresql://admin:adminpass@postgres:5432/v2
|
||||||
|
- REDIS_URL=redis://redis:6379
|
||||||
|
- RABBITMQ_URL=amqp://guest:guest@rabbitmq:5672
|
||||||
|
- PORT=5003
|
||||||
|
ports:
|
||||||
|
- "5003:5003"
|
||||||
|
depends_on:
|
||||||
|
- postgres
|
||||||
|
- redis
|
||||||
|
- rabbitmq
|
||||||
|
networks:
|
||||||
|
- v2_network
|
||||||
|
restart: unless-stopped
|
||||||
|
|
||||||
|
license-admin-api:
|
||||||
|
build: ./services/admin_api
|
||||||
|
container_name: license-admin-api
|
||||||
|
environment:
|
||||||
|
- DATABASE_URL=postgresql://admin:adminpass@postgres:5432/v2
|
||||||
|
- REDIS_URL=redis://redis:6379
|
||||||
|
- RABBITMQ_URL=amqp://guest:guest@rabbitmq:5672
|
||||||
|
- JWT_SECRET=${JWT_SECRET:-your-secret-key-change-in-production}
|
||||||
|
- PORT=5004
|
||||||
|
ports:
|
||||||
|
- "5004:5004"
|
||||||
|
depends_on:
|
||||||
|
- postgres
|
||||||
|
- redis
|
||||||
|
- rabbitmq
|
||||||
|
networks:
|
||||||
|
- v2_network
|
||||||
|
restart: unless-stopped
|
||||||
|
|
||||||
|
postgres:
|
||||||
|
image: postgres:15-alpine
|
||||||
|
container_name: license-postgres
|
||||||
|
environment:
|
||||||
|
- POSTGRES_DB=v2
|
||||||
|
- POSTGRES_USER=admin
|
||||||
|
- POSTGRES_PASSWORD=adminpass
|
||||||
|
volumes:
|
||||||
|
- postgres_data:/var/lib/postgresql/data
|
||||||
|
- ./init.sql:/docker-entrypoint-initdb.d/init.sql
|
||||||
|
networks:
|
||||||
|
- v2_network
|
||||||
|
restart: unless-stopped
|
||||||
|
|
||||||
|
redis:
|
||||||
|
image: redis:7-alpine
|
||||||
|
container_name: license-redis
|
||||||
|
command: redis-server --appendonly yes
|
||||||
|
volumes:
|
||||||
|
- redis_data:/data
|
||||||
|
networks:
|
||||||
|
- v2_network
|
||||||
|
restart: unless-stopped
|
||||||
|
|
||||||
|
rabbitmq:
|
||||||
|
image: rabbitmq:3-management-alpine
|
||||||
|
container_name: license-rabbitmq
|
||||||
|
environment:
|
||||||
|
- RABBITMQ_DEFAULT_USER=guest
|
||||||
|
- RABBITMQ_DEFAULT_PASS=guest
|
||||||
|
ports:
|
||||||
|
- "5672:5672"
|
||||||
|
- "15672:15672"
|
||||||
|
volumes:
|
||||||
|
- rabbitmq_data:/var/lib/rabbitmq
|
||||||
|
networks:
|
||||||
|
- v2_network
|
||||||
|
restart: unless-stopped
|
||||||
|
|
||||||
|
volumes:
|
||||||
|
postgres_data:
|
||||||
|
redis_data:
|
||||||
|
rabbitmq_data:
|
||||||
|
|
||||||
|
networks:
|
||||||
|
v2_network:
|
||||||
|
external: true
|
||||||
188
lizenzserver/events/event_bus.py
Normale Datei
188
lizenzserver/events/event_bus.py
Normale Datei
@@ -0,0 +1,188 @@
|
|||||||
|
import json
|
||||||
|
import logging
|
||||||
|
from typing import Dict, Any, Callable, List
|
||||||
|
from datetime import datetime
|
||||||
|
import pika
|
||||||
|
from pika.exceptions import AMQPConnectionError
|
||||||
|
import threading
|
||||||
|
from collections import defaultdict
|
||||||
|
|
||||||
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
|
class Event:
|
||||||
|
"""Base event class"""
|
||||||
|
def __init__(self, event_type: str, data: Dict[str, Any], source: str = "unknown"):
|
||||||
|
self.id = self._generate_id()
|
||||||
|
self.type = event_type
|
||||||
|
self.data = data
|
||||||
|
self.source = source
|
||||||
|
self.timestamp = datetime.utcnow().isoformat()
|
||||||
|
|
||||||
|
def _generate_id(self) -> str:
|
||||||
|
import uuid
|
||||||
|
return str(uuid.uuid4())
|
||||||
|
|
||||||
|
def to_dict(self) -> Dict[str, Any]:
|
||||||
|
return {
|
||||||
|
"id": self.id,
|
||||||
|
"type": self.type,
|
||||||
|
"data": self.data,
|
||||||
|
"source": self.source,
|
||||||
|
"timestamp": self.timestamp
|
||||||
|
}
|
||||||
|
|
||||||
|
def to_json(self) -> str:
|
||||||
|
return json.dumps(self.to_dict())
|
||||||
|
|
||||||
|
class EventBus:
|
||||||
|
"""Event bus for pub/sub pattern with RabbitMQ backend"""
|
||||||
|
|
||||||
|
def __init__(self, rabbitmq_url: str):
|
||||||
|
self.rabbitmq_url = rabbitmq_url
|
||||||
|
self.connection = None
|
||||||
|
self.channel = None
|
||||||
|
self.exchange_name = "license_events"
|
||||||
|
self.local_handlers: Dict[str, List[Callable]] = defaultdict(list)
|
||||||
|
self._connect()
|
||||||
|
|
||||||
|
def _connect(self):
|
||||||
|
"""Establish connection to RabbitMQ"""
|
||||||
|
try:
|
||||||
|
parameters = pika.URLParameters(self.rabbitmq_url)
|
||||||
|
self.connection = pika.BlockingConnection(parameters)
|
||||||
|
self.channel = self.connection.channel()
|
||||||
|
|
||||||
|
# Declare exchange
|
||||||
|
self.channel.exchange_declare(
|
||||||
|
exchange=self.exchange_name,
|
||||||
|
exchange_type='topic',
|
||||||
|
durable=True
|
||||||
|
)
|
||||||
|
|
||||||
|
logger.info("Connected to RabbitMQ")
|
||||||
|
except AMQPConnectionError as e:
|
||||||
|
logger.error(f"Failed to connect to RabbitMQ: {e}")
|
||||||
|
# Fallback to local-only event handling
|
||||||
|
self.connection = None
|
||||||
|
self.channel = None
|
||||||
|
|
||||||
|
def publish(self, event: Event):
|
||||||
|
"""Publish an event"""
|
||||||
|
try:
|
||||||
|
# Publish to RabbitMQ if connected
|
||||||
|
if self.channel and not self.channel.is_closed:
|
||||||
|
self.channel.basic_publish(
|
||||||
|
exchange=self.exchange_name,
|
||||||
|
routing_key=event.type,
|
||||||
|
body=event.to_json(),
|
||||||
|
properties=pika.BasicProperties(
|
||||||
|
delivery_mode=2, # Make message persistent
|
||||||
|
content_type='application/json'
|
||||||
|
)
|
||||||
|
)
|
||||||
|
logger.debug(f"Published event: {event.type}")
|
||||||
|
|
||||||
|
# Also handle local subscribers
|
||||||
|
self._handle_local_event(event)
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
logger.error(f"Error publishing event: {e}")
|
||||||
|
# Ensure local handlers still get called
|
||||||
|
self._handle_local_event(event)
|
||||||
|
|
||||||
|
def subscribe(self, event_type: str, handler: Callable):
|
||||||
|
"""Subscribe to an event type locally"""
|
||||||
|
self.local_handlers[event_type].append(handler)
|
||||||
|
logger.debug(f"Subscribed to {event_type}")
|
||||||
|
|
||||||
|
def subscribe_queue(self, event_types: List[str], queue_name: str, handler: Callable):
|
||||||
|
"""Subscribe to events via RabbitMQ queue"""
|
||||||
|
if not self.channel:
|
||||||
|
logger.warning("RabbitMQ not connected, falling back to local subscription")
|
||||||
|
for event_type in event_types:
|
||||||
|
self.subscribe(event_type, handler)
|
||||||
|
return
|
||||||
|
|
||||||
|
try:
|
||||||
|
# Declare queue
|
||||||
|
self.channel.queue_declare(queue=queue_name, durable=True)
|
||||||
|
|
||||||
|
# Bind queue to exchange for each event type
|
||||||
|
for event_type in event_types:
|
||||||
|
self.channel.queue_bind(
|
||||||
|
exchange=self.exchange_name,
|
||||||
|
queue=queue_name,
|
||||||
|
routing_key=event_type
|
||||||
|
)
|
||||||
|
|
||||||
|
# Set up consumer
|
||||||
|
def callback(ch, method, properties, body):
|
||||||
|
try:
|
||||||
|
event_data = json.loads(body)
|
||||||
|
event = Event(
|
||||||
|
event_type=event_data['type'],
|
||||||
|
data=event_data['data'],
|
||||||
|
source=event_data['source']
|
||||||
|
)
|
||||||
|
handler(event)
|
||||||
|
ch.basic_ack(delivery_tag=method.delivery_tag)
|
||||||
|
except Exception as e:
|
||||||
|
logger.error(f"Error handling event: {e}")
|
||||||
|
ch.basic_nack(delivery_tag=method.delivery_tag, requeue=True)
|
||||||
|
|
||||||
|
self.channel.basic_consume(queue=queue_name, on_message_callback=callback)
|
||||||
|
|
||||||
|
# Start consuming in a separate thread
|
||||||
|
consumer_thread = threading.Thread(target=self.channel.start_consuming)
|
||||||
|
consumer_thread.daemon = True
|
||||||
|
consumer_thread.start()
|
||||||
|
|
||||||
|
logger.info(f"Started consuming from queue: {queue_name}")
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
logger.error(f"Error setting up queue subscription: {e}")
|
||||||
|
|
||||||
|
def _handle_local_event(self, event: Event):
|
||||||
|
"""Handle event with local subscribers"""
|
||||||
|
handlers = self.local_handlers.get(event.type, [])
|
||||||
|
for handler in handlers:
|
||||||
|
try:
|
||||||
|
handler(event)
|
||||||
|
except Exception as e:
|
||||||
|
logger.error(f"Error in event handler: {e}")
|
||||||
|
|
||||||
|
def close(self):
|
||||||
|
"""Close RabbitMQ connection"""
|
||||||
|
if self.connection and not self.connection.is_closed:
|
||||||
|
self.connection.close()
|
||||||
|
logger.info("Closed RabbitMQ connection")
|
||||||
|
|
||||||
|
# Event types
|
||||||
|
class EventTypes:
|
||||||
|
"""Centralized event type definitions"""
|
||||||
|
|
||||||
|
# License events
|
||||||
|
LICENSE_VALIDATED = "license.validated"
|
||||||
|
LICENSE_VALIDATION_FAILED = "license.validation.failed"
|
||||||
|
LICENSE_ACTIVATED = "license.activated"
|
||||||
|
LICENSE_DEACTIVATED = "license.deactivated"
|
||||||
|
LICENSE_TRANSFERRED = "license.transferred"
|
||||||
|
LICENSE_EXPIRED = "license.expired"
|
||||||
|
|
||||||
|
# Device events
|
||||||
|
DEVICE_ADDED = "device.added"
|
||||||
|
DEVICE_REMOVED = "device.removed"
|
||||||
|
DEVICE_BLOCKED = "device.blocked"
|
||||||
|
|
||||||
|
# Anomaly events
|
||||||
|
ANOMALY_DETECTED = "anomaly.detected"
|
||||||
|
ANOMALY_RESOLVED = "anomaly.resolved"
|
||||||
|
|
||||||
|
# Session events
|
||||||
|
SESSION_STARTED = "session.started"
|
||||||
|
SESSION_ENDED = "session.ended"
|
||||||
|
SESSION_EXPIRED = "session.expired"
|
||||||
|
|
||||||
|
# System events
|
||||||
|
RATE_LIMIT_EXCEEDED = "system.rate_limit_exceeded"
|
||||||
|
API_ERROR = "system.api_error"
|
||||||
177
lizenzserver/init.sql
Normale Datei
177
lizenzserver/init.sql
Normale Datei
@@ -0,0 +1,177 @@
|
|||||||
|
-- License Server Database Schema
|
||||||
|
-- Following best practices: snake_case for DB fields, clear naming conventions
|
||||||
|
|
||||||
|
-- Enable UUID extension
|
||||||
|
CREATE EXTENSION IF NOT EXISTS "uuid-ossp";
|
||||||
|
|
||||||
|
-- License tokens for offline validation
|
||||||
|
CREATE TABLE IF NOT EXISTS license_tokens (
|
||||||
|
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
|
||||||
|
license_id UUID REFERENCES licenses(id) ON DELETE CASCADE,
|
||||||
|
token VARCHAR(512) NOT NULL UNIQUE,
|
||||||
|
hardware_id VARCHAR(255) NOT NULL,
|
||||||
|
valid_until TIMESTAMP NOT NULL,
|
||||||
|
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
|
||||||
|
last_validated TIMESTAMP,
|
||||||
|
validation_count INTEGER DEFAULT 0
|
||||||
|
);
|
||||||
|
|
||||||
|
CREATE INDEX idx_token ON license_tokens(token);
|
||||||
|
CREATE INDEX idx_hardware ON license_tokens(hardware_id);
|
||||||
|
CREATE INDEX idx_valid_until ON license_tokens(valid_until);
|
||||||
|
|
||||||
|
-- Heartbeat tracking with partitioning support
|
||||||
|
CREATE TABLE IF NOT EXISTS license_heartbeats (
|
||||||
|
id BIGSERIAL,
|
||||||
|
license_id UUID REFERENCES licenses(id) ON DELETE CASCADE,
|
||||||
|
hardware_id VARCHAR(255) NOT NULL,
|
||||||
|
ip_address INET,
|
||||||
|
user_agent VARCHAR(500),
|
||||||
|
app_version VARCHAR(50),
|
||||||
|
timestamp TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
|
||||||
|
session_data JSONB,
|
||||||
|
PRIMARY KEY (id, timestamp)
|
||||||
|
) PARTITION BY RANGE (timestamp);
|
||||||
|
|
||||||
|
-- Create partitions for the current and next month
|
||||||
|
CREATE TABLE license_heartbeats_2025_01 PARTITION OF license_heartbeats
|
||||||
|
FOR VALUES FROM ('2025-01-01') TO ('2025-02-01');
|
||||||
|
|
||||||
|
CREATE TABLE license_heartbeats_2025_02 PARTITION OF license_heartbeats
|
||||||
|
FOR VALUES FROM ('2025-02-01') TO ('2025-03-01');
|
||||||
|
|
||||||
|
CREATE INDEX idx_heartbeat_license_time ON license_heartbeats(license_id, timestamp DESC);
|
||||||
|
CREATE INDEX idx_heartbeat_hardware_time ON license_heartbeats(hardware_id, timestamp DESC);
|
||||||
|
|
||||||
|
-- Activation events tracking
|
||||||
|
CREATE TABLE IF NOT EXISTS activation_events (
|
||||||
|
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
|
||||||
|
license_id UUID REFERENCES licenses(id) ON DELETE CASCADE,
|
||||||
|
event_type VARCHAR(50) NOT NULL CHECK (event_type IN ('activation', 'deactivation', 'reactivation', 'transfer')),
|
||||||
|
hardware_id VARCHAR(255),
|
||||||
|
previous_hardware_id VARCHAR(255),
|
||||||
|
ip_address INET,
|
||||||
|
user_agent VARCHAR(500),
|
||||||
|
success BOOLEAN DEFAULT true,
|
||||||
|
error_message TEXT,
|
||||||
|
metadata JSONB,
|
||||||
|
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP
|
||||||
|
);
|
||||||
|
|
||||||
|
CREATE INDEX idx_license_events ON activation_events(license_id, created_at DESC);
|
||||||
|
CREATE INDEX idx_event_type ON activation_events(event_type, created_at DESC);
|
||||||
|
|
||||||
|
-- API rate limiting
|
||||||
|
CREATE TABLE IF NOT EXISTS api_rate_limits (
|
||||||
|
id SERIAL PRIMARY KEY,
|
||||||
|
api_key VARCHAR(255) NOT NULL UNIQUE,
|
||||||
|
requests_per_minute INTEGER DEFAULT 60,
|
||||||
|
requests_per_hour INTEGER DEFAULT 1000,
|
||||||
|
requests_per_day INTEGER DEFAULT 10000,
|
||||||
|
burst_size INTEGER DEFAULT 100,
|
||||||
|
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
|
||||||
|
updated_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP
|
||||||
|
);
|
||||||
|
|
||||||
|
-- Anomaly detection
|
||||||
|
CREATE TABLE IF NOT EXISTS anomaly_detections (
|
||||||
|
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
|
||||||
|
license_id UUID REFERENCES licenses(id),
|
||||||
|
anomaly_type VARCHAR(100) NOT NULL CHECK (anomaly_type IN ('multiple_ips', 'rapid_hardware_change', 'suspicious_pattern', 'concurrent_use', 'geo_anomaly')),
|
||||||
|
severity VARCHAR(20) NOT NULL CHECK (severity IN ('low', 'medium', 'high', 'critical')),
|
||||||
|
details JSONB NOT NULL,
|
||||||
|
detected_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
|
||||||
|
resolved BOOLEAN DEFAULT false,
|
||||||
|
resolved_at TIMESTAMP,
|
||||||
|
resolved_by VARCHAR(255),
|
||||||
|
action_taken TEXT
|
||||||
|
);
|
||||||
|
|
||||||
|
CREATE INDEX idx_unresolved ON anomaly_detections(resolved, severity, detected_at DESC);
|
||||||
|
CREATE INDEX idx_license_anomalies ON anomaly_detections(license_id, detected_at DESC);
|
||||||
|
|
||||||
|
-- API clients for authentication
|
||||||
|
CREATE TABLE IF NOT EXISTS api_clients (
|
||||||
|
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
|
||||||
|
client_name VARCHAR(255) NOT NULL,
|
||||||
|
api_key VARCHAR(255) NOT NULL UNIQUE,
|
||||||
|
secret_key VARCHAR(255) NOT NULL,
|
||||||
|
is_active BOOLEAN DEFAULT true,
|
||||||
|
allowed_endpoints TEXT[],
|
||||||
|
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
|
||||||
|
updated_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP
|
||||||
|
);
|
||||||
|
|
||||||
|
-- Feature flags for gradual rollout
|
||||||
|
CREATE TABLE IF NOT EXISTS feature_flags (
|
||||||
|
id SERIAL PRIMARY KEY,
|
||||||
|
feature_name VARCHAR(100) NOT NULL UNIQUE,
|
||||||
|
is_enabled BOOLEAN DEFAULT false,
|
||||||
|
rollout_percentage INTEGER DEFAULT 0 CHECK (rollout_percentage >= 0 AND rollout_percentage <= 100),
|
||||||
|
whitelist_license_ids UUID[],
|
||||||
|
blacklist_license_ids UUID[],
|
||||||
|
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
|
||||||
|
updated_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP
|
||||||
|
);
|
||||||
|
|
||||||
|
-- Insert default feature flags
|
||||||
|
INSERT INTO feature_flags (feature_name, is_enabled, rollout_percentage) VALUES
|
||||||
|
('anomaly_detection', true, 100),
|
||||||
|
('offline_tokens', true, 100),
|
||||||
|
('advanced_analytics', false, 0),
|
||||||
|
('geo_restriction', false, 0)
|
||||||
|
ON CONFLICT (feature_name) DO NOTHING;
|
||||||
|
|
||||||
|
-- Session management for concurrent use tracking
|
||||||
|
CREATE TABLE IF NOT EXISTS active_sessions (
|
||||||
|
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
|
||||||
|
license_id UUID REFERENCES licenses(id) ON DELETE CASCADE,
|
||||||
|
hardware_id VARCHAR(255) NOT NULL,
|
||||||
|
session_token VARCHAR(512) NOT NULL UNIQUE,
|
||||||
|
ip_address INET,
|
||||||
|
started_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
|
||||||
|
last_seen TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
|
||||||
|
expires_at TIMESTAMP NOT NULL
|
||||||
|
);
|
||||||
|
|
||||||
|
CREATE INDEX idx_session_license ON active_sessions(license_id);
|
||||||
|
CREATE INDEX idx_session_expires ON active_sessions(expires_at);
|
||||||
|
|
||||||
|
-- Update trigger for updated_at columns
|
||||||
|
CREATE OR REPLACE FUNCTION update_updated_at_column()
|
||||||
|
RETURNS TRIGGER AS $$
|
||||||
|
BEGIN
|
||||||
|
NEW.updated_at = CURRENT_TIMESTAMP;
|
||||||
|
RETURN NEW;
|
||||||
|
END;
|
||||||
|
$$ language 'plpgsql';
|
||||||
|
|
||||||
|
CREATE TRIGGER update_api_rate_limits_updated_at BEFORE UPDATE ON api_rate_limits
|
||||||
|
FOR EACH ROW EXECUTE FUNCTION update_updated_at_column();
|
||||||
|
|
||||||
|
CREATE TRIGGER update_api_clients_updated_at BEFORE UPDATE ON api_clients
|
||||||
|
FOR EACH ROW EXECUTE FUNCTION update_updated_at_column();
|
||||||
|
|
||||||
|
CREATE TRIGGER update_feature_flags_updated_at BEFORE UPDATE ON feature_flags
|
||||||
|
FOR EACH ROW EXECUTE FUNCTION update_updated_at_column();
|
||||||
|
|
||||||
|
-- Function to automatically create monthly partitions for heartbeats
|
||||||
|
CREATE OR REPLACE FUNCTION create_monthly_partition()
|
||||||
|
RETURNS void AS $$
|
||||||
|
DECLARE
|
||||||
|
start_date date;
|
||||||
|
end_date date;
|
||||||
|
partition_name text;
|
||||||
|
BEGIN
|
||||||
|
start_date := date_trunc('month', CURRENT_DATE + interval '1 month');
|
||||||
|
end_date := start_date + interval '1 month';
|
||||||
|
partition_name := 'license_heartbeats_' || to_char(start_date, 'YYYY_MM');
|
||||||
|
|
||||||
|
EXECUTE format('CREATE TABLE IF NOT EXISTS %I PARTITION OF license_heartbeats FOR VALUES FROM (%L) TO (%L)',
|
||||||
|
partition_name, start_date, end_date);
|
||||||
|
END;
|
||||||
|
$$ LANGUAGE plpgsql;
|
||||||
|
|
||||||
|
-- Create a scheduled job to create partitions (requires pg_cron extension)
|
||||||
|
-- This is a placeholder - actual scheduling depends on your PostgreSQL setup
|
||||||
|
-- SELECT cron.schedule('create-partitions', '0 0 1 * *', 'SELECT create_monthly_partition();');
|
||||||
127
lizenzserver/models/__init__.py
Normale Datei
127
lizenzserver/models/__init__.py
Normale Datei
@@ -0,0 +1,127 @@
|
|||||||
|
from datetime import datetime
|
||||||
|
from typing import Optional, List, Dict, Any
|
||||||
|
from dataclasses import dataclass, field
|
||||||
|
from enum import Enum
|
||||||
|
|
||||||
|
class EventType(Enum):
|
||||||
|
"""License event types"""
|
||||||
|
ACTIVATION = "activation"
|
||||||
|
DEACTIVATION = "deactivation"
|
||||||
|
REACTIVATION = "reactivation"
|
||||||
|
TRANSFER = "transfer"
|
||||||
|
|
||||||
|
class AnomalyType(Enum):
|
||||||
|
"""Anomaly detection types"""
|
||||||
|
MULTIPLE_IPS = "multiple_ips"
|
||||||
|
RAPID_HARDWARE_CHANGE = "rapid_hardware_change"
|
||||||
|
SUSPICIOUS_PATTERN = "suspicious_pattern"
|
||||||
|
CONCURRENT_USE = "concurrent_use"
|
||||||
|
GEO_ANOMALY = "geo_anomaly"
|
||||||
|
|
||||||
|
class Severity(Enum):
|
||||||
|
"""Anomaly severity levels"""
|
||||||
|
LOW = "low"
|
||||||
|
MEDIUM = "medium"
|
||||||
|
HIGH = "high"
|
||||||
|
CRITICAL = "critical"
|
||||||
|
|
||||||
|
@dataclass
|
||||||
|
class License:
|
||||||
|
"""License domain model"""
|
||||||
|
id: str
|
||||||
|
license_key: str
|
||||||
|
customer_id: str
|
||||||
|
max_devices: int
|
||||||
|
is_active: bool
|
||||||
|
is_test: bool
|
||||||
|
created_at: datetime
|
||||||
|
updated_at: datetime
|
||||||
|
expires_at: Optional[datetime] = None
|
||||||
|
features: List[str] = field(default_factory=list)
|
||||||
|
metadata: Dict[str, Any] = field(default_factory=dict)
|
||||||
|
|
||||||
|
@dataclass
|
||||||
|
class LicenseToken:
|
||||||
|
"""Offline validation token"""
|
||||||
|
id: str
|
||||||
|
license_id: str
|
||||||
|
token: str
|
||||||
|
hardware_id: str
|
||||||
|
valid_until: datetime
|
||||||
|
created_at: datetime
|
||||||
|
last_validated: Optional[datetime] = None
|
||||||
|
validation_count: int = 0
|
||||||
|
|
||||||
|
@dataclass
|
||||||
|
class Heartbeat:
|
||||||
|
"""License heartbeat"""
|
||||||
|
id: int
|
||||||
|
license_id: str
|
||||||
|
hardware_id: str
|
||||||
|
ip_address: Optional[str]
|
||||||
|
user_agent: Optional[str]
|
||||||
|
app_version: Optional[str]
|
||||||
|
timestamp: datetime
|
||||||
|
session_data: Dict[str, Any] = field(default_factory=dict)
|
||||||
|
|
||||||
|
@dataclass
|
||||||
|
class ActivationEvent:
|
||||||
|
"""License activation event"""
|
||||||
|
id: str
|
||||||
|
license_id: str
|
||||||
|
event_type: EventType
|
||||||
|
hardware_id: Optional[str]
|
||||||
|
previous_hardware_id: Optional[str]
|
||||||
|
ip_address: Optional[str]
|
||||||
|
user_agent: Optional[str]
|
||||||
|
success: bool
|
||||||
|
error_message: Optional[str]
|
||||||
|
metadata: Dict[str, Any] = field(default_factory=dict)
|
||||||
|
created_at: datetime
|
||||||
|
|
||||||
|
@dataclass
|
||||||
|
class AnomalyDetection:
|
||||||
|
"""Detected anomaly"""
|
||||||
|
id: str
|
||||||
|
license_id: str
|
||||||
|
anomaly_type: AnomalyType
|
||||||
|
severity: Severity
|
||||||
|
details: Dict[str, Any]
|
||||||
|
detected_at: datetime
|
||||||
|
resolved: bool = False
|
||||||
|
resolved_at: Optional[datetime] = None
|
||||||
|
resolved_by: Optional[str] = None
|
||||||
|
action_taken: Optional[str] = None
|
||||||
|
|
||||||
|
@dataclass
|
||||||
|
class Session:
|
||||||
|
"""Active session"""
|
||||||
|
id: str
|
||||||
|
license_id: str
|
||||||
|
hardware_id: str
|
||||||
|
session_token: str
|
||||||
|
ip_address: Optional[str]
|
||||||
|
started_at: datetime
|
||||||
|
last_seen: datetime
|
||||||
|
expires_at: datetime
|
||||||
|
|
||||||
|
@dataclass
|
||||||
|
class ValidationRequest:
|
||||||
|
"""License validation request"""
|
||||||
|
license_key: str
|
||||||
|
hardware_id: str
|
||||||
|
app_version: Optional[str] = None
|
||||||
|
ip_address: Optional[str] = None
|
||||||
|
user_agent: Optional[str] = None
|
||||||
|
|
||||||
|
@dataclass
|
||||||
|
class ValidationResponse:
|
||||||
|
"""License validation response"""
|
||||||
|
valid: bool
|
||||||
|
license_id: Optional[str] = None
|
||||||
|
token: Optional[str] = None
|
||||||
|
expires_at: Optional[datetime] = None
|
||||||
|
features: List[str] = field(default_factory=list)
|
||||||
|
limits: Dict[str, Any] = field(default_factory=dict)
|
||||||
|
error: Optional[str] = None
|
||||||
|
error_code: Optional[str] = None
|
||||||
94
lizenzserver/repositories/base.py
Normale Datei
94
lizenzserver/repositories/base.py
Normale Datei
@@ -0,0 +1,94 @@
|
|||||||
|
from abc import ABC, abstractmethod
|
||||||
|
from typing import Optional, List, Dict, Any
|
||||||
|
import psycopg2
|
||||||
|
from psycopg2.extras import RealDictCursor
|
||||||
|
from contextlib import contextmanager
|
||||||
|
import logging
|
||||||
|
|
||||||
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
|
class BaseRepository(ABC):
|
||||||
|
"""Base repository with common database operations"""
|
||||||
|
|
||||||
|
def __init__(self, db_url: str):
|
||||||
|
self.db_url = db_url
|
||||||
|
|
||||||
|
@contextmanager
|
||||||
|
def get_db_connection(self):
|
||||||
|
"""Get database connection with automatic cleanup"""
|
||||||
|
conn = None
|
||||||
|
try:
|
||||||
|
conn = psycopg2.connect(self.db_url)
|
||||||
|
yield conn
|
||||||
|
except Exception as e:
|
||||||
|
if conn:
|
||||||
|
conn.rollback()
|
||||||
|
logger.error(f"Database error: {e}")
|
||||||
|
raise
|
||||||
|
finally:
|
||||||
|
if conn:
|
||||||
|
conn.close()
|
||||||
|
|
||||||
|
@contextmanager
|
||||||
|
def get_db_cursor(self, conn):
|
||||||
|
"""Get database cursor with dict results"""
|
||||||
|
cursor = None
|
||||||
|
try:
|
||||||
|
cursor = conn.cursor(cursor_factory=RealDictCursor)
|
||||||
|
yield cursor
|
||||||
|
finally:
|
||||||
|
if cursor:
|
||||||
|
cursor.close()
|
||||||
|
|
||||||
|
def execute_query(self, query: str, params: tuple = None) -> List[Dict[str, Any]]:
|
||||||
|
"""Execute SELECT query and return results"""
|
||||||
|
with self.get_db_connection() as conn:
|
||||||
|
with self.get_db_cursor(conn) as cursor:
|
||||||
|
cursor.execute(query, params)
|
||||||
|
return cursor.fetchall()
|
||||||
|
|
||||||
|
def execute_one(self, query: str, params: tuple = None) -> Optional[Dict[str, Any]]:
|
||||||
|
"""Execute query and return single result"""
|
||||||
|
with self.get_db_connection() as conn:
|
||||||
|
with self.get_db_cursor(conn) as cursor:
|
||||||
|
cursor.execute(query, params)
|
||||||
|
return cursor.fetchone()
|
||||||
|
|
||||||
|
def execute_insert(self, query: str, params: tuple = None) -> Optional[str]:
|
||||||
|
"""Execute INSERT query and return ID"""
|
||||||
|
with self.get_db_connection() as conn:
|
||||||
|
with self.get_db_cursor(conn) as cursor:
|
||||||
|
cursor.execute(query + " RETURNING id", params)
|
||||||
|
result = cursor.fetchone()
|
||||||
|
conn.commit()
|
||||||
|
return result['id'] if result else None
|
||||||
|
|
||||||
|
def execute_update(self, query: str, params: tuple = None) -> int:
|
||||||
|
"""Execute UPDATE query and return affected rows"""
|
||||||
|
with self.get_db_connection() as conn:
|
||||||
|
with self.get_db_cursor(conn) as cursor:
|
||||||
|
cursor.execute(query, params)
|
||||||
|
affected = cursor.rowcount
|
||||||
|
conn.commit()
|
||||||
|
return affected
|
||||||
|
|
||||||
|
def execute_delete(self, query: str, params: tuple = None) -> int:
|
||||||
|
"""Execute DELETE query and return affected rows"""
|
||||||
|
with self.get_db_connection() as conn:
|
||||||
|
with self.get_db_cursor(conn) as cursor:
|
||||||
|
cursor.execute(query, params)
|
||||||
|
affected = cursor.rowcount
|
||||||
|
conn.commit()
|
||||||
|
return affected
|
||||||
|
|
||||||
|
def execute_batch(self, queries: List[tuple]) -> None:
|
||||||
|
"""Execute multiple queries in a transaction"""
|
||||||
|
with self.get_db_connection() as conn:
|
||||||
|
with self.get_db_cursor(conn) as cursor:
|
||||||
|
try:
|
||||||
|
for query, params in queries:
|
||||||
|
cursor.execute(query, params)
|
||||||
|
conn.commit()
|
||||||
|
except Exception as e:
|
||||||
|
conn.rollback()
|
||||||
|
raise
|
||||||
178
lizenzserver/repositories/cache_repo.py
Normale Datei
178
lizenzserver/repositories/cache_repo.py
Normale Datei
@@ -0,0 +1,178 @@
|
|||||||
|
import redis
|
||||||
|
import json
|
||||||
|
import logging
|
||||||
|
from typing import Optional, Any, Dict, List
|
||||||
|
from datetime import timedelta
|
||||||
|
|
||||||
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
|
class CacheRepository:
|
||||||
|
"""Redis cache repository"""
|
||||||
|
|
||||||
|
def __init__(self, redis_url: str):
|
||||||
|
self.redis_url = redis_url
|
||||||
|
self._connect()
|
||||||
|
|
||||||
|
def _connect(self):
|
||||||
|
"""Connect to Redis"""
|
||||||
|
try:
|
||||||
|
self.redis = redis.from_url(self.redis_url, decode_responses=True)
|
||||||
|
self.redis.ping()
|
||||||
|
logger.info("Connected to Redis")
|
||||||
|
except Exception as e:
|
||||||
|
logger.error(f"Failed to connect to Redis: {e}")
|
||||||
|
self.redis = None
|
||||||
|
|
||||||
|
def _make_key(self, prefix: str, *args) -> str:
|
||||||
|
"""Create cache key"""
|
||||||
|
parts = [prefix] + [str(arg) for arg in args]
|
||||||
|
return ":".join(parts)
|
||||||
|
|
||||||
|
def get(self, key: str) -> Optional[Any]:
|
||||||
|
"""Get value from cache"""
|
||||||
|
if not self.redis:
|
||||||
|
return None
|
||||||
|
|
||||||
|
try:
|
||||||
|
value = self.redis.get(key)
|
||||||
|
if value:
|
||||||
|
return json.loads(value)
|
||||||
|
return None
|
||||||
|
except Exception as e:
|
||||||
|
logger.error(f"Cache get error: {e}")
|
||||||
|
return None
|
||||||
|
|
||||||
|
def set(self, key: str, value: Any, ttl: int = 300) -> bool:
|
||||||
|
"""Set value in cache with TTL in seconds"""
|
||||||
|
if not self.redis:
|
||||||
|
return False
|
||||||
|
|
||||||
|
try:
|
||||||
|
json_value = json.dumps(value)
|
||||||
|
return self.redis.setex(key, ttl, json_value)
|
||||||
|
except Exception as e:
|
||||||
|
logger.error(f"Cache set error: {e}")
|
||||||
|
return False
|
||||||
|
|
||||||
|
def delete(self, key: str) -> bool:
|
||||||
|
"""Delete key from cache"""
|
||||||
|
if not self.redis:
|
||||||
|
return False
|
||||||
|
|
||||||
|
try:
|
||||||
|
return bool(self.redis.delete(key))
|
||||||
|
except Exception as e:
|
||||||
|
logger.error(f"Cache delete error: {e}")
|
||||||
|
return False
|
||||||
|
|
||||||
|
def delete_pattern(self, pattern: str) -> int:
|
||||||
|
"""Delete all keys matching pattern"""
|
||||||
|
if not self.redis:
|
||||||
|
return 0
|
||||||
|
|
||||||
|
try:
|
||||||
|
keys = self.redis.keys(pattern)
|
||||||
|
if keys:
|
||||||
|
return self.redis.delete(*keys)
|
||||||
|
return 0
|
||||||
|
except Exception as e:
|
||||||
|
logger.error(f"Cache delete pattern error: {e}")
|
||||||
|
return 0
|
||||||
|
|
||||||
|
# License-specific cache methods
|
||||||
|
|
||||||
|
def get_license_validation(self, license_key: str, hardware_id: str) -> Optional[Dict[str, Any]]:
|
||||||
|
"""Get cached license validation result"""
|
||||||
|
key = self._make_key("license:validation", license_key, hardware_id)
|
||||||
|
return self.get(key)
|
||||||
|
|
||||||
|
def set_license_validation(self, license_key: str, hardware_id: str,
|
||||||
|
result: Dict[str, Any], ttl: int = 300) -> bool:
|
||||||
|
"""Cache license validation result"""
|
||||||
|
key = self._make_key("license:validation", license_key, hardware_id)
|
||||||
|
return self.set(key, result, ttl)
|
||||||
|
|
||||||
|
def get_license_status(self, license_id: str) -> Optional[Dict[str, Any]]:
|
||||||
|
"""Get cached license status"""
|
||||||
|
key = self._make_key("license:status", license_id)
|
||||||
|
return self.get(key)
|
||||||
|
|
||||||
|
def set_license_status(self, license_id: str, status: Dict[str, Any],
|
||||||
|
ttl: int = 60) -> bool:
|
||||||
|
"""Cache license status"""
|
||||||
|
key = self._make_key("license:status", license_id)
|
||||||
|
return self.set(key, status, ttl)
|
||||||
|
|
||||||
|
def get_device_list(self, license_id: str) -> Optional[List[Dict[str, Any]]]:
|
||||||
|
"""Get cached device list"""
|
||||||
|
key = self._make_key("license:devices", license_id)
|
||||||
|
return self.get(key)
|
||||||
|
|
||||||
|
def set_device_list(self, license_id: str, devices: List[Dict[str, Any]],
|
||||||
|
ttl: int = 300) -> bool:
|
||||||
|
"""Cache device list"""
|
||||||
|
key = self._make_key("license:devices", license_id)
|
||||||
|
return self.set(key, devices, ttl)
|
||||||
|
|
||||||
|
def invalidate_license_cache(self, license_id: str) -> None:
|
||||||
|
"""Invalidate all cache entries for a license"""
|
||||||
|
patterns = [
|
||||||
|
f"license:validation:*:{license_id}",
|
||||||
|
f"license:status:{license_id}",
|
||||||
|
f"license:devices:{license_id}"
|
||||||
|
]
|
||||||
|
|
||||||
|
for pattern in patterns:
|
||||||
|
self.delete_pattern(pattern)
|
||||||
|
|
||||||
|
# Rate limiting methods
|
||||||
|
|
||||||
|
def check_rate_limit(self, key: str, limit: int, window: int) -> tuple[bool, int]:
|
||||||
|
"""Check if rate limit is exceeded
|
||||||
|
Returns: (is_allowed, current_count)
|
||||||
|
"""
|
||||||
|
if not self.redis:
|
||||||
|
return True, 0
|
||||||
|
|
||||||
|
try:
|
||||||
|
pipe = self.redis.pipeline()
|
||||||
|
now = int(time.time())
|
||||||
|
window_start = now - window
|
||||||
|
|
||||||
|
# Remove old entries
|
||||||
|
pipe.zremrangebyscore(key, 0, window_start)
|
||||||
|
|
||||||
|
# Count requests in current window
|
||||||
|
pipe.zcard(key)
|
||||||
|
|
||||||
|
# Add current request
|
||||||
|
pipe.zadd(key, {str(now): now})
|
||||||
|
|
||||||
|
# Set expiry
|
||||||
|
pipe.expire(key, window + 1)
|
||||||
|
|
||||||
|
results = pipe.execute()
|
||||||
|
current_count = results[1]
|
||||||
|
|
||||||
|
return current_count < limit, current_count + 1
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
logger.error(f"Rate limit check error: {e}")
|
||||||
|
return True, 0
|
||||||
|
|
||||||
|
def increment_counter(self, key: str, window: int = 3600) -> int:
|
||||||
|
"""Increment counter with expiry"""
|
||||||
|
if not self.redis:
|
||||||
|
return 0
|
||||||
|
|
||||||
|
try:
|
||||||
|
pipe = self.redis.pipeline()
|
||||||
|
pipe.incr(key)
|
||||||
|
pipe.expire(key, window)
|
||||||
|
results = pipe.execute()
|
||||||
|
return results[0]
|
||||||
|
except Exception as e:
|
||||||
|
logger.error(f"Counter increment error: {e}")
|
||||||
|
return 0
|
||||||
|
|
||||||
|
import time # Add this import at the top
|
||||||
228
lizenzserver/repositories/license_repo.py
Normale Datei
228
lizenzserver/repositories/license_repo.py
Normale Datei
@@ -0,0 +1,228 @@
|
|||||||
|
from typing import Optional, List, Dict, Any
|
||||||
|
from datetime import datetime, timedelta
|
||||||
|
from .base import BaseRepository
|
||||||
|
from ..models import License, LicenseToken, ActivationEvent, EventType
|
||||||
|
import logging
|
||||||
|
|
||||||
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
|
class LicenseRepository(BaseRepository):
|
||||||
|
"""Repository for license-related database operations"""
|
||||||
|
|
||||||
|
def get_license_by_key(self, license_key: str) -> Optional[Dict[str, Any]]:
|
||||||
|
"""Get license by key"""
|
||||||
|
query = """
|
||||||
|
SELECT l.*, c.name as customer_name, c.email as customer_email
|
||||||
|
FROM licenses l
|
||||||
|
JOIN customers c ON l.customer_id = c.id
|
||||||
|
WHERE l.license_key = %s
|
||||||
|
"""
|
||||||
|
return self.execute_one(query, (license_key,))
|
||||||
|
|
||||||
|
def get_license_by_id(self, license_id: str) -> Optional[Dict[str, Any]]:
|
||||||
|
"""Get license by ID"""
|
||||||
|
query = """
|
||||||
|
SELECT l.*, c.name as customer_name, c.email as customer_email
|
||||||
|
FROM licenses l
|
||||||
|
JOIN customers c ON l.customer_id = c.id
|
||||||
|
WHERE l.id = %s
|
||||||
|
"""
|
||||||
|
return self.execute_one(query, (license_id,))
|
||||||
|
|
||||||
|
def get_active_devices(self, license_id: str) -> List[Dict[str, Any]]:
|
||||||
|
"""Get active devices for a license"""
|
||||||
|
query = """
|
||||||
|
SELECT DISTINCT ON (hardware_id)
|
||||||
|
hardware_id,
|
||||||
|
ip_address,
|
||||||
|
user_agent,
|
||||||
|
app_version,
|
||||||
|
timestamp as last_seen
|
||||||
|
FROM license_heartbeats
|
||||||
|
WHERE license_id = %s
|
||||||
|
AND timestamp > NOW() - INTERVAL '15 minutes'
|
||||||
|
ORDER BY hardware_id, timestamp DESC
|
||||||
|
"""
|
||||||
|
return self.execute_query(query, (license_id,))
|
||||||
|
|
||||||
|
def get_device_count(self, license_id: str) -> int:
|
||||||
|
"""Get count of active devices"""
|
||||||
|
query = """
|
||||||
|
SELECT COUNT(DISTINCT hardware_id) as device_count
|
||||||
|
FROM license_heartbeats
|
||||||
|
WHERE license_id = %s
|
||||||
|
AND timestamp > NOW() - INTERVAL '15 minutes'
|
||||||
|
"""
|
||||||
|
result = self.execute_one(query, (license_id,))
|
||||||
|
return result['device_count'] if result else 0
|
||||||
|
|
||||||
|
def create_license_token(self, license_id: str, hardware_id: str,
|
||||||
|
valid_hours: int = 24) -> Optional[str]:
|
||||||
|
"""Create offline validation token"""
|
||||||
|
import secrets
|
||||||
|
token = secrets.token_urlsafe(64)
|
||||||
|
valid_until = datetime.utcnow() + timedelta(hours=valid_hours)
|
||||||
|
|
||||||
|
query = """
|
||||||
|
INSERT INTO license_tokens (license_id, token, hardware_id, valid_until)
|
||||||
|
VALUES (%s, %s, %s, %s)
|
||||||
|
RETURNING id
|
||||||
|
"""
|
||||||
|
|
||||||
|
result = self.execute_insert(query, (license_id, token, hardware_id, valid_until))
|
||||||
|
return token if result else None
|
||||||
|
|
||||||
|
def validate_token(self, token: str) -> Optional[Dict[str, Any]]:
|
||||||
|
"""Validate offline token"""
|
||||||
|
query = """
|
||||||
|
SELECT lt.*, l.license_key, l.is_active, l.expires_at
|
||||||
|
FROM license_tokens lt
|
||||||
|
JOIN licenses l ON lt.license_id = l.id
|
||||||
|
WHERE lt.token = %s
|
||||||
|
AND lt.valid_until > NOW()
|
||||||
|
AND l.is_active = true
|
||||||
|
"""
|
||||||
|
|
||||||
|
result = self.execute_one(query, (token,))
|
||||||
|
|
||||||
|
if result:
|
||||||
|
# Update validation count and timestamp
|
||||||
|
update_query = """
|
||||||
|
UPDATE license_tokens
|
||||||
|
SET validation_count = validation_count + 1,
|
||||||
|
last_validated = NOW()
|
||||||
|
WHERE token = %s
|
||||||
|
"""
|
||||||
|
self.execute_update(update_query, (token,))
|
||||||
|
|
||||||
|
return result
|
||||||
|
|
||||||
|
def record_heartbeat(self, license_id: str, hardware_id: str,
|
||||||
|
ip_address: str = None, user_agent: str = None,
|
||||||
|
app_version: str = None, session_data: Dict = None) -> None:
|
||||||
|
"""Record license heartbeat"""
|
||||||
|
query = """
|
||||||
|
INSERT INTO license_heartbeats
|
||||||
|
(license_id, hardware_id, ip_address, user_agent, app_version, session_data)
|
||||||
|
VALUES (%s, %s, %s, %s, %s, %s)
|
||||||
|
"""
|
||||||
|
|
||||||
|
import json
|
||||||
|
session_json = json.dumps(session_data) if session_data else None
|
||||||
|
|
||||||
|
self.execute_insert(query, (
|
||||||
|
license_id, hardware_id, ip_address,
|
||||||
|
user_agent, app_version, session_json
|
||||||
|
))
|
||||||
|
|
||||||
|
def record_activation_event(self, license_id: str, event_type: EventType,
|
||||||
|
hardware_id: str = None, previous_hardware_id: str = None,
|
||||||
|
ip_address: str = None, user_agent: str = None,
|
||||||
|
success: bool = True, error_message: str = None,
|
||||||
|
metadata: Dict = None) -> str:
|
||||||
|
"""Record activation event"""
|
||||||
|
query = """
|
||||||
|
INSERT INTO activation_events
|
||||||
|
(license_id, event_type, hardware_id, previous_hardware_id,
|
||||||
|
ip_address, user_agent, success, error_message, metadata)
|
||||||
|
VALUES (%s, %s, %s, %s, %s, %s, %s, %s, %s)
|
||||||
|
RETURNING id
|
||||||
|
"""
|
||||||
|
|
||||||
|
import json
|
||||||
|
metadata_json = json.dumps(metadata) if metadata else None
|
||||||
|
|
||||||
|
return self.execute_insert(query, (
|
||||||
|
license_id, event_type.value, hardware_id, previous_hardware_id,
|
||||||
|
ip_address, user_agent, success, error_message, metadata_json
|
||||||
|
))
|
||||||
|
|
||||||
|
def get_recent_activations(self, license_id: str, hours: int = 24) -> List[Dict[str, Any]]:
|
||||||
|
"""Get recent activation events"""
|
||||||
|
query = """
|
||||||
|
SELECT * FROM activation_events
|
||||||
|
WHERE license_id = %s
|
||||||
|
AND created_at > NOW() - INTERVAL '%s hours'
|
||||||
|
ORDER BY created_at DESC
|
||||||
|
"""
|
||||||
|
return self.execute_query(query, (license_id, hours))
|
||||||
|
|
||||||
|
def check_hardware_id_exists(self, license_id: str, hardware_id: str) -> bool:
|
||||||
|
"""Check if hardware ID is already registered"""
|
||||||
|
query = """
|
||||||
|
SELECT 1 FROM activation_events
|
||||||
|
WHERE license_id = %s
|
||||||
|
AND hardware_id = %s
|
||||||
|
AND event_type IN ('activation', 'reactivation')
|
||||||
|
AND success = true
|
||||||
|
LIMIT 1
|
||||||
|
"""
|
||||||
|
result = self.execute_one(query, (license_id, hardware_id))
|
||||||
|
return result is not None
|
||||||
|
|
||||||
|
def deactivate_device(self, license_id: str, hardware_id: str) -> bool:
|
||||||
|
"""Deactivate a device"""
|
||||||
|
# Record deactivation event
|
||||||
|
self.record_activation_event(
|
||||||
|
license_id=license_id,
|
||||||
|
event_type=EventType.DEACTIVATION,
|
||||||
|
hardware_id=hardware_id,
|
||||||
|
success=True
|
||||||
|
)
|
||||||
|
|
||||||
|
# Remove any active tokens for this device
|
||||||
|
query = """
|
||||||
|
DELETE FROM license_tokens
|
||||||
|
WHERE license_id = %s AND hardware_id = %s
|
||||||
|
"""
|
||||||
|
affected = self.execute_delete(query, (license_id, hardware_id))
|
||||||
|
|
||||||
|
return affected > 0
|
||||||
|
|
||||||
|
def transfer_license(self, license_id: str, from_hardware_id: str,
|
||||||
|
to_hardware_id: str, ip_address: str = None) -> bool:
|
||||||
|
"""Transfer license from one device to another"""
|
||||||
|
try:
|
||||||
|
# Deactivate old device
|
||||||
|
self.deactivate_device(license_id, from_hardware_id)
|
||||||
|
|
||||||
|
# Record transfer event
|
||||||
|
self.record_activation_event(
|
||||||
|
license_id=license_id,
|
||||||
|
event_type=EventType.TRANSFER,
|
||||||
|
hardware_id=to_hardware_id,
|
||||||
|
previous_hardware_id=from_hardware_id,
|
||||||
|
ip_address=ip_address,
|
||||||
|
success=True
|
||||||
|
)
|
||||||
|
|
||||||
|
return True
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
logger.error(f"License transfer failed: {e}")
|
||||||
|
return False
|
||||||
|
|
||||||
|
def get_license_usage_stats(self, license_id: str, days: int = 30) -> Dict[str, Any]:
|
||||||
|
"""Get usage statistics for a license"""
|
||||||
|
query = """
|
||||||
|
WITH daily_stats AS (
|
||||||
|
SELECT
|
||||||
|
DATE(timestamp) as date,
|
||||||
|
COUNT(*) as validations,
|
||||||
|
COUNT(DISTINCT hardware_id) as unique_devices,
|
||||||
|
COUNT(DISTINCT ip_address) as unique_ips
|
||||||
|
FROM license_heartbeats
|
||||||
|
WHERE license_id = %s
|
||||||
|
AND timestamp > NOW() - INTERVAL '%s days'
|
||||||
|
GROUP BY DATE(timestamp)
|
||||||
|
)
|
||||||
|
SELECT
|
||||||
|
COUNT(*) as total_days,
|
||||||
|
SUM(validations) as total_validations,
|
||||||
|
AVG(validations) as avg_daily_validations,
|
||||||
|
MAX(unique_devices) as max_devices,
|
||||||
|
MAX(unique_ips) as max_ips
|
||||||
|
FROM daily_stats
|
||||||
|
"""
|
||||||
|
|
||||||
|
return self.execute_one(query, (license_id, days)) or {}
|
||||||
25
lizenzserver/services/auth/Dockerfile
Normale Datei
25
lizenzserver/services/auth/Dockerfile
Normale Datei
@@ -0,0 +1,25 @@
|
|||||||
|
FROM python:3.11-slim
|
||||||
|
|
||||||
|
WORKDIR /app
|
||||||
|
|
||||||
|
# Install system dependencies
|
||||||
|
RUN apt-get update && apt-get install -y \
|
||||||
|
gcc \
|
||||||
|
&& rm -rf /var/lib/apt/lists/*
|
||||||
|
|
||||||
|
# Copy requirements
|
||||||
|
COPY requirements.txt .
|
||||||
|
RUN pip install --no-cache-dir -r requirements.txt
|
||||||
|
|
||||||
|
# Copy application code
|
||||||
|
COPY . .
|
||||||
|
|
||||||
|
# Create non-root user
|
||||||
|
RUN useradd -m -u 1000 appuser && chown -R appuser:appuser /app
|
||||||
|
USER appuser
|
||||||
|
|
||||||
|
# Expose port
|
||||||
|
EXPOSE 5001
|
||||||
|
|
||||||
|
# Run with gunicorn
|
||||||
|
CMD ["gunicorn", "--bind", "0.0.0.0:5001", "--workers", "4", "--timeout", "120", "app:app"]
|
||||||
274
lizenzserver/services/auth/app.py
Normale Datei
274
lizenzserver/services/auth/app.py
Normale Datei
@@ -0,0 +1,274 @@
|
|||||||
|
import os
|
||||||
|
import sys
|
||||||
|
from flask import Flask, request, jsonify
|
||||||
|
from flask_cors import CORS
|
||||||
|
import jwt
|
||||||
|
from datetime import datetime, timedelta
|
||||||
|
import logging
|
||||||
|
from functools import wraps
|
||||||
|
|
||||||
|
# Add parent directory to path for imports
|
||||||
|
sys.path.append(os.path.dirname(os.path.dirname(os.path.dirname(os.path.abspath(__file__)))))
|
||||||
|
|
||||||
|
from config import get_config
|
||||||
|
from repositories.base import BaseRepository
|
||||||
|
|
||||||
|
# Configure logging
|
||||||
|
logging.basicConfig(level=logging.INFO)
|
||||||
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
|
# Initialize Flask app
|
||||||
|
app = Flask(__name__)
|
||||||
|
config = get_config()
|
||||||
|
app.config.from_object(config)
|
||||||
|
CORS(app)
|
||||||
|
|
||||||
|
# Initialize repository
|
||||||
|
db_repo = BaseRepository(config.DATABASE_URL)
|
||||||
|
|
||||||
|
def create_token(payload: dict, expires_delta: timedelta) -> str:
|
||||||
|
"""Create JWT token"""
|
||||||
|
to_encode = payload.copy()
|
||||||
|
expire = datetime.utcnow() + expires_delta
|
||||||
|
to_encode.update({"exp": expire, "iat": datetime.utcnow()})
|
||||||
|
|
||||||
|
return jwt.encode(
|
||||||
|
to_encode,
|
||||||
|
config.JWT_SECRET,
|
||||||
|
algorithm=config.JWT_ALGORITHM
|
||||||
|
)
|
||||||
|
|
||||||
|
def decode_token(token: str) -> dict:
|
||||||
|
"""Decode and validate JWT token"""
|
||||||
|
try:
|
||||||
|
payload = jwt.decode(
|
||||||
|
token,
|
||||||
|
config.JWT_SECRET,
|
||||||
|
algorithms=[config.JWT_ALGORITHM]
|
||||||
|
)
|
||||||
|
return payload
|
||||||
|
except jwt.ExpiredSignatureError:
|
||||||
|
raise ValueError("Token has expired")
|
||||||
|
except jwt.InvalidTokenError:
|
||||||
|
raise ValueError("Invalid token")
|
||||||
|
|
||||||
|
def require_api_key(f):
|
||||||
|
"""Decorator to require API key"""
|
||||||
|
@wraps(f)
|
||||||
|
def decorated_function(*args, **kwargs):
|
||||||
|
api_key = request.headers.get('X-API-Key')
|
||||||
|
|
||||||
|
if not api_key:
|
||||||
|
return jsonify({"error": "Missing API key"}), 401
|
||||||
|
|
||||||
|
# Validate API key
|
||||||
|
query = """
|
||||||
|
SELECT id, client_name, allowed_endpoints
|
||||||
|
FROM api_clients
|
||||||
|
WHERE api_key = %s AND is_active = true
|
||||||
|
"""
|
||||||
|
client = db_repo.execute_one(query, (api_key,))
|
||||||
|
|
||||||
|
if not client:
|
||||||
|
return jsonify({"error": "Invalid API key"}), 401
|
||||||
|
|
||||||
|
# Check if endpoint is allowed
|
||||||
|
endpoint = request.endpoint
|
||||||
|
allowed = client.get('allowed_endpoints', [])
|
||||||
|
if allowed and endpoint not in allowed:
|
||||||
|
return jsonify({"error": "Endpoint not allowed"}), 403
|
||||||
|
|
||||||
|
# Add client info to request
|
||||||
|
request.api_client = client
|
||||||
|
|
||||||
|
return f(*args, **kwargs)
|
||||||
|
|
||||||
|
return decorated_function
|
||||||
|
|
||||||
|
@app.route('/health', methods=['GET'])
|
||||||
|
def health_check():
|
||||||
|
"""Health check endpoint"""
|
||||||
|
return jsonify({
|
||||||
|
"status": "healthy",
|
||||||
|
"service": "auth",
|
||||||
|
"timestamp": datetime.utcnow().isoformat()
|
||||||
|
})
|
||||||
|
|
||||||
|
@app.route('/api/v1/auth/token', methods=['POST'])
|
||||||
|
@require_api_key
|
||||||
|
def create_access_token():
|
||||||
|
"""Create access token for license validation"""
|
||||||
|
data = request.get_json()
|
||||||
|
|
||||||
|
if not data or 'license_id' not in data:
|
||||||
|
return jsonify({"error": "Missing license_id"}), 400
|
||||||
|
|
||||||
|
license_id = data['license_id']
|
||||||
|
hardware_id = data.get('hardware_id')
|
||||||
|
|
||||||
|
# Verify license exists and is active
|
||||||
|
query = """
|
||||||
|
SELECT id, is_active, max_devices
|
||||||
|
FROM licenses
|
||||||
|
WHERE id = %s
|
||||||
|
"""
|
||||||
|
license = db_repo.execute_one(query, (license_id,))
|
||||||
|
|
||||||
|
if not license:
|
||||||
|
return jsonify({"error": "License not found"}), 404
|
||||||
|
|
||||||
|
if not license['is_active']:
|
||||||
|
return jsonify({"error": "License is not active"}), 403
|
||||||
|
|
||||||
|
# Create token payload
|
||||||
|
payload = {
|
||||||
|
"sub": license_id,
|
||||||
|
"hwid": hardware_id,
|
||||||
|
"client_id": request.api_client['id'],
|
||||||
|
"type": "access"
|
||||||
|
}
|
||||||
|
|
||||||
|
# Add features and limits based on license
|
||||||
|
payload["features"] = data.get('features', [])
|
||||||
|
payload["limits"] = {
|
||||||
|
"api_calls": config.DEFAULT_RATE_LIMIT_PER_HOUR,
|
||||||
|
"concurrent_sessions": config.MAX_CONCURRENT_SESSIONS
|
||||||
|
}
|
||||||
|
|
||||||
|
# Create tokens
|
||||||
|
access_token = create_token(payload, config.JWT_ACCESS_TOKEN_EXPIRES)
|
||||||
|
|
||||||
|
# Create refresh token
|
||||||
|
refresh_payload = {
|
||||||
|
"sub": license_id,
|
||||||
|
"client_id": request.api_client['id'],
|
||||||
|
"type": "refresh"
|
||||||
|
}
|
||||||
|
refresh_token = create_token(refresh_payload, config.JWT_REFRESH_TOKEN_EXPIRES)
|
||||||
|
|
||||||
|
return jsonify({
|
||||||
|
"access_token": access_token,
|
||||||
|
"refresh_token": refresh_token,
|
||||||
|
"token_type": "Bearer",
|
||||||
|
"expires_in": int(config.JWT_ACCESS_TOKEN_EXPIRES.total_seconds())
|
||||||
|
})
|
||||||
|
|
||||||
|
@app.route('/api/v1/auth/refresh', methods=['POST'])
|
||||||
|
def refresh_access_token():
|
||||||
|
"""Refresh access token"""
|
||||||
|
data = request.get_json()
|
||||||
|
|
||||||
|
if not data or 'refresh_token' not in data:
|
||||||
|
return jsonify({"error": "Missing refresh_token"}), 400
|
||||||
|
|
||||||
|
try:
|
||||||
|
# Decode refresh token
|
||||||
|
payload = decode_token(data['refresh_token'])
|
||||||
|
|
||||||
|
if payload.get('type') != 'refresh':
|
||||||
|
return jsonify({"error": "Invalid token type"}), 400
|
||||||
|
|
||||||
|
license_id = payload['sub']
|
||||||
|
|
||||||
|
# Verify license still active
|
||||||
|
query = "SELECT is_active FROM licenses WHERE id = %s"
|
||||||
|
license = db_repo.execute_one(query, (license_id,))
|
||||||
|
|
||||||
|
if not license or not license['is_active']:
|
||||||
|
return jsonify({"error": "License is not active"}), 403
|
||||||
|
|
||||||
|
# Create new access token
|
||||||
|
access_payload = {
|
||||||
|
"sub": license_id,
|
||||||
|
"client_id": payload['client_id'],
|
||||||
|
"type": "access"
|
||||||
|
}
|
||||||
|
|
||||||
|
access_token = create_token(access_payload, config.JWT_ACCESS_TOKEN_EXPIRES)
|
||||||
|
|
||||||
|
return jsonify({
|
||||||
|
"access_token": access_token,
|
||||||
|
"token_type": "Bearer",
|
||||||
|
"expires_in": int(config.JWT_ACCESS_TOKEN_EXPIRES.total_seconds())
|
||||||
|
})
|
||||||
|
|
||||||
|
except ValueError as e:
|
||||||
|
return jsonify({"error": str(e)}), 401
|
||||||
|
|
||||||
|
@app.route('/api/v1/auth/verify', methods=['POST'])
|
||||||
|
def verify_token():
|
||||||
|
"""Verify token validity"""
|
||||||
|
auth_header = request.headers.get('Authorization')
|
||||||
|
|
||||||
|
if not auth_header or not auth_header.startswith('Bearer '):
|
||||||
|
return jsonify({"error": "Missing or invalid authorization header"}), 401
|
||||||
|
|
||||||
|
token = auth_header.split(' ')[1]
|
||||||
|
|
||||||
|
try:
|
||||||
|
payload = decode_token(token)
|
||||||
|
|
||||||
|
return jsonify({
|
||||||
|
"valid": True,
|
||||||
|
"license_id": payload['sub'],
|
||||||
|
"expires_at": datetime.fromtimestamp(payload['exp']).isoformat()
|
||||||
|
})
|
||||||
|
|
||||||
|
except ValueError as e:
|
||||||
|
return jsonify({
|
||||||
|
"valid": False,
|
||||||
|
"error": str(e)
|
||||||
|
}), 401
|
||||||
|
|
||||||
|
@app.route('/api/v1/auth/api-key', methods=['POST'])
|
||||||
|
def create_api_key():
|
||||||
|
"""Create new API key (admin only)"""
|
||||||
|
# This endpoint should be protected by admin authentication
|
||||||
|
# For now, we'll use a simple secret header
|
||||||
|
admin_secret = request.headers.get('X-Admin-Secret')
|
||||||
|
|
||||||
|
if admin_secret != os.getenv('ADMIN_SECRET', 'change-this-admin-secret'):
|
||||||
|
return jsonify({"error": "Unauthorized"}), 401
|
||||||
|
|
||||||
|
data = request.get_json()
|
||||||
|
|
||||||
|
if not data or 'client_name' not in data:
|
||||||
|
return jsonify({"error": "Missing client_name"}), 400
|
||||||
|
|
||||||
|
import secrets
|
||||||
|
api_key = f"sk_{secrets.token_urlsafe(32)}"
|
||||||
|
secret_key = secrets.token_urlsafe(64)
|
||||||
|
|
||||||
|
query = """
|
||||||
|
INSERT INTO api_clients (client_name, api_key, secret_key, allowed_endpoints)
|
||||||
|
VALUES (%s, %s, %s, %s)
|
||||||
|
RETURNING id
|
||||||
|
"""
|
||||||
|
|
||||||
|
allowed_endpoints = data.get('allowed_endpoints', [])
|
||||||
|
client_id = db_repo.execute_insert(
|
||||||
|
query,
|
||||||
|
(data['client_name'], api_key, secret_key, allowed_endpoints)
|
||||||
|
)
|
||||||
|
|
||||||
|
if not client_id:
|
||||||
|
return jsonify({"error": "Failed to create API key"}), 500
|
||||||
|
|
||||||
|
return jsonify({
|
||||||
|
"client_id": client_id,
|
||||||
|
"api_key": api_key,
|
||||||
|
"secret_key": secret_key,
|
||||||
|
"client_name": data['client_name']
|
||||||
|
}), 201
|
||||||
|
|
||||||
|
@app.errorhandler(404)
|
||||||
|
def not_found(error):
|
||||||
|
return jsonify({"error": "Not found"}), 404
|
||||||
|
|
||||||
|
@app.errorhandler(500)
|
||||||
|
def internal_error(error):
|
||||||
|
logger.error(f"Internal error: {error}")
|
||||||
|
return jsonify({"error": "Internal server error"}), 500
|
||||||
|
|
||||||
|
if __name__ == '__main__':
|
||||||
|
app.run(host='0.0.0.0', port=5001, debug=True)
|
||||||
8
lizenzserver/services/auth/requirements.txt
Normale Datei
8
lizenzserver/services/auth/requirements.txt
Normale Datei
@@ -0,0 +1,8 @@
|
|||||||
|
flask==3.0.0
|
||||||
|
flask-cors==4.0.0
|
||||||
|
pyjwt==2.8.0
|
||||||
|
psycopg2-binary==2.9.9
|
||||||
|
redis==5.0.1
|
||||||
|
python-dotenv==1.0.0
|
||||||
|
gunicorn==21.2.0
|
||||||
|
marshmallow==3.20.1
|
||||||
25
lizenzserver/services/license_api/Dockerfile
Normale Datei
25
lizenzserver/services/license_api/Dockerfile
Normale Datei
@@ -0,0 +1,25 @@
|
|||||||
|
FROM python:3.11-slim
|
||||||
|
|
||||||
|
WORKDIR /app
|
||||||
|
|
||||||
|
# Install system dependencies
|
||||||
|
RUN apt-get update && apt-get install -y \
|
||||||
|
gcc \
|
||||||
|
&& rm -rf /var/lib/apt/lists/*
|
||||||
|
|
||||||
|
# Copy requirements
|
||||||
|
COPY requirements.txt .
|
||||||
|
RUN pip install --no-cache-dir -r requirements.txt
|
||||||
|
|
||||||
|
# Copy application code
|
||||||
|
COPY . .
|
||||||
|
|
||||||
|
# Create non-root user
|
||||||
|
RUN useradd -m -u 1000 appuser && chown -R appuser:appuser /app
|
||||||
|
USER appuser
|
||||||
|
|
||||||
|
# Expose port
|
||||||
|
EXPOSE 5002
|
||||||
|
|
||||||
|
# Run with gunicorn
|
||||||
|
CMD ["gunicorn", "--bind", "0.0.0.0:5002", "--workers", "4", "--timeout", "120", "app:app"]
|
||||||
409
lizenzserver/services/license_api/app.py
Normale Datei
409
lizenzserver/services/license_api/app.py
Normale Datei
@@ -0,0 +1,409 @@
|
|||||||
|
import os
|
||||||
|
import sys
|
||||||
|
from flask import Flask, request, jsonify
|
||||||
|
from flask_cors import CORS
|
||||||
|
import jwt
|
||||||
|
from datetime import datetime, timedelta
|
||||||
|
import logging
|
||||||
|
from functools import wraps
|
||||||
|
from marshmallow import Schema, fields, ValidationError
|
||||||
|
|
||||||
|
# Add parent directory to path for imports
|
||||||
|
sys.path.append(os.path.dirname(os.path.dirname(os.path.dirname(os.path.abspath(__file__)))))
|
||||||
|
|
||||||
|
from config import get_config
|
||||||
|
from repositories.license_repo import LicenseRepository
|
||||||
|
from repositories.cache_repo import CacheRepository
|
||||||
|
from events.event_bus import EventBus, Event, EventTypes
|
||||||
|
from models import EventType, ValidationRequest, ValidationResponse
|
||||||
|
|
||||||
|
# Configure logging
|
||||||
|
logging.basicConfig(level=logging.INFO)
|
||||||
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
|
# Initialize Flask app
|
||||||
|
app = Flask(__name__)
|
||||||
|
config = get_config()
|
||||||
|
app.config.from_object(config)
|
||||||
|
CORS(app)
|
||||||
|
|
||||||
|
# Initialize dependencies
|
||||||
|
license_repo = LicenseRepository(config.DATABASE_URL)
|
||||||
|
cache_repo = CacheRepository(config.REDIS_URL)
|
||||||
|
event_bus = EventBus(config.RABBITMQ_URL)
|
||||||
|
|
||||||
|
# Validation schemas
|
||||||
|
class ValidateSchema(Schema):
|
||||||
|
license_key = fields.Str(required=True)
|
||||||
|
hardware_id = fields.Str(required=True)
|
||||||
|
app_version = fields.Str()
|
||||||
|
|
||||||
|
class ActivateSchema(Schema):
|
||||||
|
license_key = fields.Str(required=True)
|
||||||
|
hardware_id = fields.Str(required=True)
|
||||||
|
device_name = fields.Str()
|
||||||
|
os_info = fields.Dict()
|
||||||
|
|
||||||
|
class HeartbeatSchema(Schema):
|
||||||
|
session_data = fields.Dict()
|
||||||
|
|
||||||
|
class OfflineTokenSchema(Schema):
|
||||||
|
duration_hours = fields.Int(missing=24, validate=lambda x: 0 < x <= 72)
|
||||||
|
|
||||||
|
def require_api_key(f):
|
||||||
|
"""Decorator to require API key"""
|
||||||
|
@wraps(f)
|
||||||
|
def decorated_function(*args, **kwargs):
|
||||||
|
api_key = request.headers.get('X-API-Key')
|
||||||
|
|
||||||
|
if not api_key:
|
||||||
|
return jsonify({"error": "Missing API key"}), 401
|
||||||
|
|
||||||
|
# For now, accept any API key starting with 'sk_'
|
||||||
|
# In production, validate against database
|
||||||
|
if not api_key.startswith('sk_'):
|
||||||
|
return jsonify({"error": "Invalid API key"}), 401
|
||||||
|
|
||||||
|
return f(*args, **kwargs)
|
||||||
|
|
||||||
|
return decorated_function
|
||||||
|
|
||||||
|
def require_auth_token(f):
|
||||||
|
"""Decorator to require JWT token"""
|
||||||
|
@wraps(f)
|
||||||
|
def decorated_function(*args, **kwargs):
|
||||||
|
auth_header = request.headers.get('Authorization')
|
||||||
|
|
||||||
|
if not auth_header or not auth_header.startswith('Bearer '):
|
||||||
|
return jsonify({"error": "Missing or invalid authorization header"}), 401
|
||||||
|
|
||||||
|
token = auth_header.split(' ')[1]
|
||||||
|
|
||||||
|
try:
|
||||||
|
payload = jwt.decode(
|
||||||
|
token,
|
||||||
|
config.JWT_SECRET,
|
||||||
|
algorithms=[config.JWT_ALGORITHM]
|
||||||
|
)
|
||||||
|
request.token_payload = payload
|
||||||
|
return f(*args, **kwargs)
|
||||||
|
except jwt.ExpiredSignatureError:
|
||||||
|
return jsonify({"error": "Token has expired"}), 401
|
||||||
|
except jwt.InvalidTokenError:
|
||||||
|
return jsonify({"error": "Invalid token"}), 401
|
||||||
|
|
||||||
|
return decorated_function
|
||||||
|
|
||||||
|
def get_client_ip():
|
||||||
|
"""Get client IP address"""
|
||||||
|
if request.headers.get('X-Forwarded-For'):
|
||||||
|
return request.headers.get('X-Forwarded-For').split(',')[0]
|
||||||
|
return request.remote_addr
|
||||||
|
|
||||||
|
@app.route('/health', methods=['GET'])
|
||||||
|
def health_check():
|
||||||
|
"""Health check endpoint"""
|
||||||
|
return jsonify({
|
||||||
|
"status": "healthy",
|
||||||
|
"service": "license-api",
|
||||||
|
"timestamp": datetime.utcnow().isoformat()
|
||||||
|
})
|
||||||
|
|
||||||
|
@app.route('/api/v1/license/validate', methods=['POST'])
|
||||||
|
@require_api_key
|
||||||
|
def validate_license():
|
||||||
|
"""Validate license key with hardware ID"""
|
||||||
|
schema = ValidateSchema()
|
||||||
|
|
||||||
|
try:
|
||||||
|
data = schema.load(request.get_json())
|
||||||
|
except ValidationError as e:
|
||||||
|
return jsonify({"error": "Invalid request", "details": e.messages}), 400
|
||||||
|
|
||||||
|
license_key = data['license_key']
|
||||||
|
hardware_id = data['hardware_id']
|
||||||
|
app_version = data.get('app_version')
|
||||||
|
|
||||||
|
# Check cache first
|
||||||
|
cached_result = cache_repo.get_license_validation(license_key, hardware_id)
|
||||||
|
if cached_result:
|
||||||
|
logger.info(f"Cache hit for license validation: {license_key[:8]}...")
|
||||||
|
return jsonify(cached_result)
|
||||||
|
|
||||||
|
# Get license from database
|
||||||
|
license = license_repo.get_license_by_key(license_key)
|
||||||
|
|
||||||
|
if not license:
|
||||||
|
event_bus.publish(Event(
|
||||||
|
EventTypes.LICENSE_VALIDATION_FAILED,
|
||||||
|
{"license_key": license_key, "reason": "not_found"},
|
||||||
|
"license-api"
|
||||||
|
))
|
||||||
|
return jsonify({
|
||||||
|
"valid": False,
|
||||||
|
"error": "License not found",
|
||||||
|
"error_code": "LICENSE_NOT_FOUND"
|
||||||
|
}), 404
|
||||||
|
|
||||||
|
# Check if license is active
|
||||||
|
if not license['is_active']:
|
||||||
|
event_bus.publish(Event(
|
||||||
|
EventTypes.LICENSE_VALIDATION_FAILED,
|
||||||
|
{"license_id": license['id'], "reason": "inactive"},
|
||||||
|
"license-api"
|
||||||
|
))
|
||||||
|
return jsonify({
|
||||||
|
"valid": False,
|
||||||
|
"error": "License is not active",
|
||||||
|
"error_code": "LICENSE_INACTIVE"
|
||||||
|
}), 403
|
||||||
|
|
||||||
|
# Check expiration
|
||||||
|
if license['expires_at'] and datetime.utcnow() > license['expires_at']:
|
||||||
|
event_bus.publish(Event(
|
||||||
|
EventTypes.LICENSE_EXPIRED,
|
||||||
|
{"license_id": license['id']},
|
||||||
|
"license-api"
|
||||||
|
))
|
||||||
|
return jsonify({
|
||||||
|
"valid": False,
|
||||||
|
"error": "License has expired",
|
||||||
|
"error_code": "LICENSE_EXPIRED"
|
||||||
|
}), 403
|
||||||
|
|
||||||
|
# Check device limit
|
||||||
|
device_count = license_repo.get_device_count(license['id'])
|
||||||
|
if device_count >= license['max_devices']:
|
||||||
|
# Check if this device is already registered
|
||||||
|
if not license_repo.check_hardware_id_exists(license['id'], hardware_id):
|
||||||
|
return jsonify({
|
||||||
|
"valid": False,
|
||||||
|
"error": "Device limit exceeded",
|
||||||
|
"error_code": "DEVICE_LIMIT_EXCEEDED",
|
||||||
|
"current_devices": device_count,
|
||||||
|
"max_devices": license['max_devices']
|
||||||
|
}), 403
|
||||||
|
|
||||||
|
# Record heartbeat
|
||||||
|
license_repo.record_heartbeat(
|
||||||
|
license_id=license['id'],
|
||||||
|
hardware_id=hardware_id,
|
||||||
|
ip_address=get_client_ip(),
|
||||||
|
user_agent=request.headers.get('User-Agent'),
|
||||||
|
app_version=app_version
|
||||||
|
)
|
||||||
|
|
||||||
|
# Create response
|
||||||
|
response = {
|
||||||
|
"valid": True,
|
||||||
|
"license_id": license['id'],
|
||||||
|
"expires_at": license['expires_at'].isoformat() if license['expires_at'] else None,
|
||||||
|
"features": license.get('features', []),
|
||||||
|
"limits": {
|
||||||
|
"max_devices": license['max_devices'],
|
||||||
|
"current_devices": device_count
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
# Cache the result
|
||||||
|
cache_repo.set_license_validation(
|
||||||
|
license_key,
|
||||||
|
hardware_id,
|
||||||
|
response,
|
||||||
|
config.CACHE_TTL_VALIDATION
|
||||||
|
)
|
||||||
|
|
||||||
|
# Publish success event
|
||||||
|
event_bus.publish(Event(
|
||||||
|
EventTypes.LICENSE_VALIDATED,
|
||||||
|
{
|
||||||
|
"license_id": license['id'],
|
||||||
|
"hardware_id": hardware_id,
|
||||||
|
"ip_address": get_client_ip()
|
||||||
|
},
|
||||||
|
"license-api"
|
||||||
|
))
|
||||||
|
|
||||||
|
return jsonify(response)
|
||||||
|
|
||||||
|
@app.route('/api/v1/license/activate', methods=['POST'])
|
||||||
|
@require_api_key
|
||||||
|
def activate_license():
|
||||||
|
"""Activate license on a new device"""
|
||||||
|
schema = ActivateSchema()
|
||||||
|
|
||||||
|
try:
|
||||||
|
data = schema.load(request.get_json())
|
||||||
|
except ValidationError as e:
|
||||||
|
return jsonify({"error": "Invalid request", "details": e.messages}), 400
|
||||||
|
|
||||||
|
license_key = data['license_key']
|
||||||
|
hardware_id = data['hardware_id']
|
||||||
|
device_name = data.get('device_name')
|
||||||
|
os_info = data.get('os_info', {})
|
||||||
|
|
||||||
|
# Get license
|
||||||
|
license = license_repo.get_license_by_key(license_key)
|
||||||
|
|
||||||
|
if not license:
|
||||||
|
return jsonify({
|
||||||
|
"error": "License not found",
|
||||||
|
"error_code": "LICENSE_NOT_FOUND"
|
||||||
|
}), 404
|
||||||
|
|
||||||
|
if not license['is_active']:
|
||||||
|
return jsonify({
|
||||||
|
"error": "License is not active",
|
||||||
|
"error_code": "LICENSE_INACTIVE"
|
||||||
|
}), 403
|
||||||
|
|
||||||
|
# Check if already activated on this device
|
||||||
|
if license_repo.check_hardware_id_exists(license['id'], hardware_id):
|
||||||
|
return jsonify({
|
||||||
|
"error": "License already activated on this device",
|
||||||
|
"error_code": "ALREADY_ACTIVATED"
|
||||||
|
}), 400
|
||||||
|
|
||||||
|
# Check device limit
|
||||||
|
device_count = license_repo.get_device_count(license['id'])
|
||||||
|
if device_count >= license['max_devices']:
|
||||||
|
return jsonify({
|
||||||
|
"error": "Device limit exceeded",
|
||||||
|
"error_code": "DEVICE_LIMIT_EXCEEDED",
|
||||||
|
"current_devices": device_count,
|
||||||
|
"max_devices": license['max_devices']
|
||||||
|
}), 403
|
||||||
|
|
||||||
|
# Record activation
|
||||||
|
license_repo.record_activation_event(
|
||||||
|
license_id=license['id'],
|
||||||
|
event_type=EventType.ACTIVATION,
|
||||||
|
hardware_id=hardware_id,
|
||||||
|
ip_address=get_client_ip(),
|
||||||
|
user_agent=request.headers.get('User-Agent'),
|
||||||
|
success=True,
|
||||||
|
metadata={
|
||||||
|
"device_name": device_name,
|
||||||
|
"os_info": os_info
|
||||||
|
}
|
||||||
|
)
|
||||||
|
|
||||||
|
# Invalidate cache
|
||||||
|
cache_repo.invalidate_license_cache(license['id'])
|
||||||
|
|
||||||
|
# Publish event
|
||||||
|
event_bus.publish(Event(
|
||||||
|
EventTypes.LICENSE_ACTIVATED,
|
||||||
|
{
|
||||||
|
"license_id": license['id'],
|
||||||
|
"hardware_id": hardware_id,
|
||||||
|
"device_name": device_name
|
||||||
|
},
|
||||||
|
"license-api"
|
||||||
|
))
|
||||||
|
|
||||||
|
return jsonify({
|
||||||
|
"success": True,
|
||||||
|
"license_id": license['id'],
|
||||||
|
"message": "License activated successfully"
|
||||||
|
}), 201
|
||||||
|
|
||||||
|
@app.route('/api/v1/license/heartbeat', methods=['POST'])
|
||||||
|
@require_auth_token
|
||||||
|
def heartbeat():
|
||||||
|
"""Record license heartbeat"""
|
||||||
|
schema = HeartbeatSchema()
|
||||||
|
|
||||||
|
try:
|
||||||
|
data = schema.load(request.get_json() or {})
|
||||||
|
except ValidationError as e:
|
||||||
|
return jsonify({"error": "Invalid request", "details": e.messages}), 400
|
||||||
|
|
||||||
|
license_id = request.token_payload['sub']
|
||||||
|
hardware_id = request.token_payload.get('hwid')
|
||||||
|
|
||||||
|
# Record heartbeat
|
||||||
|
license_repo.record_heartbeat(
|
||||||
|
license_id=license_id,
|
||||||
|
hardware_id=hardware_id,
|
||||||
|
ip_address=get_client_ip(),
|
||||||
|
user_agent=request.headers.get('User-Agent'),
|
||||||
|
session_data=data.get('session_data', {})
|
||||||
|
)
|
||||||
|
|
||||||
|
return jsonify({
|
||||||
|
"success": True,
|
||||||
|
"timestamp": datetime.utcnow().isoformat()
|
||||||
|
})
|
||||||
|
|
||||||
|
@app.route('/api/v1/license/offline-token', methods=['POST'])
|
||||||
|
@require_auth_token
|
||||||
|
def create_offline_token():
|
||||||
|
"""Create offline validation token"""
|
||||||
|
schema = OfflineTokenSchema()
|
||||||
|
|
||||||
|
try:
|
||||||
|
data = schema.load(request.get_json() or {})
|
||||||
|
except ValidationError as e:
|
||||||
|
return jsonify({"error": "Invalid request", "details": e.messages}), 400
|
||||||
|
|
||||||
|
license_id = request.token_payload['sub']
|
||||||
|
hardware_id = request.token_payload.get('hwid')
|
||||||
|
duration_hours = data['duration_hours']
|
||||||
|
|
||||||
|
if not hardware_id:
|
||||||
|
return jsonify({"error": "Hardware ID required"}), 400
|
||||||
|
|
||||||
|
# Create offline token
|
||||||
|
token = license_repo.create_license_token(
|
||||||
|
license_id=license_id,
|
||||||
|
hardware_id=hardware_id,
|
||||||
|
valid_hours=duration_hours
|
||||||
|
)
|
||||||
|
|
||||||
|
if not token:
|
||||||
|
return jsonify({"error": "Failed to create token"}), 500
|
||||||
|
|
||||||
|
valid_until = datetime.utcnow() + timedelta(hours=duration_hours)
|
||||||
|
|
||||||
|
return jsonify({
|
||||||
|
"token": token,
|
||||||
|
"valid_until": valid_until.isoformat(),
|
||||||
|
"duration_hours": duration_hours
|
||||||
|
})
|
||||||
|
|
||||||
|
@app.route('/api/v1/license/validate-offline', methods=['POST'])
|
||||||
|
def validate_offline_token():
|
||||||
|
"""Validate offline token"""
|
||||||
|
data = request.get_json()
|
||||||
|
|
||||||
|
if not data or 'token' not in data:
|
||||||
|
return jsonify({"error": "Missing token"}), 400
|
||||||
|
|
||||||
|
# Validate token
|
||||||
|
result = license_repo.validate_token(data['token'])
|
||||||
|
|
||||||
|
if not result:
|
||||||
|
return jsonify({
|
||||||
|
"valid": False,
|
||||||
|
"error": "Invalid or expired token"
|
||||||
|
}), 401
|
||||||
|
|
||||||
|
return jsonify({
|
||||||
|
"valid": True,
|
||||||
|
"license_id": result['license_id'],
|
||||||
|
"hardware_id": result['hardware_id'],
|
||||||
|
"expires_at": result['valid_until'].isoformat()
|
||||||
|
})
|
||||||
|
|
||||||
|
@app.errorhandler(404)
|
||||||
|
def not_found(error):
|
||||||
|
return jsonify({"error": "Not found"}), 404
|
||||||
|
|
||||||
|
@app.errorhandler(500)
|
||||||
|
def internal_error(error):
|
||||||
|
logger.error(f"Internal error: {error}")
|
||||||
|
return jsonify({"error": "Internal server error"}), 500
|
||||||
|
|
||||||
|
if __name__ == '__main__':
|
||||||
|
app.run(host='0.0.0.0', port=5002, debug=True)
|
||||||
10
lizenzserver/services/license_api/requirements.txt
Normale Datei
10
lizenzserver/services/license_api/requirements.txt
Normale Datei
@@ -0,0 +1,10 @@
|
|||||||
|
flask==3.0.0
|
||||||
|
flask-cors==4.0.0
|
||||||
|
pyjwt==2.8.0
|
||||||
|
psycopg2-binary==2.9.9
|
||||||
|
redis==5.0.1
|
||||||
|
pika==1.3.2
|
||||||
|
python-dotenv==1.0.0
|
||||||
|
gunicorn==21.2.0
|
||||||
|
marshmallow==3.20.1
|
||||||
|
requests==2.31.0
|
||||||
@@ -351,3 +351,217 @@ BEGIN
|
|||||||
UPDATE sessions SET active = is_active;
|
UPDATE sessions SET active = is_active;
|
||||||
END IF;
|
END IF;
|
||||||
END $$;
|
END $$;
|
||||||
|
|
||||||
|
-- ===================== LICENSE SERVER TABLES =====================
|
||||||
|
-- Following best practices: snake_case for DB fields, clear naming conventions
|
||||||
|
|
||||||
|
-- Enable UUID extension
|
||||||
|
CREATE EXTENSION IF NOT EXISTS "uuid-ossp";
|
||||||
|
|
||||||
|
-- License tokens for offline validation
|
||||||
|
CREATE TABLE IF NOT EXISTS license_tokens (
|
||||||
|
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
|
||||||
|
license_id INTEGER REFERENCES licenses(id) ON DELETE CASCADE,
|
||||||
|
token VARCHAR(512) NOT NULL UNIQUE,
|
||||||
|
hardware_id VARCHAR(255) NOT NULL,
|
||||||
|
valid_until TIMESTAMP NOT NULL,
|
||||||
|
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
|
||||||
|
last_validated TIMESTAMP,
|
||||||
|
validation_count INTEGER DEFAULT 0
|
||||||
|
);
|
||||||
|
|
||||||
|
CREATE INDEX idx_token ON license_tokens(token);
|
||||||
|
CREATE INDEX idx_hardware ON license_tokens(hardware_id);
|
||||||
|
CREATE INDEX idx_valid_until ON license_tokens(valid_until);
|
||||||
|
|
||||||
|
-- Heartbeat tracking with partitioning support
|
||||||
|
CREATE TABLE IF NOT EXISTS license_heartbeats (
|
||||||
|
id BIGSERIAL,
|
||||||
|
license_id INTEGER REFERENCES licenses(id) ON DELETE CASCADE,
|
||||||
|
hardware_id VARCHAR(255) NOT NULL,
|
||||||
|
ip_address INET,
|
||||||
|
user_agent VARCHAR(500),
|
||||||
|
app_version VARCHAR(50),
|
||||||
|
timestamp TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
|
||||||
|
session_data JSONB,
|
||||||
|
PRIMARY KEY (id, timestamp)
|
||||||
|
) PARTITION BY RANGE (timestamp);
|
||||||
|
|
||||||
|
-- Create partitions for the current and next month
|
||||||
|
CREATE TABLE IF NOT EXISTS license_heartbeats_2025_01 PARTITION OF license_heartbeats
|
||||||
|
FOR VALUES FROM ('2025-01-01') TO ('2025-02-01');
|
||||||
|
|
||||||
|
CREATE TABLE IF NOT EXISTS license_heartbeats_2025_02 PARTITION OF license_heartbeats
|
||||||
|
FOR VALUES FROM ('2025-02-01') TO ('2025-03-01');
|
||||||
|
|
||||||
|
CREATE INDEX idx_heartbeat_license_time ON license_heartbeats(license_id, timestamp DESC);
|
||||||
|
CREATE INDEX idx_heartbeat_hardware_time ON license_heartbeats(hardware_id, timestamp DESC);
|
||||||
|
|
||||||
|
-- Activation events tracking
|
||||||
|
CREATE TABLE IF NOT EXISTS activation_events (
|
||||||
|
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
|
||||||
|
license_id INTEGER REFERENCES licenses(id) ON DELETE CASCADE,
|
||||||
|
event_type VARCHAR(50) NOT NULL CHECK (event_type IN ('activation', 'deactivation', 'reactivation', 'transfer')),
|
||||||
|
hardware_id VARCHAR(255),
|
||||||
|
previous_hardware_id VARCHAR(255),
|
||||||
|
ip_address INET,
|
||||||
|
user_agent VARCHAR(500),
|
||||||
|
success BOOLEAN DEFAULT true,
|
||||||
|
error_message TEXT,
|
||||||
|
metadata JSONB,
|
||||||
|
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP
|
||||||
|
);
|
||||||
|
|
||||||
|
CREATE INDEX idx_license_events ON activation_events(license_id, created_at DESC);
|
||||||
|
CREATE INDEX idx_event_type ON activation_events(event_type, created_at DESC);
|
||||||
|
|
||||||
|
-- API rate limiting
|
||||||
|
CREATE TABLE IF NOT EXISTS api_rate_limits (
|
||||||
|
id SERIAL PRIMARY KEY,
|
||||||
|
api_key VARCHAR(255) NOT NULL UNIQUE,
|
||||||
|
requests_per_minute INTEGER DEFAULT 60,
|
||||||
|
requests_per_hour INTEGER DEFAULT 1000,
|
||||||
|
requests_per_day INTEGER DEFAULT 10000,
|
||||||
|
burst_size INTEGER DEFAULT 100,
|
||||||
|
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
|
||||||
|
updated_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP
|
||||||
|
);
|
||||||
|
|
||||||
|
-- Anomaly detection
|
||||||
|
CREATE TABLE IF NOT EXISTS anomaly_detections (
|
||||||
|
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
|
||||||
|
license_id INTEGER REFERENCES licenses(id),
|
||||||
|
anomaly_type VARCHAR(100) NOT NULL CHECK (anomaly_type IN ('multiple_ips', 'rapid_hardware_change', 'suspicious_pattern', 'concurrent_use', 'geo_anomaly')),
|
||||||
|
severity VARCHAR(20) NOT NULL CHECK (severity IN ('low', 'medium', 'high', 'critical')),
|
||||||
|
details JSONB NOT NULL,
|
||||||
|
detected_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
|
||||||
|
resolved BOOLEAN DEFAULT false,
|
||||||
|
resolved_at TIMESTAMP,
|
||||||
|
resolved_by VARCHAR(255),
|
||||||
|
action_taken TEXT
|
||||||
|
);
|
||||||
|
|
||||||
|
CREATE INDEX idx_unresolved ON anomaly_detections(resolved, severity, detected_at DESC);
|
||||||
|
CREATE INDEX idx_license_anomalies ON anomaly_detections(license_id, detected_at DESC);
|
||||||
|
|
||||||
|
-- API clients for authentication
|
||||||
|
CREATE TABLE IF NOT EXISTS api_clients (
|
||||||
|
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
|
||||||
|
client_name VARCHAR(255) NOT NULL,
|
||||||
|
api_key VARCHAR(255) NOT NULL UNIQUE,
|
||||||
|
secret_key VARCHAR(255) NOT NULL,
|
||||||
|
is_active BOOLEAN DEFAULT true,
|
||||||
|
allowed_endpoints TEXT[],
|
||||||
|
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
|
||||||
|
updated_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP
|
||||||
|
);
|
||||||
|
|
||||||
|
-- Feature flags for gradual rollout
|
||||||
|
CREATE TABLE IF NOT EXISTS feature_flags (
|
||||||
|
id SERIAL PRIMARY KEY,
|
||||||
|
feature_name VARCHAR(100) NOT NULL UNIQUE,
|
||||||
|
is_enabled BOOLEAN DEFAULT false,
|
||||||
|
rollout_percentage INTEGER DEFAULT 0 CHECK (rollout_percentage >= 0 AND rollout_percentage <= 100),
|
||||||
|
whitelist_license_ids INTEGER[],
|
||||||
|
blacklist_license_ids INTEGER[],
|
||||||
|
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
|
||||||
|
updated_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP
|
||||||
|
);
|
||||||
|
|
||||||
|
-- Insert default feature flags
|
||||||
|
INSERT INTO feature_flags (feature_name, is_enabled, rollout_percentage) VALUES
|
||||||
|
('anomaly_detection', true, 100),
|
||||||
|
('offline_tokens', true, 100),
|
||||||
|
('advanced_analytics', false, 0),
|
||||||
|
('geo_restriction', false, 0)
|
||||||
|
ON CONFLICT (feature_name) DO NOTHING;
|
||||||
|
|
||||||
|
-- Session management for concurrent use tracking
|
||||||
|
CREATE TABLE IF NOT EXISTS active_sessions (
|
||||||
|
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
|
||||||
|
license_id INTEGER REFERENCES licenses(id) ON DELETE CASCADE,
|
||||||
|
hardware_id VARCHAR(255) NOT NULL,
|
||||||
|
session_token VARCHAR(512) NOT NULL UNIQUE,
|
||||||
|
ip_address INET,
|
||||||
|
started_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
|
||||||
|
last_seen TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
|
||||||
|
expires_at TIMESTAMP NOT NULL
|
||||||
|
);
|
||||||
|
|
||||||
|
CREATE INDEX idx_session_license ON active_sessions(license_id);
|
||||||
|
CREATE INDEX idx_session_expires ON active_sessions(expires_at);
|
||||||
|
|
||||||
|
-- Update trigger for updated_at columns
|
||||||
|
CREATE OR REPLACE FUNCTION update_updated_at_column()
|
||||||
|
RETURNS TRIGGER AS $$
|
||||||
|
BEGIN
|
||||||
|
NEW.updated_at = CURRENT_TIMESTAMP;
|
||||||
|
RETURN NEW;
|
||||||
|
END;
|
||||||
|
$$ language 'plpgsql';
|
||||||
|
|
||||||
|
CREATE TRIGGER update_api_rate_limits_updated_at BEFORE UPDATE ON api_rate_limits
|
||||||
|
FOR EACH ROW EXECUTE FUNCTION update_updated_at_column();
|
||||||
|
|
||||||
|
CREATE TRIGGER update_api_clients_updated_at BEFORE UPDATE ON api_clients
|
||||||
|
FOR EACH ROW EXECUTE FUNCTION update_updated_at_column();
|
||||||
|
|
||||||
|
CREATE TRIGGER update_feature_flags_updated_at BEFORE UPDATE ON feature_flags
|
||||||
|
FOR EACH ROW EXECUTE FUNCTION update_updated_at_column();
|
||||||
|
|
||||||
|
-- Function to automatically create monthly partitions for heartbeats
|
||||||
|
CREATE OR REPLACE FUNCTION create_monthly_partition()
|
||||||
|
RETURNS void AS $$
|
||||||
|
DECLARE
|
||||||
|
start_date date;
|
||||||
|
end_date date;
|
||||||
|
partition_name text;
|
||||||
|
BEGIN
|
||||||
|
start_date := date_trunc('month', CURRENT_DATE + interval '1 month');
|
||||||
|
end_date := start_date + interval '1 month';
|
||||||
|
partition_name := 'license_heartbeats_' || to_char(start_date, 'YYYY_MM');
|
||||||
|
|
||||||
|
EXECUTE format('CREATE TABLE IF NOT EXISTS %I PARTITION OF license_heartbeats FOR VALUES FROM (%L) TO (%L)',
|
||||||
|
partition_name, start_date, end_date);
|
||||||
|
END;
|
||||||
|
$$ LANGUAGE plpgsql;
|
||||||
|
|
||||||
|
-- Migration: Add max_devices column to licenses if it doesn't exist
|
||||||
|
DO $$
|
||||||
|
BEGIN
|
||||||
|
IF NOT EXISTS (SELECT 1 FROM information_schema.columns
|
||||||
|
WHERE table_name = 'licenses' AND column_name = 'max_devices') THEN
|
||||||
|
ALTER TABLE licenses ADD COLUMN max_devices INTEGER DEFAULT 3 CHECK (max_devices >= 1);
|
||||||
|
END IF;
|
||||||
|
END $$;
|
||||||
|
|
||||||
|
-- Migration: Add expires_at column to licenses if it doesn't exist
|
||||||
|
DO $$
|
||||||
|
BEGIN
|
||||||
|
IF NOT EXISTS (SELECT 1 FROM information_schema.columns
|
||||||
|
WHERE table_name = 'licenses' AND column_name = 'expires_at') THEN
|
||||||
|
ALTER TABLE licenses ADD COLUMN expires_at TIMESTAMP;
|
||||||
|
-- Set expires_at based on valid_until for existing licenses
|
||||||
|
UPDATE licenses SET expires_at = valid_until::timestamp WHERE expires_at IS NULL;
|
||||||
|
END IF;
|
||||||
|
END $$;
|
||||||
|
|
||||||
|
-- Migration: Add features column to licenses if it doesn't exist
|
||||||
|
DO $$
|
||||||
|
BEGIN
|
||||||
|
IF NOT EXISTS (SELECT 1 FROM information_schema.columns
|
||||||
|
WHERE table_name = 'licenses' AND column_name = 'features') THEN
|
||||||
|
ALTER TABLE licenses ADD COLUMN features TEXT[] DEFAULT '{}';
|
||||||
|
END IF;
|
||||||
|
END $$;
|
||||||
|
|
||||||
|
-- Migration: Add updated_at column to licenses if it doesn't exist
|
||||||
|
DO $$
|
||||||
|
BEGIN
|
||||||
|
IF NOT EXISTS (SELECT 1 FROM information_schema.columns
|
||||||
|
WHERE table_name = 'licenses' AND column_name = 'updated_at') THEN
|
||||||
|
ALTER TABLE licenses ADD COLUMN updated_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP;
|
||||||
|
CREATE TRIGGER update_licenses_updated_at BEFORE UPDATE ON licenses
|
||||||
|
FOR EACH ROW EXECUTE FUNCTION update_updated_at_column();
|
||||||
|
END IF;
|
||||||
|
END $$;
|
||||||
|
|||||||
@@ -554,3 +554,421 @@ def clear_attempts():
|
|||||||
conn.close()
|
conn.close()
|
||||||
|
|
||||||
return redirect(url_for('admin.blocked_ips'))
|
return redirect(url_for('admin.blocked_ips'))
|
||||||
|
|
||||||
|
|
||||||
|
# ===================== LICENSE SERVER MONITORING ROUTES =====================
|
||||||
|
|
||||||
|
@admin_bp.route("/lizenzserver/monitor")
|
||||||
|
@login_required
|
||||||
|
def license_monitor():
|
||||||
|
"""License server live monitoring dashboard"""
|
||||||
|
try:
|
||||||
|
conn = get_connection()
|
||||||
|
cur = conn.cursor()
|
||||||
|
|
||||||
|
# Get current statistics
|
||||||
|
# Active validations in last 5 minutes
|
||||||
|
cur.execute("""
|
||||||
|
SELECT COUNT(DISTINCT license_id) as active_licenses,
|
||||||
|
COUNT(*) as total_validations,
|
||||||
|
COUNT(DISTINCT hardware_id) as unique_devices,
|
||||||
|
COUNT(DISTINCT ip_address) as unique_ips
|
||||||
|
FROM license_heartbeats
|
||||||
|
WHERE timestamp > NOW() - INTERVAL '5 minutes'
|
||||||
|
""")
|
||||||
|
live_stats = cur.fetchone()
|
||||||
|
|
||||||
|
# Get validation rate (per minute)
|
||||||
|
cur.execute("""
|
||||||
|
SELECT DATE_TRUNC('minute', timestamp) as minute,
|
||||||
|
COUNT(*) as validations
|
||||||
|
FROM license_heartbeats
|
||||||
|
WHERE timestamp > NOW() - INTERVAL '10 minutes'
|
||||||
|
GROUP BY minute
|
||||||
|
ORDER BY minute DESC
|
||||||
|
LIMIT 10
|
||||||
|
""")
|
||||||
|
validation_rates = cur.fetchall()
|
||||||
|
|
||||||
|
# Get top active licenses
|
||||||
|
cur.execute("""
|
||||||
|
SELECT l.id, l.license_key, c.name as customer_name,
|
||||||
|
COUNT(DISTINCT lh.hardware_id) as device_count,
|
||||||
|
COUNT(*) as validation_count,
|
||||||
|
MAX(lh.timestamp) as last_seen
|
||||||
|
FROM licenses l
|
||||||
|
JOIN customers c ON l.customer_id = c.id
|
||||||
|
JOIN license_heartbeats lh ON l.id = lh.license_id
|
||||||
|
WHERE lh.timestamp > NOW() - INTERVAL '15 minutes'
|
||||||
|
GROUP BY l.id, l.license_key, c.name
|
||||||
|
ORDER BY validation_count DESC
|
||||||
|
LIMIT 10
|
||||||
|
""")
|
||||||
|
top_licenses = cur.fetchall()
|
||||||
|
|
||||||
|
# Get recent anomalies
|
||||||
|
cur.execute("""
|
||||||
|
SELECT ad.*, l.license_key, c.name as customer_name
|
||||||
|
FROM anomaly_detections ad
|
||||||
|
LEFT JOIN licenses l ON ad.license_id = l.id
|
||||||
|
LEFT JOIN customers c ON l.customer_id = c.id
|
||||||
|
WHERE ad.resolved = false
|
||||||
|
ORDER BY ad.detected_at DESC
|
||||||
|
LIMIT 10
|
||||||
|
""")
|
||||||
|
recent_anomalies = cur.fetchall()
|
||||||
|
|
||||||
|
# Get geographic distribution
|
||||||
|
cur.execute("""
|
||||||
|
SELECT ip_address, COUNT(*) as count
|
||||||
|
FROM license_heartbeats
|
||||||
|
WHERE timestamp > NOW() - INTERVAL '1 hour'
|
||||||
|
AND ip_address IS NOT NULL
|
||||||
|
GROUP BY ip_address
|
||||||
|
ORDER BY count DESC
|
||||||
|
LIMIT 20
|
||||||
|
""")
|
||||||
|
geo_distribution = cur.fetchall()
|
||||||
|
|
||||||
|
return render_template('license_monitor.html',
|
||||||
|
live_stats=live_stats,
|
||||||
|
validation_rates=validation_rates,
|
||||||
|
top_licenses=top_licenses,
|
||||||
|
recent_anomalies=recent_anomalies,
|
||||||
|
geo_distribution=geo_distribution
|
||||||
|
)
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
flash(f'Fehler beim Laden der Monitoring-Daten: {str(e)}', 'error')
|
||||||
|
return render_template('license_monitor.html')
|
||||||
|
finally:
|
||||||
|
if 'cur' in locals():
|
||||||
|
cur.close()
|
||||||
|
if 'conn' in locals():
|
||||||
|
conn.close()
|
||||||
|
|
||||||
|
|
||||||
|
@admin_bp.route("/lizenzserver/analytics")
|
||||||
|
@login_required
|
||||||
|
def license_analytics():
|
||||||
|
"""License usage analytics"""
|
||||||
|
try:
|
||||||
|
conn = get_connection()
|
||||||
|
cur = conn.cursor()
|
||||||
|
|
||||||
|
# Time range from query params
|
||||||
|
days = int(request.args.get('days', 30))
|
||||||
|
|
||||||
|
# Usage trends over time
|
||||||
|
cur.execute("""
|
||||||
|
SELECT DATE(timestamp) as date,
|
||||||
|
COUNT(DISTINCT license_id) as unique_licenses,
|
||||||
|
COUNT(DISTINCT hardware_id) as unique_devices,
|
||||||
|
COUNT(*) as total_validations
|
||||||
|
FROM license_heartbeats
|
||||||
|
WHERE timestamp > NOW() - INTERVAL '%s days'
|
||||||
|
GROUP BY date
|
||||||
|
ORDER BY date
|
||||||
|
""", (days,))
|
||||||
|
usage_trends = cur.fetchall()
|
||||||
|
|
||||||
|
# License performance metrics
|
||||||
|
cur.execute("""
|
||||||
|
SELECT l.id, l.license_key, c.name as customer_name,
|
||||||
|
COUNT(DISTINCT lh.hardware_id) as device_count,
|
||||||
|
l.max_devices,
|
||||||
|
COUNT(*) as total_validations,
|
||||||
|
COUNT(DISTINCT DATE(lh.timestamp)) as active_days,
|
||||||
|
MIN(lh.timestamp) as first_seen,
|
||||||
|
MAX(lh.timestamp) as last_seen
|
||||||
|
FROM licenses l
|
||||||
|
JOIN customers c ON l.customer_id = c.id
|
||||||
|
LEFT JOIN license_heartbeats lh ON l.id = lh.license_id
|
||||||
|
WHERE lh.timestamp > NOW() - INTERVAL '%s days'
|
||||||
|
GROUP BY l.id, l.license_key, c.name, l.max_devices
|
||||||
|
ORDER BY total_validations DESC
|
||||||
|
""", (days,))
|
||||||
|
license_metrics = cur.fetchall()
|
||||||
|
|
||||||
|
# Device distribution
|
||||||
|
cur.execute("""
|
||||||
|
SELECT l.max_devices as limit,
|
||||||
|
COUNT(*) as license_count,
|
||||||
|
AVG(device_count) as avg_usage
|
||||||
|
FROM licenses l
|
||||||
|
LEFT JOIN (
|
||||||
|
SELECT license_id, COUNT(DISTINCT hardware_id) as device_count
|
||||||
|
FROM license_heartbeats
|
||||||
|
WHERE timestamp > NOW() - INTERVAL '30 days'
|
||||||
|
GROUP BY license_id
|
||||||
|
) usage ON l.id = usage.license_id
|
||||||
|
WHERE l.is_active = true
|
||||||
|
GROUP BY l.max_devices
|
||||||
|
ORDER BY l.max_devices
|
||||||
|
""")
|
||||||
|
device_distribution = cur.fetchall()
|
||||||
|
|
||||||
|
# Revenue analysis
|
||||||
|
cur.execute("""
|
||||||
|
SELECT l.license_type,
|
||||||
|
COUNT(DISTINCT l.id) as license_count,
|
||||||
|
COUNT(DISTINCT CASE WHEN lh.license_id IS NOT NULL THEN l.id END) as active_licenses,
|
||||||
|
COUNT(DISTINCT lh.hardware_id) as total_devices
|
||||||
|
FROM licenses l
|
||||||
|
LEFT JOIN license_heartbeats lh ON l.id = lh.license_id
|
||||||
|
AND lh.timestamp > NOW() - INTERVAL '%s days'
|
||||||
|
GROUP BY l.license_type
|
||||||
|
""", (days,))
|
||||||
|
revenue_analysis = cur.fetchall()
|
||||||
|
|
||||||
|
return render_template('license_analytics.html',
|
||||||
|
days=days,
|
||||||
|
usage_trends=usage_trends,
|
||||||
|
license_metrics=license_metrics,
|
||||||
|
device_distribution=device_distribution,
|
||||||
|
revenue_analysis=revenue_analysis
|
||||||
|
)
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
flash(f'Fehler beim Laden der Analytics-Daten: {str(e)}', 'error')
|
||||||
|
return render_template('license_analytics.html', days=30)
|
||||||
|
finally:
|
||||||
|
if 'cur' in locals():
|
||||||
|
cur.close()
|
||||||
|
if 'conn' in locals():
|
||||||
|
conn.close()
|
||||||
|
|
||||||
|
|
||||||
|
@admin_bp.route("/lizenzserver/anomalies")
|
||||||
|
@login_required
|
||||||
|
def license_anomalies():
|
||||||
|
"""Anomaly detection and management"""
|
||||||
|
try:
|
||||||
|
conn = get_connection()
|
||||||
|
cur = conn.cursor()
|
||||||
|
|
||||||
|
# Filter parameters
|
||||||
|
severity = request.args.get('severity', 'all')
|
||||||
|
resolved = request.args.get('resolved', 'false')
|
||||||
|
|
||||||
|
# Build query
|
||||||
|
query = """
|
||||||
|
SELECT ad.*, l.license_key, c.name as customer_name, c.email
|
||||||
|
FROM anomaly_detections ad
|
||||||
|
LEFT JOIN licenses l ON ad.license_id = l.id
|
||||||
|
LEFT JOIN customers c ON l.customer_id = c.id
|
||||||
|
WHERE 1=1
|
||||||
|
"""
|
||||||
|
params = []
|
||||||
|
|
||||||
|
if severity != 'all':
|
||||||
|
query += " AND ad.severity = %s"
|
||||||
|
params.append(severity)
|
||||||
|
|
||||||
|
if resolved == 'false':
|
||||||
|
query += " AND ad.resolved = false"
|
||||||
|
elif resolved == 'true':
|
||||||
|
query += " AND ad.resolved = true"
|
||||||
|
|
||||||
|
query += " ORDER BY ad.detected_at DESC LIMIT 100"
|
||||||
|
|
||||||
|
cur.execute(query, params)
|
||||||
|
anomalies = cur.fetchall()
|
||||||
|
|
||||||
|
# Get anomaly statistics
|
||||||
|
cur.execute("""
|
||||||
|
SELECT anomaly_type, severity, COUNT(*) as count
|
||||||
|
FROM anomaly_detections
|
||||||
|
WHERE resolved = false
|
||||||
|
GROUP BY anomaly_type, severity
|
||||||
|
ORDER BY count DESC
|
||||||
|
""")
|
||||||
|
anomaly_stats = cur.fetchall()
|
||||||
|
|
||||||
|
return render_template('license_anomalies.html',
|
||||||
|
anomalies=anomalies,
|
||||||
|
anomaly_stats=anomaly_stats,
|
||||||
|
severity=severity,
|
||||||
|
resolved=resolved
|
||||||
|
)
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
flash(f'Fehler beim Laden der Anomalie-Daten: {str(e)}', 'error')
|
||||||
|
return render_template('license_anomalies.html')
|
||||||
|
finally:
|
||||||
|
if 'cur' in locals():
|
||||||
|
cur.close()
|
||||||
|
if 'conn' in locals():
|
||||||
|
conn.close()
|
||||||
|
|
||||||
|
|
||||||
|
@admin_bp.route("/lizenzserver/anomaly/<anomaly_id>/resolve", methods=["POST"])
|
||||||
|
@login_required
|
||||||
|
def resolve_anomaly(anomaly_id):
|
||||||
|
"""Resolve an anomaly"""
|
||||||
|
try:
|
||||||
|
conn = get_connection()
|
||||||
|
cur = conn.cursor()
|
||||||
|
|
||||||
|
action_taken = request.form.get('action_taken', '')
|
||||||
|
|
||||||
|
cur.execute("""
|
||||||
|
UPDATE anomaly_detections
|
||||||
|
SET resolved = true,
|
||||||
|
resolved_at = NOW(),
|
||||||
|
resolved_by = %s,
|
||||||
|
action_taken = %s
|
||||||
|
WHERE id = %s
|
||||||
|
""", (session.get('username'), action_taken, str(anomaly_id)))
|
||||||
|
|
||||||
|
conn.commit()
|
||||||
|
|
||||||
|
flash('Anomalie wurde als behoben markiert', 'success')
|
||||||
|
log_audit('RESOLVE_ANOMALY', 'license_server', entity_id=str(anomaly_id),
|
||||||
|
additional_info=f"Action: {action_taken}")
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
if 'conn' in locals():
|
||||||
|
conn.rollback()
|
||||||
|
flash(f'Fehler beim Beheben der Anomalie: {str(e)}', 'error')
|
||||||
|
finally:
|
||||||
|
if 'cur' in locals():
|
||||||
|
cur.close()
|
||||||
|
if 'conn' in locals():
|
||||||
|
conn.close()
|
||||||
|
|
||||||
|
return redirect(url_for('admin.license_anomalies'))
|
||||||
|
|
||||||
|
|
||||||
|
@admin_bp.route("/lizenzserver/config")
|
||||||
|
@login_required
|
||||||
|
def license_config():
|
||||||
|
"""License server configuration"""
|
||||||
|
try:
|
||||||
|
conn = get_connection()
|
||||||
|
cur = conn.cursor()
|
||||||
|
|
||||||
|
# Get feature flags
|
||||||
|
cur.execute("""
|
||||||
|
SELECT * FROM feature_flags
|
||||||
|
ORDER BY feature_name
|
||||||
|
""")
|
||||||
|
feature_flags = cur.fetchall()
|
||||||
|
|
||||||
|
# Get API clients
|
||||||
|
cur.execute("""
|
||||||
|
SELECT id, client_name, api_key, is_active, created_at
|
||||||
|
FROM api_clients
|
||||||
|
ORDER BY created_at DESC
|
||||||
|
""")
|
||||||
|
api_clients = cur.fetchall()
|
||||||
|
|
||||||
|
# Get rate limits
|
||||||
|
cur.execute("""
|
||||||
|
SELECT * FROM api_rate_limits
|
||||||
|
ORDER BY api_key
|
||||||
|
""")
|
||||||
|
rate_limits = cur.fetchall()
|
||||||
|
|
||||||
|
return render_template('license_config.html',
|
||||||
|
feature_flags=feature_flags,
|
||||||
|
api_clients=api_clients,
|
||||||
|
rate_limits=rate_limits
|
||||||
|
)
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
flash(f'Fehler beim Laden der Konfiguration: {str(e)}', 'error')
|
||||||
|
return render_template('license_config.html')
|
||||||
|
finally:
|
||||||
|
if 'cur' in locals():
|
||||||
|
cur.close()
|
||||||
|
if 'conn' in locals():
|
||||||
|
conn.close()
|
||||||
|
|
||||||
|
|
||||||
|
@admin_bp.route("/lizenzserver/config/feature-flag/<int:flag_id>", methods=["POST"])
|
||||||
|
@login_required
|
||||||
|
def update_feature_flag(flag_id):
|
||||||
|
"""Update feature flag settings"""
|
||||||
|
try:
|
||||||
|
conn = get_connection()
|
||||||
|
cur = conn.cursor()
|
||||||
|
|
||||||
|
is_enabled = request.form.get('is_enabled') == 'on'
|
||||||
|
rollout_percentage = int(request.form.get('rollout_percentage', 0))
|
||||||
|
|
||||||
|
cur.execute("""
|
||||||
|
UPDATE feature_flags
|
||||||
|
SET is_enabled = %s,
|
||||||
|
rollout_percentage = %s,
|
||||||
|
updated_at = NOW()
|
||||||
|
WHERE id = %s
|
||||||
|
""", (is_enabled, rollout_percentage, flag_id))
|
||||||
|
|
||||||
|
conn.commit()
|
||||||
|
|
||||||
|
flash('Feature Flag wurde aktualisiert', 'success')
|
||||||
|
log_audit('UPDATE_FEATURE_FLAG', 'license_server', entity_id=flag_id)
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
if 'conn' in locals():
|
||||||
|
conn.rollback()
|
||||||
|
flash(f'Fehler beim Aktualisieren: {str(e)}', 'error')
|
||||||
|
finally:
|
||||||
|
if 'cur' in locals():
|
||||||
|
cur.close()
|
||||||
|
if 'conn' in locals():
|
||||||
|
conn.close()
|
||||||
|
|
||||||
|
return redirect(url_for('admin.license_config'))
|
||||||
|
|
||||||
|
|
||||||
|
@admin_bp.route("/api/admin/lizenzserver/live-stats")
|
||||||
|
@login_required
|
||||||
|
def license_live_stats():
|
||||||
|
"""API endpoint for live statistics (for AJAX updates)"""
|
||||||
|
try:
|
||||||
|
conn = get_connection()
|
||||||
|
cur = conn.cursor()
|
||||||
|
|
||||||
|
# Get real-time stats
|
||||||
|
cur.execute("""
|
||||||
|
SELECT COUNT(DISTINCT license_id) as active_licenses,
|
||||||
|
COUNT(*) as validations_per_minute,
|
||||||
|
COUNT(DISTINCT hardware_id) as active_devices
|
||||||
|
FROM license_heartbeats
|
||||||
|
WHERE timestamp > NOW() - INTERVAL '1 minute'
|
||||||
|
""")
|
||||||
|
stats = cur.fetchone()
|
||||||
|
|
||||||
|
# Get latest validations
|
||||||
|
cur.execute("""
|
||||||
|
SELECT l.license_key, lh.hardware_id, lh.ip_address, lh.timestamp
|
||||||
|
FROM license_heartbeats lh
|
||||||
|
JOIN licenses l ON lh.license_id = l.id
|
||||||
|
ORDER BY lh.timestamp DESC
|
||||||
|
LIMIT 5
|
||||||
|
""")
|
||||||
|
latest_validations = cur.fetchall()
|
||||||
|
|
||||||
|
return jsonify({
|
||||||
|
'active_licenses': stats[0] or 0,
|
||||||
|
'validations_per_minute': stats[1] or 0,
|
||||||
|
'active_devices': stats[2] or 0,
|
||||||
|
'latest_validations': [
|
||||||
|
{
|
||||||
|
'license_key': v[0][:8] + '...',
|
||||||
|
'hardware_id': v[1][:8] + '...',
|
||||||
|
'ip_address': v[2] or 'Unknown',
|
||||||
|
'timestamp': v[3].strftime('%H:%M:%S')
|
||||||
|
} for v in latest_validations
|
||||||
|
]
|
||||||
|
})
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
return jsonify({'error': str(e)}), 500
|
||||||
|
finally:
|
||||||
|
if 'cur' in locals():
|
||||||
|
cur.close()
|
||||||
|
if 'conn' in locals():
|
||||||
|
conn.close()
|
||||||
|
|||||||
@@ -338,6 +338,11 @@ def api_customer_licenses(customer_id):
|
|||||||
WHEN l.is_active = false THEN 'inaktiv'
|
WHEN l.is_active = false THEN 'inaktiv'
|
||||||
ELSE 'aktiv'
|
ELSE 'aktiv'
|
||||||
END as status,
|
END as status,
|
||||||
|
-- License Server Status
|
||||||
|
(SELECT COUNT(*) FROM license_heartbeats lh WHERE lh.license_id = l.id AND lh.timestamp > NOW() - INTERVAL '5 minutes') as recent_heartbeats,
|
||||||
|
(SELECT MAX(timestamp) FROM license_heartbeats lh WHERE lh.license_id = l.id) as last_heartbeat,
|
||||||
|
(SELECT COUNT(DISTINCT hardware_id) FROM license_heartbeats lh WHERE lh.license_id = l.id AND lh.timestamp > NOW() - INTERVAL '15 minutes') as active_server_devices,
|
||||||
|
(SELECT COUNT(*) FROM anomaly_detections ad WHERE ad.license_id = l.id AND ad.resolved = false) as unresolved_anomalies,
|
||||||
l.domain_count,
|
l.domain_count,
|
||||||
l.ipv4_count,
|
l.ipv4_count,
|
||||||
l.phone_count,
|
l.phone_count,
|
||||||
@@ -408,14 +413,19 @@ def api_customer_licenses(customer_id):
|
|||||||
'active_sessions': row[9],
|
'active_sessions': row[9],
|
||||||
'registered_devices': row[10],
|
'registered_devices': row[10],
|
||||||
'status': row[11],
|
'status': row[11],
|
||||||
'domain_count': row[12],
|
'domain_count': row[16],
|
||||||
'ipv4_count': row[13],
|
'ipv4_count': row[17],
|
||||||
'phone_count': row[14],
|
'phone_count': row[18],
|
||||||
'active_devices': row[15],
|
'active_devices': row[19],
|
||||||
'actual_domain_count': row[16],
|
'actual_domain_count': row[20],
|
||||||
'actual_ipv4_count': row[17],
|
'actual_ipv4_count': row[21],
|
||||||
'actual_phone_count': row[18],
|
'actual_phone_count': row[22],
|
||||||
'resources': resources
|
'resources': resources,
|
||||||
|
# License Server Data
|
||||||
|
'recent_heartbeats': row[12],
|
||||||
|
'last_heartbeat': row[13].strftime('%Y-%m-%d %H:%M:%S') if row[13] else None,
|
||||||
|
'active_server_devices': row[14],
|
||||||
|
'unresolved_anomalies': row[15]
|
||||||
})
|
})
|
||||||
|
|
||||||
return jsonify({
|
return jsonify({
|
||||||
|
|||||||
@@ -433,6 +433,38 @@
|
|||||||
<span>Sicherheit</span>
|
<span>Sicherheit</span>
|
||||||
</a>
|
</a>
|
||||||
</li>
|
</li>
|
||||||
|
<li class="nav-item {% if request.endpoint in ['admin.license_monitor', 'admin.license_analytics', 'admin.license_anomalies', 'admin.license_config'] %}has-active-child{% endif %}">
|
||||||
|
<a class="nav-link has-submenu" href="#">
|
||||||
|
<i class="bi bi-graph-up"></i>
|
||||||
|
<span>Lizenzserver</span>
|
||||||
|
</a>
|
||||||
|
<ul class="sidebar-submenu">
|
||||||
|
<li class="nav-item">
|
||||||
|
<a class="nav-link {% if request.endpoint == 'admin.license_monitor' %}active{% endif %}" href="{{ url_for('admin.license_monitor') }}">
|
||||||
|
<i class="bi bi-speedometer2"></i>
|
||||||
|
<span>Live Monitor</span>
|
||||||
|
</a>
|
||||||
|
</li>
|
||||||
|
<li class="nav-item">
|
||||||
|
<a class="nav-link {% if request.endpoint == 'admin.license_analytics' %}active{% endif %}" href="{{ url_for('admin.license_analytics') }}">
|
||||||
|
<i class="bi bi-bar-chart"></i>
|
||||||
|
<span>Analytics</span>
|
||||||
|
</a>
|
||||||
|
</li>
|
||||||
|
<li class="nav-item">
|
||||||
|
<a class="nav-link {% if request.endpoint == 'admin.license_anomalies' %}active{% endif %}" href="{{ url_for('admin.license_anomalies') }}">
|
||||||
|
<i class="bi bi-exclamation-triangle"></i>
|
||||||
|
<span>Anomalien</span>
|
||||||
|
</a>
|
||||||
|
</li>
|
||||||
|
<li class="nav-item">
|
||||||
|
<a class="nav-link {% if request.endpoint == 'admin.license_config' %}active{% endif %}" href="{{ url_for('admin.license_config') }}">
|
||||||
|
<i class="bi bi-gear"></i>
|
||||||
|
<span>Konfiguration</span>
|
||||||
|
</a>
|
||||||
|
</li>
|
||||||
|
</ul>
|
||||||
|
</li>
|
||||||
</ul>
|
</ul>
|
||||||
</aside>
|
</aside>
|
||||||
|
|
||||||
|
|||||||
@@ -365,6 +365,7 @@ function updateLicenseView(customerId, licenses) {
|
|||||||
<th>Gültig von</th>
|
<th>Gültig von</th>
|
||||||
<th>Gültig bis</th>
|
<th>Gültig bis</th>
|
||||||
<th>Status</th>
|
<th>Status</th>
|
||||||
|
<th>Server Status</th>
|
||||||
<th>Ressourcen</th>
|
<th>Ressourcen</th>
|
||||||
<th>Aktionen</th>
|
<th>Aktionen</th>
|
||||||
</tr>
|
</tr>
|
||||||
@@ -378,6 +379,26 @@ function updateLicenseView(customerId, licenses) {
|
|||||||
|
|
||||||
const typeClass = license.license_type === 'full' ? 'bg-primary' : 'bg-secondary';
|
const typeClass = license.license_type === 'full' ? 'bg-primary' : 'bg-secondary';
|
||||||
|
|
||||||
|
// License Server Status
|
||||||
|
let serverStatusHtml = '';
|
||||||
|
if (license.recent_heartbeats > 0) {
|
||||||
|
serverStatusHtml = `<span class="badge bg-success" title="Aktiv - ${license.active_server_devices} Geräte">💚 Online</span>`;
|
||||||
|
if (license.unresolved_anomalies > 0) {
|
||||||
|
serverStatusHtml += `<br><span class="badge bg-danger" title="${license.unresolved_anomalies} ungelöste Anomalien">⚠️ ${license.unresolved_anomalies}</span>`;
|
||||||
|
}
|
||||||
|
} else if (license.last_heartbeat) {
|
||||||
|
const lastSeen = new Date(license.last_heartbeat);
|
||||||
|
const minutesAgo = Math.floor((new Date() - lastSeen) / 60000);
|
||||||
|
if (minutesAgo < 60) {
|
||||||
|
serverStatusHtml = `<span class="badge bg-warning" title="Zuletzt vor ${minutesAgo} Min">⏱️ ${minutesAgo} Min</span>`;
|
||||||
|
} else {
|
||||||
|
const hoursAgo = Math.floor(minutesAgo / 60);
|
||||||
|
serverStatusHtml = `<span class="badge bg-secondary" title="Zuletzt vor ${hoursAgo}h">💤 Offline</span>`;
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
serverStatusHtml = `<span class="badge bg-secondary">-</span>`;
|
||||||
|
}
|
||||||
|
|
||||||
// Erstelle Ressourcen-HTML mit Details
|
// Erstelle Ressourcen-HTML mit Details
|
||||||
let resourcesHtml = '';
|
let resourcesHtml = '';
|
||||||
const actualDomainCount = license.actual_domain_count || 0;
|
const actualDomainCount = license.actual_domain_count || 0;
|
||||||
@@ -425,6 +446,7 @@ function updateLicenseView(customerId, licenses) {
|
|||||||
<td>${license.valid_from || '-'}</td>
|
<td>${license.valid_from || '-'}</td>
|
||||||
<td>${license.valid_until || '-'}</td>
|
<td>${license.valid_until || '-'}</td>
|
||||||
<td><span class="badge ${statusClass}">${license.status}</span></td>
|
<td><span class="badge ${statusClass}">${license.status}</span></td>
|
||||||
|
<td>${serverStatusHtml}</td>
|
||||||
<td class="resources-cell">
|
<td class="resources-cell">
|
||||||
${resourcesHtml || '<span class="text-muted">-</span>'}
|
${resourcesHtml || '<span class="text-muted">-</span>'}
|
||||||
</td>
|
</td>
|
||||||
|
|||||||
319
v2_adminpanel/templates/license_monitor.html
Normale Datei
319
v2_adminpanel/templates/license_monitor.html
Normale Datei
@@ -0,0 +1,319 @@
|
|||||||
|
{% extends "base.html" %}
|
||||||
|
|
||||||
|
{% block title %}Lizenzserver Monitor{% endblock %}
|
||||||
|
|
||||||
|
{% block extra_css %}
|
||||||
|
<style>
|
||||||
|
.stat-card {
|
||||||
|
background: white;
|
||||||
|
border-radius: 8px;
|
||||||
|
padding: 1.5rem;
|
||||||
|
box-shadow: 0 2px 4px rgba(0,0,0,0.1);
|
||||||
|
text-align: center;
|
||||||
|
transition: transform 0.2s;
|
||||||
|
}
|
||||||
|
|
||||||
|
.stat-card:hover {
|
||||||
|
transform: translateY(-2px);
|
||||||
|
box-shadow: 0 4px 8px rgba(0,0,0,0.15);
|
||||||
|
}
|
||||||
|
|
||||||
|
.stat-number {
|
||||||
|
font-size: 2.5rem;
|
||||||
|
font-weight: bold;
|
||||||
|
color: var(--status-active);
|
||||||
|
}
|
||||||
|
|
||||||
|
.stat-label {
|
||||||
|
color: #6c757d;
|
||||||
|
font-size: 0.9rem;
|
||||||
|
text-transform: uppercase;
|
||||||
|
letter-spacing: 0.5px;
|
||||||
|
}
|
||||||
|
|
||||||
|
.live-indicator {
|
||||||
|
display: inline-block;
|
||||||
|
width: 8px;
|
||||||
|
height: 8px;
|
||||||
|
background: var(--status-active);
|
||||||
|
border-radius: 50%;
|
||||||
|
animation: pulse 2s infinite;
|
||||||
|
margin-right: 5px;
|
||||||
|
}
|
||||||
|
|
||||||
|
.validation-timeline {
|
||||||
|
height: 300px;
|
||||||
|
background: #f8f9fa;
|
||||||
|
border-radius: 8px;
|
||||||
|
padding: 1rem;
|
||||||
|
}
|
||||||
|
|
||||||
|
.anomaly-alert {
|
||||||
|
padding: 1rem;
|
||||||
|
border-left: 4px solid var(--status-danger);
|
||||||
|
background: #fff5f5;
|
||||||
|
margin-bottom: 0.5rem;
|
||||||
|
}
|
||||||
|
|
||||||
|
.device-badge {
|
||||||
|
display: inline-block;
|
||||||
|
padding: 0.25rem 0.5rem;
|
||||||
|
background: #e9ecef;
|
||||||
|
border-radius: 4px;
|
||||||
|
font-size: 0.85rem;
|
||||||
|
margin-right: 0.5rem;
|
||||||
|
}
|
||||||
|
</style>
|
||||||
|
{% endblock %}
|
||||||
|
|
||||||
|
{% block content %}
|
||||||
|
<div class="d-flex justify-content-between align-items-center mb-4">
|
||||||
|
<h1><i class="bi bi-speedometer2"></i> Lizenzserver Live Monitor</h1>
|
||||||
|
<div>
|
||||||
|
<span class="live-indicator"></span>
|
||||||
|
<span class="text-muted">Live-Daten</span>
|
||||||
|
<button class="btn btn-sm btn-outline-secondary ms-3" onclick="toggleAutoRefresh()">
|
||||||
|
<i class="bi bi-arrow-clockwise"></i> Auto-Refresh: <span id="refresh-status">AN</span>
|
||||||
|
</button>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<!-- Live Statistics Cards -->
|
||||||
|
<div class="row mb-4">
|
||||||
|
<div class="col-md-3 mb-3">
|
||||||
|
<div class="stat-card">
|
||||||
|
<div class="stat-number" id="active-licenses">
|
||||||
|
{{ live_stats[0] if live_stats else 0 }}
|
||||||
|
</div>
|
||||||
|
<div class="stat-label">Aktive Lizenzen</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
<div class="col-md-3 mb-3">
|
||||||
|
<div class="stat-card">
|
||||||
|
<div class="stat-number" id="total-validations">
|
||||||
|
{{ live_stats[1] if live_stats else 0 }}
|
||||||
|
</div>
|
||||||
|
<div class="stat-label">Validierungen (5 Min)</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
<div class="col-md-3 mb-3">
|
||||||
|
<div class="stat-card">
|
||||||
|
<div class="stat-number" id="unique-devices">
|
||||||
|
{{ live_stats[2] if live_stats else 0 }}
|
||||||
|
</div>
|
||||||
|
<div class="stat-label">Aktive Geräte</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
<div class="col-md-3 mb-3">
|
||||||
|
<div class="stat-card">
|
||||||
|
<div class="stat-number" id="unique-ips">
|
||||||
|
{{ live_stats[3] if live_stats else 0 }}
|
||||||
|
</div>
|
||||||
|
<div class="stat-label">Unique IPs</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div class="row">
|
||||||
|
<!-- Validation Timeline -->
|
||||||
|
<div class="col-md-8 mb-4">
|
||||||
|
<div class="card">
|
||||||
|
<div class="card-header">
|
||||||
|
<h5 class="mb-0">Validierungen pro Minute</h5>
|
||||||
|
</div>
|
||||||
|
<div class="card-body">
|
||||||
|
<canvas id="validationChart" height="100"></canvas>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<!-- Recent Anomalies -->
|
||||||
|
<div class="col-md-4 mb-4">
|
||||||
|
<div class="card">
|
||||||
|
<div class="card-header d-flex justify-content-between align-items-center">
|
||||||
|
<h5 class="mb-0">Aktuelle Anomalien</h5>
|
||||||
|
<a href="{{ url_for('admin.license_anomalies') }}" class="btn btn-sm btn-outline-primary">
|
||||||
|
Alle anzeigen
|
||||||
|
</a>
|
||||||
|
</div>
|
||||||
|
<div class="card-body" style="max-height: 400px; overflow-y: auto;">
|
||||||
|
{% if recent_anomalies %}
|
||||||
|
{% for anomaly in recent_anomalies %}
|
||||||
|
<div class="anomaly-alert">
|
||||||
|
<div class="d-flex justify-content-between">
|
||||||
|
<span class="badge badge-{{ 'danger' if anomaly['severity'] == 'critical' else anomaly['severity'] }}">
|
||||||
|
{{ anomaly['severity'].upper() }}
|
||||||
|
</span>
|
||||||
|
<small class="text-muted">{{ anomaly['detected_at'].strftime('%H:%M') }}</small>
|
||||||
|
</div>
|
||||||
|
<div class="mt-2">
|
||||||
|
<strong>{{ anomaly['anomaly_type'].replace('_', ' ').title() }}</strong><br>
|
||||||
|
<small>Lizenz: {{ anomaly['license_key'][:8] }}...</small>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
{% endfor %}
|
||||||
|
{% else %}
|
||||||
|
<p class="text-muted text-center">Keine aktiven Anomalien</p>
|
||||||
|
{% endif %}
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<!-- Top Active Licenses -->
|
||||||
|
<div class="card mb-4">
|
||||||
|
<div class="card-header">
|
||||||
|
<h5 class="mb-0">Top Aktive Lizenzen (letzte 15 Min)</h5>
|
||||||
|
</div>
|
||||||
|
<div class="card-body">
|
||||||
|
<div class="table-responsive">
|
||||||
|
<table class="table table-hover">
|
||||||
|
<thead>
|
||||||
|
<tr>
|
||||||
|
<th>Lizenzschlüssel</th>
|
||||||
|
<th>Kunde</th>
|
||||||
|
<th>Geräte</th>
|
||||||
|
<th>Validierungen</th>
|
||||||
|
<th>Zuletzt gesehen</th>
|
||||||
|
<th>Status</th>
|
||||||
|
</tr>
|
||||||
|
</thead>
|
||||||
|
<tbody id="top-licenses-tbody">
|
||||||
|
{% for license in top_licenses %}
|
||||||
|
<tr>
|
||||||
|
<td>
|
||||||
|
<code>{{ license['license_key'][:12] }}...</code>
|
||||||
|
</td>
|
||||||
|
<td>{{ license['customer_name'] }}</td>
|
||||||
|
<td>
|
||||||
|
<span class="device-badge">
|
||||||
|
<i class="bi bi-laptop"></i> {{ license['device_count'] }}
|
||||||
|
</span>
|
||||||
|
</td>
|
||||||
|
<td>{{ license['validation_count'] }}</td>
|
||||||
|
<td>{{ license['last_seen'].strftime('%H:%M:%S') }}</td>
|
||||||
|
<td>
|
||||||
|
<span class="badge bg-success">Aktiv</span>
|
||||||
|
</td>
|
||||||
|
</tr>
|
||||||
|
{% endfor %}
|
||||||
|
</tbody>
|
||||||
|
</table>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<!-- Latest Validations Stream -->
|
||||||
|
<div class="card">
|
||||||
|
<div class="card-header">
|
||||||
|
<h5 class="mb-0">Letzte Validierungen (Live-Stream)</h5>
|
||||||
|
</div>
|
||||||
|
<div class="card-body">
|
||||||
|
<div id="validation-stream" style="max-height: 300px; overflow-y: auto;">
|
||||||
|
<!-- Will be populated by JavaScript -->
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{% endblock %}
|
||||||
|
|
||||||
|
{% block extra_js %}
|
||||||
|
<script src="https://cdn.jsdelivr.net/npm/chart.js"></script>
|
||||||
|
<script>
|
||||||
|
let autoRefresh = true;
|
||||||
|
let refreshInterval;
|
||||||
|
let validationChart;
|
||||||
|
|
||||||
|
// Initialize validation chart
|
||||||
|
const ctx = document.getElementById('validationChart').getContext('2d');
|
||||||
|
validationChart = new Chart(ctx, {
|
||||||
|
type: 'line',
|
||||||
|
data: {
|
||||||
|
labels: [],
|
||||||
|
datasets: [{
|
||||||
|
label: 'Validierungen',
|
||||||
|
data: [],
|
||||||
|
borderColor: 'rgb(40, 167, 69)',
|
||||||
|
backgroundColor: 'rgba(40, 167, 69, 0.1)',
|
||||||
|
tension: 0.1
|
||||||
|
}]
|
||||||
|
},
|
||||||
|
options: {
|
||||||
|
responsive: true,
|
||||||
|
maintainAspectRatio: false,
|
||||||
|
scales: {
|
||||||
|
y: {
|
||||||
|
beginAtZero: true
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
// Update chart with validation rates
|
||||||
|
{% if validation_rates %}
|
||||||
|
const rates = {{ validation_rates|tojson }};
|
||||||
|
validationChart.data.labels = rates.map(r => new Date(r[0]).toLocaleTimeString('de-DE', {hour: '2-digit', minute: '2-digit'})).reverse();
|
||||||
|
validationChart.data.datasets[0].data = rates.map(r => r[1]).reverse();
|
||||||
|
validationChart.update();
|
||||||
|
{% endif %}
|
||||||
|
|
||||||
|
// Fetch live statistics
|
||||||
|
function fetchLiveStats() {
|
||||||
|
fetch('{{ url_for("admin.license_live_stats") }}')
|
||||||
|
.then(response => response.json())
|
||||||
|
.then(data => {
|
||||||
|
// Update statistics
|
||||||
|
document.getElementById('active-licenses').textContent = data.active_licenses;
|
||||||
|
document.getElementById('total-validations').textContent = data.validations_per_minute;
|
||||||
|
document.getElementById('unique-devices').textContent = data.active_devices;
|
||||||
|
|
||||||
|
// Update validation stream
|
||||||
|
const stream = document.getElementById('validation-stream');
|
||||||
|
const newEntries = data.latest_validations.map(v =>
|
||||||
|
`<div class="d-flex justify-content-between border-bottom py-2">
|
||||||
|
<span>
|
||||||
|
<code>${v.license_key}</code> |
|
||||||
|
<span class="text-muted">${v.hardware_id}</span>
|
||||||
|
</span>
|
||||||
|
<span>
|
||||||
|
<span class="badge bg-secondary">${v.ip_address}</span>
|
||||||
|
<span class="text-muted ms-2">${v.timestamp}</span>
|
||||||
|
</span>
|
||||||
|
</div>`
|
||||||
|
).join('');
|
||||||
|
|
||||||
|
if (newEntries) {
|
||||||
|
stream.innerHTML = newEntries + stream.innerHTML;
|
||||||
|
// Keep only last 20 entries
|
||||||
|
const entries = stream.querySelectorAll('div');
|
||||||
|
if (entries.length > 20) {
|
||||||
|
for (let i = 20; i < entries.length; i++) {
|
||||||
|
entries[i].remove();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
})
|
||||||
|
.catch(error => console.error('Error fetching live stats:', error));
|
||||||
|
}
|
||||||
|
|
||||||
|
// Toggle auto-refresh
|
||||||
|
function toggleAutoRefresh() {
|
||||||
|
autoRefresh = !autoRefresh;
|
||||||
|
document.getElementById('refresh-status').textContent = autoRefresh ? 'AN' : 'AUS';
|
||||||
|
|
||||||
|
if (autoRefresh) {
|
||||||
|
refreshInterval = setInterval(fetchLiveStats, 5000);
|
||||||
|
} else {
|
||||||
|
clearInterval(refreshInterval);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Start auto-refresh
|
||||||
|
if (autoRefresh) {
|
||||||
|
refreshInterval = setInterval(fetchLiveStats, 5000);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Initial fetch
|
||||||
|
fetchLiveStats();
|
||||||
|
</script>
|
||||||
|
{% endblock %}
|
||||||
In neuem Issue referenzieren
Einen Benutzer sperren