Add JWT authentication with user/password login, role-based access control, and Bearer token support while maintaining legacy API key compatibility for transition; update README with comprehensive JWT security guidelines, installation instructions for Python/SSH, development startup scripts (.env, run_dev.sh/ps1), and migrate all API examples to JWT authentication; update Ansible inventory SSH key paths from Docker container paths to local user paths

This commit is contained in:
Bruno Charest 2025-12-14 17:33:34 -05:00
parent 5a512d39b5
commit 0030fcc101
177 changed files with 22810 additions and 289 deletions

175
README.md
View File

@ -17,7 +17,8 @@ Une application moderne et professionnelle pour la gestion automatisée d'homela
- **Endpoints Complets** : Gestion complète des hôtes, tâches et logs - **Endpoints Complets** : Gestion complète des hôtes, tâches et logs
- **Validation Pydantic** : Validation automatique des données - **Validation Pydantic** : Validation automatique des données
- **WebSocket Support** : Communication temps réel - **WebSocket Support** : Communication temps réel
- **Authentification API** : Sécurité renforcée avec clés API - **Authentification JWT** : Login user/password avec rôles et tokens Bearer
- **Compatibilité API key (legacy)** : Mode de transition pour les anciennes intégrations
- **Documentation Interactive** : Swagger UI et ReDoc - **Documentation Interactive** : Swagger UI et ReDoc
- **CORS Support** : Compatible avec les applications web modernes - **CORS Support** : Compatible avec les applications web modernes
@ -121,6 +122,18 @@ Les fichiers JSON suivants sont utilisés par l'application pour stocker l'état
- Ansible (pour l'exécution des playbooks) - Ansible (pour l'exécution des playbooks)
- Navigateur moderne (Chrome, Firefox, Safari, Edge) - Navigateur moderne (Chrome, Firefox, Safari, Edge)
### Installation de Python
```bash
# Sur Debian/Ubuntu
sudo apt install python3.12
sudo apt install python3-pip
sudo apt install python3.12-venv
# Sur Windows
scoop install python
```
### Installation d'Ansible (optionnel mais recommandé) ### Installation d'Ansible (optionnel mais recommandé)
```bash ```bash
# Sur Debian/Ubuntu # Sur Debian/Ubuntu
@ -143,17 +156,87 @@ brew install ansible
2. **Installer les dépendances Python** 2. **Installer les dépendances Python**
```bash ```bash
pip install -r requirements.txt # Sous Windows avec WSL (Ubuntu)
wsl -d Ubuntu
# Aller à la racine du projet (côté WSL)
cd /mnt/c/dev/git/python/homelab-automation-api-v2
# Créer et activer un environnement virtuel Python
python3 -m venv .venv
source .venv/bin/activate
# Installer les dépendances du backend
pip install -r app/requirements.txt
``` ```
3. **Lancer le serveur backend (recommandé)** 3. **Installer configurer SSH sur clients**
```bash ```bash
python -m uvicorn app_optimized:app --host 0.0.0.0 --port 8000 --reload # Installer SSH et sshpass
sudo apt install ssh
sudo apt install sshpass
# Modifier configuration SSH /etc/ssh/sshd_config
PermitRootLogin yes
PubkeyAuthentication yes
# redémarrer le service SSH pour Debian
sudo service ssh restart
# redémarrer le service SSH pour Alpine Linux
sudo
``` ```
Ou directement via le script Python :
4. **Lancer le serveur backend (recommandé)**
```bash ```bash
python app_optimized.py # Depuis la racine du projet (homelab-automation-api-v2)
cd homelab-automation-api-v2
python -m uvicorn app.app_optimized:app --host 0.0.0.0 --port 8000 --reload
```
Ou directement via le script Python (chemin module explicite) :
```bash
python -m app.app_optimized
```
> Le module `app_optimized` se trouve dans le dossier `app/`, il doit donc être référencé en tant que `app.app_optimized` lorsque vous êtes à la racine du projet.
#### Utilisation d'un fichier `.env` et des scripts de démarrage
Pour éviter d'exporter manuellement les variables d'environnement (API, SSH, notifications, JWT, base de données, etc.), vous pouvez utiliser un fichier `.env` à la racine du projet et les scripts fournis :
1. **Créer un fichier `.env` à partir de l'exemple Docker**
```bash
# Depuis la racine du projet
cp docker/.env.example .env
# Puis éditer .env avec vos valeurs (API_KEY, SSH_USER, NTFY_*, JWT_SECRET_KEY, etc.)
```
2. **Démarrer en développement sous Linux/macOS**
Le script `run_dev.sh` charge automatiquement les variables du fichier `.env` puis lance uvicorn :
```bash
chmod +x run_dev.sh
./run_dev.sh
```
3. **Démarrer en développement sous Windows (PowerShell)**
Le script `run_dev.ps1` lit également le fichier `.env` et définit les variables d'environnement pour la session avant de lancer uvicorn :
```powershell
# Depuis la racine du projet
.\run_dev.ps1
```
Ces scripts utilisent la même commande uvicorn sous-jacente :
```bash
python -m uvicorn app.app_optimized:app --host 0.0.0.0 --port 8000 --reload
``` ```
4. **Ouvrir le dashboard frontend complet** (interface de l'image 1) 4. **Ouvrir le dashboard frontend complet** (interface de l'image 1)
@ -184,11 +267,55 @@ brew install ansible
### API REST ### API REST
#### Authentification #### Authentification
Toutes les requêtes API nécessitent une clé API dans le header `X-API-Key`:
L'application utilise désormais **une authentification JWT moderne** avec utilisateur/mot de passe et support des rôles.
- **Mode recommandé (JWT)**
- Login avec `POST /api/auth/login/json` (ou `/api/auth/login` en form-data)
- Récupération d'un `access_token` (JWT) à transmettre dans le header `Authorization: Bearer <token>`
- Récupération des infos utilisateur avec `GET /api/auth/me`
- **Mode legacy (API key)**
- Toujours possible via `API_KEY` et `X-API-Key`, mais **désactivé dès qu'un utilisateur est créé**.
- À utiliser uniquement pour la transition depuis les anciennes versions.
```bash ```bash
curl -H "X-API-Key: dev-key-12345" http://localhost:8000/api/hosts # 1) Création du premier utilisateur admin (si aucune BD d'utilisateurs encore)
curl -X POST "http://localhost:8000/api/auth/setup" \
-H "Content-Type: application/json" \
-d '{
"username": "admin",
"password": "motdepasse-securise",
"display_name": "Administrateur"
}'
# 2) Login et récupération du token JWT
TOKEN=$(curl -s -X POST "http://localhost:8000/api/auth/login/json" \
-H "Content-Type: application/json" \
-d '{"username": "admin", "password": "motdepasse-securise"}' | jq -r .access_token)
# 3) Utilisation du token pour appeler l'API
curl -H "Authorization: Bearer $TOKEN" http://localhost:8000/api/hosts
``` ```
##### 🔐 Sécurité / bonnes pratiques JWT
- **Clé secrète forte** :
- Définir `JWT_SECRET_KEY` dans `.env` avec une valeur longue et aléatoire (au moins 32+ caractères).
- Ne jamais committer la vraie clé dans le dépôt Git.
- **Durée de vie raisonnable** :
- Utiliser `JWT_EXPIRE_MINUTES` pour limiter la validité du token (ex: 60 à 240 min en prod, 1440 min en dev/démo).
- Plus la durée est courte, plus limpact dun token volé est réduit.
- **Stockage côté client** :
- Le frontend stocke le token dans `localStorage` pour simplifier lintégration.
- Sur un déploiement exposé sur Internet, envisager des cookies HTTPOnly + TLS obligatoire pour durcir la sécurité.
- **Rotation de clé** :
- En cas de suspicion de fuite, changer `JWT_SECRET_KEY` et redémarrer lAPI : tous les anciens tokens deviennent invalides.
- Planifier un changement périodique de la clé (ex: tous les 612 mois) pour les environnements sensibles.
- **Protection réseau** :
- Toujours exposer lAPI derrière HTTPS (reverse proxy type Nginx/Traefik, certbot, etc.).
- Restreindre les IP/autorisations au niveau du pare-feu si possible.
#### Endpoints Principaux #### Endpoints Principaux
**Hôtes** **Hôtes**
@ -243,28 +370,30 @@ curl -H "X-API-Key: dev-key-12345" http://localhost:8000/api/hosts
- `POST /api/notifications/send` - Envoie une notification personnalisée - `POST /api/notifications/send` - Envoie une notification personnalisée
- `POST /api/notifications/toggle` - Active/désactive les notifications - `POST /api/notifications/toggle` - Active/désactive les notifications
#### Exemples d'utilisation Ansible #### Exemples d'utilisation Ansible (avec JWT)
Dans les exemples suivants, on suppose que vous avez déjà un token JWT dans la variable shell `$TOKEN`.
**Lister les playbooks disponibles :** **Lister les playbooks disponibles :**
```bash ```bash
curl -H "X-API-Key: dev-key-12345" http://localhost:8000/api/ansible/playbooks curl -H "Authorization: Bearer $TOKEN" http://localhost:8000/api/ansible/playbooks
``` ```
**Voir l'inventaire Ansible :** **Voir l'inventaire Ansible :**
```bash ```bash
curl -H "X-API-Key: dev-key-12345" http://localhost:8000/api/ansible/inventory curl -H "Authorization: Bearer $TOKEN" http://localhost:8000/api/ansible/inventory
``` ```
**Exécuter un playbook (ex: mise à jour sur le groupe "lab") :** **Exécuter un playbook (ex: mise à jour sur le groupe "lab") :**
```bash ```bash
curl -X POST -H "X-API-Key: dev-key-12345" -H "Content-Type: application/json" \ curl -X POST -H "Authorization: Bearer $TOKEN" -H "Content-Type: application/json" \
-d '{"playbook": "vm-upgrade.yml", "target": "lab"}' \ -d '{"playbook": "vm-upgrade.yml", "target": "lab"}' \
http://localhost:8000/api/ansible/execute http://localhost:8000/api/ansible/execute
``` ```
**Créer une tâche Ansible via l'API tasks :** **Créer une tâche Ansible via l'API tasks :**
```bash ```bash
curl -X POST -H "X-API-Key: dev-key-12345" -H "Content-Type: application/json" \ curl -X POST -H "Authorization: Bearer $TOKEN" -H "Content-Type: application/json" \
-d '{"action": "upgrade", "group": "proxmox"}' \ -d '{"action": "upgrade", "group": "proxmox"}' \
http://localhost:8000/api/tasks http://localhost:8000/api/tasks
``` ```
@ -272,12 +401,12 @@ curl -X POST -H "X-API-Key: dev-key-12345" -H "Content-Type: application/json" \
**Exécuter une commande ad-hoc :** **Exécuter une commande ad-hoc :**
```bash ```bash
# Vérifier l'espace disque sur tous les hôtes # Vérifier l'espace disque sur tous les hôtes
curl -X POST -H "X-API-Key: dev-key-12345" -H "Content-Type: application/json" \ curl -X POST -H "Authorization: Bearer $TOKEN" -H "Content-Type: application/json" \
-d '{"target": "all", "command": "df -h /", "module": "shell"}' \ -d '{"target": "all", "command": "df -h /", "module": "shell"}' \
http://localhost:8000/api/ansible/adhoc http://localhost:8000/api/ansible/adhoc
# Redémarrer un service avec sudo # Redémarrer un service avec sudo
curl -X POST -H "X-API-Key: dev-key-12345" -H "Content-Type: application/json" \ curl -X POST -H "Authorization: Bearer $TOKEN" -H "Content-Type: application/json" \
-d '{"target": "web-servers", "command": "systemctl restart nginx", "become": true}' \ -d '{"target": "web-servers", "command": "systemctl restart nginx", "become": true}' \
http://localhost:8000/api/ansible/adhoc http://localhost:8000/api/ansible/adhoc
``` ```
@ -286,7 +415,7 @@ curl -X POST -H "X-API-Key: dev-key-12345" -H "Content-Type: application/json" \
**Créer un schedule quotidien :** **Créer un schedule quotidien :**
```bash ```bash
curl -X POST -H "X-API-Key: dev-key-12345" -H "Content-Type: application/json" \ curl -X POST -H "Authorization: Bearer $TOKEN" -H "Content-Type: application/json" \
-d '{ -d '{
"name": "Backup quotidien", "name": "Backup quotidien",
"playbook": "backup-config.yml", "playbook": "backup-config.yml",
@ -300,7 +429,7 @@ curl -X POST -H "X-API-Key: dev-key-12345" -H "Content-Type: application/json" \
**Créer un schedule hebdomadaire (lundi et vendredi) :** **Créer un schedule hebdomadaire (lundi et vendredi) :**
```bash ```bash
curl -X POST -H "X-API-Key: dev-key-12345" -H "Content-Type: application/json" \ curl -X POST -H "Authorization: Bearer $TOKEN" -H "Content-Type: application/json" \
-d '{ -d '{
"name": "Health check bi-hebdo", "name": "Health check bi-hebdo",
"playbook": "health-check.yml", "playbook": "health-check.yml",
@ -313,7 +442,7 @@ curl -X POST -H "X-API-Key: dev-key-12345" -H "Content-Type: application/json" \
**Créer un schedule avec expression cron :** **Créer un schedule avec expression cron :**
```bash ```bash
curl -X POST -H "X-API-Key: dev-key-12345" -H "Content-Type: application/json" \ curl -X POST -H "Authorization: Bearer $TOKEN" -H "Content-Type: application/json" \
-d '{ -d '{
"name": "Maintenance mensuelle", "name": "Maintenance mensuelle",
"playbook": "vm-upgrade.yml", "playbook": "vm-upgrade.yml",
@ -326,13 +455,13 @@ curl -X POST -H "X-API-Key: dev-key-12345" -H "Content-Type: application/json" \
**Lancer un schedule immédiatement :** **Lancer un schedule immédiatement :**
```bash ```bash
curl -X POST -H "X-API-Key: dev-key-12345" \ curl -X POST -H "Authorization: Bearer $TOKEN" \
http://localhost:8000/api/schedules/{schedule_id}/run http://localhost:8000/api/schedules/{schedule_id}/run
``` ```
**Voir l'historique des exécutions :** **Voir l'historique des exécutions :**
```bash ```bash
curl -H "X-API-Key: dev-key-12345" \ curl -H "Authorization: Bearer $TOKEN" \
http://localhost:8000/api/schedules/{schedule_id}/runs http://localhost:8000/api/schedules/{schedule_id}/runs
``` ```
@ -340,13 +469,13 @@ curl -H "X-API-Key: dev-key-12345" \
**Tester la configuration ntfy :** **Tester la configuration ntfy :**
```bash ```bash
curl -X POST -H "X-API-Key: dev-key-12345" \ curl -X POST -H "Authorization: Bearer $TOKEN" \
"http://localhost:8000/api/notifications/test?message=Hello%20from%20Homelab" "http://localhost:8000/api/notifications/test?message=Hello%20from%20Homelab"
``` ```
**Envoyer une notification personnalisée :** **Envoyer une notification personnalisée :**
```bash ```bash
curl -X POST -H "X-API-Key: dev-key-12345" -H "Content-Type: application/json" \ curl -X POST -H "Authorization: Bearer $TOKEN" -H "Content-Type: application/json" \
-d '{ -d '{
"topic": "homelab-alerts", "topic": "homelab-alerts",
"message": "Serveur redémarré avec succès", "message": "Serveur redémarré avec succès",
@ -359,7 +488,7 @@ curl -X POST -H "X-API-Key: dev-key-12345" -H "Content-Type: application/json" \
**Désactiver temporairement les notifications :** **Désactiver temporairement les notifications :**
```bash ```bash
curl -X POST -H "X-API-Key: dev-key-12345" \ curl -X POST -H "Authorization: Bearer $TOKEN" \
"http://localhost:8000/api/notifications/toggle?enabled=false" "http://localhost:8000/api/notifications/toggle?enabled=false"
``` ```

View File

@ -0,0 +1,46 @@
"""Add users table for authentication
Revision ID: 0004_add_users
Revises: 0003_add_notification_type
Create Date: 2025-12-09
"""
from __future__ import annotations
from alembic import op
import sqlalchemy as sa
# revision identifiers, used by Alembic.
revision = "0004_add_users"
down_revision = "0003_add_notification_type"
branch_labels = None
depends_on = None
def upgrade() -> None:
op.create_table(
"users",
sa.Column("id", sa.Integer(), primary_key=True, autoincrement=True),
sa.Column("username", sa.String(50), nullable=False, unique=True),
sa.Column("email", sa.String(255), nullable=True, unique=True),
sa.Column("hashed_password", sa.String(255), nullable=False),
sa.Column("role", sa.String(20), nullable=False, server_default=sa.text("'admin'")),
sa.Column("is_active", sa.Boolean(), nullable=False, server_default=sa.text("1")),
sa.Column("is_superuser", sa.Boolean(), nullable=False, server_default=sa.text("0")),
sa.Column("display_name", sa.String(100), nullable=True),
sa.Column("created_at", sa.DateTime(timezone=True), server_default=sa.func.now(), nullable=False),
sa.Column("updated_at", sa.DateTime(timezone=True), server_default=sa.func.now(), nullable=False),
sa.Column("last_login", sa.DateTime(timezone=True), nullable=True),
sa.Column("password_changed_at", sa.DateTime(timezone=True), nullable=True),
sa.Column("deleted_at", sa.DateTime(timezone=True), nullable=True),
)
# Create index on username for fast lookups
op.create_index("idx_users_username", "users", ["username"])
op.create_index("idx_users_email", "users", ["email"])
def downgrade() -> None:
op.drop_index("idx_users_email", table_name="users")
op.drop_index("idx_users_username", table_name="users")
op.drop_table("users")

View File

@ -0,0 +1,86 @@
"""Add host_metrics table for builtin playbooks data collection
Revision ID: 0005
Revises: 0004_add_users_table
Create Date: 2024-12-11
"""
from typing import Sequence, Union
from alembic import op
import sqlalchemy as sa
revision: str = '0005_add_host_metrics'
down_revision: Union[str, None] = '0004_add_users'
branch_labels: Union[str, Sequence[str], None] = None
depends_on: Union[str, Sequence[str], None] = None
def upgrade() -> None:
op.create_table(
'host_metrics',
sa.Column('id', sa.Integer(), autoincrement=True, nullable=False),
sa.Column('host_id', sa.String(), nullable=False),
sa.Column('metric_type', sa.String(50), nullable=False),
# CPU metrics
sa.Column('cpu_count', sa.Integer(), nullable=True),
sa.Column('cpu_model', sa.String(200), nullable=True),
sa.Column('cpu_load_1m', sa.Float(), nullable=True),
sa.Column('cpu_load_5m', sa.Float(), nullable=True),
sa.Column('cpu_load_15m', sa.Float(), nullable=True),
sa.Column('cpu_usage_percent', sa.Float(), nullable=True),
sa.Column('cpu_temperature', sa.Float(), nullable=True),
# Memory metrics
sa.Column('memory_total_mb', sa.Integer(), nullable=True),
sa.Column('memory_used_mb', sa.Integer(), nullable=True),
sa.Column('memory_free_mb', sa.Integer(), nullable=True),
sa.Column('memory_usage_percent', sa.Float(), nullable=True),
sa.Column('swap_total_mb', sa.Integer(), nullable=True),
sa.Column('swap_used_mb', sa.Integer(), nullable=True),
sa.Column('swap_usage_percent', sa.Float(), nullable=True),
# Disk metrics
sa.Column('disk_info', sa.JSON(), nullable=True),
sa.Column('disk_root_total_gb', sa.Float(), nullable=True),
sa.Column('disk_root_used_gb', sa.Float(), nullable=True),
sa.Column('disk_root_usage_percent', sa.Float(), nullable=True),
# System info
sa.Column('os_name', sa.String(100), nullable=True),
sa.Column('os_version', sa.String(100), nullable=True),
sa.Column('kernel_version', sa.String(100), nullable=True),
sa.Column('hostname', sa.String(200), nullable=True),
sa.Column('uptime_seconds', sa.Integer(), nullable=True),
sa.Column('uptime_human', sa.String(100), nullable=True),
# Network info
sa.Column('network_info', sa.JSON(), nullable=True),
# Raw data and metadata
sa.Column('raw_data', sa.JSON(), nullable=True),
sa.Column('collection_source', sa.String(100), nullable=True),
sa.Column('collection_duration_ms', sa.Integer(), nullable=True),
sa.Column('error_message', sa.Text(), nullable=True),
# Timestamps
sa.Column('collected_at', sa.DateTime(timezone=True), server_default=sa.func.now(), nullable=False),
sa.Column('created_at', sa.DateTime(timezone=True), server_default=sa.func.now(), nullable=False),
sa.ForeignKeyConstraint(['host_id'], ['hosts.id'], ondelete='CASCADE'),
sa.PrimaryKeyConstraint('id')
)
# Create indexes
op.create_index('idx_host_metrics_host_id', 'host_metrics', ['host_id'])
op.create_index('idx_host_metrics_collected_at', 'host_metrics', ['collected_at'])
op.create_index('idx_host_metrics_metric_type', 'host_metrics', ['metric_type'])
def downgrade() -> None:
op.drop_index('idx_host_metrics_metric_type', 'host_metrics')
op.drop_index('idx_host_metrics_collected_at', 'host_metrics')
op.drop_index('idx_host_metrics_host_id', 'host_metrics')
op.drop_table('host_metrics')

View File

@ -0,0 +1,42 @@
"""Add detailed CPU/disk fields to host_metrics
Revision ID: 0006_add_host_metrics_details
Revises: 0005_add_host_metrics
Create Date: 2025-12-12
"""
from typing import Sequence, Union
from alembic import op
import sqlalchemy as sa
revision: str = '0006_add_host_metrics_details'
down_revision: Union[str, None] = '0005_add_host_metrics'
branch_labels: Union[str, Sequence[str], None] = None
depends_on: Union[str, Sequence[str], None] = None
def upgrade() -> None:
op.add_column('host_metrics', sa.Column('cpu_cores', sa.Integer(), nullable=True))
op.add_column('host_metrics', sa.Column('cpu_threads', sa.Integer(), nullable=True))
op.add_column('host_metrics', sa.Column('cpu_threads_per_core', sa.Integer(), nullable=True))
op.add_column('host_metrics', sa.Column('cpu_sockets', sa.Integer(), nullable=True))
op.add_column('host_metrics', sa.Column('cpu_mhz', sa.Float(), nullable=True))
op.add_column('host_metrics', sa.Column('cpu_max_mhz', sa.Float(), nullable=True))
op.add_column('host_metrics', sa.Column('cpu_min_mhz', sa.Float(), nullable=True))
op.add_column('host_metrics', sa.Column('disk_devices', sa.JSON(), nullable=True))
def downgrade() -> None:
op.drop_column('host_metrics', 'disk_devices')
op.drop_column('host_metrics', 'cpu_min_mhz')
op.drop_column('host_metrics', 'cpu_max_mhz')
op.drop_column('host_metrics', 'cpu_mhz')
op.drop_column('host_metrics', 'cpu_sockets')
op.drop_column('host_metrics', 'cpu_threads_per_core')
op.drop_column('host_metrics', 'cpu_threads')
op.drop_column('host_metrics', 'cpu_cores')

View File

@ -0,0 +1,47 @@
"""Add alerts table
Revision ID: 0007_add_alerts_table
Revises: 0006_add_host_metrics_details
Create Date: 2025-12-12
"""
from typing import Sequence, Union
from alembic import op
import sqlalchemy as sa
revision: str = '0007_add_alerts_table'
down_revision: Union[str, None] = '0006_add_host_metrics_details'
branch_labels: Union[str, Sequence[str], None] = None
depends_on: Union[str, Sequence[str], None] = None
def upgrade() -> None:
op.create_table(
'alerts',
sa.Column('id', sa.Integer(), primary_key=True, autoincrement=True),
sa.Column('user_id', sa.Integer(), sa.ForeignKey('users.id', ondelete='SET NULL'), nullable=True),
sa.Column('category', sa.String(length=50), nullable=False),
sa.Column('level', sa.String(length=20), nullable=True),
sa.Column('title', sa.String(length=255), nullable=True),
sa.Column('message', sa.Text(), nullable=False),
sa.Column('source', sa.String(length=50), nullable=True),
sa.Column('details', sa.JSON(), nullable=True),
sa.Column('read_at', sa.DateTime(timezone=True), nullable=True),
sa.Column('created_at', sa.DateTime(timezone=True), server_default=sa.text('CURRENT_TIMESTAMP'), nullable=False),
)
op.create_index('idx_alerts_created_at', 'alerts', ['created_at'])
op.create_index('idx_alerts_user_id', 'alerts', ['user_id'])
op.create_index('idx_alerts_category', 'alerts', ['category'])
op.create_index('idx_alerts_read_at', 'alerts', ['read_at'])
def downgrade() -> None:
op.drop_index('idx_alerts_read_at', table_name='alerts')
op.drop_index('idx_alerts_category', table_name='alerts')
op.drop_index('idx_alerts_user_id', table_name='alerts')
op.drop_index('idx_alerts_created_at', table_name='alerts')
op.drop_table('alerts')

View File

@ -0,0 +1,28 @@
"""Add LVM/ZFS metrics fields to host_metrics
Revision ID: 0008_add_lvm_zfs_metrics
Revises: 0007_add_alerts_table
Create Date: 2025-12-12
"""
from typing import Sequence, Union
from alembic import op
import sqlalchemy as sa
revision: str = '0008_add_lvm_zfs_metrics'
down_revision: Union[str, None] = '0007_add_alerts_table'
branch_labels: Union[str, Sequence[str], None] = None
depends_on: Union[str, Sequence[str], None] = None
def upgrade() -> None:
op.add_column('host_metrics', sa.Column('lvm_info', sa.JSON(), nullable=True))
op.add_column('host_metrics', sa.Column('zfs_info', sa.JSON(), nullable=True))
def downgrade() -> None:
op.drop_column('host_metrics', 'zfs_info')
op.drop_column('host_metrics', 'lvm_info')

View File

@ -0,0 +1,34 @@
"""Add app_settings table
Revision ID: 0009_add_app_settings_table
Revises: 0008_add_lvm_zfs_metrics
Create Date: 2025-12-13
"""
from typing import Sequence, Union
from alembic import op
import sqlalchemy as sa
revision: str = '0009_add_app_settings_table'
down_revision: Union[str, None] = '0008_add_lvm_zfs_metrics'
branch_labels: Union[str, Sequence[str], None] = None
depends_on: Union[str, Sequence[str], None] = None
def upgrade() -> None:
op.create_table(
'app_settings',
sa.Column('key', sa.String(length=100), primary_key=True, nullable=False),
sa.Column('value', sa.Text(), nullable=True),
sa.Column('created_at', sa.DateTime(timezone=True), nullable=False, server_default=sa.text('CURRENT_TIMESTAMP')),
sa.Column('updated_at', sa.DateTime(timezone=True), nullable=False, server_default=sa.text('CURRENT_TIMESTAMP')),
)
op.create_index('idx_app_settings_updated_at', 'app_settings', ['updated_at'], unique=False)
def downgrade() -> None:
op.drop_index('idx_app_settings_updated_at', table_name='app_settings')
op.drop_table('app_settings')

View File

@ -0,0 +1,61 @@
"""Remove foreign key constraints from logs table.
Revision ID: 0010
Revises: 0009_add_app_settings_table
Create Date: 2025-12-14
"""
from alembic import op
import sqlalchemy as sa
# revision identifiers, used by Alembic.
revision = '0010_remove_logs_foreign_keys'
down_revision = '0009_add_app_settings_table'
branch_labels = None
depends_on = None
def upgrade():
"""Remove FK constraints from logs table by recreating it."""
# SQLite doesn't support ALTER TABLE DROP CONSTRAINT
# We need to recreate the table without the FK constraints
# Create new table without FK constraints
op.create_table(
'logs_new',
sa.Column('id', sa.Integer(), autoincrement=True, nullable=False),
sa.Column('level', sa.String(), nullable=False),
sa.Column('source', sa.String(), nullable=True),
sa.Column('message', sa.Text(), nullable=False),
sa.Column('details', sa.JSON(), nullable=True),
sa.Column('host_id', sa.String(), nullable=True),
sa.Column('task_id', sa.String(), nullable=True),
sa.Column('schedule_id', sa.String(), nullable=True),
sa.Column('created_at', sa.DateTime(timezone=True), server_default=sa.func.now(), nullable=False),
sa.PrimaryKeyConstraint('id')
)
# Copy data from old table
op.execute('''
INSERT INTO logs_new (id, level, source, message, details, host_id, task_id, schedule_id, created_at)
SELECT id, level, source, message, details, host_id, task_id, schedule_id, created_at FROM logs
''')
# Drop old table
op.drop_table('logs')
# Rename new table
op.rename_table('logs_new', 'logs')
# Recreate indexes
op.create_index('idx_logs_created_at', 'logs', ['created_at'])
op.create_index('idx_logs_level', 'logs', ['level'])
op.create_index('idx_logs_source', 'logs', ['source'])
def downgrade():
"""Restore FK constraints (not recommended)."""
# This would require recreating with FK constraints
# For simplicity, we don't implement downgrade
pass

View File

@ -1,4 +1,5 @@
ansible_port: 22 ansible_port: 22
ansible_user: automation ansible_user: automation
ansible_ssh_private_key_file: /app/ssh_keys/id_automation_ansible # ansible_ssh_private_key_file: /app/ssh_keys/id_automation_ansible
ansible_ssh_private_key_file: ~/.ssh/id_automation_ansible

View File

@ -1,4 +1,5 @@
ansible_port: 22 ansible_port: 22
ansible_user: automation ansible_user: automation
ansible_ssh_private_key_file: /app/ssh_keys/id_automation_ansible # ansible_ssh_private_key_file: /app/ssh_keys/id_automation_ansible
ansible_ssh_private_key_file: ~/.ssh/id_automation_ansible
ansible_python_interpreter: /usr/bin/python3 ansible_python_interpreter: /usr/bin/python3

View File

@ -1,4 +1,5 @@
ansible_port: 22 ansible_port: 22
ansible_user: automation ansible_user: automation
ansible_ssh_private_key_file: /app/ssh_keys/id_automation_ansible # ansible_ssh_private_key_file: /app/ssh_keys/id_automation_ansible
ansible_ssh_private_key_file: ~/.ssh/id_automation_ansible
ansible_python_interpreter: /usr/bin/python3 ansible_python_interpreter: /usr/bin/python3

View File

@ -1,4 +1,5 @@
ansible_port: 22 ansible_port: 22
ansible_user: automation ansible_user: automation
ansible_ssh_private_key_file: /app/ssh_keys/id_automation_ansible # ansible_ssh_private_key_file: /app/ssh_keys/id_automation_ansible
ansible_ssh_private_key_file: ~/.ssh/id_automation_ansible
ansible_python_interpreter: /usr/bin/python3 ansible_python_interpreter: /usr/bin/python3

View File

@ -1,4 +1,5 @@
ansible_port: 22 ansible_port: 22
ansible_user: automation ansible_user: automation
ansible_ssh_private_key_file: /app/ssh_keys/id_automation_ansible # ansible_ssh_private_key_file: /app/ssh_keys/id_automation_ansible
ansible_ssh_private_key_file: ~/.ssh/id_automation_ansible
ansible_python_interpreter: /usr/bin/python3 ansible_python_interpreter: /usr/bin/python3

View File

@ -12,15 +12,31 @@ all:
raspi.8gb.home: null raspi.8gb.home: null
env_lab: env_lab:
hosts: hosts:
media.labb.home: null
dev.lab.home: null dev.lab.home: null
media.labb.home: null
env_prod: env_prod:
hosts: hosts:
hp.truenas.home: null
ali2v.truenas.home: null ali2v.truenas.home: null
jump.point.home: null
automate.prod.home: null automate.prod.home: null
dev.prod.home: null dev.prod.home: null
hp.truenas.home: null
jump.point.home: null
env_test:
hosts: {}
env_test2:
hosts: {}
role_docker:
hosts:
dev.lab.home: null
role_lab_servers:
hosts:
dev.lab.home: null
media.labb.home: null
role_prod_servers:
hosts:
automate.prod.home: null
dev.prod.home: null
jump.point.home: null
role_proxmox: role_proxmox:
hosts: hosts:
ali2v.xeon.home: null ali2v.xeon.home: null
@ -28,25 +44,22 @@ all:
hp2.i7.home: null hp2.i7.home: null
hp3.i5.home: null hp3.i5.home: null
mimi.pc.home: null mimi.pc.home: null
role_lab_servers:
hosts:
media.labb.home: null
dev.lab.home: null
role_truenas:
hosts:
hp.truenas.home: null
ali2v.truenas.home:
ansible_python_interpreter: /usr/bin/python3
role_prod_servers:
hosts:
jump.point.home: null
automate.prod.home: null
dev.prod.home: null
role_sbc: role_sbc:
hosts: hosts:
orangepi.pc.home: null orangepi.pc.home: null
raspi.4gb.home: null raspi.4gb.home: null
raspi.8gb.home: null raspi.8gb.home: null
role_docker: role_test:
hosts: {}
role_test2:
hosts: {}
role_truenas:
hosts: hosts:
dev.lab.home: null ali2v.truenas.home:
ansible_python_interpreter: /usr/bin/python3
hp.truenas.home: null
local_test:
hosts:
localhost:
ansible_connection: local
ansible_python_interpreter: '{{ ansible_playbook_python }}'

View File

@ -12,17 +12,32 @@ all:
raspi.8gb.home: null raspi.8gb.home: null
env_lab: env_lab:
hosts: hosts:
media.labb.home: null
toto:
ansible_host: toto.home
dev.lab.home: null dev.lab.home: null
media.labb.home: null
env_prod: env_prod:
hosts: hosts:
hp.truenas.home: null
ali2v.truenas.home: null ali2v.truenas.home: null
jump.point.home: null
automate.prod.home: null automate.prod.home: null
dev.prod.home: null dev.prod.home: null
hp.truenas.home: null
jump.point.home: null
env_test:
hosts: {}
env_test2:
hosts:
vivobook.local: null
role_docker:
hosts:
dev.lab.home: null
role_lab_servers:
hosts:
dev.lab.home: null
media.labb.home: null
role_prod_servers:
hosts:
automate.prod.home: null
dev.prod.home: null
jump.point.home: null
role_proxmox: role_proxmox:
hosts: hosts:
ali2v.xeon.home: null ali2v.xeon.home: null
@ -30,26 +45,23 @@ all:
hp2.i7.home: null hp2.i7.home: null
hp3.i5.home: null hp3.i5.home: null
mimi.pc.home: null mimi.pc.home: null
role_lab_servers:
hosts:
media.labb.home: null
toto: null
dev.lab.home: null
role_truenas:
hosts:
hp.truenas.home: null
ali2v.truenas.home:
ansible_python_interpreter: /usr/bin/python3
role_prod_servers:
hosts:
jump.point.home: null
automate.prod.home: null
dev.prod.home: null
role_sbc: role_sbc:
hosts: hosts:
orangepi.pc.home: null orangepi.pc.home: null
raspi.4gb.home: null raspi.4gb.home: null
raspi.8gb.home: null raspi.8gb.home: null
role_docker: role_test:
hosts: hosts:
dev.lab.home: null vivobook.local: null
role_test2:
hosts: {}
local_test:
hosts:
localhost:
ansible_connection: local
ansible_python_interpreter: "{{ ansible_playbook_python }}"
role_truenas:
hosts:
ali2v.truenas.home:
ansible_python_interpreter: /usr/bin/python3
hp.truenas.home: null

View File

@ -0,0 +1,50 @@
---
# Builtin Playbook: Collecte des informations CPU
- name: Collect CPU Information
hosts: all
become: false
gather_facts: true
vars:
_builtin_playbook: true
_builtin_id: collect_cpu_info
_collect_metrics: true
tasks:
- name: Get CPU load averages
ansible.builtin.shell: cat /proc/loadavg | awk '{print $1, $2, $3}'
register: cpu_load
changed_when: false
- name: Get CPU temperature
ansible.builtin.shell: |
if [ -f /sys/class/thermal/thermal_zone0/temp ]; then
cat /sys/class/thermal/thermal_zone0/temp | awk '{printf "%.1f", $1/1000}'
else
echo "null"
fi
register: cpu_temp
changed_when: false
ignore_errors: true
- name: Get CPU usage percentage
ansible.builtin.shell: |
top -bn1 | grep "Cpu(s)" | awk '{print 100 - $8}' 2>/dev/null || echo "0"
register: cpu_usage
changed_when: false
ignore_errors: true
- name: Build metrics output
ansible.builtin.set_fact:
metrics_output:
host: "{{ inventory_hostname }}"
data:
cpu_count: "{{ ansible_processor_vcpus | default(ansible_processor_count, true) | default(1) }}"
cpu_model: "{{ ansible_processor[2] | default('Unknown', true) if (ansible_processor is defined and ansible_processor | length > 2) else 'Unknown' }}"
cpu_load_1m: "{{ cpu_load.stdout.split()[0] | default('0', true) | float }}"
cpu_load_5m: "{{ cpu_load.stdout.split()[1] | default('0', true) | float }}"
cpu_load_15m: "{{ cpu_load.stdout.split()[2] | default('0', true) | float }}"
cpu_usage_percent: "{{ cpu_usage.stdout | default('0', true) | float }}"
cpu_temperature: "{{ cpu_temp.stdout if (cpu_temp.stdout is defined and cpu_temp.stdout != 'null') else '' }}"
- name: Output metrics
ansible.builtin.debug:
msg: "METRICS_JSON_START:{{ metrics_output | to_json }}:METRICS_JSON_END"

View File

@ -0,0 +1,39 @@
---
# Builtin Playbook: Collecte de l'espace disque
- name: Collect Disk Usage Information
hosts: all
become: false
gather_facts: false
vars:
_builtin_playbook: true
_builtin_id: collect_disk_usage
_collect_metrics: true
tasks:
- name: Get disk usage for all mount points
ansible.builtin.shell: |
df -BG --output=target,size,used,avail,pcent -x tmpfs -x devtmpfs -x squashfs 2>/dev/null | tail -n +2 | awk '{
gsub("G",""); gsub("%","");
printf "{\"mount\":\"%s\",\"total_gb\":%s,\"used_gb\":%s,\"free_gb\":%s,\"usage_percent\":%s}\n", $1, $2, $3, $4, $5
}' | paste -sd "," | awk '{print "["$0"]"}' || echo '[]'
register: disk_info
changed_when: false
- name: Get root partition info
ansible.builtin.shell: |
df -BG / 2>/dev/null | tail -1 | awk '{gsub("G",""); gsub("%",""); printf "{\"total_gb\":%s,\"used_gb\":%s,\"usage_percent\":%s}", $2, $3, $5}' || echo '{"total_gb":0,"used_gb":0,"usage_percent":0}'
register: disk_root
changed_when: false
- name: Build metrics output
ansible.builtin.set_fact:
metrics_output:
host: "{{ inventory_hostname }}"
data:
disk_info: "{{ disk_info.stdout | default('[]', true) | from_json }}"
disk_root_total_gb: "{{ (disk_root.stdout | default('{\"total_gb\":0,\"used_gb\":0,\"usage_percent\":0}', true) | from_json).total_gb | float }}"
disk_root_used_gb: "{{ (disk_root.stdout | default('{\"total_gb\":0,\"used_gb\":0,\"usage_percent\":0}', true) | from_json).used_gb | float }}"
disk_root_usage_percent: "{{ (disk_root.stdout | default('{\"total_gb\":0,\"used_gb\":0,\"usage_percent\":0}', true) | from_json).usage_percent | float }}"
- name: Output metrics
ansible.builtin.debug:
msg: "METRICS_JSON_START:{{ metrics_output | to_json }}:METRICS_JSON_END"

View File

@ -0,0 +1,73 @@
---
# Builtin Playbook: Collecte des informations mémoire
- name: Collect Memory Information
hosts: all
become: false
gather_facts: false
vars:
_builtin_playbook: true
_builtin_id: collect_memory_info
_collect_metrics: true
tasks:
- name: Get memory info from /proc/meminfo (Linux)
ansible.builtin.shell: |
awk '/MemTotal/{total=$2} /MemFree/{free=$2} /MemAvailable/{avail=$2} /SwapTotal/{stotal=$2} /SwapFree/{sfree=$2} END{
used=total-avail;
usage=used/total*100;
sused=stotal-sfree;
susage=(stotal>0)?sused/stotal*100:0;
printf "{\"total_mb\":%d,\"used_mb\":%d,\"free_mb\":%d,\"usage_percent\":%.1f,\"swap_total_mb\":%d,\"swap_used_mb\":%d,\"swap_usage_percent\":%.1f}",
total/1024, used/1024, avail/1024, usage, stotal/1024, sused/1024, susage
}' /proc/meminfo 2>/dev/null || echo ''
register: memory_info_linux
changed_when: false
ignore_errors: true
- name: Get memory info using sysctl (FreeBSD/TrueNAS)
ansible.builtin.shell: |
total=$(sysctl -n hw.physmem 2>/dev/null || echo 0)
free=$(sysctl -n vm.stats.vm.v_free_count 2>/dev/null || echo 0)
pagesize=$(sysctl -n hw.pagesize 2>/dev/null || echo 4096)
swap_total=$(swapinfo -k 2>/dev/null | awk 'NR==2 {print $2}' || echo 0)
swap_used=$(swapinfo -k 2>/dev/null | awk 'NR==2 {print $3}' || echo 0)
total_mb=$((total / 1024 / 1024))
free_mb=$((free * pagesize / 1024 / 1024))
used_mb=$((total_mb - free_mb))
usage=$(awk -v used=$used_mb -v total=$total_mb 'BEGIN {if(total>0) printf "%.1f", used/total*100; else print "0.0"}')
swap_total_mb=$((swap_total / 1024))
swap_used_mb=$((swap_used / 1024))
swap_usage=$(awk -v used=$swap_used_mb -v total=$swap_total_mb 'BEGIN {if(total>0) printf "%.1f", used/total*100; else print "0.0"}')
printf '{"total_mb":%d,"used_mb":%d,"free_mb":%d,"usage_percent":%s,"swap_total_mb":%d,"swap_used_mb":%d,"swap_usage_percent":%s}' \
$total_mb $used_mb $free_mb $usage $swap_total_mb $swap_used_mb $swap_usage
register: memory_info_bsd
changed_when: false
ignore_errors: true
when: memory_info_linux.stdout == ''
- name: Set memory info variable
ansible.builtin.set_fact:
memory_info_json: "{{ memory_info_linux.stdout if memory_info_linux.stdout != '' else (memory_info_bsd.stdout | default('{\"total_mb\":0,\"used_mb\":0,\"free_mb\":0,\"usage_percent\":0.0,\"swap_total_mb\":0,\"swap_used_mb\":0,\"swap_usage_percent\":0.0}')) }}"
- name: Parse memory info JSON
ansible.builtin.set_fact:
memory_info_parsed: "{{ memory_info_json if (memory_info_json is mapping) else (memory_info_json | from_json) }}"
- name: Build metrics output
ansible.builtin.set_fact:
metrics_output:
host: "{{ inventory_hostname }}"
data:
memory_total_mb: "{{ memory_info_parsed.total_mb | default(0, true) | int }}"
memory_used_mb: "{{ memory_info_parsed.used_mb | default(0, true) | int }}"
memory_free_mb: "{{ memory_info_parsed.free_mb | default(0, true) | int }}"
memory_usage_percent: "{{ memory_info_parsed.usage_percent | default(0.0, true) | float }}"
swap_total_mb: "{{ memory_info_parsed.swap_total_mb | default(0, true) | int }}"
swap_used_mb: "{{ memory_info_parsed.swap_used_mb | default(0, true) | int }}"
swap_usage_percent: "{{ memory_info_parsed.swap_usage_percent | default(0.0, true) | float }}"
- name: Output metrics
ansible.builtin.debug:
msg: "METRICS_JSON_START:{{ metrics_output | to_json }}:METRICS_JSON_END"

View File

@ -0,0 +1,49 @@
---
# Builtin Playbook: Collecte des informations réseau
- name: Collect Network Information
hosts: all
become: false
gather_facts: false
vars:
_builtin_playbook: true
_builtin_id: collect_network_info
_collect_metrics: true
tasks:
- name: Get network interfaces info
ansible.builtin.shell: |
ip -j addr show 2>/dev/null | python3 -c "
import sys, json
try:
data = json.load(sys.stdin)
result = []
for iface in data:
if iface.get('ifname') not in ['lo']:
info = {
'name': iface.get('ifname'),
'state': iface.get('operstate', 'unknown'),
'mtu': iface.get('mtu')
}
for addr in iface.get('addr_info', []):
if addr.get('family') == 'inet':
info['ip_address'] = addr.get('local')
info['prefix'] = addr.get('prefixlen')
if iface.get('address'):
info['mac_address'] = iface.get('address')
result.append(info)
print(json.dumps(result))
except Exception as e:
print('[]')
" 2>/dev/null || echo "[]"
register: network_info
changed_when: false
- name: Build metrics output
ansible.builtin.set_fact:
metrics_output:
host: "{{ inventory_hostname }}"
data:
network_info: "{{ network_info.stdout | from_json }}"
- name: Output metrics
ansible.builtin.debug:
msg: "METRICS_JSON_START:{{ metrics_output | to_json }}:METRICS_JSON_END"

View File

@ -0,0 +1,396 @@
---
# Builtin Playbook: Collecte d'informations système complètes
# Ce playbook collecte CPU, mémoire, disque, OS et réseau en une seule exécution
# Les résultats sont formatés en JSON pour être parsés par l'application
- name: Collect Complete System Information
hosts: all
become: false
gather_facts: true
vars:
_builtin_playbook: true
_builtin_id: collect_system_info
_collect_metrics: true
tasks:
- name: Gather additional facts
ansible.builtin.setup:
gather_subset:
- hardware
- network
- virtual
- name: Get CPU load averages
ansible.builtin.shell: cat /proc/loadavg | awk '{print $1, $2, $3}'
register: cpu_load
changed_when: false
ignore_errors: true
- name: Parse CPU load parts safely
ansible.builtin.set_fact:
cpu_load_parts: "{{ cpu_load.stdout.split() if (cpu_load is defined and cpu_load.stdout is defined) else [] }}"
- name: Get CPU temperature (if available)
ansible.builtin.shell: |
temp=""
# 1) Essayer les capteurs standard du noyau (/sys/class/thermal)
for z in /sys/class/thermal/thermal_zone*/temp; do
if [ -f "$z" ]; then
raw=$(cat "$z" 2>/dev/null || echo "")
if [ -n "$raw" ]; then
temp=$(awk -v v="$raw" 'BEGIN { if (v != "" && v != "0") printf "%.1f", v/1000; }')
if [ -n "$temp" ]; then
break
fi
fi
fi
done
# 2) Spécifique Raspberry Pi : vcgencmd measure_temp
if [ -z "$temp" ] && command -v vcgencmd >/dev/null 2>&1; then
raw=$(vcgencmd measure_temp 2>/dev/null | sed 's/[^0-9\.]*//g')
if [ -n "$raw" ]; then
temp=$raw
fi
fi
# 3) Utiliser lm-sensors si disponible
if [ -z "$temp" ] && command -v sensors >/dev/null 2>&1; then
raw=$(sensors 2>/dev/null | awk '/^Package id 0:|^Tctl:|^CPU Temp:|^temp1:/{gsub("+","",$2); gsub("°C","",$2); print $2; exit}')
if [ -n "$raw" ]; then
temp=$raw
fi
fi
if [ -z "$temp" ]; then
echo "null"
else
printf "%.1f" "$temp"
fi
register: cpu_temp
changed_when: false
ignore_errors: true
- name: Get memory info from /proc/meminfo (Linux)
ansible.builtin.shell: |
awk '/MemTotal/{total=$2} /MemFree/{free=$2} /MemAvailable/{avail=$2} /Buffers/{buf=$2} /^Cached/{cache=$2} /SwapTotal/{stotal=$2} /SwapFree/{sfree=$2} END{
used=total-avail;
usage=used/total*100;
sused=stotal-sfree;
susage=(stotal>0)?sused/stotal*100:0;
printf "{\"total_mb\":%d,\"used_mb\":%d,\"free_mb\":%d,\"usage_percent\":%.1f,\"swap_total_mb\":%d,\"swap_used_mb\":%d,\"swap_usage_percent\":%.1f}",
total/1024, used/1024, avail/1024, usage, stotal/1024, sused/1024, susage
}' /proc/meminfo 2>/dev/null || echo ''
register: memory_info_linux
changed_when: false
ignore_errors: true
- name: Get memory info using sysctl (FreeBSD/TrueNAS)
ansible.builtin.shell: |
total=$(sysctl -n hw.physmem 2>/dev/null || echo 0)
free=$(sysctl -n vm.stats.vm.v_free_count 2>/dev/null || echo 0)
pagesize=$(sysctl -n hw.pagesize 2>/dev/null || echo 4096)
swap_total=$(swapinfo -k 2>/dev/null | awk 'NR==2 {print $2}' || echo 0)
swap_used=$(swapinfo -k 2>/dev/null | awk 'NR==2 {print $3}' || echo 0)
total_mb=$((total / 1024 / 1024))
free_mb=$((free * pagesize / 1024 / 1024))
used_mb=$((total_mb - free_mb))
usage=$(awk -v used=$used_mb -v total=$total_mb 'BEGIN {if(total>0) printf "%.1f", used/total*100; else print "0.0"}')
swap_total_mb=$((swap_total / 1024))
swap_used_mb=$((swap_used / 1024))
swap_usage=$(awk -v used=$swap_used_mb -v total=$swap_total_mb 'BEGIN {if(total>0) printf "%.1f", used/total*100; else print "0.0"}')
printf '{"total_mb":%d,"used_mb":%d,"free_mb":%d,"usage_percent":%s,"swap_total_mb":%d,"swap_used_mb":%d,"swap_usage_percent":%s}' \
$total_mb $used_mb $free_mb $usage $swap_total_mb $swap_used_mb $swap_usage
register: memory_info_bsd
changed_when: false
ignore_errors: true
when: memory_info_linux.stdout == ''
- name: Set memory info variable
ansible.builtin.set_fact:
memory_info_json: "{{ memory_info_linux.stdout if memory_info_linux.stdout != '' else (memory_info_bsd.stdout | default('{\"total_mb\":0,\"used_mb\":0,\"free_mb\":0,\"usage_percent\":0.0,\"swap_total_mb\":0,\"swap_used_mb\":0,\"swap_usage_percent\":0.0}')) }}"
- name: Parse memory info JSON
ansible.builtin.set_fact:
memory_info_parsed: "{{ memory_info_json if (memory_info_json is mapping) else (memory_info_json | from_json) }}"
- name: Get disk usage for all mount points
ansible.builtin.shell: |
if df --help 2>/dev/null | grep -q -- '--output'; then
df -BG --output=source,fstype,target,size,used,avail,pcent -x tmpfs -x devtmpfs -x squashfs 2>/dev/null | tail -n +2 | awk '{
gsub("G",""); gsub("%","");
dev=$1; fs=$2; mnt=$3;
gsub(/\\/,"\\\\",dev); gsub(/\"/,"\\\"",dev);
gsub(/\\/,"\\\\",fs); gsub(/\"/,"\\\"",fs);
gsub(/\\/,"\\\\",mnt); gsub(/\"/,"\\\"",mnt);
printf "{\"device\":\"%s\",\"filesystem\":\"%s\",\"mount\":\"%s\",\"total_gb\":%s,\"used_gb\":%s,\"free_gb\":%s,\"usage_percent\":%s}\n", dev, fs, mnt, $4, $5, $6, $7
}' | paste -sd "," | awk '{print "["$0"]"}'
else
echo "[]"
fi
register: disk_info
changed_when: false
ignore_errors: true
- name: Parse disk usage JSON safely
block:
- name: Parse disk usage JSON
ansible.builtin.set_fact:
disk_info_parsed: "{{ ((disk_info.stdout | default('[]', true) | trim) if ((disk_info.stdout | default('', true) | trim) | regex_search('^\\[')) else '[]') | from_json }}"
rescue:
- name: Fallback disk_info_parsed
ansible.builtin.set_fact:
disk_info_parsed: []
- name: Get disk devices layout (lsblk JSON if available)
ansible.builtin.shell: |
if command -v lsblk >/dev/null 2>&1; then
tmp="/tmp/lsblk_${$}.json"
if lsblk -J -b -o NAME,TYPE,SIZE,FSTYPE,MOUNTPOINT,MODEL,SERIAL,UUID >"$tmp" 2>/dev/null; then
cat "$tmp"
else
echo '{"blockdevices":[]}'
fi
rm -f "$tmp" >/dev/null 2>&1 || true
else
echo '{"blockdevices":[]}'
fi
register: disk_devices
changed_when: false
ignore_errors: true
- name: Parse disk devices JSON safely
block:
- name: Parse disk devices JSON
ansible.builtin.set_fact:
disk_devices_parsed: "{{ ((((disk_devices.stdout | default('{\"blockdevices\":[]}', true) | trim) if ((disk_devices.stdout | default('', true) | trim) | regex_search('^\\{')) else '{\"blockdevices\":[]}') | from_json).blockdevices) | default([], true) }}"
rescue:
- name: Fallback disk_devices_parsed
ansible.builtin.set_fact:
disk_devices_parsed: []
- name: Get LVM info (pvs/vgs/lvs) as JSON (best-effort)
ansible.builtin.shell: |
if command -v pvs >/dev/null 2>&1 && command -v vgs >/dev/null 2>&1 && command -v lvs >/dev/null 2>&1; then
pvs --reportformat json 2>/dev/null || echo '{"report":[]}'
echo '---'
vgs --reportformat json 2>/dev/null || echo '{"report":[]}'
echo '---'
lvs --reportformat json 2>/dev/null || echo '{"report":[]}'
else
echo '{"report":[]}'
echo '---'
echo '{"report":[]}'
echo '---'
echo '{"report":[]}'
fi
register: lvm_info
changed_when: false
ignore_errors: true
- name: Parse LVM JSON safely
block:
- name: Parse LVM JSON
ansible.builtin.set_fact:
_lvm_chunks: "{{ (lvm_info.stdout | default('{\"report\":[]}\n---\n{\"report\":[]}\n---\n{\"report\":[]}', true)).split('---') }}"
lvm_info_parsed:
pvs: "{{ (_lvm_chunks[0] | trim | from_json).report[0].pv | default([], true) if (_lvm_chunks | length) > 0 else [] }}"
vgs: "{{ (_lvm_chunks[1] | trim | from_json).report[0].vg | default([], true) if (_lvm_chunks | length) > 1 else [] }}"
lvs: "{{ (_lvm_chunks[2] | trim | from_json).report[0].lv | default([], true) if (_lvm_chunks | length) > 2 else [] }}"
rescue:
- name: Fallback lvm_info_parsed
ansible.builtin.set_fact:
lvm_info_parsed:
pvs: []
vgs: []
lvs: []
- name: Get ZFS pools info as JSON (best-effort)
ansible.builtin.shell: |
if command -v zpool >/dev/null 2>&1; then
zpool list -H -o name,size,alloc,free,cap 2>/dev/null | python3 -c 'import sys,json; rows=[]; [rows.append({"name":p[0],"size":p[1],"alloc":p[2],"free":p[3],"cap":p[4]}) for p in (l.strip().split() for l in sys.stdin) if len(p)>=5]; print(json.dumps(rows))' 2>/dev/null || echo '[]'
else
echo '[]'
fi
register: zfs_pools
changed_when: false
ignore_errors: true
- name: Get ZFS datasets info as JSON (best-effort)
ansible.builtin.shell: |
if command -v zfs >/dev/null 2>&1; then
zfs list -H -o name,used,avail,refer,mountpoint 2>/dev/null | python3 -c 'import sys,json; rows=[]; [rows.append({"name":p[0],"used":p[1],"avail":p[2],"refer":p[3],"mountpoint":p[4]}) for p in ((line.rstrip("\\n").split("\\t") if "\\t" in line else line.strip().split()) for line in sys.stdin) if len(p)>=5]; print(json.dumps(rows))' 2>/dev/null || echo '[]'
else
echo '[]'
fi
register: zfs_datasets
changed_when: false
ignore_errors: true
- name: Parse ZFS JSON safely
block:
- name: Parse ZFS JSON
ansible.builtin.set_fact:
zfs_info_parsed:
pools: "{{ ((zfs_pools.stdout | default('[]', true) | trim) if ((zfs_pools.stdout | default('', true) | trim) | regex_search('^\\[')) else '[]') | from_json }}"
datasets: "{{ ((zfs_datasets.stdout | default('[]', true) | trim) if ((zfs_datasets.stdout | default('', true) | trim) | regex_search('^\\[')) else '[]') | from_json }}"
rescue:
- name: Fallback zfs_info_parsed
ansible.builtin.set_fact:
zfs_info_parsed:
pools: []
datasets: []
- name: Get root partition info specifically
ansible.builtin.shell: |
if df -BG / >/dev/null 2>&1; then
df -BG / 2>/dev/null | tail -1 | awk '{gsub("G",""); gsub("%",""); printf "{\"total_gb\":%s,\"used_gb\":%s,\"usage_percent\":%s}", $2, $3, $5}'
elif df -kP / >/dev/null 2>&1; then
df -kP / 2>/dev/null | tail -1 | awk '{total_gb=$2/1024/1024; used_gb=$3/1024/1024; gsub("%",""); printf "{\"total_gb\":%.1f,\"used_gb\":%.1f,\"usage_percent\":%s}", total_gb, used_gb, $5}'
else
echo '{"total_gb":0,"used_gb":0,"usage_percent":0}'
fi
register: disk_root
changed_when: false
ignore_errors: true
- name: Parse root partition JSON safely
block:
- name: Parse root partition JSON
ansible.builtin.set_fact:
disk_root_parsed: "{{ ((disk_root.stdout | default('{\"total_gb\":0,\"used_gb\":0,\"usage_percent\":0}', true) | trim) if ((disk_root.stdout | default('', true) | trim) | regex_search('^\\{')) else '{\"total_gb\":0,\"used_gb\":0,\"usage_percent\":0}') | from_json }}"
rescue:
- name: Fallback disk_root_parsed
ansible.builtin.set_fact:
disk_root_parsed:
total_gb: 0
used_gb: 0
usage_percent: 0
- name: Get CPU current frequency in MHz (best-effort)
ansible.builtin.shell: |
(awk -F: '/cpu MHz/{gsub(/ /, "", $2); print $2; exit}' /proc/cpuinfo 2>/dev/null || echo '')
register: cpu_mhz
changed_when: false
ignore_errors: true
- name: Get CPU max frequency in MHz (best-effort)
ansible.builtin.shell: |
if [ -f /sys/devices/system/cpu/cpu0/cpufreq/cpuinfo_max_freq ]; then
awk '{printf "%.1f", $1/1000}' /sys/devices/system/cpu/cpu0/cpufreq/cpuinfo_max_freq 2>/dev/null
else
echo ""
fi
register: cpu_max_mhz
changed_when: false
ignore_errors: true
- name: Get CPU min frequency in MHz (best-effort)
ansible.builtin.shell: |
if [ -f /sys/devices/system/cpu/cpu0/cpufreq/cpuinfo_min_freq ]; then
awk '{printf "%.1f", $1/1000}' /sys/devices/system/cpu/cpu0/cpufreq/cpuinfo_min_freq 2>/dev/null
else
echo ""
fi
register: cpu_min_mhz
changed_when: false
ignore_errors: true
- name: Get system uptime in seconds
ansible.builtin.shell: cat /proc/uptime | awk '{print int($1)}'
register: uptime_seconds
changed_when: false
ignore_errors: true
- name: Get human-readable uptime
ansible.builtin.shell: uptime -p 2>/dev/null || uptime | sed 's/.*up //' | sed 's/,.*//'
register: uptime_human
changed_when: false
ignore_errors: true
- name: Get network interfaces info
ansible.builtin.shell: |
ip -j addr show 2>/dev/null | python3 -c "
import sys, json
try:
data = json.load(sys.stdin)
result = []
for iface in data:
if iface.get('ifname') not in ['lo']:
info = {'name': iface.get('ifname'), 'state': iface.get('operstate', 'unknown')}
for addr in iface.get('addr_info', []):
if addr.get('family') == 'inet':
info['ip_address'] = addr.get('local')
if iface.get('address'):
info['mac_address'] = iface.get('address')
result.append(info)
print(json.dumps(result))
except:
print('[]')
" 2>/dev/null || echo "[]"
register: network_info
changed_when: false
ignore_errors: true
- name: Parse network info JSON safely
block:
- name: Parse network info JSON
ansible.builtin.set_fact:
network_info_parsed: "{{ ((network_info.stdout | default('[]', true) | trim) if ((network_info.stdout | default('', true) | trim) | regex_search('^\\[')) else '[]') | from_json }}"
rescue:
- name: Fallback network_info_parsed
ansible.builtin.set_fact:
network_info_parsed: []
- name: Build metrics JSON output
ansible.builtin.set_fact:
metrics_output:
host: "{{ inventory_hostname }}"
data:
# CPU info
cpu_count: "{{ ansible_processor_vcpus | default(ansible_processor_count, true) | default(1) }}"
cpu_model: "{{ ansible_processor[2] | default('Unknown', true) if (ansible_processor is defined and ansible_processor | length > 2) else 'Unknown' }}"
cpu_sockets: "{{ ansible_processor_count | default(1, true) | int }}"
cpu_threads_per_core: "{{ ansible_processor_threads_per_core | default('', true) }}"
cpu_cores: "{{ (ansible_processor_cores | default(0, true) | int) * (ansible_processor_count | default(1, true) | int) if (ansible_processor_cores is defined and ansible_processor_cores | int > 0) else '' }}"
cpu_threads: "{{ ansible_processor_vcpus | default('', true) }}"
cpu_mhz: "{{ cpu_mhz.stdout | default('', true) }}"
cpu_max_mhz: "{{ cpu_max_mhz.stdout | default('', true) }}"
cpu_min_mhz: "{{ cpu_min_mhz.stdout | default('', true) }}"
cpu_load_1m: "{{ (cpu_load_parts[0] if (cpu_load_parts | length > 0) else '0') | float }}"
cpu_load_5m: "{{ (cpu_load_parts[1] if (cpu_load_parts | length > 1) else '0') | float }}"
cpu_load_15m: "{{ (cpu_load_parts[2] if (cpu_load_parts | length > 2) else '0') | float }}"
cpu_temperature: "{{ cpu_temp.stdout if (cpu_temp.stdout is defined and cpu_temp.stdout != 'null') else '' }}"
# Memory info
memory_total_mb: "{{ memory_info_parsed.total_mb | default(0, true) | int }}"
memory_used_mb: "{{ memory_info_parsed.used_mb | default(0, true) | int }}"
memory_free_mb: "{{ memory_info_parsed.free_mb | default(0, true) | int }}"
memory_usage_percent: "{{ memory_info_parsed.usage_percent | default(0, true) | float }}"
swap_total_mb: "{{ memory_info_parsed.swap_total_mb | default(0, true) | int }}"
swap_used_mb: "{{ memory_info_parsed.swap_used_mb | default(0, true) | int }}"
swap_usage_percent: "{{ memory_info_parsed.swap_usage_percent | default(0, true) | float }}"
# Disk info
disk_info: "{{ disk_info_parsed | default([], true) }}"
disk_devices: "{{ disk_devices_parsed | default([], true) }}"
disk_root_total_gb: "{{ disk_root_parsed.total_gb | default(0, true) | float }}"
disk_root_used_gb: "{{ disk_root_parsed.used_gb | default(0, true) | float }}"
disk_root_usage_percent: "{{ disk_root_parsed.usage_percent | default(0, true) | float }}"
# Storage stacks
lvm_info: "{{ lvm_info_parsed | default({'pvs':[], 'vgs':[], 'lvs':[]}, true) }}"
zfs_info: "{{ zfs_info_parsed | default({'pools':[], 'datasets':[]}, true) }}"
# System info
os_name: "{{ ansible_distribution | default('Unknown', true) }}"
os_version: "{{ ansible_distribution_version | default('', true) }}"
kernel_version: "{{ ansible_kernel | default('', true) }}"
hostname: "{{ ansible_hostname | default(inventory_hostname, true) }}"
uptime_seconds: "{{ uptime_seconds.stdout | default('0', true) | int }}"
uptime_human: "{{ uptime_human.stdout | default('unknown', true) }}"
# Network info
network_info: "{{ network_info_parsed | default([], true) }}"
- name: Output metrics in parseable format
ansible.builtin.debug:
msg: "METRICS_JSON_START:{{ metrics_output | to_json }}:METRICS_JSON_END"

View File

@ -0,0 +1,115 @@
---
- name: Install base tools required by builtin metrics collection
hosts: all:!localhost:!hp.truenas.home:!ali2v.truenas.home
become: true
gather_facts: true
vars:
_builtin_playbook: true
_builtin_id: install_base_tools
_collect_metrics: false
base_packages_debian:
- coreutils
- util-linux
- gawk
- grep
- python3
- iproute2
- procps
optional_packages_debian:
- lvm2
- lm-sensors
- zfsutils-linux
optional_packages_rpi_debian:
- libraspberrypi-bin
tasks:
- name: Detect Raspberry Pi (Debian)
ansible.builtin.set_fact:
is_raspberry_pi: >-
{{
(ansible_distribution | default('')) in ['Raspbian']
or ((ansible_lsb | default({})).id | default('')) in ['Raspbian']
or ('raspberry' in (ansible_machine | default('') | lower))
}}
when: ansible_os_family == "Debian"
- name: Install base packages (Debian/Ubuntu)
ansible.builtin.apt:
name: "{{ base_packages_debian }}"
state: present
update_cache: true
when: ansible_os_family == "Debian"
register: apt_base
- name: Install optional packages (Debian/Ubuntu)
ansible.builtin.apt:
name: "{{ optional_packages_debian }}"
state: present
update_cache: false
when: ansible_os_family == "Debian"
register: apt_optional
ignore_errors: true
- name: Install Raspberry Pi optional packages (Debian/Ubuntu)
ansible.builtin.apt:
name: "{{ optional_packages_rpi_debian }}"
state: present
update_cache: false
when:
- ansible_os_family == "Debian"
- is_raspberry_pi | default(false)
register: apt_optional_rpi
ignore_errors: true
- name: Install base packages (RedHat)
ansible.builtin.yum:
name:
- coreutils
- util-linux
- gawk
- grep
- python3
- iproute
- procps-ng
state: present
when: ansible_os_family == "RedHat"
register: yum_base
- name: Install optional packages (RedHat)
ansible.builtin.yum:
name:
- lvm2
- lm_sensors
state: present
when: ansible_os_family == "RedHat"
register: yum_optional
ignore_errors: true
- name: Install packages (Alpine)
ansible.builtin.apk:
name:
- coreutils
- util-linux
- gawk
- grep
- python3
- iproute2
- procps
- lvm2
- lm-sensors
state: present
update_cache: true
when: ansible_os_family == "Alpine"
register: apk_base
- name: Install base packages (FreeBSD)
ansible.builtin.package:
name:
- gawk
- python3
state: present
when: ansible_os_family == "FreeBSD"
register: pkg_base
- name: Output installation summary
ansible.builtin.debug:
msg: "BASE_TOOLS_INSTALL_RESULT host={{ inventory_hostname }} os_family={{ ansible_os_family }}"

View File

View File

@ -1,5 +1,5 @@
--- ---
- name: Install jq on target host - name: Install base packages on target host
hosts: all hosts: all
become: true become: true
gather_facts: true gather_facts: true

17
app/__init__.py Normal file
View File

@ -0,0 +1,17 @@
"""
Homelab Automation API - Application Package.
Ce package contient tous les composants de l'API:
- core/: Configuration, constantes, exceptions, dépendances
- models/: Modèles SQLAlchemy
- schemas/: Schémas Pydantic
- crud/: Repositories pour les opérations DB
- services/: Services métier
- routes/: Endpoints API
- utils/: Utilitaires
"""
from app.factory import create_app
__all__ = ["create_app"]
__version__ = "1.0.0"

View File

@ -29,24 +29,28 @@ from croniter import croniter
import pytz import pytz
from fastapi import FastAPI, HTTPException, Depends, Request, Form, WebSocket, WebSocketDisconnect from fastapi import FastAPI, HTTPException, Depends, Request, Form, WebSocket, WebSocketDisconnect
from fastapi.responses import HTMLResponse, JSONResponse, FileResponse from fastapi.responses import HTMLResponse, JSONResponse, FileResponse, Response
from fastapi.security import APIKeyHeader from fastapi.security import APIKeyHeader
from fastapi.templating import Jinja2Templates from fastapi.templating import Jinja2Templates
from fastapi.middleware.cors import CORSMiddleware from fastapi.middleware.cors import CORSMiddleware
from fastapi.staticfiles import StaticFiles from fastapi.staticfiles import StaticFiles
from html.parser import HTMLParser
from io import BytesIO
from xml.sax.saxutils import escape as _xml_escape
import hashlib
from pydantic import BaseModel, Field, field_validator, ConfigDict from pydantic import BaseModel, Field, field_validator, ConfigDict
from sqlalchemy.ext.asyncio import AsyncSession, create_async_engine, async_sessionmaker from sqlalchemy.ext.asyncio import AsyncSession, create_async_engine, async_sessionmaker
from sqlalchemy import select from sqlalchemy import select
from models.database import get_db, async_session_maker # type: ignore from app.models.database import get_db, async_session_maker # type: ignore
from crud.host import HostRepository # type: ignore from app.crud.host import HostRepository # type: ignore
from crud.bootstrap_status import BootstrapStatusRepository # type: ignore from app.crud.bootstrap_status import BootstrapStatusRepository # type: ignore
from crud.log import LogRepository # type: ignore from app.crud.log import LogRepository # type: ignore
from crud.task import TaskRepository # type: ignore from app.crud.task import TaskRepository # type: ignore
from crud.schedule import ScheduleRepository # type: ignore from app.crud.schedule import ScheduleRepository # type: ignore
from crud.schedule_run import ScheduleRunRepository # type: ignore from app.crud.schedule_run import ScheduleRunRepository # type: ignore
from models.database import init_db # type: ignore from app.models.database import init_db # type: ignore
from services.notification_service import notification_service, send_notification # type: ignore from app.services.notification_service import notification_service, send_notification # type: ignore
from schemas.notification import NotificationRequest, NotificationResponse # type: ignore from app.schemas.notification import NotificationRequest, NotificationResponse # type: ignore
BASE_DIR = Path(__file__).resolve().parent BASE_DIR = Path(__file__).resolve().parent
@ -98,6 +102,452 @@ ACTION_PLAYBOOK_MAP = {
# Gestionnaire de clés API # Gestionnaire de clés API
api_key_header = APIKeyHeader(name="X-API-Key", auto_error=False) api_key_header = APIKeyHeader(name="X-API-Key", auto_error=False)
class _HelpHtmlToMarkdownParser(HTMLParser):
def __init__(self) -> None:
super().__init__(convert_charrefs=True)
self._in_help = False
self._help_depth = 0
self._lines: list[str] = []
self._buf: list[str] = []
self._list_stack: list[str] = []
self._in_pre = False
self._span_class: str = ""
self._in_toc_link = False
def _flush(self) -> None:
txt = "".join(self._buf).strip()
self._buf = []
if txt:
self._lines.append(txt)
def handle_starttag(self, tag: str, attrs: list[tuple[str, Optional[str]]]) -> None:
attrs_dict = {k: (v or "") for k, v in attrs}
if tag == "section" and attrs_dict.get("id") == "page-help":
self._in_help = True
self._help_depth = 1
return
if not self._in_help:
return
if tag == "section":
self._help_depth += 1
if tag in {"h1", "h2", "h3", "h4"}:
self._flush()
if tag in {"p", "div"}:
self._flush()
if tag in {"ul", "ol"}:
self._flush()
self._list_stack.append(tag)
if tag == "li":
self._flush()
if tag == "pre":
self._flush()
self._in_pre = True
self._lines.append("```")
if tag == "code":
self._buf.append("`")
if tag == "span":
self._span_class = attrs_dict.get("class") or ""
if "help-code" in self._span_class:
self._buf.append("`")
if tag == "a":
cls = attrs_dict.get("class") or ""
if "help-toc-item" in cls:
self._flush()
self._in_toc_link = True
def handle_endtag(self, tag: str) -> None:
if not self._in_help:
return
if tag == "a" and self._in_toc_link:
txt = "".join(self._buf).strip()
self._buf = []
if txt:
self._lines.append(f"- {txt}")
self._lines.append("")
self._in_toc_link = False
return
if tag == "section":
self._help_depth -= 1
if self._help_depth <= 0:
self._flush()
self._in_help = False
return
if tag in {"h1", "h2", "h3", "h4"}:
txt = "".join(self._buf).strip()
self._buf = []
if txt:
level = {"h1": "#", "h2": "##", "h3": "###", "h4": "####"}[tag]
self._lines.append(f"{level} {txt}")
self._lines.append("")
return
if tag == "li":
txt = "".join(self._buf).strip()
self._buf = []
if txt:
if self._list_stack and self._list_stack[-1] == "ol":
self._lines.append(f"1. {txt}")
else:
self._lines.append(f"- {txt}")
return
if tag in {"p", "div"}:
self._flush()
self._lines.append("")
return
if tag in {"ul", "ol"}:
self._flush()
if self._list_stack:
self._list_stack.pop()
self._lines.append("")
return
if tag == "pre":
self._flush()
self._in_pre = False
self._lines.append("```")
self._lines.append("")
return
if tag == "code":
self._buf.append("`")
return
if tag == "span" and "help-code" in (self._span_class or ""):
self._buf.append("`")
self._span_class = ""
return
def handle_data(self, data: str) -> None:
if not self._in_help:
return
if not data:
return
txt = data
if not self._in_pre:
txt = re.sub(r"\s+", " ", txt)
self._buf.append(txt)
def markdown(self) -> str:
out = "\n".join(line.rstrip() for line in self._lines)
out = re.sub(r"\n{3,}", "\n\n", out).strip() + "\n"
return out
def _build_help_markdown() -> str:
html_path = BASE_DIR / "index.html"
html = html_path.read_text(encoding="utf-8")
parser = _HelpHtmlToMarkdownParser()
parser.feed(html)
md = parser.markdown()
if len(md.strip()) < 200:
raise HTTPException(status_code=500, detail="Extraction du contenu d'aide insuffisante")
return md
def _extract_leading_emojis(text: str) -> tuple[str, str]:
if not text:
return "", ""
m = re.match(
r"^(?P<emoji>[\U0001F600-\U0001F64F\U0001F300-\U0001F5FF\U0001F680-\U0001F6FF\U0001F700-\U0001F77F\U0001F780-\U0001F7FF\U0001F800-\U0001F8FF\U0001F900-\U0001F9FF\U0001FA00-\U0001FA6F\U0001FA70-\U0001FAFF\U0001F1E0-\U0001F1FF\u2600-\u26FF\u2700-\u27BF\u200D\uFE0E\uFE0F]+)\s*(?P<rest>.*)$",
text,
)
if not m:
return "", text
emoji = (m.group("emoji") or "").strip()
rest = (m.group("rest") or "").lstrip()
if not emoji:
return "", text
return emoji, rest
_LAST_EMOJI_FONT_PATH = ""
def _emoji_to_png_bytes(emoji: str, px: int = 64) -> bytes:
try:
from PIL import Image, ImageDraw, ImageFont
except ModuleNotFoundError:
raise HTTPException(
status_code=500,
detail="Génération PDF avec emojis indisponible: dépendance 'pillow' manquante (installer pillow dans l'environnement runtime).",
)
font_paths = [
r"C:\\Windows\\Fonts\\seguiemj.ttf",
"/usr/share/fonts/truetype/noto/NotoColorEmoji.ttf",
"/usr/share/fonts/truetype/noto/NotoEmoji-Regular.ttf",
"/usr/share/fonts/truetype/dejavu/DejaVuSans.ttf",
]
emoji_soft_fallback = {
"❤️\u200d\U0001FA79": "❤️",
"⚙️": "",
"🛠️": "🛠",
"🏗️": "🏗",
"⚡️": "",
}
global _LAST_EMOJI_FONT_PATH
_LAST_EMOJI_FONT_PATH = ""
def _load_font():
for fp in font_paths:
try:
try:
_LAST_EMOJI_FONT_PATH = fp
return ImageFont.truetype(fp, size=int(px * 0.75), embedded_color=True)
except TypeError:
_LAST_EMOJI_FONT_PATH = fp
return ImageFont.truetype(fp, size=int(px * 0.75))
except Exception:
continue
_LAST_EMOJI_FONT_PATH = "(default)"
return ImageFont.load_default()
font = _load_font()
pad = max(2, int(px * 0.18))
canvas = px + (2 * pad)
img = Image.new("RGBA", (canvas, canvas), (255, 255, 255, 0))
draw = ImageDraw.Draw(img)
bbox = draw.textbbox((0, 0), emoji, font=font)
w = bbox[2] - bbox[0]
h = bbox[3] - bbox[1]
x = ((canvas - w) // 2) - bbox[0]
y = ((canvas - h) // 2) - bbox[1]
draw_kwargs = {"font": font, "fill": (0, 0, 0, 255)}
try:
draw.text((x, y), emoji, embedded_color=True, **draw_kwargs)
except TypeError:
draw.text((x, y), emoji, **draw_kwargs)
try:
alpha = img.getchannel("A")
nonzero = alpha.getbbox() is not None
except Exception:
nonzero = True
if (not nonzero) and emoji in emoji_soft_fallback:
fallback_emoji = emoji_soft_fallback[emoji]
img = Image.new("RGBA", (canvas, canvas), (255, 255, 255, 0))
draw = ImageDraw.Draw(img)
bbox = draw.textbbox((0, 0), fallback_emoji, font=font)
w = bbox[2] - bbox[0]
h = bbox[3] - bbox[1]
x = ((canvas - w) // 2) - bbox[0]
y = ((canvas - h) // 2) - bbox[1]
try:
draw.text((x, y), fallback_emoji, embedded_color=True, **draw_kwargs)
except TypeError:
draw.text((x, y), fallback_emoji, **draw_kwargs)
out = BytesIO()
img.save(out, format="PNG")
return out.getvalue()
def _markdown_to_pdf_bytes(markdown: str) -> bytes:
try:
from reportlab.lib import colors
from reportlab.lib.pagesizes import A4
from reportlab.lib.styles import ParagraphStyle, getSampleStyleSheet
from reportlab.lib.units import cm
from reportlab.platypus import Paragraph, Preformatted, SimpleDocTemplate, Spacer, Table, TableStyle, Image
from reportlab.pdfbase import pdfmetrics
from reportlab.pdfbase.ttfonts import TTFont
except ModuleNotFoundError as e:
if "reportlab" in str(e).lower():
raise HTTPException(
status_code=500,
detail="Génération PDF indisponible: dépendance 'reportlab' manquante (installer reportlab dans l'environnement runtime).",
)
raise
styles = getSampleStyleSheet()
mono_font_name = "Courier"
mono_font_paths = [
r"C:\\Windows\\Fonts\\consola.ttf",
r"C:\\Windows\\Fonts\\lucon.ttf",
"/usr/share/fonts/truetype/dejavu/DejaVuSansMono.ttf",
"/usr/share/fonts/truetype/liberation/LiberationMono-Regular.ttf",
]
for fp in mono_font_paths:
try:
pdfmetrics.registerFont(TTFont("MonoUnicode", fp))
mono_font_name = "MonoUnicode"
break
except Exception:
continue
normal = styles["BodyText"]
h1 = styles["Heading1"]
h2 = styles["Heading2"]
h3 = styles["Heading3"]
code_style = ParagraphStyle(
"CodeBlock",
parent=styles.get("Code", normal),
fontName=mono_font_name,
fontSize=9,
leading=11,
backColor=colors.whitesmoke,
)
story: list[Any] = []
in_code = False
code_lines: list[str] = []
lines = (markdown or "").splitlines()
i = 0
while i < len(lines):
line = lines[i]
if line.strip().startswith("```"):
if not in_code:
in_code = True
code_lines = []
else:
in_code = False
story.append(Preformatted("\n".join(code_lines), code_style))
story.append(Spacer(1, 0.35 * cm))
i += 1
continue
if in_code:
code_lines.append(line)
i += 1
continue
if not line.strip():
story.append(Spacer(1, 0.2 * cm))
i += 1
continue
if line.startswith("### "):
raw = line[4:].strip()
emoji, rest = _extract_leading_emojis(raw)
if emoji:
png = _emoji_to_png_bytes(emoji, px=56)
img = Image(BytesIO(png), width=0.55 * cm, height=0.55 * cm)
tbl = Table([[img, Paragraph(_xml_escape(rest), h3)]], colWidths=[0.7 * cm, None])
tbl.setStyle(
TableStyle(
[
("VALIGN", (0, 0), (-1, -1), "MIDDLE"),
("LEFTPADDING", (0, 0), (-1, -1), 0),
("RIGHTPADDING", (0, 0), (-1, -1), 6),
("TOPPADDING", (0, 0), (-1, -1), 1),
("BOTTOMPADDING", (0, 0), (-1, -1), 1),
]
)
)
story.append(tbl)
else:
story.append(Paragraph(_xml_escape(raw), h3))
story.append(Spacer(1, 0.2 * cm))
i += 1
continue
if line.startswith("## "):
raw = line[3:].strip()
emoji, rest = _extract_leading_emojis(raw)
if emoji:
png = _emoji_to_png_bytes(emoji, px=64)
img = Image(BytesIO(png), width=0.65 * cm, height=0.65 * cm)
tbl = Table([[img, Paragraph(_xml_escape(rest), h2)]], colWidths=[0.8 * cm, None])
tbl.setStyle(
TableStyle(
[
("VALIGN", (0, 0), (-1, -1), "MIDDLE"),
("LEFTPADDING", (0, 0), (-1, -1), 0),
("RIGHTPADDING", (0, 0), (-1, -1), 6),
("TOPPADDING", (0, 0), (-1, -1), 1),
("BOTTOMPADDING", (0, 0), (-1, -1), 1),
]
)
)
story.append(tbl)
else:
story.append(Paragraph(_xml_escape(raw), h2))
story.append(Spacer(1, 0.25 * cm))
i += 1
continue
if line.startswith("# "):
raw = line[2:].strip()
emoji, rest = _extract_leading_emojis(raw)
if emoji:
png = _emoji_to_png_bytes(emoji, px=72)
img = Image(BytesIO(png), width=0.8 * cm, height=0.8 * cm)
tbl = Table([[img, Paragraph(_xml_escape(rest), h1)]], colWidths=[1.0 * cm, None])
tbl.setStyle(
TableStyle(
[
("VALIGN", (0, 0), (-1, -1), "MIDDLE"),
("LEFTPADDING", (0, 0), (-1, -1), 0),
("RIGHTPADDING", (0, 0), (-1, -1), 8),
("TOPPADDING", (0, 0), (-1, -1), 1),
("BOTTOMPADDING", (0, 0), (-1, -1), 1),
]
)
)
story.append(tbl)
else:
story.append(Paragraph(_xml_escape(raw), h1))
story.append(Spacer(1, 0.3 * cm))
i += 1
continue
if line.lstrip().startswith(("- ", "* ")):
while i < len(lines) and lines[i].lstrip().startswith(("- ", "* ")):
item_text = lines[i].lstrip()[2:].strip()
emoji, rest = _extract_leading_emojis(item_text)
if emoji:
png = _emoji_to_png_bytes(emoji, px=52)
img = Image(BytesIO(png), width=0.5 * cm, height=0.5 * cm)
tbl = Table([[img, Paragraph(_xml_escape(rest), normal)]], colWidths=[0.75 * cm, None])
tbl.setStyle(
TableStyle(
[
("VALIGN", (0, 0), (-1, -1), "MIDDLE"),
("LEFTPADDING", (0, 0), (-1, -1), 12),
("RIGHTPADDING", (0, 0), (-1, -1), 6),
("TOPPADDING", (0, 0), (-1, -1), 1),
("BOTTOMPADDING", (0, 0), (-1, -1), 2),
]
)
)
story.append(tbl)
else:
story.append(Paragraph(_xml_escape(f"{item_text}"), normal))
story.append(Spacer(1, 0.12 * cm))
i += 1
story.append(Spacer(1, 0.15 * cm))
continue
story.append(Paragraph(_xml_escape(line.strip()), normal))
story.append(Spacer(1, 0.2 * cm))
i += 1
buffer = BytesIO()
doc = SimpleDocTemplate(
buffer,
pagesize=A4,
leftMargin=2 * cm,
rightMargin=2 * cm,
topMargin=2 * cm,
bottomMargin=2 * cm,
title="Homelab Automation Dashboard — Centre d'Aide",
)
doc.build(story)
return buffer.getvalue()
# Modèles Pydantic améliorés # Modèles Pydantic améliorés
class CommandResult(BaseModel): class CommandResult(BaseModel):
status: str status: str
@ -1132,7 +1582,7 @@ class AdHocHistoryService:
pass pass
async def _get_commands_logs(self, session: AsyncSession) -> List["Log"]: async def _get_commands_logs(self, session: AsyncSession) -> List["Log"]:
from models.log import Log from app.models.log import Log
stmt = ( stmt = (
select(Log) select(Log)
.where(Log.source == "adhoc_history") .where(Log.source == "adhoc_history")
@ -1142,7 +1592,7 @@ class AdHocHistoryService:
return result.scalars().all() return result.scalars().all()
async def _get_categories_logs(self, session: AsyncSession) -> List["Log"]: async def _get_categories_logs(self, session: AsyncSession) -> List["Log"]:
from models.log import Log from app.models.log import Log
stmt = ( stmt = (
select(Log) select(Log)
.where(Log.source == "adhoc_category") .where(Log.source == "adhoc_category")
@ -1161,8 +1611,8 @@ class AdHocHistoryService:
description: str | None = None, description: str | None = None,
) -> AdHocHistoryEntry: ) -> AdHocHistoryEntry:
"""Ajoute ou met à jour une commande dans l'historique (stockée dans logs.details).""" """Ajoute ou met à jour une commande dans l'historique (stockée dans logs.details)."""
from models.log import Log from app.models.log import Log
from crud.log import LogRepository from app.crud.log import LogRepository
async with async_session_maker() as session: async with async_session_maker() as session:
repo = LogRepository(session) repo = LogRepository(session)
@ -1299,7 +1749,7 @@ class AdHocHistoryService:
Si aucune catégorie n'est présente, les catégories par défaut sont créées. Si aucune catégorie n'est présente, les catégories par défaut sont créées.
""" """
from crud.log import LogRepository from app.crud.log import LogRepository
async with async_session_maker() as session: async with async_session_maker() as session:
logs = await self._get_categories_logs(session) logs = await self._get_categories_logs(session)
@ -1344,7 +1794,7 @@ class AdHocHistoryService:
icon: str = "fa-folder", icon: str = "fa-folder",
) -> AdHocHistoryCategory: ) -> AdHocHistoryCategory:
"""Ajoute une nouvelle catégorie en BD (ou renvoie l'existante).""" """Ajoute une nouvelle catégorie en BD (ou renvoie l'existante)."""
from crud.log import LogRepository from app.crud.log import LogRepository
async with async_session_maker() as session: async with async_session_maker() as session:
logs = await self._get_categories_logs(session) logs = await self._get_categories_logs(session)
@ -1376,7 +1826,7 @@ class AdHocHistoryService:
async def delete_command(self, command_id: str) -> bool: async def delete_command(self, command_id: str) -> bool:
"""Supprime une commande de l'historique (ligne dans logs).""" """Supprime une commande de l'historique (ligne dans logs)."""
from models.log import Log from app.models.log import Log
async with async_session_maker() as session: async with async_session_maker() as session:
stmt = select(Log).where(Log.source == "adhoc_history") stmt = select(Log).where(Log.source == "adhoc_history")
@ -1404,7 +1854,7 @@ class AdHocHistoryService:
description: str | None = None, description: str | None = None,
) -> bool: ) -> bool:
"""Met à jour la catégorie d'une commande dans l'historique.""" """Met à jour la catégorie d'une commande dans l'historique."""
from models.log import Log from app.models.log import Log
async with async_session_maker() as session: async with async_session_maker() as session:
stmt = select(Log).where(Log.source == "adhoc_history") stmt = select(Log).where(Log.source == "adhoc_history")
@ -1431,7 +1881,7 @@ class AdHocHistoryService:
icon: str, icon: str,
) -> bool: ) -> bool:
"""Met à jour une catégorie existante et les commandes associées.""" """Met à jour une catégorie existante et les commandes associées."""
from models.log import Log from app.models.log import Log
async with async_session_maker() as session: async with async_session_maker() as session:
# Mettre à jour la catégorie elle-même # Mettre à jour la catégorie elle-même
@ -1471,7 +1921,7 @@ class AdHocHistoryService:
if category_name == "default": if category_name == "default":
return False return False
from models.log import Log from app.models.log import Log
async with async_session_maker() as session: async with async_session_maker() as session:
# Trouver la catégorie # Trouver la catégorie
@ -3645,6 +4095,50 @@ async def verify_api_key(api_key: str = Depends(api_key_header)) -> bool:
raise HTTPException(status_code=401, detail="Clé API invalide ou manquante") raise HTTPException(status_code=401, detail="Clé API invalide ou manquante")
return True return True
@app.get("/api/help/documentation.md")
async def download_help_markdown(api_key_valid: bool = Depends(verify_api_key)):
"""Télécharge la documentation du centre d'aide en format Markdown."""
markdown = _build_help_markdown()
digest = hashlib.sha256(markdown.encode("utf-8")).hexdigest()[:16]
etag = f'W/"md-{digest}"'
filename = f"homelab-documentation-{digest}.md"
return Response(
content=markdown,
media_type="text/markdown; charset=utf-8",
headers={
"Content-Disposition": f"attachment; filename={filename}",
"Cache-Control": "no-store, no-cache, must-revalidate, max-age=0",
"Pragma": "no-cache",
"Expires": "0",
"ETag": etag,
"X-Help-Doc-Generator": "app_optimized._build_help_markdown",
},
)
@app.get("/api/help/documentation.pdf")
async def download_help_pdf(api_key_valid: bool = Depends(verify_api_key)):
"""Télécharge la documentation du centre d'aide en format PDF."""
markdown = _build_help_markdown()
pdf_bytes = _markdown_to_pdf_bytes(markdown)
digest = hashlib.sha256(bytes(pdf_bytes)).hexdigest()[:16]
etag = f'W/"pdf-{digest}"'
filename = f"homelab-documentation-{digest}.pdf"
emoji_font = globals().get("_LAST_EMOJI_FONT_PATH", "")
return Response(
content=pdf_bytes,
media_type="application/pdf",
headers={
"Content-Disposition": f"attachment; filename={filename}",
"Cache-Control": "no-store, no-cache, must-revalidate, max-age=0",
"Pragma": "no-cache",
"Expires": "0",
"ETag": etag,
"X-Help-Doc-Generator": "app_optimized._markdown_to_pdf_bytes",
"X-Help-Emoji-Font": emoji_font,
},
)
# Routes API # Routes API
@app.get("/", response_class=HTMLResponse) @app.get("/", response_class=HTMLResponse)
async def root(request: Request): async def root(request: Request):
@ -5875,7 +6369,7 @@ async def execute_ansible_task(
# Mettre à jour la BD avec le statut final # Mettre à jour la BD avec le statut final
try: try:
async with async_session_maker() as session: async with async_session_maker() as session:
from crud.task import TaskRepository from app.crud.task import TaskRepository
repo = TaskRepository(session) repo = TaskRepository(session)
db_task = await repo.get(task_id) db_task = await repo.get(task_id)
if db_task: if db_task:

43
app/core/__init__.py Normal file
View File

@ -0,0 +1,43 @@
"""
Core module - Configuration, constantes, exceptions et dépendances.
"""
from app.core.config import settings
from app.core.constants import (
HostStatus,
TaskStatus,
LogLevel,
ScheduleStatus,
NotificationType,
ACTION_PLAYBOOK_MAP,
)
from app.core.exceptions import (
HomelabException,
HostNotFoundException,
TaskNotFoundException,
ScheduleNotFoundException,
PlaybookNotFoundException,
GroupNotFoundException,
ValidationException,
AnsibleExecutionException,
BootstrapException,
)
__all__ = [
"settings",
"HostStatus",
"TaskStatus",
"LogLevel",
"ScheduleStatus",
"NotificationType",
"ACTION_PLAYBOOK_MAP",
"HomelabException",
"HostNotFoundException",
"TaskNotFoundException",
"ScheduleNotFoundException",
"PlaybookNotFoundException",
"GroupNotFoundException",
"ValidationException",
"AnsibleExecutionException",
"BootstrapException",
]

123
app/core/config.py Normal file
View File

@ -0,0 +1,123 @@
"""
Configuration centralisée de l'application.
Toutes les variables d'environnement et paramètres sont centralisés ici.
"""
import os
from pathlib import Path
from typing import Optional
from pydantic import Field
from pydantic_settings import BaseSettings
class Settings(BaseSettings):
"""Configuration de l'application Homelab Automation."""
# === Chemins ===
base_dir: Path = Field(default_factory=lambda: Path(__file__).resolve().parent.parent)
logs_dir: Path = Field(default_factory=lambda: Path(os.environ.get("LOGS_DIR", "/logs")))
@property
def ansible_dir(self) -> Path:
"""Répertoire Ansible (relatif à base_dir.parent)"""
return self.base_dir.parent / "ansible"
@property
def tasks_logs_dir(self) -> Path:
"""Répertoire des logs de tâches markdown"""
return Path(os.environ.get("DIR_LOGS_TASKS", str(self.base_dir.parent / "tasks_logs")))
@property
def db_path(self) -> Path:
"""Chemin de la base de données SQLite"""
return self.logs_dir / "homelab.db"
# === SSH ===
ssh_key_path: str = Field(
default_factory=lambda: os.environ.get("SSH_KEY_PATH", str(Path.home() / ".ssh" / "id_rsa"))
)
ssh_user: str = Field(default_factory=lambda: os.environ.get("SSH_USER", "automation"))
ssh_remote_user: str = Field(default_factory=lambda: os.environ.get("SSH_REMOTE_USER", "root"))
# === API ===
api_key: str = Field(default_factory=lambda: os.environ.get("API_KEY", "dev-key-12345"))
api_title: str = "Homelab Automation Dashboard API"
api_version: str = "1.0.0"
api_description: str = "API REST moderne pour la gestion automatique d'homelab"
# === JWT Authentication ===
jwt_secret_key: str = Field(
default_factory=lambda: os.environ.get("JWT_SECRET_KEY", "dev-secret-key-change-in-production")
)
jwt_expire_minutes: int = Field(
default_factory=lambda: int(os.environ.get("JWT_EXPIRE_MINUTES", "1440"))
)
jwt_algorithm: str = "HS256"
# === Database ===
database_url: Optional[str] = Field(default=None)
@property
def async_database_url(self) -> str:
"""URL de connexion async pour SQLAlchemy"""
if self.database_url:
return self.database_url
return f"sqlite+aiosqlite:///{self.db_path}"
# === CORS ===
cors_origins: list = Field(default=["*"])
cors_allow_credentials: bool = True
cors_allow_methods: list = Field(default=["*"])
cors_allow_headers: list = Field(default=["*"])
# === Notifications ntfy ===
ntfy_enabled: bool = Field(
default_factory=lambda: os.environ.get("NTFY_ENABLED", "true").lower() == "true"
)
ntfy_base_url: str = Field(
default_factory=lambda: os.environ.get("NTFY_BASE_URL", "https://ntfy.sh")
)
ntfy_default_topic: str = Field(
default_factory=lambda: os.environ.get("NTFY_TOPIC", "homelab-automation")
)
ntfy_timeout: int = Field(
default_factory=lambda: int(os.environ.get("NTFY_TIMEOUT", "10"))
)
ntfy_username: Optional[str] = Field(
default_factory=lambda: os.environ.get("NTFY_USERNAME")
)
ntfy_password: Optional[str] = Field(
default_factory=lambda: os.environ.get("NTFY_PASSWORD")
)
ntfy_token: Optional[str] = Field(
default_factory=lambda: os.environ.get("NTFY_TOKEN")
)
# === Scheduler ===
scheduler_timezone: str = Field(
default_factory=lambda: os.environ.get("SCHEDULER_TIMEZONE", "America/Montreal")
)
scheduler_misfire_grace_time: int = 300
# === Cache ===
hosts_cache_ttl: int = 60 # secondes
inventory_cache_ttl: int = 60 # secondes
logs_index_rebuild_interval: int = 60 # secondes
# === Server ===
host: str = "0.0.0.0"
port: int = 8008
reload: bool = Field(
default_factory=lambda: os.environ.get("RELOAD", "true").lower() == "true"
)
log_level: str = "info"
class Config:
env_file = ".env"
env_file_encoding = "utf-8"
extra = "ignore"
# Instance singleton de la configuration
settings = Settings()

150
app/core/constants.py Normal file
View File

@ -0,0 +1,150 @@
"""
Constantes et énumérations de l'application.
Centralise toutes les valeurs constantes pour éviter les magic strings.
"""
from enum import Enum
from typing import Dict
class HostStatus(str, Enum):
"""Statuts possibles d'un hôte."""
ONLINE = "online"
OFFLINE = "offline"
WARNING = "warning"
UNKNOWN = "unknown"
class TaskStatus(str, Enum):
"""Statuts possibles d'une tâche."""
PENDING = "pending"
RUNNING = "running"
COMPLETED = "completed"
FAILED = "failed"
CANCELLED = "cancelled"
class LogLevel(str, Enum):
"""Niveaux de log."""
DEBUG = "DEBUG"
INFO = "INFO"
WARN = "WARN"
WARNING = "WARNING"
ERROR = "ERROR"
class ScheduleStatus(str, Enum):
"""Statuts possibles d'un schedule."""
NEVER = "never"
RUNNING = "running"
SUCCESS = "success"
FAILED = "failed"
CANCELED = "canceled"
class ScheduleType(str, Enum):
"""Types de schedule."""
ONCE = "once"
RECURRING = "recurring"
class RecurrenceType(str, Enum):
"""Types de récurrence."""
DAILY = "daily"
WEEKLY = "weekly"
MONTHLY = "monthly"
CUSTOM = "custom"
class TargetType(str, Enum):
"""Types de cible pour les schedules."""
GROUP = "group"
HOST = "host"
class NotificationType(str, Enum):
"""Types de notification pour les schedules."""
NONE = "none"
ALL = "all"
ERRORS = "errors"
class SourceType(str, Enum):
"""Types de source pour les tâches."""
SCHEDULED = "scheduled"
MANUAL = "manual"
ADHOC = "adhoc"
class AnsibleModule(str, Enum):
"""Modules Ansible courants pour les commandes ad-hoc."""
SHELL = "shell"
COMMAND = "command"
RAW = "raw"
PING = "ping"
SETUP = "setup"
class GroupType(str, Enum):
"""Types de groupes Ansible."""
ENV = "env"
ROLE = "role"
# === Mapping Actions → Playbooks ===
ACTION_PLAYBOOK_MAP: Dict[str, str] = {
"upgrade": "vm-upgrade.yml",
"reboot": "vm-reboot.yml",
"health-check": "health-check.yml",
"backup": "backup-config.yml",
"bootstrap": "bootstrap-host.yml",
}
# === Noms lisibles des actions ===
ACTION_DISPLAY_NAMES: Dict[str, str] = {
"upgrade": "Mise à jour système",
"reboot": "Redémarrage système",
"health-check": "Vérification de santé",
"backup": "Sauvegarde",
"deploy": "Déploiement",
"rollback": "Rollback",
"maintenance": "Maintenance",
"bootstrap": "Bootstrap Ansible",
}
# === Actions valides ===
VALID_ACTIONS = list(ACTION_DISPLAY_NAMES.keys())
# === Emojis pour les statuts de tâches ===
TASK_STATUS_EMOJIS: Dict[str, str] = {
TaskStatus.COMPLETED: "",
TaskStatus.FAILED: "",
TaskStatus.RUNNING: "🔄",
TaskStatus.PENDING: "",
TaskStatus.CANCELLED: "🚫",
}
# === Labels pour les types de source ===
SOURCE_TYPE_LABELS: Dict[str, str] = {
SourceType.SCHEDULED: "Planifié",
SourceType.MANUAL: "Manuel",
SourceType.ADHOC: "Ad-hoc",
}
# === Catégories ad-hoc par défaut ===
DEFAULT_ADHOC_CATEGORIES = [
{"name": "default", "description": "Commandes générales", "color": "#7c3aed", "icon": "fa-terminal"},
{"name": "diagnostic", "description": "Commandes de diagnostic", "color": "#10b981", "icon": "fa-stethoscope"},
{"name": "maintenance", "description": "Commandes de maintenance", "color": "#f59e0b", "icon": "fa-wrench"},
{"name": "deployment", "description": "Commandes de déploiement", "color": "#3b82f6", "icon": "fa-rocket"},
]
# === Timeouts par défaut ===
DEFAULT_ADHOC_TIMEOUT = 60
DEFAULT_SCHEDULE_TIMEOUT = 3600
SSH_CONNECT_TIMEOUT = 10
# === Limites de pagination ===
DEFAULT_PAGE_LIMIT = 50
MAX_PAGE_LIMIT = 1000

212
app/core/dependencies.py Normal file
View File

@ -0,0 +1,212 @@
"""
Dépendances FastAPI pour l'injection de dépendances.
Centralise toutes les dépendances communes utilisées dans les routes.
"""
from typing import AsyncGenerator, Optional
from fastapi import Depends, HTTPException, status
from fastapi.security import APIKeyHeader, OAuth2PasswordBearer
from sqlalchemy.ext.asyncio import AsyncSession
from app.core.config import settings
from app.core.exceptions import AuthenticationException
from app.models.database import get_db as get_db_session
# === Schémas de sécurité ===
api_key_header = APIKeyHeader(name="X-API-Key", auto_error=False)
oauth2_scheme = OAuth2PasswordBearer(tokenUrl="/api/auth/login", auto_error=False)
# === Dépendance: Session de base de données ===
async def get_db() -> AsyncGenerator[AsyncSession, None]:
"""
Fournit une session de base de données async.
Utilisée comme dépendance dans les endpoints:
```python
@app.get("/api/example")
async def example(db: AsyncSession = Depends(get_db)):
...
```
"""
async for session in get_db_session():
yield session
# === Dépendance: Vérification clé API ===
async def verify_api_key(
api_key: Optional[str] = Depends(api_key_header),
token: Optional[str] = Depends(oauth2_scheme),
) -> bool:
"""
Vérifie l'authentification par clé API ou JWT Bearer token.
Raises:
HTTPException 401 si non authentifié
Returns:
True si authentifié
"""
# Vérifier la clé API
if api_key and api_key == settings.api_key:
return True
# Vérifier le token JWT
if token:
try:
from app.services.auth_service import decode_token
token_data = decode_token(token)
if token_data:
return True
except Exception:
pass
raise HTTPException(
status_code=status.HTTP_401_UNAUTHORIZED,
detail="Authentification requise (clé API ou JWT)",
headers={"WWW-Authenticate": "Bearer"},
)
# === Dépendance: Vérification JWT (optionnelle) ===
async def get_current_user_optional(
token: Optional[str] = Depends(oauth2_scheme),
api_key: Optional[str] = Depends(api_key_header),
) -> Optional[dict]:
"""
Vérifie l'authentification par JWT ou clé API (optionnel).
Returns:
Dictionnaire utilisateur si authentifié, None sinon
"""
# Vérifier d'abord la clé API (compatibilité legacy)
if api_key and api_key == settings.api_key:
return {"type": "api_key", "authenticated": True}
# Vérifier le token JWT
if token:
try:
from app.services.auth_service import decode_token
token_data = decode_token(token)
if token_data:
return {
"type": "jwt",
"authenticated": True,
"user_id": token_data.user_id,
"username": token_data.username,
"role": token_data.role,
}
except Exception:
pass
return None
async def get_current_user(
user: Optional[dict] = Depends(get_current_user_optional)
) -> dict:
"""
Vérifie l'authentification par JWT ou clé API (obligatoire).
Raises:
HTTPException 401 si non authentifié
Returns:
Dictionnaire utilisateur
"""
if not user:
raise HTTPException(
status_code=status.HTTP_401_UNAUTHORIZED,
detail="Authentification requise",
headers={"WWW-Authenticate": "Bearer"},
)
return user
# === Dépendance: Vérification rôle admin ===
async def require_admin(
user: dict = Depends(get_current_user)
) -> dict:
"""
Vérifie que l'utilisateur est admin.
Raises:
HTTPException 403 si l'utilisateur n'est pas admin
Returns:
Dictionnaire utilisateur
"""
# La clé API a tous les droits
if user.get("type") == "api_key":
return user
# Vérifier le rôle dans le payload JWT
payload = user.get("payload", {})
role = payload.get("role", "viewer")
if role != "admin":
raise HTTPException(
status_code=status.HTTP_403_FORBIDDEN,
detail="Droits administrateur requis"
)
return user
# === Dépendance combinée: Auth + DB ===
class AuthenticatedDB:
"""Conteneur pour la session DB et l'info utilisateur."""
def __init__(self, db: AsyncSession, user: dict):
self.db = db
self.user = user
async def get_authenticated_db(
db: AsyncSession = Depends(get_db),
user: dict = Depends(get_current_user)
) -> AuthenticatedDB:
"""
Fournit une session DB authentifiée.
Returns:
AuthenticatedDB avec session et info utilisateur
"""
return AuthenticatedDB(db=db, user=user)
# === Dépendance: Pagination ===
class PaginationParams:
"""Paramètres de pagination communs."""
def __init__(
self,
limit: int = 50,
offset: int = 0
):
from app.core.constants import MAX_PAGE_LIMIT
self.limit = min(limit, MAX_PAGE_LIMIT)
self.offset = max(offset, 0)
def get_pagination(
limit: int = 50,
offset: int = 0
) -> PaginationParams:
"""
Extrait les paramètres de pagination des query params.
Returns:
PaginationParams avec limit et offset validés
"""
return PaginationParams(limit=limit, offset=offset)

258
app/core/exceptions.py Normal file
View File

@ -0,0 +1,258 @@
"""
Exceptions personnalisées de l'application.
Centralise la gestion des erreurs avec des exceptions typées.
"""
from typing import Any, Dict, Optional
class HomelabException(Exception):
"""Exception de base pour l'application Homelab."""
def __init__(
self,
message: str,
status_code: int = 500,
details: Optional[Dict[str, Any]] = None
):
self.message = message
self.status_code = status_code
self.details = details or {}
super().__init__(self.message)
def to_dict(self) -> Dict[str, Any]:
"""Convertit l'exception en dictionnaire pour la réponse API."""
return {
"error": self.__class__.__name__,
"message": self.message,
"details": self.details,
}
# === Exceptions 404 Not Found ===
class NotFoundException(HomelabException):
"""Exception de base pour les ressources non trouvées."""
def __init__(self, resource_type: str, identifier: str):
super().__init__(
message=f"{resource_type} '{identifier}' non trouvé(e)",
status_code=404,
details={"resource_type": resource_type, "identifier": identifier}
)
class HostNotFoundException(NotFoundException):
"""Hôte non trouvé."""
def __init__(self, identifier: str):
super().__init__("Hôte", identifier)
class TaskNotFoundException(NotFoundException):
"""Tâche non trouvée."""
def __init__(self, identifier: str):
super().__init__("Tâche", identifier)
class ScheduleNotFoundException(NotFoundException):
"""Schedule non trouvé."""
def __init__(self, identifier: str):
super().__init__("Schedule", identifier)
class PlaybookNotFoundException(NotFoundException):
"""Playbook non trouvé."""
def __init__(self, identifier: str):
super().__init__("Playbook", identifier)
class GroupNotFoundException(NotFoundException):
"""Groupe non trouvé."""
def __init__(self, identifier: str):
super().__init__("Groupe", identifier)
class LogNotFoundException(NotFoundException):
"""Log non trouvé."""
def __init__(self, identifier: str):
super().__init__("Log", identifier)
# === Exceptions 400 Bad Request ===
class ValidationException(HomelabException):
"""Erreur de validation des données."""
def __init__(self, message: str, field: Optional[str] = None):
details = {"field": field} if field else {}
super().__init__(
message=message,
status_code=400,
details=details
)
class DuplicateResourceException(HomelabException):
"""Ressource déjà existante."""
def __init__(self, resource_type: str, identifier: str):
super().__init__(
message=f"{resource_type} '{identifier}' existe déjà",
status_code=400,
details={"resource_type": resource_type, "identifier": identifier}
)
class InvalidOperationException(HomelabException):
"""Opération non valide dans l'état actuel."""
def __init__(self, message: str, current_state: Optional[str] = None):
details = {"current_state": current_state} if current_state else {}
super().__init__(
message=message,
status_code=400,
details=details
)
class IncompatiblePlaybookException(HomelabException):
"""Playbook incompatible avec la cible."""
def __init__(self, playbook: str, target: str, playbook_hosts: str):
super().__init__(
message=f"Le playbook '{playbook}' (hosts: {playbook_hosts}) n'est pas compatible avec la cible '{target}'",
status_code=400,
details={
"playbook": playbook,
"target": target,
"playbook_hosts": playbook_hosts,
}
)
# === Exceptions 401/403 Auth ===
class AuthenticationException(HomelabException):
"""Erreur d'authentification."""
def __init__(self, message: str = "Authentification requise"):
super().__init__(message=message, status_code=401)
class AuthorizationException(HomelabException):
"""Erreur d'autorisation."""
def __init__(self, message: str = "Accès non autorisé"):
super().__init__(message=message, status_code=403)
# === Exceptions 500 Server Error ===
class AnsibleExecutionException(HomelabException):
"""Erreur lors de l'exécution Ansible."""
def __init__(
self,
message: str,
return_code: int = -1,
stdout: str = "",
stderr: str = ""
):
super().__init__(
message=message,
status_code=500,
details={
"return_code": return_code,
"stdout": stdout,
"stderr": stderr,
}
)
class BootstrapException(HomelabException):
"""Erreur lors du bootstrap d'un hôte."""
def __init__(
self,
host: str,
message: str,
return_code: int = -1,
stdout: str = "",
stderr: str = ""
):
super().__init__(
message=f"Échec bootstrap pour {host}: {message}",
status_code=500,
details={
"host": host,
"return_code": return_code,
"stdout": stdout,
"stderr": stderr,
}
)
class SSHConnectionException(HomelabException):
"""Erreur de connexion SSH."""
def __init__(self, host: str, message: str):
super().__init__(
message=f"Connexion SSH échouée vers {host}: {message}",
status_code=500,
details={"host": host}
)
class DatabaseException(HomelabException):
"""Erreur de base de données."""
def __init__(self, message: str, operation: Optional[str] = None):
details = {"operation": operation} if operation else {}
super().__init__(
message=f"Erreur base de données: {message}",
status_code=500,
details=details
)
class SchedulerException(HomelabException):
"""Erreur du service de planification."""
def __init__(self, message: str, schedule_id: Optional[str] = None):
details = {"schedule_id": schedule_id} if schedule_id else {}
super().__init__(
message=f"Erreur scheduler: {message}",
status_code=500,
details=details
)
class NotificationException(HomelabException):
"""Erreur lors de l'envoi de notification."""
def __init__(self, message: str, topic: Optional[str] = None):
details = {"topic": topic} if topic else {}
super().__init__(
message=f"Erreur notification: {message}",
status_code=500,
details=details
)
class FileOperationException(HomelabException):
"""Erreur lors d'une opération sur fichier."""
def __init__(self, message: str, path: Optional[str] = None):
details = {"path": path} if path else {}
super().__init__(
message=message,
status_code=500,
details=details
)

View File

@ -3,6 +3,9 @@ from .bootstrap_status import BootstrapStatusRepository
from .task import TaskRepository from .task import TaskRepository
from .schedule import ScheduleRepository from .schedule import ScheduleRepository
from .log import LogRepository from .log import LogRepository
from .user import UserRepository
from .alert import AlertRepository
from .app_setting import AppSettingRepository
__all__ = [ __all__ = [
"HostRepository", "HostRepository",
@ -10,4 +13,7 @@ __all__ = [
"TaskRepository", "TaskRepository",
"ScheduleRepository", "ScheduleRepository",
"LogRepository", "LogRepository",
"AlertRepository",
"UserRepository",
"AppSettingRepository",
] ]

76
app/crud/alert.py Normal file
View File

@ -0,0 +1,76 @@
from __future__ import annotations
from datetime import datetime, timezone
from typing import Optional
from sqlalchemy import select, func, update
from sqlalchemy.ext.asyncio import AsyncSession
from app.models.alert import Alert
class AlertRepository:
def __init__(self, session: AsyncSession):
self.session = session
async def list(
self,
limit: int = 100,
offset: int = 0,
unread_only: bool = False,
category: Optional[str] = None,
user_id: Optional[int] = None,
) -> list[Alert]:
stmt = select(Alert).order_by(Alert.created_at.desc()).offset(offset).limit(limit)
if user_id is not None:
stmt = stmt.where(Alert.user_id == user_id)
if unread_only:
stmt = stmt.where(Alert.read_at.is_(None))
if category:
stmt = stmt.where(Alert.category == category)
result = await self.session.execute(stmt)
return result.scalars().all()
async def get(self, alert_id: str) -> Optional[Alert]:
stmt = select(Alert).where(Alert.id == alert_id)
result = await self.session.execute(stmt)
return result.scalar_one_or_none()
async def count_unread(self, user_id: Optional[int] = None) -> int:
stmt = select(func.count()).select_from(Alert).where(Alert.read_at.is_(None))
if user_id is not None:
stmt = stmt.where(Alert.user_id == user_id)
result = await self.session.execute(stmt)
return int(result.scalar() or 0)
async def create(self, **fields) -> Alert:
alert = Alert(**fields)
self.session.add(alert)
await self.session.flush()
return alert
async def mark_as_read(self, alert_id: str) -> bool:
stmt = select(Alert).where(Alert.id == alert_id)
result = await self.session.execute(stmt)
alert = result.scalar_one_or_none()
if not alert:
return False
if alert.read_at is None:
alert.read_at = datetime.now(timezone.utc)
await self.session.flush()
return True
async def mark_all_as_read(self, user_id: Optional[int] = None) -> int:
stmt = update(Alert).where(Alert.read_at.is_(None)).values(read_at=datetime.now(timezone.utc))
if user_id is not None:
stmt = stmt.where(Alert.user_id == user_id)
result = await self.session.execute(stmt)
return int(result.rowcount or 0)
async def delete(self, alert_id: str) -> bool:
alert = await self.get(alert_id)
if alert:
await self.session.delete(alert)
await self.session.flush()
return True
return False

41
app/crud/app_setting.py Normal file
View File

@ -0,0 +1,41 @@
from __future__ import annotations
from typing import Optional
from sqlalchemy import select
from sqlalchemy.ext.asyncio import AsyncSession
from app.models.app_setting import AppSetting
class AppSettingRepository:
def __init__(self, session: AsyncSession):
self.session = session
async def get(self, key: str) -> Optional[AppSetting]:
stmt = select(AppSetting).where(AppSetting.key == key)
result = await self.session.execute(stmt)
return result.scalar_one_or_none()
async def get_value(self, key: str, default: Optional[str] = None) -> Optional[str]:
row = await self.get(key)
if row is None:
return default
return row.value if row.value is not None else default
async def set_value(self, key: str, value: Optional[str]) -> AppSetting:
row = await self.get(key)
if row is None:
row = AppSetting(key=key, value=value)
self.session.add(row)
await self.session.flush()
return row
row.value = value
await self.session.flush()
return row
# Alias for compatibility
async def set(self, key: str, value: Optional[str]) -> AppSetting:
"""Alias for set_value."""
return await self.set_value(key, value)

View File

@ -5,7 +5,7 @@ from typing import Optional
from sqlalchemy import select from sqlalchemy import select
from sqlalchemy.ext.asyncio import AsyncSession from sqlalchemy.ext.asyncio import AsyncSession
from models.bootstrap_status import BootstrapStatus from app.models.bootstrap_status import BootstrapStatus
class BootstrapStatusRepository: class BootstrapStatusRepository:

View File

@ -7,7 +7,7 @@ from sqlalchemy import select, update
from sqlalchemy.ext.asyncio import AsyncSession from sqlalchemy.ext.asyncio import AsyncSession
from sqlalchemy.orm import selectinload from sqlalchemy.orm import selectinload
from models.host import Host from app.models.host import Host
class HostRepository: class HostRepository:

121
app/crud/host_metrics.py Normal file
View File

@ -0,0 +1,121 @@
from __future__ import annotations
from datetime import datetime, timedelta
from typing import Optional, List, Dict, Any
from sqlalchemy import select, delete, func
from sqlalchemy.ext.asyncio import AsyncSession
from app.models.host_metrics import HostMetrics
class HostMetricsRepository:
"""Repository pour gérer les métriques des hôtes"""
def __init__(self, session: AsyncSession):
self.session = session
async def create(self, **fields) -> HostMetrics:
"""Crée une nouvelle entrée de métriques"""
metrics = HostMetrics(**fields)
self.session.add(metrics)
await self.session.flush()
return metrics
async def get(self, metrics_id: int) -> Optional[HostMetrics]:
"""Récupère une entrée de métriques par son ID"""
stmt = select(HostMetrics).where(HostMetrics.id == metrics_id)
result = await self.session.execute(stmt)
return result.scalar_one_or_none()
async def get_latest_for_host(self, host_id: str, metric_type: str = None) -> Optional[HostMetrics]:
"""Récupère les dernières métriques pour un hôte"""
stmt = select(HostMetrics).where(HostMetrics.host_id == host_id)
if metric_type:
stmt = stmt.where(HostMetrics.metric_type == metric_type)
stmt = stmt.order_by(HostMetrics.collected_at.desc()).limit(1)
result = await self.session.execute(stmt)
return result.scalar_one_or_none()
async def list_for_host(
self,
host_id: str,
metric_type: str = None,
limit: int = 100,
offset: int = 0
) -> List[HostMetrics]:
"""Liste les métriques pour un hôte avec pagination"""
stmt = select(HostMetrics).where(HostMetrics.host_id == host_id)
if metric_type:
stmt = stmt.where(HostMetrics.metric_type == metric_type)
stmt = stmt.order_by(HostMetrics.collected_at.desc()).offset(offset).limit(limit)
result = await self.session.execute(stmt)
return list(result.scalars().all())
async def get_all_latest(self, metric_type: str = "system_info") -> Dict[str, HostMetrics]:
"""Récupère les dernières métriques pour tous les hôtes
Returns:
Dict mapping host_id to latest HostMetrics
"""
# Sous-requête pour obtenir la date max par host_id
subq = (
select(
HostMetrics.host_id,
func.max(HostMetrics.collected_at).label("max_collected")
)
.where(HostMetrics.metric_type == metric_type)
.group_by(HostMetrics.host_id)
.subquery()
)
# Jointure pour récupérer les enregistrements complets
stmt = (
select(HostMetrics)
.join(
subq,
(HostMetrics.host_id == subq.c.host_id) &
(HostMetrics.collected_at == subq.c.max_collected)
)
.where(HostMetrics.metric_type == metric_type)
)
result = await self.session.execute(stmt)
metrics_list = result.scalars().all()
return {m.host_id: m for m in metrics_list}
async def cleanup_old_metrics(self, days_to_keep: int = 30) -> int:
"""Supprime les métriques plus anciennes que le nombre de jours spécifié
Returns:
Nombre d'entrées supprimées
"""
cutoff_date = datetime.utcnow() - timedelta(days=days_to_keep)
stmt = delete(HostMetrics).where(HostMetrics.collected_at < cutoff_date)
result = await self.session.execute(stmt)
return result.rowcount
async def get_metrics_history(
self,
host_id: str,
metric_type: str = "system_info",
hours: int = 24
) -> List[HostMetrics]:
"""Récupère l'historique des métriques pour les dernières heures"""
cutoff = datetime.utcnow() - timedelta(hours=hours)
stmt = (
select(HostMetrics)
.where(HostMetrics.host_id == host_id)
.where(HostMetrics.metric_type == metric_type)
.where(HostMetrics.collected_at >= cutoff)
.order_by(HostMetrics.collected_at.asc())
)
result = await self.session.execute(stmt)
return list(result.scalars().all())
async def count_for_host(self, host_id: str) -> int:
"""Compte le nombre d'entrées de métriques pour un hôte"""
stmt = select(func.count(HostMetrics.id)).where(HostMetrics.host_id == host_id)
result = await self.session.execute(stmt)
return result.scalar() or 0

View File

@ -5,7 +5,7 @@ from typing import Optional
from sqlalchemy import select from sqlalchemy import select
from sqlalchemy.ext.asyncio import AsyncSession from sqlalchemy.ext.asyncio import AsyncSession
from models.log import Log from app.models.log import Log
class LogRepository: class LogRepository:

View File

@ -7,8 +7,8 @@ from sqlalchemy import select, update
from sqlalchemy.ext.asyncio import AsyncSession from sqlalchemy.ext.asyncio import AsyncSession
from sqlalchemy.orm import selectinload from sqlalchemy.orm import selectinload
from models.schedule import Schedule from app.models.schedule import Schedule
from models.schedule_run import ScheduleRun from app.models.schedule_run import ScheduleRun
class ScheduleRepository: class ScheduleRepository:
@ -22,6 +22,17 @@ class ScheduleRepository:
result = await self.session.execute(stmt) result = await self.session.execute(stmt)
return result.scalars().all() return result.scalars().all()
async def list_active(self, limit: int = 100) -> list[Schedule]:
"""Liste les schedules actifs (enabled=True et non supprimés)."""
stmt = (
select(Schedule)
.where(Schedule.enabled == True, Schedule.deleted_at.is_(None))
.order_by(Schedule.created_at.desc())
.limit(limit)
)
result = await self.session.execute(stmt)
return result.scalars().all()
async def get(self, schedule_id: str, include_deleted: bool = False) -> Optional[Schedule]: async def get(self, schedule_id: str, include_deleted: bool = False) -> Optional[Schedule]:
stmt = select(Schedule).where(Schedule.id == schedule_id).options( stmt = select(Schedule).where(Schedule.id == schedule_id).options(
selectinload(Schedule.runs) selectinload(Schedule.runs)

View File

@ -5,7 +5,7 @@ from typing import Optional
from sqlalchemy import select from sqlalchemy import select
from sqlalchemy.ext.asyncio import AsyncSession from sqlalchemy.ext.asyncio import AsyncSession
from models.schedule_run import ScheduleRun from app.models.schedule_run import ScheduleRun
class ScheduleRunRepository: class ScheduleRunRepository:

View File

@ -6,7 +6,7 @@ from sqlalchemy import select
from sqlalchemy.ext.asyncio import AsyncSession from sqlalchemy.ext.asyncio import AsyncSession
from sqlalchemy.orm import selectinload from sqlalchemy.orm import selectinload
from models.task import Task from app.models.task import Task
class TaskRepository: class TaskRepository:

143
app/crud/user.py Normal file
View File

@ -0,0 +1,143 @@
"""User repository for CRUD operations."""
from __future__ import annotations
from datetime import datetime, timezone
from typing import Optional
from sqlalchemy import func, select, update
from sqlalchemy.ext.asyncio import AsyncSession
from app.models.user import User
class UserRepository:
"""Repository for User CRUD operations."""
def __init__(self, session: AsyncSession):
self.session = session
async def count(self, include_deleted: bool = False) -> int:
"""Count total users."""
stmt = select(func.count(User.id))
if not include_deleted:
stmt = stmt.where(User.deleted_at.is_(None))
result = await self.session.execute(stmt)
return result.scalar() or 0
async def list(
self,
limit: int = 100,
offset: int = 0,
include_deleted: bool = False
) -> list[User]:
"""List all users with pagination."""
stmt = select(User).order_by(User.created_at.desc()).offset(offset).limit(limit)
if not include_deleted:
stmt = stmt.where(User.deleted_at.is_(None))
result = await self.session.execute(stmt)
return list(result.scalars().all())
async def get(self, user_id: int, include_deleted: bool = False) -> Optional[User]:
"""Get user by ID."""
stmt = select(User).where(User.id == user_id)
if not include_deleted:
stmt = stmt.where(User.deleted_at.is_(None))
result = await self.session.execute(stmt)
return result.scalar_one_or_none()
async def get_by_username(
self,
username: str,
include_deleted: bool = False
) -> Optional[User]:
"""Get user by username."""
stmt = select(User).where(User.username == username)
if not include_deleted:
stmt = stmt.where(User.deleted_at.is_(None))
result = await self.session.execute(stmt)
return result.scalar_one_or_none()
async def get_by_email(
self,
email: str,
include_deleted: bool = False
) -> Optional[User]:
"""Get user by email."""
stmt = select(User).where(User.email == email)
if not include_deleted:
stmt = stmt.where(User.deleted_at.is_(None))
result = await self.session.execute(stmt)
return result.scalar_one_or_none()
async def create(
self,
*,
username: str,
hashed_password: str,
email: Optional[str] = None,
display_name: Optional[str] = None,
role: str = "admin",
is_active: bool = True,
is_superuser: bool = False,
) -> User:
"""Create a new user."""
user = User(
username=username,
hashed_password=hashed_password,
email=email,
display_name=display_name,
role=role,
is_active=is_active,
is_superuser=is_superuser,
password_changed_at=datetime.now(timezone.utc),
)
self.session.add(user)
await self.session.flush()
return user
async def update(self, user: User, **fields) -> User:
"""Update user fields."""
for key, value in fields.items():
if value is not None:
setattr(user, key, value)
await self.session.flush()
return user
async def update_password(self, user: User, hashed_password: str) -> User:
"""Update user password and timestamp."""
user.hashed_password = hashed_password
user.password_changed_at = datetime.now(timezone.utc)
await self.session.flush()
return user
async def update_last_login(self, user: User) -> User:
"""Update last login timestamp."""
user.last_login = datetime.now(timezone.utc)
await self.session.flush()
return user
async def soft_delete(self, user_id: int) -> bool:
"""Soft delete a user."""
stmt = (
update(User)
.where(User.id == user_id, User.deleted_at.is_(None))
.values(deleted_at=datetime.now(timezone.utc), is_active=False)
)
result = await self.session.execute(stmt)
return result.rowcount > 0
async def hard_delete(self, user_id: int) -> bool:
"""Permanently delete a user (use with caution)."""
user = await self.get(user_id, include_deleted=True)
if user:
await self.session.delete(user)
await self.session.flush()
return True
return False
async def exists_any(self) -> bool:
"""Check if any user exists (for initial setup check)."""
stmt = select(func.count(User.id)).where(User.deleted_at.is_(None))
result = await self.session.execute(stmt)
count = result.scalar() or 0
return count > 0

218
app/factory.py Normal file
View File

@ -0,0 +1,218 @@
"""
Factory pour créer l'application FastAPI.
Ce module contient la fonction create_app() qui configure et retourne
une instance FastAPI prête à l'emploi.
"""
from fastapi import FastAPI
from fastapi.middleware.cors import CORSMiddleware
from fastapi.staticfiles import StaticFiles
from fastapi.responses import HTMLResponse
from app.core.config import settings
from app.models.database import init_db, async_session_maker
from app.routes import api_router
from app.routes.websocket import router as ws_router
def create_app() -> FastAPI:
"""Crée et configure l'application FastAPI.
Returns:
Instance FastAPI configurée avec tous les routers et middleware
"""
app = FastAPI(
title=settings.api_title,
description=settings.api_description,
version=settings.api_version,
docs_url="/docs",
redoc_url="/redoc",
)
# Configuration CORS
app.add_middleware(
CORSMiddleware,
allow_origins=settings.cors_origins,
allow_credentials=settings.cors_allow_credentials,
allow_methods=settings.cors_allow_methods,
allow_headers=settings.cors_allow_headers,
)
# Monter les fichiers statiques (main.js, etc.)
app.mount("/static", StaticFiles(directory=settings.base_dir, html=False), name="static")
# Inclure les routers API
app.include_router(api_router, prefix="/api")
# Inclure le router WebSocket (sans préfixe /api)
app.include_router(ws_router)
# Routes racine
@app.get("/", response_class=HTMLResponse)
async def root():
"""Page d'accueil - redirige vers le dashboard."""
# Essayer de servir index.html
index_path = settings.base_dir / "index.html"
if index_path.exists():
return index_path.read_text(encoding='utf-8')
return """
<!DOCTYPE html>
<html>
<head>
<title>Homelab Automation</title>
<meta http-equiv="refresh" content="0; url=/docs">
</head>
<body>
<p>Redirecting to <a href="/docs">API Documentation</a>...</p>
</body>
</html>
"""
@app.get("/api", response_class=HTMLResponse)
async def api_home():
"""Page d'accueil de l'API."""
return """
<!DOCTYPE html>
<html>
<head>
<title>Homelab Automation API</title>
<style>
body { font-family: system-ui, sans-serif; max-width: 800px; margin: 50px auto; padding: 20px; }
h1 { color: #1a1a2e; }
.links { margin-top: 30px; }
.links a { display: inline-block; margin-right: 20px; padding: 10px 20px;
background: #7c3aed; color: white; text-decoration: none; border-radius: 5px; }
.links a:hover { background: #6d28d9; }
</style>
</head>
<body>
<h1>🏠 Homelab Automation API</h1>
<p>API REST pour la gestion automatisée de votre homelab avec Ansible.</p>
<div class="links">
<a href="/docs">📚 Documentation Swagger</a>
<a href="/redoc">📖 Documentation ReDoc</a>
<a href="/">🖥 Dashboard</a>
</div>
</body>
</html>
"""
# Événements de démarrage et d'arrêt
@app.on_event("startup")
async def startup_event():
"""Événement de démarrage de l'application."""
# Démarrer la capture des logs console en premier
from app.services import console_log_service
console_log_service.start_capture()
print("🚀 Homelab Automation Dashboard démarré")
# Afficher les paramètres d'environnement
print("\n📋 Configuration:")
print(f" BASE_DIR: {settings.base_dir}")
print(f" ANSIBLE_DIR: {settings.ansible_dir}")
print(f" TASKS_LOGS_DIR: {settings.tasks_logs_dir}")
print(f" DATABASE_URL: {settings.async_database_url}")
print(f" SSH_KEY_PATH: {settings.ssh_key_path}")
print(f" SSH_USER: {settings.ssh_user}")
print(f" SSH_REMOTE_USER: {settings.ssh_remote_user}")
print(f" API_KEY: {'*' * 8}...{settings.api_key[-4:] if len(settings.api_key) > 4 else '****'}")
print(f" NTFY_ENABLED: {settings.ntfy_enabled}")
print(f" NTFY_BASE_URL: {settings.ntfy_base_url}")
print(f" NTFY_TOPIC: {settings.ntfy_default_topic}")
print()
# Validation des chemins critiques
validation_ok = True
if not settings.ansible_dir.exists():
print(f"⚠️ ANSIBLE_DIR n'existe pas: {settings.ansible_dir}")
validation_ok = False
else:
print(f"✅ ANSIBLE_DIR OK: {settings.ansible_dir}")
inventory_path = settings.ansible_dir / "inventory" / "hosts.yml"
if not inventory_path.exists():
print(f"⚠️ Inventaire Ansible non trouvé: {inventory_path}")
else:
print(f"✅ Inventaire Ansible OK: {inventory_path}")
playbooks_dir = settings.ansible_dir / "playbooks"
if not playbooks_dir.exists():
print(f"⚠️ Dossier playbooks non trouvé: {playbooks_dir}")
else:
playbook_count = len(list(playbooks_dir.glob("*.yml")))
print(f"✅ Dossier playbooks OK: {playbook_count} playbook(s) trouvé(s)")
print()
# Initialiser la base de données
await init_db()
print("📦 Base de données SQLite initialisée")
# Charger les services
from app.services import (
bootstrap_status_service,
scheduler_service,
notification_service,
)
from app.crud.log import LogRepository
# Charger les statuts bootstrap depuis la BD
await bootstrap_status_service.load_from_db()
# Démarrer le scheduler
await scheduler_service.start_async()
# Afficher l'état du service de notification
ntfy_status = "activé" if notification_service.enabled else "désactivé"
print(f"🔔 Service de notification ntfy: {ntfy_status} ({notification_service.config.base_url})")
# Log de démarrage en base
async with async_session_maker() as session:
repo = LogRepository(session)
await repo.create(
level="INFO",
message="Application démarrée - Services initialisés (BD)",
source="system",
)
await session.commit()
# Notification ntfy au démarrage
startup_notif = notification_service.templates.app_started()
await notification_service.send(
message=startup_notif.message,
topic=startup_notif.topic,
title=startup_notif.title,
priority=startup_notif.priority,
tags=startup_notif.tags,
)
@app.on_event("shutdown")
async def shutdown_event():
"""Événement d'arrêt de l'application."""
print("👋 Arrêt de l'application...")
from app.services import scheduler_service, notification_service
# Arrêter la capture des logs console
from app.services import console_log_service
console_log_service.stop_capture()
# Arrêter le scheduler
scheduler_service.shutdown()
# Notification ntfy à l'arrêt
shutdown_notif = notification_service.templates.app_stopped()
await notification_service.send(
message=shutdown_notif.message,
topic=shutdown_notif.topic,
title=shutdown_notif.title,
priority=shutdown_notif.priority,
tags=shutdown_notif.tags,
)
# Fermer le client HTTP
await notification_service.close()
print("✅ Services arrêtés proprement")
return app

View File

@ -2548,6 +2548,197 @@
</style> </style>
</head> </head>
<body> <body>
<!-- Login Screen - shown when not authenticated -->
<div id="login-screen" class="fixed inset-0 bg-black z-[100] flex items-center justify-center hidden">
<div class="max-w-md w-full mx-4">
<!-- Login Form -->
<div id="login-form-container" class="glass-card p-8">
<div class="text-center mb-8">
<div class="w-16 h-16 bg-gradient-to-br from-purple-600 to-blue-600 rounded-2xl flex items-center justify-center mx-auto mb-4">
<i class="fas fa-server text-white text-2xl"></i>
</div>
<h1 class="text-2xl font-bold gradient-text mb-2">Homelab Dashboard</h1>
<p class="text-gray-400">Connectez-vous pour continuer</p>
</div>
<form id="login-form" onsubmit="handleLogin(event)" class="space-y-6">
<div>
<label for="login-username" class="block text-sm font-medium text-gray-300 mb-2">
<i class="fas fa-user mr-2"></i>Nom d'utilisateur
</label>
<input
type="text"
id="login-username"
name="username"
required
autocomplete="username"
class="w-full px-4 py-3 bg-gray-800 border border-gray-700 rounded-lg text-white placeholder-gray-500 focus:border-purple-500 focus:ring-1 focus:ring-purple-500 transition-colors"
placeholder="Entrez votre nom d'utilisateur"
>
</div>
<div>
<label for="login-password" class="block text-sm font-medium text-gray-300 mb-2">
<i class="fas fa-lock mr-2"></i>Mot de passe
</label>
<div class="relative">
<input
type="password"
id="login-password"
name="password"
required
autocomplete="current-password"
class="w-full px-4 py-3 bg-gray-800 border border-gray-700 rounded-lg text-white placeholder-gray-500 focus:border-purple-500 focus:ring-1 focus:ring-purple-500 transition-colors pr-12"
placeholder="Entrez votre mot de passe"
>
<button
type="button"
onclick="togglePasswordVisibility('login-password', this)"
class="absolute right-3 top-1/2 -translate-y-1/2 text-gray-500 hover:text-gray-300"
>
<i class="fas fa-eye"></i>
</button>
</div>
</div>
<div id="login-error" class="hidden text-red-400 text-sm bg-red-900/20 border border-red-800 rounded-lg p-3">
<i class="fas fa-exclamation-circle mr-2"></i>
<span id="login-error-text"></span>
</div>
<button
type="submit"
id="login-submit-btn"
class="w-full btn-primary py-3 flex items-center justify-center gap-2"
>
<i class="fas fa-sign-in-alt"></i>
<span>Se connecter</span>
</button>
</form>
</div>
<!-- Setup Form (first user creation) -->
<div id="setup-form-container" class="glass-card p-8 hidden">
<div class="text-center mb-8">
<div class="w-16 h-16 bg-gradient-to-br from-green-600 to-emerald-600 rounded-2xl flex items-center justify-center mx-auto mb-4">
<i class="fas fa-user-plus text-white text-2xl"></i>
</div>
<h1 class="text-2xl font-bold gradient-text mb-2">Configuration Initiale</h1>
<p class="text-gray-400">Créez votre compte administrateur</p>
</div>
<form id="setup-form" onsubmit="handleSetup(event)" class="space-y-5">
<div>
<label for="setup-username" class="block text-sm font-medium text-gray-300 mb-2">
<i class="fas fa-user mr-2"></i>Nom d'utilisateur *
</label>
<input
type="text"
id="setup-username"
name="username"
required
minlength="3"
maxlength="50"
autocomplete="username"
class="w-full px-4 py-3 bg-gray-800 border border-gray-700 rounded-lg text-white placeholder-gray-500 focus:border-purple-500 focus:ring-1 focus:ring-purple-500 transition-colors"
placeholder="admin"
>
</div>
<div>
<label for="setup-password" class="block text-sm font-medium text-gray-300 mb-2">
<i class="fas fa-lock mr-2"></i>Mot de passe *
</label>
<div class="relative">
<input
type="password"
id="setup-password"
name="password"
required
minlength="6"
autocomplete="new-password"
class="w-full px-4 py-3 bg-gray-800 border border-gray-700 rounded-lg text-white placeholder-gray-500 focus:border-purple-500 focus:ring-1 focus:ring-purple-500 transition-colors pr-12"
placeholder="Minimum 6 caractères"
>
<button
type="button"
onclick="togglePasswordVisibility('setup-password', this)"
class="absolute right-3 top-1/2 -translate-y-1/2 text-gray-500 hover:text-gray-300"
>
<i class="fas fa-eye"></i>
</button>
</div>
</div>
<div>
<label for="setup-password-confirm" class="block text-sm font-medium text-gray-300 mb-2">
<i class="fas fa-lock mr-2"></i>Confirmer le mot de passe *
</label>
<input
type="password"
id="setup-password-confirm"
name="password_confirm"
required
minlength="6"
autocomplete="new-password"
class="w-full px-4 py-3 bg-gray-800 border border-gray-700 rounded-lg text-white placeholder-gray-500 focus:border-purple-500 focus:ring-1 focus:ring-purple-500 transition-colors"
placeholder="Confirmez votre mot de passe"
>
</div>
<div>
<label for="setup-email" class="block text-sm font-medium text-gray-300 mb-2">
<i class="fas fa-envelope mr-2"></i>Email (optionnel)
</label>
<input
type="email"
id="setup-email"
name="email"
autocomplete="email"
class="w-full px-4 py-3 bg-gray-800 border border-gray-700 rounded-lg text-white placeholder-gray-500 focus:border-purple-500 focus:ring-1 focus:ring-purple-500 transition-colors"
placeholder="admin@example.com"
>
</div>
<div>
<label for="setup-display-name" class="block text-sm font-medium text-gray-300 mb-2">
<i class="fas fa-id-card mr-2"></i>Nom d'affichage (optionnel)
</label>
<input
type="text"
id="setup-display-name"
name="display_name"
maxlength="100"
class="w-full px-4 py-3 bg-gray-800 border border-gray-700 rounded-lg text-white placeholder-gray-500 focus:border-purple-500 focus:ring-1 focus:ring-purple-500 transition-colors"
placeholder="Administrateur"
>
</div>
<div id="setup-error" class="hidden text-red-400 text-sm bg-red-900/20 border border-red-800 rounded-lg p-3">
<i class="fas fa-exclamation-circle mr-2"></i>
<span id="setup-error-text"></span>
</div>
<button
type="submit"
id="setup-submit-btn"
class="w-full btn-primary py-3 flex items-center justify-center gap-2"
>
<i class="fas fa-user-plus"></i>
<span>Créer le compte</span>
</button>
</form>
</div>
<p class="text-center text-gray-500 text-sm mt-6">
Homelab Automation Dashboard v1.0
</p>
</div>
</div>
<!-- Main Content Wrapper -->
<div id="main-content">
<!-- Loading indicator - will be hidden when JS loads --> <!-- Loading indicator - will be hidden when JS loads -->
<div id="page-loading" style="position:fixed;inset:0;background:#0a0a0a;display:flex;flex-direction:column;align-items:center;justify-content:center;z-index:9999;"> <div id="page-loading" style="position:fixed;inset:0;background:#0a0a0a;display:flex;flex-direction:column;align-items:center;justify-content:center;z-index:9999;">
<div style="width:50px;height:50px;border:3px solid #333;border-top-color:#7c3aed;border-radius:50%;animation:spin 1s linear infinite;"></div> <div style="width:50px;height:50px;border:3px solid #333;border-top-color:#7c3aed;border-radius:50%;animation:spin 1s linear infinite;"></div>
@ -2610,16 +2801,28 @@
<i class="fas fa-file-alt"></i> <i class="fas fa-file-alt"></i>
<span>Logs</span> <span>Logs</span>
</button> </button>
<button type="button" data-page="alerts" class="mobile-nav-link" onclick="mobileNavigateTo('alerts'); return false;">
<i class="fas fa-bell"></i>
<span>Alertes</span>
</button>
<button type="button" data-page="configuration" class="mobile-nav-link" onclick="mobileNavigateTo('configuration'); return false;">
<i class="fas fa-sliders-h"></i>
<span>Configuration</span>
</button>
<button type="button" data-page="help" class="mobile-nav-link" onclick="mobileNavigateTo('help'); return false;"> <button type="button" data-page="help" class="mobile-nav-link" onclick="mobileNavigateTo('help'); return false;">
<i class="fas fa-question-circle"></i> <i class="fas fa-question-circle"></i>
<span>Aide</span> <span>Aide</span>
</button> </button>
</nav> </nav>
<div class="p-4 border-t border-gray-700"> <div class="p-4 border-t border-gray-700 space-y-3">
<button type="button" id="mobile-theme-toggle" onclick="toggleTheme()" class="w-full flex items-center justify-center gap-2 p-3 bg-gray-800 rounded-lg hover:bg-gray-700 transition-colors"> <button type="button" id="mobile-theme-toggle" onclick="toggleTheme()" class="w-full flex items-center justify-center gap-2 p-3 bg-gray-800 rounded-lg hover:bg-gray-700 transition-colors">
<i class="fas fa-moon text-gray-300" id="mobile-theme-icon"></i> <i class="fas fa-moon text-gray-300" id="mobile-theme-icon"></i>
<span class="text-sm text-gray-300" id="mobile-theme-label">Thème sombre</span> <span class="text-sm text-gray-300" id="mobile-theme-label">Thème sombre</span>
</button> </button>
<button type="button" onclick="handleLogout(); closeMobileNav();" class="w-full flex items-center justify-center gap-2 p-3 bg-red-900/30 border border-red-800 rounded-lg hover:bg-red-900/50 transition-colors">
<i class="fas fa-sign-out-alt text-red-400"></i>
<span class="text-sm text-red-400">Déconnexion</span>
</button>
</div> </div>
</aside> </aside>
@ -2644,10 +2847,40 @@
<a href="#" data-page="tasks" class="nav-link text-gray-300 hover:text-white transition-colors">Tasks</a> <a href="#" data-page="tasks" class="nav-link text-gray-300 hover:text-white transition-colors">Tasks</a>
<a href="#" data-page="schedules" class="nav-link text-gray-300 hover:text-white transition-colors">Schedules</a> <a href="#" data-page="schedules" class="nav-link text-gray-300 hover:text-white transition-colors">Schedules</a>
<a href="#" data-page="logs" class="nav-link text-gray-300 hover:text-white transition-colors">Logs</a> <a href="#" data-page="logs" class="nav-link text-gray-300 hover:text-white transition-colors">Logs</a>
<a href="#" data-page="alerts" class="nav-link text-gray-300 hover:text-white transition-colors">Alertes</a>
<a href="#" data-page="configuration" class="nav-link text-gray-300 hover:text-white transition-colors">Configuration</a>
<a href="#" data-page="help" class="nav-link text-gray-300 hover:text-white transition-colors">Aide</a> <a href="#" data-page="help" class="nav-link text-gray-300 hover:text-white transition-colors">Aide</a>
<button id="theme-toggle" class="p-2 rounded-lg bg-gray-800 hover:bg-gray-700 transition-colors touch-target"> <button id="theme-toggle" class="p-2 rounded-lg bg-gray-800 hover:bg-gray-700 transition-colors touch-target">
<i class="fas fa-moon text-gray-300"></i> <i class="fas fa-moon text-gray-300"></i>
</button> </button>
<button id="alerts-button" class="relative p-2 rounded-lg bg-gray-800 hover:bg-gray-700 transition-colors touch-target" onclick="navigateTo('alerts')" title="Alertes">
<i class="fas fa-bell text-gray-300"></i>
<span id="alerts-badge" class="hidden absolute -top-1 -right-1 min-w-[18px] h-[18px] px-1 bg-red-600 text-white text-[10px] leading-[18px] rounded-full text-center"></span>
</button>
<!-- User Menu -->
<div class="relative group">
<button class="flex items-center gap-2 p-2 rounded-lg bg-gray-800 hover:bg-gray-700 transition-colors">
<i class="fas fa-user-circle text-gray-300"></i>
<span id="current-user-name" class="text-sm text-gray-300 hidden lg:inline"></span>
<i class="fas fa-chevron-down text-xs text-gray-500"></i>
</button>
<div class="absolute right-0 mt-2 w-48 bg-gray-800 border border-gray-700 rounded-lg shadow-xl opacity-0 invisible group-hover:opacity-100 group-hover:visible transition-all z-50">
<div class="p-3 border-b border-gray-700">
<p class="text-sm font-medium text-white" id="user-menu-name">Utilisateur</p>
<p class="text-xs text-gray-400" id="current-user-role">Admin</p>
</div>
<div class="p-2">
<button type="button" onclick="navigateTo('configuration')" class="w-full flex items-center gap-2 px-3 py-2 text-sm text-gray-200 hover:bg-gray-700 rounded-lg transition-colors">
<i class="fas fa-sliders-h text-gray-300"></i>
<span>Configuration</span>
</button>
<button onclick="handleLogout()" class="w-full flex items-center gap-2 px-3 py-2 text-sm text-red-400 hover:bg-gray-700 rounded-lg transition-colors">
<i class="fas fa-sign-out-alt"></i>
<span>Déconnexion</span>
</button>
</div>
</div>
</div>
</div> </div>
<!-- Mobile Menu Button --> <!-- Mobile Menu Button -->
@ -2721,8 +2954,65 @@
<!-- Mobile: Quick Actions first, then schedules --> <!-- Mobile: Quick Actions first, then schedules -->
<div class="grid grid-cols-1 lg:grid-cols-3 gap-4 sm:gap-8"> <div class="grid grid-cols-1 lg:grid-cols-3 gap-4 sm:gap-8">
<!-- Quick Actions - Show first on mobile --> <!-- Console Ad-Hoc Widget + Quick Actions - Stacked on right column -->
<div class="glass-card p-4 sm:p-6 fade-in order-1 lg:order-2"> <div class="flex flex-col gap-4 sm:gap-6 order-1 lg:order-2">
<!-- Console Ad-Hoc Ansible Widget -->
<div class="glass-card p-4 sm:p-6 fade-in">
<div class="flex items-center justify-between mb-4">
<h3 class="text-base sm:text-lg font-semibold flex items-center">
<i class="fas fa-terminal text-purple-400 mr-2"></i>
Console Ad-Hoc
</h3>
<div class="flex items-center gap-2">
<span id="adhoc-widget-count" class="text-xs text-gray-500 hidden sm:inline"></span>
</div>
</div>
<!-- Bouton principal Console -->
<button type="button" onclick="dashboard.showAdHocConsole()"
class="w-full p-3 sm:p-4 bg-gradient-to-r from-purple-600 to-purple-700 rounded-lg hover:from-purple-500 hover:to-purple-600 transition-all flex items-center justify-center mb-4 group">
<i class="fas fa-terminal mr-2 sm:mr-3 text-sm sm:text-base group-hover:animate-pulse"></i>
<span class="text-sm sm:text-base font-medium">Console Ad-Hoc</span>
</button>
<!-- Mini stats -->
<div class="grid grid-cols-3 gap-2 mb-4">
<div class="text-center p-2 bg-gray-800/50 rounded">
<div class="text-sm sm:text-base font-bold text-green-400" id="adhoc-widget-success">0</div>
<div class="text-[9px] sm:text-xs text-gray-500">Succès</div>
</div>
<div class="text-center p-2 bg-gray-800/50 rounded">
<div class="text-sm sm:text-base font-bold text-red-400" id="adhoc-widget-failed">0</div>
<div class="text-[9px] sm:text-xs text-gray-500">Échecs</div>
</div>
<div class="text-center p-2 bg-gray-800/50 rounded">
<div class="text-sm sm:text-base font-bold text-purple-400" id="adhoc-widget-total">0</div>
<div class="text-[9px] sm:text-xs text-gray-500">Total</div>
</div>
</div>
<!-- Historique récent -->
<div class="border-t border-gray-700/50 pt-3">
<div class="flex items-center justify-between mb-2">
<span class="text-xs text-gray-400 uppercase tracking-wide">
<i class="fas fa-history mr-1"></i>Dernières exécutions
</span>
</div>
<div id="adhoc-widget-history" class="space-y-2 max-h-48 overflow-y-auto">
<p class="text-xs text-gray-500 text-center py-3">
<i class="fas fa-spinner fa-spin mr-1"></i>Chargement...
</p>
</div>
<button type="button" onclick="dashboard.loadMoreAdhocHistory()"
id="adhoc-widget-load-more"
class="w-full mt-3 p-2 text-xs text-purple-400 hover:text-purple-300 hover:bg-purple-900/20 rounded transition-colors hidden">
<i class="fas fa-chevron-down mr-1"></i>Charger plus
</button>
</div>
</div>
<!-- Quick Actions - Now below Ad-Hoc widget -->
<div class="glass-card p-4 sm:p-6 fade-in">
<h3 class="text-lg sm:text-xl font-semibold mb-4 sm:mb-6">Actions Rapides</h3> <h3 class="text-lg sm:text-xl font-semibold mb-4 sm:mb-6">Actions Rapides</h3>
<!-- Mobile: 2x2 grid, Desktop: vertical stack --> <!-- Mobile: 2x2 grid, Desktop: vertical stack -->
<div class="grid grid-cols-2 lg:grid-cols-1 gap-2 sm:gap-4"> <div class="grid grid-cols-2 lg:grid-cols-1 gap-2 sm:gap-4">
@ -2744,6 +3034,7 @@
</button> </button>
</div> </div>
</div> </div>
</div>
<!-- Schedules Widget --> <!-- Schedules Widget -->
<div class="lg:col-span-2 order-2 lg:order-1"> <div class="lg:col-span-2 order-2 lg:order-1">
@ -3266,6 +3557,104 @@
</section> </section>
<!-- END PAGE: SCHEDULES --> <!-- END PAGE: SCHEDULES -->
<section id="page-alerts" class="page-section">
<div class="pt-20 sm:pt-24 pb-8 sm:pb-16 min-h-screen bg-gradient-to-b from-gray-900 to-black">
<div class="max-w-7xl mx-auto px-4 sm:px-6">
<div class="text-center mb-6 sm:mb-12">
<h1 class="text-2xl sm:text-3xl md:text-4xl font-bold mb-2 sm:mb-4 gradient-text">
<i class="fas fa-bell mr-2 sm:mr-3"></i>Centre d'Alertes
</h1>
<p class="text-sm sm:text-base text-gray-400 px-4">Tous les messages reçus (toasts) avec statut lu/non-lu, date et catégorie</p>
</div>
<div class="glass-card p-4 sm:p-6 fade-in">
<div class="flex flex-col sm:flex-row sm:items-center justify-between gap-3 mb-4 sm:mb-6">
<h3 class="text-lg sm:text-xl font-semibold">Messages</h3>
<div class="flex gap-2 sm:gap-3">
<button type="button" class="flex-1 sm:flex-none px-3 sm:px-4 py-2 bg-gray-700 rounded-lg hover:bg-gray-600 transition-colors text-sm touch-target" onclick="dashboard.refreshAlerts()">
<i class="fas fa-sync-alt mr-1 sm:mr-2"></i>
<span class="hidden sm:inline">Rafraîchir</span>
</button>
<button type="button" class="flex-1 sm:flex-none px-3 sm:px-4 py-2 bg-purple-600 rounded-lg hover:bg-purple-500 transition-colors text-sm touch-target" onclick="dashboard.markAllAlertsRead()">
<i class="fas fa-check-double mr-1 sm:mr-2"></i>
<span class="hidden sm:inline">Tout marquer lu</span>
</button>
</div>
</div>
<div id="alerts-container" class="space-y-2 max-h-[600px] overflow-y-auto">
</div>
</div>
</div>
</div>
</section>
<section id="page-configuration" class="page-section">
<div class="pt-20 sm:pt-24 pb-8 sm:pb-16 min-h-screen bg-gradient-to-b from-gray-900 to-black">
<div class="max-w-7xl mx-auto px-4 sm:px-6">
<div class="text-center mb-6 sm:mb-12">
<h1 class="text-2xl sm:text-3xl md:text-4xl font-bold mb-2 sm:mb-4 gradient-text">
<i class="fas fa-sliders-h mr-2 sm:mr-3"></i>Configuration
</h1>
<p class="text-sm sm:text-base text-gray-400 px-4">Paramètres et outils d'administration</p>
</div>
<div class="glass-card p-4 sm:p-6 fade-in">
<div class="flex flex-col sm:flex-row sm:items-center justify-between gap-3 mb-4 sm:mb-6">
<h3 class="text-lg sm:text-xl font-semibold">Outils de base</h3>
<div class="flex gap-2 sm:gap-3">
<button type="button" class="flex-1 sm:flex-none px-3 sm:px-4 py-2 bg-purple-600 rounded-lg hover:bg-purple-500 transition-colors text-sm touch-target" onclick="dashboard.installBaseToolsAllHosts()">
<i class="fas fa-tools mr-1 sm:mr-2"></i>
<span class="hidden sm:inline">Installer sur tous les hôtes</span>
<span class="sm:hidden">Installer</span>
</button>
</div>
</div>
<div class="p-3 bg-gray-800/40 border border-gray-700 rounded-lg text-sm text-gray-300">
<div class="flex gap-2">
<i class="fas fa-info-circle text-blue-400 mt-0.5"></i>
<div>
<p class="font-medium">Ce bouton exécute un builtin playbook d'installation</p>
<p class="text-gray-400">Installe notamment: coreutils, util-linux (lsblk), gawk/grep, python3, iproute2/procps, et optionnels: lvm2, lm-sensors, zfsutils-linux.</p>
</div>
</div>
</div>
</div>
<div class="glass-card p-4 sm:p-6 mt-6 fade-in">
<div class="flex flex-col sm:flex-row sm:items-center justify-between gap-3 mb-4 sm:mb-6">
<h3 class="text-lg sm:text-xl font-semibold">Collecte des métriques</h3>
<div class="flex gap-2 sm:gap-3">
<button type="button" id="metrics-collection-save" class="flex-1 sm:flex-none px-3 sm:px-4 py-2 bg-purple-600 rounded-lg hover:bg-purple-500 transition-colors text-sm touch-target">
<i class="fas fa-save mr-1 sm:mr-2"></i>
<span class="hidden sm:inline">Sauvegarder</span>
<span class="sm:hidden">OK</span>
</button>
</div>
</div>
<div class="space-y-3">
<div>
<label class="block text-sm text-gray-400 mb-1">Période de collecte</label>
<select id="metrics-collection-interval" class="w-full px-4 py-2 bg-gray-800 border border-gray-700 rounded-lg">
<option value="off">off</option>
<option value="5min">5 min</option>
<option value="15min">15 min</option>
<option value="30min">30 min</option>
<option value="1h">1 h</option>
<option value="6h">6 h</option>
<option value="12h">12 h</option>
<option value="24h">24 h</option>
</select>
<p class="text-xs text-gray-500 mt-1" id="metrics-collection-current">Chargement…</p>
</div>
</div>
</div>
</div>
</div>
</section>
<!-- ==================== PAGE: LOGS ==================== --> <!-- ==================== PAGE: LOGS ==================== -->
<section id="page-logs" class="page-section"> <section id="page-logs" class="page-section">
<div id="logs" class="pt-20 sm:pt-24 pb-8 sm:pb-16 min-h-screen bg-gradient-to-b from-black to-gray-900"> <div id="logs" class="pt-20 sm:pt-24 pb-8 sm:pb-16 min-h-screen bg-gradient-to-b from-black to-gray-900">
@ -3280,6 +3669,17 @@
<div class="glass-card p-4 sm:p-6 fade-in"> <div class="glass-card p-4 sm:p-6 fade-in">
<div class="flex flex-col sm:flex-row sm:items-center justify-between gap-3 mb-4 sm:mb-6"> <div class="flex flex-col sm:flex-row sm:items-center justify-between gap-3 mb-4 sm:mb-6">
<h3 class="text-lg sm:text-xl font-semibold">Logs Récentes</h3> <h3 class="text-lg sm:text-xl font-semibold">Logs Récentes</h3>
<div class="flex flex-col sm:flex-row gap-2 sm:gap-3">
<div class="flex gap-2">
<button type="button" class="flex-1 sm:flex-none px-3 sm:px-4 py-2 bg-gray-800 border border-gray-700 rounded-lg hover:bg-gray-700 transition-colors text-sm touch-target" onclick="dashboard.setLogsView('server')">
<i class="fas fa-terminal mr-1 sm:mr-2"></i>
<span class="hidden sm:inline">Console</span>
</button>
<button type="button" class="flex-1 sm:flex-none px-3 sm:px-4 py-2 bg-gray-800 border border-gray-700 rounded-lg hover:bg-gray-700 transition-colors text-sm touch-target" onclick="dashboard.setLogsView('db')">
<i class="fas fa-database mr-1 sm:mr-2"></i>
<span class="hidden sm:inline">BD</span>
</button>
</div>
<div class="flex gap-2 sm:gap-3"> <div class="flex gap-2 sm:gap-3">
<button type="button" class="flex-1 sm:flex-none px-3 sm:px-4 py-2 bg-gray-700 rounded-lg hover:bg-gray-600 transition-colors text-sm touch-target" onclick="dashboard.clearLogs()"> <button type="button" class="flex-1 sm:flex-none px-3 sm:px-4 py-2 bg-gray-700 rounded-lg hover:bg-gray-600 transition-colors text-sm touch-target" onclick="dashboard.clearLogs()">
<i class="fas fa-trash mr-1 sm:mr-2"></i> <i class="fas fa-trash mr-1 sm:mr-2"></i>
@ -3291,6 +3691,7 @@
</button> </button>
</div> </div>
</div> </div>
</div>
<div id="logs-container" class="max-h-[400px] sm:max-h-[600px] overflow-y-auto text-xs sm:text-sm"> <div id="logs-container" class="max-h-[400px] sm:max-h-[600px] overflow-y-auto text-xs sm:text-sm">
<!-- Logs will be populated by JavaScript --> <!-- Logs will be populated by JavaScript -->
@ -3308,12 +3709,20 @@
<!-- Header --> <!-- Header -->
<div class="text-center mb-6 sm:mb-12"> <div class="text-center mb-6 sm:mb-12">
<h1 class="text-2xl sm:text-3xl md:text-4xl font-bold mb-2 sm:mb-4 gradient-text"> <h1 class="text-2xl sm:text-3xl md:text-4xl font-bold mb-2 sm:mb-4 gradient-text">
<i class="fas fa-question-circle mr-2 sm:mr-3"></i>Centre d'Aide 🚀 Guide d'Utilisation
</h1> </h1>
<p class="text-sm sm:text-base text-gray-400 max-w-2xl mx-auto px-4"> <p class="text-sm sm:text-base text-gray-400 max-w-2xl mx-auto px-4">
Bienvenue dans le guide d'utilisation du Homelab Automation Dashboard. Bienvenue dans le guide officiel de votre <strong class="text-purple-400">Homelab Automation Dashboard</strong> !
Découvrez comment gérer efficacement votre infrastructure. Découvrez comment gérer et automatiser efficacement votre infrastructure grâce à cette solution puissante et centralisée.
</p> </p>
<div class="mt-4 flex flex-col sm:flex-row gap-2 sm:gap-3 justify-center px-4">
<button type="button" class="px-4 py-2 bg-gray-800 border border-gray-700 rounded-lg hover:bg-gray-700 transition-colors text-sm touch-target" onclick="dashboard.downloadHelpDocumentation('md')">
<i class="fas fa-file-alt mr-2"></i>Télécharger (.md)
</button>
<button type="button" class="px-4 py-2 bg-gray-800 border border-gray-700 rounded-lg hover:bg-gray-700 transition-colors text-sm touch-target" onclick="dashboard.downloadHelpDocumentation('pdf')">
<i class="fas fa-file-pdf mr-2"></i>Télécharger (.pdf)
</button>
</div>
</div> </div>
<!-- Layout avec Table des Matières --> <!-- Layout avec Table des Matières -->
@ -3324,31 +3733,31 @@
<div class="help-toc-title">Table des Matières</div> <div class="help-toc-title">Table des Matières</div>
<nav> <nav>
<a href="#help-quickstart" class="help-toc-item" onclick="scrollToHelpSection(event, 'help-quickstart')"> <a href="#help-quickstart" class="help-toc-item" onclick="scrollToHelpSection(event, 'help-quickstart')">
<i class="fas fa-rocket mr-2 text-purple-400"></i>Démarrage Rapide <span class="mr-2">⚡️</span>Démarrage Rapide
</a> </a>
<a href="#help-indicators" class="help-toc-item" onclick="scrollToHelpSection(event, 'help-indicators')"> <a href="#help-indicators" class="help-toc-item" onclick="scrollToHelpSection(event, 'help-indicators')">
<i class="fas fa-heartbeat mr-2 text-red-400"></i>Indicateurs de Santé <span class="mr-2">❤️‍🩹</span>Indicateurs de Santé
</a> </a>
<a href="#help-architecture" class="help-toc-item" onclick="scrollToHelpSection(event, 'help-architecture')"> <a href="#help-architecture" class="help-toc-item" onclick="scrollToHelpSection(event, 'help-architecture')">
<i class="fas fa-sitemap mr-2 text-blue-400"></i>Architecture <span class="mr-2">🏗️</span>Architecture
</a> </a>
<a href="#help-features" class="help-toc-item" onclick="scrollToHelpSection(event, 'help-features')"> <a href="#help-features" class="help-toc-item" onclick="scrollToHelpSection(event, 'help-features')">
<i class="fas fa-th-large mr-2 text-green-400"></i>Fonctionnalités <span class="mr-2">⚙️</span>Fonctionnalités
</a> </a>
<a href="#help-notifications" class="help-toc-item" onclick="scrollToHelpSection(event, 'help-notifications')"> <a href="#help-notifications" class="help-toc-item" onclick="scrollToHelpSection(event, 'help-notifications')">
<i class="fas fa-bell mr-2 text-yellow-400"></i>Notifications <span class="mr-2">🔔</span>Notifications
</a> </a>
<a href="#help-playbooks" class="help-toc-item" onclick="scrollToHelpSection(event, 'help-playbooks')"> <a href="#help-playbooks" class="help-toc-item" onclick="scrollToHelpSection(event, 'help-playbooks')">
<i class="fas fa-book mr-2 text-orange-400"></i>Playbooks Ansible <span class="mr-2">📖</span>Playbooks Ansible
</a> </a>
<a href="#help-api" class="help-toc-item" onclick="scrollToHelpSection(event, 'help-api')"> <a href="#help-api" class="help-toc-item" onclick="scrollToHelpSection(event, 'help-api')">
<i class="fas fa-code mr-2 text-cyan-400"></i>Référence API <span class="mr-2">🔗</span>Référence API
</a> </a>
<a href="#help-troubleshooting" class="help-toc-item" onclick="scrollToHelpSection(event, 'help-troubleshooting')"> <a href="#help-troubleshooting" class="help-toc-item" onclick="scrollToHelpSection(event, 'help-troubleshooting')">
<i class="fas fa-wrench mr-2 text-red-400"></i>Dépannage <span class="mr-2">🛠️</span>Dépannage
</a> </a>
<a href="#help-shortcuts" class="help-toc-item" onclick="scrollToHelpSection(event, 'help-shortcuts')"> <a href="#help-shortcuts" class="help-toc-item" onclick="scrollToHelpSection(event, 'help-shortcuts')">
<i class="fas fa-keyboard mr-2 text-yellow-400"></i>Raccourcis <span class="mr-2"></span>Raccourcis & Astuces
</a> </a>
</nav> </nav>
</div> </div>
@ -3360,7 +3769,7 @@
<!-- Quick Start --> <!-- Quick Start -->
<div id="help-quickstart" class="glass-card p-8 mb-8 fade-in help-section-anchor"> <div id="help-quickstart" class="glass-card p-8 mb-8 fade-in help-section-anchor">
<h2 class="help-section-title"> <h2 class="help-section-title">
<i class="fas fa-rocket text-purple-500"></i> <span class="text-2xl mr-2">⚡️</span>
Démarrage Rapide Démarrage Rapide
</h2> </h2>
<div class="grid grid-cols-1 md:grid-cols-3 gap-6"> <div class="grid grid-cols-1 md:grid-cols-3 gap-6">
@ -3394,7 +3803,7 @@
<!-- Indicateurs de Santé --> <!-- Indicateurs de Santé -->
<div id="help-indicators" class="glass-card p-8 mb-8 fade-in help-section-anchor"> <div id="help-indicators" class="glass-card p-8 mb-8 fade-in help-section-anchor">
<h2 class="help-section-title"> <h2 class="help-section-title">
<i class="fas fa-heartbeat text-red-500"></i> <span class="text-2xl mr-2">❤️‍🩹</span>
Indicateurs de Santé des Hosts Indicateurs de Santé des Hosts
</h2> </h2>
<p class="text-gray-400 mb-6"> <p class="text-gray-400 mb-6">
@ -3571,7 +3980,7 @@
<!-- Architecture --> <!-- Architecture -->
<div id="help-architecture" class="glass-card p-8 mb-8 fade-in help-section-anchor"> <div id="help-architecture" class="glass-card p-8 mb-8 fade-in help-section-anchor">
<h2 class="help-section-title"> <h2 class="help-section-title">
<i class="fas fa-sitemap text-blue-500"></i> <span class="text-2xl mr-2">🏗️</span>
Architecture de la Solution Architecture de la Solution
</h2> </h2>
<div class="grid grid-cols-1 lg:grid-cols-2 gap-8"> <div class="grid grid-cols-1 lg:grid-cols-2 gap-8">
@ -3625,8 +4034,8 @@ homelab-automation/
<!-- Fonctionnalités --> <!-- Fonctionnalités -->
<div id="help-features" class="glass-card p-8 mb-8 fade-in help-section-anchor"> <div id="help-features" class="glass-card p-8 mb-8 fade-in help-section-anchor">
<h2 class="help-section-title"> <h2 class="help-section-title">
<i class="fas fa-th-large text-green-500"></i> <span class="text-2xl mr-2">⚙️</span>
Fonctionnalités par Section Fonctionnalités Détaillées par Section
</h2> </h2>
<!-- Accordéon --> <!-- Accordéon -->
@ -3763,14 +4172,56 @@ homelab-automation/
</div> </div>
</div> </div>
</div> </div>
<!-- Alertes -->
<div class="accordion-item" onclick="toggleAccordion(this)">
<div class="accordion-header">
<span class="flex items-center gap-3">
<i class="fas fa-bell text-red-400"></i>
<strong>Alertes</strong>
</span>
<i class="fas fa-chevron-down accordion-icon text-gray-400"></i>
</div>
<div class="accordion-content">
<div class="p-4">
<p class="text-gray-400 mb-4">Centre de messages pour les événements importants.</p>
<ul class="help-list text-sm">
<li><strong>Suivi:</strong> Consultez les alertes récentes (succès/échec, changements d'état).</li>
<li><strong>Lecture:</strong> Les alertes peuvent être marquées comme lues pour garder une boîte de réception propre.</li>
<li><strong>Notifications:</strong> Certaines alertes peuvent déclencher des notifications ntfy (si activé).</li>
</ul>
</div>
</div>
</div>
<!-- Configuration -->
<div class="accordion-item" onclick="toggleAccordion(this)">
<div class="accordion-header">
<span class="flex items-center gap-3">
<i class="fas fa-cog text-cyan-400"></i>
<strong>Configuration</strong>
</span>
<i class="fas fa-chevron-down accordion-icon text-gray-400"></i>
</div>
<div class="accordion-content">
<div class="p-4">
<p class="text-gray-400 mb-4">Paramètres de l'application et intégrations.</p>
<ul class="help-list text-sm">
<li><strong>Paramètres applicatifs:</strong> Options persistées (ex: collecte des métriques).</li>
<li><strong>Notifications:</strong> Configuration et test du service ntfy.</li>
<li><strong>Sécurité:</strong> Gestion du compte utilisateur (mot de passe) via l'écran utilisateur.</li>
</ul>
</div>
</div>
</div>
</div> </div>
</div> </div>
<!-- Notifications --> <!-- Notifications -->
<div id="help-notifications" class="glass-card p-8 mb-8 fade-in help-section-anchor"> <div id="help-notifications" class="glass-card p-8 mb-8 fade-in help-section-anchor">
<h2 class="help-section-title"> <h2 class="help-section-title">
<i class="fas fa-bell text-yellow-500"></i> <span class="text-2xl mr-2">🔔</span>
Système de Notifications Système de Notifications (ntfy)
</h2> </h2>
<p class="text-gray-400 mb-6"> <p class="text-gray-400 mb-6">
Restez informé de l'état de votre infrastructure grâce au système de notifications intégré basé sur <strong>ntfy</strong>. Restez informé de l'état de votre infrastructure grâce au système de notifications intégré basé sur <strong>ntfy</strong>.
@ -3795,10 +4246,10 @@ homelab-automation/
Vous recevez des notifications pour les événements critiques : Vous recevez des notifications pour les événements critiques :
</p> </p>
<ul class="help-list text-sm"> <ul class="help-list text-sm">
<li><i class="fas fa-check-circle text-green-400 mr-2"></i>Succès des Backups</li> <li><span class="mr-2"></span>Succès des Backups</li>
<li><i class="fas fa-exclamation-circle text-red-400 mr-2"></i>Échecs de Tâches</li> <li><span class="mr-2"></span>Échecs de Tâches</li>
<li><i class="fas fa-heartbeat text-purple-400 mr-2"></i>Changements de Santé Host</li> <li><span class="mr-2">⚠️</span>Changements de Santé Host</li>
<li><i class="fas fa-rocket text-blue-400 mr-2"></i>Fin de Bootstrap</li> <li><span class="mr-2">🛠️</span>Fin de Bootstrap</li>
</ul> </ul>
</div> </div>
</div> </div>
@ -3807,7 +4258,7 @@ homelab-automation/
<!-- Playbooks Ansible --> <!-- Playbooks Ansible -->
<div id="help-playbooks" class="glass-card p-8 mb-8 fade-in help-section-anchor"> <div id="help-playbooks" class="glass-card p-8 mb-8 fade-in help-section-anchor">
<h2 class="help-section-title"> <h2 class="help-section-title">
<i class="fas fa-book text-orange-500"></i> <span class="text-2xl mr-2">📖</span>
Playbooks Ansible Disponibles Playbooks Ansible Disponibles
</h2> </h2>
<div class="grid grid-cols-1 md:grid-cols-2 gap-6"> <div class="grid grid-cols-1 md:grid-cols-2 gap-6">
@ -3857,11 +4308,11 @@ homelab-automation/
<!-- API Reference --> <!-- API Reference -->
<div id="help-api" class="glass-card p-8 mb-8 fade-in help-section-anchor"> <div id="help-api" class="glass-card p-8 mb-8 fade-in help-section-anchor">
<h2 class="help-section-title"> <h2 class="help-section-title">
<i class="fas fa-code text-cyan-500"></i> <span class="text-2xl mr-2">🔗</span>
Référence API Référence API
</h2> </h2>
<p class="text-gray-400 mb-6"> <p class="text-gray-400 mb-6">
L'API REST est accessible sur le port configuré. Authentification via header <span class="help-code">X-API-Key</span>. L'API REST est accessible sur le port configuré. Authentification via header <span class="help-code">Authorization: Bearer &lt;token&gt;</span>.
</p> </p>
<div class="overflow-x-auto"> <div class="overflow-x-auto">
<table class="w-full text-sm"> <table class="w-full text-sm">
@ -3911,7 +4362,7 @@ homelab-automation/
<!-- Troubleshooting --> <!-- Troubleshooting -->
<div id="help-troubleshooting" class="glass-card p-8 mb-8 fade-in help-section-anchor"> <div id="help-troubleshooting" class="glass-card p-8 mb-8 fade-in help-section-anchor">
<h2 class="help-section-title"> <h2 class="help-section-title">
<i class="fas fa-wrench text-red-500"></i> <span class="text-2xl mr-2">🛠️</span>
Dépannage Dépannage
</h2> </h2>
<div class="space-y-4"> <div class="space-y-4">
@ -3969,7 +4420,7 @@ homelab-automation/
<!-- Keyboard Shortcuts --> <!-- Keyboard Shortcuts -->
<div id="help-shortcuts" class="glass-card p-8 fade-in help-section-anchor"> <div id="help-shortcuts" class="glass-card p-8 fade-in help-section-anchor">
<h2 class="help-section-title"> <h2 class="help-section-title">
<i class="fas fa-keyboard text-yellow-500"></i> <span class="text-2xl mr-2"></span>
Raccourcis & Astuces Raccourcis & Astuces
</h2> </h2>
<div class="grid grid-cols-1 md:grid-cols-2 gap-6"> <div class="grid grid-cols-1 md:grid-cols-2 gap-6">
@ -4549,5 +5000,106 @@ homelab-automation/
})(); })();
</script> </script>
</div><!-- End main-content -->
<!-- Authentication handlers -->
<script>
// Toggle password visibility
function togglePasswordVisibility(inputId, btn) {
const input = document.getElementById(inputId);
const icon = btn.querySelector('i');
if (input.type === 'password') {
input.type = 'text';
icon.classList.remove('fa-eye');
icon.classList.add('fa-eye-slash');
} else {
input.type = 'password';
icon.classList.remove('fa-eye-slash');
icon.classList.add('fa-eye');
}
}
// Handle login form submission
async function handleLogin(event) {
event.preventDefault();
const username = document.getElementById('login-username').value;
const password = document.getElementById('login-password').value;
const errorEl = document.getElementById('login-error');
const errorText = document.getElementById('login-error-text');
const submitBtn = document.getElementById('login-submit-btn');
// Reset error state
errorEl.classList.add('hidden');
// Show loading state
submitBtn.disabled = true;
submitBtn.innerHTML = '<i class="fas fa-spinner fa-spin"></i><span>Connexion...</span>';
try {
const success = await dashboard.login(username, password);
if (!success) {
errorText.textContent = 'Nom d\'utilisateur ou mot de passe incorrect';
errorEl.classList.remove('hidden');
}
} catch (error) {
errorText.textContent = error.message || 'Erreur de connexion';
errorEl.classList.remove('hidden');
} finally {
submitBtn.disabled = false;
submitBtn.innerHTML = '<i class="fas fa-sign-in-alt"></i><span>Se connecter</span>';
}
}
// Handle setup form submission
async function handleSetup(event) {
event.preventDefault();
const username = document.getElementById('setup-username').value;
const password = document.getElementById('setup-password').value;
const passwordConfirm = document.getElementById('setup-password-confirm').value;
const email = document.getElementById('setup-email').value || null;
const displayName = document.getElementById('setup-display-name').value || null;
const errorEl = document.getElementById('setup-error');
const errorText = document.getElementById('setup-error-text');
const submitBtn = document.getElementById('setup-submit-btn');
// Reset error state
errorEl.classList.add('hidden');
// Validate password confirmation
if (password !== passwordConfirm) {
errorText.textContent = 'Les mots de passe ne correspondent pas';
errorEl.classList.remove('hidden');
return;
}
// Show loading state
submitBtn.disabled = true;
submitBtn.innerHTML = '<i class="fas fa-spinner fa-spin"></i><span>Création...</span>';
try {
const success = await dashboard.setupAdmin(username, password, email, displayName);
if (!success) {
errorText.textContent = 'Erreur lors de la création du compte';
errorEl.classList.remove('hidden');
}
} catch (error) {
errorText.textContent = error.message || 'Erreur de configuration';
errorEl.classList.remove('hidden');
} finally {
submitBtn.disabled = false;
submitBtn.innerHTML = '<i class="fas fa-user-plus"></i><span>Créer le compte</span>';
}
}
// Logout function (exposed globally)
function handleLogout() {
if (dashboard) {
dashboard.logout();
}
}
</script>
</body> </body>
</html> </html>

File diff suppressed because it is too large Load Diff

View File

@ -5,6 +5,10 @@ from .task import Task
from .schedule import Schedule from .schedule import Schedule
from .schedule_run import ScheduleRun from .schedule_run import ScheduleRun
from .log import Log from .log import Log
from .user import User, UserRole
from .host_metrics import HostMetrics
from .alert import Alert
from .app_setting import AppSetting
__all__ = [ __all__ = [
"Base", "Base",
@ -14,4 +18,9 @@ __all__ = [
"Schedule", "Schedule",
"ScheduleRun", "ScheduleRun",
"Log", "Log",
"Alert",
"User",
"UserRole",
"HostMetrics",
"AppSetting",
] ]

36
app/models/alert.py Normal file
View File

@ -0,0 +1,36 @@
from __future__ import annotations
from datetime import datetime
from typing import Optional
from sqlalchemy import DateTime, ForeignKey, Integer, JSON, String, Text, Index
from sqlalchemy.orm import Mapped, mapped_column
from sqlalchemy.sql import func
from .database import Base
class Alert(Base):
__tablename__ = "alerts"
__table_args__ = (
Index("idx_alerts_created_at", "created_at"),
Index("idx_alerts_user_id", "user_id"),
Index("idx_alerts_category", "category"),
Index("idx_alerts_read_at", "read_at"),
)
id: Mapped[int] = mapped_column(Integer, primary_key=True, autoincrement=True)
user_id: Mapped[Optional[int]] = mapped_column(Integer, ForeignKey("users.id", ondelete="SET NULL"))
category: Mapped[str] = mapped_column(String(50), nullable=False)
level: Mapped[Optional[str]] = mapped_column(String(20))
title: Mapped[Optional[str]] = mapped_column(String(255))
message: Mapped[str] = mapped_column(Text, nullable=False)
source: Mapped[Optional[str]] = mapped_column(String(50))
details: Mapped[Optional[dict]] = mapped_column(JSON)
read_at: Mapped[Optional[datetime]] = mapped_column(DateTime(timezone=True), nullable=True)
created_at: Mapped[datetime] = mapped_column(DateTime(timezone=True), nullable=False, server_default=func.now())
def __repr__(self) -> str:
return f"<Alert id={self.id} category={self.category} user_id={self.user_id} read={self.read_at is not None}>"

23
app/models/app_setting.py Normal file
View File

@ -0,0 +1,23 @@
from __future__ import annotations
from datetime import datetime
from typing import Optional
from sqlalchemy import DateTime, String, Text, Index
from sqlalchemy.orm import Mapped, mapped_column
from sqlalchemy.sql import func
from .database import Base
class AppSetting(Base):
__tablename__ = "app_settings"
__table_args__ = (
Index("idx_app_settings_updated_at", "updated_at"),
)
key: Mapped[str] = mapped_column(String(100), primary_key=True)
value: Mapped[Optional[str]] = mapped_column(Text)
created_at: Mapped[datetime] = mapped_column(DateTime(timezone=True), nullable=False, server_default=func.now())
updated_at: Mapped[datetime] = mapped_column(DateTime(timezone=True), nullable=False, server_default=func.now(), onupdate=func.now())

View File

@ -112,7 +112,7 @@ async def get_db() -> AsyncGenerator[AsyncSession, None]:
async def init_db() -> None: async def init_db() -> None:
"""Create all tables (mostly for dev/tests; migrations should be handled by Alembic).""" """Create all tables (mostly for dev/tests; migrations should be handled by Alembic)."""
from . import host, task, schedule, schedule_run, log # noqa: F401 from . import host, task, schedule, schedule_run, log, alert, app_setting # noqa: F401
async with engine.begin() as conn: async with engine.begin() as conn:
await conn.run_sync(Base.metadata.create_all) await conn.run_sync(Base.metadata.create_all)

View File

@ -27,7 +27,9 @@ class Host(Base):
bootstrap_statuses: Mapped[List["BootstrapStatus"]] = relationship( bootstrap_statuses: Mapped[List["BootstrapStatus"]] = relationship(
"BootstrapStatus", back_populates="host", cascade="all, delete-orphan" "BootstrapStatus", back_populates="host", cascade="all, delete-orphan"
) )
logs: Mapped[List["Log"]] = relationship("Log", back_populates="host") metrics: Mapped[List["HostMetrics"]] = relationship(
"HostMetrics", back_populates="host", cascade="all, delete-orphan"
)
def __repr__(self) -> str: # pragma: no cover - debug helper def __repr__(self) -> str: # pragma: no cover - debug helper
return f"<Host id={self.id} name={self.name} ip={self.ip_address}>" return f"<Host id={self.id} name={self.name} ip={self.ip_address}>"

View File

@ -0,0 +1,85 @@
from __future__ import annotations
from datetime import datetime
from typing import Optional
from sqlalchemy import DateTime, Float, ForeignKey, Integer, JSON, String, Text, Index
from sqlalchemy.orm import Mapped, mapped_column, relationship
from sqlalchemy.sql import func
from .database import Base
class HostMetrics(Base):
"""Stocke les métriques collectées par les builtin playbooks pour chaque hôte."""
__tablename__ = "host_metrics"
__table_args__ = (
Index("idx_host_metrics_host_id", "host_id"),
Index("idx_host_metrics_collected_at", "collected_at"),
Index("idx_host_metrics_metric_type", "metric_type"),
)
id: Mapped[int] = mapped_column(Integer, primary_key=True, autoincrement=True)
host_id: Mapped[str] = mapped_column(String, ForeignKey("hosts.id", ondelete="CASCADE"), nullable=False)
metric_type: Mapped[str] = mapped_column(String(50), nullable=False) # 'system_info', 'disk_usage', 'memory', etc.
# Métriques CPU
cpu_count: Mapped[Optional[int]] = mapped_column(Integer)
cpu_model: Mapped[Optional[str]] = mapped_column(String(200))
cpu_cores: Mapped[Optional[int]] = mapped_column(Integer)
cpu_threads: Mapped[Optional[int]] = mapped_column(Integer)
cpu_threads_per_core: Mapped[Optional[int]] = mapped_column(Integer)
cpu_sockets: Mapped[Optional[int]] = mapped_column(Integer)
cpu_mhz: Mapped[Optional[float]] = mapped_column(Float)
cpu_max_mhz: Mapped[Optional[float]] = mapped_column(Float)
cpu_min_mhz: Mapped[Optional[float]] = mapped_column(Float)
cpu_load_1m: Mapped[Optional[float]] = mapped_column(Float)
cpu_load_5m: Mapped[Optional[float]] = mapped_column(Float)
cpu_load_15m: Mapped[Optional[float]] = mapped_column(Float)
cpu_usage_percent: Mapped[Optional[float]] = mapped_column(Float)
cpu_temperature: Mapped[Optional[float]] = mapped_column(Float)
# Métriques mémoire
memory_total_mb: Mapped[Optional[int]] = mapped_column(Integer)
memory_used_mb: Mapped[Optional[int]] = mapped_column(Integer)
memory_free_mb: Mapped[Optional[int]] = mapped_column(Integer)
memory_usage_percent: Mapped[Optional[float]] = mapped_column(Float)
swap_total_mb: Mapped[Optional[int]] = mapped_column(Integer)
swap_used_mb: Mapped[Optional[int]] = mapped_column(Integer)
swap_usage_percent: Mapped[Optional[float]] = mapped_column(Float)
# Métriques disque (stockées en JSON pour flexibilité - plusieurs disques)
disk_info: Mapped[Optional[object]] = mapped_column(JSON) # Liste des points de montage avec usage
disk_devices: Mapped[Optional[object]] = mapped_column(JSON) # Liste des disques + partitions (layout)
disk_root_total_gb: Mapped[Optional[float]] = mapped_column(Float)
disk_root_used_gb: Mapped[Optional[float]] = mapped_column(Float)
disk_root_usage_percent: Mapped[Optional[float]] = mapped_column(Float)
# Storage stacks (JSON)
lvm_info: Mapped[Optional[object]] = mapped_column(JSON)
zfs_info: Mapped[Optional[object]] = mapped_column(JSON)
# Informations système
os_name: Mapped[Optional[str]] = mapped_column(String(100))
os_version: Mapped[Optional[str]] = mapped_column(String(100))
kernel_version: Mapped[Optional[str]] = mapped_column(String(100))
hostname: Mapped[Optional[str]] = mapped_column(String(200))
uptime_seconds: Mapped[Optional[int]] = mapped_column(Integer)
uptime_human: Mapped[Optional[str]] = mapped_column(String(100))
# Réseau (stocké en JSON pour flexibilité)
network_info: Mapped[Optional[dict]] = mapped_column(JSON)
# Données brutes et métadonnées
raw_data: Mapped[Optional[dict]] = mapped_column(JSON) # Données brutes du playbook
collection_source: Mapped[Optional[str]] = mapped_column(String(100)) # Nom du builtin playbook
collection_duration_ms: Mapped[Optional[int]] = mapped_column(Integer)
error_message: Mapped[Optional[str]] = mapped_column(Text)
collected_at: Mapped[datetime] = mapped_column(DateTime(timezone=True), nullable=False, server_default=func.now())
created_at: Mapped[datetime] = mapped_column(DateTime(timezone=True), nullable=False, server_default=func.now())
host: Mapped["Host"] = relationship("Host", back_populates="metrics")
def __repr__(self) -> str:
return f"<HostMetrics id={self.id} host_id={self.host_id} type={self.metric_type}>"

View File

@ -23,14 +23,11 @@ class Log(Base):
source: Mapped[Optional[str]] = mapped_column(String) source: Mapped[Optional[str]] = mapped_column(String)
message: Mapped[str] = mapped_column(Text, nullable=False) message: Mapped[str] = mapped_column(Text, nullable=False)
details: Mapped[Optional[dict]] = mapped_column(JSON) details: Mapped[Optional[dict]] = mapped_column(JSON)
host_id: Mapped[Optional[str]] = mapped_column(String, ForeignKey("hosts.id", ondelete="SET NULL")) host_id: Mapped[Optional[str]] = mapped_column(String, nullable=True)
task_id: Mapped[Optional[str]] = mapped_column(String, ForeignKey("tasks.id", ondelete="SET NULL")) task_id: Mapped[Optional[str]] = mapped_column(String, nullable=True)
schedule_id: Mapped[Optional[str]] = mapped_column(String, ForeignKey("schedules.id", ondelete="SET NULL")) schedule_id: Mapped[Optional[str]] = mapped_column(String, nullable=True)
created_at: Mapped[datetime] = mapped_column(DateTime(timezone=True), nullable=False, server_default=func.now()) created_at: Mapped[datetime] = mapped_column(DateTime(timezone=True), nullable=False, server_default=func.now())
host: Mapped[Optional["Host"]] = relationship("Host", back_populates="logs")
task: Mapped[Optional["Task"]] = relationship("Task", back_populates="logs")
schedule: Mapped[Optional["Schedule"]] = relationship("Schedule", back_populates="logs")
def __repr__(self) -> str: # pragma: no cover - debug helper def __repr__(self) -> str: # pragma: no cover - debug helper
return f"<Log id={self.id} level={self.level} source={self.source}>" return f"<Log id={self.id} level={self.level} source={self.source}>"

View File

@ -48,7 +48,6 @@ class Schedule(Base):
runs: Mapped[List["ScheduleRun"]] = relationship( runs: Mapped[List["ScheduleRun"]] = relationship(
"ScheduleRun", back_populates="schedule", cascade="all, delete-orphan" "ScheduleRun", back_populates="schedule", cascade="all, delete-orphan"
) )
logs: Mapped[List["Log"]] = relationship("Log", back_populates="schedule")
def __repr__(self) -> str: # pragma: no cover - debug helper def __repr__(self) -> str: # pragma: no cover - debug helper
return f"<Schedule id={self.id} name={self.name} target={self.target}>" return f"<Schedule id={self.id} name={self.name} target={self.target}>"

View File

@ -25,7 +25,6 @@ class Task(Base):
created_at: Mapped[datetime] = mapped_column(DateTime(timezone=True), nullable=False, server_default=func.now()) created_at: Mapped[datetime] = mapped_column(DateTime(timezone=True), nullable=False, server_default=func.now())
schedule_runs: Mapped[list["ScheduleRun"]] = relationship("ScheduleRun", back_populates="task") schedule_runs: Mapped[list["ScheduleRun"]] = relationship("ScheduleRun", back_populates="task")
logs: Mapped[list["Log"]] = relationship("Log", back_populates="task")
def __repr__(self) -> str: # pragma: no cover - debug helper def __repr__(self) -> str: # pragma: no cover - debug helper
return f"<Task id={self.id} action={self.action} target={self.target} status={self.status}>" return f"<Task id={self.id} action={self.action} target={self.target} status={self.status}>"

83
app/models/user.py Normal file
View File

@ -0,0 +1,83 @@
"""User model for authentication and authorization.
Designed for single-user now but prepared for multi-user with roles in the future.
"""
from __future__ import annotations
from datetime import datetime
from enum import Enum
from typing import Optional
from sqlalchemy import Boolean, DateTime, String, Text, text
from sqlalchemy.orm import Mapped, mapped_column
from sqlalchemy.sql import func
from .database import Base
class UserRole(str, Enum):
"""User roles for authorization.
Current implementation: single admin user.
Future: can be extended with more granular roles.
"""
ADMIN = "admin"
OPERATOR = "operator" # Future: can execute tasks but not manage users
VIEWER = "viewer" # Future: read-only access
class User(Base):
"""User model for authentication.
Fields prepared for future multi-user support:
- role: determines access level
- is_active: allows disabling users without deletion
- last_login: track user activity
- password_changed_at: for password rotation policies
"""
__tablename__ = "users"
id: Mapped[int] = mapped_column(primary_key=True, autoincrement=True)
username: Mapped[str] = mapped_column(String(50), unique=True, nullable=False, index=True)
email: Mapped[Optional[str]] = mapped_column(String(255), unique=True, nullable=True)
hashed_password: Mapped[str] = mapped_column(String(255), nullable=False)
# Role-based access control (prepared for future)
role: Mapped[str] = mapped_column(
String(20),
nullable=False,
server_default=text("'admin'") # Default to admin for single-user setup
)
# Account status
is_active: Mapped[bool] = mapped_column(Boolean, nullable=False, server_default=text("1"))
is_superuser: Mapped[bool] = mapped_column(Boolean, nullable=False, server_default=text("0"))
# Display name (optional)
display_name: Mapped[Optional[str]] = mapped_column(String(100), nullable=True)
# Timestamps
created_at: Mapped[datetime] = mapped_column(
DateTime(timezone=True),
nullable=False,
server_default=func.now()
)
updated_at: Mapped[datetime] = mapped_column(
DateTime(timezone=True),
nullable=False,
server_default=func.now(),
onupdate=func.now()
)
last_login: Mapped[Optional[datetime]] = mapped_column(DateTime(timezone=True), nullable=True)
password_changed_at: Mapped[Optional[datetime]] = mapped_column(DateTime(timezone=True), nullable=True)
# Soft delete support
deleted_at: Mapped[Optional[datetime]] = mapped_column(DateTime(timezone=True), nullable=True)
def __repr__(self) -> str:
return f"<User id={self.id} username={self.username} role={self.role}>"
@property
def is_admin(self) -> bool:
"""Check if user has admin privileges."""
return self.role == UserRole.ADMIN.value or self.is_superuser

View File

@ -1,6 +1,8 @@
fastapi>=0.115.0 fastapi>=0.115.0
uvicorn[standard]>=0.32.0 uvicorn[standard]>=0.32.0
pydantic>=2.12.0 pydantic>=2.12.0
pydantic[email]>=2.12.0
pydantic-settings>=2.0.0
python-multipart>=0.0.9 python-multipart>=0.0.9
PyYAML>=6.0.2 PyYAML>=6.0.2
websockets>=14.0 websockets>=14.0
@ -16,3 +18,8 @@ alembic>=1.12.0
aiosqlite>=0.19.0 aiosqlite>=0.19.0
pytest>=7.0.0 pytest>=7.0.0
pytest-asyncio>=0.21.0 pytest-asyncio>=0.21.0
# Authentication
python-jose[cryptography]>=3.3.0
passlib[bcrypt]>=1.7.4
reportlab>=4.0.0
pillow>=10.0.0

64
app/routes/__init__.py Normal file
View File

@ -0,0 +1,64 @@
"""
Routes API pour l'application Homelab Automation.
Ce module agrège tous les routers FastAPI pour une inclusion
dans l'application principale.
"""
from fastapi import APIRouter
from app.routes.auth import router as auth_router
from app.routes.hosts import router as hosts_router
from app.routes.groups import router as groups_router
from app.routes.tasks import router as tasks_router
from app.routes.logs import router as logs_router
from app.routes.ansible import router as ansible_router
from app.routes.playbooks import router as playbooks_router
from app.routes.schedules import router as schedules_router
from app.routes.adhoc import router as adhoc_router
from app.routes.bootstrap import router as bootstrap_router
from app.routes.health import router as health_router
from app.routes.notifications import router as notifications_router
from app.routes.help import router as help_router
from app.routes.metrics import router as metrics_router
from app.routes.builtin_playbooks import router as builtin_playbooks_router
from app.routes.server import router as server_router
from app.routes.alerts import router as alerts_router
# Router principal qui agrège tous les sous-routers
api_router = APIRouter()
# Inclure tous les routers avec leurs préfixes
api_router.include_router(auth_router, prefix="/auth", tags=["Auth"])
api_router.include_router(hosts_router, prefix="/hosts", tags=["Hosts"])
api_router.include_router(groups_router, prefix="/groups", tags=["Groups"])
api_router.include_router(tasks_router, prefix="/tasks", tags=["Tasks"])
api_router.include_router(logs_router, prefix="/logs", tags=["Logs"])
api_router.include_router(ansible_router, prefix="/ansible", tags=["Ansible"])
api_router.include_router(playbooks_router, prefix="/playbooks", tags=["Playbooks"])
api_router.include_router(schedules_router, prefix="/schedules", tags=["Schedules"])
api_router.include_router(adhoc_router, prefix="/adhoc", tags=["Ad-hoc"])
api_router.include_router(bootstrap_router, prefix="/bootstrap", tags=["Bootstrap"])
api_router.include_router(health_router, prefix="/health", tags=["Health"])
api_router.include_router(notifications_router, prefix="/notifications", tags=["Notifications"])
api_router.include_router(help_router, prefix="/help", tags=["Help"])
api_router.include_router(metrics_router, prefix="/metrics", tags=["Metrics"])
api_router.include_router(builtin_playbooks_router, prefix="/builtin-playbooks", tags=["Builtin Playbooks"])
api_router.include_router(server_router, prefix="/server", tags=["Server"])
api_router.include_router(alerts_router, prefix="/alerts", tags=["Alerts"])
__all__ = [
"api_router",
"hosts_router",
"groups_router",
"tasks_router",
"logs_router",
"ansible_router",
"playbooks_router",
"schedules_router",
"adhoc_router",
"bootstrap_router",
"health_router",
"notifications_router",
"help_router",
]

111
app/routes/adhoc.py Normal file
View File

@ -0,0 +1,111 @@
"""
Routes API pour l'historique des commandes ad-hoc.
"""
from typing import Optional
from fastapi import APIRouter, Depends, HTTPException, Request
from app.core.dependencies import verify_api_key
from app.services import adhoc_history_service
router = APIRouter()
@router.get("/history")
async def get_adhoc_history(
category: Optional[str] = None,
search: Optional[str] = None,
limit: int = 50,
api_key_valid: bool = Depends(verify_api_key)
):
"""Récupère l'historique des commandes ad-hoc."""
commands = await adhoc_history_service.get_commands(
category=category,
search=search,
limit=limit,
)
return {
"commands": [cmd.dict() for cmd in commands],
"count": len(commands)
}
@router.get("/categories")
async def get_adhoc_categories(api_key_valid: bool = Depends(verify_api_key)):
"""Récupère la liste des catégories de commandes ad-hoc."""
categories = await adhoc_history_service.get_categories()
return {"categories": [cat.dict() for cat in categories]}
@router.post("/categories")
async def create_adhoc_category(
name: str,
description: Optional[str] = None,
color: str = "#7c3aed",
icon: str = "fa-folder",
api_key_valid: bool = Depends(verify_api_key)
):
"""Crée une nouvelle catégorie de commandes ad-hoc."""
category = await adhoc_history_service.add_category(name, description, color, icon)
return {"category": category.dict(), "message": "Catégorie créée"}
@router.put("/categories/{category_name}")
async def update_adhoc_category(
category_name: str,
request: Request,
api_key_valid: bool = Depends(verify_api_key)
):
"""Met à jour une catégorie existante."""
try:
data = await request.json()
new_name = data.get("name", category_name)
description = data.get("description", "")
color = data.get("color", "#7c3aed")
icon = data.get("icon", "fa-folder")
success = await adhoc_history_service.update_category(category_name, new_name, description, color, icon)
if not success:
raise HTTPException(status_code=404, detail="Catégorie non trouvée")
return {"message": "Catégorie mise à jour", "category": new_name}
except Exception as e:
raise HTTPException(status_code=400, detail=str(e))
@router.delete("/categories/{category_name}")
async def delete_adhoc_category(
category_name: str,
api_key_valid: bool = Depends(verify_api_key)
):
"""Supprime une catégorie et déplace ses commandes vers 'default'."""
if category_name == "default":
raise HTTPException(status_code=400, detail="La catégorie 'default' ne peut pas être supprimée")
success = await adhoc_history_service.delete_category(category_name)
if not success:
raise HTTPException(status_code=404, detail="Catégorie non trouvée")
return {"message": "Catégorie supprimée", "category": category_name}
@router.put("/history/{command_id}/category")
async def update_adhoc_command_category(
command_id: str,
category: str,
description: Optional[str] = None,
api_key_valid: bool = Depends(verify_api_key)
):
"""Met à jour la catégorie d'une commande dans l'historique."""
success = await adhoc_history_service.update_command_category(command_id, category, description)
if not success:
raise HTTPException(status_code=404, detail="Commande non trouvée")
return {"message": "Catégorie mise à jour", "command_id": command_id, "category": category}
@router.delete("/history/{command_id}")
async def delete_adhoc_command(command_id: str, api_key_valid: bool = Depends(verify_api_key)):
"""Supprime une commande de l'historique."""
success = await adhoc_history_service.delete_command(command_id)
if not success:
raise HTTPException(status_code=404, detail="Commande non trouvée")
return {"message": "Commande supprimée", "command_id": command_id}

158
app/routes/alerts.py Normal file
View File

@ -0,0 +1,158 @@
"""
Routes API pour les alertes.
"""
from typing import Optional
from fastapi import APIRouter, Depends, HTTPException
from pydantic import BaseModel
from sqlalchemy.ext.asyncio import AsyncSession
from app.core.dependencies import get_db, verify_api_key
from app.crud.alert import AlertRepository
from app.services import ws_manager
router = APIRouter()
class AlertCreate(BaseModel):
category: str = "general"
level: Optional[str] = "info"
title: Optional[str] = None
message: str
source: Optional[str] = None
@router.post("")
async def create_alert(
alert_data: AlertCreate,
api_key_valid: bool = Depends(verify_api_key),
db_session: AsyncSession = Depends(get_db)
):
"""Crée une nouvelle alerte."""
repo = AlertRepository(db_session)
alert = await repo.create(
category=alert_data.category,
level=alert_data.level,
title=alert_data.title,
message=alert_data.message,
source=alert_data.source,
)
await db_session.commit()
await ws_manager.broadcast({
"type": "alert_created",
"data": {
"id": alert.id,
"title": alert.title,
"message": alert.message,
"level": alert.level,
}
})
return {
"id": alert.id,
"title": alert.title,
"message": alert.message,
"level": alert.level,
"category": alert.category,
"created_at": alert.created_at,
}
@router.get("")
async def get_alerts(
limit: int = 50,
offset: int = 0,
unread_only: bool = False,
api_key_valid: bool = Depends(verify_api_key),
db_session: AsyncSession = Depends(get_db)
):
"""Récupère les alertes avec pagination."""
repo = AlertRepository(db_session)
alerts = await repo.list(limit=limit, offset=offset, unread_only=unread_only)
return {
"alerts": [
{
"id": a.id,
"title": a.title,
"message": a.message,
"level": a.level,
"source": a.source,
"category": a.category,
"read": a.read_at is not None,
"read_at": a.read_at,
"created_at": a.created_at,
}
for a in alerts
],
"count": len(alerts)
}
@router.get("/unread-count")
async def get_unread_count(
api_key_valid: bool = Depends(verify_api_key),
db_session: AsyncSession = Depends(get_db)
):
"""Récupère le nombre d'alertes non lues."""
repo = AlertRepository(db_session)
count = await repo.count_unread()
return {"unread": count}
@router.post("/{alert_id}/read")
async def mark_as_read(
alert_id: str,
api_key_valid: bool = Depends(verify_api_key),
db_session: AsyncSession = Depends(get_db)
):
"""Marque une alerte comme lue."""
repo = AlertRepository(db_session)
alert = await repo.get(alert_id)
if not alert:
raise HTTPException(status_code=404, detail="Alerte non trouvée")
await repo.mark_as_read(alert_id)
await db_session.commit()
return {"message": "Alerte marquée comme lue", "id": alert_id}
@router.post("/mark-all-read")
async def mark_all_as_read(
api_key_valid: bool = Depends(verify_api_key),
db_session: AsyncSession = Depends(get_db)
):
"""Marque toutes les alertes comme lues."""
repo = AlertRepository(db_session)
count = await repo.mark_all_as_read()
await db_session.commit()
await ws_manager.broadcast({
"type": "alerts_cleared",
"data": {"count": count}
})
return {"message": f"{count} alerte(s) marquée(s) comme lue(s)"}
@router.delete("/{alert_id}")
async def delete_alert(
alert_id: str,
api_key_valid: bool = Depends(verify_api_key),
db_session: AsyncSession = Depends(get_db)
):
"""Supprime une alerte."""
repo = AlertRepository(db_session)
alert = await repo.get(alert_id)
if not alert:
raise HTTPException(status_code=404, detail="Alerte non trouvée")
await repo.delete(alert_id)
await db_session.commit()
return {"message": "Alerte supprimée", "id": alert_id}

519
app/routes/ansible.py Normal file
View File

@ -0,0 +1,519 @@
"""
Routes API pour l'exécution Ansible.
"""
import uuid
import subprocess
from datetime import datetime, timezone
from time import perf_counter
from typing import Optional, Dict, Any
from fastapi import APIRouter, Depends, HTTPException
from sqlalchemy.ext.asyncio import AsyncSession
from app.core.config import settings
from app.core.dependencies import get_db, verify_api_key
from app.crud.task import TaskRepository
from app.schemas.ansible import AnsibleExecutionRequest, AdHocCommandRequest, AdHocCommandResult
from app.schemas.task_api import Task
from app.schemas.common import LogEntry
from app.services import (
ansible_service,
ws_manager,
notification_service,
adhoc_history_service,
db,
)
from app.services.task_log_service import TaskLogService
from app.utils.ssh_utils import find_ssh_private_key
router = APIRouter()
# Instance du service de logs de tâches
task_log_service = TaskLogService(settings.tasks_logs_dir)
@router.get("/playbooks")
async def get_ansible_playbooks(
target: Optional[str] = None,
api_key_valid: bool = Depends(verify_api_key)
):
"""Liste les playbooks Ansible disponibles."""
if target:
playbooks = ansible_service.get_compatible_playbooks(target)
else:
playbooks = ansible_service.get_playbooks()
return {
"playbooks": playbooks,
"categories": ansible_service.get_playbook_categories(),
"ansible_dir": str(settings.ansible_dir),
"filter": target
}
@router.get("/inventory")
async def get_ansible_inventory(
group: Optional[str] = None,
api_key_valid: bool = Depends(verify_api_key)
):
"""Récupère l'inventaire Ansible."""
return {
"hosts": [h.dict() for h in ansible_service.get_hosts_from_inventory(group_filter=group)],
"groups": ansible_service.get_groups(),
"inventory_path": str(ansible_service.inventory_path),
"filter": group
}
@router.get("/groups")
async def get_ansible_groups(api_key_valid: bool = Depends(verify_api_key)):
"""Récupère la liste des groupes Ansible."""
return {"groups": ansible_service.get_groups()}
@router.get("/ssh-config")
async def get_ssh_config(api_key_valid: bool = Depends(verify_api_key)):
"""Diagnostic de la configuration SSH."""
from pathlib import Path
import shutil
ssh_key_path = Path(settings.ssh_key_path)
ssh_dir = ssh_key_path.parent
available_files = []
if ssh_dir.exists():
available_files = [f.name for f in ssh_dir.iterdir()]
private_key_exists = ssh_key_path.exists()
public_key_exists = Path(settings.ssh_key_path + ".pub").exists()
pub_keys_found = []
for key_type in ["id_rsa", "id_ed25519", "id_ecdsa", "id_dsa"]:
key_path = ssh_dir / f"{key_type}.pub"
if key_path.exists():
pub_keys_found.append(str(key_path))
active_private_key = find_ssh_private_key()
return {
"ssh_key_path": settings.ssh_key_path,
"ssh_dir": str(ssh_dir),
"ssh_dir_exists": ssh_dir.exists(),
"private_key_exists": private_key_exists,
"public_key_exists": public_key_exists,
"available_files": available_files,
"public_keys_found": pub_keys_found,
"active_private_key": active_private_key,
"ssh_user": settings.ssh_user,
"sshpass_available": shutil.which("sshpass") is not None,
}
@router.post("/execute")
async def execute_ansible_playbook(
request: AnsibleExecutionRequest,
api_key_valid: bool = Depends(verify_api_key),
db_session: AsyncSession = Depends(get_db)
):
"""Exécute un playbook Ansible."""
start_time_dt = datetime.now(timezone.utc)
# Valider la compatibilité playbook-target
playbooks = ansible_service.get_playbooks()
playbook_info = next(
(pb for pb in playbooks if pb['filename'] == request.playbook or pb['name'] == request.playbook.replace('.yml', '').replace('.yaml', '')),
None
)
if playbook_info:
playbook_hosts = playbook_info.get('hosts', 'all')
if not ansible_service.is_target_compatible_with_playbook(request.target, playbook_hosts):
raise HTTPException(
status_code=400,
detail=f"Le playbook '{request.playbook}' (hosts: {playbook_hosts}) n'est pas compatible avec la cible '{request.target}'."
)
# Créer une tâche en BD
task_repo = TaskRepository(db_session)
task_id = f"pb_{uuid.uuid4().hex[:12]}"
playbook_name = request.playbook.replace('.yml', '').replace('-', ' ').title()
db_task = await task_repo.create(
id=task_id,
action=f"playbook:{request.playbook}",
target=request.target,
playbook=request.playbook,
status="running",
)
await task_repo.update(db_task, started_at=start_time_dt)
await db_session.commit()
# Créer aussi en mémoire
task = Task(
id=task_id,
name=f"Playbook: {playbook_name}",
host=request.target,
status="running",
progress=0,
start_time=start_time_dt
)
db.tasks.insert(0, task)
try:
result = await ansible_service.execute_playbook(
playbook=request.playbook,
target=request.target,
extra_vars=request.extra_vars,
check_mode=request.check_mode,
verbose=request.verbose
)
# Mettre à jour la tâche
task.status = "completed" if result["success"] else "failed"
task.progress = 100
task.end_time = datetime.now(timezone.utc)
task.duration = f"{result.get('execution_time', 0):.1f}s"
task.output = result.get("stdout", "")
task.error = result.get("stderr", "") if not result["success"] else None
# Ajouter un log
log_entry = LogEntry(
id=db.get_next_id("logs"),
timestamp=datetime.now(timezone.utc),
level="INFO" if result["success"] else "ERROR",
message=f"Playbook {request.playbook} exécuté sur {request.target}: {'succès' if result['success'] else 'échec'}",
source="ansible",
host=request.target
)
db.logs.insert(0, log_entry)
# Sauvegarder le log markdown
try:
log_path = task_log_service.save_task_log(
task=task,
output=result.get("stdout", ""),
error=result.get("stderr", "")
)
created_log = task_log_service.index_log_file(log_path)
if created_log:
await ws_manager.broadcast({
"type": "task_log_created",
"data": created_log.dict()
})
except Exception as log_error:
print(f"Erreur sauvegarde log markdown: {log_error}")
await ws_manager.broadcast({
"type": "ansible_execution",
"data": result
})
# Mettre à jour la BD
await task_repo.update(
db_task,
status=task.status,
completed_at=task.end_time,
error_message=task.error,
result_data={"output": result.get("stdout", "")[:5000]}
)
await db_session.commit()
# Notification
if result["success"]:
await notification_service.notify_task_completed(
task_name=task.name,
target=request.target,
duration=task.duration
)
else:
await notification_service.notify_task_failed(
task_name=task.name,
target=request.target,
error=result.get("stderr", "Erreur inconnue")[:200]
)
result["task_id"] = task_id
return result
except FileNotFoundError as e:
task.status = "failed"
task.end_time = datetime.now(timezone.utc)
task.error = str(e)
try:
log_path = task_log_service.save_task_log(task=task, error=str(e))
created_log = task_log_service.index_log_file(log_path)
if created_log:
await ws_manager.broadcast({
"type": "task_log_created",
"data": created_log.dict()
})
except Exception:
pass
await task_repo.update(db_task, status="failed", completed_at=task.end_time, error_message=str(e))
await db_session.commit()
await notification_service.notify_task_failed(task_name=task.name, target=request.target, error=str(e)[:200])
raise HTTPException(status_code=404, detail=str(e))
except Exception as e:
task.status = "failed"
task.end_time = datetime.now(timezone.utc)
task.error = str(e)
try:
log_path = task_log_service.save_task_log(task=task, error=str(e))
created_log = task_log_service.index_log_file(log_path)
if created_log:
await ws_manager.broadcast({
"type": "task_log_created",
"data": created_log.dict()
})
except Exception:
pass
await task_repo.update(db_task, status="failed", completed_at=task.end_time, error_message=str(e))
await db_session.commit()
await notification_service.notify_task_failed(task_name=task.name, target=request.target, error=str(e)[:200])
raise HTTPException(status_code=500, detail=str(e))
@router.post("/adhoc", response_model=AdHocCommandResult)
async def execute_adhoc_command(
request: AdHocCommandRequest,
api_key_valid: bool = Depends(verify_api_key),
db_session: AsyncSession = Depends(get_db)
):
"""Exécute une commande ad-hoc Ansible."""
start_time_perf = perf_counter()
start_time_dt = datetime.now(timezone.utc)
# Créer une tâche en BD
task_repo = TaskRepository(db_session)
task_id = f"adhoc_{uuid.uuid4().hex[:12]}"
task_name = f"Ad-hoc: {request.command[:40]}{'...' if len(request.command) > 40 else ''}"
db_task = await task_repo.create(
id=task_id,
action=f"adhoc:{request.module}",
target=request.target,
playbook=None,
status="running",
)
await task_repo.update(db_task, started_at=start_time_dt)
await db_session.commit()
# Créer aussi en mémoire
task = Task(
id=task_id,
name=task_name,
host=request.target,
status="running",
progress=0,
start_time=start_time_dt
)
db.tasks.insert(0, task)
# Construire la commande ansible
ansible_cmd = [
"ansible",
request.target,
"-i", str(settings.ansible_dir / "inventory" / "hosts.yml"),
"-m", request.module,
"-a", request.command,
"--timeout", str(request.timeout),
]
if request.become:
ansible_cmd.append("--become")
private_key = find_ssh_private_key()
if private_key:
ansible_cmd.extend(["--private-key", private_key])
if settings.ssh_user:
ansible_cmd.extend(["-u", settings.ssh_user])
try:
result = subprocess.run(
ansible_cmd,
capture_output=True,
text=True,
timeout=request.timeout + 10,
cwd=str(settings.ansible_dir)
)
duration = perf_counter() - start_time_perf
success = result.returncode == 0
task.status = "completed" if success else "failed"
task.progress = 100
task.end_time = datetime.now(timezone.utc)
task.duration = f"{round(duration, 2)}s"
task.output = result.stdout
task.error = result.stderr if result.stderr else None
log_path = task_log_service.save_task_log(task, output=result.stdout, error=result.stderr or "", source_type='adhoc')
created_log = task_log_service.index_log_file(log_path)
if created_log:
await ws_manager.broadcast({
"type": "task_log_created",
"data": created_log.dict()
})
log_entry = LogEntry(
id=db.get_next_id("logs"),
timestamp=datetime.now(timezone.utc),
level="INFO" if success else "WARN",
message=f"Ad-hoc [{request.module}] sur {request.target}: {request.command[:50]}{'...' if len(request.command) > 50 else ''}",
source="ansible-adhoc",
host=request.target
)
db.logs.insert(0, log_entry)
await ws_manager.broadcast({
"type": "adhoc_executed",
"data": {
"target": request.target,
"command": request.command,
"success": success,
"task_id": task_id
}
})
await adhoc_history_service.add_command(
command=request.command,
target=request.target,
module=request.module,
become=request.become,
category=request.category or "default"
)
await task_repo.update(
db_task,
status=task.status,
completed_at=task.end_time,
error_message=task.error,
result_data={"output": result.stdout[:5000] if result.stdout else None}
)
await db_session.commit()
if success:
await notification_service.notify_task_completed(
task_name=task.name,
target=request.target,
duration=task.duration
)
else:
await notification_service.notify_task_failed(
task_name=task.name,
target=request.target,
error=(result.stderr or "Erreur inconnue")[:200]
)
return AdHocCommandResult(
target=request.target,
command=request.command,
success=success,
return_code=result.returncode,
stdout=result.stdout,
stderr=result.stderr if result.stderr else None,
duration=round(duration, 2)
)
except subprocess.TimeoutExpired:
duration = perf_counter() - start_time_perf
task.status = "failed"
task.progress = 100
task.end_time = datetime.now(timezone.utc)
task.duration = f"{round(duration, 2)}s"
task.error = f"Timeout après {request.timeout} secondes"
try:
log_path = task_log_service.save_task_log(task, error=task.error, source_type='adhoc')
created_log = task_log_service.index_log_file(log_path)
if created_log:
await ws_manager.broadcast({
"type": "task_log_created",
"data": created_log.dict()
})
except Exception:
pass
await task_repo.update(db_task, status="failed", completed_at=task.end_time, error_message=task.error)
await db_session.commit()
await notification_service.notify_task_failed(task_name=task.name, target=request.target, error=task.error[:200])
return AdHocCommandResult(
target=request.target,
command=request.command,
success=False,
return_code=-1,
stdout="",
stderr=f"Timeout après {request.timeout} secondes",
duration=round(duration, 2)
)
except FileNotFoundError:
duration = perf_counter() - start_time_perf
error_msg = "ansible non trouvé. Vérifiez que Ansible est installé."
task.status = "failed"
task.progress = 100
task.end_time = datetime.now(timezone.utc)
task.duration = f"{round(duration, 2)}s"
task.error = error_msg
try:
log_path = task_log_service.save_task_log(task, error=error_msg, source_type='adhoc')
created_log = task_log_service.index_log_file(log_path)
if created_log:
await ws_manager.broadcast({
"type": "task_log_created",
"data": created_log.dict()
})
except Exception:
pass
await task_repo.update(db_task, status="failed", completed_at=task.end_time, error_message=error_msg)
await db_session.commit()
await notification_service.notify_task_failed(task_name=task.name, target=request.target, error=error_msg[:200])
return AdHocCommandResult(
target=request.target,
command=request.command,
success=False,
return_code=-1,
stdout="",
stderr=error_msg,
duration=round(duration, 2)
)
except Exception as e:
duration = perf_counter() - start_time_perf
error_msg = f"Erreur interne: {str(e)}"
task.status = "failed"
task.progress = 100
task.end_time = datetime.now(timezone.utc)
task.duration = f"{round(duration, 2)}s"
task.error = error_msg
try:
log_path = task_log_service.save_task_log(task, error=error_msg, source_type='adhoc')
created_log = task_log_service.index_log_file(log_path)
if created_log:
await ws_manager.broadcast({
"type": "task_log_created",
"data": created_log.dict()
})
except Exception:
pass
await task_repo.update(db_task, status="failed", completed_at=task.end_time, error_message=error_msg)
await db_session.commit()
await notification_service.notify_task_failed(task_name=task.name, target=request.target, error=error_msg[:200])
return AdHocCommandResult(
target=request.target,
command=request.command,
success=False,
return_code=-1,
stdout="",
stderr=error_msg,
duration=round(duration, 2)
)

224
app/routes/auth.py Normal file
View File

@ -0,0 +1,224 @@
"""
Routes API pour l'authentification JWT.
"""
from datetime import datetime, timezone
from typing import Optional
from fastapi import APIRouter, Depends, HTTPException, status
from fastapi.security import OAuth2PasswordRequestForm
from sqlalchemy.ext.asyncio import AsyncSession
from app.core.dependencies import get_db, get_current_user, get_current_user_optional
from app.crud.user import UserRepository
from app.schemas.auth import (
LoginRequest, Token, UserOut, UserCreate, PasswordChange
)
from app.services import auth_service
router = APIRouter()
@router.get("/status")
async def auth_status(
db_session: AsyncSession = Depends(get_db),
current_user: Optional[dict] = Depends(get_current_user_optional)
):
"""Vérifie le statut d'authentification et si le setup initial est requis."""
repo = UserRepository(db_session)
users_count = await repo.count()
if users_count == 0:
return {
"setup_required": True,
"authenticated": False,
"user": None
}
if current_user:
return {
"setup_required": False,
"authenticated": True,
"user": {
"id": current_user.get("user_id"),
"username": current_user.get("username"),
"role": current_user.get("role"),
"display_name": current_user.get("display_name")
}
}
return {
"setup_required": False,
"authenticated": False,
"user": None
}
@router.post("/setup")
async def setup_admin(
user_data: UserCreate,
db_session: AsyncSession = Depends(get_db)
):
"""Crée le premier utilisateur admin (uniquement si aucun utilisateur n'existe)."""
repo = UserRepository(db_session)
users_count = await repo.count()
if users_count > 0:
raise HTTPException(
status_code=status.HTTP_400_BAD_REQUEST,
detail="Le setup a déjà été effectué. Utilisez /login pour vous connecter."
)
# Hasher le mot de passe
hashed_password = auth_service.hash_password(user_data.password)
# Créer l'utilisateur admin
user = await repo.create(
username=user_data.username,
hashed_password=hashed_password,
email=user_data.email,
display_name=user_data.display_name,
role="admin"
)
await db_session.commit()
return {
"message": "Compte administrateur créé avec succès",
"user": {
"id": user.id,
"username": user.username,
"role": user.role
}
}
@router.post("/login", response_model=Token)
async def login_form(
form_data: OAuth2PasswordRequestForm = Depends(),
db_session: AsyncSession = Depends(get_db)
):
"""Connexion via formulaire OAuth2 (form-urlencoded)."""
repo = UserRepository(db_session)
user = await repo.get_by_username(form_data.username)
if not user or not auth_service.verify_password(form_data.password, user.hashed_password):
raise HTTPException(
status_code=status.HTTP_401_UNAUTHORIZED,
detail="Nom d'utilisateur ou mot de passe incorrect",
headers={"WWW-Authenticate": "Bearer"}
)
if not user.is_active:
raise HTTPException(
status_code=status.HTTP_401_UNAUTHORIZED,
detail="Compte désactivé"
)
# Mettre à jour last_login
await repo.update(user, last_login=datetime.now(timezone.utc))
await db_session.commit()
# Créer le token
token, expires_in = auth_service.create_token_for_user(user)
return Token(
access_token=token,
token_type="bearer",
expires_in=expires_in
)
@router.post("/login/json", response_model=Token)
async def login_json(
credentials: LoginRequest,
db_session: AsyncSession = Depends(get_db)
):
"""Connexion via JSON body."""
repo = UserRepository(db_session)
user = await repo.get_by_username(credentials.username)
if not user or not auth_service.verify_password(credentials.password, user.hashed_password):
raise HTTPException(
status_code=status.HTTP_401_UNAUTHORIZED,
detail="Nom d'utilisateur ou mot de passe incorrect"
)
if not user.is_active:
raise HTTPException(
status_code=status.HTTP_401_UNAUTHORIZED,
detail="Compte désactivé"
)
# Mettre à jour last_login
await repo.update(user, last_login=datetime.now(timezone.utc))
await db_session.commit()
# Créer le token
token, expires_in = auth_service.create_token_for_user(user)
return Token(
access_token=token,
token_type="bearer",
expires_in=expires_in
)
@router.get("/me", response_model=UserOut)
async def get_current_user_info(
current_user: dict = Depends(get_current_user),
db_session: AsyncSession = Depends(get_db)
):
"""Récupère les informations de l'utilisateur connecté."""
if not current_user:
raise HTTPException(
status_code=status.HTTP_401_UNAUTHORIZED,
detail="Non authentifié"
)
repo = UserRepository(db_session)
user = await repo.get(current_user.get("user_id"))
if not user:
raise HTTPException(
status_code=status.HTTP_404_NOT_FOUND,
detail="Utilisateur non trouvé"
)
return UserOut.model_validate(user)
@router.put("/password")
async def change_password(
password_data: PasswordChange,
current_user: dict = Depends(get_current_user),
db_session: AsyncSession = Depends(get_db)
):
"""Change le mot de passe de l'utilisateur connecté."""
if not current_user:
raise HTTPException(
status_code=status.HTTP_401_UNAUTHORIZED,
detail="Non authentifié"
)
repo = UserRepository(db_session)
user = await repo.get(current_user.get("user_id"))
if not user:
raise HTTPException(
status_code=status.HTTP_404_NOT_FOUND,
detail="Utilisateur non trouvé"
)
# Vérifier l'ancien mot de passe
if not auth_service.verify_password(password_data.current_password, user.hashed_password):
raise HTTPException(
status_code=status.HTTP_400_BAD_REQUEST,
detail="Mot de passe actuel incorrect"
)
# Hasher et enregistrer le nouveau mot de passe
new_hashed = auth_service.hash_password(password_data.new_password)
await repo.update(user, hashed_password=new_hashed)
await db_session.commit()
return {"message": "Mot de passe modifié avec succès"}

182
app/routes/bootstrap.py Normal file
View File

@ -0,0 +1,182 @@
"""
Routes API pour le bootstrap des hôtes.
"""
import asyncio
from datetime import datetime, timezone
from typing import Optional
from fastapi import APIRouter, Depends, HTTPException
from app.core.dependencies import verify_api_key
from app.schemas.ansible import BootstrapRequest
from app.schemas.common import CommandResult, LogEntry
from app.services import (
bootstrap_status_service,
ws_manager,
notification_service,
db,
)
from app.utils.ssh_utils import bootstrap_host
router = APIRouter()
@router.get("/status")
async def get_all_bootstrap_status(api_key_valid: bool = Depends(verify_api_key)):
"""Récupère le statut de bootstrap de tous les hôtes."""
return {
"hosts": bootstrap_status_service.get_all_status()
}
@router.get("/status/{host_name}")
async def get_host_bootstrap_status(
host_name: str,
api_key_valid: bool = Depends(verify_api_key)
):
"""Récupère le statut de bootstrap d'un hôte spécifique."""
status = bootstrap_status_service.get_bootstrap_status(host_name)
return {
"host": host_name,
**status
}
@router.post("/status/{host_name}")
async def set_host_bootstrap_status(
host_name: str,
success: bool = True,
details: Optional[str] = None,
api_key_valid: bool = Depends(verify_api_key)
):
"""Définit manuellement le statut de bootstrap d'un hôte."""
result = bootstrap_status_service.set_bootstrap_status(
host_name=host_name,
success=success,
details=details or f"Status défini manuellement"
)
db.invalidate_hosts_cache()
await ws_manager.broadcast({
"type": "bootstrap_status_updated",
"data": {
"host": host_name,
"bootstrap_ok": success
}
})
return {
"host": host_name,
"status": "updated",
**result
}
@router.post("", response_model=CommandResult)
async def bootstrap_ansible_host(
request: BootstrapRequest,
api_key_valid: bool = Depends(verify_api_key)
):
"""Bootstrap un hôte pour Ansible.
Cette opération:
1. Se connecte à l'hôte via SSH avec le mot de passe root
2. Crée l'utilisateur d'automatisation
3. Configure la clé SSH publique
4. Configure sudo sans mot de passe
5. Installe Python3
6. Vérifie la connexion SSH par clé
"""
import logging
import traceback
logger = logging.getLogger("bootstrap_endpoint")
try:
logger.info(f"Bootstrap request for host={request.host}, user={request.automation_user}")
result = bootstrap_host(
host=request.host,
root_password=request.root_password,
automation_user=request.automation_user
)
logger.info(f"Bootstrap result: status={result.status}, return_code={result.return_code}")
if result.return_code != 0:
raise HTTPException(
status_code=500,
detail={
"status": result.status,
"return_code": result.return_code,
"stdout": result.stdout,
"stderr": result.stderr
}
)
# Trouver le nom de l'hôte
host_name = request.host
for h in db.hosts:
if h.ip == request.host or h.name == request.host:
host_name = h.name
break
# Enregistrer le statut de bootstrap réussi
bootstrap_status_service.set_bootstrap_status(
host_name=host_name,
success=True,
details=f"Bootstrap réussi via API (user: {request.automation_user})"
)
db.invalidate_hosts_cache()
log_entry = LogEntry(
id=db.get_next_id("logs"),
timestamp=datetime.now(timezone.utc),
level="INFO",
message=f"Bootstrap réussi pour {host_name} (user: {request.automation_user})",
source="bootstrap",
host=host_name
)
db.logs.insert(0, log_entry)
await ws_manager.broadcast({
"type": "bootstrap_success",
"data": {
"host": host_name,
"user": request.automation_user,
"status": "ok",
"bootstrap_ok": True
}
})
asyncio.create_task(notification_service.notify_bootstrap_success(host_name))
return result
except HTTPException as http_exc:
error_detail = str(http_exc.detail) if http_exc.detail else "Erreur inconnue"
asyncio.create_task(notification_service.notify_bootstrap_failed(
hostname=request.host,
error=error_detail[:200]
))
raise
except Exception as e:
logger.error(f"Bootstrap exception: {e}")
logger.error(traceback.format_exc())
log_entry = LogEntry(
id=db.get_next_id("logs"),
timestamp=datetime.now(timezone.utc),
level="ERROR",
message=f"Échec bootstrap pour {request.host}: {str(e)}",
source="bootstrap",
host=request.host
)
db.logs.insert(0, log_entry)
asyncio.create_task(notification_service.notify_bootstrap_failed(
hostname=request.host,
error=str(e)[:200]
))
raise HTTPException(status_code=500, detail=str(e))

View File

@ -0,0 +1,174 @@
"""
Routes API pour les builtin playbooks (collecte métriques).
"""
import asyncio
from typing import Optional
from fastapi import APIRouter, Depends, HTTPException, BackgroundTasks
from sqlalchemy.ext.asyncio import AsyncSession
from app.core.config import settings
from app.core.dependencies import get_db, verify_api_key
from app.crud.host import HostRepository
from app.crud.host_metrics import HostMetricsRepository
from app.services import ansible_service, ws_manager
from app.services.builtin_playbooks import (
BuiltinPlaybookService,
BUILTIN_PLAYBOOKS,
get_builtin_playbook_service,
init_builtin_playbook_service,
)
router = APIRouter()
def _get_service() -> BuiltinPlaybookService:
"""Récupère ou initialise le service builtin playbooks."""
try:
return get_builtin_playbook_service()
except RuntimeError:
return init_builtin_playbook_service(settings.ansible_dir, ansible_service)
@router.get("")
async def list_builtin_playbooks(api_key_valid: bool = Depends(verify_api_key)):
"""Liste tous les builtin playbooks disponibles."""
return [pb.model_dump() for pb in BUILTIN_PLAYBOOKS.values()]
@router.get("/{builtin_id}")
async def get_builtin_playbook(
builtin_id: str,
api_key_valid: bool = Depends(verify_api_key)
):
"""Récupère les détails d'un builtin playbook."""
if builtin_id not in BUILTIN_PLAYBOOKS:
raise HTTPException(status_code=404, detail=f"Builtin playbook '{builtin_id}' non trouvé")
return BUILTIN_PLAYBOOKS[builtin_id].model_dump()
@router.post("/execute")
async def execute_builtin_playbook(
builtin_id: str,
target: str,
api_key_valid: bool = Depends(verify_api_key),
db_session: AsyncSession = Depends(get_db)
):
"""Exécute un builtin playbook sur une cible."""
if builtin_id not in BUILTIN_PLAYBOOKS:
raise HTTPException(status_code=404, detail=f"Builtin playbook '{builtin_id}' non trouvé")
service = _get_service()
result = await service.execute_builtin(builtin_id, target)
# Si collecte de métriques, sauvegarder en BD
if BUILTIN_PLAYBOOKS[builtin_id].collect_metrics and result.get("success"):
metrics_repo = HostMetricsRepository(db_session)
host_repo = HostRepository(db_session)
for hostname, metrics_data in result.get("parsed_metrics", {}).items():
# Trouver l'host_id
host = await host_repo.get_by_name(hostname)
if host:
metrics_create = service.create_metrics_from_parsed(
host_id=host.id,
parsed_data=metrics_data,
builtin_id=builtin_id,
execution_time_ms=result.get("execution_time_ms", 0)
)
await metrics_repo.create(**metrics_create.model_dump())
await db_session.commit()
await ws_manager.broadcast({
"type": "builtin_executed",
"data": {
"builtin_id": builtin_id,
"target": target,
"success": result.get("success", False)
}
})
return result
@router.post("/execute-background")
async def execute_builtin_playbook_background(
builtin_id: str,
target: str,
background_tasks: BackgroundTasks,
api_key_valid: bool = Depends(verify_api_key)
):
"""Exécute un builtin playbook en arrière-plan."""
if builtin_id not in BUILTIN_PLAYBOOKS:
raise HTTPException(status_code=404, detail=f"Builtin playbook '{builtin_id}' non trouvé")
async def run_in_background():
service = _get_service()
result = await service.execute_builtin(builtin_id, target)
await ws_manager.broadcast({
"type": "builtin_executed",
"data": {
"builtin_id": builtin_id,
"target": target,
"success": result.get("success", False)
}
})
# Planifier l'exécution en arrière-plan
background_tasks.add_task(asyncio.create_task, run_in_background())
return {
"message": f"Builtin playbook '{builtin_id}' planifié pour exécution sur {target}",
"builtin_id": builtin_id,
"target": target
}
@router.post("/collect-all")
async def collect_all_metrics(
background_tasks: BackgroundTasks,
api_key_valid: bool = Depends(verify_api_key),
db_session: AsyncSession = Depends(get_db)
):
"""Collecte les métriques de tous les hôtes."""
host_repo = HostRepository(db_session)
hosts = await host_repo.list(limit=1000)
if not hosts:
return {"message": "Aucun hôte trouvé", "hosts_count": 0}
async def collect_for_all():
service = _get_service()
results = []
for host in hosts:
try:
result = await service.execute_builtin("collect_system_info", host.name)
results.append({
"host": host.name,
"success": result.get("success", False)
})
except Exception as e:
results.append({
"host": host.name,
"success": False,
"error": str(e)
})
await ws_manager.broadcast({
"type": "metrics_collection_complete",
"data": {
"total": len(hosts),
"success": sum(1 for r in results if r.get("success")),
"failed": sum(1 for r in results if not r.get("success"))
}
})
background_tasks.add_task(asyncio.create_task, collect_for_all())
return {
"message": f"Collecte des métriques lancée pour {len(hosts)} hôte(s)",
"hosts_count": len(hosts)
}

148
app/routes/groups.py Normal file
View File

@ -0,0 +1,148 @@
"""
Routes API pour la gestion des groupes Ansible.
"""
from fastapi import APIRouter, Depends, HTTPException
from app.core.dependencies import verify_api_key
from app.schemas.group import GroupRequest, GroupUpdateRequest, GroupDeleteRequest
from app.services import ansible_service
router = APIRouter()
@router.get("")
async def get_groups(api_key_valid: bool = Depends(verify_api_key)):
"""Récupère la liste de tous les groupes Ansible."""
groups = ansible_service.get_groups()
env_groups = ansible_service.get_env_groups()
role_groups = ansible_service.get_role_groups()
result = []
for group in groups:
group_type = "env" if group in env_groups else ("role" if group in role_groups else "other")
hosts = ansible_service.get_group_hosts(group)
result.append({
"name": group,
"type": group_type,
"display_name": group.replace("env_", "").replace("role_", "").replace("_", " ").title(),
"hosts_count": len(hosts),
"hosts": hosts,
})
return {
"groups": result,
"env_count": len(env_groups),
"role_count": len(role_groups),
}
@router.get("/{group_name}")
async def get_group(group_name: str, api_key_valid: bool = Depends(verify_api_key)):
"""Récupère les détails d'un groupe."""
if not ansible_service.group_exists(group_name):
raise HTTPException(status_code=404, detail=f"Groupe '{group_name}' non trouvé")
env_groups = ansible_service.get_env_groups()
role_groups = ansible_service.get_role_groups()
group_type = "env" if group_name in env_groups else ("role" if group_name in role_groups else "other")
hosts = ansible_service.get_group_hosts(group_name)
return {
"name": group_name,
"type": group_type,
"display_name": group_name.replace("env_", "").replace("role_", "").replace("_", " ").title(),
"hosts_count": len(hosts),
"hosts": hosts,
}
@router.post("")
async def create_group(request: GroupRequest, api_key_valid: bool = Depends(verify_api_key)):
"""Crée un nouveau groupe."""
# Valider le préfixe selon le type
expected_prefix = f"{request.type}_"
if not request.name.startswith(expected_prefix):
raise HTTPException(
status_code=400,
detail=f"Le nom du groupe de type '{request.type}' doit commencer par '{expected_prefix}'"
)
# Vérifier si le groupe existe déjà
if ansible_service.group_exists(request.name):
raise HTTPException(status_code=400, detail=f"Le groupe '{request.name}' existe déjà")
try:
ansible_service.add_group(request.name, request.type)
return {
"message": f"Groupe '{request.name}' créé avec succès",
"group": {
"name": request.name,
"type": request.type,
"hosts_count": 0,
}
}
except Exception as e:
raise HTTPException(status_code=500, detail=f"Erreur lors de la création du groupe: {str(e)}")
@router.put("/{group_name}")
async def update_group(
group_name: str,
request: GroupUpdateRequest,
api_key_valid: bool = Depends(verify_api_key)
):
"""Renomme un groupe."""
if not ansible_service.group_exists(group_name):
raise HTTPException(status_code=404, detail=f"Groupe '{group_name}' non trouvé")
# Vérifier que le nouveau nom a le même préfixe
old_prefix = group_name.split("_")[0] + "_" if "_" in group_name else ""
new_prefix = request.new_name.split("_")[0] + "_" if "_" in request.new_name else ""
if old_prefix and old_prefix != new_prefix:
raise HTTPException(
status_code=400,
detail=f"Le nouveau nom doit conserver le même préfixe '{old_prefix}'"
)
# Vérifier si le nouveau nom existe déjà
if ansible_service.group_exists(request.new_name):
raise HTTPException(status_code=400, detail=f"Le groupe '{request.new_name}' existe déjà")
try:
ansible_service.rename_group(group_name, request.new_name)
return {
"message": f"Groupe renommé de '{group_name}' vers '{request.new_name}'",
"old_name": group_name,
"new_name": request.new_name,
}
except Exception as e:
raise HTTPException(status_code=500, detail=f"Erreur lors du renommage: {str(e)}")
@router.delete("/{group_name}")
async def delete_group(
group_name: str,
move_hosts_to: str = None,
api_key_valid: bool = Depends(verify_api_key)
):
"""Supprime un groupe."""
if not ansible_service.group_exists(group_name):
raise HTTPException(status_code=404, detail=f"Groupe '{group_name}' non trouvé")
# Vérifier le groupe de destination si spécifié
if move_hosts_to and not ansible_service.group_exists(move_hosts_to):
raise HTTPException(status_code=400, detail=f"Groupe de destination '{move_hosts_to}' non trouvé")
hosts = ansible_service.get_group_hosts(group_name)
try:
ansible_service.delete_group(group_name, move_hosts_to)
return {
"message": f"Groupe '{group_name}' supprimé",
"hosts_moved": len(hosts) if move_hosts_to else 0,
"moved_to": move_hosts_to,
}
except Exception as e:
raise HTTPException(status_code=500, detail=f"Erreur lors de la suppression: {str(e)}")

90
app/routes/health.py Normal file
View File

@ -0,0 +1,90 @@
"""
Routes API pour les health checks.
"""
from datetime import datetime, timezone
from fastapi import APIRouter, Depends, HTTPException
from app.core.dependencies import verify_api_key
from app.schemas.health import HealthCheck
from app.schemas.common import LogEntry, SystemMetrics
from app.services import ws_manager, db
router = APIRouter()
@router.get("", response_model=SystemMetrics)
async def get_metrics(api_key_valid: bool = Depends(verify_api_key)):
"""Récupère les métriques système."""
return db.metrics
@router.get("/global")
async def global_health_check():
"""Endpoint de healthcheck global utilisé par Docker.
Ne nécessite pas de clé API pour permettre aux orchestrateurs
de vérifier l'état du service facilement.
"""
return {
"status": "ok",
"service": "homelab-automation-api",
"timestamp": datetime.now(timezone.utc).isoformat()
}
@router.get("/{host_name}", response_model=HealthCheck)
async def check_host_health(host_name: str, api_key_valid: bool = Depends(verify_api_key)):
"""Effectue un health check sur un hôte spécifique."""
host = next((h for h in db.hosts if h.name == host_name), None)
if not host:
raise HTTPException(status_code=404, detail="Hôte non trouvé")
# Simuler un health check
health_check = HealthCheck(
host=host_name,
ssh_ok=host.status == "online",
ansible_ok=host.status == "online",
sudo_ok=host.status == "online",
reachable=host.status != "offline",
response_time=0.123 if host.status == "online" else None,
error_message=None if host.status != "offline" else "Hôte injoignable"
)
# Mettre à jour le statut runtime
new_status = "online" if health_check.reachable else "offline"
db.update_host_status(host_name, new_status, host.os)
log_entry = LogEntry(
timestamp=datetime.now(timezone.utc),
level="INFO" if health_check.reachable else "ERROR",
message=f"Health check {'réussi' if health_check.reachable else 'échoué'} pour {host_name}",
source="health_check",
host=host_name
)
db.logs.insert(0, log_entry)
await ws_manager.broadcast({
"type": "health_check",
"data": health_check.dict()
})
return health_check
@router.post("/refresh")
async def refresh_hosts(api_key_valid: bool = Depends(verify_api_key)):
"""Force le rechargement des hôtes depuis l'inventaire Ansible."""
from app.services import ansible_service
ansible_service.invalidate_cache()
hosts = db.refresh_hosts()
await ws_manager.broadcast({
"type": "hosts_refreshed",
"data": {"count": len(hosts)}
})
return {"message": f"{len(hosts)} hôtes rechargés depuis l'inventaire Ansible"}

52
app/routes/help.py Normal file
View File

@ -0,0 +1,52 @@
"""
Routes API pour l'aide et la documentation.
"""
from pathlib import Path
from fastapi import APIRouter, Depends
from fastapi.responses import Response
from app.core.config import settings
from app.core.dependencies import verify_api_key
from app.utils.markdown_parser import build_help_markdown
from app.utils.pdf_generator import markdown_to_pdf_bytes
router = APIRouter()
@router.get("/documentation.md")
async def download_help_markdown(api_key_valid: bool = Depends(verify_api_key)):
"""Télécharge la documentation d'aide en format Markdown."""
# Essayer de charger depuis index.html
html_path = settings.base_dir / "index.html"
markdown_content = build_help_markdown(html_path=html_path)
return Response(
content=markdown_content,
media_type="text/markdown",
headers={
"Content-Disposition": "attachment; filename=homelab-automation-help.md"
}
)
@router.get("/documentation.pdf")
async def download_help_pdf(api_key_valid: bool = Depends(verify_api_key)):
"""Télécharge la documentation d'aide en format PDF."""
# Essayer de charger depuis index.html
html_path = settings.base_dir / "index.html"
markdown_content = build_help_markdown(html_path=html_path)
pdf_bytes = markdown_to_pdf_bytes(
markdown_content,
title="Homelab Automation - Documentation"
)
return Response(
content=pdf_bytes,
media_type="application/pdf",
headers={
"Content-Disposition": "attachment; filename=homelab-automation-help.pdf"
}
)

399
app/routes/hosts.py Normal file
View File

@ -0,0 +1,399 @@
"""
Routes API pour la gestion des hôtes.
"""
import uuid
from datetime import datetime, timezone
from typing import List, Optional
from fastapi import APIRouter, Depends, HTTPException, status
from sqlalchemy.ext.asyncio import AsyncSession
from app.core.dependencies import get_db, verify_api_key
from app.crud.host import HostRepository
from app.crud.bootstrap_status import BootstrapStatusRepository
from app.schemas.host_api import HostRequest, HostUpdateRequest, HostResponse
from app.services import ansible_service, ws_manager
router = APIRouter()
def _host_to_response(host, bootstrap=None) -> dict:
"""Convertit un modèle Host DB en réponse API."""
return {
"id": host.id,
"name": host.name,
"ip": host.ip_address,
"status": host.status or "unknown",
"os": "Linux",
"last_seen": host.last_seen,
"created_at": host.created_at,
"groups": [host.ansible_group] if host.ansible_group else [],
"bootstrap_ok": bootstrap.status == "success" if bootstrap else False,
"bootstrap_date": bootstrap.last_attempt if bootstrap else None,
}
@router.get("/groups")
async def get_host_groups(api_key_valid: bool = Depends(verify_api_key)):
"""Récupère la liste des groupes disponibles pour les hôtes."""
return {
"env_groups": ansible_service.get_env_groups(),
"role_groups": ansible_service.get_role_groups(),
"all_groups": ansible_service.get_groups(),
}
@router.get("/by-name/{host_name}")
async def get_host_by_name(
host_name: str,
api_key_valid: bool = Depends(verify_api_key),
db_session: AsyncSession = Depends(get_db),
):
"""Récupère un hôte par son nom."""
repo = HostRepository(db_session)
bs_repo = BootstrapStatusRepository(db_session)
host = await repo.get_by_name(host_name)
if not host:
host = await repo.get_by_ip(host_name)
if not host:
raise HTTPException(status_code=404, detail=f"Hôte '{host_name}' non trouvé")
bootstrap = await bs_repo.latest_for_host(host.id)
return _host_to_response(host, bootstrap)
@router.post("/refresh")
async def refresh_hosts(
api_key_valid: bool = Depends(verify_api_key),
db_session: AsyncSession = Depends(get_db),
):
"""Force le rechargement des hôtes depuis l'inventaire Ansible."""
ansible_service.invalidate_cache()
hosts = ansible_service.get_hosts_from_inventory()
await ws_manager.broadcast({
"type": "hosts_refreshed",
"data": {"count": len(hosts)}
})
return {
"message": f"{len(hosts)} hôtes rechargés depuis l'inventaire Ansible",
"count": len(hosts)
}
@router.post("/sync")
async def sync_hosts_from_ansible(
api_key_valid: bool = Depends(verify_api_key),
db_session: AsyncSession = Depends(get_db),
):
"""Synchronise les hôtes depuis l'inventaire Ansible vers la base de données."""
repo = HostRepository(db_session)
ansible_service.invalidate_cache()
inventory_hosts = ansible_service.get_hosts_from_inventory()
created_count = 0
updated_count = 0
for inv_host in inventory_hosts:
existing = await repo.get_by_name(inv_host.name)
if existing:
await repo.update(
existing,
ip_address=inv_host.ansible_host or inv_host.name,
ansible_group=inv_host.groups[0] if inv_host.groups else None,
)
updated_count += 1
else:
await repo.create(
id=uuid.uuid4().hex,
name=inv_host.name,
ip_address=inv_host.ansible_host or inv_host.name,
ansible_group=inv_host.groups[0] if inv_host.groups else None,
status="unknown",
reachable=False,
)
created_count += 1
await db_session.commit()
await ws_manager.broadcast({
"type": "hosts_synced",
"data": {
"created": created_count,
"updated": updated_count,
"total": len(inventory_hosts)
}
})
return {
"message": f"Synchronisation terminée: {created_count} créé(s), {updated_count} mis à jour",
"created": created_count,
"updated": updated_count,
"total": len(inventory_hosts)
}
@router.get("/{host_id}")
async def get_host(
host_id: str,
api_key_valid: bool = Depends(verify_api_key),
db_session: AsyncSession = Depends(get_db),
):
"""Récupère un hôte par son ID."""
repo = HostRepository(db_session)
bs_repo = BootstrapStatusRepository(db_session)
host = await repo.get(host_id)
if not host:
raise HTTPException(status_code=404, detail="Hôte non trouvé")
bootstrap = await bs_repo.latest_for_host(host.id)
return _host_to_response(host, bootstrap)
@router.get("")
async def get_hosts(
bootstrap_status: Optional[str] = None,
limit: int = 100,
offset: int = 0,
api_key_valid: bool = Depends(verify_api_key),
db_session: AsyncSession = Depends(get_db),
):
"""Récupère la liste des hôtes."""
from app.services import db
repo = HostRepository(db_session)
bs_repo = BootstrapStatusRepository(db_session)
hosts = await repo.list(limit=limit, offset=offset)
# Si la base ne contient aucun hôte, fallback sur les données de l'inventaire Ansible
if not hosts:
hybrid_hosts = db.hosts
fallback_results = []
for h in hybrid_hosts:
# Appliquer les filtres de bootstrap
if bootstrap_status == "ready" and not h.bootstrap_ok:
continue
if bootstrap_status == "not_configured" and h.bootstrap_ok:
continue
fallback_results.append({
"id": h.id,
"name": h.name,
"ip": h.ip,
"status": h.status,
"os": h.os,
"last_seen": h.last_seen,
"created_at": h.created_at,
"groups": h.groups,
"bootstrap_ok": h.bootstrap_ok,
"bootstrap_date": h.bootstrap_date,
})
return fallback_results
result = []
for host in hosts:
bootstrap = await bs_repo.latest_for_host(host.id)
# Appliquer les filtres de bootstrap
if bootstrap_status == "ready" and not (bootstrap and bootstrap.status == "success"):
continue
if bootstrap_status == "not_configured" and bootstrap and bootstrap.status == "success":
continue
result.append(_host_to_response(host, bootstrap))
return result
@router.post("")
async def create_host(
host_request: HostRequest,
api_key_valid: bool = Depends(verify_api_key),
db_session: AsyncSession = Depends(get_db),
):
"""Crée un nouvel hôte."""
repo = HostRepository(db_session)
bs_repo = BootstrapStatusRepository(db_session)
# Vérifier si l'hôte existe déjà
existing = await repo.get_by_name(host_request.name)
if existing:
raise HTTPException(status_code=400, detail=f"L'hôte '{host_request.name}' existe déjà")
# Valider le groupe d'environnement
env_groups = ansible_service.get_env_groups()
if host_request.env_group not in env_groups and not host_request.env_group.startswith("env_"):
raise HTTPException(
status_code=400,
detail=f"Le groupe d'environnement doit commencer par 'env_'. Groupes existants: {env_groups}"
)
# Valider les groupes de rôles
role_groups = ansible_service.get_role_groups()
for role in host_request.role_groups:
if role not in role_groups and not role.startswith("role_"):
raise HTTPException(
status_code=400,
detail=f"Le groupe de rôle '{role}' doit commencer par 'role_'."
)
try:
# Ajouter l'hôte à l'inventaire Ansible
ansible_service.add_host_to_inventory(
hostname=host_request.name,
env_group=host_request.env_group,
role_groups=host_request.role_groups,
ansible_host=host_request.ip,
)
# Créer en base
host = await repo.create(
id=uuid.uuid4().hex,
name=host_request.name,
ip_address=host_request.ip or host_request.name,
ansible_group=host_request.env_group,
status="unknown",
reachable=False,
last_seen=None,
)
bootstrap = await bs_repo.latest_for_host(host.id)
await db_session.commit()
# Notifier les clients WebSocket
await ws_manager.broadcast({
"type": "host_created",
"data": _host_to_response(host, bootstrap),
})
return {
"message": f"Hôte '{host_request.name}' ajouté avec succès",
"host": _host_to_response(host, bootstrap),
"inventory_updated": True,
}
except HTTPException:
raise
except Exception as e:
await db_session.rollback()
raise HTTPException(status_code=500, detail=f"Erreur lors de l'ajout de l'hôte: {str(e)}")
@router.put("/{host_name}")
async def update_host(
host_name: str,
update_request: HostUpdateRequest,
api_key_valid: bool = Depends(verify_api_key),
db_session: AsyncSession = Depends(get_db),
):
"""Met à jour un hôte existant."""
repo = HostRepository(db_session)
bs_repo = BootstrapStatusRepository(db_session)
host = await repo.get_by_name(host_name)
if not host:
host = await repo.get(host_name)
if not host:
raise HTTPException(status_code=404, detail=f"Hôte '{host_name}' non trouvé")
# Valider le groupe d'environnement si fourni
if update_request.env_group:
env_groups = ansible_service.get_env_groups()
if update_request.env_group not in env_groups and not update_request.env_group.startswith("env_"):
raise HTTPException(status_code=400, detail="Le groupe d'environnement doit commencer par 'env_'")
# Valider les groupes de rôles si fournis
if update_request.role_groups:
for role in update_request.role_groups:
if not role.startswith("role_"):
raise HTTPException(status_code=400, detail=f"Le groupe de rôle '{role}' doit commencer par 'role_'")
try:
ansible_service.update_host_groups(
hostname=host_name,
env_group=update_request.env_group,
role_groups=update_request.role_groups,
ansible_host=update_request.ansible_host,
)
await repo.update(
host,
ansible_group=update_request.env_group or host.ansible_group,
)
await db_session.commit()
bootstrap = await bs_repo.latest_for_host(host.id)
await ws_manager.broadcast({
"type": "host_updated",
"data": _host_to_response(host, bootstrap),
})
return {
"message": f"Hôte '{host_name}' mis à jour avec succès",
"host": _host_to_response(host, bootstrap),
"inventory_updated": True,
}
except HTTPException:
await db_session.rollback()
raise
except Exception as e:
await db_session.rollback()
raise HTTPException(status_code=500, detail=f"Erreur lors de la mise à jour: {str(e)}")
@router.delete("/by-name/{host_name}")
async def delete_host_by_name(
host_name: str,
api_key_valid: bool = Depends(verify_api_key),
db_session: AsyncSession = Depends(get_db),
):
"""Supprime un hôte par son nom."""
repo = HostRepository(db_session)
host = await repo.get_by_name(host_name)
if not host:
host = await repo.get(host_name)
if not host:
raise HTTPException(status_code=404, detail=f"Hôte '{host_name}' non trouvé")
try:
ansible_service.remove_host_from_inventory(host_name)
await repo.soft_delete(host.id)
await db_session.commit()
await ws_manager.broadcast({
"type": "host_deleted",
"data": {"name": host_name},
})
return {"message": f"Hôte '{host_name}' supprimé avec succès", "inventory_updated": True}
except HTTPException:
await db_session.rollback()
raise
except Exception as e:
await db_session.rollback()
raise HTTPException(status_code=500, detail=f"Erreur lors de la suppression: {str(e)}")
@router.delete("/{host_id}")
async def delete_host(
host_id: str,
api_key_valid: bool = Depends(verify_api_key),
db_session: AsyncSession = Depends(get_db),
):
"""Supprime un hôte par son ID."""
repo = HostRepository(db_session)
host = await repo.get(host_id)
if not host:
raise HTTPException(status_code=404, detail="Hôte non trouvé")
return await delete_host_by_name(host.name, api_key_valid, db_session)

88
app/routes/logs.py Normal file
View File

@ -0,0 +1,88 @@
"""
Routes API pour la gestion des logs.
"""
from typing import Optional
from fastapi import APIRouter, Depends, HTTPException
from sqlalchemy.ext.asyncio import AsyncSession
from sqlalchemy import delete
from app.core.dependencies import get_db, verify_api_key
from app.crud.log import LogRepository
from app.services import ws_manager
router = APIRouter()
@router.get("")
async def get_logs(
limit: int = 50,
offset: int = 0,
level: Optional[str] = None,
source: Optional[str] = None,
api_key_valid: bool = Depends(verify_api_key),
db_session: AsyncSession = Depends(get_db),
):
"""Récupère les logs récents avec filtrage optionnel."""
repo = LogRepository(db_session)
logs = await repo.list(limit=limit, offset=offset, level=level, source=source)
return [
{
"id": log.id,
"timestamp": log.created_at,
"level": log.level,
"message": log.message,
"source": log.source,
"host": log.host_id,
}
for log in logs
]
@router.post("")
async def create_log(
level: str,
message: str,
source: Optional[str] = None,
host_id: Optional[str] = None,
api_key_valid: bool = Depends(verify_api_key),
db_session: AsyncSession = Depends(get_db),
):
"""Ajoute une nouvelle entrée de log."""
repo = LogRepository(db_session)
log = await repo.create(
level=level.upper(),
message=message,
source=source,
host_id=host_id,
)
await db_session.commit()
response_data = {
"id": log.id,
"timestamp": log.created_at,
"level": log.level,
"message": log.message,
"source": log.source,
"host": log.host_id,
}
await ws_manager.broadcast({
"type": "new_log",
"data": response_data
})
return response_data
@router.delete("")
async def clear_logs(
api_key_valid: bool = Depends(verify_api_key),
db_session: AsyncSession = Depends(get_db),
):
"""Efface tous les logs (attention: opération destructive)."""
from app.models.log import Log as LogModel
await db_session.execute(delete(LogModel))
await db_session.commit()
return {"message": "Tous les logs ont été supprimés"}

143
app/routes/metrics.py Normal file
View File

@ -0,0 +1,143 @@
"""
Routes API pour les métriques système et des hôtes.
"""
from typing import Optional
from fastapi import APIRouter, Depends, HTTPException
from pydantic import BaseModel
from sqlalchemy.ext.asyncio import AsyncSession
from app.core.dependencies import get_db, verify_api_key
from app.crud.host_metrics import HostMetricsRepository
from app.crud.app_setting import AppSettingRepository
from app.services import db
class CollectionScheduleRequest(BaseModel):
interval: str = "off"
router = APIRouter()
@router.get("")
async def get_metrics(api_key_valid: bool = Depends(verify_api_key)):
"""Récupère les métriques système globales."""
return db.metrics
@router.get("/all-hosts")
async def get_all_hosts_metrics(
api_key_valid: bool = Depends(verify_api_key),
db_session: AsyncSession = Depends(get_db)
):
"""Récupère les métriques de tous les hôtes."""
try:
repo = HostMetricsRepository(db_session)
metrics_dict = await repo.get_all_latest()
# Convertir en dict host_id -> metrics
result = {}
for host_id, m in metrics_dict.items():
result[m.host_id] = {
"host_id": m.host_id,
"metric_type": m.metric_type,
"cpu_usage_percent": m.cpu_usage_percent,
"cpu_load_1m": m.cpu_load_1m,
"cpu_model": m.cpu_model,
"cpu_count": m.cpu_count,
"memory_usage_percent": m.memory_usage_percent,
"memory_total_mb": m.memory_total_mb,
"memory_used_mb": m.memory_used_mb,
"disk_root_usage_percent": m.disk_root_usage_percent,
"disk_root_total_gb": m.disk_root_total_gb,
"disk_root_used_gb": m.disk_root_used_gb,
"os_name": m.os_name,
"uptime_human": m.uptime_human,
"collected_at": m.collected_at,
"collection_status": "success" if not m.error_message else "failed",
"error_message": m.error_message,
}
return result
except Exception as e:
# Si la table n'existe pas encore ou autre erreur
return {}
@router.get("/collection-schedule")
async def get_collection_schedule(
api_key_valid: bool = Depends(verify_api_key),
db_session: AsyncSession = Depends(get_db)
):
"""Récupère l'intervalle de collecte des métriques."""
try:
repo = AppSettingRepository(db_session)
setting = await repo.get("metrics_collection_interval")
interval = setting.value if setting else "off"
return {"interval": interval}
except Exception:
return {"interval": "off"}
@router.post("/collection-schedule")
async def set_collection_schedule(
request: CollectionScheduleRequest,
api_key_valid: bool = Depends(verify_api_key),
db_session: AsyncSession = Depends(get_db)
):
"""Définit l'intervalle de collecte des métriques."""
interval = request.interval
valid_intervals = ["off", "5min", "15min", "30min", "1h", "6h", "12h", "24h"]
if interval not in valid_intervals:
raise HTTPException(
status_code=400,
detail=f"Intervalle invalide. Valeurs acceptées: {valid_intervals}"
)
try:
repo = AppSettingRepository(db_session)
await repo.set("metrics_collection_interval", interval)
await db_session.commit()
return {"interval": interval, "message": f"Intervalle de collecte défini à {interval}"}
except Exception as e:
raise HTTPException(status_code=500, detail=str(e))
@router.get("/{host_id}")
async def get_host_metrics(
host_id: str,
api_key_valid: bool = Depends(verify_api_key),
db_session: AsyncSession = Depends(get_db)
):
"""Récupère les métriques d'un hôte spécifique."""
try:
repo = HostMetricsRepository(db_session)
metrics = await repo.get_latest_for_host(host_id)
if not metrics:
return {"host_id": host_id, "collection_status": "no_data"}
return {
"host_id": metrics.host_id,
"metric_type": metrics.metric_type,
"cpu_usage_percent": metrics.cpu_usage_percent,
"cpu_load_1m": metrics.cpu_load_1m,
"cpu_model": metrics.cpu_model,
"cpu_count": metrics.cpu_count,
"memory_usage_percent": metrics.memory_usage_percent,
"memory_total_mb": metrics.memory_total_mb,
"memory_used_mb": metrics.memory_used_mb,
"disk_root_usage_percent": metrics.disk_root_usage_percent,
"disk_root_total_gb": metrics.disk_root_total_gb,
"disk_root_used_gb": metrics.disk_root_used_gb,
"disk_info": metrics.disk_info,
"os_name": metrics.os_name,
"uptime_human": metrics.uptime_human,
"collected_at": metrics.collected_at,
"collection_status": "success" if not metrics.error_message else "failed",
"error_message": metrics.error_message,
}
except Exception as e:
return {"host_id": host_id, "collection_status": "error", "error_message": str(e)}

View File

@ -0,0 +1,81 @@
"""
Routes API pour les notifications ntfy.
"""
from typing import Optional
from fastapi import APIRouter, Depends
from app.core.dependencies import verify_api_key
from app.schemas.notification import NotificationRequest, NotificationResponse, NtfyConfig
from app.services import notification_service
router = APIRouter()
@router.get("/config")
async def get_notification_config(api_key_valid: bool = Depends(verify_api_key)):
"""Récupère la configuration actuelle des notifications ntfy."""
config = notification_service.config
return {
"enabled": config.enabled,
"base_url": config.base_url,
"default_topic": config.default_topic,
"timeout": config.timeout,
"has_auth": config.has_auth,
}
@router.post("/test")
async def test_notification(
topic: Optional[str] = None,
message: str = "🧪 Test de notification depuis Homelab Automation API",
api_key_valid: bool = Depends(verify_api_key)
):
"""Envoie une notification de test."""
success = await notification_service.send(
topic=topic,
message=message,
title="🔔 Test Notification",
priority=3,
tags=["test_tube", "robot"]
)
return {
"success": success,
"topic": topic or notification_service.config.default_topic,
"message": "Notification envoyée" if success else "Échec de l'envoi (voir logs serveur)"
}
@router.post("/send", response_model=NotificationResponse)
async def send_custom_notification(
request: NotificationRequest,
api_key_valid: bool = Depends(verify_api_key)
):
"""Envoie une notification personnalisée via ntfy."""
return await notification_service.send_request(request)
@router.post("/toggle")
async def toggle_notifications(
enabled: bool,
api_key_valid: bool = Depends(verify_api_key)
):
"""Active ou désactive les notifications ntfy."""
current_config = notification_service.config
new_config = NtfyConfig(
base_url=current_config.base_url,
default_topic=current_config.default_topic,
enabled=enabled,
timeout=current_config.timeout,
username=current_config.username,
password=current_config.password,
token=current_config.token,
)
notification_service.reconfigure(new_config)
return {
"enabled": enabled,
"message": f"Notifications {'activées' if enabled else 'désactivées'}"
}

145
app/routes/playbooks.py Normal file
View File

@ -0,0 +1,145 @@
"""
Routes API pour la gestion des playbooks.
"""
import re
from datetime import datetime, timezone
import yaml
from fastapi import APIRouter, Depends, HTTPException
from app.core.dependencies import verify_api_key
from app.schemas.ansible import PlaybookContentRequest
from app.schemas.common import LogEntry
from app.services import ansible_service, db
router = APIRouter()
@router.get("/{filename}/content")
async def get_playbook_content(
filename: str,
api_key_valid: bool = Depends(verify_api_key)
):
"""Récupère le contenu d'un playbook."""
playbook_path = ansible_service.playbooks_dir / filename
if not filename.endswith(('.yml', '.yaml')):
raise HTTPException(status_code=400, detail="Extension de fichier invalide. Utilisez .yml ou .yaml")
if not playbook_path.exists():
raise HTTPException(status_code=404, detail=f"Playbook non trouvé: {filename}")
# Vérifier que le fichier est bien dans le répertoire playbooks (sécurité)
try:
playbook_path.resolve().relative_to(ansible_service.playbooks_dir.resolve())
except ValueError:
raise HTTPException(status_code=403, detail="Accès non autorisé")
try:
content = playbook_path.read_text(encoding='utf-8')
stat = playbook_path.stat()
return {
"filename": filename,
"content": content,
"size": stat.st_size,
"modified": datetime.fromtimestamp(stat.st_mtime, tz=timezone.utc).isoformat()
}
except Exception as e:
raise HTTPException(status_code=500, detail=f"Erreur lecture fichier: {str(e)}")
@router.put("/{filename}/content")
async def save_playbook_content(
filename: str,
request: PlaybookContentRequest,
api_key_valid: bool = Depends(verify_api_key)
):
"""Sauvegarde le contenu d'un playbook."""
if not filename.endswith(('.yml', '.yaml')):
raise HTTPException(status_code=400, detail="Extension de fichier invalide. Utilisez .yml ou .yaml")
# Valider le nom de fichier (sécurité)
if not re.match(r'^[a-zA-Z0-9_-]+\.(yml|yaml)$', filename):
raise HTTPException(status_code=400, detail="Nom de fichier invalide")
playbook_path = ansible_service.playbooks_dir / filename
# S'assurer que le répertoire existe
ansible_service.playbooks_dir.mkdir(parents=True, exist_ok=True)
# Valider le contenu YAML
try:
parsed = yaml.safe_load(request.content)
if parsed is None:
raise HTTPException(status_code=400, detail="Contenu YAML vide ou invalide")
except yaml.YAMLError as e:
raise HTTPException(status_code=400, detail=f"Erreur de syntaxe YAML: {str(e)}")
is_new = not playbook_path.exists()
try:
playbook_path.write_text(request.content, encoding='utf-8')
stat = playbook_path.stat()
# Log l'action
action = "créé" if is_new else "modifié"
log_entry = LogEntry(
id=db.get_next_id("logs"),
timestamp=datetime.now(timezone.utc),
level="INFO",
message=f"Playbook {filename} {action}",
source="playbook_editor"
)
db.logs.insert(0, log_entry)
return {
"success": True,
"message": f"Playbook {filename} {'créé' if is_new else 'sauvegardé'} avec succès",
"filename": filename,
"size": stat.st_size,
"modified": datetime.fromtimestamp(stat.st_mtime, tz=timezone.utc).isoformat(),
"is_new": is_new
}
except Exception as e:
raise HTTPException(status_code=500, detail=f"Erreur sauvegarde fichier: {str(e)}")
@router.delete("/{filename}")
async def delete_playbook(
filename: str,
api_key_valid: bool = Depends(verify_api_key)
):
"""Supprime un playbook."""
if not filename.endswith(('.yml', '.yaml')):
raise HTTPException(status_code=400, detail="Extension de fichier invalide")
playbook_path = ansible_service.playbooks_dir / filename
if not playbook_path.exists():
raise HTTPException(status_code=404, detail=f"Playbook non trouvé: {filename}")
# Vérifier que le fichier est bien dans le répertoire playbooks (sécurité)
try:
playbook_path.resolve().relative_to(ansible_service.playbooks_dir.resolve())
except ValueError:
raise HTTPException(status_code=403, detail="Accès non autorisé")
try:
playbook_path.unlink()
log_entry = LogEntry(
id=db.get_next_id("logs"),
timestamp=datetime.now(timezone.utc),
level="WARN",
message=f"Playbook {filename} supprimé",
source="playbook_editor"
)
db.logs.insert(0, log_entry)
return {
"success": True,
"message": f"Playbook {filename} supprimé avec succès"
}
except Exception as e:
raise HTTPException(status_code=500, detail=f"Erreur suppression fichier: {str(e)}")

554
app/routes/schedules.py Normal file
View File

@ -0,0 +1,554 @@
"""
Routes API pour la gestion des schedules.
"""
import json
import uuid
from typing import Optional
from fastapi import APIRouter, Depends, HTTPException
from sqlalchemy.ext.asyncio import AsyncSession
from app.core.dependencies import get_db, verify_api_key
from app.crud.schedule import ScheduleRepository
from app.crud.schedule_run import ScheduleRunRepository
from app.crud.log import LogRepository
from app.schemas.schedule_api import (
Schedule,
ScheduleCreateRequest,
ScheduleUpdateRequest,
)
from app.services import ansible_service, ws_manager, scheduler_service
router = APIRouter()
@router.get("")
async def get_schedules(
enabled: Optional[bool] = None,
playbook: Optional[str] = None,
tag: Optional[str] = None,
limit: int = 100,
offset: int = 0,
api_key_valid: bool = Depends(verify_api_key),
):
"""Liste tous les schedules."""
schedules = scheduler_service.get_all_schedules(
enabled=enabled,
playbook=playbook,
tag=tag,
)
paginated = schedules[offset:offset + limit]
results = []
for s in paginated:
rec = s.recurrence
results.append({
"id": s.id,
"name": s.name,
"playbook": s.playbook,
"target": s.target,
"schedule_type": s.schedule_type,
"recurrence": rec.model_dump() if rec else None,
"enabled": s.enabled,
"notification_type": getattr(s, 'notification_type', 'all'),
"tags": s.tags,
"next_run_at": s.next_run_at,
"last_run_at": s.last_run_at,
"last_status": s.last_status,
"run_count": s.run_count,
"success_count": s.success_count,
"failure_count": s.failure_count,
"created_at": s.created_at,
"updated_at": s.updated_at,
})
return {"schedules": results, "count": len(schedules)}
@router.get("/stats")
async def get_schedules_stats(api_key_valid: bool = Depends(verify_api_key)):
"""Récupère les statistiques des schedules."""
stats = scheduler_service.get_stats()
upcoming = scheduler_service.get_upcoming_executions(limit=5)
return {
"stats": stats.dict(),
"upcoming": upcoming
}
@router.get("/upcoming")
async def get_upcoming_schedules(
limit: int = 10,
api_key_valid: bool = Depends(verify_api_key)
):
"""Récupère les prochaines exécutions planifiées."""
upcoming = scheduler_service.get_upcoming_executions(limit=limit)
return {
"upcoming": upcoming,
"count": len(upcoming)
}
@router.get("/validate-cron")
async def validate_cron_expression(
expression: str,
api_key_valid: bool = Depends(verify_api_key)
):
"""Valide une expression cron."""
result = scheduler_service.validate_cron_expression(expression)
return result
@router.get("/{schedule_id}")
async def get_schedule(
schedule_id: str,
api_key_valid: bool = Depends(verify_api_key),
db_session: AsyncSession = Depends(get_db),
):
"""Récupère les détails d'un schedule."""
repo = ScheduleRepository(db_session)
schedule = await repo.get(schedule_id)
if not schedule:
raise HTTPException(status_code=404, detail=f"Schedule '{schedule_id}' non trouvé")
return {
"id": schedule.id,
"name": schedule.name,
"playbook": schedule.playbook,
"target": schedule.target,
"schedule_type": schedule.schedule_type,
"recurrence_type": schedule.recurrence_type,
"recurrence_time": schedule.recurrence_time,
"recurrence_days": json.loads(schedule.recurrence_days) if schedule.recurrence_days else None,
"cron_expression": schedule.cron_expression,
"enabled": schedule.enabled,
"notification_type": schedule.notification_type or "all",
"tags": json.loads(schedule.tags) if schedule.tags else [],
"next_run": schedule.next_run,
"last_run": schedule.last_run,
"created_at": schedule.created_at,
"updated_at": schedule.updated_at,
}
@router.post("")
async def create_schedule(
request: ScheduleCreateRequest,
api_key_valid: bool = Depends(verify_api_key),
db_session: AsyncSession = Depends(get_db),
):
"""Crée un nouveau schedule."""
# Vérifier que le playbook existe
playbooks = ansible_service.get_playbooks()
playbook_names = [p['filename'] for p in playbooks] + [p['name'] for p in playbooks]
playbook_file = request.playbook
if not playbook_file.endswith(('.yml', '.yaml')):
playbook_file = f"{playbook_file}.yml"
if playbook_file not in playbook_names and request.playbook not in playbook_names:
raise HTTPException(status_code=400, detail=f"Playbook '{request.playbook}' non trouvé")
# Récupérer les infos du playbook pour validation
playbook_info = next((pb for pb in playbooks if pb['filename'] == playbook_file or pb['name'] == request.playbook), None)
# Vérifier la cible
if request.target_type == "group":
groups = ansible_service.get_groups()
if request.target not in groups and request.target != "all":
raise HTTPException(status_code=400, detail=f"Groupe '{request.target}' non trouvé")
else:
if not ansible_service.host_exists(request.target):
raise HTTPException(status_code=400, detail=f"Hôte '{request.target}' non trouvé")
# Valider la compatibilité playbook-target
if playbook_info:
playbook_hosts = playbook_info.get('hosts', 'all')
if not ansible_service.is_target_compatible_with_playbook(request.target, playbook_hosts):
raise HTTPException(
status_code=400,
detail=f"Le playbook '{request.playbook}' (hosts: {playbook_hosts}) n'est pas compatible avec la cible '{request.target}'."
)
# Valider la récurrence
if request.schedule_type == "recurring" and not request.recurrence:
raise HTTPException(status_code=400, detail="La récurrence est requise pour un schedule récurrent")
if request.recurrence and request.recurrence.type == "custom":
if not request.recurrence.cron_expression:
raise HTTPException(status_code=400, detail="Expression cron requise pour le type 'custom'")
validation = scheduler_service.validate_cron_expression(request.recurrence.cron_expression)
if not validation["valid"]:
raise HTTPException(status_code=400, detail=f"Expression cron invalide: {validation.get('error')}")
# Créer en DB
repo = ScheduleRepository(db_session)
schedule_id = f"sched_{uuid.uuid4().hex[:12]}"
recurrence = request.recurrence
schedule_obj = await repo.create(
id=schedule_id,
name=request.name,
description=request.description,
playbook=playbook_file,
target_type=request.target_type,
target=request.target,
extra_vars=request.extra_vars,
schedule_type=request.schedule_type,
schedule_time=request.start_at,
recurrence_type=recurrence.type if recurrence else None,
recurrence_time=recurrence.time if recurrence else None,
recurrence_days=json.dumps(recurrence.days) if recurrence and recurrence.days else None,
cron_expression=recurrence.cron_expression if recurrence else None,
timezone=request.timezone,
start_at=request.start_at,
end_at=request.end_at,
enabled=request.enabled,
retry_on_failure=request.retry_on_failure,
timeout=request.timeout,
notification_type=request.notification_type,
tags=json.dumps(request.tags) if request.tags else None,
)
await db_session.commit()
# Créer le schedule Pydantic et l'ajouter au cache du scheduler
pydantic_schedule = Schedule(
id=schedule_id,
name=request.name,
description=request.description,
playbook=playbook_file,
target_type=request.target_type,
target=request.target,
extra_vars=request.extra_vars,
schedule_type=request.schedule_type,
recurrence=request.recurrence,
timezone=request.timezone,
start_at=request.start_at,
end_at=request.end_at,
enabled=request.enabled,
retry_on_failure=request.retry_on_failure,
timeout=request.timeout,
notification_type=request.notification_type,
tags=request.tags or [],
)
scheduler_service.add_schedule_to_cache(pydantic_schedule)
# Log en DB
log_repo = LogRepository(db_session)
await log_repo.create(
level="INFO",
message=f"Schedule '{request.name}' créé pour {playbook_file} sur {request.target}",
source="scheduler",
)
await db_session.commit()
# Notifier via WebSocket
await ws_manager.broadcast({
"type": "schedule_created",
"data": {
"id": schedule_obj.id,
"name": schedule_obj.name,
"playbook": schedule_obj.playbook,
"target": schedule_obj.target,
}
})
return {
"success": True,
"message": f"Schedule '{request.name}' créé avec succès",
"schedule": {
"id": schedule_obj.id,
"name": schedule_obj.name,
"playbook": schedule_obj.playbook,
"target": schedule_obj.target,
"enabled": schedule_obj.enabled,
}
}
@router.put("/{schedule_id}")
async def update_schedule(
schedule_id: str,
request: ScheduleUpdateRequest,
api_key_valid: bool = Depends(verify_api_key),
db_session: AsyncSession = Depends(get_db),
):
"""Met à jour un schedule existant."""
sched = scheduler_service.get_schedule(schedule_id)
repo = ScheduleRepository(db_session)
schedule = await repo.get(schedule_id)
if not sched and not schedule:
raise HTTPException(status_code=404, detail=f"Schedule '{schedule_id}' non trouvé")
schedule_name = sched.name if sched else schedule.name
# Valider le playbook si modifié
if request.playbook:
playbooks = ansible_service.get_playbooks()
playbook_names = [p['filename'] for p in playbooks] + [p['name'] for p in playbooks]
playbook_file = request.playbook
if not playbook_file.endswith(('.yml', '.yaml')):
playbook_file = f"{playbook_file}.yml"
if playbook_file not in playbook_names and request.playbook not in playbook_names:
raise HTTPException(status_code=400, detail=f"Playbook '{request.playbook}' non trouvé")
# Valider l'expression cron si modifiée
if request.recurrence and request.recurrence.type == "custom":
if request.recurrence.cron_expression:
validation = scheduler_service.validate_cron_expression(request.recurrence.cron_expression)
if not validation["valid"]:
raise HTTPException(status_code=400, detail=f"Expression cron invalide: {validation.get('error')}")
# Mettre à jour en DB
update_fields = {}
if request.name:
update_fields["name"] = request.name
if request.description:
update_fields["description"] = request.description
if request.playbook:
update_fields["playbook"] = request.playbook
if request.target:
update_fields["target"] = request.target
if request.schedule_type:
update_fields["schedule_type"] = request.schedule_type
if request.timezone:
update_fields["timezone"] = request.timezone
if request.enabled is not None:
update_fields["enabled"] = request.enabled
if request.retry_on_failure is not None:
update_fields["retry_on_failure"] = request.retry_on_failure
if request.timeout is not None:
update_fields["timeout"] = request.timeout
if request.notification_type:
update_fields["notification_type"] = request.notification_type
if request.tags:
update_fields["tags"] = json.dumps(request.tags)
if request.recurrence:
update_fields["recurrence_type"] = request.recurrence.type
update_fields["recurrence_time"] = request.recurrence.time
update_fields["recurrence_days"] = json.dumps(request.recurrence.days) if request.recurrence.days else None
update_fields["cron_expression"] = request.recurrence.cron_expression
if schedule:
await repo.update(schedule, **update_fields)
await db_session.commit()
scheduler_service.update_schedule(schedule_id, request)
# Log en DB
log_repo = LogRepository(db_session)
await log_repo.create(
level="INFO",
message=f"Schedule '{schedule_name}' mis à jour",
source="scheduler",
)
await db_session.commit()
await ws_manager.broadcast({
"type": "schedule_updated",
"data": {"id": schedule_id, "name": schedule_name}
})
return {
"success": True,
"message": f"Schedule '{schedule_name}' mis à jour",
"schedule": {"id": schedule_id, "name": schedule_name}
}
@router.delete("/{schedule_id}")
async def delete_schedule(
schedule_id: str,
api_key_valid: bool = Depends(verify_api_key),
db_session: AsyncSession = Depends(get_db),
):
"""Supprime un schedule."""
repo = ScheduleRepository(db_session)
schedule = await repo.get(schedule_id)
if not schedule:
try:
scheduler_service.delete_schedule(schedule_id)
except Exception:
pass
return {
"success": True,
"message": f"Schedule '{schedule_id}' déjà supprimé ou inexistant."
}
schedule_name = schedule.name
await repo.soft_delete(schedule_id)
await db_session.commit()
scheduler_service.delete_schedule(schedule_id)
log_repo = LogRepository(db_session)
await log_repo.create(
level="WARN",
message=f"Schedule '{schedule_name}' supprimé",
source="scheduler",
)
await db_session.commit()
await ws_manager.broadcast({
"type": "schedule_deleted",
"data": {"id": schedule_id, "name": schedule_name}
})
return {
"success": True,
"message": f"Schedule '{schedule_name}' supprimé"
}
@router.post("/{schedule_id}/run")
async def run_schedule_now(
schedule_id: str,
api_key_valid: bool = Depends(verify_api_key),
db_session: AsyncSession = Depends(get_db),
):
"""Exécute immédiatement un schedule."""
sched = scheduler_service.get_schedule(schedule_id)
if not sched:
repo = ScheduleRepository(db_session)
schedule = await repo.get(schedule_id)
if not schedule:
raise HTTPException(status_code=404, detail=f"Schedule '{schedule_id}' non trouvé")
schedule_name = schedule.name
else:
schedule_name = sched.name
run = await scheduler_service.run_now(schedule_id)
return {
"success": True,
"message": f"Schedule '{schedule_name}' lancé",
"run": run.dict() if run else None
}
@router.post("/{schedule_id}/pause")
async def pause_schedule(
schedule_id: str,
api_key_valid: bool = Depends(verify_api_key),
db_session: AsyncSession = Depends(get_db),
):
"""Met en pause un schedule."""
sched = scheduler_service.get_schedule(schedule_id)
repo = ScheduleRepository(db_session)
schedule = await repo.get(schedule_id)
if not sched and not schedule:
raise HTTPException(status_code=404, detail=f"Schedule '{schedule_id}' non trouvé")
schedule_name = sched.name if sched else schedule.name
if schedule:
await repo.update(schedule, enabled=False)
await db_session.commit()
scheduler_service.pause_schedule(schedule_id)
log_repo = LogRepository(db_session)
await log_repo.create(
level="INFO",
message=f"Schedule '{schedule_name}' mis en pause",
source="scheduler",
)
await db_session.commit()
await ws_manager.broadcast({
"type": "schedule_updated",
"data": {"id": schedule_id, "name": schedule_name, "enabled": False}
})
return {
"success": True,
"message": f"Schedule '{schedule_name}' mis en pause",
"schedule": {"id": schedule_id, "name": schedule_name, "enabled": False}
}
@router.post("/{schedule_id}/resume")
async def resume_schedule(
schedule_id: str,
api_key_valid: bool = Depends(verify_api_key),
db_session: AsyncSession = Depends(get_db),
):
"""Reprend un schedule en pause."""
sched = scheduler_service.get_schedule(schedule_id)
repo = ScheduleRepository(db_session)
schedule = await repo.get(schedule_id)
if not sched and not schedule:
raise HTTPException(status_code=404, detail=f"Schedule '{schedule_id}' non trouvé")
schedule_name = sched.name if sched else schedule.name
if schedule:
await repo.update(schedule, enabled=True)
await db_session.commit()
scheduler_service.resume_schedule(schedule_id)
log_repo = LogRepository(db_session)
await log_repo.create(
level="INFO",
message=f"Schedule '{schedule_name}' repris",
source="scheduler",
)
await db_session.commit()
await ws_manager.broadcast({
"type": "schedule_updated",
"data": {"id": schedule_id, "name": schedule_name, "enabled": True}
})
return {
"success": True,
"message": f"Schedule '{schedule_name}' repris",
"schedule": {"id": schedule_id, "name": schedule_name, "enabled": True}
}
@router.get("/{schedule_id}/runs")
async def get_schedule_runs(
schedule_id: str,
limit: int = 50,
offset: int = 0,
api_key_valid: bool = Depends(verify_api_key),
db_session: AsyncSession = Depends(get_db),
):
"""Récupère l'historique des exécutions d'un schedule."""
sched = scheduler_service.get_schedule(schedule_id)
repo = ScheduleRepository(db_session)
schedule = await repo.get(schedule_id)
if not sched and not schedule:
raise HTTPException(status_code=404, detail=f"Schedule '{schedule_id}' non trouvé")
schedule_name = sched.name if sched else schedule.name
run_repo = ScheduleRunRepository(db_session)
runs = await run_repo.list_for_schedule(schedule_id, limit=limit, offset=offset)
return {
"schedule_id": schedule_id,
"schedule_name": schedule_name,
"runs": [
{
"id": r.id,
"status": r.status,
"started_at": r.started_at,
"finished_at": r.completed_at,
"duration_seconds": r.duration,
"error_message": r.error_message,
}
for r in runs
],
"count": len(runs)
}

65
app/routes/server.py Normal file
View File

@ -0,0 +1,65 @@
"""
Routes API pour les logs serveur.
"""
from typing import Optional
from fastapi import APIRouter, Depends
from sqlalchemy.ext.asyncio import AsyncSession
from app.core.dependencies import get_db, verify_api_key
from app.crud.log import LogRepository
from app.services import console_log_service
router = APIRouter()
@router.get("/logs")
async def get_server_logs(
limit: int = 500,
offset: int = 0,
level: Optional[str] = None,
source: Optional[str] = None,
log_source: str = "console",
api_key_valid: bool = Depends(verify_api_key),
db_session: AsyncSession = Depends(get_db)
):
"""
Récupère les logs serveur avec pagination.
Args:
log_source: "console" pour les logs stdout/stderr en temps réel,
"db" pour les logs stockés en base de données
"""
if log_source == "console":
# Logs console en temps réel
logs = console_log_service.get_logs(limit=limit, offset=offset, level=level)
return {
"logs": logs,
"count": console_log_service.get_count(),
"limit": limit,
"offset": offset,
"source": "console"
}
else:
# Logs depuis la base de données
repo = LogRepository(db_session)
logs = await repo.list(limit=limit, offset=offset, level=level, source=source)
return {
"logs": [
{
"id": log.id,
"timestamp": log.created_at,
"level": log.level,
"message": log.message,
"source": log.source,
"host": log.host_id,
}
for log in logs
],
"count": len(logs),
"limit": limit,
"offset": offset,
"source": "db"
}

526
app/routes/tasks.py Normal file
View File

@ -0,0 +1,526 @@
"""
Routes API pour la gestion des tâches.
"""
import uuid
import asyncio
from datetime import datetime, timezone
from pathlib import Path
from typing import Optional
from fastapi import APIRouter, Depends, HTTPException
from sqlalchemy.ext.asyncio import AsyncSession
from app.core.config import settings
from app.core.constants import ACTION_PLAYBOOK_MAP, ACTION_DISPLAY_NAMES
from app.core.dependencies import get_db, verify_api_key
from app.crud.task import TaskRepository
from app.crud.log import LogRepository
from app.schemas.task_api import TaskRequest
from app.services import ws_manager, db, ansible_service, notification_service
from app.services.task_log_service import TaskLogService
from app.models.database import async_session_maker
from app.schemas.task_api import Task
from app.schemas.common import LogEntry
from time import perf_counter
router = APIRouter()
# Instance du service de logs de tâches
task_log_service = TaskLogService(settings.tasks_logs_dir)
# Dictionnaire des tâches en cours d'exécution
running_task_handles = {}
@router.get("")
async def get_tasks(
limit: int = 100,
offset: int = 0,
api_key_valid: bool = Depends(verify_api_key),
db_session: AsyncSession = Depends(get_db),
):
"""Récupère la liste des tâches."""
repo = TaskRepository(db_session)
tasks = await repo.list(limit=limit, offset=offset)
return [
{
"id": t.id,
"name": t.action,
"host": t.target,
"status": t.status,
"progress": 100 if t.status == "completed" else (50 if t.status == "running" else 0),
"start_time": t.started_at,
"end_time": t.completed_at,
"duration": None,
"output": t.result_data.get("output") if t.result_data else None,
"error": t.error_message,
}
for t in tasks
]
@router.get("/running")
async def get_running_tasks(
api_key_valid: bool = Depends(verify_api_key),
db_session: AsyncSession = Depends(get_db),
):
"""Récupère les tâches en cours d'exécution."""
repo = TaskRepository(db_session)
tasks = await repo.list(limit=100, offset=0)
running_tasks = [t for t in tasks if t.status in ("running", "pending")]
return {
"tasks": [
{
"id": t.id,
"name": t.action,
"host": t.target,
"status": t.status,
"progress": 50 if t.status == "running" else 0,
"start_time": t.started_at,
"end_time": t.completed_at,
}
for t in running_tasks
],
"count": len(running_tasks)
}
@router.get("/logs")
async def get_task_logs(
status: Optional[str] = None,
year: Optional[str] = None,
month: Optional[str] = None,
day: Optional[str] = None,
hour_start: Optional[str] = None,
hour_end: Optional[str] = None,
target: Optional[str] = None,
category: Optional[str] = None,
source_type: Optional[str] = None,
limit: int = 50,
offset: int = 0,
api_key_valid: bool = Depends(verify_api_key)
):
"""Récupère les logs de tâches depuis les fichiers markdown."""
logs, total_count = task_log_service.get_task_logs(
year=year,
month=month,
day=day,
status=status,
target=target,
category=category,
source_type=source_type,
hour_start=hour_start,
hour_end=hour_end,
limit=limit,
offset=offset
)
return {
"logs": [log.dict() for log in logs],
"count": len(logs),
"total_count": total_count,
"has_more": offset + len(logs) < total_count,
"filters": {
"status": status,
"year": year,
"month": month,
"day": day,
"hour_start": hour_start,
"hour_end": hour_end,
"target": target,
"source_type": source_type
},
"pagination": {
"limit": limit,
"offset": offset
}
}
@router.get("/logs/dates")
async def get_task_logs_dates(api_key_valid: bool = Depends(verify_api_key)):
"""Récupère les dates disponibles pour le filtrage."""
return task_log_service.get_available_dates()
@router.get("/logs/stats")
async def get_task_logs_stats(api_key_valid: bool = Depends(verify_api_key)):
"""Récupère les statistiques des logs de tâches."""
return task_log_service.get_stats()
@router.get("/logs/{log_id}")
async def get_task_log_content(log_id: str, api_key_valid: bool = Depends(verify_api_key)):
"""Récupère le contenu d'un log de tâche spécifique."""
logs, _ = task_log_service.get_task_logs(limit=0)
log = next((l for l in logs if l.id == log_id), None)
if not log:
raise HTTPException(status_code=404, detail="Log non trouvé")
try:
content = Path(log.path).read_text(encoding='utf-8')
return {
"log": log.dict(),
"content": content
}
except Exception as e:
raise HTTPException(status_code=500, detail=f"Erreur lecture du fichier: {str(e)}")
@router.delete("/logs/{log_id}")
async def delete_task_log(log_id: str, api_key_valid: bool = Depends(verify_api_key)):
"""Supprime un fichier de log de tâche."""
logs, _ = task_log_service.get_task_logs(limit=0)
log = next((l for l in logs if l.id == log_id), None)
if not log:
raise HTTPException(status_code=404, detail="Log non trouvé")
try:
log_path = Path(log.path)
if log_path.exists():
log_path.unlink()
# Le TaskLogService met en cache l'index des fichiers pendant ~60s.
# Invalider immédiatement pour que la suppression soit visible instantanément.
task_log_service.invalidate_index()
await ws_manager.broadcast({
"type": "task_log_deleted",
"data": {"id": log_id}
})
return {"message": "Log supprimé", "id": log_id}
except Exception as e:
raise HTTPException(status_code=500, detail=f"Erreur suppression: {str(e)}")
@router.post("/{task_id}/cancel")
async def cancel_task(
task_id: str,
api_key_valid: bool = Depends(verify_api_key),
db_session: AsyncSession = Depends(get_db),
):
"""Annule une tâche en cours d'exécution."""
repo = TaskRepository(db_session)
task = await repo.get(task_id)
if not task:
raise HTTPException(status_code=404, detail="Tâche non trouvée")
if task.status not in ("running", "pending"):
raise HTTPException(status_code=400, detail=f"La tâche n'est pas en cours (statut: {task.status})")
# Marquer comme annulée
if task_id in running_task_handles:
running_task_handles[task_id]["cancelled"] = True
async_task = running_task_handles[task_id].get("asyncio_task")
if async_task and not async_task.done():
async_task.cancel()
process = running_task_handles[task_id].get("process")
if process:
try:
process.terminate()
await asyncio.sleep(0.5)
if process.returncode is None:
process.kill()
except Exception:
pass
del running_task_handles[task_id]
await repo.update(
task,
status="cancelled",
completed_at=datetime.now(timezone.utc),
error_message="Tâche annulée par l'utilisateur"
)
await db_session.commit()
# Log
log_repo = LogRepository(db_session)
await log_repo.create(
level="WARNING",
message=f"Tâche '{task.action}' annulée manuellement",
source="task",
task_id=task_id,
)
await db_session.commit()
await ws_manager.broadcast({
"type": "task_cancelled",
"data": {
"id": task_id,
"status": "cancelled",
"message": "Tâche annulée par l'utilisateur"
}
})
return {
"success": True,
"message": f"Tâche {task_id} annulée avec succès",
"task_id": task_id
}
@router.get("/{task_id}")
async def get_task(
task_id: str,
api_key_valid: bool = Depends(verify_api_key),
db_session: AsyncSession = Depends(get_db),
):
"""Récupère une tâche spécifique."""
repo = TaskRepository(db_session)
task = await repo.get(task_id)
if not task:
raise HTTPException(status_code=404, detail="Tâche non trouvée")
return {
"id": task.id,
"name": task.action,
"host": task.target,
"status": task.status,
"progress": 100 if task.status == "completed" else (50 if task.status == "running" else 0),
"start_time": task.started_at,
"end_time": task.completed_at,
"duration": None,
"output": task.result_data.get("output") if task.result_data else None,
"error": task.error_message,
}
@router.delete("/{task_id}")
async def delete_task(
task_id: str,
api_key_valid: bool = Depends(verify_api_key),
db_session: AsyncSession = Depends(get_db),
):
"""Supprime une tâche."""
repo = TaskRepository(db_session)
task = await repo.get(task_id)
if not task:
raise HTTPException(status_code=404, detail="Tâche non trouvée")
await db_session.delete(task)
await db_session.commit()
await ws_manager.broadcast({
"type": "task_deleted",
"data": {"id": task_id}
})
return {"message": "Tâche supprimée avec succès"}
@router.post("")
async def create_task(
task_request: TaskRequest,
api_key_valid: bool = Depends(verify_api_key),
db_session: AsyncSession = Depends(get_db),
):
"""Crée une nouvelle tâche et exécute le playbook correspondant."""
repo = TaskRepository(db_session)
task_id = uuid.uuid4().hex
target = task_request.host or task_request.group or "all"
playbook = ACTION_PLAYBOOK_MAP.get(task_request.action)
task_obj = await repo.create(
id=task_id,
action=task_request.action,
target=target,
playbook=playbook,
status="running",
)
await repo.update(task_obj, started_at=datetime.now(timezone.utc))
await db_session.commit()
task_name = ACTION_DISPLAY_NAMES.get(task_request.action, f"Tâche {task_request.action}")
response_data = {
"id": task_obj.id,
"name": task_name,
"host": target,
"status": "running",
"progress": 0,
"start_time": task_obj.started_at,
"end_time": None,
"duration": None,
"output": None,
"error": None,
}
await ws_manager.broadcast({
"type": "task_created",
"data": response_data
})
# Exécuter le playbook en arrière-plan
if playbook:
asyncio.create_task(_execute_task_playbook(
task_id=task_id,
task_name=task_name,
playbook=playbook,
target=target,
extra_vars=task_request.extra_vars,
check_mode=task_request.dry_run if hasattr(task_request, 'dry_run') else False
))
return response_data
async def _execute_task_playbook(
task_id: str,
task_name: str,
playbook: str,
target: str,
extra_vars: dict = None,
check_mode: bool = False
):
"""Exécute un playbook Ansible en arrière-plan et met à jour le statut."""
start_time = perf_counter()
# Créer une tâche en mémoire pour le service de logs
mem_task = Task(
id=task_id,
name=task_name,
host=target,
status="running",
progress=10,
start_time=datetime.now(timezone.utc)
)
# Notifier la progression
await ws_manager.broadcast({
"type": "task_progress",
"data": {"id": task_id, "progress": 10, "message": "Démarrage du playbook..."}
})
try:
# Exécuter le playbook
result = await ansible_service.execute_playbook(
playbook=playbook,
target=target,
extra_vars=extra_vars,
check_mode=check_mode,
verbose=True
)
execution_time = perf_counter() - start_time
success = result.get("success", False)
# Mettre à jour la tâche en mémoire
mem_task.status = "completed" if success else "failed"
mem_task.progress = 100
mem_task.end_time = datetime.now(timezone.utc)
mem_task.duration = f"{execution_time:.1f}s"
mem_task.output = result.get("stdout", "")
mem_task.error = result.get("stderr", "") if not success else None
# Mettre à jour en BD
async with async_session_maker() as session:
repo = TaskRepository(session)
db_task = await repo.get(task_id)
if db_task:
await repo.update(
db_task,
status=mem_task.status,
completed_at=mem_task.end_time,
error_message=mem_task.error,
result_data={"output": result.get("stdout", "")[:5000]}
)
await session.commit()
# Sauvegarder le log markdown
try:
log_path = task_log_service.save_task_log(
task=mem_task,
output=result.get("stdout", ""),
error=result.get("stderr", "")
)
created_log = task_log_service.index_log_file(log_path)
if created_log:
await ws_manager.broadcast({
"type": "task_log_created",
"data": created_log.dict()
})
except Exception as log_error:
print(f"Erreur sauvegarde log markdown: {log_error}")
# Notifier la fin via WebSocket
await ws_manager.broadcast({
"type": "task_completed",
"data": {
"id": task_id,
"status": mem_task.status,
"progress": 100,
"duration": mem_task.duration,
"success": success
}
})
# Notification ntfy
if success:
await notification_service.notify_task_completed(
task_name=task_name,
target=target,
duration=mem_task.duration
)
else:
await notification_service.notify_task_failed(
task_name=task_name,
target=target,
error=result.get("stderr", "Erreur inconnue")[:200]
)
except Exception as e:
execution_time = perf_counter() - start_time
error_msg = str(e)
mem_task.status = "failed"
mem_task.end_time = datetime.now(timezone.utc)
mem_task.duration = f"{execution_time:.1f}s"
mem_task.error = error_msg
# Mettre à jour en BD
try:
async with async_session_maker() as session:
repo = TaskRepository(session)
db_task = await repo.get(task_id)
if db_task:
await repo.update(
db_task,
status="failed",
completed_at=mem_task.end_time,
error_message=error_msg
)
await session.commit()
except Exception as db_error:
print(f"Erreur mise à jour BD: {db_error}")
# Sauvegarder le log markdown
try:
log_path = task_log_service.save_task_log(task=mem_task, error=error_msg)
created_log = task_log_service.index_log_file(log_path)
if created_log:
await ws_manager.broadcast({
"type": "task_log_created",
"data": created_log.dict()
})
except Exception:
pass
# Notifier l'échec
await ws_manager.broadcast({
"type": "task_failed",
"data": {
"id": task_id,
"status": "failed",
"error": error_msg
}
})
await notification_service.notify_task_failed(
task_name=task_name,
target=target,
error=error_msg[:200]
)

22
app/routes/websocket.py Normal file
View File

@ -0,0 +1,22 @@
"""
Routes WebSocket pour les mises à jour en temps réel.
"""
from fastapi import APIRouter, WebSocket, WebSocketDisconnect
from app.services import ws_manager
router = APIRouter()
@router.websocket("/ws")
async def websocket_endpoint(websocket: WebSocket):
"""Endpoint WebSocket pour les mises à jour en temps réel."""
await ws_manager.connect(websocket)
try:
while True:
# Garder la connexion ouverte
data = await websocket.receive_text()
# Traiter les messages entrants si nécessaire
except WebSocketDisconnect:
ws_manager.disconnect(websocket)

View File

@ -10,6 +10,18 @@ from .notification import (
NotificationResponse, NotificationResponse,
NotificationTemplates, NotificationTemplates,
) )
from .auth import (
LoginRequest,
Token,
TokenData,
UserBase,
UserCreate,
UserUpdate,
UserOut,
UserSetup,
PasswordChange,
AuthStatus,
)
__all__ = [ __all__ = [
"HostCreate", "HostCreate",
@ -31,4 +43,15 @@ __all__ = [
"NotificationRequest", "NotificationRequest",
"NotificationResponse", "NotificationResponse",
"NotificationTemplates", "NotificationTemplates",
# Auth
"LoginRequest",
"Token",
"TokenData",
"UserBase",
"UserCreate",
"UserUpdate",
"UserOut",
"UserSetup",
"PasswordChange",
"AuthStatus",
] ]

108
app/schemas/ansible.py Normal file
View File

@ -0,0 +1,108 @@
"""
Schémas Pydantic pour l'exécution Ansible.
"""
from datetime import datetime, timezone
from typing import Optional, List, Dict, Any, Literal
from pydantic import BaseModel, Field, field_validator
class AnsibleExecutionRequest(BaseModel):
"""Requête d'exécution de playbook Ansible."""
playbook: str = Field(..., description="Nom du playbook à exécuter")
target: str = Field(default="all", description="Hôte ou groupe cible")
extra_vars: Optional[Dict[str, Any]] = Field(default=None, description="Variables supplémentaires")
check_mode: bool = Field(default=False, description="Mode dry-run (--check)")
verbose: bool = Field(default=False, description="Mode verbeux")
class AdHocCommandRequest(BaseModel):
"""Requête pour exécuter une commande ad-hoc Ansible."""
target: str = Field(..., description="Hôte ou groupe cible")
command: str = Field(..., description="Commande shell à exécuter")
module: str = Field(default="shell", description="Module Ansible (shell, command, raw)")
become: bool = Field(default=False, description="Exécuter avec sudo")
timeout: int = Field(default=60, ge=5, le=600, description="Timeout en secondes")
category: Optional[str] = Field(default="default", description="Catégorie d'historique pour cette commande")
class AdHocCommandResult(BaseModel):
"""Résultat d'une commande ad-hoc."""
target: str
command: str
success: bool
return_code: int
stdout: str
stderr: Optional[str] = None
duration: float
hosts_results: Optional[Dict[str, Any]] = None
class AdHocHistoryEntry(BaseModel):
"""Entrée dans l'historique des commandes ad-hoc."""
id: str
command: str
target: str
module: str
become: bool
category: str = "default"
description: Optional[str] = None
created_at: datetime = Field(default_factory=lambda: datetime.now(timezone.utc))
last_used: datetime = Field(default_factory=lambda: datetime.now(timezone.utc))
use_count: int = 1
class AdHocHistoryCategory(BaseModel):
"""Catégorie pour organiser les commandes ad-hoc."""
name: str
description: Optional[str] = None
color: str = "#7c3aed"
icon: str = "fa-folder"
class PlaybookInfo(BaseModel):
"""Informations sur un playbook."""
name: str
filename: str
path: str
category: str = "general"
subcategory: str = "other"
hosts: str = "all"
size: int = 0
modified: Optional[str] = None
description: Optional[str] = None
class PlaybookContentRequest(BaseModel):
"""Requête pour sauvegarder le contenu d'un playbook."""
content: str = Field(..., description="Contenu YAML du playbook")
class PlaybooksListResponse(BaseModel):
"""Réponse API pour la liste des playbooks."""
playbooks: List[PlaybookInfo] = Field(default_factory=list)
categories: Dict[str, List[str]] = Field(default_factory=dict)
ansible_dir: str = ""
filter: Optional[str] = None
class BootstrapRequest(BaseModel):
"""Requête de bootstrap pour un hôte."""
host: str = Field(..., description="Adresse IP ou hostname de l'hôte")
root_password: str = Field(..., description="Mot de passe root pour la connexion initiale")
automation_user: str = Field(default="automation", description="Nom de l'utilisateur d'automatisation à créer")
class SSHConfigResponse(BaseModel):
"""Réponse de diagnostic de la configuration SSH."""
ssh_key_path: str
ssh_dir: str
ssh_dir_exists: bool
private_key_exists: bool
public_key_exists: bool
available_files: List[str] = Field(default_factory=list)
public_keys_found: List[str] = Field(default_factory=list)
active_private_key: Optional[str] = None
ssh_user: str
sshpass_available: bool

112
app/schemas/auth.py Normal file
View File

@ -0,0 +1,112 @@
"""Authentication schemas for login, token, and user management."""
from __future__ import annotations
from datetime import datetime
from typing import Optional
from pydantic import BaseModel, EmailStr, Field, field_validator
class LoginRequest(BaseModel):
"""Request schema for user login."""
username: str = Field(..., min_length=3, max_length=50, description="Username")
password: str = Field(..., min_length=6, description="Password")
class Token(BaseModel):
"""JWT token response."""
access_token: str
token_type: str = "bearer"
expires_in: int = Field(description="Token expiration time in seconds")
class TokenData(BaseModel):
"""Data extracted from JWT token."""
username: Optional[str] = None
user_id: Optional[int] = None
role: Optional[str] = None
# User schemas
class UserBase(BaseModel):
"""Base user schema with common fields."""
username: str = Field(..., min_length=3, max_length=50)
email: Optional[EmailStr] = None
display_name: Optional[str] = Field(None, max_length=100)
role: str = Field(default="admin", description="User role: admin, operator, viewer")
is_active: bool = True
class UserCreate(UserBase):
"""Schema for creating a new user."""
password: str = Field(..., min_length=6, max_length=128, description="Password (min 6 chars)")
@field_validator('password')
@classmethod
def password_strength(cls, v: str) -> str:
"""Validate password has minimum complexity."""
if len(v) < 6:
raise ValueError('Password must be at least 6 characters')
return v
class UserUpdate(BaseModel):
"""Schema for updating user (all fields optional)."""
email: Optional[EmailStr] = None
display_name: Optional[str] = Field(None, max_length=100)
role: Optional[str] = None
is_active: Optional[bool] = None
class PasswordChange(BaseModel):
"""Schema for changing password."""
current_password: str = Field(..., description="Current password")
new_password: str = Field(..., min_length=6, max_length=128, description="New password")
@field_validator('new_password')
@classmethod
def password_strength(cls, v: str) -> str:
if len(v) < 6:
raise ValueError('Password must be at least 6 characters')
return v
class UserOut(BaseModel):
"""Schema for user output (without password)."""
id: int
username: str
email: Optional[str] = None
display_name: Optional[str] = None
role: str
is_active: bool
is_superuser: bool
created_at: datetime
last_login: Optional[datetime] = None
class Config:
from_attributes = True
class UserSetup(BaseModel):
"""Schema for initial admin setup (first user creation)."""
username: str = Field(..., min_length=3, max_length=50)
password: str = Field(..., min_length=6, max_length=128)
email: Optional[EmailStr] = None
display_name: Optional[str] = None
@field_validator('password')
@classmethod
def password_strength(cls, v: str) -> str:
if len(v) < 6:
raise ValueError('Password must be at least 6 characters')
return v
class AuthStatus(BaseModel):
"""Response for auth status check."""
authenticated: bool
user: Optional[UserOut] = None
setup_required: bool = Field(
default=False,
description="True if no users exist and setup is needed"
)

66
app/schemas/common.py Normal file
View File

@ -0,0 +1,66 @@
"""
Schémas Pydantic communs utilisés dans plusieurs modules.
"""
from datetime import datetime, timezone
from typing import Optional, Literal, Dict, Any
from pydantic import BaseModel, Field, ConfigDict
class CommandResult(BaseModel):
"""Résultat d'une commande SSH ou Ansible."""
status: str
return_code: int
stdout: str
stderr: Optional[str] = None
execution_time: Optional[float] = None
timestamp: datetime = Field(default_factory=lambda: datetime.now(timezone.utc))
class LogEntry(BaseModel):
"""Entrée de log (modèle mémoire)."""
id: int = 0
timestamp: datetime = Field(default_factory=lambda: datetime.now(timezone.utc))
level: Literal["DEBUG", "INFO", "WARN", "WARNING", "ERROR"] = "INFO"
message: str
source: Optional[str] = None
host: Optional[str] = None
model_config = ConfigDict(
json_encoders={datetime: lambda v: v.isoformat()}
)
class SystemMetrics(BaseModel):
"""Métriques système du dashboard."""
online_hosts: int = 0
total_tasks: int = 0
success_rate: float = 100.0
uptime: float = 99.9
cpu_usage: float = 0.0
memory_usage: float = 0.0
disk_usage: float = 0.0
class PaginatedResponse(BaseModel):
"""Réponse paginée générique."""
items: list = Field(default_factory=list)
count: int = 0
total_count: int = 0
has_more: bool = False
limit: int = 50
offset: int = 0
class SuccessResponse(BaseModel):
"""Réponse de succès générique."""
success: bool = True
message: str
class ErrorResponse(BaseModel):
"""Réponse d'erreur générique."""
error: str
message: str
details: Optional[Dict[str, Any]] = None

68
app/schemas/group.py Normal file
View File

@ -0,0 +1,68 @@
"""
Schémas Pydantic pour la gestion des groupes Ansible.
"""
from typing import Optional, List
import re
from pydantic import BaseModel, Field, field_validator
class GroupRequest(BaseModel):
"""Requête pour créer un groupe."""
name: str = Field(..., min_length=3, max_length=50, description="Nom du groupe (ex: env_prod, role_web)")
type: str = Field(..., description="Type de groupe: 'env' ou 'role'")
@field_validator('name')
@classmethod
def validate_name(cls, v: str) -> str:
if not re.match(r'^[a-zA-Z0-9_-]+$', v):
raise ValueError('Le nom du groupe ne peut contenir que des lettres, chiffres, tirets et underscores')
return v
@field_validator('type')
@classmethod
def validate_type(cls, v: str) -> str:
if v not in ['env', 'role']:
raise ValueError("Le type doit être 'env' ou 'role'")
return v
class GroupUpdateRequest(BaseModel):
"""Requête pour modifier un groupe."""
new_name: str = Field(..., min_length=3, max_length=50, description="Nouveau nom du groupe")
@field_validator('new_name')
@classmethod
def validate_new_name(cls, v: str) -> str:
if not re.match(r'^[a-zA-Z0-9_-]+$', v):
raise ValueError('Le nom du groupe ne peut contenir que des lettres, chiffres, tirets et underscores')
return v
class GroupDeleteRequest(BaseModel):
"""Requête pour supprimer un groupe."""
move_hosts_to: Optional[str] = Field(default=None, description="Groupe vers lequel déplacer les hôtes")
class GroupResponse(BaseModel):
"""Réponse API pour un groupe."""
name: str
type: str
display_name: str
hosts_count: int = 0
hosts: List[str] = Field(default_factory=list)
class GroupsListResponse(BaseModel):
"""Réponse API pour la liste des groupes."""
groups: List[GroupResponse] = Field(default_factory=list)
env_count: int = 0
role_count: int = 0
class HostGroupsResponse(BaseModel):
"""Réponse API pour les groupes disponibles."""
env_groups: List[str] = Field(default_factory=list)
role_groups: List[str] = Field(default_factory=list)
all_groups: List[str] = Field(default_factory=list)

27
app/schemas/health.py Normal file
View File

@ -0,0 +1,27 @@
"""
Schémas Pydantic pour les health checks.
"""
from typing import Optional
from pydantic import BaseModel
class HealthCheck(BaseModel):
"""Résultat d'un health check sur un hôte."""
host: str
ssh_ok: bool = False
ansible_ok: bool = False
sudo_ok: bool = False
reachable: bool = False
error_message: Optional[str] = None
response_time: Optional[float] = None
cached: bool = False
cache_age: int = 0
class GlobalHealthResponse(BaseModel):
"""Réponse du healthcheck global de l'API."""
status: str = "ok"
service: str = "homelab-automation-api"
timestamp: str

73
app/schemas/host_api.py Normal file
View File

@ -0,0 +1,73 @@
"""
Schémas Pydantic pour les hôtes - modèles API.
Ces modèles sont utilisés pour les requêtes/réponses API,
distincts des modèles de base de données.
"""
from datetime import datetime, timezone
from typing import Optional, List, Literal
from pydantic import BaseModel, Field, ConfigDict, field_validator
class Host(BaseModel):
"""Modèle complet d'un hôte pour l'API."""
id: str
name: str
ip: str
status: Literal["online", "offline", "warning", "unknown"] = "unknown"
os: str = "Linux"
last_seen: Optional[datetime] = None
created_at: datetime = Field(default_factory=lambda: datetime.now(timezone.utc))
groups: List[str] = Field(default_factory=list)
bootstrap_ok: bool = False
bootstrap_date: Optional[datetime] = None
model_config = ConfigDict(
json_encoders={datetime: lambda v: v.isoformat() if v else None}
)
class HostRequest(BaseModel):
"""Requête de création d'un hôte."""
name: str = Field(..., min_length=3, max_length=100, description="Hostname (ex: server.domain.home)")
ip: Optional[str] = Field(default=None, description="Adresse IP ou hostname (optionnel si hostname résolvable)")
os: str = Field(default="Linux", min_length=3, max_length=50)
ssh_user: Optional[str] = Field(default="root", min_length=1, max_length=50)
ssh_port: int = Field(default=22, ge=1, le=65535)
description: Optional[str] = Field(default=None, max_length=200)
env_group: str = Field(..., description="Groupe d'environnement (ex: env_homelab, env_prod)")
role_groups: List[str] = Field(default_factory=list, description="Groupes de rôles (ex: role_proxmox, role_sbc)")
class HostUpdateRequest(BaseModel):
"""Requête de mise à jour d'un hôte."""
env_group: Optional[str] = Field(default=None, description="Nouveau groupe d'environnement")
role_groups: Optional[List[str]] = Field(default=None, description="Nouveaux groupes de rôles")
ansible_host: Optional[str] = Field(default=None, description="Nouvelle adresse ansible_host")
class HostResponse(BaseModel):
"""Réponse API pour un hôte."""
id: str
name: str
ip: Optional[str] = None
status: str = "unknown"
os: str = "Linux"
last_seen: Optional[datetime] = None
created_at: Optional[datetime] = None
groups: List[str] = Field(default_factory=list)
bootstrap_ok: bool = False
bootstrap_date: Optional[datetime] = None
model_config = ConfigDict(from_attributes=True)
class AnsibleInventoryHost(BaseModel):
"""Hôte de l'inventaire Ansible."""
name: str
ansible_host: str
group: str
groups: List[str] = Field(default_factory=list)
vars: dict = Field(default_factory=dict)

172
app/schemas/host_metrics.py Normal file
View File

@ -0,0 +1,172 @@
from __future__ import annotations
from datetime import datetime
from typing import Optional, List, Dict, Any
from pydantic import BaseModel, Field, ConfigDict
class DiskInfo(BaseModel):
"""Informations sur un disque/partition"""
mount_point: str
filesystem: Optional[str] = None
total_gb: float
used_gb: float
free_gb: float
usage_percent: float
class NetworkInterface(BaseModel):
"""Informations sur une interface réseau"""
name: str
ip_address: Optional[str] = None
mac_address: Optional[str] = None
status: Optional[str] = None
class HostMetricsBase(BaseModel):
"""Schéma de base pour les métriques d'hôte"""
metric_type: str = Field(..., description="Type de métrique (system_info, full, etc.)")
# CPU
cpu_count: Optional[int] = None
cpu_model: Optional[str] = None
cpu_cores: Optional[int] = None
cpu_threads: Optional[int] = None
cpu_threads_per_core: Optional[int] = None
cpu_sockets: Optional[int] = None
cpu_mhz: Optional[float] = None
cpu_max_mhz: Optional[float] = None
cpu_min_mhz: Optional[float] = None
cpu_load_1m: Optional[float] = None
cpu_load_5m: Optional[float] = None
cpu_load_15m: Optional[float] = None
cpu_usage_percent: Optional[float] = None
cpu_temperature: Optional[float] = None
# Memory
memory_total_mb: Optional[int] = None
memory_used_mb: Optional[int] = None
memory_free_mb: Optional[int] = None
memory_usage_percent: Optional[float] = None
swap_total_mb: Optional[int] = None
swap_used_mb: Optional[int] = None
swap_usage_percent: Optional[float] = None
# Disk
disk_info: Optional[List[Dict[str, Any]]] = None
disk_devices: Optional[List[Dict[str, Any]]] = None
disk_root_total_gb: Optional[float] = None
disk_root_used_gb: Optional[float] = None
disk_root_usage_percent: Optional[float] = None
# Storage stacks
lvm_info: Optional[Dict[str, Any]] = None
zfs_info: Optional[Dict[str, Any]] = None
# System info
os_name: Optional[str] = None
os_version: Optional[str] = None
kernel_version: Optional[str] = None
hostname: Optional[str] = None
uptime_seconds: Optional[int] = None
uptime_human: Optional[str] = None
# Network
network_info: Optional[List[Dict[str, Any]]] = None
class HostMetricsCreate(HostMetricsBase):
"""Schéma pour créer des métriques"""
host_id: str
raw_data: Optional[Dict[str, Any]] = None
collection_source: Optional[str] = None
collection_duration_ms: Optional[int] = None
error_message: Optional[str] = None
class HostMetricsOut(HostMetricsBase):
"""Schéma de sortie pour les métriques"""
id: int
host_id: str
collected_at: datetime
created_at: datetime
collection_source: Optional[str] = None
collection_duration_ms: Optional[int] = None
error_message: Optional[str] = None
model_config = ConfigDict(from_attributes=True)
class HostMetricsSummary(BaseModel):
"""Résumé des métriques pour affichage dans la carte d'hôte"""
host_id: str
host_name: Optional[str] = None
last_collected: Optional[datetime] = None
# Résumé CPU
cpu_usage_percent: Optional[float] = None
cpu_load_1m: Optional[float] = None
cpu_temperature: Optional[float] = None
cpu_model: Optional[str] = None
cpu_count: Optional[int] = None
cpu_cores: Optional[int] = None
cpu_threads: Optional[int] = None
cpu_max_mhz: Optional[float] = None
# Résumé mémoire
memory_usage_percent: Optional[float] = None
memory_total_mb: Optional[int] = None
memory_used_mb: Optional[int] = None
# Résumé disque (partition racine)
disk_root_usage_percent: Optional[float] = None
disk_root_total_gb: Optional[float] = None
disk_root_used_gb: Optional[float] = None
disk_info: Optional[List[Dict[str, Any]]] = None
disk_devices: Optional[List[Dict[str, Any]]] = None
# Storage stacks
lvm_info: Optional[Dict[str, Any]] = None
zfs_info: Optional[Dict[str, Any]] = None
# Système
os_name: Optional[str] = None
uptime_human: Optional[str] = None
# Statut de la collecte
collection_status: str = "unknown" # success, failed, pending, unknown
error_message: Optional[str] = None
class BuiltinPlaybookDefinition(BaseModel):
"""Définition d'un builtin playbook"""
id: str = Field(..., description="Identifiant unique du builtin playbook")
name: str = Field(..., description="Nom affiché")
description: str = Field(..., description="Description du playbook")
playbook_file: str = Field(..., description="Nom du fichier playbook Ansible")
category: str = Field(..., description="Catégorie (metrics, maintenance, etc.)")
icon: str = Field(default="fas fa-cog", description="Icône FontAwesome")
color: str = Field(default="blue", description="Couleur du bouton")
collect_metrics: bool = Field(default=False, description="Si true, les résultats sont stockés comme métriques")
schedule_enabled: bool = Field(default=True, description="Peut être planifié")
visible_in_ui: bool = Field(default=True, description="Visible dans l'interface")
class BuiltinPlaybookExecutionRequest(BaseModel):
"""Requête d'exécution d'un builtin playbook"""
builtin_id: str = Field(..., description="ID du builtin playbook")
target: str = Field(..., description="Cible (host ou groupe)")
extra_vars: Optional[Dict[str, Any]] = None
class BuiltinPlaybookExecutionResult(BaseModel):
"""Résultat d'exécution d'un builtin playbook"""
success: bool
builtin_id: str
target: str
execution_time: float
metrics_saved: bool = False
hosts_processed: int = 0
error: Optional[str] = None
log_path: Optional[str] = None

View File

@ -252,7 +252,7 @@ class NotificationTemplates:
details.append(f"• Taille : {size}") details.append(f"• Taille : {size}")
return NotificationRequest( return NotificationRequest(
topic=None, topic="homelab-backup",
title="✅ Backup terminé avec succès", title="✅ Backup terminé avec succès",
message="\n".join(details), message="\n".join(details),
priority=3, priority=3,
@ -266,7 +266,7 @@ class NotificationTemplates:
) -> NotificationRequest: ) -> NotificationRequest:
"""Notification d'échec de backup.""" """Notification d'échec de backup."""
return NotificationRequest( return NotificationRequest(
topic=None, topic="homelab-backup",
title="❌ Échec du backup", title="❌ Échec du backup",
message=f"• Hôte : {hostname}\n• Erreur : {error}", message=f"• Hôte : {hostname}\n• Erreur : {error}",
priority=5, priority=5,
@ -277,18 +277,18 @@ class NotificationTemplates:
def bootstrap_started(hostname: str) -> NotificationRequest: def bootstrap_started(hostname: str) -> NotificationRequest:
"""Notification de début de bootstrap.""" """Notification de début de bootstrap."""
return NotificationRequest( return NotificationRequest(
topic=None, topic="homelab-bootstrap",
title="🔧 Bootstrap en cours", title="🔧 Bootstrap en cours",
message=f"Configuration initiale en cours pour l'hôte {hostname}.", message=f"Configuration initiale en cours pour l'hôte {hostname}.",
priority=3, priority=3,
tags=["wrench", "computer"] tags=["rocket", "wrench", "computer"]
) )
@staticmethod @staticmethod
def bootstrap_success(hostname: str) -> NotificationRequest: def bootstrap_success(hostname: str) -> NotificationRequest:
"""Notification de succès de bootstrap.""" """Notification de succès de bootstrap."""
return NotificationRequest( return NotificationRequest(
topic=None, topic="homelab-bootstrap",
title="✅ Bootstrap terminé avec succès", title="✅ Bootstrap terminé avec succès",
message=f"L'hôte {hostname} est maintenant configuré et prêt pour Ansible.", message=f"L'hôte {hostname} est maintenant configuré et prêt pour Ansible.",
priority=3, priority=3,
@ -299,7 +299,7 @@ class NotificationTemplates:
def bootstrap_failed(hostname: str, error: str) -> NotificationRequest: def bootstrap_failed(hostname: str, error: str) -> NotificationRequest:
"""Notification d'échec de bootstrap.""" """Notification d'échec de bootstrap."""
return NotificationRequest( return NotificationRequest(
topic=None, topic="homelab-bootstrap",
title="❌ Échec du bootstrap", title="❌ Échec du bootstrap",
message=f"• Hôte : {hostname}\n• Erreur : {error}", message=f"• Hôte : {hostname}\n• Erreur : {error}",
priority=5, priority=5,
@ -315,16 +315,16 @@ class NotificationTemplates:
"""Notification de changement d'état de santé.""" """Notification de changement d'état de santé."""
if new_status == "down": if new_status == "down":
return NotificationRequest( return NotificationRequest(
topic=None, topic="homelab-health",
title="🔴 Hôte inaccessible", title="🔴 Host DOWN",
message=f"L'hôte {hostname} ne répond plus." + (f"\n• Détails : {details}" if details else ""), message=f"L'hôte {hostname} ne répond plus." + (f"\n• Détails : {details}" if details else ""),
priority=5, priority=5,
tags=["red_circle", "warning"] tags=["red_circle", "warning"]
) )
else: else:
return NotificationRequest( return NotificationRequest(
topic=None, topic="homelab-health",
title="🟢 Hôte de nouveau accessible", title="🟢 Host UP",
message=f"L'hôte {hostname} est de nouveau en ligne." + (f"\n• Détails : {details}" if details else ""), message=f"L'hôte {hostname} est de nouveau en ligne." + (f"\n• Détails : {details}" if details else ""),
priority=3, priority=3,
tags=["green_circle", "white_check_mark"] tags=["green_circle", "white_check_mark"]

161
app/schemas/schedule_api.py Normal file
View File

@ -0,0 +1,161 @@
"""
Schémas Pydantic pour les schedules - modèles API complets.
"""
from datetime import datetime, timezone
from typing import Optional, List, Literal, Dict, Any
import uuid
from pydantic import BaseModel, Field, ConfigDict, field_validator
import pytz
class ScheduleRecurrence(BaseModel):
"""Configuration de récurrence pour un schedule."""
type: Literal["daily", "weekly", "monthly", "custom"] = "daily"
time: str = Field(default="02:00", description="Heure d'exécution HH:MM")
days: Optional[List[int]] = Field(default=None, description="Jours de la semaine (1-7, lundi=1) pour weekly")
day_of_month: Optional[int] = Field(default=None, ge=1, le=31, description="Jour du mois (1-31) pour monthly")
cron_expression: Optional[str] = Field(default=None, description="Expression cron pour custom")
class Schedule(BaseModel):
"""Modèle complet d'un schedule pour l'API."""
id: str = Field(default_factory=lambda: f"sched_{uuid.uuid4().hex[:12]}")
name: str = Field(..., min_length=3, max_length=100, description="Nom du schedule")
description: Optional[str] = Field(default=None, max_length=500)
playbook: str = Field(..., description="Nom du playbook à exécuter")
target_type: Literal["group", "host"] = Field(default="group", description="Type de cible")
target: str = Field(default="all", description="Nom du groupe ou hôte cible")
extra_vars: Optional[Dict[str, Any]] = Field(default=None, description="Variables supplémentaires")
schedule_type: Literal["once", "recurring"] = Field(default="recurring")
recurrence: Optional[ScheduleRecurrence] = Field(default=None)
timezone: str = Field(default="America/Montreal", description="Fuseau horaire")
start_at: Optional[datetime] = Field(default=None, description="Date de début (optionnel)")
end_at: Optional[datetime] = Field(default=None, description="Date de fin (optionnel)")
next_run_at: Optional[datetime] = Field(default=None, description="Prochaine exécution calculée")
last_run_at: Optional[datetime] = Field(default=None, description="Dernière exécution")
last_status: Literal["success", "failed", "running", "never"] = Field(default="never")
enabled: bool = Field(default=True, description="Schedule actif ou en pause")
retry_on_failure: int = Field(default=0, ge=0, le=3, description="Nombre de tentatives en cas d'échec")
timeout: int = Field(default=3600, ge=60, le=86400, description="Timeout en secondes")
notification_type: Literal["none", "all", "errors"] = Field(default="all", description="Type de notification")
tags: List[str] = Field(default_factory=list, description="Tags pour catégorisation")
run_count: int = Field(default=0, description="Nombre total d'exécutions")
success_count: int = Field(default=0, description="Nombre de succès")
failure_count: int = Field(default=0, description="Nombre d'échecs")
created_at: datetime = Field(default_factory=lambda: datetime.now(timezone.utc))
updated_at: datetime = Field(default_factory=lambda: datetime.now(timezone.utc))
model_config = ConfigDict(
json_encoders={datetime: lambda v: v.isoformat() if v else None}
)
@field_validator('recurrence', mode='before')
@classmethod
def validate_recurrence(cls, v, info):
return v
class ScheduleRun(BaseModel):
"""Historique d'une exécution de schedule."""
id: str = Field(default_factory=lambda: f"run_{uuid.uuid4().hex[:12]}")
schedule_id: str = Field(..., description="ID du schedule parent")
task_id: Optional[str] = Field(default=None, description="ID de la tâche créée")
started_at: datetime = Field(default_factory=lambda: datetime.now(timezone.utc))
finished_at: Optional[datetime] = Field(default=None)
status: Literal["running", "success", "failed", "canceled"] = Field(default="running")
duration_seconds: Optional[float] = Field(default=None)
hosts_impacted: int = Field(default=0)
error_message: Optional[str] = Field(default=None)
retry_attempt: int = Field(default=0, description="Numéro de la tentative (0 = première)")
model_config = ConfigDict(
json_encoders={datetime: lambda v: v.isoformat() if v else None}
)
class ScheduleCreateRequest(BaseModel):
"""Requête de création d'un schedule."""
name: str = Field(..., min_length=3, max_length=100)
description: Optional[str] = Field(default=None, max_length=500)
playbook: str = Field(...)
target_type: Literal["group", "host"] = Field(default="group")
target: str = Field(default="all")
extra_vars: Optional[Dict[str, Any]] = Field(default=None)
schedule_type: Literal["once", "recurring"] = Field(default="recurring")
recurrence: Optional[ScheduleRecurrence] = Field(default=None)
timezone: str = Field(default="America/Montreal")
start_at: Optional[datetime] = Field(default=None)
end_at: Optional[datetime] = Field(default=None)
enabled: bool = Field(default=True)
retry_on_failure: int = Field(default=0, ge=0, le=3)
timeout: int = Field(default=3600, ge=60, le=86400)
notification_type: Literal["none", "all", "errors"] = Field(default="all")
tags: List[str] = Field(default_factory=list)
@field_validator('timezone')
@classmethod
def validate_timezone(cls, v: str) -> str:
try:
pytz.timezone(v)
return v
except pytz.exceptions.UnknownTimeZoneError:
raise ValueError(f"Fuseau horaire invalide: {v}")
class ScheduleUpdateRequest(BaseModel):
"""Requête de mise à jour d'un schedule."""
name: Optional[str] = Field(default=None, min_length=3, max_length=100)
description: Optional[str] = Field(default=None, max_length=500)
playbook: Optional[str] = Field(default=None)
target_type: Optional[Literal["group", "host"]] = Field(default=None)
target: Optional[str] = Field(default=None)
extra_vars: Optional[Dict[str, Any]] = Field(default=None)
schedule_type: Optional[Literal["once", "recurring"]] = Field(default=None)
recurrence: Optional[ScheduleRecurrence] = Field(default=None)
timezone: Optional[str] = Field(default=None)
start_at: Optional[datetime] = Field(default=None)
end_at: Optional[datetime] = Field(default=None)
enabled: Optional[bool] = Field(default=None)
retry_on_failure: Optional[int] = Field(default=None, ge=0, le=3)
timeout: Optional[int] = Field(default=None, ge=60, le=86400)
notification_type: Optional[Literal["none", "all", "errors"]] = Field(default=None)
tags: Optional[List[str]] = Field(default=None)
class ScheduleStats(BaseModel):
"""Statistiques globales des schedules."""
total: int = 0
active: int = 0
paused: int = 0
expired: int = 0
next_execution: Optional[datetime] = None
next_schedule_name: Optional[str] = None
failures_24h: int = 0
executions_24h: int = 0
success_rate_7d: float = 0.0
class ScheduleListResponse(BaseModel):
"""Réponse API pour la liste des schedules."""
schedules: List[dict] = Field(default_factory=list)
count: int = 0
class UpcomingExecution(BaseModel):
"""Prochaine exécution planifiée."""
schedule_id: str
schedule_name: str
playbook: str
target: str
next_run_at: Optional[str] = None
tags: List[str] = Field(default_factory=list)
class CronValidationResult(BaseModel):
"""Résultat de validation d'une expression cron."""
valid: bool
expression: str
next_runs: Optional[List[str]] = None
error: Optional[str] = None

117
app/schemas/task_api.py Normal file
View File

@ -0,0 +1,117 @@
"""
Schémas Pydantic pour les tâches - modèles API.
"""
from datetime import datetime, timezone
from typing import Optional, List, Literal, Dict, Any
from pydantic import BaseModel, Field, ConfigDict, field_validator
class Task(BaseModel):
"""Modèle complet d'une tâche pour l'API."""
id: str
name: str
host: str
status: Literal["pending", "running", "completed", "failed", "cancelled"] = "pending"
progress: int = Field(ge=0, le=100, default=0)
start_time: Optional[datetime] = None
end_time: Optional[datetime] = None
duration: Optional[str] = None
output: Optional[str] = None
error: Optional[str] = None
model_config = ConfigDict(
json_encoders={datetime: lambda v: v.isoformat() if v else None}
)
class TaskRequest(BaseModel):
"""Requête de création d'une tâche."""
host: Optional[str] = Field(default=None, description="Hôte cible")
group: Optional[str] = Field(default=None, description="Groupe cible")
action: str = Field(..., description="Action à exécuter")
cmd: Optional[str] = Field(default=None, description="Commande personnalisée")
extra_vars: Optional[Dict[str, Any]] = Field(default=None, description="Variables Ansible")
tags: Optional[List[str]] = Field(default=None, description="Tags Ansible")
dry_run: bool = Field(default=False, description="Mode simulation")
ssh_user: Optional[str] = Field(default=None, description="Utilisateur SSH")
ssh_password: Optional[str] = Field(default=None, description="Mot de passe SSH")
@field_validator('action')
@classmethod
def validate_action(cls, v: str) -> str:
valid_actions = ['upgrade', 'reboot', 'health-check', 'backup', 'deploy', 'rollback', 'maintenance', 'bootstrap']
if v not in valid_actions:
raise ValueError(f'Action doit être l\'une de: {", ".join(valid_actions)}')
return v
class TaskLogFile(BaseModel):
"""Représentation d'un fichier de log de tâche."""
id: str
filename: str
path: str
task_name: str
target: str
status: str
date: str # Format YYYY-MM-DD
year: str
month: str
day: str
created_at: datetime
size_bytes: int
start_time: Optional[str] = None
end_time: Optional[str] = None
duration: Optional[str] = None
duration_seconds: Optional[int] = None
hosts: List[str] = Field(default_factory=list)
category: Optional[str] = None
subcategory: Optional[str] = None
target_type: Optional[str] = None
source_type: Optional[str] = None
class TasksFilterParams(BaseModel):
"""Paramètres de filtrage des tâches."""
status: Optional[str] = None
year: Optional[str] = None
month: Optional[str] = None
day: Optional[str] = None
hour_start: Optional[str] = None
hour_end: Optional[str] = None
target: Optional[str] = None
source_type: Optional[str] = None
search: Optional[str] = None
limit: int = 50
offset: int = 0
class TaskLogsResponse(BaseModel):
"""Réponse API pour les logs de tâches."""
logs: List[dict] = Field(default_factory=list)
count: int = 0
total_count: int = 0
has_more: bool = False
filters: Dict[str, Optional[str]] = Field(default_factory=dict)
pagination: Dict[str, int] = Field(default_factory=dict)
class TaskLogDatesResponse(BaseModel):
"""Structure des dates disponibles pour le filtrage."""
years: Dict[str, Any] = Field(default_factory=dict)
class TaskStatsResponse(BaseModel):
"""Statistiques des tâches."""
total: int = 0
completed: int = 0
failed: int = 0
running: int = 0
pending: int = 0
class RunningTasksResponse(BaseModel):
"""Réponse API pour les tâches en cours."""
tasks: List[dict] = Field(default_factory=list)
count: int = 0

View File

@ -2,9 +2,58 @@
Services métier pour l'API Homelab Automation. Services métier pour l'API Homelab Automation.
""" """
from .auth_service import AuthService, auth_service, verify_password, hash_password, create_access_token, decode_token
from .notification_service import NotificationService, notification_service from .notification_service import NotificationService, notification_service
from .builtin_playbooks import BuiltinPlaybookService, builtin_playbook_service, init_builtin_playbook_service
from .websocket_service import WebSocketManager, ws_manager
from .host_status_service import HostStatusService, host_status_service
from .bootstrap_status_service import BootstrapStatusService, bootstrap_status_service
from .task_log_service import TaskLogService
from .adhoc_history_service import AdHocHistoryService, adhoc_history_service
from .ansible_service import AnsibleService, ansible_service
from .scheduler_service import SchedulerService, scheduler_service
from .hybrid_db import HybridDB, db
from .console_log_service import ConsoleLogCapture, console_log_service
__all__ = [ __all__ = [
# Auth
"AuthService",
"auth_service",
"verify_password",
"hash_password",
"create_access_token",
"decode_token",
# Notifications
"NotificationService", "NotificationService",
"notification_service", "notification_service",
# Builtin playbooks
"BuiltinPlaybookService",
"builtin_playbook_service",
"init_builtin_playbook_service",
# WebSocket
"WebSocketManager",
"ws_manager",
# Host status
"HostStatusService",
"host_status_service",
# Bootstrap status
"BootstrapStatusService",
"bootstrap_status_service",
# Task logs
"TaskLogService",
# Ad-hoc history
"AdHocHistoryService",
"adhoc_history_service",
# Ansible
"AnsibleService",
"ansible_service",
# Scheduler
"SchedulerService",
"scheduler_service",
# Hybrid DB
"HybridDB",
"db",
# Console logs
"ConsoleLogCapture",
"console_log_service",
] ]

View File

@ -0,0 +1,304 @@
"""
Service de gestion de l'historique des commandes ad-hoc.
"""
import uuid
from datetime import datetime, timezone
from typing import List, Optional
from sqlalchemy import select, update, delete
from sqlalchemy.ext.asyncio import AsyncSession
from app.models.database import async_session_maker
from app.schemas.ansible import AdHocHistoryEntry, AdHocHistoryCategory
from app.core.constants import DEFAULT_ADHOC_CATEGORIES
class AdHocHistoryService:
"""Service pour gérer l'historique des commandes ad-hoc en base de données."""
def __init__(self):
self._default_categories_initialized = False
async def _ensure_default_categories(self, session: AsyncSession):
"""S'assure que les catégories par défaut existent."""
if self._default_categories_initialized:
return
from app.crud.log import LogRepository
repo = LogRepository(session)
# Vérifier si des catégories existent déjà
existing = await self.get_categories()
if not existing:
# Créer les catégories par défaut
for cat in DEFAULT_ADHOC_CATEGORIES:
await self._create_category_internal(
session,
name=cat["name"],
description=cat.get("description"),
color=cat.get("color", "#7c3aed"),
icon=cat.get("icon", "fa-folder")
)
await session.commit()
self._default_categories_initialized = True
async def _create_category_internal(
self,
session: AsyncSession,
name: str,
description: str = None,
color: str = "#7c3aed",
icon: str = "fa-folder"
):
"""Crée une catégorie en base (interne)."""
from app.models.log import Log
# Utiliser le modèle Log avec un type spécial pour stocker les catégories
log = Log(
level="ADHOC_CATEGORY",
message=name,
source=description or "",
host_id=None,
task_id=f"{color}|{icon}", # Stocker color et icon dans task_id
)
session.add(log)
async def add_command(
self,
command: str,
target: str,
module: str = "shell",
become: bool = False,
category: str = "default",
description: str = None
) -> AdHocHistoryEntry:
"""Ajoute une commande à l'historique."""
async with async_session_maker() as session:
await self._ensure_default_categories(session)
from app.models.log import Log
cmd_id = f"adhoc_{uuid.uuid4().hex[:12]}"
log = Log(
level="ADHOC_COMMAND",
message=command,
source=category,
host_id=target,
task_id=cmd_id,
)
session.add(log)
await session.commit()
return AdHocHistoryEntry(
id=cmd_id,
command=command,
target=target,
module=module,
become=become,
category=category,
description=description,
created_at=datetime.now(timezone.utc),
last_used=datetime.now(timezone.utc),
use_count=1
)
async def get_commands(
self,
category: str = None,
search: str = None,
limit: int = 50
) -> List[AdHocHistoryEntry]:
"""Récupère les commandes de l'historique."""
async with async_session_maker() as session:
from app.models.log import Log
stmt = select(Log).where(Log.level == "ADHOC_COMMAND")
if category and category != "all":
stmt = stmt.where(Log.source == category)
stmt = stmt.order_by(Log.created_at.desc()).limit(limit)
result = await session.execute(stmt)
logs = result.scalars().all()
commands = []
for log in logs:
if search and search.lower() not in log.message.lower():
continue
commands.append(AdHocHistoryEntry(
id=log.task_id or str(log.id),
command=log.message,
target=log.host_id or "all",
module="shell",
become=False,
category=log.source or "default",
created_at=log.created_at,
last_used=log.created_at,
use_count=1
))
return commands
async def update_command_category(
self,
command_id: str,
category: str,
description: str = None
) -> bool:
"""Met à jour la catégorie d'une commande."""
async with async_session_maker() as session:
from app.models.log import Log
stmt = (
update(Log)
.where(Log.task_id == command_id)
.where(Log.level == "ADHOC_COMMAND")
.values(source=category)
)
result = await session.execute(stmt)
await session.commit()
return result.rowcount > 0
async def delete_command(self, command_id: str) -> bool:
"""Supprime une commande de l'historique."""
async with async_session_maker() as session:
from app.models.log import Log
stmt = (
delete(Log)
.where(Log.task_id == command_id)
.where(Log.level == "ADHOC_COMMAND")
)
result = await session.execute(stmt)
await session.commit()
return result.rowcount > 0
async def get_categories(self) -> List[AdHocHistoryCategory]:
"""Récupère la liste des catégories."""
async with async_session_maker() as session:
from app.models.log import Log
stmt = select(Log).where(Log.level == "ADHOC_CATEGORY")
result = await session.execute(stmt)
logs = result.scalars().all()
if not logs:
# Retourner les catégories par défaut
return [
AdHocHistoryCategory(
name=cat["name"],
description=cat.get("description"),
color=cat.get("color", "#7c3aed"),
icon=cat.get("icon", "fa-folder")
)
for cat in DEFAULT_ADHOC_CATEGORIES
]
categories = []
for log in logs:
# Extraire color et icon depuis task_id
color, icon = "#7c3aed", "fa-folder"
if log.task_id and "|" in log.task_id:
parts = log.task_id.split("|", 1)
color = parts[0]
icon = parts[1] if len(parts) > 1 else "fa-folder"
categories.append(AdHocHistoryCategory(
name=log.message,
description=log.source or None,
color=color,
icon=icon
))
return categories
async def add_category(
self,
name: str,
description: str = None,
color: str = "#7c3aed",
icon: str = "fa-folder"
) -> AdHocHistoryCategory:
"""Ajoute une nouvelle catégorie."""
async with async_session_maker() as session:
await self._create_category_internal(session, name, description, color, icon)
await session.commit()
return AdHocHistoryCategory(
name=name,
description=description,
color=color,
icon=icon
)
async def update_category(
self,
old_name: str,
new_name: str,
description: str = None,
color: str = "#7c3aed",
icon: str = "fa-folder"
) -> bool:
"""Met à jour une catégorie existante."""
async with async_session_maker() as session:
from app.models.log import Log
# Mettre à jour la catégorie
stmt = (
update(Log)
.where(Log.message == old_name)
.where(Log.level == "ADHOC_CATEGORY")
.values(
message=new_name,
source=description or "",
task_id=f"{color}|{icon}"
)
)
result = await session.execute(stmt)
# Mettre à jour les commandes associées si le nom a changé
if old_name != new_name:
stmt2 = (
update(Log)
.where(Log.source == old_name)
.where(Log.level == "ADHOC_COMMAND")
.values(source=new_name)
)
await session.execute(stmt2)
await session.commit()
return result.rowcount > 0
async def delete_category(self, name: str) -> bool:
"""Supprime une catégorie et déplace ses commandes vers 'default'."""
if name == "default":
return False
async with async_session_maker() as session:
from app.models.log import Log
# Déplacer les commandes vers default
stmt1 = (
update(Log)
.where(Log.source == name)
.where(Log.level == "ADHOC_COMMAND")
.values(source="default")
)
await session.execute(stmt1)
# Supprimer la catégorie
stmt2 = (
delete(Log)
.where(Log.message == name)
.where(Log.level == "ADHOC_CATEGORY")
)
result = await session.execute(stmt2)
await session.commit()
return result.rowcount > 0
# Instance singleton du service
adhoc_history_service = AdHocHistoryService()

View File

@ -0,0 +1,577 @@
"""
Service de gestion d'Ansible (playbooks, inventaire, exécution).
"""
import asyncio
import os
import re
import shutil
from datetime import datetime, timezone
from pathlib import Path
from time import perf_counter
from typing import Any, Dict, List, Optional
import yaml
from app.core.config import settings
from app.schemas.host_api import AnsibleInventoryHost
from app.schemas.ansible import PlaybookInfo
class AnsibleService:
"""Service pour gérer les playbooks et l'inventaire Ansible."""
def __init__(self, ansible_dir: Path = None, ssh_key_path: str = None, ssh_user: str = None):
self.ansible_dir = ansible_dir or settings.ansible_dir
self.playbooks_dir = self.ansible_dir / "playbooks"
self.inventory_path = self.ansible_dir / "inventory" / "hosts.yml"
self.ssh_key_path = ssh_key_path or settings.ssh_key_path
self.ssh_user = ssh_user or settings.ssh_user
# Cache
self._inventory_cache: Optional[Dict] = None
self._inventory_cache_time: float = 0
self._playbooks_cache: Optional[List[PlaybookInfo]] = None
self._playbooks_cache_time: float = 0
self._cache_ttl = settings.inventory_cache_ttl
def invalidate_cache(self):
"""Invalide les caches."""
self._inventory_cache = None
self._playbooks_cache = None
# ===== PLAYBOOKS =====
def get_playbooks(self) -> List[Dict[str, Any]]:
"""Récupère la liste des playbooks disponibles."""
import time
current_time = time.time()
if self._playbooks_cache and (current_time - self._playbooks_cache_time) < self._cache_ttl:
return self._playbooks_cache
playbooks = []
if not self.playbooks_dir.exists():
return playbooks
# Parcourir le répertoire principal
for item in self.playbooks_dir.iterdir():
if item.is_file() and item.suffix in ['.yml', '.yaml']:
pb = self._parse_playbook_file(item, "general", "other")
if pb:
playbooks.append(pb)
elif item.is_dir() and not item.name.startswith('.'):
# Sous-répertoire = catégorie
category = item.name
for subitem in item.iterdir():
if subitem.is_file() and subitem.suffix in ['.yml', '.yaml']:
pb = self._parse_playbook_file(subitem, category, "other")
if pb:
playbooks.append(pb)
elif subitem.is_dir() and not subitem.name.startswith('.'):
# Sous-sous-répertoire = subcategory
subcategory = subitem.name
for subsubitem in subitem.iterdir():
if subsubitem.is_file() and subsubitem.suffix in ['.yml', '.yaml']:
pb = self._parse_playbook_file(subsubitem, category, subcategory)
if pb:
playbooks.append(pb)
self._playbooks_cache = playbooks
self._playbooks_cache_time = current_time
return playbooks
def _parse_playbook_file(self, file_path: Path, category: str, subcategory: str) -> Optional[Dict[str, Any]]:
"""Parse un fichier playbook et extrait ses métadonnées."""
try:
stat = file_path.stat()
# Lire le contenu pour extraire hosts
hosts = "all"
description = None
try:
content = file_path.read_text(encoding='utf-8')
data = yaml.safe_load(content)
if isinstance(data, list) and len(data) > 0:
first_play = data[0]
if isinstance(first_play, dict):
hosts = first_play.get('hosts', 'all')
# Chercher une description dans les commentaires
if content.startswith('#'):
first_line = content.split('\n')[0]
description = first_line.lstrip('#').strip()
except Exception:
pass
return {
"name": file_path.stem,
"filename": file_path.name,
"path": str(file_path),
"category": category,
"subcategory": subcategory,
"hosts": hosts,
"size": stat.st_size,
"modified": datetime.fromtimestamp(stat.st_mtime, tz=timezone.utc).isoformat(),
"description": description
}
except Exception:
return None
def get_playbook_categories(self) -> Dict[str, List[str]]:
"""Retourne les catégories de playbooks organisées."""
playbooks = self.get_playbooks()
categories = {}
for pb in playbooks:
cat = pb.get("category", "general")
subcat = pb.get("subcategory", "other")
if cat not in categories:
categories[cat] = []
if subcat not in categories[cat]:
categories[cat].append(subcat)
return categories
def is_target_compatible_with_playbook(self, target: str, playbook_hosts: str) -> bool:
"""Vérifie si une cible est compatible avec un playbook."""
# 'all' est toujours compatible
if playbook_hosts == 'all' or target == 'all':
return True
# Si le playbook cible exactement notre target
if playbook_hosts == target:
return True
# Vérifier si target fait partie des hosts du playbook
# Le playbook peut avoir une expression avec ":"
pb_hosts = [h.strip() for h in playbook_hosts.split(':')]
if target in pb_hosts:
return True
# Si le target est un groupe qui pourrait contenir les hosts du playbook
# Dans ce cas, on laisse passer car c'est géré par Ansible
groups = self.get_groups()
if target in groups:
return True
# Si le playbook cible un groupe spécifique et notre target est un hôte
hosts = self.get_hosts_from_inventory()
host_names = [h.name for h in hosts]
if target in host_names:
return True
return False
def get_compatible_playbooks(self, target: str) -> List[Dict[str, Any]]:
"""Retourne les playbooks compatibles avec une cible."""
all_playbooks = self.get_playbooks()
compatible = []
for pb in all_playbooks:
if self.is_target_compatible_with_playbook(target, pb.get('hosts', 'all')):
compatible.append(pb)
return compatible
# ===== INVENTAIRE =====
def load_inventory(self) -> Dict:
"""Charge l'inventaire Ansible depuis le fichier YAML."""
import time
current_time = time.time()
if self._inventory_cache and (current_time - self._inventory_cache_time) < self._cache_ttl:
return self._inventory_cache
if not self.inventory_path.exists():
return {}
try:
with open(self.inventory_path, 'r', encoding='utf-8') as f:
inventory = yaml.safe_load(f) or {}
self._inventory_cache = inventory
self._inventory_cache_time = current_time
return inventory
except Exception:
return {}
def _save_inventory(self, inventory: Dict):
"""Sauvegarde l'inventaire dans le fichier YAML."""
self.inventory_path.parent.mkdir(parents=True, exist_ok=True)
with open(self.inventory_path, 'w', encoding='utf-8') as f:
yaml.dump(inventory, f, default_flow_style=False, allow_unicode=True)
# Invalider le cache
self._inventory_cache = None
def get_hosts_from_inventory(self, group_filter: str = None) -> List[AnsibleInventoryHost]:
"""Récupère les hôtes depuis l'inventaire Ansible."""
inventory = self.load_inventory()
# Dictionnaire pour collecter tous les groupes de chaque hôte
host_data: Dict[str, Dict] = {}
def extract_hosts(data: Dict, parent_group: str = None):
if not isinstance(data, dict):
return
for key, value in data.items():
if key == 'hosts' and isinstance(value, dict):
for host_name, host_vars in value.items():
if host_name not in host_data:
ansible_host = host_name
if isinstance(host_vars, dict):
ansible_host = host_vars.get('ansible_host', host_name)
host_data[host_name] = {
'ansible_host': ansible_host,
'groups': [],
'vars': host_vars if isinstance(host_vars, dict) else {}
}
# Ajouter ce groupe à la liste des groupes de l'hôte
if parent_group and parent_group not in host_data[host_name]['groups']:
host_data[host_name]['groups'].append(parent_group)
elif key == 'children' and isinstance(value, dict):
for child_group, child_data in value.items():
extract_hosts(child_data, child_group)
elif isinstance(value, dict) and key not in ['hosts', 'vars', 'children']:
extract_hosts(value, key)
extract_hosts(inventory)
# Convertir en liste d'objets AnsibleInventoryHost
hosts = []
for host_name, data in host_data.items():
# Filtrer par groupe si demandé
if group_filter and group_filter not in data['groups']:
continue
# Déterminer le groupe principal (premier groupe env_ ou premier groupe)
primary_group = "ungrouped"
for g in data['groups']:
if g.startswith('env_'):
primary_group = g
break
if primary_group == "ungrouped" and data['groups']:
primary_group = data['groups'][0]
hosts.append(AnsibleInventoryHost(
name=host_name,
ansible_host=data['ansible_host'],
group=primary_group,
groups=data['groups'],
vars=data['vars']
))
return hosts
def get_groups(self) -> List[str]:
"""Récupère la liste de tous les groupes."""
inventory = self.load_inventory()
groups = set()
def extract_groups(data: Dict):
if not isinstance(data, dict):
return
for key, value in data.items():
if key in ['hosts', 'vars']:
continue
if key == 'children' and isinstance(value, dict):
for child_group, child_data in value.items():
groups.add(child_group)
extract_groups(child_data)
elif isinstance(value, dict):
groups.add(key)
extract_groups(value)
extract_groups(inventory)
return sorted(list(groups))
def get_env_groups(self) -> List[str]:
"""Récupère les groupes d'environnement (préfixe env_)."""
return [g for g in self.get_groups() if g.startswith('env_')]
def get_role_groups(self) -> List[str]:
"""Récupère les groupes de rôles (préfixe role_)."""
return [g for g in self.get_groups() if g.startswith('role_')]
def host_exists(self, hostname: str) -> bool:
"""Vérifie si un hôte existe dans l'inventaire."""
hosts = self.get_hosts_from_inventory()
return any(h.name == hostname or h.ansible_host == hostname for h in hosts)
def group_exists(self, group_name: str) -> bool:
"""Vérifie si un groupe existe."""
return group_name in self.get_groups()
def add_host_to_inventory(
self,
hostname: str,
env_group: str,
role_groups: List[str] = None,
ansible_host: str = None
):
"""Ajoute un hôte à l'inventaire."""
inventory = self.load_inventory()
if 'all' not in inventory:
inventory['all'] = {'children': {}}
children = inventory['all'].setdefault('children', {})
# Ajouter au groupe d'environnement
if env_group not in children:
children[env_group] = {'hosts': {}}
env_data = children[env_group]
if 'hosts' not in env_data:
env_data['hosts'] = {}
host_vars = {}
if ansible_host and ansible_host != hostname:
host_vars['ansible_host'] = ansible_host
env_data['hosts'][hostname] = host_vars or None
# Ajouter aux groupes de rôles
for role in (role_groups or []):
if role not in children:
children[role] = {'hosts': {}}
role_data = children[role]
if 'hosts' not in role_data:
role_data['hosts'] = {}
role_data['hosts'][hostname] = None
self._save_inventory(inventory)
def remove_host_from_inventory(self, hostname: str):
"""Supprime un hôte de l'inventaire."""
inventory = self.load_inventory()
def remove_from_dict(data: Dict):
if not isinstance(data, dict):
return
if 'hosts' in data and isinstance(data['hosts'], dict):
data['hosts'].pop(hostname, None)
if 'children' in data and isinstance(data['children'], dict):
for child_data in data['children'].values():
remove_from_dict(child_data)
for key, value in list(data.items()):
if key not in ['hosts', 'vars', 'children'] and isinstance(value, dict):
remove_from_dict(value)
remove_from_dict(inventory)
self._save_inventory(inventory)
def update_host_groups(
self,
hostname: str,
env_group: str = None,
role_groups: List[str] = None,
ansible_host: str = None
):
"""Met à jour les groupes d'un hôte."""
# Supprimer l'hôte de tous les groupes
self.remove_host_from_inventory(hostname)
# Réajouter avec les nouveaux groupes
if env_group:
self.add_host_to_inventory(
hostname=hostname,
env_group=env_group,
role_groups=role_groups or [],
ansible_host=ansible_host
)
def add_group(self, group_name: str, group_type: str = "role"):
"""Ajoute un nouveau groupe."""
inventory = self.load_inventory()
if 'all' not in inventory:
inventory['all'] = {'children': {}}
children = inventory['all'].setdefault('children', {})
if group_name not in children:
children[group_name] = {'hosts': {}}
self._save_inventory(inventory)
def rename_group(self, old_name: str, new_name: str):
"""Renomme un groupe."""
inventory = self.load_inventory()
if 'all' not in inventory or 'children' not in inventory['all']:
return
children = inventory['all']['children']
if old_name in children:
children[new_name] = children.pop(old_name)
self._save_inventory(inventory)
def delete_group(self, group_name: str, move_hosts_to: str = None):
"""Supprime un groupe (optionnellement déplace les hôtes)."""
inventory = self.load_inventory()
if 'all' not in inventory or 'children' not in inventory['all']:
return
children = inventory['all']['children']
if group_name not in children:
return
# Récupérer les hôtes du groupe
group_hosts = []
if 'hosts' in children[group_name]:
group_hosts = list(children[group_name]['hosts'].keys())
# Déplacer les hôtes si demandé
if move_hosts_to and move_hosts_to in children:
target_group = children[move_hosts_to]
if 'hosts' not in target_group:
target_group['hosts'] = {}
for host in group_hosts:
target_group['hosts'][host] = children[group_name]['hosts'].get(host)
# Supprimer le groupe
del children[group_name]
self._save_inventory(inventory)
def get_group_hosts(self, group_name: str) -> List[str]:
"""Récupère les hôtes d'un groupe."""
hosts = self.get_hosts_from_inventory(group_filter=group_name)
return [h.name for h in hosts]
# ===== EXÉCUTION =====
async def execute_playbook(
self,
playbook: str,
target: str = "all",
extra_vars: Dict[str, Any] = None,
check_mode: bool = False,
verbose: bool = False
) -> Dict[str, Any]:
"""Exécute un playbook Ansible de manière asynchrone."""
start_time = perf_counter()
# Construire le chemin du playbook
if not playbook.endswith(('.yml', '.yaml')):
playbook = f"{playbook}.yml"
playbook_path = self._find_playbook_path(playbook)
if not playbook_path or not playbook_path.exists():
raise FileNotFoundError(f"Playbook non trouvé: {playbook}")
# Trouver la clé SSH
private_key = self._find_ssh_private_key()
# Construire la commande
cmd = [
"ansible-playbook",
str(playbook_path),
"-i", str(self.inventory_path),
"-l", target,
]
if check_mode:
cmd.append("--check")
if verbose:
cmd.append("-v")
if private_key:
cmd.extend(["--private-key", private_key])
if self.ssh_user:
cmd.extend(["-u", self.ssh_user])
if extra_vars:
import json
cmd.extend(["--extra-vars", json.dumps(extra_vars)])
# Exécuter la commande
try:
process = await asyncio.create_subprocess_exec(
*cmd,
stdout=asyncio.subprocess.PIPE,
stderr=asyncio.subprocess.PIPE,
cwd=str(self.ansible_dir)
)
stdout, stderr = await process.communicate()
execution_time = perf_counter() - start_time
return {
"success": process.returncode == 0,
"return_code": process.returncode,
"stdout": stdout.decode('utf-8', errors='replace'),
"stderr": stderr.decode('utf-8', errors='replace'),
"execution_time": execution_time,
"playbook": playbook,
"target": target,
"check_mode": check_mode
}
except FileNotFoundError:
return {
"success": False,
"return_code": -1,
"stdout": "",
"stderr": "ansible-playbook non trouvé. Vérifiez que Ansible est installé.",
"execution_time": perf_counter() - start_time,
"playbook": playbook,
"target": target,
"check_mode": check_mode
}
def _find_playbook_path(self, playbook: str) -> Optional[Path]:
"""Trouve le chemin complet d'un playbook."""
# Chemin direct
direct_path = self.playbooks_dir / playbook
if direct_path.exists():
return direct_path
# Chercher dans les sous-répertoires
for item in self.playbooks_dir.rglob(playbook):
if item.is_file():
return item
return None
def _find_ssh_private_key(self) -> Optional[str]:
"""Trouve une clé SSH privée valide."""
# Essayer le chemin configuré
if self.ssh_key_path:
key_path = Path(self.ssh_key_path)
if key_path.exists():
return str(key_path)
# Chercher dans les emplacements standard
candidates = [
Path.home() / ".ssh" / "id_rsa",
Path.home() / ".ssh" / "id_ed25519",
Path.home() / ".ssh" / "id_ecdsa",
Path("/app/docker/ssh_keys/id_automation_ansible"),
]
for candidate in candidates:
if candidate.exists():
return str(candidate)
return None
# Instance singleton du service
ansible_service = AnsibleService()

View File

@ -0,0 +1,129 @@
"""Authentication service with JWT and password hashing.
Uses:
- python-jose for JWT encoding/decoding
- bcrypt for password hashing
"""
from __future__ import annotations
import os
from datetime import datetime, timedelta, timezone
from typing import Optional
import bcrypt
from jose import JWTError, jwt
from app.models.user import User
from app.schemas.auth import TokenData
# Configuration from environment variables
SECRET_KEY = os.environ.get("JWT_SECRET_KEY", "homelab-secret-key-change-in-production")
ALGORITHM = "HS256"
ACCESS_TOKEN_EXPIRE_MINUTES = int(os.environ.get("JWT_EXPIRE_MINUTES", "1440")) # 24 hours default
class AuthService:
"""Service for authentication operations."""
@staticmethod
def verify_password(plain_password: str, hashed_password: str) -> bool:
"""Verify a password against its hash."""
return bcrypt.checkpw(
plain_password.encode('utf-8'),
hashed_password.encode('utf-8')
)
@staticmethod
def hash_password(password: str) -> str:
"""Hash a password for storage."""
salt = bcrypt.gensalt()
return bcrypt.hashpw(password.encode('utf-8'), salt).decode('utf-8')
@staticmethod
def create_access_token(
data: dict,
expires_delta: Optional[timedelta] = None
) -> tuple[str, int]:
"""Create a JWT access token.
Returns:
Tuple of (token_string, expires_in_seconds)
"""
to_encode = data.copy()
if expires_delta:
expire = datetime.now(timezone.utc) + expires_delta
expires_in = int(expires_delta.total_seconds())
else:
expire = datetime.now(timezone.utc) + timedelta(minutes=ACCESS_TOKEN_EXPIRE_MINUTES)
expires_in = ACCESS_TOKEN_EXPIRE_MINUTES * 60
to_encode.update({
"exp": expire,
"iat": datetime.now(timezone.utc),
})
encoded_jwt = jwt.encode(to_encode, SECRET_KEY, algorithm=ALGORITHM)
return encoded_jwt, expires_in
@staticmethod
def decode_token(token: str) -> Optional[TokenData]:
"""Decode and validate a JWT token.
Returns:
TokenData if valid, None if invalid or expired.
"""
try:
payload = jwt.decode(token, SECRET_KEY, algorithms=[ALGORITHM])
username: str = payload.get("sub")
user_id: int = payload.get("user_id")
role: str = payload.get("role")
if username is None:
return None
return TokenData(username=username, user_id=user_id, role=role)
except JWTError:
return None
@staticmethod
def create_token_for_user(user: User) -> tuple[str, int]:
"""Create a JWT token for a user.
Returns:
Tuple of (token_string, expires_in_seconds)
"""
token_data = {
"sub": user.username,
"user_id": user.id,
"role": user.role,
}
return AuthService.create_access_token(token_data)
# Convenience functions for direct use
def verify_password(plain_password: str, hashed_password: str) -> bool:
"""Verify a password against its hash."""
return AuthService.verify_password(plain_password, hashed_password)
def hash_password(password: str) -> str:
"""Hash a password for storage."""
return AuthService.hash_password(password)
def create_access_token(
data: dict,
expires_delta: Optional[timedelta] = None
) -> tuple[str, int]:
"""Create a JWT access token."""
return AuthService.create_access_token(data, expires_delta)
def decode_token(token: str) -> Optional[TokenData]:
"""Decode and validate a JWT token."""
return AuthService.decode_token(token)
# Singleton instance
auth_service = AuthService()

View File

@ -0,0 +1,120 @@
"""
Service de gestion du statut de bootstrap des hôtes.
"""
import asyncio
from datetime import datetime, timezone
from typing import Dict, Optional
from sqlalchemy.ext.asyncio import AsyncSession
from app.models.database import async_session_maker
class BootstrapStatusService:
"""Service pour gérer le statut de bootstrap des hôtes.
Cette version utilise la base de données SQLite via SQLAlchemy async.
Note: Le modèle BD utilise host_id (FK), mais ce service utilise host_name
pour la compatibilité avec le code existant. Il fait la correspondance via HostRepository.
"""
def __init__(self):
# Cache en mémoire pour éviter les requêtes BD répétées
self._cache: Dict[str, Dict] = {}
async def _get_host_id_by_name(self, session: AsyncSession, host_name: str) -> Optional[str]:
"""Récupère l'ID d'un hôte par son nom."""
from app.crud.host import HostRepository
repo = HostRepository(session)
host = await repo.get_by_name(host_name)
return host.id if host else None
def set_bootstrap_status(self, host_name: str, success: bool, details: str = None) -> Dict:
"""Enregistre le statut de bootstrap d'un hôte (version synchrone avec cache)."""
status_data = {
"bootstrap_ok": success,
"bootstrap_date": datetime.now(timezone.utc).isoformat(),
"details": details
}
self._cache[host_name] = status_data
# Planifier la sauvegarde en BD de manière asynchrone
asyncio.create_task(self._save_to_db(host_name, success, details))
return status_data
async def _save_to_db(self, host_name: str, success: bool, details: str = None):
"""Sauvegarde le statut dans la BD."""
try:
async with async_session_maker() as session:
host_id = await self._get_host_id_by_name(session, host_name)
if not host_id:
print(f"Host '{host_name}' non trouvé en BD pour bootstrap status")
return
from app.crud.bootstrap_status import BootstrapStatusRepository
repo = BootstrapStatusRepository(session)
await repo.create(
host_id=host_id,
status="success" if success else "failed",
last_attempt=datetime.now(timezone.utc),
error_message=None if success else details,
)
await session.commit()
except Exception as e:
print(f"Erreur sauvegarde bootstrap status en BD: {e}")
def get_bootstrap_status(self, host_name: str) -> Dict:
"""Récupère le statut de bootstrap d'un hôte depuis le cache."""
return self._cache.get(host_name, {
"bootstrap_ok": False,
"bootstrap_date": None,
"details": None
})
def get_all_status(self) -> Dict[str, Dict]:
"""Récupère le statut de tous les hôtes depuis le cache."""
return self._cache.copy()
def remove_host(self, host_name: str) -> bool:
"""Supprime le statut d'un hôte du cache."""
if host_name in self._cache:
del self._cache[host_name]
return True
return False
async def load_from_db(self):
"""Charge tous les statuts depuis la BD dans le cache (appelé au démarrage)."""
try:
async with async_session_maker() as session:
from sqlalchemy import select
from app.models.bootstrap_status import BootstrapStatus
from app.models.host import Host
# Récupérer tous les derniers statuts avec les noms d'hôtes
stmt = (
select(BootstrapStatus, Host.name)
.join(Host, BootstrapStatus.host_id == Host.id)
.order_by(BootstrapStatus.created_at.desc())
)
result = await session.execute(stmt)
# Garder seulement le dernier statut par hôte
seen_hosts = set()
for bs, host_name in result:
if host_name not in seen_hosts:
self._cache[host_name] = {
"bootstrap_ok": bs.status == "success",
"bootstrap_date": bs.last_attempt.isoformat() if bs.last_attempt else bs.created_at.isoformat(),
"details": bs.error_message
}
seen_hosts.add(host_name)
print(f"📋 {len(self._cache)} statut(s) bootstrap chargé(s) depuis la BD")
except Exception as e:
print(f"Erreur chargement bootstrap status depuis BD: {e}")
# Instance singleton du service
bootstrap_status_service = BootstrapStatusService()

View File

@ -0,0 +1,526 @@
"""
Service de gestion des Builtin Playbooks.
Ce service gère les playbooks intégrés à l'application pour la collecte
automatique d'informations sur les hôtes (métriques système, disque, mémoire, etc.).
Les résultats sont stockés dans la table host_metrics et visibles dans les Logs,
mais pas dans la section Tasks (pour éviter de polluer l'interface).
"""
from __future__ import annotations
import asyncio
import json
import re
import time
from datetime import datetime, timezone, timedelta
from pathlib import Path
from typing import Dict, Any, List, Optional
from pydantic import BaseModel
from app.schemas.host_metrics import (
BuiltinPlaybookDefinition,
HostMetricsCreate,
HostMetricsSummary,
)
# Définitions des builtin playbooks
BUILTIN_PLAYBOOKS: Dict[str, BuiltinPlaybookDefinition] = {
"install_base_tools": BuiltinPlaybookDefinition(
id="install_base_tools",
name="Installer les outils de base",
description="Installe les commandes requises pour la collecte et l'affichage des métriques (df, lsblk, python3, etc.)",
playbook_file="_builtin_install_base_tools.yml",
category="maintenance",
icon="fas fa-tools",
color="blue",
collect_metrics=False,
schedule_enabled=False,
visible_in_ui=True,
),
"collect_system_info": BuiltinPlaybookDefinition(
id="collect_system_info",
name="Collecte Info Système",
description="Collecte les informations système complètes (CPU, mémoire, disque, OS)",
playbook_file="_builtin_collect_system_info.yml",
category="metrics",
icon="fas fa-microchip",
color="cyan",
collect_metrics=True,
schedule_enabled=True,
visible_in_ui=True,
),
"collect_disk_usage": BuiltinPlaybookDefinition(
id="collect_disk_usage",
name="Espace Disque",
description="Collecte l'utilisation de l'espace disque sur tous les points de montage",
playbook_file="_builtin_collect_disk_usage.yml",
category="metrics",
icon="fas fa-hdd",
color="amber",
collect_metrics=True,
schedule_enabled=True,
visible_in_ui=True,
),
"collect_memory_info": BuiltinPlaybookDefinition(
id="collect_memory_info",
name="Utilisation Mémoire",
description="Collecte les informations de mémoire RAM et swap",
playbook_file="_builtin_collect_memory_info.yml",
category="metrics",
icon="fas fa-memory",
color="purple",
collect_metrics=True,
schedule_enabled=True,
visible_in_ui=True,
),
"collect_cpu_info": BuiltinPlaybookDefinition(
id="collect_cpu_info",
name="Informations CPU",
description="Collecte les informations CPU (charge, température, modèle)",
playbook_file="_builtin_collect_cpu_info.yml",
category="metrics",
icon="fas fa-tachometer-alt",
color="red",
collect_metrics=True,
schedule_enabled=True,
visible_in_ui=True,
),
"collect_network_info": BuiltinPlaybookDefinition(
id="collect_network_info",
name="Informations Réseau",
description="Collecte les informations des interfaces réseau",
playbook_file="_builtin_collect_network_info.yml",
category="metrics",
icon="fas fa-network-wired",
color="green",
collect_metrics=True,
schedule_enabled=True,
visible_in_ui=True,
),
}
class BuiltinPlaybookService:
"""Service pour gérer et exécuter les builtin playbooks."""
def __init__(self, ansible_dir: Path, ansible_service=None):
"""
Args:
ansible_dir: Répertoire racine Ansible (contenant playbooks/)
ansible_service: Instance du service Ansible pour l'exécution
"""
self.ansible_dir = ansible_dir
self.playbooks_dir = ansible_dir / "playbooks"
self.builtin_dir = ansible_dir / "playbooks" / "builtin"
self.ansible_service = ansible_service
# Créer le répertoire builtin s'il n'existe pas
self.builtin_dir.mkdir(parents=True, exist_ok=True)
def get_all_definitions(self) -> List[BuiltinPlaybookDefinition]:
"""Retourne toutes les définitions de builtin playbooks."""
return list(BUILTIN_PLAYBOOKS.values())
def get_definition(self, builtin_id: str) -> Optional[BuiltinPlaybookDefinition]:
"""Retourne la définition d'un builtin playbook par son ID."""
return BUILTIN_PLAYBOOKS.get(builtin_id)
def get_playbook_path(self, builtin_id: str) -> Optional[Path]:
"""Retourne le chemin complet du fichier playbook."""
definition = self.get_definition(builtin_id)
if not definition:
return None
return self.builtin_dir / definition.playbook_file
def is_builtin_playbook(self, filename: str) -> bool:
"""Vérifie si un fichier est un builtin playbook (commence par _builtin_)."""
return filename.startswith("_builtin_")
async def execute_builtin(
self,
builtin_id: str,
target: str,
extra_vars: Optional[Dict[str, Any]] = None,
) -> Dict[str, Any]:
"""
Exécute un builtin playbook et retourne les résultats.
Args:
builtin_id: ID du builtin playbook
target: Cible (hostname ou groupe)
extra_vars: Variables supplémentaires pour Ansible
Returns:
Dict avec success, stdout, stderr, parsed_metrics, etc.
"""
definition = self.get_definition(builtin_id)
if not definition:
return {
"success": False,
"error": f"Builtin playbook '{builtin_id}' non trouvé",
"parsed_metrics": {},
"stdout": "",
"stderr": f"Builtin playbook '{builtin_id}' non trouvé",
}
playbook_path = self.builtin_dir / definition.playbook_file
if not playbook_path.exists():
return {
"success": False,
"error": f"Fichier playbook '{definition.playbook_file}' non trouvé à {playbook_path}",
"parsed_metrics": {},
"stdout": "",
"stderr": f"Fichier playbook '{definition.playbook_file}' non trouvé à {playbook_path}",
}
if not self.ansible_service:
return {
"success": False,
"error": "Service Ansible non initialisé",
"parsed_metrics": {},
"stdout": "",
"stderr": "Service Ansible non initialisé",
}
start_time = time.time()
try:
# Exécuter le playbook via le service Ansible
# Le playbook doit être dans le sous-dossier builtin/
playbook_relative = f"builtin/{definition.playbook_file}"
result = await self.ansible_service.execute_playbook(
playbook=playbook_relative,
target=target,
extra_vars=extra_vars or {},
check_mode=False,
verbose=False,
)
execution_time = time.time() - start_time
# Parser les métriques depuis la sortie JSON
parsed_metrics = {}
if result.get("success") and definition.collect_metrics:
parsed_metrics = self._parse_metrics_from_output(
result.get("stdout", ""),
builtin_id
)
return {
"success": result.get("success", False),
"stdout": result.get("stdout", ""),
"stderr": result.get("stderr", ""),
"execution_time": execution_time,
"execution_time_ms": int(execution_time * 1000),
"parsed_metrics": parsed_metrics,
"builtin_id": builtin_id,
"target": target,
"return_code": result.get("return_code", -1),
}
except Exception as e:
execution_time = time.time() - start_time
error_msg = f"Exception lors de l'exécution du builtin playbook: {str(e)}"
print(f"[BUILTIN] {error_msg}")
import traceback
traceback.print_exc()
return {
"success": False,
"stdout": "",
"stderr": error_msg,
"error": error_msg,
"execution_time": execution_time,
"execution_time_ms": int(execution_time * 1000),
"parsed_metrics": {},
"builtin_id": builtin_id,
"target": target,
"return_code": -1,
}
def _parse_metrics_from_output(
self,
stdout: str,
builtin_id: str
) -> Dict[str, Dict[str, Any]]:
"""
Parse les métriques JSON depuis la sortie du playbook.
Les playbooks builtin utilisent le format:
METRICS_JSON_START:{"host": "hostname", "data": {...}}:METRICS_JSON_END
Returns:
Dict mapping hostname to metrics data
"""
metrics_by_host = {}
print(f"[BUILTIN] Parsing metrics from stdout ({len(stdout)} chars)")
# Pattern pour extraire les blocs JSON de métriques
# Format: METRICS_JSON_START:{...}:METRICS_JSON_END
pattern = r'METRICS_JSON_START:(.*?):METRICS_JSON_END'
matches = re.findall(pattern, stdout, re.DOTALL)
print(f"[BUILTIN] Found {len(matches)} METRICS_JSON matches")
for match in matches:
try:
data = json.loads(match.strip())
host = data.get("host", "unknown")
metrics = data.get("data", {})
metrics_by_host[host] = metrics
print(f"[BUILTIN] Parsed metrics for host: {host}")
except json.JSONDecodeError as e:
print(f"[BUILTIN] JSON decode error: {e}")
continue
# Fallback: essayer de parser les debug outputs Ansible standards
if not metrics_by_host:
print("[BUILTIN] No metrics found with primary pattern, trying fallback...")
metrics_by_host = self._parse_ansible_debug_output(stdout, builtin_id)
# Fallback 2: chercher le format "msg": "METRICS_JSON_START:..."
if not metrics_by_host:
print("[BUILTIN] Trying msg pattern fallback...")
# Pattern pour le format Ansible debug: "msg": "METRICS_JSON_START:...:METRICS_JSON_END"
msg_pattern = r'"msg":\s*"METRICS_JSON_START:(.*?):METRICS_JSON_END"'
msg_matches = re.findall(msg_pattern, stdout, re.DOTALL)
print(f"[BUILTIN] Found {len(msg_matches)} msg pattern matches")
for match in msg_matches:
try:
# Le JSON est échappé dans le msg, il faut le décoder
unescaped = match.replace('\\"', '"').replace('\\n', '\n')
data = json.loads(unescaped.strip())
host = data.get("host", "unknown")
metrics = data.get("data", {})
metrics_by_host[host] = metrics
print(f"[BUILTIN] Parsed metrics from msg for host: {host}")
except json.JSONDecodeError as e:
print(f"[BUILTIN] JSON decode error in msg pattern: {e}")
continue
print(f"[BUILTIN] Total hosts with metrics: {len(metrics_by_host)}")
if not metrics_by_host and stdout:
# Log un extrait du stdout pour debug
print(f"[BUILTIN] Stdout sample (first 500 chars): {stdout[:500]}")
return metrics_by_host
def _parse_ansible_debug_output(
self,
stdout: str,
builtin_id: str
) -> Dict[str, Dict[str, Any]]:
"""
Parse les métriques depuis les messages debug Ansible standards.
Format attendu: "host | SUCCESS => {...}" ou debug msg avec JSON
"""
metrics_by_host = {}
# Pattern pour les résultats ad-hoc ou debug
# Ex: hostname | SUCCESS => {"ansible_facts": {...}}
pattern = r'(\S+)\s*\|\s*(?:SUCCESS|CHANGED)\s*=>\s*(\{.*?\})\s*(?=\n\S|\Z)'
for line in stdout.split('\n'):
# Chercher les lignes de debug avec JSON
if '"metrics":' in line or '"cpu_' in line or '"memory_' in line or '"disk_' in line:
try:
# Trouver le JSON dans la ligne
json_match = re.search(r'\{.*\}', line)
if json_match:
data = json.loads(json_match.group())
# Essayer d'extraire le hostname depuis le contexte
host_match = re.search(r'^(\S+)\s*:', line)
if host_match:
host = host_match.group(1)
metrics_by_host[host] = data
except json.JSONDecodeError:
continue
return metrics_by_host
def _clean_numeric_value(self, value: Any) -> Optional[float]:
"""Convertit une valeur en float, retourne None si vide ou invalide."""
if value is None or value == '' or value == 'null':
return None
try:
return float(value)
except (ValueError, TypeError):
return None
def _clean_int_value(self, value: Any) -> Optional[int]:
"""Convertit une valeur en int, retourne None si vide ou invalide."""
if value is None or value == '' or value == 'null':
return None
try:
return int(float(value)) # float d'abord pour gérer "3.0"
except (ValueError, TypeError):
return None
def _clean_string_value(self, value: Any) -> Optional[str]:
"""Retourne None si la valeur est vide."""
if value is None or value == '' or value == 'null' or value == 'Unknown':
return None
return str(value)
def create_metrics_from_parsed(
self,
host_id: str,
parsed_data: Dict[str, Any],
builtin_id: str,
execution_time_ms: int
) -> HostMetricsCreate:
"""
Crée un objet HostMetricsCreate à partir des données parsées.
"""
# Mapper le builtin_id vers metric_type
metric_type_map = {
"collect_system_info": "system_info",
"collect_disk_usage": "disk_usage",
"collect_memory_info": "memory",
"collect_cpu_info": "cpu",
"collect_network_info": "network",
}
metric_type = metric_type_map.get(builtin_id, "unknown")
return HostMetricsCreate(
host_id=host_id,
metric_type=metric_type,
# CPU
cpu_count=self._clean_int_value(parsed_data.get("cpu_count")),
cpu_model=self._clean_string_value(parsed_data.get("cpu_model")),
cpu_cores=self._clean_int_value(parsed_data.get("cpu_cores")),
cpu_threads=self._clean_int_value(parsed_data.get("cpu_threads")),
cpu_threads_per_core=self._clean_int_value(parsed_data.get("cpu_threads_per_core")),
cpu_sockets=self._clean_int_value(parsed_data.get("cpu_sockets")),
cpu_mhz=self._clean_numeric_value(parsed_data.get("cpu_mhz")),
cpu_max_mhz=self._clean_numeric_value(parsed_data.get("cpu_max_mhz")),
cpu_min_mhz=self._clean_numeric_value(parsed_data.get("cpu_min_mhz")),
cpu_load_1m=self._clean_numeric_value(parsed_data.get("cpu_load_1m")),
cpu_load_5m=self._clean_numeric_value(parsed_data.get("cpu_load_5m")),
cpu_load_15m=self._clean_numeric_value(parsed_data.get("cpu_load_15m")),
cpu_usage_percent=self._clean_numeric_value(parsed_data.get("cpu_usage_percent")),
cpu_temperature=self._clean_numeric_value(parsed_data.get("cpu_temperature")),
# Memory
memory_total_mb=self._clean_int_value(parsed_data.get("memory_total_mb")),
memory_used_mb=self._clean_int_value(parsed_data.get("memory_used_mb")),
memory_free_mb=self._clean_int_value(parsed_data.get("memory_free_mb")),
memory_usage_percent=self._clean_numeric_value(parsed_data.get("memory_usage_percent")),
swap_total_mb=self._clean_int_value(parsed_data.get("swap_total_mb")),
swap_used_mb=self._clean_int_value(parsed_data.get("swap_used_mb")),
swap_usage_percent=self._clean_numeric_value(parsed_data.get("swap_usage_percent")),
# Disk
disk_info=parsed_data.get("disk_info"),
disk_devices=parsed_data.get("disk_devices"),
disk_root_total_gb=self._clean_numeric_value(parsed_data.get("disk_root_total_gb")),
disk_root_used_gb=self._clean_numeric_value(parsed_data.get("disk_root_used_gb")),
disk_root_usage_percent=self._clean_numeric_value(parsed_data.get("disk_root_usage_percent")),
# Storage stacks
lvm_info=parsed_data.get("lvm_info"),
zfs_info=parsed_data.get("zfs_info"),
# System
os_name=self._clean_string_value(parsed_data.get("os_name")),
os_version=self._clean_string_value(parsed_data.get("os_version")),
kernel_version=self._clean_string_value(parsed_data.get("kernel_version")),
hostname=self._clean_string_value(parsed_data.get("hostname")),
uptime_seconds=self._clean_int_value(parsed_data.get("uptime_seconds")),
uptime_human=self._clean_string_value(parsed_data.get("uptime_human")),
# Network
network_info=parsed_data.get("network_info"),
# Metadata
raw_data=parsed_data,
collection_source=builtin_id,
collection_duration_ms=execution_time_ms,
)
def metrics_to_summary(
self,
metrics: Any, # HostMetrics model
host_name: Optional[str] = None
) -> HostMetricsSummary:
"""Convertit un objet HostMetrics en HostMetricsSummary pour l'UI."""
if not metrics:
return HostMetricsSummary(
host_id="unknown",
host_name=host_name,
collection_status="unknown"
)
# Normaliser le timestamp en heure locale (UTC-5) pour l'affichage
collected_at = metrics.collected_at
if collected_at is not None:
if getattr(collected_at, "tzinfo", None) is None:
collected_at = collected_at.replace(tzinfo=timezone.utc)
app_tz = timezone(timedelta(hours=-5))
collected_at = collected_at.astimezone(app_tz)
return HostMetricsSummary(
host_id=metrics.host_id,
host_name=host_name,
last_collected=collected_at,
# CPU
cpu_usage_percent=metrics.cpu_usage_percent,
cpu_load_1m=metrics.cpu_load_1m,
cpu_temperature=metrics.cpu_temperature,
cpu_model=metrics.cpu_model,
cpu_count=metrics.cpu_count,
cpu_cores=getattr(metrics, "cpu_cores", None),
cpu_threads=getattr(metrics, "cpu_threads", None),
cpu_max_mhz=getattr(metrics, "cpu_max_mhz", None),
# Memory
memory_usage_percent=metrics.memory_usage_percent,
memory_total_mb=metrics.memory_total_mb,
memory_used_mb=metrics.memory_used_mb,
# Disk
disk_root_usage_percent=metrics.disk_root_usage_percent,
disk_root_total_gb=metrics.disk_root_total_gb,
disk_root_used_gb=metrics.disk_root_used_gb,
disk_info=metrics.disk_info if getattr(metrics, "disk_info", None) else None,
disk_devices=getattr(metrics, "disk_devices", None),
# Storage stacks
lvm_info=getattr(metrics, "lvm_info", None),
zfs_info=getattr(metrics, "zfs_info", None),
# System
os_name=metrics.os_name,
uptime_human=metrics.uptime_human,
# Status
collection_status="success" if not metrics.error_message else "failed",
error_message=metrics.error_message,
)
# Instance globale (sera initialisée au démarrage de l'application)
builtin_playbook_service: Optional[BuiltinPlaybookService] = None
def get_builtin_playbook_service() -> BuiltinPlaybookService:
"""Retourne l'instance du service builtin playbooks."""
global builtin_playbook_service
if builtin_playbook_service is None:
raise RuntimeError("BuiltinPlaybookService not initialized")
return builtin_playbook_service
def init_builtin_playbook_service(ansible_dir: Path, ansible_service=None) -> BuiltinPlaybookService:
"""Initialise le service builtin playbooks."""
global builtin_playbook_service
builtin_playbook_service = BuiltinPlaybookService(ansible_dir, ansible_service)
return builtin_playbook_service

View File

@ -0,0 +1,218 @@
"""
Service de capture des logs console (stdout/stderr).
Capture les logs de l'application en temps réel pour les afficher dans l'UI.
"""
import sys
import io
import re
import logging
import threading
from datetime import datetime, timezone
from collections import deque
from typing import List, Optional
from dataclasses import dataclass, asdict
@dataclass
class ConsoleLogEntry:
"""Entrée de log console."""
id: int
timestamp: str
level: str
message: str
source: str = "console"
def to_dict(self):
return asdict(self)
class ConsoleLogCapture:
"""
Capture les logs console (stdout/stderr) et les stocke en mémoire.
Utilise un buffer circulaire pour limiter l'utilisation mémoire.
"""
def __init__(self, max_entries: int = 2000):
self.max_entries = max_entries
self._logs: deque = deque(maxlen=max_entries)
self._lock = threading.Lock()
self._id_counter = 0
self._original_stdout = sys.stdout
self._original_stderr = sys.stderr
self._capturing = False
# Patterns pour détecter le niveau de log
self._level_patterns = [
(re.compile(r'\bERROR\b', re.IGNORECASE), 'ERROR'),
(re.compile(r'\bWARN(?:ING)?\b', re.IGNORECASE), 'WARN'),
(re.compile(r'\bDEBUG\b', re.IGNORECASE), 'DEBUG'),
(re.compile(r'\b(INFO|Started|Waiting|Application)\b', re.IGNORECASE), 'INFO'),
(re.compile(r'[✅🚀📋📦⏰🔔]'), 'INFO'),
(re.compile(r'[⚠️❌]'), 'WARN'),
]
def _detect_level(self, message: str) -> str:
"""Détecte le niveau de log à partir du message."""
for pattern, level in self._level_patterns:
if pattern.search(message):
return level
return 'INFO'
def add_log(self, message: str, level: Optional[str] = None, source: str = "console"):
"""Ajoute un log au buffer."""
if not message or not message.strip():
return
message = message.strip()
if not level:
level = self._detect_level(message)
with self._lock:
# Éviter les doublons consécutifs (même message dans les 2 dernières entrées)
if len(self._logs) > 0:
recent = list(self._logs)[-2:] if len(self._logs) >= 2 else list(self._logs)
for recent_log in recent:
if recent_log.message == message and recent_log.source == source:
return # Doublon, ignorer
self._id_counter += 1
entry = ConsoleLogEntry(
id=self._id_counter,
timestamp=datetime.now(timezone.utc).isoformat(),
level=level,
message=message,
source=source
)
self._logs.append(entry)
def get_logs(self, limit: int = 500, offset: int = 0, level: Optional[str] = None) -> List[dict]:
"""Récupère les logs avec pagination."""
with self._lock:
logs = list(self._logs)
# Filtrer par niveau si spécifié
if level:
logs = [l for l in logs if l.level.upper() == level.upper()]
# Trier par ID décroissant (plus récent en premier)
logs = sorted(logs, key=lambda x: x.id, reverse=True)
# Pagination
start = offset
end = offset + limit
paginated = logs[start:end]
return [l.to_dict() for l in paginated]
def get_count(self) -> int:
"""Retourne le nombre total de logs."""
with self._lock:
return len(self._logs)
def clear(self):
"""Vide le buffer de logs."""
with self._lock:
self._logs.clear()
def start_capture(self):
"""Démarre la capture des logs stdout/stderr et uvicorn."""
if self._capturing:
return
self._capturing = True
log_service = self
# Wrapper pour stdout
class StdoutWrapper:
def __init__(wrapper_self, original):
wrapper_self._original = original
wrapper_self._buffer = ""
def write(wrapper_self, text):
wrapper_self._original.write(text)
wrapper_self._original.flush()
# Accumuler et traiter les lignes complètes
wrapper_self._buffer += text
while '\n' in wrapper_self._buffer:
line, wrapper_self._buffer = wrapper_self._buffer.split('\n', 1)
if line.strip():
log_service.add_log(line, source="stdout")
return len(text)
def flush(wrapper_self):
wrapper_self._original.flush()
def __getattr__(wrapper_self, name):
return getattr(wrapper_self._original, name)
# Wrapper pour stderr
class StderrWrapper:
def __init__(wrapper_self, original):
wrapper_self._original = original
wrapper_self._buffer = ""
def write(wrapper_self, text):
wrapper_self._original.write(text)
wrapper_self._original.flush()
wrapper_self._buffer += text
while '\n' in wrapper_self._buffer:
line, wrapper_self._buffer = wrapper_self._buffer.split('\n', 1)
if line.strip():
log_service.add_log(line, source="stderr")
return len(text)
def flush(wrapper_self):
wrapper_self._original.flush()
def __getattr__(wrapper_self, name):
return getattr(wrapper_self._original, name)
sys.stdout = StdoutWrapper(self._original_stdout)
sys.stderr = StderrWrapper(self._original_stderr)
# Handler pour capturer les logs uvicorn/logging
class LogCaptureHandler(logging.Handler):
def emit(handler_self, record):
try:
msg = handler_self.format(record)
level_map = {
logging.DEBUG: 'DEBUG',
logging.INFO: 'INFO',
logging.WARNING: 'WARN',
logging.ERROR: 'ERROR',
logging.CRITICAL: 'ERROR',
}
level = level_map.get(record.levelno, 'INFO')
log_service.add_log(msg, level=level, source=record.name)
except Exception:
pass
# Ajouter le handler aux loggers uvicorn
self._log_handler = LogCaptureHandler()
self._log_handler.setFormatter(logging.Formatter('%(message)s'))
for logger_name in ['uvicorn', 'uvicorn.access', 'uvicorn.error']:
logger = logging.getLogger(logger_name)
logger.addHandler(self._log_handler)
def stop_capture(self):
"""Arrête la capture des logs."""
if not self._capturing:
return
self._capturing = False
sys.stdout = self._original_stdout
sys.stderr = self._original_stderr
# Retirer le handler des loggers uvicorn
if hasattr(self, '_log_handler'):
for logger_name in ['uvicorn', 'uvicorn.access', 'uvicorn.error']:
logger = logging.getLogger(logger_name)
logger.removeHandler(self._log_handler)
# Instance globale
console_log_service = ConsoleLogCapture()

View File

@ -0,0 +1,57 @@
"""
Service de gestion du statut runtime des hôtes.
"""
from datetime import datetime
from typing import Dict, Any, Optional
class HostStatusService:
"""Service simple pour stocker le statut runtime des hôtes en mémoire.
Cette implémentation ne persiste plus dans un fichier JSON ; les données
sont conservées uniquement pendant la vie du processus.
"""
def __init__(self):
# Dictionnaire: host_name -> {"status": str, "last_seen": Optional[datetime|str], "os": Optional[str]}
self._hosts: Dict[str, Dict[str, Any]] = {}
def set_status(
self,
host_name: str,
status: str,
last_seen: Optional[datetime] = None,
os_info: Optional[str] = None
) -> Dict[str, Any]:
"""Met à jour le statut d'un hôte en mémoire."""
entry = {
"status": status,
"last_seen": last_seen if isinstance(last_seen, datetime) else last_seen,
"os": os_info,
}
self._hosts[host_name] = entry
return entry
def get_status(self, host_name: str) -> Dict[str, Any]:
"""Récupère le statut d'un hôte, avec valeurs par défaut si absent."""
return self._hosts.get(host_name, {"status": "online", "last_seen": None, "os": None})
def get_all_status(self) -> Dict[str, Dict[str, Any]]:
"""Retourne une copie de tous les statuts connus."""
return dict(self._hosts)
def remove_host(self, host_name: str) -> bool:
"""Supprime le statut d'un hôte de la mémoire."""
if host_name in self._hosts:
del self._hosts[host_name]
return True
return False
def clear(self):
"""Efface tous les statuts."""
self._hosts.clear()
# Instance singleton du service
host_status_service = HostStatusService()

199
app/services/hybrid_db.py Normal file
View File

@ -0,0 +1,199 @@
"""
Base de données hybride combinant données Ansible et données runtime.
"""
from datetime import datetime, timezone
from typing import Any, Dict, List, Optional
from app.schemas.host_api import Host
from app.schemas.task_api import Task
from app.schemas.common import LogEntry, SystemMetrics
class HybridDB:
"""Base de données hybride combinant l'inventaire Ansible avec les données runtime.
Cette classe agit comme un agrégateur de données provenant de plusieurs sources:
- Inventaire Ansible (via AnsibleService)
- Statuts bootstrap (via BootstrapStatusService)
- Statuts runtime (via HostStatusService)
- Tâches et logs en mémoire
"""
def __init__(self):
# Cache des hôtes
self._hosts_cache: Optional[List[Host]] = None
self._hosts_cache_time: float = 0
self._cache_ttl = 60 # secondes
# Données en mémoire
self.tasks: List[Task] = []
self.logs: List[LogEntry] = []
# Compteurs pour les IDs
self._id_counters: Dict[str, int] = {
"tasks": 0,
"logs": 0,
}
def get_next_id(self, entity: str) -> int:
"""Génère un nouvel ID pour une entité."""
self._id_counters[entity] = self._id_counters.get(entity, 0) + 1
return self._id_counters[entity]
@property
def hosts(self) -> List[Host]:
"""Retourne la liste des hôtes, en la mettant à jour si nécessaire."""
import time
current_time = time.time()
if self._hosts_cache and (current_time - self._hosts_cache_time) < self._cache_ttl:
return self._hosts_cache
return self.refresh_hosts()
def refresh_hosts(self) -> List[Host]:
"""Rafraîchit la liste des hôtes depuis l'inventaire Ansible."""
import time
from app.services.ansible_service import ansible_service
from app.services.bootstrap_status_service import bootstrap_status_service
from app.services.host_status_service import host_status_service
hosts = []
inventory_hosts = ansible_service.get_hosts_from_inventory()
for inv_host in inventory_hosts:
# Récupérer le statut bootstrap
bs_status = bootstrap_status_service.get_bootstrap_status(inv_host.name)
# Récupérer le statut runtime
rt_status = host_status_service.get_status(inv_host.name)
host = Host(
id=inv_host.name, # Utiliser le nom comme ID
name=inv_host.name,
ip=inv_host.ansible_host or inv_host.name,
status=rt_status.get("status") or "unknown",
os=rt_status.get("os") or "Linux",
last_seen=rt_status.get("last_seen"),
groups=inv_host.groups or [inv_host.group] if inv_host.group else [],
bootstrap_ok=bs_status.get("bootstrap_ok", False),
bootstrap_date=bs_status.get("bootstrap_date")
)
hosts.append(host)
self._hosts_cache = hosts
self._hosts_cache_time = time.time()
return hosts
def invalidate_hosts_cache(self):
"""Invalide le cache des hôtes."""
self._hosts_cache = None
def get_host(self, host_id: str) -> Optional[Host]:
"""Récupère un hôte par son ID ou nom."""
for host in self.hosts:
if host.id == host_id or host.name == host_id or host.ip == host_id:
return host
return None
def update_host_status(
self,
host_name: str,
status: str,
os_info: str = None
):
"""Met à jour le statut d'un hôte."""
from app.services.host_status_service import host_status_service
host_status_service.set_status(
host_name=host_name,
status=status,
last_seen=datetime.now(timezone.utc),
os_info=os_info
)
# Invalider le cache pour forcer le rechargement
self.invalidate_hosts_cache()
@property
def metrics(self) -> SystemMetrics:
"""Calcule et retourne les métriques système."""
hosts = self.hosts
online_count = sum(1 for h in hosts if h.status == "online")
total_tasks = len(self.tasks)
# Calculer le taux de succès
completed = sum(1 for t in self.tasks if t.status == "completed")
failed = sum(1 for t in self.tasks if t.status == "failed")
total_finished = completed + failed
success_rate = (completed / total_finished * 100) if total_finished > 0 else 100.0
return SystemMetrics(
online_hosts=online_count,
total_tasks=total_tasks,
success_rate=round(success_rate, 1),
uptime=99.9, # TODO: calculer depuis le démarrage
cpu_usage=0.0,
memory_usage=0.0,
disk_usage=0.0
)
def add_task(self, task: Task):
"""Ajoute une tâche à la liste."""
self.tasks.insert(0, task)
# Limiter la taille de la liste
if len(self.tasks) > 1000:
self.tasks = self.tasks[:1000]
def get_task(self, task_id: str) -> Optional[Task]:
"""Récupère une tâche par son ID."""
for task in self.tasks:
if str(task.id) == str(task_id):
return task
return None
def update_task(self, task_id: str, **kwargs):
"""Met à jour une tâche existante."""
task = self.get_task(task_id)
if task:
for key, value in kwargs.items():
if hasattr(task, key):
setattr(task, key, value)
def add_log(self, log: LogEntry):
"""Ajoute une entrée de log."""
if log.id == 0:
log.id = self.get_next_id("logs")
self.logs.insert(0, log)
# Limiter la taille de la liste
if len(self.logs) > 5000:
self.logs = self.logs[:5000]
def get_recent_logs(self, limit: int = 50, level: str = None, source: str = None) -> List[LogEntry]:
"""Récupère les logs récents avec filtrage optionnel."""
logs = self.logs
if level:
logs = [l for l in logs if l.level == level]
if source:
logs = [l for l in logs if l.source == source]
return logs[:limit]
def clear_logs(self):
"""Efface tous les logs."""
self.logs.clear()
def clear_tasks(self):
"""Efface toutes les tâches."""
self.tasks.clear()
# Instance singleton de la base de données hybride
db = HybridDB()

View File

@ -29,6 +29,7 @@ from base64 import b64encode
import httpx import httpx
try:
from schemas.notification import ( from schemas.notification import (
NtfyConfig, NtfyConfig,
NtfyAction, NtfyAction,
@ -36,6 +37,14 @@ from schemas.notification import (
NotificationResponse, NotificationResponse,
NotificationTemplates, NotificationTemplates,
) )
except ModuleNotFoundError:
from app.schemas.notification import (
NtfyConfig,
NtfyAction,
NotificationRequest,
NotificationResponse,
NotificationTemplates,
)
# Logger dédié pour le service de notification # Logger dédié pour le service de notification
logger = logging.getLogger("homelab.notifications") logger = logging.getLogger("homelab.notifications")
@ -110,6 +119,49 @@ class NotificationService:
return headers return headers
def _build_headers(
self,
title: Optional[str] = None,
priority: Optional[int] = None,
tags: Optional[List[str]] = None,
click: Optional[str] = None,
attach: Optional[str] = None,
delay: Optional[str] = None,
) -> Dict[str, str]:
"""Construit les headers ntfy (ASCII-only) pour tests/compat.
Note: en prod on envoie en JSON pour supporter UTF-8 dans title/tags,
mais les tests unitaires valident encore cette méthode.
"""
headers: Dict[str, str] = {}
if title:
headers["Title"] = title
if priority is not None:
mapping = {
1: "min",
2: "low",
3: "default",
4: "high",
5: "urgent",
}
headers["Priority"] = mapping.get(int(priority), "default")
if tags:
headers["Tags"] = ",".join(tags)
if click:
headers["Click"] = click
if attach:
headers["Attach"] = attach
if delay:
headers["Delay"] = delay
headers.update(self._build_auth_headers())
return headers
def _should_send(self, level: str) -> bool: def _should_send(self, level: str) -> bool:
"""Détermine si une notification d'un certain niveau doit être envoyée. """Détermine si une notification d'un certain niveau doit être envoyée.
@ -241,8 +293,8 @@ class NotificationService:
# Utiliser le topic par défaut si non spécifié # Utiliser le topic par défaut si non spécifié
target_topic = topic or self._config.default_topic target_topic = topic or self._config.default_topic
# Construire l'URL de base (sans le topic, car il est dans le JSON) # Construire l'URL (les tests attendent /<topic>)
url = self._config.base_url.rstrip('/') url = f"{self._config.base_url.rstrip('/')}/{target_topic}"
# Construire le payload JSON (supporte UTF-8 dans le titre et les tags) # Construire le payload JSON (supporte UTF-8 dans le titre et les tags)
payload = self._build_json_payload( payload = self._build_json_payload(

View File

@ -0,0 +1,577 @@
"""
Service de planification des tâches avec APScheduler.
"""
import asyncio
import json
import uuid
from datetime import datetime, timezone, timedelta
from typing import Any, Dict, List, Optional
import pytz
from apscheduler.schedulers.asyncio import AsyncIOScheduler
from apscheduler.triggers.cron import CronTrigger
from apscheduler.triggers.date import DateTrigger
from app.core.config import settings
from app.models.database import async_session_maker
from app.schemas.schedule_api import (
Schedule,
ScheduleRecurrence,
ScheduleRun,
ScheduleCreateRequest,
ScheduleUpdateRequest,
ScheduleStats,
)
class SchedulerService:
"""Service pour gérer les schedules avec APScheduler."""
def __init__(self):
self._scheduler: Optional[AsyncIOScheduler] = None
self._schedules_cache: Dict[str, Schedule] = {}
self._timezone = pytz.timezone(settings.scheduler_timezone)
self._started = False
@property
def scheduler(self) -> AsyncIOScheduler:
"""Retourne l'instance du scheduler, le créant si nécessaire."""
if self._scheduler is None:
self._scheduler = AsyncIOScheduler(
timezone=self._timezone,
job_defaults={
'coalesce': True,
'max_instances': 1,
'misfire_grace_time': settings.scheduler_misfire_grace_time
}
)
return self._scheduler
async def start_async(self):
"""Démarre le scheduler et charge les schedules depuis la BD."""
if self._started:
return
await self._load_active_schedules_from_db()
self.scheduler.start()
self._started = True
print(f"⏰ Scheduler démarré avec {len(self._schedules_cache)} schedule(s)")
def shutdown(self):
"""Arrête le scheduler proprement."""
if self._scheduler and self._started:
self._scheduler.shutdown(wait=False)
self._started = False
print("⏰ Scheduler arrêté")
async def _load_active_schedules_from_db(self):
"""Charge les schedules actifs depuis la base de données."""
try:
async with async_session_maker() as session:
from app.crud.schedule import ScheduleRepository
repo = ScheduleRepository(session)
db_schedules = await repo.list_active()
for db_sched in db_schedules:
pydantic_sched = self._db_to_pydantic(db_sched)
self._schedules_cache[pydantic_sched.id] = pydantic_sched
if pydantic_sched.enabled:
self._add_job_for_schedule(pydantic_sched)
except Exception as e:
print(f"Erreur chargement schedules: {e}")
def _db_to_pydantic(self, db_sched) -> Schedule:
"""Convertit un modèle DB en modèle Pydantic."""
recurrence = None
if db_sched.recurrence_type:
recurrence = ScheduleRecurrence(
type=db_sched.recurrence_type,
time=db_sched.recurrence_time or "02:00",
days=json.loads(db_sched.recurrence_days) if db_sched.recurrence_days else None,
cron_expression=db_sched.cron_expression
)
return Schedule(
id=db_sched.id,
name=db_sched.name,
description=db_sched.description,
playbook=db_sched.playbook,
target_type=db_sched.target_type or "group",
target=db_sched.target,
extra_vars=db_sched.extra_vars,
schedule_type=db_sched.schedule_type,
recurrence=recurrence,
timezone=db_sched.timezone or settings.scheduler_timezone,
start_at=db_sched.start_at,
end_at=db_sched.end_at,
next_run_at=db_sched.next_run,
last_run_at=db_sched.last_run,
last_status=db_sched.last_status or "never",
enabled=db_sched.enabled,
retry_on_failure=db_sched.retry_on_failure or 0,
timeout=db_sched.timeout or 3600,
notification_type=db_sched.notification_type or "all",
tags=json.loads(db_sched.tags) if db_sched.tags else [],
run_count=db_sched.run_count or 0,
success_count=db_sched.success_count or 0,
failure_count=db_sched.failure_count or 0,
created_at=db_sched.created_at,
updated_at=db_sched.updated_at,
)
def _build_cron_trigger(self, recurrence: ScheduleRecurrence, tz: pytz.timezone) -> CronTrigger:
"""Construit un trigger cron à partir de la récurrence."""
hour, minute = 2, 0
if recurrence.time:
parts = recurrence.time.split(':')
hour = int(parts[0])
minute = int(parts[1]) if len(parts) > 1 else 0
if recurrence.type == "custom" and recurrence.cron_expression:
return CronTrigger.from_crontab(recurrence.cron_expression, timezone=tz)
elif recurrence.type == "daily":
return CronTrigger(hour=hour, minute=minute, timezone=tz)
elif recurrence.type == "weekly":
days = recurrence.days or [1] # Lundi par défaut
day_of_week = ','.join(str((d - 1) % 7) for d in days) # Convertir 1-7 en 0-6
return CronTrigger(day_of_week=day_of_week, hour=hour, minute=minute, timezone=tz)
elif recurrence.type == "monthly":
day = recurrence.day_of_month or 1
return CronTrigger(day=day, hour=hour, minute=minute, timezone=tz)
else:
return CronTrigger(hour=hour, minute=minute, timezone=tz)
def _add_job_for_schedule(self, schedule: Schedule):
"""Ajoute un job APScheduler pour un schedule."""
job_id = f"schedule_{schedule.id}"
# Supprimer l'ancien job s'il existe
existing = self.scheduler.get_job(job_id)
if existing:
self.scheduler.remove_job(job_id)
tz = pytz.timezone(schedule.timezone)
if schedule.schedule_type == "once":
if schedule.start_at:
trigger = DateTrigger(run_date=schedule.start_at, timezone=tz)
else:
return # Pas de date définie
else:
if not schedule.recurrence:
return
trigger = self._build_cron_trigger(schedule.recurrence, tz)
self.scheduler.add_job(
self._execute_schedule,
trigger=trigger,
id=job_id,
args=[schedule.id],
name=schedule.name,
replace_existing=True
)
# Mettre à jour next_run_at
job = self.scheduler.get_job(job_id)
if job and job.next_run_time:
schedule.next_run_at = job.next_run_time
self._schedules_cache[schedule.id] = schedule
async def _execute_schedule(self, schedule_id: str):
"""Exécute un schedule (appelé par APScheduler)."""
schedule = self._schedules_cache.get(schedule_id)
if not schedule:
return
run_id = f"run_{uuid.uuid4().hex[:12]}"
start_time = datetime.now(timezone.utc)
# Créer l'entrée de run
run = ScheduleRun(
id=run_id,
schedule_id=schedule_id,
started_at=start_time,
status="running"
)
try:
# Importer les services nécessaires
from app.services.ansible_service import ansible_service
from app.services.websocket_service import ws_manager
from app.services.notification_service import notification_service
# Mettre à jour le statut
schedule.last_status = "running"
self._schedules_cache[schedule_id] = schedule
# Notifier via WebSocket
await ws_manager.broadcast({
"type": "schedule_started",
"data": {
"schedule_id": schedule_id,
"schedule_name": schedule.name,
"run_id": run_id
}
})
# Exécuter le playbook
result = await ansible_service.execute_playbook(
playbook=schedule.playbook,
target=schedule.target,
extra_vars=schedule.extra_vars,
check_mode=False,
verbose=True
)
# Mettre à jour le run
end_time = datetime.now(timezone.utc)
duration = (end_time - start_time).total_seconds()
run.finished_at = end_time
run.duration_seconds = duration
run.status = "success" if result["success"] else "failed"
run.error_message = result.get("stderr") if not result["success"] else None
# Mettre à jour le schedule
schedule.last_run_at = end_time
schedule.last_status = run.status
schedule.run_count += 1
if result["success"]:
schedule.success_count += 1
else:
schedule.failure_count += 1
self._schedules_cache[schedule_id] = schedule
# Persister en BD
await self._persist_run(run)
await self._update_schedule_stats_in_db(schedule)
# Mettre à jour next_run
job = self.scheduler.get_job(f"schedule_{schedule_id}")
if job and job.next_run_time:
schedule.next_run_at = job.next_run_time
# Notifier via WebSocket
await ws_manager.broadcast({
"type": "schedule_completed",
"data": {
"schedule_id": schedule_id,
"schedule_name": schedule.name,
"run_id": run_id,
"status": run.status,
"duration": duration
}
})
# Envoyer notification selon la configuration
if schedule.notification_type != "none":
if result["success"] and schedule.notification_type == "all":
await notification_service.notify_task_completed(
task_name=f"[Planifié] {schedule.name}",
target=schedule.target,
duration=f"{duration:.1f}s"
)
elif not result["success"]:
await notification_service.notify_task_failed(
task_name=f"[Planifié] {schedule.name}",
target=schedule.target,
error=result.get("stderr", "Erreur inconnue")[:200]
)
except Exception as e:
end_time = datetime.now(timezone.utc)
duration = (end_time - start_time).total_seconds()
run.finished_at = end_time
run.duration_seconds = duration
run.status = "failed"
run.error_message = str(e)
schedule.last_run_at = end_time
schedule.last_status = "failed"
schedule.run_count += 1
schedule.failure_count += 1
self._schedules_cache[schedule_id] = schedule
await self._persist_run(run)
await self._update_schedule_stats_in_db(schedule)
print(f"Erreur exécution schedule {schedule_id}: {e}")
async def _persist_run(self, run: ScheduleRun):
"""Persiste un run dans la base de données."""
try:
async with async_session_maker() as session:
from app.crud.schedule_run import ScheduleRunRepository
repo = ScheduleRunRepository(session)
await repo.create(
schedule_id=run.schedule_id,
task_id=run.task_id,
status=run.status,
started_at=run.started_at,
completed_at=run.finished_at,
duration=run.duration_seconds,
error_message=run.error_message,
)
await session.commit()
except Exception as e:
print(f"Erreur persistance run: {e}")
async def _update_schedule_stats_in_db(self, schedule: Schedule):
"""Met à jour les stats du schedule en BD."""
try:
async with async_session_maker() as session:
from app.crud.schedule import ScheduleRepository
repo = ScheduleRepository(session)
db_sched = await repo.get(schedule.id)
if db_sched:
await repo.update(
db_sched,
last_run=schedule.last_run_at,
last_status=schedule.last_status,
run_count=schedule.run_count,
success_count=schedule.success_count,
failure_count=schedule.failure_count,
next_run=schedule.next_run_at
)
await session.commit()
except Exception as e:
print(f"Erreur mise à jour stats schedule: {e}")
# ===== API PUBLIQUE =====
def get_all_schedules(
self,
enabled: bool = None,
playbook: str = None,
tag: str = None
) -> List[Schedule]:
"""Récupère tous les schedules avec filtrage optionnel."""
schedules = list(self._schedules_cache.values())
if enabled is not None:
schedules = [s for s in schedules if s.enabled == enabled]
if playbook:
schedules = [s for s in schedules if playbook in s.playbook]
if tag:
schedules = [s for s in schedules if tag in s.tags]
# Trier par prochaine exécution
schedules.sort(key=lambda s: s.next_run_at or datetime.max.replace(tzinfo=timezone.utc))
return schedules
def get_schedule(self, schedule_id: str) -> Optional[Schedule]:
"""Récupère un schedule par son ID."""
return self._schedules_cache.get(schedule_id)
def add_schedule_to_cache(self, schedule: Schedule):
"""Ajoute un schedule au cache et crée le job."""
self._schedules_cache[schedule.id] = schedule
if schedule.enabled:
self._add_job_for_schedule(schedule)
def remove_schedule_from_cache(self, schedule_id: str):
"""Supprime un schedule du cache et son job."""
if schedule_id in self._schedules_cache:
del self._schedules_cache[schedule_id]
job_id = f"schedule_{schedule_id}"
if self.scheduler.get_job(job_id):
self.scheduler.remove_job(job_id)
def update_schedule(self, schedule_id: str, update: ScheduleUpdateRequest) -> Optional[Schedule]:
"""Met à jour un schedule."""
schedule = self._schedules_cache.get(schedule_id)
if not schedule:
return None
# Appliquer les mises à jour
if update.name is not None:
schedule.name = update.name
if update.description is not None:
schedule.description = update.description
if update.playbook is not None:
schedule.playbook = update.playbook
if update.target is not None:
schedule.target = update.target
if update.schedule_type is not None:
schedule.schedule_type = update.schedule_type
if update.recurrence is not None:
schedule.recurrence = update.recurrence
if update.timezone is not None:
schedule.timezone = update.timezone
if update.enabled is not None:
schedule.enabled = update.enabled
if update.notification_type is not None:
schedule.notification_type = update.notification_type
if update.tags is not None:
schedule.tags = update.tags
schedule.updated_at = datetime.now(timezone.utc)
self._schedules_cache[schedule_id] = schedule
# Recréer le job si activé
job_id = f"schedule_{schedule_id}"
if self.scheduler.get_job(job_id):
self.scheduler.remove_job(job_id)
if schedule.enabled:
self._add_job_for_schedule(schedule)
return schedule
def delete_schedule(self, schedule_id: str) -> bool:
"""Supprime un schedule."""
self.remove_schedule_from_cache(schedule_id)
return True
def pause_schedule(self, schedule_id: str) -> bool:
"""Met un schedule en pause."""
schedule = self._schedules_cache.get(schedule_id)
if not schedule:
return False
schedule.enabled = False
self._schedules_cache[schedule_id] = schedule
job_id = f"schedule_{schedule_id}"
job = self.scheduler.get_job(job_id)
if job:
self.scheduler.pause_job(job_id)
return True
def resume_schedule(self, schedule_id: str) -> bool:
"""Reprend un schedule en pause."""
schedule = self._schedules_cache.get(schedule_id)
if not schedule:
return False
schedule.enabled = True
self._schedules_cache[schedule_id] = schedule
job_id = f"schedule_{schedule_id}"
job = self.scheduler.get_job(job_id)
if job:
self.scheduler.resume_job(job_id)
else:
self._add_job_for_schedule(schedule)
return True
async def run_now(self, schedule_id: str) -> Optional[ScheduleRun]:
"""Exécute immédiatement un schedule."""
schedule = self._schedules_cache.get(schedule_id)
if not schedule:
return None
# Exécuter dans une tâche séparée
asyncio.create_task(self._execute_schedule(schedule_id))
return ScheduleRun(
id=f"run_{uuid.uuid4().hex[:12]}",
schedule_id=schedule_id,
started_at=datetime.now(timezone.utc),
status="running"
)
def get_stats(self) -> ScheduleStats:
"""Récupère les statistiques globales."""
schedules = list(self._schedules_cache.values())
active = sum(1 for s in schedules if s.enabled)
paused = len(schedules) - active
# Trouver la prochaine exécution
next_exec = None
next_name = None
for s in schedules:
if s.enabled and s.next_run_at:
if next_exec is None or s.next_run_at < next_exec:
next_exec = s.next_run_at
next_name = s.name
# Statistiques 24h
failures_24h = sum(1 for s in schedules if s.last_status == "failed" and s.last_run_at and s.last_run_at > datetime.now(timezone.utc) - timedelta(hours=24))
executions_24h = sum(1 for s in schedules if s.last_run_at and s.last_run_at > datetime.now(timezone.utc) - timedelta(hours=24))
# Taux de succès 7 jours
total_runs = sum(s.run_count for s in schedules)
total_success = sum(s.success_count for s in schedules)
success_rate = (total_success / total_runs * 100) if total_runs > 0 else 0.0
return ScheduleStats(
total=len(schedules),
active=active,
paused=paused,
expired=0,
next_execution=next_exec,
next_schedule_name=next_name,
failures_24h=failures_24h,
executions_24h=executions_24h,
success_rate_7d=success_rate
)
def get_upcoming_executions(self, limit: int = 10) -> List[Dict[str, Any]]:
"""Récupère les prochaines exécutions planifiées."""
upcoming = []
for schedule in self._schedules_cache.values():
if schedule.enabled and schedule.next_run_at:
upcoming.append({
"schedule_id": schedule.id,
"schedule_name": schedule.name,
"playbook": schedule.playbook,
"target": schedule.target,
"next_run_at": schedule.next_run_at.isoformat() if schedule.next_run_at else None,
"tags": schedule.tags
})
# Trier par date
upcoming.sort(key=lambda x: x["next_run_at"] or "")
return upcoming[:limit]
def validate_cron_expression(self, expression: str) -> Dict[str, Any]:
"""Valide une expression cron et retourne les prochaines exécutions."""
try:
trigger = CronTrigger.from_crontab(expression, timezone=self._timezone)
# Calculer les 5 prochaines exécutions
next_runs = []
next_time = datetime.now(self._timezone)
for _ in range(5):
next_time = trigger.get_next_fire_time(None, next_time)
if next_time:
next_runs.append(next_time.isoformat())
next_time = next_time + timedelta(seconds=1)
return {
"valid": True,
"expression": expression,
"next_runs": next_runs,
"error": None
}
except Exception as e:
return {
"valid": False,
"expression": expression,
"next_runs": None,
"error": str(e)
}
# Instance singleton du service
scheduler_service = SchedulerService()

View File

@ -0,0 +1,681 @@
"""
Service de vérification des prérequis au démarrage de l'application.
Valide les dépendances externes, les clés SSH, et le fonctionnement d'Ansible.
"""
import asyncio
import os
import shutil
import subprocess
from dataclasses import dataclass, field
from pathlib import Path
from typing import List, Optional, Tuple
from enum import Enum
class CheckStatus(Enum):
"""Statut d'une vérification"""
OK = "ok"
WARNING = "warning"
ERROR = "error"
SKIPPED = "skipped"
@dataclass
class CheckResult:
"""Résultat d'une vérification individuelle"""
name: str
status: CheckStatus
message: str
details: Optional[str] = None
@dataclass
class StartupCheckReport:
"""Rapport complet des vérifications au démarrage"""
results: List[CheckResult] = field(default_factory=list)
@property
def has_errors(self) -> bool:
return any(r.status == CheckStatus.ERROR for r in self.results)
@property
def has_warnings(self) -> bool:
return any(r.status == CheckStatus.WARNING for r in self.results)
@property
def all_ok(self) -> bool:
return all(r.status in (CheckStatus.OK, CheckStatus.SKIPPED) for r in self.results)
def add(self, result: CheckResult):
self.results.append(result)
def print_report(self):
"""Affiche le rapport des vérifications dans la console"""
print("\n" + "=" * 60)
print("🔍 VÉRIFICATION DES PRÉREQUIS AU DÉMARRAGE")
print("=" * 60)
for result in self.results:
icon = self._get_status_icon(result.status)
print(f"{icon} {result.name}: {result.message}")
if result.details:
# Indenter les détails
for line in result.details.split('\n'):
if line.strip():
print(f" └─ {line}")
print("-" * 60)
if self.all_ok:
print("✅ Tous les prérequis sont satisfaits")
elif self.has_errors:
print("❌ Des erreurs critiques ont été détectées")
else:
print("⚠️ Des avertissements ont été détectés")
print("=" * 60 + "\n")
def _get_status_icon(self, status: CheckStatus) -> str:
icons = {
CheckStatus.OK: "",
CheckStatus.WARNING: "⚠️ ",
CheckStatus.ERROR: "",
CheckStatus.SKIPPED: "⏭️ ",
}
return icons.get(status, "")
class StartupChecksService:
"""Service de vérification des prérequis au démarrage"""
def __init__(
self,
ansible_dir: Path,
ssh_key_path: str,
ssh_user: str = "automation",
test_host: str = "localhost",
):
self.ansible_dir = ansible_dir
self.ssh_key_path = Path(ssh_key_path)
self.ssh_user = ssh_user
self.test_host = test_host
self.report = StartupCheckReport()
async def run_all_checks(self) -> StartupCheckReport:
"""Exécute toutes les vérifications et retourne le rapport"""
self.report = StartupCheckReport()
# 1. Vérification des packages Python requis
await self._check_python_packages()
# 2. Vérification des variables d'environnement
await self._check_env_vars()
# 3. Vérification des outils système (ansible, ssh)
await self._check_system_tools()
# 4. Vérification de la clé SSH
await self._check_ssh_key()
# 5. Vérification de la configuration Ansible
await self._check_ansible_config()
# 6. Vérification de l'inventaire Ansible
await self._check_ansible_inventory()
# 7. Test de connexion SSH vers localhost
await self._check_ssh_connection()
# 8. Test d'exécution Ansible (ping localhost)
await self._check_ansible_ping()
return self.report
async def _check_python_packages(self):
"""Vérifie que les packages Python requis sont installés"""
required_packages = [
("ansible", "ansible"),
("yaml", "pyyaml"),
("aiosqlite", "aiosqlite"),
("sqlalchemy", "sqlalchemy"),
("fastapi", "fastapi"),
("uvicorn", "uvicorn"),
("httpx", "httpx"),
("apscheduler", "apscheduler"),
]
missing = []
installed = []
for import_name, package_name in required_packages:
try:
__import__(import_name)
installed.append(package_name)
except ImportError:
missing.append(package_name)
if missing:
self.report.add(CheckResult(
name="Packages Python",
status=CheckStatus.ERROR,
message=f"{len(missing)} package(s) manquant(s)",
details=f"Manquants: {', '.join(missing)}"
))
else:
self.report.add(CheckResult(
name="Packages Python",
status=CheckStatus.OK,
message=f"{len(installed)} packages requis installés"
))
async def _check_env_vars(self):
"""Vérifie les variables d'environnement importantes et affiche leurs valeurs (sensibles masquées)."""
# Définition des variables à contrôler
# required=True indique qu'elles sont importantes pour la sécurité ou la config,
# même si le code a une valeur par défaut.
env_defs = [
# Sécurité / Auth
{"key": "API_KEY", "required": True, "sensitive": True, "dev_default": "dev-key-12345"},
{"key": "JWT_SECRET_KEY", "required": True, "sensitive": True, "dev_default": "homelab-secret-key-change-in-production"},
{"key": "JWT_EXPIRE_MINUTES", "required": False, "sensitive": False, "dev_default": "1440"},
# Base de données
{"key": "DATABASE_URL", "required": False, "sensitive": False, "dev_default": None},
{"key": "DB_PATH", "required": False, "sensitive": False, "dev_default": None},
# Logs et chemins
{"key": "LOGS_DIR", "required": False, "sensitive": False, "dev_default": "/logs"},
{"key": "DIR_LOGS_TASKS", "required": False, "sensitive": False, "dev_default": "./tasks_logs"},
# SSH / Ansible
{"key": "SSH_USER", "required": False, "sensitive": False, "dev_default": "automation"},
{"key": "SSH_REMOTE_USER", "required": False, "sensitive": False, "dev_default": "root"},
{"key": "SSH_KEY_PATH", "required": False, "sensitive": False, "dev_default": None},
{"key": "ANSIBLE_INVENTORY", "required": False, "sensitive": False, "dev_default": "./ansible/inventory"},
{"key": "ANSIBLE_PLAYBOOKS", "required": False, "sensitive": False, "dev_default": "./ansible/playbooks"},
{"key": "ANSIBLE_GROUP_VARS", "required": False, "sensitive": False, "dev_default": "./ansible/inventory/group_vars"},
# Notifications ntfy
{"key": "NTFY_BASE_URL", "required": False, "sensitive": False, "dev_default": "http://localhost:8150"},
{"key": "NTFY_DEFAULT_TOPIC", "required": False, "sensitive": False, "dev_default": "homelab-events"},
{"key": "NTFY_ENABLED", "required": False, "sensitive": False, "dev_default": "true"},
{"key": "NTFY_TIMEOUT", "required": False, "sensitive": False, "dev_default": "5"},
{"key": "NTFY_MSG_TYPE", "required": False, "sensitive": False, "dev_default": "ALL"},
{"key": "NTFY_USERNAME", "required": False, "sensitive": True, "dev_default": None},
{"key": "NTFY_PASSWORD", "required": False, "sensitive": True, "dev_default": None},
{"key": "NTFY_TOKEN", "required": False, "sensitive": True, "dev_default": None},
]
details_lines: List[str] = []
warnings = 0
errors = 0
for env_def in env_defs:
key = env_def["key"]
required = env_def["required"]
sensitive = env_def["sensitive"]
dev_default = env_def["dev_default"]
value = os.environ.get(key)
if value is None or value == "":
if required:
# Valeur manquante mais le code a généralement un fallback interne
warnings += 1
details_lines.append(f"{key}=<non défini> (valeur par défaut interne utilisée)")
else:
details_lines.append(f"{key}=<non défini>")
continue
# Il y a une valeur définie
display_value: str
if sensitive:
# Masquer les valeurs sensibles (clés, tokens, mots de passe)
if len(value) <= 4:
masked = "*" * len(value)
else:
masked = value[:2] + "***" + value[-2:]
display_value = masked
else:
display_value = value
# Détecter l'utilisation de valeurs de développement connues
if dev_default is not None and value == dev_default and required:
warnings += 1
details_lines.append(f"{key}={display_value} (valeur de DEV, à changer en production)")
else:
details_lines.append(f"{key}={display_value}")
# Si aucune ligne (cas improbable), éviter un message vide
if not details_lines:
details_lines.append("Aucune variable d'environnement spécifique détectée")
if errors > 0:
status = CheckStatus.ERROR
message = f"{errors} variable(s) d'environnement critique(s) manquante(s)"
elif warnings > 0:
status = CheckStatus.WARNING
message = f"{warnings} avertissement(s) de configuration d'environnement"
else:
status = CheckStatus.OK
message = "Variables d'environnement principales définies"
self.report.add(CheckResult(
name="Variables d'environnement",
status=status,
message=message,
details="\n".join(details_lines),
))
async def _check_system_tools(self):
"""Vérifie que les outils système requis sont disponibles"""
tools = {
"ansible": "ansible --version",
"ansible-playbook": "ansible-playbook --version",
"ssh": "ssh -V",
}
results = []
for tool, cmd in tools.items():
path = shutil.which(tool)
if path:
# Récupérer la version
try:
result = await asyncio.to_thread(
subprocess.run,
cmd.split(),
capture_output=True,
text=True,
timeout=10
)
# Combiner stdout et stderr pour trouver la version
output = result.stdout + result.stderr
# Chercher une ligne contenant une version
version_line = ""
# Patterns à ignorer (code Python, tracebacks, etc.)
skip_starts = ('Traceback', 'File', ' ', 'from ', 'import ', '~', '^',
'if ', 'def ', 'class ', 'return ', 'raise ', 'OSError', 'WinError')
for line in output.split('\n'):
line = line.strip()
# Ignorer les lignes de traceback, import, code Python, etc.
if line and not any(line.startswith(x) for x in skip_starts):
# Chercher des patterns de version
if any(x in line.lower() for x in ['version', 'openssh', 'core [']):
version_line = line[:60]
break
# Pattern spécifique pour ansible
if tool.startswith('ansible') and 'ansible' in line.lower() and '[' in line:
version_line = line[:60]
break
if not version_line:
# Prendre la première ligne non vide qui n'est pas du code
for line in output.split('\n'):
line = line.strip()
if line and not any(line.startswith(x) for x in skip_starts) and not any(x in line for x in ['(', ')', ':', '=']):
version_line = line[:60]
break
# Si toujours pas de version, juste indiquer que c'est installé
results.append((tool, True, version_line if version_line else f"installé à {path}"))
except Exception as e:
results.append((tool, True, f"installé à {path}"))
else:
results.append((tool, False, "non trouvé"))
missing = [r[0] for r in results if not r[1]]
if missing:
self.report.add(CheckResult(
name="Outils système",
status=CheckStatus.ERROR,
message=f"{len(missing)} outil(s) manquant(s): {', '.join(missing)}",
details="\n".join([f"{r[0]}: {r[2]}" for r in results])
))
else:
self.report.add(CheckResult(
name="Outils système",
status=CheckStatus.OK,
message="ansible, ansible-playbook, ssh disponibles",
details="\n".join([f"{r[0]}: {r[2]}" for r in results if r[1]])
))
async def _check_ssh_key(self):
"""Vérifie que la clé SSH est disponible et valide"""
# Vérifier si le fichier existe
if not self.ssh_key_path.exists():
self.report.add(CheckResult(
name="Clé SSH",
status=CheckStatus.ERROR,
message=f"Clé SSH non trouvée",
details=f"Chemin: {self.ssh_key_path}"
))
return
# Vérifier les permissions (sur Linux/Mac)
if os.name != 'nt': # Non-Windows
stat_info = self.ssh_key_path.stat()
mode = oct(stat_info.st_mode)[-3:]
if mode not in ('600', '400'):
self.report.add(CheckResult(
name="Clé SSH",
status=CheckStatus.WARNING,
message=f"Permissions incorrectes ({mode})",
details=f"Chemin: {self.ssh_key_path}\nPermissions recommandées: 600"
))
return
# Vérifier que c'est une clé valide
try:
result = await asyncio.to_thread(
subprocess.run,
["ssh-keygen", "-l", "-f", str(self.ssh_key_path)],
capture_output=True,
text=True,
timeout=10
)
if result.returncode == 0:
key_info = result.stdout.strip()
self.report.add(CheckResult(
name="Clé SSH",
status=CheckStatus.OK,
message="Clé SSH valide",
details=f"Chemin: {self.ssh_key_path}\n{key_info}"
))
else:
self.report.add(CheckResult(
name="Clé SSH",
status=CheckStatus.ERROR,
message="Clé SSH invalide",
details=result.stderr.strip()
))
except FileNotFoundError:
# ssh-keygen non disponible (Windows sans OpenSSH)
self.report.add(CheckResult(
name="Clé SSH",
status=CheckStatus.OK,
message="Clé SSH présente (validation partielle)",
details=f"Chemin: {self.ssh_key_path}\nTaille: {self.ssh_key_path.stat().st_size} bytes"
))
except Exception as e:
self.report.add(CheckResult(
name="Clé SSH",
status=CheckStatus.WARNING,
message=f"Impossible de valider la clé: {str(e)}",
details=f"Chemin: {self.ssh_key_path}"
))
async def _check_ansible_config(self):
"""Vérifie la configuration Ansible"""
ansible_cfg = self.ansible_dir / "ansible.cfg"
if not ansible_cfg.exists():
self.report.add(CheckResult(
name="Configuration Ansible",
status=CheckStatus.WARNING,
message="Fichier ansible.cfg non trouvé",
details=f"Chemin attendu: {ansible_cfg}"
))
return
# Vérifier que le fichier est lisible et contient les sections essentielles
try:
content = ansible_cfg.read_text()
has_defaults = "[defaults]" in content
has_inventory = "inventory" in content
if has_defaults and has_inventory:
self.report.add(CheckResult(
name="Configuration Ansible",
status=CheckStatus.OK,
message="ansible.cfg valide",
details=f"Chemin: {ansible_cfg}"
))
else:
self.report.add(CheckResult(
name="Configuration Ansible",
status=CheckStatus.WARNING,
message="Configuration Ansible incomplète",
details=f"[defaults]: {'' if has_defaults else ''}, inventory: {'' if has_inventory else ''}"
))
except Exception as e:
self.report.add(CheckResult(
name="Configuration Ansible",
status=CheckStatus.ERROR,
message=f"Erreur lecture ansible.cfg: {str(e)}"
))
async def _check_ansible_inventory(self):
"""Vérifie l'inventaire Ansible"""
inventory_path = self.ansible_dir / "inventory" / "hosts.yml"
if not inventory_path.exists():
self.report.add(CheckResult(
name="Inventaire Ansible",
status=CheckStatus.ERROR,
message="Fichier d'inventaire non trouvé",
details=f"Chemin attendu: {inventory_path}"
))
return
try:
import yaml
content = inventory_path.read_text()
inventory = yaml.safe_load(content)
# Compter les hôtes
host_count = 0
group_count = 0
def count_hosts(data, depth=0):
nonlocal host_count, group_count
if isinstance(data, dict):
if 'hosts' in data and isinstance(data['hosts'], dict):
host_count += len(data['hosts'])
if 'children' in data:
group_count += len(data['children'])
for child in data['children'].values():
count_hosts(child, depth + 1)
count_hosts(inventory.get('all', {}))
self.report.add(CheckResult(
name="Inventaire Ansible",
status=CheckStatus.OK,
message=f"{host_count} hôte(s) dans {group_count} groupe(s)",
details=f"Chemin: {inventory_path}"
))
except Exception as e:
self.report.add(CheckResult(
name="Inventaire Ansible",
status=CheckStatus.ERROR,
message=f"Erreur lecture inventaire: {str(e)}"
))
async def _check_ssh_connection(self):
"""Teste la connexion SSH vers l'hôte de test"""
# Pour localhost, on utilise la connexion locale Ansible, pas SSH
if self.test_host == "localhost":
self.report.add(CheckResult(
name="Connexion SSH",
status=CheckStatus.SKIPPED,
message="Test SSH ignoré pour localhost",
details="Utilisation de la connexion locale Ansible"
))
return
# Vérifier d'abord que la clé SSH existe
if not self.ssh_key_path.exists():
self.report.add(CheckResult(
name="Connexion SSH",
status=CheckStatus.SKIPPED,
message="Test SSH ignoré (clé SSH non disponible)",
details=f"Clé manquante: {self.ssh_key_path}"
))
return
try:
# Test SSH avec timeout court
cmd = [
"ssh",
"-o", "StrictHostKeyChecking=no",
"-o", "BatchMode=yes",
"-o", "ConnectTimeout=5",
"-i", str(self.ssh_key_path),
f"{self.ssh_user}@{self.test_host}",
"echo", "SSH_OK"
]
result = await asyncio.to_thread(
subprocess.run,
cmd,
capture_output=True,
text=True,
timeout=15
)
if result.returncode == 0 and "SSH_OK" in result.stdout:
self.report.add(CheckResult(
name="Connexion SSH",
status=CheckStatus.OK,
message=f"Connexion SSH vers {self.test_host} réussie",
details=f"Utilisateur: {self.ssh_user}"
))
else:
error_msg = result.stderr.strip() if result.stderr else "Erreur inconnue"
self.report.add(CheckResult(
name="Connexion SSH",
status=CheckStatus.WARNING,
message=f"Connexion SSH vers {self.test_host} échouée",
details=f"Erreur: {error_msg[:100]}"
))
except subprocess.TimeoutExpired:
self.report.add(CheckResult(
name="Connexion SSH",
status=CheckStatus.WARNING,
message=f"Timeout connexion SSH vers {self.test_host}",
details="La connexion a dépassé le délai de 15 secondes"
))
except Exception as e:
self.report.add(CheckResult(
name="Connexion SSH",
status=CheckStatus.WARNING,
message=f"Test SSH non effectué: {str(e)}"
))
async def _check_ansible_ping(self):
"""Teste le ping Ansible vers l'hôte de test"""
try:
# Pour localhost, utiliser connexion locale (pas besoin de SSH)
if self.test_host == "localhost":
cmd = [
"ansible",
self.test_host,
"-m", "ping",
"-i", str(self.ansible_dir / "inventory" / "hosts.yml"),
"-c", "local", # Connexion locale
"-o", # One-line output
]
else:
cmd = [
"ansible",
self.test_host,
"-m", "ping",
"-i", str(self.ansible_dir / "inventory" / "hosts.yml"),
"--private-key", str(self.ssh_key_path),
"-u", self.ssh_user,
"-o", # One-line output
]
result = await asyncio.to_thread(
subprocess.run,
cmd,
capture_output=True,
text=True,
timeout=30,
cwd=str(self.ansible_dir)
)
if result.returncode == 0 and "SUCCESS" in result.stdout:
self.report.add(CheckResult(
name="Ansible Ping",
status=CheckStatus.OK,
message=f"Ansible ping vers {self.test_host} réussi",
details="Module ping exécuté avec succès"
))
else:
# Extraire le message d'erreur pertinent (filtrer les tracebacks Python)
error_output = result.stdout + result.stderr
# Détecter les erreurs Windows spécifiques
if 'WinError' in error_output or 'blocking_io' in error_output.lower():
self.report.add(CheckResult(
name="Ansible Ping",
status=CheckStatus.WARNING,
message=f"Ansible non compatible avec cet environnement Windows",
details="Ansible fonctionne mieux sous WSL ou Linux"
))
return
# Filtrer les lignes de traceback et garder les messages utiles
useful_lines = []
skip_patterns = ('Traceback', 'File ', ' File', ' ', 'from ', 'import ',
'~', '^', 'check_', 'if ', 'def ', 'OSError', 'raise ')
for line in error_output.split('\n'):
line = line.strip()
if line and not any(line.startswith(p) for p in skip_patterns):
# Garder les lignes d'erreur Ansible ou messages pertinents
if any(x in line.lower() for x in ['error', 'failed', 'unreachable', 'fatal', 'msg:', 'permission']):
useful_lines.append(line[:80])
if len(useful_lines) >= 2:
break
error_detail = "\n".join(useful_lines) if useful_lines else "Vérifiez la configuration Ansible"
self.report.add(CheckResult(
name="Ansible Ping",
status=CheckStatus.WARNING,
message=f"Ansible ping vers {self.test_host} échoué",
details=error_detail
))
except subprocess.TimeoutExpired:
self.report.add(CheckResult(
name="Ansible Ping",
status=CheckStatus.WARNING,
message="Timeout Ansible ping",
details="L'exécution a dépassé 30 secondes"
))
except Exception as e:
self.report.add(CheckResult(
name="Ansible Ping",
status=CheckStatus.WARNING,
message=f"Test Ansible non effectué: {str(e)}"
))
# Instance globale du service (sera configurée au démarrage)
startup_checks_service: Optional[StartupChecksService] = None
async def run_startup_checks(
ansible_dir: Path,
ssh_key_path: str,
ssh_user: str = "automation",
test_host: str = "localhost",
) -> StartupCheckReport:
"""
Fonction utilitaire pour exécuter les vérifications au démarrage.
Args:
ansible_dir: Chemin vers le répertoire Ansible
ssh_key_path: Chemin vers la clé SSH privée
ssh_user: Utilisateur SSH pour les tests
test_host: Hôte de test pour les connexions SSH/Ansible
Returns:
StartupCheckReport: Rapport des vérifications
"""
global startup_checks_service
startup_checks_service = StartupChecksService(
ansible_dir=ansible_dir,
ssh_key_path=ssh_key_path,
ssh_user=ssh_user,
test_host=test_host,
)
report = await startup_checks_service.run_all_checks()
report.print_report()
return report

View File

@ -0,0 +1,649 @@
"""
Service de gestion des logs de tâches en fichiers markdown.
"""
import json
import re
from datetime import datetime, timezone
from pathlib import Path
from typing import Any, Dict, List, Optional, Tuple
import uuid
import pytz
from app.schemas.task_api import TaskLogFile
class TaskLogService:
"""Service pour gérer les logs de tâches en fichiers markdown."""
def __init__(self, base_dir: Path):
self.base_dir = base_dir
self._ensure_base_dir()
# Cache des métadonnées pour éviter de relire les fichiers
self._metadata_cache: Dict[str, Dict[str, Any]] = {}
self._cache_file = base_dir / ".metadata_cache.json"
# Index complet des logs (construit une fois, mis à jour incrémentalement)
self._logs_index: List[Dict[str, Any]] = []
self._index_built = False
self._last_scan_time = 0.0
self._load_cache()
def _ensure_base_dir(self):
"""Crée le répertoire de base s'il n'existe pas."""
self.base_dir.mkdir(parents=True, exist_ok=True)
def _load_cache(self):
"""Charge le cache des métadonnées depuis le fichier."""
try:
if self._cache_file.exists():
with open(self._cache_file, 'r', encoding='utf-8') as f:
self._metadata_cache = json.load(f)
except Exception:
self._metadata_cache = {}
def _save_cache(self):
"""Sauvegarde le cache des métadonnées dans le fichier."""
try:
with open(self._cache_file, 'w', encoding='utf-8') as f:
json.dump(self._metadata_cache, f, ensure_ascii=False)
except Exception:
pass
def _get_cached_metadata(self, file_path: str, file_mtime: float) -> Optional[Dict[str, Any]]:
"""Récupère les métadonnées du cache si elles sont valides."""
cached = self._metadata_cache.get(file_path)
if cached and cached.get('_mtime') == file_mtime:
return cached
return None
def _cache_metadata(self, file_path: str, file_mtime: float, metadata: Dict[str, Any]):
"""Met en cache les métadonnées d'un fichier."""
metadata['_mtime'] = file_mtime
self._metadata_cache[file_path] = metadata
def _build_index(self, force: bool = False):
"""Construit l'index complet des logs (appelé une seule fois au démarrage ou après 60s)."""
import time
current_time = time.time()
# Ne reconstruire que si nécessaire (toutes les 60 secondes max ou si forcé)
if self._index_built and not force and (current_time - self._last_scan_time) < 60:
return
self._logs_index = []
cache_updated = False
if not self.base_dir.exists():
self._index_built = True
self._last_scan_time = current_time
return
# Parcourir tous les fichiers
for year_dir in self.base_dir.iterdir():
if not year_dir.is_dir() or not year_dir.name.isdigit():
continue
for month_dir in year_dir.iterdir():
if not month_dir.is_dir():
continue
for day_dir in month_dir.iterdir():
if not day_dir.is_dir():
continue
for md_file in day_dir.glob("*.md"):
try:
entry = self._index_file(md_file)
if entry:
if entry.get('_cache_updated'):
cache_updated = True
del entry['_cache_updated']
self._logs_index.append(entry)
except Exception:
continue
# Trier par date décroissante
self._logs_index.sort(key=lambda x: x.get('created_at', 0), reverse=True)
self._index_built = True
self._last_scan_time = current_time
if cache_updated:
self._save_cache()
def _index_file(self, md_file: Path) -> Optional[Dict[str, Any]]:
"""Indexe un fichier markdown et retourne ses métadonnées."""
parts = md_file.stem.split("_")
if len(parts) < 4:
return None
file_status = parts[-1]
file_hour_str = parts[1] if len(parts) > 1 else "000000"
# Extraire la date du chemin
try:
rel_path = md_file.relative_to(self.base_dir)
path_parts = rel_path.parts
if len(path_parts) >= 3:
log_year, log_month, log_day = path_parts[0], path_parts[1], path_parts[2]
else:
return None
except Exception:
return None
stat = md_file.stat()
file_path_str = str(md_file)
file_mtime = stat.st_mtime
# Vérifier le cache
cached = self._get_cached_metadata(file_path_str, file_mtime)
cache_updated = False
if cached:
task_name = cached.get('task_name', '')
file_target = cached.get('target', '')
metadata = cached
else:
# Lire le fichier
if len(parts) >= 5:
file_target = parts[3]
task_name_from_file = "_".join(parts[4:-1]) if len(parts) > 5 else parts[4] if len(parts) > 4 else "unknown"
else:
file_target = ""
task_name_from_file = "_".join(parts[3:-1]) if len(parts) > 4 else parts[3] if len(parts) > 3 else "unknown"
try:
content = md_file.read_text(encoding='utf-8')
metadata = self._parse_markdown_metadata(content)
task_name_match = re.search(r'^#\s*[✅❌🔄⏳🚫❓]?\s*(.+)$', content, re.MULTILINE)
if task_name_match:
task_name = task_name_match.group(1).strip()
else:
task_name = task_name_from_file.replace("_", " ")
target_match = re.search(r'\|\s*\*\*Cible\*\*\s*\|\s*`([^`]+)`', content)
if target_match:
file_target = target_match.group(1).strip()
detected_source = self._detect_source_type(task_name, content)
metadata['source_type'] = detected_source
metadata['task_name'] = task_name
metadata['target'] = file_target
self._cache_metadata(file_path_str, file_mtime, metadata)
cache_updated = True
except Exception:
metadata = {'source_type': 'manual'}
task_name = task_name_from_file.replace("_", " ")
return {
'id': parts[0] + "_" + parts[1] + "_" + parts[2] if len(parts) > 2 else parts[0],
'filename': md_file.name,
'path': file_path_str,
'task_name': task_name,
'target': file_target,
'status': file_status,
'date': f"{log_year}-{log_month}-{log_day}",
'year': log_year,
'month': log_month,
'day': log_day,
'hour_str': file_hour_str,
'created_at': stat.st_ctime,
'size_bytes': stat.st_size,
'start_time': metadata.get('start_time'),
'end_time': metadata.get('end_time'),
'duration': metadata.get('duration'),
'duration_seconds': metadata.get('duration_seconds'),
'hosts': metadata.get('hosts', []),
'category': metadata.get('category'),
'subcategory': metadata.get('subcategory'),
'target_type': metadata.get('target_type'),
'source_type': metadata.get('source_type'),
'_cache_updated': cache_updated
}
def invalidate_index(self):
"""Force la reconstruction de l'index au prochain appel."""
self._index_built = False
def _get_date_path(self, dt: datetime = None) -> Path:
"""Retourne le chemin du répertoire pour une date donnée (YYYY/MM/JJ)."""
if dt is None:
dt = datetime.now(timezone.utc)
# Utiliser le fuseau horaire local pour les dossiers
local_tz = pytz.timezone("America/Montreal")
if dt.tzinfo is None:
dt_local = local_tz.localize(dt)
else:
dt_local = dt.astimezone(local_tz)
year = dt_local.strftime("%Y")
month = dt_local.strftime("%m")
day = dt_local.strftime("%d")
return self.base_dir / year / month / day
def _generate_task_id(self) -> str:
"""Génère un ID unique pour une tâche."""
return f"task_{datetime.now(timezone.utc).strftime('%H%M%S')}_{uuid.uuid4().hex[:6]}"
def save_task_log(self, task, output: str = "", error: str = "", source_type: str = None) -> str:
"""Sauvegarde un log de tâche en markdown et retourne le chemin."""
dt = task.start_time or datetime.now(timezone.utc)
date_path = self._get_date_path(dt)
date_path.mkdir(parents=True, exist_ok=True)
# Générer le nom du fichier
task_id = self._generate_task_id()
status_emoji = {
"completed": "",
"failed": "",
"running": "🔄",
"pending": "",
"cancelled": "🚫"
}.get(task.status, "")
# Détecter le type de source si non fourni
if not source_type:
task_name_lower = task.name.lower()
if '[planifié]' in task_name_lower or '[scheduled]' in task_name_lower:
source_type = 'scheduled'
elif 'ad-hoc' in task_name_lower or 'adhoc' in task_name_lower:
source_type = 'adhoc'
else:
source_type = 'manual'
# Labels pour le type de source
source_labels = {'scheduled': 'Planifié', 'manual': 'Manuel', 'adhoc': 'Ad-hoc'}
source_label = source_labels.get(source_type, 'Manuel')
# Sanitize task name and host for filename
safe_name = task.name.replace(' ', '_').replace(':', '').replace('/', '-')[:50]
safe_host = task.host.replace(' ', '_').replace(':', '').replace('/', '-')[:30] if task.host else 'unknown'
filename = f"{task_id}_{safe_host}_{safe_name}_{task.status}.md"
filepath = date_path / filename
# Créer le contenu markdown
md_content = f"""# {status_emoji} {task.name}
## Informations
| Propriété | Valeur |
|-----------|--------|
| **ID** | `{task.id}` |
| **Nom** | {task.name} |
| **Cible** | `{task.host}` |
| **Statut** | {task.status} |
| **Type** | {source_label} |
| **Progression** | {task.progress}% |
| **Début** | {task.start_time.isoformat() if task.start_time else 'N/A'} |
| **Fin** | {task.end_time.isoformat() if task.end_time else 'N/A'} |
| **Durée** | {task.duration or 'N/A'} |
## Sortie
```
{output or task.output or '(Aucune sortie)'}
```
"""
if error or task.error:
md_content += f"""## Erreurs
```
{error or task.error}
```
"""
md_content += f"""---
*Généré automatiquement par Homelab Automation Dashboard*
*Date: {datetime.now(timezone.utc).isoformat()}*
"""
# Écrire le fichier
filepath.write_text(md_content, encoding='utf-8')
# Invalider l'index pour qu'il soit reconstruit au prochain appel
self.invalidate_index()
return str(filepath)
def _parse_markdown_metadata(self, content: str) -> Dict[str, Any]:
"""Parse le contenu markdown pour extraire les métadonnées enrichies."""
metadata = {
'start_time': None,
'end_time': None,
'duration': None,
'duration_seconds': None,
'hosts': [],
'category': None,
'subcategory': None,
'target_type': None,
'source_type': None
}
# Extraire les heures de début et fin
start_match = re.search(r'\|\s*\*\*Début\*\*\s*\|\s*([^|]+)', content)
if start_match:
start_val = start_match.group(1).strip()
if start_val and start_val != 'N/A':
metadata['start_time'] = start_val
end_match = re.search(r'\|\s*\*\*Fin\*\*\s*\|\s*([^|]+)', content)
if end_match:
end_val = end_match.group(1).strip()
if end_val and end_val != 'N/A':
metadata['end_time'] = end_val
duration_match = re.search(r'\|\s*\*\*Durée\*\*\s*\|\s*([^|]+)', content)
if duration_match:
dur_val = duration_match.group(1).strip()
if dur_val and dur_val != 'N/A':
metadata['duration'] = dur_val
metadata['duration_seconds'] = self._parse_duration_to_seconds(dur_val)
# Extraire les hôtes depuis la sortie Ansible
host_patterns = [
r'^([a-zA-Z0-9][a-zA-Z0-9._-]+)\s*:\s*ok=',
r'^\s*([a-zA-Z0-9][a-zA-Z0-9._-]+)\s*\|\s*(SUCCESS|CHANGED|FAILED|UNREACHABLE)',
]
hosts_found = set()
for pattern in host_patterns:
for match in re.finditer(pattern, content, re.MULTILINE):
host = match.group(1).strip()
if host and len(host) > 2 and '.' in host or len(host) > 5:
hosts_found.add(host)
metadata['hosts'] = sorted(list(hosts_found))
# Détecter la catégorie
task_name_match = re.search(r'^#\s*[✅❌🔄⏳🚫❓]?\s*(.+)$', content, re.MULTILINE)
if task_name_match:
task_name = task_name_match.group(1).strip().lower()
if 'playbook' in task_name:
metadata['category'] = 'Playbook'
if 'health' in task_name:
metadata['subcategory'] = 'Health Check'
elif 'backup' in task_name:
metadata['subcategory'] = 'Backup'
elif 'upgrade' in task_name or 'update' in task_name:
metadata['subcategory'] = 'Upgrade'
elif 'bootstrap' in task_name:
metadata['subcategory'] = 'Bootstrap'
elif 'reboot' in task_name:
metadata['subcategory'] = 'Reboot'
elif 'ad-hoc' in task_name or 'adhoc' in task_name:
metadata['category'] = 'Ad-hoc'
else:
metadata['category'] = 'Autre'
# Détecter le type de cible
target_match = re.search(r'\|\s*\*\*Cible\*\*\s*\|\s*`([^`]+)`', content)
if target_match:
target_val = target_match.group(1).strip()
if target_val == 'all':
metadata['target_type'] = 'group'
elif target_val.startswith('env_') or target_val.startswith('role_'):
metadata['target_type'] = 'group'
elif '.' in target_val:
metadata['target_type'] = 'host'
else:
metadata['target_type'] = 'group'
# Extraire le type de source depuis le markdown
type_match = re.search(r'\|\s*\*\*Type\*\*\s*\|\s*([^|]+)', content)
if type_match:
type_val = type_match.group(1).strip().lower()
if 'planifié' in type_val or 'scheduled' in type_val:
metadata['source_type'] = 'scheduled'
elif 'ad-hoc' in type_val or 'adhoc' in type_val:
metadata['source_type'] = 'adhoc'
elif 'manuel' in type_val or 'manual' in type_val:
metadata['source_type'] = 'manual'
return metadata
def _parse_duration_to_seconds(self, duration_str: str) -> Optional[int]:
"""Convertit une chaîne de durée en secondes."""
if not duration_str:
return None
total_seconds = 0
s_clean = duration_str.strip()
# Gérer les secondes seules
sec_only_match = re.match(r'^(\d+(?:[\.,]\d+)?)\s*s$', s_clean)
if sec_only_match:
sec_val_str = sec_only_match.group(1).replace(',', '.')
try:
sec_val = float(sec_val_str)
except ValueError:
sec_val = 0.0
return int(round(sec_val)) if sec_val > 0 else None
# Format HH:MM:SS
hms_match = re.match(r'^(\d+):(\d+):(\d+)$', s_clean)
if hms_match:
h, m, s = map(int, hms_match.groups())
return h * 3600 + m * 60 + s
# Format avec h, m, s
hours = re.search(r'(\d+)\s*h', s_clean)
minutes = re.search(r'(\d+)\s*m', s_clean)
seconds = re.search(r'(\d+)\s*s', s_clean)
if hours:
total_seconds += int(hours.group(1)) * 3600
if minutes:
total_seconds += int(minutes.group(1)) * 60
if seconds:
total_seconds += int(seconds.group(1))
return total_seconds if total_seconds > 0 else None
def get_task_logs(
self,
year: str = None,
month: str = None,
day: str = None,
status: str = None,
target: str = None,
category: str = None,
source_type: str = None,
hour_start: str = None,
hour_end: str = None,
limit: int = 50,
offset: int = 0
) -> Tuple[List[TaskLogFile], int]:
"""Récupère la liste des logs de tâches avec filtrage et pagination."""
self._build_index()
# Convertir les heures de filtrage en minutes
hour_start_minutes = None
hour_end_minutes = None
if hour_start:
try:
h, m = map(int, hour_start.split(':'))
hour_start_minutes = h * 60 + m
except Exception:
pass
if hour_end:
try:
h, m = map(int, hour_end.split(':'))
hour_end_minutes = h * 60 + m
except Exception:
pass
# Filtrer l'index
filtered = []
for entry in self._logs_index:
if year and entry['year'] != year:
continue
if month and entry['month'] != month:
continue
if day and entry['day'] != day:
continue
if status and status != "all" and entry['status'] != status:
continue
if hour_start_minutes is not None or hour_end_minutes is not None:
try:
file_hour_str = entry.get('hour_str', '000000')
file_h = int(file_hour_str[:2])
file_m = int(file_hour_str[2:4])
file_minutes = file_h * 60 + file_m
if hour_start_minutes is not None and file_minutes < hour_start_minutes:
continue
if hour_end_minutes is not None and file_minutes > hour_end_minutes:
continue
except Exception:
pass
if target and target != "all":
file_target = entry.get('target', '')
if file_target and target.lower() not in file_target.lower():
continue
if category and category != "all":
file_category = entry.get('category', '')
if file_category and category.lower() not in file_category.lower():
continue
if source_type and source_type != "all":
file_source = entry.get('source_type', '')
if file_source != source_type:
continue
filtered.append(entry)
# Convertir en TaskLogFile
total_count = len(filtered)
paginated = filtered[offset:offset + limit] if limit > 0 else filtered
logs = [
TaskLogFile(
id=e['id'],
filename=e['filename'],
path=e['path'],
task_name=e['task_name'],
target=e['target'],
status=e['status'],
date=e['date'],
year=e['year'],
month=e['month'],
day=e['day'],
created_at=datetime.fromtimestamp(e['created_at'], tz=timezone.utc),
size_bytes=e['size_bytes'],
start_time=e.get('start_time'),
end_time=e.get('end_time'),
duration=e.get('duration'),
duration_seconds=e.get('duration_seconds'),
hosts=e.get('hosts', []),
category=e.get('category'),
subcategory=e.get('subcategory'),
target_type=e.get('target_type'),
source_type=e.get('source_type')
)
for e in paginated
]
return logs, total_count
def index_log_file(self, file_path: str) -> Optional[TaskLogFile]:
md_file = Path(file_path)
if not md_file.exists():
return None
try:
entry = self._index_file(md_file)
except Exception:
return None
if not entry:
return None
try:
return TaskLogFile(
id=entry['id'],
filename=entry['filename'],
path=entry['path'],
task_name=entry['task_name'],
target=entry['target'],
status=entry['status'],
date=entry['date'],
year=entry['year'],
month=entry['month'],
day=entry['day'],
created_at=datetime.fromtimestamp(entry['created_at'], tz=timezone.utc),
size_bytes=entry['size_bytes'],
start_time=entry.get('start_time'),
end_time=entry.get('end_time'),
duration=entry.get('duration'),
duration_seconds=entry.get('duration_seconds'),
hosts=entry.get('hosts', []),
category=entry.get('category'),
subcategory=entry.get('subcategory'),
target_type=entry.get('target_type'),
source_type=entry.get('source_type')
)
except Exception:
return None
def _detect_source_type(self, task_name: str, content: str) -> str:
"""Détecte le type de source d'une tâche."""
task_name_lower = task_name.lower()
content_lower = content.lower()
if '[planifié]' in task_name_lower or '[scheduled]' in task_name_lower:
return 'scheduled'
if 'schedule_id' in content_lower or 'planifié' in content_lower:
return 'scheduled'
if 'ad-hoc' in task_name_lower or 'adhoc' in task_name_lower:
return 'adhoc'
if 'commande ad-hoc' in content_lower or 'ansible ad-hoc' in content_lower:
return 'adhoc'
if re.search(r'\|\s*\*\*Module\*\*\s*\|', content):
return 'adhoc'
return 'manual'
def get_available_dates(self) -> Dict[str, Any]:
"""Retourne la structure des dates disponibles pour le filtrage."""
dates = {"years": {}}
if not self.base_dir.exists():
return dates
for year_dir in sorted(self.base_dir.iterdir(), reverse=True):
if year_dir.is_dir() and year_dir.name.isdigit():
year = year_dir.name
dates["years"][year] = {"months": {}}
for month_dir in sorted(year_dir.iterdir(), reverse=True):
if month_dir.is_dir() and month_dir.name.isdigit():
month = month_dir.name
dates["years"][year]["months"][month] = {"days": []}
for day_dir in sorted(month_dir.iterdir(), reverse=True):
if day_dir.is_dir() and day_dir.name.isdigit():
day = day_dir.name
count = len(list(day_dir.glob("*.md")))
dates["years"][year]["months"][month]["days"].append({
"day": day,
"count": count
})
return dates
def get_stats(self) -> Dict[str, int]:
"""Retourne les statistiques des tâches."""
stats = {"total": 0, "completed": 0, "failed": 0, "running": 0, "pending": 0}
logs, _ = self.get_task_logs(limit=0)
for log in logs:
stats["total"] += 1
if log.status in stats:
stats[log.status] += 1
return stats

View File

@ -0,0 +1,60 @@
"""
Service WebSocket pour les mises à jour en temps réel.
"""
from typing import List
from threading import Lock
from fastapi import WebSocket
class WebSocketManager:
"""Gestionnaire des connexions WebSocket."""
def __init__(self):
self.active_connections: List[WebSocket] = []
self.lock = Lock()
async def connect(self, websocket: WebSocket):
"""Accepte et enregistre une nouvelle connexion WebSocket."""
await websocket.accept()
with self.lock:
self.active_connections.append(websocket)
def disconnect(self, websocket: WebSocket):
"""Déconnecte un client WebSocket."""
with self.lock:
if websocket in self.active_connections:
self.active_connections.remove(websocket)
async def broadcast(self, message: dict):
"""Envoie un message à tous les clients connectés."""
with self.lock:
disconnected = []
for connection in self.active_connections:
try:
await connection.send_json(message)
except Exception:
disconnected.append(connection)
# Nettoyer les connexions déconnectées
for conn in disconnected:
if conn in self.active_connections:
self.active_connections.remove(conn)
async def send_to_client(self, websocket: WebSocket, message: dict):
"""Envoie un message à un client spécifique."""
try:
await websocket.send_json(message)
except Exception:
self.disconnect(websocket)
@property
def connection_count(self) -> int:
"""Retourne le nombre de connexions actives."""
with self.lock:
return len(self.active_connections)
# Instance singleton du gestionnaire WebSocket
ws_manager = WebSocketManager()

BIN
app/static/favicon.ico Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 3.4 KiB

Some files were not shown because too many files have changed in this diff Show More