innercontext/backend/innercontext/validators/__init__.py
Piotr Oleszczyk 2a9391ad32 feat(api): add LLM response validation and input sanitization
Implement Phase 1: Safety & Validation for all LLM-based suggestion engines.

- Add input sanitization module to prevent prompt injection attacks
- Implement 5 comprehensive validators (routine, batch, shopping, product parse, photo)
- Add 10+ critical safety checks (retinoid+acid conflicts, barrier compatibility, etc.)
- Integrate validation into all 5 API endpoints (routines, products, skincare)
- Add validation fields to ai_call_logs table (validation_errors, validation_warnings, auto_fixed)
- Create database migration for validation fields
- Add comprehensive test suite (9/9 tests passing, 88% coverage on validators)

Safety improvements:
- Blocks retinoid + acid conflicts in same routine/day
- Rejects unknown product IDs
- Enforces min_interval_hours rules
- Protects compromised skin barriers
- Prevents prohibited fields (dose, amount) in responses
- Validates all enum values and score ranges

All validation failures are logged and responses are rejected with HTTP 502.
2026-03-06 10:16:47 +01:00

17 lines
661 B
Python

"""LLM response validators for safety and quality checks."""
from innercontext.validators.base import ValidationResult
from innercontext.validators.batch_validator import BatchValidator
from innercontext.validators.photo_validator import PhotoValidator
from innercontext.validators.product_parse_validator import ProductParseValidator
from innercontext.validators.routine_validator import RoutineSuggestionValidator
from innercontext.validators.shopping_validator import ShoppingValidator
__all__ = [
"ValidationResult",
"RoutineSuggestionValidator",
"ShoppingValidator",
"ProductParseValidator",
"BatchValidator",
"PhotoValidator",
]