Compare commits
24 commits
main
...
feature/mu
| Author | SHA1 | Date | |
|---|---|---|---|
| cc73c8a7f4 | |||
| dac787b81b | |||
| b11f64d5a1 | |||
| 1d5630ed8c | |||
| 4bfa4ea02d | |||
| ffa3b71309 | |||
| cd8e39939a | |||
| 803bc3b4cd | |||
| 1f47974f48 | |||
| 4782fad5b9 | |||
| 2704d58673 | |||
| 04daadccda | |||
| e29d62f949 | |||
| 470d49b061 | |||
| 157cbc425e | |||
| ed547703ad | |||
| 0253b2377d | |||
| e20c18c2ee | |||
| d91d06455b | |||
| bb5d402c15 | |||
| 5d9d18bd05 | |||
| cebea2ac86 | |||
| fecfa0b9e4 | |||
| 1c457d62a3 |
179 changed files with 28951 additions and 4432 deletions
|
|
@ -83,8 +83,6 @@ jobs:
|
||||||
backend-test:
|
backend-test:
|
||||||
name: Backend Tests
|
name: Backend Tests
|
||||||
runs-on: lxc
|
runs-on: lxc
|
||||||
# Disabled for now since tests are not integrated yet
|
|
||||||
if: false
|
|
||||||
steps:
|
steps:
|
||||||
- name: Checkout code
|
- name: Checkout code
|
||||||
uses: actions/checkout@v4
|
uses: actions/checkout@v4
|
||||||
|
|
|
||||||
15
.gitignore
vendored
15
.gitignore
vendored
|
|
@ -19,3 +19,18 @@ wheels/
|
||||||
|
|
||||||
# SQLite dev database
|
# SQLite dev database
|
||||||
*.db
|
*.db
|
||||||
|
|
||||||
|
# Unit test / coverage reports
|
||||||
|
htmlcov/
|
||||||
|
.tox/
|
||||||
|
.nox/
|
||||||
|
.coverage
|
||||||
|
.coverage.*
|
||||||
|
.cache
|
||||||
|
nosetests.xml
|
||||||
|
coverage.xml
|
||||||
|
*.cover
|
||||||
|
*.py.cover
|
||||||
|
.hypothesis/
|
||||||
|
.pytest_cache/
|
||||||
|
cover/
|
||||||
|
|
|
||||||
9
.sisyphus/boulder.json
Normal file
9
.sisyphus/boulder.json
Normal file
|
|
@ -0,0 +1,9 @@
|
||||||
|
{
|
||||||
|
"active_plan": "/Users/piotr/dev/innercontext/.sisyphus/plans/multi-user-authelia-oidc.md",
|
||||||
|
"started_at": "2026-03-12T13:31:34.526Z",
|
||||||
|
"session_ids": [
|
||||||
|
"ses_31e44571affeyTpySqhHAuYVAm"
|
||||||
|
],
|
||||||
|
"plan_name": "multi-user-authelia-oidc",
|
||||||
|
"agent": "atlas"
|
||||||
|
}
|
||||||
1
.sisyphus/evidence/task-T1-identity-models.txt
Normal file
1
.sisyphus/evidence/task-T1-identity-models.txt
Normal file
|
|
@ -0,0 +1 @@
|
||||||
|
['household_memberships', 'households', 'users']
|
||||||
1
.sisyphus/evidence/task-T1-sharing-default.txt
Normal file
1
.sisyphus/evidence/task-T1-sharing-default.txt
Normal file
|
|
@ -0,0 +1 @@
|
||||||
|
False
|
||||||
9
.sisyphus/evidence/task-T10-health-check.txt
Normal file
9
.sisyphus/evidence/task-T10-health-check.txt
Normal file
|
|
@ -0,0 +1,9 @@
|
||||||
|
Scenario: Health check handles auth redirects
|
||||||
|
Steps: Run updated scripts/healthcheck.sh
|
||||||
|
Expected: Success despite auth redirects
|
||||||
|
|
||||||
|
Output:
|
||||||
|
[2026-03-12 15:55:04] ✓ innercontext is healthy
|
||||||
|
[2026-03-12 15:55:04] ✓ innercontext-node is healthy (status 302)
|
||||||
|
[2026-03-12 15:55:04] ✓ innercontext-pricing-worker is running
|
||||||
|
[2026-03-12 15:55:04] All services healthy
|
||||||
41
.sisyphus/evidence/task-T10-missing-env.txt
Normal file
41
.sisyphus/evidence/task-T10-missing-env.txt
Normal file
|
|
@ -0,0 +1,41 @@
|
||||||
|
Scenario: Deploy validation rejects missing auth config
|
||||||
|
Steps: Run scripts/validate-env.sh with missing OIDC var
|
||||||
|
Expected: Exits non-zero, names missing var
|
||||||
|
|
||||||
|
Output:
|
||||||
|
=== Validating Shared Directory Structure ===
|
||||||
|
✓ Shared directory exists: /tmp/innercontext/shared
|
||||||
|
✓ Shared backend .env exists: /tmp/innercontext/shared/backend/.env
|
||||||
|
✓ Shared frontend .env.production exists: /tmp/innercontext/shared/frontend/.env.production
|
||||||
|
|
||||||
|
=== Validating Symlinks in Current Release ===
|
||||||
|
✓ Symlink correct: /tmp/innercontext/current/backend/.env -> ../../../shared/backend/.env
|
||||||
|
✓ Symlink correct: /tmp/innercontext/current/frontend/.env.production -> ../../../shared/frontend/.env.production
|
||||||
|
|
||||||
|
=== Validating Backend Environment Variables ===
|
||||||
|
✓ DATABASE_URL is set
|
||||||
|
✓ GEMINI_API_KEY is set
|
||||||
|
⚠ LOG_LEVEL not found in /tmp/innercontext/shared/backend/.env (optional)
|
||||||
|
⚠ CORS_ORIGINS not found in /tmp/innercontext/shared/backend/.env (optional)
|
||||||
|
✗ OIDC_ISSUER not found in /tmp/innercontext/shared/backend/.env
|
||||||
|
✗ OIDC_CLIENT_ID not found in /tmp/innercontext/shared/backend/.env
|
||||||
|
✗ OIDC_DISCOVERY_URL not found in /tmp/innercontext/shared/backend/.env
|
||||||
|
✗ OIDC_ADMIN_GROUPS not found in /tmp/innercontext/shared/backend/.env
|
||||||
|
✗ OIDC_MEMBER_GROUPS not found in /tmp/innercontext/shared/backend/.env
|
||||||
|
⚠ OIDC_JWKS_CACHE_TTL_SECONDS not found in /tmp/innercontext/shared/backend/.env (optional)
|
||||||
|
⚠ BOOTSTRAP_ADMIN_OIDC_ISSUER not found in /tmp/innercontext/shared/backend/.env (optional)
|
||||||
|
⚠ BOOTSTRAP_ADMIN_OIDC_SUB not found in /tmp/innercontext/shared/backend/.env (optional)
|
||||||
|
⚠ BOOTSTRAP_ADMIN_EMAIL not found in /tmp/innercontext/shared/backend/.env (optional)
|
||||||
|
⚠ BOOTSTRAP_ADMIN_NAME not found in /tmp/innercontext/shared/backend/.env (optional)
|
||||||
|
⚠ BOOTSTRAP_HOUSEHOLD_NAME not found in /tmp/innercontext/shared/backend/.env (optional)
|
||||||
|
|
||||||
|
=== Validating Frontend Environment Variables ===
|
||||||
|
✓ PUBLIC_API_BASE is set
|
||||||
|
✓ ORIGIN is set
|
||||||
|
✗ SESSION_SECRET not found in /tmp/innercontext/shared/frontend/.env.production
|
||||||
|
✗ OIDC_ISSUER not found in /tmp/innercontext/shared/frontend/.env.production
|
||||||
|
✗ OIDC_CLIENT_ID not found in /tmp/innercontext/shared/frontend/.env.production
|
||||||
|
✗ OIDC_DISCOVERY_URL not found in /tmp/innercontext/shared/frontend/.env.production
|
||||||
|
|
||||||
|
✗ Found 9 error(s) in environment configuration
|
||||||
|
And 8 warning(s)
|
||||||
283
.sisyphus/evidence/task-T11-backend-regression.txt
Normal file
283
.sisyphus/evidence/task-T11-backend-regression.txt
Normal file
|
|
@ -0,0 +1,283 @@
|
||||||
|
============================= test session starts ==============================
|
||||||
|
platform darwin -- Python 3.12.12, pytest-9.0.2, pluggy-1.6.0 -- /Users/piotr/dev/innercontext/backend/.venv/bin/python3
|
||||||
|
cachedir: .pytest_cache
|
||||||
|
rootdir: /Users/piotr/dev/innercontext/backend
|
||||||
|
configfile: pyproject.toml
|
||||||
|
testpaths: tests
|
||||||
|
plugins: anyio-4.12.1, cov-7.0.0
|
||||||
|
collecting ... collected 221 items
|
||||||
|
|
||||||
|
tests/test_admin_households.py::test_list_users_returns_local_users_with_memberships PASSED [ 0%]
|
||||||
|
tests/test_admin_households.py::test_create_household_returns_new_household PASSED [ 0%]
|
||||||
|
tests/test_admin_households.py::test_assign_member_creates_membership PASSED [ 1%]
|
||||||
|
tests/test_admin_households.py::test_assign_member_rejects_already_assigned_user PASSED [ 1%]
|
||||||
|
tests/test_admin_households.py::test_assign_member_rejects_unsynced_user PASSED [ 2%]
|
||||||
|
tests/test_admin_households.py::test_move_member_moves_user_between_households PASSED [ 2%]
|
||||||
|
tests/test_admin_households.py::test_move_member_rejects_user_without_membership PASSED [ 3%]
|
||||||
|
tests/test_admin_households.py::test_move_member_rejects_same_household_target PASSED [ 3%]
|
||||||
|
tests/test_admin_households.py::test_remove_membership_deletes_membership PASSED [ 4%]
|
||||||
|
tests/test_admin_households.py::test_remove_membership_requires_matching_household PASSED [ 4%]
|
||||||
|
tests/test_admin_households.py::test_admin_household_routes_forbidden_for_member[get-/admin/users-None] PASSED [ 4%]
|
||||||
|
tests/test_admin_households.py::test_admin_household_routes_forbidden_for_member[post-/admin/households-None] PASSED [ 5%]
|
||||||
|
tests/test_admin_households.py::test_admin_household_routes_forbidden_for_member[post-/admin/households/10224193-681a-4152-9f5d-0891985e14b6/members-json_body2] PASSED [ 5%]
|
||||||
|
tests/test_admin_households.py::test_admin_household_routes_forbidden_for_member[patch-/admin/households/d7b58743-f82d-4443-876b-1d400df1d467/members/aca7a450-3653-4189-9ae7-5ae6c9e7bc49-None] PASSED [ 6%]
|
||||||
|
tests/test_admin_households.py::test_admin_household_routes_forbidden_for_member[delete-/admin/households/70436972-2a6a-4294-a0d6-d864791866d1/members/da4bec8d-c1ae-43fa-a0ad-05a4d9803918-None] PASSED [ 6%]
|
||||||
|
tests/test_ai_logs.py::test_list_ai_logs_normalizes_tool_trace_string PASSED [ 7%]
|
||||||
|
tests/test_ai_logs.py::test_get_ai_log_normalizes_tool_trace_string PASSED [ 7%]
|
||||||
|
tests/test_auth.py::test_validate_access_token_uses_cached_jwks PASSED [ 8%]
|
||||||
|
tests/test_auth.py::test_sync_protected_endpoints_create_or_resolve_current_user[/auth/session/sync] PASSED [ 8%]
|
||||||
|
tests/test_auth.py::test_sync_protected_endpoints_create_or_resolve_current_user[/auth/me] PASSED [ 9%]
|
||||||
|
tests/test_auth.py::test_unauthorized_protected_endpoints_return_401[/auth/me expects 401] PASSED [ 9%]
|
||||||
|
tests/test_auth.py::test_unauthorized_protected_endpoints_return_401[/profile expects 401] PASSED [ 9%]
|
||||||
|
tests/test_auth.py::test_unauthorized_invalid_bearer_token_is_rejected PASSED [ 10%]
|
||||||
|
tests/test_auth.py::test_require_admin_raises_for_member PASSED [ 10%]
|
||||||
|
tests/test_authz.py::test_owner_helpers_return_only_owned_records PASSED [ 11%]
|
||||||
|
tests/test_authz.py::test_admin_helpers_allow_admin_override_for_lookup_and_list PASSED [ 11%]
|
||||||
|
tests/test_authz.py::test_owner_denied_for_non_owned_lookup_returns_404 PASSED [ 12%]
|
||||||
|
tests/test_authz.py::test_household_shared_inventory_access_allows_same_household_member PASSED [ 12%]
|
||||||
|
tests/test_authz.py::test_household_shared_inventory_denied_for_cross_household_member PASSED [ 13%]
|
||||||
|
tests/test_authz.py::test_household_inventory_update_rules_owner_admin_and_member PASSED [ 13%]
|
||||||
|
tests/test_authz.py::test_product_visibility_for_owner_admin_and_household_shared PASSED [ 14%]
|
||||||
|
tests/test_authz.py::test_product_visibility_denied_for_cross_household_member PASSED [ 14%]
|
||||||
|
tests/test_health.py::test_create_medication_minimal PASSED [ 14%]
|
||||||
|
tests/test_health.py::test_create_medication_invalid_kind PASSED [ 15%]
|
||||||
|
tests/test_health.py::test_list_medications_empty PASSED [ 15%]
|
||||||
|
tests/test_health.py::test_list_filter_kind PASSED [ 16%]
|
||||||
|
tests/test_health.py::test_list_filter_product_name PASSED [ 16%]
|
||||||
|
tests/test_health.py::test_get_medication PASSED [ 17%]
|
||||||
|
tests/test_health.py::test_get_medication_not_found PASSED [ 17%]
|
||||||
|
tests/test_health.py::test_update_medication PASSED [ 18%]
|
||||||
|
tests/test_health.py::test_update_medication_not_found PASSED [ 18%]
|
||||||
|
tests/test_health.py::test_delete_medication_no_usages PASSED [ 19%]
|
||||||
|
tests/test_health.py::test_delete_medication_with_usages PASSED [ 19%]
|
||||||
|
tests/test_health.py::test_create_usage PASSED [ 19%]
|
||||||
|
tests/test_health.py::test_create_usage_medication_not_found PASSED [ 20%]
|
||||||
|
tests/test_health.py::test_list_usages_empty PASSED [ 20%]
|
||||||
|
tests/test_health.py::test_list_usages_returns_entries PASSED [ 21%]
|
||||||
|
tests/test_health.py::test_update_usage PASSED [ 21%]
|
||||||
|
tests/test_health.py::test_update_usage_not_found PASSED [ 22%]
|
||||||
|
tests/test_health.py::test_delete_usage PASSED [ 22%]
|
||||||
|
tests/test_health.py::test_delete_usage_not_found PASSED [ 23%]
|
||||||
|
tests/test_health.py::test_create_lab_result PASSED [ 23%]
|
||||||
|
tests/test_health.py::test_create_lab_result_invalid_code PASSED [ 23%]
|
||||||
|
tests/test_health.py::test_create_lab_result_invalid_flag PASSED [ 24%]
|
||||||
|
tests/test_health.py::test_list_lab_results_empty PASSED [ 24%]
|
||||||
|
tests/test_health.py::test_list_filter_test_code PASSED [ 25%]
|
||||||
|
tests/test_health.py::test_list_filter_flag PASSED [ 25%]
|
||||||
|
tests/test_health.py::test_list_filter_date_range PASSED [ 26%]
|
||||||
|
tests/test_health.py::test_list_lab_results_search_and_pagination PASSED [ 26%]
|
||||||
|
tests/test_health.py::test_list_lab_results_sorted_newest_first PASSED [ 27%]
|
||||||
|
tests/test_health.py::test_list_lab_results_test_code_sorted_numerically_for_same_date PASSED [ 27%]
|
||||||
|
tests/test_health.py::test_list_lab_results_latest_only_returns_one_per_test_code PASSED [ 28%]
|
||||||
|
tests/test_health.py::test_get_lab_result PASSED [ 28%]
|
||||||
|
tests/test_health.py::test_get_lab_result_not_found PASSED [ 28%]
|
||||||
|
tests/test_health.py::test_update_lab_result PASSED [ 29%]
|
||||||
|
tests/test_health.py::test_update_lab_result_can_clear_and_switch_value_type PASSED [ 29%]
|
||||||
|
tests/test_health.py::test_delete_lab_result PASSED [ 30%]
|
||||||
|
tests/test_inventory.py::test_get_inventory_by_id PASSED [ 30%]
|
||||||
|
tests/test_inventory.py::test_get_inventory_not_found PASSED [ 31%]
|
||||||
|
tests/test_inventory.py::test_update_inventory_opened PASSED [ 31%]
|
||||||
|
tests/test_inventory.py::test_update_inventory_not_found PASSED [ 32%]
|
||||||
|
tests/test_inventory.py::test_delete_inventory PASSED [ 32%]
|
||||||
|
tests/test_inventory.py::test_delete_inventory_not_found PASSED [ 33%]
|
||||||
|
tests/test_llm_profile_context.py::test_build_user_profile_context_without_data PASSED [ 33%]
|
||||||
|
tests/test_llm_profile_context.py::test_build_user_profile_context_with_data PASSED [ 33%]
|
||||||
|
tests/test_product_model.py::test_always_present_keys PASSED [ 34%]
|
||||||
|
tests/test_product_model.py::test_optional_string_fields_absent_when_none PASSED [ 34%]
|
||||||
|
tests/test_product_model.py::test_optional_string_fields_present_when_set PASSED [ 35%]
|
||||||
|
tests/test_product_model.py::test_ph_exact_collapses PASSED [ 35%]
|
||||||
|
tests/test_product_model.py::test_ph_range PASSED [ 36%]
|
||||||
|
tests/test_product_model.py::test_ph_only_min PASSED [ 36%]
|
||||||
|
tests/test_product_model.py::test_ph_only_max PASSED [ 37%]
|
||||||
|
tests/test_product_model.py::test_actives_pydantic_objects PASSED [ 37%]
|
||||||
|
tests/test_product_model.py::test_actives_raw_dicts PASSED [ 38%]
|
||||||
|
tests/test_product_model.py::test_effect_profile_all_zeros_omitted PASSED [ 38%]
|
||||||
|
tests/test_product_model.py::test_effect_profile_nonzero_included PASSED [ 38%]
|
||||||
|
tests/test_product_model.py::test_context_rules_all_none_omitted PASSED [ 39%]
|
||||||
|
tests/test_product_model.py::test_context_rules_with_value PASSED [ 39%]
|
||||||
|
tests/test_product_model.py::test_safety_dict_present_when_set PASSED [ 40%]
|
||||||
|
tests/test_product_model.py::test_empty_lists_omitted PASSED [ 40%]
|
||||||
|
tests/test_product_model.py::test_nonempty_lists_included PASSED [ 41%]
|
||||||
|
tests/test_products.py::test_create_minimal PASSED [ 41%]
|
||||||
|
tests/test_products.py::test_create_with_actives PASSED [ 42%]
|
||||||
|
tests/test_products.py::test_create_invalid_enum PASSED [ 42%]
|
||||||
|
tests/test_products.py::test_create_missing_required PASSED [ 42%]
|
||||||
|
tests/test_products.py::test_list_empty PASSED [ 43%]
|
||||||
|
tests/test_products.py::test_list_returns_created PASSED [ 43%]
|
||||||
|
tests/test_products.py::test_list_filter_category PASSED [ 44%]
|
||||||
|
tests/test_products.py::test_list_filter_brand PASSED [ 44%]
|
||||||
|
tests/test_products.py::test_list_filter_is_medication PASSED [ 45%]
|
||||||
|
tests/test_products.py::test_list_filter_targets PASSED [ 45%]
|
||||||
|
tests/test_products.py::test_get_by_id PASSED [ 46%]
|
||||||
|
tests/test_products.py::test_get_not_found PASSED [ 46%]
|
||||||
|
tests/test_products.py::test_update_name PASSED [ 47%]
|
||||||
|
tests/test_products.py::test_update_json_field PASSED [ 47%]
|
||||||
|
tests/test_products.py::test_update_not_found PASSED [ 47%]
|
||||||
|
tests/test_products.py::test_delete PASSED [ 48%]
|
||||||
|
tests/test_products.py::test_delete_not_found PASSED [ 48%]
|
||||||
|
tests/test_products.py::test_list_inventory_empty PASSED [ 49%]
|
||||||
|
tests/test_products.py::test_list_inventory_product_not_found PASSED [ 49%]
|
||||||
|
tests/test_products.py::test_create_inventory PASSED [ 50%]
|
||||||
|
tests/test_products.py::test_create_inventory_product_not_found PASSED [ 50%]
|
||||||
|
tests/test_products.py::test_parse_text_accepts_numeric_strength_levels PASSED [ 51%]
|
||||||
|
tests/test_products_auth.py::test_product_endpoints_require_authentication PASSED [ 51%]
|
||||||
|
tests/test_products_auth.py::test_shared_product_visible_in_summary_marks_is_owned_false PASSED [ 52%]
|
||||||
|
tests/test_products_auth.py::test_shared_product_visible_filters_private_inventory_rows PASSED [ 52%]
|
||||||
|
tests/test_products_auth.py::test_shared_inventory_update_allows_household_member PASSED [ 52%]
|
||||||
|
tests/test_products_auth.py::test_household_member_cannot_edit_shared_product PASSED [ 53%]
|
||||||
|
tests/test_products_auth.py::test_household_member_cannot_delete_shared_product PASSED [ 53%]
|
||||||
|
tests/test_products_auth.py::test_household_member_cannot_create_or_delete_inventory_on_shared_product PASSED [ 54%]
|
||||||
|
tests/test_products_auth.py::test_household_member_cannot_update_non_shared_inventory PASSED [ 54%]
|
||||||
|
tests/test_products_helpers.py::test_build_shopping_context PASSED [ 55%]
|
||||||
|
tests/test_products_helpers.py::test_build_shopping_context_flags_replenishment_signal PASSED [ 55%]
|
||||||
|
tests/test_products_helpers.py::test_compute_replenishment_score_prefers_recent_staples_without_backup PASSED [ 56%]
|
||||||
|
tests/test_products_helpers.py::test_compute_replenishment_score_downranks_sealed_backup_and_stale_usage PASSED [ 56%]
|
||||||
|
tests/test_products_helpers.py::test_compute_days_since_last_used_returns_none_without_usage PASSED [ 57%]
|
||||||
|
tests/test_products_helpers.py::test_suggest_shopping PASSED [ 57%]
|
||||||
|
tests/test_products_helpers.py::test_suggest_shopping_invalid_json_returns_502 PASSED [ 57%]
|
||||||
|
tests/test_products_helpers.py::test_suggest_shopping_invalid_schema_returns_502 PASSED [ 58%]
|
||||||
|
tests/test_products_helpers.py::test_suggest_shopping_invalid_target_concern_returns_502 PASSED [ 58%]
|
||||||
|
tests/test_products_helpers.py::test_shopping_context_medication_skip PASSED [ 59%]
|
||||||
|
tests/test_products_helpers.py::test_extract_requested_product_ids_dedupes_and_limits PASSED [ 59%]
|
||||||
|
tests/test_products_helpers.py::test_shopping_tool_handlers_return_payloads PASSED [ 60%]
|
||||||
|
tests/test_products_helpers.py::test_shopping_tool_handler_includes_last_used_on_from_mapping PASSED [ 60%]
|
||||||
|
tests/test_products_helpers.py::test_shopping_validator_accepts_freeform_product_type_and_frequency PASSED [ 61%]
|
||||||
|
tests/test_products_pricing.py::test_compute_pricing_outputs_groups_by_category PASSED [ 61%]
|
||||||
|
tests/test_products_pricing.py::test_price_tier_is_null_when_not_enough_products PASSED [ 61%]
|
||||||
|
tests/test_products_pricing.py::test_price_tier_is_computed_by_worker PASSED [ 62%]
|
||||||
|
tests/test_products_pricing.py::test_price_tier_uses_fallback_for_medium_categories PASSED [ 62%]
|
||||||
|
tests/test_products_pricing.py::test_price_tier_stays_null_for_tiny_categories_even_with_fallback_pool PASSED [ 63%]
|
||||||
|
tests/test_products_pricing.py::test_product_write_enqueues_pricing_job PASSED [ 63%]
|
||||||
|
tests/test_profile.py::test_get_profile_empty PASSED [ 64%]
|
||||||
|
tests/test_profile.py::test_upsert_profile_create_and_get PASSED [ 64%]
|
||||||
|
tests/test_profile.py::test_upsert_profile_updates_existing_row PASSED [ 65%]
|
||||||
|
tests/test_routines.py::test_create_routine_minimal PASSED [ 65%]
|
||||||
|
tests/test_routines.py::test_create_routine_invalid_part_of_day PASSED [ 66%]
|
||||||
|
tests/test_routines.py::test_list_routines_empty PASSED [ 66%]
|
||||||
|
tests/test_routines.py::test_list_filter_date_range PASSED [ 66%]
|
||||||
|
tests/test_routines.py::test_list_filter_part_of_day PASSED [ 67%]
|
||||||
|
tests/test_routines.py::test_get_routine PASSED [ 67%]
|
||||||
|
tests/test_routines.py::test_get_routine_not_found PASSED [ 68%]
|
||||||
|
tests/test_routines.py::test_update_routine_notes PASSED [ 68%]
|
||||||
|
tests/test_routines.py::test_update_routine_not_found PASSED [ 69%]
|
||||||
|
tests/test_routines.py::test_delete_routine PASSED [ 69%]
|
||||||
|
tests/test_routines.py::test_add_step_action_only PASSED [ 70%]
|
||||||
|
tests/test_routines.py::test_add_step_with_product PASSED [ 70%]
|
||||||
|
tests/test_routines.py::test_add_step_routine_not_found PASSED [ 71%]
|
||||||
|
tests/test_routines.py::test_update_step PASSED [ 71%]
|
||||||
|
tests/test_routines.py::test_update_step_not_found PASSED [ 71%]
|
||||||
|
tests/test_routines.py::test_delete_step PASSED [ 72%]
|
||||||
|
tests/test_routines.py::test_delete_step_not_found PASSED [ 72%]
|
||||||
|
tests/test_routines.py::test_list_grooming_schedule_empty PASSED [ 73%]
|
||||||
|
tests/test_routines.py::test_create_grooming_schedule PASSED [ 73%]
|
||||||
|
tests/test_routines.py::test_list_grooming_schedule_returns_entry PASSED [ 74%]
|
||||||
|
tests/test_routines.py::test_update_grooming_schedule PASSED [ 74%]
|
||||||
|
tests/test_routines.py::test_delete_grooming_schedule PASSED [ 75%]
|
||||||
|
tests/test_routines.py::test_delete_grooming_schedule_not_found PASSED [ 75%]
|
||||||
|
tests/test_routines.py::test_suggest_routine PASSED [ 76%]
|
||||||
|
tests/test_routines.py::test_suggest_batch PASSED [ 76%]
|
||||||
|
tests/test_routines.py::test_suggest_batch_invalid_date_range PASSED [ 76%]
|
||||||
|
tests/test_routines.py::test_suggest_batch_too_long PASSED [ 77%]
|
||||||
|
tests/test_routines_auth.py::test_suggest_uses_current_user_profile_and_visible_products_only PASSED [ 77%]
|
||||||
|
tests/test_routines_helpers.py::test_contains_minoxidil_text PASSED [ 78%]
|
||||||
|
tests/test_routines_helpers.py::test_is_minoxidil_product PASSED [ 78%]
|
||||||
|
tests/test_routines_helpers.py::test_ev PASSED [ 79%]
|
||||||
|
tests/test_routines_helpers.py::test_build_skin_context PASSED [ 79%]
|
||||||
|
tests/test_routines_helpers.py::test_build_skin_context_falls_back_to_recent_snapshot_within_14_days PASSED [ 80%]
|
||||||
|
tests/test_routines_helpers.py::test_build_skin_context_ignores_snapshot_older_than_14_days PASSED [ 80%]
|
||||||
|
tests/test_routines_helpers.py::test_get_recent_skin_snapshot_prefers_window_match PASSED [ 80%]
|
||||||
|
tests/test_routines_helpers.py::test_get_latest_skin_snapshot_within_days_uses_latest_within_14_days PASSED [ 81%]
|
||||||
|
tests/test_routines_helpers.py::test_build_grooming_context PASSED [ 81%]
|
||||||
|
tests/test_routines_helpers.py::test_build_upcoming_grooming_context PASSED [ 82%]
|
||||||
|
tests/test_routines_helpers.py::test_build_recent_history PASSED [ 82%]
|
||||||
|
tests/test_routines_helpers.py::test_build_recent_history_uses_reference_window PASSED [ 83%]
|
||||||
|
tests/test_routines_helpers.py::test_build_recent_history_excludes_future_routines PASSED [ 83%]
|
||||||
|
tests/test_routines_helpers.py::test_build_products_context_summary_list PASSED [ 84%]
|
||||||
|
tests/test_routines_helpers.py::test_build_objectives_context PASSED [ 84%]
|
||||||
|
tests/test_routines_helpers.py::test_build_day_context PASSED [ 85%]
|
||||||
|
tests/test_routines_helpers.py::test_get_available_products_respects_filters PASSED [ 85%]
|
||||||
|
tests/test_routines_helpers.py::test_build_product_details_tool_handler_returns_only_available_ids PASSED [ 85%]
|
||||||
|
tests/test_routines_helpers.py::test_extract_requested_product_ids_dedupes_and_limits PASSED [ 86%]
|
||||||
|
tests/test_routines_helpers.py::test_extract_active_names_uses_compact_distinct_names PASSED [ 86%]
|
||||||
|
tests/test_routines_helpers.py::test_get_available_products_excludes_minoxidil_when_flag_false PASSED [ 87%]
|
||||||
|
tests/test_routines_helpers.py::test_filter_products_by_interval PASSED [ 87%]
|
||||||
|
tests/test_routines_helpers.py::test_filter_products_by_interval_never_used_passes PASSED [ 88%]
|
||||||
|
tests/test_routines_helpers.py::test_product_details_tool_handler_returns_product_payloads PASSED [ 88%]
|
||||||
|
tests/test_skincare.py::test_create_snapshot_minimal PASSED [ 89%]
|
||||||
|
tests/test_skincare.py::test_create_snapshot_full PASSED [ 89%]
|
||||||
|
tests/test_skincare.py::test_create_snapshot_invalid_state PASSED [ 90%]
|
||||||
|
tests/test_skincare.py::test_list_snapshots_empty PASSED [ 90%]
|
||||||
|
tests/test_skincare.py::test_list_filter_date_range PASSED [ 90%]
|
||||||
|
tests/test_skincare.py::test_list_filter_overall_state PASSED [ 91%]
|
||||||
|
tests/test_skincare.py::test_get_snapshot PASSED [ 91%]
|
||||||
|
tests/test_skincare.py::test_get_snapshot_not_found PASSED [ 92%]
|
||||||
|
tests/test_skincare.py::test_update_snapshot_state PASSED [ 92%]
|
||||||
|
tests/test_skincare.py::test_update_snapshot_concerns PASSED [ 93%]
|
||||||
|
tests/test_skincare.py::test_update_snapshot_not_found PASSED [ 93%]
|
||||||
|
tests/test_skincare.py::test_delete_snapshot PASSED [ 94%]
|
||||||
|
tests/test_skincare.py::test_delete_snapshot_not_found PASSED [ 94%]
|
||||||
|
tests/test_skincare.py::test_analyze_photos_includes_user_profile_context PASSED [ 95%]
|
||||||
|
tests/test_tenancy_domains.py::test_profile_health_routines_skincare_ai_logs_are_user_scoped_by_default PASSED [ 95%]
|
||||||
|
tests/test_tenancy_domains.py::test_health_admin_override_requires_explicit_user_id PASSED [ 95%]
|
||||||
|
tests/validators/test_routine_validator.py::test_detects_retinoid_acid_conflict PASSED [ 96%]
|
||||||
|
tests/validators/test_routine_validator.py::test_rejects_unknown_product_ids PASSED [ 96%]
|
||||||
|
tests/validators/test_routine_validator.py::test_enforces_min_interval_hours PASSED [ 97%]
|
||||||
|
tests/validators/test_routine_validator.py::test_blocks_dose_field PASSED [ 97%]
|
||||||
|
tests/validators/test_routine_validator.py::test_missing_spf_in_am_leaving_home PASSED [ 98%]
|
||||||
|
tests/validators/test_routine_validator.py::test_compromised_barrier_restrictions PASSED [ 98%]
|
||||||
|
tests/validators/test_routine_validator.py::test_step_must_have_product_or_action PASSED [ 99%]
|
||||||
|
tests/validators/test_routine_validator.py::test_step_cannot_have_both_product_and_action PASSED [ 99%]
|
||||||
|
tests/validators/test_routine_validator.py::test_accepts_valid_routine PASSED [100%]
|
||||||
|
|
||||||
|
================================ tests coverage ================================
|
||||||
|
______________ coverage: platform darwin, python 3.12.12-final-0 _______________
|
||||||
|
|
||||||
|
Name Stmts Miss Cover Missing
|
||||||
|
----------------------------------------------------------------------------------
|
||||||
|
innercontext/api/__init__.py 0 0 100%
|
||||||
|
innercontext/api/admin.py 93 1 99% 142
|
||||||
|
innercontext/api/ai_logs.py 63 12 81% 19, 21, 25-26, 29-30, 55-57, 77, 79, 109
|
||||||
|
innercontext/api/auth.py 68 4 94% 66, 69, 74, 109
|
||||||
|
innercontext/api/auth_deps.py 25 1 96% 43
|
||||||
|
innercontext/api/authz.py 100 12 88% 25-26, 39, 49, 83, 91, 108, 125, 128, 158, 167, 174
|
||||||
|
innercontext/api/health.py 236 8 97% 145, 158-163, 412, 414, 418
|
||||||
|
innercontext/api/inventory.py 30 0 100%
|
||||||
|
innercontext/api/llm_context.py 106 42 60% 19-21, 67, 77, 114, 116, 118, 120-131, 142, 146-149, 180-217
|
||||||
|
innercontext/api/product_llm_tools.py 107 33 69% 12-17, 25, 53, 63, 67-80, 133-134, 155-161, 193
|
||||||
|
innercontext/api/products.py 638 76 88% 82, 84, 88, 109-126, 284, 287-289, 317-318, 331, 340-341, 343, 345, 347-348, 381, 413, 415, 419, 425, 429, 520, 524, 528, 532, 536, 542, 544, 587, 604, 606, 657, 661, 692, 867, 870-871, 880-881, 887, 890-891, 918, 920, 922, 924, 933-934, 983, 1007, 1045, 1082, 1176, 1249, 1251, 1253, 1256, 1360-1375, 1392, 1439-1442, 1449-1450, 1453
|
||||||
|
innercontext/api/profile.py 39 0 100%
|
||||||
|
innercontext/api/routines.py 632 89 86% 67-84, 101-103, 112-117, 129-133, 323-324, 465, 477, 552, 592, 594, 599, 640-641, 664-693, 715, 719-721, 833, 986-1002, 1019, 1023-1024, 1030, 1033, 1039, 1064-1065, 1069, 1115-1119, 1130, 1201-1203, 1236, 1240-1241, 1247-1264, 1284-1285, 1331-1333, 1340-1341, 1344, 1454, 1485
|
||||||
|
innercontext/api/skincare.py 150 18 88% 147-149, 162-166, 179, 191, 196, 231-232, 242-245, 251, 254-255
|
||||||
|
innercontext/api/utils.py 22 2 91% 51, 59
|
||||||
|
innercontext/auth.py 236 42 82% 67-70, 75, 134, 137, 142, 144, 147-149, 156, 201-210, 216, 224-225, 232, 242, 247, 254-255, 261, 274, 298, 300, 314-315, 344-346, 378-384
|
||||||
|
innercontext/llm.py 134 117 13% 62-66, 74-102, 118-214, 231-326
|
||||||
|
innercontext/llm_safety.py 18 6 67% 18, 59, 80-83
|
||||||
|
innercontext/models/__init__.py 13 0 100%
|
||||||
|
innercontext/models/ai_log.py 33 0 100%
|
||||||
|
innercontext/models/api_metadata.py 15 0 100%
|
||||||
|
innercontext/models/base.py 3 0 100%
|
||||||
|
innercontext/models/domain.py 4 0 100%
|
||||||
|
innercontext/models/enums.py 152 0 100%
|
||||||
|
innercontext/models/health.py 64 0 100%
|
||||||
|
innercontext/models/household.py 14 0 100%
|
||||||
|
innercontext/models/household_membership.py 20 0 100%
|
||||||
|
innercontext/models/pricing.py 19 0 100%
|
||||||
|
innercontext/models/product.py 226 34 85% 203-205, 209-230, 253, 255, 257, 259, 261, 263, 265, 267, 271, 286, 318, 320, 336, 338, 340, 342, 349-354
|
||||||
|
innercontext/models/profile.py 17 0 100%
|
||||||
|
innercontext/models/routine.py 42 0 100%
|
||||||
|
innercontext/models/skincare.py 37 0 100%
|
||||||
|
innercontext/models/user.py 19 0 100%
|
||||||
|
innercontext/services/__init__.py 0 0 100%
|
||||||
|
innercontext/services/fx.py 57 42 26% 16, 20-22, 26-48, 54-67, 71-77
|
||||||
|
innercontext/services/pricing_jobs.py 89 29 67% 35, 39, 53-67, 74-80, 94, 123-130, 136
|
||||||
|
innercontext/validators/__init__.py 7 0 100%
|
||||||
|
innercontext/validators/base.py 22 2 91% 35, 52
|
||||||
|
innercontext/validators/batch_validator.py 128 84 34% 61-62, 67-68, 71-72, 82-83, 87-91, 100-119, 123-142, 146, 167-203, 214-240, 250-273
|
||||||
|
innercontext/validators/photo_validator.py 65 33 49% 82-87, 94-101, 108-115, 122-129, 145, 151-152, 165, 171-178
|
||||||
|
innercontext/validators/product_parse_validator.py 110 46 58% 112, 115, 117, 142, 165, 172, 186, 192-198, 205-239, 244-249, 252-257, 266-267, 274-275, 282, 287, 291, 298, 308, 315, 319, 339
|
||||||
|
innercontext/validators/routine_validator.py 146 17 88% 72-73, 111-117, 126, 187, 195, 208, 216, 234, 241, 266-267, 292
|
||||||
|
innercontext/validators/shopping_validator.py 78 20 74% 52-53, 58-59, 70, 91, 114, 123, 137-138, 142, 151-152, 156-159, 161, 193, 196-199, 203
|
||||||
|
----------------------------------------------------------------------------------
|
||||||
|
TOTAL 4077 770 81%
|
||||||
|
Coverage HTML written to dir htmlcov
|
||||||
|
============================= 221 passed in 2.98s ==============================
|
||||||
6
.sisyphus/evidence/task-T11-ci-enabled.txt
Normal file
6
.sisyphus/evidence/task-T11-ci-enabled.txt
Normal file
|
|
@ -0,0 +1,6 @@
|
||||||
|
backend-test:
|
||||||
|
name: Backend Tests
|
||||||
|
runs-on: lxc
|
||||||
|
steps:
|
||||||
|
- name: Checkout code
|
||||||
|
uses: actions/checkout@v4
|
||||||
37
.sisyphus/evidence/task-T2-migration-missing-bootstrap.txt
Normal file
37
.sisyphus/evidence/task-T2-migration-missing-bootstrap.txt
Normal file
|
|
@ -0,0 +1,37 @@
|
||||||
|
INFO [alembic.runtime.migration] Context impl SQLiteImpl.
|
||||||
|
INFO [alembic.runtime.migration] Will assume non-transactional DDL.
|
||||||
|
INFO [alembic.runtime.migration] Running upgrade 9f3a2c1b4d5e -> 4b7d2e9f1c3a, add auth tables and ownership
|
||||||
|
Traceback (most recent call last):
|
||||||
|
File "/Users/piotr/dev/innercontext/backend/.venv/bin/alembic", line 10, in <module>
|
||||||
|
sys.exit(main())
|
||||||
|
^^^^^^
|
||||||
|
File "/Users/piotr/dev/innercontext/backend/.venv/lib/python3.12/site-packages/alembic/config.py", line 1047, in main
|
||||||
|
CommandLine(prog=prog).main(argv=argv)
|
||||||
|
File "/Users/piotr/dev/innercontext/backend/.venv/lib/python3.12/site-packages/alembic/config.py", line 1037, in main
|
||||||
|
self.run_cmd(cfg, options)
|
||||||
|
File "/Users/piotr/dev/innercontext/backend/.venv/lib/python3.12/site-packages/alembic/config.py", line 971, in run_cmd
|
||||||
|
fn(
|
||||||
|
File "/Users/piotr/dev/innercontext/backend/.venv/lib/python3.12/site-packages/alembic/command.py", line 483, in upgrade
|
||||||
|
script.run_env()
|
||||||
|
File "/Users/piotr/dev/innercontext/backend/.venv/lib/python3.12/site-packages/alembic/script/base.py", line 545, in run_env
|
||||||
|
util.load_python_file(self.dir, "env.py")
|
||||||
|
File "/Users/piotr/dev/innercontext/backend/.venv/lib/python3.12/site-packages/alembic/util/pyfiles.py", line 116, in load_python_file
|
||||||
|
module = load_module_py(module_id, path)
|
||||||
|
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
||||||
|
File "/Users/piotr/dev/innercontext/backend/.venv/lib/python3.12/site-packages/alembic/util/pyfiles.py", line 136, in load_module_py
|
||||||
|
spec.loader.exec_module(module) # type: ignore
|
||||||
|
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
||||||
|
File "<frozen importlib._bootstrap_external>", line 999, in exec_module
|
||||||
|
File "<frozen importlib._bootstrap>", line 488, in _call_with_frames_removed
|
||||||
|
File "/Users/piotr/dev/innercontext/backend/alembic/env.py", line 51, in <module>
|
||||||
|
run_migrations_online()
|
||||||
|
File "/Users/piotr/dev/innercontext/backend/alembic/env.py", line 45, in run_migrations_online
|
||||||
|
context.run_migrations()
|
||||||
|
File "<string>", line 8, in run_migrations
|
||||||
|
File "/Users/piotr/dev/innercontext/backend/.venv/lib/python3.12/site-packages/alembic/runtime/environment.py", line 969, in run_migrations
|
||||||
|
self.get_context().run_migrations(**kw)
|
||||||
|
File "/Users/piotr/dev/innercontext/backend/.venv/lib/python3.12/site-packages/alembic/runtime/migration.py", line 626, in run_migrations
|
||||||
|
step.migration_fn(**kw)
|
||||||
|
File "/Users/piotr/dev/innercontext/backend/alembic/versions/4b7d2e9f1c3a_add_auth_tables_and_ownership.py", line 243, in upgrade
|
||||||
|
raise RuntimeError(
|
||||||
|
RuntimeError: Legacy data requires bootstrap admin identity; missing required env vars: BOOTSTRAP_ADMIN_OIDC_ISSUER, BOOTSTRAP_ADMIN_OIDC_SUB
|
||||||
3
.sisyphus/evidence/task-T2-migration-upgrade.txt
Normal file
3
.sisyphus/evidence/task-T2-migration-upgrade.txt
Normal file
|
|
@ -0,0 +1,3 @@
|
||||||
|
INFO [alembic.runtime.migration] Context impl SQLiteImpl.
|
||||||
|
INFO [alembic.runtime.migration] Will assume non-transactional DDL.
|
||||||
|
INFO [alembic.runtime.migration] Running upgrade 9f3a2c1b4d5e -> 4b7d2e9f1c3a, add auth tables and ownership
|
||||||
BIN
.sisyphus/evidence/task-T2-missing-bootstrap.sqlite
Normal file
BIN
.sisyphus/evidence/task-T2-missing-bootstrap.sqlite
Normal file
Binary file not shown.
BIN
.sisyphus/evidence/task-T2-upgrade.sqlite
Normal file
BIN
.sisyphus/evidence/task-T2-upgrade.sqlite
Normal file
Binary file not shown.
63
.sisyphus/evidence/task-T4-authz-denied.txt
Normal file
63
.sisyphus/evidence/task-T4-authz-denied.txt
Normal file
|
|
@ -0,0 +1,63 @@
|
||||||
|
============================= test session starts ==============================
|
||||||
|
platform darwin -- Python 3.12.12, pytest-9.0.2, pluggy-1.6.0 -- /Users/piotr/dev/innercontext/backend/.venv/bin/python3
|
||||||
|
cachedir: .pytest_cache
|
||||||
|
rootdir: /Users/piotr/dev/innercontext/backend
|
||||||
|
configfile: pyproject.toml
|
||||||
|
plugins: anyio-4.12.1, cov-7.0.0
|
||||||
|
collecting ... collected 8 items / 5 deselected / 3 selected
|
||||||
|
|
||||||
|
tests/test_authz.py::test_owner_denied_for_non_owned_lookup_returns_404 PASSED [ 33%]
|
||||||
|
tests/test_authz.py::test_household_shared_inventory_denied_for_cross_household_member PASSED [ 66%]
|
||||||
|
tests/test_authz.py::test_product_visibility_denied_for_cross_household_member PASSED [100%]
|
||||||
|
|
||||||
|
================================ tests coverage ================================
|
||||||
|
______________ coverage: platform darwin, python 3.12.12-final-0 _______________
|
||||||
|
|
||||||
|
Name Stmts Miss Cover Missing
|
||||||
|
----------------------------------------------------------------------------------
|
||||||
|
innercontext/api/__init__.py 0 0 100%
|
||||||
|
innercontext/api/ai_logs.py 50 25 50% 15-27, 53-59, 79-83
|
||||||
|
innercontext/api/auth.py 68 18 74% 65-79, 100, 106-109, 122-129, 153-158, 166
|
||||||
|
innercontext/api/auth_deps.py 25 13 48% 23, 36-48, 52-57
|
||||||
|
innercontext/api/authz.py 96 41 57% 25-26, 39, 49, 63, 66, 75-83, 89-93, 105-108, 118, 121, 125, 128, 133, 141-146, 154, 157, 160, 163, 170, 172
|
||||||
|
innercontext/api/health.py 216 97 55% 75-79, 147-152, 157-161, 166, 175-181, 186-196, 206-210, 223-232, 241-247, 252-254, 276-344, 349-353, 358, 367-373, 378-380
|
||||||
|
innercontext/api/inventory.py 25 11 56% 16, 25-31, 36-38
|
||||||
|
innercontext/api/llm_context.py 92 81 12% 11, 17-20, 24-46, 68-119, 146-182, 214-218
|
||||||
|
innercontext/api/product_llm_tools.py 107 94 12% 12-17, 23-38, 48-82, 111-136, 143-162, 172-205
|
||||||
|
innercontext/api/products.py 616 403 35% 72-92, 247-255, 265-267, 278-353, 362-397, 481, 485-503, 507-516, 526-532, 536-537, 547-597, 614-658, 663-677, 681, 803-841, 853-931, 936-943, 950-961, 966-969, 979-981, 992-1001, 1010-1015, 1025-1156, 1162-1164, 1168-1181, 1187, 1217-1377
|
||||||
|
innercontext/api/profile.py 35 14 60% 29-32, 43-56
|
||||||
|
innercontext/api/routines.py 550 373 32% 58-78, 254-257, 261-278, 282-287, 296-308, 321-322, 336-345, 359-376, 384-418, 426-456, 464-475, 484-493, 497-510, 516, 527-537, 556-573, 577-583, 587-590, 681-706, 711-715, 728-960, 968-1138, 1144, 1149-1155, 1164-1170, 1175-1177, 1191-1196, 1205-1211, 1216-1218, 1230-1234, 1243-1249, 1254-1256
|
||||||
|
innercontext/api/skincare.py 131 56 57% 100, 146-219, 229-236, 241-245, 250, 259-265, 270-272
|
||||||
|
innercontext/api/utils.py 22 8 64% 22-25, 34, 43, 51, 59
|
||||||
|
innercontext/auth.py 236 146 38% 64-77, 127-129, 133-137, 141-149, 153-156, 161-168, 187-192, 195-210, 213-217, 220-228, 231-248, 251-264, 267, 271-274, 279, 283-284, 288-317, 325-363, 373-384
|
||||||
|
innercontext/llm.py 134 119 11% 22, 44, 62-66, 74-102, 118-214, 231-326
|
||||||
|
innercontext/llm_safety.py 18 14 22% 17-45, 58-61, 80-83
|
||||||
|
innercontext/models/__init__.py 13 0 100%
|
||||||
|
innercontext/models/ai_log.py 33 0 100%
|
||||||
|
innercontext/models/api_metadata.py 15 0 100%
|
||||||
|
innercontext/models/base.py 3 0 100%
|
||||||
|
innercontext/models/domain.py 4 0 100%
|
||||||
|
innercontext/models/enums.py 152 0 100%
|
||||||
|
innercontext/models/health.py 64 0 100%
|
||||||
|
innercontext/models/household.py 14 0 100%
|
||||||
|
innercontext/models/household_membership.py 20 0 100%
|
||||||
|
innercontext/models/pricing.py 19 0 100%
|
||||||
|
innercontext/models/product.py 226 106 53% 76-78, 203-205, 209-230, 238-356
|
||||||
|
innercontext/models/profile.py 17 0 100%
|
||||||
|
innercontext/models/routine.py 42 0 100%
|
||||||
|
innercontext/models/skincare.py 37 0 100%
|
||||||
|
innercontext/models/user.py 19 0 100%
|
||||||
|
innercontext/services/__init__.py 0 0 100%
|
||||||
|
innercontext/services/fx.py 57 42 26% 16, 20-22, 26-48, 54-67, 71-77
|
||||||
|
innercontext/services/pricing_jobs.py 62 53 15% 12-23, 27-48, 52-66, 70-85, 89-93
|
||||||
|
innercontext/validators/__init__.py 7 0 100%
|
||||||
|
innercontext/validators/base.py 22 5 77% 23, 27, 31, 35, 52
|
||||||
|
innercontext/validators/batch_validator.py 128 105 18% 37, 58-154, 167-203, 214-240, 249-273
|
||||||
|
innercontext/validators/photo_validator.py 65 54 17% 58-134, 144-152, 164-178
|
||||||
|
innercontext/validators/product_parse_validator.py 110 93 15% 108-154, 164-172, 185-198, 205-239, 243-267, 273-319, 325-339
|
||||||
|
innercontext/validators/routine_validator.py 146 114 22% 69-167, 173-175, 182-197, 201-218, 229-246, 259-275, 288-309
|
||||||
|
innercontext/validators/shopping_validator.py 78 58 26% 49-96, 102-114, 122-123, 136-142, 150-161, 169-203
|
||||||
|
----------------------------------------------------------------------------------
|
||||||
|
TOTAL 3774 2143 43%
|
||||||
|
Coverage HTML written to dir htmlcov
|
||||||
|
======================= 3 passed, 5 deselected in 0.35s ========================
|
||||||
68
.sisyphus/evidence/task-T4-authz-happy.txt
Normal file
68
.sisyphus/evidence/task-T4-authz-happy.txt
Normal file
|
|
@ -0,0 +1,68 @@
|
||||||
|
============================= test session starts ==============================
|
||||||
|
platform darwin -- Python 3.12.12, pytest-9.0.2, pluggy-1.6.0 -- /Users/piotr/dev/innercontext/backend/.venv/bin/python3
|
||||||
|
cachedir: .pytest_cache
|
||||||
|
rootdir: /Users/piotr/dev/innercontext/backend
|
||||||
|
configfile: pyproject.toml
|
||||||
|
plugins: anyio-4.12.1, cov-7.0.0
|
||||||
|
collecting ... collected 8 items
|
||||||
|
|
||||||
|
tests/test_authz.py::test_owner_helpers_return_only_owned_records PASSED [ 12%]
|
||||||
|
tests/test_authz.py::test_admin_helpers_allow_admin_override_for_lookup_and_list PASSED [ 25%]
|
||||||
|
tests/test_authz.py::test_owner_denied_for_non_owned_lookup_returns_404 PASSED [ 37%]
|
||||||
|
tests/test_authz.py::test_household_shared_inventory_access_allows_same_household_member PASSED [ 50%]
|
||||||
|
tests/test_authz.py::test_household_shared_inventory_denied_for_cross_household_member PASSED [ 62%]
|
||||||
|
tests/test_authz.py::test_household_inventory_update_rules_owner_admin_and_member PASSED [ 75%]
|
||||||
|
tests/test_authz.py::test_product_visibility_for_owner_admin_and_household_shared PASSED [ 87%]
|
||||||
|
tests/test_authz.py::test_product_visibility_denied_for_cross_household_member PASSED [100%]
|
||||||
|
|
||||||
|
================================ tests coverage ================================
|
||||||
|
______________ coverage: platform darwin, python 3.12.12-final-0 _______________
|
||||||
|
|
||||||
|
Name Stmts Miss Cover Missing
|
||||||
|
----------------------------------------------------------------------------------
|
||||||
|
innercontext/api/__init__.py 0 0 100%
|
||||||
|
innercontext/api/ai_logs.py 50 25 50% 15-27, 53-59, 79-83
|
||||||
|
innercontext/api/auth.py 68 18 74% 65-79, 100, 106-109, 122-129, 153-158, 166
|
||||||
|
innercontext/api/auth_deps.py 25 13 48% 23, 36-48, 52-57
|
||||||
|
innercontext/api/authz.py 96 19 80% 25-26, 39, 49, 63, 78, 81-83, 91, 108, 118, 121, 125, 128, 143, 154, 163, 170
|
||||||
|
innercontext/api/health.py 216 97 55% 75-79, 147-152, 157-161, 166, 175-181, 186-196, 206-210, 223-232, 241-247, 252-254, 276-344, 349-353, 358, 367-373, 378-380
|
||||||
|
innercontext/api/inventory.py 25 11 56% 16, 25-31, 36-38
|
||||||
|
innercontext/api/llm_context.py 92 81 12% 11, 17-20, 24-46, 68-119, 146-182, 214-218
|
||||||
|
innercontext/api/product_llm_tools.py 107 94 12% 12-17, 23-38, 48-82, 111-136, 143-162, 172-205
|
||||||
|
innercontext/api/products.py 616 403 35% 72-92, 247-255, 265-267, 278-353, 362-397, 481, 485-503, 507-516, 526-532, 536-537, 547-597, 614-658, 663-677, 681, 803-841, 853-931, 936-943, 950-961, 966-969, 979-981, 992-1001, 1010-1015, 1025-1156, 1162-1164, 1168-1181, 1187, 1217-1377
|
||||||
|
innercontext/api/profile.py 35 14 60% 29-32, 43-56
|
||||||
|
innercontext/api/routines.py 550 373 32% 58-78, 254-257, 261-278, 282-287, 296-308, 321-322, 336-345, 359-376, 384-418, 426-456, 464-475, 484-493, 497-510, 516, 527-537, 556-573, 577-583, 587-590, 681-706, 711-715, 728-960, 968-1138, 1144, 1149-1155, 1164-1170, 1175-1177, 1191-1196, 1205-1211, 1216-1218, 1230-1234, 1243-1249, 1254-1256
|
||||||
|
innercontext/api/skincare.py 131 56 57% 100, 146-219, 229-236, 241-245, 250, 259-265, 270-272
|
||||||
|
innercontext/api/utils.py 22 8 64% 22-25, 34, 43, 51, 59
|
||||||
|
innercontext/auth.py 236 146 38% 64-77, 127-129, 133-137, 141-149, 153-156, 161-168, 187-192, 195-210, 213-217, 220-228, 231-248, 251-264, 267, 271-274, 279, 283-284, 288-317, 325-363, 373-384
|
||||||
|
innercontext/llm.py 134 119 11% 22, 44, 62-66, 74-102, 118-214, 231-326
|
||||||
|
innercontext/llm_safety.py 18 14 22% 17-45, 58-61, 80-83
|
||||||
|
innercontext/models/__init__.py 13 0 100%
|
||||||
|
innercontext/models/ai_log.py 33 0 100%
|
||||||
|
innercontext/models/api_metadata.py 15 0 100%
|
||||||
|
innercontext/models/base.py 3 0 100%
|
||||||
|
innercontext/models/domain.py 4 0 100%
|
||||||
|
innercontext/models/enums.py 152 0 100%
|
||||||
|
innercontext/models/health.py 64 0 100%
|
||||||
|
innercontext/models/household.py 14 0 100%
|
||||||
|
innercontext/models/household_membership.py 20 0 100%
|
||||||
|
innercontext/models/pricing.py 19 0 100%
|
||||||
|
innercontext/models/product.py 226 106 53% 76-78, 203-205, 209-230, 238-356
|
||||||
|
innercontext/models/profile.py 17 0 100%
|
||||||
|
innercontext/models/routine.py 42 0 100%
|
||||||
|
innercontext/models/skincare.py 37 0 100%
|
||||||
|
innercontext/models/user.py 19 0 100%
|
||||||
|
innercontext/services/__init__.py 0 0 100%
|
||||||
|
innercontext/services/fx.py 57 42 26% 16, 20-22, 26-48, 54-67, 71-77
|
||||||
|
innercontext/services/pricing_jobs.py 62 53 15% 12-23, 27-48, 52-66, 70-85, 89-93
|
||||||
|
innercontext/validators/__init__.py 7 0 100%
|
||||||
|
innercontext/validators/base.py 22 5 77% 23, 27, 31, 35, 52
|
||||||
|
innercontext/validators/batch_validator.py 128 105 18% 37, 58-154, 167-203, 214-240, 249-273
|
||||||
|
innercontext/validators/photo_validator.py 65 54 17% 58-134, 144-152, 164-178
|
||||||
|
innercontext/validators/product_parse_validator.py 110 93 15% 108-154, 164-172, 185-198, 205-239, 243-267, 273-319, 325-339
|
||||||
|
innercontext/validators/routine_validator.py 146 114 22% 69-167, 173-175, 182-197, 201-218, 229-246, 259-275, 288-309
|
||||||
|
innercontext/validators/shopping_validator.py 78 58 26% 49-96, 102-114, 122-123, 136-142, 150-161, 169-203
|
||||||
|
----------------------------------------------------------------------------------
|
||||||
|
TOTAL 3774 2121 44%
|
||||||
|
Coverage HTML written to dir htmlcov
|
||||||
|
============================== 8 passed in 0.40s ===============================
|
||||||
62
.sisyphus/evidence/task-T5-product-denied.txt
Normal file
62
.sisyphus/evidence/task-T5-product-denied.txt
Normal file
|
|
@ -0,0 +1,62 @@
|
||||||
|
============================= test session starts ==============================
|
||||||
|
platform darwin -- Python 3.12.12, pytest-9.0.2, pluggy-1.6.0 -- /Users/piotr/dev/innercontext/backend/.venv/bin/python3
|
||||||
|
cachedir: .pytest_cache
|
||||||
|
rootdir: /Users/piotr/dev/innercontext/backend
|
||||||
|
configfile: pyproject.toml
|
||||||
|
plugins: anyio-4.12.1, cov-7.0.0
|
||||||
|
collecting ... collected 8 items / 6 deselected / 2 selected
|
||||||
|
|
||||||
|
tests/test_products_auth.py::test_household_member_cannot_edit_shared_product PASSED [ 50%]
|
||||||
|
tests/test_products_auth.py::test_household_member_cannot_delete_shared_product PASSED [100%]
|
||||||
|
|
||||||
|
================================ tests coverage ================================
|
||||||
|
______________ coverage: platform darwin, python 3.12.12-final-0 _______________
|
||||||
|
|
||||||
|
Name Stmts Miss Cover Missing
|
||||||
|
----------------------------------------------------------------------------------
|
||||||
|
innercontext/api/__init__.py 0 0 100%
|
||||||
|
innercontext/api/ai_logs.py 63 34 46% 18-30, 53-57, 69-81, 106-113
|
||||||
|
innercontext/api/auth.py 68 18 74% 65-79, 100, 106-109, 122-129, 153-158, 166
|
||||||
|
innercontext/api/auth_deps.py 25 13 48% 23, 36-48, 52-57
|
||||||
|
innercontext/api/authz.py 100 68 32% 25-26, 35-40, 48-51, 60-66, 78, 80, 83, 89-93, 105-108, 116-133, 141-150, 156-177
|
||||||
|
innercontext/api/health.py 238 115 52% 77-81, 142-146, 156-163, 179-185, 195-204, 214, 231-243, 253-270, 285-298, 313-330, 341-353, 363-371, 395-464, 474-483, 493, 510-522, 532-540
|
||||||
|
innercontext/api/inventory.py 30 13 57% 26, 36-44, 53-60
|
||||||
|
innercontext/api/llm_context.py 102 87 15% 17-21, 30-31, 39-42, 52-74, 96-147, 174-210, 242-246
|
||||||
|
innercontext/api/product_llm_tools.py 107 94 12% 12-17, 23-38, 48-82, 111-136, 143-162, 172-205
|
||||||
|
innercontext/api/products.py 638 389 39% 82, 84, 88, 106-126, 281-289, 299-301, 312-387, 396-431, 515, 519-537, 541-550, 560-566, 570-571, 581-631, 649-703, 712-727, 731, 853-891, 918, 920, 922, 924, 933-934, 983, 1005-1015, 1027-1029, 1043-1048, 1060-1072, 1081-1086, 1097-1232, 1238-1240, 1244-1257, 1263, 1296-1457
|
||||||
|
innercontext/api/profile.py 39 15 62% 36-39, 55-69
|
||||||
|
innercontext/api/routines.py 586 402 31% 63-83, 98-102, 108-116, 126-132, 300-303, 307-324, 328-333, 343-356, 371-372, 388-398, 414-433, 442-478, 487-519, 528-555, 564-573, 577-590, 596, 609-629, 652-681, 685-691, 695-698, 789-814, 819-823, 836-1068, 1076-1246, 1252, 1257-1263, 1272-1278, 1283-1285, 1299-1304, 1313-1319, 1324-1326, 1338-1342, 1351-1357, 1362-1364
|
||||||
|
innercontext/api/skincare.py 150 70 53% 103, 145-149, 158-166, 178-255, 267-277, 287-296, 306, 322-333, 343-350
|
||||||
|
innercontext/api/utils.py 22 4 82% 24, 34, 51, 59
|
||||||
|
innercontext/auth.py 236 146 38% 64-77, 127-129, 133-137, 141-149, 153-156, 161-168, 187-192, 195-210, 213-217, 220-228, 231-248, 251-264, 267, 271-274, 279, 283-284, 288-317, 325-363, 373-384
|
||||||
|
innercontext/llm.py 134 119 11% 22, 44, 62-66, 74-102, 118-214, 231-326
|
||||||
|
innercontext/llm_safety.py 18 14 22% 17-45, 58-61, 80-83
|
||||||
|
innercontext/models/__init__.py 13 0 100%
|
||||||
|
innercontext/models/ai_log.py 33 0 100%
|
||||||
|
innercontext/models/api_metadata.py 15 0 100%
|
||||||
|
innercontext/models/base.py 3 0 100%
|
||||||
|
innercontext/models/domain.py 4 0 100%
|
||||||
|
innercontext/models/enums.py 152 0 100%
|
||||||
|
innercontext/models/health.py 64 0 100%
|
||||||
|
innercontext/models/household.py 14 0 100%
|
||||||
|
innercontext/models/household_membership.py 20 0 100%
|
||||||
|
innercontext/models/pricing.py 19 0 100%
|
||||||
|
innercontext/models/product.py 226 106 53% 76-78, 203-205, 209-230, 238-356
|
||||||
|
innercontext/models/profile.py 17 0 100%
|
||||||
|
innercontext/models/routine.py 42 0 100%
|
||||||
|
innercontext/models/skincare.py 37 0 100%
|
||||||
|
innercontext/models/user.py 19 0 100%
|
||||||
|
innercontext/services/__init__.py 0 0 100%
|
||||||
|
innercontext/services/fx.py 57 42 26% 16, 20-22, 26-48, 54-67, 71-77
|
||||||
|
innercontext/services/pricing_jobs.py 62 52 16% 18-23, 27-48, 52-66, 70-85, 89-93
|
||||||
|
innercontext/validators/__init__.py 7 0 100%
|
||||||
|
innercontext/validators/base.py 22 5 77% 23, 27, 31, 35, 52
|
||||||
|
innercontext/validators/batch_validator.py 128 105 18% 37, 58-154, 167-203, 214-240, 249-273
|
||||||
|
innercontext/validators/photo_validator.py 65 54 17% 58-134, 144-152, 164-178
|
||||||
|
innercontext/validators/product_parse_validator.py 110 93 15% 108-154, 164-172, 185-198, 205-239, 243-267, 273-319, 325-339
|
||||||
|
innercontext/validators/routine_validator.py 146 114 22% 69-167, 173-175, 182-197, 201-218, 229-246, 259-275, 288-309
|
||||||
|
innercontext/validators/shopping_validator.py 78 58 26% 49-96, 102-114, 122-123, 136-142, 150-161, 169-203
|
||||||
|
----------------------------------------------------------------------------------
|
||||||
|
TOTAL 3909 2230 43%
|
||||||
|
Coverage HTML written to dir htmlcov
|
||||||
|
======================= 2 passed, 6 deselected in 0.49s ========================
|
||||||
63
.sisyphus/evidence/task-T5-product-sharing.txt
Normal file
63
.sisyphus/evidence/task-T5-product-sharing.txt
Normal file
|
|
@ -0,0 +1,63 @@
|
||||||
|
============================= test session starts ==============================
|
||||||
|
platform darwin -- Python 3.12.12, pytest-9.0.2, pluggy-1.6.0 -- /Users/piotr/dev/innercontext/backend/.venv/bin/python3
|
||||||
|
cachedir: .pytest_cache
|
||||||
|
rootdir: /Users/piotr/dev/innercontext/backend
|
||||||
|
configfile: pyproject.toml
|
||||||
|
plugins: anyio-4.12.1, cov-7.0.0
|
||||||
|
collecting ... collected 8 items / 5 deselected / 3 selected
|
||||||
|
|
||||||
|
tests/test_products_auth.py::test_shared_product_visible_in_summary_marks_is_owned_false PASSED [ 33%]
|
||||||
|
tests/test_products_auth.py::test_shared_product_visible_filters_private_inventory_rows PASSED [ 66%]
|
||||||
|
tests/test_products_auth.py::test_shared_inventory_update_allows_household_member PASSED [100%]
|
||||||
|
|
||||||
|
================================ tests coverage ================================
|
||||||
|
______________ coverage: platform darwin, python 3.12.12-final-0 _______________
|
||||||
|
|
||||||
|
Name Stmts Miss Cover Missing
|
||||||
|
----------------------------------------------------------------------------------
|
||||||
|
innercontext/api/__init__.py 0 0 100%
|
||||||
|
innercontext/api/ai_logs.py 63 34 46% 18-30, 53-57, 69-81, 106-113
|
||||||
|
innercontext/api/auth.py 68 18 74% 65-79, 100, 106-109, 122-129, 153-158, 166
|
||||||
|
innercontext/api/auth_deps.py 25 13 48% 23, 36-48, 52-57
|
||||||
|
innercontext/api/authz.py 100 46 54% 25-26, 39, 49, 60-66, 78, 80, 83, 89-93, 105-108, 116-133, 143, 145, 147, 149, 158, 161, 164, 167, 174, 177
|
||||||
|
innercontext/api/health.py 238 115 52% 77-81, 142-146, 156-163, 179-185, 195-204, 214, 231-243, 253-270, 285-298, 313-330, 341-353, 363-371, 395-464, 474-483, 493, 510-522, 532-540
|
||||||
|
innercontext/api/inventory.py 30 5 83% 26, 37, 53-60
|
||||||
|
innercontext/api/llm_context.py 102 87 15% 17-21, 30-31, 39-42, 52-74, 96-147, 174-210, 242-246
|
||||||
|
innercontext/api/product_llm_tools.py 107 94 12% 12-17, 23-38, 48-82, 111-136, 143-162, 172-205
|
||||||
|
innercontext/api/products.py 638 389 39% 82, 84, 88, 106-126, 281-289, 299-301, 312-387, 396-431, 515, 519-537, 541-550, 560-566, 570-571, 581-631, 649-703, 712-727, 731, 853-891, 918, 920, 922, 924, 933-934, 983, 1005-1015, 1027-1029, 1043-1048, 1060-1072, 1081-1086, 1097-1232, 1238-1240, 1244-1257, 1263, 1296-1457
|
||||||
|
innercontext/api/profile.py 39 15 62% 36-39, 55-69
|
||||||
|
innercontext/api/routines.py 586 402 31% 63-83, 98-102, 108-116, 126-132, 300-303, 307-324, 328-333, 343-356, 371-372, 388-398, 414-433, 442-478, 487-519, 528-555, 564-573, 577-590, 596, 609-629, 652-681, 685-691, 695-698, 789-814, 819-823, 836-1068, 1076-1246, 1252, 1257-1263, 1272-1278, 1283-1285, 1299-1304, 1313-1319, 1324-1326, 1338-1342, 1351-1357, 1362-1364
|
||||||
|
innercontext/api/skincare.py 150 70 53% 103, 145-149, 158-166, 178-255, 267-277, 287-296, 306, 322-333, 343-350
|
||||||
|
innercontext/api/utils.py 22 4 82% 24, 34, 51, 59
|
||||||
|
innercontext/auth.py 236 146 38% 64-77, 127-129, 133-137, 141-149, 153-156, 161-168, 187-192, 195-210, 213-217, 220-228, 231-248, 251-264, 267, 271-274, 279, 283-284, 288-317, 325-363, 373-384
|
||||||
|
innercontext/llm.py 134 119 11% 22, 44, 62-66, 74-102, 118-214, 231-326
|
||||||
|
innercontext/llm_safety.py 18 14 22% 17-45, 58-61, 80-83
|
||||||
|
innercontext/models/__init__.py 13 0 100%
|
||||||
|
innercontext/models/ai_log.py 33 0 100%
|
||||||
|
innercontext/models/api_metadata.py 15 0 100%
|
||||||
|
innercontext/models/base.py 3 0 100%
|
||||||
|
innercontext/models/domain.py 4 0 100%
|
||||||
|
innercontext/models/enums.py 152 0 100%
|
||||||
|
innercontext/models/health.py 64 0 100%
|
||||||
|
innercontext/models/household.py 14 0 100%
|
||||||
|
innercontext/models/household_membership.py 20 0 100%
|
||||||
|
innercontext/models/pricing.py 19 0 100%
|
||||||
|
innercontext/models/product.py 226 106 53% 76-78, 203-205, 209-230, 238-356
|
||||||
|
innercontext/models/profile.py 17 0 100%
|
||||||
|
innercontext/models/routine.py 42 0 100%
|
||||||
|
innercontext/models/skincare.py 37 0 100%
|
||||||
|
innercontext/models/user.py 19 0 100%
|
||||||
|
innercontext/services/__init__.py 0 0 100%
|
||||||
|
innercontext/services/fx.py 57 42 26% 16, 20-22, 26-48, 54-67, 71-77
|
||||||
|
innercontext/services/pricing_jobs.py 62 52 16% 18-23, 27-48, 52-66, 70-85, 89-93
|
||||||
|
innercontext/validators/__init__.py 7 0 100%
|
||||||
|
innercontext/validators/base.py 22 5 77% 23, 27, 31, 35, 52
|
||||||
|
innercontext/validators/batch_validator.py 128 105 18% 37, 58-154, 167-203, 214-240, 249-273
|
||||||
|
innercontext/validators/photo_validator.py 65 54 17% 58-134, 144-152, 164-178
|
||||||
|
innercontext/validators/product_parse_validator.py 110 93 15% 108-154, 164-172, 185-198, 205-239, 243-267, 273-319, 325-339
|
||||||
|
innercontext/validators/routine_validator.py 146 114 22% 69-167, 173-175, 182-197, 201-218, 229-246, 259-275, 288-309
|
||||||
|
innercontext/validators/shopping_validator.py 78 58 26% 49-96, 102-114, 122-123, 136-142, 150-161, 169-203
|
||||||
|
----------------------------------------------------------------------------------
|
||||||
|
TOTAL 3909 2200 44%
|
||||||
|
Coverage HTML written to dir htmlcov
|
||||||
|
======================= 3 passed, 5 deselected in 0.51s ========================
|
||||||
62
.sisyphus/evidence/task-T6-domain-tenancy.txt
Normal file
62
.sisyphus/evidence/task-T6-domain-tenancy.txt
Normal file
|
|
@ -0,0 +1,62 @@
|
||||||
|
============================= test session starts ==============================
|
||||||
|
platform darwin -- Python 3.12.12, pytest-9.0.2, pluggy-1.6.0 -- /Users/piotr/dev/innercontext/backend/.venv/bin/python3
|
||||||
|
cachedir: .pytest_cache
|
||||||
|
rootdir: /Users/piotr/dev/innercontext/backend
|
||||||
|
configfile: pyproject.toml
|
||||||
|
plugins: anyio-4.12.1, cov-7.0.0
|
||||||
|
collecting ... collected 2 items
|
||||||
|
|
||||||
|
tests/test_tenancy_domains.py::test_profile_health_routines_skincare_ai_logs_are_user_scoped_by_default PASSED [ 50%]
|
||||||
|
tests/test_tenancy_domains.py::test_health_admin_override_requires_explicit_user_id PASSED [100%]
|
||||||
|
|
||||||
|
================================ tests coverage ================================
|
||||||
|
______________ coverage: platform darwin, python 3.12.12-final-0 _______________
|
||||||
|
|
||||||
|
Name Stmts Miss Cover Missing
|
||||||
|
----------------------------------------------------------------------------------
|
||||||
|
innercontext/api/__init__.py 0 0 100%
|
||||||
|
innercontext/api/ai_logs.py 63 21 67% 18-30, 55-57, 77, 79, 109, 112-113
|
||||||
|
innercontext/api/auth.py 68 18 74% 65-79, 100, 106-109, 122-129, 153-158, 166
|
||||||
|
innercontext/api/auth_deps.py 25 13 48% 23, 36-48, 52-57
|
||||||
|
innercontext/api/authz.py 100 70 30% 25-26, 31, 35-40, 48-51, 63, 66, 75-83, 89-93, 105-108, 116-133, 141-150, 156-177
|
||||||
|
innercontext/api/health.py 236 68 71% 78, 145, 158-163, 182, 184, 231-243, 253-270, 285-298, 313-330, 341-353, 363-371, 400-401, 408, 410, 412, 414, 416, 418, 422-442, 492, 509-521, 531-539
|
||||||
|
innercontext/api/inventory.py 30 13 57% 26, 36-44, 53-60
|
||||||
|
innercontext/api/llm_context.py 106 58 45% 19-21, 31, 46, 59, 67, 77, 107-131, 138-149, 180-217
|
||||||
|
innercontext/api/product_llm_tools.py 107 51 52% 12-17, 25, 31, 33, 37, 50-80, 133-134, 144, 155-161, 193
|
||||||
|
innercontext/api/products.py 638 410 36% 81-89, 97, 106-126, 281-289, 299-301, 312-387, 396-431, 515, 519-537, 541-550, 560-566, 570-571, 581-631, 649-703, 714, 731, 853-891, 904-972, 981-992, 1002-1015, 1024-1029, 1043-1048, 1060-1072, 1081-1086, 1097-1232, 1238-1240, 1244-1257, 1263, 1296-1457
|
||||||
|
innercontext/api/profile.py 39 3 92% 39, 62-63
|
||||||
|
innercontext/api/routines.py 632 285 55% 67-84, 101-103, 112-117, 129-133, 309, 311, 313, 315, 319-324, 330, 334, 355, 398-399, 415-434, 451-479, 498-520, 552, 555, 559, 561, 563, 577-581, 587-600, 606, 620, 640-641, 664-693, 698, 709-710, 715, 719-721, 817, 819, 821, 827-833, 837-841, 927-929, 986-1002, 1019, 1023-1024, 1030, 1033, 1039, 1064-1065, 1069, 1115-1119, 1130, 1143-1348, 1358-1359, 1379-1386, 1397-1409, 1419-1427, 1443-1464, 1475-1491, 1501-1509, 1524-1529, 1540-1552, 1562-1570
|
||||||
|
innercontext/api/skincare.py 150 53 65% 103, 147-149, 162-166, 178-255, 272, 274, 276, 322-333, 343-350
|
||||||
|
innercontext/api/utils.py 22 7 68% 22-25, 43, 51, 59
|
||||||
|
innercontext/auth.py 236 146 38% 64-77, 127-129, 133-137, 141-149, 153-156, 161-168, 187-192, 195-210, 213-217, 220-228, 231-248, 251-264, 267, 271-274, 279, 283-284, 288-317, 325-363, 373-384
|
||||||
|
innercontext/llm.py 134 118 12% 22, 62-66, 74-102, 118-214, 231-326
|
||||||
|
innercontext/llm_safety.py 18 14 22% 17-45, 58-61, 80-83
|
||||||
|
innercontext/models/__init__.py 13 0 100%
|
||||||
|
innercontext/models/ai_log.py 33 0 100%
|
||||||
|
innercontext/models/api_metadata.py 15 0 100%
|
||||||
|
innercontext/models/base.py 3 0 100%
|
||||||
|
innercontext/models/domain.py 4 0 100%
|
||||||
|
innercontext/models/enums.py 152 0 100%
|
||||||
|
innercontext/models/health.py 64 0 100%
|
||||||
|
innercontext/models/household.py 14 0 100%
|
||||||
|
innercontext/models/household_membership.py 20 0 100%
|
||||||
|
innercontext/models/pricing.py 19 0 100%
|
||||||
|
innercontext/models/product.py 226 67 70% 78, 203-205, 209-230, 250, 253, 255, 257, 259, 261, 263, 265, 267, 269, 271, 273, 275-288, 291-298, 304, 306, 309-315, 318, 320, 331, 333, 336, 338, 340, 342, 349-354
|
||||||
|
innercontext/models/profile.py 17 0 100%
|
||||||
|
innercontext/models/routine.py 42 0 100%
|
||||||
|
innercontext/models/skincare.py 37 0 100%
|
||||||
|
innercontext/models/user.py 19 0 100%
|
||||||
|
innercontext/services/__init__.py 0 0 100%
|
||||||
|
innercontext/services/fx.py 57 42 26% 16, 20-22, 26-48, 54-67, 71-77
|
||||||
|
innercontext/services/pricing_jobs.py 89 71 20% 28-49, 53-67, 71-80, 89-107, 111-130, 134-138
|
||||||
|
innercontext/validators/__init__.py 7 0 100%
|
||||||
|
innercontext/validators/base.py 22 3 86% 27, 35, 52
|
||||||
|
innercontext/validators/batch_validator.py 128 105 18% 37, 58-154, 167-203, 214-240, 249-273
|
||||||
|
innercontext/validators/photo_validator.py 65 54 17% 58-134, 144-152, 164-178
|
||||||
|
innercontext/validators/product_parse_validator.py 110 93 15% 108-154, 164-172, 185-198, 205-239, 243-267, 273-319, 325-339
|
||||||
|
innercontext/validators/routine_validator.py 146 90 38% 72-73, 92-95, 98-101, 107-147, 151, 158, 175, 182-197, 201-218, 229-246, 259-275, 288-309
|
||||||
|
innercontext/validators/shopping_validator.py 78 58 26% 49-96, 102-114, 122-123, 136-142, 150-161, 169-203
|
||||||
|
----------------------------------------------------------------------------------
|
||||||
|
TOTAL 3984 1931 52%
|
||||||
|
Coverage HTML written to dir htmlcov
|
||||||
|
============================== 2 passed in 0.49s ===============================
|
||||||
61
.sisyphus/evidence/task-T6-routine-scope.txt
Normal file
61
.sisyphus/evidence/task-T6-routine-scope.txt
Normal file
|
|
@ -0,0 +1,61 @@
|
||||||
|
============================= test session starts ==============================
|
||||||
|
platform darwin -- Python 3.12.12, pytest-9.0.2, pluggy-1.6.0 -- /Users/piotr/dev/innercontext/backend/.venv/bin/python3
|
||||||
|
cachedir: .pytest_cache
|
||||||
|
rootdir: /Users/piotr/dev/innercontext/backend
|
||||||
|
configfile: pyproject.toml
|
||||||
|
plugins: anyio-4.12.1, cov-7.0.0
|
||||||
|
collecting ... collected 1 item
|
||||||
|
|
||||||
|
tests/test_routines_auth.py::test_suggest_uses_current_user_profile_and_visible_products_only PASSED [100%]
|
||||||
|
|
||||||
|
================================ tests coverage ================================
|
||||||
|
______________ coverage: platform darwin, python 3.12.12-final-0 _______________
|
||||||
|
|
||||||
|
Name Stmts Miss Cover Missing
|
||||||
|
----------------------------------------------------------------------------------
|
||||||
|
innercontext/api/__init__.py 0 0 100%
|
||||||
|
innercontext/api/ai_logs.py 63 34 46% 18-30, 53-57, 69-81, 106-113
|
||||||
|
innercontext/api/auth.py 68 18 74% 65-79, 100, 106-109, 122-129, 153-158, 166
|
||||||
|
innercontext/api/auth_deps.py 25 13 48% 23, 36-48, 52-57
|
||||||
|
innercontext/api/authz.py 100 79 21% 16, 20, 24-27, 31, 35-40, 48-51, 60-66, 75-83, 89-93, 105-108, 116-133, 141-150, 156-177
|
||||||
|
innercontext/api/health.py 236 113 52% 77-81, 142-146, 156-163, 179-185, 195-204, 214, 231-243, 253-270, 285-298, 313-330, 341-353, 363-371, 395-463, 473-482, 492, 509-521, 531-539
|
||||||
|
innercontext/api/inventory.py 30 13 57% 26, 36-44, 53-60
|
||||||
|
innercontext/api/llm_context.py 106 58 45% 19-21, 31, 46, 59, 67, 77, 107-131, 138-149, 180-217
|
||||||
|
innercontext/api/product_llm_tools.py 107 51 52% 12-17, 25, 31, 33, 37, 50-80, 133-134, 144, 155-161, 193
|
||||||
|
innercontext/api/products.py 638 410 36% 81-89, 97, 106-126, 281-289, 299-301, 312-387, 396-431, 515, 519-537, 541-550, 560-566, 570-571, 581-631, 649-703, 714, 731, 853-891, 904-972, 981-992, 1002-1015, 1024-1029, 1043-1048, 1060-1072, 1081-1086, 1097-1232, 1238-1240, 1244-1257, 1263, 1296-1457
|
||||||
|
innercontext/api/profile.py 39 3 92% 39, 62-63
|
||||||
|
innercontext/api/routines.py 632 285 55% 67-84, 101-103, 112-117, 129-133, 309, 311, 313, 315, 319-324, 330, 334, 355, 398-399, 415-434, 451-479, 498-520, 552, 555, 559, 561, 563, 577-581, 587-600, 606, 620, 640-641, 664-693, 698, 709-710, 715, 719-721, 817, 819, 821, 827-833, 837-841, 927-929, 986-1002, 1019, 1023-1024, 1030, 1033, 1039, 1064-1065, 1069, 1115-1119, 1130, 1143-1348, 1358-1359, 1379-1386, 1397-1409, 1419-1427, 1443-1464, 1475-1491, 1501-1509, 1524-1529, 1540-1552, 1562-1570
|
||||||
|
innercontext/api/skincare.py 150 53 65% 103, 147-149, 162-166, 178-255, 272, 274, 276, 322-333, 343-350
|
||||||
|
innercontext/api/utils.py 22 7 68% 22-25, 43, 51, 59
|
||||||
|
innercontext/auth.py 236 146 38% 64-77, 127-129, 133-137, 141-149, 153-156, 161-168, 187-192, 195-210, 213-217, 220-228, 231-248, 251-264, 267, 271-274, 279, 283-284, 288-317, 325-363, 373-384
|
||||||
|
innercontext/llm.py 134 118 12% 22, 62-66, 74-102, 118-214, 231-326
|
||||||
|
innercontext/llm_safety.py 18 14 22% 17-45, 58-61, 80-83
|
||||||
|
innercontext/models/__init__.py 13 0 100%
|
||||||
|
innercontext/models/ai_log.py 33 0 100%
|
||||||
|
innercontext/models/api_metadata.py 15 0 100%
|
||||||
|
innercontext/models/base.py 3 0 100%
|
||||||
|
innercontext/models/domain.py 4 0 100%
|
||||||
|
innercontext/models/enums.py 152 0 100%
|
||||||
|
innercontext/models/health.py 64 0 100%
|
||||||
|
innercontext/models/household.py 14 0 100%
|
||||||
|
innercontext/models/household_membership.py 20 0 100%
|
||||||
|
innercontext/models/pricing.py 19 0 100%
|
||||||
|
innercontext/models/product.py 226 67 70% 78, 203-205, 209-230, 250, 253, 255, 257, 259, 261, 263, 265, 267, 269, 271, 273, 275-288, 291-298, 304, 306, 309-315, 318, 320, 331, 333, 336, 338, 340, 342, 349-354
|
||||||
|
innercontext/models/profile.py 17 0 100%
|
||||||
|
innercontext/models/routine.py 42 0 100%
|
||||||
|
innercontext/models/skincare.py 37 0 100%
|
||||||
|
innercontext/models/user.py 19 0 100%
|
||||||
|
innercontext/services/__init__.py 0 0 100%
|
||||||
|
innercontext/services/fx.py 57 42 26% 16, 20-22, 26-48, 54-67, 71-77
|
||||||
|
innercontext/services/pricing_jobs.py 89 71 20% 28-49, 53-67, 71-80, 89-107, 111-130, 134-138
|
||||||
|
innercontext/validators/__init__.py 7 0 100%
|
||||||
|
innercontext/validators/base.py 22 3 86% 27, 35, 52
|
||||||
|
innercontext/validators/batch_validator.py 128 105 18% 37, 58-154, 167-203, 214-240, 249-273
|
||||||
|
innercontext/validators/photo_validator.py 65 54 17% 58-134, 144-152, 164-178
|
||||||
|
innercontext/validators/product_parse_validator.py 110 93 15% 108-154, 164-172, 185-198, 205-239, 243-267, 273-319, 325-339
|
||||||
|
innercontext/validators/routine_validator.py 146 90 38% 72-73, 92-95, 98-101, 107-147, 151, 158, 175, 182-197, 201-218, 229-246, 259-275, 288-309
|
||||||
|
innercontext/validators/shopping_validator.py 78 58 26% 49-96, 102-114, 122-123, 136-142, 150-161, 169-203
|
||||||
|
----------------------------------------------------------------------------------
|
||||||
|
TOTAL 3984 1998 50%
|
||||||
|
Coverage HTML written to dir htmlcov
|
||||||
|
============================== 1 passed in 0.47s ===============================
|
||||||
158
.sisyphus/evidence/task-T8-backend-qa.log
Normal file
158
.sisyphus/evidence/task-T8-backend-qa.log
Normal file
|
|
@ -0,0 +1,158 @@
|
||||||
|
INFO: Started server process [65594]
|
||||||
|
INFO: Waiting for application startup.
|
||||||
|
INFO: Application startup complete.
|
||||||
|
INFO: Uvicorn running on http://127.0.0.1:8001 (Press CTRL+C to quit)
|
||||||
|
INFO: 127.0.0.1:56744 - "GET /health-check HTTP/1.1" 200 OK
|
||||||
|
INFO: 127.0.0.1:56751 - "GET /health-check HTTP/1.1" 200 OK
|
||||||
|
INFO: 127.0.0.1:56758 - "GET /health-check HTTP/1.1" 200 OK
|
||||||
|
INFO: 127.0.0.1:56764 - "GET /health-check HTTP/1.1" 200 OK
|
||||||
|
INFO: 127.0.0.1:56770 - "GET /health-check HTTP/1.1" 200 OK
|
||||||
|
INFO: 127.0.0.1:56776 - "GET /health-check HTTP/1.1" 200 OK
|
||||||
|
INFO: 127.0.0.1:56782 - "GET /health-check HTTP/1.1" 200 OK
|
||||||
|
INFO: 127.0.0.1:56788 - "GET /health-check HTTP/1.1" 200 OK
|
||||||
|
INFO: 127.0.0.1:56794 - "GET /health-check HTTP/1.1" 200 OK
|
||||||
|
INFO: 127.0.0.1:56800 - "GET /health-check HTTP/1.1" 200 OK
|
||||||
|
INFO: 127.0.0.1:56806 - "GET /health-check HTTP/1.1" 200 OK
|
||||||
|
INFO: 127.0.0.1:56813 - "GET /health-check HTTP/1.1" 200 OK
|
||||||
|
INFO: 127.0.0.1:56820 - "GET /health-check HTTP/1.1" 200 OK
|
||||||
|
INFO: 127.0.0.1:56826 - "GET /health-check HTTP/1.1" 200 OK
|
||||||
|
INFO: 127.0.0.1:56832 - "GET /health-check HTTP/1.1" 200 OK
|
||||||
|
INFO: 127.0.0.1:56838 - "GET /health-check HTTP/1.1" 200 OK
|
||||||
|
INFO: 127.0.0.1:56844 - "GET /health-check HTTP/1.1" 200 OK
|
||||||
|
INFO: 127.0.0.1:56850 - "GET /health-check HTTP/1.1" 200 OK
|
||||||
|
INFO: 127.0.0.1:56856 - "GET /health-check HTTP/1.1" 200 OK
|
||||||
|
INFO: 127.0.0.1:56862 - "GET /health-check HTTP/1.1" 200 OK
|
||||||
|
INFO: 127.0.0.1:56868 - "GET /health-check HTTP/1.1" 200 OK
|
||||||
|
INFO: 127.0.0.1:56874 - "GET /health-check HTTP/1.1" 200 OK
|
||||||
|
INFO: 127.0.0.1:56880 - "GET /health-check HTTP/1.1" 200 OK
|
||||||
|
INFO: 127.0.0.1:56887 - "GET /health-check HTTP/1.1" 200 OK
|
||||||
|
INFO: 127.0.0.1:56893 - "GET /health-check HTTP/1.1" 200 OK
|
||||||
|
INFO: 127.0.0.1:56899 - "GET /health-check HTTP/1.1" 200 OK
|
||||||
|
INFO: 127.0.0.1:56905 - "GET /health-check HTTP/1.1" 200 OK
|
||||||
|
INFO: 127.0.0.1:56911 - "GET /health-check HTTP/1.1" 200 OK
|
||||||
|
INFO: 127.0.0.1:56917 - "GET /health-check HTTP/1.1" 200 OK
|
||||||
|
INFO: 127.0.0.1:56923 - "GET /health-check HTTP/1.1" 200 OK
|
||||||
|
INFO: 127.0.0.1:56940 - "POST /auth/session/sync HTTP/1.1" 500 Internal Server Error
|
||||||
|
ERROR: Exception in ASGI application
|
||||||
|
Traceback (most recent call last):
|
||||||
|
File "/Users/piotr/dev/innercontext/backend/.venv/lib/python3.12/site-packages/sqlalchemy/engine/base.py", line 1967, in _exec_single_context
|
||||||
|
self.dialect.do_execute(
|
||||||
|
File "/Users/piotr/dev/innercontext/backend/.venv/lib/python3.12/site-packages/sqlalchemy/engine/default.py", line 952, in do_execute
|
||||||
|
cursor.execute(statement, parameters)
|
||||||
|
File "/Users/piotr/dev/innercontext/backend/.venv/lib/python3.12/site-packages/psycopg/cursor.py", line 117, in execute
|
||||||
|
raise ex.with_traceback(None)
|
||||||
|
psycopg.errors.UndefinedColumn: column user_profiles.user_id does not exist
|
||||||
|
LINE 1: SELECT user_profiles.id, user_profiles.user_id, user_profile...
|
||||||
|
^
|
||||||
|
|
||||||
|
The above exception was the direct cause of the following exception:
|
||||||
|
|
||||||
|
Traceback (most recent call last):
|
||||||
|
File "/Users/piotr/dev/innercontext/backend/.venv/lib/python3.12/site-packages/uvicorn/protocols/http/httptools_impl.py", line 416, in run_asgi
|
||||||
|
result = await app( # type: ignore[func-returns-value]
|
||||||
|
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
||||||
|
File "/Users/piotr/dev/innercontext/backend/.venv/lib/python3.12/site-packages/uvicorn/middleware/proxy_headers.py", line 60, in __call__
|
||||||
|
return await self.app(scope, receive, send)
|
||||||
|
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
||||||
|
File "/Users/piotr/dev/innercontext/backend/.venv/lib/python3.12/site-packages/fastapi/applications.py", line 1158, in __call__
|
||||||
|
await super().__call__(scope, receive, send)
|
||||||
|
File "/Users/piotr/dev/innercontext/backend/.venv/lib/python3.12/site-packages/starlette/applications.py", line 107, in __call__
|
||||||
|
await self.middleware_stack(scope, receive, send)
|
||||||
|
File "/Users/piotr/dev/innercontext/backend/.venv/lib/python3.12/site-packages/starlette/middleware/errors.py", line 186, in __call__
|
||||||
|
raise exc
|
||||||
|
File "/Users/piotr/dev/innercontext/backend/.venv/lib/python3.12/site-packages/starlette/middleware/errors.py", line 164, in __call__
|
||||||
|
await self.app(scope, receive, _send)
|
||||||
|
File "/Users/piotr/dev/innercontext/backend/.venv/lib/python3.12/site-packages/starlette/middleware/cors.py", line 87, in __call__
|
||||||
|
await self.app(scope, receive, send)
|
||||||
|
File "/Users/piotr/dev/innercontext/backend/.venv/lib/python3.12/site-packages/starlette/middleware/exceptions.py", line 63, in __call__
|
||||||
|
await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send)
|
||||||
|
File "/Users/piotr/dev/innercontext/backend/.venv/lib/python3.12/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
|
||||||
|
raise exc
|
||||||
|
File "/Users/piotr/dev/innercontext/backend/.venv/lib/python3.12/site-packages/starlette/_exception_handler.py", line 42, in wrapped_app
|
||||||
|
await app(scope, receive, sender)
|
||||||
|
File "/Users/piotr/dev/innercontext/backend/.venv/lib/python3.12/site-packages/fastapi/middleware/asyncexitstack.py", line 18, in __call__
|
||||||
|
await self.app(scope, receive, send)
|
||||||
|
File "/Users/piotr/dev/innercontext/backend/.venv/lib/python3.12/site-packages/starlette/routing.py", line 716, in __call__
|
||||||
|
await self.middleware_stack(scope, receive, send)
|
||||||
|
File "/Users/piotr/dev/innercontext/backend/.venv/lib/python3.12/site-packages/starlette/routing.py", line 736, in app
|
||||||
|
await route.handle(scope, receive, send)
|
||||||
|
File "/Users/piotr/dev/innercontext/backend/.venv/lib/python3.12/site-packages/starlette/routing.py", line 290, in handle
|
||||||
|
await self.app(scope, receive, send)
|
||||||
|
File "/Users/piotr/dev/innercontext/backend/.venv/lib/python3.12/site-packages/fastapi/routing.py", line 119, in app
|
||||||
|
await wrap_app_handling_exceptions(app, request)(scope, receive, send)
|
||||||
|
File "/Users/piotr/dev/innercontext/backend/.venv/lib/python3.12/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
|
||||||
|
raise exc
|
||||||
|
File "/Users/piotr/dev/innercontext/backend/.venv/lib/python3.12/site-packages/starlette/_exception_handler.py", line 42, in wrapped_app
|
||||||
|
await app(scope, receive, sender)
|
||||||
|
File "/Users/piotr/dev/innercontext/backend/.venv/lib/python3.12/site-packages/fastapi/routing.py", line 105, in app
|
||||||
|
response = await f(request)
|
||||||
|
^^^^^^^^^^^^^^^^
|
||||||
|
File "/Users/piotr/dev/innercontext/backend/.venv/lib/python3.12/site-packages/fastapi/routing.py", line 431, in app
|
||||||
|
raw_response = await run_endpoint_function(
|
||||||
|
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
||||||
|
File "/Users/piotr/dev/innercontext/backend/.venv/lib/python3.12/site-packages/fastapi/routing.py", line 315, in run_endpoint_function
|
||||||
|
return await run_in_threadpool(dependant.call, **values)
|
||||||
|
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
||||||
|
File "/Users/piotr/dev/innercontext/backend/.venv/lib/python3.12/site-packages/starlette/concurrency.py", line 32, in run_in_threadpool
|
||||||
|
return await anyio.to_thread.run_sync(func)
|
||||||
|
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
||||||
|
File "/Users/piotr/dev/innercontext/backend/.venv/lib/python3.12/site-packages/anyio/to_thread.py", line 63, in run_sync
|
||||||
|
return await get_async_backend().run_sync_in_worker_thread(
|
||||||
|
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
||||||
|
File "/Users/piotr/dev/innercontext/backend/.venv/lib/python3.12/site-packages/anyio/_backends/_asyncio.py", line 2502, in run_sync_in_worker_thread
|
||||||
|
return await future
|
||||||
|
^^^^^^^^^^^^
|
||||||
|
File "/Users/piotr/dev/innercontext/backend/.venv/lib/python3.12/site-packages/anyio/_backends/_asyncio.py", line 986, in run
|
||||||
|
result = context.run(func, *args)
|
||||||
|
^^^^^^^^^^^^^^^^^^^^^^^^
|
||||||
|
File "/Users/piotr/dev/innercontext/backend/innercontext/api/auth.py", line 158, in sync_session
|
||||||
|
return _response(session, synced_user)
|
||||||
|
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
||||||
|
File "/Users/piotr/dev/innercontext/backend/innercontext/api/auth.py", line 143, in _response
|
||||||
|
profile=_profile_public(_get_profile(session, current_user.user_id)),
|
||||||
|
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
||||||
|
File "/Users/piotr/dev/innercontext/backend/innercontext/api/auth.py", line 100, in _get_profile
|
||||||
|
return session.exec(
|
||||||
|
^^^^^^^^^^^^^
|
||||||
|
File "/Users/piotr/dev/innercontext/backend/.venv/lib/python3.12/site-packages/sqlmodel/orm/session.py", line 75, in exec
|
||||||
|
results = super().execute(
|
||||||
|
^^^^^^^^^^^^^^^^
|
||||||
|
File "/Users/piotr/dev/innercontext/backend/.venv/lib/python3.12/site-packages/sqlalchemy/orm/session.py", line 2351, in execute
|
||||||
|
return self._execute_internal(
|
||||||
|
^^^^^^^^^^^^^^^^^^^^^^^
|
||||||
|
File "/Users/piotr/dev/innercontext/backend/.venv/lib/python3.12/site-packages/sqlalchemy/orm/session.py", line 2249, in _execute_internal
|
||||||
|
result: Result[Any] = compile_state_cls.orm_execute_statement(
|
||||||
|
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
||||||
|
File "/Users/piotr/dev/innercontext/backend/.venv/lib/python3.12/site-packages/sqlalchemy/orm/context.py", line 306, in orm_execute_statement
|
||||||
|
result = conn.execute(
|
||||||
|
^^^^^^^^^^^^^
|
||||||
|
File "/Users/piotr/dev/innercontext/backend/.venv/lib/python3.12/site-packages/sqlalchemy/engine/base.py", line 1419, in execute
|
||||||
|
return meth(
|
||||||
|
^^^^^
|
||||||
|
File "/Users/piotr/dev/innercontext/backend/.venv/lib/python3.12/site-packages/sqlalchemy/sql/elements.py", line 527, in _execute_on_connection
|
||||||
|
return connection._execute_clauseelement(
|
||||||
|
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
||||||
|
File "/Users/piotr/dev/innercontext/backend/.venv/lib/python3.12/site-packages/sqlalchemy/engine/base.py", line 1641, in _execute_clauseelement
|
||||||
|
ret = self._execute_context(
|
||||||
|
^^^^^^^^^^^^^^^^^^^^^^
|
||||||
|
File "/Users/piotr/dev/innercontext/backend/.venv/lib/python3.12/site-packages/sqlalchemy/engine/base.py", line 1846, in _execute_context
|
||||||
|
return self._exec_single_context(
|
||||||
|
^^^^^^^^^^^^^^^^^^^^^^^^^^
|
||||||
|
File "/Users/piotr/dev/innercontext/backend/.venv/lib/python3.12/site-packages/sqlalchemy/engine/base.py", line 1986, in _exec_single_context
|
||||||
|
self._handle_dbapi_exception(
|
||||||
|
File "/Users/piotr/dev/innercontext/backend/.venv/lib/python3.12/site-packages/sqlalchemy/engine/base.py", line 2363, in _handle_dbapi_exception
|
||||||
|
raise sqlalchemy_exception.with_traceback(exc_info[2]) from e
|
||||||
|
File "/Users/piotr/dev/innercontext/backend/.venv/lib/python3.12/site-packages/sqlalchemy/engine/base.py", line 1967, in _exec_single_context
|
||||||
|
self.dialect.do_execute(
|
||||||
|
File "/Users/piotr/dev/innercontext/backend/.venv/lib/python3.12/site-packages/sqlalchemy/engine/default.py", line 952, in do_execute
|
||||||
|
cursor.execute(statement, parameters)
|
||||||
|
File "/Users/piotr/dev/innercontext/backend/.venv/lib/python3.12/site-packages/psycopg/cursor.py", line 117, in execute
|
||||||
|
raise ex.with_traceback(None)
|
||||||
|
sqlalchemy.exc.ProgrammingError: (psycopg.errors.UndefinedColumn) column user_profiles.user_id does not exist
|
||||||
|
LINE 1: SELECT user_profiles.id, user_profiles.user_id, user_profile...
|
||||||
|
^
|
||||||
|
[SQL: SELECT user_profiles.id, user_profiles.user_id, user_profiles.birth_date, user_profiles.sex_at_birth, user_profiles.created_at, user_profiles.updated_at
|
||||||
|
FROM user_profiles
|
||||||
|
WHERE user_profiles.user_id = %(user_id_1)s::UUID]
|
||||||
|
[parameters: {'user_id_1': UUID('c6968c10-98af-4a32-a794-708aca0cc362')}]
|
||||||
|
(Background on this error at: https://sqlalche.me/e/20/f405)
|
||||||
40
.sisyphus/evidence/task-T8-backend-sqlite.log
Normal file
40
.sisyphus/evidence/task-T8-backend-sqlite.log
Normal file
|
|
@ -0,0 +1,40 @@
|
||||||
|
INFO: Started server process [67156]
|
||||||
|
INFO: Waiting for application startup.
|
||||||
|
INFO: Application startup complete.
|
||||||
|
INFO: Uvicorn running on http://127.0.0.1:8002 (Press CTRL+C to quit)
|
||||||
|
INFO: 127.0.0.1:56962 - "GET /health-check HTTP/1.1" 200 OK
|
||||||
|
INFO: 127.0.0.1:56974 - "POST /auth/session/sync HTTP/1.1" 200 OK
|
||||||
|
INFO: 127.0.0.1:57014 - "GET /routines?from_date=2026-02-26 HTTP/1.1" 200 OK
|
||||||
|
INFO: 127.0.0.1:57015 - "GET /skincare?from_date=2026-01-11 HTTP/1.1" 200 OK
|
||||||
|
INFO: 127.0.0.1:57016 - "GET /health/lab-results?latest_only=true&limit=8 HTTP/1.1" 200 OK
|
||||||
|
INFO: 127.0.0.1:57024 - "GET /routines?from_date=2026-02-26 HTTP/1.1" 200 OK
|
||||||
|
INFO: 127.0.0.1:57025 - "GET /skincare?from_date=2026-01-11 HTTP/1.1" 200 OK
|
||||||
|
INFO: 127.0.0.1:57026 - "GET /health/lab-results?latest_only=true&limit=8 HTTP/1.1" 200 OK
|
||||||
|
INFO: 127.0.0.1:57035 - "GET /routines?from_date=2026-02-26 HTTP/1.1" 200 OK
|
||||||
|
INFO: 127.0.0.1:57036 - "GET /skincare?from_date=2026-01-11 HTTP/1.1" 200 OK
|
||||||
|
INFO: 127.0.0.1:57037 - "GET /health/lab-results?latest_only=true&limit=8 HTTP/1.1" 200 OK
|
||||||
|
INFO: 127.0.0.1:57035 - "GET /products/summary HTTP/1.1" 200 OK
|
||||||
|
INFO: 127.0.0.1:57035 - "GET /profile HTTP/1.1" 200 OK
|
||||||
|
INFO: 127.0.0.1:57035 - "GET /routines?from_date=2026-02-10 HTTP/1.1" 200 OK
|
||||||
|
INFO: 127.0.0.1:57075 - "POST /auth/session/sync HTTP/1.1" 200 OK
|
||||||
|
INFO: 127.0.0.1:57075 - "GET /products/summary HTTP/1.1" 200 OK
|
||||||
|
INFO: 127.0.0.1:57089 - "POST /auth/session/sync HTTP/1.1" 200 OK
|
||||||
|
INFO: 127.0.0.1:57089 - "GET /routines?from_date=2026-02-26 HTTP/1.1" 200 OK
|
||||||
|
INFO: 127.0.0.1:57090 - "GET /skincare?from_date=2026-01-11 HTTP/1.1" 200 OK
|
||||||
|
INFO: 127.0.0.1:57091 - "GET /health/lab-results?latest_only=true&limit=8 HTTP/1.1" 200 OK
|
||||||
|
INFO: 127.0.0.1:57089 - "GET /products/summary HTTP/1.1" 200 OK
|
||||||
|
INFO: 127.0.0.1:57109 - "POST /auth/session/sync HTTP/1.1" 200 OK
|
||||||
|
INFO: 127.0.0.1:57109 - "GET /routines?from_date=2026-02-26 HTTP/1.1" 200 OK
|
||||||
|
INFO: 127.0.0.1:57110 - "GET /skincare?from_date=2026-01-11 HTTP/1.1" 200 OK
|
||||||
|
INFO: 127.0.0.1:57111 - "GET /health/lab-results?latest_only=true&limit=8 HTTP/1.1" 200 OK
|
||||||
|
INFO: 127.0.0.1:57166 - "GET /routines?from_date=2026-02-26 HTTP/1.1" 200 OK
|
||||||
|
INFO: 127.0.0.1:57167 - "GET /skincare?from_date=2026-01-11 HTTP/1.1" 200 OK
|
||||||
|
INFO: 127.0.0.1:57168 - "GET /health/lab-results?latest_only=true&limit=8 HTTP/1.1" 200 OK
|
||||||
|
INFO: 127.0.0.1:57182 - "POST /auth/session/sync HTTP/1.1" 200 OK
|
||||||
|
INFO: 127.0.0.1:57182 - "GET /routines?from_date=2026-02-26 HTTP/1.1" 200 OK
|
||||||
|
INFO: 127.0.0.1:57183 - "GET /skincare?from_date=2026-01-11 HTTP/1.1" 200 OK
|
||||||
|
INFO: 127.0.0.1:57184 - "GET /health/lab-results?latest_only=true&limit=8 HTTP/1.1" 200 OK
|
||||||
|
INFO: 127.0.0.1:57182 - "GET /routines?from_date=2026-02-26 HTTP/1.1" 200 OK
|
||||||
|
INFO: 127.0.0.1:57183 - "GET /skincare?from_date=2026-01-11 HTTP/1.1" 200 OK
|
||||||
|
INFO: 127.0.0.1:57184 - "GET /health/lab-results?latest_only=true&limit=8 HTTP/1.1" 200 OK
|
||||||
|
INFO: 127.0.0.1:57401 - "GET /profile HTTP/1.1" 200 OK
|
||||||
5
.sisyphus/evidence/task-T8-backend.log
Normal file
5
.sisyphus/evidence/task-T8-backend.log
Normal file
|
|
@ -0,0 +1,5 @@
|
||||||
|
INFO: Started server process [63874]
|
||||||
|
INFO: Waiting for application startup.
|
||||||
|
INFO: Application startup complete.
|
||||||
|
INFO: Uvicorn running on http://127.0.0.1:8000 (Press CTRL+C to quit)
|
||||||
|
INFO: 127.0.0.1:56616 - "GET /health-check HTTP/1.1" 200 OK
|
||||||
44
.sisyphus/evidence/task-T8-frontend-qa.log
Normal file
44
.sisyphus/evidence/task-T8-frontend-qa.log
Normal file
|
|
@ -0,0 +1,44 @@
|
||||||
|
|
||||||
|
> frontend@0.0.1 dev /Users/piotr/dev/innercontext/frontend
|
||||||
|
> vite dev --host 127.0.0.1 --port 4174
|
||||||
|
|
||||||
|
✔ [paraglide-js] Compilation complete (locale-modules)
|
||||||
|
|
||||||
|
VITE v7.3.1 ready in 1355 ms
|
||||||
|
|
||||||
|
➜ Local: http://127.0.0.1:4174/
|
||||||
|
|
||||||
|
[1;31m[404] GET /favicon.ico[0m
|
||||||
|
16:19:52 [vite] (ssr) page reload .svelte-kit/generated/server/internal.js
|
||||||
|
16:19:52 [vite] (ssr) page reload .svelte-kit/generated/root.svelte
|
||||||
|
16:19:52 [vite] (ssr) page reload .svelte-kit/generated/root.js
|
||||||
|
16:19:53 [vite] (ssr) page reload src/lib/paraglide/server.js
|
||||||
|
16:19:53 [vite] (ssr) page reload src/lib/paraglide/registry.js
|
||||||
|
16:19:53 [vite] (ssr) page reload src/lib/paraglide/messages.js
|
||||||
|
16:19:53 [vite] (ssr) page reload src/lib/paraglide/runtime.js
|
||||||
|
16:19:53 [vite] (ssr) page reload src/lib/paraglide/messages/_index.js
|
||||||
|
16:19:53 [vite] (ssr) page reload src/lib/paraglide/messages/en.js
|
||||||
|
16:19:53 [vite] (ssr) page reload src/lib/paraglide/messages/pl.js
|
||||||
|
16:19:53 [vite] (ssr) page reload src/lib/paraglide/messages/en.js
|
||||||
|
16:19:53 [vite] (ssr) page reload src/lib/paraglide/messages/pl.js
|
||||||
|
16:19:53 [vite] (ssr) page reload src/lib/paraglide/server.js
|
||||||
|
16:19:53 [vite] (ssr) page reload src/lib/paraglide/messages.js
|
||||||
|
16:19:53 [vite] (ssr) page reload src/lib/paraglide/runtime.js
|
||||||
|
16:19:53 [vite] (ssr) page reload src/lib/paraglide/messages/pl.js
|
||||||
|
16:19:53 [vite] (ssr) page reload src/lib/paraglide/registry.js
|
||||||
|
16:19:53 [vite] (ssr) page reload src/lib/paraglide/messages/_index.js
|
||||||
|
16:19:53 [vite] (ssr) page reload src/lib/paraglide/messages/en.js
|
||||||
|
16:21:17 [vite] (ssr) page reload src/lib/api.ts
|
||||||
|
16:23:49 [vite] (ssr) page reload src/routes/+layout.svelte
|
||||||
|
16:24:41 [vite] (ssr) page reload .svelte-kit/generated/server/internal.js
|
||||||
|
16:24:41 [vite] (ssr) page reload .svelte-kit/generated/root.svelte
|
||||||
|
16:24:41 [vite] (ssr) page reload .svelte-kit/generated/root.js
|
||||||
|
16:24:42 [vite] (ssr) page reload src/lib/paraglide/runtime.js
|
||||||
|
16:24:42 [vite] (ssr) page reload src/lib/paraglide/server.js
|
||||||
|
16:24:42 [vite] (ssr) page reload src/lib/paraglide/registry.js
|
||||||
|
16:24:42 [vite] (ssr) page reload src/lib/paraglide/messages.js
|
||||||
|
16:24:42 [vite] (ssr) page reload src/lib/paraglide/messages/en.js
|
||||||
|
16:24:42 [vite] (ssr) page reload src/lib/paraglide/messages/_index.js
|
||||||
|
16:24:42 [vite] (ssr) page reload src/lib/paraglide/messages/pl.js
|
||||||
|
16:24:42 [vite] (ssr) page reload src/lib/paraglide/messages/_index.js
|
||||||
|
16:24:42 [vite] (ssr) page reload src/lib/paraglide/messages/_index.js
|
||||||
56
.sisyphus/evidence/task-T8-frontend-sqlite.log
Normal file
56
.sisyphus/evidence/task-T8-frontend-sqlite.log
Normal file
|
|
@ -0,0 +1,56 @@
|
||||||
|
|
||||||
|
> frontend@0.0.1 dev /Users/piotr/dev/innercontext/frontend
|
||||||
|
> vite dev --host 127.0.0.1 --port 4175
|
||||||
|
|
||||||
|
✔ [paraglide-js] Compilation complete (locale-modules)
|
||||||
|
|
||||||
|
VITE v7.3.1 ready in 1402 ms
|
||||||
|
|
||||||
|
➜ Local: http://127.0.0.1:4175/
|
||||||
|
|
||||||
|
[1;31m[404] GET /api/routines[0m
|
||||||
|
|
||||||
|
[1;31m[404] GET /api/skincare[0m
|
||||||
|
|
||||||
|
[1;31m[404] GET /api/health/lab-results[0m
|
||||||
|
|
||||||
|
[1;31m[500] GET /[0m
|
||||||
|
Error: Not Found
|
||||||
|
at request (src/lib/api.ts:68:11)
|
||||||
|
at process.processTicksAndRejections (node:internal/process/task_queues:104:5)
|
||||||
|
at async Promise.all (index 0)
|
||||||
|
at async load (src/routes/+page.server.ts:9:44)
|
||||||
|
|
||||||
|
[1;31m[404] GET /favicon.ico[0m
|
||||||
|
16:21:17 [vite] (ssr) page reload src/lib/api.ts
|
||||||
|
Avoid calling `fetch` eagerly during server-side rendering — put your `fetch` calls inside `onMount` or a `load` function instead
|
||||||
|
Avoid calling `fetch` eagerly during server-side rendering — put your `fetch` calls inside `onMount` or a `load` function instead
|
||||||
|
16:23:49 [vite] (client) hmr update /src/routes/+layout.svelte, /src/app.css
|
||||||
|
16:23:49 [vite] (ssr) page reload src/routes/+layout.svelte
|
||||||
|
16:24:41 [vite] (client) page reload .svelte-kit/generated/client/nodes/0.js
|
||||||
|
16:24:41 [vite] (client) page reload .svelte-kit/generated/client/nodes/1.js
|
||||||
|
16:24:41 [vite] (client) page reload .svelte-kit/generated/client/nodes/2.js
|
||||||
|
16:24:41 [vite] (client) page reload .svelte-kit/generated/client/nodes/7.js
|
||||||
|
16:24:41 [vite] (client) page reload .svelte-kit/generated/client/nodes/11.js
|
||||||
|
16:24:41 [vite] (client) page reload .svelte-kit/generated/client/nodes/12.js
|
||||||
|
16:24:41 [vite] (client) page reload .svelte-kit/generated/client/app.js
|
||||||
|
16:24:41 [vite] (client) page reload .svelte-kit/generated/client/matchers.js
|
||||||
|
16:24:41 [vite] (ssr) page reload .svelte-kit/generated/server/internal.js
|
||||||
|
16:24:41 [vite] (client) page reload .svelte-kit/generated/root.js
|
||||||
|
16:24:41 [vite] (ssr) page reload .svelte-kit/generated/root.js
|
||||||
|
16:24:41 [vite] (ssr) page reload .svelte-kit/generated/root.svelte
|
||||||
|
16:24:42 [vite] (client) hmr update /src/lib/components/LanguageSwitcher.svelte, /src/routes/+layout.svelte, /src/routes/+page.svelte, /src/routes/products/+page.svelte, /src/routes/profile/+page.svelte, /src/routes/routines/+page.svelte
|
||||||
|
16:24:42 [vite] (ssr) page reload src/lib/paraglide/runtime.js
|
||||||
|
16:24:42 [vite] (ssr) page reload src/lib/paraglide/server.js
|
||||||
|
16:24:42 [vite] (client) hmr update /src/routes/+layout.svelte, /src/routes/+page.svelte, /src/routes/products/+page.svelte, /src/routes/profile/+page.svelte, /src/routes/routines/+page.svelte
|
||||||
|
16:24:42 [vite] (ssr) page reload src/lib/paraglide/registry.js
|
||||||
|
16:24:42 [vite] (client) hmr update /src/routes/+layout.svelte, /src/routes/+page.svelte, /src/routes/products/+page.svelte, /src/routes/profile/+page.svelte, /src/routes/routines/+page.svelte
|
||||||
|
16:24:42 [vite] (ssr) page reload src/lib/paraglide/messages.js
|
||||||
|
16:24:42 [vite] (client) hmr update /src/routes/+layout.svelte, /src/routes/+page.svelte, /src/routes/products/+page.svelte, /src/routes/profile/+page.svelte, /src/routes/routines/+page.svelte
|
||||||
|
16:24:42 [vite] (ssr) page reload src/lib/paraglide/messages/_index.js
|
||||||
|
16:24:42 [vite] (client) page reload src/lib/paraglide/messages/en.js
|
||||||
|
16:24:42 [vite] (ssr) page reload src/lib/paraglide/messages/en.js
|
||||||
|
16:24:42 [vite] (client) page reload src/lib/paraglide/messages/pl.js
|
||||||
|
16:24:42 [vite] (ssr) page reload src/lib/paraglide/messages/pl.js
|
||||||
|
16:24:42 [vite] (client) hmr update /src/routes/+layout.svelte, /src/routes/+page.svelte, /src/routes/products/+page.svelte, /src/routes/profile/+page.svelte, /src/routes/routines/+page.svelte
|
||||||
|
16:24:42 [vite] (ssr) page reload src/lib/paraglide/messages/_index.js
|
||||||
228
.sisyphus/evidence/task-T8-frontend.log
Normal file
228
.sisyphus/evidence/task-T8-frontend.log
Normal file
|
|
@ -0,0 +1,228 @@
|
||||||
|
|
||||||
|
> frontend@0.0.1 dev /Users/piotr/dev/innercontext/frontend
|
||||||
|
> vite dev --host 127.0.0.1 --port 4173
|
||||||
|
|
||||||
|
✔ [paraglide-js] Compilation complete (locale-modules)
|
||||||
|
16:14:07 [vite] (client) Re-optimizing dependencies because lockfile has changed
|
||||||
|
|
||||||
|
VITE v7.3.1 ready in 1541 ms
|
||||||
|
|
||||||
|
➜ Local: http://127.0.0.1:4173/
|
||||||
|
|
||||||
|
[1;31m[500] GET /auth/login[0m
|
||||||
|
Error: Missing required auth environment variable: OIDC_ISSUER
|
||||||
|
at requiredEnv (src/lib/server/auth.ts:111:11)
|
||||||
|
at getAuthConfig (src/lib/server/auth.ts:94:18)
|
||||||
|
at getSecretKey (src/lib/server/auth.ts:136:22)
|
||||||
|
at encryptValue (src/lib/server/auth.ts:160:15)
|
||||||
|
at setLoginFlowCookie (src/lib/server/auth.ts:514:5)
|
||||||
|
at createLoginRedirect (src/lib/server/auth.ts:533:3)
|
||||||
|
at GET (src/routes/auth/login/+server.ts:6:26)
|
||||||
|
|
||||||
|
[1;31m[500] GET /auth/login[0m
|
||||||
|
Error: Missing required auth environment variable: OIDC_ISSUER
|
||||||
|
at requiredEnv (src/lib/server/auth.ts:111:11)
|
||||||
|
at getAuthConfig (src/lib/server/auth.ts:94:18)
|
||||||
|
at getSecretKey (src/lib/server/auth.ts:136:22)
|
||||||
|
at encryptValue (src/lib/server/auth.ts:160:15)
|
||||||
|
at setLoginFlowCookie (src/lib/server/auth.ts:514:5)
|
||||||
|
at createLoginRedirect (src/lib/server/auth.ts:533:3)
|
||||||
|
at GET (src/routes/auth/login/+server.ts:6:26)
|
||||||
|
|
||||||
|
[1;31m[500] GET /auth/login[0m
|
||||||
|
Error: Missing required auth environment variable: OIDC_ISSUER
|
||||||
|
at requiredEnv (src/lib/server/auth.ts:111:11)
|
||||||
|
at getAuthConfig (src/lib/server/auth.ts:94:18)
|
||||||
|
at getSecretKey (src/lib/server/auth.ts:136:22)
|
||||||
|
at encryptValue (src/lib/server/auth.ts:160:15)
|
||||||
|
at setLoginFlowCookie (src/lib/server/auth.ts:514:5)
|
||||||
|
at createLoginRedirect (src/lib/server/auth.ts:533:3)
|
||||||
|
at GET (src/routes/auth/login/+server.ts:6:26)
|
||||||
|
|
||||||
|
[1;31m[500] GET /auth/login[0m
|
||||||
|
Error: Missing required auth environment variable: OIDC_ISSUER
|
||||||
|
at requiredEnv (src/lib/server/auth.ts:111:11)
|
||||||
|
at getAuthConfig (src/lib/server/auth.ts:94:18)
|
||||||
|
at getSecretKey (src/lib/server/auth.ts:136:22)
|
||||||
|
at encryptValue (src/lib/server/auth.ts:160:15)
|
||||||
|
at setLoginFlowCookie (src/lib/server/auth.ts:514:5)
|
||||||
|
at createLoginRedirect (src/lib/server/auth.ts:533:3)
|
||||||
|
at GET (src/routes/auth/login/+server.ts:6:26)
|
||||||
|
|
||||||
|
[1;31m[500] GET /auth/login[0m
|
||||||
|
Error: Missing required auth environment variable: OIDC_ISSUER
|
||||||
|
at requiredEnv (src/lib/server/auth.ts:111:11)
|
||||||
|
at getAuthConfig (src/lib/server/auth.ts:94:18)
|
||||||
|
at getSecretKey (src/lib/server/auth.ts:136:22)
|
||||||
|
at encryptValue (src/lib/server/auth.ts:160:15)
|
||||||
|
at setLoginFlowCookie (src/lib/server/auth.ts:514:5)
|
||||||
|
at createLoginRedirect (src/lib/server/auth.ts:533:3)
|
||||||
|
at GET (src/routes/auth/login/+server.ts:6:26)
|
||||||
|
|
||||||
|
[1;31m[500] GET /auth/login[0m
|
||||||
|
Error: Missing required auth environment variable: OIDC_ISSUER
|
||||||
|
at requiredEnv (src/lib/server/auth.ts:111:11)
|
||||||
|
at getAuthConfig (src/lib/server/auth.ts:94:18)
|
||||||
|
at getSecretKey (src/lib/server/auth.ts:136:22)
|
||||||
|
at encryptValue (src/lib/server/auth.ts:160:15)
|
||||||
|
at setLoginFlowCookie (src/lib/server/auth.ts:514:5)
|
||||||
|
at createLoginRedirect (src/lib/server/auth.ts:533:3)
|
||||||
|
at GET (src/routes/auth/login/+server.ts:6:26)
|
||||||
|
|
||||||
|
[1;31m[500] GET /auth/login[0m
|
||||||
|
Error: Missing required auth environment variable: OIDC_ISSUER
|
||||||
|
at requiredEnv (src/lib/server/auth.ts:111:11)
|
||||||
|
at getAuthConfig (src/lib/server/auth.ts:94:18)
|
||||||
|
at getSecretKey (src/lib/server/auth.ts:136:22)
|
||||||
|
at encryptValue (src/lib/server/auth.ts:160:15)
|
||||||
|
at setLoginFlowCookie (src/lib/server/auth.ts:514:5)
|
||||||
|
at createLoginRedirect (src/lib/server/auth.ts:533:3)
|
||||||
|
at GET (src/routes/auth/login/+server.ts:6:26)
|
||||||
|
|
||||||
|
[1;31m[500] GET /auth/login[0m
|
||||||
|
Error: Missing required auth environment variable: OIDC_ISSUER
|
||||||
|
at requiredEnv (src/lib/server/auth.ts:111:11)
|
||||||
|
at getAuthConfig (src/lib/server/auth.ts:94:18)
|
||||||
|
at getSecretKey (src/lib/server/auth.ts:136:22)
|
||||||
|
at encryptValue (src/lib/server/auth.ts:160:15)
|
||||||
|
at setLoginFlowCookie (src/lib/server/auth.ts:514:5)
|
||||||
|
at createLoginRedirect (src/lib/server/auth.ts:533:3)
|
||||||
|
at GET (src/routes/auth/login/+server.ts:6:26)
|
||||||
|
|
||||||
|
[1;31m[500] GET /auth/login[0m
|
||||||
|
Error: Missing required auth environment variable: OIDC_ISSUER
|
||||||
|
at requiredEnv (src/lib/server/auth.ts:111:11)
|
||||||
|
at getAuthConfig (src/lib/server/auth.ts:94:18)
|
||||||
|
at getSecretKey (src/lib/server/auth.ts:136:22)
|
||||||
|
at encryptValue (src/lib/server/auth.ts:160:15)
|
||||||
|
at setLoginFlowCookie (src/lib/server/auth.ts:514:5)
|
||||||
|
at createLoginRedirect (src/lib/server/auth.ts:533:3)
|
||||||
|
at GET (src/routes/auth/login/+server.ts:6:26)
|
||||||
|
|
||||||
|
[1;31m[500] GET /auth/login[0m
|
||||||
|
Error: Missing required auth environment variable: OIDC_ISSUER
|
||||||
|
at requiredEnv (src/lib/server/auth.ts:111:11)
|
||||||
|
at getAuthConfig (src/lib/server/auth.ts:94:18)
|
||||||
|
at getSecretKey (src/lib/server/auth.ts:136:22)
|
||||||
|
at encryptValue (src/lib/server/auth.ts:160:15)
|
||||||
|
at setLoginFlowCookie (src/lib/server/auth.ts:514:5)
|
||||||
|
at createLoginRedirect (src/lib/server/auth.ts:533:3)
|
||||||
|
at GET (src/routes/auth/login/+server.ts:6:26)
|
||||||
|
|
||||||
|
[1;31m[500] GET /auth/login[0m
|
||||||
|
Error: Missing required auth environment variable: OIDC_ISSUER
|
||||||
|
at requiredEnv (src/lib/server/auth.ts:111:11)
|
||||||
|
at getAuthConfig (src/lib/server/auth.ts:94:18)
|
||||||
|
at getSecretKey (src/lib/server/auth.ts:136:22)
|
||||||
|
at encryptValue (src/lib/server/auth.ts:160:15)
|
||||||
|
at setLoginFlowCookie (src/lib/server/auth.ts:514:5)
|
||||||
|
at createLoginRedirect (src/lib/server/auth.ts:533:3)
|
||||||
|
at GET (src/routes/auth/login/+server.ts:6:26)
|
||||||
|
|
||||||
|
[1;31m[500] GET /auth/login[0m
|
||||||
|
Error: Missing required auth environment variable: OIDC_ISSUER
|
||||||
|
at requiredEnv (src/lib/server/auth.ts:111:11)
|
||||||
|
at getAuthConfig (src/lib/server/auth.ts:94:18)
|
||||||
|
at getSecretKey (src/lib/server/auth.ts:136:22)
|
||||||
|
at encryptValue (src/lib/server/auth.ts:160:15)
|
||||||
|
at setLoginFlowCookie (src/lib/server/auth.ts:514:5)
|
||||||
|
at createLoginRedirect (src/lib/server/auth.ts:533:3)
|
||||||
|
at GET (src/routes/auth/login/+server.ts:6:26)
|
||||||
|
|
||||||
|
[1;31m[500] GET /auth/login[0m
|
||||||
|
Error: Missing required auth environment variable: OIDC_ISSUER
|
||||||
|
at requiredEnv (src/lib/server/auth.ts:111:11)
|
||||||
|
at getAuthConfig (src/lib/server/auth.ts:94:18)
|
||||||
|
at getSecretKey (src/lib/server/auth.ts:136:22)
|
||||||
|
at encryptValue (src/lib/server/auth.ts:160:15)
|
||||||
|
at setLoginFlowCookie (src/lib/server/auth.ts:514:5)
|
||||||
|
at createLoginRedirect (src/lib/server/auth.ts:533:3)
|
||||||
|
at GET (src/routes/auth/login/+server.ts:6:26)
|
||||||
|
|
||||||
|
[1;31m[500] GET /auth/login[0m
|
||||||
|
Error: Missing required auth environment variable: OIDC_ISSUER
|
||||||
|
at requiredEnv (src/lib/server/auth.ts:111:11)
|
||||||
|
at getAuthConfig (src/lib/server/auth.ts:94:18)
|
||||||
|
at getSecretKey (src/lib/server/auth.ts:136:22)
|
||||||
|
at encryptValue (src/lib/server/auth.ts:160:15)
|
||||||
|
at setLoginFlowCookie (src/lib/server/auth.ts:514:5)
|
||||||
|
at createLoginRedirect (src/lib/server/auth.ts:533:3)
|
||||||
|
at GET (src/routes/auth/login/+server.ts:6:26)
|
||||||
|
|
||||||
|
[1;31m[500] GET /auth/login[0m
|
||||||
|
Error: Missing required auth environment variable: OIDC_ISSUER
|
||||||
|
at requiredEnv (src/lib/server/auth.ts:111:11)
|
||||||
|
at getAuthConfig (src/lib/server/auth.ts:94:18)
|
||||||
|
at getSecretKey (src/lib/server/auth.ts:136:22)
|
||||||
|
at encryptValue (src/lib/server/auth.ts:160:15)
|
||||||
|
at setLoginFlowCookie (src/lib/server/auth.ts:514:5)
|
||||||
|
at createLoginRedirect (src/lib/server/auth.ts:533:3)
|
||||||
|
at GET (src/routes/auth/login/+server.ts:6:26)
|
||||||
|
|
||||||
|
[1;31m[500] GET /auth/login[0m
|
||||||
|
Error: Missing required auth environment variable: OIDC_ISSUER
|
||||||
|
at requiredEnv (src/lib/server/auth.ts:111:11)
|
||||||
|
at getAuthConfig (src/lib/server/auth.ts:94:18)
|
||||||
|
at getSecretKey (src/lib/server/auth.ts:136:22)
|
||||||
|
at encryptValue (src/lib/server/auth.ts:160:15)
|
||||||
|
at setLoginFlowCookie (src/lib/server/auth.ts:514:5)
|
||||||
|
at createLoginRedirect (src/lib/server/auth.ts:533:3)
|
||||||
|
at GET (src/routes/auth/login/+server.ts:6:26)
|
||||||
|
|
||||||
|
[1;31m[500] GET /auth/login[0m
|
||||||
|
Error: Missing required auth environment variable: OIDC_ISSUER
|
||||||
|
at requiredEnv (src/lib/server/auth.ts:111:11)
|
||||||
|
at getAuthConfig (src/lib/server/auth.ts:94:18)
|
||||||
|
at getSecretKey (src/lib/server/auth.ts:136:22)
|
||||||
|
at encryptValue (src/lib/server/auth.ts:160:15)
|
||||||
|
at setLoginFlowCookie (src/lib/server/auth.ts:514:5)
|
||||||
|
at createLoginRedirect (src/lib/server/auth.ts:533:3)
|
||||||
|
at GET (src/routes/auth/login/+server.ts:6:26)
|
||||||
|
|
||||||
|
[1;31m[500] GET /auth/login[0m
|
||||||
|
Error: Missing required auth environment variable: OIDC_ISSUER
|
||||||
|
at requiredEnv (src/lib/server/auth.ts:111:11)
|
||||||
|
at getAuthConfig (src/lib/server/auth.ts:94:18)
|
||||||
|
at getSecretKey (src/lib/server/auth.ts:136:22)
|
||||||
|
at encryptValue (src/lib/server/auth.ts:160:15)
|
||||||
|
at setLoginFlowCookie (src/lib/server/auth.ts:514:5)
|
||||||
|
at createLoginRedirect (src/lib/server/auth.ts:533:3)
|
||||||
|
at GET (src/routes/auth/login/+server.ts:6:26)
|
||||||
|
|
||||||
|
[1;31m[500] GET /auth/login[0m
|
||||||
|
Error: Missing required auth environment variable: OIDC_ISSUER
|
||||||
|
at requiredEnv (src/lib/server/auth.ts:111:11)
|
||||||
|
at getAuthConfig (src/lib/server/auth.ts:94:18)
|
||||||
|
at getSecretKey (src/lib/server/auth.ts:136:22)
|
||||||
|
at encryptValue (src/lib/server/auth.ts:160:15)
|
||||||
|
at setLoginFlowCookie (src/lib/server/auth.ts:514:5)
|
||||||
|
at createLoginRedirect (src/lib/server/auth.ts:533:3)
|
||||||
|
at GET (src/routes/auth/login/+server.ts:6:26)
|
||||||
|
|
||||||
|
[1;31m[500] GET /auth/login[0m
|
||||||
|
Error: Missing required auth environment variable: OIDC_ISSUER
|
||||||
|
at requiredEnv (src/lib/server/auth.ts:111:11)
|
||||||
|
at getAuthConfig (src/lib/server/auth.ts:94:18)
|
||||||
|
at getSecretKey (src/lib/server/auth.ts:136:22)
|
||||||
|
at encryptValue (src/lib/server/auth.ts:160:15)
|
||||||
|
at setLoginFlowCookie (src/lib/server/auth.ts:514:5)
|
||||||
|
at createLoginRedirect (src/lib/server/auth.ts:533:3)
|
||||||
|
at GET (src/routes/auth/login/+server.ts:6:26)
|
||||||
|
16:18:10 [vite] (ssr) page reload .svelte-kit/generated/server/internal.js
|
||||||
|
16:18:10 [vite] (ssr) page reload .svelte-kit/generated/root.svelte
|
||||||
|
16:18:10 [vite] (ssr) page reload .svelte-kit/generated/root.js
|
||||||
|
16:18:10 [vite] (ssr) page reload src/lib/paraglide/runtime.js
|
||||||
|
16:18:10 [vite] (ssr) page reload src/lib/paraglide/server.js
|
||||||
|
16:19:52 [vite] (ssr) page reload .svelte-kit/generated/server/internal.js
|
||||||
|
16:19:52 [vite] (ssr) page reload .svelte-kit/generated/root.svelte
|
||||||
|
16:19:52 [vite] (ssr) page reload .svelte-kit/generated/root.js
|
||||||
|
16:19:53 [vite] (ssr) page reload src/lib/paraglide/server.js
|
||||||
|
16:19:53 [vite] (ssr) page reload src/lib/paraglide/runtime.js
|
||||||
|
16:19:53 [vite] (ssr) page reload src/lib/paraglide/server.js
|
||||||
|
16:19:53 [vite] (ssr) page reload src/lib/paraglide/runtime.js
|
||||||
|
16:21:17 [vite] (ssr) page reload src/lib/api.ts
|
||||||
|
16:24:41 [vite] (ssr) page reload .svelte-kit/generated/server/internal.js
|
||||||
|
16:24:41 [vite] (ssr) page reload .svelte-kit/generated/root.svelte
|
||||||
|
16:24:41 [vite] (ssr) page reload .svelte-kit/generated/root.js
|
||||||
|
16:24:42 [vite] (ssr) page reload src/lib/paraglide/runtime.js
|
||||||
|
16:24:42 [vite] (ssr) page reload src/lib/paraglide/server.js
|
||||||
85
.sisyphus/evidence/task-T8-oidc-mock.log
Normal file
85
.sisyphus/evidence/task-T8-oidc-mock.log
Normal file
|
|
@ -0,0 +1,85 @@
|
||||||
|
"GET /.well-known/openid-configuration HTTP/1.1" 200 -
|
||||||
|
"GET /.well-known/openid-configuration HTTP/1.1" 200 -
|
||||||
|
"GET /authorize?client_id=innercontext-web&response_type=code&redirect_uri=http%3A%2F%2F127.0.0.1%3A4174%2Fauth%2Fcallback&scope=openid+profile+email+groups+offline_access&state=Pe0C5a4zQ4Vi55paBNm20bGetmj3Y3yX&code_challenge=bq7aLLFrO4nIa6kvBUM47B56asCKazcoQbOkATvooYM&code_challenge_method=S256 HTTP/1.1" 303 -
|
||||||
|
"GET /.well-known/openid-configuration HTTP/1.1" 200 -
|
||||||
|
"GET /authorize?client_id=innercontext-web&response_type=code&redirect_uri=http%3A%2F%2F127.0.0.1%3A4174%2Fauth%2Fcallback&scope=openid+profile+email+groups+offline_access&state=vt-gcRpDGF4HG7V1QCVbA-NX2WLV6UAY&code_challenge=UH0-E3tc3A3U3TCc-FUZDzvzJ1asqarJbznagQ8Lj7o&code_challenge_method=S256 HTTP/1.1" 303 -
|
||||||
|
"GET /.well-known/openid-configuration HTTP/1.1" 200 -
|
||||||
|
"GET /authorize?client_id=innercontext-web&response_type=code&redirect_uri=http%3A%2F%2F127.0.0.1%3A4174%2Fauth%2Fcallback&scope=openid+profile+email+groups+offline_access&state=m3l7BolBJCvjhuxRcsSFDZ6gZE6rqTb1&code_challenge=2g7E4QZDpYNRMtQUt5ryXhaYycdglHIeFM1UxZ-XDDs&code_challenge_method=S256 HTTP/1.1" 303 -
|
||||||
|
"GET /.well-known/openid-configuration HTTP/1.1" 200 -
|
||||||
|
"GET /authorize?client_id=innercontext-web&response_type=code&redirect_uri=http%3A%2F%2F127.0.0.1%3A4174%2Fauth%2Fcallback&scope=openid+profile+email+groups+offline_access&state=eoKVLYkr20ziVr_dZ8V9JWBkGowz7NYI&code_challenge=vUxYDfnKXdVXnk5aHxY8LjHKCszU3SEiIebjzFPv4J0&code_challenge_method=S256 HTTP/1.1" 303 -
|
||||||
|
"GET /.well-known/openid-configuration HTTP/1.1" 200 -
|
||||||
|
"GET /authorize?client_id=innercontext-web&response_type=code&redirect_uri=http%3A%2F%2F127.0.0.1%3A4174%2Fauth%2Fcallback&scope=openid+profile+email+groups+offline_access&state=pd7rjNvJbBvu6NQYuRX6D8bF5szwi8y9&code_challenge=SWGABggwp25CFq8PLbTT7DLSTsvezc77M9PGX_YYNPY&code_challenge_method=S256 HTTP/1.1" 303 -
|
||||||
|
"GET /.well-known/openid-configuration HTTP/1.1" 200 -
|
||||||
|
"GET /authorize?client_id=innercontext-web&response_type=code&redirect_uri=http%3A%2F%2F127.0.0.1%3A4174%2Fauth%2Fcallback&scope=openid+profile+email+groups+offline_access&state=In2AvuYgIN7a4n1xiCp1gHtVBn0zpdg-&code_challenge=5ys7xQwmLRLdhb9ZPyI0h3e0bxfaPzonIBTYr4uj6Xg&code_challenge_method=S256 HTTP/1.1" 303 -
|
||||||
|
"GET /.well-known/openid-configuration HTTP/1.1" 200 -
|
||||||
|
"GET /authorize?client_id=innercontext-web&response_type=code&redirect_uri=http%3A%2F%2F127.0.0.1%3A4174%2Fauth%2Fcallback&scope=openid+profile+email+groups+offline_access&state=P5shwVgZUoH9xchcI0o078fsgVweezIS&code_challenge=RmboHw0sdVGiU1POAwcptydVlhwdgUzQLLjUMT9S8OU&code_challenge_method=S256 HTTP/1.1" 303 -
|
||||||
|
"GET /.well-known/openid-configuration HTTP/1.1" 200 -
|
||||||
|
"GET /authorize?client_id=innercontext-web&response_type=code&redirect_uri=http%3A%2F%2F127.0.0.1%3A4174%2Fauth%2Fcallback&scope=openid+profile+email+groups+offline_access&state=Kp2yUUykkn9ImErvLiLfeOAmJscmF6q6&code_challenge=epJaom98WkEBzjRkkNcLTAp89sB5NkzYrdrjZRxWLok&code_challenge_method=S256 HTTP/1.1" 303 -
|
||||||
|
"GET /.well-known/openid-configuration HTTP/1.1" 200 -
|
||||||
|
"GET /authorize?client_id=innercontext-web&response_type=code&redirect_uri=http%3A%2F%2F127.0.0.1%3A4174%2Fauth%2Fcallback&scope=openid+profile+email+groups+offline_access&state=L86d5LRsIOMdz6WD744sjrwSe6iKkW0L&code_challenge=uvl59gn613ivLBLxHhQLPnEFAv48m7jTe9PBGknM1A0&code_challenge_method=S256 HTTP/1.1" 303 -
|
||||||
|
"GET /.well-known/openid-configuration HTTP/1.1" 200 -
|
||||||
|
"GET /authorize?client_id=innercontext-web&response_type=code&redirect_uri=http%3A%2F%2F127.0.0.1%3A4174%2Fauth%2Fcallback&scope=openid+profile+email+groups+offline_access&state=vCs34wOnVVizzt0YuT44Mb35qRNttfUH&code_challenge=drdkN8hf7ScN7PSjYY4wmpVhmRv2BJYRyCia0P3oPwM&code_challenge_method=S256 HTTP/1.1" 303 -
|
||||||
|
"GET /.well-known/openid-configuration HTTP/1.1" 200 -
|
||||||
|
"GET /authorize?client_id=innercontext-web&response_type=code&redirect_uri=http%3A%2F%2F127.0.0.1%3A4174%2Fauth%2Fcallback&scope=openid+profile+email+groups+offline_access&state=x7j_uSKquOLIvhjzs6SRbF81uQo1TGb_&code_challenge=nBHiDMdcFNj6JjXbJnB4-Ogsp98QtRplWP8IZnHXZ84&code_challenge_method=S256 HTTP/1.1" 303 -
|
||||||
|
"GET /.well-known/openid-configuration HTTP/1.1" 200 -
|
||||||
|
"GET /authorize?client_id=innercontext-web&response_type=code&redirect_uri=http%3A%2F%2F127.0.0.1%3A4174%2Fauth%2Fcallback&scope=openid+profile+email+groups+offline_access&state=Kbw1x2FH82MPkXsIs_cK-8-5uHbTpXl1&code_challenge=XweHN-subsxNoIOcpvxoqUx9ILiGh6RG5eHJ6O2jmKI&code_challenge_method=S256 HTTP/1.1" 303 -
|
||||||
|
"GET /.well-known/openid-configuration HTTP/1.1" 200 -
|
||||||
|
"GET /authorize?client_id=innercontext-web&response_type=code&redirect_uri=http%3A%2F%2F127.0.0.1%3A4174%2Fauth%2Fcallback&scope=openid+profile+email+groups+offline_access&state=-PIRB74zLnjfijXRSXKeVht6x0jjARr8&code_challenge=nno5wTo4kvMXh6Hbv9Q4UQ42Ah64rdX1RPh1Qas8FTQ&code_challenge_method=S256 HTTP/1.1" 303 -
|
||||||
|
"GET /.well-known/openid-configuration HTTP/1.1" 200 -
|
||||||
|
"GET /authorize?client_id=innercontext-web&response_type=code&redirect_uri=http%3A%2F%2F127.0.0.1%3A4174%2Fauth%2Fcallback&scope=openid+profile+email+groups+offline_access&state=ttM5YXVAn1VugFsWVIw9DpnRtYnUfMWW&code_challenge=5cMexMQ8ioSPTSw1FAuAImlWZm5ogbZ5vemeAZtXjvs&code_challenge_method=S256 HTTP/1.1" 303 -
|
||||||
|
"GET /.well-known/openid-configuration HTTP/1.1" 200 -
|
||||||
|
"GET /authorize?client_id=innercontext-web&response_type=code&redirect_uri=http%3A%2F%2F127.0.0.1%3A4174%2Fauth%2Fcallback&scope=openid+profile+email+groups+offline_access&state=Psc20ibq2QaTNOYGWuGsWopRtcIYMyDt&code_challenge=wsR_Ly1BzlxHsHCFOcoLNXi5hbRVej1iowuBCynBEXQ&code_challenge_method=S256 HTTP/1.1" 303 -
|
||||||
|
"GET /.well-known/openid-configuration HTTP/1.1" 200 -
|
||||||
|
"GET /authorize?client_id=innercontext-web&response_type=code&redirect_uri=http%3A%2F%2F127.0.0.1%3A4174%2Fauth%2Fcallback&scope=openid+profile+email+groups+offline_access&state=Qynh8qyA--irimulhuBnjW7vheoMFv1d&code_challenge=UW4T2DllHIe-d48yu4B33kMEJ8CZuk4uG-g8xGELX2U&code_challenge_method=S256 HTTP/1.1" 303 -
|
||||||
|
"GET /.well-known/openid-configuration HTTP/1.1" 200 -
|
||||||
|
"GET /authorize?client_id=innercontext-web&response_type=code&redirect_uri=http%3A%2F%2F127.0.0.1%3A4174%2Fauth%2Fcallback&scope=openid+profile+email+groups+offline_access&state=6i9k2lfIJpNcPY_bE-BjmnEyatt6eSEO&code_challenge=LilRpHaBD6Iij-x69kT0jg9hebm66PUSGBP3CkPQ4R0&code_challenge_method=S256 HTTP/1.1" 303 -
|
||||||
|
"GET /.well-known/openid-configuration HTTP/1.1" 200 -
|
||||||
|
"GET /authorize?client_id=innercontext-web&response_type=code&redirect_uri=http%3A%2F%2F127.0.0.1%3A4174%2Fauth%2Fcallback&scope=openid+profile+email+groups+offline_access&state=hv06PULt9VCe6Z57ICw_kfH_rOA68Ye8&code_challenge=H6WP5l4QfrsPqEESZOUm61MWtHDrSrKe4xLl2j0Mqa4&code_challenge_method=S256 HTTP/1.1" 303 -
|
||||||
|
"GET /.well-known/openid-configuration HTTP/1.1" 200 -
|
||||||
|
"GET /authorize?client_id=innercontext-web&response_type=code&redirect_uri=http%3A%2F%2F127.0.0.1%3A4174%2Fauth%2Fcallback&scope=openid+profile+email+groups+offline_access&state=bPb6NuPvTTpSLSBf0lcEyiovQMrf25ZU&code_challenge=c8QWtQNvy9pNw9nfeoAym2SM20WpBVePYjG80kV59tY&code_challenge_method=S256 HTTP/1.1" 303 -
|
||||||
|
"GET /.well-known/openid-configuration HTTP/1.1" 200 -
|
||||||
|
"GET /authorize?client_id=innercontext-web&response_type=code&redirect_uri=http%3A%2F%2F127.0.0.1%3A4174%2Fauth%2Fcallback&scope=openid+profile+email+groups+offline_access&state=19pfKm4slbG0zxJTa_aNKZauy1pAlM24&code_challenge=fJsle6PvRo2Q53ibfrj2aHQYDfbLoXyWXH11-hJO1-8&code_challenge_method=S256 HTTP/1.1" 303 -
|
||||||
|
"GET /.well-known/openid-configuration HTTP/1.1" 200 -
|
||||||
|
"GET /authorize?client_id=innercontext-web&response_type=code&redirect_uri=http%3A%2F%2F127.0.0.1%3A4174%2Fauth%2Fcallback&scope=openid+profile+email+groups+offline_access&state=Ls5-Ps19zFfHr6vdvFrgZXPU4HK2cdMH&code_challenge=hIaTDVr0WpZVBppRxfd4h0nc48Cq6llSetMzG5NU6R0&code_challenge_method=S256 HTTP/1.1" 303 -
|
||||||
|
"GET /.well-known/openid-configuration HTTP/1.1" 200 -
|
||||||
|
"GET /authorize?client_id=innercontext-web&response_type=code&redirect_uri=http%3A%2F%2F127.0.0.1%3A4174%2Fauth%2Fcallback&scope=openid+profile+email+groups+offline_access&state=UCR6DlUCqwChXZiu5m3IezhnovkLU1hP&code_challenge=mf1NzdhG0OQnoB2_L_VQNlgohTdD3ZQ1gnrC-WM5xic&code_challenge_method=S256 HTTP/1.1" 303 -
|
||||||
|
"GET /.well-known/openid-configuration HTTP/1.1" 200 -
|
||||||
|
"GET /authorize?client_id=innercontext-web&response_type=code&redirect_uri=http%3A%2F%2F127.0.0.1%3A4174%2Fauth%2Fcallback&scope=openid+profile+email+groups+offline_access&state=cikc5ZIqH3gtF0JNS4xeB8ye36C0FeYK&code_challenge=jRsCnQlVsNs4qftFnFdHUm63CxT8tkiprb5QYGcvVQs&code_challenge_method=S256 HTTP/1.1" 303 -
|
||||||
|
"GET /.well-known/openid-configuration HTTP/1.1" 200 -
|
||||||
|
"GET /authorize?client_id=innercontext-web&response_type=code&redirect_uri=http%3A%2F%2F127.0.0.1%3A4174%2Fauth%2Fcallback&scope=openid+profile+email+groups+offline_access&state=sxqmLO0piajwnRePC4fq3rSE_EErnT3j&code_challenge=Pg_VsUw-qJXnF_JpdlJbUrxJTPRjzMY2q_rTad86iv4&code_challenge_method=S256 HTTP/1.1" 303 -
|
||||||
|
"GET /.well-known/openid-configuration HTTP/1.1" 200 -
|
||||||
|
"GET /authorize?client_id=innercontext-web&response_type=code&redirect_uri=http%3A%2F%2F127.0.0.1%3A4174%2Fauth%2Fcallback&scope=openid+profile+email+groups+offline_access&state=H2123ucRCauviiQcfOgL2ACEaMJMCnCd&code_challenge=J_32kAoALP8nRUhLdpWSnHr9uePiK9ek8K5gYXKrqrM&code_challenge_method=S256 HTTP/1.1" 303 -
|
||||||
|
"GET /.well-known/openid-configuration HTTP/1.1" 200 -
|
||||||
|
"GET /authorize?client_id=innercontext-web&response_type=code&redirect_uri=http%3A%2F%2F127.0.0.1%3A4174%2Fauth%2Fcallback&scope=openid+profile+email+groups+offline_access&state=QB4ItNBiVPTcsbvf8jw5fC0wCmLdrBaR&code_challenge=7IJ0bQzKs3-0atAvro4GR87lUJ3rYcUO5nWQ7fGBbJE&code_challenge_method=S256 HTTP/1.1" 303 -
|
||||||
|
"GET /.well-known/openid-configuration HTTP/1.1" 200 -
|
||||||
|
"GET /authorize?client_id=innercontext-web&response_type=code&redirect_uri=http%3A%2F%2F127.0.0.1%3A4174%2Fauth%2Fcallback&scope=openid+profile+email+groups+offline_access&state=8f4SjKIZk6Zr23J5PhkIKaWZcZOoHbNy&code_challenge=S0TxmFf5PZ0vgx2krjLl2WIcSyxHI5CcxVdH3SDu7GE&code_challenge_method=S256 HTTP/1.1" 303 -
|
||||||
|
"GET /.well-known/openid-configuration HTTP/1.1" 200 -
|
||||||
|
"GET /authorize?client_id=innercontext-web&response_type=code&redirect_uri=http%3A%2F%2F127.0.0.1%3A4174%2Fauth%2Fcallback&scope=openid+profile+email+groups+offline_access&state=gIsLo0NRxP-EP3W8oTjf1O2QS1Nijj-g&code_challenge=bnYTMZcj4cWQ32OPHKWN3638Nuoh3pzPH0wP0GmHLOc&code_challenge_method=S256 HTTP/1.1" 303 -
|
||||||
|
"GET /.well-known/openid-configuration HTTP/1.1" 200 -
|
||||||
|
"GET /authorize?client_id=innercontext-web&response_type=code&redirect_uri=http%3A%2F%2F127.0.0.1%3A4174%2Fauth%2Fcallback&scope=openid+profile+email+groups+offline_access&state=RVs6ZqxJdOCnK_RDdRHJbeJFLcjH8jDZ&code_challenge=ghcwJxfIcbS1vZuFbltcVmX9IHbobNEoIvFRrjRo4hc&code_challenge_method=S256 HTTP/1.1" 303 -
|
||||||
|
"GET /.well-known/openid-configuration HTTP/1.1" 200 -
|
||||||
|
"GET /authorize?client_id=innercontext-web&response_type=code&redirect_uri=http%3A%2F%2F127.0.0.1%3A4174%2Fauth%2Fcallback&scope=openid+profile+email+groups+offline_access&state=BaIv9zAbKGEmbOlhh7J8I0sye2qpRPcJ&code_challenge=yckMaHRptZd3J04kALAyxxCqIIulPIf9PrTmeA6eN7s&code_challenge_method=S256 HTTP/1.1" 303 -
|
||||||
|
"GET /authorize?client_id=innercontext-web&response_type=code&redirect_uri=http%3A%2F%2F127.0.0.1%3A4174%2Fauth%2Fcallback&scope=openid+profile+email+groups+offline_access&state=8ikyRz70r3g4_lJGDkx1wEl27neWvVFS&code_challenge=b5f8yNS9lEemYbrnFBnG3OANpefQSU3NXwwFkMS63A4&code_challenge_method=S256 HTTP/1.1" 303 -
|
||||||
|
"POST /token HTTP/1.1" 200 -
|
||||||
|
"GET /userinfo HTTP/1.1" 200 -
|
||||||
|
"GET /jwks HTTP/1.1" 200 -
|
||||||
|
"GET /.well-known/openid-configuration HTTP/1.1" 200 -
|
||||||
|
"GET /authorize?client_id=innercontext-web&response_type=code&redirect_uri=http%3A%2F%2F127.0.0.1%3A4175%2Fauth%2Fcallback&scope=openid+profile+email+groups+offline_access&state=C0AvwQhDg-kdVylKE43TG9lawjgYQW0G&code_challenge=88-OcQzCn4HlOoJfXOJ_CQo1PV9YRJCgracnfsuO5LU&code_challenge_method=S256 HTTP/1.1" 303 -
|
||||||
|
"GET /authorize?client_id=innercontext-web&response_type=code&redirect_uri=http%3A%2F%2F127.0.0.1%3A4175%2Fauth%2Fcallback&scope=openid+profile+email+groups+offline_access&state=t_oQGVk1spVtWPABdSOUPjgdkk0GhGzR&code_challenge=bpLSYPmkEZpr_RbpzfcO6kYEKBBTuRFemoQDH9mBUbU&code_challenge_method=S256 HTTP/1.1" 303 -
|
||||||
|
"POST /token HTTP/1.1" 200 -
|
||||||
|
"GET /userinfo HTTP/1.1" 200 -
|
||||||
|
"GET /jwks HTTP/1.1" 200 -
|
||||||
|
"GET /authorize?client_id=innercontext-web&response_type=code&redirect_uri=http%3A%2F%2F127.0.0.1%3A4175%2Fauth%2Fcallback&scope=openid+profile+email+groups+offline_access&state=3wTcjLbHZ73Tq2RdHPs79-kJEcBcPBLQ&code_challenge=OMJ_4YniZmJEdxtS0kpHCgdbZfZWnNSNhMLOeom1y2g&code_challenge_method=S256 HTTP/1.1" 303 -
|
||||||
|
"POST /token HTTP/1.1" 200 -
|
||||||
|
"GET /userinfo HTTP/1.1" 200 -
|
||||||
|
"GET /authorize?client_id=innercontext-web&response_type=code&redirect_uri=http%3A%2F%2F127.0.0.1%3A4175%2Fauth%2Fcallback&scope=openid+profile+email+groups+offline_access&state=Rhe95uXtrJH8dXF0T1e_bLVh4ALeny99&code_challenge=1C8cQjsJ1XMTbESSapVbalnDO7QITK1ovYIt2N2OQ-M&code_challenge_method=S256 HTTP/1.1" 303 -
|
||||||
|
"POST /token HTTP/1.1" 200 -
|
||||||
|
"GET /userinfo HTTP/1.1" 200 -
|
||||||
|
"GET /logout?client_id=innercontext-web&post_logout_redirect_uri=http%3A%2F%2F127.0.0.1%3A4175%2F HTTP/1.1" 303 -
|
||||||
|
"GET /authorize?client_id=innercontext-web&response_type=code&redirect_uri=http%3A%2F%2F127.0.0.1%3A4175%2Fauth%2Fcallback&scope=openid+profile+email+groups+offline_access&state=yYJja-mcNuteTK_vfmRG2q1ngy8uQCYA&code_challenge=X-3bMxYVrLNx0WI_jBRjT33JGo-ge_2SMbuDjMQSB7o&code_challenge_method=S256 HTTP/1.1" 303 -
|
||||||
|
"POST /token HTTP/1.1" 200 -
|
||||||
|
"GET /userinfo HTTP/1.1" 200 -
|
||||||
|
"GET /authorize?client_id=innercontext-web&response_type=code&redirect_uri=http%3A%2F%2F127.0.0.1%3A4175%2Fauth%2Fcallback&scope=openid+profile+email+groups+offline_access&state=y64pl2E_jbiDANFJFjySZe06mGwJxLPY&code_challenge=l3OC99rarYHtSPsiS01wxbIdThMXnTRPUCu_9dUSA5w&code_challenge_method=S256 HTTP/1.1" 303 -
|
||||||
|
"POST /token HTTP/1.1" 200 -
|
||||||
|
"GET /userinfo HTTP/1.1" 200 -
|
||||||
|
"GET /jwks HTTP/1.1" 200 -
|
||||||
18
.sisyphus/evidence/task-T8-protected-nav.md
Normal file
18
.sisyphus/evidence/task-T8-protected-nav.md
Normal file
|
|
@ -0,0 +1,18 @@
|
||||||
|
# Task T8 Protected Navigation
|
||||||
|
|
||||||
|
- QA app: `http://127.0.0.1:4175`
|
||||||
|
- Backend: `http://127.0.0.1:8002`
|
||||||
|
- Mock OIDC issuer: `http://127.0.0.1:9100`
|
||||||
|
- Backend DB: `.sisyphus/evidence/task-T8-qa.sqlite`
|
||||||
|
|
||||||
|
Authenticated shell and protected route checks executed with Playwright:
|
||||||
|
|
||||||
|
- `/` -> title `Dashboard - innercontext`, heading `Dashboard`, shell user `Playwright User`, role `Użytkownik`, logout visible `true`
|
||||||
|
- `/products` -> title `Produkty — innercontext`, heading `Produkty`, shell user `Playwright User`, role `Użytkownik`, logout visible `true`
|
||||||
|
- `/profile` -> title `Profil — innercontext`, heading `Profil`, shell user `Playwright User`, role `Użytkownik`, logout visible `true`
|
||||||
|
- `/routines` -> title `Rutyny — innercontext`, heading `Rutyny`, shell user `Playwright User`, role `Użytkownik`, logout visible `true`
|
||||||
|
|
||||||
|
Logout endpoint check executed with Playwright request API:
|
||||||
|
|
||||||
|
- `GET /auth/logout` -> `303`
|
||||||
|
- Location -> `http://127.0.0.1:9100/logout?client_id=innercontext-web&post_logout_redirect_uri=http%3A%2F%2F127.0.0.1%3A4175%2F`
|
||||||
BIN
.sisyphus/evidence/task-T8-qa.sqlite
Normal file
BIN
.sisyphus/evidence/task-T8-qa.sqlite
Normal file
Binary file not shown.
10
.sisyphus/evidence/task-T8-signed-out-network.txt
Normal file
10
.sisyphus/evidence/task-T8-signed-out-network.txt
Normal file
|
|
@ -0,0 +1,10 @@
|
||||||
|
Playwright unauthenticated request check
|
||||||
|
|
||||||
|
request: GET http://127.0.0.1:4175/products
|
||||||
|
cookies: none
|
||||||
|
maxRedirects: 0
|
||||||
|
|
||||||
|
status: 303
|
||||||
|
location: /auth/login?returnTo=%2Fproducts
|
||||||
|
|
||||||
|
result: protected page redirects to the login flow before returning page content.
|
||||||
67
.sisyphus/evidence/task-T9-admin-households-denied.txt
Normal file
67
.sisyphus/evidence/task-T9-admin-households-denied.txt
Normal file
|
|
@ -0,0 +1,67 @@
|
||||||
|
============================= test session starts ==============================
|
||||||
|
platform darwin -- Python 3.12.12, pytest-9.0.2, pluggy-1.6.0 -- /Users/piotr/dev/innercontext/backend/.venv/bin/python3
|
||||||
|
cachedir: .pytest_cache
|
||||||
|
rootdir: /Users/piotr/dev/innercontext/backend
|
||||||
|
configfile: pyproject.toml
|
||||||
|
plugins: anyio-4.12.1, cov-7.0.0
|
||||||
|
collecting ... collected 15 items / 9 deselected / 6 selected
|
||||||
|
|
||||||
|
tests/test_admin_households.py::test_assign_member_rejects_unsynced_user PASSED [ 16%]
|
||||||
|
tests/test_admin_households.py::test_admin_household_routes_forbidden_for_member[get-/admin/users-None] PASSED [ 33%]
|
||||||
|
tests/test_admin_households.py::test_admin_household_routes_forbidden_for_member[post-/admin/households-None] PASSED [ 50%]
|
||||||
|
tests/test_admin_households.py::test_admin_household_routes_forbidden_for_member[post-/admin/households/ad0a09c5-b0bb-4565-895c-eada9498db52/members-json_body2] PASSED [ 66%]
|
||||||
|
tests/test_admin_households.py::test_admin_household_routes_forbidden_for_member[patch-/admin/households/25453802-90e7-44a8-99ab-3241db73c5c0/members/56c635a2-bb8b-48ff-876e-f085ae7cff6c-None] PASSED [ 83%]
|
||||||
|
tests/test_admin_households.py::test_admin_household_routes_forbidden_for_member[delete-/admin/households/4390cd61-0028-4970-a573-a9fca06aff36/members/e5ba2298-ad10-44c0-a77a-b3ea7efa929f-None] PASSED [100%]
|
||||||
|
|
||||||
|
================================ tests coverage ================================
|
||||||
|
______________ coverage: platform darwin, python 3.12.12-final-0 _______________
|
||||||
|
|
||||||
|
Name Stmts Miss Cover Missing
|
||||||
|
----------------------------------------------------------------------------------
|
||||||
|
innercontext/api/__init__.py 0 0 100%
|
||||||
|
innercontext/api/admin.py 93 26 72% 78, 102-110, 142, 165-183, 195-206
|
||||||
|
innercontext/api/ai_logs.py 63 34 46% 18-30, 53-57, 69-81, 106-113
|
||||||
|
innercontext/api/auth.py 68 18 74% 65-79, 100, 106-109, 122-129, 153-158, 166
|
||||||
|
innercontext/api/auth_deps.py 25 10 60% 23, 36-48
|
||||||
|
innercontext/api/authz.py 100 79 21% 16, 20, 24-27, 31, 35-40, 48-51, 60-66, 75-83, 89-93, 105-108, 116-133, 141-150, 156-177
|
||||||
|
innercontext/api/health.py 236 113 52% 77-81, 142-146, 156-163, 179-185, 195-204, 214, 231-243, 253-270, 285-298, 313-330, 341-353, 363-371, 395-463, 473-482, 492, 509-521, 531-539
|
||||||
|
innercontext/api/inventory.py 30 13 57% 26, 36-44, 53-60
|
||||||
|
innercontext/api/llm_context.py 106 91 14% 17-21, 30-36, 44-47, 57-79, 101-153, 180-217, 249-253
|
||||||
|
innercontext/api/product_llm_tools.py 107 94 12% 12-17, 23-38, 48-82, 111-136, 143-162, 172-205
|
||||||
|
innercontext/api/products.py 638 419 34% 81-89, 97, 106-126, 281-289, 299-301, 312-387, 396-431, 515, 519-537, 541-550, 560-566, 570-571, 581-631, 649-703, 712-727, 731, 853-891, 904-972, 981-992, 1002-1015, 1024-1029, 1043-1048, 1060-1072, 1081-1086, 1097-1232, 1238-1240, 1244-1257, 1263, 1296-1457
|
||||||
|
innercontext/api/profile.py 39 15 62% 36-39, 55-69
|
||||||
|
innercontext/api/routines.py 632 446 29% 64-84, 99-103, 109-117, 127-133, 301-304, 308-325, 329-334, 344-357, 372-373, 389-399, 415-434, 443-479, 488-520, 529-565, 574-583, 587-600, 606, 619-641, 664-693, 697-703, 707-710, 714-721, 814-842, 852-857, 871-1134, 1143-1348, 1358-1359, 1371-1386, 1397-1409, 1419-1427, 1443-1464, 1475-1491, 1501-1509, 1524-1529, 1540-1552, 1562-1570
|
||||||
|
innercontext/api/skincare.py 150 70 53% 103, 145-149, 158-166, 178-255, 267-277, 287-296, 306, 322-333, 343-350
|
||||||
|
innercontext/api/utils.py 22 4 82% 34, 43, 51, 59
|
||||||
|
innercontext/auth.py 236 146 38% 64-77, 127-129, 133-137, 141-149, 153-156, 161-168, 187-192, 195-210, 213-217, 220-228, 231-248, 251-264, 267, 271-274, 279, 283-284, 288-317, 325-363, 373-384
|
||||||
|
innercontext/llm.py 134 119 11% 22, 44, 62-66, 74-102, 118-214, 231-326
|
||||||
|
innercontext/llm_safety.py 18 14 22% 17-45, 58-61, 80-83
|
||||||
|
innercontext/models/__init__.py 13 0 100%
|
||||||
|
innercontext/models/ai_log.py 33 0 100%
|
||||||
|
innercontext/models/api_metadata.py 15 0 100%
|
||||||
|
innercontext/models/base.py 3 0 100%
|
||||||
|
innercontext/models/domain.py 4 0 100%
|
||||||
|
innercontext/models/enums.py 152 0 100%
|
||||||
|
innercontext/models/health.py 64 0 100%
|
||||||
|
innercontext/models/household.py 14 0 100%
|
||||||
|
innercontext/models/household_membership.py 20 0 100%
|
||||||
|
innercontext/models/pricing.py 19 0 100%
|
||||||
|
innercontext/models/product.py 226 106 53% 76-78, 203-205, 209-230, 238-356
|
||||||
|
innercontext/models/profile.py 17 0 100%
|
||||||
|
innercontext/models/routine.py 42 0 100%
|
||||||
|
innercontext/models/skincare.py 37 0 100%
|
||||||
|
innercontext/models/user.py 19 0 100%
|
||||||
|
innercontext/services/__init__.py 0 0 100%
|
||||||
|
innercontext/services/fx.py 57 42 26% 16, 20-22, 26-48, 54-67, 71-77
|
||||||
|
innercontext/services/pricing_jobs.py 89 76 15% 19-24, 28-49, 53-67, 71-80, 89-107, 111-130, 134-138
|
||||||
|
innercontext/validators/__init__.py 7 0 100%
|
||||||
|
innercontext/validators/base.py 22 5 77% 23, 27, 31, 35, 52
|
||||||
|
innercontext/validators/batch_validator.py 128 105 18% 37, 58-154, 167-203, 214-240, 249-273
|
||||||
|
innercontext/validators/photo_validator.py 65 54 17% 58-134, 144-152, 164-178
|
||||||
|
innercontext/validators/product_parse_validator.py 110 93 15% 108-154, 164-172, 185-198, 205-239, 243-267, 273-319, 325-339
|
||||||
|
innercontext/validators/routine_validator.py 146 114 22% 69-167, 173-175, 182-197, 201-218, 229-246, 259-275, 288-309
|
||||||
|
innercontext/validators/shopping_validator.py 78 58 26% 49-96, 102-114, 122-123, 136-142, 150-161, 169-203
|
||||||
|
----------------------------------------------------------------------------------
|
||||||
|
TOTAL 4077 2364 42%
|
||||||
|
Coverage HTML written to dir htmlcov
|
||||||
|
======================= 6 passed, 9 deselected in 0.41s ========================
|
||||||
65
.sisyphus/evidence/task-T9-admin-households.txt
Normal file
65
.sisyphus/evidence/task-T9-admin-households.txt
Normal file
|
|
@ -0,0 +1,65 @@
|
||||||
|
============================= test session starts ==============================
|
||||||
|
platform darwin -- Python 3.12.12, pytest-9.0.2, pluggy-1.6.0 -- /Users/piotr/dev/innercontext/backend/.venv/bin/python3
|
||||||
|
cachedir: .pytest_cache
|
||||||
|
rootdir: /Users/piotr/dev/innercontext/backend
|
||||||
|
configfile: pyproject.toml
|
||||||
|
plugins: anyio-4.12.1, cov-7.0.0
|
||||||
|
collecting ... collected 15 items / 11 deselected / 4 selected
|
||||||
|
|
||||||
|
tests/test_admin_households.py::test_create_household_returns_new_household PASSED [ 25%]
|
||||||
|
tests/test_admin_households.py::test_assign_member_creates_membership PASSED [ 50%]
|
||||||
|
tests/test_admin_households.py::test_assign_member_rejects_already_assigned_user PASSED [ 75%]
|
||||||
|
tests/test_admin_households.py::test_assign_member_rejects_unsynced_user PASSED [100%]
|
||||||
|
|
||||||
|
================================ tests coverage ================================
|
||||||
|
______________ coverage: platform darwin, python 3.12.12-final-0 _______________
|
||||||
|
|
||||||
|
Name Stmts Miss Cover Missing
|
||||||
|
----------------------------------------------------------------------------------
|
||||||
|
innercontext/api/__init__.py 0 0 100%
|
||||||
|
innercontext/api/admin.py 93 26 72% 78, 102-110, 142, 165-183, 195-206
|
||||||
|
innercontext/api/ai_logs.py 63 34 46% 18-30, 53-57, 69-81, 106-113
|
||||||
|
innercontext/api/auth.py 68 18 74% 65-79, 100, 106-109, 122-129, 153-158, 166
|
||||||
|
innercontext/api/auth_deps.py 25 10 60% 23, 36-48
|
||||||
|
innercontext/api/authz.py 100 79 21% 16, 20, 24-27, 31, 35-40, 48-51, 60-66, 75-83, 89-93, 105-108, 116-133, 141-150, 156-177
|
||||||
|
innercontext/api/health.py 236 113 52% 77-81, 142-146, 156-163, 179-185, 195-204, 214, 231-243, 253-270, 285-298, 313-330, 341-353, 363-371, 395-463, 473-482, 492, 509-521, 531-539
|
||||||
|
innercontext/api/inventory.py 30 13 57% 26, 36-44, 53-60
|
||||||
|
innercontext/api/llm_context.py 106 91 14% 17-21, 30-36, 44-47, 57-79, 101-153, 180-217, 249-253
|
||||||
|
innercontext/api/product_llm_tools.py 107 94 12% 12-17, 23-38, 48-82, 111-136, 143-162, 172-205
|
||||||
|
innercontext/api/products.py 638 419 34% 81-89, 97, 106-126, 281-289, 299-301, 312-387, 396-431, 515, 519-537, 541-550, 560-566, 570-571, 581-631, 649-703, 712-727, 731, 853-891, 904-972, 981-992, 1002-1015, 1024-1029, 1043-1048, 1060-1072, 1081-1086, 1097-1232, 1238-1240, 1244-1257, 1263, 1296-1457
|
||||||
|
innercontext/api/profile.py 39 15 62% 36-39, 55-69
|
||||||
|
innercontext/api/routines.py 632 446 29% 64-84, 99-103, 109-117, 127-133, 301-304, 308-325, 329-334, 344-357, 372-373, 389-399, 415-434, 443-479, 488-520, 529-565, 574-583, 587-600, 606, 619-641, 664-693, 697-703, 707-710, 714-721, 814-842, 852-857, 871-1134, 1143-1348, 1358-1359, 1371-1386, 1397-1409, 1419-1427, 1443-1464, 1475-1491, 1501-1509, 1524-1529, 1540-1552, 1562-1570
|
||||||
|
innercontext/api/skincare.py 150 70 53% 103, 145-149, 158-166, 178-255, 267-277, 287-296, 306, 322-333, 343-350
|
||||||
|
innercontext/api/utils.py 22 4 82% 34, 43, 51, 59
|
||||||
|
innercontext/auth.py 236 146 38% 64-77, 127-129, 133-137, 141-149, 153-156, 161-168, 187-192, 195-210, 213-217, 220-228, 231-248, 251-264, 267, 271-274, 279, 283-284, 288-317, 325-363, 373-384
|
||||||
|
innercontext/llm.py 134 119 11% 22, 44, 62-66, 74-102, 118-214, 231-326
|
||||||
|
innercontext/llm_safety.py 18 14 22% 17-45, 58-61, 80-83
|
||||||
|
innercontext/models/__init__.py 13 0 100%
|
||||||
|
innercontext/models/ai_log.py 33 0 100%
|
||||||
|
innercontext/models/api_metadata.py 15 0 100%
|
||||||
|
innercontext/models/base.py 3 0 100%
|
||||||
|
innercontext/models/domain.py 4 0 100%
|
||||||
|
innercontext/models/enums.py 152 0 100%
|
||||||
|
innercontext/models/health.py 64 0 100%
|
||||||
|
innercontext/models/household.py 14 0 100%
|
||||||
|
innercontext/models/household_membership.py 20 0 100%
|
||||||
|
innercontext/models/pricing.py 19 0 100%
|
||||||
|
innercontext/models/product.py 226 106 53% 76-78, 203-205, 209-230, 238-356
|
||||||
|
innercontext/models/profile.py 17 0 100%
|
||||||
|
innercontext/models/routine.py 42 0 100%
|
||||||
|
innercontext/models/skincare.py 37 0 100%
|
||||||
|
innercontext/models/user.py 19 0 100%
|
||||||
|
innercontext/services/__init__.py 0 0 100%
|
||||||
|
innercontext/services/fx.py 57 42 26% 16, 20-22, 26-48, 54-67, 71-77
|
||||||
|
innercontext/services/pricing_jobs.py 89 76 15% 19-24, 28-49, 53-67, 71-80, 89-107, 111-130, 134-138
|
||||||
|
innercontext/validators/__init__.py 7 0 100%
|
||||||
|
innercontext/validators/base.py 22 5 77% 23, 27, 31, 35, 52
|
||||||
|
innercontext/validators/batch_validator.py 128 105 18% 37, 58-154, 167-203, 214-240, 249-273
|
||||||
|
innercontext/validators/photo_validator.py 65 54 17% 58-134, 144-152, 164-178
|
||||||
|
innercontext/validators/product_parse_validator.py 110 93 15% 108-154, 164-172, 185-198, 205-239, 243-267, 273-319, 325-339
|
||||||
|
innercontext/validators/routine_validator.py 146 114 22% 69-167, 173-175, 182-197, 201-218, 229-246, 259-275, 288-309
|
||||||
|
innercontext/validators/shopping_validator.py 78 58 26% 49-96, 102-114, 122-123, 136-142, 150-161, 169-203
|
||||||
|
----------------------------------------------------------------------------------
|
||||||
|
TOTAL 4077 2364 42%
|
||||||
|
Coverage HTML written to dir htmlcov
|
||||||
|
======================= 4 passed, 11 deselected in 0.41s =======================
|
||||||
|
|
@ -0,0 +1,13 @@
|
||||||
|
# T10: Runtime Configuration and Validation
|
||||||
|
|
||||||
|
## Learnings
|
||||||
|
- Nginx needs `X-Forwarded-Host` and `X-Forwarded-Port` for proper OIDC callback URL generation.
|
||||||
|
- `curl -f` fails on 302 redirects, which are common when a page is protected by OIDC.
|
||||||
|
- Health checks and deployment scripts must be updated to allow 302/303/307 status codes for the frontend root.
|
||||||
|
- Bash `((errors++))` returns 1 if `errors` is 0, which can kill the script if `set -e` is active. Use `errors=$((errors + 1))` instead.
|
||||||
|
- Documenting required environment variables in systemd service files and `DEPLOYMENT.md` is crucial for operators.
|
||||||
|
- Authelia client configuration requires specific `redirect_uris` and `scopes` (openid, profile, email, groups).
|
||||||
|
|
||||||
|
## Verification
|
||||||
|
- `scripts/validate-env.sh` correctly identifies missing OIDC and session variables.
|
||||||
|
- `scripts/healthcheck.sh` and `deploy.sh` now handle auth redirects (302) for the frontend.
|
||||||
5
.sisyphus/notepads/multi-user-authelia-oidc/decisions.md
Normal file
5
.sisyphus/notepads/multi-user-authelia-oidc/decisions.md
Normal file
|
|
@ -0,0 +1,5 @@
|
||||||
|
- Added `users`, `households`, and `household_memberships` tables with OIDC identity key (`oidc_issuer`, `oidc_subject`) and one-household-per-user enforced via unique `household_memberships.user_id`.
|
||||||
|
- Added `is_household_shared` to `product_inventory` with default `False` so sharing remains per-row opt-in.
|
||||||
|
- Migration enforces ownership in two phases: nullable + backfill to bootstrap admin, then non-null constraints on all owned tables.
|
||||||
|
- Correction: migration 4b7d2e9f1c3a applies a two-step ownership rollout (nullable user_id, bootstrap+backfill, then NOT NULL on owned tables).
|
||||||
|
- Centralized tenant authorization in `innercontext/api/authz.py` and exposed wrappers in `api/utils.py` so routers can move from global `get_or_404` to scoped helpers.
|
||||||
3
.sisyphus/notepads/multi-user-authelia-oidc/issues.md
Normal file
3
.sisyphus/notepads/multi-user-authelia-oidc/issues.md
Normal file
|
|
@ -0,0 +1,3 @@
|
||||||
|
- Full backend pytest currently has pre-existing failures unrelated to this task scope (5 failing tests in routines/skincare helpers after schema changes in this branch context).
|
||||||
|
- Existing historical migration executes , which breaks full SQLite-from-base upgrades; T2 QA used a synthetic DB pinned at revision to validate the new migration behavior in isolation.
|
||||||
|
- Correction: historical migration 7c91e4b2af38 runs DROP TYPE IF EXISTS pricetier, which breaks SQLite full-chain upgrades; T2 evidence therefore uses a synthetic DB pinned to revision 9f3a2c1b4d5e for isolated migration validation.
|
||||||
15
.sisyphus/notepads/multi-user-authelia-oidc/learnings.md
Normal file
15
.sisyphus/notepads/multi-user-authelia-oidc/learnings.md
Normal file
|
|
@ -0,0 +1,15 @@
|
||||||
|
- For ownership rollout without API auth wiring, `user_id` columns can be added as nullable to avoid breaking existing write paths and tests.
|
||||||
|
- Alembic is needed for SQLite-safe ownership column/FK addition and later non-null enforcement across legacy tables.
|
||||||
|
- Correction: Alembic batch_alter_table is required for SQLite-safe ownership column/FK addition and non-null enforcement across legacy tables.
|
||||||
|
- New tenant helpers should keep unauthorized lookups indistinguishable from missing rows by raising `404` with model-not-found detail.
|
||||||
|
- Product visibility and inventory access are separate checks: household-shared inventory can grant view access without granting update rights.
|
||||||
|
- Products should be visible when user is owner, admin, or in the same household as at least one household-shared inventory row; inventory payloads must still be filtered to shared rows only for non-owners.
|
||||||
|
- Shared inventory update rules differ from create/delete: household members in the same household can PATCH shared rows, but POST/DELETE inventory stays owner/admin only.
|
||||||
|
- Product summary ownership should use Product.user_id (is_owned) rather than active inventory presence, so shared products render as accessible-but-not-owned.
|
||||||
|
- SvelteKit can keep PKCE server-only by storing the verifier/state in a short-lived encrypted HTTP-only cookie and storing the refreshed app session in a separate encrypted HTTP-only cookie.
|
||||||
|
- `handleFetch` is enough to attach bearer tokens for server loads/actions that hit `PUBLIC_API_BASE`, but browser-direct `$lib/api` calls to `/api` still need follow-up proxy/auth plumbing outside this task.
|
||||||
|
- 2026-03-12 T6: Domain routers now enforce per-user ownership by default with explicit `?user_id=` admin override in profile/health/routines/skincare/ai-logs; routine suggestion product pool is constrained to owned+household-shared visibility and uses current user profile context.
|
||||||
|
- 2026-03-12 T6: QA evidence generated at `.sisyphus/evidence/task-T6-domain-tenancy.txt` and `.sisyphus/evidence/task-T6-routine-scope.txt` with passing scenarios.
|
||||||
|
- 2026-03-12 T9: Admin household management can stay backend-only by listing synced local `users` plus current membership state, creating bare `households`, and handling assign/move/remove as explicit membership operations.
|
||||||
|
- 2026-03-12 T9: Unsynced identities should fail assignment via local `User` lookup rather than implicit creation, keeping Authelia as the only identity source and preserving the v1 one-household-per-user rule.
|
||||||
|
- 2026-03-12 T8: Server-side frontend API helpers should call `PUBLIC_API_BASE` directly with the access token from `event.locals.session`; same-origin SvelteKit endpoints are still the right bridge for browser-only interactions like AI modals and inline PATCHes.
|
||||||
3
.sisyphus/notepads/multi-user-authelia-oidc/problems.md
Normal file
3
.sisyphus/notepads/multi-user-authelia-oidc/problems.md
Normal file
|
|
@ -0,0 +1,3 @@
|
||||||
|
- Pending follow-up migration task is required to materialize new models/columns in PostgreSQL schema.
|
||||||
|
- End-to-end SQLite from base revision remains blocked by pre-existing non-SQLite-safe migration logic () outside T2 scope.
|
||||||
|
- Correction: full SQLite alembic upgrade head from base is still blocked by pre-existing DROP TYPE usage in migration 7c91e4b2af38 (outside T2 scope).
|
||||||
604
.sisyphus/plans/multi-user-authelia-oidc.md
Normal file
604
.sisyphus/plans/multi-user-authelia-oidc.md
Normal file
|
|
@ -0,0 +1,604 @@
|
||||||
|
# Multi-User Support with Authelia OIDC
|
||||||
|
|
||||||
|
## TL;DR
|
||||||
|
> **Summary**: Convert the monorepo from a single-user personal system into a multi-user application authenticated by Authelia OIDC, with SvelteKit owning the login/session flow and FastAPI enforcing row-level ownership and household-scoped inventory sharing.
|
||||||
|
> **Deliverables**:
|
||||||
|
> - OIDC login/logout/session flow in SvelteKit
|
||||||
|
> - FastAPI token validation, current-user resolution, and authorization helpers
|
||||||
|
> - New local identity/household schema plus ownership migrations for existing data
|
||||||
|
> - Household-shared inventory support with owner/admin product controls
|
||||||
|
> - Updated infra, CI, and verification coverage for the new auth model
|
||||||
|
> **Effort**: XL
|
||||||
|
> **Parallel**: YES - 3 waves
|
||||||
|
> **Critical Path**: T1 -> T2 -> T3 -> T4 -> T5/T6 -> T7/T8 -> T11
|
||||||
|
|
||||||
|
## Context
|
||||||
|
### Original Request
|
||||||
|
Add multi-user support with login handled by Authelia using OpenID Connect.
|
||||||
|
|
||||||
|
### Interview Summary
|
||||||
|
- Auth model: app-managed OIDC with SvelteKit-owned session handling; FastAPI acts as the resource server.
|
||||||
|
- Roles: `admin` and `member`; admins can manage member data and household memberships, but v1 excludes impersonation and a full user-management console.
|
||||||
|
- Ownership model: records are user-owned by default; `products` stay user-owned in v1.
|
||||||
|
- Sharing exception: product inventory may be shared among members of the same household; shared household members may view and update inventory entries, but only the product owner or an admin may edit/delete the underlying product.
|
||||||
|
- Rollout: retrofit the existing application in one implementation plan rather than staging auth separately.
|
||||||
|
- Identity source: Authelia remains the source of truth; no in-app signup/provisioning UI in v1.
|
||||||
|
- Verification preference: do not add a permanent frontend test suite in this pass; still require backend tests plus agent-executed QA scenarios.
|
||||||
|
|
||||||
|
### Metis Review (gaps addressed)
|
||||||
|
- Made household sharing explicit with a local `households` + `household_memberships` model instead of overloading OIDC groups.
|
||||||
|
- Added a deterministic legacy-data backfill step so existing single-user records are assigned to the first configured admin identity during migration.
|
||||||
|
- Called out `llm_context.py`, helper functions like `get_or_404()`, and all row-fetching routes as mandatory scoping points so no single-user path survives.
|
||||||
|
- Chose JWT access-token validation via Authelia JWKS for FastAPI, with SvelteKit calling `userinfo` to hydrate the app session and local user record.
|
||||||
|
- Kept browser QA agent-executed and out of repo while still requiring backend auth tests and CI enablement.
|
||||||
|
|
||||||
|
## Work Objectives
|
||||||
|
### Core Objective
|
||||||
|
Implement a secure, decision-complete multi-user architecture that uses Authelia OIDC for authentication, local app users/households for authorization, row ownership across existing data models, and household-scoped inventory sharing without broadening scope into a full account-management product.
|
||||||
|
|
||||||
|
### Deliverables
|
||||||
|
- Backend identity/auth models for local users, households, memberships, and role mapping.
|
||||||
|
- Alembic migration/backfill converting all existing domain data to owned records.
|
||||||
|
- FastAPI auth dependencies, token validation, and authorization utilities.
|
||||||
|
- Retrofitted API routes and LLM context builders that enforce ownership.
|
||||||
|
- SvelteKit login, callback, logout, refresh, and protected-route behavior.
|
||||||
|
- Auth-aware API access from frontend server actions and protected page loads.
|
||||||
|
- Admin-only backend endpoints for household membership management without a UI console.
|
||||||
|
- nginx, deploy, CI, and environment updates needed for OIDC rollout.
|
||||||
|
|
||||||
|
### Definition of Done (verifiable conditions with commands)
|
||||||
|
- `cd backend && uv run pytest`
|
||||||
|
- `cd backend && uv run ruff check .`
|
||||||
|
- `cd frontend && pnpm check`
|
||||||
|
- `cd frontend && pnpm lint`
|
||||||
|
- `cd frontend && pnpm build`
|
||||||
|
- `cd backend && uv run python -c "import json; from main import app; print(json.dumps(app.openapi())[:200])"`
|
||||||
|
|
||||||
|
### Must Have
|
||||||
|
- OIDC Authorization Code flow with PKCE, server-handled callback, HTTP-only app session cookie, refresh-token renewal, and logout.
|
||||||
|
- FastAPI bearer-token validation against Authelia JWKS; no trusted identity headers between app tiers.
|
||||||
|
- Local `users`, `households`, and `household_memberships` tables keyed by `issuer + sub` rather than email.
|
||||||
|
- `user_id` ownership enforcement across profile, health, routines, skincare, AI logs, and products.
|
||||||
|
- Household inventory-sharing model that permits view/update of shared inventory by household members while preserving owner/admin control of product records.
|
||||||
|
- Deterministic backfill of legacy records to a configured bootstrap admin identity.
|
||||||
|
- Admin/member authorization rules enforced in backend dependencies and mirrored in frontend navigation/controls.
|
||||||
|
- Backend auth and authorization tests, plus CI job enablement for those tests.
|
||||||
|
|
||||||
|
### Must NOT Have (guardrails, AI slop patterns, scope boundaries)
|
||||||
|
- No proxy-header trust model between SvelteKit and FastAPI.
|
||||||
|
- No in-app signup, password reset, email verification, impersonation, or full user-management console.
|
||||||
|
- No multi-household membership per user in v1.
|
||||||
|
- No global shared product catalog refactor in this pass.
|
||||||
|
- No audit-log productization, notification system, or support tooling.
|
||||||
|
- No permanent Playwright/Vitest suite added to the repo in this pass.
|
||||||
|
|
||||||
|
## Verification Strategy
|
||||||
|
> ZERO HUMAN INTERVENTION - all verification is agent-executed.
|
||||||
|
- Test decision: tests-after using existing backend `pytest` + `TestClient`; no new committed frontend suite, but include agent-executed browser QA and curl-based verification.
|
||||||
|
- QA policy: every task includes happy-path and failure/edge-case scenarios with exact commands or browser actions.
|
||||||
|
- Evidence: `.sisyphus/evidence/task-{N}-{slug}.{ext}`
|
||||||
|
|
||||||
|
## Execution Strategy
|
||||||
|
### Parallel Execution Waves
|
||||||
|
> Target: 5-8 tasks per wave. <3 per wave (except final) = under-splitting.
|
||||||
|
> Extract shared dependencies as Wave-1 tasks for max parallelism.
|
||||||
|
|
||||||
|
Wave 1: T1 identity models, T2 ownership migration, T3 backend token validation, T4 tenant-aware authorization helpers
|
||||||
|
|
||||||
|
Wave 2: T5 product/inventory authorization retrofit, T6 remaining domain scoping retrofit, T7 SvelteKit auth/session flow, T8 frontend auth-aware plumbing and shell behavior
|
||||||
|
|
||||||
|
Wave 3: T9 admin household-management endpoints, T10 infra/env/CI/deploy updates, T11 backend auth regression coverage and release verification
|
||||||
|
|
||||||
|
### Dependency Matrix (full, all tasks)
|
||||||
|
| Task | Depends On | Blocks |
|
||||||
|
| --- | --- | --- |
|
||||||
|
| T1 | - | T2, T3, T4, T9 |
|
||||||
|
| T2 | T1 | T5, T6, T11 |
|
||||||
|
| T3 | T1 | T4, T5, T6, T7, T8, T9, T11 |
|
||||||
|
| T4 | T1, T3 | T5, T6, T9 |
|
||||||
|
| T5 | T2, T3, T4 | T11 |
|
||||||
|
| T6 | T2, T3, T4 | T11 |
|
||||||
|
| T7 | T3 | T8, T10, T11 |
|
||||||
|
| T8 | T7 | T11 |
|
||||||
|
| T9 | T1, T2, T3, T4 | T11 |
|
||||||
|
| T10 | T3, T7 | T11 |
|
||||||
|
| T11 | T2, T3, T4, T5, T6, T7, T8, T9, T10 | Final verification |
|
||||||
|
|
||||||
|
### Agent Dispatch Summary (wave -> task count -> categories)
|
||||||
|
- Wave 1 -> 4 tasks -> `deep`, `unspecified-high`
|
||||||
|
- Wave 2 -> 4 tasks -> `deep`, `unspecified-high`, `writing`
|
||||||
|
- Wave 3 -> 3 tasks -> `unspecified-high`, `writing`, `deep`
|
||||||
|
|
||||||
|
## TODOs
|
||||||
|
> Implementation + Test = ONE task. Never separate.
|
||||||
|
> EVERY task MUST have: Agent Profile + Parallelization + QA Scenarios.
|
||||||
|
|
||||||
|
- [x] T1. Add local identity, role, household, and sharing models
|
||||||
|
|
||||||
|
**What to do**: Add a new backend model module for `User`, `Household`, and `HouseholdMembership`; extend existing domain models with ownership fields; add a compact role enum (`admin`, `member`) and a household-membership role enum (`owner`, `member`). Use `issuer + subject` as the immutable OIDC identity key, enforce at most one household membership per user in v1, and add `is_household_shared: bool = False` to `ProductInventory` so sharing is opt-in per inventory row rather than automatic for an entire household.
|
||||||
|
**Must NOT do**: Do not key users by email, do not introduce multi-household membership, do not split `Product` into catalog vs overlay tables in this pass, and do not add frontend management UI here.
|
||||||
|
|
||||||
|
**Recommended Agent Profile**:
|
||||||
|
- Category: `deep` - Reason: cross-cutting schema design with downstream auth and authorization consequences
|
||||||
|
- Skills: `[]` - Existing backend conventions are the main source of truth
|
||||||
|
- Omitted: `svelte-code-writer` - No Svelte files belong in this task
|
||||||
|
|
||||||
|
**Parallelization**: Can Parallel: NO | Wave 1 | Blocks: T2, T3, T4, T9 | Blocked By: -
|
||||||
|
|
||||||
|
**References** (executor has NO interview context - be exhaustive):
|
||||||
|
- Pattern: `backend/innercontext/models/profile.py:13` - Simple SQLModel table with UUID PK and timestamp conventions to follow for user-owned profile data.
|
||||||
|
- Pattern: `backend/innercontext/models/product.py:138` - Main table-model style, JSON-column usage, and `updated_at` pattern.
|
||||||
|
- Pattern: `backend/innercontext/models/product.py:353` - Existing `ProductInventory` table to extend with ownership and sharing fields.
|
||||||
|
- Pattern: `backend/innercontext/models/__init__.py:1` - Export surface that must include every new model/type.
|
||||||
|
- API/Type: `backend/innercontext/models/enums.py` - Existing enum location; add role enums here unless a dedicated auth model module makes more sense.
|
||||||
|
|
||||||
|
**Acceptance Criteria** (agent-executable only):
|
||||||
|
- [ ] `backend/innercontext/models/` defines `User`, `Household`, and `HouseholdMembership` with UUID PKs, timestamps, uniqueness on `(oidc_issuer, oidc_subject)`, and one-household-per-user enforcement.
|
||||||
|
- [ ] `Product`, `ProductInventory`, `UserProfile`, `MedicationEntry`, `MedicationUsage`, `LabResult`, `Routine`, `RoutineStep`, `GroomingSchedule`, `SkinConditionSnapshot`, and `AICallLog` each expose an ownership field (`user_id`) in model code, with `ProductInventory` also exposing `is_household_shared`.
|
||||||
|
- [ ] `innercontext.models` re-exports the new auth/household types so metadata loading and imports continue to work.
|
||||||
|
- [ ] `cd backend && uv run python -c "import innercontext.models as m; print(all(hasattr(m, name) for name in ['User','Household','HouseholdMembership']))"` prints `True`.
|
||||||
|
|
||||||
|
**QA Scenarios** (MANDATORY - task incomplete without these):
|
||||||
|
```
|
||||||
|
Scenario: Identity models load into SQLModel metadata
|
||||||
|
Tool: Bash
|
||||||
|
Steps: Run `cd backend && uv run python -c "import innercontext.models; from sqlmodel import SQLModel; print(sorted(t.name for t in SQLModel.metadata.sorted_tables if t.name in {'users','households','household_memberships'}))" > ../.sisyphus/evidence/task-T1-identity-models.txt`
|
||||||
|
Expected: Evidence file lists `['household_memberships', 'households', 'users']`
|
||||||
|
Evidence: .sisyphus/evidence/task-T1-identity-models.txt
|
||||||
|
|
||||||
|
Scenario: Product inventory sharing stays opt-in
|
||||||
|
Tool: Bash
|
||||||
|
Steps: Run `cd backend && uv run python -c "from innercontext.models.product import ProductInventory; f=ProductInventory.model_fields['is_household_shared']; print(f.default)" > ../.sisyphus/evidence/task-T1-sharing-default.txt`
|
||||||
|
Expected: Evidence file contains `False`
|
||||||
|
Evidence: .sisyphus/evidence/task-T1-sharing-default.txt
|
||||||
|
```
|
||||||
|
|
||||||
|
**Commit**: YES | Message: `feat(auth): add local user and household models` | Files: `backend/innercontext/models/*`
|
||||||
|
|
||||||
|
- [x] T2. Add Alembic migration and bootstrap backfill for legacy single-user data
|
||||||
|
|
||||||
|
**What to do**: Create an Alembic revision that creates `users`, `households`, and `household_memberships`, adds `user_id` ownership columns and related foreign keys/indexes to all owned tables, and adds `is_household_shared` to `product_inventory`. Use a two-step migration: add nullable columns, create/bootstrap a local admin user + default household from environment variables, backfill every existing row to that bootstrap user, then enforce non-null ownership constraints. Use env names `BOOTSTRAP_ADMIN_OIDC_ISSUER`, `BOOTSTRAP_ADMIN_OIDC_SUB`, `BOOTSTRAP_ADMIN_EMAIL`, `BOOTSTRAP_ADMIN_NAME`, and `BOOTSTRAP_HOUSEHOLD_NAME`; abort the migration with a clear error if legacy data exists and the required issuer/sub values are missing.
|
||||||
|
**Must NOT do**: Do not assign ownership based on email matching, do not silently create random bootstrap identities, and do not leave owned tables nullable after the migration completes.
|
||||||
|
|
||||||
|
**Recommended Agent Profile**:
|
||||||
|
- Category: `deep` - Reason: schema migration, backfill, and irreversible data-shape change
|
||||||
|
- Skills: `[]` - Use existing Alembic patterns from the repo
|
||||||
|
- Omitted: `git-master` - Commit strategy is already prescribed here
|
||||||
|
|
||||||
|
**Parallelization**: Can Parallel: NO | Wave 1 | Blocks: T5, T6, T11 | Blocked By: T1
|
||||||
|
|
||||||
|
**References** (executor has NO interview context - be exhaustive):
|
||||||
|
- Pattern: `backend/alembic/versions/` - Existing migration naming/layout conventions to follow.
|
||||||
|
- Pattern: `backend/innercontext/models/product.py:180` - Timestamp/nullability expectations that migrated columns must preserve.
|
||||||
|
- Pattern: `backend/db.py:17` - Metadata creation path; migration must leave runtime startup compatible.
|
||||||
|
- API/Type: `backend/innercontext/models/profile.py:13` - Existing singleton-style table that must become owned data.
|
||||||
|
- API/Type: `backend/innercontext/models/product.py:353` - Inventory table receiving the sharing flag.
|
||||||
|
|
||||||
|
**Acceptance Criteria** (agent-executable only):
|
||||||
|
- [ ] A new Alembic revision exists under `backend/alembic/versions/` creating auth/household tables and ownership columns/indexes/foreign keys.
|
||||||
|
- [ ] The migration backfills all existing owned rows to the bootstrap admin user and creates that user's default household + owner membership.
|
||||||
|
- [ ] The migration aborts with a readable exception if legacy data exists and `BOOTSTRAP_ADMIN_OIDC_ISSUER` or `BOOTSTRAP_ADMIN_OIDC_SUB` is absent.
|
||||||
|
- [ ] Owned tables end with non-null `user_id` constraints after upgrade.
|
||||||
|
|
||||||
|
**QA Scenarios** (MANDATORY - task incomplete without these):
|
||||||
|
```
|
||||||
|
Scenario: Migration upgrade succeeds with bootstrap identity configured
|
||||||
|
Tool: Bash
|
||||||
|
Steps: Create a disposable DB URL (for example `sqlite:///../.sisyphus/evidence/task-T2-upgrade.sqlite`), then run `cd backend && DATABASE_URL=sqlite:///../.sisyphus/evidence/task-T2-upgrade.sqlite BOOTSTRAP_ADMIN_OIDC_ISSUER=https://auth.example.test BOOTSTRAP_ADMIN_OIDC_SUB=legacy-admin BOOTSTRAP_ADMIN_EMAIL=owner@example.test BOOTSTRAP_ADMIN_NAME='Legacy Owner' BOOTSTRAP_HOUSEHOLD_NAME='Default Household' uv run alembic upgrade head > ../.sisyphus/evidence/task-T2-migration-upgrade.txt`
|
||||||
|
Expected: Command exits 0 and evidence file shows Alembic reached `head`
|
||||||
|
Evidence: .sisyphus/evidence/task-T2-migration-upgrade.txt
|
||||||
|
|
||||||
|
Scenario: Migration fails fast when bootstrap identity is missing for legacy data
|
||||||
|
Tool: Bash
|
||||||
|
Steps: Seed a disposable SQLite DB with one legacy row using the pre-migration schema, then run `cd backend && DATABASE_URL=sqlite:///../.sisyphus/evidence/task-T2-missing-bootstrap.sqlite uv run alembic upgrade head 2> ../.sisyphus/evidence/task-T2-migration-missing-bootstrap.txt`
|
||||||
|
Expected: Upgrade exits non-zero and evidence contains a message naming both missing bootstrap env vars
|
||||||
|
Evidence: .sisyphus/evidence/task-T2-migration-missing-bootstrap.txt
|
||||||
|
```
|
||||||
|
|
||||||
|
**Commit**: YES | Message: `feat(db): backfill tenant ownership for existing records` | Files: `backend/alembic/versions/*`, `backend/innercontext/models/*`
|
||||||
|
|
||||||
|
- [x] T3. Implement FastAPI token validation, user sync, and current-user dependencies
|
||||||
|
|
||||||
|
**What to do**: Add backend auth modules that validate Authelia JWT access tokens via JWKS with cached key material, enforce issuer/audience/expiry checks, map role groups to local roles, and expose dependencies like `get_current_user()` and `require_admin()`. Create protected auth endpoints for session sync and self introspection (for example `/auth/session/sync` and `/auth/me`) so SvelteKit can exchange token-derived/userinfo-derived identity details for a local `User` row and current app profile. Use env/config values for issuer, JWKS URL/discovery URL, client ID, and group names instead of hard-coding them.
|
||||||
|
**Must NOT do**: Do not trust `X-Forwarded-User`-style headers, do not skip signature validation, do not derive role from email domain, and do not make backend routes public except health-check.
|
||||||
|
|
||||||
|
**Recommended Agent Profile**:
|
||||||
|
- Category: `unspecified-high` - Reason: focused backend auth implementation with security-sensitive logic
|
||||||
|
- Skills: `[]` - No project skill is better than direct backend work here
|
||||||
|
- Omitted: `svelte-code-writer` - No Svelte components involved
|
||||||
|
|
||||||
|
**Parallelization**: Can Parallel: NO | Wave 1 | Blocks: T4, T5, T6, T7, T8, T9, T11 | Blocked By: T1
|
||||||
|
|
||||||
|
**References** (executor has NO interview context - be exhaustive):
|
||||||
|
- Pattern: `backend/main.py:37` - Current FastAPI app construction and router registration point.
|
||||||
|
- Pattern: `backend/db.py:12` - Session dependency shape that auth dependencies must compose with.
|
||||||
|
- Pattern: `backend/innercontext/api/profile.py:27` - Router/dependency style used throughout the API.
|
||||||
|
- External: `https://www.authelia.com/configuration/identity-providers/openid-connect/provider/` - OIDC provider/discovery and JWKS behavior.
|
||||||
|
- External: `https://www.authelia.com/integration/openid-connect/openid-connect-1.0-claims/` - Claims and userinfo behavior; use `issuer + sub` as identity key.
|
||||||
|
|
||||||
|
**Acceptance Criteria** (agent-executable only):
|
||||||
|
- [ ] A backend auth module validates bearer tokens against Authelia JWKS with issuer/audience checks and cached key refresh.
|
||||||
|
- [ ] Protected dependencies expose a normalized current user object with local `user_id`, role, and household membership information.
|
||||||
|
- [ ] Backend includes protected auth sync/introspection endpoints used by SvelteKit to upsert local users from OIDC identity data.
|
||||||
|
- [ ] Unauthenticated access to owned API routes returns `401`; authenticated access with a valid token reaches router logic.
|
||||||
|
|
||||||
|
**QA Scenarios** (MANDATORY - task incomplete without these):
|
||||||
|
```
|
||||||
|
Scenario: Valid bearer token resolves a current user
|
||||||
|
Tool: Bash
|
||||||
|
Steps: Run `cd backend && uv run pytest tests/test_auth.py -k sync > ../.sisyphus/evidence/task-T3-auth-sync.txt`
|
||||||
|
Expected: Auth sync/introspection tests pass and evidence includes the protected auth endpoint names
|
||||||
|
Evidence: .sisyphus/evidence/task-T3-auth-sync.txt
|
||||||
|
|
||||||
|
Scenario: Missing or invalid bearer token is rejected
|
||||||
|
Tool: Bash
|
||||||
|
Steps: Run `cd backend && uv run pytest tests/test_auth.py -k unauthorized > ../.sisyphus/evidence/task-T3-auth-unauthorized.txt`
|
||||||
|
Expected: Tests pass and evidence shows `401` expectations
|
||||||
|
Evidence: .sisyphus/evidence/task-T3-auth-unauthorized.txt
|
||||||
|
```
|
||||||
|
|
||||||
|
**Commit**: YES | Message: `feat(auth): validate Authelia tokens in FastAPI` | Files: `backend/main.py`, `backend/innercontext/auth.py`, `backend/innercontext/api/auth*.py`
|
||||||
|
|
||||||
|
- [x] T4. Centralize tenant-aware fetch helpers and authorization predicates
|
||||||
|
|
||||||
|
**What to do**: Replace single-user helper assumptions with reusable authorization helpers that every router can call. Add tenant-aware helpers for owned lookup, admin override, same-household checks, and household-shared inventory visibility/update rules. Keep `get_session()` unchanged, but add helpers/dependencies that make it difficult for routers to accidentally query global rows. Update or supersede `get_or_404()` with helpers that scope by `user_id` and return `404` for unauthorized record lookups unless the route intentionally needs `403`.
|
||||||
|
**Must NOT do**: Do not leave routers performing raw `session.get()` on owned models, do not duplicate household-sharing logic in every route, and do not use admin bypasses that skip existence checks.
|
||||||
|
|
||||||
|
**Recommended Agent Profile**:
|
||||||
|
- Category: `deep` - Reason: authorization rules must become the shared execution path for many routers
|
||||||
|
- Skills: `[]` - This is backend architecture work, not skill-driven tooling
|
||||||
|
- Omitted: `frontend-design` - No UI work belongs here
|
||||||
|
|
||||||
|
**Parallelization**: Can Parallel: NO | Wave 1 | Blocks: T5, T6, T9 | Blocked By: T1, T3
|
||||||
|
|
||||||
|
**References** (executor has NO interview context - be exhaustive):
|
||||||
|
- Pattern: `backend/innercontext/api/utils.py:9` - Existing naive `get_or_404()` helper that must no longer be used for owned records.
|
||||||
|
- Pattern: `backend/innercontext/api/products.py:934` - Current direct object fetch/update/delete route pattern to replace.
|
||||||
|
- Pattern: `backend/innercontext/api/inventory.py:14` - Inventory routes that currently expose rows globally.
|
||||||
|
- Pattern: `backend/innercontext/api/health.py:141` - Representative list/get/update/delete health routes requiring shared helpers.
|
||||||
|
- Pattern: `backend/innercontext/api/routines.py:674` - Another high-volume router that must consume the same authz utilities.
|
||||||
|
|
||||||
|
**Acceptance Criteria** (agent-executable only):
|
||||||
|
- [ ] Backend provides shared helper/dependency functions for owned lookups, admin checks, same-household checks, and shared-inventory updates.
|
||||||
|
- [ ] `get_or_404()` is either retired for owned data or wrapped so no owned router path still uses the unscoped helper directly.
|
||||||
|
- [ ] Shared inventory authorization distinguishes product ownership from inventory update rights.
|
||||||
|
- [ ] Helper tests cover owner access, admin override, same-household shared inventory access, and cross-household denial.
|
||||||
|
|
||||||
|
**QA Scenarios** (MANDATORY - task incomplete without these):
|
||||||
|
```
|
||||||
|
Scenario: Authorization helpers allow owner/admin/household-shared access correctly
|
||||||
|
Tool: Bash
|
||||||
|
Steps: Run `cd backend && uv run pytest tests/test_authz.py -k 'owner or admin or household' > ../.sisyphus/evidence/task-T4-authz-happy.txt`
|
||||||
|
Expected: Tests pass and evidence includes owner/admin/household cases
|
||||||
|
Evidence: .sisyphus/evidence/task-T4-authz-happy.txt
|
||||||
|
|
||||||
|
Scenario: Cross-household access is denied without leaking row existence
|
||||||
|
Tool: Bash
|
||||||
|
Steps: Run `cd backend && uv run pytest tests/test_authz.py -k denied > ../.sisyphus/evidence/task-T4-authz-denied.txt`
|
||||||
|
Expected: Tests pass and evidence shows `404` or `403` assertions exactly where specified by the helper contract
|
||||||
|
Evidence: .sisyphus/evidence/task-T4-authz-denied.txt
|
||||||
|
```
|
||||||
|
|
||||||
|
**Commit**: YES | Message: `refactor(api): centralize tenant authorization helpers` | Files: `backend/innercontext/api/utils.py`, `backend/innercontext/api/authz.py`, router call sites
|
||||||
|
|
||||||
|
- [x] T5. Retrofit products and inventory endpoints for owned access plus household sharing
|
||||||
|
|
||||||
|
**What to do**: Update `products` and `inventory` APIs so product visibility is `owned OR household-visible-via-shared-inventory OR admin`, while product mutation remains `owner OR admin`. Keep `Product` user-owned. For household members, allow `GET` on shared products/inventory rows and `PATCH` on shared inventory rows, but keep `POST /products`, `PATCH /products/{id}`, `DELETE /products/{id}`, `POST /products/{id}/inventory`, and `DELETE /inventory/{id}` restricted to owner/admin. Reuse the existing `ProductListItem.is_owned` field so shared-but-not-owned products are clearly marked in summaries. Ensure suggestion and summary endpoints only use products accessible to the current user.
|
||||||
|
**Must NOT do**: Do not expose non-shared inventory across a household, do not let household members edit `personal_tolerance_notes`, and do not return global product lists anymore.
|
||||||
|
|
||||||
|
**Recommended Agent Profile**:
|
||||||
|
- Category: `deep` - Reason: most nuanced authorization rules live in product and inventory flows
|
||||||
|
- Skills: `[]` - Backend logic and existing product patterns are sufficient
|
||||||
|
- Omitted: `frontend-design` - No UI polish belongs here
|
||||||
|
|
||||||
|
**Parallelization**: Can Parallel: YES | Wave 2 | Blocks: T11 | Blocked By: T2, T3, T4
|
||||||
|
|
||||||
|
**References** (executor has NO interview context - be exhaustive):
|
||||||
|
- Pattern: `backend/innercontext/api/products.py:605` - List route currently returning global products.
|
||||||
|
- Pattern: `backend/innercontext/api/products.py:844` - Summary route already exposes `is_owned`; extend rather than replacing it.
|
||||||
|
- Pattern: `backend/innercontext/api/products.py:934` - Detail/update/delete routes that currently use direct lookup.
|
||||||
|
- Pattern: `backend/innercontext/api/products.py:977` - Product inventory list/create routes.
|
||||||
|
- Pattern: `backend/innercontext/api/inventory.py:14` - Direct inventory get/update/delete routes that currently bypass ownership.
|
||||||
|
- API/Type: `backend/innercontext/models/product.py:353` - Inventory model fields involved in household sharing.
|
||||||
|
- Test: `backend/tests/test_products.py:38` - Existing CRUD/filter test style to extend for authz cases.
|
||||||
|
|
||||||
|
**Acceptance Criteria** (agent-executable only):
|
||||||
|
- [ ] Product list/detail/summary/suggest endpoints only return products accessible to the current user.
|
||||||
|
- [ ] Shared household members can `GET` shared products/inventory and `PATCH` shared inventory rows, but cannot mutate product records or create/delete another user's inventory rows.
|
||||||
|
- [ ] Product summaries preserve `is_owned` semantics for shared products.
|
||||||
|
- [ ] Product/inventory tests cover owner, admin, same-household shared member, and different-household member cases.
|
||||||
|
|
||||||
|
**QA Scenarios** (MANDATORY - task incomplete without these):
|
||||||
|
```
|
||||||
|
Scenario: Household member can view a shared product and update its shared inventory row
|
||||||
|
Tool: Bash
|
||||||
|
Steps: Run `cd backend && uv run pytest tests/test_products_auth.py -k 'shared_inventory_update or shared_product_visible' > ../.sisyphus/evidence/task-T5-product-sharing.txt`
|
||||||
|
Expected: Tests pass and evidence shows `200` assertions for shared view/update cases
|
||||||
|
Evidence: .sisyphus/evidence/task-T5-product-sharing.txt
|
||||||
|
|
||||||
|
Scenario: Household member cannot edit or delete another user's product
|
||||||
|
Tool: Bash
|
||||||
|
Steps: Run `cd backend && uv run pytest tests/test_products_auth.py -k 'cannot_edit_shared_product or cannot_delete_shared_product' > ../.sisyphus/evidence/task-T5-product-denied.txt`
|
||||||
|
Expected: Tests pass and evidence shows `403` or `404` assertions matching the route contract
|
||||||
|
Evidence: .sisyphus/evidence/task-T5-product-denied.txt
|
||||||
|
```
|
||||||
|
|
||||||
|
**Commit**: YES | Message: `feat(api): scope products and inventory by owner and household` | Files: `backend/innercontext/api/products.py`, `backend/innercontext/api/inventory.py`, related tests
|
||||||
|
|
||||||
|
- [x] T6. Retrofit remaining domain routes, LLM context, and jobs for per-user ownership
|
||||||
|
|
||||||
|
**What to do**: Update profile, health, routines, skincare, AI log, and LLM-context code so every query is user-scoped by default and admin override is explicit. `UserProfile` becomes one-per-user rather than singleton; `build_user_profile_context()` and product-context builders must accept the current user and only include accessible data. Routine suggestion/batch flows must use the current user's profile plus products visible under the owned/shared rules from T5. Ensure background pricing/job paths preserve `user_id` on products and logs, and that list endpoints never aggregate cross-user data for non-admins.
|
||||||
|
**Must NOT do**: Do not keep any `select(Model)` query unfiltered on an owned model, do not keep singleton profile lookups, and do not leak other users' AI logs or health data through helper functions.
|
||||||
|
|
||||||
|
**Recommended Agent Profile**:
|
||||||
|
- Category: `deep` - Reason: many routers and helper layers need consistent tenancy retrofits
|
||||||
|
- Skills: `[]` - Backend cross-module work only
|
||||||
|
- Omitted: `svelte-code-writer` - No Svelte component work in this task
|
||||||
|
|
||||||
|
**Parallelization**: Can Parallel: YES | Wave 2 | Blocks: T11 | Blocked By: T2, T3, T4
|
||||||
|
|
||||||
|
**References** (executor has NO interview context - be exhaustive):
|
||||||
|
- Pattern: `backend/innercontext/api/profile.py:27` - Current singleton profile route using `get_user_profile(session)`.
|
||||||
|
- Pattern: `backend/innercontext/api/llm_context.py:10` - Single-user helper that currently selects the most recent profile globally.
|
||||||
|
- Pattern: `backend/innercontext/api/health.py:141` - Medication and lab-result CRUD/list route layout.
|
||||||
|
- Pattern: `backend/innercontext/api/routines.py:674` - Routine list/create/suggest entry points that need scoped product/profile data.
|
||||||
|
- Pattern: `backend/innercontext/api/skincare.py:222` - Snapshot list/get/update/delete route structure.
|
||||||
|
- Pattern: `backend/innercontext/api/ai_logs.py:46` - AI-log exposure that must become owned/admin-only.
|
||||||
|
- Pattern: `backend/innercontext/services/pricing_jobs.py` - Background queue path that must preserve product ownership.
|
||||||
|
|
||||||
|
**Acceptance Criteria** (agent-executable only):
|
||||||
|
- [ ] Every non-admin router outside products/inventory scopes owned data by `user_id` before returning or mutating rows.
|
||||||
|
- [ ] `GET /profile` and `PATCH /profile` operate on the current user's profile, not the newest global profile.
|
||||||
|
- [ ] Routine suggestion and batch suggestion flows use only the current user's profile plus accessible products.
|
||||||
|
- [ ] AI logs are owned/admin-only, and background job/log creation stores `user_id` when applicable.
|
||||||
|
|
||||||
|
**QA Scenarios** (MANDATORY - task incomplete without these):
|
||||||
|
```
|
||||||
|
Scenario: Member only sees their own health, routine, profile, skin, and AI-log data
|
||||||
|
Tool: Bash
|
||||||
|
Steps: Run `cd backend && uv run pytest tests/test_tenancy_domains.py -k 'profile or health or routines or skincare or ai_logs' > ../.sisyphus/evidence/task-T6-domain-tenancy.txt`
|
||||||
|
Expected: Tests pass and evidence shows only owned/admin-allowed access patterns
|
||||||
|
Evidence: .sisyphus/evidence/task-T6-domain-tenancy.txt
|
||||||
|
|
||||||
|
Scenario: Routine suggestions ignore another user's products and profile
|
||||||
|
Tool: Bash
|
||||||
|
Steps: Run `cd backend && uv run pytest tests/test_routines_auth.py -k suggest > ../.sisyphus/evidence/task-T6-routine-scope.txt`
|
||||||
|
Expected: Tests pass and evidence shows suggestion inputs are scoped to the authenticated user plus shared inventory visibility rules
|
||||||
|
Evidence: .sisyphus/evidence/task-T6-routine-scope.txt
|
||||||
|
```
|
||||||
|
|
||||||
|
**Commit**: YES | Message: `feat(api): enforce ownership across health routines and profile flows` | Files: `backend/innercontext/api/profile.py`, `backend/innercontext/api/health.py`, `backend/innercontext/api/routines.py`, `backend/innercontext/api/skincare.py`, `backend/innercontext/api/ai_logs.py`, `backend/innercontext/api/llm_context.py`
|
||||||
|
|
||||||
|
- [x] T7. Implement SvelteKit OIDC login, callback, logout, refresh, and protected-session handling
|
||||||
|
|
||||||
|
**What to do**: Add server-only auth utilities under `frontend/src/lib/server/` and implement `Authorization Code + PKCE` in SvelteKit using Authelia discovery/token/userinfo endpoints. Create `/auth/login`, `/auth/callback`, and `/auth/logout` server routes. Extend `hooks.server.ts` to decrypt/load the app session, refresh the access token when it is near expiry, populate `event.locals.user` and `event.locals.session`, and redirect unauthenticated requests on all application routes except `/auth/*` and static assets. Use an encrypted HTTP-only cookie named `innercontext_session` with `sameSite=lax`, `secure` in production, and a 32-byte secret from private env.
|
||||||
|
**Must NOT do**: Do not store access or refresh tokens in `localStorage`, do not expose client secrets via `$env/static/public`, and do not protect routes with client-only guards.
|
||||||
|
|
||||||
|
**Recommended Agent Profile**:
|
||||||
|
- Category: `unspecified-high` - Reason: server-side SvelteKit auth flow with cookies, hooks, and redirects
|
||||||
|
- Skills: [`svelte-code-writer`] - Required for editing SvelteKit auth and route modules cleanly
|
||||||
|
- Omitted: `frontend-design` - This task is auth/session behavior, not visual redesign
|
||||||
|
|
||||||
|
**Parallelization**: Can Parallel: YES | Wave 2 | Blocks: T8, T10, T11 | Blocked By: T3
|
||||||
|
|
||||||
|
**References** (executor has NO interview context - be exhaustive):
|
||||||
|
- Pattern: `frontend/src/hooks.server.ts:1` - Current global request hook; auth must compose with existing Paraglide middleware rather than replacing it.
|
||||||
|
- Pattern: `frontend/src/app.d.ts:3` - Add typed `App.Locals`/`PageData` session fields here.
|
||||||
|
- Pattern: `frontend/src/routes/+layout.svelte:30` - App shell/navigation that will consume authenticated user state later.
|
||||||
|
- Pattern: `frontend/src/routes/products/suggest/+page.server.ts:4` - Existing SvelteKit server action style using `fetch`.
|
||||||
|
- External: `https://www.authelia.com/configuration/identity-providers/openid-connect/clients/` - Client configuration expectations for auth code flow and PKCE.
|
||||||
|
|
||||||
|
**Acceptance Criteria** (agent-executable only):
|
||||||
|
- [ ] SvelteKit exposes login/callback/logout server routes that complete the OIDC flow against Authelia and create/destroy `innercontext_session`.
|
||||||
|
- [ ] `hooks.server.ts` populates `event.locals.user`/`event.locals.session`, refreshes tokens near expiry, and redirects unauthenticated users away from protected pages.
|
||||||
|
- [ ] The callback flow calls backend auth sync before treating the user as signed in.
|
||||||
|
- [ ] Session cookies are HTTP-only and sourced only from private env/config.
|
||||||
|
|
||||||
|
**QA Scenarios** (MANDATORY - task incomplete without these):
|
||||||
|
```
|
||||||
|
Scenario: Login callback establishes an authenticated server session
|
||||||
|
Tool: Playwright
|
||||||
|
Steps: Navigate to `/products` while signed out, follow redirect to `/auth/login`, on the Authelia page fill the `Username` and `Password` fields using `E2E_AUTHELIA_USERNAME`/`E2E_AUTHELIA_PASSWORD`, submit the primary login button, wait for redirect back to the app, then save an accessibility snapshot to `.sisyphus/evidence/task-T7-login-flow.md`
|
||||||
|
Expected: Final URL is inside the app, the protected page renders, and the session cookie exists
|
||||||
|
Evidence: .sisyphus/evidence/task-T7-login-flow.md
|
||||||
|
|
||||||
|
Scenario: Expired or refresh-failed session redirects back to login
|
||||||
|
Tool: Playwright
|
||||||
|
Steps: Start from an authenticated session, replace the `innercontext_session` cookie with one containing an expired access token or invalidate the refresh endpoint in the browser session, reload `/products`, and save a snapshot to `.sisyphus/evidence/task-T7-refresh-failure.md`
|
||||||
|
Expected: The app clears the session cookie and redirects to `/auth/login`
|
||||||
|
Evidence: .sisyphus/evidence/task-T7-refresh-failure.md
|
||||||
|
```
|
||||||
|
|
||||||
|
**Commit**: YES | Message: `feat(frontend): add Authelia OIDC session flow` | Files: `frontend/src/hooks.server.ts`, `frontend/src/app.d.ts`, `frontend/src/lib/server/auth.ts`, `frontend/src/routes/auth/*`
|
||||||
|
|
||||||
|
- [x] T8. Refactor frontend data access, route guards, and shell state around the server session
|
||||||
|
|
||||||
|
**What to do**: Refactor frontend API access so protected backend calls always originate from SvelteKit server loads/actions/endpoints using the access token from `event.locals.session`. Convert browser-side direct `$lib/api` usage to server actions or same-origin SvelteKit endpoints, add a `+layout.server.ts` that exposes authenticated user data to the shell, and update `+layout.svelte` to show the current user role/name plus a logout action. Regenerate OpenAPI types if backend response models change and keep `$lib/types` as the canonical import surface.
|
||||||
|
**Must NOT do**: Do not keep browser-side bearer-token fetches, do not bypass the server session by calling backend APIs directly from components, and do not hardcode English auth labels without Paraglide message keys.
|
||||||
|
|
||||||
|
**Recommended Agent Profile**:
|
||||||
|
- Category: `unspecified-high` - Reason: SvelteKit route plumbing plus shell-state integration
|
||||||
|
- Skills: [`svelte-code-writer`] - Required because this task edits `.svelte` and SvelteKit route modules
|
||||||
|
- Omitted: `frontend-design` - Preserve the existing editorial shell instead of redesigning it
|
||||||
|
|
||||||
|
**Parallelization**: Can Parallel: YES | Wave 2 | Blocks: T11 | Blocked By: T7
|
||||||
|
|
||||||
|
**References** (executor has NO interview context - be exhaustive):
|
||||||
|
- Pattern: `frontend/src/lib/api.ts:25` - Current request helper branching between browser and server; replace with session-aware server usage.
|
||||||
|
- Pattern: `frontend/src/routes/+layout.svelte:63` - Existing app shell where user state/logout should appear without breaking navigation.
|
||||||
|
- Pattern: `frontend/src/routes/+page.server.ts` - Representative server-load pattern already used throughout the app.
|
||||||
|
- Pattern: `frontend/src/routes/skin/new/+page.svelte` - Existing browser-side API import to eliminate or proxy through server logic.
|
||||||
|
- Pattern: `frontend/src/routes/routines/[id]/+page.svelte` - Another browser-side API import that must stop calling the backend directly.
|
||||||
|
- Pattern: `frontend/src/routes/products/suggest/+page.server.ts:4` - Server action pattern to reuse for auth-aware fetches.
|
||||||
|
- API/Type: `frontend/src/lib/types.ts` - Keep as the only frontend import surface after any `pnpm generate:api` run.
|
||||||
|
|
||||||
|
**Acceptance Criteria** (agent-executable only):
|
||||||
|
- [ ] Protected backend calls in frontend code use the server session access token and no longer depend on browser token storage.
|
||||||
|
- [ ] Direct component-level `$lib/api` usage on protected paths is removed or wrapped behind same-origin server endpoints/actions.
|
||||||
|
- [ ] App shell receives authenticated user/session data from server load and exposes a logout affordance.
|
||||||
|
- [ ] `pnpm generate:api` is run if backend auth/API response changes require regenerated frontend types.
|
||||||
|
|
||||||
|
**QA Scenarios** (MANDATORY - task incomplete without these):
|
||||||
|
```
|
||||||
|
Scenario: Authenticated user navigates protected pages and sees session-aware shell state
|
||||||
|
Tool: Playwright
|
||||||
|
Steps: Log in, visit `/`, `/products`, `/profile`, and `/routines`; capture an accessibility snapshot to `.sisyphus/evidence/task-T8-protected-nav.md`
|
||||||
|
Expected: Each page loads without redirect loops, and the shell shows the current user plus logout control
|
||||||
|
Evidence: .sisyphus/evidence/task-T8-protected-nav.md
|
||||||
|
|
||||||
|
Scenario: Unauthenticated browser access cannot hit protected data paths directly
|
||||||
|
Tool: Playwright
|
||||||
|
Steps: Start from a signed-out browser, open a page that previously imported `$lib/api` from a component, attempt the same interaction, capture console/network output to `.sisyphus/evidence/task-T8-signed-out-network.txt`
|
||||||
|
Expected: The app redirects or blocks cleanly without leaking backend JSON responses into the UI
|
||||||
|
Evidence: .sisyphus/evidence/task-T8-signed-out-network.txt
|
||||||
|
```
|
||||||
|
|
||||||
|
**Commit**: YES | Message: `refactor(frontend): route protected API access through server session` | Files: `frontend/src/lib/api.ts`, `frontend/src/routes/**/*.server.ts`, `frontend/src/routes/+layout.*`, selected `.svelte` files, `frontend/src/lib/types.ts`
|
||||||
|
|
||||||
|
- [x] T9. Add admin-only household management API without a frontend console
|
||||||
|
|
||||||
|
**What to do**: Add a small admin-only backend router for household administration so the app can support real household sharing without a management UI. Provide endpoints to list local users who have logged in, create a household, assign a user to a household, move a user between households, and remove a membership. Enforce the v1 rule that a user can belong to at most one household. Do not manage identity creation here; Authelia remains the identity source, and only locally synced users may be assigned. Non-bootstrap users should remain household-less until an admin assigns them.
|
||||||
|
**Must NOT do**: Do not add Svelte pages for household management, do not let non-admins call these endpoints, and do not allow membership assignment for users who have never authenticated into the app.
|
||||||
|
|
||||||
|
**Recommended Agent Profile**:
|
||||||
|
- Category: `unspecified-high` - Reason: contained backend admin surface with sensitive authorization logic
|
||||||
|
- Skills: `[]` - Backend conventions already exist in repo
|
||||||
|
- Omitted: `frontend-design` - Explicitly no console/UI in scope
|
||||||
|
|
||||||
|
**Parallelization**: Can Parallel: YES | Wave 3 | Blocks: T11 | Blocked By: T1, T2, T3, T4
|
||||||
|
|
||||||
|
**References** (executor has NO interview context - be exhaustive):
|
||||||
|
- Pattern: `backend/main.py:50` - Router registration area; add a dedicated admin router here.
|
||||||
|
- Pattern: `backend/innercontext/api/profile.py:41` - Simple patch/upsert route style for small admin mutation endpoints.
|
||||||
|
- Pattern: `backend/innercontext/api/utils.py:9` - Error-handling pattern to preserve with tenant-aware replacements.
|
||||||
|
- API/Type: `backend/innercontext/models/profile.py:13` - Example of owned record exposed without extra wrapper models.
|
||||||
|
- Test: `backend/tests/conftest.py:34` - Dependency-override style for admin/member API tests.
|
||||||
|
|
||||||
|
**Acceptance Criteria** (agent-executable only):
|
||||||
|
- [ ] Backend exposes admin-only household endpoints for list/create/assign/move/remove operations.
|
||||||
|
- [ ] Membership moves preserve the one-household-per-user rule.
|
||||||
|
- [ ] Membership assignment only works for users already present in the local `users` table.
|
||||||
|
- [ ] Admin-route tests cover admin success, member denial, and attempted assignment of unsynced users.
|
||||||
|
|
||||||
|
**QA Scenarios** (MANDATORY - task incomplete without these):
|
||||||
|
```
|
||||||
|
Scenario: Admin can create a household and assign a logged-in member
|
||||||
|
Tool: Bash
|
||||||
|
Steps: Run `cd backend && uv run pytest tests/test_admin_households.py -k 'create_household or assign_member' > ../.sisyphus/evidence/task-T9-admin-households.txt`
|
||||||
|
Expected: Tests pass and evidence shows admin-only success cases
|
||||||
|
Evidence: .sisyphus/evidence/task-T9-admin-households.txt
|
||||||
|
|
||||||
|
Scenario: Member cannot manage households and unsynced users cannot be assigned
|
||||||
|
Tool: Bash
|
||||||
|
Steps: Run `cd backend && uv run pytest tests/test_admin_households.py -k 'forbidden or unsynced' > ../.sisyphus/evidence/task-T9-admin-households-denied.txt`
|
||||||
|
Expected: Tests pass and evidence shows `403`/validation failures for forbidden assignments
|
||||||
|
Evidence: .sisyphus/evidence/task-T9-admin-households-denied.txt
|
||||||
|
```
|
||||||
|
|
||||||
|
**Commit**: YES | Message: `feat(api): add admin household management endpoints` | Files: `backend/main.py`, `backend/innercontext/api/admin*.py`, related tests
|
||||||
|
|
||||||
|
- [x] T10. Update runtime configuration, validation scripts, deploy checks, and operator docs for OIDC
|
||||||
|
|
||||||
|
**What to do**: Update runtime configuration for both services so frontend and backend receive the new OIDC/session env vars at runtime, and document the exact Authelia client/server setup required. Keep nginx in a pure reverse-proxy role (no `auth_request`), but make sure forwarded host/proto information remains sufficient for callback URL generation. Extend `scripts/validate-env.sh` and deploy validation so missing auth env vars fail fast, and update `scripts/healthcheck.sh` plus `deploy.sh` health expectations because authenticated pages may now redirect to login instead of returning `200` for signed-out probes. Document bootstrap-admin env usage for the migration.
|
||||||
|
**Must NOT do**: Do not add proxy-level auth, do not require manual post-deploy DB edits, and do not leave deploy health checks assuming `/` must return `200` when the app intentionally redirects signed-out users.
|
||||||
|
|
||||||
|
**Recommended Agent Profile**:
|
||||||
|
- Category: `writing` - Reason: configuration, deployment, and operator-facing documentation dominate this task
|
||||||
|
- Skills: `[]` - Repo docs and service files are the governing references
|
||||||
|
- Omitted: `svelte-code-writer` - No Svelte component changes needed
|
||||||
|
|
||||||
|
**Parallelization**: Can Parallel: YES | Wave 3 | Blocks: T11 | Blocked By: T3, T7
|
||||||
|
|
||||||
|
**References** (executor has NO interview context - be exhaustive):
|
||||||
|
- Pattern: `nginx/innercontext.conf:1` - Current reverse-proxy setup that must remain proxy-only.
|
||||||
|
- Pattern: `deploy.sh:313` - Service-wait and health-check functions to update for signed-out redirects and auth env validation.
|
||||||
|
- Pattern: `deploy.sh:331` - Backend/frontend health-check behavior currently assuming public app pages.
|
||||||
|
- Pattern: `scripts/validate-env.sh:57` - Existing required-env validation script to extend with OIDC/session/bootstrap keys.
|
||||||
|
- Pattern: `scripts/healthcheck.sh:10` - Current frontend health check that assumes `/` returns `200`.
|
||||||
|
- Pattern: `systemd/innercontext.service` - Backend runtime env injection point.
|
||||||
|
- Pattern: `systemd/innercontext-node.service` - Frontend runtime env injection point.
|
||||||
|
- Pattern: `docs/DEPLOYMENT.md` - Canonical operator runbook to update.
|
||||||
|
|
||||||
|
**Acceptance Criteria** (agent-executable only):
|
||||||
|
- [ ] Backend and frontend runtime configs declare/document all required OIDC/session/bootstrap env vars.
|
||||||
|
- [ ] Deploy validation fails fast when required auth env vars are missing.
|
||||||
|
- [ ] Frontend health checks accept the signed-out auth redirect behavior or target a public route that remains intentionally available.
|
||||||
|
- [ ] Deployment docs describe Authelia client config, callback/logout URLs, JWKS/issuer envs, and bootstrap-migration envs.
|
||||||
|
|
||||||
|
**QA Scenarios** (MANDATORY - task incomplete without these):
|
||||||
|
```
|
||||||
|
Scenario: Deploy validation rejects missing auth configuration
|
||||||
|
Tool: Bash
|
||||||
|
Steps: Run `scripts/validate-env.sh` (or the deploy wrapper that calls it) with one required OIDC/session variable removed, and redirect output to `.sisyphus/evidence/task-T10-missing-env.txt`
|
||||||
|
Expected: Validation exits non-zero and names the missing variable
|
||||||
|
Evidence: .sisyphus/evidence/task-T10-missing-env.txt
|
||||||
|
|
||||||
|
Scenario: Signed-out frontend health behavior matches updated deployment expectations
|
||||||
|
Tool: Bash
|
||||||
|
Steps: Run the updated `scripts/healthcheck.sh` or deploy health-check path and save output to `.sisyphus/evidence/task-T10-health-check.txt`
|
||||||
|
Expected: Evidence shows a successful probe despite protected app routes (either via accepted redirect or a dedicated public health target)
|
||||||
|
Evidence: .sisyphus/evidence/task-T10-health-check.txt
|
||||||
|
```
|
||||||
|
|
||||||
|
**Commit**: YES | Message: `chore(deploy): wire OIDC runtime configuration` | Files: `nginx/innercontext.conf`, `deploy.sh`, `scripts/validate-env.sh`, `scripts/healthcheck.sh`, `systemd/*`, `docs/DEPLOYMENT.md`
|
||||||
|
|
||||||
|
- [ ] T11. Add shared auth fixtures, full regression coverage, and CI enforcement
|
||||||
|
|
||||||
|
**What to do**: Build reusable backend test fixtures for authenticated users, roles, households, and shared inventory, then add regression tests covering auth sync, unauthenticated access, admin/member authorization, household inventory sharing, routine/product visibility, and migration-sensitive ownership behavior. Use dependency overrides in tests instead of hitting a live Authelia server. Enable the existing backend CI job so these tests run in Forgejo, and make sure the final verification command set includes backend tests, lint, frontend check/lint/build, and any required API type generation.
|
||||||
|
**Must NOT do**: Do not depend on a live Authelia instance in CI, do not leave the backend test job disabled, and do not add a committed frontend browser test suite in this pass.
|
||||||
|
|
||||||
|
**Recommended Agent Profile**:
|
||||||
|
- Category: `unspecified-high` - Reason: broad regression coverage plus CI wiring across the monorepo
|
||||||
|
- Skills: `[]` - Existing pytest/CI patterns are sufficient
|
||||||
|
- Omitted: `playwright` - Browser QA stays agent-executed, not repository-committed
|
||||||
|
|
||||||
|
**Parallelization**: Can Parallel: NO | Wave 3 | Blocks: Final verification | Blocked By: T2, T3, T4, T5, T6, T7, T8, T9, T10
|
||||||
|
|
||||||
|
**References** (executor has NO interview context - be exhaustive):
|
||||||
|
- Pattern: `backend/tests/conftest.py:16` - Per-test DB isolation and dependency override technique.
|
||||||
|
- Pattern: `backend/tests/test_products.py:4` - Existing endpoint-test style to mirror for authz coverage.
|
||||||
|
- Pattern: `.forgejo/workflows/ci.yml:83` - Disabled backend test job that must be enabled.
|
||||||
|
- Pattern: `frontend/package.json:6` - Final frontend verification commands available in the repo.
|
||||||
|
- Pattern: `backend/pyproject.toml` - Pytest command/config surface for any new test files.
|
||||||
|
|
||||||
|
**Acceptance Criteria** (agent-executable only):
|
||||||
|
- [ ] Shared auth fixtures exist for admin/member identities, household membership, and shared inventory setup.
|
||||||
|
- [ ] Backend tests cover `401`, owner success, admin override, same-household shared inventory update, and different-household denial across representative routes.
|
||||||
|
- [ ] Forgejo backend tests run by default instead of being gated by `if: false`.
|
||||||
|
- [ ] Final command set passes: backend tests + lint, frontend check + lint + build, and API type generation only if required by backend schema changes.
|
||||||
|
|
||||||
|
**QA Scenarios** (MANDATORY - task incomplete without these):
|
||||||
|
```
|
||||||
|
Scenario: Full backend auth regression suite passes locally
|
||||||
|
Tool: Bash
|
||||||
|
Steps: Run `cd backend && uv run pytest > ../.sisyphus/evidence/task-T11-backend-regression.txt`
|
||||||
|
Expected: Evidence file shows the full suite passing, including new auth/tenancy tests
|
||||||
|
Evidence: .sisyphus/evidence/task-T11-backend-regression.txt
|
||||||
|
|
||||||
|
Scenario: CI config now runs backend tests instead of skipping them
|
||||||
|
Tool: Bash
|
||||||
|
Steps: Read `.forgejo/workflows/ci.yml`, confirm the backend-test job no longer contains `if: false`, and save a grep extract to `.sisyphus/evidence/task-T11-ci-enabled.txt`
|
||||||
|
Expected: Evidence shows the backend-test job is active and executes `uv run pytest`
|
||||||
|
Evidence: .sisyphus/evidence/task-T11-ci-enabled.txt
|
||||||
|
```
|
||||||
|
|
||||||
|
**Commit**: YES | Message: `test(auth): add multi-user regression coverage` | Files: `backend/tests/*`, `.forgejo/workflows/ci.yml`
|
||||||
|
|
||||||
|
## Final Verification Wave (4 parallel agents, ALL must APPROVE)
|
||||||
|
- [ ] F1. Plan Compliance Audit - oracle
|
||||||
|
- [ ] F2. Code Quality Review - unspecified-high
|
||||||
|
- [ ] F3. Real Manual QA - unspecified-high (+ playwright if UI)
|
||||||
|
- [ ] F4. Scope Fidelity Check - deep
|
||||||
|
|
||||||
|
## Commit Strategy
|
||||||
|
- Use atomic commits after stable checkpoints: Wave 1 foundation, Wave 2 application integration, Wave 3 infra/tests.
|
||||||
|
- Prefer conventional commits with monorepo scopes such as `feat(auth): ...`, `feat(frontend): ...`, `feat(api): ...`, `test(auth): ...`, `chore(deploy): ...`.
|
||||||
|
- Do not merge unrelated refactors into auth/tenancy commits; keep schema, auth flow, frontend session, and infra/test changes reviewable.
|
||||||
|
|
||||||
|
## Success Criteria
|
||||||
|
- Every protected route and API request resolves a concrete current user before touching owned data.
|
||||||
|
- Non-admin users cannot read or mutate records outside their ownership, except household-shared inventory entries.
|
||||||
|
- Household members can view/update shared inventory without gaining product edit rights.
|
||||||
|
- Existing single-user data survives migration and becomes accessible to the bootstrap admin account after first login.
|
||||||
|
- Frontend protected navigation/login/logout flow works without browser-stored bearer tokens.
|
||||||
|
- Backend test suite and CI catch auth regressions before deploy.
|
||||||
134
AGENTS.md
134
AGENTS.md
|
|
@ -1,83 +1,123 @@
|
||||||
# AGENTS.md
|
# AGENTS.md
|
||||||
|
|
||||||
This file provides guidance to AI coding agents when working with code in this repository.
|
Personal health & skincare data hub with LLM agent integration. Monorepo: Python FastAPI backend + SvelteKit frontend.
|
||||||
|
|
||||||
## Repository Structure
|
## Structure
|
||||||
|
|
||||||
This is a monorepo with **backend** and **frontend** directories.
|
```
|
||||||
|
innercontext/
|
||||||
|
├── backend/ # Python 3.12, FastAPI, SQLModel, PostgreSQL, Gemini
|
||||||
|
│ ├── innercontext/ # Main package
|
||||||
|
│ │ ├── api/ # 7 FastAPI routers + LLM endpoints
|
||||||
|
│ │ ├── models/ # SQLModel tables + Pydantic types (12 files)
|
||||||
|
│ │ ├── validators/# LLM response validators (6 validators)
|
||||||
|
│ │ ├── services/ # FX rates (NBP API), pricing jobs
|
||||||
|
│ │ └── workers/ # Background pricing worker
|
||||||
|
│ ├── tests/ # pytest (171 tests, SQLite in-memory)
|
||||||
|
│ ├── alembic/ # DB migrations (17 versions)
|
||||||
|
│ ├── main.py # App entry, lifespan, CORS, router registration
|
||||||
|
│ └── db.py # Engine, get_session(), create_db_and_tables()
|
||||||
|
├── frontend/ # SvelteKit 2, Svelte 5, Tailwind v4, bits-ui
|
||||||
|
│ └── src/
|
||||||
|
│ ├── routes/ # File-based routing (15+ pages)
|
||||||
|
│ ├── lib/ # API client, types, components, i18n
|
||||||
|
│ └── app.css # Theme + editorial design system
|
||||||
|
├── docs/ # Deployment guides + frontend-design-cookbook.md
|
||||||
|
├── nginx/ # Reverse proxy (strips /api prefix → backend:8000)
|
||||||
|
├── systemd/ # 3 units: backend, frontend-node, pricing-worker
|
||||||
|
├── scripts/ # Health checks, backups, env validation
|
||||||
|
└── deploy.sh # Push-based deploy (Capistrano-style symlinked releases)
|
||||||
|
```
|
||||||
|
|
||||||
## Agent Skills
|
## Agent Skills
|
||||||
|
|
||||||
Use repository skills when applicable:
|
- `svelte-code-writer`: REQUIRED for `.svelte`, `.svelte.ts`, `.svelte.js` files.
|
||||||
|
- `frontend-design`: Frontend UI, page, and component design work.
|
||||||
|
- `conventional-commit`: Commit messages following Conventional Commits.
|
||||||
|
- `gemini-api-dev`: Gemini API integrations, multimodal, function calling, structured output.
|
||||||
|
|
||||||
- `svelte-code-writer`: required for creating, editing, or analyzing `.svelte`, `.svelte.ts`, and `.svelte.js` files.
|
When editing frontend code, follow `docs/frontend-design-cookbook.md` and update it when introducing or modifying reusable UI patterns, visual rules, or shared styling conventions.
|
||||||
- `frontend-design`: use for frontend UI, page, and component design work.
|
|
||||||
- `conventional-commit`: use when drafting commit messages that follow Conventional Commits.
|
|
||||||
- `gemini-api-dev`: use when implementing Gemini API integrations, multimodal flows, function calling, or model selection details.
|
|
||||||
|
|
||||||
When editing frontend code, always follow `docs/frontend-design-cookbook.md` and update it in the same change whenever you introduce or modify reusable UI patterns, visual rules, or shared styling conventions.
|
## Where to Look
|
||||||
|
|
||||||
## Commit Guidelines
|
| Task | Location | Notes |
|
||||||
|
|------|----------|-------|
|
||||||
This repository uses Conventional Commits (e.g., `feat(api): ...`, `fix(frontend): ...`, `test(models): ...`). Always format commit messages accordingly and ensure you include the correct scope to indicate which part of the monorepo is affected.
|
| Add API endpoint | `backend/innercontext/api/` | Follow router pattern, use `get_or_404()` |
|
||||||
|
| Add/modify model | `backend/innercontext/models/` | See `backend/AGENTS.md` for JSON col + timestamp conventions |
|
||||||
|
| Add DB migration | `backend/alembic/` | `cd backend && uv run alembic revision --autogenerate -m "desc"` |
|
||||||
|
| Add frontend page | `frontend/src/routes/` | `+page.svelte` + `+page.server.ts` (load + actions) |
|
||||||
|
| Add component | `frontend/src/lib/components/` | Use bits-ui primitives, check design cookbook |
|
||||||
|
| Add LLM feature | `backend/innercontext/api/` + `llm.py` | `call_gemini()` or `call_gemini_with_function_tools()` |
|
||||||
|
| Add LLM validator | `backend/innercontext/validators/` | Extend `BaseValidator`, return `ValidationResult` |
|
||||||
|
| Add i18n strings | `frontend/messages/{en,pl}.json` | Auto-generates to `src/lib/paraglide/` |
|
||||||
|
| Modify design system | `frontend/src/app.css` + `docs/frontend-design-cookbook.md` | Update both in same change |
|
||||||
|
| Modify types | `backend/innercontext/models/` → `pnpm generate:api` → `frontend/src/lib/types.ts` | Auto-generated from OpenAPI; bridge file may need augmentation |
|
||||||
|
|
||||||
## Commands
|
## Commands
|
||||||
|
|
||||||
Run the backend from the `backend/` directory:
|
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
# Backend
|
# Backend
|
||||||
cd backend && uv run python main.py
|
cd backend && uv run python main.py # Start API server
|
||||||
|
cd backend && uv run ruff check . # Lint
|
||||||
|
cd backend && uv run black . # Format
|
||||||
|
cd backend && uv run isort . # Sort imports
|
||||||
|
cd backend && uv run pytest # Run tests
|
||||||
|
|
||||||
# Linting / formatting
|
|
||||||
cd backend && uv run ruff check .
|
|
||||||
cd backend && uv run black .
|
|
||||||
cd backend && uv run isort .
|
|
||||||
```
|
|
||||||
|
|
||||||
Run the frontend from the `frontend/` directory:
|
|
||||||
|
|
||||||
```bash
|
|
||||||
# Frontend
|
# Frontend
|
||||||
cd frontend && pnpm dev
|
cd frontend && pnpm dev # Dev server (API proxied to :8000)
|
||||||
|
cd frontend && pnpm check # Type check + Svelte validation
|
||||||
# Type checking / linting / formatting
|
cd frontend && pnpm lint # ESLint
|
||||||
cd frontend && pnpm check
|
cd frontend && pnpm format # Prettier
|
||||||
cd frontend && pnpm lint
|
cd frontend && pnpm build # Production build → build/
|
||||||
cd frontend && pnpm format
|
cd frontend && pnpm generate:api # Regenerate types from backend OpenAPI
|
||||||
```
|
```
|
||||||
|
|
||||||
No test suite exists yet (backend has some test files but they're not integrated into CI).
|
## Commit Guidelines
|
||||||
|
|
||||||
|
Conventional Commits: `feat(api): ...`, `fix(frontend): ...`, `test(models): ...`. Include scope indicating which part of the monorepo is affected.
|
||||||
|
|
||||||
## Architecture
|
## Architecture
|
||||||
|
|
||||||
**innercontext** collects personal health and skincare data and exposes it to an LLM agent.
|
**Backend:** Python 3.12, FastAPI, SQLModel 0.0.37 + SQLAlchemy, Pydantic v2, PostgreSQL (psycopg3), Gemini API (google-genai).
|
||||||
|
|
||||||
**Backend Stack:** Python 3.12, SQLModel (0.0.37) + SQLAlchemy, Pydantic v2, FastAPI, PostgreSQL (psycopg3).
|
**Frontend:** SvelteKit 2, Svelte 5 (Runes), TypeScript, Tailwind CSS v4, bits-ui (shadcn-svelte), Paraglide (i18n), svelte-dnd-action, adapter-node.
|
||||||
|
|
||||||
**Frontend Stack:** SvelteKit 5, Tailwind CSS v4, bits-ui, inlang/paraglide (i18n), svelte-dnd-action.
|
### Cross-Cutting Patterns
|
||||||
|
|
||||||
### Models (`backend/innercontext/models/`)
|
- **Type sharing**: Auto-generated from backend OpenAPI schema via `@hey-api/openapi-ts`. Run `cd frontend && pnpm generate:api` after backend model changes. `src/lib/types.ts` is a bridge file with re-exports, renames, and `Require<>` augmentations. See `frontend/AGENTS.md` § Type Generation.
|
||||||
|
- **API proxy**: Frontend server-side uses `PUBLIC_API_BASE` (http://localhost:8000). Browser uses `/api` (nginx strips prefix → backend).
|
||||||
|
- **Auth**: None. Single-user personal system.
|
||||||
|
- **Error flow**: Backend `HTTPException(detail=...)` → Frontend catches `.detail` field → `FlashMessages` or `StructuredErrorDisplay`.
|
||||||
|
- **LLM validation errors**: Non-blocking (HTTP 200). Returned in `validation_warnings` field. Frontend parses semicolon-separated strings into list.
|
||||||
|
|
||||||
|
### Models
|
||||||
|
|
||||||
| File | Tables |
|
| File | Tables |
|
||||||
|------|--------|
|
|------|--------|
|
||||||
| `product.py` | `products`, `product_inventory` |
|
| `product.py` | `products`, `product_inventory` |
|
||||||
| `health.py` | `medication_entries`, `medication_usages`, `lab_results` |
|
| `health.py` | `medication_entries`, `medication_usages`, `lab_results` |
|
||||||
| `routine.py` | `routines`, `routine_steps` |
|
| `routine.py` | `routines`, `routine_steps`, `grooming_schedules` |
|
||||||
| `skincare.py` | `skin_condition_snapshots` |
|
| `skincare.py` | `skin_condition_snapshots` |
|
||||||
|
| `profile.py` | `user_profiles` |
|
||||||
|
| `pricing.py` | `pricing_recalc_jobs` |
|
||||||
|
| `ai_log.py` | `ai_call_logs` |
|
||||||
|
|
||||||
**`Product`** is the core model. JSON columns store `inci` (list), `actives` (list of `ActiveIngredient`), `recommended_for`, `targets`, `incompatible_with`, `synergizes_with`, `context_rules`, and `product_effect_profile`. The `to_llm_context()` method returns a token-optimised dict for LLM usage.
|
**Product** is the core model with JSON columns for `inci`, `actives`, `recommended_for`, `targets`, `product_effect_profile`, and `context_rules`. `to_llm_context()` returns a token-optimised dict for LLM usage.
|
||||||
|
|
||||||
**`ProductInventory`** tracks physical packages (opened status, expiry, remaining weight). One product → many inventory entries.
|
### Deployment
|
||||||
|
|
||||||
**`Routine` / `RoutineStep`** record daily AM/PM skincare sessions. A step references either a `Product` or a free-text `action` (e.g. shaving).
|
- **CI**: Forgejo (`.forgejo/workflows/`), manual trigger only.
|
||||||
|
- **Deploy**: `deploy.sh` pushes via SSH to LXC host. Capistrano-style timestamped releases with `current` symlink. Auto-rollback on health check failure.
|
||||||
|
- **Services**: 3 systemd units — backend (uvicorn :8000), frontend-node (:3000), pricing-worker.
|
||||||
|
- **Env**: Backend `.env` has `DATABASE_URL` + `GEMINI_API_KEY`. Frontend `PUBLIC_API_BASE` set at build time.
|
||||||
|
|
||||||
**`SkinConditionSnapshot`** is a weekly LLM-filled record (skin state, metrics 1–5, active concerns).
|
## Anti-Patterns (this project)
|
||||||
|
|
||||||
### Key Conventions
|
- `model_validator(mode="after")` does NOT fire on `table=True` SQLModel instances (SQLModel 0.0.37 + Pydantic v2 bug). Validators in Product are documentation only.
|
||||||
|
- Never use plain `Field(default_factory=...)` for `updated_at` — must use `sa_column=Column(DateTime(timezone=True), onupdate=utc_now)`.
|
||||||
- All `table=True` models use `Column(DateTime(timezone=True), onupdate=utc_now)` for `updated_at` via raw SQLAlchemy column — do not use plain `Field(default_factory=...)` for auto-update.
|
- JSON columns use `sa_column=Column(JSON, nullable=...)` — NOT JSONB. DB-agnostic.
|
||||||
- List/complex fields stored as JSON use `sa_column=Column(JSON, nullable=...)` pattern (DB-agnostic; not JSONB).
|
- Gemini API rejects int-enum in `response_schema` — `AIActiveIngredient` overrides with `int` + `# type: ignore[assignment]`.
|
||||||
- `model_validator(mode="after")` **does not fire** on `table=True` SQLModel instances (SQLModel 0.0.37 + Pydantic v2 bug). Validators in `Product` are present for documentation but are unreliable at construction time.
|
- `backend/skincare.yaml` is legacy notes — ignore, not part of data model.
|
||||||
- `backend/skincare.yaml` is a legacy notes file — ignore it, it is not part of the data model and will not be imported.
|
- ESLint rule `svelte/no-navigation-without-resolve` has `ignoreGoto: true` workaround (upstream bug sveltejs/eslint-plugin-svelte#1327).
|
||||||
- `_ev()` helper in `product.py` normalises enum values when fields may be raw dicts (as returned from DB) or Python enum instances.
|
- `_ev()` helper in `product.py` normalises enum values when fields may be raw dicts (from DB) or Python enum instances.
|
||||||
|
- No frontend tests exist. Backend tests use SQLite in-memory (not PostgreSQL).
|
||||||
|
|
|
||||||
Binary file not shown.
121
backend/AGENTS.md
Normal file
121
backend/AGENTS.md
Normal file
|
|
@ -0,0 +1,121 @@
|
||||||
|
# Backend
|
||||||
|
|
||||||
|
Python 3.12 FastAPI backend. Entry: `main.py` → `db.py` → routers in `innercontext/api/`.
|
||||||
|
|
||||||
|
## Structure
|
||||||
|
|
||||||
|
```
|
||||||
|
backend/
|
||||||
|
├── main.py # FastAPI app, lifespan, CORS, router registration
|
||||||
|
├── db.py # Engine, get_session() dependency, create_db_and_tables()
|
||||||
|
├── innercontext/
|
||||||
|
│ ├── api/ # 7 FastAPI routers
|
||||||
|
│ │ ├── products.py # CRUD + LLM parse/suggest + pricing
|
||||||
|
│ │ ├── routines.py # CRUD + LLM suggest/batch + grooming schedule
|
||||||
|
│ │ ├── health.py # Medications + lab results CRUD
|
||||||
|
│ │ ├── skincare.py # Snapshots + photo analysis (Gemini vision)
|
||||||
|
│ │ ├── inventory.py # Product inventory CRUD
|
||||||
|
│ │ ├── profile.py # User profile upsert
|
||||||
|
│ │ ├── ai_logs.py # LLM call log viewer
|
||||||
|
│ │ ├── llm_context.py # Context builders (Tier 1 summary / Tier 2 detailed)
|
||||||
|
│ │ ├── product_llm_tools.py # Gemini function tool declarations + handlers
|
||||||
|
│ │ └── utils.py # get_or_404()
|
||||||
|
│ ├── models/ # SQLModel tables + Pydantic types
|
||||||
|
│ │ ├── product.py # Product, ProductInventory, _ev(), to_llm_context()
|
||||||
|
│ │ ├── health.py # MedicationEntry, MedicationUsage, LabResult
|
||||||
|
│ │ ├── routine.py # Routine, RoutineStep, GroomingSchedule
|
||||||
|
│ │ ├── skincare.py # SkinConditionSnapshot (JSON: concerns, risks, priorities)
|
||||||
|
│ │ ├── profile.py # UserProfile
|
||||||
|
│ │ ├── pricing.py # PricingRecalcJob (async tier calculation)
|
||||||
|
│ │ ├── ai_log.py # AICallLog (token metrics, reasoning chain, tool trace)
|
||||||
|
│ │ ├── enums.py # 20+ enums (ProductCategory, SkinType, SkinConcern, etc.)
|
||||||
|
│ │ ├── base.py # utc_now() helper
|
||||||
|
│ │ ├── domain.py # Domain enum (HEALTH, SKINCARE)
|
||||||
|
│ │ └── api_metadata.py # ResponseMetadata, TokenMetrics (Phase 3 observability)
|
||||||
|
│ ├── validators/ # LLM response validators (non-blocking)
|
||||||
|
│ │ ├── base.py # ValidationResult, BaseValidator abstract
|
||||||
|
│ │ ├── routine_validator.py # Retinoid+acid, intervals, SPF, barrier safety
|
||||||
|
│ │ ├── batch_validator.py # Multi-day frequency + same-day conflicts
|
||||||
|
│ │ ├── product_parse_validator.py # Enum checks, effect_profile, pH, actives
|
||||||
|
│ │ ├── shopping_validator.py # Category, priority, text quality
|
||||||
|
│ │ └── photo_validator.py # Skin metrics 1-5, enum checks
|
||||||
|
│ ├── services/
|
||||||
|
│ │ ├── fx.py # NBP API currency conversion (24h cache, thread-safe)
|
||||||
|
│ │ └── pricing_jobs.py # Job queue (enqueue, claim with FOR UPDATE SKIP LOCKED)
|
||||||
|
│ ├── workers/
|
||||||
|
│ │ └── pricing.py # Background pricing worker
|
||||||
|
│ ├── llm.py # Gemini client, call_gemini(), call_gemini_with_function_tools()
|
||||||
|
│ └── llm_safety.py # Prompt injection prevention (sanitize + isolate)
|
||||||
|
├── tests/ # 171 pytest tests (SQLite in-memory, isolated per test)
|
||||||
|
├── alembic/ # 17 migration versions
|
||||||
|
└── pyproject.toml # uv, pytest (--cov), ruff, black, isort (black profile)
|
||||||
|
```
|
||||||
|
|
||||||
|
## Model Conventions
|
||||||
|
|
||||||
|
- **JSON columns**: `sa_column=Column(JSON, nullable=...)` on `table=True` models only. DB-agnostic (not JSONB).
|
||||||
|
- **`updated_at`**: MUST use `sa_column=Column(DateTime(timezone=True), onupdate=utc_now)`. Never plain `Field(default_factory=...)`.
|
||||||
|
- **`_ev()` helper** (`product.py`): Normalises enum values — returns `.value` if enum, `str()` otherwise. Required when fields may be raw dicts (from DB) or Python enum instances.
|
||||||
|
- **`model_validator(mode="after")`**: Does NOT fire on `table=True` instances (SQLModel 0.0.37 + Pydantic v2 bug). Product validators are documentation only.
|
||||||
|
- **`to_llm_context()`**: Returns token-optimised dict. Filters `effect_profile` to nonzero values (≥2). Handles both dict and object forms.
|
||||||
|
- **`short_id`**: 8-char UUID prefix on Product. Used in LLM context for token efficiency → expanded to full UUID before DB queries.
|
||||||
|
|
||||||
|
## LLM Integration
|
||||||
|
|
||||||
|
Two config patterns in `llm.py`:
|
||||||
|
- `get_extraction_config()`: temp=0.0, MINIMAL thinking. Deterministic data parsing.
|
||||||
|
- `get_creative_config()`: temp=0.4, MEDIUM thinking. Suggestions with reasoning chain capture.
|
||||||
|
|
||||||
|
Three context tiers in `llm_context.py`:
|
||||||
|
- **Tier 1** (~15-20 tokens/product): One-line summary with status, key effects, safety flags.
|
||||||
|
- **Tier 2** (~40-50 tokens/product): Top 5 actives + effect_profile + context_rules. Used in function tool responses.
|
||||||
|
- **Tier 3**: Full `to_llm_context()`. Token-heavy, rarely used.
|
||||||
|
|
||||||
|
Function calling (`product_llm_tools.py`):
|
||||||
|
- `call_gemini_with_function_tools()`: Iterative tool loop, max 2 roundtrips.
|
||||||
|
- `PRODUCT_DETAILS_FUNCTION_DECLARATION`: Gemini function schema for product lookups.
|
||||||
|
- INCI lists excluded from LLM context by default (~12-15KB per product saved).
|
||||||
|
|
||||||
|
All calls logged to `AICallLog` with: token metrics, reasoning_chain, tool_trace, validation results.
|
||||||
|
|
||||||
|
Safety (`llm_safety.py`):
|
||||||
|
- `sanitize_user_input()`: Removes prompt injection patterns, limits length.
|
||||||
|
- `isolate_user_input()`: Wraps with boundary markers, treats as data not instructions.
|
||||||
|
|
||||||
|
## Validators
|
||||||
|
|
||||||
|
All extend `BaseValidator`, return `ValidationResult` (errors, warnings, auto_fixes). Validation is **non-blocking** — errors returned in response body as `validation_warnings`, not as HTTP 4xx.
|
||||||
|
|
||||||
|
Key safety checks in `routine_validator.py`:
|
||||||
|
- No retinoid + acid in same routine (detects via `effect_profile.retinoid_strength > 0` and exfoliant functions in actives)
|
||||||
|
- Respect `min_interval_hours` and `max_frequency_per_week`
|
||||||
|
- Check `context_rules`: `safe_after_shaving`, `safe_with_compromised_barrier`
|
||||||
|
- AM routines need SPF when `leaving_home=True`
|
||||||
|
- No high `irritation_risk` or `barrier_disruption_risk` with compromised barrier
|
||||||
|
|
||||||
|
## API Patterns
|
||||||
|
|
||||||
|
- All routers use `Depends(get_session)` for DB access.
|
||||||
|
- `get_or_404(session, Model, id)` for 404 responses.
|
||||||
|
- LLM endpoints: build context → call Gemini → validate → log to AICallLog → return data + `ResponseMetadata`.
|
||||||
|
- Product pricing: enqueues `PricingRecalcJob` on create/update. Worker claims with `FOR UPDATE SKIP LOCKED`.
|
||||||
|
- Gemini API rejects int-enum in `response_schema` — `AIActiveIngredient` overrides fields with plain `int` + `# type: ignore[assignment]`.
|
||||||
|
|
||||||
|
## Environment
|
||||||
|
|
||||||
|
| Variable | Default | Required |
|
||||||
|
|----------|---------|----------|
|
||||||
|
| `DATABASE_URL` | `postgresql+psycopg://localhost/innercontext` | Yes |
|
||||||
|
| `GEMINI_API_KEY` | — | For LLM features |
|
||||||
|
| `GEMINI_MODEL` | `gemini-3-flash-preview` | No |
|
||||||
|
|
||||||
|
`main.py` calls `load_dotenv()` before importing `db.py` to ensure `DATABASE_URL` is read from `.env`.
|
||||||
|
|
||||||
|
## Testing
|
||||||
|
|
||||||
|
- `cd backend && uv run pytest`
|
||||||
|
- SQLite in-memory per test — fully isolated, no cleanup needed.
|
||||||
|
- `conftest.py` fixtures: `session`, `client` (TestClient with patched engine), `product_data`, `created_product`, `medication_data`, `created_medication`, `created_routine`.
|
||||||
|
- LLM calls mocked with `unittest.mock.patch` and `monkeypatch`.
|
||||||
|
- Coverage: `--cov=innercontext --cov-report=term-missing`.
|
||||||
|
- No test markers or parametrize — explicit test functions only.
|
||||||
|
|
@ -0,0 +1,289 @@
|
||||||
|
"""add auth tables and ownership
|
||||||
|
|
||||||
|
Revision ID: 4b7d2e9f1c3a
|
||||||
|
Revises: 9f3a2c1b4d5e
|
||||||
|
Create Date: 2026-03-12 12:00:00.000000
|
||||||
|
|
||||||
|
"""
|
||||||
|
|
||||||
|
import os
|
||||||
|
from collections.abc import Sequence
|
||||||
|
from datetime import datetime, timezone
|
||||||
|
from uuid import UUID, uuid4
|
||||||
|
|
||||||
|
import sqlalchemy as sa
|
||||||
|
|
||||||
|
from alembic import op
|
||||||
|
|
||||||
|
# revision identifiers, used by Alembic.
|
||||||
|
revision: str = "4b7d2e9f1c3a"
|
||||||
|
down_revision: str | Sequence[str] | None = "9f3a2c1b4d5e"
|
||||||
|
branch_labels: str | Sequence[str] | None = None
|
||||||
|
depends_on: str | Sequence[str] | None = None
|
||||||
|
|
||||||
|
OWNED_TABLES: tuple[str, ...] = (
|
||||||
|
"products",
|
||||||
|
"product_inventory",
|
||||||
|
"user_profiles",
|
||||||
|
"medication_entries",
|
||||||
|
"medication_usages",
|
||||||
|
"lab_results",
|
||||||
|
"routines",
|
||||||
|
"routine_steps",
|
||||||
|
"grooming_schedule",
|
||||||
|
"skin_condition_snapshots",
|
||||||
|
"ai_call_logs",
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def _table_has_rows(connection: sa.Connection, table_name: str) -> bool:
|
||||||
|
query = sa.text(f"SELECT 1 FROM {table_name} LIMIT 1")
|
||||||
|
return connection.execute(query).first() is not None
|
||||||
|
|
||||||
|
|
||||||
|
def _legacy_data_exists(connection: sa.Connection) -> bool:
|
||||||
|
return any(_table_has_rows(connection, table_name) for table_name in OWNED_TABLES)
|
||||||
|
|
||||||
|
|
||||||
|
def _ensure_bootstrap_user_and_household(
|
||||||
|
connection: sa.Connection,
|
||||||
|
*,
|
||||||
|
issuer: str,
|
||||||
|
subject: str,
|
||||||
|
) -> UUID:
|
||||||
|
now = datetime.now(timezone.utc)
|
||||||
|
|
||||||
|
users_table = sa.table(
|
||||||
|
"users",
|
||||||
|
sa.column("id", sa.Uuid()),
|
||||||
|
sa.column("oidc_issuer", sa.String(length=512)),
|
||||||
|
sa.column("oidc_subject", sa.String(length=512)),
|
||||||
|
sa.column("role", sa.Enum("ADMIN", "MEMBER", name="role")),
|
||||||
|
sa.column("created_at", sa.DateTime()),
|
||||||
|
sa.column("updated_at", sa.DateTime(timezone=True)),
|
||||||
|
)
|
||||||
|
|
||||||
|
user_id = connection.execute(
|
||||||
|
sa.select(users_table.c.id).where(
|
||||||
|
users_table.c.oidc_issuer == issuer,
|
||||||
|
users_table.c.oidc_subject == subject,
|
||||||
|
)
|
||||||
|
).scalar_one_or_none()
|
||||||
|
|
||||||
|
if user_id is None:
|
||||||
|
user_id = uuid4()
|
||||||
|
_ = connection.execute(
|
||||||
|
sa.insert(users_table).values(
|
||||||
|
id=user_id,
|
||||||
|
oidc_issuer=issuer,
|
||||||
|
oidc_subject=subject,
|
||||||
|
role="ADMIN",
|
||||||
|
created_at=now,
|
||||||
|
updated_at=now,
|
||||||
|
)
|
||||||
|
)
|
||||||
|
|
||||||
|
households_table = sa.table(
|
||||||
|
"households",
|
||||||
|
sa.column("id", sa.Uuid()),
|
||||||
|
sa.column("created_at", sa.DateTime()),
|
||||||
|
sa.column("updated_at", sa.DateTime(timezone=True)),
|
||||||
|
)
|
||||||
|
memberships_table = sa.table(
|
||||||
|
"household_memberships",
|
||||||
|
sa.column("id", sa.Uuid()),
|
||||||
|
sa.column("user_id", sa.Uuid()),
|
||||||
|
sa.column("household_id", sa.Uuid()),
|
||||||
|
sa.column("role", sa.Enum("OWNER", "MEMBER", name="householdrole")),
|
||||||
|
sa.column("created_at", sa.DateTime()),
|
||||||
|
sa.column("updated_at", sa.DateTime(timezone=True)),
|
||||||
|
)
|
||||||
|
|
||||||
|
membership_id = connection.execute(
|
||||||
|
sa.select(memberships_table.c.id).where(memberships_table.c.user_id == user_id)
|
||||||
|
).scalar_one_or_none()
|
||||||
|
if membership_id is None:
|
||||||
|
household_id = uuid4()
|
||||||
|
_ = connection.execute(
|
||||||
|
sa.insert(households_table).values(
|
||||||
|
id=household_id,
|
||||||
|
created_at=now,
|
||||||
|
updated_at=now,
|
||||||
|
)
|
||||||
|
)
|
||||||
|
_ = connection.execute(
|
||||||
|
sa.insert(memberships_table).values(
|
||||||
|
id=uuid4(),
|
||||||
|
user_id=user_id,
|
||||||
|
household_id=household_id,
|
||||||
|
role="OWNER",
|
||||||
|
created_at=now,
|
||||||
|
updated_at=now,
|
||||||
|
)
|
||||||
|
)
|
||||||
|
|
||||||
|
return user_id
|
||||||
|
|
||||||
|
|
||||||
|
def _backfill_owned_rows(connection: sa.Connection, user_id: UUID) -> None:
|
||||||
|
for table_name in OWNED_TABLES:
|
||||||
|
table = sa.table(table_name, sa.column("user_id", sa.Uuid()))
|
||||||
|
_ = connection.execute(
|
||||||
|
sa.update(table).where(table.c.user_id.is_(None)).values(user_id=user_id)
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def upgrade() -> None:
|
||||||
|
bind = op.get_bind()
|
||||||
|
|
||||||
|
role_enum = sa.Enum("ADMIN", "MEMBER", name="role")
|
||||||
|
household_role_enum = sa.Enum("OWNER", "MEMBER", name="householdrole")
|
||||||
|
role_enum.create(bind, checkfirst=True)
|
||||||
|
household_role_enum.create(bind, checkfirst=True)
|
||||||
|
|
||||||
|
_ = op.create_table(
|
||||||
|
"users",
|
||||||
|
sa.Column("id", sa.Uuid(), nullable=False),
|
||||||
|
sa.Column("oidc_issuer", sa.String(length=512), nullable=False),
|
||||||
|
sa.Column("oidc_subject", sa.String(length=512), nullable=False),
|
||||||
|
sa.Column("role", role_enum, nullable=False),
|
||||||
|
sa.Column("created_at", sa.DateTime(), nullable=False),
|
||||||
|
sa.Column("updated_at", sa.DateTime(timezone=True), nullable=False),
|
||||||
|
sa.PrimaryKeyConstraint("id"),
|
||||||
|
sa.UniqueConstraint(
|
||||||
|
"oidc_issuer", "oidc_subject", name="uq_users_oidc_identity"
|
||||||
|
),
|
||||||
|
)
|
||||||
|
op.create_index(op.f("ix_users_role"), "users", ["role"], unique=False)
|
||||||
|
|
||||||
|
_ = op.create_table(
|
||||||
|
"households",
|
||||||
|
sa.Column("id", sa.Uuid(), nullable=False),
|
||||||
|
sa.Column("created_at", sa.DateTime(), nullable=False),
|
||||||
|
sa.Column("updated_at", sa.DateTime(timezone=True), nullable=False),
|
||||||
|
sa.PrimaryKeyConstraint("id"),
|
||||||
|
)
|
||||||
|
|
||||||
|
_ = op.create_table(
|
||||||
|
"household_memberships",
|
||||||
|
sa.Column("id", sa.Uuid(), nullable=False),
|
||||||
|
sa.Column("user_id", sa.Uuid(), nullable=False),
|
||||||
|
sa.Column("household_id", sa.Uuid(), nullable=False),
|
||||||
|
sa.Column("role", household_role_enum, nullable=False),
|
||||||
|
sa.Column("created_at", sa.DateTime(), nullable=False),
|
||||||
|
sa.Column("updated_at", sa.DateTime(timezone=True), nullable=False),
|
||||||
|
sa.ForeignKeyConstraint(
|
||||||
|
["household_id"], ["households.id"], ondelete="CASCADE"
|
||||||
|
),
|
||||||
|
sa.ForeignKeyConstraint(["user_id"], ["users.id"], ondelete="CASCADE"),
|
||||||
|
sa.PrimaryKeyConstraint("id"),
|
||||||
|
sa.UniqueConstraint("user_id", name="uq_household_memberships_user_id"),
|
||||||
|
)
|
||||||
|
op.create_index(
|
||||||
|
op.f("ix_household_memberships_household_id"),
|
||||||
|
"household_memberships",
|
||||||
|
["household_id"],
|
||||||
|
unique=False,
|
||||||
|
)
|
||||||
|
op.create_index(
|
||||||
|
op.f("ix_household_memberships_role"),
|
||||||
|
"household_memberships",
|
||||||
|
["role"],
|
||||||
|
unique=False,
|
||||||
|
)
|
||||||
|
op.create_index(
|
||||||
|
op.f("ix_household_memberships_user_id"),
|
||||||
|
"household_memberships",
|
||||||
|
["user_id"],
|
||||||
|
unique=False,
|
||||||
|
)
|
||||||
|
|
||||||
|
for table_name in OWNED_TABLES:
|
||||||
|
with op.batch_alter_table(table_name) as batch_op:
|
||||||
|
batch_op.add_column(sa.Column("user_id", sa.Uuid(), nullable=True))
|
||||||
|
batch_op.create_index(
|
||||||
|
op.f(f"ix_{table_name}_user_id"), ["user_id"], unique=False
|
||||||
|
)
|
||||||
|
batch_op.create_foreign_key(
|
||||||
|
f"fk_{table_name}_user_id_users",
|
||||||
|
"users",
|
||||||
|
["user_id"],
|
||||||
|
["id"],
|
||||||
|
ondelete="CASCADE",
|
||||||
|
)
|
||||||
|
if table_name == "product_inventory":
|
||||||
|
batch_op.add_column(
|
||||||
|
sa.Column(
|
||||||
|
"is_household_shared",
|
||||||
|
sa.Boolean(),
|
||||||
|
nullable=False,
|
||||||
|
server_default=sa.false(),
|
||||||
|
)
|
||||||
|
)
|
||||||
|
|
||||||
|
connection = op.get_bind()
|
||||||
|
legacy_data_exists = _legacy_data_exists(connection)
|
||||||
|
|
||||||
|
issuer = os.getenv("BOOTSTRAP_ADMIN_OIDC_ISSUER", "").strip()
|
||||||
|
subject = os.getenv("BOOTSTRAP_ADMIN_OIDC_SUB", "").strip()
|
||||||
|
bootstrap_email = os.getenv("BOOTSTRAP_ADMIN_EMAIL", "").strip()
|
||||||
|
bootstrap_name = os.getenv("BOOTSTRAP_ADMIN_NAME", "").strip()
|
||||||
|
bootstrap_household_name = os.getenv("BOOTSTRAP_HOUSEHOLD_NAME", "").strip()
|
||||||
|
_ = (bootstrap_email, bootstrap_name, bootstrap_household_name)
|
||||||
|
|
||||||
|
if legacy_data_exists:
|
||||||
|
missing_required: list[str] = []
|
||||||
|
if not issuer:
|
||||||
|
missing_required.append("BOOTSTRAP_ADMIN_OIDC_ISSUER")
|
||||||
|
if not subject:
|
||||||
|
missing_required.append("BOOTSTRAP_ADMIN_OIDC_SUB")
|
||||||
|
|
||||||
|
if missing_required:
|
||||||
|
missing_csv = ", ".join(missing_required)
|
||||||
|
raise RuntimeError(
|
||||||
|
f"Legacy data requires bootstrap admin identity; missing required env vars: {missing_csv}"
|
||||||
|
)
|
||||||
|
|
||||||
|
bootstrap_user_id = _ensure_bootstrap_user_and_household(
|
||||||
|
connection,
|
||||||
|
issuer=issuer,
|
||||||
|
subject=subject,
|
||||||
|
)
|
||||||
|
_backfill_owned_rows(connection, bootstrap_user_id)
|
||||||
|
|
||||||
|
for table_name in OWNED_TABLES:
|
||||||
|
with op.batch_alter_table(table_name) as batch_op:
|
||||||
|
batch_op.alter_column("user_id", existing_type=sa.Uuid(), nullable=False)
|
||||||
|
|
||||||
|
|
||||||
|
def downgrade() -> None:
|
||||||
|
for table_name in reversed(OWNED_TABLES):
|
||||||
|
with op.batch_alter_table(table_name) as batch_op:
|
||||||
|
batch_op.drop_constraint(
|
||||||
|
f"fk_{table_name}_user_id_users", type_="foreignkey"
|
||||||
|
)
|
||||||
|
batch_op.drop_index(op.f(f"ix_{table_name}_user_id"))
|
||||||
|
if table_name == "product_inventory":
|
||||||
|
batch_op.drop_column("is_household_shared")
|
||||||
|
batch_op.drop_column("user_id")
|
||||||
|
|
||||||
|
op.drop_index(
|
||||||
|
op.f("ix_household_memberships_user_id"), table_name="household_memberships"
|
||||||
|
)
|
||||||
|
op.drop_index(
|
||||||
|
op.f("ix_household_memberships_role"), table_name="household_memberships"
|
||||||
|
)
|
||||||
|
op.drop_index(
|
||||||
|
op.f("ix_household_memberships_household_id"),
|
||||||
|
table_name="household_memberships",
|
||||||
|
)
|
||||||
|
op.drop_table("household_memberships")
|
||||||
|
op.drop_table("households")
|
||||||
|
op.drop_index(op.f("ix_users_role"), table_name="users")
|
||||||
|
op.drop_table("users")
|
||||||
|
|
||||||
|
bind = op.get_bind()
|
||||||
|
household_role_enum = sa.Enum("OWNER", "MEMBER", name="householdrole")
|
||||||
|
role_enum = sa.Enum("ADMIN", "MEMBER", name="role")
|
||||||
|
household_role_enum.drop(bind, checkfirst=True)
|
||||||
|
role_enum.drop(bind, checkfirst=True)
|
||||||
|
|
@ -0,0 +1,69 @@
|
||||||
|
"""replace product weights with inventory remaining level
|
||||||
|
|
||||||
|
Revision ID: 9f3a2c1b4d5e
|
||||||
|
Revises: 7e6f73d1cc95
|
||||||
|
Create Date: 2026-03-08 12:00:00.000000
|
||||||
|
|
||||||
|
"""
|
||||||
|
|
||||||
|
from typing import Sequence, Union
|
||||||
|
|
||||||
|
import sqlalchemy as sa
|
||||||
|
|
||||||
|
from alembic import op
|
||||||
|
|
||||||
|
# revision identifiers, used by Alembic.
|
||||||
|
revision: str = "9f3a2c1b4d5e"
|
||||||
|
down_revision: Union[str, Sequence[str], None] = "7e6f73d1cc95"
|
||||||
|
branch_labels: Union[str, Sequence[str], None] = None
|
||||||
|
depends_on: Union[str, Sequence[str], None] = None
|
||||||
|
|
||||||
|
|
||||||
|
def upgrade() -> None:
|
||||||
|
bind = op.get_bind()
|
||||||
|
remaining_level_enum = sa.Enum(
|
||||||
|
"HIGH",
|
||||||
|
"MEDIUM",
|
||||||
|
"LOW",
|
||||||
|
"NEARLY_EMPTY",
|
||||||
|
name="remaininglevel",
|
||||||
|
)
|
||||||
|
remaining_level_enum.create(bind, checkfirst=True)
|
||||||
|
|
||||||
|
op.add_column(
|
||||||
|
"product_inventory",
|
||||||
|
sa.Column("remaining_level", remaining_level_enum, nullable=True),
|
||||||
|
)
|
||||||
|
op.drop_column("product_inventory", "last_weighed_at")
|
||||||
|
op.drop_column("product_inventory", "current_weight_g")
|
||||||
|
op.drop_column("products", "personal_repurchase_intent")
|
||||||
|
op.drop_column("products", "empty_weight_g")
|
||||||
|
op.drop_column("products", "full_weight_g")
|
||||||
|
|
||||||
|
|
||||||
|
def downgrade() -> None:
|
||||||
|
bind = op.get_bind()
|
||||||
|
remaining_level_enum = sa.Enum(
|
||||||
|
"HIGH",
|
||||||
|
"MEDIUM",
|
||||||
|
"LOW",
|
||||||
|
"NEARLY_EMPTY",
|
||||||
|
name="remaininglevel",
|
||||||
|
)
|
||||||
|
|
||||||
|
op.add_column(
|
||||||
|
"products",
|
||||||
|
sa.Column("personal_repurchase_intent", sa.Boolean(), nullable=True),
|
||||||
|
)
|
||||||
|
op.add_column(
|
||||||
|
"product_inventory",
|
||||||
|
sa.Column("current_weight_g", sa.Float(), nullable=True),
|
||||||
|
)
|
||||||
|
op.add_column(
|
||||||
|
"product_inventory",
|
||||||
|
sa.Column("last_weighed_at", sa.Date(), nullable=True),
|
||||||
|
)
|
||||||
|
op.add_column("products", sa.Column("full_weight_g", sa.Float(), nullable=True))
|
||||||
|
op.add_column("products", sa.Column("empty_weight_g", sa.Float(), nullable=True))
|
||||||
|
op.drop_column("product_inventory", "remaining_level")
|
||||||
|
remaining_level_enum.drop(bind, checkfirst=True)
|
||||||
206
backend/innercontext/api/admin.py
Normal file
206
backend/innercontext/api/admin.py
Normal file
|
|
@ -0,0 +1,206 @@
|
||||||
|
from datetime import datetime
|
||||||
|
from typing import Annotated
|
||||||
|
from uuid import UUID
|
||||||
|
|
||||||
|
from fastapi import APIRouter, Depends, HTTPException, Response, status
|
||||||
|
from sqlmodel import Session, SQLModel, select
|
||||||
|
|
||||||
|
from db import get_session
|
||||||
|
from innercontext.api.auth_deps import require_admin
|
||||||
|
from innercontext.api.utils import get_or_404
|
||||||
|
from innercontext.models import (
|
||||||
|
Household,
|
||||||
|
HouseholdMembership,
|
||||||
|
HouseholdRole,
|
||||||
|
Role,
|
||||||
|
User,
|
||||||
|
)
|
||||||
|
|
||||||
|
router = APIRouter(dependencies=[Depends(require_admin)])
|
||||||
|
SessionDep = Annotated[Session, Depends(get_session)]
|
||||||
|
|
||||||
|
|
||||||
|
class AdminHouseholdPublic(SQLModel):
|
||||||
|
id: UUID
|
||||||
|
created_at: datetime
|
||||||
|
updated_at: datetime
|
||||||
|
|
||||||
|
|
||||||
|
class AdminHouseholdMembershipPublic(SQLModel):
|
||||||
|
id: UUID
|
||||||
|
user_id: UUID
|
||||||
|
household_id: UUID
|
||||||
|
role: HouseholdRole
|
||||||
|
created_at: datetime
|
||||||
|
updated_at: datetime
|
||||||
|
|
||||||
|
|
||||||
|
class AdminUserPublic(SQLModel):
|
||||||
|
id: UUID
|
||||||
|
oidc_issuer: str
|
||||||
|
oidc_subject: str
|
||||||
|
role: Role
|
||||||
|
created_at: datetime
|
||||||
|
updated_at: datetime
|
||||||
|
household_membership: AdminHouseholdMembershipPublic | None = None
|
||||||
|
|
||||||
|
|
||||||
|
class AdminHouseholdMembershipCreate(SQLModel):
|
||||||
|
user_id: UUID
|
||||||
|
role: HouseholdRole = HouseholdRole.MEMBER
|
||||||
|
|
||||||
|
|
||||||
|
def _membership_public(
|
||||||
|
membership: HouseholdMembership,
|
||||||
|
) -> AdminHouseholdMembershipPublic:
|
||||||
|
return AdminHouseholdMembershipPublic(
|
||||||
|
id=membership.id,
|
||||||
|
user_id=membership.user_id,
|
||||||
|
household_id=membership.household_id,
|
||||||
|
role=membership.role,
|
||||||
|
created_at=membership.created_at,
|
||||||
|
updated_at=membership.updated_at,
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def _household_public(household: Household) -> AdminHouseholdPublic:
|
||||||
|
return AdminHouseholdPublic(
|
||||||
|
id=household.id,
|
||||||
|
created_at=household.created_at,
|
||||||
|
updated_at=household.updated_at,
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def _user_public(
|
||||||
|
user: User,
|
||||||
|
membership: HouseholdMembership | None,
|
||||||
|
) -> AdminUserPublic:
|
||||||
|
return AdminUserPublic(
|
||||||
|
id=user.id,
|
||||||
|
oidc_issuer=user.oidc_issuer,
|
||||||
|
oidc_subject=user.oidc_subject,
|
||||||
|
role=user.role,
|
||||||
|
created_at=user.created_at,
|
||||||
|
updated_at=user.updated_at,
|
||||||
|
household_membership=(
|
||||||
|
_membership_public(membership) if membership is not None else None
|
||||||
|
),
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def _get_membership_for_user(
|
||||||
|
session: Session,
|
||||||
|
user_id: UUID,
|
||||||
|
) -> HouseholdMembership | None:
|
||||||
|
return session.exec(
|
||||||
|
select(HouseholdMembership).where(HouseholdMembership.user_id == user_id)
|
||||||
|
).first()
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/users", response_model=list[AdminUserPublic])
|
||||||
|
def list_users(session: SessionDep):
|
||||||
|
users = sorted(
|
||||||
|
session.exec(select(User)).all(),
|
||||||
|
key=lambda user: (user.created_at, str(user.id)),
|
||||||
|
)
|
||||||
|
memberships = session.exec(select(HouseholdMembership)).all()
|
||||||
|
memberships_by_user_id = {
|
||||||
|
membership.user_id: membership for membership in memberships
|
||||||
|
}
|
||||||
|
return [_user_public(user, memberships_by_user_id.get(user.id)) for user in users]
|
||||||
|
|
||||||
|
|
||||||
|
@router.post(
|
||||||
|
"/households",
|
||||||
|
response_model=AdminHouseholdPublic,
|
||||||
|
status_code=status.HTTP_201_CREATED,
|
||||||
|
)
|
||||||
|
def create_household(session: SessionDep):
|
||||||
|
household = Household()
|
||||||
|
session.add(household)
|
||||||
|
session.commit()
|
||||||
|
session.refresh(household)
|
||||||
|
return _household_public(household)
|
||||||
|
|
||||||
|
|
||||||
|
@router.post(
|
||||||
|
"/households/{household_id}/members",
|
||||||
|
response_model=AdminHouseholdMembershipPublic,
|
||||||
|
status_code=status.HTTP_201_CREATED,
|
||||||
|
)
|
||||||
|
def assign_household_member(
|
||||||
|
household_id: UUID,
|
||||||
|
payload: AdminHouseholdMembershipCreate,
|
||||||
|
session: SessionDep,
|
||||||
|
):
|
||||||
|
_ = get_or_404(session, Household, household_id)
|
||||||
|
_ = get_or_404(session, User, payload.user_id)
|
||||||
|
existing_membership = _get_membership_for_user(session, payload.user_id)
|
||||||
|
if existing_membership is not None:
|
||||||
|
detail = "User already belongs to a household"
|
||||||
|
if existing_membership.household_id == household_id:
|
||||||
|
detail = "User already belongs to this household"
|
||||||
|
raise HTTPException(status_code=status.HTTP_409_CONFLICT, detail=detail)
|
||||||
|
|
||||||
|
membership = HouseholdMembership(
|
||||||
|
user_id=payload.user_id,
|
||||||
|
household_id=household_id,
|
||||||
|
role=payload.role,
|
||||||
|
)
|
||||||
|
session.add(membership)
|
||||||
|
session.commit()
|
||||||
|
session.refresh(membership)
|
||||||
|
return _membership_public(membership)
|
||||||
|
|
||||||
|
|
||||||
|
@router.patch(
|
||||||
|
"/households/{household_id}/members/{user_id}",
|
||||||
|
response_model=AdminHouseholdMembershipPublic,
|
||||||
|
)
|
||||||
|
def move_household_member(
|
||||||
|
household_id: UUID,
|
||||||
|
user_id: UUID,
|
||||||
|
session: SessionDep,
|
||||||
|
):
|
||||||
|
_ = get_or_404(session, Household, household_id)
|
||||||
|
_ = get_or_404(session, User, user_id)
|
||||||
|
membership = _get_membership_for_user(session, user_id)
|
||||||
|
if membership is None:
|
||||||
|
raise HTTPException(
|
||||||
|
status_code=status.HTTP_404_NOT_FOUND,
|
||||||
|
detail="HouseholdMembership not found",
|
||||||
|
)
|
||||||
|
if membership.household_id == household_id:
|
||||||
|
raise HTTPException(
|
||||||
|
status_code=status.HTTP_409_CONFLICT,
|
||||||
|
detail="User already belongs to this household",
|
||||||
|
)
|
||||||
|
|
||||||
|
membership.household_id = household_id
|
||||||
|
session.add(membership)
|
||||||
|
session.commit()
|
||||||
|
session.refresh(membership)
|
||||||
|
return _membership_public(membership)
|
||||||
|
|
||||||
|
|
||||||
|
@router.delete(
|
||||||
|
"/households/{household_id}/members/{user_id}",
|
||||||
|
status_code=status.HTTP_204_NO_CONTENT,
|
||||||
|
)
|
||||||
|
def remove_household_member(
|
||||||
|
household_id: UUID,
|
||||||
|
user_id: UUID,
|
||||||
|
session: SessionDep,
|
||||||
|
):
|
||||||
|
_ = get_or_404(session, Household, household_id)
|
||||||
|
_ = get_or_404(session, User, user_id)
|
||||||
|
membership = _get_membership_for_user(session, user_id)
|
||||||
|
if membership is None or membership.household_id != household_id:
|
||||||
|
raise HTTPException(
|
||||||
|
status_code=status.HTTP_404_NOT_FOUND,
|
||||||
|
detail="HouseholdMembership not found",
|
||||||
|
)
|
||||||
|
|
||||||
|
session.delete(membership)
|
||||||
|
session.commit()
|
||||||
|
return Response(status_code=status.HTTP_204_NO_CONTENT)
|
||||||
|
|
@ -2,10 +2,13 @@ import json
|
||||||
from typing import Any, Optional
|
from typing import Any, Optional
|
||||||
from uuid import UUID
|
from uuid import UUID
|
||||||
|
|
||||||
from fastapi import APIRouter, Depends, HTTPException
|
from fastapi import APIRouter, Depends, HTTPException, Query
|
||||||
from sqlmodel import Session, SQLModel, col, select
|
from sqlmodel import Session, SQLModel, col, select
|
||||||
|
|
||||||
from db import get_session
|
from db import get_session
|
||||||
|
from innercontext.api.auth_deps import get_current_user
|
||||||
|
from innercontext.auth import CurrentUser
|
||||||
|
from innercontext.models.enums import Role
|
||||||
from innercontext.models.ai_log import AICallLog
|
from innercontext.models.ai_log import AICallLog
|
||||||
|
|
||||||
router = APIRouter()
|
router = APIRouter()
|
||||||
|
|
@ -43,14 +46,33 @@ class AICallLogPublic(SQLModel):
|
||||||
error_detail: Optional[str] = None
|
error_detail: Optional[str] = None
|
||||||
|
|
||||||
|
|
||||||
|
def _resolve_target_user_id(
|
||||||
|
current_user: CurrentUser,
|
||||||
|
user_id: UUID | None,
|
||||||
|
) -> UUID:
|
||||||
|
if user_id is None:
|
||||||
|
return current_user.user_id
|
||||||
|
if current_user.role is not Role.ADMIN:
|
||||||
|
raise HTTPException(status_code=403, detail="Admin role required")
|
||||||
|
return user_id
|
||||||
|
|
||||||
|
|
||||||
@router.get("", response_model=list[AICallLogPublic])
|
@router.get("", response_model=list[AICallLogPublic])
|
||||||
def list_ai_logs(
|
def list_ai_logs(
|
||||||
endpoint: Optional[str] = None,
|
endpoint: Optional[str] = None,
|
||||||
success: Optional[bool] = None,
|
success: Optional[bool] = None,
|
||||||
limit: int = 50,
|
limit: int = 50,
|
||||||
|
user_id: UUID | None = Query(default=None),
|
||||||
session: Session = Depends(get_session),
|
session: Session = Depends(get_session),
|
||||||
|
current_user: CurrentUser = Depends(get_current_user),
|
||||||
):
|
):
|
||||||
stmt = select(AICallLog).order_by(col(AICallLog.created_at).desc()).limit(limit)
|
target_user_id = _resolve_target_user_id(current_user, user_id)
|
||||||
|
stmt = (
|
||||||
|
select(AICallLog)
|
||||||
|
.where(AICallLog.user_id == target_user_id)
|
||||||
|
.order_by(col(AICallLog.created_at).desc())
|
||||||
|
.limit(limit)
|
||||||
|
)
|
||||||
if endpoint is not None:
|
if endpoint is not None:
|
||||||
stmt = stmt.where(AICallLog.endpoint == endpoint)
|
stmt = stmt.where(AICallLog.endpoint == endpoint)
|
||||||
if success is not None:
|
if success is not None:
|
||||||
|
|
@ -75,9 +97,17 @@ def list_ai_logs(
|
||||||
|
|
||||||
|
|
||||||
@router.get("/{log_id}", response_model=AICallLog)
|
@router.get("/{log_id}", response_model=AICallLog)
|
||||||
def get_ai_log(log_id: UUID, session: Session = Depends(get_session)):
|
def get_ai_log(
|
||||||
|
log_id: UUID,
|
||||||
|
user_id: UUID | None = Query(default=None),
|
||||||
|
session: Session = Depends(get_session),
|
||||||
|
current_user: CurrentUser = Depends(get_current_user),
|
||||||
|
):
|
||||||
|
target_user_id = _resolve_target_user_id(current_user, user_id)
|
||||||
log = session.get(AICallLog, log_id)
|
log = session.get(AICallLog, log_id)
|
||||||
if log is None:
|
if log is None:
|
||||||
raise HTTPException(status_code=404, detail="Log not found")
|
raise HTTPException(status_code=404, detail="Log not found")
|
||||||
|
if log.user_id != target_user_id:
|
||||||
|
raise HTTPException(status_code=404, detail="Log not found")
|
||||||
log.tool_trace = _normalize_tool_trace(getattr(log, "tool_trace", None))
|
log.tool_trace = _normalize_tool_trace(getattr(log, "tool_trace", None))
|
||||||
return log
|
return log
|
||||||
|
|
|
||||||
166
backend/innercontext/api/auth.py
Normal file
166
backend/innercontext/api/auth.py
Normal file
|
|
@ -0,0 +1,166 @@
|
||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
from datetime import date, datetime
|
||||||
|
from uuid import UUID
|
||||||
|
|
||||||
|
from fastapi import APIRouter, Depends, HTTPException, status
|
||||||
|
from sqlmodel import Field, Session, SQLModel, select
|
||||||
|
|
||||||
|
from db import get_session
|
||||||
|
from innercontext.api.auth_deps import get_current_user
|
||||||
|
from innercontext.auth import CurrentUser, IdentityData, sync_current_user
|
||||||
|
from innercontext.models import HouseholdRole, Role, UserProfile
|
||||||
|
|
||||||
|
router = APIRouter()
|
||||||
|
|
||||||
|
|
||||||
|
class SessionSyncRequest(SQLModel):
|
||||||
|
iss: str | None = None
|
||||||
|
sub: str | None = None
|
||||||
|
email: str | None = None
|
||||||
|
name: str | None = None
|
||||||
|
preferred_username: str | None = None
|
||||||
|
groups: list[str] | None = None
|
||||||
|
|
||||||
|
|
||||||
|
class AuthHouseholdMembershipPublic(SQLModel):
|
||||||
|
household_id: UUID
|
||||||
|
role: HouseholdRole
|
||||||
|
|
||||||
|
|
||||||
|
class AuthUserPublic(SQLModel):
|
||||||
|
id: UUID
|
||||||
|
role: Role
|
||||||
|
household_membership: AuthHouseholdMembershipPublic | None = None
|
||||||
|
|
||||||
|
|
||||||
|
class AuthIdentityPublic(SQLModel):
|
||||||
|
issuer: str
|
||||||
|
subject: str
|
||||||
|
email: str | None = None
|
||||||
|
name: str | None = None
|
||||||
|
preferred_username: str | None = None
|
||||||
|
groups: list[str] = Field(default_factory=list)
|
||||||
|
|
||||||
|
|
||||||
|
class AuthProfilePublic(SQLModel):
|
||||||
|
id: UUID
|
||||||
|
user_id: UUID | None
|
||||||
|
birth_date: date | None = None
|
||||||
|
sex_at_birth: str | None = None
|
||||||
|
created_at: datetime
|
||||||
|
updated_at: datetime
|
||||||
|
|
||||||
|
|
||||||
|
class AuthSessionResponse(SQLModel):
|
||||||
|
user: AuthUserPublic
|
||||||
|
identity: AuthIdentityPublic
|
||||||
|
profile: AuthProfilePublic | None = None
|
||||||
|
|
||||||
|
|
||||||
|
def _build_identity(
|
||||||
|
current_user: CurrentUser,
|
||||||
|
payload: SessionSyncRequest | None,
|
||||||
|
) -> IdentityData:
|
||||||
|
if payload is None:
|
||||||
|
return current_user.identity
|
||||||
|
|
||||||
|
if payload.iss is not None and payload.iss != current_user.identity.issuer:
|
||||||
|
raise HTTPException(
|
||||||
|
status_code=status.HTTP_400_BAD_REQUEST,
|
||||||
|
detail="Session sync issuer does not match bearer token",
|
||||||
|
)
|
||||||
|
if payload.sub is not None and payload.sub != current_user.identity.subject:
|
||||||
|
raise HTTPException(
|
||||||
|
status_code=status.HTTP_400_BAD_REQUEST,
|
||||||
|
detail="Session sync subject does not match bearer token",
|
||||||
|
)
|
||||||
|
|
||||||
|
return IdentityData(
|
||||||
|
issuer=current_user.identity.issuer,
|
||||||
|
subject=current_user.identity.subject,
|
||||||
|
email=(
|
||||||
|
payload.email if payload.email is not None else current_user.identity.email
|
||||||
|
),
|
||||||
|
name=payload.name if payload.name is not None else current_user.identity.name,
|
||||||
|
preferred_username=(
|
||||||
|
payload.preferred_username
|
||||||
|
if payload.preferred_username is not None
|
||||||
|
else current_user.identity.preferred_username
|
||||||
|
),
|
||||||
|
groups=(
|
||||||
|
tuple(payload.groups)
|
||||||
|
if payload.groups is not None
|
||||||
|
else current_user.identity.groups
|
||||||
|
),
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def _get_profile(session: Session, user_id: UUID) -> UserProfile | None:
|
||||||
|
return session.exec(
|
||||||
|
select(UserProfile).where(UserProfile.user_id == user_id)
|
||||||
|
).first()
|
||||||
|
|
||||||
|
|
||||||
|
def _profile_public(profile: UserProfile | None) -> AuthProfilePublic | None:
|
||||||
|
if profile is None:
|
||||||
|
return None
|
||||||
|
|
||||||
|
return AuthProfilePublic(
|
||||||
|
id=profile.id,
|
||||||
|
user_id=profile.user_id,
|
||||||
|
birth_date=profile.birth_date,
|
||||||
|
sex_at_birth=(
|
||||||
|
profile.sex_at_birth.value if profile.sex_at_birth is not None else None
|
||||||
|
),
|
||||||
|
created_at=profile.created_at,
|
||||||
|
updated_at=profile.updated_at,
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def _response(session: Session, current_user: CurrentUser) -> AuthSessionResponse:
|
||||||
|
household_membership = None
|
||||||
|
if current_user.household_membership is not None:
|
||||||
|
household_membership = AuthHouseholdMembershipPublic(
|
||||||
|
household_id=current_user.household_membership.household_id,
|
||||||
|
role=current_user.household_membership.role,
|
||||||
|
)
|
||||||
|
|
||||||
|
return AuthSessionResponse(
|
||||||
|
user=AuthUserPublic(
|
||||||
|
id=current_user.user_id,
|
||||||
|
role=current_user.role,
|
||||||
|
household_membership=household_membership,
|
||||||
|
),
|
||||||
|
identity=AuthIdentityPublic(
|
||||||
|
issuer=current_user.identity.issuer,
|
||||||
|
subject=current_user.identity.subject,
|
||||||
|
email=current_user.identity.email,
|
||||||
|
name=current_user.identity.name,
|
||||||
|
preferred_username=current_user.identity.preferred_username,
|
||||||
|
groups=list(current_user.identity.groups),
|
||||||
|
),
|
||||||
|
profile=_profile_public(_get_profile(session, current_user.user_id)),
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
@router.post("/session/sync", response_model=AuthSessionResponse)
|
||||||
|
def sync_session(
|
||||||
|
payload: SessionSyncRequest | None = None,
|
||||||
|
session: Session = Depends(get_session),
|
||||||
|
current_user: CurrentUser = Depends(get_current_user),
|
||||||
|
):
|
||||||
|
synced_user = sync_current_user(
|
||||||
|
session,
|
||||||
|
current_user.claims,
|
||||||
|
identity=_build_identity(current_user, payload),
|
||||||
|
)
|
||||||
|
return _response(session, synced_user)
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/me", response_model=AuthSessionResponse)
|
||||||
|
def get_me(
|
||||||
|
session: Session = Depends(get_session),
|
||||||
|
current_user: CurrentUser = Depends(get_current_user),
|
||||||
|
):
|
||||||
|
return _response(session, current_user)
|
||||||
57
backend/innercontext/api/auth_deps.py
Normal file
57
backend/innercontext/api/auth_deps.py
Normal file
|
|
@ -0,0 +1,57 @@
|
||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
from typing import Annotated
|
||||||
|
|
||||||
|
from fastapi import Depends, HTTPException, status
|
||||||
|
from fastapi.security import HTTPAuthorizationCredentials, HTTPBearer
|
||||||
|
from sqlmodel import Session
|
||||||
|
|
||||||
|
from db import get_session
|
||||||
|
from innercontext.auth import (
|
||||||
|
AuthConfigurationError,
|
||||||
|
CurrentUser,
|
||||||
|
TokenValidationError,
|
||||||
|
sync_current_user,
|
||||||
|
validate_access_token,
|
||||||
|
)
|
||||||
|
from innercontext.models import Role
|
||||||
|
|
||||||
|
_bearer_scheme = HTTPBearer(auto_error=False)
|
||||||
|
|
||||||
|
|
||||||
|
def _unauthorized(detail: str) -> HTTPException:
|
||||||
|
return HTTPException(
|
||||||
|
status_code=status.HTTP_401_UNAUTHORIZED,
|
||||||
|
detail=detail,
|
||||||
|
headers={"WWW-Authenticate": "Bearer"},
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def get_current_user(
|
||||||
|
credentials: Annotated[
|
||||||
|
HTTPAuthorizationCredentials | None, Depends(_bearer_scheme)
|
||||||
|
],
|
||||||
|
session: Session = Depends(get_session),
|
||||||
|
) -> CurrentUser:
|
||||||
|
if credentials is None or credentials.scheme.lower() != "bearer":
|
||||||
|
raise _unauthorized("Missing bearer token")
|
||||||
|
|
||||||
|
try:
|
||||||
|
claims = validate_access_token(credentials.credentials)
|
||||||
|
return sync_current_user(session, claims)
|
||||||
|
except AuthConfigurationError as exc:
|
||||||
|
raise HTTPException(
|
||||||
|
status_code=status.HTTP_503_SERVICE_UNAVAILABLE,
|
||||||
|
detail=str(exc),
|
||||||
|
) from exc
|
||||||
|
except TokenValidationError as exc:
|
||||||
|
raise _unauthorized(str(exc)) from exc
|
||||||
|
|
||||||
|
|
||||||
|
def require_admin(current_user: CurrentUser = Depends(get_current_user)) -> CurrentUser:
|
||||||
|
if current_user.role is not Role.ADMIN:
|
||||||
|
raise HTTPException(
|
||||||
|
status_code=status.HTTP_403_FORBIDDEN,
|
||||||
|
detail="Admin role required",
|
||||||
|
)
|
||||||
|
return current_user
|
||||||
177
backend/innercontext/api/authz.py
Normal file
177
backend/innercontext/api/authz.py
Normal file
|
|
@ -0,0 +1,177 @@
|
||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
from typing import TypeVar, cast
|
||||||
|
from uuid import UUID
|
||||||
|
|
||||||
|
from fastapi import HTTPException
|
||||||
|
from sqlmodel import Session, select
|
||||||
|
|
||||||
|
from innercontext.auth import CurrentUser
|
||||||
|
from innercontext.models import HouseholdMembership, Product, ProductInventory, Role
|
||||||
|
|
||||||
|
_T = TypeVar("_T")
|
||||||
|
|
||||||
|
|
||||||
|
def _not_found(model_name: str) -> HTTPException:
|
||||||
|
return HTTPException(status_code=404, detail=f"{model_name} not found")
|
||||||
|
|
||||||
|
|
||||||
|
def _user_scoped_model_name(model: type[object]) -> str:
|
||||||
|
return getattr(model, "__name__", str(model))
|
||||||
|
|
||||||
|
|
||||||
|
def _record_user_id(model: type[object], record: object) -> object:
|
||||||
|
if not hasattr(record, "user_id"):
|
||||||
|
model_name = _user_scoped_model_name(model)
|
||||||
|
raise TypeError(f"{model_name} does not expose user_id")
|
||||||
|
return cast(object, getattr(record, "user_id"))
|
||||||
|
|
||||||
|
|
||||||
|
def _is_admin(current_user: CurrentUser) -> bool:
|
||||||
|
return current_user.role is Role.ADMIN
|
||||||
|
|
||||||
|
|
||||||
|
def _owner_household_id(session: Session, owner_user_id: UUID) -> UUID | None:
|
||||||
|
membership = session.exec(
|
||||||
|
select(HouseholdMembership).where(HouseholdMembership.user_id == owner_user_id)
|
||||||
|
).first()
|
||||||
|
if membership is None:
|
||||||
|
return None
|
||||||
|
return membership.household_id
|
||||||
|
|
||||||
|
|
||||||
|
def _is_same_household(
|
||||||
|
session: Session,
|
||||||
|
owner_user_id: UUID,
|
||||||
|
current_user: CurrentUser,
|
||||||
|
) -> bool:
|
||||||
|
if current_user.household_membership is None:
|
||||||
|
return False
|
||||||
|
owner_household_id = _owner_household_id(session, owner_user_id)
|
||||||
|
return owner_household_id == current_user.household_membership.household_id
|
||||||
|
|
||||||
|
|
||||||
|
def get_owned_or_404(
|
||||||
|
session: Session,
|
||||||
|
model: type[_T],
|
||||||
|
record_id: object,
|
||||||
|
current_user: CurrentUser,
|
||||||
|
) -> _T:
|
||||||
|
obj = session.get(model, record_id)
|
||||||
|
model_name = _user_scoped_model_name(model)
|
||||||
|
if obj is None:
|
||||||
|
raise _not_found(model_name)
|
||||||
|
if _record_user_id(model, obj) != current_user.user_id:
|
||||||
|
raise _not_found(model_name)
|
||||||
|
return obj
|
||||||
|
|
||||||
|
|
||||||
|
def get_owned_or_404_admin_override(
|
||||||
|
session: Session,
|
||||||
|
model: type[_T],
|
||||||
|
record_id: object,
|
||||||
|
current_user: CurrentUser,
|
||||||
|
) -> _T:
|
||||||
|
obj = session.get(model, record_id)
|
||||||
|
model_name = _user_scoped_model_name(model)
|
||||||
|
if obj is None:
|
||||||
|
raise _not_found(model_name)
|
||||||
|
if _is_admin(current_user):
|
||||||
|
return obj
|
||||||
|
if _record_user_id(model, obj) != current_user.user_id:
|
||||||
|
raise _not_found(model_name)
|
||||||
|
return obj
|
||||||
|
|
||||||
|
|
||||||
|
def list_owned(
|
||||||
|
session: Session, model: type[_T], current_user: CurrentUser
|
||||||
|
) -> list[_T]:
|
||||||
|
model_name = _user_scoped_model_name(model)
|
||||||
|
if not hasattr(model, "user_id"):
|
||||||
|
raise TypeError(f"{model_name} does not expose user_id")
|
||||||
|
records = cast(list[_T], session.exec(select(model)).all())
|
||||||
|
return [
|
||||||
|
record
|
||||||
|
for record in records
|
||||||
|
if _record_user_id(model, record) == current_user.user_id
|
||||||
|
]
|
||||||
|
|
||||||
|
|
||||||
|
def list_owned_admin_override(
|
||||||
|
session: Session,
|
||||||
|
model: type[_T],
|
||||||
|
current_user: CurrentUser,
|
||||||
|
) -> list[_T]:
|
||||||
|
if _is_admin(current_user):
|
||||||
|
statement = select(model)
|
||||||
|
return cast(list[_T], session.exec(statement).all())
|
||||||
|
return list_owned(session, model, current_user)
|
||||||
|
|
||||||
|
|
||||||
|
def check_household_inventory_access(
|
||||||
|
session: Session,
|
||||||
|
inventory_id: UUID,
|
||||||
|
current_user: CurrentUser,
|
||||||
|
) -> ProductInventory:
|
||||||
|
inventory = session.get(ProductInventory, inventory_id)
|
||||||
|
if inventory is None:
|
||||||
|
raise _not_found(ProductInventory.__name__)
|
||||||
|
|
||||||
|
if _is_admin(current_user):
|
||||||
|
return inventory
|
||||||
|
|
||||||
|
owner_user_id = inventory.user_id
|
||||||
|
if owner_user_id == current_user.user_id:
|
||||||
|
return inventory
|
||||||
|
|
||||||
|
if not inventory.is_household_shared or owner_user_id is None:
|
||||||
|
raise _not_found(ProductInventory.__name__)
|
||||||
|
|
||||||
|
if not _is_same_household(session, owner_user_id, current_user):
|
||||||
|
raise _not_found(ProductInventory.__name__)
|
||||||
|
|
||||||
|
return inventory
|
||||||
|
|
||||||
|
|
||||||
|
def can_update_inventory(
|
||||||
|
session: Session,
|
||||||
|
inventory_id: UUID,
|
||||||
|
current_user: CurrentUser,
|
||||||
|
) -> bool:
|
||||||
|
inventory = session.get(ProductInventory, inventory_id)
|
||||||
|
if inventory is None:
|
||||||
|
return False
|
||||||
|
if _is_admin(current_user):
|
||||||
|
return True
|
||||||
|
if inventory.user_id == current_user.user_id:
|
||||||
|
return True
|
||||||
|
if not inventory.is_household_shared or inventory.user_id is None:
|
||||||
|
return False
|
||||||
|
return _is_same_household(session, inventory.user_id, current_user)
|
||||||
|
|
||||||
|
|
||||||
|
def is_product_visible(
|
||||||
|
session: Session, product_id: UUID, current_user: CurrentUser
|
||||||
|
) -> bool:
|
||||||
|
product = session.get(Product, product_id)
|
||||||
|
if product is None:
|
||||||
|
return False
|
||||||
|
|
||||||
|
if _is_admin(current_user):
|
||||||
|
return True
|
||||||
|
|
||||||
|
if product.user_id == current_user.user_id:
|
||||||
|
return True
|
||||||
|
|
||||||
|
if current_user.household_membership is None:
|
||||||
|
return False
|
||||||
|
|
||||||
|
inventories = session.exec(
|
||||||
|
select(ProductInventory).where(ProductInventory.product_id == product_id)
|
||||||
|
).all()
|
||||||
|
for inventory in inventories:
|
||||||
|
if not inventory.is_household_shared or inventory.user_id is None:
|
||||||
|
continue
|
||||||
|
if _is_same_household(session, inventory.user_id, current_user):
|
||||||
|
return True
|
||||||
|
return False
|
||||||
|
|
@ -3,15 +3,17 @@ from datetime import datetime
|
||||||
from typing import Optional
|
from typing import Optional
|
||||||
from uuid import UUID, uuid4
|
from uuid import UUID, uuid4
|
||||||
|
|
||||||
from fastapi import APIRouter, Depends, Query
|
from fastapi import APIRouter, Depends, HTTPException, Query
|
||||||
from pydantic import field_validator
|
from pydantic import field_validator
|
||||||
from sqlalchemy import Integer, cast, func, or_
|
from sqlalchemy import Integer, cast, func, or_
|
||||||
from sqlmodel import Session, SQLModel, col, select
|
from sqlmodel import Session, SQLModel, col, select
|
||||||
|
|
||||||
from db import get_session
|
from db import get_session
|
||||||
from innercontext.api.utils import get_or_404
|
from innercontext.api.auth_deps import get_current_user
|
||||||
|
from innercontext.api.utils import get_owned_or_404
|
||||||
|
from innercontext.auth import CurrentUser
|
||||||
from innercontext.models import LabResult, MedicationEntry, MedicationUsage
|
from innercontext.models import LabResult, MedicationEntry, MedicationUsage
|
||||||
from innercontext.models.enums import MedicationKind, ResultFlag
|
from innercontext.models.enums import MedicationKind, ResultFlag, Role
|
||||||
|
|
||||||
router = APIRouter()
|
router = APIRouter()
|
||||||
|
|
||||||
|
|
@ -133,6 +135,34 @@ class LabResultListResponse(SQLModel):
|
||||||
# ---------------------------------------------------------------------------
|
# ---------------------------------------------------------------------------
|
||||||
|
|
||||||
|
|
||||||
|
def _resolve_target_user_id(
|
||||||
|
current_user: CurrentUser,
|
||||||
|
user_id: UUID | None,
|
||||||
|
) -> UUID:
|
||||||
|
if user_id is None:
|
||||||
|
return current_user.user_id
|
||||||
|
if current_user.role is not Role.ADMIN:
|
||||||
|
raise HTTPException(status_code=403, detail="Admin role required")
|
||||||
|
return user_id
|
||||||
|
|
||||||
|
|
||||||
|
def _get_owned_or_admin_override(
|
||||||
|
session: Session,
|
||||||
|
model: type[MedicationEntry] | type[MedicationUsage] | type[LabResult],
|
||||||
|
record_id: UUID,
|
||||||
|
current_user: CurrentUser,
|
||||||
|
user_id: UUID | None,
|
||||||
|
):
|
||||||
|
if user_id is None:
|
||||||
|
return get_owned_or_404(session, model, record_id, current_user)
|
||||||
|
record = session.get(model, record_id)
|
||||||
|
if record is None or record.user_id != _resolve_target_user_id(
|
||||||
|
current_user, user_id
|
||||||
|
):
|
||||||
|
raise HTTPException(status_code=404, detail=f"{model.__name__} not found")
|
||||||
|
return record
|
||||||
|
|
||||||
|
|
||||||
# ---------------------------------------------------------------------------
|
# ---------------------------------------------------------------------------
|
||||||
# Medication routes
|
# Medication routes
|
||||||
# ---------------------------------------------------------------------------
|
# ---------------------------------------------------------------------------
|
||||||
|
|
@ -142,9 +172,12 @@ class LabResultListResponse(SQLModel):
|
||||||
def list_medications(
|
def list_medications(
|
||||||
kind: Optional[MedicationKind] = None,
|
kind: Optional[MedicationKind] = None,
|
||||||
product_name: Optional[str] = None,
|
product_name: Optional[str] = None,
|
||||||
|
user_id: UUID | None = Query(default=None),
|
||||||
session: Session = Depends(get_session),
|
session: Session = Depends(get_session),
|
||||||
|
current_user: CurrentUser = Depends(get_current_user),
|
||||||
):
|
):
|
||||||
stmt = select(MedicationEntry)
|
target_user_id = _resolve_target_user_id(current_user, user_id)
|
||||||
|
stmt = select(MedicationEntry).where(MedicationEntry.user_id == target_user_id)
|
||||||
if kind is not None:
|
if kind is not None:
|
||||||
stmt = stmt.where(MedicationEntry.kind == kind)
|
stmt = stmt.where(MedicationEntry.kind == kind)
|
||||||
if product_name is not None:
|
if product_name is not None:
|
||||||
|
|
@ -153,8 +186,18 @@ def list_medications(
|
||||||
|
|
||||||
|
|
||||||
@router.post("/medications", response_model=MedicationEntry, status_code=201)
|
@router.post("/medications", response_model=MedicationEntry, status_code=201)
|
||||||
def create_medication(data: MedicationCreate, session: Session = Depends(get_session)):
|
def create_medication(
|
||||||
entry = MedicationEntry(record_id=uuid4(), **data.model_dump())
|
data: MedicationCreate,
|
||||||
|
user_id: UUID | None = Query(default=None),
|
||||||
|
session: Session = Depends(get_session),
|
||||||
|
current_user: CurrentUser = Depends(get_current_user),
|
||||||
|
):
|
||||||
|
target_user_id = _resolve_target_user_id(current_user, user_id)
|
||||||
|
entry = MedicationEntry(
|
||||||
|
record_id=uuid4(),
|
||||||
|
user_id=target_user_id,
|
||||||
|
**data.model_dump(),
|
||||||
|
)
|
||||||
session.add(entry)
|
session.add(entry)
|
||||||
session.commit()
|
session.commit()
|
||||||
session.refresh(entry)
|
session.refresh(entry)
|
||||||
|
|
@ -162,17 +205,36 @@ def create_medication(data: MedicationCreate, session: Session = Depends(get_ses
|
||||||
|
|
||||||
|
|
||||||
@router.get("/medications/{medication_id}", response_model=MedicationEntry)
|
@router.get("/medications/{medication_id}", response_model=MedicationEntry)
|
||||||
def get_medication(medication_id: UUID, session: Session = Depends(get_session)):
|
def get_medication(
|
||||||
return get_or_404(session, MedicationEntry, medication_id)
|
medication_id: UUID,
|
||||||
|
user_id: UUID | None = Query(default=None),
|
||||||
|
session: Session = Depends(get_session),
|
||||||
|
current_user: CurrentUser = Depends(get_current_user),
|
||||||
|
):
|
||||||
|
return _get_owned_or_admin_override(
|
||||||
|
session,
|
||||||
|
MedicationEntry,
|
||||||
|
medication_id,
|
||||||
|
current_user,
|
||||||
|
user_id,
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
@router.patch("/medications/{medication_id}", response_model=MedicationEntry)
|
@router.patch("/medications/{medication_id}", response_model=MedicationEntry)
|
||||||
def update_medication(
|
def update_medication(
|
||||||
medication_id: UUID,
|
medication_id: UUID,
|
||||||
data: MedicationUpdate,
|
data: MedicationUpdate,
|
||||||
|
user_id: UUID | None = Query(default=None),
|
||||||
session: Session = Depends(get_session),
|
session: Session = Depends(get_session),
|
||||||
|
current_user: CurrentUser = Depends(get_current_user),
|
||||||
):
|
):
|
||||||
entry = get_or_404(session, MedicationEntry, medication_id)
|
entry = _get_owned_or_admin_override(
|
||||||
|
session,
|
||||||
|
MedicationEntry,
|
||||||
|
medication_id,
|
||||||
|
current_user,
|
||||||
|
user_id,
|
||||||
|
)
|
||||||
for key, value in data.model_dump(exclude_unset=True).items():
|
for key, value in data.model_dump(exclude_unset=True).items():
|
||||||
setattr(entry, key, value)
|
setattr(entry, key, value)
|
||||||
session.add(entry)
|
session.add(entry)
|
||||||
|
|
@ -182,13 +244,25 @@ def update_medication(
|
||||||
|
|
||||||
|
|
||||||
@router.delete("/medications/{medication_id}", status_code=204)
|
@router.delete("/medications/{medication_id}", status_code=204)
|
||||||
def delete_medication(medication_id: UUID, session: Session = Depends(get_session)):
|
def delete_medication(
|
||||||
entry = get_or_404(session, MedicationEntry, medication_id)
|
medication_id: UUID,
|
||||||
|
user_id: UUID | None = Query(default=None),
|
||||||
|
session: Session = Depends(get_session),
|
||||||
|
current_user: CurrentUser = Depends(get_current_user),
|
||||||
|
):
|
||||||
|
target_user_id = _resolve_target_user_id(current_user, user_id)
|
||||||
|
entry = _get_owned_or_admin_override(
|
||||||
|
session,
|
||||||
|
MedicationEntry,
|
||||||
|
medication_id,
|
||||||
|
current_user,
|
||||||
|
user_id,
|
||||||
|
)
|
||||||
# Delete usages first (no cascade configured at DB level)
|
# Delete usages first (no cascade configured at DB level)
|
||||||
usages = session.exec(
|
usages = session.exec(
|
||||||
select(MedicationUsage).where(
|
select(MedicationUsage)
|
||||||
MedicationUsage.medication_record_id == medication_id
|
.where(MedicationUsage.medication_record_id == medication_id)
|
||||||
)
|
.where(MedicationUsage.user_id == target_user_id)
|
||||||
).all()
|
).all()
|
||||||
for u in usages:
|
for u in usages:
|
||||||
session.delete(u)
|
session.delete(u)
|
||||||
|
|
@ -202,10 +276,24 @@ def delete_medication(medication_id: UUID, session: Session = Depends(get_sessio
|
||||||
|
|
||||||
|
|
||||||
@router.get("/medications/{medication_id}/usages", response_model=list[MedicationUsage])
|
@router.get("/medications/{medication_id}/usages", response_model=list[MedicationUsage])
|
||||||
def list_usages(medication_id: UUID, session: Session = Depends(get_session)):
|
def list_usages(
|
||||||
get_or_404(session, MedicationEntry, medication_id)
|
medication_id: UUID,
|
||||||
stmt = select(MedicationUsage).where(
|
user_id: UUID | None = Query(default=None),
|
||||||
MedicationUsage.medication_record_id == medication_id
|
session: Session = Depends(get_session),
|
||||||
|
current_user: CurrentUser = Depends(get_current_user),
|
||||||
|
):
|
||||||
|
target_user_id = _resolve_target_user_id(current_user, user_id)
|
||||||
|
_ = _get_owned_or_admin_override(
|
||||||
|
session,
|
||||||
|
MedicationEntry,
|
||||||
|
medication_id,
|
||||||
|
current_user,
|
||||||
|
user_id,
|
||||||
|
)
|
||||||
|
stmt = (
|
||||||
|
select(MedicationUsage)
|
||||||
|
.where(MedicationUsage.medication_record_id == medication_id)
|
||||||
|
.where(MedicationUsage.user_id == target_user_id)
|
||||||
)
|
)
|
||||||
return session.exec(stmt).all()
|
return session.exec(stmt).all()
|
||||||
|
|
||||||
|
|
@ -218,11 +306,21 @@ def list_usages(medication_id: UUID, session: Session = Depends(get_session)):
|
||||||
def create_usage(
|
def create_usage(
|
||||||
medication_id: UUID,
|
medication_id: UUID,
|
||||||
data: UsageCreate,
|
data: UsageCreate,
|
||||||
|
user_id: UUID | None = Query(default=None),
|
||||||
session: Session = Depends(get_session),
|
session: Session = Depends(get_session),
|
||||||
|
current_user: CurrentUser = Depends(get_current_user),
|
||||||
):
|
):
|
||||||
get_or_404(session, MedicationEntry, medication_id)
|
target_user_id = _resolve_target_user_id(current_user, user_id)
|
||||||
|
_ = _get_owned_or_admin_override(
|
||||||
|
session,
|
||||||
|
MedicationEntry,
|
||||||
|
medication_id,
|
||||||
|
current_user,
|
||||||
|
user_id,
|
||||||
|
)
|
||||||
usage = MedicationUsage(
|
usage = MedicationUsage(
|
||||||
record_id=uuid4(),
|
record_id=uuid4(),
|
||||||
|
user_id=target_user_id,
|
||||||
medication_record_id=medication_id,
|
medication_record_id=medication_id,
|
||||||
**data.model_dump(),
|
**data.model_dump(),
|
||||||
)
|
)
|
||||||
|
|
@ -236,9 +334,17 @@ def create_usage(
|
||||||
def update_usage(
|
def update_usage(
|
||||||
usage_id: UUID,
|
usage_id: UUID,
|
||||||
data: UsageUpdate,
|
data: UsageUpdate,
|
||||||
|
user_id: UUID | None = Query(default=None),
|
||||||
session: Session = Depends(get_session),
|
session: Session = Depends(get_session),
|
||||||
|
current_user: CurrentUser = Depends(get_current_user),
|
||||||
):
|
):
|
||||||
usage = get_or_404(session, MedicationUsage, usage_id)
|
usage = _get_owned_or_admin_override(
|
||||||
|
session,
|
||||||
|
MedicationUsage,
|
||||||
|
usage_id,
|
||||||
|
current_user,
|
||||||
|
user_id,
|
||||||
|
)
|
||||||
for key, value in data.model_dump(exclude_unset=True).items():
|
for key, value in data.model_dump(exclude_unset=True).items():
|
||||||
setattr(usage, key, value)
|
setattr(usage, key, value)
|
||||||
session.add(usage)
|
session.add(usage)
|
||||||
|
|
@ -248,8 +354,19 @@ def update_usage(
|
||||||
|
|
||||||
|
|
||||||
@router.delete("/usages/{usage_id}", status_code=204)
|
@router.delete("/usages/{usage_id}", status_code=204)
|
||||||
def delete_usage(usage_id: UUID, session: Session = Depends(get_session)):
|
def delete_usage(
|
||||||
usage = get_or_404(session, MedicationUsage, usage_id)
|
usage_id: UUID,
|
||||||
|
user_id: UUID | None = Query(default=None),
|
||||||
|
session: Session = Depends(get_session),
|
||||||
|
current_user: CurrentUser = Depends(get_current_user),
|
||||||
|
):
|
||||||
|
usage = _get_owned_or_admin_override(
|
||||||
|
session,
|
||||||
|
MedicationUsage,
|
||||||
|
usage_id,
|
||||||
|
current_user,
|
||||||
|
user_id,
|
||||||
|
)
|
||||||
session.delete(usage)
|
session.delete(usage)
|
||||||
session.commit()
|
session.commit()
|
||||||
|
|
||||||
|
|
@ -265,32 +382,41 @@ def list_lab_results(
|
||||||
test_code: Optional[str] = None,
|
test_code: Optional[str] = None,
|
||||||
flag: Optional[ResultFlag] = None,
|
flag: Optional[ResultFlag] = None,
|
||||||
flags: list[ResultFlag] = Query(default_factory=list),
|
flags: list[ResultFlag] = Query(default_factory=list),
|
||||||
|
without_flag: bool = False,
|
||||||
from_date: Optional[datetime] = None,
|
from_date: Optional[datetime] = None,
|
||||||
to_date: Optional[datetime] = None,
|
to_date: Optional[datetime] = None,
|
||||||
latest_only: bool = False,
|
latest_only: bool = False,
|
||||||
limit: int = Query(default=50, ge=1, le=200),
|
limit: int = Query(default=50, ge=1, le=200),
|
||||||
offset: int = Query(default=0, ge=0),
|
offset: int = Query(default=0, ge=0),
|
||||||
|
user_id: UUID | None = Query(default=None),
|
||||||
session: Session = Depends(get_session),
|
session: Session = Depends(get_session),
|
||||||
|
current_user: CurrentUser = Depends(get_current_user),
|
||||||
):
|
):
|
||||||
filters = []
|
target_user_id = _resolve_target_user_id(current_user, user_id)
|
||||||
if q is not None and q.strip():
|
|
||||||
query = f"%{q.strip()}%"
|
def _apply_filters(statement):
|
||||||
filters.append(
|
statement = statement.where(col(LabResult.user_id) == target_user_id)
|
||||||
or_(
|
if q is not None and q.strip():
|
||||||
col(LabResult.test_code).ilike(query),
|
query = f"%{q.strip()}%"
|
||||||
col(LabResult.test_name_original).ilike(query),
|
statement = statement.where(
|
||||||
|
or_(
|
||||||
|
col(LabResult.test_code).ilike(query),
|
||||||
|
col(LabResult.test_name_original).ilike(query),
|
||||||
|
)
|
||||||
)
|
)
|
||||||
)
|
if test_code is not None:
|
||||||
if test_code is not None:
|
statement = statement.where(col(LabResult.test_code) == test_code)
|
||||||
filters.append(LabResult.test_code == test_code)
|
if flag is not None:
|
||||||
if flag is not None:
|
statement = statement.where(col(LabResult.flag) == flag)
|
||||||
filters.append(LabResult.flag == flag)
|
if flags:
|
||||||
if flags:
|
statement = statement.where(col(LabResult.flag).in_(flags))
|
||||||
filters.append(col(LabResult.flag).in_(flags))
|
if without_flag:
|
||||||
if from_date is not None:
|
statement = statement.where(col(LabResult.flag).is_(None))
|
||||||
filters.append(LabResult.collected_at >= from_date)
|
if from_date is not None:
|
||||||
if to_date is not None:
|
statement = statement.where(col(LabResult.collected_at) >= from_date)
|
||||||
filters.append(LabResult.collected_at <= to_date)
|
if to_date is not None:
|
||||||
|
statement = statement.where(col(LabResult.collected_at) <= to_date)
|
||||||
|
return statement
|
||||||
|
|
||||||
if latest_only:
|
if latest_only:
|
||||||
ranked_stmt = select(
|
ranked_stmt = select(
|
||||||
|
|
@ -306,8 +432,7 @@ def list_lab_results(
|
||||||
)
|
)
|
||||||
.label("rank"),
|
.label("rank"),
|
||||||
)
|
)
|
||||||
if filters:
|
ranked_stmt = _apply_filters(ranked_stmt)
|
||||||
ranked_stmt = ranked_stmt.where(*filters)
|
|
||||||
|
|
||||||
ranked_subquery = ranked_stmt.subquery()
|
ranked_subquery = ranked_stmt.subquery()
|
||||||
latest_ids = select(ranked_subquery.c.record_id).where(
|
latest_ids = select(ranked_subquery.c.record_id).where(
|
||||||
|
|
@ -320,11 +445,8 @@ def list_lab_results(
|
||||||
.subquery()
|
.subquery()
|
||||||
)
|
)
|
||||||
else:
|
else:
|
||||||
stmt = select(LabResult)
|
stmt = _apply_filters(select(LabResult))
|
||||||
count_stmt = select(func.count()).select_from(LabResult)
|
count_stmt = _apply_filters(select(func.count()).select_from(LabResult))
|
||||||
if filters:
|
|
||||||
stmt = stmt.where(*filters)
|
|
||||||
count_stmt = count_stmt.where(*filters)
|
|
||||||
|
|
||||||
test_code_numeric = cast(
|
test_code_numeric = cast(
|
||||||
func.replace(col(LabResult.test_code), "-", ""),
|
func.replace(col(LabResult.test_code), "-", ""),
|
||||||
|
|
@ -342,8 +464,18 @@ def list_lab_results(
|
||||||
|
|
||||||
|
|
||||||
@router.post("/lab-results", response_model=LabResult, status_code=201)
|
@router.post("/lab-results", response_model=LabResult, status_code=201)
|
||||||
def create_lab_result(data: LabResultCreate, session: Session = Depends(get_session)):
|
def create_lab_result(
|
||||||
result = LabResult(record_id=uuid4(), **data.model_dump())
|
data: LabResultCreate,
|
||||||
|
user_id: UUID | None = Query(default=None),
|
||||||
|
session: Session = Depends(get_session),
|
||||||
|
current_user: CurrentUser = Depends(get_current_user),
|
||||||
|
):
|
||||||
|
target_user_id = _resolve_target_user_id(current_user, user_id)
|
||||||
|
result = LabResult(
|
||||||
|
record_id=uuid4(),
|
||||||
|
user_id=target_user_id,
|
||||||
|
**data.model_dump(),
|
||||||
|
)
|
||||||
session.add(result)
|
session.add(result)
|
||||||
session.commit()
|
session.commit()
|
||||||
session.refresh(result)
|
session.refresh(result)
|
||||||
|
|
@ -351,17 +483,36 @@ def create_lab_result(data: LabResultCreate, session: Session = Depends(get_sess
|
||||||
|
|
||||||
|
|
||||||
@router.get("/lab-results/{result_id}", response_model=LabResult)
|
@router.get("/lab-results/{result_id}", response_model=LabResult)
|
||||||
def get_lab_result(result_id: UUID, session: Session = Depends(get_session)):
|
def get_lab_result(
|
||||||
return get_or_404(session, LabResult, result_id)
|
result_id: UUID,
|
||||||
|
user_id: UUID | None = Query(default=None),
|
||||||
|
session: Session = Depends(get_session),
|
||||||
|
current_user: CurrentUser = Depends(get_current_user),
|
||||||
|
):
|
||||||
|
return _get_owned_or_admin_override(
|
||||||
|
session,
|
||||||
|
LabResult,
|
||||||
|
result_id,
|
||||||
|
current_user,
|
||||||
|
user_id,
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
@router.patch("/lab-results/{result_id}", response_model=LabResult)
|
@router.patch("/lab-results/{result_id}", response_model=LabResult)
|
||||||
def update_lab_result(
|
def update_lab_result(
|
||||||
result_id: UUID,
|
result_id: UUID,
|
||||||
data: LabResultUpdate,
|
data: LabResultUpdate,
|
||||||
|
user_id: UUID | None = Query(default=None),
|
||||||
session: Session = Depends(get_session),
|
session: Session = Depends(get_session),
|
||||||
|
current_user: CurrentUser = Depends(get_current_user),
|
||||||
):
|
):
|
||||||
result = get_or_404(session, LabResult, result_id)
|
result = _get_owned_or_admin_override(
|
||||||
|
session,
|
||||||
|
LabResult,
|
||||||
|
result_id,
|
||||||
|
current_user,
|
||||||
|
user_id,
|
||||||
|
)
|
||||||
for key, value in data.model_dump(exclude_unset=True).items():
|
for key, value in data.model_dump(exclude_unset=True).items():
|
||||||
setattr(result, key, value)
|
setattr(result, key, value)
|
||||||
session.add(result)
|
session.add(result)
|
||||||
|
|
@ -371,7 +522,18 @@ def update_lab_result(
|
||||||
|
|
||||||
|
|
||||||
@router.delete("/lab-results/{result_id}", status_code=204)
|
@router.delete("/lab-results/{result_id}", status_code=204)
|
||||||
def delete_lab_result(result_id: UUID, session: Session = Depends(get_session)):
|
def delete_lab_result(
|
||||||
result = get_or_404(session, LabResult, result_id)
|
result_id: UUID,
|
||||||
|
user_id: UUID | None = Query(default=None),
|
||||||
|
session: Session = Depends(get_session),
|
||||||
|
current_user: CurrentUser = Depends(get_current_user),
|
||||||
|
):
|
||||||
|
result = _get_owned_or_admin_override(
|
||||||
|
session,
|
||||||
|
LabResult,
|
||||||
|
result_id,
|
||||||
|
current_user,
|
||||||
|
user_id,
|
||||||
|
)
|
||||||
session.delete(result)
|
session.delete(result)
|
||||||
session.commit()
|
session.commit()
|
||||||
|
|
|
||||||
|
|
@ -1,19 +1,29 @@
|
||||||
from uuid import UUID
|
from uuid import UUID
|
||||||
|
|
||||||
from fastapi import APIRouter, Depends
|
from fastapi import APIRouter, Depends, HTTPException
|
||||||
from sqlmodel import Session
|
from sqlmodel import Session
|
||||||
|
|
||||||
from db import get_session
|
from db import get_session
|
||||||
|
from innercontext.api.auth_deps import get_current_user
|
||||||
|
from innercontext.api.authz import (
|
||||||
|
can_update_inventory,
|
||||||
|
check_household_inventory_access,
|
||||||
|
)
|
||||||
from innercontext.api.products import InventoryUpdate
|
from innercontext.api.products import InventoryUpdate
|
||||||
from innercontext.api.utils import get_or_404
|
from innercontext.api.utils import get_or_404, get_owned_or_404_admin_override
|
||||||
|
from innercontext.auth import CurrentUser
|
||||||
from innercontext.models import ProductInventory
|
from innercontext.models import ProductInventory
|
||||||
|
|
||||||
router = APIRouter()
|
router = APIRouter()
|
||||||
|
|
||||||
|
|
||||||
@router.get("/{inventory_id}", response_model=ProductInventory)
|
@router.get("/{inventory_id}", response_model=ProductInventory)
|
||||||
def get_inventory(inventory_id: UUID, session: Session = Depends(get_session)):
|
def get_inventory(
|
||||||
return get_or_404(session, ProductInventory, inventory_id)
|
inventory_id: UUID,
|
||||||
|
session: Session = Depends(get_session),
|
||||||
|
current_user: CurrentUser = Depends(get_current_user),
|
||||||
|
):
|
||||||
|
return check_household_inventory_access(session, inventory_id, current_user)
|
||||||
|
|
||||||
|
|
||||||
@router.patch("/{inventory_id}", response_model=ProductInventory)
|
@router.patch("/{inventory_id}", response_model=ProductInventory)
|
||||||
|
|
@ -21,7 +31,10 @@ def update_inventory(
|
||||||
inventory_id: UUID,
|
inventory_id: UUID,
|
||||||
data: InventoryUpdate,
|
data: InventoryUpdate,
|
||||||
session: Session = Depends(get_session),
|
session: Session = Depends(get_session),
|
||||||
|
current_user: CurrentUser = Depends(get_current_user),
|
||||||
):
|
):
|
||||||
|
if not can_update_inventory(session, inventory_id, current_user):
|
||||||
|
raise HTTPException(status_code=404, detail="ProductInventory not found")
|
||||||
entry = get_or_404(session, ProductInventory, inventory_id)
|
entry = get_or_404(session, ProductInventory, inventory_id)
|
||||||
for key, value in data.model_dump(exclude_unset=True).items():
|
for key, value in data.model_dump(exclude_unset=True).items():
|
||||||
setattr(entry, key, value)
|
setattr(entry, key, value)
|
||||||
|
|
@ -32,7 +45,16 @@ def update_inventory(
|
||||||
|
|
||||||
|
|
||||||
@router.delete("/{inventory_id}", status_code=204)
|
@router.delete("/{inventory_id}", status_code=204)
|
||||||
def delete_inventory(inventory_id: UUID, session: Session = Depends(get_session)):
|
def delete_inventory(
|
||||||
entry = get_or_404(session, ProductInventory, inventory_id)
|
inventory_id: UUID,
|
||||||
|
session: Session = Depends(get_session),
|
||||||
|
current_user: CurrentUser = Depends(get_current_user),
|
||||||
|
):
|
||||||
|
entry = get_owned_or_404_admin_override(
|
||||||
|
session,
|
||||||
|
ProductInventory,
|
||||||
|
inventory_id,
|
||||||
|
current_user,
|
||||||
|
)
|
||||||
session.delete(entry)
|
session.delete(entry)
|
||||||
session.commit()
|
session.commit()
|
||||||
|
|
|
||||||
|
|
@ -2,14 +2,41 @@ from datetime import date
|
||||||
from typing import Any
|
from typing import Any
|
||||||
from uuid import UUID
|
from uuid import UUID
|
||||||
|
|
||||||
|
from fastapi import HTTPException
|
||||||
from sqlmodel import Session, col, select
|
from sqlmodel import Session, col, select
|
||||||
|
|
||||||
|
from innercontext.auth import CurrentUser
|
||||||
from innercontext.models import Product, UserProfile
|
from innercontext.models import Product, UserProfile
|
||||||
|
from innercontext.models.enums import Role
|
||||||
|
|
||||||
|
|
||||||
def get_user_profile(session: Session) -> UserProfile | None:
|
def _resolve_target_user_id(
|
||||||
|
current_user: CurrentUser,
|
||||||
|
user_id: UUID | None,
|
||||||
|
) -> UUID:
|
||||||
|
if user_id is None:
|
||||||
|
return current_user.user_id
|
||||||
|
if current_user.role is not Role.ADMIN:
|
||||||
|
raise HTTPException(status_code=403, detail="Admin role required")
|
||||||
|
return user_id
|
||||||
|
|
||||||
|
|
||||||
|
def get_user_profile(
|
||||||
|
session: Session,
|
||||||
|
current_user: CurrentUser | None = None,
|
||||||
|
*,
|
||||||
|
user_id: UUID | None = None,
|
||||||
|
) -> UserProfile | None:
|
||||||
|
if current_user is None:
|
||||||
|
return session.exec(
|
||||||
|
select(UserProfile).order_by(col(UserProfile.created_at).desc())
|
||||||
|
).first()
|
||||||
|
|
||||||
|
target_user_id = _resolve_target_user_id(current_user, user_id)
|
||||||
return session.exec(
|
return session.exec(
|
||||||
select(UserProfile).order_by(col(UserProfile.created_at).desc())
|
select(UserProfile)
|
||||||
|
.where(UserProfile.user_id == target_user_id)
|
||||||
|
.order_by(col(UserProfile.created_at).desc())
|
||||||
).first()
|
).first()
|
||||||
|
|
||||||
|
|
||||||
|
|
@ -20,8 +47,14 @@ def calculate_age(birth_date: date, reference_date: date) -> int:
|
||||||
return years
|
return years
|
||||||
|
|
||||||
|
|
||||||
def build_user_profile_context(session: Session, reference_date: date) -> str:
|
def build_user_profile_context(
|
||||||
profile = get_user_profile(session)
|
session: Session,
|
||||||
|
reference_date: date,
|
||||||
|
current_user: CurrentUser | None = None,
|
||||||
|
*,
|
||||||
|
user_id: UUID | None = None,
|
||||||
|
) -> str:
|
||||||
|
profile = get_user_profile(session, current_user, user_id=user_id)
|
||||||
if profile is None:
|
if profile is None:
|
||||||
return "USER PROFILE: no data\n"
|
return "USER PROFILE: no data\n"
|
||||||
|
|
||||||
|
|
@ -69,8 +102,9 @@ def build_product_context_summary(product: Product, has_inventory: bool = False)
|
||||||
|
|
||||||
# Get effect profile scores if available
|
# Get effect profile scores if available
|
||||||
effects = []
|
effects = []
|
||||||
if hasattr(product, "effect_profile") and product.effect_profile:
|
effect_profile = getattr(product, "product_effect_profile", None)
|
||||||
profile = product.effect_profile
|
if effect_profile:
|
||||||
|
profile = effect_profile
|
||||||
# Only include notable effects (score > 0)
|
# Only include notable effects (score > 0)
|
||||||
# Handle both dict (from DB) and object (from Pydantic)
|
# Handle both dict (from DB) and object (from Pydantic)
|
||||||
if isinstance(profile, dict):
|
if isinstance(profile, dict):
|
||||||
|
|
@ -165,11 +199,12 @@ def build_product_context_detailed(
|
||||||
|
|
||||||
# Effect profile
|
# Effect profile
|
||||||
effect_profile = None
|
effect_profile = None
|
||||||
if hasattr(product, "effect_profile") and product.effect_profile:
|
product_effect_profile = getattr(product, "effect_profile", None)
|
||||||
if isinstance(product.effect_profile, dict):
|
if product_effect_profile:
|
||||||
effect_profile = product.effect_profile
|
if isinstance(product_effect_profile, dict):
|
||||||
|
effect_profile = product_effect_profile
|
||||||
else:
|
else:
|
||||||
effect_profile = product.effect_profile.model_dump()
|
effect_profile = product_effect_profile.model_dump()
|
||||||
|
|
||||||
# Context rules
|
# Context rules
|
||||||
context_rules = None
|
context_rules = None
|
||||||
|
|
|
||||||
|
|
@ -1,7 +1,9 @@
|
||||||
|
# pyright: reportImportCycles=false, reportIncompatibleVariableOverride=false
|
||||||
|
|
||||||
import json
|
import json
|
||||||
import logging
|
import logging
|
||||||
from datetime import date
|
from datetime import date
|
||||||
from typing import Any, Literal, Optional
|
from typing import Any, Literal, Optional, cast
|
||||||
from uuid import UUID, uuid4
|
from uuid import UUID, uuid4
|
||||||
|
|
||||||
from fastapi import APIRouter, Depends, HTTPException, Query
|
from fastapi import APIRouter, Depends, HTTPException, Query
|
||||||
|
|
@ -13,6 +15,8 @@ from sqlalchemy import select as sa_select
|
||||||
from sqlmodel import Field, Session, SQLModel, col, select
|
from sqlmodel import Field, Session, SQLModel, col, select
|
||||||
|
|
||||||
from db import get_session
|
from db import get_session
|
||||||
|
from innercontext.api.auth_deps import get_current_user
|
||||||
|
from innercontext.api.authz import is_product_visible
|
||||||
from innercontext.api.llm_context import build_user_profile_context
|
from innercontext.api.llm_context import build_user_profile_context
|
||||||
from innercontext.api.product_llm_tools import (
|
from innercontext.api.product_llm_tools import (
|
||||||
PRODUCT_DETAILS_FUNCTION_DECLARATION,
|
PRODUCT_DETAILS_FUNCTION_DECLARATION,
|
||||||
|
|
@ -24,7 +28,8 @@ from innercontext.api.product_llm_tools import (
|
||||||
build_last_used_on_by_product,
|
build_last_used_on_by_product,
|
||||||
build_product_details_tool_handler,
|
build_product_details_tool_handler,
|
||||||
)
|
)
|
||||||
from innercontext.api.utils import get_or_404
|
from innercontext.api.utils import get_or_404, get_owned_or_404_admin_override
|
||||||
|
from innercontext.auth import CurrentUser
|
||||||
from innercontext.llm import (
|
from innercontext.llm import (
|
||||||
call_gemini,
|
call_gemini,
|
||||||
call_gemini_with_function_tools,
|
call_gemini_with_function_tools,
|
||||||
|
|
@ -42,12 +47,14 @@ from innercontext.models import (
|
||||||
SkinConcern,
|
SkinConcern,
|
||||||
SkinConditionSnapshot,
|
SkinConditionSnapshot,
|
||||||
)
|
)
|
||||||
|
from innercontext.models import Role
|
||||||
from innercontext.models.ai_log import AICallLog
|
from innercontext.models.ai_log import AICallLog
|
||||||
from innercontext.models.api_metadata import ResponseMetadata, TokenMetrics
|
from innercontext.models.api_metadata import ResponseMetadata, TokenMetrics
|
||||||
from innercontext.models.enums import (
|
from innercontext.models.enums import (
|
||||||
AbsorptionSpeed,
|
AbsorptionSpeed,
|
||||||
DayTime,
|
DayTime,
|
||||||
PriceTier,
|
PriceTier,
|
||||||
|
RemainingLevel,
|
||||||
SkinType,
|
SkinType,
|
||||||
TextureType,
|
TextureType,
|
||||||
)
|
)
|
||||||
|
|
@ -66,6 +73,34 @@ logger = logging.getLogger(__name__)
|
||||||
router = APIRouter()
|
router = APIRouter()
|
||||||
|
|
||||||
|
|
||||||
|
def _is_inventory_visible_to_user(
|
||||||
|
inventory: ProductInventory,
|
||||||
|
session: Session,
|
||||||
|
current_user: CurrentUser,
|
||||||
|
) -> bool:
|
||||||
|
if current_user.role is Role.ADMIN:
|
||||||
|
return True
|
||||||
|
if inventory.user_id == current_user.user_id:
|
||||||
|
return True
|
||||||
|
if not inventory.is_household_shared:
|
||||||
|
return False
|
||||||
|
if inventory.user_id is None:
|
||||||
|
return False
|
||||||
|
return is_product_visible(session, inventory.product_id, current_user)
|
||||||
|
|
||||||
|
|
||||||
|
def _visible_inventory_for_product(
|
||||||
|
inventories: list[ProductInventory],
|
||||||
|
session: Session,
|
||||||
|
current_user: CurrentUser,
|
||||||
|
) -> list[ProductInventory]:
|
||||||
|
return [
|
||||||
|
inventory
|
||||||
|
for inventory in inventories
|
||||||
|
if _is_inventory_visible_to_user(inventory, session, current_user)
|
||||||
|
]
|
||||||
|
|
||||||
|
|
||||||
def _build_response_metadata(session: Session, log_id: Any) -> ResponseMetadata | None:
|
def _build_response_metadata(session: Session, log_id: Any) -> ResponseMetadata | None:
|
||||||
"""Build ResponseMetadata from AICallLog for Phase 3 observability."""
|
"""Build ResponseMetadata from AICallLog for Phase 3 observability."""
|
||||||
if not log_id:
|
if not log_id:
|
||||||
|
|
@ -128,8 +163,6 @@ class ProductUpdate(SQLModel):
|
||||||
price_amount: Optional[float] = None
|
price_amount: Optional[float] = None
|
||||||
price_currency: Optional[str] = None
|
price_currency: Optional[str] = None
|
||||||
size_ml: Optional[float] = None
|
size_ml: Optional[float] = None
|
||||||
full_weight_g: Optional[float] = None
|
|
||||||
empty_weight_g: Optional[float] = None
|
|
||||||
pao_months: Optional[int] = None
|
pao_months: Optional[int] = None
|
||||||
|
|
||||||
inci: Optional[list[str]] = None
|
inci: Optional[list[str]] = None
|
||||||
|
|
@ -159,7 +192,6 @@ class ProductUpdate(SQLModel):
|
||||||
needle_length_mm: Optional[float] = None
|
needle_length_mm: Optional[float] = None
|
||||||
|
|
||||||
personal_tolerance_notes: Optional[str] = None
|
personal_tolerance_notes: Optional[str] = None
|
||||||
personal_repurchase_intent: Optional[bool] = None
|
|
||||||
|
|
||||||
|
|
||||||
class ProductParseRequest(SQLModel):
|
class ProductParseRequest(SQLModel):
|
||||||
|
|
@ -181,8 +213,6 @@ class ProductParseResponse(SQLModel):
|
||||||
price_amount: Optional[float] = None
|
price_amount: Optional[float] = None
|
||||||
price_currency: Optional[str] = None
|
price_currency: Optional[str] = None
|
||||||
size_ml: Optional[float] = None
|
size_ml: Optional[float] = None
|
||||||
full_weight_g: Optional[float] = None
|
|
||||||
empty_weight_g: Optional[float] = None
|
|
||||||
pao_months: Optional[int] = None
|
pao_months: Optional[int] = None
|
||||||
inci: Optional[list[str]] = None
|
inci: Optional[list[str]] = None
|
||||||
actives: Optional[list[ActiveIngredient]] = None
|
actives: Optional[list[ActiveIngredient]] = None
|
||||||
|
|
@ -218,15 +248,15 @@ class ProductListItem(SQLModel):
|
||||||
|
|
||||||
class AIActiveIngredient(ActiveIngredient):
|
class AIActiveIngredient(ActiveIngredient):
|
||||||
# Gemini API rejects int-enum values in response_schema; override with plain int.
|
# Gemini API rejects int-enum values in response_schema; override with plain int.
|
||||||
strength_level: Optional[int] = None # type: ignore[assignment]
|
strength_level: Optional[int] = None # pyright: ignore[reportIncompatibleVariableOverride]
|
||||||
irritation_potential: Optional[int] = None # type: ignore[assignment]
|
irritation_potential: Optional[int] = None # pyright: ignore[reportIncompatibleVariableOverride]
|
||||||
|
|
||||||
|
|
||||||
class ProductParseLLMResponse(ProductParseResponse):
|
class ProductParseLLMResponse(ProductParseResponse):
|
||||||
# Gemini response schema currently requires enum values to be strings.
|
# Gemini response schema currently requires enum values to be strings.
|
||||||
# Strength fields are numeric in our domain (1-3), so keep them as ints here
|
# Strength fields are numeric in our domain (1-3), so keep them as ints here
|
||||||
# and convert via ProductParseResponse validation afterward.
|
# and convert via ProductParseResponse validation afterward.
|
||||||
actives: Optional[list[AIActiveIngredient]] = None # type: ignore[assignment]
|
actives: Optional[list[AIActiveIngredient]] = None # pyright: ignore[reportIncompatibleVariableOverride]
|
||||||
|
|
||||||
|
|
||||||
class InventoryCreate(SQLModel):
|
class InventoryCreate(SQLModel):
|
||||||
|
|
@ -234,8 +264,7 @@ class InventoryCreate(SQLModel):
|
||||||
opened_at: Optional[date] = None
|
opened_at: Optional[date] = None
|
||||||
finished_at: Optional[date] = None
|
finished_at: Optional[date] = None
|
||||||
expiry_date: Optional[date] = None
|
expiry_date: Optional[date] = None
|
||||||
current_weight_g: Optional[float] = None
|
remaining_level: Optional[RemainingLevel] = None
|
||||||
last_weighed_at: Optional[date] = None
|
|
||||||
notes: Optional[str] = None
|
notes: Optional[str] = None
|
||||||
|
|
||||||
|
|
||||||
|
|
@ -244,24 +273,191 @@ class InventoryUpdate(SQLModel):
|
||||||
opened_at: Optional[date] = None
|
opened_at: Optional[date] = None
|
||||||
finished_at: Optional[date] = None
|
finished_at: Optional[date] = None
|
||||||
expiry_date: Optional[date] = None
|
expiry_date: Optional[date] = None
|
||||||
current_weight_g: Optional[float] = None
|
remaining_level: Optional[RemainingLevel] = None
|
||||||
last_weighed_at: Optional[date] = None
|
|
||||||
notes: Optional[str] = None
|
notes: Optional[str] = None
|
||||||
|
|
||||||
|
|
||||||
|
def _remaining_level_rank(level: RemainingLevel | str | None) -> int:
|
||||||
|
if level in (RemainingLevel.NEARLY_EMPTY, "nearly_empty"):
|
||||||
|
return 0
|
||||||
|
if level in (RemainingLevel.LOW, "low"):
|
||||||
|
return 1
|
||||||
|
if level in (RemainingLevel.MEDIUM, "medium"):
|
||||||
|
return 2
|
||||||
|
if level in (RemainingLevel.HIGH, "high"):
|
||||||
|
return 3
|
||||||
|
return 99
|
||||||
|
|
||||||
|
|
||||||
|
_STAPLE_CATEGORIES = {"cleanser", "moisturizer", "spf"}
|
||||||
|
_OCCASIONAL_CATEGORIES = {"exfoliant", "mask", "spot_treatment", "tool"}
|
||||||
|
|
||||||
|
|
||||||
|
def _compute_days_since_last_used(
|
||||||
|
last_used_on: date | None, reference_date: date
|
||||||
|
) -> int | None:
|
||||||
|
if last_used_on is None:
|
||||||
|
return None
|
||||||
|
return max((reference_date - last_used_on).days, 0)
|
||||||
|
|
||||||
|
|
||||||
|
def _compute_replenishment_score(
|
||||||
|
*,
|
||||||
|
has_stock: bool,
|
||||||
|
sealed_backup_count: int,
|
||||||
|
lowest_remaining_level: str | None,
|
||||||
|
days_since_last_used: int | None,
|
||||||
|
category: ProductCategory | str,
|
||||||
|
) -> dict[str, object]:
|
||||||
|
score = 0
|
||||||
|
reason_codes: list[str] = []
|
||||||
|
category_value = _ev(category)
|
||||||
|
|
||||||
|
if not has_stock:
|
||||||
|
score = 90
|
||||||
|
reason_codes.append("out_of_stock")
|
||||||
|
elif sealed_backup_count > 0:
|
||||||
|
score = 10
|
||||||
|
reason_codes.append("has_sealed_backup")
|
||||||
|
elif lowest_remaining_level == "nearly_empty":
|
||||||
|
score = 80
|
||||||
|
reason_codes.append("nearly_empty_opened")
|
||||||
|
elif lowest_remaining_level == "low":
|
||||||
|
score = 60
|
||||||
|
reason_codes.append("low_opened")
|
||||||
|
elif lowest_remaining_level == "medium":
|
||||||
|
score = 25
|
||||||
|
elif lowest_remaining_level == "high":
|
||||||
|
score = 5
|
||||||
|
else:
|
||||||
|
reason_codes.append("insufficient_remaining_data")
|
||||||
|
|
||||||
|
if days_since_last_used is not None:
|
||||||
|
if days_since_last_used <= 3:
|
||||||
|
score += 20
|
||||||
|
reason_codes.append("recently_used")
|
||||||
|
elif days_since_last_used <= 7:
|
||||||
|
score += 12
|
||||||
|
reason_codes.append("recently_used")
|
||||||
|
elif days_since_last_used <= 14:
|
||||||
|
score += 6
|
||||||
|
elif days_since_last_used <= 30:
|
||||||
|
pass
|
||||||
|
elif days_since_last_used <= 60:
|
||||||
|
score -= 10
|
||||||
|
reason_codes.append("stale_usage")
|
||||||
|
else:
|
||||||
|
score -= 20
|
||||||
|
reason_codes.append("stale_usage")
|
||||||
|
|
||||||
|
if category_value in _STAPLE_CATEGORIES:
|
||||||
|
score += 15
|
||||||
|
reason_codes.append("staple_category")
|
||||||
|
elif category_value in _OCCASIONAL_CATEGORIES:
|
||||||
|
score -= 10
|
||||||
|
reason_codes.append("occasional_category")
|
||||||
|
elif category_value == "serum":
|
||||||
|
score += 5
|
||||||
|
|
||||||
|
if sealed_backup_count > 0 and has_stock:
|
||||||
|
score = min(score, 15)
|
||||||
|
if (
|
||||||
|
days_since_last_used is not None
|
||||||
|
and days_since_last_used > 60
|
||||||
|
and category_value not in _STAPLE_CATEGORIES
|
||||||
|
):
|
||||||
|
score = min(score, 25)
|
||||||
|
if (
|
||||||
|
lowest_remaining_level is None
|
||||||
|
and has_stock
|
||||||
|
and (days_since_last_used is None or days_since_last_used > 14)
|
||||||
|
):
|
||||||
|
score = min(score, 20)
|
||||||
|
|
||||||
|
score = max(0, min(score, 100))
|
||||||
|
if score >= 80:
|
||||||
|
priority_hint = "high"
|
||||||
|
elif score >= 50:
|
||||||
|
priority_hint = "medium"
|
||||||
|
elif score >= 25:
|
||||||
|
priority_hint = "low"
|
||||||
|
else:
|
||||||
|
priority_hint = "none"
|
||||||
|
|
||||||
|
return {
|
||||||
|
"replenishment_score": score,
|
||||||
|
"replenishment_priority_hint": priority_hint,
|
||||||
|
"repurchase_candidate": priority_hint != "none",
|
||||||
|
"replenishment_reason_codes": reason_codes,
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
def _summarize_inventory_state(entries: list[ProductInventory]) -> dict[str, object]:
|
||||||
|
active_entries = [entry for entry in entries if entry.finished_at is None]
|
||||||
|
opened_entries = [entry for entry in active_entries if entry.is_opened]
|
||||||
|
sealed_entries = [entry for entry in active_entries if not entry.is_opened]
|
||||||
|
|
||||||
|
opened_levels = [
|
||||||
|
_ev(entry.remaining_level)
|
||||||
|
for entry in opened_entries
|
||||||
|
if entry.remaining_level is not None
|
||||||
|
]
|
||||||
|
opened_levels_sorted = sorted(
|
||||||
|
opened_levels,
|
||||||
|
key=_remaining_level_rank,
|
||||||
|
)
|
||||||
|
lowest_opened_level = opened_levels_sorted[0] if opened_levels_sorted else None
|
||||||
|
|
||||||
|
stock_state = "healthy"
|
||||||
|
if not active_entries:
|
||||||
|
stock_state = "out_of_stock"
|
||||||
|
elif sealed_entries:
|
||||||
|
stock_state = "healthy"
|
||||||
|
elif lowest_opened_level == "nearly_empty":
|
||||||
|
stock_state = "urgent"
|
||||||
|
elif lowest_opened_level == "low":
|
||||||
|
stock_state = "low"
|
||||||
|
elif lowest_opened_level == "medium":
|
||||||
|
stock_state = "monitor"
|
||||||
|
|
||||||
|
replenishment_signal = "none"
|
||||||
|
if stock_state == "out_of_stock":
|
||||||
|
replenishment_signal = "out_of_stock"
|
||||||
|
elif stock_state == "urgent":
|
||||||
|
replenishment_signal = "urgent"
|
||||||
|
elif stock_state == "low":
|
||||||
|
replenishment_signal = "soon"
|
||||||
|
|
||||||
|
return {
|
||||||
|
"has_stock": bool(active_entries),
|
||||||
|
"active_count": len(active_entries),
|
||||||
|
"opened_count": len(opened_entries),
|
||||||
|
"sealed_backup_count": len(sealed_entries),
|
||||||
|
"opened_levels": opened_levels_sorted,
|
||||||
|
"lowest_opened_level": lowest_opened_level,
|
||||||
|
"stock_state": stock_state,
|
||||||
|
"replenishment_signal": replenishment_signal,
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
# ---------------------------------------------------------------------------
|
# ---------------------------------------------------------------------------
|
||||||
# Shopping suggestion schemas
|
# Shopping suggestion schemas
|
||||||
# ---------------------------------------------------------------------------
|
# ---------------------------------------------------------------------------
|
||||||
|
|
||||||
|
|
||||||
class ProductSuggestion(PydanticBase):
|
class ProductSuggestion(PydanticBase):
|
||||||
category: str
|
category: ProductCategory
|
||||||
product_type: str
|
product_type: str
|
||||||
|
priority: Literal["high", "medium", "low"]
|
||||||
key_ingredients: list[str]
|
key_ingredients: list[str]
|
||||||
target_concerns: list[str]
|
target_concerns: list[SkinConcern]
|
||||||
why_needed: str
|
recommended_time: DayTime
|
||||||
recommended_time: str
|
|
||||||
frequency: str
|
frequency: str
|
||||||
|
short_reason: str
|
||||||
|
reason_to_buy_now: str
|
||||||
|
reason_not_needed_if_budget_tight: str | None = None
|
||||||
|
fit_with_current_routine: str
|
||||||
|
usage_cautions: list[str]
|
||||||
|
|
||||||
|
|
||||||
class ShoppingSuggestionResponse(PydanticBase):
|
class ShoppingSuggestionResponse(PydanticBase):
|
||||||
|
|
@ -274,13 +470,18 @@ class ShoppingSuggestionResponse(PydanticBase):
|
||||||
|
|
||||||
|
|
||||||
class _ProductSuggestionOut(PydanticBase):
|
class _ProductSuggestionOut(PydanticBase):
|
||||||
category: str
|
category: ProductCategory
|
||||||
product_type: str
|
product_type: str
|
||||||
|
priority: Literal["high", "medium", "low"]
|
||||||
key_ingredients: list[str]
|
key_ingredients: list[str]
|
||||||
target_concerns: list[str]
|
target_concerns: list[SkinConcern]
|
||||||
why_needed: str
|
recommended_time: DayTime
|
||||||
recommended_time: str
|
|
||||||
frequency: str
|
frequency: str
|
||||||
|
short_reason: str
|
||||||
|
reason_to_buy_now: str
|
||||||
|
reason_not_needed_if_budget_tight: str | None = None
|
||||||
|
fit_with_current_routine: str
|
||||||
|
usage_cautions: list[str]
|
||||||
|
|
||||||
|
|
||||||
class _ShoppingSuggestionsOut(PydanticBase):
|
class _ShoppingSuggestionsOut(PydanticBase):
|
||||||
|
|
@ -314,15 +515,6 @@ def _estimated_amount_per_use(category: ProductCategory) -> float | None:
|
||||||
return _ESTIMATED_AMOUNT_PER_USE.get(category)
|
return _ESTIMATED_AMOUNT_PER_USE.get(category)
|
||||||
|
|
||||||
|
|
||||||
def _net_weight_g(product: Product) -> float | None:
|
|
||||||
if product.full_weight_g is None or product.empty_weight_g is None:
|
|
||||||
return None
|
|
||||||
net = product.full_weight_g - product.empty_weight_g
|
|
||||||
if net <= 0:
|
|
||||||
return None
|
|
||||||
return net
|
|
||||||
|
|
||||||
|
|
||||||
def _price_per_use_pln(product: Product) -> float | None:
|
def _price_per_use_pln(product: Product) -> float | None:
|
||||||
if product.price_amount is None or product.price_currency is None:
|
if product.price_amount is None or product.price_currency is None:
|
||||||
return None
|
return None
|
||||||
|
|
@ -332,8 +524,6 @@ def _price_per_use_pln(product: Product) -> float | None:
|
||||||
return None
|
return None
|
||||||
|
|
||||||
pack_amount = product.size_ml
|
pack_amount = product.size_ml
|
||||||
if pack_amount is None or pack_amount <= 0:
|
|
||||||
pack_amount = _net_weight_g(product)
|
|
||||||
if pack_amount is None or pack_amount <= 0:
|
if pack_amount is None or pack_amount <= 0:
|
||||||
return None
|
return None
|
||||||
|
|
||||||
|
|
@ -454,6 +644,7 @@ def list_products(
|
||||||
is_medication: Optional[bool] = None,
|
is_medication: Optional[bool] = None,
|
||||||
is_tool: Optional[bool] = None,
|
is_tool: Optional[bool] = None,
|
||||||
session: Session = Depends(get_session),
|
session: Session = Depends(get_session),
|
||||||
|
current_user: CurrentUser = Depends(get_current_user),
|
||||||
):
|
):
|
||||||
stmt = select(Product)
|
stmt = select(Product)
|
||||||
if category is not None:
|
if category is not None:
|
||||||
|
|
@ -466,6 +657,12 @@ def list_products(
|
||||||
stmt = stmt.where(Product.is_tool == is_tool)
|
stmt = stmt.where(Product.is_tool == is_tool)
|
||||||
|
|
||||||
products = list(session.exec(stmt).all())
|
products = list(session.exec(stmt).all())
|
||||||
|
if current_user.role is not Role.ADMIN:
|
||||||
|
products = [
|
||||||
|
product
|
||||||
|
for product in products
|
||||||
|
if is_product_visible(session, product.id, current_user)
|
||||||
|
]
|
||||||
|
|
||||||
# Filter by targets (JSON column — done in Python)
|
# Filter by targets (JSON column — done in Python)
|
||||||
if targets:
|
if targets:
|
||||||
|
|
@ -490,26 +687,37 @@ def list_products(
|
||||||
if product_ids
|
if product_ids
|
||||||
else []
|
else []
|
||||||
)
|
)
|
||||||
inv_by_product: dict = {}
|
inv_by_product: dict[UUID, list[ProductInventory]] = {}
|
||||||
for inv in inventory_rows:
|
for inv in inventory_rows:
|
||||||
inv_by_product.setdefault(inv.product_id, []).append(inv)
|
inv_by_product.setdefault(inv.product_id, []).append(inv)
|
||||||
|
|
||||||
results = []
|
results = []
|
||||||
for p in products:
|
for p in products:
|
||||||
r = ProductWithInventory.model_validate(p, from_attributes=True)
|
r = ProductWithInventory.model_validate(p, from_attributes=True)
|
||||||
r.inventory = inv_by_product.get(p.id, [])
|
r.inventory = _visible_inventory_for_product(
|
||||||
|
inv_by_product.get(p.id, []),
|
||||||
|
session,
|
||||||
|
current_user,
|
||||||
|
)
|
||||||
results.append(r)
|
results.append(r)
|
||||||
return results
|
return results
|
||||||
|
|
||||||
|
|
||||||
@router.post("", response_model=ProductPublic, status_code=201)
|
@router.post("", response_model=ProductPublic, status_code=201)
|
||||||
def create_product(data: ProductCreate, session: Session = Depends(get_session)):
|
def create_product(
|
||||||
|
data: ProductCreate,
|
||||||
|
session: Session = Depends(get_session),
|
||||||
|
current_user: CurrentUser = Depends(get_current_user),
|
||||||
|
):
|
||||||
payload = data.model_dump()
|
payload = data.model_dump()
|
||||||
if payload.get("price_currency"):
|
if payload.get("price_currency"):
|
||||||
payload["price_currency"] = str(payload["price_currency"]).upper()
|
payload["price_currency"] = str(payload["price_currency"]).upper()
|
||||||
|
|
||||||
|
product_id = uuid4()
|
||||||
product = Product(
|
product = Product(
|
||||||
id=uuid4(),
|
id=product_id,
|
||||||
|
user_id=current_user.user_id,
|
||||||
|
short_id=str(product_id)[:8],
|
||||||
**payload,
|
**payload,
|
||||||
)
|
)
|
||||||
session.add(product)
|
session.add(product)
|
||||||
|
|
@ -589,8 +797,6 @@ OUTPUT SCHEMA (all fields optional — omit what you cannot determine):
|
||||||
"price_amount": number,
|
"price_amount": number,
|
||||||
"price_currency": string,
|
"price_currency": string,
|
||||||
"size_ml": number,
|
"size_ml": number,
|
||||||
"full_weight_g": number,
|
|
||||||
"empty_weight_g": number,
|
|
||||||
"pao_months": integer,
|
"pao_months": integer,
|
||||||
"inci": [string, ...],
|
"inci": [string, ...],
|
||||||
"actives": [
|
"actives": [
|
||||||
|
|
@ -693,10 +899,12 @@ def list_products_summary(
|
||||||
is_medication: Optional[bool] = None,
|
is_medication: Optional[bool] = None,
|
||||||
is_tool: Optional[bool] = None,
|
is_tool: Optional[bool] = None,
|
||||||
session: Session = Depends(get_session),
|
session: Session = Depends(get_session),
|
||||||
|
current_user: CurrentUser = Depends(get_current_user),
|
||||||
):
|
):
|
||||||
product_table = inspect(Product).local_table
|
product_table = inspect(Product).local_table
|
||||||
stmt = sa_select(
|
stmt = sa_select(
|
||||||
product_table.c.id,
|
product_table.c.id,
|
||||||
|
product_table.c.user_id,
|
||||||
product_table.c.name,
|
product_table.c.name,
|
||||||
product_table.c.brand,
|
product_table.c.brand,
|
||||||
product_table.c.category,
|
product_table.c.category,
|
||||||
|
|
@ -716,6 +924,10 @@ def list_products_summary(
|
||||||
stmt = stmt.where(product_table.c.is_tool == is_tool)
|
stmt = stmt.where(product_table.c.is_tool == is_tool)
|
||||||
|
|
||||||
rows = list(session.execute(stmt).all())
|
rows = list(session.execute(stmt).all())
|
||||||
|
if current_user.role is not Role.ADMIN:
|
||||||
|
rows = [
|
||||||
|
row for row in rows if is_product_visible(session, row[0], current_user)
|
||||||
|
]
|
||||||
|
|
||||||
if targets:
|
if targets:
|
||||||
target_values = {t.value for t in targets}
|
target_values = {t.value for t in targets}
|
||||||
|
|
@ -728,26 +940,11 @@ def list_products_summary(
|
||||||
)
|
)
|
||||||
]
|
]
|
||||||
|
|
||||||
product_ids = [row[0] for row in rows]
|
|
||||||
inventory_rows = (
|
|
||||||
session.exec(
|
|
||||||
select(ProductInventory).where(
|
|
||||||
col(ProductInventory.product_id).in_(product_ids)
|
|
||||||
)
|
|
||||||
).all()
|
|
||||||
if product_ids
|
|
||||||
else []
|
|
||||||
)
|
|
||||||
owned_ids = {
|
|
||||||
inv.product_id
|
|
||||||
for inv in inventory_rows
|
|
||||||
if inv.product_id is not None and inv.finished_at is None
|
|
||||||
}
|
|
||||||
|
|
||||||
results: list[ProductListItem] = []
|
results: list[ProductListItem] = []
|
||||||
for row in rows:
|
for row in rows:
|
||||||
(
|
(
|
||||||
product_id,
|
product_id,
|
||||||
|
product_user_id,
|
||||||
name,
|
name,
|
||||||
brand_value,
|
brand_value,
|
||||||
category_value,
|
category_value,
|
||||||
|
|
@ -765,7 +962,7 @@ def list_products_summary(
|
||||||
category=category_value,
|
category=category_value,
|
||||||
recommended_time=recommended_time,
|
recommended_time=recommended_time,
|
||||||
targets=row_targets or [],
|
targets=row_targets or [],
|
||||||
is_owned=product_id in owned_ids,
|
is_owned=product_user_id == current_user.user_id,
|
||||||
price_tier=price_tier,
|
price_tier=price_tier,
|
||||||
price_per_use_pln=price_per_use_pln,
|
price_per_use_pln=price_per_use_pln,
|
||||||
price_tier_source=price_tier_source,
|
price_tier_source=price_tier_source,
|
||||||
|
|
@ -776,22 +973,35 @@ def list_products_summary(
|
||||||
|
|
||||||
|
|
||||||
@router.get("/{product_id}", response_model=ProductWithInventory)
|
@router.get("/{product_id}", response_model=ProductWithInventory)
|
||||||
def get_product(product_id: UUID, session: Session = Depends(get_session)):
|
def get_product(
|
||||||
|
product_id: UUID,
|
||||||
|
session: Session = Depends(get_session),
|
||||||
|
current_user: CurrentUser = Depends(get_current_user),
|
||||||
|
):
|
||||||
product = get_or_404(session, Product, product_id)
|
product = get_or_404(session, Product, product_id)
|
||||||
|
if not is_product_visible(session, product_id, current_user):
|
||||||
|
raise HTTPException(status_code=404, detail="Product not found")
|
||||||
|
|
||||||
inventory = session.exec(
|
inventory = session.exec(
|
||||||
select(ProductInventory).where(ProductInventory.product_id == product_id)
|
select(ProductInventory).where(ProductInventory.product_id == product_id)
|
||||||
).all()
|
).all()
|
||||||
result = ProductWithInventory.model_validate(product, from_attributes=True)
|
result = ProductWithInventory.model_validate(product, from_attributes=True)
|
||||||
result.inventory = list(inventory)
|
result.inventory = _visible_inventory_for_product(
|
||||||
|
list(inventory), session, current_user
|
||||||
|
)
|
||||||
return result
|
return result
|
||||||
|
|
||||||
|
|
||||||
@router.patch("/{product_id}", response_model=ProductPublic)
|
@router.patch("/{product_id}", response_model=ProductPublic)
|
||||||
def update_product(
|
def update_product(
|
||||||
product_id: UUID, data: ProductUpdate, session: Session = Depends(get_session)
|
product_id: UUID,
|
||||||
|
data: ProductUpdate,
|
||||||
|
session: Session = Depends(get_session),
|
||||||
|
current_user: CurrentUser = Depends(get_current_user),
|
||||||
):
|
):
|
||||||
product = get_or_404(session, Product, product_id)
|
product = get_owned_or_404_admin_override(
|
||||||
|
session, Product, product_id, current_user
|
||||||
|
)
|
||||||
patch_data = data.model_dump(exclude_unset=True)
|
patch_data = data.model_dump(exclude_unset=True)
|
||||||
if patch_data.get("price_currency"):
|
if patch_data.get("price_currency"):
|
||||||
patch_data["price_currency"] = str(patch_data["price_currency"]).upper()
|
patch_data["price_currency"] = str(patch_data["price_currency"]).upper()
|
||||||
|
|
@ -806,8 +1016,14 @@ def update_product(
|
||||||
|
|
||||||
|
|
||||||
@router.delete("/{product_id}", status_code=204)
|
@router.delete("/{product_id}", status_code=204)
|
||||||
def delete_product(product_id: UUID, session: Session = Depends(get_session)):
|
def delete_product(
|
||||||
product = get_or_404(session, Product, product_id)
|
product_id: UUID,
|
||||||
|
session: Session = Depends(get_session),
|
||||||
|
current_user: CurrentUser = Depends(get_current_user),
|
||||||
|
):
|
||||||
|
product = get_owned_or_404_admin_override(
|
||||||
|
session, Product, product_id, current_user
|
||||||
|
)
|
||||||
session.delete(product)
|
session.delete(product)
|
||||||
enqueue_pricing_recalc(session)
|
enqueue_pricing_recalc(session)
|
||||||
session.commit()
|
session.commit()
|
||||||
|
|
@ -819,10 +1035,17 @@ def delete_product(product_id: UUID, session: Session = Depends(get_session)):
|
||||||
|
|
||||||
|
|
||||||
@router.get("/{product_id}/inventory", response_model=list[ProductInventory])
|
@router.get("/{product_id}/inventory", response_model=list[ProductInventory])
|
||||||
def list_product_inventory(product_id: UUID, session: Session = Depends(get_session)):
|
def list_product_inventory(
|
||||||
|
product_id: UUID,
|
||||||
|
session: Session = Depends(get_session),
|
||||||
|
current_user: CurrentUser = Depends(get_current_user),
|
||||||
|
):
|
||||||
get_or_404(session, Product, product_id)
|
get_or_404(session, Product, product_id)
|
||||||
|
if not is_product_visible(session, product_id, current_user):
|
||||||
|
raise HTTPException(status_code=404, detail="Product not found")
|
||||||
stmt = select(ProductInventory).where(ProductInventory.product_id == product_id)
|
stmt = select(ProductInventory).where(ProductInventory.product_id == product_id)
|
||||||
return session.exec(stmt).all()
|
inventories = list(session.exec(stmt).all())
|
||||||
|
return _visible_inventory_for_product(inventories, session, current_user)
|
||||||
|
|
||||||
|
|
||||||
@router.post(
|
@router.post(
|
||||||
|
|
@ -832,10 +1055,14 @@ def create_product_inventory(
|
||||||
product_id: UUID,
|
product_id: UUID,
|
||||||
data: InventoryCreate,
|
data: InventoryCreate,
|
||||||
session: Session = Depends(get_session),
|
session: Session = Depends(get_session),
|
||||||
|
current_user: CurrentUser = Depends(get_current_user),
|
||||||
):
|
):
|
||||||
get_or_404(session, Product, product_id)
|
product = get_owned_or_404_admin_override(
|
||||||
|
session, Product, product_id, current_user
|
||||||
|
)
|
||||||
entry = ProductInventory(
|
entry = ProductInventory(
|
||||||
id=uuid4(),
|
id=uuid4(),
|
||||||
|
user_id=product.user_id or current_user.user_id,
|
||||||
product_id=product_id,
|
product_id=product_id,
|
||||||
**data.model_dump(),
|
**data.model_dump(),
|
||||||
)
|
)
|
||||||
|
|
@ -859,8 +1086,19 @@ def _ev(v: object) -> str:
|
||||||
return str(v)
|
return str(v)
|
||||||
|
|
||||||
|
|
||||||
def _build_shopping_context(session: Session, reference_date: date) -> str:
|
def _build_shopping_context(
|
||||||
profile_ctx = build_user_profile_context(session, reference_date=reference_date)
|
session: Session,
|
||||||
|
reference_date: date,
|
||||||
|
current_user: CurrentUser,
|
||||||
|
*,
|
||||||
|
products: list[Product] | None = None,
|
||||||
|
last_used_on_by_product: dict[str, date] | None = None,
|
||||||
|
) -> str:
|
||||||
|
profile_ctx = build_user_profile_context(
|
||||||
|
session,
|
||||||
|
reference_date=reference_date,
|
||||||
|
current_user=current_user,
|
||||||
|
)
|
||||||
snapshot = session.exec(
|
snapshot = session.exec(
|
||||||
select(SkinConditionSnapshot).order_by(
|
select(SkinConditionSnapshot).order_by(
|
||||||
col(SkinConditionSnapshot.snapshot_date).desc()
|
col(SkinConditionSnapshot.snapshot_date).desc()
|
||||||
|
|
@ -882,9 +1120,14 @@ def _build_shopping_context(session: Session, reference_date: date) -> str:
|
||||||
else:
|
else:
|
||||||
skin_lines.append(" (brak danych)")
|
skin_lines.append(" (brak danych)")
|
||||||
|
|
||||||
products = _get_shopping_products(session)
|
if products is None:
|
||||||
|
products = _get_shopping_products(session)
|
||||||
|
|
||||||
product_ids = [p.id for p in products]
|
product_ids = [p.id for p in products]
|
||||||
|
last_used_on_by_product = last_used_on_by_product or build_last_used_on_by_product(
|
||||||
|
session,
|
||||||
|
product_ids=product_ids,
|
||||||
|
)
|
||||||
inventory_rows = (
|
inventory_rows = (
|
||||||
session.exec(
|
session.exec(
|
||||||
select(ProductInventory).where(
|
select(ProductInventory).where(
|
||||||
|
|
@ -894,21 +1137,37 @@ def _build_shopping_context(session: Session, reference_date: date) -> str:
|
||||||
if product_ids
|
if product_ids
|
||||||
else []
|
else []
|
||||||
)
|
)
|
||||||
inv_by_product: dict = {}
|
inv_by_product: dict[UUID, list[ProductInventory]] = {}
|
||||||
for inv in inventory_rows:
|
for inv in inventory_rows:
|
||||||
inv_by_product.setdefault(inv.product_id, []).append(inv)
|
inv_by_product.setdefault(inv.product_id, []).append(inv)
|
||||||
|
|
||||||
products_lines = ["POSIADANE PRODUKTY:"]
|
products_lines = ["POSIADANE PRODUKTY:"]
|
||||||
products_lines.append(
|
products_lines.append(
|
||||||
" Legenda: [✓] = produkt dostępny (w magazynie), [✗] = brak w magazynie"
|
" Legenda: [✓] = aktywny zapas istnieje, [✗] = brak aktywnego zapasu"
|
||||||
|
)
|
||||||
|
products_lines.append(
|
||||||
|
" Pola: stock_state, sealed_backup_count, lowest_remaining_level, days_since_last_used, replenishment_score, replenishment_priority_hint, repurchase_candidate"
|
||||||
)
|
)
|
||||||
for p in products:
|
for p in products:
|
||||||
active_inv = [i for i in inv_by_product.get(p.id, []) if i.finished_at is None]
|
inventory_summary = _summarize_inventory_state(inv_by_product.get(p.id, []))
|
||||||
has_stock = len(active_inv) > 0 # any unfinished inventory = in stock
|
stock = "✓" if inventory_summary["has_stock"] else "✗"
|
||||||
stock = "✓" if has_stock else "✗"
|
last_used_on = last_used_on_by_product.get(str(p.id))
|
||||||
|
days_since_last_used = _compute_days_since_last_used(
|
||||||
|
last_used_on, reference_date
|
||||||
|
)
|
||||||
|
replenishment = _compute_replenishment_score(
|
||||||
|
has_stock=bool(inventory_summary["has_stock"]),
|
||||||
|
sealed_backup_count=cast(int, inventory_summary["sealed_backup_count"]),
|
||||||
|
lowest_remaining_level=(
|
||||||
|
str(inventory_summary["lowest_opened_level"])
|
||||||
|
if inventory_summary["lowest_opened_level"] is not None
|
||||||
|
else None
|
||||||
|
),
|
||||||
|
days_since_last_used=days_since_last_used,
|
||||||
|
category=p.category,
|
||||||
|
)
|
||||||
|
|
||||||
actives = _extract_active_names(p)
|
actives = _extract_active_names(p)
|
||||||
actives_str = f", actives: {actives}" if actives else ""
|
|
||||||
|
|
||||||
ep = p.product_effect_profile
|
ep = p.product_effect_profile
|
||||||
if isinstance(ep, dict):
|
if isinstance(ep, dict):
|
||||||
|
|
@ -919,13 +1178,55 @@ def _build_shopping_context(session: Session, reference_date: date) -> str:
|
||||||
for k, v in ep.model_dump().items()
|
for k, v in ep.model_dump().items()
|
||||||
if v >= 3
|
if v >= 3
|
||||||
}
|
}
|
||||||
effects_str = f", effects: {effects}" if effects else ""
|
|
||||||
|
|
||||||
targets = [_ev(t) for t in (p.targets or [])]
|
targets = [_ev(t) for t in (p.targets or [])]
|
||||||
|
product_header = f" [{stock}] id={p.id} {p.name}"
|
||||||
|
if p.brand:
|
||||||
|
product_header += f" ({p.brand})"
|
||||||
|
products_lines.append(product_header)
|
||||||
|
products_lines.append(f" category={_ev(p.category)}")
|
||||||
|
products_lines.append(f" targets={targets}")
|
||||||
|
if actives:
|
||||||
|
products_lines.append(f" actives={actives}")
|
||||||
|
if effects:
|
||||||
|
products_lines.append(f" effects={effects}")
|
||||||
|
products_lines.append(f" stock_state={inventory_summary['stock_state']}")
|
||||||
|
products_lines.append(f" active_count={inventory_summary['active_count']}")
|
||||||
|
products_lines.append(f" opened_count={inventory_summary['opened_count']}")
|
||||||
products_lines.append(
|
products_lines.append(
|
||||||
f" [{stock}] id={p.id} {p.name} ({p.brand or ''}) - {_ev(p.category)}, "
|
f" sealed_backup_count={inventory_summary['sealed_backup_count']}"
|
||||||
f"targets: {targets}{actives_str}{effects_str}"
|
)
|
||||||
|
products_lines.append(
|
||||||
|
" lowest_remaining_level="
|
||||||
|
+ (
|
||||||
|
str(inventory_summary["lowest_opened_level"])
|
||||||
|
if inventory_summary["lowest_opened_level"] is not None
|
||||||
|
else "null"
|
||||||
|
)
|
||||||
|
)
|
||||||
|
products_lines.append(
|
||||||
|
" last_used_on=" + (last_used_on.isoformat() if last_used_on else "null")
|
||||||
|
)
|
||||||
|
products_lines.append(
|
||||||
|
" days_since_last_used="
|
||||||
|
+ (
|
||||||
|
str(days_since_last_used)
|
||||||
|
if days_since_last_used is not None
|
||||||
|
else "null"
|
||||||
|
)
|
||||||
|
)
|
||||||
|
products_lines.append(
|
||||||
|
f" replenishment_score={replenishment['replenishment_score']}"
|
||||||
|
)
|
||||||
|
products_lines.append(
|
||||||
|
" replenishment_priority_hint="
|
||||||
|
+ str(replenishment["replenishment_priority_hint"])
|
||||||
|
)
|
||||||
|
products_lines.append(
|
||||||
|
" repurchase_candidate="
|
||||||
|
+ ("true" if replenishment["repurchase_candidate"] else "false")
|
||||||
|
)
|
||||||
|
products_lines.append(
|
||||||
|
" reason_codes=" + str(replenishment["replenishment_reason_codes"])
|
||||||
)
|
)
|
||||||
|
|
||||||
return (
|
return (
|
||||||
|
|
@ -963,40 +1264,54 @@ def _extract_requested_product_ids(
|
||||||
|
|
||||||
|
|
||||||
_SHOPPING_SYSTEM_PROMPT = """Jesteś asystentem zakupowym w dziedzinie pielęgnacji skóry.
|
_SHOPPING_SYSTEM_PROMPT = """Jesteś asystentem zakupowym w dziedzinie pielęgnacji skóry.
|
||||||
Twoim zadaniem jest przeanalizować stan skóry użytkownika oraz produkty, które już posiada,
|
|
||||||
a następnie zasugerować TYPY produktów (bez marek), które mogłyby uzupełnić ich rutynę.
|
|
||||||
|
|
||||||
LEGENDA:
|
Oceń dwie rzeczy: realne luki w rutynie oraz odkupy produktów, które warto uzupełnić teraz.
|
||||||
- [✓] = produkt dostępny w magazynie (nawet jeśli jest zapieczętowany)
|
Działaj konserwatywnie: sugeruj tylko wtedy, gdy istnieje wyraźny powód praktyczny.
|
||||||
- [✗] = produkt niedostępny (brak w magazynie, wszystkie opakowania zużyte)
|
Najpierw rozważ luki w rutynie, potem odkupy.
|
||||||
|
Traktuj `replenishment_score`, `replenishment_priority_hint`, `repurchase_candidate`, `stock_state`, `lowest_remaining_level`, `sealed_backup_count`, `last_used_on` i `days_since_last_used` jako główne sygnały decyzji zakupowej.
|
||||||
|
`sealed_backup_count` odnosi się do zapasu tego produktu lub bardzo zbliżonego typu; inny produkt z tej samej kategorii obniża pilność tylko wtedy, gdy realistycznie pełni podobną funkcję w rutynie.
|
||||||
|
Jeśli zakup nie jest pilny dzięki alternatywie, wyjaśnij, czy chodzi o rzeczywisty zapas tego samego typu produktu, czy o funkcjonalny zamiennik z tej samej kategorii.
|
||||||
|
Jeśli istnieje sealed backup lub bardzo bliski funkcjonalny zamiennik, sugestia zwykle nie powinna mieć `priority=high`, chyba że potrzeba odkupu jest wyraźnie wyjątkowa.
|
||||||
|
`product_type` ma być krótką nazwą typu produktu i opisywać funkcję produktu, a nie opis marketingowy lub pełną specyfikację składu.
|
||||||
|
Przy odkupie możesz odwoływać się do konkretnych już posiadanych produktów, jeśli pomaga to uzasadnić decyzję. Przy lukach w rutynie sugeruj typy produktów, nie marki.
|
||||||
|
Uwzględniaj aktywne problemy skóry, miejsce produktu w rutynie, konflikty składników i bezpieczeństwo przy naruszonej barierze.
|
||||||
|
Pisz po polsku, językiem praktycznym i zakupowym. Unikaj nadmiernie medycznego lub diagnostycznego tonu.
|
||||||
|
Nie używaj w tekstach dla użytkownika surowych sygnałów systemowych ani dosłownych etykiet z warstwy danych, takich jak `id`, `score`, `status low` czy `poziom produktu jest niski`; opisuj naturalnie wniosek, np. jako kończący się zapas, niski zapas lub wysoką pilność odkupu.
|
||||||
|
Możesz zwrócić pustą listę `suggestions`, jeśli nie widzisz realnej potrzeby.
|
||||||
|
`target_concerns` musi używać wyłącznie wartości enumu `SkinConcern` poniżej. `priority` ustawiaj jako: high = wyraźna luka lub pilna potrzeba, medium = sensowne uzupełnienie, low = opcjonalny upgrade.
|
||||||
|
|
||||||
ZASADY:
|
DOZWOLONE WARTOŚCI ENUMÓW:
|
||||||
0. Sugeruj tylko wtedy, gdy jest realna potrzeba - nie zwracaj stałej liczby produktów
|
- category: "cleanser" | "toner" | "essence" | "serum" | "moisturizer" | "spf" | "mask" | "exfoliant" | "hair_treatment" | "tool" | "spot_treatment" | "oil"
|
||||||
1. Sugeruj TYLKO typy produktów, NIGDY konkretne marki (np. "Salicylic Acid 2% Masque", nie "La Roche-Posay")
|
- target_concerns: "acne" | "rosacea" | "hyperpigmentation" | "aging" | "dehydration" | "redness" | "damaged_barrier" | "pore_visibility" | "uneven_texture" | "hair_growth" | "sebum_excess"
|
||||||
2. Produkty oznaczone [✗] to te, których NIE MA w magazynie - możesz je zasugerować
|
- recommended_time: "am" | "pm" | "both"
|
||||||
3. Produkty oznaczone [✓] są już dostępne - nie sugeruj ich ponownie
|
|
||||||
4. Bierz pod uwagę aktywne problemy skóry (acne, hyperpigmentacja, aging, etc.)
|
|
||||||
5. Sugeruj realistyczną częstotliwość użycia (dzienna, 2-3x tygodniowo, etc.)
|
|
||||||
6. Zachowaj kolejność warstw: cleanse → toner → serum → moisturizer → SPF
|
|
||||||
7. Jeśli użytkownik ma uszkodzoną barierę, unikaj silnych eksfoliantów i retinoidów
|
|
||||||
8. Zwracaj uwagę na ewentualne konflikty polecanych składników z tymi, które użytkownik już posiada (np. nie polecaj peptydów miedziowych jeśli użytkownik nadużywa kwasów)
|
|
||||||
9. Odpowiadaj w języku polskim
|
|
||||||
|
|
||||||
Format odpowiedzi - zwróć wyłącznie JSON zgodny z podanym schematem."""
|
Format odpowiedzi - zwróć wyłącznie JSON zgodny z podanym schematem."""
|
||||||
|
|
||||||
|
|
||||||
@router.post("/suggest", response_model=ShoppingSuggestionResponse)
|
@router.post("/suggest", response_model=ShoppingSuggestionResponse)
|
||||||
def suggest_shopping(session: Session = Depends(get_session)):
|
def suggest_shopping(
|
||||||
context = _build_shopping_context(session, reference_date=date.today())
|
session: Session = Depends(get_session),
|
||||||
|
current_user: CurrentUser = Depends(get_current_user),
|
||||||
|
):
|
||||||
|
reference_date = date.today()
|
||||||
shopping_products = _get_shopping_products(session)
|
shopping_products = _get_shopping_products(session)
|
||||||
last_used_on_by_product = build_last_used_on_by_product(
|
last_used_on_by_product = build_last_used_on_by_product(
|
||||||
session,
|
session,
|
||||||
product_ids=[p.id for p in shopping_products],
|
product_ids=[p.id for p in shopping_products],
|
||||||
)
|
)
|
||||||
|
context = _build_shopping_context(
|
||||||
|
session,
|
||||||
|
reference_date=reference_date,
|
||||||
|
current_user=current_user,
|
||||||
|
products=shopping_products,
|
||||||
|
last_used_on_by_product=last_used_on_by_product,
|
||||||
|
)
|
||||||
|
|
||||||
prompt = (
|
prompt = (
|
||||||
f"Na podstawie poniższych danych przeanalizuj, jakie TYPY produktów "
|
"Przeanalizuj dane użytkownika i zaproponuj tylko te zakupy, które mają realny sens teraz.\n\n"
|
||||||
f"mogłyby uzupełnić rutynę pielęgnacyjną użytkownika.\n\n"
|
"Najpierw rozważ luki w rutynie, potem ewentualny odkup kończących się produktów.\n"
|
||||||
|
"Jeśli produkt już istnieje, ale ma niski zapas i jest nadal realnie używany, możesz zasugerować odkup tego typu produktu.\n"
|
||||||
|
"Jeśli produkt ma sealed backup albo nie był używany od dawna, zwykle nie sugeruj odkupu.\n\n"
|
||||||
f"{context}\n\n"
|
f"{context}\n\n"
|
||||||
"NARZEDZIA:\n"
|
"NARZEDZIA:\n"
|
||||||
"- Masz dostep do funkcji: get_product_details.\n"
|
"- Masz dostep do funkcji: get_product_details.\n"
|
||||||
|
|
@ -1081,13 +1396,27 @@ def suggest_shopping(session: Session = Depends(get_session)):
|
||||||
except json.JSONDecodeError as e:
|
except json.JSONDecodeError as e:
|
||||||
raise HTTPException(status_code=502, detail=f"LLM returned invalid JSON: {e}")
|
raise HTTPException(status_code=502, detail=f"LLM returned invalid JSON: {e}")
|
||||||
|
|
||||||
|
try:
|
||||||
|
parsed_response = _ShoppingSuggestionsOut.model_validate(parsed)
|
||||||
|
except ValidationError as exc:
|
||||||
|
formatted_errors = "; ".join(
|
||||||
|
f"{'/'.join(str(part) for part in err['loc'])}: {err['msg']}"
|
||||||
|
for err in exc.errors()
|
||||||
|
)
|
||||||
|
raise HTTPException(
|
||||||
|
status_code=502,
|
||||||
|
detail=(
|
||||||
|
f"LLM returned invalid shopping suggestion schema: {formatted_errors}"
|
||||||
|
),
|
||||||
|
)
|
||||||
|
|
||||||
# Get products with inventory (those user already owns)
|
# Get products with inventory (those user already owns)
|
||||||
products_with_inventory = session.exec(
|
products_with_inventory_ids = session.exec(
|
||||||
select(Product).join(ProductInventory).distinct()
|
select(ProductInventory.product_id).distinct()
|
||||||
).all()
|
).all()
|
||||||
|
|
||||||
shopping_context = ShoppingValidationContext(
|
shopping_context = ShoppingValidationContext(
|
||||||
owned_product_ids=set(p.id for p in products_with_inventory),
|
owned_product_ids=set(products_with_inventory_ids),
|
||||||
valid_categories=set(ProductCategory),
|
valid_categories=set(ProductCategory),
|
||||||
valid_targets=set(SkinConcern),
|
valid_targets=set(SkinConcern),
|
||||||
)
|
)
|
||||||
|
|
@ -1097,8 +1426,11 @@ def suggest_shopping(session: Session = Depends(get_session)):
|
||||||
|
|
||||||
# Build initial shopping response without metadata
|
# Build initial shopping response without metadata
|
||||||
shopping_response = ShoppingSuggestionResponse(
|
shopping_response = ShoppingSuggestionResponse(
|
||||||
suggestions=[ProductSuggestion(**s) for s in parsed.get("suggestions", [])],
|
suggestions=[
|
||||||
reasoning=parsed.get("reasoning", ""),
|
ProductSuggestion.model_validate(s.model_dump())
|
||||||
|
for s in parsed_response.suggestions
|
||||||
|
],
|
||||||
|
reasoning=parsed_response.reasoning,
|
||||||
)
|
)
|
||||||
|
|
||||||
validation_result = validator.validate(shopping_response, shopping_context)
|
validation_result = validator.validate(shopping_response, shopping_context)
|
||||||
|
|
|
||||||
|
|
@ -1,11 +1,14 @@
|
||||||
from datetime import date, datetime
|
from datetime import date, datetime
|
||||||
from typing import Optional
|
from typing import Optional
|
||||||
|
from uuid import UUID
|
||||||
|
|
||||||
from fastapi import APIRouter, Depends
|
from fastapi import APIRouter, Depends, Query
|
||||||
from sqlmodel import Session, SQLModel
|
from sqlmodel import Session, SQLModel
|
||||||
|
|
||||||
from db import get_session
|
from db import get_session
|
||||||
from innercontext.api.llm_context import get_user_profile
|
from innercontext.api.llm_context import get_user_profile
|
||||||
|
from innercontext.api.auth_deps import get_current_user
|
||||||
|
from innercontext.auth import CurrentUser
|
||||||
from innercontext.models import SexAtBirth, UserProfile
|
from innercontext.models import SexAtBirth, UserProfile
|
||||||
|
|
||||||
router = APIRouter()
|
router = APIRouter()
|
||||||
|
|
@ -25,8 +28,12 @@ class UserProfilePublic(SQLModel):
|
||||||
|
|
||||||
|
|
||||||
@router.get("", response_model=UserProfilePublic | None)
|
@router.get("", response_model=UserProfilePublic | None)
|
||||||
def get_profile(session: Session = Depends(get_session)):
|
def get_profile(
|
||||||
profile = get_user_profile(session)
|
user_id: UUID | None = Query(default=None),
|
||||||
|
session: Session = Depends(get_session),
|
||||||
|
current_user: CurrentUser = Depends(get_current_user),
|
||||||
|
):
|
||||||
|
profile = get_user_profile(session, current_user, user_id=user_id)
|
||||||
if profile is None:
|
if profile is None:
|
||||||
return None
|
return None
|
||||||
return UserProfilePublic(
|
return UserProfilePublic(
|
||||||
|
|
@ -39,12 +46,18 @@ def get_profile(session: Session = Depends(get_session)):
|
||||||
|
|
||||||
|
|
||||||
@router.patch("", response_model=UserProfilePublic)
|
@router.patch("", response_model=UserProfilePublic)
|
||||||
def upsert_profile(data: UserProfileUpdate, session: Session = Depends(get_session)):
|
def upsert_profile(
|
||||||
profile = get_user_profile(session)
|
data: UserProfileUpdate,
|
||||||
|
user_id: UUID | None = Query(default=None),
|
||||||
|
session: Session = Depends(get_session),
|
||||||
|
current_user: CurrentUser = Depends(get_current_user),
|
||||||
|
):
|
||||||
|
target_user_id = user_id if user_id is not None else current_user.user_id
|
||||||
|
profile = get_user_profile(session, current_user, user_id=user_id)
|
||||||
payload = data.model_dump(exclude_unset=True)
|
payload = data.model_dump(exclude_unset=True)
|
||||||
|
|
||||||
if profile is None:
|
if profile is None:
|
||||||
profile = UserProfile(**payload)
|
profile = UserProfile(user_id=target_user_id, **payload)
|
||||||
else:
|
else:
|
||||||
for key, value in payload.items():
|
for key, value in payload.items():
|
||||||
setattr(profile, key, value)
|
setattr(profile, key, value)
|
||||||
|
|
|
||||||
|
|
@ -5,12 +5,15 @@ from datetime import date, timedelta
|
||||||
from typing import Any, Optional
|
from typing import Any, Optional
|
||||||
from uuid import UUID, uuid4
|
from uuid import UUID, uuid4
|
||||||
|
|
||||||
from fastapi import APIRouter, Depends, HTTPException
|
from fastapi import APIRouter, Depends, HTTPException, Query
|
||||||
from google.genai import types as genai_types
|
from google.genai import types as genai_types
|
||||||
from pydantic import BaseModel as PydanticBase
|
from pydantic import BaseModel as PydanticBase
|
||||||
|
|
||||||
from sqlmodel import Field, Session, SQLModel, col, select
|
from sqlmodel import Field, Session, SQLModel, col, select
|
||||||
|
|
||||||
from db import get_session
|
from db import get_session
|
||||||
|
from innercontext.api.auth_deps import get_current_user
|
||||||
|
from innercontext.api.authz import is_product_visible
|
||||||
from innercontext.api.llm_context import (
|
from innercontext.api.llm_context import (
|
||||||
build_products_context_summary_list,
|
build_products_context_summary_list,
|
||||||
build_user_profile_context,
|
build_user_profile_context,
|
||||||
|
|
@ -25,7 +28,8 @@ from innercontext.api.product_llm_tools import (
|
||||||
build_last_used_on_by_product,
|
build_last_used_on_by_product,
|
||||||
build_product_details_tool_handler,
|
build_product_details_tool_handler,
|
||||||
)
|
)
|
||||||
from innercontext.api.utils import get_or_404
|
from innercontext.api.utils import get_owned_or_404
|
||||||
|
from innercontext.auth import CurrentUser
|
||||||
from innercontext.llm import (
|
from innercontext.llm import (
|
||||||
call_gemini,
|
call_gemini,
|
||||||
call_gemini_with_function_tools,
|
call_gemini_with_function_tools,
|
||||||
|
|
@ -33,6 +37,7 @@ from innercontext.llm import (
|
||||||
)
|
)
|
||||||
from innercontext.llm_safety import isolate_user_input, sanitize_user_input
|
from innercontext.llm_safety import isolate_user_input, sanitize_user_input
|
||||||
from innercontext.models import (
|
from innercontext.models import (
|
||||||
|
HouseholdMembership,
|
||||||
GroomingSchedule,
|
GroomingSchedule,
|
||||||
Product,
|
Product,
|
||||||
ProductInventory,
|
ProductInventory,
|
||||||
|
|
@ -43,12 +48,16 @@ from innercontext.models import (
|
||||||
from innercontext.models.ai_log import AICallLog
|
from innercontext.models.ai_log import AICallLog
|
||||||
from innercontext.models.api_metadata import ResponseMetadata, TokenMetrics
|
from innercontext.models.api_metadata import ResponseMetadata, TokenMetrics
|
||||||
from innercontext.models.enums import GroomingAction, PartOfDay
|
from innercontext.models.enums import GroomingAction, PartOfDay
|
||||||
|
from innercontext.models.enums import Role
|
||||||
from innercontext.validators import BatchValidator, RoutineSuggestionValidator
|
from innercontext.validators import BatchValidator, RoutineSuggestionValidator
|
||||||
from innercontext.validators.batch_validator import BatchValidationContext
|
from innercontext.validators.batch_validator import BatchValidationContext
|
||||||
from innercontext.validators.routine_validator import RoutineValidationContext
|
from innercontext.validators.routine_validator import RoutineValidationContext
|
||||||
|
|
||||||
logger = logging.getLogger(__name__)
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
|
HISTORY_WINDOW_DAYS = 5
|
||||||
|
SNAPSHOT_FALLBACK_DAYS = 14
|
||||||
|
|
||||||
|
|
||||||
def _build_response_metadata(session: Session, log_id: Any) -> ResponseMetadata | None:
|
def _build_response_metadata(session: Session, log_id: Any) -> ResponseMetadata | None:
|
||||||
"""Build ResponseMetadata from AICallLog for Phase 3 observability."""
|
"""Build ResponseMetadata from AICallLog for Phase 3 observability."""
|
||||||
|
|
@ -83,6 +92,47 @@ def _build_response_metadata(session: Session, log_id: Any) -> ResponseMetadata
|
||||||
router = APIRouter()
|
router = APIRouter()
|
||||||
|
|
||||||
|
|
||||||
|
def _resolve_target_user_id(
|
||||||
|
current_user: CurrentUser,
|
||||||
|
user_id: UUID | None,
|
||||||
|
) -> UUID:
|
||||||
|
if user_id is None:
|
||||||
|
return current_user.user_id
|
||||||
|
if current_user.role is not Role.ADMIN:
|
||||||
|
raise HTTPException(status_code=403, detail="Admin role required")
|
||||||
|
return user_id
|
||||||
|
|
||||||
|
|
||||||
|
def _shared_household_user_ids(
|
||||||
|
session: Session, current_user: CurrentUser
|
||||||
|
) -> set[UUID]:
|
||||||
|
membership = current_user.household_membership
|
||||||
|
if membership is None:
|
||||||
|
return set()
|
||||||
|
user_ids = session.exec(
|
||||||
|
select(HouseholdMembership.user_id).where(
|
||||||
|
HouseholdMembership.household_id == membership.household_id
|
||||||
|
)
|
||||||
|
).all()
|
||||||
|
return {uid for uid in user_ids if uid != current_user.user_id}
|
||||||
|
|
||||||
|
|
||||||
|
def _get_owned_or_admin_override(
|
||||||
|
session: Session,
|
||||||
|
model: type[Routine] | type[RoutineStep] | type[GroomingSchedule],
|
||||||
|
record_id: UUID,
|
||||||
|
current_user: CurrentUser,
|
||||||
|
user_id: UUID | None,
|
||||||
|
):
|
||||||
|
if user_id is None:
|
||||||
|
return get_owned_or_404(session, model, record_id, current_user)
|
||||||
|
target_user_id = _resolve_target_user_id(current_user, user_id)
|
||||||
|
record = session.get(model, record_id)
|
||||||
|
if record is None or record.user_id != target_user_id:
|
||||||
|
raise HTTPException(status_code=404, detail=f"{model.__name__} not found")
|
||||||
|
return record
|
||||||
|
|
||||||
|
|
||||||
# ---------------------------------------------------------------------------
|
# ---------------------------------------------------------------------------
|
||||||
# Schemas
|
# Schemas
|
||||||
# ---------------------------------------------------------------------------
|
# ---------------------------------------------------------------------------
|
||||||
|
|
@ -284,12 +334,65 @@ def _ev(v: object) -> str:
|
||||||
return str(v)
|
return str(v)
|
||||||
|
|
||||||
|
|
||||||
def _build_skin_context(session: Session) -> str:
|
def _get_recent_skin_snapshot(
|
||||||
|
session: Session,
|
||||||
|
target_user_id: UUID,
|
||||||
|
reference_date: date,
|
||||||
|
window_days: int = HISTORY_WINDOW_DAYS,
|
||||||
|
fallback_days: int = SNAPSHOT_FALLBACK_DAYS,
|
||||||
|
) -> SkinConditionSnapshot | None:
|
||||||
|
window_cutoff = reference_date - timedelta(days=window_days)
|
||||||
|
fallback_cutoff = reference_date - timedelta(days=fallback_days)
|
||||||
|
|
||||||
snapshot = session.exec(
|
snapshot = session.exec(
|
||||||
select(SkinConditionSnapshot).order_by(
|
select(SkinConditionSnapshot)
|
||||||
col(SkinConditionSnapshot.snapshot_date).desc()
|
.where(SkinConditionSnapshot.user_id == target_user_id)
|
||||||
)
|
.where(SkinConditionSnapshot.snapshot_date <= reference_date)
|
||||||
|
.where(SkinConditionSnapshot.snapshot_date >= window_cutoff)
|
||||||
|
.order_by(col(SkinConditionSnapshot.snapshot_date).desc())
|
||||||
).first()
|
).first()
|
||||||
|
if snapshot is not None:
|
||||||
|
return snapshot
|
||||||
|
|
||||||
|
return session.exec(
|
||||||
|
select(SkinConditionSnapshot)
|
||||||
|
.where(SkinConditionSnapshot.user_id == target_user_id)
|
||||||
|
.where(SkinConditionSnapshot.snapshot_date <= reference_date)
|
||||||
|
.where(SkinConditionSnapshot.snapshot_date >= fallback_cutoff)
|
||||||
|
.order_by(col(SkinConditionSnapshot.snapshot_date).desc())
|
||||||
|
).first()
|
||||||
|
|
||||||
|
|
||||||
|
def _get_latest_skin_snapshot_within_days(
|
||||||
|
session: Session,
|
||||||
|
target_user_id: UUID,
|
||||||
|
reference_date: date,
|
||||||
|
max_age_days: int = SNAPSHOT_FALLBACK_DAYS,
|
||||||
|
) -> SkinConditionSnapshot | None:
|
||||||
|
cutoff = reference_date - timedelta(days=max_age_days)
|
||||||
|
return session.exec(
|
||||||
|
select(SkinConditionSnapshot)
|
||||||
|
.where(SkinConditionSnapshot.user_id == target_user_id)
|
||||||
|
.where(SkinConditionSnapshot.snapshot_date <= reference_date)
|
||||||
|
.where(SkinConditionSnapshot.snapshot_date >= cutoff)
|
||||||
|
.order_by(col(SkinConditionSnapshot.snapshot_date).desc())
|
||||||
|
).first()
|
||||||
|
|
||||||
|
|
||||||
|
def _build_skin_context(
|
||||||
|
session: Session,
|
||||||
|
target_user_id: UUID,
|
||||||
|
reference_date: date,
|
||||||
|
window_days: int = HISTORY_WINDOW_DAYS,
|
||||||
|
fallback_days: int = SNAPSHOT_FALLBACK_DAYS,
|
||||||
|
) -> str:
|
||||||
|
snapshot = _get_recent_skin_snapshot(
|
||||||
|
session,
|
||||||
|
target_user_id=target_user_id,
|
||||||
|
reference_date=reference_date,
|
||||||
|
window_days=window_days,
|
||||||
|
fallback_days=fallback_days,
|
||||||
|
)
|
||||||
if snapshot is None:
|
if snapshot is None:
|
||||||
return "SKIN CONDITION: no data\n"
|
return "SKIN CONDITION: no data\n"
|
||||||
ev = _ev
|
ev = _ev
|
||||||
|
|
@ -305,10 +408,14 @@ def _build_skin_context(session: Session) -> str:
|
||||||
|
|
||||||
|
|
||||||
def _build_grooming_context(
|
def _build_grooming_context(
|
||||||
session: Session, weekdays: Optional[list[int]] = None
|
session: Session,
|
||||||
|
target_user_id: UUID,
|
||||||
|
weekdays: Optional[list[int]] = None,
|
||||||
) -> str:
|
) -> str:
|
||||||
entries = session.exec(
|
entries = session.exec(
|
||||||
select(GroomingSchedule).order_by(col(GroomingSchedule.day_of_week))
|
select(GroomingSchedule)
|
||||||
|
.where(GroomingSchedule.user_id == target_user_id)
|
||||||
|
.order_by(col(GroomingSchedule.day_of_week))
|
||||||
).all()
|
).all()
|
||||||
if not entries:
|
if not entries:
|
||||||
return "GROOMING SCHEDULE: none\n"
|
return "GROOMING SCHEDULE: none\n"
|
||||||
|
|
@ -327,10 +434,62 @@ def _build_grooming_context(
|
||||||
return "\n".join(lines) + "\n"
|
return "\n".join(lines) + "\n"
|
||||||
|
|
||||||
|
|
||||||
def _build_recent_history(session: Session) -> str:
|
def _build_upcoming_grooming_context(
|
||||||
cutoff = date.today() - timedelta(days=7)
|
session: Session,
|
||||||
|
target_user_id: UUID,
|
||||||
|
start_date: date,
|
||||||
|
days: int = 7,
|
||||||
|
) -> str:
|
||||||
|
entries = session.exec(
|
||||||
|
select(GroomingSchedule)
|
||||||
|
.where(GroomingSchedule.user_id == target_user_id)
|
||||||
|
.order_by(col(GroomingSchedule.day_of_week))
|
||||||
|
).all()
|
||||||
|
if not entries:
|
||||||
|
return f"UPCOMING GROOMING (next {days} days): none\n"
|
||||||
|
|
||||||
|
entries_by_weekday: dict[int, list[GroomingSchedule]] = {}
|
||||||
|
for entry in entries:
|
||||||
|
entries_by_weekday.setdefault(entry.day_of_week, []).append(entry)
|
||||||
|
|
||||||
|
lines = [f"UPCOMING GROOMING (next {days} days):"]
|
||||||
|
for offset in range(days):
|
||||||
|
target_date = start_date + timedelta(days=offset)
|
||||||
|
day_entries = entries_by_weekday.get(target_date.weekday(), [])
|
||||||
|
if not day_entries:
|
||||||
|
continue
|
||||||
|
|
||||||
|
if offset == 0:
|
||||||
|
relative_label = "dzisiaj"
|
||||||
|
elif offset == 1:
|
||||||
|
relative_label = "jutro"
|
||||||
|
else:
|
||||||
|
relative_label = f"za {offset} dni"
|
||||||
|
|
||||||
|
day_name = _DAY_NAMES[target_date.weekday()]
|
||||||
|
actions = ", ".join(
|
||||||
|
f"{_ev(entry.action)}" + (f" ({entry.notes})" if entry.notes else "")
|
||||||
|
for entry in day_entries
|
||||||
|
)
|
||||||
|
lines.append(f" {relative_label} ({target_date}, {day_name}): {actions}")
|
||||||
|
|
||||||
|
if len(lines) == 1:
|
||||||
|
lines.append(" (no entries in this window)")
|
||||||
|
|
||||||
|
return "\n".join(lines) + "\n"
|
||||||
|
|
||||||
|
|
||||||
|
def _build_recent_history(
|
||||||
|
session: Session,
|
||||||
|
target_user_id: UUID,
|
||||||
|
reference_date: date,
|
||||||
|
window_days: int = HISTORY_WINDOW_DAYS,
|
||||||
|
) -> str:
|
||||||
|
cutoff = reference_date - timedelta(days=window_days)
|
||||||
routines = session.exec(
|
routines = session.exec(
|
||||||
select(Routine)
|
select(Routine)
|
||||||
|
.where(Routine.user_id == target_user_id)
|
||||||
|
.where(Routine.routine_date <= reference_date)
|
||||||
.where(Routine.routine_date >= cutoff)
|
.where(Routine.routine_date >= cutoff)
|
||||||
.order_by(col(Routine.routine_date).desc())
|
.order_by(col(Routine.routine_date).desc())
|
||||||
).all()
|
).all()
|
||||||
|
|
@ -341,6 +500,7 @@ def _build_recent_history(session: Session) -> str:
|
||||||
steps = session.exec(
|
steps = session.exec(
|
||||||
select(RoutineStep)
|
select(RoutineStep)
|
||||||
.where(RoutineStep.routine_id == r.id)
|
.where(RoutineStep.routine_id == r.id)
|
||||||
|
.where(RoutineStep.user_id == target_user_id)
|
||||||
.order_by(col(RoutineStep.order_index))
|
.order_by(col(RoutineStep.order_index))
|
||||||
).all()
|
).all()
|
||||||
step_names = []
|
step_names = []
|
||||||
|
|
@ -362,11 +522,37 @@ def _build_recent_history(session: Session) -> str:
|
||||||
|
|
||||||
def _get_available_products(
|
def _get_available_products(
|
||||||
session: Session,
|
session: Session,
|
||||||
|
current_user: CurrentUser,
|
||||||
time_filter: Optional[str] = None,
|
time_filter: Optional[str] = None,
|
||||||
include_minoxidil: bool = True,
|
include_minoxidil: bool = True,
|
||||||
) -> list[Product]:
|
) -> list[Product]:
|
||||||
stmt = select(Product).where(col(Product.is_tool).is_(False))
|
stmt = select(Product).where(col(Product.is_tool).is_(False))
|
||||||
products = session.exec(stmt).all()
|
if current_user.role is not Role.ADMIN:
|
||||||
|
owned_products = session.exec(
|
||||||
|
stmt.where(col(Product.user_id) == current_user.user_id)
|
||||||
|
).all()
|
||||||
|
shared_user_ids = _shared_household_user_ids(session, current_user)
|
||||||
|
shared_product_ids = (
|
||||||
|
session.exec(
|
||||||
|
select(ProductInventory.product_id)
|
||||||
|
.where(col(ProductInventory.is_household_shared).is_(True))
|
||||||
|
.where(col(ProductInventory.user_id).in_(list(shared_user_ids)))
|
||||||
|
.distinct()
|
||||||
|
).all()
|
||||||
|
if shared_user_ids
|
||||||
|
else []
|
||||||
|
)
|
||||||
|
shared_products = (
|
||||||
|
session.exec(stmt.where(col(Product.id).in_(shared_product_ids))).all()
|
||||||
|
if shared_product_ids
|
||||||
|
else []
|
||||||
|
)
|
||||||
|
products_by_id = {p.id: p for p in owned_products}
|
||||||
|
for product in shared_products:
|
||||||
|
products_by_id.setdefault(product.id, product)
|
||||||
|
products = list(products_by_id.values())
|
||||||
|
else:
|
||||||
|
products = session.exec(stmt).all()
|
||||||
result: list[Product] = []
|
result: list[Product] = []
|
||||||
for p in products:
|
for p in products:
|
||||||
if p.is_medication and not _is_minoxidil_product(p):
|
if p.is_medication and not _is_minoxidil_product(p):
|
||||||
|
|
@ -421,7 +607,9 @@ def _extract_requested_product_ids(
|
||||||
|
|
||||||
|
|
||||||
def _get_products_with_inventory(
|
def _get_products_with_inventory(
|
||||||
session: Session, product_ids: list[UUID]
|
session: Session,
|
||||||
|
current_user: CurrentUser,
|
||||||
|
product_ids: list[UUID],
|
||||||
) -> set[UUID]:
|
) -> set[UUID]:
|
||||||
"""
|
"""
|
||||||
Return set of product IDs that have active (non-finished) inventory.
|
Return set of product IDs that have active (non-finished) inventory.
|
||||||
|
|
@ -431,17 +619,33 @@ def _get_products_with_inventory(
|
||||||
if not product_ids:
|
if not product_ids:
|
||||||
return set()
|
return set()
|
||||||
|
|
||||||
inventory_rows = session.exec(
|
stmt = (
|
||||||
select(ProductInventory.product_id)
|
select(ProductInventory.product_id)
|
||||||
.where(col(ProductInventory.product_id).in_(product_ids))
|
.where(col(ProductInventory.product_id).in_(product_ids))
|
||||||
.where(col(ProductInventory.finished_at).is_(None))
|
.where(col(ProductInventory.finished_at).is_(None))
|
||||||
.distinct()
|
)
|
||||||
).all()
|
if current_user.role is not Role.ADMIN:
|
||||||
|
owned_inventory_rows = session.exec(
|
||||||
|
stmt.where(col(ProductInventory.user_id) == current_user.user_id).distinct()
|
||||||
|
).all()
|
||||||
|
shared_user_ids = _shared_household_user_ids(session, current_user)
|
||||||
|
shared_inventory_rows = session.exec(
|
||||||
|
stmt.where(col(ProductInventory.is_household_shared).is_(True))
|
||||||
|
.where(col(ProductInventory.user_id).in_(list(shared_user_ids)))
|
||||||
|
.distinct()
|
||||||
|
).all()
|
||||||
|
inventory_rows = set(owned_inventory_rows)
|
||||||
|
inventory_rows.update(shared_inventory_rows)
|
||||||
|
return inventory_rows
|
||||||
|
inventory_rows = session.exec(stmt.distinct()).all()
|
||||||
return set(inventory_rows)
|
return set(inventory_rows)
|
||||||
|
|
||||||
|
|
||||||
def _expand_product_id(session: Session, short_or_full_id: str) -> UUID | None:
|
def _expand_product_id(
|
||||||
|
session: Session,
|
||||||
|
current_user: CurrentUser,
|
||||||
|
short_or_full_id: str,
|
||||||
|
) -> UUID | None:
|
||||||
"""
|
"""
|
||||||
Expand 8-char short_id to full UUID, or validate full UUID.
|
Expand 8-char short_id to full UUID, or validate full UUID.
|
||||||
|
|
||||||
|
|
@ -462,7 +666,13 @@ def _expand_product_id(session: Session, short_or_full_id: str) -> UUID | None:
|
||||||
uuid_obj = UUID(short_or_full_id)
|
uuid_obj = UUID(short_or_full_id)
|
||||||
# Verify it exists
|
# Verify it exists
|
||||||
product = session.get(Product, uuid_obj)
|
product = session.get(Product, uuid_obj)
|
||||||
return uuid_obj if product else None
|
if product is None:
|
||||||
|
return None
|
||||||
|
return (
|
||||||
|
uuid_obj
|
||||||
|
if is_product_visible(session, uuid_obj, current_user)
|
||||||
|
else None
|
||||||
|
)
|
||||||
except (ValueError, TypeError):
|
except (ValueError, TypeError):
|
||||||
return None
|
return None
|
||||||
|
|
||||||
|
|
@ -471,7 +681,13 @@ def _expand_product_id(session: Session, short_or_full_id: str) -> UUID | None:
|
||||||
product = session.exec(
|
product = session.exec(
|
||||||
select(Product).where(Product.short_id == short_or_full_id)
|
select(Product).where(Product.short_id == short_or_full_id)
|
||||||
).first()
|
).first()
|
||||||
return product.id if product else None
|
if product is None:
|
||||||
|
return None
|
||||||
|
return (
|
||||||
|
product.id
|
||||||
|
if is_product_visible(session, product.id, current_user)
|
||||||
|
else None
|
||||||
|
)
|
||||||
|
|
||||||
# Invalid length
|
# Invalid length
|
||||||
return None
|
return None
|
||||||
|
|
@ -494,6 +710,17 @@ def _build_day_context(leaving_home: Optional[bool]) -> str:
|
||||||
return f"DAY CONTEXT:\n Leaving home: {val}\n"
|
return f"DAY CONTEXT:\n Leaving home: {val}\n"
|
||||||
|
|
||||||
|
|
||||||
|
def _coerce_action_type(value: object) -> GroomingAction | None:
|
||||||
|
if isinstance(value, GroomingAction):
|
||||||
|
return value
|
||||||
|
if isinstance(value, str):
|
||||||
|
try:
|
||||||
|
return GroomingAction(value)
|
||||||
|
except ValueError:
|
||||||
|
return None
|
||||||
|
return None
|
||||||
|
|
||||||
|
|
||||||
_ROUTINES_SYSTEM_PROMPT = """\
|
_ROUTINES_SYSTEM_PROMPT = """\
|
||||||
Jesteś ekspertem planowania pielęgnacji.
|
Jesteś ekspertem planowania pielęgnacji.
|
||||||
|
|
||||||
|
|
@ -580,9 +807,12 @@ def list_routines(
|
||||||
from_date: Optional[date] = None,
|
from_date: Optional[date] = None,
|
||||||
to_date: Optional[date] = None,
|
to_date: Optional[date] = None,
|
||||||
part_of_day: Optional[PartOfDay] = None,
|
part_of_day: Optional[PartOfDay] = None,
|
||||||
|
user_id: UUID | None = Query(default=None),
|
||||||
session: Session = Depends(get_session),
|
session: Session = Depends(get_session),
|
||||||
|
current_user: CurrentUser = Depends(get_current_user),
|
||||||
):
|
):
|
||||||
stmt = select(Routine)
|
target_user_id = _resolve_target_user_id(current_user, user_id)
|
||||||
|
stmt = select(Routine).where(Routine.user_id == target_user_id)
|
||||||
if from_date is not None:
|
if from_date is not None:
|
||||||
stmt = stmt.where(Routine.routine_date >= from_date)
|
stmt = stmt.where(Routine.routine_date >= from_date)
|
||||||
if to_date is not None:
|
if to_date is not None:
|
||||||
|
|
@ -592,10 +822,12 @@ def list_routines(
|
||||||
routines = session.exec(stmt).all()
|
routines = session.exec(stmt).all()
|
||||||
|
|
||||||
routine_ids = [r.id for r in routines]
|
routine_ids = [r.id for r in routines]
|
||||||
steps_by_routine: dict = {}
|
steps_by_routine: dict[UUID, list[RoutineStep]] = {}
|
||||||
if routine_ids:
|
if routine_ids:
|
||||||
all_steps = session.exec(
|
all_steps = session.exec(
|
||||||
select(RoutineStep).where(col(RoutineStep.routine_id).in_(routine_ids))
|
select(RoutineStep)
|
||||||
|
.where(col(RoutineStep.routine_id).in_(routine_ids))
|
||||||
|
.where(RoutineStep.user_id == target_user_id)
|
||||||
).all()
|
).all()
|
||||||
for step in all_steps:
|
for step in all_steps:
|
||||||
steps_by_routine.setdefault(step.routine_id, []).append(step)
|
steps_by_routine.setdefault(step.routine_id, []).append(step)
|
||||||
|
|
@ -611,8 +843,14 @@ def list_routines(
|
||||||
|
|
||||||
|
|
||||||
@router.post("", response_model=Routine, status_code=201)
|
@router.post("", response_model=Routine, status_code=201)
|
||||||
def create_routine(data: RoutineCreate, session: Session = Depends(get_session)):
|
def create_routine(
|
||||||
routine = Routine(id=uuid4(), **data.model_dump())
|
data: RoutineCreate,
|
||||||
|
user_id: UUID | None = Query(default=None),
|
||||||
|
session: Session = Depends(get_session),
|
||||||
|
current_user: CurrentUser = Depends(get_current_user),
|
||||||
|
):
|
||||||
|
target_user_id = _resolve_target_user_id(current_user, user_id)
|
||||||
|
routine = Routine(id=uuid4(), user_id=target_user_id, **data.model_dump())
|
||||||
session.add(routine)
|
session.add(routine)
|
||||||
session.commit()
|
session.commit()
|
||||||
session.refresh(routine)
|
session.refresh(routine)
|
||||||
|
|
@ -628,15 +866,35 @@ def create_routine(data: RoutineCreate, session: Session = Depends(get_session))
|
||||||
def suggest_routine(
|
def suggest_routine(
|
||||||
data: SuggestRoutineRequest,
|
data: SuggestRoutineRequest,
|
||||||
session: Session = Depends(get_session),
|
session: Session = Depends(get_session),
|
||||||
|
current_user: CurrentUser = Depends(get_current_user),
|
||||||
):
|
):
|
||||||
|
target_user_id = current_user.user_id
|
||||||
weekday = data.routine_date.weekday()
|
weekday = data.routine_date.weekday()
|
||||||
skin_ctx = _build_skin_context(session)
|
skin_ctx = _build_skin_context(
|
||||||
profile_ctx = build_user_profile_context(session, reference_date=data.routine_date)
|
session,
|
||||||
grooming_ctx = _build_grooming_context(session, weekdays=[weekday])
|
target_user_id=target_user_id,
|
||||||
history_ctx = _build_recent_history(session)
|
reference_date=data.routine_date,
|
||||||
|
)
|
||||||
|
profile_ctx = build_user_profile_context(
|
||||||
|
session,
|
||||||
|
reference_date=data.routine_date,
|
||||||
|
current_user=current_user,
|
||||||
|
)
|
||||||
|
upcoming_grooming_ctx = _build_upcoming_grooming_context(
|
||||||
|
session,
|
||||||
|
target_user_id=target_user_id,
|
||||||
|
start_date=data.routine_date,
|
||||||
|
days=7,
|
||||||
|
)
|
||||||
|
history_ctx = _build_recent_history(
|
||||||
|
session,
|
||||||
|
target_user_id=target_user_id,
|
||||||
|
reference_date=data.routine_date,
|
||||||
|
)
|
||||||
day_ctx = _build_day_context(data.leaving_home)
|
day_ctx = _build_day_context(data.leaving_home)
|
||||||
available_products = _get_available_products(
|
available_products = _get_available_products(
|
||||||
session,
|
session,
|
||||||
|
current_user=current_user,
|
||||||
time_filter=data.part_of_day.value,
|
time_filter=data.part_of_day.value,
|
||||||
include_minoxidil=data.include_minoxidil_beard,
|
include_minoxidil=data.include_minoxidil_beard,
|
||||||
)
|
)
|
||||||
|
|
@ -652,7 +910,9 @@ def suggest_routine(
|
||||||
|
|
||||||
# Phase 2: Use tiered context (summary mode for initial prompt)
|
# Phase 2: Use tiered context (summary mode for initial prompt)
|
||||||
products_with_inventory = _get_products_with_inventory(
|
products_with_inventory = _get_products_with_inventory(
|
||||||
session, [p.id for p in available_products]
|
session,
|
||||||
|
current_user,
|
||||||
|
[p.id for p in available_products],
|
||||||
)
|
)
|
||||||
products_ctx = build_products_context_summary_list(
|
products_ctx = build_products_context_summary_list(
|
||||||
available_products, products_with_inventory
|
available_products, products_with_inventory
|
||||||
|
|
@ -675,7 +935,7 @@ def suggest_routine(
|
||||||
f"na {data.routine_date} ({day_name}).\n\n"
|
f"na {data.routine_date} ({day_name}).\n\n"
|
||||||
f"{mode_line}\n"
|
f"{mode_line}\n"
|
||||||
"INPUT DATA:\n"
|
"INPUT DATA:\n"
|
||||||
f"{profile_ctx}\n{skin_ctx}\n{grooming_ctx}\n{history_ctx}\n{day_ctx}\n{products_ctx}\n{objectives_ctx}"
|
f"{profile_ctx}\n{skin_ctx}\n{upcoming_grooming_ctx}\n{history_ctx}\n{day_ctx}\n{products_ctx}\n{objectives_ctx}"
|
||||||
"\nNARZEDZIA:\n"
|
"\nNARZEDZIA:\n"
|
||||||
"- Masz dostep do funkcji: get_product_details.\n"
|
"- Masz dostep do funkcji: get_product_details.\n"
|
||||||
"- Wywoluj narzedzia tylko, gdy potrzebujesz detali do decyzji klinicznej/bezpieczenstwa.\n"
|
"- Wywoluj narzedzia tylko, gdy potrzebujesz detali do decyzji klinicznej/bezpieczenstwa.\n"
|
||||||
|
|
@ -765,22 +1025,35 @@ def suggest_routine(
|
||||||
|
|
||||||
# Translation layer: Expand short_ids (8 chars) to full UUIDs (36 chars)
|
# Translation layer: Expand short_ids (8 chars) to full UUIDs (36 chars)
|
||||||
steps = []
|
steps = []
|
||||||
for s in parsed.get("steps", []):
|
raw_steps = parsed.get("steps", [])
|
||||||
|
if not isinstance(raw_steps, list):
|
||||||
|
raw_steps = []
|
||||||
|
for s in raw_steps:
|
||||||
|
if not isinstance(s, dict):
|
||||||
|
continue
|
||||||
product_id_str = s.get("product_id")
|
product_id_str = s.get("product_id")
|
||||||
product_id_uuid = None
|
product_id_uuid = None
|
||||||
|
|
||||||
if product_id_str:
|
if isinstance(product_id_str, str) and product_id_str:
|
||||||
# Expand short_id or validate full UUID
|
# Expand short_id or validate full UUID
|
||||||
product_id_uuid = _expand_product_id(session, product_id_str)
|
product_id_uuid = _expand_product_id(session, current_user, product_id_str)
|
||||||
|
|
||||||
|
action_type = s.get("action_type")
|
||||||
|
action_notes = s.get("action_notes")
|
||||||
|
region = s.get("region")
|
||||||
|
why_this_step = s.get("why_this_step")
|
||||||
|
optional = s.get("optional")
|
||||||
|
|
||||||
steps.append(
|
steps.append(
|
||||||
SuggestedStep(
|
SuggestedStep(
|
||||||
product_id=product_id_uuid,
|
product_id=product_id_uuid,
|
||||||
action_type=s.get("action_type") or None,
|
action_type=_coerce_action_type(action_type),
|
||||||
action_notes=s.get("action_notes"),
|
action_notes=action_notes if isinstance(action_notes, str) else None,
|
||||||
region=s.get("region"),
|
region=region if isinstance(region, str) else None,
|
||||||
why_this_step=s.get("why_this_step"),
|
why_this_step=(
|
||||||
optional=s.get("optional"),
|
why_this_step if isinstance(why_this_step, str) else None
|
||||||
|
),
|
||||||
|
optional=optional if isinstance(optional, bool) else None,
|
||||||
)
|
)
|
||||||
)
|
)
|
||||||
|
|
||||||
|
|
@ -802,10 +1075,11 @@ def suggest_routine(
|
||||||
)
|
)
|
||||||
|
|
||||||
# Get skin snapshot for barrier state
|
# Get skin snapshot for barrier state
|
||||||
stmt = select(SkinConditionSnapshot).order_by(
|
skin_snapshot = _get_latest_skin_snapshot_within_days(
|
||||||
col(SkinConditionSnapshot.snapshot_date).desc()
|
session,
|
||||||
|
target_user_id=target_user_id,
|
||||||
|
reference_date=data.routine_date,
|
||||||
)
|
)
|
||||||
skin_snapshot = session.exec(stmt).first()
|
|
||||||
|
|
||||||
# Build validation context
|
# Build validation context
|
||||||
products_by_id = {p.id: p for p in available_products}
|
products_by_id = {p.id: p for p in available_products}
|
||||||
|
|
@ -864,7 +1138,9 @@ def suggest_routine(
|
||||||
def suggest_batch(
|
def suggest_batch(
|
||||||
data: SuggestBatchRequest,
|
data: SuggestBatchRequest,
|
||||||
session: Session = Depends(get_session),
|
session: Session = Depends(get_session),
|
||||||
|
current_user: CurrentUser = Depends(get_current_user),
|
||||||
):
|
):
|
||||||
|
target_user_id = current_user.user_id
|
||||||
delta = (data.to_date - data.from_date).days + 1
|
delta = (data.to_date - data.from_date).days + 1
|
||||||
if delta > 14:
|
if delta > 14:
|
||||||
raise HTTPException(
|
raise HTTPException(
|
||||||
|
|
@ -876,18 +1152,37 @@ def suggest_batch(
|
||||||
weekdays = list(
|
weekdays = list(
|
||||||
{(data.from_date + timedelta(days=i)).weekday() for i in range(delta)}
|
{(data.from_date + timedelta(days=i)).weekday() for i in range(delta)}
|
||||||
)
|
)
|
||||||
profile_ctx = build_user_profile_context(session, reference_date=data.from_date)
|
profile_ctx = build_user_profile_context(
|
||||||
skin_ctx = _build_skin_context(session)
|
session,
|
||||||
grooming_ctx = _build_grooming_context(session, weekdays=weekdays)
|
reference_date=data.from_date,
|
||||||
history_ctx = _build_recent_history(session)
|
current_user=current_user,
|
||||||
|
)
|
||||||
|
skin_ctx = _build_skin_context(
|
||||||
|
session,
|
||||||
|
target_user_id=target_user_id,
|
||||||
|
reference_date=data.from_date,
|
||||||
|
)
|
||||||
|
grooming_ctx = _build_grooming_context(
|
||||||
|
session,
|
||||||
|
target_user_id=target_user_id,
|
||||||
|
weekdays=weekdays,
|
||||||
|
)
|
||||||
|
history_ctx = _build_recent_history(
|
||||||
|
session,
|
||||||
|
target_user_id=target_user_id,
|
||||||
|
reference_date=data.from_date,
|
||||||
|
)
|
||||||
batch_products = _get_available_products(
|
batch_products = _get_available_products(
|
||||||
session,
|
session,
|
||||||
|
current_user=current_user,
|
||||||
include_minoxidil=data.include_minoxidil_beard,
|
include_minoxidil=data.include_minoxidil_beard,
|
||||||
)
|
)
|
||||||
|
|
||||||
# Phase 2: Use tiered context (summary mode for batch planning)
|
# Phase 2: Use tiered context (summary mode for batch planning)
|
||||||
products_with_inventory = _get_products_with_inventory(
|
products_with_inventory = _get_products_with_inventory(
|
||||||
session, [p.id for p in batch_products]
|
session,
|
||||||
|
current_user,
|
||||||
|
[p.id for p in batch_products],
|
||||||
)
|
)
|
||||||
products_ctx = build_products_context_summary_list(
|
products_ctx = build_products_context_summary_list(
|
||||||
batch_products, products_with_inventory
|
batch_products, products_with_inventory
|
||||||
|
|
@ -945,25 +1240,39 @@ def suggest_batch(
|
||||||
except json.JSONDecodeError as e:
|
except json.JSONDecodeError as e:
|
||||||
raise HTTPException(status_code=502, detail=f"LLM returned invalid JSON: {e}")
|
raise HTTPException(status_code=502, detail=f"LLM returned invalid JSON: {e}")
|
||||||
|
|
||||||
def _parse_steps(raw_steps: list) -> list[SuggestedStep]:
|
def _parse_steps(raw_steps: list[dict[str, object]]) -> list[SuggestedStep]:
|
||||||
"""Parse steps and expand short_ids to full UUIDs."""
|
"""Parse steps and expand short_ids to full UUIDs."""
|
||||||
result = []
|
result = []
|
||||||
for s in raw_steps:
|
for s in raw_steps:
|
||||||
product_id_str = s.get("product_id")
|
product_id_str = s.get("product_id")
|
||||||
product_id_uuid = None
|
product_id_uuid = None
|
||||||
|
|
||||||
if product_id_str:
|
if isinstance(product_id_str, str) and product_id_str:
|
||||||
# Translation layer: expand short_id to full UUID
|
# Translation layer: expand short_id to full UUID
|
||||||
product_id_uuid = _expand_product_id(session, product_id_str)
|
product_id_uuid = _expand_product_id(
|
||||||
|
session,
|
||||||
|
current_user,
|
||||||
|
product_id_str,
|
||||||
|
)
|
||||||
|
|
||||||
|
action_type = s.get("action_type")
|
||||||
|
action_notes = s.get("action_notes")
|
||||||
|
region = s.get("region")
|
||||||
|
why_this_step = s.get("why_this_step")
|
||||||
|
optional = s.get("optional")
|
||||||
|
|
||||||
result.append(
|
result.append(
|
||||||
SuggestedStep(
|
SuggestedStep(
|
||||||
product_id=product_id_uuid,
|
product_id=product_id_uuid,
|
||||||
action_type=s.get("action_type") or None,
|
action_type=_coerce_action_type(action_type),
|
||||||
action_notes=s.get("action_notes"),
|
action_notes=(
|
||||||
region=s.get("region"),
|
action_notes if isinstance(action_notes, str) else None
|
||||||
why_this_step=s.get("why_this_step"),
|
),
|
||||||
optional=s.get("optional"),
|
region=region if isinstance(region, str) else None,
|
||||||
|
why_this_step=(
|
||||||
|
why_this_step if isinstance(why_this_step, str) else None
|
||||||
|
),
|
||||||
|
optional=optional if isinstance(optional, bool) else None,
|
||||||
)
|
)
|
||||||
)
|
)
|
||||||
return result
|
return result
|
||||||
|
|
@ -984,10 +1293,11 @@ def suggest_batch(
|
||||||
)
|
)
|
||||||
|
|
||||||
# Get skin snapshot for barrier state
|
# Get skin snapshot for barrier state
|
||||||
stmt = select(SkinConditionSnapshot).order_by(
|
skin_snapshot = _get_latest_skin_snapshot_within_days(
|
||||||
col(SkinConditionSnapshot.snapshot_date).desc()
|
session,
|
||||||
|
target_user_id=target_user_id,
|
||||||
|
reference_date=data.from_date,
|
||||||
)
|
)
|
||||||
skin_snapshot = session.exec(stmt).first()
|
|
||||||
|
|
||||||
# Build validation context
|
# Build validation context
|
||||||
products_by_id = {p.id: p for p in batch_products}
|
products_by_id = {p.id: p for p in batch_products}
|
||||||
|
|
@ -1040,15 +1350,36 @@ def suggest_batch(
|
||||||
|
|
||||||
# Grooming-schedule GET must appear before /{routine_id} to avoid being shadowed
|
# Grooming-schedule GET must appear before /{routine_id} to avoid being shadowed
|
||||||
@router.get("/grooming-schedule", response_model=list[GroomingSchedule])
|
@router.get("/grooming-schedule", response_model=list[GroomingSchedule])
|
||||||
def list_grooming_schedule(session: Session = Depends(get_session)):
|
def list_grooming_schedule(
|
||||||
return session.exec(select(GroomingSchedule)).all()
|
user_id: UUID | None = Query(default=None),
|
||||||
|
session: Session = Depends(get_session),
|
||||||
|
current_user: CurrentUser = Depends(get_current_user),
|
||||||
|
):
|
||||||
|
target_user_id = _resolve_target_user_id(current_user, user_id)
|
||||||
|
return session.exec(
|
||||||
|
select(GroomingSchedule).where(GroomingSchedule.user_id == target_user_id)
|
||||||
|
).all()
|
||||||
|
|
||||||
|
|
||||||
@router.get("/{routine_id}")
|
@router.get("/{routine_id}")
|
||||||
def get_routine(routine_id: UUID, session: Session = Depends(get_session)):
|
def get_routine(
|
||||||
routine = get_or_404(session, Routine, routine_id)
|
routine_id: UUID,
|
||||||
|
user_id: UUID | None = Query(default=None),
|
||||||
|
session: Session = Depends(get_session),
|
||||||
|
current_user: CurrentUser = Depends(get_current_user),
|
||||||
|
):
|
||||||
|
target_user_id = _resolve_target_user_id(current_user, user_id)
|
||||||
|
routine = _get_owned_or_admin_override(
|
||||||
|
session,
|
||||||
|
Routine,
|
||||||
|
routine_id,
|
||||||
|
current_user,
|
||||||
|
user_id,
|
||||||
|
)
|
||||||
steps = session.exec(
|
steps = session.exec(
|
||||||
select(RoutineStep).where(RoutineStep.routine_id == routine_id)
|
select(RoutineStep)
|
||||||
|
.where(RoutineStep.routine_id == routine_id)
|
||||||
|
.where(RoutineStep.user_id == target_user_id)
|
||||||
).all()
|
).all()
|
||||||
data = routine.model_dump(mode="json")
|
data = routine.model_dump(mode="json")
|
||||||
data["steps"] = [step.model_dump(mode="json") for step in steps]
|
data["steps"] = [step.model_dump(mode="json") for step in steps]
|
||||||
|
|
@ -1059,9 +1390,17 @@ def get_routine(routine_id: UUID, session: Session = Depends(get_session)):
|
||||||
def update_routine(
|
def update_routine(
|
||||||
routine_id: UUID,
|
routine_id: UUID,
|
||||||
data: RoutineUpdate,
|
data: RoutineUpdate,
|
||||||
|
user_id: UUID | None = Query(default=None),
|
||||||
session: Session = Depends(get_session),
|
session: Session = Depends(get_session),
|
||||||
|
current_user: CurrentUser = Depends(get_current_user),
|
||||||
):
|
):
|
||||||
routine = get_or_404(session, Routine, routine_id)
|
routine = _get_owned_or_admin_override(
|
||||||
|
session,
|
||||||
|
Routine,
|
||||||
|
routine_id,
|
||||||
|
current_user,
|
||||||
|
user_id,
|
||||||
|
)
|
||||||
for key, value in data.model_dump(exclude_unset=True).items():
|
for key, value in data.model_dump(exclude_unset=True).items():
|
||||||
setattr(routine, key, value)
|
setattr(routine, key, value)
|
||||||
session.add(routine)
|
session.add(routine)
|
||||||
|
|
@ -1071,8 +1410,19 @@ def update_routine(
|
||||||
|
|
||||||
|
|
||||||
@router.delete("/{routine_id}", status_code=204)
|
@router.delete("/{routine_id}", status_code=204)
|
||||||
def delete_routine(routine_id: UUID, session: Session = Depends(get_session)):
|
def delete_routine(
|
||||||
routine = get_or_404(session, Routine, routine_id)
|
routine_id: UUID,
|
||||||
|
user_id: UUID | None = Query(default=None),
|
||||||
|
session: Session = Depends(get_session),
|
||||||
|
current_user: CurrentUser = Depends(get_current_user),
|
||||||
|
):
|
||||||
|
routine = _get_owned_or_admin_override(
|
||||||
|
session,
|
||||||
|
Routine,
|
||||||
|
routine_id,
|
||||||
|
current_user,
|
||||||
|
user_id,
|
||||||
|
)
|
||||||
session.delete(routine)
|
session.delete(routine)
|
||||||
session.commit()
|
session.commit()
|
||||||
|
|
||||||
|
|
@ -1086,10 +1436,28 @@ def delete_routine(routine_id: UUID, session: Session = Depends(get_session)):
|
||||||
def add_step(
|
def add_step(
|
||||||
routine_id: UUID,
|
routine_id: UUID,
|
||||||
data: RoutineStepCreate,
|
data: RoutineStepCreate,
|
||||||
|
user_id: UUID | None = Query(default=None),
|
||||||
session: Session = Depends(get_session),
|
session: Session = Depends(get_session),
|
||||||
|
current_user: CurrentUser = Depends(get_current_user),
|
||||||
):
|
):
|
||||||
get_or_404(session, Routine, routine_id)
|
target_user_id = _resolve_target_user_id(current_user, user_id)
|
||||||
step = RoutineStep(id=uuid4(), routine_id=routine_id, **data.model_dump())
|
_ = _get_owned_or_admin_override(
|
||||||
|
session,
|
||||||
|
Routine,
|
||||||
|
routine_id,
|
||||||
|
current_user,
|
||||||
|
user_id,
|
||||||
|
)
|
||||||
|
if data.product_id and not is_product_visible(
|
||||||
|
session, data.product_id, current_user
|
||||||
|
):
|
||||||
|
raise HTTPException(status_code=404, detail="Product not found")
|
||||||
|
step = RoutineStep(
|
||||||
|
id=uuid4(),
|
||||||
|
user_id=target_user_id,
|
||||||
|
routine_id=routine_id,
|
||||||
|
**data.model_dump(),
|
||||||
|
)
|
||||||
session.add(step)
|
session.add(step)
|
||||||
session.commit()
|
session.commit()
|
||||||
session.refresh(step)
|
session.refresh(step)
|
||||||
|
|
@ -1100,9 +1468,21 @@ def add_step(
|
||||||
def update_step(
|
def update_step(
|
||||||
step_id: UUID,
|
step_id: UUID,
|
||||||
data: RoutineStepUpdate,
|
data: RoutineStepUpdate,
|
||||||
|
user_id: UUID | None = Query(default=None),
|
||||||
session: Session = Depends(get_session),
|
session: Session = Depends(get_session),
|
||||||
|
current_user: CurrentUser = Depends(get_current_user),
|
||||||
):
|
):
|
||||||
step = get_or_404(session, RoutineStep, step_id)
|
step = _get_owned_or_admin_override(
|
||||||
|
session,
|
||||||
|
RoutineStep,
|
||||||
|
step_id,
|
||||||
|
current_user,
|
||||||
|
user_id,
|
||||||
|
)
|
||||||
|
if data.product_id and not is_product_visible(
|
||||||
|
session, data.product_id, current_user
|
||||||
|
):
|
||||||
|
raise HTTPException(status_code=404, detail="Product not found")
|
||||||
for key, value in data.model_dump(exclude_unset=True).items():
|
for key, value in data.model_dump(exclude_unset=True).items():
|
||||||
setattr(step, key, value)
|
setattr(step, key, value)
|
||||||
session.add(step)
|
session.add(step)
|
||||||
|
|
@ -1112,8 +1492,19 @@ def update_step(
|
||||||
|
|
||||||
|
|
||||||
@router.delete("/steps/{step_id}", status_code=204)
|
@router.delete("/steps/{step_id}", status_code=204)
|
||||||
def delete_step(step_id: UUID, session: Session = Depends(get_session)):
|
def delete_step(
|
||||||
step = get_or_404(session, RoutineStep, step_id)
|
step_id: UUID,
|
||||||
|
user_id: UUID | None = Query(default=None),
|
||||||
|
session: Session = Depends(get_session),
|
||||||
|
current_user: CurrentUser = Depends(get_current_user),
|
||||||
|
):
|
||||||
|
step = _get_owned_or_admin_override(
|
||||||
|
session,
|
||||||
|
RoutineStep,
|
||||||
|
step_id,
|
||||||
|
current_user,
|
||||||
|
user_id,
|
||||||
|
)
|
||||||
session.delete(step)
|
session.delete(step)
|
||||||
session.commit()
|
session.commit()
|
||||||
|
|
||||||
|
|
@ -1125,9 +1516,13 @@ def delete_step(step_id: UUID, session: Session = Depends(get_session)):
|
||||||
|
|
||||||
@router.post("/grooming-schedule", response_model=GroomingSchedule, status_code=201)
|
@router.post("/grooming-schedule", response_model=GroomingSchedule, status_code=201)
|
||||||
def create_grooming_schedule(
|
def create_grooming_schedule(
|
||||||
data: GroomingScheduleCreate, session: Session = Depends(get_session)
|
data: GroomingScheduleCreate,
|
||||||
|
user_id: UUID | None = Query(default=None),
|
||||||
|
session: Session = Depends(get_session),
|
||||||
|
current_user: CurrentUser = Depends(get_current_user),
|
||||||
):
|
):
|
||||||
entry = GroomingSchedule(id=uuid4(), **data.model_dump())
|
target_user_id = _resolve_target_user_id(current_user, user_id)
|
||||||
|
entry = GroomingSchedule(id=uuid4(), user_id=target_user_id, **data.model_dump())
|
||||||
session.add(entry)
|
session.add(entry)
|
||||||
session.commit()
|
session.commit()
|
||||||
session.refresh(entry)
|
session.refresh(entry)
|
||||||
|
|
@ -1138,9 +1533,17 @@ def create_grooming_schedule(
|
||||||
def update_grooming_schedule(
|
def update_grooming_schedule(
|
||||||
entry_id: UUID,
|
entry_id: UUID,
|
||||||
data: GroomingScheduleUpdate,
|
data: GroomingScheduleUpdate,
|
||||||
|
user_id: UUID | None = Query(default=None),
|
||||||
session: Session = Depends(get_session),
|
session: Session = Depends(get_session),
|
||||||
|
current_user: CurrentUser = Depends(get_current_user),
|
||||||
):
|
):
|
||||||
entry = get_or_404(session, GroomingSchedule, entry_id)
|
entry = _get_owned_or_admin_override(
|
||||||
|
session,
|
||||||
|
GroomingSchedule,
|
||||||
|
entry_id,
|
||||||
|
current_user,
|
||||||
|
user_id,
|
||||||
|
)
|
||||||
for key, value in data.model_dump(exclude_unset=True).items():
|
for key, value in data.model_dump(exclude_unset=True).items():
|
||||||
setattr(entry, key, value)
|
setattr(entry, key, value)
|
||||||
session.add(entry)
|
session.add(entry)
|
||||||
|
|
@ -1150,7 +1553,18 @@ def update_grooming_schedule(
|
||||||
|
|
||||||
|
|
||||||
@router.delete("/grooming-schedule/{entry_id}", status_code=204)
|
@router.delete("/grooming-schedule/{entry_id}", status_code=204)
|
||||||
def delete_grooming_schedule(entry_id: UUID, session: Session = Depends(get_session)):
|
def delete_grooming_schedule(
|
||||||
entry = get_or_404(session, GroomingSchedule, entry_id)
|
entry_id: UUID,
|
||||||
|
user_id: UUID | None = Query(default=None),
|
||||||
|
session: Session = Depends(get_session),
|
||||||
|
current_user: CurrentUser = Depends(get_current_user),
|
||||||
|
):
|
||||||
|
entry = _get_owned_or_admin_override(
|
||||||
|
session,
|
||||||
|
GroomingSchedule,
|
||||||
|
entry_id,
|
||||||
|
current_user,
|
||||||
|
user_id,
|
||||||
|
)
|
||||||
session.delete(entry)
|
session.delete(entry)
|
||||||
session.commit()
|
session.commit()
|
||||||
|
|
|
||||||
|
|
@ -4,15 +4,17 @@ from datetime import date
|
||||||
from typing import Optional
|
from typing import Optional
|
||||||
from uuid import UUID, uuid4
|
from uuid import UUID, uuid4
|
||||||
|
|
||||||
from fastapi import APIRouter, Depends, File, HTTPException, UploadFile
|
from fastapi import APIRouter, Depends, File, HTTPException, Query, UploadFile
|
||||||
from google.genai import types as genai_types
|
from google.genai import types as genai_types
|
||||||
from pydantic import BaseModel as PydanticBase
|
from pydantic import BaseModel as PydanticBase
|
||||||
from pydantic import ValidationError
|
from pydantic import ValidationError
|
||||||
from sqlmodel import Session, SQLModel, select
|
from sqlmodel import Session, SQLModel, select
|
||||||
|
|
||||||
from db import get_session
|
from db import get_session
|
||||||
|
from innercontext.api.auth_deps import get_current_user
|
||||||
from innercontext.api.llm_context import build_user_profile_context
|
from innercontext.api.llm_context import build_user_profile_context
|
||||||
from innercontext.api.utils import get_or_404
|
from innercontext.api.utils import get_owned_or_404
|
||||||
|
from innercontext.auth import CurrentUser
|
||||||
from innercontext.llm import call_gemini, get_extraction_config
|
from innercontext.llm import call_gemini, get_extraction_config
|
||||||
from innercontext.models import (
|
from innercontext.models import (
|
||||||
SkinConditionSnapshot,
|
SkinConditionSnapshot,
|
||||||
|
|
@ -26,6 +28,7 @@ from innercontext.models.enums import (
|
||||||
SkinTexture,
|
SkinTexture,
|
||||||
SkinType,
|
SkinType,
|
||||||
)
|
)
|
||||||
|
from innercontext.models.enums import Role
|
||||||
from innercontext.validators import PhotoValidator
|
from innercontext.validators import PhotoValidator
|
||||||
|
|
||||||
logger = logging.getLogger(__name__)
|
logger = logging.getLogger(__name__)
|
||||||
|
|
@ -135,6 +138,34 @@ OUTPUT (all fields optional):
|
||||||
# ---------------------------------------------------------------------------
|
# ---------------------------------------------------------------------------
|
||||||
|
|
||||||
|
|
||||||
|
def _resolve_target_user_id(
|
||||||
|
current_user: CurrentUser,
|
||||||
|
user_id: UUID | None,
|
||||||
|
) -> UUID:
|
||||||
|
if user_id is None:
|
||||||
|
return current_user.user_id
|
||||||
|
if current_user.role is not Role.ADMIN:
|
||||||
|
raise HTTPException(status_code=403, detail="Admin role required")
|
||||||
|
return user_id
|
||||||
|
|
||||||
|
|
||||||
|
def _get_owned_or_admin_override(
|
||||||
|
session: Session,
|
||||||
|
snapshot_id: UUID,
|
||||||
|
current_user: CurrentUser,
|
||||||
|
user_id: UUID | None,
|
||||||
|
) -> SkinConditionSnapshot:
|
||||||
|
if user_id is None:
|
||||||
|
return get_owned_or_404(
|
||||||
|
session, SkinConditionSnapshot, snapshot_id, current_user
|
||||||
|
)
|
||||||
|
target_user_id = _resolve_target_user_id(current_user, user_id)
|
||||||
|
snapshot = session.get(SkinConditionSnapshot, snapshot_id)
|
||||||
|
if snapshot is None or snapshot.user_id != target_user_id:
|
||||||
|
raise HTTPException(status_code=404, detail="SkinConditionSnapshot not found")
|
||||||
|
return snapshot
|
||||||
|
|
||||||
|
|
||||||
MAX_IMAGE_BYTES = 5 * 1024 * 1024 # 5 MB
|
MAX_IMAGE_BYTES = 5 * 1024 * 1024 # 5 MB
|
||||||
|
|
||||||
|
|
||||||
|
|
@ -142,6 +173,7 @@ MAX_IMAGE_BYTES = 5 * 1024 * 1024 # 5 MB
|
||||||
async def analyze_skin_photos(
|
async def analyze_skin_photos(
|
||||||
photos: list[UploadFile] = File(...),
|
photos: list[UploadFile] = File(...),
|
||||||
session: Session = Depends(get_session),
|
session: Session = Depends(get_session),
|
||||||
|
current_user: CurrentUser = Depends(get_current_user),
|
||||||
) -> SkinPhotoAnalysisResponse:
|
) -> SkinPhotoAnalysisResponse:
|
||||||
if not (1 <= len(photos) <= 3):
|
if not (1 <= len(photos) <= 3):
|
||||||
raise HTTPException(status_code=422, detail="Send between 1 and 3 photos.")
|
raise HTTPException(status_code=422, detail="Send between 1 and 3 photos.")
|
||||||
|
|
@ -174,7 +206,11 @@ async def analyze_skin_photos(
|
||||||
)
|
)
|
||||||
parts.append(
|
parts.append(
|
||||||
genai_types.Part.from_text(
|
genai_types.Part.from_text(
|
||||||
text=build_user_profile_context(session, reference_date=date.today())
|
text=build_user_profile_context(
|
||||||
|
session,
|
||||||
|
reference_date=date.today(),
|
||||||
|
current_user=current_user,
|
||||||
|
)
|
||||||
)
|
)
|
||||||
)
|
)
|
||||||
|
|
||||||
|
|
@ -224,9 +260,14 @@ def list_snapshots(
|
||||||
from_date: Optional[date] = None,
|
from_date: Optional[date] = None,
|
||||||
to_date: Optional[date] = None,
|
to_date: Optional[date] = None,
|
||||||
overall_state: Optional[OverallSkinState] = None,
|
overall_state: Optional[OverallSkinState] = None,
|
||||||
|
user_id: UUID | None = Query(default=None),
|
||||||
session: Session = Depends(get_session),
|
session: Session = Depends(get_session),
|
||||||
|
current_user: CurrentUser = Depends(get_current_user),
|
||||||
):
|
):
|
||||||
stmt = select(SkinConditionSnapshot)
|
target_user_id = _resolve_target_user_id(current_user, user_id)
|
||||||
|
stmt = select(SkinConditionSnapshot).where(
|
||||||
|
SkinConditionSnapshot.user_id == target_user_id
|
||||||
|
)
|
||||||
if from_date is not None:
|
if from_date is not None:
|
||||||
stmt = stmt.where(SkinConditionSnapshot.snapshot_date >= from_date)
|
stmt = stmt.where(SkinConditionSnapshot.snapshot_date >= from_date)
|
||||||
if to_date is not None:
|
if to_date is not None:
|
||||||
|
|
@ -237,8 +278,18 @@ def list_snapshots(
|
||||||
|
|
||||||
|
|
||||||
@router.post("", response_model=SkinConditionSnapshotPublic, status_code=201)
|
@router.post("", response_model=SkinConditionSnapshotPublic, status_code=201)
|
||||||
def create_snapshot(data: SnapshotCreate, session: Session = Depends(get_session)):
|
def create_snapshot(
|
||||||
snapshot = SkinConditionSnapshot(id=uuid4(), **data.model_dump())
|
data: SnapshotCreate,
|
||||||
|
user_id: UUID | None = Query(default=None),
|
||||||
|
session: Session = Depends(get_session),
|
||||||
|
current_user: CurrentUser = Depends(get_current_user),
|
||||||
|
):
|
||||||
|
target_user_id = _resolve_target_user_id(current_user, user_id)
|
||||||
|
snapshot = SkinConditionSnapshot(
|
||||||
|
id=uuid4(),
|
||||||
|
user_id=target_user_id,
|
||||||
|
**data.model_dump(),
|
||||||
|
)
|
||||||
session.add(snapshot)
|
session.add(snapshot)
|
||||||
session.commit()
|
session.commit()
|
||||||
session.refresh(snapshot)
|
session.refresh(snapshot)
|
||||||
|
|
@ -246,17 +297,34 @@ def create_snapshot(data: SnapshotCreate, session: Session = Depends(get_session
|
||||||
|
|
||||||
|
|
||||||
@router.get("/{snapshot_id}", response_model=SkinConditionSnapshotPublic)
|
@router.get("/{snapshot_id}", response_model=SkinConditionSnapshotPublic)
|
||||||
def get_snapshot(snapshot_id: UUID, session: Session = Depends(get_session)):
|
def get_snapshot(
|
||||||
return get_or_404(session, SkinConditionSnapshot, snapshot_id)
|
snapshot_id: UUID,
|
||||||
|
user_id: UUID | None = Query(default=None),
|
||||||
|
session: Session = Depends(get_session),
|
||||||
|
current_user: CurrentUser = Depends(get_current_user),
|
||||||
|
):
|
||||||
|
return _get_owned_or_admin_override(
|
||||||
|
session,
|
||||||
|
snapshot_id,
|
||||||
|
current_user,
|
||||||
|
user_id,
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
@router.patch("/{snapshot_id}", response_model=SkinConditionSnapshotPublic)
|
@router.patch("/{snapshot_id}", response_model=SkinConditionSnapshotPublic)
|
||||||
def update_snapshot(
|
def update_snapshot(
|
||||||
snapshot_id: UUID,
|
snapshot_id: UUID,
|
||||||
data: SnapshotUpdate,
|
data: SnapshotUpdate,
|
||||||
|
user_id: UUID | None = Query(default=None),
|
||||||
session: Session = Depends(get_session),
|
session: Session = Depends(get_session),
|
||||||
|
current_user: CurrentUser = Depends(get_current_user),
|
||||||
):
|
):
|
||||||
snapshot = get_or_404(session, SkinConditionSnapshot, snapshot_id)
|
snapshot = _get_owned_or_admin_override(
|
||||||
|
session,
|
||||||
|
snapshot_id,
|
||||||
|
current_user,
|
||||||
|
user_id,
|
||||||
|
)
|
||||||
for key, value in data.model_dump(exclude_unset=True).items():
|
for key, value in data.model_dump(exclude_unset=True).items():
|
||||||
setattr(snapshot, key, value)
|
setattr(snapshot, key, value)
|
||||||
session.add(snapshot)
|
session.add(snapshot)
|
||||||
|
|
@ -266,7 +334,17 @@ def update_snapshot(
|
||||||
|
|
||||||
|
|
||||||
@router.delete("/{snapshot_id}", status_code=204)
|
@router.delete("/{snapshot_id}", status_code=204)
|
||||||
def delete_snapshot(snapshot_id: UUID, session: Session = Depends(get_session)):
|
def delete_snapshot(
|
||||||
snapshot = get_or_404(session, SkinConditionSnapshot, snapshot_id)
|
snapshot_id: UUID,
|
||||||
|
user_id: UUID | None = Query(default=None),
|
||||||
|
session: Session = Depends(get_session),
|
||||||
|
current_user: CurrentUser = Depends(get_current_user),
|
||||||
|
):
|
||||||
|
snapshot = _get_owned_or_admin_override(
|
||||||
|
session,
|
||||||
|
snapshot_id,
|
||||||
|
current_user,
|
||||||
|
user_id,
|
||||||
|
)
|
||||||
session.delete(snapshot)
|
session.delete(snapshot)
|
||||||
session.commit()
|
session.commit()
|
||||||
|
|
|
||||||
|
|
@ -3,6 +3,18 @@ from typing import TypeVar
|
||||||
from fastapi import HTTPException
|
from fastapi import HTTPException
|
||||||
from sqlmodel import Session
|
from sqlmodel import Session
|
||||||
|
|
||||||
|
from innercontext.api.authz import (
|
||||||
|
get_owned_or_404 as authz_get_owned_or_404,
|
||||||
|
)
|
||||||
|
from innercontext.api.authz import (
|
||||||
|
get_owned_or_404_admin_override as authz_get_owned_or_404_admin_override,
|
||||||
|
)
|
||||||
|
from innercontext.api.authz import list_owned as authz_list_owned
|
||||||
|
from innercontext.api.authz import (
|
||||||
|
list_owned_admin_override as authz_list_owned_admin_override,
|
||||||
|
)
|
||||||
|
from innercontext.auth import CurrentUser
|
||||||
|
|
||||||
_T = TypeVar("_T")
|
_T = TypeVar("_T")
|
||||||
|
|
||||||
|
|
||||||
|
|
@ -11,3 +23,37 @@ def get_or_404(session: Session, model: type[_T], record_id: object) -> _T:
|
||||||
if obj is None:
|
if obj is None:
|
||||||
raise HTTPException(status_code=404, detail=f"{model.__name__} not found")
|
raise HTTPException(status_code=404, detail=f"{model.__name__} not found")
|
||||||
return obj
|
return obj
|
||||||
|
|
||||||
|
|
||||||
|
def get_owned_or_404(
|
||||||
|
session: Session,
|
||||||
|
model: type[_T],
|
||||||
|
record_id: object,
|
||||||
|
current_user: CurrentUser,
|
||||||
|
) -> _T:
|
||||||
|
return authz_get_owned_or_404(session, model, record_id, current_user)
|
||||||
|
|
||||||
|
|
||||||
|
def get_owned_or_404_admin_override(
|
||||||
|
session: Session,
|
||||||
|
model: type[_T],
|
||||||
|
record_id: object,
|
||||||
|
current_user: CurrentUser,
|
||||||
|
) -> _T:
|
||||||
|
return authz_get_owned_or_404_admin_override(
|
||||||
|
session, model, record_id, current_user
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def list_owned(
|
||||||
|
session: Session, model: type[_T], current_user: CurrentUser
|
||||||
|
) -> list[_T]:
|
||||||
|
return authz_list_owned(session, model, current_user)
|
||||||
|
|
||||||
|
|
||||||
|
def list_owned_admin_override(
|
||||||
|
session: Session,
|
||||||
|
model: type[_T],
|
||||||
|
current_user: CurrentUser,
|
||||||
|
) -> list[_T]:
|
||||||
|
return authz_list_owned_admin_override(session, model, current_user)
|
||||||
|
|
|
||||||
384
backend/innercontext/auth.py
Normal file
384
backend/innercontext/auth.py
Normal file
|
|
@ -0,0 +1,384 @@
|
||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
import os
|
||||||
|
import time
|
||||||
|
from dataclasses import dataclass, field
|
||||||
|
from datetime import UTC, datetime
|
||||||
|
from functools import lru_cache
|
||||||
|
from threading import Lock
|
||||||
|
from typing import Any, Mapping
|
||||||
|
from uuid import UUID
|
||||||
|
|
||||||
|
import httpx
|
||||||
|
import jwt
|
||||||
|
from jwt import InvalidTokenError, PyJWKSet
|
||||||
|
from sqlmodel import Session, select
|
||||||
|
|
||||||
|
from innercontext.models import HouseholdMembership, HouseholdRole, Role, User
|
||||||
|
|
||||||
|
_DISCOVERY_PATH = "/.well-known/openid-configuration"
|
||||||
|
_SUPPORTED_ALGORITHMS = frozenset(
|
||||||
|
{"RS256", "RS384", "RS512", "ES256", "ES384", "ES512"}
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
class AuthConfigurationError(RuntimeError):
|
||||||
|
pass
|
||||||
|
|
||||||
|
|
||||||
|
class TokenValidationError(ValueError):
|
||||||
|
pass
|
||||||
|
|
||||||
|
|
||||||
|
@dataclass(frozen=True, slots=True)
|
||||||
|
class AuthSettings:
|
||||||
|
issuer: str
|
||||||
|
client_id: str
|
||||||
|
audiences: tuple[str, ...]
|
||||||
|
discovery_url: str
|
||||||
|
jwks_url: str | None
|
||||||
|
groups_claim: str
|
||||||
|
admin_groups: tuple[str, ...]
|
||||||
|
member_groups: tuple[str, ...]
|
||||||
|
jwks_cache_ttl_seconds: int
|
||||||
|
http_timeout_seconds: float
|
||||||
|
clock_skew_seconds: int
|
||||||
|
|
||||||
|
|
||||||
|
@dataclass(frozen=True, slots=True)
|
||||||
|
class TokenClaims:
|
||||||
|
issuer: str
|
||||||
|
subject: str
|
||||||
|
audience: tuple[str, ...]
|
||||||
|
expires_at: datetime
|
||||||
|
groups: tuple[str, ...] = ()
|
||||||
|
email: str | None = None
|
||||||
|
name: str | None = None
|
||||||
|
preferred_username: str | None = None
|
||||||
|
raw_claims: Mapping[str, Any] = field(default_factory=dict, repr=False)
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
def from_payload(
|
||||||
|
cls, payload: Mapping[str, Any], settings: AuthSettings
|
||||||
|
) -> "TokenClaims":
|
||||||
|
audience = payload.get("aud")
|
||||||
|
if isinstance(audience, str):
|
||||||
|
audiences = (audience,)
|
||||||
|
elif isinstance(audience, list):
|
||||||
|
audiences = tuple(str(item) for item in audience)
|
||||||
|
else:
|
||||||
|
audiences = ()
|
||||||
|
|
||||||
|
groups = _normalize_groups(payload.get(settings.groups_claim))
|
||||||
|
exp = payload.get("exp")
|
||||||
|
if not isinstance(exp, (int, float)):
|
||||||
|
raise TokenValidationError("Access token missing exp claim")
|
||||||
|
|
||||||
|
return cls(
|
||||||
|
issuer=str(payload["iss"]),
|
||||||
|
subject=str(payload["sub"]),
|
||||||
|
audience=audiences,
|
||||||
|
expires_at=datetime.fromtimestamp(exp, tz=UTC),
|
||||||
|
groups=groups,
|
||||||
|
email=_optional_str(payload.get("email")),
|
||||||
|
name=_optional_str(payload.get("name")),
|
||||||
|
preferred_username=_optional_str(payload.get("preferred_username")),
|
||||||
|
raw_claims=dict(payload),
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
@dataclass(frozen=True, slots=True)
|
||||||
|
class IdentityData:
|
||||||
|
issuer: str
|
||||||
|
subject: str
|
||||||
|
email: str | None = None
|
||||||
|
name: str | None = None
|
||||||
|
preferred_username: str | None = None
|
||||||
|
groups: tuple[str, ...] = ()
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
def from_claims(cls, claims: TokenClaims) -> "IdentityData":
|
||||||
|
return cls(
|
||||||
|
issuer=claims.issuer,
|
||||||
|
subject=claims.subject,
|
||||||
|
email=claims.email,
|
||||||
|
name=claims.name,
|
||||||
|
preferred_username=claims.preferred_username,
|
||||||
|
groups=claims.groups,
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
@dataclass(frozen=True, slots=True)
|
||||||
|
class CurrentHouseholdMembership:
|
||||||
|
household_id: UUID
|
||||||
|
role: HouseholdRole
|
||||||
|
|
||||||
|
|
||||||
|
@dataclass(frozen=True, slots=True)
|
||||||
|
class CurrentUser:
|
||||||
|
user_id: UUID
|
||||||
|
role: Role
|
||||||
|
identity: IdentityData
|
||||||
|
claims: TokenClaims = field(repr=False)
|
||||||
|
household_membership: CurrentHouseholdMembership | None = None
|
||||||
|
|
||||||
|
|
||||||
|
def _split_csv(value: str | None) -> tuple[str, ...]:
|
||||||
|
if value is None:
|
||||||
|
return ()
|
||||||
|
return tuple(item.strip() for item in value.split(",") if item.strip())
|
||||||
|
|
||||||
|
|
||||||
|
def _optional_str(value: Any) -> str | None:
|
||||||
|
if value is None:
|
||||||
|
return None
|
||||||
|
if isinstance(value, str):
|
||||||
|
return value
|
||||||
|
return str(value)
|
||||||
|
|
||||||
|
|
||||||
|
def _normalize_groups(value: Any) -> tuple[str, ...]:
|
||||||
|
if value is None:
|
||||||
|
return ()
|
||||||
|
if isinstance(value, str):
|
||||||
|
return (value,)
|
||||||
|
if isinstance(value, list):
|
||||||
|
return tuple(str(item) for item in value)
|
||||||
|
if isinstance(value, tuple):
|
||||||
|
return tuple(str(item) for item in value)
|
||||||
|
return (str(value),)
|
||||||
|
|
||||||
|
|
||||||
|
def _required_env(name: str) -> str:
|
||||||
|
value = os.environ.get(name)
|
||||||
|
if value:
|
||||||
|
return value
|
||||||
|
raise AuthConfigurationError(f"Missing required auth environment variable: {name}")
|
||||||
|
|
||||||
|
|
||||||
|
@lru_cache
|
||||||
|
def get_auth_settings() -> AuthSettings:
|
||||||
|
issuer = _required_env("OIDC_ISSUER")
|
||||||
|
client_id = _required_env("OIDC_CLIENT_ID")
|
||||||
|
audiences = _split_csv(os.environ.get("OIDC_AUDIENCE")) or (client_id,)
|
||||||
|
discovery_url = os.environ.get("OIDC_DISCOVERY_URL") or (
|
||||||
|
issuer.rstrip("/") + _DISCOVERY_PATH
|
||||||
|
)
|
||||||
|
|
||||||
|
return AuthSettings(
|
||||||
|
issuer=issuer,
|
||||||
|
client_id=client_id,
|
||||||
|
audiences=audiences,
|
||||||
|
discovery_url=discovery_url,
|
||||||
|
jwks_url=os.environ.get("OIDC_JWKS_URL"),
|
||||||
|
groups_claim=os.environ.get("OIDC_GROUPS_CLAIM", "groups"),
|
||||||
|
admin_groups=_split_csv(os.environ.get("OIDC_ADMIN_GROUPS")),
|
||||||
|
member_groups=_split_csv(os.environ.get("OIDC_MEMBER_GROUPS")),
|
||||||
|
jwks_cache_ttl_seconds=int(
|
||||||
|
os.environ.get("OIDC_JWKS_CACHE_TTL_SECONDS", "300")
|
||||||
|
),
|
||||||
|
http_timeout_seconds=float(os.environ.get("OIDC_HTTP_TIMEOUT_SECONDS", "5")),
|
||||||
|
clock_skew_seconds=int(os.environ.get("OIDC_CLOCK_SKEW_SECONDS", "30")),
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
class CachedJwksClient:
|
||||||
|
def __init__(self, settings: AuthSettings):
|
||||||
|
self._settings = settings
|
||||||
|
self._lock = Lock()
|
||||||
|
self._jwks: PyJWKSet | None = None
|
||||||
|
self._jwks_fetched_at = 0.0
|
||||||
|
self._discovery_jwks_url: str | None = None
|
||||||
|
self._discovery_fetched_at = 0.0
|
||||||
|
|
||||||
|
def get_signing_key(self, kid: str) -> Any:
|
||||||
|
with self._lock:
|
||||||
|
jwks = self._get_jwks_locked()
|
||||||
|
key = self._find_key(jwks, kid)
|
||||||
|
if key is not None:
|
||||||
|
return key
|
||||||
|
|
||||||
|
self._refresh_jwks_locked(
|
||||||
|
force_discovery_refresh=self._settings.jwks_url is None
|
||||||
|
)
|
||||||
|
if self._jwks is None:
|
||||||
|
raise TokenValidationError("JWKS cache is empty")
|
||||||
|
|
||||||
|
key = self._find_key(self._jwks, kid)
|
||||||
|
if key is None:
|
||||||
|
raise TokenValidationError(f"No signing key found for kid '{kid}'")
|
||||||
|
return key
|
||||||
|
|
||||||
|
def _get_jwks_locked(self) -> PyJWKSet:
|
||||||
|
if self._jwks is None or self._is_stale(self._jwks_fetched_at):
|
||||||
|
self._refresh_jwks_locked(force_discovery_refresh=False)
|
||||||
|
if self._jwks is None:
|
||||||
|
raise TokenValidationError("Unable to load JWKS")
|
||||||
|
return self._jwks
|
||||||
|
|
||||||
|
def _refresh_jwks_locked(self, force_discovery_refresh: bool) -> None:
|
||||||
|
jwks_url = self._resolve_jwks_url_locked(force_refresh=force_discovery_refresh)
|
||||||
|
data = self._fetch_json(jwks_url)
|
||||||
|
try:
|
||||||
|
self._jwks = PyJWKSet.from_dict(data)
|
||||||
|
except Exception as exc:
|
||||||
|
raise TokenValidationError(
|
||||||
|
"OIDC provider returned an invalid JWKS payload"
|
||||||
|
) from exc
|
||||||
|
self._jwks_fetched_at = time.monotonic()
|
||||||
|
|
||||||
|
def _resolve_jwks_url_locked(self, force_refresh: bool) -> str:
|
||||||
|
if self._settings.jwks_url:
|
||||||
|
return self._settings.jwks_url
|
||||||
|
|
||||||
|
if (
|
||||||
|
force_refresh
|
||||||
|
or self._discovery_jwks_url is None
|
||||||
|
or self._is_stale(self._discovery_fetched_at)
|
||||||
|
):
|
||||||
|
discovery = self._fetch_json(self._settings.discovery_url)
|
||||||
|
jwks_uri = discovery.get("jwks_uri")
|
||||||
|
if not isinstance(jwks_uri, str) or not jwks_uri:
|
||||||
|
raise TokenValidationError("OIDC discovery document missing jwks_uri")
|
||||||
|
self._discovery_jwks_url = jwks_uri
|
||||||
|
self._discovery_fetched_at = time.monotonic()
|
||||||
|
|
||||||
|
if self._discovery_jwks_url is None:
|
||||||
|
raise TokenValidationError("Unable to resolve JWKS URL")
|
||||||
|
return self._discovery_jwks_url
|
||||||
|
|
||||||
|
def _fetch_json(self, url: str) -> dict[str, Any]:
|
||||||
|
try:
|
||||||
|
response = httpx.get(url, timeout=self._settings.http_timeout_seconds)
|
||||||
|
response.raise_for_status()
|
||||||
|
except httpx.HTTPError as exc:
|
||||||
|
raise TokenValidationError(
|
||||||
|
f"Failed to fetch OIDC metadata from {url}"
|
||||||
|
) from exc
|
||||||
|
|
||||||
|
data = response.json()
|
||||||
|
if not isinstance(data, dict):
|
||||||
|
raise TokenValidationError(
|
||||||
|
f"OIDC metadata from {url} must be a JSON object"
|
||||||
|
)
|
||||||
|
return data
|
||||||
|
|
||||||
|
def _is_stale(self, fetched_at: float) -> bool:
|
||||||
|
return (time.monotonic() - fetched_at) >= self._settings.jwks_cache_ttl_seconds
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def _find_key(jwks: PyJWKSet, kid: str) -> Any | None:
|
||||||
|
for jwk in jwks.keys:
|
||||||
|
if jwk.key_id == kid:
|
||||||
|
return jwk.key
|
||||||
|
return None
|
||||||
|
|
||||||
|
|
||||||
|
@lru_cache
|
||||||
|
def get_jwks_client() -> CachedJwksClient:
|
||||||
|
return CachedJwksClient(get_auth_settings())
|
||||||
|
|
||||||
|
|
||||||
|
def reset_auth_caches() -> None:
|
||||||
|
get_auth_settings.cache_clear()
|
||||||
|
get_jwks_client.cache_clear()
|
||||||
|
|
||||||
|
|
||||||
|
def validate_access_token(token: str) -> TokenClaims:
|
||||||
|
settings = get_auth_settings()
|
||||||
|
|
||||||
|
try:
|
||||||
|
unverified_header = jwt.get_unverified_header(token)
|
||||||
|
except InvalidTokenError as exc:
|
||||||
|
raise TokenValidationError("Malformed access token header") from exc
|
||||||
|
|
||||||
|
kid = unverified_header.get("kid")
|
||||||
|
algorithm = unverified_header.get("alg")
|
||||||
|
if not isinstance(kid, str) or not kid:
|
||||||
|
raise TokenValidationError("Access token missing kid header")
|
||||||
|
if not isinstance(algorithm, str) or algorithm not in _SUPPORTED_ALGORITHMS:
|
||||||
|
raise TokenValidationError("Access token uses an unsupported signing algorithm")
|
||||||
|
|
||||||
|
signing_key = get_jwks_client().get_signing_key(kid)
|
||||||
|
|
||||||
|
try:
|
||||||
|
payload = jwt.decode(
|
||||||
|
token,
|
||||||
|
key=signing_key,
|
||||||
|
algorithms=[algorithm],
|
||||||
|
audience=settings.audiences,
|
||||||
|
issuer=settings.issuer,
|
||||||
|
options={"require": ["exp", "iss", "sub"]},
|
||||||
|
leeway=settings.clock_skew_seconds,
|
||||||
|
)
|
||||||
|
except InvalidTokenError as exc:
|
||||||
|
raise TokenValidationError("Invalid access token") from exc
|
||||||
|
|
||||||
|
return TokenClaims.from_payload(payload, settings)
|
||||||
|
|
||||||
|
|
||||||
|
def sync_current_user(
|
||||||
|
session: Session,
|
||||||
|
claims: TokenClaims,
|
||||||
|
identity: IdentityData | None = None,
|
||||||
|
) -> CurrentUser:
|
||||||
|
effective_identity = identity or IdentityData.from_claims(claims)
|
||||||
|
statement = select(User).where(
|
||||||
|
User.oidc_issuer == effective_identity.issuer,
|
||||||
|
User.oidc_subject == effective_identity.subject,
|
||||||
|
)
|
||||||
|
user = session.exec(statement).first()
|
||||||
|
existing_role = user.role if user is not None else None
|
||||||
|
resolved_role = resolve_role(effective_identity.groups, existing_role=existing_role)
|
||||||
|
needs_commit = False
|
||||||
|
|
||||||
|
if user is None:
|
||||||
|
user = User(
|
||||||
|
oidc_issuer=effective_identity.issuer,
|
||||||
|
oidc_subject=effective_identity.subject,
|
||||||
|
role=resolved_role,
|
||||||
|
)
|
||||||
|
session.add(user)
|
||||||
|
needs_commit = True
|
||||||
|
elif user.role != resolved_role:
|
||||||
|
user.role = resolved_role
|
||||||
|
session.add(user)
|
||||||
|
needs_commit = True
|
||||||
|
|
||||||
|
if needs_commit:
|
||||||
|
session.commit()
|
||||||
|
session.refresh(user)
|
||||||
|
|
||||||
|
membership = session.exec(
|
||||||
|
select(HouseholdMembership).where(HouseholdMembership.user_id == user.id)
|
||||||
|
).first()
|
||||||
|
|
||||||
|
household_membership = None
|
||||||
|
if membership is not None:
|
||||||
|
household_membership = CurrentHouseholdMembership(
|
||||||
|
household_id=membership.household_id,
|
||||||
|
role=membership.role,
|
||||||
|
)
|
||||||
|
|
||||||
|
return CurrentUser(
|
||||||
|
user_id=user.id,
|
||||||
|
role=user.role,
|
||||||
|
identity=effective_identity,
|
||||||
|
claims=claims,
|
||||||
|
household_membership=household_membership,
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def resolve_role(groups: tuple[str, ...], existing_role: Role | None = None) -> Role:
|
||||||
|
settings = get_auth_settings()
|
||||||
|
if groups:
|
||||||
|
group_set = set(groups)
|
||||||
|
if settings.admin_groups and group_set.intersection(settings.admin_groups):
|
||||||
|
return Role.ADMIN
|
||||||
|
if settings.member_groups:
|
||||||
|
if group_set.intersection(settings.member_groups):
|
||||||
|
return Role.MEMBER
|
||||||
|
return Role.MEMBER
|
||||||
|
return Role.MEMBER
|
||||||
|
|
||||||
|
return existing_role or Role.MEMBER
|
||||||
|
|
@ -6,13 +6,16 @@ from .enums import (
|
||||||
DayTime,
|
DayTime,
|
||||||
EvidenceLevel,
|
EvidenceLevel,
|
||||||
GroomingAction,
|
GroomingAction,
|
||||||
|
HouseholdRole,
|
||||||
IngredientFunction,
|
IngredientFunction,
|
||||||
MedicationKind,
|
MedicationKind,
|
||||||
OverallSkinState,
|
OverallSkinState,
|
||||||
PartOfDay,
|
PartOfDay,
|
||||||
PriceTier,
|
PriceTier,
|
||||||
ProductCategory,
|
ProductCategory,
|
||||||
|
RemainingLevel,
|
||||||
ResultFlag,
|
ResultFlag,
|
||||||
|
Role,
|
||||||
RoutineRole,
|
RoutineRole,
|
||||||
SexAtBirth,
|
SexAtBirth,
|
||||||
SkinConcern,
|
SkinConcern,
|
||||||
|
|
@ -23,6 +26,8 @@ from .enums import (
|
||||||
UsageFrequency,
|
UsageFrequency,
|
||||||
)
|
)
|
||||||
from .health import LabResult, MedicationEntry, MedicationUsage
|
from .health import LabResult, MedicationEntry, MedicationUsage
|
||||||
|
from .household import Household
|
||||||
|
from .household_membership import HouseholdMembership
|
||||||
from .pricing import PricingRecalcJob
|
from .pricing import PricingRecalcJob
|
||||||
from .product import (
|
from .product import (
|
||||||
ActiveIngredient,
|
ActiveIngredient,
|
||||||
|
|
@ -41,6 +46,7 @@ from .skincare import (
|
||||||
SkinConditionSnapshotBase,
|
SkinConditionSnapshotBase,
|
||||||
SkinConditionSnapshotPublic,
|
SkinConditionSnapshotPublic,
|
||||||
)
|
)
|
||||||
|
from .user import User
|
||||||
|
|
||||||
__all__ = [
|
__all__ = [
|
||||||
# ai logs
|
# ai logs
|
||||||
|
|
@ -53,13 +59,16 @@ __all__ = [
|
||||||
"DayTime",
|
"DayTime",
|
||||||
"EvidenceLevel",
|
"EvidenceLevel",
|
||||||
"GroomingAction",
|
"GroomingAction",
|
||||||
|
"HouseholdRole",
|
||||||
"IngredientFunction",
|
"IngredientFunction",
|
||||||
"MedicationKind",
|
"MedicationKind",
|
||||||
"OverallSkinState",
|
"OverallSkinState",
|
||||||
"PartOfDay",
|
"PartOfDay",
|
||||||
"PriceTier",
|
"PriceTier",
|
||||||
|
"RemainingLevel",
|
||||||
"ProductCategory",
|
"ProductCategory",
|
||||||
"ResultFlag",
|
"ResultFlag",
|
||||||
|
"Role",
|
||||||
"RoutineRole",
|
"RoutineRole",
|
||||||
"SexAtBirth",
|
"SexAtBirth",
|
||||||
"SkinConcern",
|
"SkinConcern",
|
||||||
|
|
@ -72,6 +81,8 @@ __all__ = [
|
||||||
"LabResult",
|
"LabResult",
|
||||||
"MedicationEntry",
|
"MedicationEntry",
|
||||||
"MedicationUsage",
|
"MedicationUsage",
|
||||||
|
"Household",
|
||||||
|
"HouseholdMembership",
|
||||||
# product
|
# product
|
||||||
"ActiveIngredient",
|
"ActiveIngredient",
|
||||||
"Product",
|
"Product",
|
||||||
|
|
@ -83,6 +94,7 @@ __all__ = [
|
||||||
"ProductWithInventory",
|
"ProductWithInventory",
|
||||||
"PricingRecalcJob",
|
"PricingRecalcJob",
|
||||||
"UserProfile",
|
"UserProfile",
|
||||||
|
"User",
|
||||||
# routine
|
# routine
|
||||||
"GroomingSchedule",
|
"GroomingSchedule",
|
||||||
"Routine",
|
"Routine",
|
||||||
|
|
|
||||||
|
|
@ -10,10 +10,11 @@ from .domain import Domain
|
||||||
|
|
||||||
|
|
||||||
class AICallLog(SQLModel, table=True):
|
class AICallLog(SQLModel, table=True):
|
||||||
__tablename__ = "ai_call_logs"
|
__tablename__ = "ai_call_logs" # pyright: ignore[reportAssignmentType]
|
||||||
__domains__: ClassVar[frozenset[Domain]] = frozenset()
|
__domains__: ClassVar[frozenset[Domain]] = frozenset()
|
||||||
|
|
||||||
id: UUID = Field(default_factory=uuid4, primary_key=True)
|
id: UUID = Field(default_factory=uuid4, primary_key=True)
|
||||||
|
user_id: UUID | None = Field(default=None, foreign_key="users.id", index=True)
|
||||||
created_at: datetime = Field(default_factory=utc_now, nullable=False)
|
created_at: datetime = Field(default_factory=utc_now, nullable=False)
|
||||||
endpoint: str = Field(index=True)
|
endpoint: str = Field(index=True)
|
||||||
model: str
|
model: str
|
||||||
|
|
|
||||||
|
|
@ -29,6 +29,16 @@ class UsageFrequency(str, Enum):
|
||||||
AS_NEEDED = "as_needed"
|
AS_NEEDED = "as_needed"
|
||||||
|
|
||||||
|
|
||||||
|
class Role(str, Enum):
|
||||||
|
ADMIN = "admin"
|
||||||
|
MEMBER = "member"
|
||||||
|
|
||||||
|
|
||||||
|
class HouseholdRole(str, Enum):
|
||||||
|
OWNER = "owner"
|
||||||
|
MEMBER = "member"
|
||||||
|
|
||||||
|
|
||||||
class ProductCategory(str, Enum):
|
class ProductCategory(str, Enum):
|
||||||
CLEANSER = "cleanser"
|
CLEANSER = "cleanser"
|
||||||
TONER = "toner"
|
TONER = "toner"
|
||||||
|
|
@ -124,6 +134,13 @@ class PriceTier(str, Enum):
|
||||||
LUXURY = "luxury"
|
LUXURY = "luxury"
|
||||||
|
|
||||||
|
|
||||||
|
class RemainingLevel(str, Enum):
|
||||||
|
HIGH = "high"
|
||||||
|
MEDIUM = "medium"
|
||||||
|
LOW = "low"
|
||||||
|
NEARLY_EMPTY = "nearly_empty"
|
||||||
|
|
||||||
|
|
||||||
class EvidenceLevel(str, Enum):
|
class EvidenceLevel(str, Enum):
|
||||||
LOW = "low"
|
LOW = "low"
|
||||||
MIXED = "mixed"
|
MIXED = "mixed"
|
||||||
|
|
|
||||||
|
|
@ -11,10 +11,11 @@ from .enums import MedicationKind, ResultFlag
|
||||||
|
|
||||||
|
|
||||||
class MedicationEntry(SQLModel, table=True):
|
class MedicationEntry(SQLModel, table=True):
|
||||||
__tablename__ = "medication_entries"
|
__tablename__ = "medication_entries" # pyright: ignore[reportAssignmentType]
|
||||||
__domains__: ClassVar[frozenset[Domain]] = frozenset({Domain.HEALTH})
|
__domains__: ClassVar[frozenset[Domain]] = frozenset({Domain.HEALTH})
|
||||||
|
|
||||||
record_id: UUID = Field(default_factory=uuid4, primary_key=True)
|
record_id: UUID = Field(default_factory=uuid4, primary_key=True)
|
||||||
|
user_id: UUID | None = Field(default=None, foreign_key="users.id", index=True)
|
||||||
|
|
||||||
kind: MedicationKind = Field(index=True)
|
kind: MedicationKind = Field(index=True)
|
||||||
|
|
||||||
|
|
@ -43,10 +44,11 @@ class MedicationEntry(SQLModel, table=True):
|
||||||
|
|
||||||
|
|
||||||
class MedicationUsage(SQLModel, table=True):
|
class MedicationUsage(SQLModel, table=True):
|
||||||
__tablename__ = "medication_usages"
|
__tablename__ = "medication_usages" # pyright: ignore[reportAssignmentType]
|
||||||
__domains__: ClassVar[frozenset[Domain]] = frozenset({Domain.HEALTH})
|
__domains__: ClassVar[frozenset[Domain]] = frozenset({Domain.HEALTH})
|
||||||
|
|
||||||
record_id: UUID = Field(default_factory=uuid4, primary_key=True)
|
record_id: UUID = Field(default_factory=uuid4, primary_key=True)
|
||||||
|
user_id: UUID | None = Field(default=None, foreign_key="users.id", index=True)
|
||||||
medication_record_id: UUID = Field(
|
medication_record_id: UUID = Field(
|
||||||
foreign_key="medication_entries.record_id", index=True
|
foreign_key="medication_entries.record_id", index=True
|
||||||
)
|
)
|
||||||
|
|
@ -78,10 +80,11 @@ class MedicationUsage(SQLModel, table=True):
|
||||||
|
|
||||||
|
|
||||||
class LabResult(SQLModel, table=True):
|
class LabResult(SQLModel, table=True):
|
||||||
__tablename__ = "lab_results"
|
__tablename__ = "lab_results" # pyright: ignore[reportAssignmentType]
|
||||||
__domains__: ClassVar[frozenset[Domain]] = frozenset({Domain.HEALTH})
|
__domains__: ClassVar[frozenset[Domain]] = frozenset({Domain.HEALTH})
|
||||||
|
|
||||||
record_id: UUID = Field(default_factory=uuid4, primary_key=True)
|
record_id: UUID = Field(default_factory=uuid4, primary_key=True)
|
||||||
|
user_id: UUID | None = Field(default=None, foreign_key="users.id", index=True)
|
||||||
|
|
||||||
collected_at: datetime = Field(index=True)
|
collected_at: datetime = Field(index=True)
|
||||||
test_code: str = Field(index=True, regex=r"^\d+-\d$")
|
test_code: str = Field(index=True, regex=r"^\d+-\d$")
|
||||||
|
|
|
||||||
36
backend/innercontext/models/household.py
Normal file
36
backend/innercontext/models/household.py
Normal file
|
|
@ -0,0 +1,36 @@
|
||||||
|
# pyright: reportImportCycles=false
|
||||||
|
|
||||||
|
from datetime import datetime
|
||||||
|
from typing import TYPE_CHECKING, ClassVar
|
||||||
|
from uuid import UUID, uuid4
|
||||||
|
|
||||||
|
from sqlalchemy import Column, DateTime
|
||||||
|
from sqlmodel import Field, Relationship, SQLModel
|
||||||
|
|
||||||
|
from .base import utc_now
|
||||||
|
from .domain import Domain
|
||||||
|
|
||||||
|
if TYPE_CHECKING:
|
||||||
|
from .household_membership import HouseholdMembership # pyright: ignore[reportImportCycles]
|
||||||
|
|
||||||
|
|
||||||
|
class Household(SQLModel, table=True):
|
||||||
|
__tablename__ = "households" # pyright: ignore[reportAssignmentType]
|
||||||
|
__domains__: ClassVar[frozenset[Domain]] = frozenset()
|
||||||
|
|
||||||
|
id: UUID = Field(default_factory=uuid4, primary_key=True)
|
||||||
|
created_at: datetime = Field(default_factory=utc_now, nullable=False)
|
||||||
|
updated_at: datetime = Field(
|
||||||
|
default_factory=utc_now,
|
||||||
|
sa_column=Column(
|
||||||
|
DateTime(timezone=True),
|
||||||
|
default=utc_now,
|
||||||
|
onupdate=utc_now,
|
||||||
|
nullable=False,
|
||||||
|
),
|
||||||
|
)
|
||||||
|
|
||||||
|
memberships: list["HouseholdMembership"] = Relationship(
|
||||||
|
back_populates="household",
|
||||||
|
sa_relationship_kwargs={"cascade": "all, delete-orphan"},
|
||||||
|
)
|
||||||
45
backend/innercontext/models/household_membership.py
Normal file
45
backend/innercontext/models/household_membership.py
Normal file
|
|
@ -0,0 +1,45 @@
|
||||||
|
# pyright: reportImportCycles=false
|
||||||
|
|
||||||
|
from datetime import datetime
|
||||||
|
from typing import TYPE_CHECKING, ClassVar
|
||||||
|
from uuid import UUID, uuid4
|
||||||
|
|
||||||
|
from sqlalchemy import Column, DateTime, UniqueConstraint
|
||||||
|
from sqlmodel import Field, Relationship, SQLModel
|
||||||
|
|
||||||
|
from .base import utc_now
|
||||||
|
from .domain import Domain
|
||||||
|
from .enums import HouseholdRole
|
||||||
|
|
||||||
|
if TYPE_CHECKING:
|
||||||
|
from .household import Household # pyright: ignore[reportImportCycles]
|
||||||
|
from .user import User # pyright: ignore[reportImportCycles]
|
||||||
|
|
||||||
|
|
||||||
|
class HouseholdMembership(SQLModel, table=True):
|
||||||
|
__tablename__ = "household_memberships" # pyright: ignore[reportAssignmentType]
|
||||||
|
__domains__: ClassVar[frozenset[Domain]] = frozenset()
|
||||||
|
__table_args__ = (
|
||||||
|
UniqueConstraint("user_id", name="uq_household_memberships_user_id"),
|
||||||
|
)
|
||||||
|
|
||||||
|
id: UUID = Field(default_factory=uuid4, primary_key=True)
|
||||||
|
user_id: UUID = Field(foreign_key="users.id", index=True, ondelete="CASCADE")
|
||||||
|
household_id: UUID = Field(
|
||||||
|
foreign_key="households.id", index=True, ondelete="CASCADE"
|
||||||
|
)
|
||||||
|
role: HouseholdRole = Field(default=HouseholdRole.MEMBER, index=True)
|
||||||
|
|
||||||
|
created_at: datetime = Field(default_factory=utc_now, nullable=False)
|
||||||
|
updated_at: datetime = Field(
|
||||||
|
default_factory=utc_now,
|
||||||
|
sa_column=Column(
|
||||||
|
DateTime(timezone=True),
|
||||||
|
default=utc_now,
|
||||||
|
onupdate=utc_now,
|
||||||
|
nullable=False,
|
||||||
|
),
|
||||||
|
)
|
||||||
|
|
||||||
|
user: "User" = Relationship(back_populates="household_membership")
|
||||||
|
household: "Household" = Relationship(back_populates="memberships")
|
||||||
|
|
@ -1,4 +1,5 @@
|
||||||
from datetime import date, datetime
|
from datetime import date, datetime
|
||||||
|
from enum import Enum
|
||||||
from typing import Any, ClassVar, Optional, cast
|
from typing import Any, ClassVar, Optional, cast
|
||||||
from uuid import UUID, uuid4
|
from uuid import UUID, uuid4
|
||||||
|
|
||||||
|
|
@ -14,6 +15,7 @@ from .enums import (
|
||||||
IngredientFunction,
|
IngredientFunction,
|
||||||
PriceTier,
|
PriceTier,
|
||||||
ProductCategory,
|
ProductCategory,
|
||||||
|
RemainingLevel,
|
||||||
SkinConcern,
|
SkinConcern,
|
||||||
SkinType,
|
SkinType,
|
||||||
StrengthLevel,
|
StrengthLevel,
|
||||||
|
|
@ -71,7 +73,9 @@ class ProductContext(SQLModel):
|
||||||
|
|
||||||
def _ev(v: object) -> str:
|
def _ev(v: object) -> str:
|
||||||
"""Return enum value or string as-is (handles both DB-loaded dicts and Python enums)."""
|
"""Return enum value or string as-is (handles both DB-loaded dicts and Python enums)."""
|
||||||
return v.value if hasattr(v, "value") else str(v) # type: ignore[union-attr]
|
if isinstance(v, Enum):
|
||||||
|
return str(v.value)
|
||||||
|
return str(v)
|
||||||
|
|
||||||
|
|
||||||
# ---------------------------------------------------------------------------
|
# ---------------------------------------------------------------------------
|
||||||
|
|
@ -97,8 +101,6 @@ class ProductBase(SQLModel):
|
||||||
price_amount: float | None = Field(default=None, gt=0)
|
price_amount: float | None = Field(default=None, gt=0)
|
||||||
price_currency: str | None = Field(default=None, min_length=3, max_length=3)
|
price_currency: str | None = Field(default=None, min_length=3, max_length=3)
|
||||||
size_ml: float | None = Field(default=None, gt=0)
|
size_ml: float | None = Field(default=None, gt=0)
|
||||||
full_weight_g: float | None = Field(default=None, gt=0)
|
|
||||||
empty_weight_g: float | None = Field(default=None, gt=0)
|
|
||||||
pao_months: int | None = Field(default=None, ge=1, le=60)
|
pao_months: int | None = Field(default=None, ge=1, le=60)
|
||||||
|
|
||||||
inci: list[str] = Field(default_factory=list)
|
inci: list[str] = Field(default_factory=list)
|
||||||
|
|
@ -129,7 +131,6 @@ class ProductBase(SQLModel):
|
||||||
needle_length_mm: float | None = Field(default=None, gt=0)
|
needle_length_mm: float | None = Field(default=None, gt=0)
|
||||||
|
|
||||||
personal_tolerance_notes: str | None = None
|
personal_tolerance_notes: str | None = None
|
||||||
personal_repurchase_intent: bool | None = None
|
|
||||||
|
|
||||||
|
|
||||||
# ---------------------------------------------------------------------------
|
# ---------------------------------------------------------------------------
|
||||||
|
|
@ -138,10 +139,11 @@ class ProductBase(SQLModel):
|
||||||
|
|
||||||
|
|
||||||
class Product(ProductBase, table=True):
|
class Product(ProductBase, table=True):
|
||||||
__tablename__ = "products"
|
__tablename__ = "products" # pyright: ignore[reportAssignmentType]
|
||||||
__domains__: ClassVar[frozenset[Domain]] = frozenset({Domain.SKINCARE})
|
__domains__: ClassVar[frozenset[Domain]] = frozenset({Domain.SKINCARE})
|
||||||
|
|
||||||
id: UUID = Field(default_factory=uuid4, primary_key=True)
|
id: UUID = Field(default_factory=uuid4, primary_key=True)
|
||||||
|
user_id: UUID | None = Field(default=None, foreign_key="users.id", index=True)
|
||||||
short_id: str = Field(
|
short_id: str = Field(
|
||||||
max_length=8,
|
max_length=8,
|
||||||
unique=True,
|
unique=True,
|
||||||
|
|
@ -232,8 +234,8 @@ class Product(ProductBase, table=True):
|
||||||
*,
|
*,
|
||||||
computed_price_tier: PriceTier | None = None,
|
computed_price_tier: PriceTier | None = None,
|
||||||
price_per_use_pln: float | None = None,
|
price_per_use_pln: float | None = None,
|
||||||
) -> dict:
|
) -> dict[str, Any]:
|
||||||
ctx: dict = {
|
ctx: dict[str, Any] = {
|
||||||
"id": str(self.id),
|
"id": str(self.id),
|
||||||
"name": self.name,
|
"name": self.name,
|
||||||
"brand": self.brand,
|
"brand": self.brand,
|
||||||
|
|
@ -275,7 +277,7 @@ class Product(ProductBase, table=True):
|
||||||
if isinstance(a, dict):
|
if isinstance(a, dict):
|
||||||
actives_ctx.append(a)
|
actives_ctx.append(a)
|
||||||
else:
|
else:
|
||||||
a_dict: dict = {"name": a.name}
|
a_dict: dict[str, Any] = {"name": a.name}
|
||||||
if a.percent is not None:
|
if a.percent is not None:
|
||||||
a_dict["percent"] = a.percent
|
a_dict["percent"] = a.percent
|
||||||
if a.functions:
|
if a.functions:
|
||||||
|
|
@ -338,16 +340,16 @@ class Product(ProductBase, table=True):
|
||||||
ctx["needle_length_mm"] = self.needle_length_mm
|
ctx["needle_length_mm"] = self.needle_length_mm
|
||||||
if self.personal_tolerance_notes:
|
if self.personal_tolerance_notes:
|
||||||
ctx["personal_tolerance_notes"] = self.personal_tolerance_notes
|
ctx["personal_tolerance_notes"] = self.personal_tolerance_notes
|
||||||
if self.personal_repurchase_intent is not None:
|
|
||||||
ctx["personal_repurchase_intent"] = self.personal_repurchase_intent
|
|
||||||
|
|
||||||
try:
|
try:
|
||||||
opened_items = [
|
opened_items = [
|
||||||
inv for inv in (self.inventory or []) if inv.is_opened and inv.opened_at
|
inv for inv in (self.inventory or []) if inv.is_opened and inv.opened_at
|
||||||
]
|
]
|
||||||
if opened_items:
|
if opened_items:
|
||||||
most_recent = max(opened_items, key=lambda x: x.opened_at)
|
most_recent = max(opened_items, key=lambda x: cast(date, x.opened_at))
|
||||||
ctx["days_since_opened"] = (date.today() - most_recent.opened_at).days
|
ctx["days_since_opened"] = (
|
||||||
|
date.today() - cast(date, most_recent.opened_at)
|
||||||
|
).days
|
||||||
except Exception:
|
except Exception:
|
||||||
pass
|
pass
|
||||||
|
|
||||||
|
|
@ -355,18 +357,19 @@ class Product(ProductBase, table=True):
|
||||||
|
|
||||||
|
|
||||||
class ProductInventory(SQLModel, table=True):
|
class ProductInventory(SQLModel, table=True):
|
||||||
__tablename__ = "product_inventory"
|
__tablename__ = "product_inventory" # pyright: ignore[reportAssignmentType]
|
||||||
__domains__: ClassVar[frozenset[Domain]] = frozenset({Domain.SKINCARE})
|
__domains__: ClassVar[frozenset[Domain]] = frozenset({Domain.SKINCARE})
|
||||||
|
|
||||||
id: UUID = Field(default_factory=uuid4, primary_key=True)
|
id: UUID = Field(default_factory=uuid4, primary_key=True)
|
||||||
|
user_id: UUID | None = Field(default=None, foreign_key="users.id", index=True)
|
||||||
product_id: UUID = Field(foreign_key="products.id", index=True, ondelete="CASCADE")
|
product_id: UUID = Field(foreign_key="products.id", index=True, ondelete="CASCADE")
|
||||||
|
|
||||||
|
is_household_shared: bool = Field(default=False)
|
||||||
is_opened: bool = Field(default=False)
|
is_opened: bool = Field(default=False)
|
||||||
opened_at: date | None = Field(default=None)
|
opened_at: date | None = Field(default=None)
|
||||||
finished_at: date | None = Field(default=None)
|
finished_at: date | None = Field(default=None)
|
||||||
expiry_date: date | None = Field(default=None)
|
expiry_date: date | None = Field(default=None)
|
||||||
current_weight_g: float | None = Field(default=None, gt=0)
|
remaining_level: RemainingLevel | None = None
|
||||||
last_weighed_at: date | None = Field(default=None)
|
|
||||||
notes: str | None = None
|
notes: str | None = None
|
||||||
|
|
||||||
created_at: datetime = Field(default_factory=utc_now, nullable=False)
|
created_at: datetime = Field(default_factory=utc_now, nullable=False)
|
||||||
|
|
|
||||||
|
|
@ -11,12 +11,13 @@ from .enums import SexAtBirth
|
||||||
|
|
||||||
|
|
||||||
class UserProfile(SQLModel, table=True):
|
class UserProfile(SQLModel, table=True):
|
||||||
__tablename__ = "user_profiles"
|
__tablename__ = "user_profiles" # pyright: ignore[reportAssignmentType]
|
||||||
__domains__: ClassVar[frozenset[Domain]] = frozenset(
|
__domains__: ClassVar[frozenset[Domain]] = frozenset(
|
||||||
{Domain.HEALTH, Domain.SKINCARE}
|
{Domain.HEALTH, Domain.SKINCARE}
|
||||||
)
|
)
|
||||||
|
|
||||||
id: UUID = Field(default_factory=uuid4, primary_key=True)
|
id: UUID = Field(default_factory=uuid4, primary_key=True)
|
||||||
|
user_id: UUID | None = Field(default=None, foreign_key="users.id", index=True)
|
||||||
birth_date: date | None = Field(default=None)
|
birth_date: date | None = Field(default=None)
|
||||||
sex_at_birth: SexAtBirth | None = Field(
|
sex_at_birth: SexAtBirth | None = Field(
|
||||||
default=None,
|
default=None,
|
||||||
|
|
|
||||||
|
|
@ -14,7 +14,7 @@ if TYPE_CHECKING:
|
||||||
|
|
||||||
|
|
||||||
class Routine(SQLModel, table=True):
|
class Routine(SQLModel, table=True):
|
||||||
__tablename__ = "routines"
|
__tablename__ = "routines" # pyright: ignore[reportAssignmentType]
|
||||||
__domains__: ClassVar[frozenset[Domain]] = frozenset({Domain.SKINCARE})
|
__domains__: ClassVar[frozenset[Domain]] = frozenset({Domain.SKINCARE})
|
||||||
__table_args__ = (
|
__table_args__ = (
|
||||||
UniqueConstraint(
|
UniqueConstraint(
|
||||||
|
|
@ -23,6 +23,7 @@ class Routine(SQLModel, table=True):
|
||||||
)
|
)
|
||||||
|
|
||||||
id: UUID = Field(default_factory=uuid4, primary_key=True)
|
id: UUID = Field(default_factory=uuid4, primary_key=True)
|
||||||
|
user_id: UUID | None = Field(default=None, foreign_key="users.id", index=True)
|
||||||
routine_date: date = Field(index=True)
|
routine_date: date = Field(index=True)
|
||||||
part_of_day: PartOfDay = Field(index=True)
|
part_of_day: PartOfDay = Field(index=True)
|
||||||
notes: str | None = Field(default=None)
|
notes: str | None = Field(default=None)
|
||||||
|
|
@ -45,20 +46,22 @@ class Routine(SQLModel, table=True):
|
||||||
|
|
||||||
|
|
||||||
class GroomingSchedule(SQLModel, table=True):
|
class GroomingSchedule(SQLModel, table=True):
|
||||||
__tablename__ = "grooming_schedule"
|
__tablename__ = "grooming_schedule" # pyright: ignore[reportAssignmentType]
|
||||||
__domains__: ClassVar[frozenset[Domain]] = frozenset({Domain.SKINCARE})
|
__domains__: ClassVar[frozenset[Domain]] = frozenset({Domain.SKINCARE})
|
||||||
|
|
||||||
id: UUID = Field(default_factory=uuid4, primary_key=True)
|
id: UUID = Field(default_factory=uuid4, primary_key=True)
|
||||||
|
user_id: UUID | None = Field(default=None, foreign_key="users.id", index=True)
|
||||||
day_of_week: int = Field(ge=0, le=6, index=True) # 0 = poniedziałek, 6 = niedziela
|
day_of_week: int = Field(ge=0, le=6, index=True) # 0 = poniedziałek, 6 = niedziela
|
||||||
action: GroomingAction
|
action: GroomingAction
|
||||||
notes: str | None = Field(default=None)
|
notes: str | None = Field(default=None)
|
||||||
|
|
||||||
|
|
||||||
class RoutineStep(SQLModel, table=True):
|
class RoutineStep(SQLModel, table=True):
|
||||||
__tablename__ = "routine_steps"
|
__tablename__ = "routine_steps" # pyright: ignore[reportAssignmentType]
|
||||||
__domains__: ClassVar[frozenset[Domain]] = frozenset({Domain.SKINCARE})
|
__domains__: ClassVar[frozenset[Domain]] = frozenset({Domain.SKINCARE})
|
||||||
|
|
||||||
id: UUID = Field(default_factory=uuid4, primary_key=True)
|
id: UUID = Field(default_factory=uuid4, primary_key=True)
|
||||||
|
user_id: UUID | None = Field(default=None, foreign_key="users.id", index=True)
|
||||||
routine_id: UUID = Field(foreign_key="routines.id", index=True)
|
routine_id: UUID = Field(foreign_key="routines.id", index=True)
|
||||||
product_id: UUID | None = Field(default=None, foreign_key="products.id", index=True)
|
product_id: UUID | None = Field(default=None, foreign_key="products.id", index=True)
|
||||||
order_index: int = Field(ge=0)
|
order_index: int = Field(ge=0)
|
||||||
|
|
|
||||||
|
|
@ -51,11 +51,12 @@ class SkinConditionSnapshot(SkinConditionSnapshotBase, table=True):
|
||||||
i kontekstu rutyny. Wszystkie metryki numeryczne w skali 1–5.
|
i kontekstu rutyny. Wszystkie metryki numeryczne w skali 1–5.
|
||||||
"""
|
"""
|
||||||
|
|
||||||
__tablename__ = "skin_condition_snapshots"
|
__tablename__ = "skin_condition_snapshots" # pyright: ignore[reportAssignmentType]
|
||||||
__domains__: ClassVar[frozenset[Domain]] = frozenset({Domain.SKINCARE})
|
__domains__: ClassVar[frozenset[Domain]] = frozenset({Domain.SKINCARE})
|
||||||
__table_args__ = (UniqueConstraint("snapshot_date", name="uq_skin_snapshot_date"),)
|
__table_args__ = (UniqueConstraint("snapshot_date", name="uq_skin_snapshot_date"),)
|
||||||
|
|
||||||
id: UUID = Field(default_factory=uuid4, primary_key=True)
|
id: UUID = Field(default_factory=uuid4, primary_key=True)
|
||||||
|
user_id: UUID | None = Field(default=None, foreign_key="users.id", index=True)
|
||||||
|
|
||||||
# Override: add index for table context
|
# Override: add index for table context
|
||||||
snapshot_date: date = Field(index=True)
|
snapshot_date: date = Field(index=True)
|
||||||
|
|
|
||||||
41
backend/innercontext/models/user.py
Normal file
41
backend/innercontext/models/user.py
Normal file
|
|
@ -0,0 +1,41 @@
|
||||||
|
# pyright: reportImportCycles=false
|
||||||
|
|
||||||
|
from datetime import datetime
|
||||||
|
from typing import TYPE_CHECKING, ClassVar
|
||||||
|
from uuid import UUID, uuid4
|
||||||
|
|
||||||
|
from sqlalchemy import Column, DateTime, String, UniqueConstraint
|
||||||
|
from sqlmodel import Field, Relationship, SQLModel
|
||||||
|
|
||||||
|
from .base import utc_now
|
||||||
|
from .domain import Domain
|
||||||
|
from .enums import Role
|
||||||
|
|
||||||
|
if TYPE_CHECKING:
|
||||||
|
from .household_membership import HouseholdMembership # pyright: ignore[reportImportCycles]
|
||||||
|
|
||||||
|
|
||||||
|
class User(SQLModel, table=True):
|
||||||
|
__tablename__ = "users" # pyright: ignore[reportAssignmentType]
|
||||||
|
__domains__: ClassVar[frozenset[Domain]] = frozenset()
|
||||||
|
__table_args__ = (
|
||||||
|
UniqueConstraint("oidc_issuer", "oidc_subject", name="uq_users_oidc_identity"),
|
||||||
|
)
|
||||||
|
|
||||||
|
id: UUID = Field(default_factory=uuid4, primary_key=True)
|
||||||
|
oidc_issuer: str = Field(sa_column=Column(String(length=512), nullable=False))
|
||||||
|
oidc_subject: str = Field(sa_column=Column(String(length=512), nullable=False))
|
||||||
|
role: Role = Field(default=Role.MEMBER, index=True)
|
||||||
|
|
||||||
|
created_at: datetime = Field(default_factory=utc_now, nullable=False)
|
||||||
|
updated_at: datetime = Field(
|
||||||
|
default_factory=utc_now,
|
||||||
|
sa_column=Column(
|
||||||
|
DateTime(timezone=True),
|
||||||
|
default=utc_now,
|
||||||
|
onupdate=utc_now,
|
||||||
|
nullable=False,
|
||||||
|
),
|
||||||
|
)
|
||||||
|
|
||||||
|
household_membership: "HouseholdMembership" = Relationship(back_populates="user")
|
||||||
|
|
@ -1,4 +1,5 @@
|
||||||
from datetime import datetime
|
from datetime import datetime
|
||||||
|
from uuid import UUID
|
||||||
|
|
||||||
from sqlmodel import Session, col, select
|
from sqlmodel import Session, col, select
|
||||||
|
|
||||||
|
|
@ -66,9 +67,53 @@ def _apply_pricing_snapshot(session: Session, computed_at: datetime) -> int:
|
||||||
return len(products)
|
return len(products)
|
||||||
|
|
||||||
|
|
||||||
|
def _scope_user_id(scope: str) -> UUID | None:
|
||||||
|
prefix = "user:"
|
||||||
|
if not scope.startswith(prefix):
|
||||||
|
return None
|
||||||
|
raw_user_id = scope[len(prefix) :].strip()
|
||||||
|
if not raw_user_id:
|
||||||
|
return None
|
||||||
|
try:
|
||||||
|
return UUID(raw_user_id)
|
||||||
|
except ValueError:
|
||||||
|
return None
|
||||||
|
|
||||||
|
|
||||||
|
def _apply_pricing_snapshot_for_scope(
|
||||||
|
session: Session,
|
||||||
|
*,
|
||||||
|
computed_at: datetime,
|
||||||
|
scope: str,
|
||||||
|
) -> int:
|
||||||
|
from innercontext.api.products import _compute_pricing_outputs
|
||||||
|
|
||||||
|
scoped_user_id = _scope_user_id(scope)
|
||||||
|
stmt = select(Product)
|
||||||
|
if scoped_user_id is not None:
|
||||||
|
stmt = stmt.where(Product.user_id == scoped_user_id)
|
||||||
|
products = list(session.exec(stmt).all())
|
||||||
|
pricing_outputs = _compute_pricing_outputs(products)
|
||||||
|
|
||||||
|
for product in products:
|
||||||
|
tier, price_per_use_pln, tier_source = pricing_outputs.get(
|
||||||
|
product.id, (None, None, None)
|
||||||
|
)
|
||||||
|
product.price_tier = tier
|
||||||
|
product.price_per_use_pln = price_per_use_pln
|
||||||
|
product.price_tier_source = tier_source
|
||||||
|
product.pricing_computed_at = computed_at
|
||||||
|
|
||||||
|
return len(products)
|
||||||
|
|
||||||
|
|
||||||
def process_pricing_job(session: Session, job: PricingRecalcJob) -> int:
|
def process_pricing_job(session: Session, job: PricingRecalcJob) -> int:
|
||||||
try:
|
try:
|
||||||
updated_count = _apply_pricing_snapshot(session, computed_at=utc_now())
|
updated_count = _apply_pricing_snapshot_for_scope(
|
||||||
|
session,
|
||||||
|
computed_at=utc_now(),
|
||||||
|
scope=job.scope,
|
||||||
|
)
|
||||||
job.status = "succeeded"
|
job.status = "succeeded"
|
||||||
job.finished_at = utc_now()
|
job.finished_at = utc_now()
|
||||||
job.error = None
|
job.error = None
|
||||||
|
|
|
||||||
|
|
@ -22,48 +22,9 @@ class ShoppingValidationContext:
|
||||||
|
|
||||||
|
|
||||||
class ShoppingValidator(BaseValidator):
|
class ShoppingValidator(BaseValidator):
|
||||||
"""Validates shopping suggestions for product types."""
|
"""Validates shopping suggestion schema and copy quality."""
|
||||||
|
|
||||||
# Realistic product type patterns (not exhaustive, just sanity checks)
|
VALID_PRIORITIES = {"high", "medium", "low"}
|
||||||
VALID_PRODUCT_TYPE_PATTERNS = {
|
|
||||||
"serum",
|
|
||||||
"cream",
|
|
||||||
"cleanser",
|
|
||||||
"toner",
|
|
||||||
"essence",
|
|
||||||
"moisturizer",
|
|
||||||
"spf",
|
|
||||||
"sunscreen",
|
|
||||||
"oil",
|
|
||||||
"balm",
|
|
||||||
"mask",
|
|
||||||
"exfoliant",
|
|
||||||
"acid",
|
|
||||||
"retinoid",
|
|
||||||
"vitamin",
|
|
||||||
"niacinamide",
|
|
||||||
"hyaluronic",
|
|
||||||
"ceramide",
|
|
||||||
"peptide",
|
|
||||||
"antioxidant",
|
|
||||||
"aha",
|
|
||||||
"bha",
|
|
||||||
"pha",
|
|
||||||
}
|
|
||||||
|
|
||||||
VALID_FREQUENCIES = {
|
|
||||||
"daily",
|
|
||||||
"twice daily",
|
|
||||||
"am",
|
|
||||||
"pm",
|
|
||||||
"both",
|
|
||||||
"2x weekly",
|
|
||||||
"3x weekly",
|
|
||||||
"2-3x weekly",
|
|
||||||
"weekly",
|
|
||||||
"as needed",
|
|
||||||
"occasional",
|
|
||||||
}
|
|
||||||
|
|
||||||
def validate(
|
def validate(
|
||||||
self, response: Any, context: ShoppingValidationContext
|
self, response: Any, context: ShoppingValidationContext
|
||||||
|
|
@ -73,19 +34,17 @@ class ShoppingValidator(BaseValidator):
|
||||||
|
|
||||||
Checks:
|
Checks:
|
||||||
1. suggestions field present
|
1. suggestions field present
|
||||||
2. Product types are realistic (contain known keywords)
|
2. Categories are valid
|
||||||
3. Not suggesting products user already owns (should mark as [✗])
|
3. Targets are valid
|
||||||
4. Recommended frequencies are valid
|
4. Each suggestion has required fields
|
||||||
5. Categories are valid
|
5. Decision-support fields are well formed
|
||||||
6. Targets are valid
|
|
||||||
7. Each suggestion has required fields
|
|
||||||
|
|
||||||
Args:
|
Args:
|
||||||
response: Parsed shopping suggestion response
|
response: Parsed shopping suggestion response
|
||||||
context: Validation context
|
context: Validation context
|
||||||
|
|
||||||
Returns:
|
Returns:
|
||||||
ValidationResult with any errors/warnings
|
ValidationResult with schema errors and lightweight quality warnings
|
||||||
"""
|
"""
|
||||||
result = ValidationResult()
|
result = ValidationResult()
|
||||||
|
|
||||||
|
|
@ -112,15 +71,8 @@ class ShoppingValidator(BaseValidator):
|
||||||
f"Suggestion {sug_num}: invalid category '{suggestion.category}'"
|
f"Suggestion {sug_num}: invalid category '{suggestion.category}'"
|
||||||
)
|
)
|
||||||
|
|
||||||
# Check product type is realistic
|
if hasattr(suggestion, "priority") and suggestion.priority:
|
||||||
if hasattr(suggestion, "product_type") and suggestion.product_type:
|
self._check_priority_valid(suggestion.priority, sug_num, result)
|
||||||
self._check_product_type_realistic(
|
|
||||||
suggestion.product_type, sug_num, result
|
|
||||||
)
|
|
||||||
|
|
||||||
# Check frequency is valid
|
|
||||||
if hasattr(suggestion, "frequency") and suggestion.frequency:
|
|
||||||
self._check_frequency_valid(suggestion.frequency, sug_num, result)
|
|
||||||
|
|
||||||
# Check targets are valid
|
# Check targets are valid
|
||||||
if hasattr(suggestion, "target_concerns") and suggestion.target_concerns:
|
if hasattr(suggestion, "target_concerns") and suggestion.target_concerns:
|
||||||
|
|
@ -128,6 +80,11 @@ class ShoppingValidator(BaseValidator):
|
||||||
suggestion.target_concerns, sug_num, context, result
|
suggestion.target_concerns, sug_num, context, result
|
||||||
)
|
)
|
||||||
|
|
||||||
|
if hasattr(suggestion, "usage_cautions"):
|
||||||
|
self._check_usage_cautions(suggestion.usage_cautions, sug_num, result)
|
||||||
|
|
||||||
|
self._check_text_quality(suggestion, sug_num, result)
|
||||||
|
|
||||||
# Check recommended_time is valid
|
# Check recommended_time is valid
|
||||||
if hasattr(suggestion, "recommended_time") and suggestion.recommended_time:
|
if hasattr(suggestion, "recommended_time") and suggestion.recommended_time:
|
||||||
if suggestion.recommended_time not in ("am", "pm", "both"):
|
if suggestion.recommended_time not in ("am", "pm", "both"):
|
||||||
|
|
@ -142,7 +99,15 @@ class ShoppingValidator(BaseValidator):
|
||||||
self, suggestion: Any, sug_num: int, result: ValidationResult
|
self, suggestion: Any, sug_num: int, result: ValidationResult
|
||||||
) -> None:
|
) -> None:
|
||||||
"""Check suggestion has required fields."""
|
"""Check suggestion has required fields."""
|
||||||
required = ["category", "product_type", "why_needed"]
|
required = [
|
||||||
|
"category",
|
||||||
|
"product_type",
|
||||||
|
"priority",
|
||||||
|
"short_reason",
|
||||||
|
"reason_to_buy_now",
|
||||||
|
"fit_with_current_routine",
|
||||||
|
"usage_cautions",
|
||||||
|
]
|
||||||
|
|
||||||
for field in required:
|
for field in required:
|
||||||
if not hasattr(suggestion, field) or getattr(suggestion, field) is None:
|
if not hasattr(suggestion, field) or getattr(suggestion, field) is None:
|
||||||
|
|
@ -150,64 +115,14 @@ class ShoppingValidator(BaseValidator):
|
||||||
f"Suggestion {sug_num}: missing required field '{field}'"
|
f"Suggestion {sug_num}: missing required field '{field}'"
|
||||||
)
|
)
|
||||||
|
|
||||||
def _check_product_type_realistic(
|
def _check_priority_valid(
|
||||||
self, product_type: str, sug_num: int, result: ValidationResult
|
self, priority: str, sug_num: int, result: ValidationResult
|
||||||
) -> None:
|
) -> None:
|
||||||
"""Check product type contains realistic keywords."""
|
"""Check priority uses supported enum values."""
|
||||||
product_type_lower = product_type.lower()
|
if priority not in self.VALID_PRIORITIES:
|
||||||
|
|
||||||
# Check if any valid pattern appears in the product type
|
|
||||||
has_valid_keyword = any(
|
|
||||||
pattern in product_type_lower
|
|
||||||
for pattern in self.VALID_PRODUCT_TYPE_PATTERNS
|
|
||||||
)
|
|
||||||
|
|
||||||
if not has_valid_keyword:
|
|
||||||
result.add_warning(
|
|
||||||
f"Suggestion {sug_num}: product type '{product_type}' looks unusual - "
|
|
||||||
"verify it's a real skincare product category"
|
|
||||||
)
|
|
||||||
|
|
||||||
# Check for brand names (shouldn't suggest specific brands)
|
|
||||||
suspicious_brands = [
|
|
||||||
"la roche",
|
|
||||||
"cerave",
|
|
||||||
"paula",
|
|
||||||
"ordinary",
|
|
||||||
"skinceuticals",
|
|
||||||
"drunk elephant",
|
|
||||||
"versed",
|
|
||||||
"inkey",
|
|
||||||
"cosrx",
|
|
||||||
"pixi",
|
|
||||||
]
|
|
||||||
|
|
||||||
if any(brand in product_type_lower for brand in suspicious_brands):
|
|
||||||
result.add_error(
|
result.add_error(
|
||||||
f"Suggestion {sug_num}: product type contains brand name - "
|
f"Suggestion {sug_num}: invalid priority '{priority}' "
|
||||||
"should suggest product TYPES only, not specific brands"
|
"(must be 'high', 'medium', or 'low')"
|
||||||
)
|
|
||||||
|
|
||||||
def _check_frequency_valid(
|
|
||||||
self, frequency: str, sug_num: int, result: ValidationResult
|
|
||||||
) -> None:
|
|
||||||
"""Check frequency is a recognized pattern."""
|
|
||||||
frequency_lower = frequency.lower()
|
|
||||||
|
|
||||||
# Check for exact matches or common patterns
|
|
||||||
is_valid = (
|
|
||||||
frequency_lower in self.VALID_FREQUENCIES
|
|
||||||
or "daily" in frequency_lower
|
|
||||||
or "weekly" in frequency_lower
|
|
||||||
or "am" in frequency_lower
|
|
||||||
or "pm" in frequency_lower
|
|
||||||
or "x" in frequency_lower # e.g. "2x weekly"
|
|
||||||
)
|
|
||||||
|
|
||||||
if not is_valid:
|
|
||||||
result.add_warning(
|
|
||||||
f"Suggestion {sug_num}: unusual frequency '{frequency}' - "
|
|
||||||
"verify it's a realistic usage pattern"
|
|
||||||
)
|
)
|
||||||
|
|
||||||
def _check_targets_valid(
|
def _check_targets_valid(
|
||||||
|
|
@ -227,3 +142,64 @@ class ShoppingValidator(BaseValidator):
|
||||||
result.add_error(
|
result.add_error(
|
||||||
f"Suggestion {sug_num}: invalid target concern '{target}'"
|
f"Suggestion {sug_num}: invalid target concern '{target}'"
|
||||||
)
|
)
|
||||||
|
|
||||||
|
def _check_usage_cautions(
|
||||||
|
self, usage_cautions: Any, sug_num: int, result: ValidationResult
|
||||||
|
) -> None:
|
||||||
|
"""Check usage cautions are a list of short strings."""
|
||||||
|
if not isinstance(usage_cautions, list):
|
||||||
|
result.add_error(f"Suggestion {sug_num}: usage_cautions must be a list")
|
||||||
|
return
|
||||||
|
|
||||||
|
for caution in usage_cautions:
|
||||||
|
if not isinstance(caution, str):
|
||||||
|
result.add_error(
|
||||||
|
f"Suggestion {sug_num}: usage_cautions entries must be strings"
|
||||||
|
)
|
||||||
|
continue
|
||||||
|
if len(caution.strip()) > 180:
|
||||||
|
result.add_warning(
|
||||||
|
f"Suggestion {sug_num}: usage caution is too long - keep it concise"
|
||||||
|
)
|
||||||
|
|
||||||
|
def _check_text_quality(
|
||||||
|
self, suggestion: Any, sug_num: int, result: ValidationResult
|
||||||
|
) -> None:
|
||||||
|
"""Warn when decision-support copy is too generic or empty-ish."""
|
||||||
|
generic_phrases = {
|
||||||
|
"wspiera skore",
|
||||||
|
"pomaga skorze",
|
||||||
|
"moze pomoc",
|
||||||
|
"dobry wybor",
|
||||||
|
"uzupelnia rutyne",
|
||||||
|
"supports the skin",
|
||||||
|
"may help",
|
||||||
|
"good option",
|
||||||
|
"complements the routine",
|
||||||
|
}
|
||||||
|
|
||||||
|
text_fields = [
|
||||||
|
("short_reason", getattr(suggestion, "short_reason", None), 12),
|
||||||
|
("reason_to_buy_now", getattr(suggestion, "reason_to_buy_now", None), 18),
|
||||||
|
(
|
||||||
|
"fit_with_current_routine",
|
||||||
|
getattr(suggestion, "fit_with_current_routine", None),
|
||||||
|
18,
|
||||||
|
),
|
||||||
|
]
|
||||||
|
|
||||||
|
for field_name, value, min_length in text_fields:
|
||||||
|
if not isinstance(value, str):
|
||||||
|
continue
|
||||||
|
stripped = value.strip()
|
||||||
|
if len(stripped) < min_length:
|
||||||
|
result.add_warning(
|
||||||
|
f"Suggestion {sug_num}: {field_name} is very short - add more decision context"
|
||||||
|
)
|
||||||
|
continue
|
||||||
|
|
||||||
|
lowered = stripped.lower()
|
||||||
|
if lowered in generic_phrases:
|
||||||
|
result.add_warning(
|
||||||
|
f"Suggestion {sug_num}: {field_name} is too generic - make it more specific"
|
||||||
|
)
|
||||||
|
|
|
||||||
|
|
@ -1,17 +1,19 @@
|
||||||
|
from collections.abc import AsyncIterator
|
||||||
from contextlib import asynccontextmanager
|
from contextlib import asynccontextmanager
|
||||||
from typing import AsyncIterator
|
|
||||||
|
|
||||||
from dotenv import load_dotenv
|
from dotenv import load_dotenv
|
||||||
|
|
||||||
load_dotenv() # load .env before db.py reads DATABASE_URL
|
_ = load_dotenv() # load .env before db.py reads DATABASE_URL
|
||||||
|
|
||||||
from fastapi import FastAPI # noqa: E402
|
from fastapi import Depends, FastAPI # noqa: E402
|
||||||
from fastapi.middleware.cors import CORSMiddleware # noqa: E402
|
from fastapi.middleware.cors import CORSMiddleware # noqa: E402
|
||||||
from sqlmodel import Session # noqa: E402
|
from sqlmodel import Session # noqa: E402
|
||||||
|
|
||||||
from db import create_db_and_tables, engine # noqa: E402
|
from db import create_db_and_tables, engine # noqa: E402
|
||||||
from innercontext.api import ( # noqa: E402
|
from innercontext.api import ( # noqa: E402
|
||||||
|
admin,
|
||||||
ai_logs,
|
ai_logs,
|
||||||
|
auth,
|
||||||
health,
|
health,
|
||||||
inventory,
|
inventory,
|
||||||
products,
|
products,
|
||||||
|
|
@ -19,15 +21,16 @@ from innercontext.api import ( # noqa: E402
|
||||||
routines,
|
routines,
|
||||||
skincare,
|
skincare,
|
||||||
)
|
)
|
||||||
|
from innercontext.api.auth_deps import get_current_user # noqa: E402
|
||||||
from innercontext.services.pricing_jobs import enqueue_pricing_recalc # noqa: E402
|
from innercontext.services.pricing_jobs import enqueue_pricing_recalc # noqa: E402
|
||||||
|
|
||||||
|
|
||||||
@asynccontextmanager
|
@asynccontextmanager
|
||||||
async def lifespan(app: FastAPI) -> AsyncIterator[None]:
|
async def lifespan(_app: FastAPI) -> AsyncIterator[None]:
|
||||||
create_db_and_tables()
|
create_db_and_tables()
|
||||||
try:
|
try:
|
||||||
with Session(engine) as session:
|
with Session(engine) as session:
|
||||||
enqueue_pricing_recalc(session)
|
_ = enqueue_pricing_recalc(session)
|
||||||
session.commit()
|
session.commit()
|
||||||
except Exception as exc: # pragma: no cover
|
except Exception as exc: # pragma: no cover
|
||||||
print(f"[startup] failed to enqueue pricing recalculation job: {exc}")
|
print(f"[startup] failed to enqueue pricing recalculation job: {exc}")
|
||||||
|
|
@ -47,13 +50,52 @@ app.add_middleware(
|
||||||
allow_headers=["*"],
|
allow_headers=["*"],
|
||||||
)
|
)
|
||||||
|
|
||||||
app.include_router(products.router, prefix="/products", tags=["products"])
|
protected = [Depends(get_current_user)]
|
||||||
app.include_router(inventory.router, prefix="/inventory", tags=["inventory"])
|
|
||||||
app.include_router(profile.router, prefix="/profile", tags=["profile"])
|
app.include_router(auth.router, prefix="/auth", tags=["auth"])
|
||||||
app.include_router(health.router, prefix="/health", tags=["health"])
|
app.include_router(admin.router, prefix="/admin", tags=["admin"])
|
||||||
app.include_router(routines.router, prefix="/routines", tags=["routines"])
|
app.include_router(
|
||||||
app.include_router(skincare.router, prefix="/skincare", tags=["skincare"])
|
products.router,
|
||||||
app.include_router(ai_logs.router, prefix="/ai-logs", tags=["ai-logs"])
|
prefix="/products",
|
||||||
|
tags=["products"],
|
||||||
|
dependencies=protected,
|
||||||
|
)
|
||||||
|
app.include_router(
|
||||||
|
inventory.router,
|
||||||
|
prefix="/inventory",
|
||||||
|
tags=["inventory"],
|
||||||
|
dependencies=protected,
|
||||||
|
)
|
||||||
|
app.include_router(
|
||||||
|
profile.router,
|
||||||
|
prefix="/profile",
|
||||||
|
tags=["profile"],
|
||||||
|
dependencies=protected,
|
||||||
|
)
|
||||||
|
app.include_router(
|
||||||
|
health.router,
|
||||||
|
prefix="/health",
|
||||||
|
tags=["health"],
|
||||||
|
dependencies=protected,
|
||||||
|
)
|
||||||
|
app.include_router(
|
||||||
|
routines.router,
|
||||||
|
prefix="/routines",
|
||||||
|
tags=["routines"],
|
||||||
|
dependencies=protected,
|
||||||
|
)
|
||||||
|
app.include_router(
|
||||||
|
skincare.router,
|
||||||
|
prefix="/skincare",
|
||||||
|
tags=["skincare"],
|
||||||
|
dependencies=protected,
|
||||||
|
)
|
||||||
|
app.include_router(
|
||||||
|
ai_logs.router,
|
||||||
|
prefix="/ai-logs",
|
||||||
|
tags=["ai-logs"],
|
||||||
|
dependencies=protected,
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
@app.get("/health-check")
|
@app.get("/health-check")
|
||||||
|
|
|
||||||
|
|
@ -8,6 +8,7 @@ dependencies = [
|
||||||
"alembic>=1.14",
|
"alembic>=1.14",
|
||||||
"fastapi>=0.132.0",
|
"fastapi>=0.132.0",
|
||||||
"google-genai>=1.65.0",
|
"google-genai>=1.65.0",
|
||||||
|
"pyjwt[crypto]>=2.10.1",
|
||||||
"psycopg[binary]>=3.3.3",
|
"psycopg[binary]>=3.3.3",
|
||||||
"python-dotenv>=1.2.1",
|
"python-dotenv>=1.2.1",
|
||||||
"python-multipart>=0.0.22",
|
"python-multipart>=0.0.22",
|
||||||
|
|
|
||||||
|
|
@ -1,4 +1,6 @@
|
||||||
import os
|
import os
|
||||||
|
from datetime import UTC, datetime, timedelta
|
||||||
|
from uuid import uuid4
|
||||||
|
|
||||||
# Must be set before importing db (which calls create_engine at module level)
|
# Must be set before importing db (which calls create_engine at module level)
|
||||||
os.environ.setdefault("DATABASE_URL", "sqlite://")
|
os.environ.setdefault("DATABASE_URL", "sqlite://")
|
||||||
|
|
@ -10,6 +12,9 @@ from sqlmodel.pool import StaticPool
|
||||||
|
|
||||||
import db as db_module
|
import db as db_module
|
||||||
from db import get_session
|
from db import get_session
|
||||||
|
from innercontext.api.auth_deps import get_current_user
|
||||||
|
from innercontext.auth import CurrentUser, IdentityData, TokenClaims
|
||||||
|
from innercontext.models import Role
|
||||||
from main import app
|
from main import app
|
||||||
|
|
||||||
|
|
||||||
|
|
@ -32,13 +37,35 @@ def session(monkeypatch):
|
||||||
|
|
||||||
|
|
||||||
@pytest.fixture()
|
@pytest.fixture()
|
||||||
def client(session, monkeypatch):
|
def current_user() -> CurrentUser:
|
||||||
|
claims = TokenClaims(
|
||||||
|
issuer="https://auth.test",
|
||||||
|
subject="test-user",
|
||||||
|
audience=("innercontext-web",),
|
||||||
|
expires_at=datetime.now(UTC) + timedelta(hours=1),
|
||||||
|
groups=("innercontext-admin",),
|
||||||
|
raw_claims={"iss": "https://auth.test", "sub": "test-user"},
|
||||||
|
)
|
||||||
|
return CurrentUser(
|
||||||
|
user_id=uuid4(),
|
||||||
|
role=Role.ADMIN,
|
||||||
|
identity=IdentityData.from_claims(claims),
|
||||||
|
claims=claims,
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.fixture()
|
||||||
|
def client(session, monkeypatch, current_user):
|
||||||
"""TestClient using the per-test session for every request."""
|
"""TestClient using the per-test session for every request."""
|
||||||
|
|
||||||
def _override():
|
def _override():
|
||||||
yield session
|
yield session
|
||||||
|
|
||||||
|
def _current_user_override():
|
||||||
|
return current_user
|
||||||
|
|
||||||
app.dependency_overrides[get_session] = _override
|
app.dependency_overrides[get_session] = _override
|
||||||
|
app.dependency_overrides[get_current_user] = _current_user_override
|
||||||
with TestClient(app) as c:
|
with TestClient(app) as c:
|
||||||
yield c
|
yield c
|
||||||
app.dependency_overrides.clear()
|
app.dependency_overrides.clear()
|
||||||
|
|
|
||||||
354
backend/tests/test_admin_households.py
Normal file
354
backend/tests/test_admin_households.py
Normal file
|
|
@ -0,0 +1,354 @@
|
||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
from collections.abc import Generator
|
||||||
|
from datetime import UTC, datetime, timedelta
|
||||||
|
from typing import cast
|
||||||
|
from uuid import UUID, uuid4
|
||||||
|
|
||||||
|
import pytest
|
||||||
|
from fastapi.testclient import TestClient
|
||||||
|
from sqlmodel import Session
|
||||||
|
|
||||||
|
from db import get_session
|
||||||
|
from innercontext.api.auth_deps import get_current_user
|
||||||
|
from innercontext.auth import (
|
||||||
|
CurrentHouseholdMembership,
|
||||||
|
CurrentUser,
|
||||||
|
IdentityData,
|
||||||
|
TokenClaims,
|
||||||
|
)
|
||||||
|
from innercontext.models import (
|
||||||
|
Household,
|
||||||
|
HouseholdMembership,
|
||||||
|
HouseholdRole,
|
||||||
|
Role,
|
||||||
|
User,
|
||||||
|
)
|
||||||
|
from main import app
|
||||||
|
|
||||||
|
|
||||||
|
def _current_user(
|
||||||
|
user_id: UUID,
|
||||||
|
*,
|
||||||
|
role: Role = Role.ADMIN,
|
||||||
|
household_id: UUID | None = None,
|
||||||
|
) -> CurrentUser:
|
||||||
|
claims = TokenClaims(
|
||||||
|
issuer="https://auth.test",
|
||||||
|
subject=str(user_id),
|
||||||
|
audience=("innercontext-web",),
|
||||||
|
expires_at=datetime.now(UTC) + timedelta(hours=1),
|
||||||
|
groups=(
|
||||||
|
("innercontext-admin",) if role is Role.ADMIN else ("innercontext-member",)
|
||||||
|
),
|
||||||
|
raw_claims={"iss": "https://auth.test", "sub": str(user_id)},
|
||||||
|
)
|
||||||
|
membership = None
|
||||||
|
if household_id is not None:
|
||||||
|
membership = CurrentHouseholdMembership(
|
||||||
|
household_id=household_id,
|
||||||
|
role=HouseholdRole.MEMBER,
|
||||||
|
)
|
||||||
|
return CurrentUser(
|
||||||
|
user_id=user_id,
|
||||||
|
role=role,
|
||||||
|
identity=IdentityData.from_claims(claims),
|
||||||
|
claims=claims,
|
||||||
|
household_membership=membership,
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def _create_user(
|
||||||
|
session: Session,
|
||||||
|
*,
|
||||||
|
role: Role = Role.MEMBER,
|
||||||
|
subject: str | None = None,
|
||||||
|
) -> User:
|
||||||
|
user = User(
|
||||||
|
oidc_issuer="https://auth.test",
|
||||||
|
oidc_subject=subject or str(uuid4()),
|
||||||
|
role=role,
|
||||||
|
)
|
||||||
|
session.add(user)
|
||||||
|
session.commit()
|
||||||
|
session.refresh(user)
|
||||||
|
return user
|
||||||
|
|
||||||
|
|
||||||
|
def _create_household(session: Session) -> Household:
|
||||||
|
household = Household()
|
||||||
|
session.add(household)
|
||||||
|
session.commit()
|
||||||
|
session.refresh(household)
|
||||||
|
return household
|
||||||
|
|
||||||
|
|
||||||
|
def _create_membership(
|
||||||
|
session: Session,
|
||||||
|
*,
|
||||||
|
user_id: UUID,
|
||||||
|
household_id: UUID,
|
||||||
|
role: HouseholdRole = HouseholdRole.MEMBER,
|
||||||
|
) -> HouseholdMembership:
|
||||||
|
membership = HouseholdMembership(
|
||||||
|
user_id=user_id,
|
||||||
|
household_id=household_id,
|
||||||
|
role=role,
|
||||||
|
)
|
||||||
|
session.add(membership)
|
||||||
|
session.commit()
|
||||||
|
session.refresh(membership)
|
||||||
|
return membership
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.fixture()
|
||||||
|
def auth_client(
|
||||||
|
session: Session,
|
||||||
|
) -> Generator[tuple[TestClient, dict[str, CurrentUser]], None, None]:
|
||||||
|
auth_state = {"current_user": _current_user(uuid4(), role=Role.ADMIN)}
|
||||||
|
|
||||||
|
def _session_override():
|
||||||
|
yield session
|
||||||
|
|
||||||
|
def _current_user_override():
|
||||||
|
return auth_state["current_user"]
|
||||||
|
|
||||||
|
app.dependency_overrides[get_session] = _session_override
|
||||||
|
app.dependency_overrides[get_current_user] = _current_user_override
|
||||||
|
with TestClient(app) as client:
|
||||||
|
yield client, auth_state
|
||||||
|
app.dependency_overrides.clear()
|
||||||
|
|
||||||
|
|
||||||
|
def test_list_users_returns_local_users_with_memberships(
|
||||||
|
auth_client: tuple[TestClient, dict[str, CurrentUser]],
|
||||||
|
session: Session,
|
||||||
|
):
|
||||||
|
client, _ = auth_client
|
||||||
|
|
||||||
|
unassigned_user = _create_user(session, subject="member-a")
|
||||||
|
assigned_user = _create_user(session, subject="member-b")
|
||||||
|
household = _create_household(session)
|
||||||
|
membership = _create_membership(
|
||||||
|
session,
|
||||||
|
user_id=assigned_user.id,
|
||||||
|
household_id=household.id,
|
||||||
|
role=HouseholdRole.OWNER,
|
||||||
|
)
|
||||||
|
|
||||||
|
response = client.get("/admin/users")
|
||||||
|
|
||||||
|
assert response.status_code == 200
|
||||||
|
response_users = cast(list[dict[str, object]], response.json())
|
||||||
|
users = {item["id"]: item for item in response_users}
|
||||||
|
assert users[str(unassigned_user.id)]["household_membership"] is None
|
||||||
|
assert users[str(assigned_user.id)]["household_membership"] == {
|
||||||
|
"id": str(membership.id),
|
||||||
|
"user_id": str(assigned_user.id),
|
||||||
|
"household_id": str(household.id),
|
||||||
|
"role": "owner",
|
||||||
|
"created_at": membership.created_at.isoformat(),
|
||||||
|
"updated_at": membership.updated_at.isoformat(),
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
def test_create_household_returns_new_household(
|
||||||
|
auth_client: tuple[TestClient, dict[str, CurrentUser]],
|
||||||
|
session: Session,
|
||||||
|
):
|
||||||
|
client, _ = auth_client
|
||||||
|
|
||||||
|
response = client.post("/admin/households")
|
||||||
|
|
||||||
|
assert response.status_code == 201
|
||||||
|
payload = cast(dict[str, object], response.json())
|
||||||
|
household_id = UUID(cast(str, payload["id"]))
|
||||||
|
created = session.get(Household, household_id)
|
||||||
|
assert created is not None
|
||||||
|
|
||||||
|
|
||||||
|
def test_assign_member_creates_membership(
|
||||||
|
auth_client: tuple[TestClient, dict[str, CurrentUser]],
|
||||||
|
session: Session,
|
||||||
|
):
|
||||||
|
client, _ = auth_client
|
||||||
|
|
||||||
|
user = _create_user(session)
|
||||||
|
household = _create_household(session)
|
||||||
|
|
||||||
|
response = client.post(
|
||||||
|
f"/admin/households/{household.id}/members",
|
||||||
|
json={"user_id": str(user.id), "role": "owner"},
|
||||||
|
)
|
||||||
|
|
||||||
|
assert response.status_code == 201
|
||||||
|
payload = cast(dict[str, object], response.json())
|
||||||
|
assert payload["user_id"] == str(user.id)
|
||||||
|
assert payload["household_id"] == str(household.id)
|
||||||
|
assert payload["role"] == "owner"
|
||||||
|
|
||||||
|
membership = session.get(HouseholdMembership, UUID(cast(str, payload["id"])))
|
||||||
|
assert membership is not None
|
||||||
|
assert membership.user_id == user.id
|
||||||
|
assert membership.household_id == household.id
|
||||||
|
assert membership.role is HouseholdRole.OWNER
|
||||||
|
|
||||||
|
|
||||||
|
def test_assign_member_rejects_already_assigned_user(
|
||||||
|
auth_client: tuple[TestClient, dict[str, CurrentUser]],
|
||||||
|
session: Session,
|
||||||
|
):
|
||||||
|
client, _ = auth_client
|
||||||
|
|
||||||
|
user = _create_user(session)
|
||||||
|
current_household = _create_household(session)
|
||||||
|
target_household = _create_household(session)
|
||||||
|
_ = _create_membership(session, user_id=user.id, household_id=current_household.id)
|
||||||
|
|
||||||
|
response = client.post(
|
||||||
|
f"/admin/households/{target_household.id}/members",
|
||||||
|
json={"user_id": str(user.id)},
|
||||||
|
)
|
||||||
|
|
||||||
|
assert response.status_code == 409
|
||||||
|
assert response.json()["detail"] == "User already belongs to a household"
|
||||||
|
|
||||||
|
|
||||||
|
def test_assign_member_rejects_unsynced_user(
|
||||||
|
auth_client: tuple[TestClient, dict[str, CurrentUser]],
|
||||||
|
session: Session,
|
||||||
|
):
|
||||||
|
client, _ = auth_client
|
||||||
|
household = _create_household(session)
|
||||||
|
user_id = uuid4()
|
||||||
|
|
||||||
|
response = client.post(
|
||||||
|
f"/admin/households/{household.id}/members",
|
||||||
|
json={"user_id": str(user_id)},
|
||||||
|
)
|
||||||
|
|
||||||
|
assert response.status_code == 404
|
||||||
|
assert response.json()["detail"] == "User not found"
|
||||||
|
|
||||||
|
|
||||||
|
def test_move_member_moves_user_between_households(
|
||||||
|
auth_client: tuple[TestClient, dict[str, CurrentUser]],
|
||||||
|
session: Session,
|
||||||
|
):
|
||||||
|
client, _ = auth_client
|
||||||
|
|
||||||
|
user = _create_user(session)
|
||||||
|
source_household = _create_household(session)
|
||||||
|
target_household = _create_household(session)
|
||||||
|
membership = _create_membership(
|
||||||
|
session,
|
||||||
|
user_id=user.id,
|
||||||
|
household_id=source_household.id,
|
||||||
|
role=HouseholdRole.OWNER,
|
||||||
|
)
|
||||||
|
|
||||||
|
response = client.patch(
|
||||||
|
f"/admin/households/{target_household.id}/members/{user.id}"
|
||||||
|
)
|
||||||
|
|
||||||
|
assert response.status_code == 200
|
||||||
|
payload = cast(dict[str, object], response.json())
|
||||||
|
assert payload["id"] == str(membership.id)
|
||||||
|
assert payload["household_id"] == str(target_household.id)
|
||||||
|
assert payload["role"] == "owner"
|
||||||
|
|
||||||
|
session.refresh(membership)
|
||||||
|
assert membership.household_id == target_household.id
|
||||||
|
|
||||||
|
|
||||||
|
def test_move_member_rejects_user_without_membership(
|
||||||
|
auth_client: tuple[TestClient, dict[str, CurrentUser]],
|
||||||
|
session: Session,
|
||||||
|
):
|
||||||
|
client, _ = auth_client
|
||||||
|
|
||||||
|
user = _create_user(session)
|
||||||
|
target_household = _create_household(session)
|
||||||
|
|
||||||
|
response = client.patch(
|
||||||
|
f"/admin/households/{target_household.id}/members/{user.id}"
|
||||||
|
)
|
||||||
|
|
||||||
|
assert response.status_code == 404
|
||||||
|
assert response.json()["detail"] == "HouseholdMembership not found"
|
||||||
|
|
||||||
|
|
||||||
|
def test_move_member_rejects_same_household_target(
|
||||||
|
auth_client: tuple[TestClient, dict[str, CurrentUser]],
|
||||||
|
session: Session,
|
||||||
|
):
|
||||||
|
client, _ = auth_client
|
||||||
|
|
||||||
|
user = _create_user(session)
|
||||||
|
household = _create_household(session)
|
||||||
|
_ = _create_membership(session, user_id=user.id, household_id=household.id)
|
||||||
|
|
||||||
|
response = client.patch(f"/admin/households/{household.id}/members/{user.id}")
|
||||||
|
|
||||||
|
assert response.status_code == 409
|
||||||
|
assert response.json()["detail"] == "User already belongs to this household"
|
||||||
|
|
||||||
|
|
||||||
|
def test_remove_membership_deletes_membership(
|
||||||
|
auth_client: tuple[TestClient, dict[str, CurrentUser]],
|
||||||
|
session: Session,
|
||||||
|
):
|
||||||
|
client, _ = auth_client
|
||||||
|
|
||||||
|
user = _create_user(session)
|
||||||
|
household = _create_household(session)
|
||||||
|
membership = _create_membership(session, user_id=user.id, household_id=household.id)
|
||||||
|
|
||||||
|
response = client.delete(f"/admin/households/{household.id}/members/{user.id}")
|
||||||
|
|
||||||
|
assert response.status_code == 204
|
||||||
|
assert session.get(HouseholdMembership, membership.id) is None
|
||||||
|
|
||||||
|
|
||||||
|
def test_remove_membership_requires_matching_household(
|
||||||
|
auth_client: tuple[TestClient, dict[str, CurrentUser]],
|
||||||
|
session: Session,
|
||||||
|
):
|
||||||
|
client, _ = auth_client
|
||||||
|
|
||||||
|
user = _create_user(session)
|
||||||
|
household = _create_household(session)
|
||||||
|
other_household = _create_household(session)
|
||||||
|
_ = _create_membership(session, user_id=user.id, household_id=household.id)
|
||||||
|
|
||||||
|
response = client.delete(
|
||||||
|
f"/admin/households/{other_household.id}/members/{user.id}"
|
||||||
|
)
|
||||||
|
|
||||||
|
assert response.status_code == 404
|
||||||
|
assert response.json()["detail"] == "HouseholdMembership not found"
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.mark.parametrize(
|
||||||
|
("method", "path", "json_body"),
|
||||||
|
[
|
||||||
|
("get", "/admin/users", None),
|
||||||
|
("post", "/admin/households", None),
|
||||||
|
("post", f"/admin/households/{uuid4()}/members", {"user_id": str(uuid4())}),
|
||||||
|
("patch", f"/admin/households/{uuid4()}/members/{uuid4()}", None),
|
||||||
|
("delete", f"/admin/households/{uuid4()}/members/{uuid4()}", None),
|
||||||
|
],
|
||||||
|
)
|
||||||
|
def test_admin_household_routes_forbidden_for_member(
|
||||||
|
auth_client: tuple[TestClient, dict[str, CurrentUser]],
|
||||||
|
method: str,
|
||||||
|
path: str,
|
||||||
|
json_body: dict[str, str] | None,
|
||||||
|
):
|
||||||
|
client, auth_state = auth_client
|
||||||
|
auth_state["current_user"] = _current_user(uuid4(), role=Role.MEMBER)
|
||||||
|
|
||||||
|
response = client.request(method, path, json=json_body)
|
||||||
|
|
||||||
|
assert response.status_code == 403
|
||||||
|
assert response.json()["detail"] == "Admin role required"
|
||||||
|
|
@ -4,12 +4,13 @@ from typing import Any, cast
|
||||||
from innercontext.models.ai_log import AICallLog
|
from innercontext.models.ai_log import AICallLog
|
||||||
|
|
||||||
|
|
||||||
def test_list_ai_logs_normalizes_tool_trace_string(client, session):
|
def test_list_ai_logs_normalizes_tool_trace_string(client, session, current_user):
|
||||||
log = AICallLog(
|
log = AICallLog(
|
||||||
id=uuid.uuid4(),
|
id=uuid.uuid4(),
|
||||||
endpoint="routines/suggest",
|
endpoint="routines/suggest",
|
||||||
model="gemini-3-flash-preview",
|
model="gemini-3-flash-preview",
|
||||||
success=True,
|
success=True,
|
||||||
|
user_id=current_user.user_id,
|
||||||
)
|
)
|
||||||
log.tool_trace = cast(
|
log.tool_trace = cast(
|
||||||
Any,
|
Any,
|
||||||
|
|
@ -26,12 +27,13 @@ def test_list_ai_logs_normalizes_tool_trace_string(client, session):
|
||||||
assert data[0]["tool_trace"]["events"][0]["function"] == "get_product_inci"
|
assert data[0]["tool_trace"]["events"][0]["function"] == "get_product_inci"
|
||||||
|
|
||||||
|
|
||||||
def test_get_ai_log_normalizes_tool_trace_string(client, session):
|
def test_get_ai_log_normalizes_tool_trace_string(client, session, current_user):
|
||||||
log = AICallLog(
|
log = AICallLog(
|
||||||
id=uuid.uuid4(),
|
id=uuid.uuid4(),
|
||||||
endpoint="routines/suggest",
|
endpoint="routines/suggest",
|
||||||
model="gemini-3-flash-preview",
|
model="gemini-3-flash-preview",
|
||||||
success=True,
|
success=True,
|
||||||
|
user_id=current_user.user_id,
|
||||||
)
|
)
|
||||||
log.tool_trace = cast(Any, '{"mode":"function_tools","round":1}')
|
log.tool_trace = cast(Any, '{"mode":"function_tools","round":1}')
|
||||||
session.add(log)
|
session.add(log)
|
||||||
|
|
|
||||||
275
backend/tests/test_auth.py
Normal file
275
backend/tests/test_auth.py
Normal file
|
|
@ -0,0 +1,275 @@
|
||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
import json
|
||||||
|
from datetime import UTC, datetime, timedelta
|
||||||
|
from uuid import UUID, uuid4
|
||||||
|
|
||||||
|
import jwt
|
||||||
|
import pytest
|
||||||
|
from cryptography.hazmat.primitives.asymmetric import rsa
|
||||||
|
from fastapi import HTTPException
|
||||||
|
from fastapi.testclient import TestClient
|
||||||
|
from jwt import algorithms
|
||||||
|
from sqlmodel import Session, SQLModel, create_engine
|
||||||
|
from sqlmodel.pool import StaticPool
|
||||||
|
|
||||||
|
import db as db_module
|
||||||
|
from db import get_session
|
||||||
|
from innercontext.api.auth_deps import require_admin
|
||||||
|
from innercontext.auth import (
|
||||||
|
CurrentHouseholdMembership,
|
||||||
|
CurrentUser,
|
||||||
|
IdentityData,
|
||||||
|
TokenClaims,
|
||||||
|
reset_auth_caches,
|
||||||
|
validate_access_token,
|
||||||
|
)
|
||||||
|
from innercontext.models import (
|
||||||
|
Household,
|
||||||
|
HouseholdMembership,
|
||||||
|
HouseholdRole,
|
||||||
|
Role,
|
||||||
|
User,
|
||||||
|
)
|
||||||
|
from main import app
|
||||||
|
|
||||||
|
|
||||||
|
class _MockResponse:
|
||||||
|
def __init__(self, payload: dict[str, object], status_code: int = 200):
|
||||||
|
self._payload = payload
|
||||||
|
self.status_code = status_code
|
||||||
|
|
||||||
|
def raise_for_status(self) -> None:
|
||||||
|
if self.status_code >= 400:
|
||||||
|
raise RuntimeError(f"unexpected status {self.status_code}")
|
||||||
|
|
||||||
|
def json(self) -> dict[str, object]:
|
||||||
|
return self._payload
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.fixture()
|
||||||
|
def auth_env(monkeypatch):
|
||||||
|
monkeypatch.setenv("OIDC_ISSUER", "https://auth.example.test")
|
||||||
|
monkeypatch.setenv("OIDC_CLIENT_ID", "innercontext-web")
|
||||||
|
monkeypatch.setenv(
|
||||||
|
"OIDC_DISCOVERY_URL",
|
||||||
|
"https://auth.example.test/.well-known/openid-configuration",
|
||||||
|
)
|
||||||
|
monkeypatch.setenv("OIDC_ADMIN_GROUPS", "innercontext-admin")
|
||||||
|
monkeypatch.setenv("OIDC_MEMBER_GROUPS", "innercontext-member")
|
||||||
|
monkeypatch.setenv("OIDC_JWKS_CACHE_TTL_SECONDS", "3600")
|
||||||
|
reset_auth_caches()
|
||||||
|
yield
|
||||||
|
reset_auth_caches()
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.fixture()
|
||||||
|
def rsa_keypair():
|
||||||
|
private_key = rsa.generate_private_key(
|
||||||
|
public_exponent=65537,
|
||||||
|
key_size=2048,
|
||||||
|
)
|
||||||
|
return private_key, private_key.public_key()
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.fixture()
|
||||||
|
def auth_session(monkeypatch):
|
||||||
|
engine = create_engine(
|
||||||
|
"sqlite://",
|
||||||
|
connect_args={"check_same_thread": False},
|
||||||
|
poolclass=StaticPool,
|
||||||
|
)
|
||||||
|
monkeypatch.setattr(db_module, "engine", engine)
|
||||||
|
import innercontext.models # noqa: F401
|
||||||
|
|
||||||
|
SQLModel.metadata.create_all(engine)
|
||||||
|
with Session(engine) as session:
|
||||||
|
yield session
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.fixture()
|
||||||
|
def auth_client(auth_session):
|
||||||
|
def _override():
|
||||||
|
yield auth_session
|
||||||
|
|
||||||
|
app.dependency_overrides[get_session] = _override
|
||||||
|
with TestClient(app) as client:
|
||||||
|
yield client
|
||||||
|
app.dependency_overrides.clear()
|
||||||
|
|
||||||
|
|
||||||
|
def _public_jwk(public_key, kid: str) -> dict[str, object]:
|
||||||
|
jwk = json.loads(algorithms.RSAAlgorithm.to_jwk(public_key))
|
||||||
|
jwk["kid"] = kid
|
||||||
|
jwk["use"] = "sig"
|
||||||
|
jwk["alg"] = "RS256"
|
||||||
|
return jwk
|
||||||
|
|
||||||
|
|
||||||
|
def _sign_token(private_key, kid: str, **claims_overrides: object) -> str:
|
||||||
|
now = datetime.now(UTC)
|
||||||
|
payload: dict[str, object] = {
|
||||||
|
"iss": "https://auth.example.test",
|
||||||
|
"sub": "user-123",
|
||||||
|
"aud": "innercontext-web",
|
||||||
|
"exp": int((now + timedelta(hours=1)).timestamp()),
|
||||||
|
"iat": int(now.timestamp()),
|
||||||
|
"groups": ["innercontext-admin"],
|
||||||
|
"email": "user@example.test",
|
||||||
|
"name": "Inner Context User",
|
||||||
|
"preferred_username": "ictx-user",
|
||||||
|
}
|
||||||
|
payload.update(claims_overrides)
|
||||||
|
return jwt.encode(payload, private_key, algorithm="RS256", headers={"kid": kid})
|
||||||
|
|
||||||
|
|
||||||
|
def _mock_oidc(monkeypatch, public_key, *, fetch_counts: dict[str, int] | None = None):
|
||||||
|
def _fake_get(url: str, timeout: float):
|
||||||
|
if fetch_counts is not None:
|
||||||
|
fetch_counts[url] = fetch_counts.get(url, 0) + 1
|
||||||
|
if url.endswith("/.well-known/openid-configuration"):
|
||||||
|
return _MockResponse({"jwks_uri": "https://auth.example.test/jwks.json"})
|
||||||
|
if url.endswith("/jwks.json"):
|
||||||
|
return _MockResponse({"keys": [_public_jwk(public_key, "kid-1")]})
|
||||||
|
raise AssertionError(f"unexpected URL {url} with timeout {timeout}")
|
||||||
|
|
||||||
|
monkeypatch.setattr("innercontext.auth.httpx.get", _fake_get)
|
||||||
|
|
||||||
|
|
||||||
|
def test_validate_access_token_uses_cached_jwks(auth_env, rsa_keypair, monkeypatch):
|
||||||
|
private_key, public_key = rsa_keypair
|
||||||
|
fetch_counts: dict[str, int] = {}
|
||||||
|
_mock_oidc(monkeypatch, public_key, fetch_counts=fetch_counts)
|
||||||
|
|
||||||
|
validate_access_token(_sign_token(private_key, "kid-1", sub="user-a"))
|
||||||
|
validate_access_token(_sign_token(private_key, "kid-1", sub="user-b"))
|
||||||
|
|
||||||
|
assert (
|
||||||
|
fetch_counts["https://auth.example.test/.well-known/openid-configuration"] == 1
|
||||||
|
)
|
||||||
|
assert fetch_counts["https://auth.example.test/jwks.json"] == 1
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.mark.parametrize(
|
||||||
|
("path", "payload"),
|
||||||
|
[
|
||||||
|
(
|
||||||
|
"/auth/session/sync",
|
||||||
|
{
|
||||||
|
"email": "sync@example.test",
|
||||||
|
"name": "Synced User",
|
||||||
|
"preferred_username": "synced-user",
|
||||||
|
"groups": ["innercontext-admin"],
|
||||||
|
},
|
||||||
|
),
|
||||||
|
("/auth/me", None),
|
||||||
|
],
|
||||||
|
ids=["/auth/session/sync", "/auth/me"],
|
||||||
|
)
|
||||||
|
def test_sync_protected_endpoints_create_or_resolve_current_user(
|
||||||
|
auth_env,
|
||||||
|
auth_client,
|
||||||
|
auth_session,
|
||||||
|
rsa_keypair,
|
||||||
|
monkeypatch,
|
||||||
|
path: str,
|
||||||
|
payload: dict[str, object] | None,
|
||||||
|
):
|
||||||
|
private_key, public_key = rsa_keypair
|
||||||
|
_mock_oidc(monkeypatch, public_key)
|
||||||
|
token = _sign_token(private_key, "kid-1")
|
||||||
|
|
||||||
|
if path == "/auth/me":
|
||||||
|
user = User(
|
||||||
|
oidc_issuer="https://auth.example.test",
|
||||||
|
oidc_subject="user-123",
|
||||||
|
role=Role.ADMIN,
|
||||||
|
)
|
||||||
|
auth_session.add(user)
|
||||||
|
auth_session.commit()
|
||||||
|
auth_session.refresh(user)
|
||||||
|
|
||||||
|
household = Household()
|
||||||
|
auth_session.add(household)
|
||||||
|
auth_session.commit()
|
||||||
|
auth_session.refresh(household)
|
||||||
|
|
||||||
|
membership = HouseholdMembership(
|
||||||
|
user_id=user.id,
|
||||||
|
household_id=household.id,
|
||||||
|
role=HouseholdRole.OWNER,
|
||||||
|
)
|
||||||
|
auth_session.add(membership)
|
||||||
|
auth_session.commit()
|
||||||
|
|
||||||
|
response = auth_client.request(
|
||||||
|
"POST" if path.endswith("sync") else "GET",
|
||||||
|
path,
|
||||||
|
headers={"Authorization": f"Bearer {token}"},
|
||||||
|
json=payload,
|
||||||
|
)
|
||||||
|
|
||||||
|
assert response.status_code == 200
|
||||||
|
data = response.json()
|
||||||
|
assert data["user"]["role"] == "admin"
|
||||||
|
assert data["identity"]["issuer"] == "https://auth.example.test"
|
||||||
|
assert data["identity"]["subject"] == "user-123"
|
||||||
|
|
||||||
|
synced_user = auth_session.get(User, UUID(data["user"]["id"]))
|
||||||
|
assert synced_user is not None
|
||||||
|
assert synced_user.oidc_issuer == "https://auth.example.test"
|
||||||
|
assert synced_user.oidc_subject == "user-123"
|
||||||
|
|
||||||
|
if path == "/auth/session/sync":
|
||||||
|
assert data["identity"]["email"] == "sync@example.test"
|
||||||
|
assert data["identity"]["groups"] == ["innercontext-admin"]
|
||||||
|
else:
|
||||||
|
assert data["user"]["household_membership"]["role"] == "owner"
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.mark.parametrize(
|
||||||
|
"path",
|
||||||
|
["/auth/me", "/profile"],
|
||||||
|
ids=["/auth/me expects 401", "/profile expects 401"],
|
||||||
|
)
|
||||||
|
def test_unauthorized_protected_endpoints_return_401(auth_env, auth_client, path: str):
|
||||||
|
response = auth_client.get(path)
|
||||||
|
assert response.status_code == 401
|
||||||
|
assert response.json()["detail"] == "Missing bearer token"
|
||||||
|
|
||||||
|
|
||||||
|
def test_unauthorized_invalid_bearer_token_is_rejected(
|
||||||
|
auth_env, auth_client, rsa_keypair, monkeypatch
|
||||||
|
):
|
||||||
|
_, public_key = rsa_keypair
|
||||||
|
_mock_oidc(monkeypatch, public_key)
|
||||||
|
response = auth_client.get(
|
||||||
|
"/auth/me",
|
||||||
|
headers={"Authorization": "Bearer not-a-jwt"},
|
||||||
|
)
|
||||||
|
assert response.status_code == 401
|
||||||
|
|
||||||
|
|
||||||
|
def test_require_admin_raises_for_member():
|
||||||
|
claims = TokenClaims(
|
||||||
|
issuer="https://auth.example.test",
|
||||||
|
subject="member-1",
|
||||||
|
audience=("innercontext-web",),
|
||||||
|
expires_at=datetime.now(UTC) + timedelta(hours=1),
|
||||||
|
raw_claims={"iss": "https://auth.example.test", "sub": "member-1"},
|
||||||
|
)
|
||||||
|
current_user = CurrentUser(
|
||||||
|
user_id=uuid4(),
|
||||||
|
role=Role.MEMBER,
|
||||||
|
identity=IdentityData.from_claims(claims),
|
||||||
|
claims=claims,
|
||||||
|
household_membership=CurrentHouseholdMembership(
|
||||||
|
household_id=uuid4(),
|
||||||
|
role=HouseholdRole.MEMBER,
|
||||||
|
),
|
||||||
|
)
|
||||||
|
|
||||||
|
with pytest.raises(HTTPException) as exc_info:
|
||||||
|
require_admin(current_user)
|
||||||
|
|
||||||
|
assert exc_info.value.status_code == 403
|
||||||
293
backend/tests/test_authz.py
Normal file
293
backend/tests/test_authz.py
Normal file
|
|
@ -0,0 +1,293 @@
|
||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
from datetime import UTC, datetime, timedelta
|
||||||
|
from uuid import UUID, uuid4
|
||||||
|
|
||||||
|
import pytest
|
||||||
|
from fastapi import HTTPException
|
||||||
|
from sqlmodel import Session
|
||||||
|
|
||||||
|
from innercontext.api.authz import (
|
||||||
|
can_update_inventory,
|
||||||
|
check_household_inventory_access,
|
||||||
|
get_owned_or_404,
|
||||||
|
get_owned_or_404_admin_override,
|
||||||
|
is_product_visible,
|
||||||
|
list_owned,
|
||||||
|
list_owned_admin_override,
|
||||||
|
)
|
||||||
|
from innercontext.auth import (
|
||||||
|
CurrentHouseholdMembership,
|
||||||
|
CurrentUser,
|
||||||
|
IdentityData,
|
||||||
|
TokenClaims,
|
||||||
|
)
|
||||||
|
from innercontext.models import (
|
||||||
|
Household,
|
||||||
|
HouseholdMembership,
|
||||||
|
HouseholdRole,
|
||||||
|
DayTime,
|
||||||
|
MedicationEntry,
|
||||||
|
MedicationKind,
|
||||||
|
Product,
|
||||||
|
ProductCategory,
|
||||||
|
ProductInventory,
|
||||||
|
Role,
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def _claims(subject: str) -> TokenClaims:
|
||||||
|
return TokenClaims(
|
||||||
|
issuer="https://auth.example.test",
|
||||||
|
subject=subject,
|
||||||
|
audience=("innercontext-web",),
|
||||||
|
expires_at=datetime.now(UTC) + timedelta(hours=1),
|
||||||
|
raw_claims={"iss": "https://auth.example.test", "sub": subject},
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def _current_user(
|
||||||
|
user_id: UUID,
|
||||||
|
*,
|
||||||
|
role: Role = Role.MEMBER,
|
||||||
|
household_id: UUID | None = None,
|
||||||
|
) -> CurrentUser:
|
||||||
|
claims = _claims(str(user_id))
|
||||||
|
membership = None
|
||||||
|
if household_id is not None:
|
||||||
|
membership = CurrentHouseholdMembership(
|
||||||
|
household_id=household_id,
|
||||||
|
role=HouseholdRole.MEMBER,
|
||||||
|
)
|
||||||
|
return CurrentUser(
|
||||||
|
user_id=user_id,
|
||||||
|
role=role,
|
||||||
|
identity=IdentityData.from_claims(claims),
|
||||||
|
claims=claims,
|
||||||
|
household_membership=membership,
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def _create_household(session: Session) -> Household:
|
||||||
|
household = Household()
|
||||||
|
session.add(household)
|
||||||
|
session.commit()
|
||||||
|
session.refresh(household)
|
||||||
|
return household
|
||||||
|
|
||||||
|
|
||||||
|
def _create_membership(
|
||||||
|
session: Session, user_id: UUID, household_id: UUID
|
||||||
|
) -> HouseholdMembership:
|
||||||
|
membership = HouseholdMembership(user_id=user_id, household_id=household_id)
|
||||||
|
session.add(membership)
|
||||||
|
session.commit()
|
||||||
|
session.refresh(membership)
|
||||||
|
return membership
|
||||||
|
|
||||||
|
|
||||||
|
def _create_medication(session: Session, user_id: UUID) -> MedicationEntry:
|
||||||
|
entry = MedicationEntry(
|
||||||
|
user_id=user_id,
|
||||||
|
kind=MedicationKind.PRESCRIPTION,
|
||||||
|
product_name="Test medication",
|
||||||
|
)
|
||||||
|
session.add(entry)
|
||||||
|
session.commit()
|
||||||
|
session.refresh(entry)
|
||||||
|
return entry
|
||||||
|
|
||||||
|
|
||||||
|
def _create_product(session: Session, user_id: UUID, short_id: str) -> Product:
|
||||||
|
product = Product(
|
||||||
|
user_id=user_id,
|
||||||
|
short_id=short_id,
|
||||||
|
name="Shared product",
|
||||||
|
brand="Test brand",
|
||||||
|
category=ProductCategory.MOISTURIZER,
|
||||||
|
recommended_time=DayTime.BOTH,
|
||||||
|
leave_on=True,
|
||||||
|
)
|
||||||
|
setattr(product, "product_effect_profile", {})
|
||||||
|
session.add(product)
|
||||||
|
session.commit()
|
||||||
|
session.refresh(product)
|
||||||
|
return product
|
||||||
|
|
||||||
|
|
||||||
|
def _create_inventory(
|
||||||
|
session: Session,
|
||||||
|
*,
|
||||||
|
user_id: UUID,
|
||||||
|
product_id: UUID,
|
||||||
|
is_household_shared: bool,
|
||||||
|
) -> ProductInventory:
|
||||||
|
inventory = ProductInventory(
|
||||||
|
user_id=user_id,
|
||||||
|
product_id=product_id,
|
||||||
|
is_household_shared=is_household_shared,
|
||||||
|
)
|
||||||
|
session.add(inventory)
|
||||||
|
session.commit()
|
||||||
|
session.refresh(inventory)
|
||||||
|
return inventory
|
||||||
|
|
||||||
|
|
||||||
|
def test_owner_helpers_return_only_owned_records(session: Session):
|
||||||
|
owner_id = uuid4()
|
||||||
|
other_id = uuid4()
|
||||||
|
owner_user = _current_user(owner_id)
|
||||||
|
owner_entry = _create_medication(session, owner_id)
|
||||||
|
_ = _create_medication(session, other_id)
|
||||||
|
|
||||||
|
fetched = get_owned_or_404(
|
||||||
|
session, MedicationEntry, owner_entry.record_id, owner_user
|
||||||
|
)
|
||||||
|
owned_entries = list_owned(session, MedicationEntry, owner_user)
|
||||||
|
|
||||||
|
assert fetched.record_id == owner_entry.record_id
|
||||||
|
assert len(owned_entries) == 1
|
||||||
|
assert owned_entries[0].user_id == owner_id
|
||||||
|
|
||||||
|
|
||||||
|
def test_admin_helpers_allow_admin_override_for_lookup_and_list(session: Session):
|
||||||
|
owner_id = uuid4()
|
||||||
|
admin_user = _current_user(uuid4(), role=Role.ADMIN)
|
||||||
|
owner_entry = _create_medication(session, owner_id)
|
||||||
|
|
||||||
|
fetched = get_owned_or_404_admin_override(
|
||||||
|
session,
|
||||||
|
MedicationEntry,
|
||||||
|
owner_entry.record_id,
|
||||||
|
admin_user,
|
||||||
|
)
|
||||||
|
listed = list_owned_admin_override(session, MedicationEntry, admin_user)
|
||||||
|
|
||||||
|
assert fetched.record_id == owner_entry.record_id
|
||||||
|
assert len(listed) == 1
|
||||||
|
|
||||||
|
|
||||||
|
def test_owner_denied_for_non_owned_lookup_returns_404(session: Session):
|
||||||
|
owner_id = uuid4()
|
||||||
|
intruder = _current_user(uuid4())
|
||||||
|
owner_entry = _create_medication(session, owner_id)
|
||||||
|
|
||||||
|
with pytest.raises(HTTPException) as exc_info:
|
||||||
|
_ = get_owned_or_404(session, MedicationEntry, owner_entry.record_id, intruder)
|
||||||
|
|
||||||
|
assert exc_info.value.status_code == 404
|
||||||
|
|
||||||
|
|
||||||
|
def test_household_shared_inventory_access_allows_same_household_member(
|
||||||
|
session: Session,
|
||||||
|
):
|
||||||
|
owner_id = uuid4()
|
||||||
|
household_member_id = uuid4()
|
||||||
|
household = _create_household(session)
|
||||||
|
_ = _create_membership(session, owner_id, household.id)
|
||||||
|
_ = _create_membership(session, household_member_id, household.id)
|
||||||
|
|
||||||
|
product = _create_product(session, owner_id, short_id="abcd0001")
|
||||||
|
inventory = _create_inventory(
|
||||||
|
session,
|
||||||
|
user_id=owner_id,
|
||||||
|
product_id=product.id,
|
||||||
|
is_household_shared=True,
|
||||||
|
)
|
||||||
|
|
||||||
|
current_user = _current_user(household_member_id, household_id=household.id)
|
||||||
|
fetched = check_household_inventory_access(session, inventory.id, current_user)
|
||||||
|
|
||||||
|
assert fetched.id == inventory.id
|
||||||
|
|
||||||
|
|
||||||
|
def test_household_shared_inventory_denied_for_cross_household_member(session: Session):
|
||||||
|
owner_id = uuid4()
|
||||||
|
outsider_id = uuid4()
|
||||||
|
owner_household = _create_household(session)
|
||||||
|
outsider_household = _create_household(session)
|
||||||
|
_ = _create_membership(session, owner_id, owner_household.id)
|
||||||
|
_ = _create_membership(session, outsider_id, outsider_household.id)
|
||||||
|
|
||||||
|
product = _create_product(session, owner_id, short_id="abcd0002")
|
||||||
|
inventory = _create_inventory(
|
||||||
|
session,
|
||||||
|
user_id=owner_id,
|
||||||
|
product_id=product.id,
|
||||||
|
is_household_shared=True,
|
||||||
|
)
|
||||||
|
|
||||||
|
outsider = _current_user(outsider_id, household_id=outsider_household.id)
|
||||||
|
|
||||||
|
with pytest.raises(HTTPException) as exc_info:
|
||||||
|
_ = check_household_inventory_access(session, inventory.id, outsider)
|
||||||
|
|
||||||
|
assert exc_info.value.status_code == 404
|
||||||
|
|
||||||
|
|
||||||
|
def test_household_inventory_update_rules_owner_admin_and_member(session: Session):
|
||||||
|
owner_id = uuid4()
|
||||||
|
member_id = uuid4()
|
||||||
|
household = _create_household(session)
|
||||||
|
_ = _create_membership(session, owner_id, household.id)
|
||||||
|
_ = _create_membership(session, member_id, household.id)
|
||||||
|
|
||||||
|
product = _create_product(session, owner_id, short_id="abcd0003")
|
||||||
|
inventory = _create_inventory(
|
||||||
|
session,
|
||||||
|
user_id=owner_id,
|
||||||
|
product_id=product.id,
|
||||||
|
is_household_shared=True,
|
||||||
|
)
|
||||||
|
|
||||||
|
owner = _current_user(owner_id, household_id=household.id)
|
||||||
|
admin = _current_user(uuid4(), role=Role.ADMIN)
|
||||||
|
member = _current_user(member_id, household_id=household.id)
|
||||||
|
|
||||||
|
assert can_update_inventory(session, inventory.id, owner) is True
|
||||||
|
assert can_update_inventory(session, inventory.id, admin) is True
|
||||||
|
assert can_update_inventory(session, inventory.id, member) is True
|
||||||
|
|
||||||
|
|
||||||
|
def test_product_visibility_for_owner_admin_and_household_shared(session: Session):
|
||||||
|
owner_id = uuid4()
|
||||||
|
member_id = uuid4()
|
||||||
|
household = _create_household(session)
|
||||||
|
_ = _create_membership(session, owner_id, household.id)
|
||||||
|
_ = _create_membership(session, member_id, household.id)
|
||||||
|
|
||||||
|
product = _create_product(session, owner_id, short_id="abcd0004")
|
||||||
|
_ = _create_inventory(
|
||||||
|
session,
|
||||||
|
user_id=owner_id,
|
||||||
|
product_id=product.id,
|
||||||
|
is_household_shared=True,
|
||||||
|
)
|
||||||
|
|
||||||
|
owner = _current_user(owner_id, household_id=household.id)
|
||||||
|
admin = _current_user(uuid4(), role=Role.ADMIN)
|
||||||
|
member = _current_user(member_id, household_id=household.id)
|
||||||
|
|
||||||
|
assert is_product_visible(session, product.id, owner) is True
|
||||||
|
assert is_product_visible(session, product.id, admin) is True
|
||||||
|
assert is_product_visible(session, product.id, member) is True
|
||||||
|
|
||||||
|
|
||||||
|
def test_product_visibility_denied_for_cross_household_member(session: Session):
|
||||||
|
owner_id = uuid4()
|
||||||
|
outsider_id = uuid4()
|
||||||
|
owner_household = _create_household(session)
|
||||||
|
outsider_household = _create_household(session)
|
||||||
|
_ = _create_membership(session, owner_id, owner_household.id)
|
||||||
|
_ = _create_membership(session, outsider_id, outsider_household.id)
|
||||||
|
|
||||||
|
product = _create_product(session, owner_id, short_id="abcd0005")
|
||||||
|
_ = _create_inventory(
|
||||||
|
session,
|
||||||
|
user_id=owner_id,
|
||||||
|
product_id=product.id,
|
||||||
|
is_household_shared=True,
|
||||||
|
)
|
||||||
|
outsider = _current_user(outsider_id, household_id=outsider_household.id)
|
||||||
|
|
||||||
|
assert is_product_visible(session, product.id, outsider) is False
|
||||||
|
|
@ -24,12 +24,17 @@ def test_update_inventory_opened(client, created_product):
|
||||||
|
|
||||||
r2 = client.patch(
|
r2 = client.patch(
|
||||||
f"/inventory/{inv_id}",
|
f"/inventory/{inv_id}",
|
||||||
json={"is_opened": True, "opened_at": "2026-01-15"},
|
json={
|
||||||
|
"is_opened": True,
|
||||||
|
"opened_at": "2026-01-15",
|
||||||
|
"remaining_level": "low",
|
||||||
|
},
|
||||||
)
|
)
|
||||||
assert r2.status_code == 200
|
assert r2.status_code == 200
|
||||||
data = r2.json()
|
data = r2.json()
|
||||||
assert data["is_opened"] is True
|
assert data["is_opened"] is True
|
||||||
assert data["opened_at"] == "2026-01-15"
|
assert data["opened_at"] == "2026-01-15"
|
||||||
|
assert data["remaining_level"] == "low"
|
||||||
|
|
||||||
|
|
||||||
def test_update_inventory_not_found(client):
|
def test_update_inventory_not_found(client):
|
||||||
|
|
|
||||||
|
|
@ -187,11 +187,15 @@ def test_list_inventory_product_not_found(client):
|
||||||
|
|
||||||
def test_create_inventory(client, created_product):
|
def test_create_inventory(client, created_product):
|
||||||
pid = created_product["id"]
|
pid = created_product["id"]
|
||||||
r = client.post(f"/products/{pid}/inventory", json={"is_opened": False})
|
r = client.post(
|
||||||
|
f"/products/{pid}/inventory",
|
||||||
|
json={"is_opened": True, "remaining_level": "medium"},
|
||||||
|
)
|
||||||
assert r.status_code == 201
|
assert r.status_code == 201
|
||||||
data = r.json()
|
data = r.json()
|
||||||
assert data["product_id"] == pid
|
assert data["product_id"] == pid
|
||||||
assert data["is_opened"] is False
|
assert data["is_opened"] is True
|
||||||
|
assert data["remaining_level"] == "medium"
|
||||||
|
|
||||||
|
|
||||||
def test_create_inventory_product_not_found(client):
|
def test_create_inventory_product_not_found(client):
|
||||||
|
|
@ -204,11 +208,16 @@ def test_parse_text_accepts_numeric_strength_levels(client, monkeypatch):
|
||||||
|
|
||||||
class _FakeResponse:
|
class _FakeResponse:
|
||||||
text = (
|
text = (
|
||||||
'{"name":"Test Serum","actives":[{"name":"Niacinamide","percent":10,'
|
'{"name":"Test Serum","category":"serum","recommended_time":"both",'
|
||||||
|
'"leave_on":true,"actives":[{"name":"Niacinamide","percent":10,'
|
||||||
'"functions":["niacinamide"],"strength_level":2,"irritation_potential":1}]}'
|
'"functions":["niacinamide"],"strength_level":2,"irritation_potential":1}]}'
|
||||||
)
|
)
|
||||||
|
|
||||||
monkeypatch.setattr(products_api, "call_gemini", lambda **kwargs: _FakeResponse())
|
monkeypatch.setattr(
|
||||||
|
products_api,
|
||||||
|
"call_gemini",
|
||||||
|
lambda **kwargs: (_FakeResponse(), None),
|
||||||
|
)
|
||||||
|
|
||||||
r = client.post("/products/parse-text", json={"text": "dummy input"})
|
r = client.post("/products/parse-text", json={"text": "dummy input"})
|
||||||
assert r.status_code == 200
|
assert r.status_code == 200
|
||||||
|
|
|
||||||
370
backend/tests/test_products_auth.py
Normal file
370
backend/tests/test_products_auth.py
Normal file
|
|
@ -0,0 +1,370 @@
|
||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
from datetime import UTC, datetime, timedelta
|
||||||
|
from uuid import UUID, uuid4
|
||||||
|
|
||||||
|
import pytest
|
||||||
|
from fastapi.testclient import TestClient
|
||||||
|
|
||||||
|
from db import get_session
|
||||||
|
from innercontext.api.auth_deps import get_current_user
|
||||||
|
from innercontext.auth import (
|
||||||
|
CurrentHouseholdMembership,
|
||||||
|
CurrentUser,
|
||||||
|
IdentityData,
|
||||||
|
TokenClaims,
|
||||||
|
)
|
||||||
|
from innercontext.models import (
|
||||||
|
DayTime,
|
||||||
|
Household,
|
||||||
|
HouseholdMembership,
|
||||||
|
HouseholdRole,
|
||||||
|
Product,
|
||||||
|
ProductCategory,
|
||||||
|
ProductInventory,
|
||||||
|
Role,
|
||||||
|
)
|
||||||
|
from main import app
|
||||||
|
|
||||||
|
|
||||||
|
def _current_user(
|
||||||
|
user_id: UUID,
|
||||||
|
*,
|
||||||
|
role: Role = Role.MEMBER,
|
||||||
|
household_id: UUID | None = None,
|
||||||
|
) -> CurrentUser:
|
||||||
|
claims = TokenClaims(
|
||||||
|
issuer="https://auth.test",
|
||||||
|
subject=str(user_id),
|
||||||
|
audience=("innercontext-web",),
|
||||||
|
expires_at=datetime.now(UTC) + timedelta(hours=1),
|
||||||
|
groups=("innercontext-member",),
|
||||||
|
raw_claims={"iss": "https://auth.test", "sub": str(user_id)},
|
||||||
|
)
|
||||||
|
membership = None
|
||||||
|
if household_id is not None:
|
||||||
|
membership = CurrentHouseholdMembership(
|
||||||
|
household_id=household_id,
|
||||||
|
role=HouseholdRole.MEMBER,
|
||||||
|
)
|
||||||
|
return CurrentUser(
|
||||||
|
user_id=user_id,
|
||||||
|
role=role,
|
||||||
|
identity=IdentityData.from_claims(claims),
|
||||||
|
claims=claims,
|
||||||
|
household_membership=membership,
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def _create_membership(session, user_id: UUID, household_id: UUID) -> None:
|
||||||
|
membership = HouseholdMembership(
|
||||||
|
user_id=user_id,
|
||||||
|
household_id=household_id,
|
||||||
|
role=HouseholdRole.MEMBER,
|
||||||
|
)
|
||||||
|
session.add(membership)
|
||||||
|
session.commit()
|
||||||
|
|
||||||
|
|
||||||
|
def _create_product(session, *, user_id: UUID, short_id: str, name: str) -> Product:
|
||||||
|
product = Product(
|
||||||
|
user_id=user_id,
|
||||||
|
short_id=short_id,
|
||||||
|
name=name,
|
||||||
|
brand="Brand",
|
||||||
|
category=ProductCategory.SERUM,
|
||||||
|
recommended_time=DayTime.BOTH,
|
||||||
|
leave_on=True,
|
||||||
|
)
|
||||||
|
setattr(product, "product_effect_profile", {})
|
||||||
|
session.add(product)
|
||||||
|
session.commit()
|
||||||
|
session.refresh(product)
|
||||||
|
return product
|
||||||
|
|
||||||
|
|
||||||
|
def _create_inventory(
|
||||||
|
session,
|
||||||
|
*,
|
||||||
|
user_id: UUID,
|
||||||
|
product_id: UUID,
|
||||||
|
is_household_shared: bool,
|
||||||
|
) -> ProductInventory:
|
||||||
|
entry = ProductInventory(
|
||||||
|
user_id=user_id,
|
||||||
|
product_id=product_id,
|
||||||
|
is_household_shared=is_household_shared,
|
||||||
|
)
|
||||||
|
session.add(entry)
|
||||||
|
session.commit()
|
||||||
|
session.refresh(entry)
|
||||||
|
return entry
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.fixture()
|
||||||
|
def auth_client(session):
|
||||||
|
auth_state = {"current_user": _current_user(uuid4(), role=Role.ADMIN)}
|
||||||
|
|
||||||
|
def _session_override():
|
||||||
|
yield session
|
||||||
|
|
||||||
|
def _current_user_override():
|
||||||
|
return auth_state["current_user"]
|
||||||
|
|
||||||
|
app.dependency_overrides[get_session] = _session_override
|
||||||
|
app.dependency_overrides[get_current_user] = _current_user_override
|
||||||
|
with TestClient(app) as client:
|
||||||
|
yield client, auth_state
|
||||||
|
app.dependency_overrides.clear()
|
||||||
|
|
||||||
|
|
||||||
|
def test_product_endpoints_require_authentication(session):
|
||||||
|
def _session_override():
|
||||||
|
yield session
|
||||||
|
|
||||||
|
app.dependency_overrides[get_session] = _session_override
|
||||||
|
app.dependency_overrides.pop(get_current_user, None)
|
||||||
|
with TestClient(app) as client:
|
||||||
|
response = client.get("/products")
|
||||||
|
app.dependency_overrides.clear()
|
||||||
|
|
||||||
|
assert response.status_code == 401
|
||||||
|
assert response.json()["detail"] == "Missing bearer token"
|
||||||
|
|
||||||
|
|
||||||
|
def test_shared_product_visible_in_summary_marks_is_owned_false(auth_client, session):
|
||||||
|
client, auth_state = auth_client
|
||||||
|
|
||||||
|
owner_id = uuid4()
|
||||||
|
member_id = uuid4()
|
||||||
|
household = Household()
|
||||||
|
session.add(household)
|
||||||
|
session.commit()
|
||||||
|
session.refresh(household)
|
||||||
|
_create_membership(session, owner_id, household.id)
|
||||||
|
_create_membership(session, member_id, household.id)
|
||||||
|
|
||||||
|
shared_product = _create_product(
|
||||||
|
session,
|
||||||
|
user_id=owner_id,
|
||||||
|
short_id="shprd001",
|
||||||
|
name="Shared Product",
|
||||||
|
)
|
||||||
|
_ = _create_inventory(
|
||||||
|
session,
|
||||||
|
user_id=owner_id,
|
||||||
|
product_id=shared_product.id,
|
||||||
|
is_household_shared=True,
|
||||||
|
)
|
||||||
|
|
||||||
|
auth_state["current_user"] = _current_user(member_id, household_id=household.id)
|
||||||
|
response = client.get("/products/summary")
|
||||||
|
|
||||||
|
assert response.status_code == 200
|
||||||
|
items = response.json()
|
||||||
|
shared_item = next(item for item in items if item["id"] == str(shared_product.id))
|
||||||
|
assert shared_item["is_owned"] is False
|
||||||
|
|
||||||
|
|
||||||
|
def test_shared_product_visible_filters_private_inventory_rows(auth_client, session):
|
||||||
|
client, auth_state = auth_client
|
||||||
|
|
||||||
|
owner_id = uuid4()
|
||||||
|
member_id = uuid4()
|
||||||
|
household = Household()
|
||||||
|
session.add(household)
|
||||||
|
session.commit()
|
||||||
|
session.refresh(household)
|
||||||
|
_create_membership(session, owner_id, household.id)
|
||||||
|
_create_membership(session, member_id, household.id)
|
||||||
|
|
||||||
|
product = _create_product(
|
||||||
|
session,
|
||||||
|
user_id=owner_id,
|
||||||
|
short_id="shprd002",
|
||||||
|
name="Shared Inventory Product",
|
||||||
|
)
|
||||||
|
shared_row = _create_inventory(
|
||||||
|
session,
|
||||||
|
user_id=owner_id,
|
||||||
|
product_id=product.id,
|
||||||
|
is_household_shared=True,
|
||||||
|
)
|
||||||
|
_ = _create_inventory(
|
||||||
|
session,
|
||||||
|
user_id=owner_id,
|
||||||
|
product_id=product.id,
|
||||||
|
is_household_shared=False,
|
||||||
|
)
|
||||||
|
|
||||||
|
auth_state["current_user"] = _current_user(member_id, household_id=household.id)
|
||||||
|
response = client.get(f"/products/{product.id}")
|
||||||
|
|
||||||
|
assert response.status_code == 200
|
||||||
|
inventory_ids = {entry["id"] for entry in response.json()["inventory"]}
|
||||||
|
assert str(shared_row.id) in inventory_ids
|
||||||
|
assert len(inventory_ids) == 1
|
||||||
|
|
||||||
|
|
||||||
|
def test_shared_inventory_update_allows_household_member(auth_client, session):
|
||||||
|
client, auth_state = auth_client
|
||||||
|
|
||||||
|
owner_id = uuid4()
|
||||||
|
member_id = uuid4()
|
||||||
|
household = Household()
|
||||||
|
session.add(household)
|
||||||
|
session.commit()
|
||||||
|
session.refresh(household)
|
||||||
|
_create_membership(session, owner_id, household.id)
|
||||||
|
_create_membership(session, member_id, household.id)
|
||||||
|
|
||||||
|
product = _create_product(
|
||||||
|
session,
|
||||||
|
user_id=owner_id,
|
||||||
|
short_id="shprd003",
|
||||||
|
name="Shared Update Product",
|
||||||
|
)
|
||||||
|
inventory = _create_inventory(
|
||||||
|
session,
|
||||||
|
user_id=owner_id,
|
||||||
|
product_id=product.id,
|
||||||
|
is_household_shared=True,
|
||||||
|
)
|
||||||
|
|
||||||
|
auth_state["current_user"] = _current_user(member_id, household_id=household.id)
|
||||||
|
response = client.patch(
|
||||||
|
f"/inventory/{inventory.id}",
|
||||||
|
json={"is_opened": True, "remaining_level": "low"},
|
||||||
|
)
|
||||||
|
|
||||||
|
assert response.status_code == 200
|
||||||
|
assert response.json()["is_opened"] is True
|
||||||
|
assert response.json()["remaining_level"] == "low"
|
||||||
|
|
||||||
|
|
||||||
|
def test_household_member_cannot_edit_shared_product(auth_client, session):
|
||||||
|
client, auth_state = auth_client
|
||||||
|
|
||||||
|
owner_id = uuid4()
|
||||||
|
member_id = uuid4()
|
||||||
|
household = Household()
|
||||||
|
session.add(household)
|
||||||
|
session.commit()
|
||||||
|
session.refresh(household)
|
||||||
|
_create_membership(session, owner_id, household.id)
|
||||||
|
_create_membership(session, member_id, household.id)
|
||||||
|
|
||||||
|
product = _create_product(
|
||||||
|
session,
|
||||||
|
user_id=owner_id,
|
||||||
|
short_id="shprd004",
|
||||||
|
name="Shared No Edit",
|
||||||
|
)
|
||||||
|
_ = _create_inventory(
|
||||||
|
session,
|
||||||
|
user_id=owner_id,
|
||||||
|
product_id=product.id,
|
||||||
|
is_household_shared=True,
|
||||||
|
)
|
||||||
|
|
||||||
|
auth_state["current_user"] = _current_user(member_id, household_id=household.id)
|
||||||
|
response = client.patch(f"/products/{product.id}", json={"name": "Intrusion"})
|
||||||
|
|
||||||
|
assert response.status_code == 404
|
||||||
|
|
||||||
|
|
||||||
|
def test_household_member_cannot_delete_shared_product(auth_client, session):
|
||||||
|
client, auth_state = auth_client
|
||||||
|
|
||||||
|
owner_id = uuid4()
|
||||||
|
member_id = uuid4()
|
||||||
|
household = Household()
|
||||||
|
session.add(household)
|
||||||
|
session.commit()
|
||||||
|
session.refresh(household)
|
||||||
|
_create_membership(session, owner_id, household.id)
|
||||||
|
_create_membership(session, member_id, household.id)
|
||||||
|
|
||||||
|
product = _create_product(
|
||||||
|
session,
|
||||||
|
user_id=owner_id,
|
||||||
|
short_id="shprd005",
|
||||||
|
name="Shared No Delete",
|
||||||
|
)
|
||||||
|
_ = _create_inventory(
|
||||||
|
session,
|
||||||
|
user_id=owner_id,
|
||||||
|
product_id=product.id,
|
||||||
|
is_household_shared=True,
|
||||||
|
)
|
||||||
|
|
||||||
|
auth_state["current_user"] = _current_user(member_id, household_id=household.id)
|
||||||
|
response = client.delete(f"/products/{product.id}")
|
||||||
|
|
||||||
|
assert response.status_code == 404
|
||||||
|
|
||||||
|
|
||||||
|
def test_household_member_cannot_create_or_delete_inventory_on_shared_product(
|
||||||
|
auth_client, session
|
||||||
|
):
|
||||||
|
client, auth_state = auth_client
|
||||||
|
|
||||||
|
owner_id = uuid4()
|
||||||
|
member_id = uuid4()
|
||||||
|
household = Household()
|
||||||
|
session.add(household)
|
||||||
|
session.commit()
|
||||||
|
session.refresh(household)
|
||||||
|
_create_membership(session, owner_id, household.id)
|
||||||
|
_create_membership(session, member_id, household.id)
|
||||||
|
|
||||||
|
product = _create_product(
|
||||||
|
session,
|
||||||
|
user_id=owner_id,
|
||||||
|
short_id="shprd006",
|
||||||
|
name="Shared Inventory Restrictions",
|
||||||
|
)
|
||||||
|
inventory = _create_inventory(
|
||||||
|
session,
|
||||||
|
user_id=owner_id,
|
||||||
|
product_id=product.id,
|
||||||
|
is_household_shared=True,
|
||||||
|
)
|
||||||
|
|
||||||
|
auth_state["current_user"] = _current_user(member_id, household_id=household.id)
|
||||||
|
create_response = client.post(f"/products/{product.id}/inventory", json={})
|
||||||
|
delete_response = client.delete(f"/inventory/{inventory.id}")
|
||||||
|
|
||||||
|
assert create_response.status_code == 404
|
||||||
|
assert delete_response.status_code == 404
|
||||||
|
|
||||||
|
|
||||||
|
def test_household_member_cannot_update_non_shared_inventory(auth_client, session):
|
||||||
|
client, auth_state = auth_client
|
||||||
|
|
||||||
|
owner_id = uuid4()
|
||||||
|
member_id = uuid4()
|
||||||
|
household = Household()
|
||||||
|
session.add(household)
|
||||||
|
session.commit()
|
||||||
|
session.refresh(household)
|
||||||
|
_create_membership(session, owner_id, household.id)
|
||||||
|
_create_membership(session, member_id, household.id)
|
||||||
|
|
||||||
|
product = _create_product(
|
||||||
|
session,
|
||||||
|
user_id=owner_id,
|
||||||
|
short_id="shprd007",
|
||||||
|
name="Private Inventory",
|
||||||
|
)
|
||||||
|
inventory = _create_inventory(
|
||||||
|
session,
|
||||||
|
user_id=owner_id,
|
||||||
|
product_id=product.id,
|
||||||
|
is_household_shared=False,
|
||||||
|
)
|
||||||
|
|
||||||
|
auth_state["current_user"] = _current_user(member_id, household_id=household.id)
|
||||||
|
response = client.patch(f"/inventory/{inventory.id}", json={"is_opened": True})
|
||||||
|
|
||||||
|
assert response.status_code == 404
|
||||||
|
|
@ -5,33 +5,50 @@ from unittest.mock import patch
|
||||||
from sqlmodel import Session
|
from sqlmodel import Session
|
||||||
|
|
||||||
from innercontext.api.products import (
|
from innercontext.api.products import (
|
||||||
|
ProductSuggestion,
|
||||||
|
ShoppingSuggestionResponse,
|
||||||
_build_shopping_context,
|
_build_shopping_context,
|
||||||
|
_compute_days_since_last_used,
|
||||||
|
_compute_replenishment_score,
|
||||||
_extract_requested_product_ids,
|
_extract_requested_product_ids,
|
||||||
build_product_details_tool_handler,
|
build_product_details_tool_handler,
|
||||||
)
|
)
|
||||||
from innercontext.models import (
|
from innercontext.models import (
|
||||||
Product,
|
Product,
|
||||||
|
ProductCategory,
|
||||||
ProductInventory,
|
ProductInventory,
|
||||||
SexAtBirth,
|
SexAtBirth,
|
||||||
|
SkinConcern,
|
||||||
SkinConditionSnapshot,
|
SkinConditionSnapshot,
|
||||||
)
|
)
|
||||||
from innercontext.models.profile import UserProfile
|
from innercontext.models.profile import UserProfile
|
||||||
|
from innercontext.validators.shopping_validator import (
|
||||||
|
ShoppingValidationContext,
|
||||||
|
ShoppingValidator,
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
def test_build_shopping_context(session: Session):
|
def test_build_shopping_context(session: Session, current_user):
|
||||||
# Empty context
|
# Empty context
|
||||||
ctx = _build_shopping_context(session, reference_date=date.today())
|
ctx = _build_shopping_context(
|
||||||
|
session, reference_date=date.today(), current_user=current_user
|
||||||
|
)
|
||||||
assert "USER PROFILE: no data" in ctx
|
assert "USER PROFILE: no data" in ctx
|
||||||
assert "(brak danych)" in ctx
|
assert "(brak danych)" in ctx
|
||||||
assert "POSIADANE PRODUKTY" in ctx
|
assert "POSIADANE PRODUKTY" in ctx
|
||||||
|
|
||||||
profile = UserProfile(birth_date=date(1990, 1, 10), sex_at_birth=SexAtBirth.MALE)
|
profile = UserProfile(
|
||||||
|
user_id=current_user.user_id,
|
||||||
|
birth_date=date(1990, 1, 10),
|
||||||
|
sex_at_birth=SexAtBirth.MALE,
|
||||||
|
)
|
||||||
session.add(profile)
|
session.add(profile)
|
||||||
session.commit()
|
session.commit()
|
||||||
|
|
||||||
# Add snapshot
|
# Add snapshot
|
||||||
snap = SkinConditionSnapshot(
|
snap = SkinConditionSnapshot(
|
||||||
id=uuid.uuid4(),
|
id=uuid.uuid4(),
|
||||||
|
user_id=current_user.user_id,
|
||||||
snapshot_date=date.today(),
|
snapshot_date=date.today(),
|
||||||
overall_state="fair",
|
overall_state="fair",
|
||||||
skin_type="combination",
|
skin_type="combination",
|
||||||
|
|
@ -46,6 +63,7 @@ def test_build_shopping_context(session: Session):
|
||||||
# Add product
|
# Add product
|
||||||
p = Product(
|
p = Product(
|
||||||
id=uuid.uuid4(),
|
id=uuid.uuid4(),
|
||||||
|
short_id=str(uuid.uuid4())[:8],
|
||||||
name="Soothing Serum",
|
name="Soothing Serum",
|
||||||
brand="BrandX",
|
brand="BrandX",
|
||||||
category="serum",
|
category="serum",
|
||||||
|
|
@ -59,11 +77,18 @@ def test_build_shopping_context(session: Session):
|
||||||
session.commit()
|
session.commit()
|
||||||
|
|
||||||
# Add inventory
|
# Add inventory
|
||||||
inv = ProductInventory(id=uuid.uuid4(), product_id=p.id, is_opened=True)
|
inv = ProductInventory(
|
||||||
|
id=uuid.uuid4(),
|
||||||
|
product_id=p.id,
|
||||||
|
is_opened=True,
|
||||||
|
remaining_level="medium",
|
||||||
|
)
|
||||||
session.add(inv)
|
session.add(inv)
|
||||||
session.commit()
|
session.commit()
|
||||||
|
|
||||||
ctx = _build_shopping_context(session, reference_date=date(2026, 3, 5))
|
ctx = _build_shopping_context(
|
||||||
|
session, reference_date=date(2026, 3, 5), current_user=current_user
|
||||||
|
)
|
||||||
assert "USER PROFILE:" in ctx
|
assert "USER PROFILE:" in ctx
|
||||||
assert "Age: 36" in ctx
|
assert "Age: 36" in ctx
|
||||||
assert "Sex at birth: male" in ctx
|
assert "Sex at birth: male" in ctx
|
||||||
|
|
@ -78,12 +103,165 @@ def test_build_shopping_context(session: Session):
|
||||||
assert "Soothing Serum" in ctx
|
assert "Soothing Serum" in ctx
|
||||||
assert f"id={p.id}" in ctx
|
assert f"id={p.id}" in ctx
|
||||||
assert "BrandX" in ctx
|
assert "BrandX" in ctx
|
||||||
assert "targets: ['redness']" in ctx
|
assert "targets=['redness']" in ctx
|
||||||
assert "actives: ['Centella']" in ctx
|
assert "actives=['Centella']" in ctx
|
||||||
assert "effects: {'soothing': 4}" in ctx
|
assert "effects={'soothing': 4}" in ctx
|
||||||
|
assert "stock_state=monitor" in ctx
|
||||||
|
assert "opened_count=1" in ctx
|
||||||
|
assert "sealed_backup_count=0" in ctx
|
||||||
|
assert "lowest_remaining_level=medium" in ctx
|
||||||
|
assert "replenishment_score=30" in ctx
|
||||||
|
assert "replenishment_priority_hint=low" in ctx
|
||||||
|
assert "repurchase_candidate=true" in ctx
|
||||||
|
|
||||||
|
|
||||||
|
def test_build_shopping_context_flags_replenishment_signal(
|
||||||
|
session: Session, current_user
|
||||||
|
):
|
||||||
|
product = Product(
|
||||||
|
id=uuid.uuid4(),
|
||||||
|
short_id=str(uuid.uuid4())[:8],
|
||||||
|
name="Barrier Cleanser",
|
||||||
|
brand="BrandY",
|
||||||
|
category="cleanser",
|
||||||
|
recommended_time="both",
|
||||||
|
leave_on=False,
|
||||||
|
product_effect_profile={},
|
||||||
|
user_id=current_user.user_id,
|
||||||
|
)
|
||||||
|
session.add(product)
|
||||||
|
session.commit()
|
||||||
|
|
||||||
|
session.add(
|
||||||
|
ProductInventory(
|
||||||
|
id=uuid.uuid4(),
|
||||||
|
product_id=product.id,
|
||||||
|
is_opened=True,
|
||||||
|
remaining_level="nearly_empty",
|
||||||
|
)
|
||||||
|
)
|
||||||
|
session.commit()
|
||||||
|
|
||||||
|
ctx = _build_shopping_context(
|
||||||
|
session, reference_date=date.today(), current_user=current_user
|
||||||
|
)
|
||||||
|
assert "lowest_remaining_level=nearly_empty" in ctx
|
||||||
|
assert "stock_state=urgent" in ctx
|
||||||
|
assert "replenishment_priority_hint=high" in ctx
|
||||||
|
|
||||||
|
|
||||||
|
def test_compute_replenishment_score_prefers_recent_staples_without_backup():
|
||||||
|
result = _compute_replenishment_score(
|
||||||
|
has_stock=True,
|
||||||
|
sealed_backup_count=0,
|
||||||
|
lowest_remaining_level="low",
|
||||||
|
days_since_last_used=2,
|
||||||
|
category=ProductCategory.CLEANSER,
|
||||||
|
)
|
||||||
|
|
||||||
|
assert result["replenishment_score"] == 95
|
||||||
|
assert result["replenishment_priority_hint"] == "high"
|
||||||
|
assert result["repurchase_candidate"] is True
|
||||||
|
assert result["replenishment_reason_codes"] == [
|
||||||
|
"low_opened",
|
||||||
|
"recently_used",
|
||||||
|
"staple_category",
|
||||||
|
]
|
||||||
|
|
||||||
|
|
||||||
|
def test_compute_replenishment_score_downranks_sealed_backup_and_stale_usage():
|
||||||
|
result = _compute_replenishment_score(
|
||||||
|
has_stock=True,
|
||||||
|
sealed_backup_count=1,
|
||||||
|
lowest_remaining_level="nearly_empty",
|
||||||
|
days_since_last_used=70,
|
||||||
|
category=ProductCategory.EXFOLIANT,
|
||||||
|
)
|
||||||
|
|
||||||
|
assert result["replenishment_score"] == 0
|
||||||
|
assert result["replenishment_priority_hint"] == "none"
|
||||||
|
assert result["repurchase_candidate"] is False
|
||||||
|
assert result["replenishment_reason_codes"] == [
|
||||||
|
"has_sealed_backup",
|
||||||
|
"stale_usage",
|
||||||
|
"occasional_category",
|
||||||
|
]
|
||||||
|
|
||||||
|
|
||||||
|
def test_compute_days_since_last_used_returns_none_without_usage():
|
||||||
|
assert _compute_days_since_last_used(None, date(2026, 3, 9)) is None
|
||||||
|
assert _compute_days_since_last_used(date(2026, 3, 7), date(2026, 3, 9)) == 2
|
||||||
|
|
||||||
|
|
||||||
def test_suggest_shopping(client, session):
|
def test_suggest_shopping(client, session):
|
||||||
|
with patch(
|
||||||
|
"innercontext.api.products.call_gemini_with_function_tools"
|
||||||
|
) as mock_gemini:
|
||||||
|
product = Product(
|
||||||
|
id=uuid.uuid4(),
|
||||||
|
short_id=str(uuid.uuid4())[:8],
|
||||||
|
name="Owned Serum",
|
||||||
|
brand="BrandX",
|
||||||
|
category="serum",
|
||||||
|
recommended_time="both",
|
||||||
|
leave_on=True,
|
||||||
|
product_effect_profile={},
|
||||||
|
)
|
||||||
|
session.add(product)
|
||||||
|
session.commit()
|
||||||
|
session.add(
|
||||||
|
ProductInventory(id=uuid.uuid4(), product_id=product.id, is_opened=True)
|
||||||
|
)
|
||||||
|
session.commit()
|
||||||
|
|
||||||
|
mock_response = type(
|
||||||
|
"Response",
|
||||||
|
(),
|
||||||
|
{
|
||||||
|
"text": '{"suggestions": [{"category": "cleanser", "product_type": "cleanser", "priority": "high", "key_ingredients": ["glycerin"], "target_concerns": ["dehydration"], "recommended_time": "am", "frequency": "daily", "short_reason": "Brakuje lagodnego kroku myjacego rano.", "reason_to_buy_now": "Obecnie nie masz delikatnego produktu do porannego oczyszczania i wsparcia bariery.", "reason_not_needed_if_budget_tight": "Mozesz tymczasowo ograniczyc sie do samego splukania twarzy rano, jesli skora jest spokojna.", "fit_with_current_routine": "To domknie podstawowy krok cleanse bez dokladania agresywnych aktywow.", "usage_cautions": ["unikaj mocnego domywania przy podraznieniu"]}], "reasoning": "Test shopping"}'
|
||||||
|
},
|
||||||
|
)
|
||||||
|
mock_gemini.return_value = (mock_response, None)
|
||||||
|
|
||||||
|
r = client.post("/products/suggest")
|
||||||
|
assert r.status_code == 200
|
||||||
|
data = r.json()
|
||||||
|
assert len(data["suggestions"]) == 1
|
||||||
|
assert data["suggestions"][0]["product_type"] == "cleanser"
|
||||||
|
assert data["suggestions"][0]["priority"] == "high"
|
||||||
|
assert data["suggestions"][0]["short_reason"]
|
||||||
|
assert data["suggestions"][0]["usage_cautions"] == [
|
||||||
|
"unikaj mocnego domywania przy podraznieniu"
|
||||||
|
]
|
||||||
|
assert data["reasoning"] == "Test shopping"
|
||||||
|
kwargs = mock_gemini.call_args.kwargs
|
||||||
|
assert "USER PROFILE:" in kwargs["contents"]
|
||||||
|
assert (
|
||||||
|
'category: "cleanser" | "toner" | "essence"'
|
||||||
|
in kwargs["config"].system_instruction
|
||||||
|
)
|
||||||
|
assert (
|
||||||
|
'recommended_time: "am" | "pm" | "both"'
|
||||||
|
in kwargs["config"].system_instruction
|
||||||
|
)
|
||||||
|
assert "function_handlers" in kwargs
|
||||||
|
assert "get_product_details" in kwargs["function_handlers"]
|
||||||
|
|
||||||
|
|
||||||
|
def test_suggest_shopping_invalid_json_returns_502(client):
|
||||||
|
with patch(
|
||||||
|
"innercontext.api.products.call_gemini_with_function_tools"
|
||||||
|
) as mock_gemini:
|
||||||
|
mock_response = type("Response", (), {"text": "{"})
|
||||||
|
mock_gemini.return_value = (mock_response, None)
|
||||||
|
|
||||||
|
r = client.post("/products/suggest")
|
||||||
|
|
||||||
|
assert r.status_code == 502
|
||||||
|
assert "LLM returned invalid JSON" in r.json()["detail"]
|
||||||
|
|
||||||
|
|
||||||
|
def test_suggest_shopping_invalid_schema_returns_502(client):
|
||||||
with patch(
|
with patch(
|
||||||
"innercontext.api.products.call_gemini_with_function_tools"
|
"innercontext.api.products.call_gemini_with_function_tools"
|
||||||
) as mock_gemini:
|
) as mock_gemini:
|
||||||
|
|
@ -91,26 +269,42 @@ def test_suggest_shopping(client, session):
|
||||||
"Response",
|
"Response",
|
||||||
(),
|
(),
|
||||||
{
|
{
|
||||||
"text": '{"suggestions": [{"category": "cleanser", "product_type": "cleanser", "priority": "high", "key_ingredients": [], "target_concerns": [], "why_needed": "reason", "recommended_time": "am", "frequency": "daily"}], "reasoning": "Test shopping"}'
|
"text": '{"suggestions": [{"category": "cleanser", "product_type": "cleanser", "priority": "urgent", "key_ingredients": [], "target_concerns": [], "recommended_time": "am", "frequency": "daily", "short_reason": "x", "reason_to_buy_now": "y", "fit_with_current_routine": "z", "usage_cautions": []}], "reasoning": "Test shopping"}'
|
||||||
},
|
},
|
||||||
)
|
)
|
||||||
mock_gemini.return_value = mock_response
|
mock_gemini.return_value = (mock_response, None)
|
||||||
|
|
||||||
r = client.post("/products/suggest")
|
r = client.post("/products/suggest")
|
||||||
assert r.status_code == 200
|
|
||||||
data = r.json()
|
assert r.status_code == 502
|
||||||
assert len(data["suggestions"]) == 1
|
assert "LLM returned invalid shopping suggestion schema" in r.json()["detail"]
|
||||||
assert data["suggestions"][0]["product_type"] == "cleanser"
|
assert "suggestions/0/priority" in r.json()["detail"]
|
||||||
assert data["reasoning"] == "Test shopping"
|
|
||||||
kwargs = mock_gemini.call_args.kwargs
|
|
||||||
assert "USER PROFILE:" in kwargs["contents"]
|
|
||||||
assert "function_handlers" in kwargs
|
|
||||||
assert "get_product_details" in kwargs["function_handlers"]
|
|
||||||
|
|
||||||
|
|
||||||
def test_shopping_context_medication_skip(session: Session):
|
def test_suggest_shopping_invalid_target_concern_returns_502(client):
|
||||||
|
with patch(
|
||||||
|
"innercontext.api.products.call_gemini_with_function_tools"
|
||||||
|
) as mock_gemini:
|
||||||
|
mock_response = type(
|
||||||
|
"Response",
|
||||||
|
(),
|
||||||
|
{
|
||||||
|
"text": '{"suggestions": [{"category": "cleanser", "product_type": "cleanser", "priority": "high", "key_ingredients": ["glycerin"], "target_concerns": ["inflammation"], "recommended_time": "am", "frequency": "daily", "short_reason": "Brakuje lagodnego kroku myjacego rano.", "reason_to_buy_now": "Obecnie nie masz delikatnego produktu do porannego oczyszczania i wsparcia bariery.", "fit_with_current_routine": "To domknie podstawowy krok cleanse bez dokladania agresywnych aktywow.", "usage_cautions": []}], "reasoning": "Test shopping"}'
|
||||||
|
},
|
||||||
|
)
|
||||||
|
mock_gemini.return_value = (mock_response, None)
|
||||||
|
|
||||||
|
r = client.post("/products/suggest")
|
||||||
|
|
||||||
|
assert r.status_code == 502
|
||||||
|
assert "LLM returned invalid shopping suggestion schema" in r.json()["detail"]
|
||||||
|
assert "suggestions/0/target_concerns/0" in r.json()["detail"]
|
||||||
|
|
||||||
|
|
||||||
|
def test_shopping_context_medication_skip(session: Session, current_user):
|
||||||
p = Product(
|
p = Product(
|
||||||
id=uuid.uuid4(),
|
id=uuid.uuid4(),
|
||||||
|
short_id=str(uuid.uuid4())[:8],
|
||||||
name="Epiduo",
|
name="Epiduo",
|
||||||
brand="Galderma",
|
brand="Galderma",
|
||||||
category="serum",
|
category="serum",
|
||||||
|
|
@ -118,11 +312,14 @@ def test_shopping_context_medication_skip(session: Session):
|
||||||
leave_on=True,
|
leave_on=True,
|
||||||
is_medication=True,
|
is_medication=True,
|
||||||
product_effect_profile={},
|
product_effect_profile={},
|
||||||
|
user_id=current_user.user_id,
|
||||||
)
|
)
|
||||||
session.add(p)
|
session.add(p)
|
||||||
session.commit()
|
session.commit()
|
||||||
|
|
||||||
ctx = _build_shopping_context(session, reference_date=date.today())
|
ctx = _build_shopping_context(
|
||||||
|
session, reference_date=date.today(), current_user=current_user
|
||||||
|
)
|
||||||
assert "Epiduo" not in ctx
|
assert "Epiduo" not in ctx
|
||||||
|
|
||||||
|
|
||||||
|
|
@ -137,6 +334,7 @@ def test_extract_requested_product_ids_dedupes_and_limits():
|
||||||
def test_shopping_tool_handlers_return_payloads(session: Session):
|
def test_shopping_tool_handlers_return_payloads(session: Session):
|
||||||
product = Product(
|
product = Product(
|
||||||
id=uuid.uuid4(),
|
id=uuid.uuid4(),
|
||||||
|
short_id=str(uuid.uuid4())[:8],
|
||||||
name="Test Product",
|
name="Test Product",
|
||||||
brand="Brand",
|
brand="Brand",
|
||||||
category="serum",
|
category="serum",
|
||||||
|
|
@ -151,10 +349,10 @@ def test_shopping_tool_handlers_return_payloads(session: Session):
|
||||||
payload = {"product_ids": [str(product.id)]}
|
payload = {"product_ids": [str(product.id)]}
|
||||||
|
|
||||||
details = build_product_details_tool_handler([product])(payload)
|
details = build_product_details_tool_handler([product])(payload)
|
||||||
assert details["products"][0]["inci"] == ["Water", "Niacinamide"]
|
|
||||||
assert details["products"][0]["actives"][0]["name"] == "Niacinamide"
|
assert details["products"][0]["actives"][0]["name"] == "Niacinamide"
|
||||||
assert "context_rules" in details["products"][0]
|
assert "context_rules" in details["products"][0]
|
||||||
assert details["products"][0]["last_used_on"] is None
|
assert details["products"][0]["last_used_on"] is None
|
||||||
|
assert "inci" not in details["products"][0]
|
||||||
|
|
||||||
|
|
||||||
def test_shopping_tool_handler_includes_last_used_on_from_mapping(session: Session):
|
def test_shopping_tool_handler_includes_last_used_on_from_mapping(session: Session):
|
||||||
|
|
@ -175,3 +373,48 @@ def test_shopping_tool_handler_includes_last_used_on_from_mapping(session: Sessi
|
||||||
)(payload)
|
)(payload)
|
||||||
|
|
||||||
assert details["products"][0]["last_used_on"] == "2026-03-01"
|
assert details["products"][0]["last_used_on"] == "2026-03-01"
|
||||||
|
|
||||||
|
|
||||||
|
def test_shopping_validator_accepts_freeform_product_type_and_frequency():
|
||||||
|
response = ShoppingSuggestionResponse(
|
||||||
|
suggestions=[
|
||||||
|
ProductSuggestion(
|
||||||
|
category="spot_treatment",
|
||||||
|
product_type="Punktowy preparat na wypryski z ichtiolem lub cynkiem",
|
||||||
|
priority="high",
|
||||||
|
key_ingredients=["ichtiol", "cynk"],
|
||||||
|
target_concerns=["acne"],
|
||||||
|
recommended_time="pm",
|
||||||
|
frequency="Codziennie (punktowo na zmiany)",
|
||||||
|
short_reason="Pomaga opanowac aktywne zmiany bez dokladania pelnego aktywu na cala twarz.",
|
||||||
|
reason_to_buy_now="Brakuje Ci dedykowanego produktu punktowego na pojedyncze wypryski.",
|
||||||
|
fit_with_current_routine="Mozesz dolozyc go tylko na zmiany po serum lub zamiast mocniejszego aktywu.",
|
||||||
|
usage_cautions=["stosuj tylko miejscowo"],
|
||||||
|
),
|
||||||
|
ProductSuggestion(
|
||||||
|
category="mask",
|
||||||
|
product_type="Lagodna maska oczyszczajaca",
|
||||||
|
priority="low",
|
||||||
|
key_ingredients=["glinka"],
|
||||||
|
target_concerns=["sebum_excess"],
|
||||||
|
recommended_time="pm",
|
||||||
|
frequency="1 raz w tygodniu",
|
||||||
|
short_reason="To opcjonalne wsparcie przy nadmiarze sebum.",
|
||||||
|
reason_to_buy_now="Moze pomoc domknac sporadyczne oczyszczanie, gdy skora jest bardziej przetluszczona.",
|
||||||
|
fit_with_current_routine="Najlepiej traktowac to jako dodatkowy krok, nie zamiennik podstaw rutyny.",
|
||||||
|
usage_cautions=[],
|
||||||
|
),
|
||||||
|
],
|
||||||
|
reasoning="Test",
|
||||||
|
)
|
||||||
|
|
||||||
|
result = ShoppingValidator().validate(
|
||||||
|
response,
|
||||||
|
ShoppingValidationContext(
|
||||||
|
owned_product_ids=set(),
|
||||||
|
valid_categories=set(ProductCategory),
|
||||||
|
valid_targets=set(SkinConcern),
|
||||||
|
),
|
||||||
|
)
|
||||||
|
|
||||||
|
assert not any("unusual frequency" in warning for warning in result.warnings)
|
||||||
|
|
|
||||||
|
|
@ -1,6 +1,10 @@
|
||||||
import uuid
|
import uuid
|
||||||
|
from datetime import date
|
||||||
from unittest.mock import patch
|
from unittest.mock import patch
|
||||||
|
|
||||||
|
from innercontext.models import Routine, SkinConditionSnapshot
|
||||||
|
from innercontext.models.enums import BarrierState, OverallSkinState, PartOfDay
|
||||||
|
|
||||||
# ---------------------------------------------------------------------------
|
# ---------------------------------------------------------------------------
|
||||||
# Routines
|
# Routines
|
||||||
# ---------------------------------------------------------------------------
|
# ---------------------------------------------------------------------------
|
||||||
|
|
@ -219,10 +223,23 @@ def test_delete_grooming_schedule_not_found(client):
|
||||||
assert r.status_code == 404
|
assert r.status_code == 404
|
||||||
|
|
||||||
|
|
||||||
def test_suggest_routine(client, session):
|
def test_suggest_routine(client, session, current_user):
|
||||||
with patch(
|
with patch(
|
||||||
"innercontext.api.routines.call_gemini_with_function_tools"
|
"innercontext.api.routines.call_gemini_with_function_tools"
|
||||||
) as mock_gemini:
|
) as mock_gemini:
|
||||||
|
session.add(
|
||||||
|
SkinConditionSnapshot(
|
||||||
|
id=uuid.uuid4(),
|
||||||
|
user_id=current_user.user_id,
|
||||||
|
snapshot_date=date(2026, 2, 22),
|
||||||
|
overall_state=OverallSkinState.GOOD,
|
||||||
|
hydration_level=4,
|
||||||
|
barrier_state=BarrierState.INTACT,
|
||||||
|
priorities=["hydration"],
|
||||||
|
)
|
||||||
|
)
|
||||||
|
session.commit()
|
||||||
|
|
||||||
# Mock the Gemini response
|
# Mock the Gemini response
|
||||||
mock_response = type(
|
mock_response = type(
|
||||||
"Response",
|
"Response",
|
||||||
|
|
@ -231,7 +248,7 @@ def test_suggest_routine(client, session):
|
||||||
"text": '{"steps": [{"product_id": null, "action_type": "shaving_razor"}], "reasoning": "because"}'
|
"text": '{"steps": [{"product_id": null, "action_type": "shaving_razor"}], "reasoning": "because"}'
|
||||||
},
|
},
|
||||||
)
|
)
|
||||||
mock_gemini.return_value = mock_response
|
mock_gemini.return_value = (mock_response, None)
|
||||||
|
|
||||||
r = client.post(
|
r = client.post(
|
||||||
"/routines/suggest",
|
"/routines/suggest",
|
||||||
|
|
@ -249,12 +266,35 @@ def test_suggest_routine(client, session):
|
||||||
assert data["reasoning"] == "because"
|
assert data["reasoning"] == "because"
|
||||||
kwargs = mock_gemini.call_args.kwargs
|
kwargs = mock_gemini.call_args.kwargs
|
||||||
assert "USER PROFILE:" in kwargs["contents"]
|
assert "USER PROFILE:" in kwargs["contents"]
|
||||||
|
assert "UPCOMING GROOMING (next 7 days):" in kwargs["contents"]
|
||||||
|
assert "snapshot from 2026-02-22" in kwargs["contents"]
|
||||||
|
assert "RECENT ROUTINES: none" in kwargs["contents"]
|
||||||
assert "function_handlers" in kwargs
|
assert "function_handlers" in kwargs
|
||||||
assert "get_product_details" in kwargs["function_handlers"]
|
assert "get_product_details" in kwargs["function_handlers"]
|
||||||
|
|
||||||
|
|
||||||
def test_suggest_batch(client, session):
|
def test_suggest_batch(client, session, current_user):
|
||||||
with patch("innercontext.api.routines.call_gemini") as mock_gemini:
|
with patch("innercontext.api.routines.call_gemini") as mock_gemini:
|
||||||
|
session.add(
|
||||||
|
Routine(
|
||||||
|
id=uuid.uuid4(),
|
||||||
|
user_id=current_user.user_id,
|
||||||
|
routine_date=date(2026, 2, 27),
|
||||||
|
part_of_day=PartOfDay.PM,
|
||||||
|
)
|
||||||
|
)
|
||||||
|
session.add(
|
||||||
|
SkinConditionSnapshot(
|
||||||
|
id=uuid.uuid4(),
|
||||||
|
user_id=current_user.user_id,
|
||||||
|
snapshot_date=date(2026, 2, 20),
|
||||||
|
overall_state=OverallSkinState.GOOD,
|
||||||
|
hydration_level=4,
|
||||||
|
barrier_state=BarrierState.INTACT,
|
||||||
|
)
|
||||||
|
)
|
||||||
|
session.commit()
|
||||||
|
|
||||||
# Mock the Gemini response
|
# Mock the Gemini response
|
||||||
mock_response = type(
|
mock_response = type(
|
||||||
"Response",
|
"Response",
|
||||||
|
|
@ -263,7 +303,7 @@ def test_suggest_batch(client, session):
|
||||||
"text": '{"days": [{"date": "2026-03-03", "am_steps": [], "pm_steps": [], "reasoning": "none"}], "overall_reasoning": "batch test"}'
|
"text": '{"days": [{"date": "2026-03-03", "am_steps": [], "pm_steps": [], "reasoning": "none"}], "overall_reasoning": "batch test"}'
|
||||||
},
|
},
|
||||||
)
|
)
|
||||||
mock_gemini.return_value = mock_response
|
mock_gemini.return_value = (mock_response, None)
|
||||||
|
|
||||||
r = client.post(
|
r = client.post(
|
||||||
"/routines/suggest-batch",
|
"/routines/suggest-batch",
|
||||||
|
|
@ -280,6 +320,8 @@ def test_suggest_batch(client, session):
|
||||||
assert data["overall_reasoning"] == "batch test"
|
assert data["overall_reasoning"] == "batch test"
|
||||||
kwargs = mock_gemini.call_args.kwargs
|
kwargs = mock_gemini.call_args.kwargs
|
||||||
assert "USER PROFILE:" in kwargs["contents"]
|
assert "USER PROFILE:" in kwargs["contents"]
|
||||||
|
assert "2026-02-27 PM:" in kwargs["contents"]
|
||||||
|
assert "snapshot from 2026-02-20" in kwargs["contents"]
|
||||||
|
|
||||||
|
|
||||||
def test_suggest_batch_invalid_date_range(client):
|
def test_suggest_batch_invalid_date_range(client):
|
||||||
|
|
|
||||||
112
backend/tests/test_routines_auth.py
Normal file
112
backend/tests/test_routines_auth.py
Normal file
|
|
@ -0,0 +1,112 @@
|
||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
from datetime import UTC, datetime, timedelta
|
||||||
|
from unittest.mock import patch
|
||||||
|
from uuid import uuid4
|
||||||
|
|
||||||
|
from innercontext.api.auth_deps import get_current_user
|
||||||
|
from innercontext.auth import CurrentUser, IdentityData, TokenClaims
|
||||||
|
from innercontext.models import Role
|
||||||
|
from main import app
|
||||||
|
|
||||||
|
|
||||||
|
def _user(subject: str, *, role: Role = Role.MEMBER) -> CurrentUser:
|
||||||
|
claims = TokenClaims(
|
||||||
|
issuer="https://auth.test",
|
||||||
|
subject=subject,
|
||||||
|
audience=("innercontext-web",),
|
||||||
|
expires_at=datetime.now(UTC) + timedelta(hours=1),
|
||||||
|
raw_claims={"iss": "https://auth.test", "sub": subject},
|
||||||
|
)
|
||||||
|
return CurrentUser(
|
||||||
|
user_id=uuid4(),
|
||||||
|
role=role,
|
||||||
|
identity=IdentityData.from_claims(claims),
|
||||||
|
claims=claims,
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def _set_current_user(user: CurrentUser) -> None:
|
||||||
|
app.dependency_overrides[get_current_user] = lambda: user
|
||||||
|
|
||||||
|
|
||||||
|
def test_suggest_uses_current_user_profile_and_visible_products_only(client):
|
||||||
|
owner = _user("owner")
|
||||||
|
other = _user("other")
|
||||||
|
|
||||||
|
_set_current_user(owner)
|
||||||
|
owner_profile = client.patch(
|
||||||
|
"/profile", json={"birth_date": "1991-01-15", "sex_at_birth": "male"}
|
||||||
|
)
|
||||||
|
owner_product = client.post(
|
||||||
|
"/products",
|
||||||
|
json={
|
||||||
|
"name": "Owner Serum",
|
||||||
|
"brand": "Test",
|
||||||
|
"category": "serum",
|
||||||
|
"recommended_time": "both",
|
||||||
|
"leave_on": True,
|
||||||
|
},
|
||||||
|
)
|
||||||
|
assert owner_profile.status_code == 200
|
||||||
|
assert owner_product.status_code == 201
|
||||||
|
|
||||||
|
_set_current_user(other)
|
||||||
|
other_profile = client.patch(
|
||||||
|
"/profile", json={"birth_date": "1975-06-20", "sex_at_birth": "female"}
|
||||||
|
)
|
||||||
|
other_product = client.post(
|
||||||
|
"/products",
|
||||||
|
json={
|
||||||
|
"name": "Other Serum",
|
||||||
|
"brand": "Test",
|
||||||
|
"category": "serum",
|
||||||
|
"recommended_time": "both",
|
||||||
|
"leave_on": True,
|
||||||
|
},
|
||||||
|
)
|
||||||
|
assert other_profile.status_code == 200
|
||||||
|
assert other_product.status_code == 201
|
||||||
|
|
||||||
|
_set_current_user(owner)
|
||||||
|
|
||||||
|
with patch(
|
||||||
|
"innercontext.api.routines.call_gemini_with_function_tools"
|
||||||
|
) as mock_gemini:
|
||||||
|
mock_response = type(
|
||||||
|
"Response",
|
||||||
|
(),
|
||||||
|
{
|
||||||
|
"text": '{"steps": [{"product_id": null, "action_type": "shaving_razor"}], "reasoning": "ok", "summary": {"primary_goal": "safe", "constraints_applied": [], "confidence": 0.7}}'
|
||||||
|
},
|
||||||
|
)
|
||||||
|
mock_gemini.return_value = (mock_response, None)
|
||||||
|
|
||||||
|
response = client.post(
|
||||||
|
"/routines/suggest",
|
||||||
|
json={
|
||||||
|
"routine_date": "2026-03-05",
|
||||||
|
"part_of_day": "am",
|
||||||
|
"include_minoxidil_beard": False,
|
||||||
|
},
|
||||||
|
)
|
||||||
|
assert response.status_code == 200
|
||||||
|
|
||||||
|
kwargs = mock_gemini.call_args.kwargs
|
||||||
|
prompt = kwargs["contents"]
|
||||||
|
assert "Birth date: 1991-01-15" in prompt
|
||||||
|
assert "Birth date: 1975-06-20" not in prompt
|
||||||
|
assert "Owner Serum" in prompt
|
||||||
|
assert "Other Serum" not in prompt
|
||||||
|
|
||||||
|
handler = kwargs["function_handlers"]["get_product_details"]
|
||||||
|
payload = handler(
|
||||||
|
{
|
||||||
|
"product_ids": [
|
||||||
|
owner_product.json()["id"],
|
||||||
|
other_product.json()["id"],
|
||||||
|
]
|
||||||
|
}
|
||||||
|
)
|
||||||
|
assert len(payload["products"]) == 1
|
||||||
|
assert payload["products"][0]["name"] == "Owner Serum"
|
||||||
|
|
@ -3,30 +3,33 @@ from datetime import date, timedelta
|
||||||
|
|
||||||
from sqlmodel import Session
|
from sqlmodel import Session
|
||||||
|
|
||||||
|
from innercontext.api.llm_context import build_products_context_summary_list
|
||||||
from innercontext.api.routines import (
|
from innercontext.api.routines import (
|
||||||
_build_day_context,
|
_build_day_context,
|
||||||
_build_grooming_context,
|
_build_grooming_context,
|
||||||
_build_objectives_context,
|
_build_objectives_context,
|
||||||
_build_products_context,
|
|
||||||
_build_recent_history,
|
_build_recent_history,
|
||||||
_build_skin_context,
|
_build_skin_context,
|
||||||
|
_build_upcoming_grooming_context,
|
||||||
_contains_minoxidil_text,
|
_contains_minoxidil_text,
|
||||||
_ev,
|
_ev,
|
||||||
_extract_active_names,
|
_extract_active_names,
|
||||||
_extract_requested_product_ids,
|
_extract_requested_product_ids,
|
||||||
_filter_products_by_interval,
|
_filter_products_by_interval,
|
||||||
_get_available_products,
|
_get_available_products,
|
||||||
|
_get_latest_skin_snapshot_within_days,
|
||||||
|
_get_recent_skin_snapshot,
|
||||||
_is_minoxidil_product,
|
_is_minoxidil_product,
|
||||||
build_product_details_tool_handler,
|
build_product_details_tool_handler,
|
||||||
)
|
)
|
||||||
from innercontext.models import (
|
from innercontext.models import (
|
||||||
GroomingSchedule,
|
GroomingSchedule,
|
||||||
Product,
|
Product,
|
||||||
ProductInventory,
|
|
||||||
Routine,
|
Routine,
|
||||||
RoutineStep,
|
RoutineStep,
|
||||||
SkinConditionSnapshot,
|
SkinConditionSnapshot,
|
||||||
)
|
)
|
||||||
|
from innercontext.models.enums import BarrierState, OverallSkinState, SkinConcern
|
||||||
|
|
||||||
|
|
||||||
def test_contains_minoxidil_text():
|
def test_contains_minoxidil_text():
|
||||||
|
|
@ -75,59 +78,253 @@ def test_ev():
|
||||||
assert _ev("string") == "string"
|
assert _ev("string") == "string"
|
||||||
|
|
||||||
|
|
||||||
def test_build_skin_context(session: Session):
|
def test_build_skin_context(session: Session, current_user):
|
||||||
# Empty
|
# Empty
|
||||||
assert _build_skin_context(session) == "SKIN CONDITION: no data\n"
|
reference_date = date(2026, 3, 10)
|
||||||
|
assert (
|
||||||
|
_build_skin_context(
|
||||||
|
session,
|
||||||
|
target_user_id=current_user.user_id,
|
||||||
|
reference_date=reference_date,
|
||||||
|
)
|
||||||
|
== "SKIN CONDITION: no data\n"
|
||||||
|
)
|
||||||
|
|
||||||
# With data
|
# With data
|
||||||
snap = SkinConditionSnapshot(
|
snap = SkinConditionSnapshot(
|
||||||
id=uuid.uuid4(),
|
id=uuid.uuid4(),
|
||||||
snapshot_date=date.today(),
|
user_id=current_user.user_id,
|
||||||
overall_state="good",
|
snapshot_date=reference_date,
|
||||||
|
overall_state=OverallSkinState.GOOD,
|
||||||
hydration_level=4,
|
hydration_level=4,
|
||||||
barrier_state="intact",
|
barrier_state=BarrierState.INTACT,
|
||||||
active_concerns=["acne", "dryness"],
|
active_concerns=[SkinConcern.ACNE, SkinConcern.DEHYDRATION],
|
||||||
priorities=["hydration"],
|
priorities=["hydration"],
|
||||||
notes="Feeling good",
|
notes="Feeling good",
|
||||||
)
|
)
|
||||||
session.add(snap)
|
session.add(snap)
|
||||||
session.commit()
|
session.commit()
|
||||||
|
|
||||||
ctx = _build_skin_context(session)
|
ctx = _build_skin_context(
|
||||||
|
session,
|
||||||
|
target_user_id=current_user.user_id,
|
||||||
|
reference_date=reference_date,
|
||||||
|
)
|
||||||
assert "SKIN CONDITION (snapshot from" in ctx
|
assert "SKIN CONDITION (snapshot from" in ctx
|
||||||
assert "Overall state: good" in ctx
|
assert "Overall state: good" in ctx
|
||||||
assert "Hydration: 4/5" in ctx
|
assert "Hydration: 4/5" in ctx
|
||||||
assert "Barrier: intact" in ctx
|
assert "Barrier: intact" in ctx
|
||||||
assert "Active concerns: acne, dryness" in ctx
|
assert "Active concerns: acne, dehydration" in ctx
|
||||||
assert "Priorities: hydration" in ctx
|
assert "Priorities: hydration" in ctx
|
||||||
assert "Notes: Feeling good" in ctx
|
assert "Notes: Feeling good" in ctx
|
||||||
|
|
||||||
|
|
||||||
def test_build_grooming_context(session: Session):
|
def test_build_skin_context_falls_back_to_recent_snapshot_within_14_days(
|
||||||
assert _build_grooming_context(session) == "GROOMING SCHEDULE: none\n"
|
session: Session,
|
||||||
|
current_user,
|
||||||
|
):
|
||||||
|
reference_date = date(2026, 3, 20)
|
||||||
|
snap = SkinConditionSnapshot(
|
||||||
|
id=uuid.uuid4(),
|
||||||
|
user_id=current_user.user_id,
|
||||||
|
snapshot_date=reference_date - timedelta(days=10),
|
||||||
|
overall_state=OverallSkinState.FAIR,
|
||||||
|
hydration_level=3,
|
||||||
|
barrier_state=BarrierState.COMPROMISED,
|
||||||
|
active_concerns=[SkinConcern.REDNESS],
|
||||||
|
priorities=["barrier"],
|
||||||
|
)
|
||||||
|
session.add(snap)
|
||||||
|
session.commit()
|
||||||
|
|
||||||
|
ctx = _build_skin_context(
|
||||||
|
session,
|
||||||
|
target_user_id=current_user.user_id,
|
||||||
|
reference_date=reference_date,
|
||||||
|
)
|
||||||
|
|
||||||
|
assert f"snapshot from {reference_date - timedelta(days=10)}" in ctx
|
||||||
|
assert "Barrier: compromised" in ctx
|
||||||
|
|
||||||
|
|
||||||
|
def test_build_skin_context_ignores_snapshot_older_than_14_days(
|
||||||
|
session: Session, current_user
|
||||||
|
):
|
||||||
|
reference_date = date(2026, 3, 20)
|
||||||
|
snap = SkinConditionSnapshot(
|
||||||
|
id=uuid.uuid4(),
|
||||||
|
user_id=current_user.user_id,
|
||||||
|
snapshot_date=reference_date - timedelta(days=15),
|
||||||
|
overall_state=OverallSkinState.FAIR,
|
||||||
|
hydration_level=3,
|
||||||
|
barrier_state=BarrierState.INTACT,
|
||||||
|
)
|
||||||
|
session.add(snap)
|
||||||
|
session.commit()
|
||||||
|
|
||||||
|
assert (
|
||||||
|
_build_skin_context(
|
||||||
|
session,
|
||||||
|
target_user_id=current_user.user_id,
|
||||||
|
reference_date=reference_date,
|
||||||
|
)
|
||||||
|
== "SKIN CONDITION: no data\n"
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def test_get_recent_skin_snapshot_prefers_window_match(session: Session, current_user):
|
||||||
|
reference_date = date(2026, 3, 20)
|
||||||
|
older = SkinConditionSnapshot(
|
||||||
|
id=uuid.uuid4(),
|
||||||
|
user_id=current_user.user_id,
|
||||||
|
snapshot_date=reference_date - timedelta(days=10),
|
||||||
|
overall_state=OverallSkinState.POOR,
|
||||||
|
hydration_level=2,
|
||||||
|
barrier_state=BarrierState.COMPROMISED,
|
||||||
|
)
|
||||||
|
newer = SkinConditionSnapshot(
|
||||||
|
id=uuid.uuid4(),
|
||||||
|
user_id=current_user.user_id,
|
||||||
|
snapshot_date=reference_date - timedelta(days=2),
|
||||||
|
overall_state=OverallSkinState.GOOD,
|
||||||
|
hydration_level=4,
|
||||||
|
barrier_state=BarrierState.INTACT,
|
||||||
|
)
|
||||||
|
session.add_all([older, newer])
|
||||||
|
session.commit()
|
||||||
|
|
||||||
|
snapshot = _get_recent_skin_snapshot(
|
||||||
|
session,
|
||||||
|
target_user_id=current_user.user_id,
|
||||||
|
reference_date=reference_date,
|
||||||
|
)
|
||||||
|
|
||||||
|
assert snapshot is not None
|
||||||
|
assert snapshot.id == newer.id
|
||||||
|
|
||||||
|
|
||||||
|
def test_get_latest_skin_snapshot_within_days_uses_latest_within_14_days(
|
||||||
|
session: Session,
|
||||||
|
current_user,
|
||||||
|
):
|
||||||
|
reference_date = date(2026, 3, 20)
|
||||||
|
older = SkinConditionSnapshot(
|
||||||
|
id=uuid.uuid4(),
|
||||||
|
user_id=current_user.user_id,
|
||||||
|
snapshot_date=reference_date - timedelta(days=10),
|
||||||
|
overall_state=OverallSkinState.POOR,
|
||||||
|
hydration_level=2,
|
||||||
|
barrier_state=BarrierState.COMPROMISED,
|
||||||
|
)
|
||||||
|
newer = SkinConditionSnapshot(
|
||||||
|
id=uuid.uuid4(),
|
||||||
|
user_id=current_user.user_id,
|
||||||
|
snapshot_date=reference_date - timedelta(days=2),
|
||||||
|
overall_state=OverallSkinState.GOOD,
|
||||||
|
hydration_level=4,
|
||||||
|
barrier_state=BarrierState.INTACT,
|
||||||
|
)
|
||||||
|
session.add_all([older, newer])
|
||||||
|
session.commit()
|
||||||
|
|
||||||
|
snapshot = _get_latest_skin_snapshot_within_days(
|
||||||
|
session,
|
||||||
|
target_user_id=current_user.user_id,
|
||||||
|
reference_date=reference_date,
|
||||||
|
)
|
||||||
|
|
||||||
|
assert snapshot is not None
|
||||||
|
assert snapshot.id == newer.id
|
||||||
|
|
||||||
|
|
||||||
|
def test_build_grooming_context(session: Session, current_user):
|
||||||
|
assert (
|
||||||
|
_build_grooming_context(session, target_user_id=current_user.user_id)
|
||||||
|
== "GROOMING SCHEDULE: none\n"
|
||||||
|
)
|
||||||
|
|
||||||
sch = GroomingSchedule(
|
sch = GroomingSchedule(
|
||||||
id=uuid.uuid4(), day_of_week=0, action="shaving_oneblade", notes="Morning"
|
id=uuid.uuid4(),
|
||||||
|
user_id=current_user.user_id,
|
||||||
|
day_of_week=0,
|
||||||
|
action="shaving_oneblade",
|
||||||
|
notes="Morning",
|
||||||
)
|
)
|
||||||
session.add(sch)
|
session.add(sch)
|
||||||
session.commit()
|
session.commit()
|
||||||
|
|
||||||
ctx = _build_grooming_context(session)
|
ctx = _build_grooming_context(session, target_user_id=current_user.user_id)
|
||||||
assert "GROOMING SCHEDULE:" in ctx
|
assert "GROOMING SCHEDULE:" in ctx
|
||||||
assert "poniedziałek: shaving_oneblade (Morning)" in ctx
|
assert "poniedziałek: shaving_oneblade (Morning)" in ctx
|
||||||
|
|
||||||
# Test weekdays filter
|
# Test weekdays filter
|
||||||
ctx2 = _build_grooming_context(session, weekdays=[1]) # not monday
|
ctx2 = _build_grooming_context(
|
||||||
|
session,
|
||||||
|
target_user_id=current_user.user_id,
|
||||||
|
weekdays=[1],
|
||||||
|
) # not monday
|
||||||
assert "(no entries for specified days)" in ctx2
|
assert "(no entries for specified days)" in ctx2
|
||||||
|
|
||||||
|
|
||||||
def test_build_recent_history(session: Session):
|
def test_build_upcoming_grooming_context(session: Session, current_user):
|
||||||
assert _build_recent_history(session) == "RECENT ROUTINES: none\n"
|
assert (
|
||||||
|
_build_upcoming_grooming_context(
|
||||||
|
session,
|
||||||
|
target_user_id=current_user.user_id,
|
||||||
|
start_date=date(2026, 3, 2),
|
||||||
|
days=7,
|
||||||
|
)
|
||||||
|
== "UPCOMING GROOMING (next 7 days): none\n"
|
||||||
|
)
|
||||||
|
|
||||||
r = Routine(id=uuid.uuid4(), routine_date=date.today(), part_of_day="am")
|
monday = GroomingSchedule(
|
||||||
|
id=uuid.uuid4(),
|
||||||
|
user_id=current_user.user_id,
|
||||||
|
day_of_week=0,
|
||||||
|
action="shaving_oneblade",
|
||||||
|
notes="Morning",
|
||||||
|
)
|
||||||
|
wednesday = GroomingSchedule(
|
||||||
|
id=uuid.uuid4(),
|
||||||
|
user_id=current_user.user_id,
|
||||||
|
day_of_week=2,
|
||||||
|
action="dermarolling",
|
||||||
|
)
|
||||||
|
session.add_all([monday, wednesday])
|
||||||
|
session.commit()
|
||||||
|
|
||||||
|
ctx = _build_upcoming_grooming_context(
|
||||||
|
session,
|
||||||
|
target_user_id=current_user.user_id,
|
||||||
|
start_date=date(2026, 3, 2),
|
||||||
|
days=7,
|
||||||
|
)
|
||||||
|
assert "UPCOMING GROOMING (next 7 days):" in ctx
|
||||||
|
assert "dzisiaj (2026-03-02, poniedziałek): shaving_oneblade (Morning)" in ctx
|
||||||
|
assert "za 2 dni (2026-03-04, środa): dermarolling" in ctx
|
||||||
|
|
||||||
|
|
||||||
|
def test_build_recent_history(session: Session, current_user):
|
||||||
|
reference_date = date(2026, 3, 10)
|
||||||
|
assert (
|
||||||
|
_build_recent_history(
|
||||||
|
session,
|
||||||
|
target_user_id=current_user.user_id,
|
||||||
|
reference_date=reference_date,
|
||||||
|
)
|
||||||
|
== "RECENT ROUTINES: none\n"
|
||||||
|
)
|
||||||
|
|
||||||
|
r = Routine(
|
||||||
|
id=uuid.uuid4(),
|
||||||
|
user_id=current_user.user_id,
|
||||||
|
routine_date=reference_date,
|
||||||
|
part_of_day="am",
|
||||||
|
)
|
||||||
session.add(r)
|
session.add(r)
|
||||||
p = Product(
|
p = Product(
|
||||||
id=uuid.uuid4(),
|
id=uuid.uuid4(),
|
||||||
|
short_id=str(uuid.uuid4())[:8],
|
||||||
name="Cleanser",
|
name="Cleanser",
|
||||||
category="cleanser",
|
category="cleanser",
|
||||||
brand="Test",
|
brand="Test",
|
||||||
|
|
@ -138,19 +335,37 @@ def test_build_recent_history(session: Session):
|
||||||
session.add(p)
|
session.add(p)
|
||||||
session.commit()
|
session.commit()
|
||||||
|
|
||||||
s1 = RoutineStep(id=uuid.uuid4(), routine_id=r.id, order_index=1, product_id=p.id)
|
s1 = RoutineStep(
|
||||||
|
id=uuid.uuid4(),
|
||||||
|
user_id=current_user.user_id,
|
||||||
|
routine_id=r.id,
|
||||||
|
order_index=1,
|
||||||
|
product_id=p.id,
|
||||||
|
)
|
||||||
s2 = RoutineStep(
|
s2 = RoutineStep(
|
||||||
id=uuid.uuid4(), routine_id=r.id, order_index=2, action_type="shaving_razor"
|
id=uuid.uuid4(),
|
||||||
|
user_id=current_user.user_id,
|
||||||
|
routine_id=r.id,
|
||||||
|
order_index=2,
|
||||||
|
action_type="shaving_razor",
|
||||||
)
|
)
|
||||||
# Step with non-existent product
|
# Step with non-existent product
|
||||||
s3 = RoutineStep(
|
s3 = RoutineStep(
|
||||||
id=uuid.uuid4(), routine_id=r.id, order_index=3, product_id=uuid.uuid4()
|
id=uuid.uuid4(),
|
||||||
|
user_id=current_user.user_id,
|
||||||
|
routine_id=r.id,
|
||||||
|
order_index=3,
|
||||||
|
product_id=uuid.uuid4(),
|
||||||
)
|
)
|
||||||
|
|
||||||
session.add_all([s1, s2, s3])
|
session.add_all([s1, s2, s3])
|
||||||
session.commit()
|
session.commit()
|
||||||
|
|
||||||
ctx = _build_recent_history(session)
|
ctx = _build_recent_history(
|
||||||
|
session,
|
||||||
|
target_user_id=current_user.user_id,
|
||||||
|
reference_date=reference_date,
|
||||||
|
)
|
||||||
assert "RECENT ROUTINES:" in ctx
|
assert "RECENT ROUTINES:" in ctx
|
||||||
assert "AM:" in ctx
|
assert "AM:" in ctx
|
||||||
assert "cleanser [" in ctx
|
assert "cleanser [" in ctx
|
||||||
|
|
@ -158,19 +373,70 @@ def test_build_recent_history(session: Session):
|
||||||
assert "unknown [" in ctx
|
assert "unknown [" in ctx
|
||||||
|
|
||||||
|
|
||||||
def test_build_products_context(session: Session):
|
def test_build_recent_history_uses_reference_window(session: Session, current_user):
|
||||||
|
reference_date = date(2026, 3, 10)
|
||||||
|
recent = Routine(
|
||||||
|
id=uuid.uuid4(),
|
||||||
|
user_id=current_user.user_id,
|
||||||
|
routine_date=reference_date - timedelta(days=3),
|
||||||
|
part_of_day="pm",
|
||||||
|
)
|
||||||
|
old = Routine(
|
||||||
|
id=uuid.uuid4(),
|
||||||
|
user_id=current_user.user_id,
|
||||||
|
routine_date=reference_date - timedelta(days=6),
|
||||||
|
part_of_day="am",
|
||||||
|
)
|
||||||
|
session.add_all([recent, old])
|
||||||
|
session.commit()
|
||||||
|
|
||||||
|
ctx = _build_recent_history(
|
||||||
|
session,
|
||||||
|
target_user_id=current_user.user_id,
|
||||||
|
reference_date=reference_date,
|
||||||
|
)
|
||||||
|
|
||||||
|
assert str(recent.routine_date) in ctx
|
||||||
|
assert str(old.routine_date) not in ctx
|
||||||
|
|
||||||
|
|
||||||
|
def test_build_recent_history_excludes_future_routines(session: Session, current_user):
|
||||||
|
reference_date = date(2026, 3, 10)
|
||||||
|
future = Routine(
|
||||||
|
id=uuid.uuid4(),
|
||||||
|
user_id=current_user.user_id,
|
||||||
|
routine_date=reference_date + timedelta(days=1),
|
||||||
|
part_of_day="am",
|
||||||
|
)
|
||||||
|
session.add(future)
|
||||||
|
session.commit()
|
||||||
|
|
||||||
|
assert (
|
||||||
|
_build_recent_history(
|
||||||
|
session,
|
||||||
|
target_user_id=current_user.user_id,
|
||||||
|
reference_date=reference_date,
|
||||||
|
)
|
||||||
|
== "RECENT ROUTINES: none\n"
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def test_build_products_context_summary_list(session: Session, current_user):
|
||||||
p1 = Product(
|
p1 = Product(
|
||||||
id=uuid.uuid4(),
|
id=uuid.uuid4(),
|
||||||
name="Regaine",
|
short_id=str(uuid.uuid4())[:8],
|
||||||
|
name="Regaine Minoxidil",
|
||||||
category="serum",
|
category="serum",
|
||||||
is_medication=True,
|
is_medication=True,
|
||||||
brand="J&J",
|
brand="J&J",
|
||||||
recommended_time="both",
|
recommended_time="both",
|
||||||
leave_on=True,
|
leave_on=True,
|
||||||
product_effect_profile={},
|
product_effect_profile={},
|
||||||
|
user_id=current_user.user_id,
|
||||||
)
|
)
|
||||||
p2 = Product(
|
p2 = Product(
|
||||||
id=uuid.uuid4(),
|
id=uuid.uuid4(),
|
||||||
|
short_id=str(uuid.uuid4())[:8],
|
||||||
name="Sunscreen",
|
name="Sunscreen",
|
||||||
category="spf",
|
category="spf",
|
||||||
brand="Test",
|
brand="Test",
|
||||||
|
|
@ -181,53 +447,23 @@ def test_build_products_context(session: Session):
|
||||||
context_rules={"safe_after_shaving": False},
|
context_rules={"safe_after_shaving": False},
|
||||||
min_interval_hours=12,
|
min_interval_hours=12,
|
||||||
max_frequency_per_week=7,
|
max_frequency_per_week=7,
|
||||||
|
user_id=current_user.user_id,
|
||||||
)
|
)
|
||||||
session.add_all([p1, p2])
|
session.add_all([p1, p2])
|
||||||
session.commit()
|
session.commit()
|
||||||
|
|
||||||
# Inventory
|
products_am = _get_available_products(
|
||||||
inv1 = ProductInventory(
|
session,
|
||||||
id=uuid.uuid4(),
|
current_user=current_user,
|
||||||
product_id=p2.id,
|
time_filter="am",
|
||||||
is_opened=True,
|
|
||||||
opened_at=date.today() - timedelta(days=10),
|
|
||||||
expiry_date=date.today() + timedelta(days=365),
|
|
||||||
)
|
)
|
||||||
inv2 = ProductInventory(id=uuid.uuid4(), product_id=p2.id, is_opened=False)
|
ctx = build_products_context_summary_list(products_am, {p2.id})
|
||||||
session.add_all([inv1, inv2])
|
|
||||||
session.commit()
|
|
||||||
|
|
||||||
# Usage
|
|
||||||
r = Routine(id=uuid.uuid4(), routine_date=date.today(), part_of_day="am")
|
|
||||||
session.add(r)
|
|
||||||
session.commit()
|
|
||||||
s = RoutineStep(id=uuid.uuid4(), routine_id=r.id, order_index=1, product_id=p2.id)
|
|
||||||
session.add(s)
|
|
||||||
session.commit()
|
|
||||||
|
|
||||||
products_am = _get_available_products(session, time_filter="am")
|
|
||||||
ctx = _build_products_context(session, products_am, reference_date=date.today())
|
|
||||||
# p1 is medication but not minoxidil (wait, Regaine name doesn't contain minoxidil!) -> skipped
|
|
||||||
assert "Regaine" not in ctx
|
|
||||||
|
|
||||||
# Let's fix p1 to be minoxidil
|
|
||||||
p1.name = "Regaine Minoxidil"
|
|
||||||
session.add(p1)
|
|
||||||
session.commit()
|
|
||||||
|
|
||||||
products_am = _get_available_products(session, time_filter="am")
|
|
||||||
ctx = _build_products_context(session, products_am, reference_date=date.today())
|
|
||||||
assert "Regaine Minoxidil" in ctx
|
assert "Regaine Minoxidil" in ctx
|
||||||
assert "Sunscreen" in ctx
|
assert "Sunscreen" in ctx
|
||||||
assert "inventory_status={active:2,opened:1,sealed:1}" in ctx
|
assert "[✓]" in ctx
|
||||||
assert "nearest_open_expiry=" in ctx
|
assert "hydration=2" in ctx
|
||||||
assert "nearest_open_pao_deadline=" in ctx
|
assert "!post_shave" in ctx
|
||||||
assert "pao_months=6" in ctx
|
|
||||||
assert "effects={'hydration_immediate': 2}" in ctx
|
|
||||||
assert "context_rules={'safe_after_shaving': False}" in ctx
|
|
||||||
assert "min_interval_hours=12" in ctx
|
|
||||||
assert "max_frequency_per_week=7" in ctx
|
|
||||||
assert "used_in_last_7_days=1" in ctx
|
|
||||||
|
|
||||||
|
|
||||||
def test_build_objectives_context():
|
def test_build_objectives_context():
|
||||||
|
|
@ -241,9 +477,10 @@ def test_build_day_context():
|
||||||
assert "Leaving home: no" in _build_day_context(False)
|
assert "Leaving home: no" in _build_day_context(False)
|
||||||
|
|
||||||
|
|
||||||
def test_get_available_products_respects_filters(session: Session):
|
def test_get_available_products_respects_filters(session: Session, current_user):
|
||||||
regular_med = Product(
|
regular_med = Product(
|
||||||
id=uuid.uuid4(),
|
id=uuid.uuid4(),
|
||||||
|
short_id=str(uuid.uuid4())[:8],
|
||||||
name="Tretinoin",
|
name="Tretinoin",
|
||||||
category="serum",
|
category="serum",
|
||||||
is_medication=True,
|
is_medication=True,
|
||||||
|
|
@ -251,9 +488,11 @@ def test_get_available_products_respects_filters(session: Session):
|
||||||
recommended_time="pm",
|
recommended_time="pm",
|
||||||
leave_on=True,
|
leave_on=True,
|
||||||
product_effect_profile={},
|
product_effect_profile={},
|
||||||
|
user_id=current_user.user_id,
|
||||||
)
|
)
|
||||||
minoxidil_med = Product(
|
minoxidil_med = Product(
|
||||||
id=uuid.uuid4(),
|
id=uuid.uuid4(),
|
||||||
|
short_id=str(uuid.uuid4())[:8],
|
||||||
name="Minoxidil 5%",
|
name="Minoxidil 5%",
|
||||||
category="serum",
|
category="serum",
|
||||||
is_medication=True,
|
is_medication=True,
|
||||||
|
|
@ -261,29 +500,38 @@ def test_get_available_products_respects_filters(session: Session):
|
||||||
recommended_time="both",
|
recommended_time="both",
|
||||||
leave_on=True,
|
leave_on=True,
|
||||||
product_effect_profile={},
|
product_effect_profile={},
|
||||||
|
user_id=current_user.user_id,
|
||||||
)
|
)
|
||||||
am_product = Product(
|
am_product = Product(
|
||||||
id=uuid.uuid4(),
|
id=uuid.uuid4(),
|
||||||
|
short_id=str(uuid.uuid4())[:8],
|
||||||
name="AM SPF",
|
name="AM SPF",
|
||||||
category="spf",
|
category="spf",
|
||||||
brand="Test",
|
brand="Test",
|
||||||
recommended_time="am",
|
recommended_time="am",
|
||||||
leave_on=True,
|
leave_on=True,
|
||||||
product_effect_profile={},
|
product_effect_profile={},
|
||||||
|
user_id=current_user.user_id,
|
||||||
)
|
)
|
||||||
pm_product = Product(
|
pm_product = Product(
|
||||||
id=uuid.uuid4(),
|
id=uuid.uuid4(),
|
||||||
|
short_id=str(uuid.uuid4())[:8],
|
||||||
name="PM Cream",
|
name="PM Cream",
|
||||||
category="moisturizer",
|
category="moisturizer",
|
||||||
brand="Test",
|
brand="Test",
|
||||||
recommended_time="pm",
|
recommended_time="pm",
|
||||||
leave_on=True,
|
leave_on=True,
|
||||||
product_effect_profile={},
|
product_effect_profile={},
|
||||||
|
user_id=current_user.user_id,
|
||||||
)
|
)
|
||||||
session.add_all([regular_med, minoxidil_med, am_product, pm_product])
|
session.add_all([regular_med, minoxidil_med, am_product, pm_product])
|
||||||
session.commit()
|
session.commit()
|
||||||
|
|
||||||
am_available = _get_available_products(session, time_filter="am")
|
am_available = _get_available_products(
|
||||||
|
session,
|
||||||
|
current_user=current_user,
|
||||||
|
time_filter="am",
|
||||||
|
)
|
||||||
am_names = {p.name for p in am_available}
|
am_names = {p.name for p in am_available}
|
||||||
assert "Tretinoin" not in am_names
|
assert "Tretinoin" not in am_names
|
||||||
assert "Minoxidil 5%" in am_names
|
assert "Minoxidil 5%" in am_names
|
||||||
|
|
@ -296,6 +544,7 @@ def test_build_product_details_tool_handler_returns_only_available_ids(
|
||||||
):
|
):
|
||||||
available = Product(
|
available = Product(
|
||||||
id=uuid.uuid4(),
|
id=uuid.uuid4(),
|
||||||
|
short_id=str(uuid.uuid4())[:8],
|
||||||
name="Available",
|
name="Available",
|
||||||
category="serum",
|
category="serum",
|
||||||
brand="Test",
|
brand="Test",
|
||||||
|
|
@ -306,6 +555,7 @@ def test_build_product_details_tool_handler_returns_only_available_ids(
|
||||||
)
|
)
|
||||||
unavailable = Product(
|
unavailable = Product(
|
||||||
id=uuid.uuid4(),
|
id=uuid.uuid4(),
|
||||||
|
short_id=str(uuid.uuid4())[:8],
|
||||||
name="Unavailable",
|
name="Unavailable",
|
||||||
category="serum",
|
category="serum",
|
||||||
brand="Test",
|
brand="Test",
|
||||||
|
|
@ -330,9 +580,8 @@ def test_build_product_details_tool_handler_returns_only_available_ids(
|
||||||
assert "products" in payload
|
assert "products" in payload
|
||||||
products = payload["products"]
|
products = payload["products"]
|
||||||
assert len(products) == 1
|
assert len(products) == 1
|
||||||
assert products[0]["id"] == str(available.id)
|
assert products[0]["id"] == available.short_id
|
||||||
assert products[0]["name"] == "Available"
|
assert products[0]["name"] == "Available"
|
||||||
assert products[0]["inci"] == ["Water", "Niacinamide"]
|
|
||||||
assert "actives" in products[0]
|
assert "actives" in products[0]
|
||||||
assert "safety" in products[0]
|
assert "safety" in products[0]
|
||||||
|
|
||||||
|
|
@ -374,9 +623,13 @@ def test_extract_active_names_uses_compact_distinct_names(session: Session):
|
||||||
assert names == ["Niacinamide", "Zinc PCA"]
|
assert names == ["Niacinamide", "Zinc PCA"]
|
||||||
|
|
||||||
|
|
||||||
def test_get_available_products_excludes_minoxidil_when_flag_false(session: Session):
|
def test_get_available_products_excludes_minoxidil_when_flag_false(
|
||||||
|
session: Session,
|
||||||
|
current_user,
|
||||||
|
):
|
||||||
minoxidil = Product(
|
minoxidil = Product(
|
||||||
id=uuid.uuid4(),
|
id=uuid.uuid4(),
|
||||||
|
short_id=str(uuid.uuid4())[:8],
|
||||||
name="Minoxidil 5%",
|
name="Minoxidil 5%",
|
||||||
category="hair_treatment",
|
category="hair_treatment",
|
||||||
is_medication=True,
|
is_medication=True,
|
||||||
|
|
@ -384,27 +637,38 @@ def test_get_available_products_excludes_minoxidil_when_flag_false(session: Sess
|
||||||
recommended_time="both",
|
recommended_time="both",
|
||||||
leave_on=True,
|
leave_on=True,
|
||||||
product_effect_profile={},
|
product_effect_profile={},
|
||||||
|
user_id=current_user.user_id,
|
||||||
)
|
)
|
||||||
regular = Product(
|
regular = Product(
|
||||||
id=uuid.uuid4(),
|
id=uuid.uuid4(),
|
||||||
|
short_id=str(uuid.uuid4())[:8],
|
||||||
name="Cleanser",
|
name="Cleanser",
|
||||||
category="cleanser",
|
category="cleanser",
|
||||||
brand="Test",
|
brand="Test",
|
||||||
recommended_time="both",
|
recommended_time="both",
|
||||||
leave_on=False,
|
leave_on=False,
|
||||||
product_effect_profile={},
|
product_effect_profile={},
|
||||||
|
user_id=current_user.user_id,
|
||||||
)
|
)
|
||||||
session.add_all([minoxidil, regular])
|
session.add_all([minoxidil, regular])
|
||||||
session.commit()
|
session.commit()
|
||||||
|
|
||||||
# With flag True (default) - minoxidil included
|
# With flag True (default) - minoxidil included
|
||||||
products = _get_available_products(session, include_minoxidil=True)
|
products = _get_available_products(
|
||||||
|
session,
|
||||||
|
current_user=current_user,
|
||||||
|
include_minoxidil=True,
|
||||||
|
)
|
||||||
names = {p.name for p in products}
|
names = {p.name for p in products}
|
||||||
assert "Minoxidil 5%" in names
|
assert "Minoxidil 5%" in names
|
||||||
assert "Cleanser" in names
|
assert "Cleanser" in names
|
||||||
|
|
||||||
# With flag False - minoxidil excluded
|
# With flag False - minoxidil excluded
|
||||||
products = _get_available_products(session, include_minoxidil=False)
|
products = _get_available_products(
|
||||||
|
session,
|
||||||
|
current_user=current_user,
|
||||||
|
include_minoxidil=False,
|
||||||
|
)
|
||||||
names = {p.name for p in products}
|
names = {p.name for p in products}
|
||||||
assert "Minoxidil 5%" not in names
|
assert "Minoxidil 5%" not in names
|
||||||
assert "Cleanser" in names
|
assert "Cleanser" in names
|
||||||
|
|
|
||||||
|
|
@ -140,7 +140,7 @@ def test_analyze_photos_includes_user_profile_context(client, monkeypatch):
|
||||||
|
|
||||||
def _fake_call_gemini(**kwargs):
|
def _fake_call_gemini(**kwargs):
|
||||||
captured.update(kwargs)
|
captured.update(kwargs)
|
||||||
return _FakeResponse()
|
return _FakeResponse(), None
|
||||||
|
|
||||||
monkeypatch.setattr(skincare_api, "call_gemini", _fake_call_gemini)
|
monkeypatch.setattr(skincare_api, "call_gemini", _fake_call_gemini)
|
||||||
|
|
||||||
|
|
|
||||||
100
backend/tests/test_tenancy_domains.py
Normal file
100
backend/tests/test_tenancy_domains.py
Normal file
|
|
@ -0,0 +1,100 @@
|
||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
from datetime import UTC, datetime, timedelta
|
||||||
|
from uuid import uuid4
|
||||||
|
|
||||||
|
from innercontext.api.auth_deps import get_current_user
|
||||||
|
from innercontext.auth import CurrentUser, IdentityData, TokenClaims
|
||||||
|
from innercontext.models import Role
|
||||||
|
from innercontext.models.ai_log import AICallLog
|
||||||
|
from main import app
|
||||||
|
|
||||||
|
|
||||||
|
def _user(subject: str, *, role: Role = Role.MEMBER) -> CurrentUser:
|
||||||
|
claims = TokenClaims(
|
||||||
|
issuer="https://auth.test",
|
||||||
|
subject=subject,
|
||||||
|
audience=("innercontext-web",),
|
||||||
|
expires_at=datetime.now(UTC) + timedelta(hours=1),
|
||||||
|
raw_claims={"iss": "https://auth.test", "sub": subject},
|
||||||
|
)
|
||||||
|
return CurrentUser(
|
||||||
|
user_id=uuid4(),
|
||||||
|
role=role,
|
||||||
|
identity=IdentityData.from_claims(claims),
|
||||||
|
claims=claims,
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def _set_current_user(user: CurrentUser) -> None:
|
||||||
|
app.dependency_overrides[get_current_user] = lambda: user
|
||||||
|
|
||||||
|
|
||||||
|
def test_profile_health_routines_skincare_ai_logs_are_user_scoped_by_default(
|
||||||
|
client, session
|
||||||
|
):
|
||||||
|
owner = _user("owner")
|
||||||
|
intruder = _user("intruder")
|
||||||
|
|
||||||
|
_set_current_user(owner)
|
||||||
|
profile = client.patch(
|
||||||
|
"/profile", json={"birth_date": "1991-01-15", "sex_at_birth": "male"}
|
||||||
|
)
|
||||||
|
medication = client.post(
|
||||||
|
"/health/medications", json={"kind": "prescription", "product_name": "Owner Rx"}
|
||||||
|
)
|
||||||
|
routine = client.post(
|
||||||
|
"/routines", json={"routine_date": "2026-03-01", "part_of_day": "am"}
|
||||||
|
)
|
||||||
|
snapshot = client.post("/skincare", json={"snapshot_date": "2026-03-01"})
|
||||||
|
log = AICallLog(endpoint="routines/suggest", model="gemini-3-flash-preview")
|
||||||
|
log.user_id = owner.user_id
|
||||||
|
session.add(log)
|
||||||
|
session.commit()
|
||||||
|
session.refresh(log)
|
||||||
|
|
||||||
|
assert profile.status_code == 200
|
||||||
|
assert medication.status_code == 201
|
||||||
|
assert routine.status_code == 201
|
||||||
|
assert snapshot.status_code == 201
|
||||||
|
|
||||||
|
medication_id = medication.json()["record_id"]
|
||||||
|
routine_id = routine.json()["id"]
|
||||||
|
snapshot_id = snapshot.json()["id"]
|
||||||
|
|
||||||
|
_set_current_user(intruder)
|
||||||
|
assert client.get("/profile").json() is None
|
||||||
|
assert client.get("/health/medications").json() == []
|
||||||
|
assert client.get("/routines").json() == []
|
||||||
|
assert client.get("/skincare").json() == []
|
||||||
|
assert client.get("/ai-logs").json() == []
|
||||||
|
|
||||||
|
assert client.get(f"/health/medications/{medication_id}").status_code == 404
|
||||||
|
assert client.get(f"/routines/{routine_id}").status_code == 404
|
||||||
|
assert client.get(f"/skincare/{snapshot_id}").status_code == 404
|
||||||
|
assert client.get(f"/ai-logs/{log.id}").status_code == 404
|
||||||
|
|
||||||
|
|
||||||
|
def test_health_admin_override_requires_explicit_user_id(client):
|
||||||
|
owner = _user("owner")
|
||||||
|
admin = _user("admin", role=Role.ADMIN)
|
||||||
|
|
||||||
|
_set_current_user(owner)
|
||||||
|
created = client.post(
|
||||||
|
"/health/lab-results",
|
||||||
|
json={
|
||||||
|
"collected_at": "2026-03-01T00:00:00",
|
||||||
|
"test_code": "718-7",
|
||||||
|
"test_name_original": "Hemoglobin",
|
||||||
|
},
|
||||||
|
)
|
||||||
|
assert created.status_code == 201
|
||||||
|
|
||||||
|
_set_current_user(admin)
|
||||||
|
default_scope = client.get("/health/lab-results")
|
||||||
|
assert default_scope.status_code == 200
|
||||||
|
assert default_scope.json()["items"] == []
|
||||||
|
|
||||||
|
overridden = client.get(f"/health/lab-results?user_id={owner.user_id}")
|
||||||
|
assert overridden.status_code == 200
|
||||||
|
assert len(overridden.json()["items"]) == 1
|
||||||
16
backend/uv.lock
generated
16
backend/uv.lock
generated
|
|
@ -557,6 +557,7 @@ dependencies = [
|
||||||
{ name = "fastapi" },
|
{ name = "fastapi" },
|
||||||
{ name = "google-genai" },
|
{ name = "google-genai" },
|
||||||
{ name = "psycopg", extra = ["binary"] },
|
{ name = "psycopg", extra = ["binary"] },
|
||||||
|
{ name = "pyjwt", extra = ["crypto"] },
|
||||||
{ name = "python-dotenv" },
|
{ name = "python-dotenv" },
|
||||||
{ name = "python-multipart" },
|
{ name = "python-multipart" },
|
||||||
{ name = "sqlmodel" },
|
{ name = "sqlmodel" },
|
||||||
|
|
@ -580,6 +581,7 @@ requires-dist = [
|
||||||
{ name = "fastapi", specifier = ">=0.132.0" },
|
{ name = "fastapi", specifier = ">=0.132.0" },
|
||||||
{ name = "google-genai", specifier = ">=1.65.0" },
|
{ name = "google-genai", specifier = ">=1.65.0" },
|
||||||
{ name = "psycopg", extras = ["binary"], specifier = ">=3.3.3" },
|
{ name = "psycopg", extras = ["binary"], specifier = ">=3.3.3" },
|
||||||
|
{ name = "pyjwt", extras = ["crypto"], specifier = ">=2.10.1" },
|
||||||
{ name = "python-dotenv", specifier = ">=1.2.1" },
|
{ name = "python-dotenv", specifier = ">=1.2.1" },
|
||||||
{ name = "python-multipart", specifier = ">=0.0.22" },
|
{ name = "python-multipart", specifier = ">=0.0.22" },
|
||||||
{ name = "sqlmodel", specifier = ">=0.0.37" },
|
{ name = "sqlmodel", specifier = ">=0.0.37" },
|
||||||
|
|
@ -909,6 +911,20 @@ wheels = [
|
||||||
{ url = "https://files.pythonhosted.org/packages/c7/21/705964c7812476f378728bdf590ca4b771ec72385c533964653c68e86bdc/pygments-2.19.2-py3-none-any.whl", hash = "sha256:86540386c03d588bb81d44bc3928634ff26449851e99741617ecb9037ee5ec0b", size = 1225217, upload-time = "2025-06-21T13:39:07.939Z" },
|
{ url = "https://files.pythonhosted.org/packages/c7/21/705964c7812476f378728bdf590ca4b771ec72385c533964653c68e86bdc/pygments-2.19.2-py3-none-any.whl", hash = "sha256:86540386c03d588bb81d44bc3928634ff26449851e99741617ecb9037ee5ec0b", size = 1225217, upload-time = "2025-06-21T13:39:07.939Z" },
|
||||||
]
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "pyjwt"
|
||||||
|
version = "2.11.0"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/5c/5a/b46fa56bf322901eee5b0454a34343cdbdae202cd421775a8ee4e42fd519/pyjwt-2.11.0.tar.gz", hash = "sha256:35f95c1f0fbe5d5ba6e43f00271c275f7a1a4db1dab27bf708073b75318ea623", size = 98019, upload-time = "2026-01-30T19:59:55.694Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/6f/01/c26ce75ba460d5cd503da9e13b21a33804d38c2165dec7b716d06b13010c/pyjwt-2.11.0-py3-none-any.whl", hash = "sha256:94a6bde30eb5c8e04fee991062b534071fd1439ef58d2adc9ccb823e7bcd0469", size = 28224, upload-time = "2026-01-30T19:59:54.539Z" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[package.optional-dependencies]
|
||||||
|
crypto = [
|
||||||
|
{ name = "cryptography" },
|
||||||
|
]
|
||||||
|
|
||||||
[[package]]
|
[[package]]
|
||||||
name = "pytest"
|
name = "pytest"
|
||||||
version = "9.0.2"
|
version = "9.0.2"
|
||||||
|
|
|
||||||
|
|
@ -346,7 +346,8 @@ check_backend_health() {
|
||||||
check_frontend_health() {
|
check_frontend_health() {
|
||||||
local i
|
local i
|
||||||
for ((i = 1; i <= 30; i++)); do
|
for ((i = 1; i <= 30; i++)); do
|
||||||
if remote "curl -sf http://127.0.0.1:3000/ >/dev/null"; then
|
# Allow 200 OK or 302/303/307 Redirect (to login)
|
||||||
|
if remote "curl -s -o /dev/null -w '%{http_code}' http://127.0.0.1:3000/ | grep -qE '^(200|302|303|307)$'"; then
|
||||||
log "Frontend health check passed"
|
log "Frontend health check passed"
|
||||||
return 0
|
return 0
|
||||||
fi
|
fi
|
||||||
|
|
|
||||||
|
|
@ -94,117 +94,82 @@ chown -R innercontext:innercontext /opt/innercontext
|
||||||
cat > /opt/innercontext/shared/backend/.env <<'EOF'
|
cat > /opt/innercontext/shared/backend/.env <<'EOF'
|
||||||
DATABASE_URL=postgresql+psycopg://innercontext:change-me@<pg-ip>/innercontext
|
DATABASE_URL=postgresql+psycopg://innercontext:change-me@<pg-ip>/innercontext
|
||||||
GEMINI_API_KEY=your-key
|
GEMINI_API_KEY=your-key
|
||||||
|
|
||||||
|
# OIDC Configuration
|
||||||
|
OIDC_ISSUER=https://auth.example.com
|
||||||
|
OIDC_CLIENT_ID=innercontext-backend
|
||||||
|
OIDC_DISCOVERY_URL=https://auth.example.com/.well-known/openid-configuration
|
||||||
|
OIDC_ADMIN_GROUPS=admins
|
||||||
|
OIDC_MEMBER_GROUPS=members
|
||||||
|
|
||||||
|
# Bootstrap Admin (Optional, used for initial setup)
|
||||||
|
# BOOTSTRAP_ADMIN_OIDC_ISSUER=https://auth.example.com
|
||||||
|
# BOOTSTRAP_ADMIN_OIDC_SUB=user-sub-from-authelia
|
||||||
|
# BOOTSTRAP_ADMIN_EMAIL=admin@example.com
|
||||||
|
# BOOTSTRAP_ADMIN_NAME="Admin User"
|
||||||
|
# BOOTSTRAP_HOUSEHOLD_NAME="My Household"
|
||||||
EOF
|
EOF
|
||||||
|
|
||||||
cat > /opt/innercontext/shared/frontend/.env.production <<'EOF'
|
cat > /opt/innercontext/shared/frontend/.env.production <<'EOF'
|
||||||
PUBLIC_API_BASE=http://127.0.0.1:8000
|
PUBLIC_API_BASE=http://127.0.0.1:8000
|
||||||
ORIGIN=http://innercontext.lan
|
ORIGIN=http://innercontext.lan
|
||||||
|
|
||||||
|
# Session and OIDC
|
||||||
|
SESSION_SECRET=generate-a-long-random-string
|
||||||
|
OIDC_ISSUER=https://auth.example.com
|
||||||
|
OIDC_CLIENT_ID=innercontext-frontend
|
||||||
|
OIDC_DISCOVERY_URL=https://auth.example.com/.well-known/openid-configuration
|
||||||
EOF
|
EOF
|
||||||
|
|
||||||
chmod 600 /opt/innercontext/shared/backend/.env
|
|
||||||
chmod 600 /opt/innercontext/shared/frontend/.env.production
|
|
||||||
chown innercontext:innercontext /opt/innercontext/shared/backend/.env
|
|
||||||
chown innercontext:innercontext /opt/innercontext/shared/frontend/.env.production
|
|
||||||
```
|
```
|
||||||
|
|
||||||
### 4) Grant deploy sudo permissions
|
## OIDC Setup (Authelia)
|
||||||
|
|
||||||
```bash
|
This project uses OIDC for authentication. You need an OIDC provider like Authelia.
|
||||||
cat > /etc/sudoers.d/innercontext-deploy << 'EOF'
|
|
||||||
innercontext ALL=(root) NOPASSWD: \
|
|
||||||
/usr/bin/systemctl restart innercontext, \
|
|
||||||
/usr/bin/systemctl restart innercontext-node, \
|
|
||||||
/usr/bin/systemctl restart innercontext-pricing-worker, \
|
|
||||||
/usr/bin/systemctl is-active innercontext, \
|
|
||||||
/usr/bin/systemctl is-active innercontext-node, \
|
|
||||||
/usr/bin/systemctl is-active innercontext-pricing-worker
|
|
||||||
EOF
|
|
||||||
|
|
||||||
chmod 440 /etc/sudoers.d/innercontext-deploy
|
### Authelia Client Configuration
|
||||||
visudo -c -f /etc/sudoers.d/innercontext-deploy
|
|
||||||
|
|
||||||
# Must work without password or TTY prompt:
|
Add the following to your Authelia `configuration.yml`:
|
||||||
sudo -u innercontext sudo -n -l
|
|
||||||
|
```yaml
|
||||||
|
identity_providers:
|
||||||
|
oidc:
|
||||||
|
clients:
|
||||||
|
- id: innercontext-frontend
|
||||||
|
description: InnerContext Frontend
|
||||||
|
secret: '$pbkdf2-sha512$...' # Not used for public client, but Authelia may require it
|
||||||
|
public: true
|
||||||
|
authorization_policy: one_factor
|
||||||
|
redirect_uris:
|
||||||
|
- http://innercontext.lan/auth/callback
|
||||||
|
scopes:
|
||||||
|
- openid
|
||||||
|
- profile
|
||||||
|
- email
|
||||||
|
- groups
|
||||||
|
userinfo_signed_response_alg: none
|
||||||
|
|
||||||
|
- id: innercontext-backend
|
||||||
|
description: InnerContext Backend
|
||||||
|
secret: '$pbkdf2-sha512$...'
|
||||||
|
public: false
|
||||||
|
authorization_policy: one_factor
|
||||||
|
redirect_uris: []
|
||||||
|
scopes:
|
||||||
|
- openid
|
||||||
|
- profile
|
||||||
|
- email
|
||||||
|
- groups
|
||||||
|
userinfo_signed_response_alg: none
|
||||||
```
|
```
|
||||||
|
|
||||||
If `sudo -n -l` fails, deployments will fail during restart/rollback with:
|
### Bootstrap Admin
|
||||||
`sudo: a terminal is required` or `sudo: a password is required`.
|
|
||||||
|
|
||||||
### 5) Install systemd and nginx configs
|
To create the first user and household, set the `BOOTSTRAP_ADMIN_*` environment variables in the backend `.env` file and restart the backend. The backend will automatically create the user and household on startup if they don't exist. After the first successful login, you can remove these variables.
|
||||||
|
|
||||||
After first deploy (or after copying repo content to `/opt/innercontext/current`), install configs:
|
|
||||||
|
|
||||||
```bash
|
|
||||||
cp /opt/innercontext/current/systemd/innercontext.service /etc/systemd/system/
|
|
||||||
cp /opt/innercontext/current/systemd/innercontext-node.service /etc/systemd/system/
|
|
||||||
cp /opt/innercontext/current/systemd/innercontext-pricing-worker.service /etc/systemd/system/
|
|
||||||
systemctl daemon-reload
|
|
||||||
systemctl enable innercontext
|
|
||||||
systemctl enable innercontext-node
|
|
||||||
systemctl enable innercontext-pricing-worker
|
|
||||||
|
|
||||||
cp /opt/innercontext/current/nginx/innercontext.conf /etc/nginx/sites-available/innercontext
|
|
||||||
ln -sf /etc/nginx/sites-available/innercontext /etc/nginx/sites-enabled/innercontext
|
|
||||||
rm -f /etc/nginx/sites-enabled/default
|
|
||||||
nginx -t && systemctl reload nginx
|
|
||||||
```
|
|
||||||
|
|
||||||
## Local Machine Setup
|
|
||||||
|
|
||||||
`~/.ssh/config`:
|
|
||||||
|
|
||||||
```
|
|
||||||
Host innercontext
|
|
||||||
HostName <lxc-ip>
|
|
||||||
User innercontext
|
|
||||||
```
|
|
||||||
|
|
||||||
Ensure your public key is in `/home/innercontext/.ssh/authorized_keys`.
|
|
||||||
|
|
||||||
## Deploy Commands
|
|
||||||
|
|
||||||
From repository root on external machine:
|
|
||||||
|
|
||||||
```bash
|
|
||||||
./deploy.sh # full deploy (default = all)
|
|
||||||
./deploy.sh all
|
|
||||||
./deploy.sh backend
|
|
||||||
./deploy.sh frontend
|
|
||||||
./deploy.sh list
|
|
||||||
./deploy.sh rollback
|
|
||||||
```
|
|
||||||
|
|
||||||
Optional overrides:
|
|
||||||
|
|
||||||
```bash
|
|
||||||
DEPLOY_SERVER=innercontext ./deploy.sh all
|
|
||||||
DEPLOY_ROOT=/opt/innercontext ./deploy.sh backend
|
|
||||||
DEPLOY_ALLOW_DIRTY=1 ./deploy.sh frontend
|
|
||||||
```
|
|
||||||
|
|
||||||
## What `deploy.sh` Does
|
|
||||||
|
|
||||||
For `backend` / `frontend` / `all`:
|
|
||||||
|
|
||||||
1. Local checks (strict, fail-fast)
|
|
||||||
2. Acquire `/opt/innercontext/.deploy.lock`
|
|
||||||
3. Create `<timestamp>` release directory
|
|
||||||
4. Upload selected component(s)
|
|
||||||
5. Link shared env files in the release directory
|
|
||||||
6. `uv sync` + `alembic upgrade head` (backend scope)
|
|
||||||
7. Upload `scripts/`, `systemd/`, `nginx/`
|
|
||||||
8. Switch `current` to the prepared release
|
|
||||||
9. Restart affected services
|
|
||||||
10. Run health checks
|
|
||||||
11. Remove old releases (keep last 5)
|
|
||||||
12. Write deploy entry to `/opt/innercontext/deploy.log`
|
|
||||||
|
|
||||||
If anything fails after promotion, script auto-rolls back to previous release.
|
|
||||||
|
|
||||||
## Health Checks
|
## Health Checks
|
||||||
|
|
||||||
- Backend: `http://127.0.0.1:8000/health-check`
|
- Backend: `http://127.0.0.1:8000/health-check` (returns 200)
|
||||||
- Frontend: `http://127.0.0.1:3000/`
|
- Frontend: `http://127.0.0.1:3000/` (returns 200 or 302 redirect to login)
|
||||||
- Worker: `systemctl is-active innercontext-pricing-worker`
|
- Worker: `systemctl is-active innercontext-pricing-worker`
|
||||||
|
|
||||||
Manual checks:
|
Manual checks:
|
||||||
|
|
|
||||||
|
|
@ -80,10 +80,12 @@ Use these wrappers before introducing route-specific structure:
|
||||||
|
|
||||||
- `editorial-page`: standard constrained content width for route pages.
|
- `editorial-page`: standard constrained content width for route pages.
|
||||||
- `editorial-hero`: top summary strip for title, subtitle, and primary actions.
|
- `editorial-hero`: top summary strip for title, subtitle, and primary actions.
|
||||||
|
- `PageHeader.svelte`: preferred reusable wrapper for page-level hero sections; use it to keep title hierarchy, backlinks, meta rows, and action placement consistent.
|
||||||
- `editorial-panel`: primary surface for forms, tables, and ledgers.
|
- `editorial-panel`: primary surface for forms, tables, and ledgers.
|
||||||
- `editorial-toolbar`: compact action row under hero copy.
|
- `editorial-toolbar`: compact action row under hero copy.
|
||||||
- `editorial-backlink`: standard top-left back navigation style.
|
- `editorial-backlink`: standard top-left back navigation style.
|
||||||
- `editorial-alert`, `editorial-alert--error`, `editorial-alert--success`, `editorial-alert--warning`, `editorial-alert--info`: feedback banners.
|
- `editorial-alert`, `editorial-alert--error`, `editorial-alert--success`, `editorial-alert--warning`, `editorial-alert--info`: feedback banners.
|
||||||
|
- `page-header-meta`, `page-header-foot`, `hero-strip`: shared secondary rows inside page headers for compact metadata and summary stats.
|
||||||
|
|
||||||
### Collapsible panels
|
### Collapsible panels
|
||||||
|
|
||||||
|
|
@ -117,10 +119,14 @@ This matches the warm editorial aesthetic and maintains visual consistency with
|
||||||
|
|
||||||
These classes are already in use and should be reused:
|
These classes are already in use and should be reused:
|
||||||
|
|
||||||
- Lists and ledgers: `routine-ledger-row`, `products-mobile-card`, `health-entry-row`
|
- Lists and ledgers: `routine-ledger-row`, `editorial-mobile-card`, `health-entry-row`
|
||||||
- Group headers: `products-section-title`
|
- Group headers: `editorial-section-title`
|
||||||
- Table shell: `products-table-shell`
|
- Table shell: `editorial-table-shell`
|
||||||
|
- Compact metadata rows: `editorial-meta-strip`
|
||||||
- Tabs shell: `products-tabs`, `editorial-tabs`
|
- Tabs shell: `products-tabs`, `editorial-tabs`
|
||||||
|
- App shell/navigation: `app-mobile-header`, `app-drawer`, `app-nav-list`, `app-nav-link`, `app-sidebar-footer`
|
||||||
|
- Reusable locale control: `LanguageSwitcher.svelte` with `language-switcher*` classes
|
||||||
|
- Dashboard summary patterns: `dashboard-stat-strip`, `dashboard-stat-card`, `dashboard-attention-list`, `dashboard-attention-item`
|
||||||
- Health semantic pills: `health-kind-pill*`, `health-flag-pill*`
|
- Health semantic pills: `health-kind-pill*`, `health-flag-pill*`
|
||||||
- Lab results utilities:
|
- Lab results utilities:
|
||||||
- metadata chips: `lab-results-meta-strip`, `lab-results-meta-pill`
|
- metadata chips: `lab-results-meta-strip`, `lab-results-meta-pill`
|
||||||
|
|
@ -145,6 +151,7 @@ These classes are already in use and should be reused:
|
||||||
- In dense row-based lists, prefer `ghost` action controls; use icon-only buttons on desktop tables and short text+icon `ghost` actions on mobile cards to keep row actions subordinate to data.
|
- In dense row-based lists, prefer `ghost` action controls; use icon-only buttons on desktop tables and short text+icon `ghost` actions on mobile cards to keep row actions subordinate to data.
|
||||||
- For editable data tables, open a dedicated inline edit panel above the list (instead of per-row expanded forms) and prefill it from row actions; keep users on the same filtered/paginated context after save.
|
- For editable data tables, open a dedicated inline edit panel above the list (instead of per-row expanded forms) and prefill it from row actions; keep users on the same filtered/paginated context after save.
|
||||||
- When a list is narrowed to a single entity key (for example `test_code`), display an explicit "filtered by" banner with a one-click clear action and avoid extra grouping wrappers that add no context.
|
- When a list is narrowed to a single entity key (for example `test_code`), display an explicit "filtered by" banner with a one-click clear action and avoid extra grouping wrappers that add no context.
|
||||||
|
- For dashboard-style summaries, prefer compact stat strips and attention rows over large decorative cards; each item should pair one strong value with one short explanatory line.
|
||||||
|
|
||||||
### DRY form primitives
|
### DRY form primitives
|
||||||
|
|
||||||
|
|
@ -211,6 +218,7 @@ These classes are already in use and should be reused:
|
||||||
|
|
||||||
- Core tokens and global look: `frontend/src/app.css`
|
- Core tokens and global look: `frontend/src/app.css`
|
||||||
- App shell and route domain mapping: `frontend/src/routes/+layout.svelte`
|
- App shell and route domain mapping: `frontend/src/routes/+layout.svelte`
|
||||||
|
- Shared page header: `frontend/src/lib/components/PageHeader.svelte`
|
||||||
- Route examples using the pattern:
|
- Route examples using the pattern:
|
||||||
- `frontend/src/routes/+page.svelte`
|
- `frontend/src/routes/+page.svelte`
|
||||||
- `frontend/src/routes/products/+page.svelte`
|
- `frontend/src/routes/products/+page.svelte`
|
||||||
|
|
|
||||||
|
|
@ -1,6 +1,8 @@
|
||||||
node_modules
|
node_modules
|
||||||
.svelte-kit
|
.svelte-kit
|
||||||
paraglide
|
paraglide
|
||||||
|
src/lib/api/generated
|
||||||
|
openapi.json
|
||||||
build
|
build
|
||||||
dist
|
dist
|
||||||
.env
|
.env
|
||||||
|
|
|
||||||
160
frontend/AGENTS.md
Normal file
160
frontend/AGENTS.md
Normal file
|
|
@ -0,0 +1,160 @@
|
||||||
|
# Frontend
|
||||||
|
|
||||||
|
SvelteKit 2 + Svelte 5 (Runes) web UI. Adapter: `@sveltejs/adapter-node` (required for form actions).
|
||||||
|
|
||||||
|
## Structure
|
||||||
|
|
||||||
|
```
|
||||||
|
frontend/src/
|
||||||
|
├── app.css # Tailwind v4 theme + editorial design system (1420 lines)
|
||||||
|
├── app.html # HTML shell (Cormorant Infant + Manrope fonts)
|
||||||
|
├── hooks.server.ts # Paraglide i18n middleware
|
||||||
|
├── routes/ # SvelteKit file-based routing
|
||||||
|
│ ├── +layout.svelte # App shell, sidebar, mobile drawer, domain-based theming
|
||||||
|
│ ├── +page.svelte # Dashboard (routines, snapshots, lab results)
|
||||||
|
│ ├── products/ # List, [id] detail/edit, new, suggest (AI)
|
||||||
|
│ ├── routines/ # List, [id] detail/edit, new, suggest (AI), grooming-schedule/
|
||||||
|
│ ├── health/ # medications/ (list, new), lab-results/ (list, new)
|
||||||
|
│ ├── skin/ # Snapshots list, new (with photo analysis)
|
||||||
|
│ └── profile/ # User profile
|
||||||
|
└── lib/
|
||||||
|
├── api.ts # Typed fetch wrappers (server: PUBLIC_API_BASE, browser: /api)
|
||||||
|
├── types.ts # Type bridge — re-exports from generated OpenAPI types with augmentations
|
||||||
|
├── api/generated/ # Auto-generated types from backend OpenAPI schema — DO NOT EDIT
|
||||||
|
├── utils.ts # cn() class merger, bits-ui types
|
||||||
|
├── utils/ # forms.ts (preventIfNotConfirmed), skin-display.ts (label helpers)
|
||||||
|
├── paraglide/ # Generated i18n runtime — DO NOT EDIT
|
||||||
|
└── components/
|
||||||
|
├── ui/ # bits-ui primitives: button, card, badge, input, label, select, tabs, table, separator
|
||||||
|
├── forms/ # DRY helpers: SimpleSelect, GroupedSelect, HintCheckbox, LabeledInputField, FormSectionCard, form-classes.ts
|
||||||
|
├── product-form/ # Sectioned form: Basic, Details, Classification, Assessment, Notes
|
||||||
|
├── PageHeader.svelte # Reusable page header (kicker, title, subtitle, backlink, actions via snippets)
|
||||||
|
├── ProductForm.svelte # Main product form (tabbed, 737 lines)
|
||||||
|
├── ProductFormAiModal.svelte # AI text-to-product parsing modal
|
||||||
|
├── FlashMessages.svelte # Error/success/warning/info alerts
|
||||||
|
├── StructuredErrorDisplay.svelte # Parses semicolon-separated backend errors into list
|
||||||
|
├── ValidationWarningsAlert.svelte # LLM validation warnings display
|
||||||
|
├── ReasoningChainViewer.svelte # AI reasoning chain viewer (collapsible)
|
||||||
|
├── MetadataDebugPanel.svelte # Token metrics, model info (collapsible)
|
||||||
|
├── AutoFixBadge.svelte # Auto-fix indicator
|
||||||
|
└── LanguageSwitcher.svelte # i18n locale toggle
|
||||||
|
```
|
||||||
|
|
||||||
|
## Design System
|
||||||
|
|
||||||
|
**MUST READ**: `docs/frontend-design-cookbook.md` — update when introducing new UI patterns.
|
||||||
|
|
||||||
|
- **Typography**: `Cormorant Infant` (display/headings), `Manrope` (body/UI).
|
||||||
|
- **Colors**: CSS variables in `app.css`. Domain accents per route: products (green), routines (cyan), skin (orange), profile (blue), health-labs (purple), health-meds (teal).
|
||||||
|
- **Layout wrappers**: `.editorial-page`, `.editorial-hero`, `.editorial-panel`, `.editorial-toolbar`, `.editorial-backlink`, `.editorial-alert`.
|
||||||
|
- **Page header**: Use `PageHeader.svelte` for consistent title hierarchy, backlinks, and actions.
|
||||||
|
- **Accent rule**: ~10-15% of visual area. Never full backgrounds or body text.
|
||||||
|
- **Motion**: Short purposeful reveals. Always respect `prefers-reduced-motion`.
|
||||||
|
|
||||||
|
## Route Patterns
|
||||||
|
|
||||||
|
Every page: `+page.svelte` (UI) + `+page.server.ts` (load + actions).
|
||||||
|
|
||||||
|
Load functions fetch from API, return data:
|
||||||
|
```typescript
|
||||||
|
export const load: PageServerLoad = async () => {
|
||||||
|
const data = await getProducts();
|
||||||
|
return { products: data };
|
||||||
|
};
|
||||||
|
```
|
||||||
|
|
||||||
|
Form actions parse FormData, call API, return result or `fail()`:
|
||||||
|
```typescript
|
||||||
|
export const actions = {
|
||||||
|
default: async ({ request }) => {
|
||||||
|
const form = await request.formData();
|
||||||
|
try {
|
||||||
|
const result = await createProduct(payload);
|
||||||
|
return { success: true, product: result };
|
||||||
|
} catch (e) {
|
||||||
|
return fail(500, { error: (e as Error).message });
|
||||||
|
}
|
||||||
|
}
|
||||||
|
};
|
||||||
|
```
|
||||||
|
|
||||||
|
## Component Conventions
|
||||||
|
|
||||||
|
- Prefer `SimpleSelect` / `GroupedSelect` over bits-ui `ui/select` unless search/rich popup UX needed.
|
||||||
|
- Use `form-classes.ts` tokens (`baseSelectClass`, `baseTextareaClass`) for consistent form styling.
|
||||||
|
- Svelte 5 runes: `$props()`, `$state()`, `$derived()`, `$effect()`, `$bindable()`.
|
||||||
|
- Snippet-based composition in `PageHeader` (actions, meta, children snippets).
|
||||||
|
- Compound components: Card → CardHeader, CardContent, CardFooter, etc.
|
||||||
|
|
||||||
|
## i18n
|
||||||
|
|
||||||
|
- Source messages: `frontend/messages/{en,pl}.json`.
|
||||||
|
- Generated runtime: `src/lib/paraglide/` (via Vite plugin).
|
||||||
|
- Import: `import * as m from '$lib/paraglide/messages.js'`.
|
||||||
|
- **No hardcoded English labels.** Use `m.*` keys. Add new keys to message files if needed.
|
||||||
|
- Fallback display: use `m.common_unknown()` not hardcoded `n/a`.
|
||||||
|
|
||||||
|
## API Client
|
||||||
|
|
||||||
|
`src/lib/api.ts` — typed fetch wrappers.
|
||||||
|
|
||||||
|
```typescript
|
||||||
|
const base = browser ? "/api" : PUBLIC_API_BASE;
|
||||||
|
// Browser: /api (nginx proxies, strips prefix to backend)
|
||||||
|
// Server-side (SSR): PUBLIC_API_BASE (http://localhost:8000)
|
||||||
|
```
|
||||||
|
|
||||||
|
Methods: `api.get<T>()`, `api.post<T>()`, `api.patch<T>()`, `api.del()`.
|
||||||
|
File upload: `analyzeSkinPhotos()` uses FormData (not JSON).
|
||||||
|
Error handling: throws `Error` with `.detail` from backend response.
|
||||||
|
|
||||||
|
## Environment
|
||||||
|
|
||||||
|
| Variable | Default | Set at |
|
||||||
|
|----------|---------|--------|
|
||||||
|
| `PUBLIC_API_BASE` | `http://localhost:8000` | Build time |
|
||||||
|
|
||||||
|
Production: `PUBLIC_API_BASE=http://innercontext.lan/api pnpm build`.
|
||||||
|
|
||||||
|
## Commands
|
||||||
|
|
||||||
|
```bash
|
||||||
|
pnpm dev # Dev server (API proxied to :8000)
|
||||||
|
pnpm check # Type check + Svelte validation
|
||||||
|
pnpm lint # ESLint
|
||||||
|
pnpm format # Prettier
|
||||||
|
pnpm build # Production build → build/
|
||||||
|
pnpm generate:api # Regenerate TypeScript types from backend OpenAPI schema
|
||||||
|
```
|
||||||
|
|
||||||
|
## Anti-Patterns
|
||||||
|
|
||||||
|
- No frontend tests exist. Only linting + type checking.
|
||||||
|
- ESLint `svelte/no-navigation-without-resolve` has `ignoreGoto: true` workaround (upstream bug sveltejs/eslint-plugin-svelte#1327).
|
||||||
|
- `src/paraglide/` is a legacy output path — active i18n output is in `src/lib/paraglide/`.
|
||||||
|
|
||||||
|
## Type Generation
|
||||||
|
|
||||||
|
TypeScript types are auto-generated from the FastAPI backend's OpenAPI schema using `@hey-api/openapi-ts`.
|
||||||
|
|
||||||
|
### Workflow
|
||||||
|
|
||||||
|
1. Generate `openapi.json` from backend: `cd backend && uv run python -c "import json; from main import app; print(json.dumps(app.openapi(), indent=2))" > ../frontend/openapi.json`
|
||||||
|
2. Generate types: `cd frontend && pnpm generate:api`
|
||||||
|
3. Output lands in `src/lib/api/generated/types.gen.ts` — **never edit this file directly**.
|
||||||
|
|
||||||
|
### Architecture
|
||||||
|
|
||||||
|
- **Generated types**: `src/lib/api/generated/types.gen.ts` — raw OpenAPI types, auto-generated.
|
||||||
|
- **Bridge file**: `src/lib/types.ts` — re-exports from generated types with:
|
||||||
|
- **Renames**: `ProductWithInventory` → `Product`, `ProductListItem` → `ProductSummary`, `UserProfilePublic` → `UserProfile`, `SkinConditionSnapshotPublic` → `SkinConditionSnapshot`.
|
||||||
|
- **`Require<T, K>` augmentations**: Fields with `default_factory` in SQLModel are optional in OpenAPI but always present in API responses (e.g. `id`, `created_at`, `updated_at`, `targets`, `inventory`).
|
||||||
|
- **Relationship fields**: SQLModel `Relationship()` fields are excluded from OpenAPI schema. Added manually: `MedicationEntry.usage_history`, `Routine.steps`, `RoutineStep.product`, `ProductInventory.product`, `Product.inventory` (with augmented `ProductInventory`).
|
||||||
|
- **Manual types**: `PriceTierSource`, `ShoppingPriority` — inline literals in backend, not named in OpenAPI.
|
||||||
|
- **Canonical import**: Always `import type { ... } from '$lib/types'` — never import from `$lib/api/generated` directly.
|
||||||
|
|
||||||
|
### When to regenerate
|
||||||
|
|
||||||
|
- After adding/modifying backend models or response schemas.
|
||||||
|
- After adding/modifying API endpoints that change the OpenAPI spec.
|
||||||
|
- After updating the bridge file, run `pnpm check` to verify type compatibility.
|
||||||
|
|
@ -69,6 +69,6 @@ Or use the provided systemd service: `../systemd/innercontext-node.service`.
|
||||||
| File | Purpose |
|
| File | Purpose |
|
||||||
| ------------------ | --------------------------------- |
|
| ------------------ | --------------------------------- |
|
||||||
| `src/lib/api.ts` | API client (typed fetch wrappers) |
|
| `src/lib/api.ts` | API client (typed fetch wrappers) |
|
||||||
| `src/lib/types.ts` | Shared TypeScript types |
|
| `src/lib/types.ts` | Type bridge (re-exports from generated OpenAPI types) |
|
||||||
| `src/app.css` | Tailwind v4 theme + global styles |
|
| `src/app.css` | Tailwind v4 theme + global styles |
|
||||||
| `svelte.config.js` | SvelteKit config (adapter-node) |
|
| `svelte.config.js` | SvelteKit config (adapter-node) |
|
||||||
|
|
|
||||||
|
|
@ -12,6 +12,7 @@ export default [
|
||||||
"dist",
|
"dist",
|
||||||
"**/paraglide/**",
|
"**/paraglide/**",
|
||||||
"**/lib/paraglide/**",
|
"**/lib/paraglide/**",
|
||||||
|
"**/api/generated/**",
|
||||||
],
|
],
|
||||||
},
|
},
|
||||||
js.configs.recommended,
|
js.configs.recommended,
|
||||||
|
|
|
||||||
|
|
@ -10,6 +10,11 @@
|
||||||
"nav_appName": "innercontext",
|
"nav_appName": "innercontext",
|
||||||
"nav_appSubtitle": "personal health & skincare",
|
"nav_appSubtitle": "personal health & skincare",
|
||||||
|
|
||||||
|
"auth_signedInAs": "Signed in as",
|
||||||
|
"auth_roleAdmin": "Admin",
|
||||||
|
"auth_roleMember": "Member",
|
||||||
|
"auth_logout": "Log out",
|
||||||
|
|
||||||
"common_save": "Save",
|
"common_save": "Save",
|
||||||
"common_cancel": "Cancel",
|
"common_cancel": "Cancel",
|
||||||
"common_add": "Add",
|
"common_add": "Add",
|
||||||
|
|
@ -32,8 +37,74 @@
|
||||||
|
|
||||||
"dashboard_title": "Dashboard",
|
"dashboard_title": "Dashboard",
|
||||||
"dashboard_subtitle": "Your recent health & skincare overview",
|
"dashboard_subtitle": "Your recent health & skincare overview",
|
||||||
|
"dashboard_dailyBriefing": "A quick read on what changed, what is missing, and where to look next.",
|
||||||
"dashboard_latestSnapshot": "Latest Skin Snapshot",
|
"dashboard_latestSnapshot": "Latest Skin Snapshot",
|
||||||
"dashboard_recentRoutines": "Recent Routines",
|
"dashboard_recentRoutines": "Recent Routines",
|
||||||
|
"dashboard_requiresAttention": "Requires attention",
|
||||||
|
"dashboard_healthPulse": "Health Pulse",
|
||||||
|
"dashboard_viewSkinHistory": "Open skin history",
|
||||||
|
"dashboard_viewLabResults": "Open lab results",
|
||||||
|
"dashboard_heroFreshness": "Freshness",
|
||||||
|
"dashboard_sinceLastSnapshot": "since last snapshot",
|
||||||
|
"dashboard_skinFreshness": [
|
||||||
|
{
|
||||||
|
"declarations": ["input count", "local countPlural = count: plural"],
|
||||||
|
"selectors": ["countPlural"],
|
||||||
|
"match": {
|
||||||
|
"countPlural=one": "Last snapshot was {count} day ago.",
|
||||||
|
"countPlural=*": "Last snapshot was {count} days ago."
|
||||||
|
}
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"dashboard_daysAgo": [
|
||||||
|
{
|
||||||
|
"declarations": ["input count", "local countPlural = count: plural"],
|
||||||
|
"selectors": ["countPlural"],
|
||||||
|
"match": {
|
||||||
|
"countPlural=one": "{count} day ago",
|
||||||
|
"countPlural=*": "{count} days ago"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"dashboard_daysAgoShort": [
|
||||||
|
{
|
||||||
|
"declarations": ["input count", "local countPlural = count: plural"],
|
||||||
|
"selectors": ["countPlural"],
|
||||||
|
"match": {
|
||||||
|
"countPlural=one": "{count}d",
|
||||||
|
"countPlural=*": "{count}d"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"dashboard_flaggedResults": "flagged results",
|
||||||
|
"dashboard_flaggedLabsCount": [
|
||||||
|
{
|
||||||
|
"declarations": ["input count", "local countPlural = count: plural"],
|
||||||
|
"selectors": ["countPlural"],
|
||||||
|
"match": {
|
||||||
|
"countPlural=one": "{count} flagged result",
|
||||||
|
"countPlural=*": "{count} flagged results"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"dashboard_attentionSnapshot": "Skin log",
|
||||||
|
"dashboard_attentionRoutineAM": "AM routine",
|
||||||
|
"dashboard_attentionRoutinePM": "PM routine",
|
||||||
|
"dashboard_attentionLabs": "Lab review",
|
||||||
|
"dashboard_attentionMissing": "Missing",
|
||||||
|
"dashboard_attentionToday": "Today",
|
||||||
|
"dashboard_attentionStable": "No flagged items",
|
||||||
|
"dashboard_statusLogged": "Logged",
|
||||||
|
"dashboard_statusOpen": "Open",
|
||||||
|
"dashboard_metricHydration": "Hydration",
|
||||||
|
"dashboard_metricSensitivity": "Sensitivity",
|
||||||
|
"dashboard_metricSebumTzone": "Sebum T-zone",
|
||||||
|
"dashboard_metricSebumCheeks": "Sebum cheeks",
|
||||||
|
"dashboard_metricDelta": "Delta {delta}",
|
||||||
|
"dashboard_averageSteps": "Avg. {count} steps",
|
||||||
|
"dashboard_lastRoutine": "Last routine",
|
||||||
|
"dashboard_lastLabDate": "Last collection",
|
||||||
|
"dashboard_noLabResults": "No lab results yet.",
|
||||||
"dashboard_noSnapshots": "No skin snapshots yet.",
|
"dashboard_noSnapshots": "No skin snapshots yet.",
|
||||||
"dashboard_noRoutines": "No routines in the past 2 weeks.",
|
"dashboard_noRoutines": "No routines in the past 2 weeks.",
|
||||||
|
|
||||||
|
|
@ -58,6 +129,15 @@
|
||||||
"products_suggestResults": "Suggestions",
|
"products_suggestResults": "Suggestions",
|
||||||
"products_suggestTime": "Time",
|
"products_suggestTime": "Time",
|
||||||
"products_suggestFrequency": "Frequency",
|
"products_suggestFrequency": "Frequency",
|
||||||
|
"products_suggestPriorityHigh": "High priority",
|
||||||
|
"products_suggestPriorityMedium": "Medium priority",
|
||||||
|
"products_suggestPriorityLow": "Low priority",
|
||||||
|
"products_suggestBuyNow": "Buy now because",
|
||||||
|
"products_suggestRoutineFit": "How it fits your routine",
|
||||||
|
"products_suggestBudgetSkip": "If you're cutting the budget",
|
||||||
|
"products_suggestKeyIngredients": "Key ingredients",
|
||||||
|
"products_suggestTargets": "Targets",
|
||||||
|
"products_suggestCautions": "Cautions",
|
||||||
"products_suggestRegenerate": "Regenerate",
|
"products_suggestRegenerate": "Regenerate",
|
||||||
"products_suggestNoResults": "No suggestions.",
|
"products_suggestNoResults": "No suggestions.",
|
||||||
"products_noProducts": "No products found.",
|
"products_noProducts": "No products found.",
|
||||||
|
|
@ -86,8 +166,11 @@
|
||||||
"inventory_openedDate": "Opened date",
|
"inventory_openedDate": "Opened date",
|
||||||
"inventory_finishedDate": "Finished date",
|
"inventory_finishedDate": "Finished date",
|
||||||
"inventory_expiryDate": "Expiry date",
|
"inventory_expiryDate": "Expiry date",
|
||||||
"inventory_currentWeight": "Current weight (g)",
|
"inventory_remainingLevel": "Remaining product level",
|
||||||
"inventory_lastWeighed": "Last weighed",
|
"inventory_remainingHigh": "high",
|
||||||
|
"inventory_remainingMedium": "medium",
|
||||||
|
"inventory_remainingLow": "low",
|
||||||
|
"inventory_remainingNearlyEmpty": "nearly empty",
|
||||||
"inventory_notes": "Notes",
|
"inventory_notes": "Notes",
|
||||||
"inventory_badgeOpen": "Open",
|
"inventory_badgeOpen": "Open",
|
||||||
"inventory_badgeSealed": "Sealed",
|
"inventory_badgeSealed": "Sealed",
|
||||||
|
|
@ -95,8 +178,6 @@
|
||||||
"inventory_exp": "Exp:",
|
"inventory_exp": "Exp:",
|
||||||
"inventory_opened": "Opened:",
|
"inventory_opened": "Opened:",
|
||||||
"inventory_finished": "Finished:",
|
"inventory_finished": "Finished:",
|
||||||
"inventory_remaining": "g remaining",
|
|
||||||
"inventory_weighed": "Weighed:",
|
|
||||||
"inventory_confirmDelete": "Delete this package?",
|
"inventory_confirmDelete": "Delete this package?",
|
||||||
|
|
||||||
"routines_title": "Routines",
|
"routines_title": "Routines",
|
||||||
|
|
@ -142,6 +223,9 @@
|
||||||
"grooming_title": "Grooming Schedule",
|
"grooming_title": "Grooming Schedule",
|
||||||
"grooming_backToRoutines": "Routines",
|
"grooming_backToRoutines": "Routines",
|
||||||
"grooming_addEntry": "+ Add entry",
|
"grooming_addEntry": "+ Add entry",
|
||||||
|
"grooming_newTitle": "New grooming entry",
|
||||||
|
"grooming_newSubtitle": "Add a recurring entry to your weekly grooming schedule.",
|
||||||
|
"grooming_newSectionIntro": "Choose the day, action, and an optional note.",
|
||||||
"grooming_entryAdded": "Entry added.",
|
"grooming_entryAdded": "Entry added.",
|
||||||
"grooming_entryUpdated": "Entry updated.",
|
"grooming_entryUpdated": "Entry updated.",
|
||||||
"grooming_entryDeleted": "Entry deleted.",
|
"grooming_entryDeleted": "Entry deleted.",
|
||||||
|
|
@ -244,6 +328,8 @@
|
||||||
],
|
],
|
||||||
"medications_addNew": "+ Add medication",
|
"medications_addNew": "+ Add medication",
|
||||||
"medications_newTitle": "New medication",
|
"medications_newTitle": "New medication",
|
||||||
|
"medications_newSubtitle": "Add a basic medication or supplement record for later tracking.",
|
||||||
|
"medications_newSectionIntro": "Start with the type, product name, and active substance.",
|
||||||
"medications_kind": "Kind",
|
"medications_kind": "Kind",
|
||||||
"medications_productName": "Product name *",
|
"medications_productName": "Product name *",
|
||||||
"medications_productNamePlaceholder": "e.g. Vitamin D3",
|
"medications_productNamePlaceholder": "e.g. Vitamin D3",
|
||||||
|
|
@ -281,9 +367,21 @@
|
||||||
],
|
],
|
||||||
"labResults_addNew": "+ Add result",
|
"labResults_addNew": "+ Add result",
|
||||||
"labResults_newTitle": "New lab result",
|
"labResults_newTitle": "New lab result",
|
||||||
|
"labResults_newSubtitle": "Save a single lab result and add it to your health history.",
|
||||||
|
"labResults_newSectionIntro": "Start with the date and LOINC code, then add the remaining details.",
|
||||||
"labResults_flagFilter": "Flag:",
|
"labResults_flagFilter": "Flag:",
|
||||||
"labResults_flagAll": "All",
|
"labResults_flagAll": "All",
|
||||||
"labResults_flagNone": "None",
|
"labResults_flagNone": "None",
|
||||||
|
"labResults_statusAll": "All",
|
||||||
|
"labResults_statusAbnormal": "Abnormal",
|
||||||
|
"labResults_statusNormal": "Normal",
|
||||||
|
"labResults_statusUninterpreted": "No interpretation",
|
||||||
|
"labResults_activeFilters": "Active filters",
|
||||||
|
"labResults_activeFilterSearch": "Search: {value}",
|
||||||
|
"labResults_activeFilterCode": "Code: {value}",
|
||||||
|
"labResults_activeFilterFrom": "From: {value}",
|
||||||
|
"labResults_activeFilterTo": "To: {value}",
|
||||||
|
"labResults_activeFilterHistory": "Full history",
|
||||||
"labResults_date": "Date *",
|
"labResults_date": "Date *",
|
||||||
"labResults_loincCode": "LOINC code *",
|
"labResults_loincCode": "LOINC code *",
|
||||||
"labResults_loincExample": "e.g. 718-7",
|
"labResults_loincExample": "e.g. 718-7",
|
||||||
|
|
@ -364,6 +462,8 @@
|
||||||
"skin_analyzePhotos": "Analyze photos",
|
"skin_analyzePhotos": "Analyze photos",
|
||||||
"skin_analyzing": "Analyzing…",
|
"skin_analyzing": "Analyzing…",
|
||||||
"skin_newSnapshotTitle": "New skin snapshot",
|
"skin_newSnapshotTitle": "New skin snapshot",
|
||||||
|
"skin_newSubtitle": "Capture today’s skin state manually or prefill the form with AI photo analysis.",
|
||||||
|
"skin_newSectionIntro": "Start with the date and overall condition, then refine the details.",
|
||||||
"skin_date": "Date *",
|
"skin_date": "Date *",
|
||||||
"skin_overallState": "Overall state",
|
"skin_overallState": "Overall state",
|
||||||
"skin_texture": "Texture",
|
"skin_texture": "Texture",
|
||||||
|
|
@ -505,7 +605,6 @@
|
||||||
"productForm_isTool": "Is tool (e.g. dermaroller)",
|
"productForm_isTool": "Is tool (e.g. dermaroller)",
|
||||||
"productForm_needleLengthMm": "Needle length (mm, tools only)",
|
"productForm_needleLengthMm": "Needle length (mm, tools only)",
|
||||||
"productForm_personalNotes": "Personal notes",
|
"productForm_personalNotes": "Personal notes",
|
||||||
"productForm_repurchaseIntent": "Repurchase intent",
|
|
||||||
"productForm_toleranceNotes": "Tolerance notes",
|
"productForm_toleranceNotes": "Tolerance notes",
|
||||||
"productForm_toleranceNotesPlaceholder": "e.g. Causes mild stinging, fine after 2 weeks",
|
"productForm_toleranceNotesPlaceholder": "e.g. Causes mild stinging, fine after 2 weeks",
|
||||||
|
|
||||||
|
|
|
||||||
|
|
@ -10,6 +10,11 @@
|
||||||
"nav_appName": "innercontext",
|
"nav_appName": "innercontext",
|
||||||
"nav_appSubtitle": "zdrowie & pielęgnacja",
|
"nav_appSubtitle": "zdrowie & pielęgnacja",
|
||||||
|
|
||||||
|
"auth_signedInAs": "Zalogowano jako",
|
||||||
|
"auth_roleAdmin": "Administrator",
|
||||||
|
"auth_roleMember": "Użytkownik",
|
||||||
|
"auth_logout": "Wyloguj",
|
||||||
|
|
||||||
"common_save": "Zapisz",
|
"common_save": "Zapisz",
|
||||||
"common_cancel": "Anuluj",
|
"common_cancel": "Anuluj",
|
||||||
"common_add": "Dodaj",
|
"common_add": "Dodaj",
|
||||||
|
|
@ -32,8 +37,82 @@
|
||||||
|
|
||||||
"dashboard_title": "Dashboard",
|
"dashboard_title": "Dashboard",
|
||||||
"dashboard_subtitle": "Przegląd zdrowia i pielęgnacji",
|
"dashboard_subtitle": "Przegląd zdrowia i pielęgnacji",
|
||||||
|
"dashboard_dailyBriefing": "Szybki rzut oka na zmiany, braki i miejsca, które warto teraz sprawdzić.",
|
||||||
"dashboard_latestSnapshot": "Ostatni stan skóry",
|
"dashboard_latestSnapshot": "Ostatni stan skóry",
|
||||||
"dashboard_recentRoutines": "Ostatnie rutyny",
|
"dashboard_recentRoutines": "Ostatnie rutyny",
|
||||||
|
"dashboard_requiresAttention": "Wymaga uwagi",
|
||||||
|
"dashboard_healthPulse": "Puls zdrowia",
|
||||||
|
"dashboard_viewSkinHistory": "Otwórz historię skóry",
|
||||||
|
"dashboard_viewLabResults": "Otwórz wyniki badań",
|
||||||
|
"dashboard_heroFreshness": "Świeżość",
|
||||||
|
"dashboard_sinceLastSnapshot": "od ostatniego wpisu",
|
||||||
|
"dashboard_skinFreshness": [
|
||||||
|
{
|
||||||
|
"declarations": ["input count", "local countPlural = count: plural"],
|
||||||
|
"selectors": ["countPlural"],
|
||||||
|
"match": {
|
||||||
|
"countPlural=one": "Ostatni wpis był {count} dzień temu.",
|
||||||
|
"countPlural=few": "Ostatni wpis był {count} dni temu.",
|
||||||
|
"countPlural=many": "Ostatni wpis był {count} dni temu.",
|
||||||
|
"countPlural=*": "Ostatni wpis był {count} dni temu."
|
||||||
|
}
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"dashboard_daysAgo": [
|
||||||
|
{
|
||||||
|
"declarations": ["input count", "local countPlural = count: plural"],
|
||||||
|
"selectors": ["countPlural"],
|
||||||
|
"match": {
|
||||||
|
"countPlural=one": "{count} dzień temu",
|
||||||
|
"countPlural=few": "{count} dni temu",
|
||||||
|
"countPlural=many": "{count} dni temu",
|
||||||
|
"countPlural=*": "{count} dni temu"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"dashboard_daysAgoShort": [
|
||||||
|
{
|
||||||
|
"declarations": ["input count", "local countPlural = count: plural"],
|
||||||
|
"selectors": ["countPlural"],
|
||||||
|
"match": {
|
||||||
|
"countPlural=one": "{count} d",
|
||||||
|
"countPlural=few": "{count} d",
|
||||||
|
"countPlural=many": "{count} d",
|
||||||
|
"countPlural=*": "{count} d"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"dashboard_flaggedResults": "wyników oznaczonych flagą",
|
||||||
|
"dashboard_flaggedLabsCount": [
|
||||||
|
{
|
||||||
|
"declarations": ["input count", "local countPlural = count: plural"],
|
||||||
|
"selectors": ["countPlural"],
|
||||||
|
"match": {
|
||||||
|
"countPlural=one": "{count} wynik z flagą",
|
||||||
|
"countPlural=few": "{count} wyniki z flagą",
|
||||||
|
"countPlural=many": "{count} wyników z flagą",
|
||||||
|
"countPlural=*": "{count} wyników z flagą"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"dashboard_attentionSnapshot": "Dziennik skóry",
|
||||||
|
"dashboard_attentionRoutineAM": "Rutyna AM",
|
||||||
|
"dashboard_attentionRoutinePM": "Rutyna PM",
|
||||||
|
"dashboard_attentionLabs": "Przegląd badań",
|
||||||
|
"dashboard_attentionMissing": "Brak",
|
||||||
|
"dashboard_attentionToday": "Dzisiaj",
|
||||||
|
"dashboard_attentionStable": "Brak flagowanych pozycji",
|
||||||
|
"dashboard_statusLogged": "Zapisane",
|
||||||
|
"dashboard_statusOpen": "Do uzupełnienia",
|
||||||
|
"dashboard_metricHydration": "Nawodnienie",
|
||||||
|
"dashboard_metricSensitivity": "Wrażliwość",
|
||||||
|
"dashboard_metricSebumTzone": "Sebum T-zone",
|
||||||
|
"dashboard_metricSebumCheeks": "Sebum policzki",
|
||||||
|
"dashboard_metricDelta": "Zmiana {delta}",
|
||||||
|
"dashboard_averageSteps": "Śr. {count} kroków",
|
||||||
|
"dashboard_lastRoutine": "Ostatnia rutyna",
|
||||||
|
"dashboard_lastLabDate": "Ostatnie badanie",
|
||||||
|
"dashboard_noLabResults": "Brak wyników badań.",
|
||||||
"dashboard_noSnapshots": "Brak wpisów o stanie skóry.",
|
"dashboard_noSnapshots": "Brak wpisów o stanie skóry.",
|
||||||
"dashboard_noRoutines": "Brak rutyn w ciągu ostatnich 2 tygodni.",
|
"dashboard_noRoutines": "Brak rutyn w ciągu ostatnich 2 tygodni.",
|
||||||
|
|
||||||
|
|
@ -60,6 +139,15 @@
|
||||||
"products_suggestResults": "Propozycje",
|
"products_suggestResults": "Propozycje",
|
||||||
"products_suggestTime": "Pora",
|
"products_suggestTime": "Pora",
|
||||||
"products_suggestFrequency": "Częstotliwość",
|
"products_suggestFrequency": "Częstotliwość",
|
||||||
|
"products_suggestPriorityHigh": "Wysoki priorytet",
|
||||||
|
"products_suggestPriorityMedium": "Średni priorytet",
|
||||||
|
"products_suggestPriorityLow": "Niski priorytet",
|
||||||
|
"products_suggestBuyNow": "Kup teraz, bo",
|
||||||
|
"products_suggestRoutineFit": "Jak wpisuje się w rutynę",
|
||||||
|
"products_suggestBudgetSkip": "Jeśli tniesz budżet",
|
||||||
|
"products_suggestKeyIngredients": "Kluczowe składniki",
|
||||||
|
"products_suggestTargets": "Cele",
|
||||||
|
"products_suggestCautions": "Uwagi",
|
||||||
"products_suggestRegenerate": "Wygeneruj ponownie",
|
"products_suggestRegenerate": "Wygeneruj ponownie",
|
||||||
"products_suggestNoResults": "Brak propozycji.",
|
"products_suggestNoResults": "Brak propozycji.",
|
||||||
"products_noProducts": "Nie znaleziono produktów.",
|
"products_noProducts": "Nie znaleziono produktów.",
|
||||||
|
|
@ -88,8 +176,11 @@
|
||||||
"inventory_openedDate": "Data otwarcia",
|
"inventory_openedDate": "Data otwarcia",
|
||||||
"inventory_finishedDate": "Data skończenia",
|
"inventory_finishedDate": "Data skończenia",
|
||||||
"inventory_expiryDate": "Data ważności",
|
"inventory_expiryDate": "Data ważności",
|
||||||
"inventory_currentWeight": "Aktualna waga (g)",
|
"inventory_remainingLevel": "Poziom pozostałego produktu:",
|
||||||
"inventory_lastWeighed": "Ostatnie ważenie",
|
"inventory_remainingHigh": "dużo",
|
||||||
|
"inventory_remainingMedium": "średnio",
|
||||||
|
"inventory_remainingLow": "mało",
|
||||||
|
"inventory_remainingNearlyEmpty": "prawie puste",
|
||||||
"inventory_notes": "Notatki",
|
"inventory_notes": "Notatki",
|
||||||
"inventory_badgeOpen": "Otwarte",
|
"inventory_badgeOpen": "Otwarte",
|
||||||
"inventory_badgeSealed": "Zamknięte",
|
"inventory_badgeSealed": "Zamknięte",
|
||||||
|
|
@ -97,8 +188,6 @@
|
||||||
"inventory_exp": "Wazność:",
|
"inventory_exp": "Wazność:",
|
||||||
"inventory_opened": "Otwarto:",
|
"inventory_opened": "Otwarto:",
|
||||||
"inventory_finished": "Skończono:",
|
"inventory_finished": "Skończono:",
|
||||||
"inventory_remaining": "g pozostało",
|
|
||||||
"inventory_weighed": "Ważono:",
|
|
||||||
"inventory_confirmDelete": "Usunąć to opakowanie?",
|
"inventory_confirmDelete": "Usunąć to opakowanie?",
|
||||||
|
|
||||||
"routines_title": "Rutyny",
|
"routines_title": "Rutyny",
|
||||||
|
|
@ -146,6 +235,9 @@
|
||||||
"grooming_title": "Harmonogram pielęgnacji",
|
"grooming_title": "Harmonogram pielęgnacji",
|
||||||
"grooming_backToRoutines": "Rutyny",
|
"grooming_backToRoutines": "Rutyny",
|
||||||
"grooming_addEntry": "+ Dodaj wpis",
|
"grooming_addEntry": "+ Dodaj wpis",
|
||||||
|
"grooming_newTitle": "Nowy wpis pielęgnacyjny",
|
||||||
|
"grooming_newSubtitle": "Dodaj stały wpis do tygodniowego harmonogramu pielęgnacji.",
|
||||||
|
"grooming_newSectionIntro": "Ustal dzień, czynność i krótką notatkę, jeśli chcesz.",
|
||||||
"grooming_entryAdded": "Wpis dodany.",
|
"grooming_entryAdded": "Wpis dodany.",
|
||||||
"grooming_entryUpdated": "Wpis zaktualizowany.",
|
"grooming_entryUpdated": "Wpis zaktualizowany.",
|
||||||
"grooming_entryDeleted": "Wpis usunięty.",
|
"grooming_entryDeleted": "Wpis usunięty.",
|
||||||
|
|
@ -252,6 +344,8 @@
|
||||||
],
|
],
|
||||||
"medications_addNew": "+ Dodaj lek",
|
"medications_addNew": "+ Dodaj lek",
|
||||||
"medications_newTitle": "Nowy lek",
|
"medications_newTitle": "Nowy lek",
|
||||||
|
"medications_newSubtitle": "Dodaj podstawowy rekord leku lub suplementu do dalszego śledzenia.",
|
||||||
|
"medications_newSectionIntro": "Zacznij od rodzaju, nazwy i substancji czynnej.",
|
||||||
"medications_kind": "Rodzaj",
|
"medications_kind": "Rodzaj",
|
||||||
"medications_productName": "Nazwa produktu *",
|
"medications_productName": "Nazwa produktu *",
|
||||||
"medications_productNamePlaceholder": "np. Witamina D3",
|
"medications_productNamePlaceholder": "np. Witamina D3",
|
||||||
|
|
@ -293,9 +387,21 @@
|
||||||
],
|
],
|
||||||
"labResults_addNew": "+ Dodaj wynik",
|
"labResults_addNew": "+ Dodaj wynik",
|
||||||
"labResults_newTitle": "Nowy wynik badania",
|
"labResults_newTitle": "Nowy wynik badania",
|
||||||
|
"labResults_newSubtitle": "Zapisz pojedynczy wynik badania, aby dołączyć go do historii zdrowia.",
|
||||||
|
"labResults_newSectionIntro": "Najpierw podaj datę i kod LOINC, resztę możesz uzupełnić skrótowo.",
|
||||||
"labResults_flagFilter": "Flaga:",
|
"labResults_flagFilter": "Flaga:",
|
||||||
"labResults_flagAll": "Wszystkie",
|
"labResults_flagAll": "Wszystkie",
|
||||||
"labResults_flagNone": "Brak",
|
"labResults_flagNone": "Brak",
|
||||||
|
"labResults_statusAll": "Wszystkie",
|
||||||
|
"labResults_statusAbnormal": "Nieprawidłowe",
|
||||||
|
"labResults_statusNormal": "Prawidłowe",
|
||||||
|
"labResults_statusUninterpreted": "Bez interpretacji",
|
||||||
|
"labResults_activeFilters": "Aktywne filtry",
|
||||||
|
"labResults_activeFilterSearch": "Szukaj: {value}",
|
||||||
|
"labResults_activeFilterCode": "Kod: {value}",
|
||||||
|
"labResults_activeFilterFrom": "Od: {value}",
|
||||||
|
"labResults_activeFilterTo": "Do: {value}",
|
||||||
|
"labResults_activeFilterHistory": "Pełna historia",
|
||||||
"labResults_date": "Data *",
|
"labResults_date": "Data *",
|
||||||
"labResults_loincCode": "Kod LOINC *",
|
"labResults_loincCode": "Kod LOINC *",
|
||||||
"labResults_loincExample": "np. 718-7",
|
"labResults_loincExample": "np. 718-7",
|
||||||
|
|
@ -378,6 +484,8 @@
|
||||||
"skin_analyzePhotos": "Analizuj zdjęcia",
|
"skin_analyzePhotos": "Analizuj zdjęcia",
|
||||||
"skin_analyzing": "Analizuję…",
|
"skin_analyzing": "Analizuję…",
|
||||||
"skin_newSnapshotTitle": "Nowy wpis",
|
"skin_newSnapshotTitle": "Nowy wpis",
|
||||||
|
"skin_newSubtitle": "Zapisz bieżący stan skóry ręcznie lub uzupełnij pola analizą AI ze zdjęć.",
|
||||||
|
"skin_newSectionIntro": "Zacznij od daty i ogólnej oceny, a potem doprecyzuj szczegóły.",
|
||||||
"skin_date": "Data *",
|
"skin_date": "Data *",
|
||||||
"skin_overallState": "Ogólny stan",
|
"skin_overallState": "Ogólny stan",
|
||||||
"skin_texture": "Tekstura",
|
"skin_texture": "Tekstura",
|
||||||
|
|
@ -519,7 +627,6 @@
|
||||||
"productForm_isTool": "To narzędzie (np. dermaroller)",
|
"productForm_isTool": "To narzędzie (np. dermaroller)",
|
||||||
"productForm_needleLengthMm": "Długość igły (mm, tylko narzędzia)",
|
"productForm_needleLengthMm": "Długość igły (mm, tylko narzędzia)",
|
||||||
"productForm_personalNotes": "Notatki osobiste",
|
"productForm_personalNotes": "Notatki osobiste",
|
||||||
"productForm_repurchaseIntent": "Zamiar ponownego zakupu",
|
|
||||||
"productForm_toleranceNotes": "Notatki o tolerancji",
|
"productForm_toleranceNotes": "Notatki o tolerancji",
|
||||||
"productForm_toleranceNotesPlaceholder": "np. Lekkie pieczenie, ustępuje po 2 tygodniach",
|
"productForm_toleranceNotesPlaceholder": "np. Lekkie pieczenie, ustępuje po 2 tygodniach",
|
||||||
|
|
||||||
|
|
|
||||||
14
frontend/openapi-ts.config.ts
Normal file
14
frontend/openapi-ts.config.ts
Normal file
|
|
@ -0,0 +1,14 @@
|
||||||
|
import { defineConfig } from "@hey-api/openapi-ts";
|
||||||
|
|
||||||
|
export default defineConfig({
|
||||||
|
input: "./openapi.json",
|
||||||
|
output: {
|
||||||
|
path: "src/lib/api/generated",
|
||||||
|
},
|
||||||
|
plugins: [
|
||||||
|
{
|
||||||
|
name: "@hey-api/typescript",
|
||||||
|
enums: false, // union types, matching existing frontend pattern
|
||||||
|
},
|
||||||
|
],
|
||||||
|
});
|
||||||
9077
frontend/openapi.json
Normal file
9077
frontend/openapi.json
Normal file
File diff suppressed because it is too large
Load diff
|
|
@ -11,10 +11,12 @@
|
||||||
"check": "svelte-kit sync && svelte-check --tsconfig ./tsconfig.json",
|
"check": "svelte-kit sync && svelte-check --tsconfig ./tsconfig.json",
|
||||||
"check:watch": "svelte-kit sync && svelte-check --tsconfig ./tsconfig.json --watch",
|
"check:watch": "svelte-kit sync && svelte-check --tsconfig ./tsconfig.json --watch",
|
||||||
"lint": "eslint .",
|
"lint": "eslint .",
|
||||||
"format": "prettier --write ."
|
"format": "prettier --write .",
|
||||||
|
"generate:api": "cd ../backend && uv run python -c \"import json; from main import app; print(json.dumps(app.openapi(), indent=2))\" > ../frontend/openapi.json && cd ../frontend && openapi-ts"
|
||||||
},
|
},
|
||||||
"devDependencies": {
|
"devDependencies": {
|
||||||
"@eslint/js": "^10.0.1",
|
"@eslint/js": "^10.0.1",
|
||||||
|
"@hey-api/openapi-ts": "^0.94.0",
|
||||||
"@internationalized/date": "^3.11.0",
|
"@internationalized/date": "^3.11.0",
|
||||||
"@lucide/svelte": "^0.561.0",
|
"@lucide/svelte": "^0.561.0",
|
||||||
"@sveltejs/adapter-node": "^5.0.0",
|
"@sveltejs/adapter-node": "^5.0.0",
|
||||||
|
|
|
||||||
3156
frontend/pnpm-lock.yaml
generated
3156
frontend/pnpm-lock.yaml
generated
File diff suppressed because it is too large
Load diff
|
|
@ -153,40 +153,182 @@ body {
|
||||||
}
|
}
|
||||||
|
|
||||||
.app-mobile-header {
|
.app-mobile-header {
|
||||||
|
position: sticky;
|
||||||
|
top: 0;
|
||||||
|
z-index: 40;
|
||||||
|
display: flex;
|
||||||
|
align-items: center;
|
||||||
|
justify-content: space-between;
|
||||||
|
gap: 1rem;
|
||||||
border-bottom: 1px solid hsl(35 22% 76% / 0.7);
|
border-bottom: 1px solid hsl(35 22% 76% / 0.7);
|
||||||
background: linear-gradient(180deg, hsl(44 35% 97%), hsl(44 25% 94%));
|
background: linear-gradient(180deg, hsl(44 35% 97% / 0.92), hsl(44 25% 94% / 0.96));
|
||||||
|
backdrop-filter: blur(16px);
|
||||||
|
padding: 0.9rem 1rem;
|
||||||
|
}
|
||||||
|
|
||||||
|
.app-mobile-titleblock {
|
||||||
|
min-width: 0;
|
||||||
|
}
|
||||||
|
|
||||||
|
.app-mobile-overline,
|
||||||
|
.app-sidebar-subtitle {
|
||||||
|
margin: 0;
|
||||||
|
color: var(--muted-foreground);
|
||||||
|
font-size: 0.68rem;
|
||||||
|
font-weight: 700;
|
||||||
|
letter-spacing: 0.16em;
|
||||||
|
line-height: 1.35;
|
||||||
|
overflow-wrap: anywhere;
|
||||||
|
text-transform: uppercase;
|
||||||
}
|
}
|
||||||
|
|
||||||
.app-mobile-title,
|
.app-mobile-title,
|
||||||
.app-brand {
|
.app-brand {
|
||||||
|
display: block;
|
||||||
font-family: 'Cormorant Infant', 'Times New Roman', serif;
|
font-family: 'Cormorant Infant', 'Times New Roman', serif;
|
||||||
font-size: 1.2rem;
|
font-size: 1.2rem;
|
||||||
font-weight: 600;
|
font-weight: 600;
|
||||||
letter-spacing: 0.02em;
|
letter-spacing: 0.02em;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
.app-mobile-toggle,
|
||||||
.app-icon-button {
|
.app-icon-button {
|
||||||
display: flex;
|
display: flex;
|
||||||
height: 2rem;
|
height: 2.6rem;
|
||||||
width: 2rem;
|
width: 2.6rem;
|
||||||
align-items: center;
|
align-items: center;
|
||||||
justify-content: center;
|
justify-content: center;
|
||||||
border: 1px solid hsl(34 21% 75%);
|
border: 1px solid hsl(34 21% 75% / 0.95);
|
||||||
border-radius: 0.45rem;
|
border-radius: 999px;
|
||||||
|
background: hsl(42 32% 95% / 0.92);
|
||||||
color: var(--muted-foreground);
|
color: var(--muted-foreground);
|
||||||
|
box-shadow: 0 10px 24px -20px hsl(220 32% 14% / 0.55);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
.app-mobile-toggle:hover,
|
||||||
.app-icon-button:hover {
|
.app-icon-button:hover {
|
||||||
color: var(--foreground);
|
color: var(--foreground);
|
||||||
border-color: var(--page-accent);
|
border-color: var(--page-accent);
|
||||||
background: var(--page-accent-soft);
|
background: var(--page-accent-soft);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
.app-drawer-backdrop {
|
||||||
|
position: fixed;
|
||||||
|
inset: 0;
|
||||||
|
z-index: 50;
|
||||||
|
background: hsl(220 40% 8% / 0.42);
|
||||||
|
backdrop-filter: blur(4px);
|
||||||
|
}
|
||||||
|
|
||||||
.app-sidebar {
|
.app-sidebar {
|
||||||
border-right: 1px solid hsl(36 20% 73% / 0.75);
|
border-right: 1px solid hsl(36 20% 73% / 0.75);
|
||||||
background: linear-gradient(180deg, hsl(44 34% 97%), hsl(42 28% 94%));
|
background: linear-gradient(180deg, hsl(44 34% 97%), hsl(42 28% 94%));
|
||||||
}
|
}
|
||||||
|
|
||||||
|
.app-drawer {
|
||||||
|
position: fixed;
|
||||||
|
inset: 0 auto 0 0;
|
||||||
|
z-index: 60;
|
||||||
|
display: flex;
|
||||||
|
width: min(20rem, calc(100vw - 1.5rem));
|
||||||
|
flex-direction: column;
|
||||||
|
gap: 1rem;
|
||||||
|
overflow-y: auto;
|
||||||
|
border-right: 1px solid hsl(36 20% 73% / 0.75);
|
||||||
|
background: linear-gradient(180deg, hsl(44 35% 97%), hsl(42 29% 94%));
|
||||||
|
padding: 1.1rem 0.85rem 1rem;
|
||||||
|
box-shadow: 0 28px 56px -28px hsl(220 34% 14% / 0.42);
|
||||||
|
}
|
||||||
|
|
||||||
|
.app-sidebar-brandblock {
|
||||||
|
margin-bottom: 0.8rem;
|
||||||
|
display: flex;
|
||||||
|
align-items: flex-start;
|
||||||
|
justify-content: space-between;
|
||||||
|
gap: 0.75rem;
|
||||||
|
padding: 0 0.8rem;
|
||||||
|
}
|
||||||
|
|
||||||
|
.app-sidebar-brandcopy {
|
||||||
|
min-width: 0;
|
||||||
|
max-width: 100%;
|
||||||
|
}
|
||||||
|
|
||||||
|
.app-mobile-header--menu-open {
|
||||||
|
border-bottom-color: transparent;
|
||||||
|
background: transparent;
|
||||||
|
backdrop-filter: none;
|
||||||
|
}
|
||||||
|
|
||||||
|
.app-nav-list {
|
||||||
|
display: flex;
|
||||||
|
flex-direction: column;
|
||||||
|
gap: 0.3rem;
|
||||||
|
}
|
||||||
|
|
||||||
|
.app-nav-link {
|
||||||
|
display: flex;
|
||||||
|
align-items: center;
|
||||||
|
gap: 0.75rem;
|
||||||
|
border: 1px solid transparent;
|
||||||
|
border-radius: 0.95rem;
|
||||||
|
padding: 0.8rem 0.9rem;
|
||||||
|
color: var(--muted-foreground);
|
||||||
|
font-size: 0.93rem;
|
||||||
|
text-decoration: none;
|
||||||
|
transition: border-color 140ms ease, background-color 140ms ease, color 140ms ease, transform 140ms ease;
|
||||||
|
}
|
||||||
|
|
||||||
|
.app-nav-link:hover {
|
||||||
|
transform: translateX(2px);
|
||||||
|
border-color: hsl(35 23% 76% / 0.75);
|
||||||
|
background: hsl(42 28% 92% / 0.78);
|
||||||
|
color: var(--foreground);
|
||||||
|
}
|
||||||
|
|
||||||
|
.app-nav-link--active {
|
||||||
|
border-color: color-mix(in srgb, var(--page-accent) 45%, white);
|
||||||
|
background: color-mix(in srgb, var(--page-accent) 13%, white);
|
||||||
|
color: var(--foreground);
|
||||||
|
box-shadow: inset 0 1px 0 hsl(0 0% 100% / 0.7);
|
||||||
|
}
|
||||||
|
|
||||||
|
.app-sidebar-footer {
|
||||||
|
margin-top: auto;
|
||||||
|
padding: 0.8rem;
|
||||||
|
}
|
||||||
|
|
||||||
|
.language-switcher {
|
||||||
|
display: inline-flex;
|
||||||
|
width: 100%;
|
||||||
|
border: 1px solid hsl(35 22% 75% / 0.82);
|
||||||
|
border-radius: 999px;
|
||||||
|
background: hsl(42 30% 93% / 0.9);
|
||||||
|
padding: 0.2rem;
|
||||||
|
}
|
||||||
|
|
||||||
|
.language-switcher__button {
|
||||||
|
flex: 1;
|
||||||
|
border-radius: 999px;
|
||||||
|
padding: 0.45rem 0.7rem;
|
||||||
|
color: var(--muted-foreground);
|
||||||
|
font-size: 0.72rem;
|
||||||
|
font-weight: 700;
|
||||||
|
letter-spacing: 0.16em;
|
||||||
|
text-transform: uppercase;
|
||||||
|
transition: background-color 140ms ease, color 140ms ease, box-shadow 140ms ease;
|
||||||
|
}
|
||||||
|
|
||||||
|
.language-switcher__button:hover {
|
||||||
|
color: var(--foreground);
|
||||||
|
}
|
||||||
|
|
||||||
|
.language-switcher__button--active {
|
||||||
|
background: color-mix(in srgb, var(--page-accent) 14%, white);
|
||||||
|
color: var(--foreground);
|
||||||
|
box-shadow: inset 0 1px 0 hsl(0 0% 100% / 0.74);
|
||||||
|
}
|
||||||
|
|
||||||
.app-sidebar a {
|
.app-sidebar a {
|
||||||
border: 1px solid transparent;
|
border: 1px solid transparent;
|
||||||
}
|
}
|
||||||
|
|
@ -212,6 +354,7 @@ body {
|
||||||
width: min(1160px, 100%);
|
width: min(1160px, 100%);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
.app-main h1,
|
||||||
.app-main h2 {
|
.app-main h2 {
|
||||||
font-family: 'Cormorant Infant', 'Times New Roman', serif;
|
font-family: 'Cormorant Infant', 'Times New Roman', serif;
|
||||||
font-size: clamp(1.9rem, 3.3vw, 2.7rem);
|
font-size: clamp(1.9rem, 3.3vw, 2.7rem);
|
||||||
|
|
@ -241,6 +384,19 @@ body {
|
||||||
color: var(--foreground);
|
color: var(--foreground);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
.page-header-meta,
|
||||||
|
.page-header-foot {
|
||||||
|
grid-column: 1 / -1;
|
||||||
|
}
|
||||||
|
|
||||||
|
.page-header-meta {
|
||||||
|
margin-top: 0.65rem;
|
||||||
|
}
|
||||||
|
|
||||||
|
.page-header-foot {
|
||||||
|
margin-top: 1rem;
|
||||||
|
}
|
||||||
|
|
||||||
.editorial-toolbar {
|
.editorial-toolbar {
|
||||||
margin-top: 0.9rem;
|
margin-top: 0.9rem;
|
||||||
display: flex;
|
display: flex;
|
||||||
|
|
@ -257,6 +413,23 @@ body {
|
||||||
margin-bottom: 0.65rem;
|
margin-bottom: 0.65rem;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
.lab-results-filter-chip {
|
||||||
|
display: inline-flex;
|
||||||
|
align-items: center;
|
||||||
|
gap: 0.35rem;
|
||||||
|
border: 1px solid color-mix(in srgb, var(--page-accent) 30%, var(--border));
|
||||||
|
border-radius: 999px;
|
||||||
|
background: color-mix(in srgb, var(--page-accent) 10%, white);
|
||||||
|
padding: 0.35rem 0.65rem;
|
||||||
|
color: var(--foreground);
|
||||||
|
font-size: 0.78rem;
|
||||||
|
text-decoration: none;
|
||||||
|
}
|
||||||
|
|
||||||
|
.lab-results-filter-chip:hover {
|
||||||
|
background: color-mix(in srgb, var(--page-accent) 16%, white);
|
||||||
|
}
|
||||||
|
|
||||||
.editorial-alert {
|
.editorial-alert {
|
||||||
border-radius: 0.7rem;
|
border-radius: 0.7rem;
|
||||||
border: 1px solid hsl(34 25% 75% / 0.8);
|
border: 1px solid hsl(34 25% 75% / 0.8);
|
||||||
|
|
@ -289,7 +462,7 @@ body {
|
||||||
color: hsl(207 78% 28%);
|
color: hsl(207 78% 28%);
|
||||||
}
|
}
|
||||||
|
|
||||||
.products-table-shell {
|
.editorial-table-shell {
|
||||||
border: 1px solid hsl(35 24% 74% / 0.85);
|
border: 1px solid hsl(35 24% 74% / 0.85);
|
||||||
border-radius: 0.9rem;
|
border-radius: 0.9rem;
|
||||||
overflow: hidden;
|
overflow: hidden;
|
||||||
|
|
@ -299,14 +472,14 @@ body {
|
||||||
background: color-mix(in srgb, var(--page-accent) 10%, white);
|
background: color-mix(in srgb, var(--page-accent) 10%, white);
|
||||||
}
|
}
|
||||||
|
|
||||||
.products-mobile-card {
|
.editorial-mobile-card {
|
||||||
display: block;
|
display: block;
|
||||||
border: 1px solid hsl(35 21% 76% / 0.85);
|
border: 1px solid hsl(35 21% 76% / 0.85);
|
||||||
border-radius: 0.8rem;
|
border-radius: 0.8rem;
|
||||||
padding: 0.95rem;
|
padding: 0.95rem;
|
||||||
}
|
}
|
||||||
|
|
||||||
.products-section-title {
|
.editorial-section-title {
|
||||||
border-bottom: 1px dashed color-mix(in srgb, var(--page-accent) 35%, var(--border));
|
border-bottom: 1px dashed color-mix(in srgb, var(--page-accent) 35%, var(--border));
|
||||||
padding-bottom: 0.3rem;
|
padding-bottom: 0.3rem;
|
||||||
padding-top: 0.5rem;
|
padding-top: 0.5rem;
|
||||||
|
|
@ -317,11 +490,7 @@ body {
|
||||||
text-transform: uppercase;
|
text-transform: uppercase;
|
||||||
}
|
}
|
||||||
|
|
||||||
.products-sticky-actions {
|
.editorial-meta-strip {
|
||||||
border-color: color-mix(in srgb, var(--page-accent) 25%, var(--border));
|
|
||||||
}
|
|
||||||
|
|
||||||
.products-meta-strip {
|
|
||||||
display: flex;
|
display: flex;
|
||||||
flex-wrap: wrap;
|
flex-wrap: wrap;
|
||||||
align-items: center;
|
align-items: center;
|
||||||
|
|
@ -488,7 +657,7 @@ body {
|
||||||
font-feature-settings: 'tnum';
|
font-feature-settings: 'tnum';
|
||||||
}
|
}
|
||||||
|
|
||||||
.lab-results-mobile-grid .products-section-title {
|
.lab-results-mobile-grid .editorial-section-title {
|
||||||
margin-top: 0.15rem;
|
margin-top: 0.15rem;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
@ -525,6 +694,20 @@ body {
|
||||||
}
|
}
|
||||||
|
|
||||||
@media (min-width: 768px) {
|
@media (min-width: 768px) {
|
||||||
|
.app-mobile-header,
|
||||||
|
.app-drawer,
|
||||||
|
.app-drawer-backdrop {
|
||||||
|
display: none !important;
|
||||||
|
}
|
||||||
|
|
||||||
|
.app-sidebar {
|
||||||
|
position: sticky;
|
||||||
|
top: 0;
|
||||||
|
align-self: flex-start;
|
||||||
|
height: 100vh;
|
||||||
|
overflow-y: auto;
|
||||||
|
}
|
||||||
|
|
||||||
.app-shell {
|
.app-shell {
|
||||||
flex-direction: row;
|
flex-direction: row;
|
||||||
}
|
}
|
||||||
|
|
@ -556,6 +739,10 @@ body {
|
||||||
grid-area: subtitle;
|
grid-area: subtitle;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
.dashboard-stat-strip {
|
||||||
|
grid-column: 1 / -1;
|
||||||
|
}
|
||||||
|
|
||||||
.editorial-toolbar {
|
.editorial-toolbar {
|
||||||
grid-area: actions;
|
grid-area: actions;
|
||||||
margin-top: 0;
|
margin-top: 0;
|
||||||
|
|
@ -661,6 +848,122 @@ body {
|
||||||
font-weight: 600;
|
font-weight: 600;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
.dashboard-stat-strip {
|
||||||
|
margin-top: 1.8rem;
|
||||||
|
display: grid;
|
||||||
|
gap: 0.65rem;
|
||||||
|
grid-template-columns: repeat(4, minmax(0, 1fr));
|
||||||
|
border-top: 1px dashed color-mix(in srgb, var(--page-accent) 24%, var(--editorial-line));
|
||||||
|
padding-top: 1rem;
|
||||||
|
}
|
||||||
|
|
||||||
|
.dashboard-stat-card {
|
||||||
|
border: 1px solid hsl(36 18% 77% / 0.62);
|
||||||
|
border-radius: 0.85rem;
|
||||||
|
background: linear-gradient(180deg, hsl(44 32% 96% / 0.74), hsl(45 24% 93% / 0.66));
|
||||||
|
padding: 0.72rem 0.78rem;
|
||||||
|
}
|
||||||
|
|
||||||
|
.dashboard-stat-label,
|
||||||
|
.dashboard-metric-label,
|
||||||
|
.dashboard-featured-label,
|
||||||
|
.dashboard-health-subline,
|
||||||
|
.dashboard-attention-label {
|
||||||
|
margin: 0;
|
||||||
|
color: var(--editorial-muted);
|
||||||
|
font-size: 0.73rem;
|
||||||
|
font-weight: 700;
|
||||||
|
letter-spacing: 0.12em;
|
||||||
|
text-transform: uppercase;
|
||||||
|
}
|
||||||
|
|
||||||
|
.dashboard-stat-value,
|
||||||
|
.dashboard-metric-value,
|
||||||
|
.dashboard-featured-value {
|
||||||
|
margin: 0.28rem 0 0;
|
||||||
|
font-family: 'Cormorant Infant', 'Times New Roman', serif;
|
||||||
|
font-size: 1.28rem;
|
||||||
|
font-weight: 600;
|
||||||
|
line-height: 1;
|
||||||
|
}
|
||||||
|
|
||||||
|
.dashboard-stat-detail,
|
||||||
|
.dashboard-featured-meta,
|
||||||
|
.dashboard-featured-notes,
|
||||||
|
.dashboard-health-value,
|
||||||
|
.dashboard-attention-value,
|
||||||
|
.dashboard-panel-note,
|
||||||
|
.dashboard-metric-trend {
|
||||||
|
margin: 0.28rem 0 0;
|
||||||
|
color: var(--editorial-muted);
|
||||||
|
font-size: 0.8rem;
|
||||||
|
}
|
||||||
|
|
||||||
|
.dashboard-attention-panel {
|
||||||
|
position: relative;
|
||||||
|
z-index: 1;
|
||||||
|
margin-bottom: 1rem;
|
||||||
|
border: 1px solid hsl(36 25% 74% / 0.8);
|
||||||
|
border-radius: 1.2rem;
|
||||||
|
background: linear-gradient(180deg, hsl(44 40% 95% / 0.92), hsl(42 32% 93% / 0.94));
|
||||||
|
box-shadow:
|
||||||
|
0 20px 40px -34px hsl(219 32% 14% / 0.38),
|
||||||
|
inset 0 1px 0 hsl(0 0% 100% / 0.7);
|
||||||
|
padding: 0.9rem;
|
||||||
|
}
|
||||||
|
|
||||||
|
.dashboard-attention-header {
|
||||||
|
display: flex;
|
||||||
|
align-items: baseline;
|
||||||
|
gap: 0.75rem;
|
||||||
|
margin-bottom: 0.85rem;
|
||||||
|
}
|
||||||
|
|
||||||
|
.dashboard-attention-header h3 {
|
||||||
|
margin: 0;
|
||||||
|
font-family: 'Cormorant Infant', 'Times New Roman', serif;
|
||||||
|
font-size: clamp(1.3rem, 2.3vw, 1.6rem);
|
||||||
|
font-weight: 600;
|
||||||
|
}
|
||||||
|
|
||||||
|
.dashboard-attention-list {
|
||||||
|
display: grid;
|
||||||
|
gap: 0.6rem;
|
||||||
|
grid-template-columns: repeat(4, minmax(0, 1fr));
|
||||||
|
}
|
||||||
|
|
||||||
|
.dashboard-attention-item {
|
||||||
|
display: flex;
|
||||||
|
min-height: 4.35rem;
|
||||||
|
flex-direction: column;
|
||||||
|
justify-content: space-between;
|
||||||
|
border: 1px solid hsl(36 22% 74% / 0.75);
|
||||||
|
border-radius: 0.9rem;
|
||||||
|
padding: 0.72rem 0.8rem;
|
||||||
|
text-decoration: none;
|
||||||
|
color: inherit;
|
||||||
|
transition: transform 140ms ease, border-color 140ms ease, background-color 140ms ease;
|
||||||
|
}
|
||||||
|
|
||||||
|
.dashboard-attention-item:hover {
|
||||||
|
transform: translateY(-1px);
|
||||||
|
border-color: color-mix(in srgb, var(--page-accent) 42%, var(--border));
|
||||||
|
}
|
||||||
|
|
||||||
|
.dashboard-attention-item--alert {
|
||||||
|
background: linear-gradient(180deg, hsl(27 76% 95%), hsl(18 60% 91%));
|
||||||
|
}
|
||||||
|
|
||||||
|
.dashboard-attention-item--calm {
|
||||||
|
background: linear-gradient(180deg, hsl(130 24% 95%), hsl(136 26% 91%));
|
||||||
|
}
|
||||||
|
|
||||||
|
.dashboard-attention-value {
|
||||||
|
color: var(--foreground);
|
||||||
|
font-size: 0.88rem;
|
||||||
|
font-weight: 600;
|
||||||
|
}
|
||||||
|
|
||||||
.editorial-grid {
|
.editorial-grid {
|
||||||
position: relative;
|
position: relative;
|
||||||
z-index: 1;
|
z-index: 1;
|
||||||
|
|
@ -671,11 +974,16 @@ body {
|
||||||
|
|
||||||
.editorial-panel {
|
.editorial-panel {
|
||||||
border-radius: 1.2rem;
|
border-radius: 1.2rem;
|
||||||
padding: 1rem;
|
padding: 0.9rem;
|
||||||
|
}
|
||||||
|
|
||||||
|
.dashboard-grid {
|
||||||
|
grid-template-columns: minmax(0, 1.08fr) minmax(0, 1fr) minmax(0, 0.92fr);
|
||||||
|
align-items: start;
|
||||||
}
|
}
|
||||||
|
|
||||||
.panel-header {
|
.panel-header {
|
||||||
margin-bottom: 0.9rem;
|
margin-bottom: 0.75rem;
|
||||||
display: flex;
|
display: flex;
|
||||||
align-items: baseline;
|
align-items: baseline;
|
||||||
justify-content: space-between;
|
justify-content: space-between;
|
||||||
|
|
@ -687,7 +995,7 @@ body {
|
||||||
.panel-header h3 {
|
.panel-header h3 {
|
||||||
margin: 0;
|
margin: 0;
|
||||||
font-family: 'Cormorant Infant', 'Times New Roman', serif;
|
font-family: 'Cormorant Infant', 'Times New Roman', serif;
|
||||||
font-size: clamp(1.35rem, 2.4vw, 1.7rem);
|
font-size: clamp(1.22rem, 2.1vw, 1.56rem);
|
||||||
font-weight: 600;
|
font-weight: 600;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
@ -714,7 +1022,7 @@ body {
|
||||||
|
|
||||||
.snapshot-date {
|
.snapshot-date {
|
||||||
color: var(--editorial-muted);
|
color: var(--editorial-muted);
|
||||||
font-size: 0.9rem;
|
font-size: 0.84rem;
|
||||||
font-weight: 600;
|
font-weight: 600;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
@ -786,33 +1094,51 @@ body {
|
||||||
list-style: none;
|
list-style: none;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
.dashboard-metric-row {
|
||||||
|
margin-top: 0.85rem;
|
||||||
|
display: grid;
|
||||||
|
gap: 0.55rem;
|
||||||
|
grid-template-columns: repeat(2, minmax(0, 1fr));
|
||||||
|
}
|
||||||
|
|
||||||
|
.dashboard-metric-card {
|
||||||
|
border: 1px solid hsl(36 23% 74% / 0.76);
|
||||||
|
border-radius: 0.8rem;
|
||||||
|
background: hsl(42 36% 93% / 0.7);
|
||||||
|
padding: 0.65rem 0.72rem;
|
||||||
|
}
|
||||||
|
|
||||||
|
.dashboard-metric-value {
|
||||||
|
font-size: 1.08rem;
|
||||||
|
}
|
||||||
|
|
||||||
.routine-summary-strip {
|
.routine-summary-strip {
|
||||||
margin-bottom: 0.7rem;
|
margin-bottom: 0.6rem;
|
||||||
display: flex;
|
display: flex;
|
||||||
flex-wrap: wrap;
|
flex-wrap: wrap;
|
||||||
align-items: center;
|
align-items: center;
|
||||||
gap: 0.4rem;
|
gap: 0.35rem;
|
||||||
}
|
}
|
||||||
|
|
||||||
.routine-summary-chip {
|
.routine-summary-chip {
|
||||||
border: 1px solid hsl(35 24% 71% / 0.85);
|
border: 1px solid hsl(35 24% 71% / 0.85);
|
||||||
border-radius: 999px;
|
border-radius: 999px;
|
||||||
padding: 0.22rem 0.62rem;
|
padding: 0.16rem 0.5rem;
|
||||||
color: var(--editorial-muted);
|
color: var(--editorial-muted);
|
||||||
font-size: 0.74rem;
|
font-size: 0.68rem;
|
||||||
font-weight: 700;
|
font-weight: 700;
|
||||||
letter-spacing: 0.08em;
|
letter-spacing: 0.06em;
|
||||||
}
|
}
|
||||||
|
|
||||||
.panel-action-link,
|
.panel-action-link,
|
||||||
.routine-summary-link {
|
.routine-summary-link {
|
||||||
border: 1px solid color-mix(in srgb, var(--page-accent) 38%, var(--editorial-line));
|
border: 1px solid color-mix(in srgb, var(--page-accent) 38%, var(--editorial-line));
|
||||||
border-radius: 999px;
|
border-radius: 999px;
|
||||||
padding: 0.24rem 0.64rem;
|
padding: 0.2rem 0.56rem;
|
||||||
color: var(--page-accent);
|
color: var(--page-accent);
|
||||||
font-size: 0.76rem;
|
font-size: 0.68rem;
|
||||||
font-weight: 700;
|
font-weight: 700;
|
||||||
letter-spacing: 0.08em;
|
letter-spacing: 0.06em;
|
||||||
text-decoration: none;
|
text-decoration: none;
|
||||||
text-transform: uppercase;
|
text-transform: uppercase;
|
||||||
}
|
}
|
||||||
|
|
@ -826,6 +1152,32 @@ body {
|
||||||
background: var(--page-accent-soft);
|
background: var(--page-accent-soft);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
.dashboard-featured-routine {
|
||||||
|
display: block;
|
||||||
|
margin-bottom: 0.7rem;
|
||||||
|
border: 1px solid hsl(36 24% 73% / 0.82);
|
||||||
|
border-radius: 0.88rem;
|
||||||
|
background: linear-gradient(155deg, color-mix(in srgb, var(--page-accent) 5%, white), hsl(44 30% 94%));
|
||||||
|
padding: 0.78rem 0.84rem;
|
||||||
|
text-decoration: none;
|
||||||
|
color: inherit;
|
||||||
|
}
|
||||||
|
|
||||||
|
.dashboard-featured-routine:hover {
|
||||||
|
border-color: color-mix(in srgb, var(--page-accent) 44%, var(--border));
|
||||||
|
}
|
||||||
|
|
||||||
|
.dashboard-featured-routine-topline {
|
||||||
|
display: flex;
|
||||||
|
align-items: center;
|
||||||
|
justify-content: space-between;
|
||||||
|
gap: 0.75rem;
|
||||||
|
}
|
||||||
|
|
||||||
|
.dashboard-featured-notes {
|
||||||
|
line-height: 1.4;
|
||||||
|
}
|
||||||
|
|
||||||
.routine-item + .routine-item {
|
.routine-item + .routine-item {
|
||||||
border-top: 1px dashed hsl(36 26% 72% / 0.7);
|
border-top: 1px dashed hsl(36 26% 72% / 0.7);
|
||||||
}
|
}
|
||||||
|
|
@ -834,7 +1186,7 @@ body {
|
||||||
display: flex;
|
display: flex;
|
||||||
align-items: center;
|
align-items: center;
|
||||||
gap: 0.6rem;
|
gap: 0.6rem;
|
||||||
padding: 0.78rem 0;
|
padding: 0.68rem 0;
|
||||||
text-decoration: none;
|
text-decoration: none;
|
||||||
color: inherit;
|
color: inherit;
|
||||||
transition: transform 140ms ease, color 160ms ease;
|
transition: transform 140ms ease, color 160ms ease;
|
||||||
|
|
@ -858,9 +1210,9 @@ body {
|
||||||
.routine-meta {
|
.routine-meta {
|
||||||
display: flex;
|
display: flex;
|
||||||
flex-wrap: wrap;
|
flex-wrap: wrap;
|
||||||
gap: 0.75rem;
|
gap: 0.6rem;
|
||||||
color: var(--editorial-muted);
|
color: var(--editorial-muted);
|
||||||
font-size: 0.8rem;
|
font-size: 0.76rem;
|
||||||
}
|
}
|
||||||
|
|
||||||
.routine-note-inline {
|
.routine-note-inline {
|
||||||
|
|
@ -882,7 +1234,7 @@ body {
|
||||||
}
|
}
|
||||||
|
|
||||||
.routine-date {
|
.routine-date {
|
||||||
font-size: 0.93rem;
|
font-size: 0.88rem;
|
||||||
font-weight: 600;
|
font-weight: 600;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
@ -909,6 +1261,56 @@ body {
|
||||||
display: flex;
|
display: flex;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
.dashboard-health-meta-strip {
|
||||||
|
margin-bottom: 0.65rem;
|
||||||
|
}
|
||||||
|
|
||||||
|
.dashboard-health-list {
|
||||||
|
display: flex;
|
||||||
|
flex-direction: column;
|
||||||
|
gap: 0.55rem;
|
||||||
|
}
|
||||||
|
|
||||||
|
.dashboard-health-item {
|
||||||
|
display: flex;
|
||||||
|
align-items: flex-start;
|
||||||
|
justify-content: space-between;
|
||||||
|
gap: 0.65rem;
|
||||||
|
border: 1px solid hsl(36 22% 75% / 0.78);
|
||||||
|
border-radius: 0.85rem;
|
||||||
|
background: hsl(44 34% 95% / 0.75);
|
||||||
|
padding: 0.68rem 0.78rem;
|
||||||
|
text-decoration: none;
|
||||||
|
color: inherit;
|
||||||
|
}
|
||||||
|
|
||||||
|
.dashboard-health-item:hover {
|
||||||
|
border-color: color-mix(in srgb, var(--page-accent) 36%, var(--border));
|
||||||
|
background: var(--page-accent-soft);
|
||||||
|
}
|
||||||
|
|
||||||
|
.dashboard-health-test {
|
||||||
|
margin: 0;
|
||||||
|
font-size: 0.88rem;
|
||||||
|
font-weight: 600;
|
||||||
|
}
|
||||||
|
|
||||||
|
.dashboard-health-value-wrap {
|
||||||
|
display: flex;
|
||||||
|
min-width: 0;
|
||||||
|
flex-direction: column;
|
||||||
|
align-items: flex-end;
|
||||||
|
gap: 0.35rem;
|
||||||
|
}
|
||||||
|
|
||||||
|
.dashboard-health-value {
|
||||||
|
margin-top: 0;
|
||||||
|
color: var(--foreground);
|
||||||
|
font-size: 0.8rem;
|
||||||
|
font-weight: 600;
|
||||||
|
text-align: right;
|
||||||
|
}
|
||||||
|
|
||||||
.reveal-1,
|
.reveal-1,
|
||||||
.reveal-2,
|
.reveal-2,
|
||||||
.reveal-3 {
|
.reveal-3 {
|
||||||
|
|
@ -933,12 +1335,23 @@ body {
|
||||||
}
|
}
|
||||||
|
|
||||||
@media (max-width: 1024px) {
|
@media (max-width: 1024px) {
|
||||||
|
.dashboard-stat-strip,
|
||||||
|
.dashboard-attention-list,
|
||||||
.editorial-grid {
|
.editorial-grid {
|
||||||
grid-template-columns: minmax(0, 1fr);
|
grid-template-columns: minmax(0, 1fr);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
.dashboard-grid {
|
||||||
|
grid-template-columns: minmax(0, 1fr);
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
@media (max-width: 640px) {
|
@media (max-width: 640px) {
|
||||||
|
.dashboard-stat-strip {
|
||||||
|
margin-top: 1.35rem;
|
||||||
|
padding-top: 0.8rem;
|
||||||
|
}
|
||||||
|
|
||||||
.editorial-title {
|
.editorial-title {
|
||||||
font-size: 2.05rem;
|
font-size: 2.05rem;
|
||||||
}
|
}
|
||||||
|
|
@ -951,6 +1364,21 @@ body {
|
||||||
font-size: 1.4rem;
|
font-size: 1.4rem;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
.dashboard-metric-row {
|
||||||
|
grid-template-columns: minmax(0, 1fr);
|
||||||
|
}
|
||||||
|
|
||||||
|
.dashboard-health-item,
|
||||||
|
.dashboard-featured-routine-topline,
|
||||||
|
.dashboard-attention-item {
|
||||||
|
align-items: flex-start;
|
||||||
|
flex-direction: column;
|
||||||
|
}
|
||||||
|
|
||||||
|
.dashboard-health-value-wrap {
|
||||||
|
align-items: flex-start;
|
||||||
|
}
|
||||||
|
|
||||||
.state-pill,
|
.state-pill,
|
||||||
.routine-pill {
|
.routine-pill {
|
||||||
letter-spacing: 0.08em;
|
letter-spacing: 0.08em;
|
||||||
|
|
|
||||||
Some files were not shown because too many files have changed in this diff Show more
Loading…
Add table
Add a link
Reference in a new issue