v1.2.0February 2026Latest
BYOM, Security Hardening, Rate Limiting & Production Readiness
- •Bring Your Own Model (BYOM): organizations can choose their LLM — GPT-4o, Claude, DeepSeek, Llama 3, or 100+ providers via LiteLLM
- •Per-org LLM configuration from the Settings page with preset selection and custom model support
- •Unified authentication: all API routes protected by session cookie or API key auth
- •Redis-backed rate limiting on all abuse-prone endpoints (auth, webhooks, contact forms)
- •Pydantic schema validation on all API inputs including webhook payloads
- •CSRF protection on all OAuth/OIDC flows with Redis-backed state tokens
- •Multi-stage Docker builds with non-root containers for all services
- •Health checks on backend, worker, and frontend services in production
- •HSTS headers enforced in production (max-age=63072000; includeSubDomains; preload)
- •React Error Boundary for graceful frontend crash recovery
- •Global API request timeouts (30s) and automatic 401 session expiry handling
- •Safe production environment configuration — no localhost fallbacks in prod builds
- •pip-audit and npm audit integrated into CI pipeline
- •Dependabot configured for pip, npm, Docker, and GitHub Actions ecosystems
- •Dependency version pinning with upper bounds to prevent breaking upgrades