Warning
Phlox is an experimental project. For full details on limitations and risks, please read the Usage Warning section carefully before proceeding.
Phlox is an open-source patient management system integrating AI-powered medical transcription, clinical note generation, and an AI chatbot interface. It's designed to run locally, utilizing local models for inference and transcription.
- 🔒 100% Local & Private: Runs entirely on your machine with no third-party services - all data stays local. Free and open source forever; the security of your data is in your hands.
- 🖥️ Desktop App: Native Apple Silicon application with bundled LLM and transcription servers - no external dependencies required.
- 🎤 AI Medical Transcription & Summarization: Convert patient encounters to structured clinical notes using customizable templates.
- 📝 Flexible Template System: Structure clinical notes to your preferences, with versioning and automated template generation from example notes.
- ✅ Task Manager: Parse clinical plans into actionable task lists with AI-generated summaries.
- ✉️ Correspondence Generation: One-click generation of patient letters based on clinical notes.
- 🤖 AI-chat/RAG: Reference tool to query medical guidelines, literature, and documentation backed by a local knowledge base (ChromaDB).
- 💡 Adaptive Refinement: Outputs improve the more you use it; Phlox learns from your previous notes.
- 📰 Dashboard with RSS Reader: Stay updated with LLM-summarized articles from medical RSS feeds.
- Frontend: Chakra UI (React/Vite)
- Backend: FastAPI (Python/uv)
- Database: SQLCipher
- Desktop Wrapper: Tauri (Rust)
- LLM Backend: Ollama, OpenAI-compatible endpoints, or bundled llama.cpp server
- Transcription: Whisper-compatible endpoints or bundled whisper.cpp server
- RAG: ChromaDB
Pre-built Apple Silicon binaries are available from GitHub Releases.
Note: The desktop app provides transcription and correspondence features only. For extended reference tools (Chat, RAG, PDF Upload), use the Docker/Podman deployment below.
- Prerequisites: Podman/Docker, Ollama/OpenAI-compatible endpoint, Whisper endpoint.
- Hardware Requirements: For reasonable performance, a GPU (CUDA, ROCm) or Apple M-Series chip is strongly recommended. Without these, especially with larger models, the system will run extremely slowly.
- Clone:
git clone https://github.com/bloodworks-io/phlox.git && cd phlox - Build:
docker build -t phlox:latest . - Environment: Create
.envinphlox/(see example in documentation). - Run:
docker-compose up(Production) ordocker-compose -f docker-compose.dev.yml up(Development). - Access: http://localhost:5000
For detailed setup, feature explanations, and important warnings, please see the Documentation.
The complete Phlox experience with all features:
- Medical transcription and clinical notes
- Correspondence generation
- AI Chat interface
- RAG/document knowledge base
- Dashboard with RSS reader
Native desktop application for Apple Silicon:
- Medical transcription and clinical notes
- Correspondence generation
- Bundled llama.cpp and whisper servers - no external dependencies
- All data stored locally. Nothing leaves your machine.
Additional platforms and full feature parity coming in future releases.
Here's what's coming next for Phlox:
- Use structured JSON outputs for managing LLM responses
- Add support for OpenAI-compatible endpoints
- Tauri desktop app with local inference (llama.cpp + whisper bundled)
- MCP server support for custom tools and agentic workflows
- Advanced template version control
- Meeting and multi-disciplinary meeting scribing
Phlox is an experimental project intended for educational and personal use. It is not a certified medical device and should NOT be used for clinical decision-making.
Phlox is not suitable for production deployment in the form provided in this repo. If you intend to use it in a clinical setting, you are responsible for ensuring compliance with local applicable regulations (HIPAA, GDPR, TGA, etc.)
AI outputs can be unreliable. Always verify AI-generated content and use professional clinical judgment. The application displays a disclaimer on startup with full details.
Security note: The Docker deployment binds to 0.0.0.0 by default and has no authentication. You MUST place it behind a reverse proxy with an authentication layer like Authelia. The desktop app requires a passphrase to unlock the encrypted database.

