MedScribe AI
Local-first AI clinical documentation — speech to structured FHIR notes
About This Project
MedScribe AI is a privacy-first healthcare platform that turns doctor-patient conversations into structured clinical notes using local AI. It transcribes audio with Whisper (running on-premises), generates SOAP/clinical notes with a local LLM via Ollama, and exports in FHIR R4, HL7 v2, and KITH XML formats — all without a single byte of patient data leaving the hospital.
What You'll Learn
Key Features
Project Structure
MedScribe-AI/ ├── medscribe/ # Python backend (FastAPI) │ ├── api/ # Route handlers (35+ endpoints) │ ├── agents/ # AI agents (diagnosis, referral, etc.) │ ├── models/ # SQLAlchemy database models │ ├── schemas/ # Pydantic request/response schemas │ ├── services/ # Business logic (transcription, notes, RAG) │ │ ├── whisper.py # Speech-to-text with faster-whisper │ │ ├── ollama.py # Local LLM integration │ │ ├── fhir.py # FHIR R4 export │ │ └── safety.py # Hallucination detection │ └── main.py # FastAPI app + startup ├── frontend/ # React 19 + TypeScript + Vite │ ├── src/ │ │ ├── components/ # UI components │ │ ├── pages/ # Route pages │ │ └── api/ # API client │ └── package.json ├── tests/ # 38 pytest unit tests ├── k8s/ # Kubernetes manifests ├── .env.example └── pyproject.toml
Setup Guide
Clone the repository
Clone MedScribe AI from GitHub and navigate into the project directory.
git clone https://github.com/asmanasir/MedScribe-AI.git cd MedScribe-AI
Create a Python virtual environment
Always use a venv to keep project dependencies isolated from your system Python.
# Create the virtual environment python -m venv .venv # Activate it — Linux/macOS: source .venv/bin/activate # Activate it — Windows: .venv\Scripts\activate
Install Python dependencies
Install the project with dev and local AI dependencies. This includes FastAPI, SQLAlchemy, faster-whisper, and all other backend packages.
pip install -e ".[dev,local]"
Set up environment variables
Copy the example .env file and review the configuration. For local development the defaults work out of the box.
cp .env.example .env # Open .env in your editor and review the settings # Key variables: # DATABASE_URL=sqlite:///./medscribe.db (default — SQLite for dev) # JWT_SECRET_KEY=your-secret-key-here # OLLAMA_BASE_URL=http://localhost:11434
Pull the local LLM model with Ollama
MedScribe uses llama3.2:3b for clinical note generation. This runs entirely on your machine — no OpenAI key needed.
# Pull the model (~2 GB download) ollama pull llama3.2:3b # Verify it's available ollama list
Install frontend dependencies
The React frontend is in the /frontend subdirectory.
cd frontend npm install cd ..
Running the Project
Start Ollama (if not already running)
Ollama must be running before the backend starts. It listens on port 11434 by default.
ollama serve
Start the FastAPI backend
Open a new terminal. The backend starts on port 8000 and auto-reloads on file changes.
# Make sure your venv is activated source .venv/bin/activate # or .venv\Scripts\activate on Windows python -m medscribe
Start the React frontend
Open a second terminal. The frontend dev server starts on port 3000.
cd frontend npm run dev
Verify everything is running
Open your browser and check these URLs to confirm the system is healthy.
# API documentation (Swagger UI) http://localhost:8000/docs # React frontend http://localhost:3000 # Health check curl http://localhost:8000/health
Run the test suite
MedScribe has 38 unit tests. Run them to verify your setup is working correctly.
pytest tests/ -v
Project Info
Tech Stack
Prerequisites
- Python 3.10+ installed
- Node.js 18+ installed
- Ollama installed — download from ollama.ai
- Git installed
- 8 GB RAM minimum (16 GB recommended for GPU inference)
- Basic familiarity with Python and React
Asma Nasir
Project Author
Built by a senior AI/healthcare systems engineer with deep experience in FHIR, Azure healthcare services, and production AI systems. This project demonstrates how to build privacy-compliant clinical AI that runs entirely on-premises — no cloud required.