This application connects to the Azalea Health FHIR API to identify patient encounters with multiple unsigned clinical notes, collects comprehensive chart data, processes it with AWS Bedrock AI, and updates the EHR with recommendations for provider review.
Help healthcare providers at Comprehensive Medical Clinic clear their backlog of unsigned notes by:
- Finding encounters with >1 unsigned clinical notes
- Collecting all relevant patient chart data
- Processing with AWS Bedrock AI for note enhancement
- Updating notes in EHR with AI recommendations for comparison
- Storing results in SQL Server for tracking
- Automated Discovery: Finds encounters with multiple unsigned notes
- Comprehensive Data Collection: Gathers complete patient chart information
- AI Enhancement: Uses AWS Bedrock to enhance clinical documentation
- EHR Integration: Updates notes with AI suggestions for provider review
- SQL Server Storage: Tracks all processed notes and recommendations
- Comprehensive Logging: Full audit trail of all operations
- Sandbox Support: Test safely in sandbox environment before production
- Python 3.7+
- Access to Azalea Health API (client credentials)
- AWS Account with Bedrock access
- SQL Server database (optional)
# Clone the repository
git clone [your-repo-url]
cd azalea_api
# Create virtual environment
python -m venv venv
source venv/bin/activate # On Windows: venv\Scripts\activate
# Install dependencies
pip install -r requirements.txtThe config.json file contains your API credentials:
client_id: Your application's client IDclient_secret: Your application's client secrettoken_url: OAuth token endpointbase_fhir_url: FHIR API base URLactive_environment: Switch between "sandbox" and "production"
Important: Keep your client_secret secure and never commit it to version control.
Create a .env file with the following variables:
# SQL Server Connection (optional)
SQL_CONNECTION_STRING=Driver={ODBC Driver 17 for SQL Server};Server=your-server.database.windows.net;Database=your-database;Uid=your-username;Pwd=your-password;Encrypt=yes;TrustServerCertificate=no;Connection Timeout=30;
# AWS Credentials for Bedrock
AWS_ACCESS_KEY_ID=your-aws-access-key
AWS_SECRET_ACCESS_KEY=your-aws-secret-key
AWS_DEFAULT_REGION=us-east-1
# Optional: Override Bedrock model
BEDROCK_MODEL_ID=anthropic.claude-v2python main.pyThis will:
- Authenticate with the API (using sandbox by default)
- Search for encounters from the last 30 days
- Find encounters with >1 unsigned notes
- Collect full patient chart data
- Process with AWS Bedrock AI (if configured)
- Update documents in EHR with AI recommendations
- Save results to SQL Server (if configured)
- Create JSON exports in
chart_exports/directory - Log all operations to
unsigned_notes_processor.log
You can modify the parameters in the main function call:
process_unsigned_notes_batch(
config=app_config,
access_token=token,
db_manager=db_manager, # SQL Server connection (optional)
ai_processor=ai_processor, # AWS Bedrock processor (optional)
days_back=30, # Look back N days for encounters
max_encounters=5 # Process up to N encounters
)- FHIR API Client: Handles all API interactions with Azalea Health
- DatabaseManager: Manages SQL Server operations for tracking processed notes
- BedrockAIProcessor: Interfaces with AWS Bedrock for AI processing
- Document Updater: Updates EHR documents with AI recommendations
- Discovery: Find encounters with unsigned notes
- Collection: Gather comprehensive patient data
- Processing: Send to AWS Bedrock for enhancement
- Storage: Save to SQL Server for tracking
- Update: Push recommendations back to EHR
- Review: Provider reviews and approves/edits suggestions
Created in chart_exports/ directory:
patient_[PATIENT_ID]_encounter_[ENCOUNTER_ID].json
Stored in processed_notes table with:
- Original and enhanced text
- AI recommendations
- Processing timestamps
- Status tracking
Documents are updated with a comparison format:
[Original Note]
========== AI ENHANCED VERSION (For Review) ==========
[AI Enhanced Note]
========== AI RECOMMENDATIONS ==========
[Specific Recommendations]
========== END AI SUGGESTIONS ==========
Note: Please review the AI suggestions above and edit as needed before signing.
{
"patient_id": "12345",
"encounter_id": "67890",
"collected_at": "2024-01-01T10:00:00",
"data": {
"patient": { /* FHIR Patient resource */ },
"encounter": { /* FHIR Encounter resource */ },
"clinical_notes": [ /* Array of DocumentReference resources */ ],
"conditions": [ /* Array of Condition resources */ ],
"allergies": [ /* Array of AllergyIntolerance resources */ ],
"medications": [ /* Array of MedicationRequest resources */ ],
"vital_signs": [ /* Array of Observation resources */ ],
"problem_list": [ /* Array of Condition resources */ ]
}
}The application provides comprehensive logging:
- Console Output: Real-time progress updates
- Log File:
unsigned_notes_processor.logwith detailed operations - Log Levels: INFO for normal operations, ERROR for issues, DEBUG for troubleshooting
- Always use HTTPS connections
- Store credentials securely (use environment variables)
- Limit API scope to minimum required permissions
- Ensure HIPAA compliance when handling PHI
- Implement audit logging for production use
- Use sandbox for testing before production
- Encrypt data at rest in SQL Server
- Follow AWS best practices for Bedrock access
-
Authentication Failed
- Verify client_id and client_secret
- Check if application has required scopes
- Ensure using correct environment (sandbox vs production)
-
No Encounters Found
- Adjust
days_backparameter to search wider date range - Verify organization has encounters with unsigned notes
- Check encounter status filters
- Review logs for specific errors
- Adjust
-
AWS Bedrock Errors
- Verify AWS credentials are set correctly
- Check Bedrock service availability in your region
- Ensure your AWS account has Bedrock access enabled
- Review model ID configuration
-
SQL Server Connection Issues
- Verify connection string format
- Check firewall rules
- Ensure ODBC driver is installed
- Test connection with SQL Server Management Studio
-
API Rate Limits
- Implement delays between requests if needed
- Process in smaller batches
- Contact Azalea Health for rate limit information
-
Switch to Production:
- Update
active_environmentin config.json to "production" - Update base_fhir_url to your organization's endpoint
- Update
-
Security Hardening:
- Use Azure Key Vault or AWS Secrets Manager for credentials
- Implement proper access controls
- Enable audit logging
-
Monitoring:
- Set up alerts for processing failures
- Monitor API usage and rate limits
- Track AI processing costs
-
Backup & Recovery:
- Regular SQL Server backups
- Archive processed JSON files
- Document recovery procedures
After implementation:
- Monitor provider adoption and feedback
- Fine-tune AI prompts based on results
- Implement automated scheduling for batch processing
- Build dashboard for tracking progress
- Add support for different note types
- Integrate with provider workflow tools
- API Issues: Contact Azalea Health support
- AWS Bedrock: Check AWS documentation and support
- Application Issues: Review logs and
unsigned_notes_plan.md
[Your License Here]