Apply to jobs automatically while you sleep.
Sleep-Apply is an intelligent job application automation system that scrapes job postings, generates tailored resumes using AI, and automatically fills out applications across multiple job portals.
- ๐ Smart Job Scraping - Find jobs from LinkedIn, Indeed, ZipRecruiter, and more
- ๐ค AI-Powered Resume Generation - Tailored resumes for each job using GPT
- โก Automated Application - Auto-fill forms across different job portals
- ๐ Application Tracking - Monitor status and success rates
- ๐ง Email Integration - Parse interview invites automatically
- ๐ง Learning Engine - Improve over time based on success patterns
- ๐ Secure - Encrypted credential storage and session management
- Python 3.11+
- Docker & Docker Compose (recommended)
- OpenAI API key
-
Clone the repository:
git clone https://github.com/YanmiYu/Sleep-Apply.git cd Sleep-Apply -
Set up environment:
cp .env.example .env # Edit .env with your API keys -
Start services:
make setup make start
-
Access the application:
- Frontend Demo: http://localhost:8080/demo.html
- API Docs: http://localhost:8000/docs
Sleep-Apply/
โโโ hunt/ # Job scraping & automation module
โ โโโ src/
โ โ โโโ scraping_service.py # Multi-platform job scraper
โ โ โโโ apply_automation.py # Browser automation
โ โโโ tests/
โ
โโโ auto-fill/ # Form auto-fill system
โ โโโ backend/ # FastAPI backend
โ โโโ frontend/ # Demo UI
โ โโโ extension/ # Chrome extension
โ โโโ demo.html # Interactive demo
โ
โโโ backend/ # Main backend (MVP - in progress)
โ โโโ app/
โ โ โโโ models/ # Database models
โ โ โโโ services/ # Business logic
โ โ โโโ routers/ # API endpoints
โ โโโ tests/
โ
โโโ docker-compose.yml # Full stack deployment
โโโ Makefile # Convenient commands
โโโ MVP_PLAN.md # 4-week implementation plan
from hunt.src.scraping_service import scrape_jobs_universal
# Scrape jobs
jobs = scrape_jobs_universal(
search_term="software engineer",
location="San Francisco, CA",
site_name=["linkedin", "indeed"],
results_wanted=50,
is_remote=True
)
# Save to CSV
from hunt.src.scraping_service import save_jobs_csv
save_jobs_csv(jobs, "jobs.csv")from hunt.src.apply_automation import JobApplicationAutomator
# Apply to a job
automator = JobApplicationAutomator(headless=True)
result = automator.click_apply_button("https://www.linkedin.com/jobs/view/123456")
if result["success"]:
print(f"Applied! Redirected to: {result['new_url']}")-
Start the demo server:
cd auto-fill python3 -m http.server 8080 -
Use keyboard shortcuts:
Ctrl+Shift+F- Auto-fill formsCtrl+Shift+R- Refresh field detectionCtrl+Shift+A- Analyze page
make setup # Initial setup
make start # Start all services
make stop # Stop services
make test # Run tests
make logs # View logs
make migrate # Run database migrations# Hunt module tests
cd hunt
uv run pytest tests/ -v
# Auto-fill tests
cd auto-fill
python -m pytest tests/make lint # Check code quality
make lint-fix # Auto-fix issuesWe're building an MVP in 4 weeks. See MVP_PLAN.md for details.
Week 1: Foundation & Database
Week 2: Core Automation
Week 3: Tracking & Analytics
Week 4: Polish & Documentation
Current Status: Week 1 - Setting up infrastructure
Contributions are welcome! Please:
- Fork the repository
- Create a feature branch (
git checkout -b feature/amazing-feature) - Commit your changes (
git commit -m 'Add amazing feature') - Push to the branch (
git push origin feature/amazing-feature) - Open a Pull Request
This tool is for educational purposes. Automated job applications may violate terms of service of job sites. Use responsibly and at your own risk. Always review applications before submission.
This project is licensed under the MIT License - see the LICENSE file for details.
- JobSpy - Job scraping library
- AIHawk - Inspiration for automation
- Playwright - Browser automation
- GitHub: @YanmiYu
- Issues: GitHub Issues
Made with โค๏ธ by the Sleep-Apply Team