CASE STUDY

TAGAP: AI-Powered Lesson Plan Generator for Filipino Teachers

How I built an AI tool that reduces lesson planning time by 75% for 800,000+ Filipino public school teachers.

Role: Product Owner & Technical Lead
Timeline: Dec 2024 – Jan 2025
Status: Live Production
75%
Time Saved
800K+
Target Teachers
6 Weeks
To Launch
100%
DepEd Compliant

The Administrative Burden on Teachers

Filipino K-12 public school teachers spend 2-4 hours per week creating Daily Lesson Logs (DLLs)—a mandatory submission format required by the Department of Education. This administrative burden directly competes with actual teaching preparation time.

"I spend my Sundays making lesson plans instead of resting. By Friday, I'm just copying from last year's plans."

— Grade 4 Teacher, Quezon City

"The MATATAG curriculum is new. I don't know if my lesson plans are compliant anymore."

— Grade 2 Teacher, Cebu

"I can't afford the subscription tools my colleagues use. I just do everything manually."

— High School Teacher, Bicol Region

Why Existing Solutions Fail

Solution Critical Gap
Manual Creation 2-4 hours per week; inconsistent quality
Template Libraries Static; don't reflect new MATATAG curriculum
Paid SaaS Tools ₱299-599/month; unaffordable for most teachers
ChatGPT/Generic AI Requires prompt engineering; wrong format

User Research Approach

Method Participants Key Focus
Teacher interviews 8 teachers (Grades 1-10) Workflow pain points, current solutions
Facebook group observation 3 communities (~50K members) Common complaints, shared resources
Competitive analysis 6 existing tools Pricing, features, format compliance
Curriculum document review DepEd MATATAG guidelines Required format, learning competencies

Key Insights

Finding Product Implication
Teachers spend 70% of planning time on format, not content Automate the structure; let teachers focus on customization
MATATAG curriculum (2024) invalidated existing templates Build curriculum data directly into the product
₱500/month is the absolute maximum teachers can spend Must be free; monetize through ads if needed
Teachers don't trust cloud storage with their work Local storage first; no account required
Many teachers use phones, not computers Mobile-responsive is P0, not P1
Filipino and Mother Tongue subjects require specific dialects Support 12+ regional languages for generation

Feature Prioritization

P0 Must-Haves (Launch Requirements)

Feature Rationale
AI-generated DLL matching DepEd format exactly Core value proposition
All K-12 grade levels and subjects Can't exclude any teacher segment
MATATAG curriculum compliance (Grades 1-5) New curriculum = highest demand
Export to Word/PDF Required for submission
Mobile-responsive interface 60%+ teachers access via phone
Zero account requirement Reduce friction to zero

Deliberately Deferred (Post-Launch)

User accounts and cloud sync

Adds complexity; teachers expressed privacy concerns

Lesson plan history/archive

Local storage sufficient for MVP

Collaborative editing

Single-teacher use case is 95%+ of demand

Critical Trade-off Decisions

Trade-off Choice Rationale
Build curriculum database vs. let AI infer competencies Build comprehensive curriculum database AI hallucination risk too high for official documents
Provide shared API vs. users bring their own key Users provide their own free Gemini API key Zero ongoing costs; Gemini free tier is generous enough
Native mobile app vs. responsive web Responsive web only Single codebase; no app store delays; instant updates

Build Approach

Phase Duration Deliverable
Phase 1: Core Engine Week 1-2 AI generation + basic UI + single grade/subject
Phase 2: Curriculum Data Week 2-3 All grades, subjects, MATATAG competencies
Phase 3: Export & Polish Week 4-5 Word/PDF export, print formatting, mobile optimization
Phase 4: Production Week 5-6 SEO, analytics, deployment, monitoring

Validation Checkpoints

Prototype (3 teachers)

Pass Criteria: "I would submit this to my principal"

Alpha (8 teachers across grade levels)

Pass Criteria: <2 structural errors per generated plan

Beta (Open testing via Facebook groups)

Pass Criteria: <5% regeneration rate due to format issues

Key Iterations

Feedback Response
"The assessment section is too generic" Added subject-specific assessment templates to prompt engineering
"I can't find my dialect" Expanded Mother Tongue support to 12 regional languages
"Exported Word file loses table borders" Rebuilt export using custom HTML-to-DOCX conversion with inline styles
"Plan doesn't match new Grade 3 MATATAG competencies" Updated curriculum database from latest DepEd memorandum

Major Challenges Solved

Challenge 1: Curriculum Data Accuracy

The MATATAG curriculum was new (2024) with limited structured data sources. I manually extracted learning competencies from DepEd PDF documents, validated against teacher feedback, and structured into a queryable format. This took 40% of total development time but eliminated the #1 teacher complaint about AI-generated plans.

Challenge 2: Export Fidelity

Teachers submit DLLs in Word format. Standard HTML-to-Word libraries produced broken tables. I implemented a custom export pipeline that preserves table structures, handles merged cells correctly, and maintains print formatting—validated by opening exports in Word 2016, 2019, and 365.

Results & Reflection

Current State

  • Live at tagap-ph.com with organic traffic from teacher communities
  • VPS deployment with Nginx, PM2 process management

Measured Results

<90s

Time to first lesson plan

97%+

Generation success rate

94%

Export completion rate

58%

Mobile usage

What Worked Well

  • Zero-cost model — Completely free with users providing their own Gemini API key—removes the subscription barrier that excludes most teachers
  • Curriculum-first approach — Building the competency database eliminated the "AI hallucination" problem that plagues generic tools
  • Local storage decision — Teachers appreciated not having to trust a new service with their data
  • Language support — Mother Tongue dialect coverage addressed an unmet need no competitor had solved

What Didn't Work as Expected

Expectation Reality Learning
Teachers would customize heavily after generation Most export with minimal edits AI output quality was higher than anticipated; simplify the editing UX
API key setup would be a friction point Teachers complete setup quickly with the guide Clear step-by-step instructions removed the barrier
Print would be primary export method Word export is 3x more popular than print Teachers share files digitally with supervisors more than printing

What I Learned

1. Compliance-critical products require authoritative data sources

For education tools, "close enough" isn't acceptable—teachers face real consequences for format errors. Investing in curriculum data accuracy was the right call.

2. Friction removal beats feature richness

Every step I eliminated (accounts, payments, configuration) increased conversion. Teachers wanted to solve their problem, not learn a new tool.

3. User research in existing communities is underrated

Facebook teacher groups surfaced more authentic pain points in one week than formal interviews would have in a month.

4. Export is the product for document-generation tools

The lesson plan generator isn't the end—the Word file teachers submit is. I should have prioritized export fidelity earlier.

What I'd Do Differently

Start with export testing earlier

I treated export as a Phase 3 concern; it should have been Phase 1. The export format defines the entire data structure.

Build a curriculum update pipeline

Manual updates from DepEd PDFs don't scale. I'd invest in structured data extraction for ongoing curriculum changes.

Add lightweight analytics from day one

I delayed analytics setup; earlier data would have informed feature prioritization faster.

Prototype with actual teachers in-session

Remote feedback missed UX friction I only caught watching a teacher use the tool in real-time.

Interested in working together?