QuickMark.
How We Built QuickMark: A Technical Breakdown

How We Built QuickMark: A Technical Breakdown

QuickMark Team
9 min read
Friday, March 20, 2026

A practical look at the architecture, tradeoffs, and engineering decisions behind QuickMark's interactive exam-practice platform.

Why We Built QuickMark

Past papers are the backbone of exam preparation, but the workflow around them is usually fragmented. Students often jump between PDFs, answer sheets, notes, and separate tools for tracking progress.

QuickMark was built to reduce that friction. The goal was simple: keep practice, feedback, and progress tracking in one place so revision sessions become more consistent and easier to repeat.

The Product Problem We Focused On

For most IGCSE, O Level, and AS and A Level students, the hard part is not finding questions. The hard part is building a reliable loop:

  • attempt questions under realistic conditions
  • check answers quickly
  • review mistakes in context
  • choose what to practice next

If any step is slow, students skip it. That usually means weaker review quality, repeated mistakes, and unstable scores.

Architecture Principles

We designed the platform around four engineering principles:

  1. Keep interactions fast so students complete full practice loops.
  2. Keep state reliable so attempts and progress are not lost.
  3. Keep real-time flows consistent in ranked sessions.
  4. Keep the system modular so full-paper and topical modes can evolve independently.

Browser-Side Paper Processing

One of the biggest design choices was where PDF handling should happen. We process question extraction in the browser using PDF.js. This keeps the interaction immediate from the student's perspective and avoids introducing extra upload-processing latency as a default flow.

The extraction pipeline focuses on structure first, then rendering:

This approach improves iteration speed during practice, especially when students are running many short sessions.

Frontend Responsibilities

The frontend uses Next.js App Router, React, and TypeScript. That combination gives us route-level separation for marketing and product surfaces, typed contracts for UI state, and predictable behavior as features grow.

Frontend responsibilities include:

  • rendering practice sessions from parsed paper data
  • collecting answers and timing information
  • showing review views and post-session summaries
  • handling ranked-mode UI updates

Type safety matters here because mistakes in session state can directly affect scoring and review quality.

Backend Responsibilities

The backend is built with Fastify and Node.js, with Prisma for data access. It handles workloads where server authority is important:

  • validating and storing attempts
  • powering reports and aggregate progress views
  • coordinating ranked matchmaking and results
  • serving leaderboard and account-scoped data

This split keeps the client focused on interaction while the backend remains the source of truth for persistent outcomes.

Data Layer Design

We use PostgreSQL for durable records and Redis for high-speed coordination paths.

PostgreSQL is used for:

  • accounts and identity-linked data
  • attempt history
  • reportable progress data
  • leaderboard persistence

Redis is used for:

  • low-latency ranked-session coordination
  • caching hot read paths
  • reducing repeated query pressure during active usage windows

Using both lets us optimize for durability and responsiveness without forcing one system to do everything.

Real-Time Ranked Sessions

Ranked play needs strict consistency. Two players should see coherent match state even if their network conditions differ.

Socket-based events are used so the server can coordinate key transitions:

  • queue and match creation
  • session start
  • answer submissions
  • match completion and result publication

The practical objective is fairness: players should compete against each other, not against inconsistent session state.

Tradeoffs We Accepted

No architecture choice is free. We accepted a few deliberate tradeoffs:

  • client-heavy interactivity increases frontend complexity
  • real-time features require stricter state design and observability
  • multiple data systems improve performance but add operational complexity

These tradeoffs were chosen because they directly improve learner experience during repeated exam-style practice.

Reliability Guardrails

Fast experiences only help when they are reliable. We prioritize:

  • clear loading and retry behavior in session flows
  • defensive handling for malformed or difficult paper layouts
  • server-authoritative completion logic for ranked results
  • robust storage of attempt outcomes for later review

Reliability is not a single feature; it is a collection of small safeguards across the full session lifecycle.

What This Means for Students

At a practical level, architecture decisions show up as revision quality:

  • shorter delay between solving and reviewing
  • fewer context switches between tools
  • clearer continuity across sessions
  • more stable ranked experience

Students do better when the loop practice -> review -> fix is easy to repeat. Our technical decisions are built around protecting that loop.

Final Takeaway

QuickMark is designed as an exam-practice system first, not just a paper viewer. The architecture prioritizes speed, consistency, and repeatable feedback so students can spend more time learning from attempts and less time managing tools.

If you are preparing for IGCSE, O Level, or AS and A Level exams, start with a short focused session and review every mistake before your next paper. Consistency compounds quickly.

Related reads

Apr 17, 2026 · 8 min read

How to Review MCQ Mark Schemes Efficiently After a Past Paper

Turn mark schemes into actionable revision: log what matters, spot recurring errors, and choose your next study session with a concrete 20-minute workflow.

Apr 16, 2026 · 9 min read

How to Reduce Careless Mistakes in MCQ Exams Without Over-Studying Theory

Learn how to separate careless errors from concept gaps, apply a 20-minute daily routine, and use practical checklists that reduce avoidable mark loss.

Apr 15, 2026 · 8 min read

Building an Error Log That Actually Improves Your MCQ Scores

Most error logs fail because they are too long and never reviewed. Use this minimal format, weekly cadence, and decision rules to turn mistakes into score gains.

Apr 14, 2026 · 11 min read

Your Final 30 Days: The IGCSE Revision Framework That Works

A realistic 30-day plan with phase-by-phase priorities, daily templates, and contingency rules so you can adapt without losing momentum.

Ready to practice?

Start practicing IGCSE, O Level, and AS and A Level questions with real past papers, ranked play, and detailed analytics.

Start Practicing
Q
QuickMark.

Join the next generation of IGCSE, O Level and A Level elite. Practice with intent. Lead the board.

Join Arena
  • Global Rank
  • Blog
  • Community Soon
  • About
  • Partners
  • Help
  • Contact
  • Instagram
  • Discord
© 2026 QuickMark
PrivacyTermsEditorial