Turning a struggling job-seeker toolkit into a scalable, AI-powered application engine
JobGap started as an all-in-one hub — a job board, AI résumé reviewer, and applications CRM. We acquired 5,000 users, but engagement and monetization struggled. Qualitative discovery through customer conversations revealed the real pain: users didn't need more CV tips, they lacked the time to apply. I led the full product redesign that turned it into an autonomous AI application engine — optimising the acquisition funnel for £50,000 in monthly revenue.
- ClientJOBGAP AI
- RoleGrowth Product Designer — UX Strategy, Behavioural Analytics, Conversion Optimisation, Prototyping
- ToolsFigma · Hotjar · Mixpanel · Jitter
Context & the Pivot
The original JobGap product was comprehensive on paper: a job board, AI résumé reviewer, interview prep, and application tracker. The problem was that the internet had already solved every one of those problems individually. Users signed up, looked around, and left.
We briefly pivoted to connecting candidates with Virtual Assistants who applied on their behalf. Traction was promising but the model wasn't scalable. The 2023 AI boom gave us one shot to automate what humans were doing manually. We rebuilt the entire product around a centralised job application engine: users set their preferences once — job title, location, salary, visa requirements — and the AI found matching roles, tailored a CV for each one, completed the application form, and submitted it.
users acquired before the rebuild — with near-zero retention
automated job applications delivered per paying user, monthly
The Onboarding Problem
What existed before — and why it failed
I started with what felt like the right instinct: a comprehensive 7-step onboarding form. The form covered visa status, work arrangement preferences, salary expectations, preferred industries, location, and company size — everything the AI needed to do its job well. Completeness seemed logical. The data requirement was real.
Original 7-step onboarding — step 2/7. Hotjar recordings showed steep drop-off from this point.
Behavioural data analysis via Hotjar session recordings told a brutal story: most users clicked through step one with enthusiasm, slowed at step two, and abandoned entirely around step four. The users who did push through all seven steps almost always converted to paid — the problem was pressing enough for them that they powered through. But they were a tiny fraction of everyone who started.
Hotjar behavioural data analysis: 831 sessions, 18:27 avg. session duration. Top-clicked CTA was "Sign In" — users were returning, not converting.
My first hypothesis was that the form was simply too long. I cut it to 3 steps. Drop-off reduced but persisted. User interviews then surfaced a sequencing problem rather than a length problem: I had been asking for sensitive data — visa type, work authorisation — before showing a single piece of product value. We had the sequence completely backwards.
Complete upfront data collection. Near-total abandonment at step 4.
Drop-off reduced but persisted. User interviews revealed it wasn't length — it was sequence.
Show product output before the paywall. Single-screen completion. Conversions improved immediately.
Competitive Research — Activation Patterns That Worked
To find the right onboarding model, I ran a competitive audit of high-converting mobile apps using a value-first activation pattern. Two clear patterns emerged: show a personalised output immediately, and defer the payment ask until after the "aha moment."
Restore — paywall deferred until after personalised value is shown
Restore — progressive data collection, minimal friction per step
Value-first prediction: personalised output shown before any payment ask
CoinSnap — core value (identify + grade) surface-level from home screen
Re-engineering the Form — Value-First Onboarding
I studied the onboarding flows of high-converting mobile apps and identified two principles that changed everything: collect the absolute minimum upfront, and show value before asking for anything sensitive. I had been doing the opposite — presenting a paywall at the end of a data collection exercise, asking users who had given everything and received nothing to now hand over their credit card.
The "Aha Moment" — Activation Metric Optimisation
Once users experienced the product before the paywall, the conversion ask finally had context behind it. The redesigned flow: user completes one minimal page → app hits the API → pulls real matching job listings → generates a tailored CV for one of those roles on screen, in real time. Not a demo video. Not a marketing mockup. Their actual data, processed instantly, producing a real output in under a minute. Only after that moment did we present the paywall with a visible option to skip to the free tier.
Incremental fixes surfaced through ongoing behavioural data analysis:
Open-text job title field replacing dropdowns — support tickets about missing titles stopped immediately.
Company & industry exclusion list — users could tell the AI which employers to avoid. This single feature visibly increased trust.
- Dropdown menus for job titles replaced with open text fields — support tickets about missing job titles stopped the same week
- Country selector expanded to city-level — users targeting specific metro areas could finally set that
- Company & Industry exclusion list added — users told the AI which employers to avoid based on past experience. This one feature alone visibly increased trust during onboarding
- Multi-country + multi-city support: select Canada, then specify Toronto or Vancouver
The Conversion Engine
Two Dashboards. Two Jobs to Be Done.
Applying jobs-to-be-done to information architecture revealed that free users and paid users had fundamentally different needs — not just different features. Free users needed to want the product. Paid users needed to trust it. I designed each dashboard for exactly that outcome.
Free Dashboard
Its purpose was to show value and encourage upgrades. A free CV ATS analysis was included to create desire: "if this is free, what's paid like?" Matched jobs were visible but locked, with an unavoidable upgrade prompt: "Get 200+ applications submitted automatically on your behalf."
Paid Dashboard
Served as a live command centre, visibly tracking applications through a pipeline: Found → Tailoring CV → Submitting → Submitted. Crucially, users could open any application to see the exact CV, cover letter, and AI-written form answers used. This deliberate transparency was essential for retaining users who entrusted the platform with their job search.
Dashboard Evolution
The paid dashboard was iterated across multiple cycles based on user feedback and session data.
Transparency as a Retention Tool
Paid users could open any submitted application and see exactly what the AI filled in — CV, cover letter, personal information, and form answers. This visibility was critical to trust.
Application detail view: full transparency into every field the AI submitted — personal info, work authorisation, salary expectation, CV and cover letter.
The Business Model Shift
Subscriptions weren't converting. I recommended the pricing model shift based on financial risk patterns I was seeing in user interviews — job seekers were financially stressed, making a recurring monthly charge feel like a risk they couldn't absorb. We moved on it the same week. Switching to a one-time payment, explicitly stated in the hero section, immediately boosted conversions by removing a previously unnamed objection.
Subscription → consistent resistance, low conversion
One-time payment stated upfront → immediate spike in signups and paid conversions the same day it launched
Instalment option added → unlocked a segment that wanted the product but couldn't commit to a lump sum
Final pricing card: $469 one-time, instalment options via Afterpay & Klarna. "One-time payment. No subscription." stated prominently.
Landing Page — The 80% Rule
Behavioural data analysis showed 80% of ad traffic never scrolled past the hero. I had been optimising the wrong part of the page. Realising most visitors decided on the first screen, I shifted all investment from below the fold to the hero section. I rewrote the hero three times before it was effective — early, feature-focused copy performed poorly. The breakthrough came from accepting the uncomfortable truth: users prioritise what the product does for them, not what it does.
"AI-powered job applications"
Generic sub-copy about AI capability
"Sign Up"
"Land your dream job without sending hundreds of applications every week"
Pain point addressed + preference reassurance in one line
"Start Applying For Me"
Early version — generic pain framing, feature-led copy
Final version — outcome-led headline, one-time payment stated in hero, dual CTA
Retention & Reactivation
The Reactivation Engine
A segment of users signed up, went quiet, and never came back. Rather than accepting this as normal churn, I identified it as a reactivation opportunity — users who cared enough to sign up but hadn't yet seen the product work for them personally. I designed an AI email engine where no two emails were the same.
The engine matched each dormant user's profile (role, location, preferences, CV) with live job listings and crafted personalised emails — not generic re-engagement blasts, but a specific role, a tailored fit analysis, and an explicit offer:
"I found this role at [Company]. Based on your profile, here is why I think you are a strong match. Here are a few things worth considering before applying. Should I handle the application for you?"
Subject line: personalised with recipient name and specific role context — not a generic re-engagement blast.
Body: specific role, company context, personalised fit rationale — drafted by the AI from the user's live profile.
Sign-off: explicit offer to handle the application + direct application link. Sarah (JobgapAI Career Assistant) as sender persona.
Once users saw the product actively working on their behalf before committing financially, the conversion ask finally had the evidence it needed. Paid conversions clustered consistently at email 2 and 3. Users needed to see the product working for them more than once before committing.
Results
Every decision compounded. Here is where the funnel landed.
Once the sequence matched the user's trust curve — value before data collection — abandonment collapsed.
7 steps of near-total abandonment reduced to a single completing screen
Once users experienced the product before the paywall, the conversion ask finally had context behind it.
£50,000 in monthly sales generated consistently from the optimised funnel
Removing the recurring risk from financially stressed job seekers immediately cleared a hidden objection.
One-time payment switch produced an immediate, visible uplift the same day it launched
Personalised, profile-matched emails gave dormant users a concrete reason to return rather than a generic reminder.
3-part email sequence reliably converted dormant users at message 2 and 3
Automating the full application loop meant volume scaled with AI, not with human hours.
35,000+ automated job applications delivered per paying user every month
The value-first preview flow gave users proof the product worked before asking them to pay.
Value-first preview flow drove the strongest free-to-paid rate in the product's history
Personal Learnings
Earn the right to ask
Asking for sensitive data too early reduces conversions and breaks trust. I learned this after three form versions — each one a deliberate experiment, each one teaching something the previous hadn't.
The sequence is the solution
The 7-step form failed not due to length, but because it appeared before the user was invested. The fix wasn't subtraction — it was reordering.
The business model is a UX decision
The shift from subscription to one-time payment was driven by understanding job seekers' financial risk tolerance. Pricing is a product decision, not just a commercial one.
Transparency retains users
Visible AI decisions — exact CVs, cover letters, form answers — ensured paid users returned. They needed proof their delegated job search was active.
Kill experiments before they become the default
Quickly abandoning unsuccessful elements — free trial, 7-step form, subscription model — based on data led us to the combinations that compounded.