Work / Enrola
Intelligent course matching for prospective students
Enrolais an education comparison service that helps Australians navigate the overwhelming landscape of professional training options by recommending quality providers, offering guidance and streamlining enrolments. I joined the early-stage startup as a solo designer to move the needle by addressing user problems that were impacting enrolments.

Problem
In mid-late 2024, Enrola saw a high proportion of low quality leads, which slowed down enrolment processes. An estimated 1 in 7 prospective students were applying to the wrong course on the course advisor app. Furthermore, students who did apply to the correct course often missed important practical information.
Outcome
Leading the UX/UI design efforts, I identified and addressed key user problems by improving visual hierarchy and scannability while contributing to a smarter quiz and recommendations flow. The redesign with an emphasis on clarity resulted in up to 12-15% increase in either leads or student enrolments across all but one A/B tests.
Skip to Solution
Research
To understand the reason behind the high proportion of unqualified leads, my initial research involved:
- Collaboration with the student acquisition team to gather qualitative insights about student motivations, needs and concerns.
- Leveraging past A/B test results.
- Analysis of chat support logs to measure prevalence of common questions such as course fees, funded place eligibility and online availability.
- Comparative assessment to identify opportunities and gaps in the product.

Key findings
Focusing initially on applicants for popular government-funded courses in the individual support sector, our research revealed a need to address:
- Funding complexity: Government funding eligibility was a major draw but also a significant source of confusion due to varying state-based rules.
- Visual hierarchy and clarity: From session replays, I noted that the recommendation's page lacked clear visual hierarchy and alignment, likely causing students to overlook course specifics and eligibility requirements.
- Trust and retention: Alongside usability and clarity issues identified, we remained mindful of the broader industry challenge concerning user trust in intermediary platforms and the potential for lead leakage. While not directly quantified, ensuring a trustworthy experience was considered an underlying factor for successful enrolment.

Wireframes
In my wireframes, I focused on identified usability issues and incorporated opportunities for conversion, primarily addressing:
- Need for content clarity: Given the complexity of funding, ensuring the wording of quiz questions was clear and unambiguous was paramount. Before finalizing layouts, we focused on refining the question flow and language, initially using collaborative documents, like Google Docs, to iterate quickly and ensure accuracy. This content-first approach helped clarify complex requirements for users.
- State-specific funding logic: To tackle the confusion around government funding, we introduced logic based on the user's postcode. This directed users through a tailored sequence of questions specific to their state's eligibility requirements. This aimed to provide accurate eligibility information early, preventing users from applying for unsuitable courses.
- Building trust & credibility: Establishing trust with new customers as a startup is essential so we tested elements in wireframes and design aimed at improving conversion through enhanced credibility. This included adding external reviews for course providers and prominently displaying symbols of authority – such as nationally recognised training logo and official course provider logos – to reassure users of the legitimacy of Enrola and the courses offered.
Prototype
Prioritizing rapid delivery in the early stages, we tackled the redesign one webpage at a time. This allowed for steady two-week cycles of product updates and simplified A/B testing analysis. The full redesign, from the quiz to the enrolment form, took approximately 3 months to complete.

We ran 6 A/B tests to validate each product page's redesign before full release. With the exception of the enrolment form, A/B testing confirmed the effectiveness of my redesign in improving conversion rates for leads or enrolments. Each test saw an uplift in conversions between 12-15%. Most tests were run until they reached statistical significance.
While my redesigned form initially saw a statistically insignificant decrease in application submissions compared to the original, zooming out revealed a positive development: the rate of students with submitted applications who commenced studies was gradually increasing. Overall, it was a learning moment for me as it revealed some of the challenges around testing large redesigns, where the reasons for the results can be unclear.
One drawback of the one-webpage-at-a-time approach was the accumulation of minor inconsistencies across pages, requiring extra refinement effort later and partially offsetting the initial speed gains. Furthermore, disagreements with early visual design decisions surfaced towards the end of the full product redesign, leading to an exploratory visual design phase that impacted the final timeline.

Key Takeaways
Prioritize visual design conversations: This project demonstrated the importance of cohesive visual design and brand for early-stage startups for building trust and differentiation. Moving forward, I'll prioritise product vision and brand direction early, especially when tackling a redesign with a one page at a time approach.
Understand the data structure: Designing with a solid grasp of the underlying data was invaluable for collaboration and efficiency. Access to the admin console enabled me to understand the architecture of information in the system such as how fees were structured, ensuring my designs were feasible and more efficient for implementation. I will absolutely continue with this content-driven practice in future projects.