Doubling digital affordability check opt-in
Increasing opt-in by 216% in tenant reference checks saving ~£175K/yr in costs.
2023 | B2C responsive web research & design, Rightmove
Product
Referencing product, Lead to Keys Suite
Team
Product Owner, Engineering Lead, QA, 2 x engineers, Head of Lettings, Head of Referencing, Head of Product, General Manager of Referencing
First, a bit of context. What is an affordability check at Rightmove?
Tenants being referenced for a rental property undergo mandatory affordability checks conducted by our internal referencing team on behalf of an agent & landlord.
Two ways affordability can be checked:
-
Digitally – utilising open banking tech to instantly pull a consensual 12-month one-time snapshot of the tenant’s ingoings & outgoings.
-
Manually – bank statements are requested via email and manually reviewed by the team for fraud detection & verification
Diagram to illustrate context, process & actors involved.
The problem
-
Our key OKR was to reduce the cost of a reference by 35% and open banking technology was our main product lever to achieve that.
-
Despite six months of implementation, achieving 22% reduction in costs per reference and savings of 1.5 days per reference per headcount, adoption sat at half the industry standard.
-
For the cost of open banking to scale, we needed to raise opt-in and subsequent conversion.
How it works
Tenants consent to one-time access to annual income & expense data as part of their tenancy reference.
Alternative involves gathering and sending payslips and sensitive documents manually, via email.
Benchmarks
Our opt-in rate was 30%, contrasting with the industry & competitor benchmark of 60%.
My task was to investigate the reasons behind this discrepancy and design a solution to optimise adoption.
Goals
50% tenant flow completion, requiring 65% opt-in (less 15% expected drop-off).
Open banking boosts security against fraud, increases efficiency and decreases internal cost per reference, a key OKR for the business.
What we set out to solve​
​
There were clear heuristic improvements to be made to the experience, but generative research would need to be undertaken to understand the 'why' behind this low adoption rate.
​
-
Enhance user experience to increase adoption rates.
-
Identify and address reasons for low adoption rate.
-
Effectively communicate benefits to help tenants make an informed decision.
Previous Open Banking experience
Cross-disciplinary kick-off
Goal: align a large number of tricky stakeholders on some clear goals to deliver impact quickly.
I brought together 8 stakeholders to build shared understanding of current experience, appraise the current challenges as well as qual and quant insights we had available. This helped the group to understand the problems, empathise with users, and define the hypotheses and metrics for the project.
Deepening the team's shared understanding of the problems and significance.
Analysing the flow to gain shared understanding, challenge, and share knowledge.
We voted and aligned on some targets for the project: 65% of consumers opting in to use Open Banking, and a target of 50% completing the flow, given 15% baseline drop-off as industry standard.
​
Other success measures defined:
-
8/10 satisfaction in hotjar survey. Built to gather a CSAT rating and qualitative insights from tenants to understand performance.
-
Reduce affordability check-related inbound by 20%.
First round of User Research
Moderated - attitudinal & evaluative testing of current experience
Method
5 unmoderated user tests with recent renters in the UK aged 19-55.
-
Used Usertesting.com
-
Product Owner observed & took notes.
-
Users asked questions about experience with referencing process generally.
-
Users shown the original flow and asked to think out loud and explain their expectations and understanding.
Goals
Understand why people drop off from a qualitative perspective.
-
Understand any concerns with regard to open banking & trust generally.
-
Gain insight into how well tenants understand & how they perceive the process from the current experience.
-
Use these insights to optimise the design to address any concerns & friction points to double the opt-in rate.
Analysed findings with the Product Owner
-
Triangulated qual and quant insights to frame problems.
-
Refined into problem statements.
-
Devised 'How Might We' (HMW) statements to solve in the next design iteration.
-
Drew rough flows and wireframes using miro and paper to discuss flow integration which allowed engineers to do a feasibility spike early.
Round 1 user testing raw insights & analysis
Key Findings
Most tenants felt they didn’t understand the process of Open Banking enough to make them commit, despite understanding the benefits.
Security was the core concern
-
100% concerned about the depth and level of access required.
-
People also raised concerns around the 12 months of access as opposed to previous experiences only giving 3 months of information.
-
Poor comprehension of what Open Banking was until later steps.
Most would actually opt-in if we surfaced information displayed on the consent page, earlier.
-
All 3 users who would have opted out immediately if not under test conditions, would have been happy to continue after seeing the consent page, this mitigated all of their concerns.
-
Only 1/5 of the participants said they’d opt-in.
-
3 dropped off at the first step, 1 at the consent step, 1 made it to the end.
Quotes from testing
Probelms and How might we statements to solve
Constraints to call out
3rd party changes limited
Our partner Equifax offered limited customisation on screens post our product experience. It also had to open in a new tab which made for a more confusing experience for users.
High business risk
Current implementation was saving 22% per reference. At scale, it was a big part of driving down the cost to serve, a key OKR. We risked losing those cost-savings by making changes, and solution needed to remain compliant.
Limited design system and illustrations
I wasn't able to influence the design of any part of the screen but the body.
We could only make changes in the form of select copy on these 3rd party screens.
Key design changes informed by quant and qual research
Hypothes post-testing round one
-
​Integrate open banking in 'Finances - Affordability' section
This contextual placement will lead to increased user engagement as it better aligns with users' mental models.
-
Make the choice binary and mandatory
This will heighten the perceived importance of the step, resulting in increased user engagement.
​ -
Prioritise key information upfront
Optimising the information hierarchy by surfacing elements that provided reassurance with more precise language, earlier in flow.
-
Educate on alternative manual choice
Educating users on the alternative manual affordability check will improve trust through transparency, fostering clearer comprehension.
Some earlier iterations experimenting with skip, progression, and balance of information.
Iterations round 1
Second round of user testing
Unmoderated - evaluative & usability testing of new design
Method
5 unmoderated user tests with recent renters.
-
Used Usertesting.com
-
Evaluative usability & perception test
-
Users shown iterated conceptual flow and asked to think out loud and explain their expectations.
Goal
Understand to what extent new design addresses problems identified.
-
Understand whether integration into flow assisted in contextualising the prompt.
-
Understand to what extent copy sufficiently addresses security & education concerns.
-
Identify any usability issues.
-
Understand to what extent comparison aided comprehension.
Key takeaways
-
Integration of step in 'Finances - Affordability' section was not just understood but expected by users.
-
Introducing the comparison between manual and digital helped users feel empowered to make an informed decision and education was no longer a problem.
-
More users reported likelihood to opt-in as a result of clearer key information upfront.
-
Still unsolved: Security still a concern for some participants.
-
Learning: users felt, with all of this info and caveats, that it might take a long time to complete.
User testing findings blurred.
"...if it was reviewed by a professional website then I would do [the digital check]."
Tenant from 2nd round testing
Changes post umoderated testing - round two
Security concerns
Security still a concern for 2/5 users
-
Both suggested the display of any accreditations as a form of reassurance of legitimacy.
-
Added 'FCA regulated' copy and provider's logo.
-
Clarified data sharing details and duration.
Expectations of time
Helping users understand how long each choice would take.
-
The '44% faster' statistic was lost on users as they didn't know how long this step should normally take.
-
​Instead displayed in minutes for each the manual and digital options in the final design.
Design optimisations
Moving the choice above the fold
-
Accordion functioned well in testing but pushed manual section below fold -poor exp as user must choose to proceed.
-
Opted to replace with tooltip for reduced vertical space.
Taken from actual project progress update deck.
Flow diagrams
I put together a flow of the current state to first understand the flow while gathering requirements.
Later mapped the new experience.
I find it helps a lot to facilitate focused conversations with the engineers and QAs to get early feasibility input, all the way through to handover, QA testing and performance monitoring over time. Also helps me to identify any edge cases or unhappy paths that may have been overlooked.
Handover & early release performance issue false alarm
Monitoring performance after release led to some great team work and diagnosis of issues for quick iteration.
Sharing an insight into my approach in identifying performance tracking and issues:
-
Jumped on a call with the Product Owner to get the context.
-
Interrogated the validity of the data myself - dove into Google Analytics (GA) to poke around. How were our funnel categories defined? Were we double counting? Unique users or unique sessions?
-
Mapped out data points (blue) and issues (in red) along journey for engineers to investigate.
-
Wrote ​out desired metrics & definitions for engineers to implement to solve (top of screenshot.)
-
Requested data from 3rd party to triangulate.
-
-
I also looked into other dimensions like devices and browser sessions.
-
75% users using safari - led to the discovery of a safari pop up issue which explained the high drop-off rates.​
-
Sat alongside the QA to reproduce the error successfully.
-
-
Finally looked in hotjar to see what trends I could find. Observed the safari issue first-hand, and picked up on another issue with the flow for the unsuccessful path.
-
This led to workshopping quickly with the engineers to get on the same page (created flow diagram), and the design of ​some quick iterations post-launch to make experience more seamless.
-
Speedy collaboration with cross-disciplinary squad to diagnose issues.
The Solution
Integrated open banking check into Finances step in application form, reduced exit points, provided security reassurance with copy and visual cues.
Screens 3-7 not designed by me.
Results
Small scope, mighty impact
-
Increasing opt-in from 30% to 65% was a significant achievement.
-
Reduces headcount costs significantly, and saves our team ~1.5 days/reference.
-
Biggest contributor towards key OKR of reducing referencing costs.
£174,240
Est. annual cost-savings from efficiency
+216%
Tenants starting digital check
+21%
Tenants completing digital check
"Sophie, you are an absolute gun"
Head of Product, Lead to Keys
"I wish we'd had you 15 years ago"
Head of Lettings
Learnings
The key to speed is to slow down at the beginning.
I think the success of this project hinged on well-planned kickoff that aligned stakeholders and focused on the initial metrics and getting buy-in from tricky stakeholders.
As a group we audited the experience triangulating several SME perspectives from user, to internal and customer perspectives, which allowed us to streamline our user interviews by addressing known issues promptly, saving us time.