user.first_name
Menu

Marketwatch

AI is handy for mortgage curiosity but leaves advisers correcting oversimplified information – Marketwatch

AI is handy for mortgage curiosity but leaves advisers correcting oversimplified information – Marketwatch
Shekina Tuahene
Written By:
Posted:
December 3, 2025
Updated:
December 3, 2025

With people consulting some form of an artificial intelligence (AI) tool to gather information on various topics, it makes sense to assume it is also being used for mortgages.

However, research shows that large language models (LLMs), such as ChatGPT, are not 100% accurate and can struggle with quantitative data, which might pose issues if people use them to inform them of their mortgage options.

So, this time, Mortgage Solutions is asking: Do you come across clients who rely on AI for mortgage information, and what are the challenges in correcting what the technology tells them? 

 

 

Jason Foord, founder of Verifi Mortgages 

Sponsored

Aldermore Insights with Jon Cooper: Edition 7 – Opening doors in a tougher first-time buyer market

Sponsored by Aldermore

I actually talk about this a lot with my network, as we work with an AI team for our RateDrop app, and I believe that AI isn’t the issue; unregulated AI is. 

AI can be a fantastic tool for helping people understand the basics, like what loan to value (LTV) means or how early repayment charges (ERCs) work. It breaks down the noise better than most online guides, and it gives people a helpful starting point before speaking to a broker, but there is a fine line between using AI to learn the basics and relying on it for mortgage criteria or product advice.

These models are not regulated by the Financial Conduct Authority (FCA), they do not have access to live lender systems, and they can’t read websites word for word. They only know the information uploaded into them and lenders update criteria and rates constantly, so accuracy is never guaranteed. 

I’m seeing more clients come to me with AI advice that is simply wrong, especially around bad credit. AI often tells people that a low credit score means an automatic decline, when in reality, several lenders don’t credit score at all; they manually review the credit report. I have also spoken with a client who said AI advised that missed payments did not matter, or that county court judgments (CCJs) need to be satisfied before applying, both of which are untrue, as it all depends entirely on the lender.

These oversimplifications are well-known issues with unregulated AI tools. 

Some clients do question advice because an AI tool has told them something different, but that is where an FCA-regulated adviser should step in to re-educate their client by showing them real lender criteria and affordability figures.

AI is a great starting point, but it can’t replace regulated up-to-date advice from someone dealing with real cases every day. 

 

Sam Fox, founder of UK Mortgage Centre 

Yes, this is definitely something we’re seeing more often. Over the past year especially, clients have been coming to us after using AI tools to ‘check’ their mortgage options before speaking to us.

On the positive side, it means they’re engaged and curious, which is never a bad starting point. But the challenge is that AI often gives them information that’s very generic and vanilla. If you’ve used it yourself, it will always feel like it defaults to agree with you and summarise an answer or thought process. 

The biggest hurdle isn’t that they don’t trust us, but that they assume AI is pulling from accurate information simply by asking it one question. Where buyers used to spend their own time researching, people are accepting the AI information on face value, then asking us the questions. 

So, a client might come in convinced they can use a certain lender, and we then have to unwind that expectation and explain why we can’t use a particular lender, through the experience we’ve had. 

It doesn’t make advice harder; we’ve always had comparisons to client research, but it does add an education step at the beginning of the journey. Once clients understand that AI can give broad guidance but not personalised, regulated advice, they’re generally relieved to have a human to provide real advice. 

Broadly, clients are happy to engage with AI, so too are workers, but it’s that relationship piece that carries through. 

 

Michelle Niziol, CEO of IMS Property Group 

Yes, we are absolutely seeing more clients arrive having used AI tools like ChatGPT or online comparison platforms to pre-research their mortgage options. 

In many ways, that curiosity is positive; people are more engaged with their finances than ever. However, the challenge is that AI can only work with generic information and assumptions, whereas mortgages are deeply personal, regulated and dependent on nuance. 

Where it becomes problematic is when clients understandably assume the information they’ve received is precise for their situation. In reality, AI does not factor in complex income structures, credit history nuances, adverse events, portfolio expansion strategy, lender appetite shifts or real-time policy changes. 

That is where re-education becomes necessary. 

We do not find clients questioning professional advice in a negative way, but they do often ask: “Why is this different from what AI told me?”. That opens the door for a valuable conversation about context, regulation, affordability modelling and lender discretion, all the elements technology simply cannot apply with accuracy. 

AI is a useful starting point for awareness, but it is not a substitute for bespoke advice. Our role as brokers has evolved into not just arranging finance, but also translating complexity into clarity and protecting clients from costly misunderstandings.