The Evolution of Financial Advice

Investment firms fear AI will replace human financial advisors. The real question is whether AI might actually grow the market for human advice, not shrink it.

This week I was able to attend The Investment Network (TIN)‘s AI conference. This event brought together thought leaders from across the world of financial investment firms, and provided excellent insights and thought-provoking discussion.

The theme of the event was AI (of course). Investment firms have been concerned about the risks of AI services such as ChatGPT being “good enough” for users to obtain investment advice, bypassing the need to work with expert, human IFAs (Independent Financial Advisors).

During the event a live trial of ChatGPT vs a conference full of IFA experts was run, and the results were interesting:

  • For ‘basic’ financial advice questions, ChatGPT was able to provide confident, plausible, and fast advice - but not always accurate
  • For most retail potential investors, the accuracy of the advice would be very hard to determine
  • The protections offered by certified financial advice firms and advisers (FCA registered, QCF/RQF qualifications, SPS, CPD) are absent

Despite this, the mood in the conference was one of apprehension; how long until these limitations and barriers were overcome? Is the industry about to enter a period of crisis?

The on-ramp, not the replacement

Before answering that question, it is worth reframing it. The market for financial advice is very large, with only a small minority of citizens having ever used financial advice services. The users who may be fine using an AI financial advice chatbot are statistically unlikely to use a human IFA, and vice versa.

If anything, having financial advice apps that improve financial literacy, for example providing a safe space for users to “ask the dumb questions”, may lead to this being an on-ramp for IFA clients rather than a substitute for them. The threat narrative assumes a fixed market being carved up by automation. The more interesting possibility is that AI expands the market by bringing more people into the conversation about their financial futures.

The regulatory moat

The strongest structural defence that the financial advice industry has is regulation and liability.

Financial advisors operate within a framework of FCA registration, professional qualifications, complaints procedures, and professional indemnity insurance. When a human advisor gives bad advice, there is a clear chain of accountability and redress. When a chatbot gives bad advice and someone loses their pension, who is liable? The provider? The model developer? The user who chose to trust it?

These are not hypothetical questions. They are unresolved, and until they are resolved, the regulatory framework around financial advice remains a durable barrier to AI replacement that no amount of model improvement can bypass on its own.

The human advantage, for now

There are also genuine capability gaps that favour human advisors today. Investment decisions are not only about risk profile, preferences, life goals, and capital; they are often as much emotional as intellectual. EQ is as important as IQ when making investment decisions, and a good IFA knows how to read their clients, as well as be able to produce a well-matched suitability report. Sometimes a little grey hair, and being a trusted advisor who advised your parents, is no bad thing.

A previous client of Yellow Radio operated in the AI-powered mental wellbeing space, providing therapy/counselling services via an AI-powered app with considerable industry expertise and caution. A target GTM segment was higher education students and young people (Gen Z) in general; surely they would be willing to use an app?

In short: unlikely. Gen Z and younger users may “live on their phones”, but often hate the experience and crave in-person experiences, particularly when human perception, judgment, experience, privacy and safety are factors. The research we conducted proved this; as of 2025 at least, the probability of users wanting to use an AI app instead of seeing a mental health professional is very low. Ironically, seeing a professional in person is perceived as more secure, authentic, accurate, and private than trusting an online app. AI mental health apps have their uses and benefits, but they are not an effective substitute for professional service provision, and users will know and experience that.

When it comes to investment advice, the same principle holds. Certainly there are cases where basic financial Q&A is fine, and useful, via an app. But when it comes to getting reliable, expert, trusted advice from a professional with lived experience, seeing a human is preferred.

That said, the emotional intelligence gap, reading body language, sensing hesitation, understanding family dynamics, is exactly the kind of capability that multimodal AI and agentic systems are explicitly targeting. It is a real advantage today, but not necessarily a permanent one. And generational preferences for in-person interaction may be more about the quality of today’s AI than a fixed trait.

No grounds for complacency

None of this is grounds for complacency. LLM capabilities are advancing rapidly, particularly with the rise of agentic AI, and the stock market’s reaction to perceived AI disruption is hardly rational at the best of times. Financial advice firms would be wise to explore how to differentiate, and how to integrate this technology as part of their long term strategy.

But the framing matters. The industry’s instinct at events like TIN is to ask “how do we defend against AI?” The better question may be: “how do we position ourselves to benefit from the clients that AI sends our way?” The winners will be the firms ready to meet newly informed clients with the expertise, judgment, and trust that no chatbot can yet replicate.

Got comments?

We'd love to hear your thoughts on this article.

Thanks for your feedback

We appreciate you taking the time to share your thoughts.

Something went wrong. Please try again, or email us directly at contact@yellowrad.io.