Enrico Louw, General Practice Principal at Old Mutual Personal Finance
More customers are arriving at meetings with financial plans created by generative artificial intelligence (AI), believing they have already mapped out their financial journey from start to finish. But beneath the neat spreadsheets and projections lie gaps: overgeneralised savings rules, overconfident return assumptions, and overlooked protections such as severe illness or estate planning.
It’s exactly these shortcomings that highlight why human advice remains essential, says Enrico Louw, General Practice Principal at Old Mutual Personal Finance. “For financial advisers, the real opportunity is to show the value of a human-led financial plan. It’s a plan that grows with people’s lives, adapting to their circumstances and anticipating challenges in ways digital advice cannot.
“It’s encouraging to see customers taking the initiative to map out their financial futures, but a plan built solely on a large language model is, at best, an average answer based on probability. It can only go so far. This is where we step in to turn those first steps into something robust, personal, and aligned with long-term financial wellbeing.”
The Human Edge Over AI
Spotting these shortcomings also show why human advice is indispensable. It positions the financial adviser as the family’s financial coach, protects customers from costly mistakes, and demonstrates the depth of judgment no algorithm can replicate. “That depth becomes even clearer when compared with the limitations of AI,” says Louw.
While computer models can crunch numbers and outline generic strategies, they cannot account for personal values, a change in life stage, or how an individual might react under market stress. “AI won’t ask: ‘How would you feel if your investment dropped 20–30%?’ That emotional layer, how customers react under stress, is where human financial advisers can show their expertise,” Louw explains.
He cautions that AI can also act like a yes-man, known as a sycophant, reinforcing what customers want to hear rather than challenging their assumptions. “That’s where the real value of advice comes in,” he says. “We ask the tough questions, stress-test overly optimistic projections, and bring realism into the discussion so customers are prepared for setbacks as well as successes.”
Ultimately, the rise of AI-driven Do-It-Yourself (DIY) plans signals a shift in the customer journey, not the end of advice. “By turning flawed digital blueprints into collaborative conversations, financial advisers can reaffirm their role as trusted partners: delivering not only financial strategies but also reassurance, perspective, and human judgment that technology cannot replicate,” says Louw.
From Red Flags to Relationships
By reframing AI’s shortcomings as teachable moments, financial advisers can turn potentially misinformed customers into engaged partners. Respectful correction builds trust and confidence, while also underscoring the limits of machine-generated advice. “AI may start the conversation, but financial advisers give it real direction,” Louw notes.
Beyond building trust, financial advisers who address these red flags also strengthen their customers’ long-term financial wellbeing. “When we respectfully correct flawed assumptions, customers feel guided rather than judged,” says Louw. “That’s what builds lasting relationships and keeps the customer engaged over the long term.”
He adds: “Each shortcoming in a DIY plan is a chance to guide customers with expertise and sound knowledge, improve financial wellbeing, and build stronger relationships rooted in trust. In a digital-first world, that human edge is the financial adviser’s greatest advantage.”
Six Gaps to Spot in a DIY AI Plan, and How to Respond
1 . Overgeneralisation: Customers may assume rules like “save 15% of your income” apply universally, but this ignores personal realities such as life stage, dependents, debt, and financial goals. Response: Acknowledge the customer’s effort, then explain why a tailored plan unique to their situation makes it more realistic and achievable.
2. Overconfidence in Returns: Many DIY plans assume constant high growth, often 10% or more annually, but markets fluctuate, and these inflated expectations can leave customers underprepared. Response: Stress-test the plan with more conservative assumptions and demonstrate how different market scenarios impact outcomes.
3. Blind Spots in Risk Protection: Self-made plans often focus only on retirement and discretionary savings while overlooking other needs, including severe illness, disability, or retrenchment, which leaves customers exposed to life’s curveballs. Response: Use real-life examples to show why protection strategies are critical for long-term financial wellbeing. These are uncomfortable conversations that AI won’t initiate, because it is designed to reinforce what customers want to hear rather than challenge their assumptions.
4. Ignoring Tax Efficiency: AI-generated plans may suggest the right products but fail to calculate actual tax implications, leading to unnecessary costs, fees and taxes from poorly timed withdrawals or insufficient contributions. Response: Walk customers through the tax implications of different options and illustrate the benefits of structuring contributions (lump sum or recurring) and the impact of withdrawals more effectively.
5. No Emotional Layer: DIY advice assumes perfect behaviour during volatility, but generative AI often tells customers what they want to hear and doesn’t prepare them for the stress of a 20–30% market dip. Response: Ask reflective questions like “How would you feel if your investment dropped 20%?” to connect numbers to real behaviour and build emotional readiness.
6. Missing Pieces in Holistic Planning: AI often sidesteps difficult but essential topics such as whether you have a valid will, if beneficiaries are up to date, or whether there is enough liquidity to cover estate costs. Response: These are uncomfortable conversations, but they are too important to ignore. Unlike AI, which acts like a yes-man and avoids raising the subject of death, advisers can address it with sensitivity and ensure families are properly protected.
“The message for financial advisers is clear: AI-generated plans are not obstacles but openings. Each gap is a chance to show the value of a human-led financial plan, improve financial wellbeing, and build stronger relationships rooted in trust. In a digital-first world, that human edge is our greatest advantage when the financial plan is reviewed regularly,” concludes Louw.
ENDS