Welcome back, readers, as we continue coverage of Financial Literacy Month. Dedicated readers will remember that last year, I wrote an article on how TikTok has become the most trusted source of financial literacy and advice for younger generations. A year later, TikTok still remains a top contender, but it doesn’t reign quite as supreme as it once did. Today, I want to discuss a new competitor that’s entering the race to be your members’ most trusted financial companion: AI.
In the last few years, you’ve probably read articles on how AI is the best thing that could happen to your credit union, how it’s the worst thing that could happen to your credit union, and perhaps a few in the middle. But no matter your stance, or whether or not your credit union has chosen to take on AI, the technology is already permeating into your credit union through your members, and they’re using it in one of the most worrisome possible ways: to receive financial advice.
Just last year, Credit Karma released a survey which reported that 66% of Americans are turning to AI for financial advice, and even more when you look at younger generations such as Gen Z and Millennials, with those numbers reaching 82% for both. Most commonly, participants asked AI about “financial education and basic personal finance concepts (35%), financial goal setting and action plans (35%), budgeting and expense management (34%), optimizing savings (33%), saving for retirement (31%), and investing in the stock market (32%).”
As you can see, the list above shows that the majority of AI use was to seek very basic financial advice and education. But if members are looking for help, why turn to AI and not the credit union?
AI: boyfriend, therapist, and financial counselor
Since its inception, AI has been slowly gaining the public’s trust, both directly and indirectly. Millions reach out to AI platforms such as ChatGPT every single day, while others experience forced exposure to the technology—such as through Google’s AI Overview which now appears at the top of every search by default.
With this increase in experience and exposure, many have become comfortable (maybe too comfortable) with AI. In the last few years, AI chatbots have become the largest provider of mental health services in the US, according to a recent survey, while other studies have shown that millions are creating AI partners, with over 1 in 5 US adults creating romantic chatbots. For millions, AI serves as a therapist, friend, and partner.
Which essentially boils down to this: they trust it, are comfortable with the technology, and are open to sharing their concerns with it. Therefore, it’s no wonder that those users are now looking to AI to be their personal financial advisor, given that level of trust and openness.
For many, getting access to sound financial advice can be a tricky and nerve-wracking experience. Members often feel embarrassed asking questions about basic financial concepts, as they worry about looking stupid for not knowing the answer. They might also feel ashamed of their current financial situation, their lack of knowledge, or their poor financial choices. This fear of being poorly perceived can make them hesitant to utilize available resources, such as going into a branch or meeting with a financial advisor.
These emotions are a major driving force that determines where and how members get their financial education and support. There’s a reason TikTok became so popular in the financial education sphere, and spoiler alert, it’s for the same reasons AI is growing in popularity as well: ease of access, ease of use, anonymity, and lack of judgment. Scrolling through their phones or typing questions into Google is a lot easier and less complicated than having to schedule an appointment, drive to a branch, and tell another person—a total stranger—where they’re struggling.
The importance of accessibility can also not be overstated, especially for those living in underbanked and underserved communities, who may lack a trusted community financial institution or one that offers financial counseling. While of course, a credit union is going to be a better resource for sound financial advice, if the member lives in a banking desert, doesn’t have a car, or simply has to work a 9-5 (the same hours the credit union is open), AI’s accessibility is going to win out every time.
Okay then, if AI is so trusted and accessible, and members feel comfortable using it, what’s the issue?
Widely trusted, often wrong
While it’s great that members have an accessible and trusted source of financial advice to turn to, that trust can often be misplaced when it comes to AI. For all that it can do, AI’s biggest weakness is still that it’s often wrong—and confidently so. Anyone who has ever skimmed that aforementioned Google AI overview has surely seen incorrect information pop up there more likely than they’d like.
According to a New York Times article, “testing suggests that approximately one in 10 Google AI search overviews contains false information. Given that the search engine processes roughly 5 trillion queries per year, users could be exposed to more than 57 million inaccurate answers each hour—nearly 1 million per minute…Researchers also found that AI overviews are susceptible to manipulation. In one example, a BBC journalist published a blog post containing false information and later found that Google repeated those claims the following day.”
As for ChatGPT, a study from 2025 found that not only was the program wrong roughly 20% of the time, but it was also only right 60% of the time when it didn’t rely on guesswork, and its answers were wildly inconsistent when asked the same question a number of times. 27% of the time, ChatGPT would flip its response, saying something was true in one moment and false in the next, then true again.
“Current AI tools don’t understand the world the way we do — they don’t have a ‘brain,'” said Washington State University professor Mesut Cicek, author of the study. “They just memorize, and they can give you some insight, but they don’t understand what they’re talking about.”
Ultimately, what this means is that if a member goes to AI and asks it for financial advice, there’s a decent chance that not only is it wrong, but it could give a completely different answer than it would have two minutes earlier. It’s also simply repeating whatever information it’s pulling from the internet—which, as mentioned earlier, can include outdated, incorrect, or misinformation—instead of actually “thinking” about the question and giving a nuanced response, which it is incapable of.
So, despite how comfortable users might be telling AI their financial concerns and woes, AI’s advice should not be trusted. Or at least, it should be triple-checked and verified before members act on it. Which, in the end, defeats the purpose of asking it to begin with.
It only knows what you tell it
Another huge failure on the part of AI in offering financial advice is how little it knows about the user on the other end. The AI only knows as much as the user tells it. So, when a member turns to generally accessible AI tools for financial advice, they’ll be getting very generic, surface-level information that does not take their specific finances and concerns into account.
The user will have to provide a large amount of information in order to get an answer that relates to their current financial situation. And even then, the information it’s working off of is very limited in scope. Additionally, users may accidentally omit pertinent information or provide incorrect information, skewing the answers they get from AI, intentionally or unintentionally.
An AI working off what little info it has, making assumptions based on general advice, and an end-user applying this to their real-world finances is a recipe for disaster. That same survey from earlier found that about 85% of the respondents who have used GenAI in this manner acted on the recommendations provided. Of those who have acted on financial advice they received from GenAI, 52% say they have made a poor financial decision or mistake based on the information they received.
A real financial advisor or credit union employee, on the other hand, knows the member, has access to their financial information, and knows the community, local resources, and more that can support the member. They can also take the nuances of the member’s personal situation into account. So how do we get the member away from AI and pointed toward a credit union?
How to counter
Your credit union needs to be a trusted source of financial education, advice, and support. But that means now more than ever, you need to be focused on accessibility, ease of use, and avenues that avoid perceived judgment. TikTok and other social media platforms are a great outlet for that, as they allow users to engage with content anonymously, where they are, when they need it. (If you need help getting started, you can learn more about TikTok and financial literacy here.)
Of course, that’s not to say you shouldn’t offer in-person services such as financial counseling appointments, but starting digital is a great way for members to build up trust with the credit union, and once they become comfortable with you, they may eventually utilize those in-branch resources.
However, you should also be using those digital channels to promote your in-person services if you want members to take advantage of them. How will they know financial counseling is provided at your credit union if your social media doesn’t advertise it? And if you’re about to say, “Well, they can go to the website or call,” sorry, you’ve already lost many of them.
The bottom line? To get members where you want them, you first need to be present, active, and helpful where they are, which, yes, for the time being, means TikTok, Instagram reels, etc. Members are turning to AI because it’s right in front of them, not because it’s helpful in the long run, which means that’s the aspect we are competing with.
In order to win, we need to make “meeting members where they are” less of a mission statement and more of a plan of action.




















































