CFPB Warns Poorly Deployed AI Chatbots Can Hinder Customer Service
June 8, 2023
Working with customers to resolve a problem or answer a question is an essential function for financial institutions. Customers turn to their financial institutions for assistance with financial products and services and rightfully expect to receive timely, straightforward answers, regardless of the processes or technologies used.
Many financial institutions have deployed chatbots intended to simulate human-like responses using computer programming and help institutions reduce the costs of customer service agents. These chatbots sometimes have human names and use popup features to encourage engagement. Some chatbots use more complex technologies marketed as “artificial intelligence,” to generate responses to customers.
The Consumer Financial Protection Bureau released a report highlighting some of the challenges associated with the deployment of chatbots in consumer financial services. As sectors across the economy continue to integrate “artificial intelligence” solutions into customer service operations, there will likely be a number of strong financial incentives to substitute away from support offered in-person, over the phone, and through live chat.
Deficient chatbots that prevent access to live, human support can lead to law violations, diminished service and other harms, according to the CFPB. The shift away from relationship banking and toward algorithmic banking will have several long-term implications that the CFPB will continue to monitor closely.
Approximately 37% of the United States population is estimated to have interacted with a bank’s chatbot in 2022, a figure that is projected to grow. Among the top 10 commercial banks in the country, all use chatbots of varying complexity to engage with customers. Financial institutions advertise that their chatbots offer a variety of features to consumers like retrieving account balances, looking up recent transactions, and paying bills. Much of the industry uses simple rule-based chatbots with either decision tree logic or databases of keywords or emojis that trigger preset, limited responses or route customers to frequently asked questions (FAQs). Other institutions have built their own chatbots by training algorithms with real customer conversations and chat logs, like Capital One’s Eno and Bank of America’s Erica. More recently, the banking industry has begun adopting advanced technologies, such as generative chatbots, to support customer service needs.
Financial products and services can be complex, and the information being sought by people shopping for or using those products and services may not be easily retrievable or effectively reduced to an FAQ response. Financial institutions should avoid using chatbots as their primary customer service delivery channel when it is reasonably clear that the chatbot is unable to meet customer needs.
The report found the use of chatbots raised several risks, including:
- Noncompliance with federal consumer financial protection laws. Financial institutions run the risk that when chatbots ingest customer communications and provide responses, the information chatbots provide may not be accurate, the technology may fail to recognize that a consumer is invoking their federal rights, or it may fail to protect their privacy and data.
- Diminished customer service and trust. When consumers require assistance from their financial institution, the circumstances could be dire and urgent. Instead of finding help, consumers can face repetitive loops of unhelpful jargon. Consumers also can struggle to get the response they need, including an inability to access a human customer service representative. Overall, their chatbot interactions can diminish their confidence and trust in their financial institutions.
- Harm to consumers. When chatbots provide inaccurate information regarding a consumer financial product or service, there is potential to cause considerable harm. It could lead the consumer to select the wrong product or service that they need. There could also be an assessment of fees or other penalties should consumers receive inaccurate information on making payments.
Contact ALTA at 202-296-3671 or email@example.com.