This story appears in the March 2026 issue of Utah Business. Subscribe.
Utah, like most other states, faces a shortage of mental health professionals to support student well-being in schools, with just a quarter of the recommended number of school psychologists and half the number of school counselors.
ElizaChat, a Utah-based AI company that has created a mental health support chatbot, believes it can help fill this gap for students in Utah and around the country. Founded in 2023, ElizaChat now works with school districts in Illinois, Oklahoma, Oregon, Montana and Utah, giving students access to a 24/7 chatbot that offers support in processing emotions or managing the stress of a busy schedule.
Creating a product that keeps student users safe while navigating an ever-changing regulatory landscape was no easy task. In Utah, however, this process was made smoother by collaborating with the state’s new Office of Artificial Intelligence Policy (OAIP). Launched in July 2024, OAIP was created to help facilitate collaboration between lawmakers and AI businesses in the state.
ElizaChat co-founder and CEO Dave Barney says he was enthusiastic when OAIP took off. “Government positively engaging with industry like this, in wanting to learn, is a good thing,” he says.
Designed differently
From the start, Barney and regulators agreed on one thing: A chatbot for mental health support needed to be designed and trained differently than many popular AI “companions.”
One major concern for all AI users, but especially minors, is the development of unhealthy emotional attachments to chatbots, leading to social isolation and, in some cases, suicide. Often, these unhealthy relationships develop with certain chatbots that are designed to keep users engaged for as long as possible by simulating emotional connections.
“There is early evidence that the majority of people who do end up being in an unhealthy relationship with AI … they don’t intend to at first,” says Zach Boyd, director at OAIP and BYU applied mathematics professor. “Actually, it seems to be a consequence of the product design. You can design your product in such a way that it continues to strengthen these emotional ties in a way that’s not really healthy.”

According to Barney, ElizaChat is designed differently. “Philosophically, we don’t believe that we should have an emotional, personal connection with our user,” Barney says. Instead of speaking to a student like it were a friend, the chatbot engages with students more like a school counselor would.
Where other chatbots have been found to engage in inappropriate conversations with minors or encourage risky behaviors, ElizaChat’s goal is to redirect focus on well-being while limiting how much time users spend with the chatbot. Barney says, “If your school counselor keeps the conversation on topic — “How are you doing? How is school going? What’s going on in your life?” — no teenager is going to want to just chit-chat.”
Balancing innovation and safety
Five months after OAIP was launched, ElizaChat became the first company to enter a regulatory relief agreement with the office.
This type of agreement gives AI companies flexibility. It exempts them from certain regulations while working with OAIP to ensure their products remain safe and effective. “We wanted a flexible vehicle to let companies that had a reasonable idea charge forward and be leaders, and at the same time, [allowed] for the government to learn,” Boyd says.
ElizaChat’s regulatory relief agreement required the company to work with OAIP to determine a number of measures to keep its users safe, from data privacy policies to a crisis response plan, allowing ElizaChat a 30-day window to rectify any incidents before being penalized by state regulators.
One of the primary stipulations is that ElizaChat cannot provide mental health therapy because it is not a human being with a therapy license. This means ElizaChat is limited in the way it can support students as they process emotions or manage stress. It cannot diagnose or treat particular mental health disorders.
Another set of parameters concerns privacy and data sharing. When a student begins their first conversation with ElizaChat, the chatbot clarifies that it is not a human being and will share information with school emergency contacts if the student indicates they are in danger. ElizaChat shares anonymized general data with school officials to help them understand the issues their students are facing.
“Students have to know and trust that what they’re saying is private if it’s going to be useful … but we also want to provide schools with information and value that helps them to better address needs,” Barney says.
Because ElizaChat is a mental health tool, emergencies are bound to happen. When a student user indicates they are in danger of harming themselves or others, or that they are being harmed by someone else, the regulatory relief agreement requires ElizaChat to automatically ping their school’s emergency contacts via text and email.

While the onus is on the school to enact its existing emergency protocol, ElizaChat will continue to engage with the student, according to Barney. He says, “We don’t just drop it there.” Instead, ElizaChat will ask the student questions such as “Hey, is there someone you can talk to? Do you need to call 911? Are you by a friend or a family member?”
A few months after ElizaChat’s agreement with OAIP was inked, Utah lawmakers passed legislation codifying many of the same provisions from that deal and taking insights from ElizaChat and other businesses working with OAIP into consideration. “Our input helped shape the law,” Barney says, meaning lawmakers “could protect the consumer without creating an unnecessarily burdensome regulation on business.”
Br33 Jon3s, who has previously consulted for OAIP and recently became ElizaChat’s chief experience officer, sees the regulatory relief agreement as a positive example of how government and industry can work together to their mutual benefit. Jon3s says, “They don’t have to be opposites. They don’t have to be antagonistic with one another. They can be harmonized quite well if we’re willing to put in the effort.”
