In this vlog, we discuss the relevant and timely topic of AI in home care. Larson is one of our agency owners, and he’s been using artificial intelligence (AI) resources like ChatGPT to see how it could help his business.
He was gracious enough to speak with me about his discoveries – both good and bad – so other home care operators could learn from his research.
Watch the full video for all the important details, but here are some key takeaways from our conversation.
AI Isn’t HIPAA Compliant
Larson’s first words of caution for any agency attempting to use AI in home care is that it’s not intrinsically safe to use without time and effort.
When inputting information into a tool like ChatGPT, users have to make sure they’re not just taking information straight from their database and uploading it into the AI tool. To do so would put them at risk of violating HIPAA laws if that information includes any kind of personal information (PI). EXAMPLE: Things like full name, or phone number.
Use Cases
After many attempts, Larson has no positive use cases for using AI to help with scheduling issues.
However, an example of a positive use case he found for AI is in creating staff utilization reports, that give a sense of how well they are staffing their open shifts and if caregivers are overleveraged.
Hallucinations from AI
AI can produce false and inaccurate information. Human management and oversight are still needed when using AI for home care because it’s not fully automated and not fully intelligent. Also keep in mind that something could break over time as AI models change and evolve. Just because the data was right today, doesn’t mean it will be tomorrow.
AI in Home Care – There is No “Easy Button”
How do we build a model in AI that is actually useful? In Larson’s example of staff utilization reports, he estimated that it took 4 hours to build the model he now uses to run the report.
His caution to other agency owners and operators is that using AI doesn’t mean there’s an easy button to hit. He still has to pull data out of Rosemark and de-identify the information he’s putting into ChatGPT. It still requires a lot of manual work, but it is saving him time in terms of manual data aggregation.
He also wants to remind other agencies that any reports or data generated from AI must still be reviewed for accuracy.
Be realistic about AI
As an agency owner, Larson said he went into the process thinking AI would be this massive panacea to solve all his problems. He discovered that it can be a time saver, but it cannot solve every problem.
There are specific questions agencies should ask when delving into AI:
- What do you want from AI?
- Realistically, what is it going to do for you and your agency?
- Can it save time?
- Can it aggregate or summarize things more efficiently than a human can?
Larson found that AI is a great tool for creating summaries and aggregations. However, he doesn’t see a situation where he can reduce staff overhead or payroll with the help of AI.
WATCH NOW:
For more information about Rosemark, reach out to a member of our Customer Success Team.