October 01, 2025
2 min read
Key takeaways:
- AI has significant potential for relieving administrative burden, Mikhail O. Varshavski, DO, said.
- However, it cannot replace human interaction.
NASHVILLE — It is a “scary time” to be a medical student amid new AI technology and distrust in health care, but such developments will never replace the impact of human interaction, Mikhail O. Varshavski, DO, said at OMED.
“You don’t know if the health care system is going to be there to support you. You don’t know if AI is going to take over the work you do day-to-day. You don’t know if patients are actually going to trust you when you’re giving them advice because they read something through ChatGPT, Google [or] WebMD,” Varshavski, a family medicine physician and social media personality known as “Dr. Mike,” said. “But I promise you those worries are unfounded.”
AI has significant potential for relieving administrative burden but it cannot replace human interaction, Mikhail O. Varshavski, DO, said. Image: Adobe Stock
Varshavski said that clinical jobs “are going nowhere as long as you remember the one most important takeaway from the art of practicing medicine — and that is to never forget the human that is at the heart of why you do what you do. While that human may feel like it’s you, it’s actually the person sitting across from you.”
“It’s so easy to start thinking about the science behind it, the pathophysiology behind it, the marketing, the hospitals, the government — but it’s the human at the end of that interaction that’s key,” he continued. “If you don’t lose focus on that, I don’t care how smart AI gets, you’re going nowhere.”
During his presentation, Varshavski discussed the benefits of evolving digital health care tools, especially on the constant issue of administrative burden.
“There’s no doubt that the administrative burden is making the practice of medicine suck,” he said. “With tools like Clinical AI, we’re able to take shortcuts in areas where it gives us back the art of practicing medicine. It gives us time to stop touching the keyboard and make eye contact with our patients.”
But he added that “at the same time, we need to have our ears open and our eyes open to make sure we’re not creating misinformation.”
“With AI, there are legitimate problems. Hallucinations. Making up sources,” Varshavski said. “I’ve used AI in the past and have gotten an answer with a source that looks very legitimate, only to realize an hour later that source was fully AI. That’s why tools that use AI are just that — tools.”
He said that every tool in health care, “whether it’s a medication, surgery [or] therapy — all have benefits and side effects.”
“When we take into account what those benefits and side effects are, we can better prescribe them and use them,” he added. “Anything that can be used to relieve that administrative burden and pressure and allow me to do what I love doing, which is spending time with my patients … I’m absolutely excited about this.”