Ashis Barad was hired a year ago as the Hospital for Special Surgery’s first chief digital and technology officer. The New York-based academic medical center, known for its orthopedics program, made the hire amid a broader push by its new CEO to prioritize data and technology.
Barad, a pediatric gastroenterologist by training, previously led digital efforts at Allegheny Health Network in Pittsburgh and Baylor Scott & White Health in Dallas.
MedTech Dive spoke with Barad about his priorities for digital efforts at HSS and his thoughts on agentic AI, a technology that can perform tasks without supervision. Barad also discussed how that work intersects with orthopedic devices.
This interview has been edited for length and clarity.

MEDTECH DIVE: Now that you’re a year into your role, what are your top priorities at HSS?
ASHIS BARAD: One of the issues with AI that I hope we don’t see is only efficiency. With new technology, people often say, “Oh my gosh, we can continue doing everything we do, but just be so much more efficient.”
But are we really not going to assume that there's going to be new capabilities we can do with AI inside healthcare? We start thinking about expanding care models outside of what we’ve historically done.
My biggest theme is [that] I believe agentic AI is going to be the great orchestrator of healthcare.
HSS wants to move into the movement arena more than where you just go get orthopedic surgery, because we already do a lot of stuff outside of surgery. So, we look at AI as an orchestration of a massive kind of ecosystem, all the way from prevention to regenerative to performance to wellness to longevity to surgical to non-surgical care. Taking what is special in the Upper East Side of New York and scaling that to the rest of the U.S. and across the globe.
When you talk about AI agents, is this something you’re thinking of as long term, big picture? Or is this something you’re looking to roll out soon?
There are two levels of agents. The agents of today are, let me do a task for you that’s simple. So, these could be inside a call center to triage phone calls. This could be an agent to help you schedule an appointment, this could be scheduled to check on you after surgery, [or] this could be an agent to look up HR policies as an employee.
When I talk about agents long term, I think of them as minions from “Despicable Me.” You would never send a minion out to go do a big thing. If you put 100 minions together, they can do amazing things, but somebody’s going to have to guide them and orchestrate them. How do you organize these minions to do these really complex tasks? That’s going to have to be architected through data — that’s going to take years.
We’ve talked about some administrative uses. How does your work with AI connect to medical devices?
Quite a few things come to mind.
We have an implant lab here that we actually partner with a company at HSS, and we are 3-D printing [implants]. There’s tons of AI that’s involved in 3-D printing an actual knee or hip. If somebody has a massive defect and needs a custom, personalized implant, that can happen at HSS.
Gait analysis is another really burgeoning [area] inside the orthopedic space. Up to this point, the technology has mostly been with markers. You have to actually put markers all over your body for the cameras to be able to see movement. What's happening inside the world of computer vision is markerless, so patients can now walk through a hall and be able to see their gait, how they're moving. Our surgeons, right now, within the research realm, they're actively doing this inside the world of our joint replacement service. That isn’t just to get cool data — that’s actually informing the right surgery for that patient.
We have an emerging technologies committee at HSS. We look at every vendor. We are constantly looking at the newest AI robotics and navigation, but I would separate those two. A lot of companies have paired them together. There are also now open platforms that work with any robot and joint.
There’s not a lot happening inside the medical device world with generative AI at this point. What’s fascinating is the new interface to interact with technology is turning [from] ambient into voice. I think there's a world with AI navigation and robotics — I think of Iron Man with Jarvis — that the surgeon is speaking to the software and saying, “Can you move it a little bit here? Can you shift?”
As you’re evaluating different AI technologies, how do you decide what is a helpful tool versus what to pass on?
[After] 16 years of full-time practice, I got frustrated with so many technologies coming in that didn't actually solve real problems for the front line. It has to solve real problems that are affecting our front lines, our patients. Can it do better and can it do good?
Making sure we identify the problem and that we’re not a hammer looking for a nail. There’s some really cool stuff out there, but I think we've done that a lot with digital health and telemedicine and other technologies before that. I feel like it’s my duty to make sure we don’t get caught up in the shiny object syndrome and [that] we are solving real problems that are affecting people.
I look mostly at platform companies over point solutions. Point solutions are fine, but I think, right now, we have to be horizontal more than we have to be vertical. I'm looking at the world of platforms way before somebody comes to me and says, “I'm just going to solve this one little narrow part of your journey.” It has to have business value or patient value, meaning outcomes, safety, quality, or financial ROI.
We are setting up robust data governance, AI governance, cybersecurity governance, ethics, regulatory and equity, other aspects of looking at these tools. It is a new world. Making sure that we look at it responsibly within our new governances is also critical.