Digitizing Surgery (LinkedIn article by Peter Verrillo, CEO @ enhatch)
Imagine, as a total joint replacement orthopedic surgeon today, that you are handed a mallet and an osteotome and told that is all you have to implant a state of the art total knee system. It’s hard to conceive of in today’s environment, but this is exactly how the modern era of total condylar knee replacement began in the late 1960’s and early 70’s.
The development of improved implants, instruments, and techniques for total knee arthroplasty (TKA) since then is a well documented one. Several well known surgeons, including Insall, Kelly, Hungerford, Ranawat, et al. made major contributions to improving all aspects of TKA procedures with the goal of more accurate bone cuts, better kinematics, and improved outcomes.
Then, in 1997, Dr. Kenneth Krackow performed the first total knee replacement using a digital navigation system he developed in his lab at the University of Buffalo. As an orthopedic surgeon with an advanced degree in mathematics, he recognized the need for a new system which enabled surgeons to make precise decisions on the alignment and orientation of instruments, the location and depth of bone cuts, and the placement of knee implant components.
The advent of digital navigated total joint replacement is arguably the single most significant event that ushered in the digital era in orthopedic surgery. Today’s navigation landscape has expanded beyond TKA into multiple types of procedures, all with the goal of more accurate implant alignment to improve clinical outcomes.
Since 1997, this landscape has continued to evolve by developing navigation technologies that address issues such as ergonomics, ease of use, and surgical accuracy. The evolution of this technology started with classic mechanical approaches, then moved on to monitor-based portable systems, then on to augmented reality, and then to in-vivo intraoperative sensor technologies. In this infographic, we are capturing the advancements in this technology that already exist and painting a picture of what the future will look like: a fully autonomous, seamless OR experience.
Some companies have developed systems that married technology with traditional, mechanical methods of navigation to produce patient-specific solutions.
- HipXpert utilizes a sextant to accurately align the acetabular cup during total hip arthroplasty (THA) and hip resurfacing
- Corin’s OPS uses a laser alignment system projected onto the OR wall to provide accurate alignment during THA
Beyond mechanical systems, another wave of surgical navigation systems consists of portable, cart-based systems that made use of cameras and fluoroscopy equipment in the OR to display real-time surgical navigation information on a monitor.
- 7D surgical has developed a system with the camera inside of the OR light that registers the patient’s spinal anatomy within 12 seconds and displays the image for navigation on a monitor attached to the cart. This is a major advantage as line of sight is constantly an issue during Navigated Surgery.
- Surgivisio integrates fluoroscopy into the procedure to render 3D alignment and implant positioning from 2D images in as little as 5 minutes
While these systems are conveniently located on carts that can be wheeled from OR to OR with ease, there is an ergonomic delay that occurs from the surgeon constantly having to look back and forth from the patient to the screen in order to achieve navigation.
To address this, several Augmented Reality companies are now allowing surgeons to visualize the surgery without having to take their eyes off the patient. These systems provide real-time, intraoperative, 3D visualization that allows the surgeon to operate while keeping their eyes on the patient through the use of virtual reality headsets and similar systems.
- HoloSurgical utilizes artificial intelligence that allows computer-generated anatomy to be viewed in conjunction with the surgeon’s real-time line of sight through a small, mounted monitor.
- Caira Surgical is also in development of a headset that addresses this problem, streamlining the procedure and providing an immersive view for the surgeon.
In addition to being ergonomically conservative, these technologies allow for a significant decrease in soft tissue exposure, reducing the risk of infection for the patient as well as overall OR time.
In-vivo sensors are also being utilized to provide real-time navigation in many orthopedic procedures. These sensors provide important data on ligament tension and balancing which has previously been inaccessible. There are a few different methods in which this is accomplished.
- Orthosensor balances the knee after the bone cuts are made, allowing a surgeon to balance the knee through ligament release.
- Corin BalanceBot takes a different approach, balancing the tension prior to performing the distal femoral cut. This tensioning data feeds back into navigation software, which then positions the cutting guide to replicate tissue balance with the final implants. The cutting guide also provides a useful haptic feedback to the surgeon replicating the surgical plan accurately.
What’s exciting about Corin’s OmniBotics platform is that it is the first system that ties sensor data to the navigated robotic surgery. While the Mako system includes soft tissue balancing as part of the algorithm, it is based on geometric measurements and not real-time, intra-operative ligament balancing.