Pediatric Neurosurgery in 50 years

Samuel R. Browd

What does pediatric neurosurgery look like in 50 years? What advances have been made? Have we cured cancer, eliminated hydrocephalus, eradicated intraoperative infections? As I work to build my academic career around innovation and device development in neurosurgery, it is fun to speculate where our field will move during the next half century as our surgical successors treat our grand- and great-grandchildren.

We are fortunate to be practicing neurosurgery during one of the most exciting times in biomedical technology—ever. New ideas and concepts unthinkable ten years ago are now becoming reality, and innovators are looking for ways to move technological advancements into medical practice. The triple aim of improving the patient experience of care (including quality and satisfaction), improving the health of populations, and reducing the per capita cost of health care will be met through innovation. Precise (or personalized) medicine, big data, standardized clinical and surgical work, and cost-containment will reshape how we practice and provide huge incentives for innovation.

Four key verticals will drive much of the biomedical innovation in the next 50 years: the Internet, computing power, genomics, and additive manufacturing. The evolution of the “Internet of things” will reshape how we practice medicine as the patient-physician relationship becomes closer with monitored implants, virtual and home-based diagnostics, and telemedicine (the house call is coming back). As an example, shunts will become electromechanical devices having auto-diagnostic capabilities, sensing and adjusting to ICP (or flow) and reporting failures prior to the development of symptoms.

Moore’s Law has seen computing power increase exponentially since the 1970s, allowing for computing power that doubles every 18 months. Computer processing on this scale allows for advanced diagnostics, applications of big data, and next-generation image processing and display (think real-time virtual reality or remote, robotically driven OR suites).

Recent advances in genomic sciences have been equally amazing, and set up a future where precise, or individualized, medicine is the standard of care. In 2007 it cost upwards of $1 million to map an individual’s genome. Now it is a few thousand dollars, and the race to the $100 genome is well underway. Companies now offer limited gene sequencing for $99; 23andMe is an example of the exploding field of personalized, predictive genomic diagnostics.

Finally, additive manufacturing, including 3D printing, has rapidly evolved in the last few years and is on the verge of a huge paradigm shift in the multi-billion dollar surgical implant industry, a shift that will redefine the current models of hospital purchasing and demand flow.

Fast forward 50 years—Dr. Marty McFly is in his first year of being a pediatric neurosurgery attending. His training is comprised of both real and virtual patients with at least 50 percent of his operative experience done on simulated patients combined with virtual reality and realistic haptic feedback. He has performed hundreds of pediatric operations— chiari malformations, in utero myelomenigocele closures, posterior fossa tumor resections—and masterfully handled a plethora of complications in virtual reality born from real-world experiences archived and cataloged to a national neurosurgical teaching data repository.

Today Dr. McFly is operating on a four-year-old with a medulloblastoma in the posterior fossa. Preoperatively, many details are already known. Imaging has been performed with advanced MRI. He already knows many specifics of the tumor—the anatomy is defined; he knows exactly where any infiltrating cells reside outside of the main tumor border since he gave an advanced contrast agent that binds specific tumor cell surface markers, and biomarkers such as advanced chemical spectroscopy help refine the likely tumor subtype.

Functional imaging provides a clear assessment of cognition, including specific processing deficits compared to matched controls, and provides a predictive assessment of postoperative function based on planned resection borders. Dr. McFly has uploaded the imaging, has practiced the surgery virtually, and has set the limits of his surgical resection. During surgery the intraoperative navigation will warn him if he is straying outside of his pre-defined plan, operative corridor, or is in danger of harming vital structures.

The day of surgery arrives. The patient is brought into the room, the “time-out” is performed, and the patient’s identity and surgery are confirmed using RFID and facial recognition technologies. The patient is anesthetized. We know the exact level of sedation, and monitor cortical, cranial nerve, and spinal cord function with wireless electrodes. A monitoring system is simple and placed by the anesthesiologist, who also monitors the automated diagnostic feedback. Dr. McFly dons his virtual reality goggles and goes to scrub. Scrubbing no longer requires soap and water; UV irradiation sterilizes his skin, and the patient is “prepped” in a similar fashion.

The patient is positioned with a pin-less Mayfield system, and the navigation automatically registers to the patient without additional input. Dr. McFly uses his VR goggles to view the operative field in real-time, using an assortment of cameras directed towards the operative field. He can magnify his field of view at will with voice, gesture, or ocular tracking. Likewise he can view images, his operative plan, or surgical atlases overlaid and warped to the patient’s anatomy on demand. His senior partner, on vacation in Hawaii, plans to join him virtually to offer guidance during the case.

Surgery starts with a bloodless opening using a harmonic scalpel; the drill has been supplanted with a device that effortlessly opens the cranium without compromising the dura. Similarly the dura is opened with a simple device that cuts and oversews the edge simultaneously. The operative microscope of 2015 has been replaced by a simple articulated camera that is pointed to the operative field. The camera’s robotic arm moves to follow the surgeon’s line of sight and utilizes operative navigation to enter the surgical corridor, offering superior magnification and illumination. Dr. McFly operates comfortably with two hands, never needing to readjust the microscope. Similarly the instrumentation he uses is multifaceted. Suction calibers change automatically without the scrub tech intervening, and clogs are automatically detected and cleared. Tumor resection occurs with a device that aspirates and coagulates simultaneously; it is tracked via navigation and samples the aspirate to determine if tumor is present. The patient was injected with “tumor paint” at incision, and the edges of the lesion fluoresce during the surgical resection to guide margins. In real time during the operation, the navigation autocorrects for brain shift, providing sub-millimeter accuracy throughout the case.

A small, intraoperative MRI is brought over the field prior to closure to confirm complete resection. The closure occurs quickly once all bleeding is controlled. Any fine vessels are coagulated with a fine laser beam, and the VR goggles are able to refine the specific location of bleeding based on thermography or other features such as spectral discrimination. The dura is closed with an autosuture device, and any defects are filled with 3D printed or cut synthetic biologics. The template is created during the procedure based on intraoperative topographic measurements of the defect taken with a laser scanner and sent to the sterile printer. The bone flap is replaced and spot welded in strategic locations with instant curing bone cement; the skin is reapproximated and closed with an autosuture device.

The tumor is precisely categorized, and the specific genetic defects are compared to all age-matched patients with the same tumor type worldwide. Big data is used to recommend best treatment recommendations. Prior to initiating treatment, live tumor is tested in vitro against hundreds of chemotherapy options using microfluidic devices, and the most robust combination of medications are given to the patient for precise, individualized treatment after resection. The patient makes a full recovery, and yearly follow up is done via telemedicine, utilizing local imaging facilities, remote diagnostics, and examination surrogates (semi-autonomous robots or smart instruments—ophthalmoscopes, stethoscopes, etc.).

Technology in 2065 has improved the likelihood of a safe and successful surgery. The patient and their family are happy, and we’ve improved health and reduced overall cost of care. After a productive day in the OR, it’s time to head to the 6D cinema, relax, and watch Back to the Future VII. (Yes, there still will be movie theaters. Maybe. And sequels? Definitely.)

Disclosures: Dr. Browd is co-founder and chief medical officer of Aqueduct Neurosciences  Inc., Aqueduct Critical Care Inc., Navisonics Inc., and Vicis Inc. Dr. Browd is vice president of business development at ThermaNeurosciences Inc.