Image Guided Surgery

In conjunction with our clinical collaborators (Prof. Brian Davidson) at the Royal Free Hospital, TIG is developing the Smart Liver Surgery system. The aim is to provide augmented reality image overlays, showing the surgeon pre-operative CT data fused with live laparoscopic video.

 

 

 

Background

 

In the UK, approximately 1800 liver resections are performed annually, and globally this number is estimated at 150,000, representing a major health problem. Laparoscopic surgery has significant benefits for the patient in terms of reduced pain, shorter hospital stays and earlier return to work. These benefits clearly fit with the strategic priorities for the NHS. However, in specialist HPB centres such as the Royal Free Hospital, only 10% of cases are currently performed laparoscopically. Large liver cancers and those close to major vascular/biliary structures are generally excluded from a laparoscopic approach because of the difficulty in controlling bleeding and uncertainty that the cancer will be removed with a clear margin. Furthermore, laparoscopic surgery is more technically challenging to perform than open surgery, due to the narrower field of view, lack of tactile feedback and the currently available tools.

 

We have developed the Smart Liver Surgery system [2,3,4], and are trialling it at the Royal Free Hospital. The aim is to overlay pre-operative data onto the laparoscopic video to provide guidance to the surgeon. This system will give the surgeon an increased confidence to attempt more surgeries laparoscopically. The system will also be applicable to pancreatic resection (2,200/year UK), kidney operations (3300/year UK) and removal of the gallbladder (60,000/year UK).

 

Technical Developments

The aim of this project was to develop the necessary technology to provide surgical guidance during laparoscopic liver resection. The project started in November 2012 and we developed our core system for image guided interventions. By Dec 2014 we had deployed a system for human studies at the Royal Free Hospital in London. To date, we have used the system on 7 laparoscopic staging and 5 laparoscopic resection procedures where the aim was to either record data for offline analysis, or perform image overlay in the operating room. The system is currently operated by technical staff, and has been demonstrated to 7 surgeons. The current clinical system uses manual alignment, and various technologies are underway with a view to integrating them into the clinical system when ready.

 

SteveBrian rescaled

 

Specifically we have:

(a) developed a software platform for our on-going research in image-guided interventions [1].

(b) published a novel, fast, GPU enhanced, stereo surface reconstruction [2].

(c) validated our surface based registration on a phantom and porcine model, measuring an accuracy of 2.9mm and <10mm respectively. [3].

(d) demonstrated locally rigid, vessel based registration for laparoscopic ultrasound [4].

(e) proposed a new hand-eye calibration method that can be constructed on-the-fly during surgery, and demonstrated the improvement of switching to electro-magnetic tracking [5].

 

We are currently improving our vision and ultrasound based registration, improving the clinical user interface and developing new visualisation technology. We will present [5] at IPCAI 2016.

 

Team

Steve Thompson, Guofang Xiao, Maria Robu, Joao Ramalhinho, Michele Bosi, Eddie Edwards,  Matt Clarkson, Dave Hawkes.

 

References

[1] Clarkson et. al. The NifTK software platform for image-guided interventions: platform overview and NiftyLink messaging.

[2] Totz et al. Fast semi-dense surface reconstruction from stereoscopic video in laparoscopic surgery.

[3] Thompson et al. Accuracy validation of an image guided laparoscopy system for liver resection.

[4] Y. Song et al. Locally rigid, vessel-based registration for laparoscopic liver surgery.

[5] S. Thompson et al. Hand–eye calibration for rigid laparoscopes using an invariant point.