The third SmartOT project meeting was again a great success. It took place on 21 September 2021 in partial presence; some project partners met in person in Oldenburg, the others joined online. Since the last meeting, the project work made significant progress in all areas. The current interim results were presented in various contributions and illustrated with demonstrators and simulations. Everyone is already looking forward to the next big step by setting up a test system at the Pius-Hospital Oldenburg to evaluate the functionality in different scenarios.
At the Eurographics 2021 conference, the Computer Graphics and Virtual Reality Lab (CGVR) of the University of Bremen won the public voting award for the best poster. The poster “Fast and Robust Registration and Calibration of Depth-Only Sensors” presents a newly developed, fast and robust registration method that is used in the SmartOT project.
At various points in the SmartOT project, we rely on depth cameras that can record 3D geometry in the form of point clouds. For example, we use depth images of real abdominal ORs to test and improve the light control algorithms developed in the SmartOT project.
In the case of non-contact tracking using cameras, the problem of obscuring the area to be observed exists in principle, irrespective of the respective technology and the area of application of the depth cameras, which is why in many cases several cameras record the scene simultaneously from different angles. To be able to combine the recordings of several depth cameras into a common point cloud, a registration (also called extrinsic calibration) is necessary.
For this purpose, the CGVR group developed a new, fast and robust registration method for depth-only cameras, which was used for the produced images of real operations but will also be used in the future in the real prototype in the Living Lab of the Pius Hospital in Oldenburg. The registration procedure was submitted in February 2021 at Eurographics 2021 in the form of a poster called “Fast and Robust Registration and Calibration of Depth-Only Sensors”, where it won the public voting award for best poster in May 2021. A full paper submitted at Cyberworlds 2021 entitled “Fast and Robust Registration of multiple Depth-Sensors and Virtual Worlds”, which goes into more detail and evaluates the developed method, was recently accepted, and will be published in the near future.
The developed registration procedure is not only beneficial for the SmartOT project but is also used within the VIVATOP project.
In April, the consortium had the opportunity to present the Smart-OT project for the first time to a broader medical audience at the 138th German Congress of Surgeons (DCK). As the annual congress of the German Society of Surgeons, the DCK is one of the most important annual specialist events for surgery in Germany and this year took place as a digital event under the motto Competence-Creativity-Communication from April 12 to 16.
Already in the first session of the event, which took place in the afternoon of April 6, 2021, as part of the pre-congress, and in which a wide variety of surgical issues were addressed in various short presentations, the University Department of Visceral Surgery of the Carl von Ossietzky University of Oldenburg was able to provide information about the project in a five-minute short communication on behalf of all institutions involved in the project. In particular, the partners involved in the Smart- OT project explained why there is a need for more modern OR lighting and what a prototype of this new type of lighting system might look like. Particularly eagerly awaited was the feedback provided during the expert discussion that followed the presentation, to hear the assessment of scientists and physicians from other disciplines who were not involved in the project.
This showed that the planned system was recognized as innovative and – especially regarding the handling of shading into the OR area – of practical relevance. Overall, this confirmed that the project is on the right track and has addressed issues that are also seen as relevant across the boundaries of different surgical disciplines.
Another online project meeting with all project partners took place on October 26, 2020. The current progress of the project was presented and discussed on the basis of various prototypes and demonstrators for the lighting and interaction system – the current results show that the project is on a very good way
A consortium from science and health care industry is developing a lighting system that automatically provides optimal lighting in the operating room. It compensates for shadows cast by the movements of the surgical team and can also be controlled specifically via gestures and speech when surgeons want to illuminate a certain region particularly well.
The new research project “Smart-OT” focuses on increasing the safety in the operating theatre. The primary goal of the participating consortium of science and medical technology companies is to develop an intelligent surgical lighting system that will provide precise illumination without the need for manual interaction. The advantages: Surgeons have their hands free and always have a good view, even when people are moving around. This reduces the workload of the OR staff and at the same time increases patient safety.
The consortium leader is Dr. Mach GmbH & Co. KG, one of the world’s leading manufacturers of medical lighting systems. Also involved are the Pius Hospital Oldenburg – a clinical partner of the University of Oldenburg – as well as the company Qioptiq Photonics and the Clinical Innovation Centre for Medical Technology Oldenburg (KIZMO). In addition, the University of Bremen is represented by the Technology Centre for Computer Science and Information Technology (TZI). The project, which will run until the end of 2021, is funded by the German Federal Ministry of Education and Research (BMBF) with 1.2 million euros.
Important step on the way to everyday life in the operating theatre
Intelligent surgical lighting is made possible by the use of novel lighting and control concepts that are interactively linked and should be transferable to other devices in the operating room. “Smart-OT” is based on the “Intra-operative Information” project, which was carried out by the University of Bremen together with regional hospitals after winning the excellence competition and has demonstrated several possibilities for improvement in the operating theatre. With “Smart-OT” an important step towards practical implementation is now being taken.
The hardware will be developed within the project mainly by the companies Dr. Mach and Qioptiq. The Computer Graphics and Virtual Reality group of the TZI at the University of Bremen (Prof. Gabriel Zachmann) provides the software for autonomous lighting control. One challenge is the optimal arrangement and coordination of a multitude of small light sources, which will be used instead of the large traditional surgery lamps.
The Digital Media Lab working group (Prof. Rainer Malaka) of the TZI supplements the system with the possibility of gesture and speech control. These novel types of interaction will enable surgeons to readjust the lighting with little effort.
The University of Oldenburg with the Pius-Hospital (Prof. Dirk Weyhe) together with the KIZMO ensures the practical suitability of the system – from requirements analysis to usability and evaluation of the demonstrators. In a laborotory especially designed for real-life research questions (“Living Lab”), surgeries can be simulated with the developed technologies and work processes can be tested.
During the Corona period, work on the project is being continued. A virtual project meeting with 14 participants took place on April 28, 2020, where the current project status was discussed and decisions for the next phase were made. Despite the special circumstances, the participants were very satisfied with both the implementation and the results.