5G promises to bring a wealth of new features especially in the area of mission critical Internet of Things applications. Healthcare is one sector looking to capitalise on this opportunity. For example, Ericsson and King’s College London are collaborating to develop remote robotic surgery using 5G technology.
The collaboration sprang from a chance meeting between John Cunliffe, formerly of Ericsson, and Peter Marshall, Head of Network Product Solutions in Ericsson and Professor Mischa Dohler, Director of the Centre for Telecommunications Research in the Department of Informatics at King's College London, whose research interests include 5G and the Internet of Things.
‘We agreed 5G would bring a lot of new opportunities so we thought; what can we do together?’ recalls Marshall. ‘King’s has a lot of expertise in tactile research, so we wondered whether we could add a remote sense of touch, and if so, what value would that provide for healthcare, remote response and emergency situations.’
After only a short time the team expanded to include Dr. Toktam Mahmoodi and Dr Maria Lema Rosas both from King’s College, London who have been instrumental in the recent collaboration and research. For the last 15 to 18 months, King’s and Ericsson and have been working closely together with the NHS on developing robotic surgery using haptic devices and tactile gloves.
‘It’s about providing a good, tangible use case; creating something because it makes sense, not just because it is good research, and that’s where King’s adds such a lot of value,’ says Marshall.
‘By doing this research we can see what it means from a 5G architectural and technology development point of view. We can see how things will evolve over the next three years – adding virtual reality, for example.’
The current level of robotic surgery is represented by products like the da Vinci Surgical System. It enables surgeons to perform operations by translating the surgeon’s hand movements into smaller, precise movements of tiny instruments inside the patient’s body. The instruments bend and rotate far more than a human hand is capable of doing.
One of the instruments is a laparoscope, a thin tube with a tiny camera and light at the end. It provides a magnified vision system to give surgeons a 3D HD view inside the patient’s body by sending images to a video monitor in the operating room to guide doctors during surgery.
Dr Toktam Mahmoodi, Lecturer in Telecommunications at King’s College London, explains that from a purely medical point of view there are two areas where 5G can advance this level of robotic surgery.
‘The first came out from our discussions with the doctors who told us they wanted to have the full set of human senses. In particular, they wanted to get back their sense of touch in robotic surgery, so as to improve precision further. They also wanted to have touch, movement and vision in a synchronised way that is not possible at the moment, because of the tight latency requirements,’ she says.
‘The second aspect 5G can provide is a sense of geographic distance,’ continues Mahmoodi. ‘At the moment the doctor has to be in the operating theatre even when using robotic surgery. 5G now brings the possibility of carrying out operations remotely. These are the two main lines of development we think 5G will bring.’
Marshall adds: ‘The da Vinci robot provides the most advanced video for surgeons. They can see and then act according to what they see by moving the robotic arm, but they don’t “feel” if they hit a bone. So, this is what we are trying to enable.’
The first showcase for the initial fruits of the Ericsson and King’s College collaboration was a demonstration of tactile robotic surgery at 5G World 2016 at Olympia in London earlier this year.
The ‘Remote Control and Intervention’ 5G medical use case showed a probe as a robotic representation of a biological finger, which gives the surgeon the sense of touch in minimally invasive surgery, and which is able to send accurate real time localisation of hard nodules in soft tissue.
In a real surgical environment it would mean that the probe, or robotic finger, is able to identify cancer tissue, for example, and send information back to the surgeon as haptic feedback. The sense of touch was combined with real time visual feedback from cameras of what was happening to provide a close view of the soft tissue model.
From the technical point of view, the demonstration was enabled through software defined networking (SDN), which was configured to provide the necessary Quality of Service, by implementing networking slicing end-to-end.
Network slicing moves away from the traditional one size fits all network architecture. Using Cloud, SDN and network functions virtualisation (NFV) technologies, the network can be broken into building blocks and assembled both programmatically and virtually to suit particular services or scales of activity.
5G will allow ‘slices’ of the network to be configured in this way to provide, for example, a mission critical healthcare service offering. The flexibility provided by these customisable software-defined functions allows the mobile operator to guarantee particular types and levels of services defined by geographical coverage area, duration, capacity, speed, latency, robustness, security and availability.
The demonstration at 5G World Summit was largely devised by Dr Maria Lema Rosas, Research Associate at the Centre for Telecommunications Research at King’s College London, who says: ‘It was a natural thing to put together, as in the Department of Informatics we have advanced research in both robotics and telecommunications.
‘The Centre for Robotics Research has been working on haptic feedback for robot sensors and how to add this capability into robots. There are several ways to embed sensors using tactile feedback and we can gather information on laparoscopic surgery (also known as bandaid, keyhole or minimally invasive surgery) for example,’ says Rosas.
‘The question was how to de-couple the doctor from the robot,’ she continues. ‘We talked to the person who built the robot for the demo and he reproduced a biological finger with the ability to feel hard and soft material.
‘You feel with muscles and tendons, so he replicated that with four sensors and strings and put that into the robot. That replaced one of the laparoscopic tools, so that while the robot is touching inside the body, the doctor can have haptic feedback of what it is feeling.’
The team also worked with NeuroDigital Technologies based in Almeria, Spain, creators of the Gloveone haptic glove. ‘We decided to put these two together to see if one could control the other using gestures with the glove and once the robotic finger touched a hard object it would send back tactile information. We converted tactile information into kinaesthetic information,’ explains Rosas.
The demonstration also served to show how the different information streams behaved within the network, as the two video streams, the haptic glove and the robot sensors each have very different requirements.
The team was also keen to test the SDN tools by implementing a queue of assigned priorities to these different information flows. For example, by isolating the different flows, the team could ensure the tactile information had a guaranteed level of service, while the visual feeds from the video might have suffered.
Rosas adds: ‘We also intentionally crashed one video to show people that by using SDN and network slicing we could still run one service even though there might not be enough network resources to run the other services. We do this using resource reservation implemented by planning this into the network beforehand.’
Marshall points out that the mission critical aspects of remote surgery are heavily reliant on network slicing. ‘Being able to show the mission critical elements through network slicing was a very good addition to the primary demonstration of tactile capabilities,’ he says.
‘One aspect it was important for us to show was that even if you don’t have an always available, low latency, high data rate network, you can have this isolation between different information with different sensitivities,’ Mahmoodi notes. ‘You can ensure that if there is a bit of your network with that low latency or high data rate, then you can still put applications of high sensitivity into that bit of the network.’
Rosas points out that this does not necessarily mean some applications suffering at the expense of others. Different types of media and use cases will have different types of service level needs, so it doesn’t matter if some less high priority types of service are a little slower.
‘Each slice has a business point of view,’ says Rosas. ‘So, for example, the healthcare slice would have a priority over most other applications and the mobile operator would have a service level agreement (SLA) with the healthcare industry to provide a certain quality of service. But 4G lacks the flexibility to provide this.’
Marshall adds: ‘You can do a lot with 4G, but if you are able to add augmented reality and virtual reality on top of that, then from a services point of view you are going from a small sphere of influence to a massive sphere of influence in terms of what you can actually achieve.
‘If you then add a sense of touch as well that takes you to another level again. From a service and application point of view 5G can offer a massive panacea of options and healthcare is one of them.’
Other 5G healthcare use cases
When it comes to other area areas of healthcare which might benefit from 5G technology, Mahmoodi points to the rehabilitation of patients. ‘We have an aging society. There were a record number of stroke patients in the UK in 2015, so that means a lot of people who need care. The trouble is keeping them in hospital is too expensive, but there is not enough care available to look after them at home.
‘We think advanced wearable devices are key here,’ she says. ‘This is not sci-fi anymore; there are small companies who are making these kinds of products. A wearable device connected to a 5G network that would allow people to roam around and live and work normally.
‘But you would also need perfect coverage to maintain the connectivity, so you can move and roam around,’ she adds. ‘This is not possible with 4G and you would also need a highly responsive network, which again you cannot get with 4G.’
Telecare is another area where 5G could help. It need not necessarily be for rehabilitation, but more for those with chronic conditions who need constant care. If they are on medication it could help ensure they take it. There are already devices that can do this such as implants that measure a diabetic’s blood sugar levels and automatically inject the correct dosage of insulin, for example.
How might a 5G network be architected then? Marshall thinks it will certainly be different to current network architectures. ‘You could do it with small cells, a distributed network with one cell covering 500m. But why not 500 small cells the size of a matchbox distributed down a street just covering a few meters, but providing ubiquitous coverage, very high bandwidth and very low latency?
‘I think we will get a stacked aspect and sub-stacked aspect depending on what you want to do with the 5G sub-localised system. When we talk to mobile operators we say do you really want to be in charge of everything? Or do you just want to be part of a bigger system with lots of private local networks monitoring things like smart bins?’
‘In the future there could be a scenario where there will be a stacked aspect and sub-stacked aspect on how networks could evolve depending on what you want to do with the 5G sub-localised system. In this respect there is a big question on how will existing operators support such localised systems – will they own them or partner with others?’
So what next for the team? Mahmoodi says they are working with Guy’s and St Thomas’ Hospital Trust to trial some aspects of 5G technology, which could be available pre-2020, the generally agreed date for commercialisation of 5G. However, because of third party involvement the trial is still confidential at this stage. ‘It should be very exciting this time next year,’ teases Marshall.
Rosas says: ‘The main difference I see now from seven years ago when I started is that then it was academia on one side and industry on the other. Before, industry would pick up on what we were doing, but now we are working together much more. Industry needs to make things happen quickly, so working together helps enable that.’
Mahmoodi agrees: ‘Developments are happening faster now because of having the complete knowledge set in one place and the speed of modern communication to go with it. The interfaces here are close; it’s easy to just go and talk to a colleague.’
Marshall sums up: ‘It’s good to have this three pronged approach combining industry, academia and suppliers in a triangular mix. Whether you are a supplier, an academic or from industry, we all want to be at the forefront of technology. We all have a common goal, but we all come at it from slightly different angles.’
‘What has been achieved from the initial discussions to where we are now has been done in only six months,’ points out Marshall. ‘But in that time we have been able to show a practical use case and we have established an evolutionary path.’
The team hopes to be able to present the next stage of its research in early 2017 when virtual reality and an expanded touch capability are expected to feature.
Image Credit: Photo_Concepts / iStock