Greater use of standardised systems will be needed for the maritime industry to achieve the advantages of digitalisation, says Tore Morten Olsen, President, Maritime, Marlink
We know that digitalisation will have a transformative effect on the maritime industry and that it has the potential to improve everything from operational safety and port call optimisation to environmental performance.
What is not always appreciated is that it is difficult to achieve a revolution in the way that people, assets and data are managed using the technology of a decade ago. Just as Moore’s Law has been superseded with every more powerful processing, communications technology has also evolved.
As if we need reminding, maritime communication is ‘different’ to terrestrial communications, principally because of its inherent latency, or delay between data being sent and its arrival. All networks have challenges of how to terminate the traffic with as low latency as possible and an increase in throughput does not by itself solve the latency issue because the journey is still into space and back. How we overcome this challenge requires a combination of engineering, hardware and software expertise.
The trend in data analytics is once again moving inexorably towards the cloud. But maritime satellite services operate with a level of latency that makes cloud-based computing and applications challenging. Even the use of LEO constellations does not eliminate the problem, because even though the distance between earth and satellites is smaller, the journey from the vessel across the network to an end point is still long compared to terrestrial services.
Our experience in smart hybrid networks suggests there are mitigations, including a form of buffering that smooths out the end user experience by manipulating the journey of the data to enable it to be delivered as a complete package even if the process is interrupted.
Network performance can be impacted for better or worse by a number of factors; orbital spread of the available satellites, the likelihood of congestion and the number of overlapping beams in areas of densest traffic. One-off problems such as blockages to the signal from ship superstructure are a fact of life but shouldn’t be a reason to incur higher costs or sacrifice performance.
For vessel operators, getting the service they require will mean comparing their options in detail and creating a benchmark that gives them the certainty they need. The absolute number of combination of satellites may not be decisive above a certain number, but the orbital spread and number of overlapping beams makes a critical difference to service quality.
The core differentiator is whether they are using a guaranteed service or one that provides ‘best effort’. For some low level application usage, best effort will perform adequately but when communications become critical, an uptime and throughput guarantee will be required as the industry moves towards more complex solutions.
There are different categories of urgency and owners need to plan and prioritise the traffic in a particular data stream. Some OT systems may only need to ping once in 24 hours, others will send more frequent updates and systems must also be able to cope with unforeseen events that need to take bandwidth priority.
Achieving a new level of operational efficiency, certainty and security on the basis of “this has worked until now” may not be enough for the new technologies coming into play. A long term connectivity partner will need to do more than join one USER to the other; value added services go well beyond basic user applications to backbone systems such as automatic updating to more emergent technologies like routeing data around software-defined networks.
The trend towards cloud-based computing and applications also suggests that the industry can only take full advantage of digitalisation by moving towards the use of standardised software tools rather than the traditional maritime-specific systems that have emerged independently over time. This is particularly true of cloud services which tend to employ disparate standards; these need to converge for the maritime cloud to work to its full potential.
The software, monitoring and performance tools that shipping has used until now can continue to be used over hybrid networks. This challenge lies ahead if more demand for cloud computing pushes latency further up the agenda. Being able to connect to shore and transmit data should be considered ‘business as usual’; storing and processing data in cloud based corporate systems will require us to think differently.
These challenges must be considered as part of the digitalisation story, and the shipping industry needs to understand the limitations and challenges in coming closer to an ‘onshore experience’ in the near future.
The physics of satellite connectivity might be against standard shore-based offerings for applications like Microsoft Office 365 or Citrix to maritime users, but with proper expertise and considerations, we can optimise every possible parameter to build a good user experience, making standardised tools work within the framework of maritime communications market.
About Tore Morten Olsen, President, Marlink Group
Tore Morten Olsen holds a M.Sc in Telecommunications from the Norwegian Technical University from 1993, and has participated in Executive MBA programs at Wharton Business School in the United States, Insead in France and Stockholm School of Economics in Sweden. He has 24 years of experience in the satellite communications sector, starting out as a technical product manager in 1994 and moving on to hold several senior management positions with Telenor, Astrium Services, Airbus Defence and Space, and Marlink.
The post Shipping and the cloud: ‘business as usual’ versus ‘next generation’ appeared first on SAFETY4SEA.