We need to enact a strategic pivot in oceanographic research infrastructure from a fossil-fuel dependent, ship-centric model to a green, digitally enabled ecosystem that synthesises advances in autonomous technology with next-generation research vessels.
Why? Because there is an imperative to measure the ocean in ever greater detail if we are to understand, predict and mitigate the environmental, social and economic impacts of the climate crisis. And we must be part of the solution while we do it.
The ocean is the key component of the Earth’s climate and ecological systems. It covers over 70% of the surface and acts as its primary reservoir of heat and carbon, absorbing over 90% of the extra heat from global warming and around 30% of carbon dioxide emissions. It plays a critical role in supporting life on our planet, from the air we breathe and the food we eat to driving weather and climate patterns. As one might expect, the ocean is a complex and dynamic system, and it is only by exploring it that we can seek to understand its current state and attempt to predict future changes.
Recent studies have documented and advanced understanding of the current and ongoing changes occurring in our oceans: these include, rapid ocean warming, increasing ocean acidification and deoxygenation, accelerating sea level rise, dramatic sea ice declines in both polar regions, and the increasing frequency and intensity of marine heatwaves.
The data collected and analyses conducted have provided the evidence base that underpins key international assessments such as those by the Intergovernmental Panel for Climate Change (IPCC)1 and the recently agreed High Seas Treaty under the UN Convention on the Law of the Sea (UNCLOS). And there is broad international consensus on the key priorities to advance ‘the science we need for the ocean we want’.
"there is broad international consensus on the key priorities to advance ‘the science we need for the ocean we want"
The UK currently has two world-class research ships (the RRS Discovery and RRS James Cook) and the fantastic polar research vessel, the RRS Sir David Attenborough. With sufficient investment the UK can maintain its position as a top-tier nation in oceanography (held since 1872 and the HMS Challenger expedition). The future infrastructure envisaged places data as the central core and would link up hundreds of autonomous platforms to scientific satellites, oceanic floats and research vessels.
AI would optimise the mission planning and support vastly improved data sharing. The 3D world of our coastal seas and deep ocean would be made transparent as hundreds of data streams are beamed back to land and bought together to stream this fascinating yet under-explored ecosystem in high definition.
Gathering and accessing marine data
For this strategic pivot to be successful and support our net-zero targets by 2040, we need to accelerate the adoption of new technologies. Success is reliant upon data that is trusted by the users and that accesses sophisticated technology that is robust, available, interoperable and affordable.
But observing the ocean is complicated due to its vastness, unique characteristics (including extreme pressure at depth, partly covered in ice, dynamic and with complex interactions on all scales), its inhospitable nature (to humans) and its acute incompatibility with electronics. Capturing data is therefore often time consuming, inherently risky, expensive and technically challenging. Operating at sea and collecting the data is always the final stage in a long logistical effort that involves multiple teams and is dependent upon a wide structural foundation.
In the future those foundations will require ‘in-the-loop’ testing facilities in bespoke labs, exemplified by the MARS facilities at NOC, through to in-water trials in fully instrumented test and evaluation centres, such as the SMART-sound facility in Plymouth harbour. Maintenance and servicing facilities that allow novel sensors to be integrated, new power systems to be installed and software upgrades to be downloaded will have to be expanded.
And because this is cutting-edge and often ground-breaking those teams and structures aren’t yet embedded or optimised. In fact, in many cases the current systems work against them as funding and processes are tied to the ships. So, this pivot is a real challenge, but it has a growing number of supporters who can see both the opportunities and the imperative.
Very few countries can afford to build and operate global-class, multi-role research ships capable of exploring all parts of the ocean, from the edge of icesheets to the deepest trenches. But over the last few decades the introduction of scientific satellites and oceanic floats has opened up access to a swathe of marine data.
The low entry costs of using autonomous platforms (from £100M+ for a research ship to £100k for an uncrewed surface vehicle or autonomous underwater vehicle) and the reduced shore infrastructure requirements will allow more countries and institutes to access this technology. This should continue to expand access to marine data, especially if the Argo and Copernicus models of open data are followed.
In addition, more people will be able to access the data, and a wider group can become involved in operating the equipment as the requirement to deploy on ships for weeks or months is removed. So potentially a major element of a future marine research infrastructure should be easy scalability in terms of capability and cost delivered by a more diverse workforce.
The risks – network effects and switching costs
But the real opportunity, the future vision that places data as the central core and links up hundreds of autonomous platforms to scientific satellites, oceanic floats and research vessels, will be lost if we aren’t able to make the systems interoperable and/or they aren’t coordinated in a sensible manner.
The ‘network effect’ is a phenomenon whereby increased numbers of people or participants improve the value. The internet is the best example of this. As more users gained access to the internet, they produced more content, information, and services. The development and improvement of websites attracted more users to connect and do business with each other. As it grew, the internet offered more value, leading to the ‘network effect’.
The autonomous element of the future marine research infrastructure is significantly improved by making real-time data accessible. Planned missions and monitoring activities can be optimised by information collected separately and shared in real time. But behind that statement lies common data standards, interoperability, retrievability of data from data portals and accessible meta data amongst other issues. And the opportunities made available by the network effect are reduced by switching costs, particularly as this is still an emerging technology and sector and so the pace of change is rapid.
Switching costs are the costs that a user incurs as a result of changing brand, suppliers or products. If those costs are too high, then we risk having disparate systems with users locked in to niche or old technology and therefore not able to support the wider network which has moved onto newer technology. If we try to use autonomy in the same way that we have used research ships, as isolated research expeditions that quarantine data for years, we will fail to achieve the spatial and temporal benefits that are possible, and we won’t be able to scale costs and maximise potential users of the data. Currently there is no ‘guiding hand’ for this although the Marine Technology Society (MTS), National Oceanic and Atmospheric Administration (NOAA) and Global Ocean Observing System (GOOS) recently convened a series of excellent workshops to discuss these and associated issues.
"sulphur dioxide indicates magmatic activity, and is an indicator of volcanic activity that we can monitor from space prior to eruptions"
The big prize
Precise, accurate sensors are the key element of past, current and future research infrastructures. The key challenge for the future is the requirement to make them smaller, lower energy and ‘plug and playable’. Fully pulling back the veil on our ocean is only possible if we have sufficient sensors, capable of measuring all parts of the physical, chemical, biological and geological worlds and we can deploy them in sufficient numbers.
The R&D into sensors therefore needs to increase dramatically and engineers must be incentivised to both take risks and share lessons – it can take a decade from initial idea to adoption, and we don’t have that amount of time. But even then, at the risk of repeating myself, unless the data streams from all of those sensors are relayed back to shore and shared between all those who wish to use them, we will fail to win the big prize and may even go backwards in capability.
Digital Twins of the Ocean
To better understand Digital Twins, I’d recommend listening to Dr John Siddorn’s podcast (https://www.youtube.com/watch?v=D4qyDdvn5rQ). However, I like to think of them as animations of life under the waves that we can poke and tweak to see what happens when one part of them changes. There are lots of teams working on these sorts of visualisations and they grasp the imagination of children and the general public in ways that scientific papers never will. But they require regular measurements to validate those ‘animations’. And the exciting bit is that researchers might use them to test ideas and then direct ‘Boatys’ to see if their ideas are correct. We could experiment in the lab and a few weeks later confirm those experiments in the ocean.
Boaty McBoatface – the benefit of long-range AUV’s
Boaty McBoatface is a fantastic case study. Working with the UK National Oceanography Centre (NOC), the Natural Environment Research Council (NERC) has invested over a long period of time to support the development of this long-range AUV. As a result of that collaboration, we now have access to a platform that can dive to 6000m, explore miles and miles under ice shelves, swim through complex terrains and even hibernate on the seabed until required.
It has been designed to carry multiple sensors and to allow new ones to be integrated as easily as possible. It can be deployed from shore and stay at sea for months at a time. Imagine what marine scientists could do if the UK had access to 100 of these platforms – and the costs would be half of that required to build a research ship!
The next generation of ocean scientists
I hope that we can get this technology to work in the way envisaged because the opportunities for the next generation of ocean scientists are immense: the coverage provided could be extraordinary, barriers to joining that community could be lowered and collaborative networks based outside of traditional structures could be introduced. The intention is to ensure the next generation are at the heart of the development, but funding structures make that a challenge as they are often forced to focus on teaching or research proposals or both – the time they have available to work closely with engineers is limited and seems to be reducing further.
I often reflect upon the life of Dr John Swallow FRS – the inventor of the ‘swallow float’. His candidature notation when elected to the Royal Society in 1968 noted that “by numerous observations with this ingenious device, he and others have completely changed our picture of the deep circulation of the ocean”. But the swallow float was not an immediate success and was refined over multiple research expeditions – how might we ensure that opportunity to innovate is available in the context of this new infrastructure?