new class of marine AI to advance $90BN autonomous shipping market
Sea Trials Begin for Mayflower Autonomous Ship’s
A new class of marine AI has been developed by Promare and IBM engineers to advance a $90BN autonomous shipping market. IBM (NYSE: IBM) and marine research organization Promare, announced that a new ‘AI Captain’, will enable the Mayflower Autonomous Ship (MAS) to self-navigate across the Atlantic later this year and will go to sea this month for testing.
The trial, which will take place on a manned research vessel off the coast of Plymouth in the UK, will evaluate how the AI Captain uses cameras, AI and edge computing systems to safely navigate around ships, buoys and other ocean hazards that it is expected to meet during its transatlantic voyage in September 2020.
MAS will trace the route of the original 1620 Mayflower to commemorate the 400th anniversary of the famous voyage. Sailing from Plymouth, UK to Plymouth, Massachusetts with no human captain or onboard crew, it will become one of the first full-sized, fully autonomous vessels to cross the Atlantic. The mission will further the development of commercial autonomous ships and help transform the future of marine research.
Don Scott, CTO of the Mayflower Autonomous Ship stated “many of today’s autonomous ships are really just automated – robots which do not dynamically adapt to new situations and rely heavily on operator override. Using an integrated set of IBM’s AI, cloud, and edge technologies, we are aiming to give the Mayflower the ability to operate independently in some of the most challenging circumstances on the planet.”
Two years of training and a million nautical images
Over the past two years, the Mayflower team has been training the ship’s AI models using over a million nautical images collected from cameras in the Plymouth Sound in the UK as well as open source databases. To meet the processing demands of machine learning, the team used an IBM Power AC922 fuelled by IBM Power9 CPUs and NVIDIA V100 Tensor Core GPUs, the same technologies behind the world’s smartest AI supercomputers. Now, using IBM’s computer vision technology, the Mayflower’s AI Captain should be able to independently detect and classify ships, buoys and other hazards such as land, breakwaters and debris.
The trial begins
MAS will rely on IBM’s advanced AI and edge computing systems to sense, think and make decisions at sea, even with no human intervention. With the three hulls of the trimaran MAS currently reaching the final phase of construction in Gdansk, Poland, a prototype of the AI Captain will first take to the water on a manned vessel – the Plymouth Quest – a research ship owned and operated by the Plymouth Marine Laboratory in the UK.
The March sea trials, which will be conducted in waters of Smart Sound Plymouth, under the watchful eye of the Plymouth Quest’s human crew. These trials will help determine how the Mayflower’s AI Captain performs in real-world maritime scenarios, and provide valuable feedback to help refine the ship’s machine learning models.
Getting there (safely)
As well as following the overall mission objectives to reach Plymouth, Massachusetts in the shortest amount of time, the AI Captain will draw on IBM’s rule management system (Operational Decision Manager – ODM) to follow the International Regulations for Preventing Collisions at Sea (COLREGs) as well as recommendations from the International Convention for the Safety of Life at Sea (SOLAS). Used widely across the financial services industry, ODM is particularly suited to the Mayflower project as it provides a completely transparent record of its decision-making process, avoiding ‘black box’ scenarios.
As the weather is one of the most significant factors impacting the success of the voyage, the AI Captain will use forecast data from The Weather Company to help make navigation decisions. A Safety Manager function (running on RHEL) will review all of the AI Captain’s decisions to ensure they are safe – for the Mayflower, and for other vessels in its vicinity.
How the Mayflower senses, thinks and acts at sea
The Mayflower’s AI Captain will use technologies and processes to independently assess situations, and decide what action to take. It will sense, think and act in the following ways:
Senses (assesses current environment & identifies hazards)
- Radar detects multiple hazards in MAS’s path, 2.5 nautical miles ahead
- Onboard cameras provide visual input to IBM computer vision system which identifies hazards as: a cargo ship, a fishing vessel and three partially submerged shipping containers floating in the water
- Automatic Identification System (AIS) provides specific information about the cargo ship’s class, weight, speed, cargo, etc.
- GPS Navigation System – provides MAS’s current location, heading, speed and course
- MAS’s nautical chart server provides geospatial information about its chosen route
- Weather data provided by The Weather Company
- Attitude Sensors – assess local sea state (how MAS pitches and rolls due to waves)
- Fathometer – provides water depth measurements
- Vehicle Management System – provides operational data such as MAS’s battery charge level, power consumption, communications, science payloads etc.
Thinks (evaluates options)
- IBM Operational Decision Manager (ODM) evaluates COLREGs with respect to the other vessels in the vicinity and generates a risk map indicating an “unsafe” situation ahead
- MAS’s AI Captain ingests the ODM recommendation, computer vision input, current and forecasted weather and assesses several options to avoid hazard
Acts (chooses best actions and instructs vessel)
- AI Captain determines the best action for MAS, eg to steer to starboard to avoid an unexpected navigation hazard
- MAS’s Safety Manager verifies the decision as safe
- AI Captain instructs MAS’s Vehicle Management system to change course and speed.
As the ocean is an ever-changing dynamic environment, the AI Captain will constantly re-evaluate the situation and update the course of the Mayflower as situations evolve.
The March sea trials will take place for approximately two months on the Plymouth Quest with the ship’s human captain and crew at the helm. In the first stage of testing, the Mayflower AI Captain’s inference engine will receive input from the Quest’s radar, AIS, GPS and navigation system, as well as data about visibility. Cameras, computer vision, edge and autonomy capabilities will be added in the next phase of testing from April.