1. Introduction
The primary task of maritime pilots is to ensure the safe passage of vessels through challenging or congested waters by providing expert navigation guidance [
1]. This remains important despite technology advancements in navigation technologies. Pilots possess in-depth knowledge of local waterways, including currents, tides, depth variations, and potential hazards. While pilots provide guidance, the ultimate responsibility for the safety and navigation of the vessel rests with the ship’s captain. Pilots serve in an advisory capacity, offering recommendations and assistance based on their expertise. Pilotage is still a dangerous profession with several casualties happening each year. Especially the process of entering and leaving a vessel is associated with risks [
2]. This can be one of the benefits of enabling remote pilotage: by removing the need for a pilot to be physically present onboard a vessel, remote pilotage can reduce the risks associated with accidents and injuries during pilot transfer operations. Additional benefits are scheduling flexibility and time savings according to a Finnish study [
3].
However, changing from onboard to remote pilotage comes with challenges, specifically with regard to human factor-oriented topics. Communication remains crucial, as the pilot and master must still communicate efficiently to ensure smooth and safe navigation, despite being geographically separated and lacking non-verbal communication capabilities. Additionally, with the pilot not being on the bridge, it is essential to ensure high situational awareness for the pilot ashore. Addressing these challenges must be supported by remote pilotage technologies in the future to ensure smooth and safe operations. Those challenges must be addressed by any future Remote Pilotage Technology, to ensure that its benefits can be realized without countering effects on navigational safety. Hereby, new technology and interaction concepts should be also investigated with regard to their applicability. Thus, different user interface concepts for remote pilotage are tested and assessed within this paper with regard to their principal usability for remote pilotage operations. This answers the question if they are suitable visualization and interaction technologies for Remote Pilotage Systems of the future. A significant innovation of this paper is the on-site testing of these technologies on an actual ship vessel, a pioneering approach in the field. Our research uniquely evaluates the practical application and usability of these systems in real-world maritime conditions. This hands-on testing not only provides more realistic insights, but also bridges the gap between theoretical concepts and practical implementation, offering valuable contributions to the current literature and industry practices. The tested technologies are classical desktop visualizations for shore-based pilots, immersive virtual reality (VR) technology for shore-based pilots, and immersive augmented reality (AR) technology for onboard masters being piloted.
For the immersive technologies, prototype systems introduced in the concept presented in [
4] have been used. By transmitting sensor data and a 360° video stream from the ship to shore and displaying it in a VR environment, a high level of SA for the pilot shall be assured in combination with an integrated electronic nautical chart application. On the ship, an AR system is used to superimpose the view of the master with essential data about the traffic situation. Voice communication between shore and ship is supported by a marker and hint system with the aim of achieving an efficient and unambiguous communication. The prototype of the concept was tested in an in situ trial on a ferry during a voyage in the Baltic Sea. For the desktop visualization, a similar prototype has been created having a comparable level of technological readiness. As the focus was on testing the principal usability of the different technologies rather than testing a specific system implementation, all systems had a pre-commercial implementation standard.
The Introduction is followed by a brief recap of the state of the art in remote pilotage and maritime mixed reality in
Section 2. The system prototypes are described in
Section 3. The testing procedure and the assessment are presented in
Section 4 and
Section 5, respectively. Finally,
Section 7 concludes the paper, followed by a discussion in
Section 6.
2. State of the Art
2.1. Remote Pilotage
Remote pilotage has been the subject of scientific publications for at least 20 years [
5]. To date, however, the most advanced practical efforts to establish remote control have been limited to only parts of the fairway [
6]. An example of this is the port of Rotterdam, where remote pilotage (here called shore-based pilotage) is offered between the Maas Centre pilot station and the Hoek van Holland traffic center. In this port, the pilot communicates with the ship via a VTS-monitored VHF channel. The pilot is supported by a VTS monitoring view and a land-based radar image [
7]. Apart from a few exceptional cases, the pilot still enters the ship at one point in time. Complete remote pilotage does not take place [
8].
Implementations of remote pilotage in the context of research projects are often limited to implementation in simulation environments [
6]. One exception is the Sea4Value research project. In 2022, remote piloting was tested in Finnish waters as part of the research project. To support the pilot, sensor data are transmitted from the ship to the shore, including the ship’s foresight as a video stream. The above-mentioned research projects and implementations are based on conventional display technology, with communication between the shore station and the ship taking place exclusively verbally via a radio link.
There is a broad consensus in the scientific community that communication and trust are the two most important factors for the successful execution of pilot maneuvers [
9,
10]. In addition, communication is considered essential for successful decision-making and for building SA [
6]. A survey of pilots confirms that communication is a key factor for successful piloting and that advanced technologies that facilitate communication are seen as a prerequisite for remote pilotage [
6]. Besides human factors, communication stability, as well as proper officer qualification are seen as key enablers for such services [
3].
2.2. Maritime Mixed Reality
The approach of developing a remote piloting system based on VR and AR technology has not yet been implemented. Modern ship navigation on a bridge is already supported by a variety of sophisticated digital and automated tools such as radar, automatic radar plotting aids, and electronic navigation systems. However, the International Regulations for Preventing Collisions at Sea (COLREGS) apply. Rule 5 requires that
every vessel shall at all times keep a proper lookout by sight and hearing and by any other available means appropriate to the circumstances and conditions, giving a full view of the situation and the possibility of a collision [
11]. It is expected that AR technology will allow bridge personnel to fulfill their duty to keep a lookout while having the information from the modern tools at their disposal.
There are already several research projects investigating AR in ship navigation [
12,
13] and maritime traffic control [
14]. In most cases, augmentation is limited to overlaying the navigator’s field of view with AIS data. Furthermore, most implementations have a low technical readiness level (TRL). In a systematic literature review on AR in maritime collaboration, Van den Oever et al. [
15] concluded that it would be more advantageous to develop prototypes with a higher TRL and criticized the lack of scientific evaluation of the prototypes developed so far.
Until now, virtual reality in the maritime sector is primarily used for trainings. An exception is the project FernSAMS, where a VR setup is used to steer a tugboat from shore with a 360-degree video stream as the primary sensor input [
16,
17]. This has led to the MR demonstrator for remote pilotage demonstrated in [
4], which is used here for the technology assessment. To the knowledge of the authors, there exist no further approaches for VR-based remote pilotage stations at the moment.
3. System Overview
This section outlines the development and deployment of a mixed reality infrastructure aimed at facilitating remote pilotage operations. An in depth analysis of the infrastructure can be found in the paper
Use Case Remote Pilotage—Technology Overview [
4]. Focusing on augmenting SA and interaction between the ship’s captain and remote pilots, this infrastructure integrates augmented reality on the ship side with a desktop and a virtual reality application for the shore side. Notably, all system validations, including the shore-side application, were conducted onboard a vessel, reflecting a unique approach to evaluating the interaction dynamics under real navigational conditions. All applications were developed in the Unity game engine (Unity Technologies, San Francisco, CA, USA).
3.1. System Infrastructure
Data acquisition is systematically facilitated through a comprehensive network of onboard sensors, capturing vital navigational inputs such as GNSS and AIS signals. These sensors channel data to a central processing unit. A general overview can be seen in
Figure 1.
It is worth mentioning that the setup for the user study was different from the previously mentioned conceptional setup. For logistical and organizational reasons, the shore-side equipment was also placed on the ship during testing. Therefore, the data exchange was performed via a WiFi router instead of a 4G/5G connection. This paper will continue to refer to the respective interfaces as the shore UI/shore application and the ship UI/ship application for clarity and consistency with the initial conceptual framework.
3.2. Information Exchange between Ship and Shore
To ensure a high SA of the pilot, sensory data are transmitted from ship to shore. These data include AIS and a 360° video stream. In our setup, the GeoVision VR360 camera was used. It produces a H264-encoded RTSP video stream with a resolution of 3840*2160 pixels. One of the installed cameras faces forward from its position directly above the ship’s bridge. The second camera is mounted on the port side at the forwardmost corner, also above the ship’s bridge.
Our system offers verbal communication between ship and shore via a VoIP connection. Additionally, we aim to compensate non-verbal communication with a system of markers and clues that can be placed by the pilot and give the master additional information. These are as follows:
Markers for guiding attention towards specific coordinates in the environment.
Bearing line for guiding attention towards a specific direction.
Highlighting for guiding attention towards specific vessels in the environment.
A message system with predefined messages for ensuring safe communication about a chosen target.
3.3. Shore-Side Implementation
The shore-side system was implemented in two variations: a desktop solution and a VR solution. The former runs on a 27-inch touch screen for interaction purposes, while the later uses the Varjo XR-3 headset with hand tracking capabilities (see the
Shore Application column in
Figure 1). A VR desktop application requires significant computational power to deliver a smooth and immersive experience. For our application, we used an Alienware laptop equipped with an NVIDIA RTX 3070 graphics card and an Intel CPU from the 13th generation. Communication with the ship’s system was established via local WiFi, as both systems were placed on the ship. As the focus was on usability, this modification was acceptable during testing. The available information in both variants is the same, and the user interfaces are designed to be as similar as possible. Both variants present the live video stream from the 360° cameras on the ship and an electronic sea chart
1. It is possible to change the displayed chart section and to alter the scale of the map. Essential information about the ship, namely heading, course, and speed, is always visible. Symbols for the vessels in the environment are shown on the chart according to the received AIS data. These symbols can be selected to open a panel with detailed information about a vessel. The info panel also offers the options for tagging a vessel with a highlight and attach it with a message. Markers for the purposes of highlighting specific locations for the master can be placed on the chart.
3.3.1. Desktop Application
The layout of the UI of the desktop application can be seen in
Figure 2. The interface is divided between a section for the video stream and a section for the chart. The chart can also be closed so that the video stream is shown as full screen. The interface shows a maximum 140° section of the 360° video stream. On top of the video stream, the compass element and the ship information are placed.
All interactions with the application are performed via touch gestures on the display surface. The selection of UI elements is performed via a single touch; changing the chart section and the displayed section of the video stream is performed via swipe gestures. Zooming in and out of the chart and the video stream is performed via two-finger gestures.
Figure 2 shows the UI in split-screen mode.
3.3.2. Virtual Reality Application
In the virtual environment, the user is surrounded by a sphere on which the video stream is projected. The chart is placed as a 3D element in front of the user; its position can be changed between predefined options or be disabled completely if needed. The compass element is implemented as a ribbon in the upper part of the users field of view.
Figure 3 shows the user interface of the VR application. Interaction with the environment is performed via the hand-tracking capabilities of the headset. Rays from the shoulder to the hand intersecting with the UI elements in the environment define the cursor position. A finger pinch gesture selects the element the cursor is currently on.
3.4. Ship-Side Implementation
The ship-side system harnesses the capabilities of the Microsoft Hololens 2 (Microsoft Corporation, Redmond, WE, USA) to superimpose critical navigational data, including AIS information and route specifics, directly into the captain’s field of vision. This integration is designed to augment the physical maritime environment with digital data.
Figure 4 shows an AR overlay with navigational data for a vessel called “Testship”. On the left side is the vessel information collapsed behind an AR button. After interacting with the button, the information on the right side is displayed. Essential information such as heading, course, speed, and status is presented alongside the vessel’s position. A 3D wireframe model provides visual context. Additionally, collision avoidance details like CPA and TCPA are shown.
Interaction with the Hololens 2 system is intuitively designed to occur through hand tracking and hand gestures, alongside voice commands. Specific gestures allow for the expansion or collapse of ship information by pinching the thumb and pointer finger. Additionally, looking at the right-hand palm activates the display of the vessel’s own navigational information, while glancing at the left palm summons a menu for layer management (
Figure 5). This feature permits the toggling of various data layers, such as ships, markers, and waypoints.
Figure 6 delineates the overlays in greater detail. Besides AIS data, the overlays include route information and markers, with AIS details encircling identifiable objects. Depending on the type of vessel, a 3D wireframe model is also presented, enhancing the spatial understanding of nearby maritime traffic. Ships that are highlighted via the shore-side application are marked with arrows at the field of view edges if positioned outside the captain’s immediate visual range. Moreover, a bearing indicator from the shore-side application can be visualized within the Hololens 2, ensuring synchronized navigation and planning between the ship and remote pilots.
The technical setup was already installed and tested in voyages previous to the user study. During these voyages, informal feedback was collected from the ship’s crew and integrated into the prototype. To improve the system further, it was deemed necessary to collect systematic feedback from pilots in an in situ test run.
4. Testing and Evaluation
The testing procedure consists of several campaigns in which the usability, intuitive operation, effectiveness, and accuracy of the systems are evaluated. The results aim to determine the potential of each system to establish maritime SA for safe and robust remote pilotage and, where possible, provide insight into which enhancements could improve these systems. The test setup is divided into four phases aimed at assessing different aspects of technology application in a maritime context (see
Figure 7).
4.1. Survey Methods Applied
During testing, three survey methods were employed that are utilized in human–computer interaction research, particularly with respect to immersive systems such as AR and VR. These methods aid in understanding and evaluating user experience, system performance, and the physical and psychological impacts of these technologies on users:
System Usability Scale (SUS): The SUS is an effective tool for assessing the usability of a system. It consists of a brief questionnaire with 10 items, providing a quick gauge of how user-friendly a system or product is [
18]. The questions are designed to be generic, making them applicable to a wide range of products or systems, including AR and VR. It scores on a scale from 0 to 100, though it is not a percentage. Scores can be categorized into ranges: scores from 85 to 100 indicate excellent usability; scores from 68 to 84 reflect good usability; scores from 51 to 67 are considered marginal; scores below 50 are deemed poor, highlighting significant usability issues. It is critical to acknowledge that the SUS does not provide detailed insights into the specific issues or potential enhancements of a system. Rather, the SUS serves as a general metric for a system’s usability and can be employed as a benchmark for comparing different systems or iterations of the same system.
Situational Awareness Rating Technique (SART): SART was developed to assess the level of SA experienced by users during system interaction [
19]. It quantifies SA [
20]. This is particularly critical in AR and VR environments, as these technologies aim to seamlessly integrate digital information into the user’s visual surroundings. A high level of SA implies that users can effectively perceive, comprehend, and respond to information provided by the system. When using this method, subjects rate their own awareness using a questionnaire consisting of ten questions. This questionnaire uses a 7-point scale, with the lowest score being 1 and the highest score being 7. Based on these scores, the questions are divided into three categories (understanding, demand, and supply) to calculate SA. The SART scores range from −14 to 46.
Simulator Sickness Questionnaire (SSQ): The SSQ is used to assess symptoms of simulator sickness (also known as cybersickness) that may occur when using MR systems. Symptoms include nausea, disorientation, and discomfort [
21]. The SSQ assists developers and researchers in identifying system aspects that may cause discomfort, to improve them and optimize user experience.
In addition to the traditional surveys, application-specific questionnaires were incorporated and discussed with the participants during the testing campaign, i.e., AIS Data and Efficiency, Live Video Stream, and General Questions.
4.2. Phase 1: Familiarization
The introduction of new technologies requires an adjustment period in which users can become acquainted with the functionalities and possibilities of these technologies. This is crucial for acceptance and effective use. In the first test run, the participants were familiarized with the AR, VR, and desktop systems. This allowed them to become comfortable with the functions and operating methods of the systems. The benefits of this familiarization are that the participants develop a better understanding of the functionality of the systems and can assess their user-friendliness:
Objective: determination of the adjustment time and comfort of the participants with the various technologies (AR, VR, desktop).
Methodology: Participants are systematically guided using the AR, VR, and desktop systems. The SUS and SSQ surveys provide quantitative data on usability and potential physical impairments caused by the technology. Here, an initial baseline of usability and user comfort is examined, as well as the effects of the technologies on the users.
4.3. Phase 2: Optimization
In this testing campaign, the usability and intuitive operation of the systems were evaluated. Participants were asked to perform tasks with the various systems and then conduct the SUS and SSQ surveys. Intuitive operation and user-friendliness are crucial for the effectiveness of the systems in critical situations. High usability reduces the risk of operating errors and increases acceptance among users. This phase focuses on how intuitively and effectively the systems can be applied by end-users, which is crucial for the practical implementation of the technologies. Smaller features and changes have been implemented in the applications for an optimized routine in the following tests:
Objective: evaluation of the usability and intuitive operation of the shore-side and ship-side systems.
Methodology: performing tasks using the various systems and subsequent evaluation through the SUS and SSQ to quantify usability. SSQ surveys are used to monitor the well-being of the users.
4.4. Phase 3: Consolidation
This phase was designed to evaluate how effectively and accurately the mixed reality technologies could deliver specific, critical information and support users in executing precise maneuvers. By guiding participants through a series of predefined tasks, the tests assessed the capacity of both the VR and AR infrastructure, as well as the desktop and AR setup to provide real-time assistance in navigational decision-making. With developers on the application’s opposite end, the evaluation was structured to simulate a realistic pilotage scenario where developers relayed tasks to the pilots via VoIP communication. The interaction of the pilots with the systems has been consolidated in this phase to prepare for the next one:
Objective: to investigate the effectiveness and accuracy of both VR and AR systems, as well as the desktop and AR setup in handling specific pilotage tasks.
Methodology: Each pilot participant was required to test every system once to evaluate the systems’ efficiency and accuracy. The developers, serving as the counterparts in these tests, assigned a set of tasks that the pilots had to perform using the systems. These tasks included the following:
- –
Highlighting a specific ship within the visual field.
- –
Setting a status message for a ship to communicate its operational condition.
- –
Placing a marker in proximity to their vessel or on top of other ships to designate points of interest or navigational relevance.
- –
Adjusting the bearing indicator to aid in the navigation and orientation process.
This phase aimed to provide insights into the user’s ability to complete navigation-specific tasks effectively with each system and to determine the operational accuracy of the mixed reality infrastructure.
4.5. Phase 4: Cooperation
Phase 4 is pivotal in assessing the independent operational capability of the mixed reality systems by examining the direct interaction between pilots without developer intervention. This phase focuses on the core of maritime operations: the effective collaboration and communication between ship masters and pilots under authentic conditions:
Objective: to evaluate the independent usability and efficiency of the shore-side and ship-side applications for cooperative maritime tasks.
Methodology: Pilots enacted typical maritime scenarios, including vessel entry and departure from a harbor, by collaboratively performing tasks using the mixed reality systems. These tasks were designed to simulate the coordination required during actual pilotage without external assistance, ensuring the systems facilitate effective pilot-to-pilot interaction.
This phase of testing emphasizes real-world applicability and the self-sufficiency of the pilots in utilizing the systems’ collaborative tools. The execution and repetition of these scenarios contribute to the iterative enhancement of system performance, mirroring the dynamic and sometimes unpredictable conditions of maritime navigation. Through these exercises, the systems’ potential to reinforce safety and improve operational fluidity in maritime navigation is rigorously examined.
4.6. Test and Interpretation Notes
In total, 42 tests with 3 test participants and 3 systems have been conducted.
Table 1 gives an overview about the executed test schedule. With regard to the interpretation, it must be noted that the test faced limitations concerning the diversity and number of test persons. While all participants had professional maritime backgrounds either as pilots or navigator, they were all male and in the age range from 45 to 65. This is representative of a pilotage peer group as of today, but of course, the scope for evaluating the system’s usability and effectiveness across a broader demographic spectrum was consequently constrained. Additionally, as testing took place under in situ conditions, the external maritime traffic was representative of real-world situations, but of course not controllable and comparable between all phases of testing, which could leave room for different interpretations, as system and traffic assessment can be interlinked by human test participants.
5. System Assessment and User Feedback
During the testing phase, participants provided valuable insights into the usability, functionality, and practical limitations of the systems for remote pilotage. Their feedback is instrumental in identifying areas for improvement and potential enhancements to the system.
5.1. User Interface and Usability
Participants reported that both the AR and VR headsets, along with the screen interface, were generally user-friendly and intuitive to operate. However, they noted that accurately selecting options using the AR headset required a period of adjustment. Concerns were raised about the image quality provided by the 360° video stream, specifically mentioning that it was insufficient for operational needs. Additionally, the zoom functionality did not meet expectations, and the system’s performance during night-time operations was deemed inadequate. Prolonged use of the VR headset was found to cause eye fatigue, suggesting a need for further ergonomic optimization to ensure user comfort during extended periods of use.
Despite these challenges, the overall system architecture aligns with standard nautical procedures and workflows, making it usable in principle. It was acknowledged, however, that the system prototype has not yet achieved full operational readiness, but served as a technology demonstrator. In terms of usability, the overall SUS score with an average of 68.33 across participants indicates that the system is on the threshold of above-average usability (see
Figure 8). Given the small group of test participants, this is, however, only a first indication and not representative, as it can also be seen that the assessment differs between the participants in height, as well in order of preference.
5.2. Situational Awareness
The average values of the respondents’ ratings for the individual questions are shown in a graph (
Figure 9). The computed SA scores of all the participants were above the middle of this range, indicating that they had good SA during the test scenario.
The overall SA score is a calculated value from the three dimensions.
It is described as
[
22]:
“Demand” quantified the extent of human awareness processes during the simulation. (represented by Question 1, 2, and 3).
“Supply” represented the available cognitive capacity and uncommitted attention available to the subject during the simulation. (represented by Question 4, 5, 6, and 7).
“Understanding” indicated the extent to which the individual grasped the situational circumstances during the simulation. (represented by Question 8, 9, and 10).
SA describes the recognition, understanding, and anticipation of environmental factors and events within defined time frames, particularly in dynamic and complex contexts. SA is typically divided into three hierarchical levels [
22]:
Level 1—perception (perception of the surrounding elements).
Level 2—Comprehension (Comprehension of the current situation).
Level 3—projection (projection of future states).
The characteristics of effective situational awareness include the following:
Complete and accurate perception: all relevant information is recorded completely and without error.
Correct understanding of the situation: the meanings and interactions of the various elements are correctly interpreted.
Effective projection of future developments: future states and developments are reliably predicted, enabling informed and proactive decisions.
In contrast, the characteristics of inadequate situational awareness are incomplete or erroneous perception, misunderstanding of the situation, and inadequate projection of future states. Endsley’s theory [
22] provides a central model of SA that emphasizes the importance of the three levels and shows how effective SA is achieved through a combination of environmental perception, processing, and cognitive prediction.
Having effective SA is essential for decision-making and the ability to act in dynamic and complex environments, while inadequate SA can lead to poor decisions and ineffective actions.
5.3. VR and AR Sickness
In the context of VR, subjective symptom reports were gathered from participants during testing sessions. On March 12th, during the late session, Test Persons A and B both experienced mild fatigue prior to engaging with the VR prototype. Following the test, Test Person A reported symptoms of mild eye strain and a sensation of fullness in the head. Test Person B experienced only mild eye strain. Test Person C, who initially felt moderately fatigued, did not report any post-testing issues.
On the morning of March 13th, Test Person A commenced the VR application without any pre-existing simulator sickness symptoms, but subsequently reported “difficulty in focusing”. Test Persons B and C did not report any simulator sickness symptoms either before or after their VR tests.
Later that day, Test Person A maintained an absence of simulator sickness symptoms both before and after the VR test. Conversely, Test Persons B and C both noted mild fatigue related to simulator sickness in the pre- and post-testing phases.
These findings indicate variability in the manifestation of simulator-related symptoms among participants within the VR environment, with some experiencing mild discomfort and others reporting no adverse effects. The symptoms observed, such as eye strain and difficulty focusing, align with known indicators of simulator sickness and highlight the need for further research into their etiology and potential mitigation within VR development.
The evaluation of the SSQ is based on the assessment of the three primary dimensions of simulator sickness: nausea, ophthalmic (visual) symptoms, and discomfort. The interpretation of the SSQ should be considered in conjunction with other factors such as the test environment, duration of exposure, and individual experiences. Since the test participants conducted the VR sessions standing for a relatively short duration (10–15 min), the results were, as expected, positive. For future tests, longer standing durations should be considered. Additionally, tests should be planned that involve sitting and the combination with desktop development within a single test session. It must be further noted that the VR system was presented in a more challenging way, as the shore application would normally be on hard ground and not on a moving vessel, the movements of which are neither aligned nor synchronized with the movements within the VR system.
5.4. Comparative Analysis of VR, AR, and Desktop
A comparison between the VR and AR headsets and the desktop interface highlighted the distinct presentation styles of each platform. Notably, the VR headset, while offering an immersive experience, was considered to have limited practical viability for prolonged use due to eye strain. The AR headset, on the other hand, posed fewer issues regarding extended wear, though it was observed to diminish overall sensory perception. Such limitations could potentially lead to disadvantages in nautical practice, raising questions about the system’s effectiveness in critical scenarios, such as a fire alarm on the bridge.
In the course of the test campaign, the test subjects had the opportunity to test all three systems. As part of these tests, there was constant feedback on the comfort of the systems, how intuitive they were to use, and comparisons with similar conservative systems. The results are shown in the following graphs. It can be seen that the comfort of the systems and the intuitive operation were always above average. For the desktop system, touch operation was preferred to a conventional setup.
The VR glasses (
Figure 10) are characterized by several functions, in particular the availability of the electronic chart applications, which allows navigation information to be integrated into the VR environment. The abilities to enter data and access ECDIS-like data are also important features. The 360-degree overview of the situation and the intuitive user interface improve SA. The full view in the camera supports the visual capture of the surroundings.
The availability and use of chart information within the VR environment were seen as particularly useful. Although the functions are not yet fully developed in the current development phase, the potential for the use of VR is considered to be conceivable and promising for the future. This could even lead to the integration of existing systems and proven applications like PPU software in the VR environment.
The Hololens 2 AR glasses (
Figure 11) can make work more difficult in certain situations, especially when several ships are on the same bearing. Problems can arise here if there is an overflow of data and too many visualizations block the view. A key point mentioned was that radar and ECDIS are considered more reliable information systems and the AR glasses only have a supplementary function. The glasses can also interfere with normal vision and make it difficult to find relevant information quickly, which is particularly problematic during maneuvers.
Useful functions of the setup are the display of ship data, interaction with the AIS system, the ability to deactivate information, the three-dimensional representation of ships, the distance to waypoints, and the color coding of traffic vessels for better differentiation.
The desktop solution (
Figure 12) offers a number of features that were highlighted for users. These features include the ability to interact between the map and live view, including centering the camera, improved overall operation on the desktop, selecting and tagging other ships, and intuitive handling.
Sending messages appears to be particularly useful, although according to the test subjects, it involves certain risks that still need to be assessed. However, it was noted that the available functions may not be sufficient to make well-founded decisions. This is due to the early stage of development.
5.5. Interaction Design
The system’s interactions were designed to be user-friendly, with a specific emphasis on minimizing the need for a physical controller. However, feedback suggested that foregoing a controller entirely should only be considered after comprehensive and effective training to ensure complete operational proficiency and safety.
6. Discussion
The development process during the testing phase was characterized by iterative enhancements and responsive adaptations. An integral finding was the inconsistency in hand tracking within low-light conditions, a common scenario on the bridge at night. The precise positioning of the user’s hand relative to the Hololens 2 camera proved crucial; it was noted that users had to slightly tilt their hands to ensure visibility for the camera to register the interaction, which was not entirely intuitive. Moreover, the standard settings of the Hololens 2 presented challenges in dynamic maritime environments. The system, which relies on built-in cameras and a gyroscope for spatial tracking, faced difficulties with the ship’s movement, especially when dealing with ocean waves. The gyroscope’s detection of tilting conflicted with the camera’s perception of a stationary bridge. To address this, enabling the Moving Platform Mode was a critical adaptation, minimizing the gyroscope’s influence and stabilizing the AR experience amidst the vessel’s motion. Consequently, a maritimization of AR and VR equipment is needed, if commercial onboard operation is intended.
While the participants refrained from evaluating whether the prototypes could fully replace onboard pilots at this stage of the project, they concurred with the paper’s outlined advantages and disadvantages. Looking ahead, the addition of a night vision feature by including infrared/thermal imaging was proposed as a beneficial enhancement, particularly for operations in foggy conditions. This indicates potential directions for future development in the VR application.
The quality and communication speed over VoIP was considered very good by all participants.
The journey to integrate mixed reality technologies in maritime navigation is ongoing, and this analysis indicates that at least a similar usability level with respect to desktop solutions can be achieved already today on a prototype level. However, the ultimate objective remains clear: to create a system that not only aligns with, but enhances the natural workflows of maritime professionals, thereby promoting safety, efficiency, and precision in the complex domain of seafaring considering the best technology options for the respective tasks.
7. Conclusions and Future Work
Enabling remote pilotage requires solutions addressing the communication and SA challenges inherit to this novel concept. Within this paper, different technology options for visualization and interaction have been initially tested to support technology scouting when setting up these systems in a safe and interactive manner. Given the results with the peer group of three experienced nautical persons, it can be noted that the classical desktop, as well as immersive mixed reality reached a comparable usability on the SUS score. All passed the SUS threshold for good usability on average and reached a good SA level in SART. Consequently, this indicates that all three technology options are suitable in principle. However, the limitation of the peer group prevents a comprehensive understanding of how the system would perform or be received by users of varying ages, experience levels, and roles within the maritime industry.
Despite the principle usability of the prototype, several technical constraints have also been identified that need ergonomic enhancements to fully exploit the technologies for the remote pilotage use case properly. First, high-resolution cameras with better specifications than the GeoVision VR360 are necessary to provide clear, detailed images with minimal latency and high frame rates. This is essential for maintaining accurate situational awareness (SA). The zoom function sufficiency is another important aspect. The system’s zoom capabilities must maintain image clarity and detail even when magnified. High-quality digital zoom functions are required to avoid significant loss of resolution and minimize noise, ensuring that small details remain visible and actionable. Regarding augmented reality (AR) and virtual reality (VR), reducing eye fatigue for prolonged use is crucial. The hardware must incorporate advanced display technologies to mitigate eye strain. This includes implementing high refresh rates, adaptive brightness, and ergonomic design to facilitate extended use without causing discomfort to the users. Hand tracking consistency is also vital for immersive AR and VR systems. The Varjo XR-3 and Hololens 2 must ensure precise and responsive gesture recognition, allowing users to interact intuitively with virtual elements. This level of accuracy and responsiveness is critical. Lastly, the stability and reliability of AR overlays must be ensured. The Hololens 2 must deliver stable and accurate overlays that provide essential navigational data reliably. This includes superimposing AIS information, route specifics, and collision avoidance details onto the captain’s field of view. The system must be robust enough to withstand maritime environmental challenges, such as ship motion and varying light conditions, ensuring continuous and reliable support for safe navigation. By addressing these technical constraints, the Remote Pilotage Technology system can effectively support remote pilotage operations, enhancing both safety and efficiency without compromising navigational integrity.
Future work shall include extended test durations. The combination of desktop and immersive technologies within a single test session also needs to be examined to offer pilots the flexibility to choose the interface that best suits the task at hand.
Author Contributions
Conceptualization, A.U., P.H., R.G. and H.-C.B.; methodology, A.U., P.H. and R.G.; state of the art, P.H.; supervision, H.-C.B.; technical implementation, A.U. and P.H.; testing, A.U. and P.H.; assessment, R.G.; figures, R.G.; writing—original draft preparation, A.U., P.H. and R.G.; writing—review and editing, H.-C.B.; introduction, P.H. and H.-C.B.; conclusion and discussion, R.G. and H.-C.B. All authors have read and agreed to the published version of the manuscript.
Funding
This research was internally funded within the Fraunhofer Innovation Platform for Smart Shipping FIP-S2@Novia collaboration by Fraunhofer CML.
Institutional Review Board Statement
Not applicable.
Informed Consent Statement
Not applicable.
Data Availability Statement
The SUS, SART, and SSQ data sets from the test runs are available upon request from the corresponding author.
Acknowledgments
The authors would like to thank the three test participants for their willingness to join a three-day ferry ride in winter during these tests, as well as the ferry operator for facilitating these tests. Specifically, the authors would also like to thank Niclas Seligson for his support in facilitating this research in practice.
Conflicts of Interest
The authors declare no conflicts of interest.
Abbreviations
The following abbreviations are used in this manuscript:
AIS | Automatic Identification System |
AR | augmented reality |
COLREGs | Convention on the International Regulations for Preventing Collisions at Sea |
CPA | Closest Point of Approach |
ECDIS | Electronic Chart Display and Information System |
MR | mixed reality |
RTSP | Real-Time Streaming Protocol |
SA | situational awareness |
SART | Situational Awareness Rating Technique |
SSQ | Simulator Sickness Questionnaire |
SUS | System Usability Scale |
TCPA | Time to Closest Point of Approach |
TRL | technological readiness level |
UI | user interface |
VHF | very high frequency |
VoIP | Voice over Internet Protocol |
VR | virtual reality |
VTS | Vessel Traffic Service |
Note
1 | Within this prototype, basic sea chart implementations have been used. The authors are aware of the existence of special screen-based pilotage software, so-called Portable Pilot Units (PPUs), that offer pilotage-specific chart applications. For a final commercial system, the PPUs’ functionalities or even the software itself could be integrated here so that the pilots can benefit from their established set of known functionalities also remotely, but as this was not the focus of this usability tests, simplified prototypes have been used. It is, however, important to note that Remote Pilotage Systems are not a substitute for PPUs, but that the authors recommend fully integrating PPUs into such systems in the future for improved SA and to smooth the transition between onboard and remote pilotage execution. |
References
- Wild, C.R.J. The Paradigm and the Paradox of Perfect Pilotage. J. Navig. 2011, 64, 183–191. [Google Scholar] [CrossRef]
- Sakar, C.; Sokukcu, M. Dynamic analysis of pilot transfer accidents. Ocean. Eng. 2023, 287, 115823. [Google Scholar] [CrossRef]
- Heikkilä, M.; Himmanen, H.; Soininen, O.; Sonninen, S.; Heikkilä, J. Navigating the Future: Developing Smart Fairways for Enhanced Maritime Safety and Efficiency. J. Mar. Sci. Eng. 2024, 12, 324. [Google Scholar] [CrossRef]
- Grundmann, R.; Ujkani, A.; Weisheit, J.; Seppänen, J.; Salokorpi, M.; Burmeister, H.C. Use Case Remote Pilotage—Technology Overview. J. Phys. Conf. Ser. 2023, 2618, 012007. [Google Scholar] [CrossRef]
- Hadley, M.; Pourzanjani, M. How remote is remote pilotage? WMU J. Marit. Aff. 2003, 2, 181–197. [Google Scholar] [CrossRef]
- Lahtinen, J.; Valdez Banda, O.A.; Kujala, P.; Hirdaris, S. Remote piloting in an intelligent fairway—A paradigm for future pilotage. Saf. Sci. 2020, 130, 104889. [Google Scholar] [CrossRef]
- Rotterdam Port Authority. Port Information Guide. 2023. Available online: https://www.portofrotterdam.com/sites/default/files/2023-01/port-information-guide_0.pdf (accessed on 1 March 2024).
- Verbeek, E. Shore Based Pilotage, a matter of trust. Seaways 2021, 10, 6–8. [Google Scholar]
- Bruno, K.; Lützhöft, M. Shore-Based Pilotage: Pilot or Autopilot? Piloting as a Control Problem. J. Navig. 2009, 62, 427–437. [Google Scholar] [CrossRef]
- Hadley, M.A. Issues in Remote Pilotage. J. Navig. 1999, 52, 1–10. [Google Scholar] [CrossRef]
- IMO. COLREG—Collision Regulations 1972; IMO: London, UK, 2019; Available online: https://www.imo.org/en/About/Conventions/Pages/COLREG.aspx (accessed on 1 July 2021).
- Rowen, A.; Grabowski, M.; Rancy, J.P. Moving and improving in safety-critical systems: Impacts of head-mounted displays on operator mobility, performance, and situation awareness. Int. J.-Hum.-Comput. Stud. 2021, 150, 102606. [Google Scholar] [CrossRef]
- Okazaki, T.; Kitagawa, R.; Matsubara, K.; Kashima, H. Development of maneuvering support system for ship docking. In Proceedings of the 2017 Joint 17th World Congress of International Fuzzy Systems Association and 9th International Conference on Soft Computing and Intelligent Systems (IFSA-SCIS), Otsu, Japan, 27–30 June 2017; pp. 1–5. [Google Scholar]
- Nađ, Đ.; Mišković, N.; Omerdic, E. Multi-Modal Supervision Interface Concept for Marine Systems. In Proceedings of the OCEANS 2019, Marseille, France, 17–20 June 2019; pp. 1–5. [Google Scholar] [CrossRef]
- Floris van den Oever, M.F.; Sætrevik, B. A Systematic Literature Review of Augmented Reality for Maritime Collaboration. Int. J. Human–Computer Interact. 2023, 2023, 1–16. [Google Scholar] [CrossRef]
- Burmeister, H.C.; Grundmann, R.; Schulte, B. Situational Awareness in AR/VR during remote maneuvering with MASS: The tug case. In Proceedings of the Global Oceans 2020: Singapore—U.S. Gulf Coast, Biloxi, MS, USA, 5–30 October 2020; pp. 1–6. [Google Scholar] [CrossRef]
- Byeon, S.; Grundmann, R.; Burmeister, H.C. Remote-controlled tug operation via VR/AR: Results of an in-situ model test. TransNav Int. J. Mar. Navig. Saf. Sea Transp. 2021, 15, 4. [Google Scholar] [CrossRef]
- Brooke, J. SUS: A quick and dirty usability scale. Usability Eval. Ind. 1995, 189, 4–7. [Google Scholar]
- Bolton, M.; Biltekoff, E.; Humphrey, L. The Level of Measurement of Subjective Situation Awareness and Its Dimensions in the Situation Awareness Rating Technique (SART). IEEE Trans.-Hum.-Mach. Syst. 2021, 52, 1147–1154. [Google Scholar] [CrossRef]
- Taylor, R. Situational Awareness Rating Technique (SART): The Development of a Tool for Aircrew Systems Design. In Situational Awareness in Aerospace Operations (AGARD-CP-478); NATO-AGARD: Neuilly Sur Seine, France; Routledge: London, UK, 1990. [Google Scholar]
- Kennedy, R.S.; Lane, N.E.; Berbaum, K.S.; Lilienthal, M.G. Simulator Sickness Questionnaire: An Enhanced Method for Quantifying Simulator Sickness. Int. J. Aviat. Psychol. 1993, 3, 203–220. [Google Scholar] [CrossRef]
- Endsley, M.R. Measurement of situation awareness in dynamic systems. Hum. Factors 1995, 37, 65–84. [Google Scholar] [CrossRef]
| Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).