Tot Pt. 1.1

Download as pdf or txt
Download as pdf or txt
You are on page 1of 6

See discussions, stats, and author profiles for this publication at: https://www.researchgate.

net/publication/376654458

Maritime Search and Rescue Operations

Technical Report · December 2023


DOI: 10.13140/RG.2.2.22490.72640

CITATIONS READS

0 547

1 author:

Karthika Punnengattu Padi Subramanian


Bournemouth University
1 PUBLICATION 0 CITATIONS

SEE PROFILE

All content following this page was uploaded by Karthika Punnengattu Padi Subramanian on 20 December 2023.

The user has requested enhancement of the downloaded file.


Maritime Search and Rescue Operations
Karthika Punnengattu Padi Subramanian
Department of Computing And Informatics
Bournemouth University
Bournemouth, United Kingdom
s5553380@bournemouth.ac.uk

Abstract— Search and Rescue (SAR) plans need to be However, the difficulties associated with catastrophe
developed with efficiency and science since maritime accidents environments, including exposure to asbestos dust, toxic
are driven by complicated hydrometeorological conditions and gases, hazardous materials, radiation, and extreme
unpredictability. This research evaluates the use of state-of- temperatures, make these tasks exceptionally challenging
the-art technologies in marine environments to improve search for human responders [5].
and rescue decision-making. The significance of technology in
disaster mitigation, preparedness, response, and recovery Addressing these challenges, robots emerge as a
stages is emphasized, and creative solutions to optical detecting potential solution to navigate hazardous conditions swiftly
issues are presented. For drift prediction and autonomous and efficiently. In contrast to human limitations, robots have
robots, such as UAVs, USVs, and UUVs, in SAR operations, the capability to mitigate threats and expedite the search for
the research presents the Lagrange particle tracking victims, offering a promising avenue to augment the
technique. With a focus on semantic segmentation and object capabilities of SAR personnel [4], [6].
detection, it talks about computer vision models. The
importance of these technological breakthroughs in Robotic SAR systems vary across several dimensions,
transforming SAR capabilities is emphasized in the last encompassing their designated operational environments
section. Fig. 1 (urban, maritime, or wilderness), the types and
quantity of robots utilized (such as Unmanned Surface
Keywords— Search and Rescue (SAR), Autonomous robots, Vehicles (USVs), Unmanned Aerial Vehicles (UAVs),
Deep Learning, Computer vision Unmanned Ground Vehicles (UGVs), and Unmanned
Underwater Vehicles (UUVs)), the degree of autonomy
embedded in these robotic systems, and the methods
I.INTRODUCTION employed by human operators for supervision and control,
Maritime accidents pose unprecedented threats, marked among other factors.
by the unpredictability of occurrences and the intricate
hydrometeorological conditions that characterize emergency
response efforts [1]. Such events can result in catastrophic
consequences, as evidenced by the tragic collision of the
ships "TIANYU 2" and "LIAOSUIYU 66528" in September
2017, leading to the loss of six lives due to the absence of a
timely and scientifically devised search and rescue plan [3].
The urgency of effective search and rescue operations is Fig. 1 : Autonomous robot types and advantages [17]
further underscored by the escalating numbers of individuals
attempting illegal entry into Europe through the perilous This paper delves into the integration of cutting-edge
maritime route across the Mediterranean. Data from the technologies as a solution to address existing gaps in current
International Organization for Migration reveals alarming approaches to marine safety. The overarching objective is to
figures, emphasizing the human toll and the pressing need to enhance the decision-making capabilities of search and
address the challenges associated with irregular migration rescue teams, ultimately leading to faster responses in
via this [1]. emergency situations, a heightened likelihood of success,
and an overall reduction in response times. Emphasizing the
One of the most formidable challenges in sea-based necessity of technological innovation to navigate evolving
search and rescue efforts lies in visually identifying challenges and improve safety measures in marine
survivors. The vastness of the maritime environment, environments, this study critically evaluates the application
coupled with visual fatigue and the difficulty of discerning of various technologies within search and rescue operations
small targets against a shifting sea background, can lead to at sea. By exploring these concepts, the paper aims to
human errors during surveillance missions, whether contribute to a deeper understanding of how advanced
conducted from ships or aircraft. Detection of lifeboats or technologies can significantly impact and advance the field
life rafts faces similar [1]. of marine safety.
The general understanding of disasters has grown over
time. [7]. During natural disasters like hurricanes, tsunamis,
earthquakes, or acts of terrorism, Search and Rescue (SAR) II.RELATED WORK
operations become imperative for locating, rescuing, and The EU-funded ICARUS (Integrated Components for
providing assistance to those stranded in perilous areas. Search and Rescue) project [1], [15] project focuses on
Time sensitivity is critical, with a narrow window of development of robotic tools for rescue and search missions
approximately 48 hours for rescuers to locate survivors in both urban and maritime environments, emphasizing
before their chances of survival significantly diminish [4]. operational and technological interoperability. The project
integrates robotic tools across land, sea, and aircraft forecasting systems. When a disaster occurs, the focus shifts
domains, achieving success in multi-robot search and rescue to emergency response, which prioritizes Search and Rescue
missions. Leveraging an Unmanned Surface Vehicle (USV) (SAR) operations and addressing the basic humanitarian
and an Autonomous Underwater Vehicle (AUV), the team needs of affected communities. Finally, emergency recovery
autonomously surveyed and mapped the area. The AUV, steps in to restore living conditions in areas devastated by
equipped with automatic target detection, precisely located disasters. This phase involves a rapid assessment of damage,
and inspected targets, including underwater pipe leaks and reconstruction, and rehabilitation efforts to bring about a
victims. Their achievements, which included detecting the swift recovery and return to normalcy.
leak and mapping infrastructure, earned them the second-
place position in the overall grand challenge. Additionally,
the team validated their capabilities in autonomous
transportation and deployment through a successful
autonomous AUV deployment mission.
DARIUS [1],[18] is a project tailored for multinational
Search and Rescue (SAR) operations, involving a structured
operational approach with Coordination, Tactical, and
Execution levels. Coordination manages planning, Tactical
oversees tasking and operations, and Execution involves on-
the-ground teams. The project seamlessly integrates
unmanned and manned systems, aspiring to establish a
collaborative "system of systems."
To assess its success, DARIUS uses three scenarios
collaboratively designed with end users, addressing both
large-scale events and everyday SAR challenges. These
scenarios undergo rigorous training in simulation platforms
and real-world field trials.
DARIUS' Generic Ground Stations (GGS) employ OGC Fig 2 : Search and Rescue Process [9]
services for efficient data exchange with multiple Ground The IAMSAR (International Aeronautical and Maritime
Control Stations (GCS). These services include Sensor Search and Rescue) guideline classifies situations into three
Observation Service (SOS) for querying sensor capabilities emergency phases, reflecting the level of endangerment to
and Web Feature Service (WFS) for distributing tactical people's safety: [19]:
mission features among ground stations, ensuring a cohesive
and interoperable SAR system.
In sea survival searches, the vast water expanse poses a • Uncertainty: This phase occurs when there's uncertainty
challenge for visual detection, leading to potential errors in about a ship's safety, marked by a failure to disclose its
traditional surveillance against the sea background. position or arriving later than expected.
Common vision techniques, such as colour recognition, may • Alert: Declared when a ship needs assistance but isn't
result in false positives [1]. An innovative solution involves immediately in danger.
using visual saliency to identify rescue vehicles, offering • Distress: This phase is declared when there's a high
resilience to changes in geometry and colour. Despite certainty of immediate danger, necessitating urgent
methods like saliency formation between frames, challenges assistance.
persist. To address these issues, thermal cameras play a
crucial role in reducing false positives and identifying less Understanding these phases is vital for the Search and
obvious targets. By segmenting the visual field based on Rescue Mission Coordinator (SMC) to determine
heat sensitivity, they distinguish human heat signatures. The appropriate actions, each phase having a checklist primarily
application of cutting-edge technologies to tackle optical focused on information gathering and reminders to dispatch
detection challenges significantly improves the efficacy of a Search and Rescue Unit (SRU).
search and rescue operations at sea. A. Search region
Marine SAR decisions heavily hinge on data from the
III.METHODOLOGY marine dynamic environment, including wind, wave,
current, and sea temperature. Wind and flow data assist in
The Search and Rescue SAR process is categorized into
predicting the target's drift trajectory, while water
four phases Fig. 2, mitigation, preparedness, response, and
temperature data is crucial for estimating the survival period
recovery [9].
of the SAR target [3].
Disaster mitigation involves a comprehensive approach
AIS (Automated Identification System) data provides
encompassing risk identification, analysis, and mitigation
information on the current location of a ship. Typically,
through technical means, public awareness campaigns, and
individuals obtain AIS data by interfacing with free radar
spatial planning. This proactive phase aims to decrease the
AIS and shore-based AIS. In Search and Rescue (SAR)
impact of potential disasters [8]. Preparedness, another
operations, knowing the precise position of a ship or person
integral aspect, involves making decisions on how to
is crucial. However, the ship in distress may be influenced
respond to emergencies, including activities like emergency
by the surrounding sea environment and drift after the
planning, training, and the establishment of monitoring and
incident. Therefore, the key to effective SAR operations lies essential. Due to their adaptability, affordability, and
in accurately estimating the ship's drift path. simplicity of use, small-scale unmanned aerial vehicles
(UAVs) are being used in a wide range of industries,
The distress target's force formula is [3] : including construction, border control, environmental
monitoring, disaster surveillance, search and rescue
𝑑𝑣 operations, and commodities delivery. These applications
𝑀 × + 𝑚𝑓 = 𝐹𝑤 + 𝐹𝑐 (1)
𝑑𝑡 need sophisticated three-dimensional (3D) spatial awareness
and reliable wireless connectivity to be successful [23].
M : The mass of the distress target
Along with support UAVs, surface and underwater
𝑑𝑣 : The speed of the distress target
robots are often employed in maritime SAR missions. These
mf : The Coriolis force robots are outfitted with specialised sensors designed to
meet the demands of marine SAR. Seismometers,
𝐹𝑤 : The force of the wind on the portion of hydrophones, sensors for water and weather conditions, and
the distressed target sensors measuring seafloor pressure are examples of this.
Their uses range from employing laser fluorosensors to
𝐹𝑐 : The force which seawater exerts on the distress identify compounds to detecting earthquakes, tsunamis, and
target. oil spills [17].
Robotic SAR systems must react quickly in sea SAR
operations [25] because these operations are characterised
by unexpected incidents and are impacted by several aspects
such as ambient conditions, location devices, and injury
circumstances [17]. Particularly in close proximity to
shorelines, the vital function these systems serve is
exemplified by innovations like as the life-ring drone
delivery system [26].
Projects like ICARUS [15] provide as examples of how
autonomous robots, including support UAVs, UUVs, and
USVs, might work together in maritime SAR. USVs are
superior in assessment, whereas UAVs are useful in
mapping. The combined use of USVs and UUVs, frequently
with UAV assistance, for activities including mapping,
surveying, identifying pipe leaks, and locating victims
Fig. 3 : The SAR target's drift trajectory and the underwater is still being investigated [17].
anticipated scatter dispersion at every instant [3]
The Lagrange particle tracking algorithm [20] is
employed to calculate particle displacement, capturing
variations in physical quantity as particles move and
intuitively generating their motion paths. This dynamic
particle tracking is facilitated by the algorithm's continuous
solution for particle displacement. However, in the drift
prediction equation, the use of both experimental and
empirical coefficients introduces inherent calculation errors.
To enhance prediction accuracy, the model calculates drift
trajectories for five hundred particles Fig. 3, determining the
Search and Rescue (SAR) range based on their collective
drift range [3].
To create a convex hull, the Graham scanning process
[22] is used. Graham's scan is a time-complex method for
finding the convex hull of a finite set of points in the plane.
The convex hull is formed by combining two convex Fig. 4 : Pars robot [27]
polygons. However, using the convex polygon as the search
One illustrative example of an autonomous robot is Pars,
area can affect search route planning. To address this, the
an innovative Unmanned Aerial System (UAS) designed for
convex polygon is transformed to create the Minimal Area
maritime rescue. Through radio control, lifeguards or rescue
Bounding Rectangle (MABR) [21] that encompasses it, and
teams can operate Pars. By lowering life rings, this eight-
the search region is determined by the drift time [3].
rotor rescue robot effectively reaches injured people at sea.
B. Autonomous robots A victim 75 metres off the coast received a float from Pars
Once the search region is identified, in Search and in 22 seconds, as opposed to 90 seconds for a human
Rescue (SAR) operations, especially in marine situations, lifeguard, in a practical exercise showcasing the device's
autonomous robots, such as Unmanned Surface Vehicles greater speed. It is vital to identify the victim's exact
(USVs), Unmanned Underwater Vehicles (UUVs) [15], and location precisely because it guarantees the proper
support Unmanned Aerial Vehicles (UAVs) [27], are deployment of emergency flotation devices. A possibly life-
saving innovation in maritime rescue operations, Pars can This study's methodology focuses on water segmentation
notably carry three life buoys in a single flight [27]. using visually-based data from widely available high-
resolution RGB cameras, rather than relying on expensive
C. Computer vision models and specialised sensors like thermal cameras or LIDARs.
Rapid human and hazard detection along with situational Specifically, the KittiSeg [36] deep learning segmentation
awareness in real time are critical components of SAR algorithm is modified for the purpose of water
missions. Developments in computer vision, mostly based segmentation, which is the preferred method [37].
on Deep Learning (DL) models, are essential, but for the
best real-time performance, particularly on portable devices, 3) Object detection
their weightiness needs to be addressed. The goal of Searching digital images and videos for instances of a
ongoing work is to improve these models for effective specific class of semantic objects is the focus of the object
emergency response [17]. detection field in computer vision and image processing. In
This section looks at single-agent and multi-agent order to extract features, carry out classification, and aid in
machine perception techniques that can be used in situations localization in input photos or videos, state-of-the-art object
and missions similar to SAR. In this context, Deep Learning detectors usually use deep learning networks as their
(DL) is an essential component for efficiently victim foundation. A number of extensively studied object
identification and scenario assessment. Since cameras are identification domains include face, multicategory, edge,
widely employed as sensors in Search and Rescue (SAR) salient, pose, text detection in scenes, and pedestrian
robotics, the primary emphasis is on image-based detection [32].
perception, more precisely on semantic segmentation and Two major categories can be used to classify current
object detection. Labelling every element, the agent sees is domain-specific imagine object detectors. A two-stage
the task of semantic segmentation, while object detection detector with great accuracy in object recognition and
labels only the items that the agent finds interesting [17]. localization is the first type, as illustrated by Faster R-CNN
1) Semantic segmentation [33]. The second is the one-stage detector, represented by
You Only Look Once (YOLO) [34] and Single Shot
Neural networks have been applied to attain MultiBox Detector (SSD) [35], renowned for achieving high
considerable improvements in recent deep learning research, inference speed. Significant quantities of memory and
particularly in the area of semantic segmentation. Semantic processing power are frequently required for object
segmentation requires assigning a class of objects or non- detection tasks, especially in real-time applications. Cloud
objects to each region or pixel, and deep neural networks are computing and small-scale object identification techniques
proven to be quite effective at this task. In order to have been studied for Unmanned Aerial Vehicle (UAV)
understand images, segmentation is essential. It is also applications in order to overcome this [17].
necessary for many image analysis tasks, including object
recognition, boundary localization, and image classification
[28]. IV.Conclusion

The most often used feature extraction techniques are This study concludes by highlighting the revolutionary
VGG and ResNet, both of which have exceptional efficacy. influence of cutting-edge technologies on search and rescue
The incorporation of techniques such as nested region operations and maritime safety. Technological
sequences or multi-scale feature extraction guarantees advancements have been successful in tackling difficulties
performance criteria while maintaining computing related to marine catastrophes and irregular migration crises,
economy. In semantic segmentation, recent research as evidenced by the robotic instruments used in the
focusing on VGG and ResNet methods has produced ICARUS project and the multinational approach of the
impressive results [28]. DARIUS initiative. The precision of SAR procedures is
improved by the application of the Lagrange particle
The assessment of three widely adopted, state-of-the-art tracking method and the IAMSAR guideline, which are
deep learning semantic segmentation methods—U-Net [29], essential for making wise decisions in emergency situations.
PSP-Net [30], and DeepLabv2 [31]—in a maritime The addition of autonomous robots that are outfitted with
environment offers valuable insights into their respective specific sensors enhances SAR performance even further in
performances [17]. maritime settings. Furthermore, computer vision models are
2) Water segmentation essential for quick human and hazard recognition, which
helps to maintain situational awareness in real time. This is
Water segmentation is the process of separating water especially true for semantic segmentation and object
from other compounds, offering a basic but reliable safety detection. Technology plays an increasingly important part
precaution. The USV is only designed to proceed in the in the phases of readiness, reaction, and recovery after a
event that open water is found along the intended path. In all disaster as it develops, pointing to a future in which SAR
other cases, such as when there are obstructions, the USV activities will be more effective, flexible, and adaptable to
should slow down, halt, or choose a different path to prevent changing circumstances.
possible collisions. In addition, during search operations,
water segmentation facilitates the identification and However, it's crucial to acknowledge that weather
detection of adjacent items. Higher resolution visual input conditions, such as strong winds or rain, pose significant
can be used for a more accurate analysis of identified challenges and can impact the effective use of Unmanned
objects, assuring real-time system functioning, by restricting Aerial Vehicles (UAVs) in maritime Search and Rescue
the area analysed by algorithms to segmented water zones (SAR) operations. These challenges need to be addressed
[37]. for UAVs to consistently perform optimally in varying
environmental conditions during SAR missions.
V.REFERENCES
[20] K. Szwaykowska and F. Zhang, ‘‘Controlled lagrangian particle
tracking:Error growth under feedback control,’’ IEEE Trans. Control
[1] Mendonça, R., Marques, M.M., Marques, F., Lourenco, A., Pinto, E., Syst. Technol.,vol. 26, no. 3, pp. 874–889, May 2018.
Santana, P., Coito, F., Lobo, V. and Barata, J., 2016, September. A
cooperative multi-robot team for the surveillance of shipwreck [21] Zhang, J., Teixeira, Â.P., Soares, C.G. and Yan, X., 2017.
survivors at sea. In OCEANS 2016 MTS/IEEE Monterey (pp. 1-6). Probabilistic modelling of the drifting trajectory of an object under
IEEE. the effect of wind and current for maritime search and rescue. Ocean
Engineering, 129, pp.253-264.
[2] Paltrinieri, N., Dechy, N., Salzano, E., Wardman, M. and Cozzani, V.,
2013. Towards a new approach for the identification of atypical [22] Graham, R.L. and Yao, F.F., 1983. Finding the convex hull of a
accident scenarios. Journal of Risk Research, 16(3-4), pp.337-354. simple polygon. Journal of Algorithms, 4(4), pp.324-331.
[3] Ai, B., Li, B., Gao, S., Xu, J. and Shang, H., 2019. An intelligent [23] Hayat, S., Yanmaz, E. and Muzaffar, R., 2016. Survey on unmanned
decision algorithm for the generation of maritime search and rescue aerial vehicle networks for civil applications: A communications
emergency response plans. IEEE Access, 7, pp.155835-155850. viewpoint. IEEE Communications Surveys & Tutorials, 18(4),
pp.2624-2661.
[4] Shah, B. and Choset, H., 2004. Survey on urban search and rescue
robots. Journal of the Robotics Society of Japan, 22(5), pp.582-586. [24] Andre, T., Hummel, K.A., Schoellig, A.P., Yanmaz, E., Asadpour,
M., Bettstetter, C., Grippa, P., Hellwagner, H., Sand, S. and Zhang,
[5] Casper, J. and Murphy, R.R., 2003. Human-robot interactions during S., 2014. Application-driven design of aerial communication
the robot-assisted urban search and rescue response at the world trade networks. IEEE Communications Magazine, 52(5), pp.129-137.
center. IEEE Transactions on Systems, Man, and Cybernetics, Part B
(Cybernetics), 33(3), pp.367-38 [25] Zhao, Q., Ding, J., Xia, B., Guo, Y., Ge, B. and Yang, K., 2019,
October. Search and rescue at sea: Situational factors analysis and
[6] Grayson, S., 2014. Search & rescue using multi-robot systems. School similarity Measure. In 2019 IEEE International Conference on
of Computer Science and Informatics, University College Dublin, Systems, Man and Cybernetics (SMC) (pp. 2678-2683). IEEE.
pp.1-14.
[26] Xiang, G., Hardy, A., Rajeh, M. and Venuthurupalli, L., 2016, April.
[7] Bello, O.M. and Aina, Y.A., 2014. Satellite remote sensing as a tool Design of the life-ring drone delivery system for rip current rescue. In
in disaster management and sustainable development: towards a 2016 IEEE Systems and Information Engineering Design Symposium
synergistic approach. Procedia-Social and Behavioral Sciences, 120, (SIEDS) (pp. 181-186). IEEE.
pp.365-373.
[27] Yeong, S.P., King, L.M. and Dol, S.S., 2015. A review on marine
[8] Poser, K. and Dransch, D., 2010. Volunteered geographic information search and rescue operations using unmanned aerial vehicles.
for disaster management with application to rapid flood damage International Journal of Marine and Environmental Sciences, 9(2),
estimation. Geomatica, 64(1), pp.89-98. pp.396-399.
[9] Nasar, W., Da Silva Torres, R., Gundersen, O.E. and Karlsen, A.T., [28] Lateef, F. and Ruichek, Y., 2019. Survey on semantic segmentation
2023. The Use of Decision Support in Search and Rescue: A using deep learning techniques. Neurocomputing, 338, pp.321-348.
Systematic Literature Review. ISPRS International Journal of Geo-
Information, 12(5), p.182. [29] Ronneberger, O., Fischer, P. and Brox, T., 2015. U-net:
Convolutional networks for biomedical image segmentation. In
[10] Hani'Hairzaki, U., Ibrahim, R., Halim, S.A. and Yokoi, T., 2017, Medical Image Computing and Computer-Assisted Intervention–
September. A review on service oriented architecture approach in MICCAI 2015: 18th International Conference, Munich, Germany,
flood disaster management framework for sentiment analysis: October 5-9, 2015, Proceedings, Part III 18 (pp. 234-241). Springer
Malaysia context. International Publishing.
[11] In New Trends in Intelligent Software Methodologies, Tools and [30] Zhao, H., Shi, J., Qi, X., Wang, X. and Jia, J., 2017. Pyramid scene
Techniques: Proceedings of the 16th International Conference parsing network. In Proceedings of the IEEE conference on computer
SoMeT_17 (Vol. 297, p. 362). IOS Press. vision and pattern recognition (pp. 2881-2890).
[12] Nunavath, V. and Goodwin, M., 2019, December. The use of artificial [31] Chen, L.C., Papandreou, G., Kokkinos, I., Murphy, K. and Yuille,
intelligence in disaster management-a systematic literature review. A.L., 2017. Deeplab: Semantic image segmentation with deep
In 2019 International Conference on Information and Communication convolutional nets, atrous convolution, and fully connected crfs. IEEE
Technologies for Disaster Management (ICT-DM) (pp. 1-8). IEEE. transactions on pattern analysis and machine intelligence, 40(4),
[13] Shah, S.A., Seker, D.Z., Hameed, S. and Draheim, D., 2019. The pp.834-848.
rising role of big data analytics and IoT in disaster management: [32] Jiao, L., Zhang, F., Liu, F., Yang, S., Li, L., Feng, Z. and Qu, R.,
recent advances, taxonomy and prospects. IEEE Access, 7, pp.54595- 2019. A survey of deep learning-based object detection. IEEE access,
54614. 7, pp.128837-128868.
[14] Shahrah, A.Y. and Al-Mashari, M.A., 2017, March. Emergency [33] Ren, S., He, K., Girshick, R. and Sun, J., 2015. Faster r-cnn: Towards
response systems: research directions and current challenges. real-time object detection with region proposal networks. Advances in
In Proceedings of the Second International Conference on Internet of neural information processing systems, 28.
things, Data and Cloud Computing (pp. 1-6).
[34] Redmon, J., Divvala, S., Girshick, R. and Farhadi, A., 2016. You only
[15] Matos, A., Martins, A., Dias, A., Ferreira, B., Almeida, J.M., Ferreira, look once: Unified, real-time object detection. In Proceedings of the
H., Amaral, G., Figueiredo, A., Almeida, R. and Silva, F., 2016, IEEE conference on computer vision and pattern recognition (pp.
April. Multiple robot operations for maritime search and rescue in 779-788).
euRathlon 2015 competition. In OCEANS 2016-Shanghai (pp. 1-7).
IEEE. [35] Liu, W., Anguelov, D., Erhan, D., Szegedy, C., Reed, S., Fu, C.Y. and
Berg, A.C., 2016. Ssd: Single shot multibox detector. In Computer
[16] De Cubber, G., Doroftei, D., Serrano, D., Chintamani, K., Sabino, R. Vision–ECCV 2016: 14th European Conference, Amsterdam, The
and Ourevitch, S., 2013, October. The EU-ICARUS project: Netherlands, October 11–14, 2016, Proceedings, Part I 14 (pp. 21-37).
Developing assistive robotic tools for search and rescue operations. In Springer International Publishing.
2013 IEEE international symposium on safety, security, and rescue
robotics (SSRR) (pp. 1-4). IEEE. [36] Teichmann, M., Weber, M., Zoellner, M., Cipolla, R. and Urtasun, R.,
2018, June. Multinet: Real-time joint semantic reasoning for
[17] Queralta, J.P., Taipalmaa, J., Pullinen, B.C., Sarker, V.K., Gia, T.N., autonomous driving. In 2018 IEEE intelligent vehicles symposium
Tenhunen, H., Gabbouj, M., Raitoharju, J. and Westerlund, T., 2020. (IV) (pp. 1013-1020). IEEE.
Collaborative multi-robot search and rescue: Planning, coordination,
perception, and active vision. Ieee Access, 8, pp.191617-191643 [37] Taipalmaa, J., Passalis, N., Zhang, H., Gabbouj, M. and Raitoharju, J.,
2019, October. High-resolution water segmentation for autonomous
[18] Serrano, D., De Cubber, G., Leventakis, G., Chrobocinski, P., Moore, unmanned surface vehicles: A novel dataset and evaluation. In 2019
D. and Govindaraj, S., 2015, January. ICARUS and DARIUS IEEE 29th International Workshop on Machine Learning for Signal
approaches towards interoperability. In IARP RISE Workshop, At Processing (MLSP) (pp. 1-6). IEEE.
Lisbon, Portugal. Proceedings of the NATO STO Lecture Series SCI-
271 (Vol. 1). [38] Meuth, R.J., Saad, E.W., Wunsch, D.C. and Vian, J., 2009,
November. Adaptive task allocation for search area coverage. In 2009
[19] Storvik, M.H.R., 2020. Case-Based Reasoning for Decision Support IEEE International Conference on Technologies for Practical Robot
in Search and Rescue (Master's thesis, NTNU). Applications (pp. 67-74). IEEE.

View publication stats

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy