Abstract Proceedings of ICIRESM – 2020
Full conference PDF is available to the subscribed user. Use your subscription login to access,
HUMAN RESCUE SYSTEM FOR FLOOD AREAS USING OPENCV COMPUTER VISION
Floods are becoming more frequent and severe natural disasters worldwide due to extreme climatic change. In addition to causing a huge economic damage (to the human property) they cause a substantial loss of human lives even leading to fatalities. Early detection is critical in providing a timely response to prevent damage to property and life. It is therefore crucial to use all available technologies, including Earth observation, in their prevention and mitigation. On the other hand, focusing on immediate actions to be taken after the onset of flood is highly essential. Person detection and tracking is a popular and still very active field of research in computer vision. There are many camera-based safety and security applications such as search and rescue, surveillance, driver assistance systems, or autonomous driving. Previous methods for flood detection use specialized satellite imagery. In this project, we propose a method for real time human detection using deep neural network algorithm based on video content analysis of feeds from surveillance cameras, which are more common and readily available nowadays. We demonstrate that open CV is effective method and comparatively fast for recognition and localization in COCO Human dataset. Once the presence of human is detected, the system will capture the detected image and post it in social media platform like Instagram. This enables not only the rescue team who will be busy at that time but also the common people who are near to that place to provide help to those detected humans and to rescue them by saving their life.
Flood, human detecting.
13/11/2020
133
20133
IMPORTANT DAYS
Paper Submission Last Date
October 20th, 2024
Notification of Acceptance
November 7th, 2024
Camera Ready Paper Submission & Author's Registration
November 1st, 2024
Date of Conference
November 15th, 2024
Publication
January 30th, 2025