<?xml version="1.0" encoding="UTF-8"?>
<feed xmlns="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
<title>2026</title>
<link href="https://ar.iub.edu.bd/handle/11348/1043" rel="alternate"/>
<subtitle/>
<id>https://ar.iub.edu.bd/handle/11348/1043</id>
<updated>2026-04-14T21:27:24Z</updated>
<dc:date>2026-04-14T21:27:24Z</dc:date>
<entry>
<title>Dual-Task Real-Time Low-Light Lane and Pothole Detection for Resource-Constrained Environments</title>
<link href="https://ar.iub.edu.bd/handle/11348/1044" rel="alternate"/>
<author>
<name>Md Iftekharul, Alam</name>
</author>
<id>https://ar.iub.edu.bd/handle/11348/1044</id>
<updated>2026-02-26T06:36:13Z</updated>
<published>2026-01-04T00:00:00Z</published>
<summary type="text">Dual-Task Real-Time Low-Light Lane and Pothole Detection for Resource-Constrained Environments
Md Iftekharul, Alam
Lane detection and road hazard awareness are crucial for ensuring safety in autonomous driving and Advanced Driver-Assistance Systems (ADAS). These systems rely&#13;
heavily on clear visual cues, which are often compromised in low- light driving scenarios.&#13;
The challenge is especially pronounced in low- and middle-income countries (LMICs),&#13;
where poorly illuminated roads, faded lane markings, and unmaintained sur- faces frequently co-occur. Under such conditions, conventional single-model detectors trained for&#13;
daytime environments degrade sharply, as lane cues and pothole textures often compete&#13;
in the same field of view. To address this, we present a lightweight dual- model pipeline&#13;
that integrates a low-light enhancement front end with an OpenCV-based lane delineation&#13;
pipeline and a YOLOv12 detector for pothole localization. The models run in parallel on&#13;
shared inputs, and their outputs are fused to generate a unified lane geometry and hazard&#13;
map in a single pass. The architecture is optimized for modest compute and memory&#13;
budgets, enabling deployment in resource-constrained settings while maintaining high&#13;
throughput. Evaluated on evening-time urban road scenes from Bangladesh, achieves&#13;
88.7potholes and 89.3FPS on NVIDIA GTX 1050Ti, outperforming a single-detector&#13;
baseline. These results highlight the potential of our approach for practical, real-time&#13;
ADAS perception in underserved regions. Index Terms—Low-light imaging, Lane detection, Pothole de- tection, YOLOv12, OpenCV, Image enhancement, Edge comput- ing,&#13;
Autonomous driving
</summary>
<dc:date>2026-01-04T00:00:00Z</dc:date>
</entry>
</feed>
