Show simple item record

dc.contributor.authorTabib, Fatin Israq
dc.contributor.authorKhatun, Halima
dc.contributor.authorSarker, Shuvro Sourav
dc.date.accessioned2025-05-18T06:24:29Z
dc.date.available2025-05-18T06:24:29Z
dc.date.issued2025-01
dc.identifier.urihttp://ar.iub.edu.bd/handle/11348/987
dc.description.abstractarchitecture relies on the addition of flow estimation layers during the downsampling process, which allows relevant motion details to permeate into frame synthesis. This design, in conjunction with a progressive refinement approach, guarantees high-fidelity output while remaining computationally efficient. Extensive experimental evaluations with Tarzan were performed using the Vimeo90K, UCF101, and Middlebury datasets. Extensive experiments evaluate UFlow-Net's performance against many previous optical flow models in terms of Peak Signal-to-Noise Ratio (PSNR) and Structural Similarity Index (SSIM) of the interpolated results, with improved maintaining of structural detail and spatial coherence in UFlow-Net interpolated frames. Though UFlow-Net outperforms current methods by a large margin, it may still struggle in very complex background scenes or images with highly rapid motion. The following are limitations that guide further studies in video frame interpolation technology. This research has significant practical implications. The accuracy and computational efficiency balance of UFlow-Net renders it especially applicable to real-world video enhancement tasks. This work makes a major step forward in the development of high-quality video processing systems and looks positive for future work.en_US
dc.language.isoenen_US
dc.publisherIUBen_US
dc.titleU flow-net: integration of flow estimation and kernel-based method in a modified u-net architecture for video frame interpolationen_US
dc.typeThesisen_US


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record


Copyright © 2002-2021  IUB Academic Repository.
Maintained by  Library Information Technology (LIT)
LIT