Paper
9 September 2022 Rotary transformer for image captioning
Author Affiliations +
Proceedings Volume 12328, Second International Conference on Optics and Image Processing (ICOIP 2022); 1232802 (2022) https://doi.org/10.1117/12.2644069
Event: Second International Conference on Optics and Image Processing (ICOIP 2022), 2022, Taian, China
Abstract
Image captioning tasks based on deep learning encompasses two major domains: computer vision and natural language processing. The Transformer architecture has achieved leading performance in the field of natural language processing, There have been studies using Transformer in image caption encoder and decoder, the results proving better performance compared to previous solutions. Positional encoding is an essential part in Transformer. Rotary Transformer proposed Rotary Position Embedding (RoPE), has achieved comparable or superior performance on various language modeling tasks. Limited work has been done to adapt the Roformer's architecture to image captioning tasks. The study conduct research based on the positional encoding of Transformer architecture, our proposed model consists of modified Roformer as an encoder and BERT as a decoder. With extracted feature as inputs as well as some training tricks, our model achieves similar or better performance on MSCOCO dataset compared to “CNN+RNN” models and regular transformer solutions.
© (2022) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Yile Qiu and Li Zhu "Rotary transformer for image captioning", Proc. SPIE 12328, Second International Conference on Optics and Image Processing (ICOIP 2022), 1232802 (9 September 2022); https://doi.org/10.1117/12.2644069
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Transformers

Computer programming

Performance modeling

Visualization

Head

Visual process modeling

Animal model studies

RELATED CONTENT


Back to Top