GABIC: Graph-based Attention Block for Image Compression
While standardized codecs like JPEG and HEVC-intra represent the industry standard in image compression, neural Learned Image Compression (LIC) codecs represent a promising alternative. In detail, integrating attention mechanisms from Vision Transformers into LIC models has shown improved compressio...
Saved in:
Main Authors: | , , , , , |
---|---|
Format: | Journal Article |
Language: | English |
Published: |
03-10-2024
|
Subjects: | |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | While standardized codecs like JPEG and HEVC-intra represent the industry
standard in image compression, neural Learned Image Compression (LIC) codecs
represent a promising alternative. In detail, integrating attention mechanisms
from Vision Transformers into LIC models has shown improved compression
efficiency. However, extra efficiency often comes at the cost of aggregating
redundant features. This work proposes a Graph-based Attention Block for Image
Compression (GABIC), a method to reduce feature redundancy based on a k-Nearest
Neighbors enhanced attention mechanism. Our experiments show that GABIC
outperforms comparable methods, particularly at high bit rates, enhancing
compression performance. |
---|---|
DOI: | 10.48550/arxiv.2410.02981 |