Towards a tailored mixed-precision sub-8-bit quantization scheme for Gated Recurrent Units using Genetic Algorithms

Despite the recent advances in model compression techniques for deep neural networks, deploying such models on ultra-low-power embedded devices still proves challenging. In particular, quantization schemes for Gated Recurrent Units (GRU) are difficult to tune due to their dependence on an internal s...

Full description

Saved in:
Bibliographic Details
Main Authors: Miccini, Riccardo, Cerioli, Alessandro, Laroche, Clément, Piechowiak, Tobias, Sparsø, Jens, Pezzarossa, Luca
Format: Journal Article
Language:English
Published: 19-02-2024
Subjects:
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Abstract Despite the recent advances in model compression techniques for deep neural networks, deploying such models on ultra-low-power embedded devices still proves challenging. In particular, quantization schemes for Gated Recurrent Units (GRU) are difficult to tune due to their dependence on an internal state, preventing them from fully benefiting from sub-8bit quantization. In this work, we propose a modular integer quantization scheme for GRUs where the bit width of each operator can be selected independently. We then employ Genetic Algorithms (GA) to explore the vast search space of possible bit widths, simultaneously optimising for model size and accuracy. We evaluate our methods on four different sequential tasks and demonstrate that mixed-precision solutions exceed homogeneous-precision ones in terms of Pareto efficiency. In our results, we achieve a model size reduction between 25% and 55% while maintaining an accuracy comparable with the 8-bit homogeneous equivalent.
AbstractList Despite the recent advances in model compression techniques for deep neural networks, deploying such models on ultra-low-power embedded devices still proves challenging. In particular, quantization schemes for Gated Recurrent Units (GRU) are difficult to tune due to their dependence on an internal state, preventing them from fully benefiting from sub-8bit quantization. In this work, we propose a modular integer quantization scheme for GRUs where the bit width of each operator can be selected independently. We then employ Genetic Algorithms (GA) to explore the vast search space of possible bit widths, simultaneously optimising for model size and accuracy. We evaluate our methods on four different sequential tasks and demonstrate that mixed-precision solutions exceed homogeneous-precision ones in terms of Pareto efficiency. In our results, we achieve a model size reduction between 25% and 55% while maintaining an accuracy comparable with the 8-bit homogeneous equivalent.
Author Pezzarossa, Luca
Sparsø, Jens
Miccini, Riccardo
Laroche, Clément
Piechowiak, Tobias
Cerioli, Alessandro
Author_xml – sequence: 1
  givenname: Riccardo
  surname: Miccini
  fullname: Miccini, Riccardo
– sequence: 2
  givenname: Alessandro
  surname: Cerioli
  fullname: Cerioli, Alessandro
– sequence: 3
  givenname: Clément
  surname: Laroche
  fullname: Laroche, Clément
– sequence: 4
  givenname: Tobias
  surname: Piechowiak
  fullname: Piechowiak, Tobias
– sequence: 5
  givenname: Jens
  surname: Sparsø
  fullname: Sparsø, Jens
– sequence: 6
  givenname: Luca
  surname: Pezzarossa
  fullname: Pezzarossa, Luca
BackLink https://doi.org/10.48550/arXiv.2402.12263$$DView paper in arXiv
BookMark eNqFzrkOwkAMBNAtoOD6ACr8AwlJOESLEEeNoI5MYsBSsgteh-vrgYieaqTRjPTapmGdJWP6cRSOZ5NJNER58C1MxlESxkkyHbWM37k7Su4BQZELJ5RDyQ_Kg4tQxp6dBV8dgllwYIVrhVb5hVrX2ZlKgqMTWKN-flvKKhGyCnvL6qHybE-wJkvKGcyLkxPWc-m7pnnEwlPvlx0zWC13i01Q89KLcInyTL_MtGaO_i_e5E9Lnw
ContentType Journal Article
Copyright http://arxiv.org/licenses/nonexclusive-distrib/1.0
Copyright_xml – notice: http://arxiv.org/licenses/nonexclusive-distrib/1.0
DBID AKY
GOX
DOI 10.48550/arxiv.2402.12263
DatabaseName arXiv Computer Science
arXiv.org
DatabaseTitleList
Database_xml – sequence: 1
  dbid: GOX
  name: arXiv.org
  url: http://arxiv.org/find
  sourceTypes: Open Access Repository
DeliveryMethod fulltext_linktorsrc
ExternalDocumentID 2402_12263
GroupedDBID AKY
GOX
ID FETCH-arxiv_primary_2402_122633
IEDL.DBID GOX
IngestDate Wed Mar 13 12:34:42 EDT 2024
IsDoiOpenAccess true
IsOpenAccess true
IsPeerReviewed false
IsScholarly false
Language English
LinkModel DirectLink
MergedId FETCHMERGED-arxiv_primary_2402_122633
OpenAccessLink https://arxiv.org/abs/2402.12263
ParticipantIDs arxiv_primary_2402_12263
PublicationCentury 2000
PublicationDate 2024-02-19
PublicationDateYYYYMMDD 2024-02-19
PublicationDate_xml – month: 02
  year: 2024
  text: 2024-02-19
  day: 19
PublicationDecade 2020
PublicationYear 2024
Score 3.8265877
SecondaryResourceType preprint
Snippet Despite the recent advances in model compression techniques for deep neural networks, deploying such models on ultra-low-power embedded devices still proves...
SourceID arxiv
SourceType Open Access Repository
SubjectTerms Computer Science - Learning
Computer Science - Neural and Evolutionary Computing
Title Towards a tailored mixed-precision sub-8-bit quantization scheme for Gated Recurrent Units using Genetic Algorithms
URI https://arxiv.org/abs/2402.12263
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
link http://sdu.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwdV09T8MwED3RTiwIBKh838BqUJwvd6yghQkk6NAtSuKjVCK0xA3qz-fOCYKla2zZPluy3zs_vwBc88STTkutmCtbFSVFoYrQxGpoozKg0nu6SeriNX2amfux2OTg71uYvN4svlt_4MLdSur_JmCEEPagp7VIth6eZ-3lpLfi6ur_1WOM6T_9OyQm-7DXoTsctctxADv0eQhu6qWpDnMUueayJovVYkNWreruDzfomkIZZqlr_Go41u5xJDL1pIqQgSVKmsvii6THxVAJBSw6FNn6HMU7mnvE0cd8yWT_vXJHcDUZT-8elR9mtmo9JTKJIPMRhMfQZ-ZPA8BAl2EZJWSMTaJCJybNieI4f0uTwPIOdwKDba2cbi86g13NJ7NIj4PhOfTXdUMX0HO2ufTT-wNzDn6_
link.rule.ids 228,230,782,887
linkProvider Cornell University
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Towards+a+tailored+mixed-precision+sub-8-bit+quantization+scheme+for+Gated+Recurrent+Units+using+Genetic+Algorithms&rft.au=Miccini%2C+Riccardo&rft.au=Cerioli%2C+Alessandro&rft.au=Laroche%2C+Cl%C3%A9ment&rft.au=Piechowiak%2C+Tobias&rft.date=2024-02-19&rft_id=info:doi/10.48550%2Farxiv.2402.12263&rft.externalDocID=2402_12263