A Batch Normalization Classifier for Domain Adaptation
Adapting a model to perform well on unforeseen data outside its training set is a common problem that continues to motivate new approaches. We demonstrate that application of batch normalization in the output layer, prior to softmax activation, results in improved generalization across visual data d...
Saved in:
Main Authors: | , |
---|---|
Format: | Journal Article |
Language: | English |
Published: |
22-03-2021
|
Subjects: | |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Adapting a model to perform well on unforeseen data outside its training set
is a common problem that continues to motivate new approaches. We demonstrate
that application of batch normalization in the output layer, prior to softmax
activation, results in improved generalization across visual data domains in a
refined ResNet model. The approach adds negligible computational complexity yet
outperforms many domain adaptation methods that explicitly learn to align data
domains. We benchmark this technique on the Office-Home dataset and show that
batch normalization is competitive with other leading methods. We show that
this method is not sensitive to presence of source data during adaptation and
further present the impact on trained tensor distributions tends toward
sparsity. Code is available at https://github.com/matthewbehrend/BNC |
---|---|
DOI: | 10.48550/arxiv.2103.11642 |