Transfer learning data adaptation using conflation of low-level textural
features
Abstract
Adapting the target dataset for a pre-trained model is still
challenging. These adaptation problems result from a lack of adequate
transfer of traits from the source dataset; this often leads to poor
model performance resulting in trial and error in selecting the best
performing pre-trained model. This paper introduces the conflation of
source domain low-level textural features extracted using the first
layer of the pretrained model. The extracted features are compared to
the conflated low-level features of the target dataset to select a
higher quality target dataset for improved pre-trained model performance
and adaptation. From comparing the various probability distance metrics,
Kullback-Leibler is adopted to compare the samples from both domains. We
experiment on three publicly available datasets and two ImageNet
pre-trained models used in past studies for results comparisons. This
proposed approach method yields two categories of the target samples
with those with lower Kullback-Leibler values giving better accuracy,
precision and recall. The samples with the lower Kullback-Leibler values
give a higher margin accuracy rate of 6.21% to 7.27%, thereby leading
to better model adaptation for target transfer learning datasets and
tasks