- . For example the domain of images of nature and the domain of images of houses. The domain of documents in English and in French. The domain of numbers in Arabic and the domain of numbers in Latin.
- . Both distributions are generated by the same domain but the marginals are different and therefore this case is known as domain adaptation. For example the domain of English documents with marginals obtained from documents of different topics. Another example is shown here where the marginals are different because different people have different physiologies in their hands and EMG signals are sensed for a person sometimes quite differently than others. Finally the classic domain adaptation example: spoken English with the marginals referring to different accents.
- This is the most common scenario where we use the task we learned in the source domain to improve our learning of the target domain.


Fine Tuning
A complementary approach is fine-tuning where we allow a upper subset of the base network to train while continue to keep frozen the weights of the bottom of the base. See this for an example for a fine-tuning approach for the problem of instance segmentation.Workshop
The application of transfer learning in Keras is demonstrated here. Key references: (Lopez-Paz et al., 2015; Sun et al., 2016; Zhang et al., 2017; Zhang et al., 2017)References
- Lopez-Paz, D., Bottou, L., Schölkopf, B., Vapnik, V. (2015). Unifying distillation and privileged information.
- Sun, B., Feng, J., Saenko, K. (2016). Correlation Alignment for Unsupervised Domain Adaptation.
- Zhang, Y., Xiang, T., Hospedales, T., Lu, H. (2017). Deep Mutual Learning.
- Zhang, Y., Xiang, T., Hospedales, T., Lu, H. (2017). Deep Mutual Learning.

