OUT-OF-DISTRIBUTION AS A TARGET CLASS IN SEMI-SUPERVISED LEARNING
Antoine Tadros, S�bastien Drouyer, Rafael Grompone von Gioi
-
SPS
IEEE Members: $11.00
Non-members: $15.00Length: 00:06:17
A key limitation of supervised learning is the ability to handle data from unknown distributions. Often, such methods fail when presented with samples from a source not represented in the training data. This work proposes an effective way of controlling the behavior of a neural network in the presence of out-of-distribution examples. For this, the training dataset supplemented with extraneous data assigned to an additional out-of-distribution class. The extraneous data may come from a different dataset or be even noise. By applying a Gaussian mixture model on the latent representation, and by taking advantage of the ability of these models to generalize well, the method described thereafter performs well. Training the model on a segregated dataset helps the model to distinguish out-of-distribution data, including the ones the model were never confronted to during training.