Concept drift-tolerant transfer learning in dynamic environments.

Existing transfer learning methods that focus on problems in stationary environments are not usually applicable to dynamic environments, where concept drift may occur. To the best of our knowledge, the concept drift-tolerant transfer learn- ing (CDTL), whose major challenge is the need to adapt the target model and knowledge of source domains to the changing environments, has yet to be well explored in the literature. This article, therefore, proposes a hybrid ensemble approach to deal with the CDTL problem provided that data in the target domain are generated in a streaming chunk-by-chunk manner from nonstationary environments. At each time step, a class- wise weighted ensemble is presented to adapt the model of target domains to new environments. It assigns a weight vector for each classifier generated from the previous data chunks to allow each class of the current data leveraging historical knowl- edge independently. Then, a domain-wise weighted ensemble is introduced to combine the source and target models to select useful knowledge of each domain. The source models are updated with the source instances performed by the proposed adaptive weighted CORrelation ALignment (AW-CORAL). AW-CORAL iteratively minimizes domain discrepancy meanwhile decreases the effect of unrelated source instances. In this way, positive knowledge of source domains can be potentially promoted while negative knowledge is reduced. Empirical studies on synthetic and real benchmark data sets demonstrate the effectiveness of the proposed algorithm.

PDF
No code implementations yet. Submit your code now

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here