Continually adapt model to new domains without catastrophic forget using visual prompts with CTAP

--

Continually adapt model to new domains without catastrophic forget using visual prompts with CTAP

Decorate the Newcomers: Visual Domain Prompt for Continual Test Time Adaptation
arXiv paper abstract https://arxiv.org/abs/2212.04145
arXiv PDF paper https://arxiv.org/pdf/2212.04145.pdf

Continual Test-Time Adaptation (CTTA) aims to adapt the source model to continually changing unlabeled target domains without access to the source data.

Existing methods … such as predicting pseudo labels … are noisy and unreliable, these methods suffer from catastrophic forgetting and error accumulation when dealing with dynamic data distributions.

Motivated by the prompt learning in NLP … During testing, the changing target datasets can be adapted to the source model by reformulating the input data with the learned visual prompts.

… devise two types of prompts, i.e., domains-specific prompts and domains-agnostic prompts, to extract current domain knowledge and maintain the domain-shared knowledge in the continual adaptation.

Furthermore, … design a homeostasis-based prompt adaptation strategy to suppress domain-sensitive parameters in domain-invariant prompts to learn domain-shared knowledge more effectively.

… proposed method achieves significant performance gains over state-of-the-art methods …

Stay up to date. Subscribe to my posts https://morrislee1234.wixsite.com/website/contact
Web site with my other posts by category https://morrislee1234.wixsite.com/website

LinkedIn https://www.linkedin.com/in/morris-lee-47877b7b

Photo by Call Me Fred on Unsplash

--

--

AI News Clips by Morris Lee: News to help your R&D
AI News Clips by Morris Lee: News to help your R&D

Written by AI News Clips by Morris Lee: News to help your R&D

A computer vision consultant in artificial intelligence and related hitech technologies 37+ years. Am innovator with 66+ patents and ready to help a firm's R&D.

No responses yet