*Result*: FAST: Foreground-aware active self-training for domain adaptive object detection.

Title:
FAST: Foreground-aware active self-training for domain adaptive object detection.
Authors:
Zhang D; Southwestern University of Finance and Economics, China., Deng H; Southwestern University of Finance and Economics, China., Wang H; Southwestern University of Finance and Economics, China. Electronic address: wanghl@uestc.edu.cn., Du Z; University of Electronic Science and Technology of China, China., Liu G; Southwestern University of Finance and Economics, China., Li J; University of Electronic Science and Technology of China, China., Ye M; University of Electronic Science and Technology of China, China.
Source:
Neural networks : the official journal of the International Neural Network Society [Neural Netw] 2026 Mar; Vol. 195, pp. 108201. Date of Electronic Publication: 2025 Oct 11.
Publication Type:
Journal Article
Language:
English
Journal Info:
Publisher: Pergamon Press Country of Publication: United States NLM ID: 8805018 Publication Model: Print-Electronic Cited Medium: Internet ISSN: 1879-2782 (Electronic) Linking ISSN: 08936080 NLM ISO Abbreviation: Neural Netw Subsets: MEDLINE
Imprint Name(s):
Original Publication: New York : Pergamon Press, [c1988-
Contributed Indexing:
Keywords: Active learning; Domain adaptation; Mean teacher self-training; Object detection
Entry Date(s):
Date Created: 20251019 Date Completed: 20260124 Latest Revision: 20260128
Update Code:
20260130
DOI:
10.1016/j.neunet.2025.108201
PMID:
41110199
Database:
MEDLINE

*Further Information*

*Domain adaptive object detection (DAOD) aims to enable object detectors to perform well on an unlabeled target domain that differs from the source domain used for training. Among various approaches, mean-teacher self-training has emerged as a promising framework in DAOD. However, the noisy pseudo-labels generated by the teacher model constrain its potential for further performance improvements, making it challenging to achieve fully supervised performance. While annotating all target samples is prohibitively expensive, labeling a small subset is often acceptable. Active domain adaptation (ADA) therefore serves as promising way to alleviate this issue by selectively annotating the most informative target samples to maximize performance gains with minimal annotation cost. However, its application to DAOD remains underexplored. This paper proposes Foreground-aware Active Self-Training (FAST), establishing an effective framework for active DAOD. Specifically, FAST introduces two innovative sampling strategies: foreground diversity clustering sampling (FDCS) to maximize the diversity of selected foreground objects, and teacher-student discrepancy uncertainty sampling (TDUN) to identify samples with high prediction uncertainty. These strategies are implemented within a decoupled active learning paradigm that employs a dedicated sampling model to identify the most informative target samples. By incorporating the selected samples into the mean-teacher framework, FAST significantly improves detection performance on the target domain. Extensive experiments demonstrate that our method achieves superior performance across multiple DAOD datasets, showcasing its effectiveness in bridging the domain gap in challenging scenarios.
(Copyright © 2025 Elsevier Ltd. All rights reserved.)*

*Declaration of competing interest The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper. This research did not receive any specific grant from funding agencies in the public, commercial, or not-for-profit sectors. All authors have contributed substantially to this work and have approved the final manuscript.*