This work addresses the unsupervised adaptation of an existing object detector to a new target domain. We assume that a large number of unlabeled videos from this domain are readily available. We automatically obtain labels on the target data by using high-confidence detections from the existing detector, augmented with hard (misclassified) examples acquired by exploiting temporal cues using a tracker. These automatically-obtained labels are then used for re-training the original model. A modified knowledge distillation loss is proposed, and we investigate several ways of assigning soft-labels to the training examples from the target domain. Our approach is empirically evaluated on challenging face and pedestrian detection tasks: a face detector trained on WIDER-Face, which consists of highquality images crawled from the web, is adapted to a largescale surveillance data set; a pedestrian detector trained on clear, daytime images from the BDD-100K driving data set is adapted to all other scenarios such as rainy, foggy, nighttime. Our results demonstrate the usefulness of incorporating hard examples obtained from tracking, the advantage of using soft-labels via distillation loss versus hard-labels, and show promising performance as a simple method for unsupervised domain adaptation of object detectors, with minimal dependence on hyper-parameters. Code and models are available at
DSSCs with >9% PCE based on a new D–π–A dye (SK3) having carbazole as a donor, vinylene-phenylene (π-bridge) and cyanoacrylic acid as electron withdrawing–injecting as well as anchoring groups are reported.
Important gains have recently been obtained in object detection by using training objectives that focus on hard negative examples, i.e., negative examples that are currently rated as positive or ambiguous by the detector. These examples can strongly influence parameters when the network is trained to correct them. Unfortunately, they are often sparse in the training data, and are expensive to obtain. In this work, we show how large numbers of hard negatives can be obtained automatically by analyzing the output of a trained detector on video sequences. In particular, detections that are isolated in time, i.e., that have no associated preceding or following detections, are likely to be hard negatives. We describe simple procedures for mining large numbers of such hard negatives (and also hard positives) from unlabeled video data. Our experiments show that retraining detectors on these automatically obtained examples often significantly improves performance. We present experiments on multiple architectures and multiple data sets, including face detection, pedestrian detection and other object categories.
We report the synthesis of bay-annulated (N, S, and Se) perylene bisimides (PBIs) and their structural, thermal, photophysical, electrochemical, and morphological characterization. In addition, their application in organic field effect transistors (OFETs) is demonstrated. All the PBIs except PBI-Se exhibited bright emission in solutions and thin films. Planar molecular structure, variation of the HOMO−LUMO levels, and the energy gaps were evaluated with the help of Gaussian simulation. Morphology of all the synthesized PBIs has been investigated with the aid of polarizing optical microscopy and atomic force microscopy (AFM). AFM topographical images exhibited surface roughness value ranging from 1.98 to 3.52 nm. Powder X-ray diffraction data revealed the lamellar packing of these molecules in the thermally evaporated thin films. It has been witnessed that the OFET device with top-contact bottom-gate configuration, where the PBI-S served as the semiconducting layer, exhibited the highest electron mobility (μ = 4.40 × 10 −3 cm 2 V −1 s −1 ) in comparison to PBI-N and PBI-Se, due to the better thin film growth mechanism. These materials are promising from the viewpoint of n-type materials that can be utilized in organic electronics.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.