The effect of doping on the thermoelectric properties
of Half-Heusler
(HH) high-entropy alloy (HEA) Ti2NiCoSnSb was studied.
Lower thermal conductivity was observed with increased Sb doping.
Mass scattering by heavy (Ta, Zr) and light (Al) dopants was studied
to further lower the thermal conductivity. Dopants at the level of
up to 50% at the Ti site were studied. A high HH phase content was
obtained in the Zr-doped samples, and a low-lattice thermal conductivity
of 1.9 W/(m·K) was observed. This value is one of the lowest
reported lattice thermal conductivities in HH alloys. The poor solubility
of Ta led to undissolved Ta in the samples, which enhanced the electrical
properties. In the case of Al doping, the NiAl phase raised the power
factor value of Ti1.8Al0.2NiCoSn0.5Sb1.5 to 2.2 × 10–3 W/(m·K2), which is almost twice the corresponding value reported
for Ti2NiCoSnSb. Interestingly, a maximum ZT of 0.29 was found in all of the doped systems, although the transport
mechanism and microstructure varied widely with the type of dopant.
An optimum dopant level of 25% of Zr, 7.5% of Ta, and 10% of Al is
necessary to obtain the maximum ZT in these alloys.
Compared to HH systems, the HH high-entropy alloy (HEA) systems provide
a larger composition field for tuning the transport properties by
simultaneous doping of multiple elements to lower the thermal conductivity.
Researchers are interested in Facial Emotion Recognition (FER) because it could be useful in many ways and has promising applications. The main task of FER is to identify and recognize the original facial expressions of users from digital inputs. Feature extraction and emotion recognition make up the majority of the traditional FER. Deep Neural Networks, specifically Convolutional Neural Network (CNN), are popular and highly used in FER due to their inherent image feature extraction process. This work presents a novel method dubbed as EfficientNet-XGBoost that is based on Transfer Learning (TL) technique. EfficientNet-XGBoost is basically a cascading of the EfficientNet and the XGBoost techniques along with certain enhancements by experimentation that reflects the novelty of the work. To ensure faster learning of the network and to overcome the vanishing gradient problem, our model incorporates fully connected layers of global average pooling, dropout and dense. EfficientNet is fine-tuned by replacing the upper dense layer(s) and cascading the XGBoost classifier making it suitable for FER. Feature map visualization is carried out that reveals the reduction in the size of feature vectors. The proposed method is well-validated on benchmark datasets such as CK+, KDEF, JAFFE, and FER2013. To overcome the issue of data imbalance, in some of the datasets namely CK+ and FER2013, we augmented data artificially through geometric transformation techniques. The proposed method is implemented individually on these datasets and corresponding results are recorded for performance analysis. The performance is computed with the help of several metrics like precision, recall and F1 measure. Comparative analysis with competent schemes are carried out on the same sample data sets separately. Irrespective of the nature of the datasets, the proposed scheme outperforms the rest with overall rates of accuracy being 100%, 98% and 98% for the first three datasets respectively. However, for the FER2013 datasets, efficiency is less promisingly observed in support of the proposed work.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.