The resolution of the computed depth maps of fish in an underwater environment limits the 3D fish metric estimation. This paper addresses this problem using object-based matching for underwater fish tracking and depth computing using convolutional neural networks (CNNs). First, for each frame in a stereo video, a joint object classification and semantic segmentation CNN is used to segment fish objects from the background. Next, the fish objects in these images are cropped and matched for the subpixel disparity computation using the video interpolation CNN. The calculated disparities and depth values estimate the fish metrics, including the length, height, and weight. Next, we tracked the fish across frames of the input stereo video to compute the metrics of the fish frame by frame. Finally, the fish median metrics are calculated for noise reduction caused by the fish motions. Hereafter, the fish with incorrect measurement by the stereo camera is cleaned before generating the final fish metric distributions, which are relevant inputs for learning decision models to manage a fish farm. We also constructed underwater stereo video datasets with actual fish metrics measured by humans to verify the effectiveness of our approach. Experimental results show a 5% error rate in our fish length estimation.INDEX TERMS convolutional neural network, object tracking, object-based stereo matching I. INTRODUCTION
The ocean resources have been rapidly depleted in the recent decade, and the complementary role of aquaculture to food security has become more critical than ever before. Water quality is one of the key factors in determining the success of aquaculture and real-time water quality monitoring is an important process for aquaculture. This paper proposes a low-cost and easy-to-build artificial intelligence (AI) buoy system that autonomously measures the related water quality data and instantly forwards them via wireless channels to the shore server. Furthermore, the data provide aquaculture staff with real-time water quality information and also assists server-side AI programs in implementing machine learning techniques to further provide short-term water quality predictions. In particular, we aim to provide a low-cost design by combining simple electronic devices and server-side AI programs for the proposed buoy system to measure water velocity. As a result, the cost for the practical implementation is approximately USD 2015 only to facilitate the proposed AI buoy system to measure the real-time data of dissolved oxygen, salinity, water temperature, and velocity. In addition, the AI buoy system also offers short-term estimations of water temperature and velocity, with mean square errors of 0.021 °C and 0.92 cm/s, respectively. Furthermore, we replaced the use of expensive current meters with a flow sensor tube of only USD 100 to measure water velocity.
Monitoring the status of culture fish is an essential task for precision aquaculture using a smart underwater imaging device as a non-intrusive way of sensing to monitor freely swimming fish even in turbid or low-ambient-light waters. This paper developed a two-mode underwater surveillance camera system consisting of a sonar imaging device and a stereo camera. The sonar imaging device has two cloud-based Artificial Intelligence (AI) functions that estimate the quantity and the distribution of the length and weight of fish in a crowded fish school. Because sonar images can be noisy and fish instances of an overcrowded fish school are often overlapped, machine learning technologies, such as Mask R-CNN, Gaussian mixture models, convolutional neural networks, and semantic segmentation networks were employed to address the difficulty in the analysis of fish in sonar images. Furthermore, the sonar and stereo RGB images were aligned in the 3D space, offering an additional AI function for fish annotation based on RGB images. The proposed two-mode surveillance camera was tested to collect data from aquaculture tanks and off-shore net cages using a cloud-based AIoT system. The accuracy of the proposed AI functions based on human-annotated fish metric data sets were tested to verify the feasibility and suitability of the smart camera for the estimation of remote underwater fish metrics.
The pollen grains of Phalaenopsis orchids are clumped tightly together, packed in pollen dispersal units called pollinia. In this study, the morphology, cytology, biochemistry, and sucrose transporters in pollinia of Phalaenopsis orchids were investigated. Periodic acid–Schiff (PAS) counterstained with DAPI or aniline blue was used to characterize the distribution of sugars and callose at the different development stages of pollinia. Ultra performance liquid chromatography-high resolution-tandem mass spectrometry data indicated that Phal. orchid accumulated abundant saccharides such as sucrose, galactinol, myo-inositol, and glucose, and trace amounts of raffinose and trehalose in mature pollinia. We found that galactinol synthase (PAXXG304680) and trehalose-6-phosphate phosphatase (PAXXG016120) were preferentially expressed in mature pollinia. The Phal. aphrodite genome was identified as having 11 sucrose transporters (SUTs). Our qRT-PCR confirmed that two SUTs (PAXXG030250 and PAXXG195390) were preferentially expressed in the pollinia. Pollinia germinated in Brewbaker and Kwack’s medium (BK) pollen germination media supplemented with 10% sucrose showed increased callose production and enhanced pollinia germination, but there was no callose and no germination in BK media without sucrose. We postulate that Phal. orchid accumulates high levels of sugars in mature pollinia providing nutrients and enhanced sucrose transporter gene expression for pollinia germination and tube growth.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.