The automation of inspections in aircraft engines is an ever-increasing growing field of research. In particular, the inspection and quantification of coating damages in confined spaces, usually performed manually with handheld endoscopes, comprise tasks that are challenging to automate. In this study, 2D RGB video data provided by commercial instruments are further analyzed in the form of a segmentation of damage areas. For this purpose, large overview images, which are stitched from the video frames, showing the whole coating area are analyzed with convolutional neural networks (CNNs). However, these overview images need to be divided into smaller image patches to keep the CNN architecture at a functional and fixed size, which leads to a significantly reduced field of view (FOV) and therefore a loss of information and reduced network accuracy. A possible solution is a downsampling of the overview image to decrease the number of patches and increase this FOV for each patch. However, while an increased FOV with downsampling or a small FOV without resampling both exhibit a lack of information, these approaches incorporate partly different information and abstractions to be utilized complementary. Based on this hypothesis, we propose a two-stage segmentation pipeline, which processes image patches with different FOV and downsampling factors to increase the overall segmentation accuracy for large images. This includes a novel method to optimize the position of image patches, which leads to a further improvement in accuracy. After a validation of the described hypothesis, an evaluation and comparison of the proposed pipeline and methods against the single-network application is conducted in order to demonstrate the accuracy improvements.
In order to provide timely, reliable, and comprehensive data for the maintenance of highly stressed geometries in sheet-bulk metal forming tools, this article features a possible setup by combining a 3D measuring endoscope with a two-stage kinematic. The measurement principle is based on the projection of structured light, allowing time-effective measurements of larger areas. To obtain data of proper quality, several hundred measurements are performed which then have to be registered and finally merged into one single point cloud. Factors such as heavy, unwieldy specimens affecting precise alignment. The rotational axes are therefore possibly misaligned and the kinematics and the hand-eye transformation remain uncalibrated. By the use of computer-aided design (CAD) data, registration can be improved, allowing a detailed examination of local features like gear geometries while reducing the sensitivity to detect shape deviations.
Fringe projection profilometry in combination with other optical measuring technologies has established itself over the last decades as an essential complement to conventional, tactile measuring devices. The non-contact, holistic reconstruction of complex geometries within fractions of a second in conjunction with the lightweight and transportable sensor design open up many fields of application in production metrology. Furthermore, triangulation-based measuring principles feature good scalability, which has led to 3D scanners for various scale ranges. Innovative and modern production processes, such as sheet-bulk metal forming, thus, utilize fringe projection profilometry in many respects to monitor the process, quantify possible wear and improve production technology. Therefore, it is essential to identify the appropriate 3D scanner for each application and to properly evaluate the acquired data. Through precise knowledge of the measurement volume and the relative uncertainty with respect to the specimen and scanner position, adapted measurement strategies and integrated production concepts can be realized. Although there are extensive industrial standards and guidelines for the quantification of sensor performance, evaluation and tolerancing is mainly global and can, therefore, neither provide assistance in the correct, application-specific positioning and alignment of the sensor nor reflect the local characteristics within the measuring volume. Therefore, this article compares fringe projection systems across various scale ranges by positioning and scanning a calibrated sphere in a high resolution grid.
Inspection in confined spaces and difficult-to-access machines is a challenging quality assurance task and particularly difficult to quantify and automate. Using the example of aero engine inspection, an approach for the high-precision inspection of movable turbine blades in confined spaces will be demonstrated. To assess the condition and damages of turbine blades, a borescopic inspection approach in which the pose of the turbine blades is estimated on the basis of measured point clouds is presented. By means of a feature extraction approach, film-cooling holes are identified and used to pre-align the measured point clouds to a reference geometry. Based on the segmented features of the measurement and reference geometry a RANSAC-based feature matching is applied, and a multi-stage registration process is performed. Subsequently, an initial damage assessment of the turbine blades is derived, and engine disassembly decisions can be assisted by metric geometry deviations. During engine disassembly, the blade root is exposed to high disassembly forces, which can damage the blade root and is crucial for possible repair. To check for dismantling damage, a fast inspection of the blade root is executed using the borescopic sensor.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.