In order to predict with some degree of certainty whether any test method can be adapted to a particular inspection problem or more fundamentally to develop methods which can be applied, the physical principles of methods must be understood as completely as possible. Only after such an understanding has been obtained can further exploitation and technical development along particular Unes be justified. The general inadequacy of available methods for the inspection of metals at war-time production rates was first indicated by experience during World War II. The realization of this situation came too late, however, to allow for revision of methods then used and the substitution of new ideas for inspection tests. In a few specialized cases, new methods were provided and adopted, but these methods were by no means fully exploited. A test method which aroused considerable interest during the war, utilizes energy in the form of high frequency vibrations (ultrasonic vibrations) for the detection of flaws in certain parts. Successful application of this method in specific cases indicated a much wider utilization of the principles employed therein. The report presented herewith is a summary of available technical literature through 1946 on the subject of ultrasonics as applied to metals. Particular emphasis has been placed on the detection of flaws or discontinuities in metals since the bulk of the literature encountered has dealt with this application. However, other metallurgical applications were encountered and have been included for the sake of completeness and to promote speculation along these lines.
At the risk of being caught in the crossfire of controversy that has accompanied the subject of radiographic sensitivity since the days of the first radiographic procedure specification, the author will attempt to express his thoughts on this subject in an effort to present a more complete and modern picture of the problem than has been done in the past. Much of the discussion to be presented will not be new but will be presented in the light of recent experiences in the preparation of a general radiographic procedure specification. “The term radiographic sensitivity refers to a combination of contrast and definition which determines the clarity with which variations in dimensions and radiographic opacity of the object being examined are depicted on the exposed film. This quality is, of course, directly related to the ability of the radiograph to reveal defects or other sought for conditions in the object but the latter ability is very difficult to express quantitatively.” It is interesting to note that of several textbooks and manuals on radiography examined, not one presented a word definition and only one of them made specific mention of this term. The one book noted which did present a picture of the significance of the term did so by means of a chart showing the breakdown of factors influencing radiographic sensitivity, a very clear means of explanation. The above definition does not state the meaning of the term nearly as well as it states the factors upon which it depends. This fact is somewhat unfortunate but nonetheless significant. The fact that it is difficult to define directly and the fact that it is incapable of quantitative evaluation leads one to question the advisability of using the term as the keynote to good industrial radiography at the present time. In an attempt to discover the reason for our failure to be able to evaluate radiographic sensitivity quantitatively one must investigate the individual factors involved.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.