ABSTRACT. Earlier work by Driscoll and Healy [18]
We discuss an implementation of an efficient algorithm for the numerical computation of Fourier transforms of bandlimited functions defined on the rotation group SO(3). The implementation is freely available on the web. The algorithm described herein uses O(B 4 ) operations to compute the Fourier coefficients of a function whose Fourier expansion uses only (the O(B 3 )) spherical harmonics of degree at most B. This compares very favorably with the direct O(B 6 ) algorithm derived from a basic quadrature rule on O(B 3 ) sample points. The efficient Fourier transform also makes possible the efficient calculation of convolution over SO(3) which has been used as the analytic engine for some new approaches to searching 3D databases (Funkhouser et al.fast SO(3) algorithm can be improved to give an algorithm of complexity O(B 3 log 2 B), but at a cost in numerical reliability. Numerical and empirical results are presented establishing the empirical stability of the basic algorithm. Examples of applications are presented as well.
i=0 f i P j (z i)w(i) for some associated weight function w. These sorts of transforms nd important applications in areas such as medical imaging and signal processing. In this paper we present fast algorithms for computing discrete orthogonal polynomial transforms. For a system of N orthogonal polynomials of degree at most N ? 1 we give an O(N log 2 N) algorithm for computing a discrete polynomial transform at an arbitrary set of points instead of the N 2 operations required by direct evaluation. Our algorithm depends only on the fact that orthogonal polynomial sets satisfy a three-term recurrence and thus it may be applied to any such set of discrete sampled functions. In particular, sampled orthogonal polynomials generate the vector space of functions on a distance transitive graph. As a direct application of our work we are able to give a fast algorithm for computing subspace decompositions of this vector space which respect the action of the symmetry group of such a graph. This has direct applications to treating computational bottlenecks in the spectral analysis of data on distance transitive graphs and we discuss this in some detail. J. Driscoll, D. Healy supported in part by DARPA as administered by the AFOSR under contract AFOSR-90-0292. D. Rockmore supported in part by an NSF Math Sciences Postdoctoral Fellowship y
We describe a computational technique for authenticating works of art, specifically paintings and drawings, from high-resolution digital scans of the original works. This approach builds a statistical model of an artist from the scans of a set of authenticated works against which new works then are compared. The statistical model consists of first-and higher-order wavelet statistics. We show preliminary results from our analysis of 13 drawings that at various times have been attributed to Pieter Bruegel the Elder; these results confirm expert authentications. We also apply these techniques to the problem of determining the number of artists that may have contributed to a painting attributed to Pietro Perugino and again achieve an analysis agreeing with expert opinion. I t probably was not long after people began paying money for art that a lucrative business in forging art was born, and it probably was not too much later that techniques for detecting art forgeries emerged. Even today, the early techniques for authentication remain preeminent. By and large, these techniques are based on ''connoisseurship'' and so rely on the discerning eyes of a few experts who are steeped in the work and life of the artist in question. Their opinion may be informed by the ''catalogue raison,'' which is the current acknowledged authoritative work on the artist's oeuvre. Other desiderata may include provenance that might be traced back to the artist's circle or his collectors and makes possible the comparison of the work's implicit biography with the histories of related works or even a detailed analysis of any signature that may be present. (See ref. 1 for a survey of current techniques.)In addition to the reliance on the human actor, quantitative methods can be brought to bear. X-ray analysis can reveal a painting beneath a painting that can shed light on its origins. Surface analysis of the painting materials is another approach, most famously applied in the investigation of the famous ''van Meerghen forgeries.'' In this case, the forgery of paintings attributed to Johannes Vermeer was confirmed by dating the paintings according to the proportion of a certain lead isotope in the lead-based paint. An elementary application of differential equations allowed for the actual isotope content to be compared with the expected content had the work been painted in Vermeer's day (2). This technique marked a first use of mathematics in the service of art authentication.With the advent of powerful digital technology, computational tools may be able to provide new insights into and techniques for the art and science of art authentication. For example, a fractal analysis of Jackson Pollock's drip paintings has revealed interesting relations between the evolution of Pollock's aesthetic and the fractal dimension of his work (3, 4). The analysis also raises the possibility of using fractal dimension to help authenticate work attributed to Pollock. Various techniques from machine learning have been applied to the analysis and classification of ''...
Literature is a form of expression whose temporal structure, both in content and style, provides a historical record of the evolution of culture. In this work we take on a quantitative analysis of literary style and conduct the first large-scale temporal stylometric study of literature by using the vast holdings in the Project Gutenberg Digital Library corpus. We find temporal stylistic localization among authors through the analysis of the similarity structure in feature vectors derived from content-free word usage, nonhomogeneous decay rates of stylistic influence, and an accelerating rate of decay of influence among modern authors. Within a given time period we also find evidence for stylistic coherence with a given literary topic, such that writers in different fields adopt different literary styles. This study gives quantitative support to the notion of a literary "style of a time" with a strong trend toward increasingly contemporaneous stylistic influence.cultural evolution | stylometry | culture | complexity | big data W ritten works, or literature, provide one of the great bodies of cultural artifacts. The analysis of literature typically involves the aggregation of information on several levels, ranging from words to sentences and even larger scale properties of temporal narratives such as structure, plot, and the use of irony and metaphor (1-3). Quantitative methods have long been applied to literature, most notably in the analysis of style, which can be traced back to a comment by the mathematician Augustus de Morgan regarding the attribution of the Pauline epistles (4) and the late nineteenth century work of the historian of philosophy Wincenty Lutasłowski, who brought basic statistical ideas of word usage to the problem of dating the dialogues of Plato (5). It was Lutasłowski who coined the word "stylometry" to describe such an approach to investigating questions of literary style. Since then, a wide range of statistical techniques have been developed toward this end (6), generally with the goal of settling questions of author attribution (see, e.g., refs. 6-11). Stylometric studies have also been pursued in the study of visual art (12, 13) and music [both in composition (14-16) and performance (17)], and are part of a growing body of work in the quantitative analysis of cultural artifacts (18).In this paper we report our findings from the first large-scale stylometric analysis of literature. The goal of this work is not author attribution-for the authorship of all the works is well known-but is instead to articulate, in a quantitative fashion, large-scale temporal trends in literary (i.e., writing) style. This type of study has been, until now, impossible to undertake, but the advent of mass digitization has created dramatic new opportunities for scholarly studies in literature as well as in other disciplines (19). Our literature sample is obtained from the Project Gutenberg Digital Library (http://www.gutenberg.org/wiki/ Gutenberg:About). Project Gutenberg consists of more than 30,000 public domai...
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.