One of the main difficulties in studying expression in musical performance is the acquisition of data. While audio recordings abound, automatically extracting precise information related to timing, dynamics, and articulation is still not possible at the level of precision required for large-scale music performance studies. In 1989, the Russian pianist Nikita Magaloff performed essentially the entire works for solo piano by Frédéric Chopin on a Bösendorfer SE, a computer-controlled grand piano that precisely measures every key and pedal action by the performer. In this paper, we describe the process and the tools for the preparation of this collection, which comprises hundreds of thousands of notes. We then move on to presenting the results of initial exploratory studies of the expressive content of the data, specifically effects of performer age, performance errors, between-hand asynchronies, and tempo rubato. We also report preliminary results of a systematic study of the shaping of particular rhythmic passages, using the notion of phase-plane trajectories. Finally, we briefly describe how the Magaloff data were used to train a performance rendering system that won the 2008 Rencon International Performance Rendering Contest.