Acknowledgements. This work was funded by NSF EAGER-1514351 to RNA and CAN, a grant and an Emmy Noether Grant (CI-241/1-1) to RMC, and a Philippe Foundation award to LB. We thank the participants, families, and students who made this work possible, Dr. Dimitrios Pantazis for sharing MATLAB code to perform clusterbased inferences on classification time-series, and Heather Kosakowski and Kirsten Lydic for helpful feedback on an earlier version of this manuscript.2 Abstract Computational tools have allowed cognitive neuroscientists to move beyond measuring neural activations to examining neural representations. However, access to the representational content of neural activations early in life has remained limited. We asked whether patterns of neural activity elicited by complex visual stimuli (animals, human body) could be decoded from EEG data gathered from 12-15-month-old infants and adult controls. We assessed pairwise classification accuracy at each time-point after stimulus onset, for individual infants and adults. Classification accuracies rose above chance in both groups, within 500 ms. In contrast to adults, neural representations in infants were not linearly separable across visual domains. Representations were similar within, but not across, age groups. These findings suggest a developmental reorganization of visual representations between the second year of life and adulthood and provide a promising proof-ofconcept for the feasibility of decoding EEG data within-subject to assess how the infant brain dynamically represents visual objects.