etinitis pigmentosa (RP) is a progressive, inherited, monogenic or rarely digenic 1 blinding disease caused by mutations in more than 71 different genes (https://sph.uth.edu/retnet/ sum-dis.htm). It affects more than 2 million people worldwide. With the exception of a gene replacement therapy for one form of early-onset RP caused by mutation in the gene RPE65 (ref. 2 ), there is no approved therapy for RP.Optogenetic vision restoration 3-5 is a mutation-independent approach for restoring visual function at the late stages of RP after vision is lost [6][7][8][9] . The open-label phase 1/2a PIONEER study (ClinicalTrials.gov identifier: NCT03326336; the clinical trial protocol is provided in the Supplementary Text) was designed to evaluate the safety (primary objective) and efficacy (secondary objective) of an investigational treatment for patients with advanced nonsyndromic RP that combines injection of an optogenetic vector (GS030-Drug Product (GS030-DP)) with wearing a medical device, namely light-stimulating goggles (GS030-Medical Device (GS030-MD)). The proof of concept for GS030-DP and the GS030-DP dose used in the PIONEER clinical trial were established in nonhuman primate studies 10,11 .The optogenetic vector, a serotype 2.7m8 (ref. 12 ) adenoassociated viral vector encoding the light-sensing channelrhodopsin protein ChrimsonR fused to the red fluorescent protein tdTomato 13 , was administered by a single intravitreal injection into the worse-seeing eye to target mainly foveal retinal ganglion cells 10 . The fusion protein tdTomato was included to increase the expression of ChrimsonR in the cell membrane 10 . The peak sensitivity of ChrimsonR-tdTomato is around 590 nm (amber color) 13 . We chose ChrimsonR, which has one of the most red-shifted action spectra among the available optogenetic sensors because amber light is safer and causes less pupil constriction 10 than the blue light used to activate many other sensors. The light-stimulating goggles capture images from the visual world using a neuromorphic camera that detects changes in intensity, pixel by pixel, as distinct events 14 . The goggles then transform the events into monochromatic images and project them in real time as local 595-nm light pulses onto the retina (Extended Data Fig. 1). Results Safety of the optogenetic vector and light-stimulating goggles.In this article, we describe the partial recovery of vision in one participant of the PIONEER study. At the inclusion in the study, this 58-year-old male, who was diagnosed with RP 40 years ago, had a visual acuity limited to light perception. The worse-seeing eye was treated with 5.0 × 10 10 vector genomes of optogenetic vector. Both before and after the injection, we performed ocular examinations and assessed the anatomy of the retina based on optical coherence tomography images, color fundus photographs and fundus autofluorescence images taken on several occasions over 15 visits spanning 84 weeks according to the protocol (Extended Data Fig. 2). We monitored potential intraocular inflammation a...
Coupling behavioral measures and brain imaging in naturalistic, ecological conditions is key to comprehend the neural bases of spatial navigation. This highly integrative function encompasses sensorimotor, cognitive, and executive processes that jointly mediate active exploration and spatial learning. However, most neuroimaging approaches in humans are based on static, motion-constrained paradigms and they do not account for all these processes, in particular multisensory integration. Following the Mobile Brain/Body Imaging approach, we aimed to explore the cortical correlates of landmark-based navigation in actively behaving young adults, solving a Y-maze task in immersive virtual reality. EEG analysis identified a set of brain areas matching state-of-the-art brain imaging literature of landmark-based navigation. Spatial behavior in mobile conditions additionally involved sensorimotor areas related to motor execution and proprioception usually overlooked in static fMRI paradigms. Expectedly, we located a cortical source in or near the posterior cingulate, in line with the engagement of the retrosplenial complex in spatial reorientation. Consistent with its role in visuo-spatial processing and coding, we observed an alpha-power desynchronization while participants gathered visual information.We also hypothesized behavior-dependent modulations of the cortical signal during navigation. Despite finding few differences between the encoding and retrieval phases of the task, we identified transient time-frequency patterns attributed, for instance, to attentional demand, as reflected in the alpha/gamma range, or memoryThis is an open access article under the terms of the Creative Commons Attribution-NonCommercial-NoDerivs License, which permits use and distribution in any medium, provided the original work is properly cited, the use is non-commercial and no modifications or adaptations are made.
Coupling behavioral and brain imaging in naturalistic, ecological conditions is key to comprehend the neural bases of spatial navigation. This highly-integrative function encompasses sensorimotor, cognitive, and executive processes that jointly mediate active exploration and spatial learning. However, most neuroimaging approaches in humans do not account for all these processes, in particular multisensory integration, since they are based on static, motion constrained paradigms. Here, we bring together the technology and data analysis tools to conduct simultaneous brain/body imaging during navigation in mobile conditions. Following the Mobile Brain/Body Imaging approach, we focus on landmark-based navigation in actively behaving young adults solving a virtual reality Y-maze task. The presented EEG analysis identifies exploitable neural signals from a specific network of brain regions that matches the state-of-the-art imaging literature of landmark-based navigation, revealing the concurrent activation of brain areas engaged in active, natural spatial navigation. In particular, we focus on the role of the retrosplenial cortex in visuo-spatial processing and coding. In line with previous evidence, we find behavioral modulations of neural processes during navigation, like attentional demand, as reflected in the alpha/gamma range and memory workload in the delta/theta range. Finally, our results show how the fine temporal resolution of mobile EEG recordings captures the time course of neuro-behavioral correlations, as participants actively interact with their environment. We confirm that combining mobile high-density EEG and biometric measures can help unravel the brain network and neural modulations subtending ecological landmark-based navigation.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.