We introduce GPT-NeoX-20B, a 20 billion parameter autoregressive language model trained on the Pile, whose weights will be made freely and openly available to the public through a permissive license. It is, to the best of our knowledge, the largest dense autoregressive model that has publicly available weights at the time of submission. In this work, we describe GPT-NeoX-20B's architecture and training, and evaluate its performance on a range of language-understanding, mathematics and knowledge-based tasks. We open-source the training and evaluation code, as well as the model weights, at https://github.com/ EleutherAI/gpt-neox.
We present membrane-based steric exclusion chromatography (SXC) as a universal capture step for purification of adenoassociated virus (AAV) gene transfer vectors independent of their serotype and surface characteristics. SXC is performed by mixing an unpurified cell culture supernatant containing AAV particles with polyethylene glycol (PEG) and feeding the mixture onto a chromatography filter unit. The purified AAV particles are recovered by flushing the unit with a solution lacking PEG. SXC is an inexpensive single-use method that permits to concentrate, purify, and re-buffer AAV particles with yields >95% and >80% impurity clearance. SXC could theoretically be employed at industrial scales with units of nearly 20 m 2 .
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.