Binary code analysis allows analyzing binary code without having access to the corresponding source code. A binary, after disassembly, is expressed in an assembly language. This inspires us to approach binary analysis by leveraging ideas and techniques from Natural Language Processing (NLP), a fruitful area focused on processing text of various natural languages. We notice that binary code analysis and NLP share many analogical topics, such as semantics extraction, classification, and code/text comparison. This work thus borrows ideas from NLP to address two important code similarity comparison problems. (I) Given a pair of basic blocks of different instruction set architectures (ISAs), determining whether their semantics is similar; and (II) given a piece of code of interest, determining if it is contained in another piece of code of a different ISA. The solutions to these two problems have many applications, such as cross-architecture vulnerability discovery and code plagiarism detection.Despite the evident importance of Problem I, existing solutions are either inefficient or imprecise. Inspired by Neural Machine Translation (NMT), which is a new approach that tackles text across natural languages very well, we regard instructions as words and basic blocks as sentences, and propose a novel cross-(assembly)-lingual deep learning approach to solving Problem I, attaining high efficiency and precision. Many solutions have been proposed to determine whether two pieces of code, e.g., functions, are equivalent (called the equivalence problem), which is different from Problem II (called the containment problem). Resolving the cross-architecture code containment problem is a new and more challenging endeavor. Employing our technique for crossarchitecture basic-block comparison, we propose the first solution to Problem II. We implement a prototype system INNEREYE and perform a comprehensive evaluation. A comparison between our approach and existing approaches to Problem I shows that our system outperforms them in terms of accuracy, efficiency and scalability. The case studies applying the system demonstrate that our solution to Problem II is effective. Moreover, this research showcases how to apply ideas and techniques from NLP to largescale binary code analysis.
A program is underway at Los Alamos National Laboratory to develop nuclear data libraries for incident neutrons and protons to 150 MeV for accelerator-driven applications. These libraries will be used initially for design of an accelerator-based facility to produce tritium, including analysis of system performance, induced radiation doses, material activation, heating, damage, and shielding requirements. The libraries are based primarily on nuclear model calculations with the GNASH reaction theory code, including thorough benchmarking of the model calculations against experimental data. All evaluations include specification of production cross sections for light particles, gamma rays, and heavy recoil particles, energy-angle correlated spectra for secondary light particles, and energy spectra for gamma rays and heavy recoil nuclei. The neutron evaluations are combined with ENDFB-VI evaluations below 20 MeV. To date, neutron and roton evaluations have been com leted for 2H, lzC, 160, 27A1, 289293Si, 4oCa, 5495698958Fk, 182,183,184,186~, and 206,20$208pb.
DIscLAlMERThis document was pwpared s an account of work sponsored by an agency of the United States Government. Neither the United States Government nor the UNversity of California nor any of their employees, makes any warranty, expres or implied, or aseumes any legal liability or responsibility for the accuracy, completeness, or usefulness of any information, apparatus, product, or process disclosed, or represents that its use would not infringe privately owned rights. Reference Abstract A program is being carried out at Lawrence Livermore National Laboratory to develop high-energy evaluated nuclear data libraries for use in Monte Carlo simulations of cancer radiation therapy. In this report we describe evaluated cross sections and kerma factors for neutrons with incident energies up to 100 MeV on 12C; in subsequent reports we shall describe our high-energy libraries for neutrons on I4N, l60, 31P and 40Ca, as well as accelerator collimator and shielding materials. The aim of this effort is to incorporate advanced nuclear physics modeling methods, with new experimental measurements, to generate cross section libraries needed for an accurate simulation of dose deposition in fast neutron therapy.The evaluated libraries are based mainly on nuclear model calculations, benchmarked to experimental measurements where they exist. We use the GNASH code system, which includes Hauser-Feshbach, preequilibrium, and direct reaction mechanisms. The libraries tabulate elastic and nonelastic cross sections, angle-energy correlated production spectra for light ejectiles with A54, and kinetic energies given to light ejectiles and heavy recoil fragments. The major steps involved in this effort are: (1) development and validation of nuclear models for incident energies up to 100 MeV; (2) collation of experimental measurements, including new results from Louvain-la-Nueve and Los Alamos; (3) extension of the Livermore ENDL formats for representing highenergy data; (4) calculation and evaluation of nuclear data; and (5) validation of the libraries. We describe the evaluations in detail, with particu1a.r emphasis on our new high-energy modeling developments. Our evaluations agree well with experimental measurements of integrated and differential cross sections. We compare our results with the recent ENDF/B-VI cvaluatioii which extends up to 32 MeV. We also compare kerma factors resulting from our evaluated microscopic cross sections with measurements, providing an important integral benchmarking of the libraries. The evaluated libraries are described and illustrated in detail. bmJBVnON OF MIS.. Contents AbstractA program is being carried out at Lawrence Livermore National Laboratory to develop high-energy evaluated nuclear data libraries for use in Monte Carlo simulations of cancer radiation therapy. In this report we describe evaluated cross sections and kerma factors for neutrons with incident energies up to 100 MeV on 12C; in subsequent reports we shall describe our high-energy libraries for neutrons on I4N, l60, 31P and 40Ca, ...
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.