Abstract. The effects of initial nitrogen and lignin contents of six species of hardwood leaves on their decomposition dynamics were studied at the Hubbard Brook Experimental Forest. Rate constants (k) for annual leaf mass loss ranged from -0.08 to -0.47. The rate constants (k) had a negative linear correlation (r 2 = .89) with the ratio of initial lignin concentration to initial nitrogen concentration. Decomposition dynamics of the litter materials were described by inverse linear relationships between the percentage of original mass remaining and the nitrogen concentration in the residual material. Initial lignin concentration was highly correlated (r 2 = .93) with the slope of the inverse linear relationship for each of the litter types.
Perhaps one of the most powerful symbols of the United States' technological prowess is the Mission Control Center (MCC) at the Lyndon B. Johnson Space Center in Houston. The rooms at Mission Control have been witness to major milestones in the history of American technology such as the first lunar landing, the rescue of Skylab, and the first launch of the Space Shuttle. When Mission Control was first activated in the early 1960s it was truly a technological marvel. This facility, however, has received only modest upgrades since the Apollo program. Until recently it maintained a mainframe-based architecture that displayed data and left the job of data analysis to flight controllers. The display technology utilized in this system was monochrome and primarily displayed text information with limited graphics (photo 1). An example display of 250 communication parameters is shown in Figure 1. The mainframe processed incoming data and displayed it to the flight controllers; however it performed few functions to convert raw data into information. The job of converting data into information upon which flight decisions could be made was performed by the flight controllers. In some cases, where additional computational support was required, small offline personal computers were added to the complex. Flight controllers visually copied data off the console display screens, and manually entered the data into the small personal computers where offline analysis could be performed. Although this system was technologically outdated, it contained years of customizing efforts and served NASA well through the early Space Shuttle program. Several factors are now driving NASA to change the architecture of Mission Control to accommodate advanced automation. First is the requirement to support an increased flight rate without major growth in the number of personnel assigned to flight control duties. A second major concern is loss of corporate knowledge due to the unique bimodal age distribution of NASA staff. Hiring freezes between the Apollo and Shuttle programs have resulted in NASA being composed of two primary groups. Approximately half of NASA consists of Apollo veterans within five years of retirement. The other half consists of personnel under the age of 35 with Shuttle-only experience. NASA considers it highly desirable to capture the corporate knowledge of the Apollo veterans in knowledge-based systems before they retire. Because the mainframe complex is primarily oriented to data display, it is a poor environment for capturing and utilizing knowledge. These factors have resulted in aggressive efforts by NASA's Mission Operations Directorate to utilize the following: a distributed system of Unix engineering-class workstations to run a mix of online real-time expert systems, and traditional automation to allow flight controllers to perform more tasks and to capture the corporate knowledge of senior personnel. Starting with the first flight of the Space Shuttle after the Challenger accident, the Real-Time Data System (RTDS) has played an increasingly significant role in the flight-critical decision-making process.
No abstract
This Paper is intended t o discuss the introduction o f advanced information systems technologies such as Artificial Intelligence, Expert Systems, and advanced human-computer interfaces directly into Space Shuttle Software Engineering. Completion o f this objective will validate that these technologies are sufficiently mature t o use in NASA space mission operations and will benefit the Space Shuttle Program by improving the quality of software performance analysis. The Process The Space Shuttle flight software is built and tested within the Software Production Facility (SPF). The SPF is composed of several IBM 3000 Series Mainframes and is the backbone of the reconfiguration process.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.