The UNIX operating system provides an especially congenial programming environment, in which it is not only possible, but actually natural, to write programs quickly and well. Several characteristics of the UNIX system contribute to this desirable state of affairs. Files have no type or internal structure, so data produced by one program can be used by another without impediment. The basic system interface for input and output provides homogeneous treatment of files, I/O devices and programs, so programs need not care where their data comes from or goes to. The command interpreter makes it convenient to connect programs, by arranging for all data communication. Complex procedures are created not by writing large programs from scratch, but by interconnecting relatively small components. These programs are small and concentrate on single functions, and therefore are easy to build, understand, describe, and maintain. They form a high level toolkit whose existence causes programmers to view their work as the use and creation of tools, a viewpoint that encourages growth in place of reinvention. Tools interact in a limited number of ways, but can be used in many different combinations. Thus, an addition to the toolkit tends to improve the programming power of the user faster than it increases the complexity of interconnection and maintenance. Finally, tools are connected at a very high level by a powerful command language interpreter. The error‐prone and expensive process of program writing can often be avoided in favor of program‐using. In this paper we will present a variety of examples to illustrate this methodology, focusing on those aspects of the system and supporting software which make it possible.
For decades, computer benchmarkers have fought a War of Means. Although many have raised concerns with the geometric mean (GM), it continues to be used by SPEC and others. This war is an unnecessarymisunderstanding due to inadequately articulated implicit assumptions, plus confusio namong populations, their parameters, sampling methods, and sample statistics. In fact, all the Means have their uses, sometimes in combination. Metrics may be algebraically correct, but statistically irrelevant or misleading if applied to population distributions for which they are inappropriate. Normal (Gaussian) distributions are so useful that they are often assumed without question,but many important distributions are not normal.They require different analyses, most commonly by finding a mathematical transformations that yields a normal distribution,computing the metrics, and then back-transforming to the original scale. Consider the distribution of relative performance ratios of programs on two computers. The normal distribution is a good fit only when variance and skew are small, but otherwise generates logical impossibilities and misleading statistical measures. A much better choice is the lognormal (or log-normal) distribution, not just on theoretical grounds, but through the (necessary) validation with real data. Normal and lognormal distributions are similar for low variance and skew, but the lognormal handles skewed distributions reasonably, unlike the normal. Lognormal distributions occur frequently elsewhere are well-understood, and have standard methods of analysis.Everyone agrees that "Performance is not a single number," ... and then argues about which number is better. It is more important to understanding populations, appropriate methods, and proper ways to convey uncertainty. When population parameters are estimated via samples, statistically correct methods must be used to produce the appropriate means, measures of dispersion, Skew, confidence levels, and perhaps goodness-of-fit estimators. If the wrong Mean is chosen, it is difficult to achieve much. The GM predicts the mean relative performance of programs , not of workloads. The usual GM formula is rather unintuitive, and is often claimed to have no physical meaning. However, it is the back-transformed average of a lognormal distribution , as can be seen by the mathematical identity below. Its use is not onlystatistically appropriate in some cases, but enables straightforward computation of other useful statistics.<display equation>"If a man will begin in certainties, he shall end in doubts, but if he will be content to begin with doubts, he shall end with certainties." — Francis Bacon, in Savage.
In 50 years, we've already seen numerous programming systems come and (mostly) go, although some have remained a long time and will probably do so for: decades? centuries? millennia? The questions about language designs, levels of abstraction, libraries, and resulting longevity are numerous. Why do new languages arise? Why is it sometimes easier to write new software than to adapt old software that works? How many different levels of languages make sense? Why do some languages last in the face of "better" ones?We can gather insights from the last 50 years of programming systems to the current time. For the far future, Vernor Vinge's fine science-fiction novel, A Deepness in the Sky, rings all too true. The young protagonist, Pham, has joined a starship crew and is Languages, Programming Languages FOCUS Levels, Libraries, and Longevity Languages, Levels, Libraries, and Longevity Programming LanguagesFOCUS
Many, if not most, UNIX* systems are dedicated to specific projects and serve small, cohesive groups of (usually technically oriented) users. The Programmer's Workbench UNIX system (PWBIUNlX for short) is a facility based on the UNIX system that serves as a large, general-purpose, "util ity" computing service. It provides a convenient working environment and a uniform set of programming tools to a very diverse group of users. The PWBIUNlX system has several interesting characteristics:(i) Many of its facilities were built in close cooperation between developers and users.(ii) It has proven itself to be sufficiently reliable so that its users, who develop production software, have abandoned punched cards, private backup tapes, etc. (iii) It offers a large number of simple, understandable programdevelopment tools that can be combined in a varie/y of ways; users "package " these tools to create their own specialized environments. (iv) Most importantly, the above were achieved without compromising the basic elegance, simplicity, generality, and ease of use of the UNIX system.The result has been an environment that helps large numbers of users to get their work done, that improves their productivity, that adapts quickly to their individual needs, and that provides reliable service at a relatively low cost. This paper discusses some of the problems we encountered in building the PWBIUNlX system, how we solved them, how our system is used, and some of the lessons we learned in the process.' UNIX is a trademark of Bell Laboratories. 2177
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.