Shelves of Technical Books

Bibliographies
Computer Science

orcmid.github.io>
bib>

compsci>
2025-11-22 -17:07 -0800

  
see also:
Readings in Personal Computing
Readings in Software Tools
          Readings in Information Processing
Readings in Software Engineering
Readings in System Architecture and Design
Readings in Programming Systems and Languages
Readings in Functional Programming Systems

[Ashenhurst1987]
Ashenhurst, Robert L., Graham, Susan L. (eds.).  ACM Turing Award Lectures: The First Twenty Years 1966-1985.  ACM Anthology Series.  ACM Press (New York: 1987).  ISBN 0-201-07794-9.
     The complete list is maintained on the ACM web site.
   Content
     Author's Biographies
     Preface
     Introduction to Part I Programming Languages and Systems (Susan L. Graham)
          1966 The Synthesis of Algorithmic Systems (Alan J. Perlis)
          1972 The Humble Programmer (Edsger W. Dijkstra)
          1974 Computer Programming as an Art (Donald E. Knuth [1974])
          1976 Logic and Programming Languages (Dana S. Scott)
          1977 Can Programming Be Liberated from the von Neumann Style? (John Backus)
          1978 The Paradigm of Programming (Robert W. Floyd)
          1980 The Emperor's Old Clothes (Charles Anthony Richard Hoare)
          1983 Reflections on Trusting Trust (Ken Thompson)
          1984 From Programming Language Design to Computer Construction (Niklaus Wirth)
     Introduction to Part II Computers and Computing Methodologies (Richard L. Ashenhurst)
          1967 Computers Then and Now (Maurice V. Wilkes)
          1968 One Man's View of Computer Science (R. W. Hamming [1969])
          1969 Form and Content in Computer Science (Marvin Minsky)
          1970 Some Comments from a Numerical Analyst (J. H. Wilkinson)
          1971 Generality in Artificial Intelligence [Postscript] (John McCarthy)
          1973 The Programmer as Navigator (Charles W. Bachman)
          1975 Computer Science as Empirical Inquiry: Symbols and Search (Allen Newell and Herbert A. Simon)
          1976 Complexity of Computations (Michael O. Rabin)
          1979 Notation as a Tool of Thought (Kenneth E. Iverson)
          1981 Relational Database: A Practical Foundation for Productivity (E. F. Codd [1982])
          1982 An Overview of Computational Complexity (Stephen A. Cook)
          1985 Combinatorics, Complexity, and Randomness (Richard M. Karp)
                   Piecing Together Complexity [Postscript] (Karen Frenkel)
                   Complexity and Parallel Processing: An Interview with Richard Karp [PostScript] (Karen Frenkel)
     Index According to ACM Computing Reviews Classification Scheme
     Name Index
     Subject Index
          
[Backus1959]
Backus, J. W.  The Syntax and Semantics of the Proposed International Algebraic Language of the Zurich ACM-GAMM Conference.  Proceedings of the International Conference on Information Processing (UNESCO, 1959), 125-132.  Available on the Internet (typewritten preprint).
  The IAL became known as ALGOL 58 and the notation Backus introduced was adapted by Peter Naur for the ALGOL 60 report.  The notation became known as BNF, now taken as short for "Backus Naur Form."
  
[Backus1978]
Backus, John.  Can programming be liberated from the von Neumann style?  a functional style and its algebra of programs.  Comm. ACM 21, 8 (August 1978), 613-641.  Turing Award Lecture.   Available on the Internet at <doi:10.1145/359576.359579>.
     
[Baker1978]
Baker, Henry G. List processing in real time on a serial computer.  Comm. ACM 21, 4 (April 1978), 280-294.  <doi:10.1145/359460.359470>
   Abstract: "A real-time list processing system is one in which the time required by the elementary list operations (e.g. CONS, CAR, CDR, RPLACA, RPLACD, EQ, and ATOM in LISP) is bounded by a (small) constant. Classical implementations of list processing systems lack this property because allocating a list cell from the heap may cause a garbage collection, which process requires time proportional to the heap size to finish. A real-time list processing system is presented which continuously reclaims garbage, including directed cycles, while linearizing and compacting the accessible cells into contiguous locations to avoid fragmenting the free storage pool. The program is small and requires no time-sharing interrupts, making it suitable for microcode. Finally, the system requires the same average time, and not more than twice the space, of a classical implementation, and those space requirements can be reduced to approximately classical proportions by compact list representation. Arrays of different sizes, a program stack, and hash linking are simple extensions to our system, and reference counting is found to be inferior for many applications."
   [dh:2025-11-22 This is the fundamental Baker algorithm I need to go through more careful to grasp the specifics and see how this distills down to a workable storage-management scheme for oMiser list structures.  I also think there is something more around generational garbage-collection algorithms and I need to find it.  I will need to see if [Leiberman1983] improves on this and whether there is yet more (apart from the insight to have heap-0 be on the runtime stack.]
  
[Baker1990]
Baker, Henry G. Efficient implementation of bit-vector operation in Common Lisp.  ACM SIGPLAN Lisp Pointers 3, 2-4 (April 1990), 8-22.  <doi:10.1145/121989.121991>.
   From the Abstract: "In this paper we show various techniques for the efficient implementation for the various functions of Common Lisp involving bit-vectors and bit-arrays. ... [The] efficient manipulation of bit-vectors in modern computers represents a curious point in the spectrum of data processing tasks.  On the one hand, the possibility of packing many bits within a single computer word and operating on all of them in parallel offers a potential for speedup not usually available for other types of tasks. ... [The] efficient implementation of bit-vector operations requires special-case code, and is an interesting challenge in ingenuity and engineering.
   "[We] show how word-wide parallelism can be utilized for almost all of [the bit-vector operations in Common Lisp] to achieve speedups.  ... The major exception will be Common Lisp's 'SEARCH' function for bit-vectors, which can also be speeded up [Baker1989]."
   [dh:2025-11-22 This is rather intriguing when seen alongside [Baker1990b].  It would be remarkable if something like this could somehow be introduced into Miser, though I am not confident about oMiser.]
  
[Baker1990b]
Baker, Henry G. Unify and conquer. pp. 218-226 in LFP'90: Proceeding of the 1990 ACM conference on LISP and functional programming, (Nice France: May 1990)  <doi:10.1145/91556.91652>.
   From the Abstract: Type inference is the process by which an expression in an untyped computer language such as the lambda-calculus, Lisp, or a functional language can be assigned a static data type in order to improve the code generated by a compiler.  Storage use inference is the process by which a program ina computer language can be statically analyzed to model its run-time behavior, particularly the containment and sharing relations among its run-time data structures.  The information generated by storage use information can also be used to improve the code generated by a compiler, because knowledge of the containment and sharing relations of run-time data structures allows for methods of storage allocation and deallocation which are cheaper than garbage-collected heap storage and allows for the in-place updating of functional aggregates. ... [The] best-known and best-understood of the type inferencing algorithms--Milner's unification method for ML--already generates valuable sharing and containment information ... with no additional overhead during unification; however there is some additional work necessary to extract this information. ... [Our] scheme seems only to work for functional languages like pure Lisp.
   [dh:2025-11-22 This is an intriguing situation.  It might not be workable with regard to type inference from pure functional types, such as that provided by combinators, and it might require distinguished (base) types already.  I take this as something fruitful to explore.]
  
[Baker1991]
Baker, Henry G. Shallow binding makes functional arrays fast.  ACM SIGPLAN Notices 26, 8 (August 1991), 145-147.  <doi:10.1145/122598.122614>
   From the Abstract: Access and update for random elements of arrays in imperative programming languages are O(1) operation.  Implementing functional programming languages to achieve equivalent efficiency has proved difficult.  We show how the straight-forward application of shallow binding to functional arrays automatically achieves O(1) update for single-threaded usage.
   [dh:2025-11-22 This is a general problem with respect to the absence of assignment operations leading to concern for alterations of aggregates involves transformation to a derived aggregate regardless of how miniscule an individual change might be.  This applies to any kind of aggregated structure, not just arrays.  This article does not illustrate shallow binding and other resources must be consulted.]
 
[Baker1992]
Baker, Henry G. CONS should not CONS its arguments, or, a lazy alloc is a smart alloc.  ACM SIGPLAN Notices 27, 3 (March 1992), 24-34.  <doi:10.1145/130854.130858>
  From the Abstract: Lazy allocation is a model for allocating objects on the execution stack of a high-level language which does not create dangling references.  Our model provides safe transportation into the heap for objects tha may survive the deallocation of the surrounding stack frame.  Space for objects that do not survive the deallocation of the surrounding stack frame is reclaimed without additional effort when the stack is popped
   [dh:2025-11-22 This is precisely the technique that I am interested in for the Miser Project.  [Baker1995] may provide a core aspect of the technique, and this 1992 article may address additional questions about ensuring no references from outside the retreated stack frame are left dangling.]
  
[Baker1992d]
Baker, Henry G. Less complex elementary functions.  ACM SIGPLAN Notices 27, 11 (November 1992), 15-16.
   From the Abstract:  "We give formulae for complex elementary functions in which the real and imaginary parts can be computed separately using only real operations.  Many of thesse formulae are obvious but some--e.g., asin(z), ..., atanh(z)--are not obvious and/or not well-known.
 
[Baker1993b]
Baker, Henry G.  The Boyer benchmark meets linear logic.  ACM SIGPLAN Lisp Pointers, 6, 4 (October 1993), 3-10.  Available on the Internet at <doi:10.1145/181889.181890>.
   From the abstract: "Of the Gabriel Lisp Benchmarks, the Boyer Benchmark ("Boyer") is the most representative of real AI applications, because it performs .. rule-directed rewriting and because it relies heavily on garbage collection (GC) for the recovery of storage.  We investigated the notion that such programs are unsuitable for explicit storage management."
   The Boyer benchmark has several bugs that are corrected in this paper.  The paper ends with "A short Tutorial on 'Linear' Lisp."  [dh:2025-11-22 This appears to be making procedures tail-recursive and I'm uncertain about the necessity of the linear logic approach that involves explicit "killing" of variables.  It seems easier to specify as a tail-recursive form to start with.  There does seem to be an opportunity to effectively de-reference the original argument by essentially reusing the "frame."  This needs exploration.] 
   [dh:2025-11-22 This is as much about Functional Programming as a general Computer Science Topic.  It is about the nuts-and-bolts under the covers of functional-style operations and there being bench-mark stress tests is of particular interest.  The [Gabriel1985: Boyer Benchmark] is useful as a stress test for Functional Programming approaches, including that of the Miser Project, but transposition into an oFrugal script may be challenging.]
  
[Baker1994]
Baker, Henry G. Thermodynamics and garbage collection.  ACM SIGPLAN Notices 29, 4 (April 1, 1994), 58-63.  Available on the Internet at  <doi:10.1145/181761.181770>.
   Abstract: "We discuss the principles of statistical thermodynamics and their application to storage management problems.  We point out problems which result from imprecise usage of the terms information, state, reversible, conservative, etc."
  
[Baker1994b]
Baker, Henry G. Minimizing reference count updating with deferred and anchored pointers for functional data structures.  ACM SIGPLAN Notices 29, 9 (September 1, 1994), 38-43.  Available on the Internet at <doi:10.1145/185009.185016>.
   From the abstract: "Reference counting can be an attractive form of dynamic storage management.  It recovers storage promptly and (with a garbage stack instead of a free list) it can be made "real-time"--i.e., all accesses can be performed in constant time.  Its major drawbacks are its inability to reclaim cycles, its count storage, and its count update overhead.  Update overhead is especially irritating for functional (read-only) date ... .  We show how reference count updating can be largely eliminated for functional data structures ... ."
  
[Baker1995]
Baker, Henry G.  CONS should not CONS its arguments, part II: Cheney on the M.T.A.  ACM SIGPLAN Notices 30, 9 (September 1, 1995), 17-20. Available on the Internet at <doi:10.1145/214448.214454>.
   Part I is [Baker1992]
  
[Biermann1997]
Biermann, Alan W.  Great Ideas in Computer Science: A Gentle Introduction.  ed.2.  MIT Press (Cambridge, MA: 1990, 1997).  ISBN 0-262-52223-3 pbk. alk. paper
     "This is a book about computers--what they are, how they work, what they can do, and what they cannot do.  It is written for people who read about such topics as computers networks or artificial intelligence and want to understand them, for people who need to have data processed on the job and want to know what can and cannot be done, and for people who see the proliferation of computers throughout society and ask about the meaning of it all.  It is written for doctors, lawyers, preachers, teachers, managers, students, and all others who have a curiosity to learn about computing.  It is also written for computer science students and professionals whose education may not have covered all the important areas, and who want to broaden themselves." -- from the Preface, p.xv.
     "This book was written on the assumption that intelligent people can understand every fundamental issue of computer science if the preparation and explanation are adequate. ...
     "Because casual readers may not wish to read all the chapters, the book is designed to encourage dabbling.  Readers are encouraged to jump to any chapter at any time and read as much as is of interest. ...
     "... Chapter sections labeled A include only introductory material and make few demands on the reader.  One can get an overview of the book in a single evening by reading them.  The B sections are the primary material of the book and may require substantial time and effort, but the reader who completes them will have a deep understanding of the major lessons on that topic.  The C material answers questions that careful readers may ask and supplements the main portions of the book."  -- from the Introduction, p.xxiv.
     I just became aware of this book out of recent discussions of my experience seeing how programming languages are taught and on how the principles of computing (as opposed to the technology du jour) are introduced.  I am very much in favor of this approach and the proposal that computing be made accessible to anyone to the degree that they choose to explore it.  I strongly favor any approach that promotes readers and students distilling out the ideas behind the practices and concrete examples.
     This book offers a practical approach that encourages the reader to sit down at a computer and explore and confirm what is presented.  The book is not a programming text -- programs and programming are introduced as vehicles of demonstration.  I am keenly interested in this hands-on approach and in how that can be used to illustrate principles without having the principle associated too tightly (or even be confused with) the technology that is applied as the instrument of illustration.
     A subset of Pascal is used, in the spirit of [Sedgewick1989], and the examples are developed in Turbo Pascal.  There is a single page devoted to what is needed and how to get started, including the following wonderful paragraph: "The [computer system] manual is necessary because it will tell you which buttons to push on your computer to turn it on and to get your program typed in and running.  Finally, you need an instructor or friend to tell you what the manual did not.  Computer Science, like many disciplines, has an oral tradition, and some of the most important facts are passed only by word of mouth.  You may be able to get along without this help, but learning is usually easier if it is there (p.11)."
     The recognition of the amount of tacit knowledge that is assumed in even the most elementary treatment of computing, as far as hands-on treatment goes, is very welcome.  My experience is that more is called for here, though I can accept the author not choosing to take on that task as part of his project.  Now that Pascal has gone out of favor and economical implementations are no longer commonplace, it will take more work to apply the examples.  This has me wondering whether there is an advantage to two levels of exposition -- something like design plus representative implementation -- if there is going to be provision for hands-on confirmation, demonstration, and exploration that can endure technology fashions while also accentuating how the ideas are manifest in the concrete examples. -- dh:2003-07-31
   Content
     Preface to the Second Edition
     Preface
     Studying Academic Computer Science: An Introduction

     1. An Introduction to Programming: Coding Decision Trees
     2. Text Manipulation and Algorithm Design
     3. Numerical Computation and a Study of Functions
     4. Top-Down Programming, Subroutines, and a Database Application
     5. Simulation
     6. Software Engineering
     7. Electric Circuits
     8. Machine Architecture
     9. Language Translation
     10. Virtual Environments for Computing
     11. Computer Communications
     12. Program Execution Time
     13. Parallel Computation
     14. Noncomputability
     15. Artificial Intelligence
     Appendix A.  The Rules for the Subset of Pascal Used in this Book
     Appendix B.  Two Simulation Programs
     Readings
     Index

      
[Biermann2002]
Biermann, Alan W., Ramm, Dietolf.  Great Ideas in Computer Science with Java.  MIT Press (Cambridge, MA: 2002).  ISBN 0-262-02497-7 pbk. alk. paper.
     There is an errata at http://www.cs.duke.edu/~dept/Great_Ideas_with_Java/errata.html.
     "This book is written for people who find themselves on the outside of a high-technology world and who want to find a way in. ...
     "This book presents the story of computer science organized around the theme of the 'great ideas' of the field.  These are the discoveries, the paradigms, the simplifying equations that attract the attention of everyone who comes near, and they provide the best-known paths to understanding. ...
     "The method of teaching is by doing rather than just by reading and talking.  The theory is that if you personally go through the steps of creating a program to do something or if you hand-simulate a process, that mechanism and the related ideas will get built into your body as well as your mind.  They will become instinctive as well as intellectually known. ... This is the problems-oriented approach to teaching that has been so successful in the traditional quantitative sciences." -- from the Preface, pp. xiii-xiv.
     A number of alterations have appeared since the previous version [Biermann1997].  First, scholarly publishing is losing its memory -- there is no recognition of prior editions in the history of the book and certainly not in the copyright notice -- a recent practice about which I have my suspicions and my doubts.
  And the Preface completely disregards the possibility that we have been here before.
      Secondly, the language of choice for illustration of computer science is now Java.  With regard to availability, support, and high interest, that is difficult to fault.  It is also an avenue to the currently-favored approach for GUI and graphics work as well as a way of exploring an object-oriented technology (OOT).
      The "gentle introduction" isn't quite so gentle, and the immediate support for hands-on experience does not come across so well nor so early as in the previous chapters 1-2.  It seems that this book is seriously intended as a textbook with a companion laboratory or instructional support for the nitty-gritty, working-at-a-keyboard parts.  I miss the promise of an introduction for anyone with the interest and curiosity.  
      There is something less situated about the introduction of Java and starting with applets so unceremoniously.  I am also concerned with having to deal with OOT so early just because there is no other way to accomplish anything useful in Java.  Here the neophyte may never know, in a clear way, whether we are doing OOP or simply using the OOT to accomplish some straightforward OOP-indifferent task by idiomatic application of the object-oriented technology (e.g., as with main(), or java.lang.Math, or init()-as-main()).  And identifying what we are actually doing when we extend a GUI-component class, and how that ties to OOP, takes some masterful explication (not included here -- where interface ActionListener implementation begins on p.86 and is nowhere addressed that I can find.) That it cost the readers of this book an introduction to text processing and algorithm design is unfortunate.
     Having said all of that, I strongly favor a hands-on, experience-it and demonstrate-it-yourself approach.  There is much value here.  And, I think, there's more to accomplish in execution on the promise to demonstrate the great ideas in a way that brings them to life for people.
  It would be useful to not have the tool be in the way. --dh:2003-07-31
     An updated, extended version of this précis appears on amazon.com. --dh:2003-10-17
   Content.
     Preface
     Studying Academic Computer Science: An Introduction
     1. The World Wide Web
    2. Watch Out: Here Comes Java
     3. Numerical Computation and a Study of Functions
     4. Top-Down Programming, Subroutines and a Database Application
     5. Graphics, Classes, and Objects
     6. Simulation
     7. Software Engineering
     8. Machine Architecture
     9. Language Translation
     10. Virtual Environments for Computing
     11. Security, Privacy, and Wishful Thinking
     12. Computer Communication
     13. Program Execution Time
     14. Parallel Computation
     15. Noncomputability
     16. Artificial Intelligence
     Appendix: The IntField and DoubleField Classes
     Readings
     Index
   
[Bird1997]
Bird, Richard., de Moor, Oege.  Algebra of Programming.  Prentice-Hall (Harlow, England: 1997).  ISBN 0-13-507245-X.
     "[This book] codifies the basic laws of algorithmics, and shows how they can be used to classify many ingenious and important programs into families related by the algebraic properties of their specifications.  The formulae and equations that you will see here share the elegance of those which underlie physics or chemistry or any other branch of basic science; and like them, they inspire our interest, enlarge our understanding, and hold out promise of enduring benefits in application."  From the Foreword by Tony Hoare, p. ix.
   Content
     Foreword
     Preface

     1. Programs
     2. Functions and Categories
     3. Applications
     4. Relations and Allegories
     5. Datatypes in Allegories
     6. Recursive Programs
     7. Optimisation Problems
     8. Thinning Algorithms
     9. Dynamic Programming
     10. Greedy Algorithms
     Appendix
     Bibliography
     Index
   
[Bratman1961]
Bratman, H.  An alternate form of the "UNCOL diagram."  Comm. ACM 4, 3 (March 1961), 142.
   The basic idea was to illustrate, in a building-block manner, how a compiler could be used to produce programs, including compilers for the same or different language.
The specific Universal Common Language (UNCOL) case was an early idea to have common back-ends that would compile from an intermediate language and achieve portable implementations of programming languages by compiling to the UNCOL intermediate.  And I have probably over-simplified the whole thing.  Oddly, the use of a common often interpretive run-time has satisfied this requirement in the case of Java and especially .NET.  I do use the diagramming technique though.
   At the Turing Centenary celebration in San Francisco, I approached Charles Bachman and thanked him for his great diagramming technique.  He didn't know what I was talking about, of course.  I had misremembered the name of this source.
  
[Brookshear2003]
Brookshear, J.GlennComputer Science: An Overview.  ed.7.  Addison-Wesley (Boston: 2003).  ISBN 0-201-78130-1 pbk.
     2002-11-18: This book is designed and promoted as a textbook for module CS0, Introduction to Computer Science
.  The traditional designation (introduced in Computer Science Curriculum 78) of courses CS1, CS2, ... commences with programming.  In breadth-first approaches, there is an overview of computer science topics that precedes the first collegiate programming course.   The overview can be designed for non-majors as well.  This is apparently CS0, or, in the terminology for Computing Curriculum 2001, CS100b.  Providing this breadth-first overview for non-majors and majors alike is the express purpose of this book.
     "I wrote this text for both computer science majors and students from other disciplines.  As for computer science majors, most begin their studies with the illusion that computer science is programming and Web browsing since that is essentially all they have seen.  Yet computer science is much more than this." -- From the preface, p.v.
     2002-11-02, -11-28: When I first learned that this book is used for the mandatory Computer Structures course of the University of Liverpool M.Sc in Information Technology on-line degree, I checked it out on amazon.com.  I was a little surprised at the cross-section of the reader reviews.  There were those -- apparently practitioners -- who disliked the book for the absence of programming and software approaches.  Many of these dismissed it as too elementary.  There was a second group who found it valuable in preparation for examinations in introductory computer science.  And then there were those who raved about the book because it gave them grounding and insights that they had been seeking.  It would seem that some did not check to see what the author's intention is.  I expect to be satisfied.  One odd thing though.  As soon as I ordered the book, amazon.com began recommending other books of an elementary nature that are considered good study guides for various introductory-subject examinations.  Their recommendations have lately returned to something more like my typical interests, however.
     The author's web site provides an errata for this edition.
     The publisher has a web site for the book, with links, PowerPoint slides, and supplementary materials.
   Content
     Preface
          0. Introduction
     Part One: Machine Architecture
          1. Data Storage
          2. Data Manipulation
     Part Two: Software
          3. Operating Systems and Networks
          4. Algorithms
          5. Programming Languages
          6. Software Engineering
     Part Three: Data Organization
          7. Data Structures
          8. File Structures
          9. Database Structures
     Part Four: The Potential of Machines
          10. Artificial Intelligence
          11. Theory of Computation
     Appendixes
          A. ASCII
          B. Circuits to Manipulate Two's Complement Representations
          C. A Simple Machine Language
          D. High-Level Language Program Examples
          E. The Equivalence of Iterative and Recursive Structures
          F. Answers to Questions/Exercises
     Index
     
[Cheatham1964]
Cheathan, T. E. Jr., Sattley, Kirk.  Syntax-Directed Compiling.  Proceedings of the April 21-23 1964 Spring Joint Computer Conference (AFIPS: April 1964), 31-57.
 A general parser (syntax analyzer) for context-free languages is described. What we now refer to as a lexical analyser is identified as a recognizer layer. The analyzer method is most easily recognized as a top-down parser with back-tracking. It could even be used to produce all analyses available for an ambiguous input. Refinements of this method that I have explored include pruning alternatives when they cannot be possible if the input will be valid. This allows early analysis to be forwarded to a generation layer. There also needs to be some assistance in the case of left-recursive generations, usually expedited by transformation into tail-iterative forms.
  
[Codd1970]
Codd, E.F.  A Relational Model of Data for Large Shared Data BanksComm. ACM 13, 6 (June 1970), 377-387.  Available on the web as an ACM Classic (without Section 2).  A PDF of the complete paper is available here.
   
[Codd1971]
Codd, E.F.  Normalized Data Base Structure: A Brief Tutorial.  Proc. 1971 ACM SIGFIDET Workshop on Data Description, Access, and Control.  San Diego, California, November 11-12, 1971.
   
[Codd1972a]
Codd, E.F.  Further Normalization of the Data Base Relational Model.  in Rustin, Randall J.(ed.).  Data Base Systems.   Courant Computer Symposia Series, vol. 6.  Prentice-Hall (Englewood Cliffs, NJ: 1972).
     This is is where the second and third normal forms were introduced.  Ron Fagin introduced the fourth and fifth normal forms.  Chris Date has introduced a sixth, but it is clear that the the fifth is complete.  
   
[Codd1972b]
Codd, E.F.  Relational Completeness of Data Base Sublanguages in Data Base Systems.  pp. 65-98 in Rustin, Randall J.(ed.).  Data Base Systems.   Courant Computer Symposia Series, vol. 6.  Prentice-Hall (Englewood Cliffs, NJ: 1972).
   
[Codd1974]
Codd, E.F.  Interactive Support for Nonprogrammers:  The Relational and Network Approach.  in Proc. ACM SIGMOD Workshop on Data Description, Access and Control, vol. II.  Ann Arbor, Michigan, May, 1974.
   
[Codd1979]
Codd, E.F.  Extending the Data Base Relational Model to Capture More MeaningACM Trans. on Database Systems 4, 4 (December 1979), 397-434.  A PDF is available here.
   
[Codd1981]
Codd, E.F.  Data Models in Database Management.  ACM SIGMOD Record 11, 2 (Feb. 1981).
   
[Codd1982]
Codd, E.F.  The 1981 ACM Turing Award Lecture: Relational Databases -- A Practical Foundation for Productivity.  Comm. ACM 25, 2 (February, 1982).
   
[Date2001]
Date, C.J.  The Database Relational Model: A Retrospective Review and Analysis. Addison-Wesley (Reading, MA: 2001).  ISBN 0-201-61294-1 pbk.  A historical account and assessment of E. F. Codd's contribution to the field of database technology.
     "This book consists in essence of a series of twelve articles, originally published in the print and online portions of the Miller Freeman magazine Intelligent Enterprise (Vol. 1, Nos. 1-3, and vol.2, Nos. 1-9, October 1998 onward).  The overall title for the series was '30 Years of Relational,' ... to celebrate the relational model's 30th birthday ... [and] to serve as a historical account and impartial analysis of E. F. Codd's (huge!) contribution to the field of database technology ... with a view to assessing their true significance and restating (and reinforcing) their message for a new generation of database professionals."  -- from the Preface.
   Content
     Preface
     1. The Birth of the Relational Model, Part 1 [October 1998]
     2. The Birth of the Relational Model, Part 2
     3. The Birth of the Relational Model, Part 3 [December 1998]
     4. Codd's Relational Algebra [January 5, 1999]
     5. Codd's Relational Calculus
     6. Data Sublanguage ALPHA, Part 1 [February 16, 1999]
     7. Data Sublanguage ALPHA, Part 2 [March 9, 1999]
     8. The First Three Normal Forms, Part 1 [March 30, 1999]
     9. The First Three Normal Forms, Part 2 [April 20, 1999]
     10. Relational Really Is Different [May 11, 1999]
     11. Extending the Relational Model [June 1, 1999]
     12. Relational Forever! [June 22, 1999]
     Appendix A. A Definition of the Relational Model
     Appendix B. References and Bibliography
     Index
   
[Dawson2000]
Dawson, Christian W.  The Essence of Computing Projects: A Student's Guide.  Pearson Education (Harlow, England: 2000).  ISBN 0-13-021972-X pbk.
   Content
     Preface
          1. Introduction: What Are Computing Projects?
     Part I: Setting Your Project's Foundation
          2. Choosing a Project and Writing a Proposal
          3. Project Planning
     Part II: Conducting Your Project
          4. Literature Searching and Literature Reviews
          5. Doing Your Project
     Part III: Presenting Your Project
          6. Presenting Your Project in Written Form
          7. Presentation Skills
          8. Final Considerations
     Bibliography
     Index

   
[DeMoor1997]
Bird, Richard., de Moor, Oege.  Algebra of Programming.  Prentice-Hall (Harlow, England: 1997).  ISBN 0-13-507245-X.  See [Bird1997]
   
[Denning2025]
Denning, Peter J. Abstractions.  Profession of IT column.  Comm. ACM 68, 3 (March 2025), 21-23. DOI:10.1145/3710809
 
[Earley1970]
Earley, Jay., Sturgis, Howard. A Formalism for Translator Interactions.  Comm. ACM 13, 10 (October 1970), 607-617.
   I became a strong fan of this kind of diagramming technique, but in the simpler form of [Bratman1961], by which I have also managed to blend into the use of dataflow and HIPO diagrams as well.
    
[Ershov1972]
Ershov, Andrei P.  Aesthetics and the Human Factor in Programming.  Comm. ACM 15, 7 (July 1972), 501-505.  Archived on-line at http://doi.acm.org/10.1145/361454.361458.
     This is a wonderful find in the references of [Knuth1974].  I had read the article when it first appeared, but it lands much more clearly for me on rereading it now.  Andrei Ershov was a renowned computer scientist in Novosobirsk.  That he could observe this about the situation of programmers worldwide over 30 years ago demonstrates to me how out-of-balance our progress has been.  I don't want to diminish the substantial ground we have covered, yet it is valuable to notice that we are dealing with essentially the same issues and that our tools don't save us. -- dh:2003-03-22
   
[Evans1961]
Evans, A., Perlis, A.J., Van Zoeren, H.  The Use of Threaded Lists in Construction of a Combined ALGOL and Machine-Like Assembly Processor.  Comm. ACM 4, 1 (Jan. 1961), 36-41.  DOR <https://dl.acm.org/doi/10.1145/366062.366081>.
Footnote 2 on p.37 describes "mindelayscandoer" as "scan and do translation to final code as quickly as possible." The idea of threaded lists does not apply well to oMiser, although a counterpart might assist in oFrugal operator-precedence handling.  The oFrugal REPL is a true mindelayscandoer.
   
[Fagan1979]
Fagin, Ronald., Nievergelt, Jurg., Strong, H. Raymond.  Extendible Hashing -- A Fast Access Method for Dynamic Files.  ACM Transactions on Database Systems 4, 3 (September 1979), 315-344.
   This is a foundational paper that is cited in the contributions of others such as [Litwin1980] and [Larson1988].  It attends to factors, such as processor paging and caching considerations that might not be of such concern in relatively moderate applications.  The emphasis is on structuring data whose volume is allowed to grow and shrink by large factors on a dynamic basis.  There is a good summary of approaches and characteristics of dynamic file organization schemes.  Not all of this pertains to in-memory directories/repositories.  It is presumed that the data items being organized are relatively larger than the items in the directory.  There is useful nomenclature: The key space, S, is whatever differentiates the items to be captured in the directory.  The hash function h on S maps into address space A which is presumed to be fairly evenly distributed.  A is mapped onto a directory with buckets having approximately the same high load factor.  The notion of load factor is also specified.  There is reference to a "buddy" system for allocating buckets [Knuth1997:2.5C].  The basic idea of splitting buckets in half or coalescing pairs of sparse buckets into one is fundamental to the proposal, and subsequent proposals, such as Linear Hashing, although it is not done with adjacent (i.e. buddy) blocks and alternative ways might be needed.  Part of the buddy system appeal is that it accomodates chunks of different (power of 2) sizes, so it is about allocating other things than, say, simple list-processing cells.  Linear Hashing is about splitting and coalescing fixed buckets of directory entries, not necessarily of the extent of an item, just its directory entry.  In designing a memory-management system, the various schemes must be walked through very carefully, with testing and instrumentation as well.  And there needs to be some smoothness of operation as well as amortization of the burps all such schemes have.
  
[Gabriel1985]
Gabriel, Richard P. Performance and Evaluation of Lisp Systems.  MIT Press (Cambridge MA: 1985).  ISBN 9780262256193.  <doi.org:10.7551/mitpress/5298.001.0001>.
   This book is cited for some benchmarks that it includes, including the "Boyer Test."  Although the subject is a Functional Programming system, this is about implementations and relevant Computer Science Topics and not so abstract as general Functional Programming application.
   From the abstract: "This final report of the Stanford Lisp Performance Study, conducted over a three year period by the author, describes implementation techniques, performance tradeoffs, benchmarking techniques, and performance results for all of the major Lisp dialects in use today. ... It provides detailed performance informatoin using the tools of benchmarking (the process of utilizing standardized computer programs to test the processing power of different computer systems) to measure the various Lisp sytems, and provide an understanding of the technical tradeoffs made during the implementation of a Lisp system.  The study is divided into three major parts.  The first provides the theoretical background, outlining the factors that go into evaluating the performance of a Lisp system.  The second part presents the Lisp implmementations ... A final part describes the benchmark suite that was used during the major portion of the study and the results themselves."
    
[Graham1987]
Ashenhurst, Robert L., Graham, Susan L. (eds.).  ACM Turing Award Lectures: The First Twenty Years 1966-1985.  ACM Anthology Series.  ACM Press (New York: 1987).  ISBN 0-201-07794-9.  See [Ashenhurst1987]
   
[Graham1989]
Graham, Ronald L., Knuth, Donald E., Patashnik, Oren.  Concrete Mathematics: A Foundation for Computer ScienceAddison-Wesley (Reading, MA: 1989).  ISBN 0-201-14236-8.
   Content
     Preface
     A Note on Notation

     1. Recurrent Problems
     2. Sums
     3. Integer Functions
     4. Number Theory
     5. Binomial Coefficients
     6. Special Numbers
     7. Generating Functions
     8. Discrete Probability
     9. Asymptotics
     Appendix A.  Answers to Exercises
     Appendix B. Bibliography
     Appendix C. Credits for Exercises
     Index
     List of Tables
   
[Gulutzan1999]
Gulutzan, Peter., Pelzer, Trudy.  SQL-99 Complete, Really: An Example-Based Reference Manual of the New Standard.  R&D Books Miller Freeman (Lawrence KS: 1999).   ISBN 0-87930-568-1 pbk + CD-ROM.  See [Gulutzan1999] under Information Processing.
   
[Hamming1969]
Hamming, R.W.  One Man's View of Computer Science.  J. ACM 16, 1 (Jan. 1969), 3-12.  Archived on-line at http://doi.acm.org/10.1145/321495.321497.  Reprinted in [Ashenhurst1987: pp.207-218].
     "A number of observations and comments are directed toward suggesting that more than the usual engineering flavor be given to computer science. The engineering aspect is important because most present difficulties in this field do not involve the theoretical question of whether certain things can be done, but rather the practical question of how can they be accomplished well and simply.
     "The teaching of computer science could be made more effective by various alterations, for example, the inclusion of a laboratory course in programming, the requirement for a strong minor in something other than mathematics, and more practical coding and less abstract theory, as well as more seriousness and less game playing." -- from the Abstract
   
[Hayes2025]
Hayes, Catherine., Malone, David.  Questioning the Criteria for Evaluating Non-Cryptographic Hash Functions.  Comm. ACM 68, 2 (Feb. 2025), 46-51.
   This is an interesting study around conditions that apply for non-cryptographic hash functions, ones where the concern is how well a items from a given distribution distribute into distinct buckets with minimal clumping.  The paper does not address the historically-common case of hashes used on fixed vocabularies, such as symbol tables, and whether or not the order in which items are first hashed can have any material impact, especially when arrivals are not uniformly distributed.  An interesting topic would be reconciliation of the cases here and the considerations of [Knuth1998b:6.4].
  
[Hewitt1983]
Lieberman, Henry., Hewitt, Carl.  A real-time garbage collector based on the lifetimes of objects.  Comm. ACM 26, 6 (June 1983), 419-429.  See [Lieberman1983]
  
[Kent1982]
Kent, WilliamA Simple Guide to Five Normal Forms in Relational Database Theory. Comm. ACM 26, 2 (Feb. 1983), 120-125. Also IBM Technical Report TR03.159, Aug. 1981. Also presented at SHARE 62, March 1984, Anaheim, California. Also in A.R. Hurson, L.L. Miller and S.H. Pakzad, Parallel Architectures for Database Systems, IEEE Computer Society Press, 1989.
   
[Kent2000]
Kent, WilliamData and Reality.  ed.2.  The International Online Library.  (Bloomington, IN: 1978; 1998, 2000).  ISBN 1-58500-970-9.
   Content
     Preface to the Second Edition
     Preface
     1. Entities
     2. The Nature of an Information System
     3. Naming
     4. Relationships
     5. Attributes
     6. Types and Categories and Sets
     7. Models
     8. The Record Model
     9. The Other Three Popular Models
     10. The Modeling of Relationships
     11. Elementary Concepts: Another Model?
     12. Philosophy
     Bibliography
     Detailed Contents
   
[Knuth1961]
Knuth, Donald E.  Minimizing Drum Latency Time.  J. ACM 8, 2 (April 1961), 119-150.  Archived on-line at http://doi.acm.org/10.1145/321062.321063.
   
[Knuth1974]
Knuth, Donald E.  Computer Programming as an Art.  1974 Turing Award Lecture.  Comm. ACM 17, 12 (Dec. 1974), 667-673.  Reprinted in pp. 33-46 of [Ashenhurst1987] and in Chapter 1 of [Knuth1992].  Archived on-line at http://doi.acm.org/10.1145/361604.361612.
   
[Knuth1989]
Graham, Ronald L., Knuth, Donald E., Patashnik, Oren.  Concrete Mathematics: A Foundation for Computer Science.  Addison-Wesley (Reading, MA: 1989).  ISBN 0-201-14236-8.  See [Graham1989]
 
[Knuth1992]
Knuth, Donald E. «Literate Programming».  CSLI Lecture Notes Number 27.  Center for the Study of Language and Information (Palo Alto: 1992).  ISBN 0-937073-80-6 pbk.
   Content
     Acknowledgments
     Preface

     1. Computer Programming as an Art [A.M. Turing Award Lecture, 1974]
     2. Structured Programming with goto Statements [1974]
     3. A Structured Program to Generate All Topological Sorting Arrangements [with Jayme L. Szwarcfiter, 1974]
     4. Literate Programming [1984]
     5. Programming Pearls, by Jon Bentley: Sampling [1986]
     6. Programming Pearls, Continued: Common Words [1986]
     7. How to Read a
WEB [1986]
     8. Excerpts from the Programs for ΤΕΧ and METAFONT [1986]
     9. Mathematical Writing [1987]
     10. The Errors of ΤΕΧ [1989]
     11. The Error Log of ΤΕΧ [1991]
     12. An Example of
CWEB [1990]
     Further Reading
     Index
 
[Knuth1993]
Knuth, Donald E.  Artistic :Programming.  This Week's Citation Classic.  Current Contents, Physical, Chemical & Earth Sciences 33, 34 (23 August 1993), 8.  Reprinted in [Knuth1996: Chapter 15].
     This short article provides some interesting context on the writing of the Art of Computer Programming [ACP: now [Knuth1997], [Knuth1998], [Knuth1998b], and subsequent volumes and revisions to appear].   In the summer of 1962 I was on the Univac team for which Don developed a small Fortran compiler using the small-compiler techniques that he and others had mastered.  There is also a hint of the degree to which Knuth pursues beauty in his work and as something to preserve in the world, including his 11-year effort to restore mathematical typography, an ultimate tool-building act.  -- dh:2003-03-22
   
[Knuth1993b]
Knuth, Donald E. The Stanford GraphBase: A Platform for Combinatorial Computing.  Addison-Wesley (Boston: 1963).  ISBN 978-0-321-60632-7 pbk.
   "This award-winning book demonstrates the art of literate programming with more than 30 examples. ... The programs contain state-of-the-art explanations of many important algorithms and data structures.  They also define a workbench for combinatorial computing, and standard sets of data that can be used for benchmark tests of computing methods, as well as demonstration programs and games that make use of the data." (Author's note with additional information including Errata and considerations that impact x64 implementations.)
   Andreas Scherer has provided a GitHub SGB repository based on the 2002-01-30 release of the Stanford GraphBase software and data.  Adjustments in the form of patches are provided in separate files there.  Orientation is generally toward use on Unix systems, with standard Unix tools.  The source programs are all in *.w (Web) files that require a CWeb installation. The code is in C Language, but rather [K&R] classical.  Adjustment to more-modern levels of standards (ANSI and ISO) for the C Language are provided in patches.  These factors make it important to have the book and learn to read the published form of the Literate Programming documentation (and nicely explained in the book).   There are some handy data files for testing the programs and other algorithms that the SGB ones may inspire.

  
[Knuth1996]
Knuth, Donald E. Selected Papers on Computer Science. CSLI Lecture Notes Number 59.  Center for the Study of Language and Information (Palo Alto: 1996). ISBN 1-881526-91-7 pbk.
     I opened this up because it contains Don's appreciation of the IBM 650. There is more here, including an example from Bishop's Constructive Analysis (almost on p.100) that put me over the top on the purchase decision.  There is much said here about algorithms, and that matters in The Miser Project, too.  dh:2000-07-18.
     For a related discussion, see my note, Do Programs Teach Algorithms? -- dh.
   Content
     Preface
     Acknowledgments

     0. Algorithms, Programs, and Computer Science [1966; 1992]
     1. Computer Science and its Relation to Mathematics [1973; 1974]
     2. Mathematics and Computer Science: Coping with Finiteness [1976]
     3. Algorithms [1977]
     4. Algorithms in Modern Mathematics and Computer Science [1981]
     5. Algorithmic Themes [1988]
     6. Theory and Practice, I [1977]
     7. Theory and Practice, II [1985]
     8. Theory and Practice, III [1986]
     9. Theory and Practice, IV [1989]
     10. Are Toy Problems Useful [1977]
     11. Ancient Babylonian Algorithms [1972; 1976]
     12. Von Neumann's First Computer Program [1970]
     13. The IBM 650: An Appreciation from the Field [1986]
     14. George Forsythe and the Development of Computer Science [1972]
     15. Artistic Programming [1993]
     Index
 
[Knuth1997]
Knuth, Donald EThe Art of Computer Programming, vol.1: Fundamental Algorithms. ed.3.  Addison Wesley Longman (Reading, MA: 1968, 1973, 1997).  ISBN 0-201-89683-4.
   Contents
     Preface
     Preface to the Third Edition
     Procedure for Reading This Set of Books
     Notes on the Exercises

     Chapter 1 - Basic Concepts
          1.1 Algorithms
          1.2 Mathematical Preliminaries
          1.3
MIX
          1.4 Some Fundamental Programming Techniques
     Chapter 2 - Information Structures
          2.1 Introduction
          2.2 Linear Lists
          2.3 Trees
          2.4 Multilinked Structures
          2.5 Dynamic Storage Allocation
          2.6 History and Bibliography
     Answers to Exercises
     Appendix A - Tables of Numerical Quantities
     Appendix B - Index to Notations
     Index and Glossary
 
[Knuth1998]
Knuth, Donald EThe Art of Computer Programming, vol.2: Seminumerical Algorithms.  ed.3.  Addison Wesley Longman (Reading, MA: 1969, 1981, 1998).  ISBN 0-201-89684-2.
   Content
     Preface
     Preface to the Third Edition
     Notes on the Exercises

     Chapter 3 - Random Numbers
          3.1 Introduction
          3.2 Generating Uniform Random Numbers
          3.3 Statistical Tests
          3.4 Other Types of Random Quantities
          3.5 What Is a Random Sequence?
          3.6 Summary
     Chapter 4 - Arithmetic
          4.1 Positional Number Systems
          4.2 Floating Point Arithmetic
          4.3 Multiple Precision Arithmetic
          4.4 Radix Conversion
          4.5 Rational Arithmetic
          4.6 Polynomial Arithmetic
          4.7 Manipulation of Power Series
     Answers to Exercises
     Appendix A - Tables of Numerical Quantities
     Appendix B - Index to Notations
     Index and Glossary
 
[Knuth1998b]
Knuth, Donald EThe Art of Computer Programming, vol.3: Sorting and Searching.  ed.2.  Addison Wesley Longman (Reading, MA: 1973, 1998).  ISBN 0-201-89685-0.
   Content
     Preface
     Preface to the Second Edition
     Notes on the Exercises
     Chapter 5
- Sorting
          5.1 Combinatorial Properties of Permutations
          5.2 Internal Sorting
          5.3 Optimum Sorting
          5.4 External Sorting
          5.5 Summary, History, and Bibliography
     Chapter 6 - Searching
          6.1 Sequential Searching
          6.2 Searching by Comparison of Keys
          6.3 Digital Searching
          6.4 Hashing
          6.5 Retrieval on Secondary Keys
     Answers to Exercises
     Appendix A
- Tables of Numerical Quantities
     Appendix B - Index to Notations
     Index and Glossary
 
[Knuth2000]
Knuth, Donald ESelected Papers on Analysis of Algorithms.  CLSI Lecture Notes Number 102.  Center for the Study of Language and Information (Palo Alto: 2000).  ISBN 1-57586-212-3 pbk.
   
[Knuth2011]
Knuth, Donald E.  Selected Papers on Fun & Games.  CSLI Lecture Notes Number 192. CSLI Publications (Stanford, CA: 2011).  ISBN 1-57586-584-X pbk.
 
[Kurose2003]
Kurose, James F., Ross, Keith W.  Computer Networking: A Top-Down Approach Featuring the Internet.  ed.2, International.  Addison-Wesley (Boston, MA: 2003).  ISBN 0-321-17644-8 pbk.
    
This is a textbook.  I will be using it for my M.Sc module on Computer Communication beginning 2004-02-05.  There is a companion web site that provides links, errata, and other materials.  It includes additional student resources available to purchasers of the book (via a scratch-off access code bound into the book).  The companion material includes self-assessment, some programming assignments, and material from the previous edition. 
     I am also interested in this book with regard to the treatment of abstractions involving computer communication and networking.  These are pertinent to nfoWare and I want to use consistent terminology and concepts where I am able to do so.  I will see how that works as I dig into the course.  -- dh:2004-01-11
     Although communication and networking texts don't start at the fundamental level that I see called for in Situating Data, this book is rewarding in how it points to meaningful toolcraft for having confirmable experiences with operations of computer communications.  I find high value in the gentleness of the progression and in the provision of concrete examples that can be used in experimental confirmation of networking and communication operations.  [dh:2004-02-13]
   Content
     Preface
     1. Computer Networks and the Internet
     2. Application Layer
     3. Transport Layer
     4. Network Layer and Routing
     5. Link Layer and Local Area Networks
     6. Multimedia Networking
     7. Security in Computer Networks
     8. Network Management
     References
     Index

   
[Larson1988]
Larson, Per-Ake.  Dynamic Hash Tables.  Comm. ACM 31, 4 (April 1988), 446-457.  Available on the Internet at <https://dl.acm.org/doi/10.1145/42404.42410>.
   From the Abstract: "Linear hashing and spiral storage are two dynamic hashing schemes originally designed for external files.  This paper shows how to adapt these two methods for hash tables stored in main memory.  The necessary data structures and algorithms are described, the expected performance is analyzed mathematically, and actual execution time are obtained and compared with alternative techniques.  Linear hashing is found to be both faster and easier to implement than spiral storage. ... Overall, linear hashing is a simple and efficient technique for applications where the cardinality of the key set is not known in advance.
   This article provides a very clear account of Linear Hashing as introduced in [Litwin1980].  The data structures and algorithms are well illustrated in a publication form of ALGOL 60/Pascal accompanied by useful diagrams.  The design of the experiments and the experimental results are also illustrative and inform what one might to employ Linear Hashing for an in-memory directory/database.  It will be important to obtain excellent "representative" test cases.  The cardinality observation is about the key set, not the hashes, although the chardinality of the hash values is set in advance and determines the worst case for a very large data set of distinct keys.  Also, within the hash set, there will be collisions even in the hopefully-impossible largest case, say 24-32  hash bits.
  
[Lieberman1983]
Lieberman, Henry., Hewitt, Carl.  A real-time garbage collector based on the lifetimes of objects.  Comm. ACM 26, 6 (June 1983), 419-429.
   This paper reviews the technique of [Baker1978] and proposes a simple extension to have multiple generations (smaller regions) and a scavenging technique that ensures all references into a region being evacuated are not left dangling.  Finding a reference must be done to accomplish the vacating of the referenced object and let the new location be know to all of the references to any one of them.
   [dh:2025-11-22 I am not sanguine about all of this.  In the case of list structures that are always acyclic and (logically) immutable, this may be far easier.  I need to understand how forward pointers are needed in this Lieberman-Hewitt scheme and if I can always avoid them.]
  
[Litwin1980]
Litwin, Witold. Linear Hashing: A New Tool for File and Table Addressing.  pp.212-223 in Proceedings of the 6th International Conference on Very Large Databases (VLDB), (Montreal, Quebec, Canada: October 1980).  Available on the Internet at <https://cs-web.bu.edu/faculty/gkollios/ada17/LectNotes/linear-hashing.PDF>.
   From the Abstract: "Linear hashing is a hashing in which the address space may grow or shrink dynamically. A file or a table may then support any number of insertions or deletions wthout access or memory load performance deterioration.  A record in the file is, in general, found in one access while the load may stay practically constant up to 90%.  A record in a table is found in a mean of 1.7 accesses, while the load is constantly 80%.  No other algorithms attaining such a performance are known."
   In April 2011 I stumbled over Linear Hashing in the context of a patent-infringement held against Google in regard to a means for accomplishing deletions of somehow-expired entries while searching for something in the bucket used for similar hashes.  Avoiding that situation, the use of Linear Hashing is very appealing in how the table of hash-indexed items grows and maybe even shrinks dynamically.  There are related considerations concerning the nature of the hash values and also their distribution in accesses to a Linear-Hashing-implemented directory/database.
  
[Malone2025]
Hayes, Catherine., Malone, David.  Questioning the Criteria for Evaluating Non-Cryptographic Hash Functions.  Comm. ACM 68, 2 (Feb. 2025), 46-51.  See [Hayes2025]
  
[More1979]
More, Trenchard. The nested rectangular array as a model of data. pp. 55-73 in "APL '79 Proceedings of the international conference on APL: Part 1." doi>10.1145/390009.804440 available at
<https://dl.acm.org/citation.cfm?id=804440>
   
[Nievergelt1979]
Fagin, Ronald., Nievergelt, Jurg., Strong, H. Raymond.  Extendible Hashing -- A Fast Access Method for Dynamic Files.  ACM Transactions on Database Systems 4, 3 (September 1979), 315-344.  See [Fagan1979]
  
[Patashnik1989]
Graham, Ronald L., Knuth, Donald E., Patashnik, Oren.  Concrete Mathematics: A Foundation for Computer Science.  Addison-Wesley (Reading, MA: 1989).  ISBN 0-201-14236-8.  See [Graham1989]
   
[Pelzer1999]
Gulutzan, Peter., Pelzer, Trudy.  SQL-99 Complete, Really: An Example-Based Reference Manual of the New Standard.  R&D Books Miller Freeman (Lawrence KS: 1999).   ISBN 0-87930-568-1 pbk + CD-ROM.  See [Gulutzan1999] in Information Processing.
   
[Perlis1961]
Evans, A., Perlis, A.J., Van Zoeren, H.  The Use of Threaded Lists in Construction of a Combined ALGOL and Machine-Like Assembly Processor.  Comm. ACM 4, 1 (Jan. 1961), 36-41.  DOR <https://dl.acm.org/doi/10.1145/366062.366081>.  See [Evans1961].
  
[Ramm2002]
Biermann, Alan W., Ramm, Dietolf.  Great Ideas in Computer Science with Java.  MIT Press (Cambridge, MA: 2002).  ISBN 0-262-02497-7 pbk. alk. paper.  See [Biermann2002]
       
[Ross2003]
Kurose, James F., Ross, Keith W.  Computer Networking: A Top-Down Approach Featuring the Internet.  ed.2, International.  Addison-Wesley (Boston, MA: 2003).  ISBN 0-321-17644-8 pbk.  See [Kurose2003]
   
[Sattley1964]
Cheatham, T. E. Jr., Sattley, Kirk.  Syntax-Directed Compiling.  Proceedings of the April 21-23 1964 Spring Joint Computer Conference (AFIPS: April 1964), 31-57.  See [Cheatham1964].
  
[Scott1977]
Scott, Dana S.  Logic and Programming Languages.  1976 ACM Turing Award Lecture.  Comm. ACM 20, 9 (September 1977), 634-641.  PDF available at <http://delivery.acm.org/10.1145/1290000/1283932/a1976-scott.pdf>.
   The discussion on semantic structures and function space is avauable collateral on the use of monotonicity and continuity in [Scott1993].
  
[Scott1993]
Scott, Dana S.  A Type-Theoretic Alternative to ISWIM, CUCH, OWHY.  Theoretical Computer Science 121, 12 (December 1993), 411-440.  Available at <https://www.sciencedirect.com/science/article/pii/030439759390095B>. 
   This paper was originally circulated informally in 1969, providing a type-theoretic approach that treats types as definite and distinguished.
  
[Scott2012]
Scott, Dana S.  λ-Calculus Then & Now.  Presented at the ACM Turing Centenary Celebration, San Francisco, June 15-16, 2012.  Annotated slides PDF version of 2012-08-25 available at <http://logic.berkeley.edu/colloquium/ScottACMTuring.pdf>.  Video of the presentation available at <https://doi.org/10.1145/2322176.2322185>.
  
[Sedgewick1989]
Sedgewick, Robert.  Algorithms.  Second edition.  Addison-Wesley (Reading, MA: 1983, 1988).  1989 reprint with authors corrections.  ISBN 0-201-06673-4.
     "This book is intended to survey the most important computer algorithms in use today and to teach fundamental techniques to the growing number of people in need of knowing them.  ... The broad perspective taken in the book makes it an appropriate introduction to the field." -- from the Preface, p.v.
     "The orientation of the book is toward algorithms likely to be of practical use.  ... Full implementations of the methods discussed (in an actual programming language) are included in the text, along with descriptions of the operations of these programs on a consistent set of examples.
     "Properties of the algorithms and situations in which they might be useful are discussed in detail.  Though not emphasized, connections to the analysis of algorithms and theoretical computer science are not ignored.  When appropriate, empirical and analytical results are discussed to illustrate why certain algorithms are preferred.  When interesting, the relationship of the practical algorithms being discussed to purely theoretical results is described."  -- from the Preface, p.vii.
     "To learn an algorithm well, one must implement and run it.  Accordingly, the recommended strategy for understanding the programs present in this book is to implement and test them, experiment with variants, and try them out on real problems.  We will use the Pascal programming language to discuss and implement most of the algorithms; since, however, we use a relatively small subset of the language, our programs can easily be translated into many other modern programming languages." -- from the Introduction, p.3.
     There is more discussion of my concerns about the actual execution of this approach under "Do Programs Teach Algorithms?" -- dh.
   Contents
     Preface
     Fundamentals
          1. Introduction
          2. Pascal
          3. Elementary Data Structures
          4. Trees
          5. Recursion
          6. Analysis of Algorithms
          7. Implementation of Algorithms
     Sorting Algorithms
          8. Elementary Sorting Methods
          9. Quicksort
          10. Radix Sorting
          11. Priority Queues
          12. Mergesort
          13. External Sorting
     Searching Algorithms
          14. Elementary Searching Algorithms
          15. Balanced Trees
          16. Hashing
          17. Radix Searching
          18. External Searching
     String Processing
          19. String Searching
          20. Pattern Matching
          21. Parsing
          22. File Compression
          23. Cryptology
     Geometric Algorithms
          24. Elementary Geometric Methods
          25. Finding the Convex Hull
          26. Range Searching
          27. Geometric Intersection
          28. Closet-Point Problems
     Graph Algorithms
          29. Elementary Graph Algorithms
          30. Connectivity
          31. Weighted Graphs
          32. Directed Graphs
          33. Network Flow
          34. Matching
     Mathematical Algorithms
          35. Random Numbers
          36. Arithmetic
          37. Gaussian Elimination
          38. Curve Fitting
          39. Integration
     Advanced Topics
          40. Parallel Algorithms
          41. The Fast Fourier Transform
          42. Dynamic Programming
          43. Linear Programming
          44. Exhaustive Search
          45. NP-Complete Problems
     Index
     
[Skiena1998]
Skiena, Steven S.  The Algorithm Design Manual.  Springer-Verlag TELOS (New York: 1998).  ISBN 0-387-94860-0 (book & CD-ROM).
   Contents
     Preface
     I. Techniques
          1. Introduction to Algorithms
          2. Data Structures and Sorting
          3. Breaking Problems Down
          4. Graph Algorithms
          5. Combinatorial Search and Heuristic Methods
          6. Intractable Problems and Approximations
          7. How to Design Algorithms
     II. Resources
          8. A Catalog of Algorithmic Problems
          9. Algorithmic Resources
     Bibliography
     Index
   
[Strong1979]
Fagin, Ronald., Nievergelt, Jurg., Strong, H. Raymond.  Extendible Hashing -- A Fast Access Method for Dynamic Files.  ACM Transactions on Database Systems 4, 3 (September 1979), 315-344.  See [Fagan1979]
  
[Sturgis1970]
Earley, Jay., Sturgis, Howard. A Formalism for Translator Interactions.  Comm. ACM 13, 10 (October 1970), 607-617.  See [Earley1970]
 
[VanZoeren1961]
Evans, A., Perlis, A.J., Van Zoeren, H.  The Use of Threaded Lists in Construction of a Combined ALGOL and Machine-Like Assembly Processor.  Comm. ACM 4, 1 (Jan. 1961), 36-41.  DOR <https://dl.acm.org/doi/10.1145/366062.366081>.  See [Evans1961].

The Content Material here was successfully repurposed/preserved/repaved as part of the 2023-08-29 stage of the Site Preservation Project.  Check those pages for additional details of the approach to repurposing/preservation and correction.

Hard Hat Area You are navigating Orcmid on GitHub

created 2000-07-18-18:06 -0700 (pdt) by orcmid