DAVID JOHN BLOWER, 70, passed away suddenly on February 26, 2018. David was born May 12, 1947 in Akron, Ohio, the eldest child of Basil and Jean Blower. He graduated from Ohio State University with a degree in Psychology. He then enlisted in the US Army, spending nearly three years in Berlin, where he met his wife, Elizabeth Claude Blower. Following his honorable discharge from the Army, David continued his academic studies at Stanford, where he received his PhD in Experimental Psychology. He then joined the US Navy, serving at NAMRL Pensacola, FL, NADC Pennsylvania, and Naval Research Center at the University of Central Florida, Orlando. He returned to NAMRL in 1988, retiring honorably from military service in 1994.
David spent his retirement years writing and publishing four books on Information Processing, the Principle of Maximum Entropy, and Bayes Theorem.
You can download the books below
Information Processing: Boolean Algebra, Classical Logic, Cellular Automata, and Probability Manipulation
Volume I was corrected and revised in September 2017. This book begins the task of defining, explaining, arguing for, and, in the end, providing a rationale for information processing. Volume I is concerned with the notion that an information processor is mainly engaged in making inferences. An information processor would prefer to reach definite conclusions by using some form of deduction. Unfortunately, it is often thwarted in this desire by a fundamental lack of relevant information. Probability theory has developed as a rigorous way of dealing with the uncertainty surrounding inference. Thus, we begin by treating some of the formal manipulation rules that crop up in probability theory. To provide some foundational basis for the applications to appear in later Volumes, the topics of Boolean Algebra, Classical Logic, and Cellular Automata will make an appearance here.
Information Processing: The Maximum Entropy Principle
Information Processing: An Introduction to Information Geometry
Information Processing: Introduction to Artificial Neural Networks
Information Processing, Volume IV is an incomplete volume. The author passed away suddenly on February 26, 2018, while working on both his Information Processing: Supplemental Exercises for Volume I (published May 30, 2019) and Volume IV of his Information Processing series, entitled ‘Introduction to Artificial Neural Networks’ (dated June 2017). He had completed about 150 pages of Volume IV. His friends and collaborators, Dr. Romke Bontekoe and Dr. Barrie Stokes, graciously offered to edit the Volume IV draft, update the Mathematica code, and bring the draft into a form ready to be published. They did not attempt to complete missing chapters or add original content. Some chapters listed in the draft table of contents were represented only by placeholders in the main text, and there was no reasonable way to fill in this missing material. The original ‘Contents’ section was amended accordingly.
A generic label, artificial neural networks (ANNs), has been bestowed on the whole concept of a biologically inspired structure of interconnected nodes and weights that might mimic in computer code how the human brain solves inferential problems. This volume will provide the interested reader a glimpse into some of the author’s ideas concerning ANNs and their application to the general problems of information processing and inference under uncertainty. One common inferential scenario is classification. This volume compares the approach offered by a combination of our preferred Bayesian and MEP attack with the less probabilistically motivated ANNs. It is profitable to compare these approaches since ANNs might offer a way out of the dilemma posed by the dreaded curse of dimensionality.
Now that Mathematica is endowed with powerful new machine learning and ANN-related functions, they can be used to tackle the typical inferential problems addressed previously in these books, and extensive numerical experiments with ANNs are possible. For appropriately simplified cases, the Mathematica ANN results are compared to those provided by Bayesian and MEP techniques.
The Volume IV Mathematica Notebooks and other supplemental information are available from a public repository on github.com.