The book reviews the origins of the first computer built at the IAS (Institute for Advanced Study) at Princeton, and the personalities who created it. In setting the stage for the personalities, it dives back as far as the revolutionary war, but mostly surveys fin-de-siècle Eastern Europe, the period leading up to WWII, and the and the drive for something that could do the calculations necessary for an atomic bomb.
Within this context, I found two strong themes: first, the physical engineering problems involved in creating the computer, and the abilities that distinguished it from previous computers/calculator. (I am hoping someone has an explanation if the significance of "numbers that do things as well as numbers that mean things"), and second the problems the first computers were intended to solve, and the future applications their designers envisioned. The book concludes by looking at the way our digital universe has evolved and continues to evolve.
I was most out of my depth in the engineering area. I don't know what distinguished the MANIAC, understand the mechanics of vacuum tubes, the difficulties of memory allocation in machine language, or have any idea how this relates to modern processors. I did have two takeaways, though. I'm much more sympathetic than I had been to the impressive leap between envisioning a theoretical device and the practical reality if bringing that device into existence. (Probably related to my experiments in bread making and beer brewing, which are as close as I've come to any engineering problem.)
Second, the notion of a central "clock" which is really more of a counter/incrementer, as essential to computers. Instructions are performed at each step of this clock, which means states are allowed to change whenever the controller advances, which means finite spaces can be reused, which is really powerful. Also means there's a giant gap between the digital, chunky, universe and the continual analog universe.
As far as the problems solved, the book mentions 5, on different timescales: atomic bomb equations, shock waves, meteorology, evolutionary biology, and stellar evolution.
It's an interesting set of problems. The only real comment I have is how much harder meteorology seems to be than was expected. I think (getting back to messy engineering), you need the ability to do really complicated messy calculations to scare up something like chaos theory.
I'm ... Skeptical of the attempts to analogize biological and technological evolution and suggest machines are impacting our evolution in the way that biological evolution works. I think that the distinction between digital and analog is really big. I also think that no technological ecosystem is anywhere near as complex as our own, and I think these are huge qualitative differences between evolutionary environments. I also think these are justifications for my knee-jerk reaction, so we'll see what I think as things settle. The flip side is that computers are still in their infancy, are inserting themselves into our world in unimaginable ways, and are clearly highly powerful, addictive, and impact full devices. Maybe skynet is close? Maybe whatever human-machine evolution develops will be highly unlike skynet.
I listened to this as an audiobook, which was awesome unless the book itself has pictures, because visualizing many situations would have helped. I'm also really glad this was a book club book, because I need help understanding and processing it.
To the extent that I want to get something out of book club, it's:
I'm glad we're discussing this
Can someone clarify what pre-MANIAC calculating devices were like, and why "numbers that do things and numbers that mean things" is so important
Anything else is gravy.
May edit for clarity, links, and grammar after book club, potentially with a PS unless that turns into a new post.