Skip to main content

Chaos – James Gleick *****

The book on the most amazing development in mathematics since the introduction of the zero – chaos theory. Has a nice dramatic style that highlights the importance of the people involved in the maths, but can’t detract from the remarkable implications of this fundamentally new understanding of how almost everything really works, rather than the approximations we are used to in science…
… at least, that’s certainly the feeling you’ll get when you read the book. But then you take a step back and think, okay, what chaos done for the world since 1987 (or thereabouts) when the book first came out. And you have to say – very little. Chaos is fascinating, but usually turns out to be fundamentally impractical.
Does this detract from the book? Not at all. It’s still a fascinating read after all these years, and even if the best chaos can give us is Jeff Goldblum in Jurassic Park it doesn’t cease to intrigue.
Paperback:  
Kindle:  
Review by Peter Spitz

Comments

Popular posts from this blog

Astrophysics for People in a Hurry – Neil deGrasse Tyson *****

When I reviewed James Binney’s Astrophysics: A Very Short Introduction earlier this year, I observed that the very word ‘astrophysics’ in a book’s title is liable to deter many readers from buying it. As a former astrophysicist myself, I’ve never really understood why it’s considered such a scary word, but that’s the way it is. So I was pleasantly surprised to learn, from Wikipedia, that this new book by Neil deGrasse Tyson ‘topped The New York Times non-fiction bestseller list for four weeks in the middle of 2017’.

Like James Binney, Tyson is a professional astrophysicist with a string of research papers to his name – but he’s also one of America’s top science popularisers, and that’s the hat he’s wearing in this book. While Binney addresses an already-physics-literate audience, Tyson sets his sights on a much wider readership. It’s actually very brave – and honest – of him to give physics such prominent billing; the book could easily have been given a more reader-friendly title such …

Once upon and Algorithm - Martin Erwig ***

I've been itching to start reading this book for some time, as the premise was so intriguing - to inform the reader about computer science and algorithms using stories as analogies to understand the process.

This is exactly what Martin Erwig does, starting (as the cover suggests) with Hansel and Gretel, and then bringing in Sherlock Holmes (and particularly The Hound of the Baskervilles), Indiana Jones, the song 'Over the Rainbow' (more on that in a moment), Groundhog Day, Back to the Future and Harry Potter.

The idea is to show how some aspect of the story - in the case of Hansel and Gretel, laying a trail of stones/breadcrumbs, then attempting to follow them home - can be seen as a kind of algorithm or computation and gradually adding in computing standards, such as searching, queues and lists, loops, recursion and more.

This really would have been a brilliant book if Erwig had got himself a co-author who knew how to write for the public, but sadly the style is mostly heavy…

A turnround from Tyson

I am delighted that one of our reviewers has been able to give a five star review to Neil deGrasse Tyson's latest book. The astrophysicist has taken over Carl Sagan's old post as the number one science populariser in the US, but his written output in the past has been patchy, to say the least.

There have been at least two significant problems. One is dubious history of science. For example, in the cases of both Galileo and Bruno he has passed on undiluted the comic book version of history where Galileo is persecuted for mentioned heliocentricity (rather than his disastrous political handling of the  pope) and mutters 'Eppur si muove!' at his trial, and Bruno is burned at the stake for his advanced scientific ideas (both misrepresentations). Some argue that it getting history of science accurate doesn't matter if we get the right message about science across - but if we are prepared to distort historical data, why should anyone take scientific data seriously?

The o…