Skip to main content

Infinitesimal - Amir Alexander ***

While some books have obscure titles, a combination of the title and the subtitle will usually make it plain what the book is about. But I can pretty much guarantee that most readers, seeing Infinitesimal - how a dangerous mathematical theory shaped the modern world would leap to an incorrect conclusion as I did. The dangerous aspect of infinitesimals was surely going to be related in some way to calculus, but I expected it to be about the great priority debate between Newton and Leibniz, where in fact the book concentrates on the precursors to their work that would make the use of infinitesimals - quantities that are vanishingly close to zero - acceptable in mathematics.

The book is in two distinct sections. The first focuses on the history of the Jesuits, from their founding to their weighing into the mathematical debate against those who wanted to use infinitesimals in maths. For the Jesuits, everything was cut and dried, and where Aristotle's view and the geometry of Euclid had an unchanging nature that made them acceptable, the use of infinitesimals was far too redolent of change and rebellion. This was interesting, particularly in the way that the history gave background on Galileo's rise and fall seen from a different viewpoint (as he was in the ascendancy, the Jesuits were temporarily losing power, and vice versa). However, this part goes on far too long and says the same thing pretty much over and over again. This is, I can't help but feel, a fairly small book, trying to look bigger and more important than it is by being padded.

The second section I found considerably more interesting, though this was mostly as a pure history text. I was fairly ignorant about the origins of the civil war and the impact of its outcome, and Amir Alexander lays this out well. He also portrays the mental battle between philosopher Thomas Hobbes and mathematician John Wallis in a very interesting fashion. I knew, for example, that Wallis had been the first to use the lemniscate, the symbol for infinity used in calculus, but wasn't aware how much he was a self-taught mathematician who took an approach to maths that would horrify any modern maths professional, treating it more as an experimental science where induction was key, than a pure discipline where everything has to be proved.

Hobbes, I only really knew as a name, associated with that horrible frontispiece of his 'masterpiece' Leviathan, which seems to the modern eye a work of madness, envisaging a state where the monarch's word is so supreme that the people are more like automata, cells in a body or bees in a hive rather than individual, thinking humans. What I hadn't realised is that Hobbes was also an enthusiastic mathematician who believed it was possible to derive all his philosophy from geometry - and geometry alone, with none of Wallis' cheating little infinitesimals. The pair attacked each other in print for many years, though Hobbes' campaign foundered to some extent on his inability to see that geometry was not capable of everything (he regularly claimed he had worked out how to square the circle, a geometrically impossible task).

Although I enjoyed finding out more about the historical context it's perhaps unfortunate that Alexander is a historian, rather than someone with an eye to modern science, as I felt the first two sections, which effectively described the winning of the war by induction and experimentation over a view that expected mathematics to be a pure predictor of reality, would have benefited hugely from being contrasted with modern physics, where some would argue that far too much depends on starting with mathematics and predicting outcomes, rather than starting with observation and experiment. An interesting book without doubt, but not quite what it could have been.

Hardback:  

Kindle:  
Review by Brian Clegg

Comments

Popular posts from this blog

Astrophysics for People in a Hurry – Neil deGrasse Tyson *****

When I reviewed James Binney’s Astrophysics: A Very Short Introduction earlier this year, I observed that the very word ‘astrophysics’ in a book’s title is liable to deter many readers from buying it. As a former astrophysicist myself, I’ve never really understood why it’s considered such a scary word, but that’s the way it is. So I was pleasantly surprised to learn, from Wikipedia, that this new book by Neil deGrasse Tyson ‘topped The New York Times non-fiction bestseller list for four weeks in the middle of 2017’.

Like James Binney, Tyson is a professional astrophysicist with a string of research papers to his name – but he’s also one of America’s top science popularisers, and that’s the hat he’s wearing in this book. While Binney addresses an already-physics-literate audience, Tyson sets his sights on a much wider readership. It’s actually very brave – and honest – of him to give physics such prominent billing; the book could easily have been given a more reader-friendly title such …

Once upon and Algorithm - Martin Erwig ***

I've been itching to start reading this book for some time, as the premise was so intriguing - to inform the reader about computer science and algorithms using stories as analogies to understand the process.

This is exactly what Martin Erwig does, starting (as the cover suggests) with Hansel and Gretel, and then bringing in Sherlock Holmes (and particularly The Hound of the Baskervilles), Indiana Jones, the song 'Over the Rainbow' (more on that in a moment), Groundhog Day, Back to the Future and Harry Potter.

The idea is to show how some aspect of the story - in the case of Hansel and Gretel, laying a trail of stones/breadcrumbs, then attempting to follow them home - can be seen as a kind of algorithm or computation and gradually adding in computing standards, such as searching, queues and lists, loops, recursion and more.

This really would have been a brilliant book if Erwig had got himself a co-author who knew how to write for the public, but sadly the style is mostly heavy…

A turnround from Tyson

I am delighted that one of our reviewers has been able to give a five star review to Neil deGrasse Tyson's latest book. The astrophysicist has taken over Carl Sagan's old post as the number one science populariser in the US, but his written output in the past has been patchy, to say the least.

There have been at least two significant problems. One is dubious history of science. For example, in the cases of both Galileo and Bruno he has passed on undiluted the comic book version of history where Galileo is persecuted for mentioned heliocentricity (rather than his disastrous political handling of the  pope) and mutters 'Eppur si muove!' at his trial, and Bruno is burned at the stake for his advanced scientific ideas (both misrepresentations). Some argue that it getting history of science accurate doesn't matter if we get the right message about science across - but if we are prepared to distort historical data, why should anyone take scientific data seriously?

The o…