Skip to main content

Astrophysics for People in a Hurry – Neil deGrasse Tyson *****

When I reviewed James Binney’s Astrophysics: A Very Short Introduction earlier this year, I observed that the very word ‘astrophysics’ in a book’s title is liable to deter many readers from buying it. As a former astrophysicist myself, I’ve never really understood why it’s considered such a scary word, but that’s the way it is. So I was pleasantly surprised to learn, from Wikipedia, that this new book by Neil deGrasse Tyson ‘topped The New York Times non-fiction bestseller list for four weeks in the middle of 2017’.

Like James Binney, Tyson is a professional astrophysicist with a string of research papers to his name – but he’s also one of America’s top science popularisers, and that’s the hat he’s wearing in this book. While Binney addresses an already-physics-literate audience, Tyson sets his sights on a much wider readership. It’s actually very brave – and honest – of him to give physics such prominent billing; the book could easily have been given a more reader-friendly title such as ‘Secrets of the Universe’. But it would still have been astrophysics by stealth, because it’s only thanks to physics that we understand anything beyond our own planet. As Tyson puts it: ‘the universality of physical laws makes the cosmos a marvellously simple place’.

Although the book is new, its chapters (now suitably updated) originated over a period of many years as self-contained magazine articles. They cover a wide range of topics, from the big bang and dark matter, via the electromagnetic spectrum and the periodic table, to asteroids and exoplanets. The coverage isn’t comprehensive; some of the most obvious subjects, like stellar evolution and black holes, are barely touched on. That isn’t a problem, though. The book doesn’t set out to explain everything we know about the universe, but to show that what we do know about it, we know because of physics. That’s just as interesting, and much rarer at a popular science level.

Personally, I loved the book – and I would have loved it even more when I was 15 years old, and my knowledge of physics was largely aspirational rather than actual. In those days, the book would probably have been written by someone like Isaac Asimov – and that’s a fair comparison, because Tyson’s style is a lot like Asimov’s. It manages to be clever, engaging, witty and lucid all at the same time. I kept finding myself stopping to read bits again because they were so good. Here are three examples of the kind of thing I mean:
  • On quarks: ‘The most familiar quarks are ... well, there are no familiar quarks. Each of their six subspecies has been assigned an abstract name that serves no philological, philosophical or pedagogical purpose, except to distinguish it from the others.’
  • On dark energy: ‘When you estimate the amount of repulsive vacuum pressure that arises from the abbreviated lives of virtual particles, the result is more than 10120 times larger than the experimentally determined value of the cosmological constant. This is a stupidly large factor, leading to the biggest mismatch between theory and observation in the history of science.’
  • On the cosmic microwave background: ‘The molecule cyanogen gets excited by exposure to microwaves. If the microwaves are warmer ... they excite the molecule a little more. In the big bang model, the cyanogen in distant, younger galaxies gets bathed in a warmer cosmic background than the cyanogen in our own Milky Way galaxy. And that’s exactly what we observe (you can’t make this stuff up).’
Although the book’s aimed at beginners, I have to admit that rather spooky last point came as news to me. And it wasn’t the only thing I learned. I  never realised there was enough energy in a single cosmic ray particle to knock a golf ball across a putting green. I didn’t know thunderstorms could produce gamma rays. Or that, if we could see Jupiter’s magnetosphere, it would be several times bigger than a full Moon in the sky. 

All in all, this is a book I can heartily recommend to anyone, regardless of how much or how little they know about physics.

Hardback:  
Kindle:  

Review by Andrew May

Comments

Popular posts from this blog

Once upon and Algorithm - Martin Erwig ***

I've been itching to start reading this book for some time, as the premise was so intriguing - to inform the reader about computer science and algorithms using stories as analogies to understand the process.

This is exactly what Martin Erwig does, starting (as the cover suggests) with Hansel and Gretel, and then bringing in Sherlock Holmes (and particularly The Hound of the Baskervilles), Indiana Jones, the song 'Over the Rainbow' (more on that in a moment), Groundhog Day, Back to the Future and Harry Potter.

The idea is to show how some aspect of the story - in the case of Hansel and Gretel, laying a trail of stones/breadcrumbs, then attempting to follow them home - can be seen as a kind of algorithm or computation and gradually adding in computing standards, such as searching, queues and lists, loops, recursion and more.

This really would have been a brilliant book if Erwig had got himself a co-author who knew how to write for the public, but sadly the style is mostly heavy…

A turnround from Tyson

I am delighted that one of our reviewers has been able to give a five star review to Neil deGrasse Tyson's latest book. The astrophysicist has taken over Carl Sagan's old post as the number one science populariser in the US, but his written output in the past has been patchy, to say the least.

There have been at least two significant problems. One is dubious history of science. For example, in the cases of both Galileo and Bruno he has passed on undiluted the comic book version of history where Galileo is persecuted for mentioned heliocentricity (rather than his disastrous political handling of the  pope) and mutters 'Eppur si muove!' at his trial, and Bruno is burned at the stake for his advanced scientific ideas (both misrepresentations). Some argue that it getting history of science accurate doesn't matter if we get the right message about science across - but if we are prepared to distort historical data, why should anyone take scientific data seriously?

The o…