Category Archives: Beary Scientist

Batman and Superman

When do you think Lego will make a Superman movie? We had fun at the movies with Lego Batman 🙂

Puffles and Honey at IMAX in Melbourne to see the Lego Batman movie

Little bears are debating Batman vs Superman. The science, not the movie. The movie isn’t much to write home about 😦 Although we warmed up to Ben Affleck’s Batman in Justice League. He was positively teary eyed when they brought back Superman!

Superman has the ability to fly, he has super strength, x-ray vision, invulnerability, super speed, heat vision, freeze breath and super senses. As Lex Luthor described him, he is a “living God on Earth”. Superman gets his powers from Earth’s yellow Sun, basically making him a large solar powered battery. Meaning when he is here on Earth he has the potential to wipe out all of humanity.

Batman is very cynical and as he puts it, “he’s had a bad experience with freaks dressed like clowns”, so even though we know Superman is really a good guy, The Dark Knight isn’t convinced and wants to take him down. But Batman wasn’t born with any genealogical advantage, he doesn’t have mutated cells which makes him a capable crime fighter, he just uses his brain.

Batman’s ability stems from science and engineering and with his family fortune he is able to bankroll some of the most advanced pieces of technology the world has ever seen.

Bruce Wayne / Batman and Barry Allen / Flash in Justice League

Batman made his first appearance in Detective Comics back in 1939 and he was more of a super sleuth then a hero, using his wits and lock picks to help solve crimes. However as he moved into the 1940s his gadgetry became more advanced when he started using infrared goggles that enabled him to see in the dark. Infrared light isn’t visible to the naked eye but everything above absolute zero temperature gives off infrared radiation in the form of heat. So Batman would be able to use this technology, not just to see in the dark, but to also target people as they would stand out against the colder background. Night vision goggles were first developed by the US in the early 1940s to assist with the war efforts, however German armies were using a form of night vision even earlier, which they used as a short range search light on board their Panther Tanks.

Batman made his debut in Detective Comics #27 (May 1939). Cover art by Bob Kane.

As Batman moved into the 1950s he seemed to shift slightly into science fiction. This was to keep up with the trends of the times and would often involve him firing ray guns at enemies or even flying into outer space. In Batman issue 109, he used a heat-ray to detonate explosives underwater, three years before optical lasers were invented. This may seem a little far-fetched but the military have been working on this for quite some time. They have built a non-lethal weapon called the Active Denial System, which directs high frequency microwaves over 500 meters. It’s about as hot as a lightbulb and is able to neutralise people without causing them serious harm. Similar to a microwave it excites the water and fat molecules in the skin and instantly heats them. This system was launched in 2010 during the war in Afghanistan but was never used. The military have developed other equipment that uses microwaves in a way that can destroy electrical equipment or even disrupt missiles guiding systems when launched at planes.

Batman has always managed to stay one step ahead of his mortal enemies, but he’s never taken on anyone quite like Superman. For starters, Superman has the ability to fly. Lex Luthor once theorised that this was because he must have come from a gigantic planet with enormous gravity, so his race had to develop natural anti-gravity organs in order to function. On Earth this would mean he would be able to control his own gravimetric field allowing him to fly.

Batman doesn’t have the ability to fly, but he does fall with style. His cape is made of ‘memory cloth’ which, when charged, has the ability to realign the molecules and become rigid, letting him use it to glide. Material, just like this, is actually in the works. One method of creating this effect is using magnetorheological fluid. This is usually a type of oil, which contains lots of little particles that when subjected to a magnetic field, becomes rigid thus increasing its viscosity. So incorporating this into a kind of fabric could potentially turn Batman’s cape into a reality. However there are people who glide similarly to Batman without the use of magnetorheological fluid.

Gliding using a wingsuit

They use wing suits. Wing suits adds surface area to the body and enables a significant increase in lift. There are slits on the arms which allows the suit to fill up with air, which permits them to glide. By manipulating their body as they free fall through the air they are able to gain great distances as they plummet towards the ground.

In the original comics Batman’s costume was made of cloth, but this became problematic as it was constantly being torn. So he developed his suit, not only to stop it tearing, but to also stop bullets.

Molecular structure of Kevlar: bold represents a monomer unit, dashed lines indicate hydrogen bonds.

Kevlar is a para-aramid synthetic fibre that’s five times stronger than steel making it perfect for Batman. It’s able to be woven into a material and when worn, the fibres are able to stretch meaning it’s capable of stopping both bullets and knives. Although in the recent films, we notice he hasn’t got as much dexterity when it comes to hand to hand combat. In fact, he hasn’t got enough dexterity to cross a busy road. The kevlar made his neck increasingly stiff making it hard for him to react quickly when being attacked. Batman upgraded his suit to Kevlar plates placed on top of titanium dipped tri-weaved fibres. Just as strong but it gives him more flexibility. This design parallels ceramic armour. Ceramic armour is 70% lighter than Kevlar and uses boron carbide, a black crystalline powder, that when heated can be turned into ceramic plating that’s as strong as diamonds. The military use ceramic plates on tanks and even inserts them into soft ballistic vests making them wearable, protecting their soldiers from enemy fire.

Batman’s suit is also coated with Nomex, which is a flame resistant material. When exposed to intense heat, the fibres carbonise and thicken, creating a barrier between the skin and the fire. The material doesn’t melt and it doesn’t burn which makes it perfect for firefighters to use. In Batman vs Superman: Dawn of Justice, Batman has again upgraded his suit to a hefty suit of armour designed to withstand the blows of super strong extra-terrestrials. Plus a few other nifty features as well.

Despite the impressive nature of Batman’s suit, the most essential item of clothing he wears is his utility belt. It is packed with everything a vigilante needs. Thermite bombs to get through doors, a re-breather to breathe under water and the most famous of all his gadgets, the Batarang. Ben Affleck got to keep a Batarang from the movie!

Image from Batman: The Movie (1966). Adam West’s Batman using Shark Repellent Bat Spray

In Batman: The movie (1966), Adam West’s Batman even uses shark repellent bat spray to get rid of a shark that’s munching on his leg as he dangles from a helicopter. This may sound ludicrous but shark repellent spray does exist and it was specially designed to protect sailors who got stranded in open water. When it was being developed, scientists found that the thing to drive away a shark is the odour of a dead shark. Dissecting what it is exactly that sends them swimming, it was found that certain copper compounds, such as copper sulphate and copper acetate, in combination with other ingredients could replicate the smell of dead shark. It’s actually purchasable online at a very reasonable price, so be like Batman, always prepared, and carry shark repellent in your belt just in case of a freak shark attack.

Another helpful tool is Batman’s grappling gun. He uses an air gun to shoot a grappling hook to the top of tall buildings, and he pulls up at a fairly quick speed. In order to lift the Dark Knight quickly, the grabbling gun would need a powerful motorised mechanism. A standard lightbulb uses roughly 60 watts, this motor would need at least 5000 watts to work but it is doable. Atlas Devices is a global provider of innovative solutions for security and defence. They have the capability to use such grabbling motors for rescue and extraction missions. They have been specially designed so they can lift two people at a fast pace increasing their chances of success during rescue missions.

Two of the founding members of Atlas Devices demonstrating their Atlas Powered Ascender (APA)

It is impossible to grow up to become Superman, but it is technically possible to grow up to become Batman. And it’s definitely possible to become best friends, like little bears 🙂

We might give Batman vs Superman: Dawn of Justice another go, but for now it’s time to watch Wonder Woman. Again 🙂

Original story on Science Made Simple, heavily biased towards Batman, on account that in the battle between brains and brawn, brains win all the time. If only! That’s the wishful thinking of a scientist 🙂

Science in the Soul

“I think it’s high time,” writes Richard Dawkins in the introduction to Science in the Soul, “the Nobel Prize for Literature was awarded to a scientist.”

We agree. Dawkins doesn’t mean to imply that he might be one such deserving scientist. Carl Sagan, Stephen Jay Gould, Oliver Sacks and now Stephen Hawking are out of contention, but there’s Brian Greene, Steven Pinker, Jared Diamond, Neil deGrasse Tyson, James Gleick, Michio Kaku and others…

The laureates for Literature have caused quite a stir at times. The Swedish Academy’s decision to bestow its distinguished literary award — and the accompanying $1.1 million prize — to Bob Dylan in 2016 unleashed a storm of criticism, with many arguing the American musician and songwriter did not deserve an award that was typically bestowed on novelists, dramatists, and writers of non-fiction. Dylan’s reluctant acknowledgement of the award and his decision to be absent at the official award ceremony only added fuel to the fire.

In this era of “alternative facts” and with science under siege, maybe it’s about time that a science writer was awarded the prize. But it’s not going to happen this year. The Swedish Academy is dealing with allegations of sexual misconduct, financial malpractice and repeated leaks, and has announced earlier this month no Nobel prize for literature will be awarded this year.

If they were to head down the science path, just about the first person the Nobel organizers would bump into would be Richard Dawkins.

Debates about the most influential science book of all time habitually settle into a face-off between Darwin’s Origin of Species and Newton’s Principia Mathematica. But last year, a poll to celebrate the 30th anniversary of the Royal Society Science Book Prize returned a more recent winner: Richard Dawkins’s The Selfish Gene.

Dawkins took a decisive 18% of the vote, while Darwin was jostled into third place by Bill Bryson’s A Short History of Nearly Everything in the Royal Society poll of more than 1,300 readers. As interesting as the votes on the 10 books shortlisted for contention was the often passionate championship of titles that were left off the list. They were dominated by physics and cosmology. Silly not to include David Deutsch, sniffed one of many, who cited a range of works by the Oxford-based quantum physicist. Carl Sagan’s “mind-blowing” 1980 TV tie-in Cosmos garnered a clutch of votes from fans who described it as life-changing.

A less inspiring picture emerges from a crunch of the ratio of recommendations by gender, unsurprisingly perhaps in the context of a prize that only had its first female winner – Gaia Vince – in 2015. Of 313 suggestions outside the shortlisted books, fewer than 20 were for books by women – but they win out on imaginative titles. Hats off to Elizabeth Royte for The Tapir’s Morning Bath and to Robin Wall Kinnear for Braiding Sweetgrass – and above all, to primatologist Jane Goodall, who summed it up in the five words of her 1971 title: In the Shadow of Man.

The top 10 most influential science books of all time – from the shortlist
The Selfish Gene by Richard Dawkins – 236 votes
A Short History of Nearly Everything by Bill Bryson – 150 votes
On the Origin of Species by Charles Darwin – 118 votes
The Natural History of Selborne by Gilbert White – 101 votes
Bad Science by Ben Goldacre – 88 votes
Fermat’s Last Theorem by Simon Singh – 81 votes
The Immortal Life of Henrietta Lacks by Rebecca Skloot – 77 votes
Silent Spring by Rachel Carson – 39 votes
Married Love by Marie Carmichael Stopes – 5 votes
The Science of Life by HG Wells, Julian Huxley and GP Wells – 4 votes

Out of this list, 6 authors are eligible for the Nobel Prize as they are still alive. And since the members of the Swedish Academy are busy infighting and have lost public confidence, maybe others should step up to the task.

It was in 1976 when Richard Dawkins suggested, in the opening words of The Selfish Gene, that, were an alien to visit Earth, the question it would pose to judge our intellectual maturity was: “Have they discovered evolution yet?” We had, of course, by the grace of Charles Darwin and a century of evolutionary biologists who had been trying to figure out how natural selection actually worked. In 1976, The Selfish Gene became the first real blockbuster popular science book, a poetic mark in the sand to the public and scientists alike: this idea had to enter our thinking, our research and our culture.

The original book cover, illustrated by Desmond Morris.

The idea was this: genes strive for immortality, and individuals, families, and species are merely vehicles in that quest. The behaviour of all living things is in service of their genes hence, metaphorically, they are selfish. Before this, it had been proposed that natural selection was honing the behaviour of living things to promote the continuance through time of the individual creature, or family, or group or species. But in fact, Dawkins said, it was the gene itself that was trying to survive, and it just so happened that the best way for it to survive was in concert with other genes in the impermanent husk of an individual.

This gene-centric view of evolution also began to explain one of the oddities of life on Earth – the behaviour of social insects. What is the point of a drone bee, doomed to remain childless and in the service of a totalitarian queen? Suddenly it made sense that, with the gene itself steering evolution, the fact that the drone shared its DNA with the queen meant that its servitude guarantees not the individual’s survival, but the endurance of the genes they share. Or as the Anglo-Indian biologist JBS Haldane put it: “Would I lay down my life to save my brother? No, but I would to save two brothers or eight cousins.”

These ideas were espoused by only a handful of scientists in the middle decades of the 20th century – notably Bob Trivers, Bill Hamilton, John Maynard Smith and George Williams. In The Selfish Gene, Dawkins did not merely recapitulate them; he made an impassioned argument for the reality of natural selection. Previous attempts to explain the mechanics of evolution had been academic and rooted in maths. Dawkins walked us through it in prose. Many great popular science books followed – Stephen Hawking’s A Brief History of Time, Stephen Pinker’s The Blank Slate, and, recently, The Vital Question by Nick Lane.

More books followed from Dawkins, including The Extended Phenotype (1982), The Blind Watchmaker (1986), which won the Royal Society of Literature Award in 1987, and River Out of Eden (1995). Dawkins particularly sought to address a growing misapprehension of what exactly Darwinian natural selection entailed in Climbing Mount Improbable (1996).

Though much of Dawkins’s oeuvre generated debate for asserting the supremacy of science over religion in explaining the world, nothing matched the response to the polemical The God Delusion (2006). The book relentlessly points out the logical fallacies in religious belief and ultimately concludes that the laws of probability preclude the existence of an omnipotent creator. Dawkins used the book as a platform to launch the Richard Dawkins Foundation for Reason and Science (2006), an organization that, in dual American and British incarnations, sought to foster the acceptance of atheism and championed scientific answers to existential questions. Along with fellow atheists Christopher Hitchens, Sam Harris, and Daniel C. Dennett, he embarked on a campaign of lectures and public debates proselytizing and defending a secular worldview. Dawkins launched the Out Campaign in 2007 in order to urge atheists to publicly declare their beliefs.

In the memoir An Appetite for Wonder: The Making of a Scientist (2013), Dawkins chronicled his life up to the publication of The Selfish Gene. A second volume of memoir, Brief Candle in the Dark: My Life in Science (2015), recorded episodes from the latter part of his career.

Dawkins’s prose is lucid and powerful, his arguments difficult to contend. Science in the Soul is Dawkins’ 14th book and contains the usual vivid explanations of Darwinism and kin selection, replicators and phenotypes, written in sentences that grab you by the throat. He is brilliant, as ever, at evoking a sense of wonder about nature – a fly’s compound eye, the waggle-dance of a honeybee – then showing how the scientific reality is infinitely more complex and beautiful than the appeal to the supernatural.

It is a shame that Dawkins is now perhaps better known for his irritable contempt for religion, since his true legacy is The Selfish Gene and its profound effect on multiple generations of scientists and lay readers.

Setting aside the response to his views on religion and politics, there have been plenty of attacks on the idea of the selfish gene. The Selfish Gene has been attacked variously by philosophers, comedians, vicars and journalists. Much of the enmity stems from people misunderstanding that selfishness is being used as a metaphor. The irony of these attacks is that the selfish gene metaphor actually explains altruism. We help others who are not directly related to us because we share similar versions of genes with them.

Richard Dawkins is one of the great thinkers of the 20th and 21st century. He is erudite, considered in his statements and unapologetic in his insistence that facts, empirical evidence and reason take center stage. He received a standing ovation at the end of his talk and rightly so!

Consider this statement from Kurt Wise, who has a Ph.D. in paleontology and an M.A. in geology from Harvard University, and a B.A. Geology from the University of Chicago:

Although there are scientific reasons for accepting a young earth, I am a young age creationist because that is my understanding of the Scripture. As I shared with my professors years ago when I was in college, if all the evidence in the universe turns against creationism, I would be the first to admit it, but I would still be a creationist because that is what the Word of God seems to indicate.

Here is someone who despite having the ability and knowledge to engage in critical thinking and think for himself, deliberately chooses not to. And admits it out loud!

A number of Stanford studies became famous for the contention that people can’t really think straight and reasonable-seeming people are often totally irrational. Rarely has this insight seemed more relevant than it does right now.

Richard Dawkins doesn’t have any answers for how people can embrace critical thinking and empirical evidence and reason. But he most definitely does not use abusive language towards anyone, individual or group, and while some of his statements might make you cringe, well… the truth sometimes hurts. Criticism is not ‘abuse’. People may get offended and hurt by honest criticism, but that’s still not abuse. And it is noticeable that many of the attacks on Dawkins are ill-tempered spats full of abusive language, while he remains calm and rational and backs up his arguments with evidence.

We don’t have the answers either, and it is quite likely that we have even less optimism than Dawkins that humanity as a whole will evolve to the next stage of overcoming the evolutionary and cultural conditioning to fully embrace facts, empirical evidence and reason.

Consider what’s become known as “confirmation bias”, the tendency people have to embrace information that supports their beliefs and reject information that contradicts them. Of the many forms of faulty thinking that have been identified, confirmation bias is among the best catalogued; it’s the subject of entire textbooks’ worth of experiments. One of the most famous of these was conducted, again, at Stanford.

Some cognitive scientists prefer the term “myside bias”. Humans, they point out, aren’t randomly credulous. Presented with someone else’s argument, we’re quite adept at spotting the weaknesses. Almost invariably, the positions we’re blind about are our own.

An experiment performed by a group of cognitive scientists neatly demonstrates this asymmetry. Participants were asked to answer a series of simple reasoning problems. They were then asked to explain their responses, and were given a chance to modify them if they identified mistakes. The majority were satisfied with their original choices; fewer than fifteen per cent changed their minds in step two.

In step three, participants were shown one of the same problems, along with their answer and the answer of another participant, who’d come to a different conclusion. Once again, they were given the chance to change their responses. But a trick had been played: the answers presented to them as someone else’s were actually their own, and vice versa. About half the participants realized what was going on. Among the other half, suddenly people became a lot more critical. Nearly sixty per cent now rejected the responses that they’d earlier been satisfied with.

This lopsidedness might reflect the task that reason evolved to perform, which is to prevent us from getting screwed by the other members of our group. Living in small bands of hunter-gatherers, our ancestors were primarily concerned with their social standing, and with making sure that they weren’t the ones risking their lives on the hunt while others loafed around in the cave. There was little advantage in reasoning clearly, while much was to be gained from winning arguments.

And this social game has not changed much over centuries and millenia. The unwritten rules for success, failure, belonging, and other key attributes of people’s lives remain similar to the hunter-gatherer times. People need to fit in, to behave in ways that are acceptable to the groups to which they belong. What has changed are the toys people use in playing the game.

Little bears don’t suffer from the hunter-gatherer conditioning, they weren’t around then! 🙂

It says that the book is written by a passionate rationalist for other passionate rationalists…

That’s us!

The Curious Mr Feynman

Richard Feynman was a curious character.

He advertised as much in the subtitle of his autobiography, Surely You’re Joking, Mr. Feynman!: Adventures of a Curious Character. Everybody knew that, in many respects, Feynman was an oddball.

But he was curious in every other sense of the word as well. His curiosity about nature, about how the world works, led to a Nobel Prize in physics and a legendary reputation, both among physicists and the public at large.

Feynman was born 100 years ago May 11. It’s an anniversary inspiring much celebration in the physics world. Feynman was one of the last great physicist celebrities, universally acknowledged as a genius who stood out even from other geniuses.

Another Nobel laureate, Hans Bethe, a Cornell University physicist who worked with Feynman during World War II on the atomic bomb project at Los Alamos (and later on the Cornell faculty) referred to Feynman as a magician. “Normal” geniuses, Bethe said, did things much better than other people but you could figure out how they did it. And then there were magicians. “Feynman was a magician. I could not imagine how he got his ideas,” Bethe said. “He was a phenomenon. Feynman certainly was the most original physicist I have seen in my life, and I have seen lots of them.”

Feynman was a master conjuror of physics. A mathematical whizz with exceptional intuition, he seemed to pull solutions out of thin air. He crafted a lexicon for particle interactions: iconic squiggles, loops and lines now known as Feynman diagrams. His Nobel-prizewinning work on quantum electrodynamics included methods that even he saw as a sleight-of-hand for removing infinite terms from calculations. Yet, his results — equivalent to more systematic, rigorously expounded mathematical techniques independently proposed by co-laureates Julian Schwinger and Sin-Itiro Tomonaga — matched atomic-physics data beautifully.

Apart from his brilliance as a physicist, Feynman was also known for his skill at playing the bongo drums and cracking safes. Public acclaim came after he served on the presidential commission investigating the explosion of the space shuttle Challenger. In a dramatic moment during a hearing about that disaster, he dipped material from an O-ring (a crucial seal on the shuttle’s rockets) into icy water, demonstrating that an O-ring would not have remained flexible at the launch-time temperature.

His autobiography had already become a best seller, so Feynman was well-known when he died in February 1988.

John Wheeler, Feynman’s doctoral adviser at Princeton University before World War II said then “I felt very lucky to have him as my graduate student. “There was an immense vitality about Feynman. He was interested in all kinds of problems.”

Feynman’s curiosity was not satisfied merely by being told the solution to a problem, though.

“If you said you had the answer to something, he wouldn’t let you tell it,” Wheeler said. “He had to stand on his head and pace up and down and figure out the answer for himself. It was his way of keeping the ability to make headway into brand new frontiers.”

Feynman found fascination in all sorts of things, some profound, some trivial. In his autobiography, he revealed that he spent a lot of time analyzing ant trails. He sometimes entertained Wheeler’s children by tossing tin cans into the air and then explaining how the way the can turned revealed whether the contents were solid or liquid.

Curiosity of that type was instrumental in the work that led to his Nobel Prize. While eating in the Cornell cafeteria, Feynman noticed someone tossing a plate, kind of like a Frisbee. As the plate flew by, Feynman noticed that the Cornell medallion on the plate was rotating more rapidly than the plate was wobbling. He performed some calculations and showed that the medallion’s rotation rate should be precisely twice the rate of the wobbling. He then perceived an analogy to a problem he had been investigating relating to the motion of electrons. The wobbling plate turned out to provide the clue he needed to develop a new version of the theory of quantum electrodynamics.

“The whole business that I got the Nobel Prize for came from that piddling around with the wobbling plate,” he wrote in his autobiography.

It was not curiosity alone that made Feynman a legend. His approach to physics and life incorporated a willful disdain for authority. He regularly disregarded bureaucratic rules, ignored expert opinion and was willing to fearlessly criticize the most eminent of other scientists.

During his time at Los Alamos, for instance, he encountered Niels Bohr, the foremost atomic physicist of the era. Other physicists held Bohr in awe. “Even to the big shot guys,” Feynman recalled, “Bohr was a great god.” During a meeting in which the “big shots” deferred to Bohr, Feynman kept pestering him with questions. Before the next meeting, Bohr called Feynman in to talk without the big shots. Bohr’s son (and assistant) later explained why. “He’s the only guy who’s not afraid of me, and will say when I’ve got a crazy idea,” Niels had said to his son. “So next time when we want to discuss ideas, we’re not going to be able to do it with these guys who say everything is yes, yes, Dr. Bohr. Get that guy and we’ll talk with him first.”

Feynman knew that he sometimes made mistakes. Once he foolishly even read some papers by experts that turned out to be wrong, retarding his work on understanding the form of radioactivity known as beta decay. He vowed never to make the mistake of listening to “experts” again.

“Of course,” he ended one chapter of his autobiography, “you only live one life, and you make all your mistakes, and learn what not to do, and that’s the end of you.”

Feynman’s books urge us to explore the world with open-minded inquisitiveness, as if encountering it for the first time. He worked from the idea that all of us could aspire to take the same mental leaps as him. But, of course, not every ambitious young magician can be a Harry Houdini. Whereas other educators might try to coddle those who couldn’t keep up, Feynman never relented. The essence of his philosophy was to find something that you can do well, and put your heart and soul into it. If not physics, then another passion — bongos, perhaps.

Original stories on Science News and Nature.

An Evening With Galileo and Bach

It’s World Astronomy Day and little bears are planet gazing 🙂 while listening to the Galileo Project, an imaginative concert from Tafelmusik, commemorating Galileo’s first public demonstration of the telescope. Galileo is considered the father of modern astronomy.

Out of This World

Galileo was born in Pisa on the day that Michelangelo died. In truth, it was probably about a week later, but the records were tweaked to make it seem so. The connection was real, and deep. Galileo spent his life as an engineer and astronomer, but his primary education was almost exclusively in what we would call the liberal arts: music, drawing, poetry and rhetoric — the kind of thing that had made Michelangelo’s Florence the capital of culture in the previous hundred years.

As an interesting aside, Galileo was originally buried in 1642 at the Novitate Chapel in Santa Croce under the Campanile. He was not allowed a Christian burial inside the church because he asserted that the Earth revolved around the sun, and was thus excommunicated by the Church. Ninety-five years later, in 1737, his body was moved to a marble sarcophagus inside the Church of Santa Croce. His tomb is located directly across from Michelangelo’s monument, for ‘it was believed that Michelangelo’s spirit leapt into Galileo’s body between the former’s death and the latter’s birth.’

Galileo was afflicted with a cold and crazy mother — after he made his first telescope, she tried to bribe a servant to betray its secret so that she could sell it on the market! — and some of the chauvinism that flecks his life and his writing may have derived from weird-mom worries. He was, however, very close to his father, Vincenzo Galilei, a lute player and, more important, a musical theorist. Vincenzo wrote a book, startlingly similar in tone and style to the ones his son wrote later, ripping apart ancient Ptolemaic systems of lute tuning, as his son ripped apart Ptolemaic astronomy. Evidently, there were numerological prejudices in the ancient tuning that didn’t pass the test of the ear. The young Galileo took for granted the intellectual freedom conceded to Renaissance musicians. The Inquisition was all ears, but not at concerts.

Part of Galileo’s genius was to transfer the spirit of the Italian Renaissance in the plastic arts to the mathematical and observational ones. He took the competitive, empirical drive with which Florentine painters had been looking at the world and used it to look at the night sky. The intellectual practices of doubting authority and trying out experiments happened on lutes and with tempera on gesso before they turned toward the stars. You had only to study the previous two centuries of Florentine drawing, from the rocky pillars of Masaccio to the twisting perfection of Michelangelo, to see how knowledge grew through a contest in observation. As the physicist and historian of science Mark Peterson points out, the young Galileo used his newly acquired skills as a geometer to lecture on the architecture of Hell as Dante had imagined it, grasping the hidden truth of “scaling up”: an Inferno that big couldn’t be built on classical engineering principles. But the painters and poets could look at the world, safely, through the lens of religious subjects; Galileo, looking through his lens, saw the religious non-subject. They looked at people and saw angels; he looked at the heavens, and didn’t.

In the 1580s, Galileo studied at the University of Pisa, where he absorbed the Aristotelian orthodoxy of his time — one as synthetic as most orthodoxy is. There were Arab-spiced versions of Aristotle, which led first to alchemy and then to chemistry; more pious alternatives merged the Greek philosopher with St. Thomas Aquinas. They all agreed that what made things move in nature was an impetus locked into the moving things themselves. The universe was divided into neat eternal zones: the earth was rough, rugged, and corrupt with mortality, and therefore had settled in, heavy and unhappy, at the center of the universe. Things up above were pure and shining and smooth, and were held aloft, like the ladies in the Renaissance romances, by the conceited self-knowledge of their perfection. Movement was absolute. Things had essences, constantly revealed. You could know in advance how something would move or act by knowing what it was. A brick and a cannonball, dropped from a tower, would fall at different rates based on their weight. And the best argument, often the only argument, for all these beliefs was that Aristotle had said so, and who were you to say otherwise?

Galileo soon began to have doubts about this orthodoxy, which he aired in conversation with friends and then in correspondence with other natural philosophers in Europe, particularly the great German astronomer Johannes Kepler. Mail was already the miracle of the age. In correspondence, the new science passed back and forth through Europe, almost as fluidly as it does in the e-mail era. It’s astonishing to follow the three-way correspondence among Tycho Brahe, Kepler and Galileo, and see how little time was lost in disseminating gossip and discovery. Human curiosity is an amazing accelerant.

Kepler encouraged Galileo to announce publicly his agreement with the sun-centered cosmology of the Polish astronomer monk Copernik, better known to history by the far less euphonious, Latinized name of Copernicus. His system, which greatly eased astronomical calculation, had been published in 1543, to little ideological agitation. It was only half a century later, as the consequences of pushing the earth out into plebeian orbit dawned on the priests, that it became too hot to handle, or even touch.

In 1592, Galileo made his way to Padua, right outside Venice, to teach at the university. He promised to help the Venetian Navy, at the Arsenale, regain its primacy, by using physics to improve the placement of oars on the convict-rowed galleys. Once there, he earned money designing and selling new gadgets. He made a kind of military compass and fought bitterly in support of his claim to have invented it. Oddly, he also made money by casting horoscopes for his students and wealthy patrons. (Did he believe in astrology? Maybe so. He cast them for himself and his daughters, without being paid.)

If you were trying to choose the best places in history to have lived — making allowances for syphilis, childbirth mortality, and all the other pre-antibiotic plagues — Venice in Galileo’s day would have to be high on the list. The most beautiful of cities, with the paint still wet on the late Bellinis and Titians, Venice also had wonderful music, geisha-like courtesans, and a life of endless, mostly free conversation. Galileo called these years the happiest of his life.

He became an ever more convinced Copernican, but he had his crotchets. He never accepted Kepler’s proof that the orbits of the planets in the Copernican system had to be ellipses, because he loved the perfection of circles; and he was sure that the movement of the tides was the best proof that the earth was turning, since the ocean water on the earth’s surface was so obviously sloshing around as it turned. The truth — that the moon was pulling the water at a distance — seemed to him obvious nonsense, and he never tired of mocking it.

Although Copernicus didn’t see any big ideas flowing from the sun-centered system, the Church was slowly beginning to suspect that heliocentrism, heretically, elbowed man right out of the center of things. Galileo alone saw something more: the most interesting thing about the earth’s spinning at high speeds around the sun was that, in the normal course of things, none of us noticed. One of the deepest insights in the history of thought was his slowly developed idea of what we now call the “inertial system”: the idea that the physics stays the same within a system whether it’s in rapid movement or at rest — indeed, that “rest” and “movement” are relative terms. Physical laws, he insisted, are the same in all inertial systems. We experience the earth as stable and still, but it might well be racing around the cosmos, just as we could lock ourselves up in the hold of a ship and, if it was moving evenly, never know that it was moving at all. Fast and slow, large and small, up and down are all relative conditions, and change depending on where you stand and how fast you’re moving. The idea demolished absolutes and democratized the movement of the spheres. Galileo grasped some of the significance of what he had discovered, writing later that “to our natural and human reason, I say that these terms ‘large’, ‘small’, ‘immense’, ‘minute’, etc. are not absolute but relative; the same thing in comparison with various others may be called at one time ‘immense’ and at another ‘imperceptible’.” But he saw only sporadically just how far you could push the principle: he saw the sun at the center of things, and didn’t reflect, at any length, that the sun might itself be turning around some other star.

In 1609, Galileo heard rumors about a Dutch gadget that gave you a closeup look at faraway ships and distant buildings. After a friend sent him the dimensions and the basic layout — two lenses in a 48-inch tube — he got to work, and within weeks had made his own telescope. One night in December, he turned it on the moon, and saw what no man had seen before. Or, rather, since there were Dutch gadgets in many hands by then, and many eyes, he understood what he was seeing as no man of his time had before — that shadows from some of the splotches were craters and others mountains. The moon was not a hard, pure sphere; it was geological.

A few weeks later, he pointed his gadget at Jupiter. Some of his notes, scratched on the back of an envelope, still exist, at the Morgan Library in New York. He was startled to see four little stars near the planet. In an episode in the history of thought that can still make the heart beat faster, he noticed that, night after night, they were waltzing back and forth near the big planet: first left, then right, never quite clearing its path, as though the planet were sticky and they wanted to stay near it. In a flash of intuition, he had it: the new stars near Jupiter were actually moons, orbiting the planet as our moon orbits us. So their light might be reflected light, as is our moon’s. All moonlight might be sunshine, bounced off a hall of celestial mirrors. More important still, there in the sky was a miniature Copernican system, visible to the aided eye.

It’s hard to overstate how important the telescope was to Galileo’s image. It was his emblem and icon, the first next big thing, the ancestor of Edison’s light bulb and Steve Jobs’s iPhone. A Tuscan opportunist to the bone, Galileo rushed off letters to the Medici duke in Florence, hinting that, in exchange for a job, he would name the new stars after the Medici. He wanted to go back to Florence, partly, it seems, because he wanted to persuade the smart, well-educated Jesuits who clustered there to accept his world picture. Sell the powerful Jesuits on the New Science, he thought, and you wouldn’t have to worry about the Inquisition or the Pope. Galileo felt himself already under enough religious pressure to continue to encode all talk of his discoveries in his correspondence with Kepler. He even sent him a letter about the phases of Venus in cipher, ending, “Oy!” Really, he did. Heilbron suggests, smilingly, that this hints at Jewish ancestry. (No evidence exists that Kepler replied “Vey!”)

Throughout Italy, the Inquisition was what Heilbron calls “low-level background terrorism”. (One of Galileo’s servants had already reported him for not going to Mass regularly.) It was an Italian Inquisition, meaning subject to the laws and influences of clan, and cheerfully corrupt, but disinclined to killing. Disinclined but not incapable; as recently as 1600, the Roman Inquisition had burned alive, in public, the great Giordano Bruno, who taught the doctrine of the plurality of worlds, uncomfortably like Galileo’s doctrine of many moons. It was unusual for the Inquisition to burn philosophers alive; on the other hand, how many philosophers do you have to burn alive to keep other philosophers from thinking twice before they say anything inflammatory?

The Catholic Church in Italy then was very much like a Communist Party today: an institution in which few of the rulers took their own ideology seriously but still held a monopoly on moral and legal authority, and also the one place where ambitious, intelligent people could rise, even without family connections (though they helped). The Church was pluralistic in practice about everything except an affront to its core powers.

For the next two decades, Galileo tried to do what we would now call basic research while simultaneously negotiating with the Church to let him do it. Eventually, he and the Church came to an implicit understanding: if he would treat Copernicanism merely as a hypothesis, rather than as a truth about the world, it would be acceptable — if he would claim his work only as “istoria,” not as “dimostrazione”, the Inquisitors would leave him alone. The Italian words convey the same ideas as the English equivalents: a new story about the cosmos to contemplate for pleasure is fine, a demonstration of the way things work is not. You could calculate, consider, and even hypothesize with Copernicus. You just couldn’t believe in him.

Galileo even seems to have had six interviews with the sympathetic new Pope, Urban VIII — a member of the sophisticated Barberini family — in which he was more or less promised freedom of expression in exchange for keeping quiet about his Copernicanism. It was a poisoned promise: though Galileo, vain as ever, thought he could finesse the point, Copernicanism was at the heart of what he wanted to express.

It all came to a head in 1632, with the publication of his masterpiece, manifesto, poem, and comedy, Dialogue Concerning the Two Chief World Systems. Set in Venice as a conversation among three curious friends, the book was in part an evocation of happy times there — a highly stylized version of the kinds of evenings and conversations Galileo had once had. It was in honor of those evenings that he named two of the characters after his friends: Salviati, who here speaks entirely for Galileo, and Sagredo, who represents an honest non-scientist of common sense. He invented a third puppet, Simplicio, who speaks, stumblingly, for Aristotle and the establishment — the other World System. Salviati describes him as “one of that herd who, in order to learn how matters such as this take place, do not betake themselves to ships or crossbows or cannons, but retire into their studies and glance through an index and a table of contents to see whether Aristotle has said anything about them.” Aristotle is to Simplicio one of those complete thinkers, of the Heidegger or Ayn Rand kind, whose every thought must be true even if you can’t show why it is in this particular instance: it explains everything except anything.

Dialogue Concerning the Two Chief World Systems is the most entertaining classic of science ever published. Written in the vernacular — the best modern translation is by Stillman Drake — it uses every device of Renaissance humanism: irony, drama, comedy, sarcasm, pointed conflict, and a special kind of fantastic poetry. There are passages that are still funny, four hundred years later. At one point, the dispute takes up the high-minded Aristotelian view that “corrupt” elements must have trajectories different from pure ones, and Sagredo points out that an Aristotelian author “must believe that if a dead cat falls out of a window, a live one cannot possibly fall, too, since it is not a proper thing for a corpse to share in qualities suitable for the living.” The dialogue is also philosophically sophisticated. Though Galileo/Salviati wants to convince Simplicio and Sagredo of the importance of looking for yourself, he also wants to convince them of the importance of not looking for yourself. The Copernican system is counterintuitive, he admits — the earth certainly doesn’t seem to move. It takes intellectual courage to grasp the argument that it does.

Galileo’s tone is thrilling: he is struggling to find things out, and his eye covers everything from the movement of birds in the air to the actual motion of cannonballs fired at the horizon, from the way stars glow to the way all movable bones of animals are rounded. There’s even a lovely moment when, trying to explain to Simplicio how deceptive appearances can be, Sagredo refers to “the appearance to those who travel along a street by night of being followed by the moon, with steps equal to theirs, when they see it go gliding along the eaves of the roofs.” You can’t trust your eyes, but you can’t trust old books, either. What can you trust? Nothing, really, is Galileo/Salviati’s answer, only some fluid mixture of sense impression and strong argument. “Therefore, Simplicius, come either with arguments and demonstrations,” Salviati declares, in Thomas Salusbury’s fine Jacobean translation, in words that remain the slogan of science, “and bring us no more Texts and authorities, for our disputes are about the Sensible World, and not one of Paper.”

Contemporary historians of science have a tendency to deprecate the originality of the so-called scientific revolution, and to stress, instead, its continuities with medieval astrology and alchemy. And they have a point. It wasn’t that one day people were doing astrology in Europe and then there was this revolution and everyone started doing astronomy. Newton practiced alchemy; Galileo drew up all those horoscopes. But if you can’t tell the difference in tone and temperament between Galileo’s sound and that of what went before, then you can’t tell the difference between chalk and cheese. The difference is apparent if you compare what astrologers actually did and what the new astronomers were doing. The Arch-Conjuror of England (Yale), Glynn Parry’s entertaining biography of Galileo’s contemporary the English magician and astrologer John Dee, shows that Dee was, in his own odd way, an honest man and a true intellectual. He races from Prague to Paris, holding conferences with other astrologers and publishing papers, consulting with allies and insulting rivals. He wasn’t a fraud. His life has all the look and sound of a fully respectable intellectual activity, rather like, one feels uneasily, the life of a string theorist today.

The look and the sound of science… but it does have a funny smell. Dee doesn’t once ask himself, “Is any of this real or is it all just bullshit?” If it works, sort of, and you draw up a chart that looks cool, it counts. Galileo never stopped asking himself that question, even when it wasn’t bullshit but sounded as though it might well be. That’s why he went wrong on the tides; the-moon-does-it-at-a-distance explanation sounds too much like the assertion of magic. The temperament is not all-seeing and curious; it is, instead, irritable and impatient with the usual stories. The new stories might be ugly, but they’re not crap. “It is true that the Copernican system creates disturbances in the Aristotelian universe,” Salviati admits in the Dialogue, “but we are dealing with our own real and actual universe.”

What is so strange, and sad, given what would soon happen, is that Two Chief World Systems contains some of the best “accommodationist” rhetoric that has ever been written. To the objections that the Copernican universe, with its vast spaces outside the solar system, is now too big to be beautiful, Galileo has his puppets ask, Too big for whom? How presumptuous to say it is too big for God’s mind! God’s idea of beauty is surely different and more encompassing than ours. The truth that God has his eye on the sparrow means that the space between the sparrow and outer space is impossible for us to see as God sees it.

These are the arguments that, less eloquently put, are used now by smart accommodationists in favor of evolution. Evolution is not an alternative to intelligent design; it is intelligent design, seen from the point of view of a truly intelligent designer. Galileo was happy enough to go on doing research under the generally benevolent umbrella of the Church if only it would let him.

It wouldn’t let him. He provided every argument for toleration he could, and still he wasn’t tolerated. Part of the trouble was traceable to his hubris: he had remembered at the last minute to put the Pope’s favorite argument for a “hypothetical” reading of Copernicus into his book, but he had made it into a closing speech for Simplicio, and when you are going to put the Pope’s words in a puppet’s mouth it is a good idea first to make sure that the puppet is not named Dumbso. But it went deeper than the insult. Whatever might be said to accord faith and Copernicus, religion depends for its myth on a certain sense of scale. Small domestic dogmatists are always merely funny (like Alceste, in The Misanthrope, or the dad in just about any American sitcom). Man must be at the center of a universe on a stable planet, or else the core Catholic claim that the omnipotent ruler of the cosmos could satisfy his sense of justice only by sending his son here to be tortured to death begins to seem a little frayed. Scale matters. If Clark Kent had never left Smallville, then the significance of Superman would be much reduced.

Two books by the historian Thomas F. Mayer take up exactly what happened to Galileo: The Trial of Galileo (Toronto) is specifically about the scientist’s persecution by the Inquisition, while his much longer The Roman Inquisition: A Papal Bureaucracy and Its Laws in the Age of Galileo (Pennsylvania) delves into its social and intellectual context. Mayer deprecates the conventional account as, in the words of another scholar, “shrouded in myth and misunderstanding.” But, when you’ve read through his collected evidence, the myth seems pretty much right: Galileo wrote a book about the world saying that the earth goes around the sun, and the Church threatened to have him tortured or killed if he didn’t stop saying it, so he stopped saying it. Mayer believes that had Galileo been less pugnacious things would have worked out better for science; yet his argument is basically one of those “If you put it in context, threatening people with hideous torture in order to get them to shut up about their ideas was just one of the ways they did things then” efforts, much loved by contemporary historians.

To be sure, Galileo’s trial was a bureaucratic muddle, with crossing lines of responsibility, and it left fruitfully unsettled the question of whether Copernican ideas had been declared heretical or if Galileo had simply been condemned as an individual for continuing to promote them after he had promised not to. But what is certain is that, in 1633, Galileo was threatened with torture, forced on his knees to abjure his beliefs and his book, and then kept under house arrest and close watch for the rest of his life. (Albeit of a fairly loose kind: John Milton came to see him, and the image of the imprisoned scientist appears in Milton’s defense of free speech, the Areopagitica.) Galileo’s words, read a certain way, were not innocent of irony: “I do not hold the Copernican opinion, and have not held it after being ordered by injunction to abandon it.” Notice that he does not say that he never held it, or that he would not still hold it, had he not been forced to abandon it.

Once the book was published, who cared what transparent lies Galileo had to tell to save his life? Martyrdom is the test of faith, but the test of truth is truth. The best reason we have to believe in miracles is the miracle that people are prepared to die for them. But the best reason that we have to believe in the moons of Jupiter is that no one has to be prepared to die for them in order for them to be real.

So the scientist can shrug at the torturer and say, Any way you want me to tell it, I will. You’ve got the waterboard. The stars are still there. It may be no accident that so many of the great scientists really have followed Galileo, in ducking and avoiding the consequences of what they discovered. In the roster of genius, evasion of worldly responsibility seems practically a fixed theme. Newton escaped the world through nuttiness, Darwin through elaborate evasive courtesies and by farming out the politics to Huxley. Heisenberg’s uncertainty was political — he did nuclear-fission research for Hitler — as well as quantum-mechanical. Science demands heroic minds, but not heroic morals. It’s one of the things that make it move.

Out of This World

Original article in The New Yorker.

Out of this World

Another day, another party 🙂

Little bears are celebrating the Day of Beary Space Flight 🙂

It’s Lego play time!
Yay! We have a new Lego set!

In 2011, 12 April was declared as the International Day of Human Space Flight in dedication of the first manned space flight made on 12 April 1961 by the 27-year-old Russian Soviet cosmonaut Yuri Gagarin. Gagarin circled the Earth for 1 hour and 48 minutes aboard the Vostok 1 spacecraft.

Three-quarter profile head-and-shoulders view of Soviet cosmonaut Major Yuri Alexeyevich Gagarin in pressure suit and helmet (faceplate raised), probably on or about April 12, 1961, when he made his orbital space flight in Vostok 1.

On 12 April 1981, exactly 20 years after Vostok 1, Space Shuttle Columbia was launched for the first orbital flight, although this was a coincidence as the launch had been delayed for two days.

Space shuttle astronauts John Young and Robert Crippen (in tan space suits) are greeted by members of the ground crew moments after stepping off the shuttle Columbia following its maiden flight..
OV-102 Columbia
Columbia and its crew of seven astronauts tragically perished during atmospheric re-entry at the end of mission STS-107 on February 1, 2003

Since 2001, Yuri’s Night, also known as the World’s Space Party, is held every 12 April worldwide to commemorate milestones in space exploration.

These cocktails are out of this world!

Little bears are going to watch their favourite space exploration film, The Martian 🙂

A teddy bear on Mars

Definitely!

Fourier’s 250th Anniversary

The 250th anniversary of Joseph Fourier’s birth has been added to the French national commemorations of 2018 by the High committee of the French Academy.

Portrait of Joseph Fourier, mathematician

March 21 marks the 250th birthday of one of the most influential mathematicians in history. He accompanied Napoleon on his expedition to Egypt, revolutionized science’s understanding of heat transfer, developed the mathematical tools used today to create CT and MRI scan images, and discovered the greenhouse effect.

He wrote of mathematics: “There cannot be a language more universal and more simple, more free from errors and obscurities … Mathematical analysis is as extensive as nature itself, and it defines all perceptible relations.”

Jean-Baptiste-Joseph Fourier is the most illustrious citizen of Auxerre, the principal city of western Burgundy, where he was born on March 21, 1768. Both his father Joseph, who was a master tailor originally from Lorraine, and his mother Edmie died before he was ten years old. Fortunately certain local citizens took an interest in the boy’s education and secured him a place in the progressive École Royale Militaire, one of a number run by the Benedictine and other monastic orders. Science and mathematics were taught there, among other subjects, and, while the boy displayed all-round ability, he had a special gift for mathematics. He went on from there to complete his studies in Paris at the College Montagu. His aim was to join other the artillery or the engineers, the branches of the army supposedly open to all classes of society, but when he applied he was turned down, despite a strong recommendation from Legendre, who was an inspector of the Écoles Militaires. Although he could have been rejected on medical grounds the reason given by the minister was that only candidates of noble birth were acceptable.

After this setback Fourier embarked on a career in the church. He became a novice at the famous Benedictine Abbey of St. Benoît-sur-Loire, where he was called on to teach elementary mathematics to the other novices. After taking monastic vows he became known as Abbé Fourier (Father or Reverend), but instead of pursuing a career in the church he returned to Auxerre to teach at the École Militaire. By this time he was twenty-one and had already read a research paper at a meeting of the Paris Academy.

During the first years of the Revolution, Fourier was prominent in local affairs. His courageous defence of victims of the Terror led to his arrest by order of the Committee for Public Safety in 1794. A personal appeal to Robespierre was unsuccessful, but he was released after Robespierre himself was guillotined. Fourier then went as a student to the short-lived École Normale. The innovative teaching methods used there made a strong impression on him and it gave him the opportunity to meet some of the foremost mathematicians of the day, including Lagrange, Laplace and Monge. Fourier was amused when it emerged that, due to administrative error, the proud Laplace had been enrolled as a student rather than a professor. The next year, when the École Polytechnique opened its doors, under its original name of the École Centrale des Travaux Publiques, Fourier was appointed assistant lecturer to support the teaching of Lagrange and Monge. However, before long he fell victim to the reaction against the previous regime and was arrested again. He had an anxious time in prison but his colleagues at the École successfully sought his release.

In 1798 he was selected to join an expedition to an undisclosed destination. This proved to be Napoleon’s Egyptian adventure, Campagne d’Égypte. Once the newly formed Institut d’Egypte was established in Cairo, with Monge as its president and Fourier as permanent secretary, the cultural arm of the expedition set to work studying the antiquities, some of which were appropriated. On top of this activity Fourier was also entrusted with some negotiations of a diplomatic nature, and he even found time to think about mathematics. He proposed that a report be published on the work of the Institut d’Égypte, and on his return to France was consulted as to its organisation and deputed to write a historical preface describing the rediscovery of the wonders of the ancient civilisation. When the Description de l’Égypte (a twelve-volume report which founded modern Egyptology) was published, Fourier’s elegant preface, somewhat edited by Napoleon, appeared at the front of it.

Meanwhile Fourier had resumed his work at the École Polytechnique. Before long, however, Napoleon, who had been impressed by his capacity for administration, decided to appoint him prefect of the Departement of Isère, based at Grenoble and extending to what was then the Italian border. The office of prefect is a demanding one but it was during this period that Fourier wrote his classic monograph on heat diffusion entitled On the propagation of heat in solid bodies and presented it to the Paris Academy in 1807. It was examined by Lagrange, Laplace, Lacroix and Monge. Lagrange was adamant in his rejection of several of its features (essentially the central concept of trigonometric or, as we say, of Fourier series) and so its publication in full was blocked; only an inadequate five-page summary appeared, written by Poisson. Outclassed as rivals in the theory of heat diffusion, Poisson and Biot tried for years to belittle Fourier’s achievements. Later he received a prize from the academy for the work, but it was not until 1822 that Fourier’s theory of heat diffusion was published.

Pierre-Simon de Laplace (L) and Joseph Louis Lagrange (R) were not initially convinced by Fourier’s work.

To quote from the preface to the Théorie Analytique de la Chaleur, this ‘great mathematical poem’ as Clerk Maxwell described it:

First causes are not known to us, but they are subjected to simple and constant laws that can be studied by observation and whose study is the goal of Natural Philosophy… Heat penetrates, as does gravity, all the substances of the universe; its rays occupy all regions of space. The aim of our work is to expose the mathematical laws that this element follows… But whatever the extent of the mechanical theories, they do not apply at all to the effects of heat. They constitute a special order of phenomena that cannot be explained by principles of movement and of equilibrium… The differential equations for the propagation of heat express the most general conditions and reduce physical questions to problems in pure Analysis that is properly the object of the theory.

One major novelty of his work was the systematic use of a decomposition of a general ‘signal’ (think of the sound of a violin) into the sum of many simpler ‘signals’ (think of the sound of many tuning forks). One of the British physicists who took up Fourier’s ideas and ran with them was William Thomson (later Lord Kelvin) of Thomson and Tait’s Treatise on Natural Philosophy. Thomson used Fourier’s ideas to understand why the first Atlantic telegraph cable failed and to ensure that the second cable succeeded.

As prefect, Fourier’s administrative achievements included securing the agreement of thirty-seven different communities to the drainage of a huge area of marshland to make valuable agricultural land, and the planning of a spectacular highway between Grenoble and Turin, of which only the French section was built. Napoleon conferred on him the title of baron, in recognition of his excellent work as prefect.

Fourier was still at Grenoble in 1814 when Napoleon fell from power. The city happened to be directly on the route of the party escorting the Emperor from Paris to the south and thence to Elba; to avoid and embarrassing meeting with his former chief, Fourier negotiated a detour in the route. But no such detour was possible when Napoleon returned on his march to Paris in 1815, and so Fourier compromised, fulfilling his duties as prefect by ordering the preparation of the defences – which he knew to be futile – and then leaving the town by one gate as Napoleon entered by another. His handling of this awkward situation did not affect their relationship. In fact the Emperor promptly gave him the title of count and appointed him prefect of the neighbouring Département of the Rhône, based at Lyon. However before the end of the Hundred Days Fourier had resigned his new title and appointment in protest against the severities of the regime and returned to Paris to concentrate on scientific work.

This was the low point in Fourier’s life. For a short while he was without employment, subsisting on a small pension, and out of favour politically. However a former student at the École Polytechnique and companion in Egypt was now prefect of the Département of the Seine. He appointed Fourier director of the Statistical Bureau of the Seine, a post without arduous duties but with a salary sufficient for his needs.

Fourier’s last burst of creative activity came in 1817/18 when he achieved an effective insight into the relation between integral-transform solutions to differential equations and the operational calculus. There was at that time a three-cornered race in progress between Fourier, Poisson and Cauchy to develop such techniques. In a crushing response to a criticism by Poisson, Fourier exhibited integral-transform solutions of several equations which had long defied analysis, and paved the way for Cauchy to develop a systematic theory, en route to the calculus of residues.

In 1816 Fourier was elected to the reconstituted Académie des Sciences, but Louis XVIII could not forgive his acceptance of the Rhône prefecture from Napoleon and at first refused to approve the election. Diplomatic negotiation eventually resolved the situation and his renomination the next year was approved. He also had some trouble with the second edition of the Description de l’Égypte (for now his references to Napoleon needed revision) but in general his reputation was recovering rapidly. He was left in a position of strength after the decline of the Société d’Arcueil, and gained the support of Laplace against the enmity of Poisson. In 1822 he was elected to the powerful position of permanent secretary of the Académie des Sciences. In 1827, like d’Alembert and Laplace before him, he was elected to the literary Académie Française. Outside France he was elected to the Royal Society of London.

Fourier’s health was never robust, and towards the end of his life he began to display peculiar symptoms which are thought to have been due to a disease of the thyroid gland called myxoedema, possibly contracted in Egypt. As well as certain physical symptoms, the disorder can lead to a dulling of the memory, apparent in the rambling papers he wrote towards the end of his life. Perhaps it was also partly responsible for the unfortunate incident which occurred in February 1830 when he apparently mislaid the second paper on the solution of equations sent to the Academy by Galois for the competition for the Grand Prix de Mathématiques. The prize was awarded jointly to Niels Abel (posthumously) and Carl Jacobi for their work on elliptic functions.

Fourier was terminally ill by that time. Early in May 1830 he suffered a collapse and his condition deteriorated until he died on May 16, at the age of sixty-two. The funeral service took place at the church of St Jacques de Haut Pas and he was buried in the cemetery of Père Lachaise, close to the grave of Monge.

Today, Fourier’s name is inscribed on the Eiffel Tower. But more importantly, it is immortalized in Fourier’s law and the Fourier transform, enduring emblems of his belief that mathematics holds the key to the universe.

Visualisation of an approximation of a square wave by taking the first 1, 2, 3 and 4 terms of its Fourier series

Thiis interactive animation will keep little bears occupied for hours 🙂

Fourier’s law states that heat transfers through a material at a rate proportional to both the difference in temperature between different areas and to the area across which the transfer takes place. For example, people who are overheated can cool off quickly by getting to a cool place and exposing as much of their body to it as possible.

Fourier’s work enables scientists to predict the future distribution of heat. Heat is transferred through different materials at different rates. For example, brass has a high thermal conductivity. Air is poorly conductive, which is why it’s frequently used in insulation.

Remarkably, Fourier’s equation applies widely to matter, whether in the form of solid, liquid or gas. It powerfully shaped scientists’ understanding of both electricity and the process of diffusion. It also transformed scientists’ understanding of flow in nature generally – from water’s passage through porous rocks to the movement of blood through capillaries.

Fourier applications

Modern medical imaging machines rely on another mathematical discovery of Fourier’s, the “Fourier transform”.

In CT scans, doctors send X-ray beams through a patient from multiple different directions. Some X-rays emerge from the other side, where they can be measured, while others are blocked by structures within the body.

With many such measurements taken at many different angles, it becomes possible to determine the degree to which each tiny block of tissue blocked the beam. For example, bone blocks most of the X-rays, while the lungs block very little. Through a complex series of computations, it’s possible to reconstruct the measurements into two-dimensional images of a patient’s internal anatomy.

Thanks to Fourier and today’s powerful computers, doctors can create almost instantaneous images of the brain, the pulmonary arteries, the appendix and other parts of the body. This in turn makes it possible to confirm or rule out the presence of issues such as blood clots in the pulmonary arteries or inflammation of the appendix.

Fourier is also regarded as the first scientist to notice what we today call the greenhouse effect.

His interest was piqued when he observed that a planet as far away from the sun as Earth should be considerably cooler. He hypothesized that something about the Earth – in particular, its atmosphere – must enable it to trap solar radiation that would otherwise simply radiate back out into space.

Fourier created a model of the Earth involving a box with a glass cover. Over time, the temperature in the box rose above that of the surrounding air, suggesting that the glass continually trapped heat. Because his model resembled a greenhouse in some respects, this phenomenon came to be called the “greenhouse effect”.

Simply put: The earth is warmed by the sun’s radiation. The sun is very hot, so why is the earth not very hot? Because the earth reradiates heat. But if the earth radiates heat, why is it not much colder (as the moon is)? Because the atmosphere slows down the process of re-radiation.

Just as Fourier was the first to give an interesting answer to why the earth is the temperature it is, so John Tyndall (1820-1893) was the first to give an interesting answer to why the sky is blue.

John Tyndall (1820-1893).
Smithsonian Institution’s digital collection of portraits

His answer has undergone substantial modifications by Lord Rayleigh and Albert Einstein, but his general idea of atmospheric scattering has proved correct.

A keen Alpine climber, he was fascinated by glaciers and worked on their flow. Glaciers led him to ice ages and thence to the problem of the earth’s temperature.

By experiment, he was able to identify those gases, primarily water and carbon dioxide, whose presence interferes with the passage of heat radiation. Ice ages could, perhaps, be accounted for by relatively small changes in the chemical composition of the atmosphere.

It was not until the development of quantum theory that Tyndall’s discoveries could be understood from a theoretical perspective, and not until the middle of the 20th century that the complexities of the misnamed Greenhouse Effect were understood (what happens in greenhouses is rather different).

Mankind may now be in the position of a lobster in a very slowly warming pot but, thanks to people like Fourier, Tyndall and their successors, we do, at least, know what is happening to us.

Happy Pi Day

It’s Pi Day again, and so there’s pie-eating contests, bickering over the merits of pi versus tau (pi times two), and throwdowns over who can recite more digits of π.

π does deserve a celebration, but for reasons that are rarely mentioned. In high school, we all learned that π is about circles. π is the ratio of a circle’s circumference (the distance around the circle, represented by the letter C) to its diameter (the distance across the circle at its widest point, represented by the letter d). That ratio, which is about 3.14, also appears in the formula for the area inside the circle, A = πr2, where π is the Greek letter “pi” and r is the circle’s radius (the distance from centre to rim).

Pi explained

The beauty of π is that it puts infinity within reach. The digits of π never end and never show a pattern. They go on forever, seemingly at random — except that they can’t possibly be random, because they embody the order inherent in a perfect circle. This tension between order and randomness is one of the most tantalizing aspects of π.

π touches infinity in other ways. For example, there are astonishing formulas in which an endless procession of smaller and smaller numbers adds up to π. One of the earliest such infinite series to be discovered says that π equals four times the sum 1 – 1/3 + 1/5 – 1/7 + 1/9 – 1/11 + ⋯. The appearance of this formula alone is cause for celebration. It connects all odd numbers to π, thereby also linking number theory to circles and geometry. In this way, π joins two seemingly separate mathematical universes, like a cosmic wormhole.

But wait, there’s more… After all, other famous irrational numbers, like e (the base of natural logarithms) and the square root of two, bridge different areas of mathematics, and they, too, have never-ending, seemingly random sequences of digits.

What distinguishes π from all other numbers is its connection to cycles. For those interested in the applications of mathematics to the real world, this makes π indispensable. Whenever we think about rhythms — processes that repeat periodically, with a fixed tempo, like a pulsing heart or a planet orbiting the sun — we inevitably encounter π. There it is in the formula for a Fourier series:

That series is an all-encompassing representation of any process, x(t), that repeats every T units of time. The building blocks of the formula are π and the sine and cosine functions from trigonometry. Through the Fourier series, π appears in the math that describes the gentle breathing of a baby and the circadian rhythms of sleep and wakefulness that govern our bodies. When structural engineers need to design buildings to withstand earthquakes, π always shows up in their calculations. π is inescapable because cycles are the temporal cousins of circles; they are to time as circles are to space. π is at the heart of both.

For this reason, π is intimately associated with waves, from the ebb and flow of the ocean’s tides to the electromagnetic waves that let us communicate wirelessly. At a deeper level, π appears in both the statement of Heisenberg’s uncertainty principle and the Schrödinger wave equation, which capture the fundamental behaviour of atoms and subatomic particles. In short, π is woven into our descriptions of the innermost workings of the universe.

So there is plenty to celebrate this Pi Day. With πe and πzza 🙂

Being rational is hard. Take the day off 🙂

Original article in The New Yorker.