Scientists reconstruct speech through soundproof glass by watching a bag of potato chips

Subtle vibrations can be translated back into audio

 

 

Your bag of potato chips can hear what you’re saying. Now, researchers from MIT are trying to figure out a way to make that bag of chips tell them everything that you said — and apparently they have a method that works. By pointing a video camera at the bag while audio is playing or someone is speaking, researchers can detect tiny vibrations in it that are caused by the sound. When later playing back that recording, MIT says that it has figured out a way to read those vibrations and translate them back into music, speech, or seemingly any other sound.

“THIS IS TOTALLY OUT OF SOME HOLLYWOOD THRILLER.”

While a bag of chips is one example of where this method can be put to work, MIT has found success with it elsewhere, including when watching plant leaves and the surface of a glass of water. While the vibrations that the camera is picking up aren’t observable to the human eye, seemingly anything observable to a camera can work here. For the most part the researchers used a high-speed camera to pick up the vibrations, even using it to detect them on a potato chip bag filmed 15-feet away and through a pane of soundproof glass. Even without a high-speed camera though, researchers were able to use a common digital camera to pick up basic audio information.

“We’re scientists, and sometimes we watch these movies, like James Bond, and we think, ‘This is Hollywood theatrics. It’s not possible to do that. This is ridiculous.’ And suddenly, there you have it,” Alexei Efros, a University of California at Berkeley researcher, says in a statement. “This is totally out of some Hollywood thriller. You know that the killer has admitted his guilt because there’s surveillance footage of his potato chip bag vibrating.” The research is being described in a paper that will be published at the computer graphics conference Siggraph.

Advertisements

Where did life come from?

Natural selection explains how organisms that already exist evolve in response to changes in their environment. But Darwin’s theory is silent on how organisms came into being in the first place, which he considered a deep mystery. What creates life out of the inanimate compounds that make up living things? No one knows. How were the first organisms assembled? Nature hasn’t given us the slightest hint.

If anything, the mystery has deepened over time. After all, if life began unaided under primordial conditions in a natural system containing zero knowledge, then it should be possible – it should be easy – to create life in a laboratory today. But determined attempts have failed. International fame, a likely Nobel Prize, and $1 million from the Gene Emergence Project await the researcher who makes life on a lab bench. Still, no one has come close.

Experiments have created some basic materials of life. Famously, in 1952 Harold Urey and Stanley Miller mixed the elements thought to exist in Earth’s primordial atmosphere, exposed them to electricity to simulate lightning, and found that amino acids self-assembled in the researchers’ test tubes. Amino acids are essential to life. But the ones in the 1952 experiment did not come to life. Building-block compounds have been shown to result from many natural processes; they even float in huge clouds in space. But no test has given any indication of how they begin to live – or how, in early tentative forms, they could have resisted being frozen or fried by Earth’s harsh prehistoric conditions.

Some researchers have backed the hypothesis that an unknown primordial “soup” of naturally occurring chemicals was able to self-organize and become animate through a natural mechanism that no longer exists. Some advance the “RNA first” idea, which holds that RNA formed and lived on its own before DNA – but that doesn’t explain where the RNA came from. Others suppose life began around hot deep-sea vents, where very high temperatures and pressures cause a chemical maelstrom. Still others have proposed that some as-yet-unknown natural law causes complexity – and that when this natural law is discovered, the origin of life will become imaginable.

Did God or some other higher being create life? Did it begin on another world, to be transported later to ours? Until such time as a wholly natural origin of life is found, these questions have power. We’re improbable, we’re here, and we have no idea why. Or how.

life1

Digital legacy: The fate of your online soul

We are the first people in history to create vast online records of our lives. How much of it will endure when we are gone?

NOT long before my wife died, she asked me to do something for her. “Make sure people remember me,” she said. “Not the way I am now. The way I was.” Having spent most of her life as an assertive, ambitious and beautiful woman, Kathryn didn’t want people’s memories to be dominated by her final year, in which the ravages of disease and continual chemotherapy had taken her spirit, vitality and looks.

To me, the internet seemed to offer an obvious way to fulfil Kathryn’s wish – certainly more so than a dramatic headstone or funerary monument. So I built a memorial website to celebrate her life through carefully selected pictures and text. The decision was unorthodox at the time, and I suspect that some in our circle thought it tasteless.

Six years on, things are very different. As the internet’s population has grown and got older, memorial pages and tribute sites have become commonplace. But when you and I shuffle off this mortal coil, formal remembrances won’t be the only way we are remembered. I manage myriad websites and blogs, both personal and professional, as well as profiles on Facebook, Flickr, Twitter and more. All of those will be left behind, and many other people will leave a similar legacy.

We are creating digital legacies for ourselves every day – even, increasingly, every minute. More than a quarter of a million Facebook users will die this year alone. The information about ourselves that we record online is the sum of our relationships, interests and beliefs. It’s who we are. Hans-Peter Brondmo, head of social software and services at Nokia in San Francisco, calls this collection of data our “digital soul”.

Thanks to cheap storage and easy copying, our digital souls have the potential to be truly immortal. But do we really want everything we’ve done online – offhand comments, camera-phone snaps or embarrassing surfing habits – to be preserved for posterity? One school of thought, the “preservationists”, believes we owe it to our descendants. Another, the “deletionists”, think it’s vital the internet learns how to forget. These two groups are headed for a struggle over the future of the internet – and the fate of your digital soul is hanging in the balance.

As the internet has become seamlessly integrated with all our experiences, more and more of our everyday life is being documented online. Last year, two-thirds of all Americans stored personal data on a distant server in the cloud, while nearly half were active on social networks.

Today, that data is hoarded by internet companies. Google and Facebook are dedicated to storing as much of your data as possible for as long as possible. Even your “digital exhaust”, such as search requests and browsing history, is often recorded by companies who want to target you with personalised advertising.

All this data will prove fascinating to sociologists, archaeologists and anthropologists studying the dawn of the digital age. For them, everyday life can be just as interesting as epoch-defining moments. Whereas researchers have hitherto had to rely on whatever physical documents happen to survive, our vast digital legacies mean their successors could be spoiled for choice.

Nothing is definite, though: it’s far from certain that this information will endure. “Digital records are more like an oral tradition than traditional documents,” says Marc Weber of the Computer History Museum in Mountain View, California. “If you don’t copy them regularly, they simply disappear.” He is concerned that we are not doing enough. “If we’re not careful, our period could end up as a bit of a Dark Age. Everyone is putting material into digital formats, but not putting much effort into preserving it.”

Amateur archivists

A movement is now emerging to make sure our legacies persist – with amateur enthusiasts in the vanguard. One of those is Jason Scott, a film-maker who recently staged an effort to save Geocities, a vast collection of personal websites dating back to 1994.

Geocities allowed anyone to create a home page of their own, usually using cheesy clip art, excitable text effects and templates that look endearingly amateurish to modern eyes. Antique charm doesn’t count for much in the marketplace, and as slicker competitors emerged Geocities became deserted and spam-laden. After a decade’s forbearance (or neglect, some would say), the site’s owner, Yahoo, decided to pull the plug on the vast majority of pages in 2009.

The threat of the impending axe horrified Scott. He and his supporters hastily “scraped” as many Geocities pages as they could, creating a 641-gigabyte archive that initially circulated on file-sharing networks before being reposted at reocities.com.

The one question that gets asked most often, says Scott, is “Why bother to save this junk anyway?” His answer is that it’s not junk: it’s history. Geocities is a huge time capsule from the infancy of the World Wide Web. Its design values speak to the limitations of dial-up connections; its structure captures a time when no one had figured out how to navigate the web, where people built online homes in themed “neighbourhoods” called Hollywood or EnchantedForest. Its users’ interactions with each other – via email addresses and guestbooks published openly without fear of spam – offer valuable insights into the birth of online culture.

The fate of Geocities is relevant because the odds are that more sites will go the same way. History shows that even the most prominent technology companies can be rapidly overtaken by competitors or deserted by customers: think of IBM or Microsoft. Companies like Facebook provide you with free services and storage on their servers. In exchange, they track your online activities and sell advertising against the personal information you provide. But one day they may choose – or be forced – to look for new ways to make money. Those might not involve hosting pictures of your cat.

Last December, Yahoo announced plans to “sunset” more well-known services, including the pioneering social bookmarking service del.icio.us. Rumours soon began to circulate about an impending demise for its giant photo-sharing site Flickr. Yahoo has brushed aside suggestions that the site’s future is in question, but Flickr users remain concerned about what they see as a lack of commitment.

When such sites disappear, many users feel they are losing more than a photo album. Years of my personal photographs are stored on Flickr, and it is woven into Kathryn’s memorial site. I have backups, but the photos on Flickr are surrounded by a rich history of social interactions, from groups to which I belong to comments that friends have left about my photos. That feels just as much part of my digital soul as the images themselves. The same goes for anything we share on social networks: our friendships, likes and links are what’s really important.

Many preservationists feel that it is not safe to entrust information of sentimental value to companies with fickle agendas and fortunes, and are working on ways to give us greater control of our digital legacies. Over the past year, there has been a proliferation of tools that allow us to extract our data from the big social sites. There’s also a cottage industry that aims to ensure that our legacies are assembled and apportioned according to our wishes after we are gone. Many of those involved, including security specialists, virtual undertakers, data storage companies – and, inevitably, lawyers – will be meeting for a “Digital Death Day” in San Francisco in early May.

“Think about the appeal of family history,” says Jeremy Leighton John, curator of e-manuscripts at the British Library in London. “The idea of creating a personal archive for your descendants is very evocative.”

But assembling such a legacy is not simple. Facebook and its ilk put a lot of work into keeping your information neatly organised and readily accessible. That’s not something most of us are good at. John says around a third of us report having lost a digital file of personal value. “Imagine losing your memories of your children growing up,” he says. You might not be doing much to mitigate that risk, he says, but it’s no doubt a concern for many people nowadays.

A new breed of social networking services might help us organise our data while also ensuring that we maintain control over it. Diaspora, based in San Francisco, is a fledgling social network which runs on servers maintained by its users. That’s in contrast to Facebook, which has its own servers and therefore controls everything on your profile. The downside with Diaspora and other “DIY” social networks is that you have to keep your server running; if you stop, your legacy could evaporate overnight.

Still, it might eventually be possible for us to assemble and bequeath our virtual estates with a few clicks – the internet equivalent of donating our personal letters and papers. The San Francisco-based Internet Archive, which has been curating a public collection of web pages and multimedia since 1996, hopes to accept such donations in the near future. Founder Brewster Kahle says he hopes it will inspire people to “endow a terabyte”. If that happens, our digital legacies may be preserved for posterity after all.

Yet should we be so quick to give in to the urge to preserve? “Forgetting is built into the human brain,” says Viktor Mayer-Schönberger of the Oxford Internet Institute in the UK, who studies internet governance and regulation. “So for thousands of years we’ve developed ways to preserve special memories.” Today, though, it is quicker and easier to save every bit of our vast digital trails than it is to sort and discard what we don’t want. In other words, we might be producing more memories than we can cope with.

We are often ill-equipped to deal with the consequences of total recall. For example, Facebook has been sporadically testing a “memorable stories” feature: every now and again, it will show old status updates written by you or a friend. The general reaction has been bafflement, with users unsure what to make of these blasts from the past. Sometimes it’s hard to tell what the vintage update is actually referring to; at others it’s unwanted, like a reminder of a bad break-up.

Forgetting gracefully

Occasionally, the resurgence of memories from long ago can be devastating. “A woman called in to a radio programme to tell me that her long-spent criminal conviction had been inadvertently revealed online,” says Mayer-Schönberger. “It had instantly destroyed her standing in the small community where she lived, the fresh start she had worked for years to achieve. This wasn’t even something she had posted: it was someone else.” It’s hard to forgive and forget if you can no longer forget.

There’s another persuasive reason why we might want to embrace online forgetfulness. If personal internet sites really did last forever, the web would start to fill up with “dead data” – a reflection of the truism that the dead outnumber the living. My memorial site for Kathryn is currently Google’s first result for her name. I’m not sure how her living namesakes feel about that.

In his 2009 book Delete, Mayer-Schönberger proposed that we should build technology that forgets gracefully. Files might come with expiry dates, he suggests, so that they simply vanish after a certain point. Or they might “digitally rust”, gradually becoming less faithful unless we make a concerted effort to preserve them. Perhaps files could become less accessible over time – like consigning old photos to a shoebox in the attic rather than displaying them on the wall.

A few firms have put these ideas into practice. In January, a German start-up called X-Pire launched software that lets you add digital expiry dates to images uploaded to sites like Facebook. After a certain date the images become invisible, so your friends will be able to check out your debauched photos on the morning after the night before, but you won’t have to worry about them appearing when a potential employer looks you up a few years later.

The problem with such schemes is that if something can be seen on the web it can also be copied, albeit with a bit of effort. Human nature being what it is, that’s most likely to happen to something really exceptional. If we’re lucky, our finest achievements will be replicated; if we’re not, it will be our epic failures that become immortal. Another difficulty is that the providers of “forgetting” services are minnows in a very big pond. How likely is it that a plucky start-up will be able to pry your entire legacy from Google and Facebook?

Even if we can’t erase data, we might be able to hide it. This February, after a number of individuals complained to the country’s data protection agency, a Spanish court ordered Google to remove nearly 100 links from its database because they contained out-of-date information about these people. The links were mostly to newspaper articles and public records, and Google refused to comply, but with the “right to be forgotten” enshrined as a key objective of the European Union’s 2011 data protection strategy, more and bigger cases are likely to follow. The EU has a track record of changing the way that the internet is used: forgetfulness may be the next big frontier.

Right now, though, we are living through a truly unique period in human history. We have learned how to create vast digital legacies but we don’t yet know how to tidy them up for our successors. The generations that went before us left no digital trace, and the ones that go after might leave nothing but sanitised “authorised biographies”. We will be defined through piecemeal and haphazard collections of our finest and foulest moments.

The memories we are leaving behind now, in all their riotous glory – drunken tweets, ranting blog posts, bad-hair-day pictures and much more – may become a unique trove to be studied by historians for centuries to come. In fact, today’s web may offer the most truthful and comprehensive snapshot of the human race that will ever exist.

And perhaps, deep within that record, those historians will find an online memorial built by a grieving widower to a woman who died too young, at the dawn of the digital age.

Read more:Forever online: Your digital legacy

Sumit Paul-Choudhury is the editor of newscientist.com. He is happily remarried.


Are We lab-rats?

 

Are you a dreamer? Do you frequently find yourself gazing off into the distance getting lost in a world of “What If”?

Back to work. You can’t daydream forever.

But what if you could? What if you had the freedom to daydream when you felt like daydreaming? To work when you felt like working?

Humans aren’t supposed to spend their days in office buildings. We’re not supposed to spend large amounts of time moving ourselves from one place to another in giant hunks of metal while our bodies slowly deteriorate and our relationships slowly fade.

We’re not supposed to spend gargantuan amounts of time plopped down in front of electronic devices moving our fingers and eyelids, absorbing radiation, and spending more waking time in the virtual world than in the real one.

We not supposed to arrive at home and focus our attention on a box that has been pre-programmed to brainwash us while simultaneously allowing our bodies to atrophy.

We’re not supposed to spend most of our life communicating with our loved ones using a digital representation of their voices. “Raam to the Enterprise, I love you mom!”

Damn it Jim, we’re not digital lab rats! We’re humans!

We’re meant to be free!

We’re meant to move. To run. To breathe. We’re meant to interact. To communicate. To laugh. To smile. To learn.

We’re meant to live in the real world.

We’re meant to follow our desires; our passions; our dreams. We’re meant to coexist with nature and to nurture a sustainable, symbiotic relationship with it.

We’re meant to eat when we’re hungry and to sleep when we’re tired. We’re meant to have fun when we’re feeling playful and to daydream when we’re feeling dreamy.

We’re meant to work like humans, not like ants. We’re meant to be more than an address book entry, a Twitter username, or an Avatar on the screen. We’re meant to be creative. To explore. To think. We’re meant to dream. To have ideas. To create. To be unique.

We’re meant to be friendly. To give. To share. We’re meant to be kind. To help. To heal. We’re meant to care and to love.

We’re meant to feel the wind against our skin, smell the earth under our feet, and be inspired by the life and the natural beauty around us.

When was the last time you felt the earth under your bare feet? When did you last have fresh dirt underneath your nails?

Go! Experience the real world. Reconnect with your body. Develop a healthy relationship with Mother Nature. Listen. Breathe. See. Touch. Smell. Taste. Use your senses.

Walk barefoot. Use your legs, if you’re lucky enough to have them. Feel the individual muscles move. Use your voice. Move your eyes.

Embrace who you are. Find the courage to be yourself.

If you’re a talker, connect. If you’re a musician, create. If you’re a writer, share. If you’re an artist, do art. If you’re a builder, build. If you’re a healer, heal. If you’re a teacher, teach. If you’re an explorer, explore. If you’re a dreamer, dream.

You’re not who society tells you to be. You don’t have to be. You shouldn’t be! Break free of any self-imposed limitations and experience life to its fullest. Break new ground. Get out of the rut. Explore. Try something new.

If you find yourself daydreaming, then let yourself dream. If you find yourself in a position you don’t fully enjoy, or one that holds you back from experiencing life to its fullest, then make those tough decisions and change your life.

If you don’t wake up every single day and look forward to what’s ahead, then something isn’t right. You’re not living.

But you can change it!

You already have the freedom inside you. Living free is a conscious decision.

Make it!

But What If really are Lab-rats? What If there is someone controlling our every thought as we speak, as we type , as we move and so on.. What if there exists multiple dynamics of parallel universe? What if there are billions of Earths scattered in different time dimensions? What if in Earth A you are what you are and in Earth B you are someone you want to be , in Earth C you are an old man/woman and so on and so forth.

What if everything we know , is just another STORY! …???  Probably we are just bunch lab-rats of this whole Blunder Experiment called Universe. !!

 

God may work in mysterious ways–but cognitive science is getting a handle on them

God came from an egg. At least, that’s how He came to me. Don’t get me wrong, it was a very fancy egg. More specifically, it was an ersatz Fabergé egg decorated with colorful scenes from the Orient. Now about two dozen years before the episode I’m about to describe, somewhere in continental Europe, this particular egg was shunted through the vent of an irritable hen, pierced with a needle and drained of its yolk, and held in the palm of a nimble artist who, for hours upon hours, painstakingly hand-painted it with elaborate images of a stereotypical Asian society. The artist, who specialized in such kitsch materials, then sold the egg along with similar wares to a local vendor, who placed it carefully in the front window of a side-street souvenir shop. Here it eventually caught the eye of a young German girl, who coveted it, purchased it, and after some time admiring it in her apartment against the backdrop of the Black Forest, wrapped it in layers of tissue paper, placed it in her purse, said a prayer for its safe transport, and took it on a transatlantic journey to a middle-class American neighborhood where she was to live with her new military husband. There, in the family room of her modest new home, on a bookshelf crammed with romance novels and knickknacks from her earlier life, she found a cozy little nook for the egg and propped it up on a miniature display stand. A year or so later she bore a son, Peter, who later befriended the boy across the street, who suffered me as a tagalong little brother, the boy who, one aimless summer afternoon, would enter the German woman’s family room, see the egg, become transfixed by this curiosity, and crush it accidentally in his seven-year-old hand.

The incident unobserved, I hastily put the fractured artifact back in its place, turned it at an angle so that its wound would be least noticeable, and, to this day, acted as though nothing had ever happened. Well, almost. A week later, I overheard Peter telling my brother that the crime had been discovered. His mother had a few theories about how her beloved egg had been irreparably damaged, he said—one being a very accurate and embarrassing deduction involving, of all people, me. When confronted with this scenario—through first insinuation and then full-blown accusations—and wary of the stern German matriarch’s wrath, I denied my guilt summarily. Then, to get them off my back, I did the unthinkable. I swore to God that I hadn’t done it.

Let’s put this in perspective. Somewhere on a quiet cul-de-sac, a second-grader secretly cracks a flashy egg owned by a woman who’s a little too infatuated with it to begin with, tells nobody for fear of being punished, and finally invokes God as a false witness to his egged innocence. It’s not exactly the crime of the century. But from my point of view, at that moment in time, the act was commensurate with the very worst of offenses against another human being. That I would dare to bring God into it only to protect myself was so unconscionable that the matter was never spoken of again.

Meanwhile, for weeks afterward, I had trouble sleeping and I lost my appetite; when I got a nasty splinter a few days later, I thought it was God’s wrath. I nearly offered up an unbidden confession to my parents. I was like a loathsome dog whimpering at God’s feet. Do with me as you will, I thought to myself; I’ve done wrong.

Such an overwhelming fear of a vindictive, disappointed God certainly wasn’t something that my parents had ever taught me. Of course, many parents do teachtheir children such things. If you’ve ever seen Jesus Camp (2006), a rather disturbing documentary about evangelically reared children in the American heartland, or if you’ve read Sam Harris’s The End of Faith (2004), you’ll know what I mean. But my family didn’t even own a copy of the Bible, and I doubt if I had ever even heard the word “sin” uttered before. The only serious religious talk I ever heard was when my mother—who as a girl was once held down by exuberant Catholic children sifting through her hair for the rudimentary devil horns their parents told them all Jews have—tried to vaccinate me against all things evangelical by explaining how silly Christians’ beliefs were. But even she was just a “secular Jew,” and my father, at best, a shoulder-shrugging Lutheran. Years later, when I was a teenager, my mother would be diagnosed with cancer, and then, too, I had the immediate sense that I had fallen out of favor with God. It felt as if my mother’s plight were somehow related to the shenanigans I’d been up to (nothing worse than most teenagers, I’m sure, but also certainly nothing to commit indelibly to print). The feeling that I had a bad essence welled up inside me; God was singling me out for special punishment.

The thing is, I would never have admitted to having these thoughts at the time. In fact, I didn’t even believe in God. I realized there was a logical biological explanation for the fact that my mother was dying. And if you had even alluded to the possibility that my mom’s ailing health was caused by some secret moral offense on my part or hers, you would have forced my intellectual gag reflex. I would probably have dismissed you as one of those people she had warned me about. In fact, I shook off the “God must really hate me” mentality as soon as it registered in my rationalconsciousness. But there’s also no mistaking that it was there in my mind and, for a few bizarre moments, it was clear as a whistle.

It was around that time that God struck me as being curiously similar to the Mafia, offering us “protection” and promising not to hurt us (or kill us) as long as we pay up in moral currency. But unlike a hammer to the shin or a baseball bat to the back of the head, God’s brand of punishment, at least here on earth, is distinctively symbolic, coming in the form of a limitless array of cruel vagaries thoughtfully designed for us, such as a splinter in our hands, our stocks tumbling into the financial abyss, a tumor in our brains, our ex-wives on the prowl for another man, an earthquake under our feet, and so on. For believers, the possibilities are endless.

Now, years later, one of the key motivators still driving the academic curiosity that fuels my career as an atheistic psychological scientist who studies religion is my own seemingly instinctual fear of being punished by God, and thinking about God more generally. I wanted to know where in the world these ideas were coming from. Could it really be possible that they were innate? Is there perhaps something like a “belief instinct”?

In the chapters that follow, we will be exploring this question of the innateness of God beliefs, in addition to many related beliefs, such as souls, the afterlife, destiny, and meaning. You’re probably already well versed in the man in the street’s explanations for why people gravitate toward God in times of trouble. Almost all such stories are need-based accounts concerning human emotional well-being. For example, if I were to pose the question “Why do most people believe in God?” to my best friend from high school, or my Aunt Betty Sue in Georgia, or the pet store owner in my small village here in Northern Ireland, their responses would undoubtedly go something like this: “Well, that’s easy. It’s because people need . . . [fill in the blank here: to feel like there’s something bigger out there; to have a sense of purpose in their lives; to take comfort in religion; to reduce uncertainty; something to believe in].”

I don’t think these types of answers are entirely intellectually bankrupt actually, but I do think they just beg the question. They’re perfectly circular, leaving us scratching our heads over why we need to feel like there’s something bigger out there or to have a sense of purpose and so on to begin with. Do other animals have these same existential needs? And, if not, why don’t they? When looked at objectively, our behaviors in this domain are quite strange, at least from a cross-species, evolutionary perspective. As the Spanish author Miguel de Unamuno wrote,

The gorilla, the chimpanzee, the orangutan, and their kind, must look upon man as a feeble and infirm animal, whose strange custom it is to store up his dead. Wherefore?

Back when I was in graduate school, I spent several years conducting psychological research with chimpanzees. Our small group of seven study animals was housed in a very large, very sterile, and very boring biomedical facility, where hundreds of other great apes—our closest living relatives—were being warehoused for invasive testing purposes under pharmaceutical contracts. I saw too many scenes of these animals in distress, unsettling images that I try not to revisit these days. But it occurred to me that if humans were in comparably hopeless conditions as these chimpanzees, certainly the question of God—particularly, what God could possibly be thinking by allowing such cruel travesties—would be on a lot of people’s minds.

So what exactly is it that can account for that instantaneous bolus of “why” questioning secreted by our human brains in response to pain and misfortune, a question that implies a breach of some unspoken moral contract between ourselves, as individuals, and God? We might convince ourselves that it is misleading to ask such questions, that God “isn’t like that” or even that there is no God, but this is only in answer to the knee-jerk question arising in the first place.

To help us understand why our minds gravitate toward God in the wake of misfortune (as well as fortune), we will be drawing primarily from recent findings in the cognitive sciences. Investigators in the cognitive science of religion argue that religious thinking, like any other type of thinking, is something done by a brain that is occasionally prone to making mistakes. Superstitious thinking, such as seeing causal relations where none in fact exist, is portrayed as the product of an imperfectly evolved brain. Perhaps it’s understandable, then, that all but a handful of scholars in this area regard religion as an accidental byproduct of our mental evolution. Specifically, religious thought is usually portrayed by scholars as having no particular adaptive biological function in itself, but instead it’s viewed as a leftover of other psychological adaptations (sort of like male nipples being a useless leftover of the default human body plan). God is a happenstance muddle of other evolved mental parts. This is the position taken by the evolutionary biologist Richard Dawkins, for example, in The God Delusion (2006):

I am one of an increasing number of biologists who see religion as a byproduct of something else. Perhaps the feature we are interested in (religion in this case) doesn’t have a direct survival value of its own, but is a byproduct of something else that does . . . [Religious] behavior may be a misfiring, an unfortunate byproduct of an underlying psychological propensity which in other circumstances is, or once was, useful.

Evolutionary by-product theorists, however, may have been a bit hasty in dismissing the possibility that religion—and especially, the idea of a watchful, knowing, reactive God—uniquely helped our ancestors survive and reproduce. If so, then just as with any other evolved adaptation, we would expect concepts about supernatural agents such as God to have solved, or at least to have meaningfully addressed, a particular adaptive problem in the evolutionary past. And, indeed, after first examining the mechanics of belief, we’ll eventually explore in this book the possibility that God (and others like Him) evolved in human minds as an “adaptive illusion,” one that directly helped our ancestors solve the unique problem of human gossip.

With the evolution of language, the importance of behavioral inhibition became paramount for our ancestors because absent third parties could now find out about their behaviors days, even weeks, after an event. If they failed to bridle their selfish passions in the face of temptation, and if there was even a single human witness to their antisocial actions, our ancestors’ reputations—and hence their reproductive interests—were foolishly gambled away. The private perception of being intelligently designed, monitored, and known about by a God who actively punished and rewarded our intentions and behaviors would have helped stomp out the frequency and intensity of our ancestors’ immoral hiccups and would have been strongly favored by natural selection. God and other supernatural agents like Him needn’t actually exist to have caused such desired gene-salvaging effects, but—just as they do today—the mental biases we’ll be examining certainly gave our ancestors reason to think that they did.

One of the important, often unspoken, implications of the new cognitive science of religion is the possibility that we’ve been going about studying the God question completely wrong for a very long time. Perhaps the question of God’s existence is one that is more for psychologists than for philosophers, physicists, or even theologians. Put the scripture aside. Just as the scientist who studies the basic cognitive mechanisms of language acquisition isn’t especially concerned with the particular narrative plot in children’s bedtime stories, the cognitive scientist of religion isn’t much concerned about the details of the fantastic fables buried in religious texts. Instead, in picking apart the psychological bones of belief, we’re going to focus on some existential basics. Perceiving the supernatural isn’t magic, but something patently organic: a function of the brain.

I should warn you: I’ve always had trouble biting my tongue, and we’re going to address head-on some of life’s biggest questions. Is there really a God who cares about you? Is there really a special reason that you are here? Will your soul live on after you die? Or, alternatively, are God, souls, and destiny simply a set of seductive cognitive illusions, one that can be accounted for by the unusual evolution of the human brain? It seems Nature may have had a few tricks up her sleeve to ensure that we would fall hook, line, and sinker for these spectacular ruses.

Ultimately, of course, you must decide for yourself whether the subjective psychological effects created by your evolved cognitive biases reflect an objective reality, perhaps as evidence that God designed your mind to be so receptive to Him. Or, just maybe, you will come to acknowledge that, like the rest of us, you are a hopeless pawn in one of natural selection’s most successful hoaxes ever—and smile at the sheer ingenuity involved in pulling it off, at the very thought of such mindless cleverness. One can still enjoy the illusion of God, after all, without believing Him to be real.

Either way, our first order of business is to determine what kind of mind it takes to think about God’s mind in the first place, and one crucial factor—indeed, perhaps the only essential one—is the ability to think about other minds at all.

So, onward we go.

Buy the book or see the reviews at www.thebeliefinstinct.com

2010 Review : A year in brief

Busted well: 4.9 million

The number of barrels of crude oil that spilled into the Gulf of Mexico following the explosion that destroyed BP’s Deepwater Horizon well. Eleven of its crew died. For three mishap-prone months the company tried, and repeatedly failed, to plug its runaway well. Meanwhile, the crude poured forth, wreaking havoc on deep-water fish, migrating baby sea turtles, and BP chief executive Tony Hayward’s career.

7 million

To keep oil off coastal marshes and, some allege, out of sight, BP released 7 million litres of chemicals to disperse and break up the oil at the well head 1500 metres down. Environmentalists balked. The Obama administration imposed a moratorium on deep-water drilling. Yet in Gulf coast communities, where fishers and oil workers may be the exact same people, the “drill-baby-drill” cry grew ever shriller.

20 million

With an average consumption of more than 20 million barrels of oil per day, industry and consumers across the US would have gobbled up the entire spill in just a few hours. Some denizens of the Gulf have almost as great an appetite for the oil: many deep-water microbes thrive on the stuff, and are probably still enjoying the unexpected feast.

Ice-free 2035

This is the fated year by which, according to the Intergovernmental Panel on Climate Change, the Himalayan glaciers could disappear. In January, the head of the IPCC was forced to apologise after it transpired that the panel’s prediction for the fate of this crucial source of south Asia’s water was almost certainly very wrong. It had sourced the erroneous date from non-peer-reviewed sources, highlighting the paucity of research on the regional effects of climate change.

Green volcano

Eyjafjallajökull, the volcano that closed Europe’s airspace and stumped English-speaking newscasters trying to pronounce its name, is estimated to have emitted between 150,000 and 300,000 tonnes of carbon dioxide a day. That’s less than the grounded flights would have emitted, making it the first carbon-negative volcano.

Cyberwar

Elegant and stealthy, the Stuxnet computer worm slipped undetected into key nuclear facilities in Iran, inflicting substantial damage. No one has claimed responsibility. The sophistication of the code suggests whoever is behind the worm had significant technical resources, leading Iran to point the finger at the Pentagon and Israel. What seems clear is that the first shot has been fired in a new era of cyber-warfare.

Those cursed climate emails

Thousands of them were hacked off the servers of the University of East Anglia, home to one of the UK’s leading climate research units, in November 2009. In 2010, their content was dissected, re-dissected, and then dissected some more, amid claims that some climate scientists had engaged in fraudulent behaviour. Four independent reviews exonerated them, and data sets were made public that were previously under lock and key. And, finally, the world moved on.

Life from life

Make a genome – check. Transplant it into an emptied cell to create the world’s first synthetic life form – check. Frenzied media coverage accusing the researchers concerned of “playing God” – check. So it was in May, when Craig Venter and his colleagues stitched together the genome of a goat pathogen from bits of synthetic DNA and inserted it into the empty cytoplasm of a related bacterium. The implanted genome booted up and divided over and over to make billions more synthetic cells in the image of the original. To confirm that the daughter cells were of the synthetic species, the researchers added coded watermarks to its genome – including a quotation from James Joyce’s A Portrait of the Artist as a Young Man: “To live, to err, to fall, to triumph, to recreate life out of life”.

10 trillion °C

The highest temperature ever achieved in a scientific experiment, some 1013 degrees, was reached on 7 November inside the Large Hadron Collider at CERN, near Geneva in Switzerland, when it started blasting lead ions together at near light speed. What remained after the smash-up was a quark-gluon plasma, the stuff thought to have made up the early universe. Quark-gluon plasmas had been made before, but earlier in 2010 physicists working on CERN’s CMS experiment recorded a mysterious, never-before- seen signal during the LHC’s proton-proton collisions. They are still scratching their heads trying to work out what caused it.

Kinect connection

Microsoft’s hands-free video controller sold over a million units in the 10 days after its 4 November release in the US. The Kinect makes a great toy for sure, but it is also turning out to be more than that. Its sophisticated depth-sensing camera and infrared scanner have made it a honeypot for hackers, who are using it to manipulate 3D images of themselves and their surroundings in mind-bending software applications. Scientists have gotten a whiff of what the controller can do, too, and are enthralled by its possible applications – which range from controlling robots to 3D mapping and video conferencing.

Homo sapiens neanderthalis

The first draft of the Neanderthal genome, extracted from 44,000-year-old bones found in Croatia, revealed that the genome of all non-Africans is 1 to 4 per cent Neanderthal. In other words, humans and Neanderthals had sex and had hybrid offspring. The absence of Neanderthal genetic markers in modern Africans suggests that the interbreeding happened between 100,000 and 45,000 years ago, after the first humans left Africa but before they split into regional populations elsewhere.

Asteroid dust

All but given up for dead, the Hayabusa space probe finally made it home in June. After a bumpy landing on the asteroid Itokawa and a beleaguered return mission, the Japanese Aerospace Exploration Agency feared the probe had failed in its mission to bring asteroid dust back to Earth. It took five months for the answer to arrive: Hayabusa had snatched 1500 particles of extraterrestrial dust, which will be scrutinised for clues to how the solar system – and our own planet – formed.

Global nation of Facebook

Facebook welcomed its 500-millionth user in July, just six years after it was created in a Harvard University dorm room. The Facebook “nation” now stands as the third most populous in the world, ahead of the US.

Oscar’s new face

“Oscar”, a farmer who accidentally shot himself in the face, became the first recipient of a full face transplant in March. While all 10 previous transplants had replaced sections of a face only, Oscar was given new facial skin, muscles and nerves, nose, lips, palate, teeth, cheekbones and lower jaw by a surgical team at Vall d’Hebron University Hospital in Barcelona, Spain.

The images in my head

Hope is dawning for people with “locked in” syndrome. In February, an international team of neuroscientists announced that they had conversed with a 29-year-old man diagnosed as being in a vegetative state. By asking him to picture himself doing two distinct activities and monitoring the different patterns in a brain scan as he did so, they created a code for him to answer yes/no questions. Imagining himself playing tennis meant “yes”; moving around his home meant “no”.

Not just a notepad

Long-awaited, but not as coveted as was expected, Apple’s iPad came to market in April. Within 24 hours, Apple claimed it had sold 300,000 units, but then enthusiasm seemed to wane. By September, 4.2 million of the devices had left the mother ship, falling short of Apple’s 5 million projection.

[Source : http://www.newscientist.com/blogs/shortsharpscience/2010/12/2010-review-a-year-in-brief.html ]

Stephen Hawking says there’s no theory of everything

SH.jpg

Craig Callender, contributor

Three decades ago, Stephen Hawking famously declared that a “theory of everything” was on the horizon, with a 50 per cent chance of its completion by 2000. Now it is 2010, and Hawking has given up. But it is not his fault, he says: there may not be a final theory to discover after all. No matter; he can explain the riddles of existence without it.

The Grand Design, written with Leonard Mlodinow, is Hawking’s first popular science book for adults in almost a decade. It duly covers the growth of modern physics (quantum mechanics, general relativity, modern cosmology) sprinkled with the wild speculation about multiple universes that seems mandatory in popular works these days. Short but engaging and packed with colourful illustrations, the book is a natural choice for someone wanting a quick introduction to mind-bending theoretical physics.

Early on, the authors claim that they will be answering the ultimate riddles of existence – and that their answer won’t be “42”. Their starting point for this bold claim is superstring theory.

In the early 1990s, string theory was struggling with a multiplicity of distinct theories. Instead of a single theory of everything, there seemed to be five. Beginning in 1994, though, physicists noticed that, at low energies, some of these theories were “dual” to others – that is, a mathematical transformation makes one theory look like another, suggesting that they may just be two descriptions of the same thing. Then a bigger surprise came: one string theory was shown to be dual to 11-dimensional supergravity, a theory describing not only strings but membranes, too. Many physicists believe that this supergravity theory is one piece of a hypothetical ultimate theory, dubbed M-theory, of which all the different string theories offer us mere glimpses.

This multiplicity of distinct theories prompts the authors to declare that the only way to understand reality is to employ a philosophy called “model-dependent realism”. Having declared that “philosophy is dead”, the authors unwittingly develop a theory familiar to philosophers since the 1980s, namely “perspectivalism”. This radical theory holds that there doesn’t exist, even in principle, a single comprehensive theory of the universe. Instead, science offers many incomplete windows onto a common reality, one no more “true” than another. In the authors’ hands this position bleeds into an alarming anti-realism: not only does science fail to provide a single description of reality, they say, there is no theory-independent reality at all. If either stance is correct, one shouldn’t expect to find a final unifying theory like M-theory – only a bunch of separate and sometimes overlapping windows.

So I was surprised when the authors began to advocate M-theory. But it turns out they were unconventionally referring to the patchwork of string theories as “M-theory” too, in addition to the hypothetical ultimate theory about which they remain agnostic.

M-theory in either sense is far from complete. But that doesn’t stop the authors from asserting that it explains the mysteries of existence: why there is something rather than nothing, why this set of laws and not another, and why we exist at all. According to Hawking, enough is known about M-theory to see that God is not needed to answer these questions. Instead, string theory points to the existence of a multiverse, and this multiverse coupled with anthropic reasoning will suffice. Personally, I am doubtful.

Take life. We are lucky to be alive. Imagine all the ways physics might have precluded life: gravity could have been stronger, electrons could have been as big as basketballs and so on. Does this intuitive “luck” warrant the postulation of God? No. Does it warrant the postulation of an infinity of universes? The authors and many others think so. In the absence of theory, though, this is nothing more than a hunch doomed – until we start watching universes come into being – to remain untested. The lesson isn’t that we face a dilemma between God and the multiverse, but that we shouldn’t go off the rails at the first sign of coincidences.

Craig Callender is a philosopher of physics at the University of California, San Diego