Archive for the ‘science & technology’ Category

Ray Kurzweil is interviewed here by Nataly Kelly who works for Common Sense Advisory, a company focused on language in the context of globalisation, which of course includes translation. While her questions are informed by her work context (and sometimes lead to repetition in Kurzweil’s answers), they nevertheless make Kurzweil reflect on whether and how computers will replace humans as translators.

 

Kurzweil already predicted more than eleven years ago in his book “The Age Of Spiritual Machines” that computers will reach a high degree of almost human proficiency in translating written and spoken language by 2029. While he was formulating his prediction, Franz Och started to work on language translation algorithms; Och today is the man behind Google Translations, which still is very crude but nevertheless has reached mainstream. Considering how rapidly technological progress happens, I wouldn’t be surprised if 2029 will turn out to be a fairly conservative assumption.

But the video is not so much focused on time frames, as it is on questions based on the effects of computers on translation, like for example whether we will live in societies without language barriers because computers will comprehensively convey meanings from one language to another in an instant. Kurzweil answers that question with a categorical ‘no’. Even if computer translations will be able to reach a level of proficiency that will allow us to use them for day-to-day conversation and communication, Kurzweil predicts that people will still gain immense benefits from learning other languages, the same way literature (in particular poetry) will still need humans for translation. Words do not just have literal meanings but are also carriers of cultural messages, are steeped in historical contexts and are the products of unique individual creative expression. Unless we create artificial intelligence that at least fully matches our own human one, machines relying on databases for translation won’t reflect and produce the complexity and richness that humans are able to tap into when extracting meaning from words in another language.

Another question Kurzweil addresses in this context is whether computer translations will wipe out the profession of human translators, for example in non-literary fields like that of commerce. Again he answers with a ‘no’. While making references to the just mentioned translation quality argument, he also suggests that new technologies bring diversification to traditional professions by providing people with new skill sets (and Kurzweil refers in this context to the introduction of synthesisers in the 1980s and the consequent changes computers have made to music creation).

While Kurzweil’s historical perspective seems verifiable over longer time periods, it certainly won’t hold true in the short-term. After all: car factories are not filled with masses of multi-skilled workers but with robots and a hand-full of engineers and technicians, and those workers’ experience was just another repetition of what for example the weavers’ trade went through in the 19th century when the steam engine eliminated it. Therefore and despite different avenues being opened for a new generation of language agents, countless translators will lose their professional career prospects forever – the same way people are losing their jobs in bookshops and record stores without being re-employed en-masse by online companies. But I guess, if you’re a futurist with a fascination for technology, social justice aspects are outside  your general frame of reference – which still makes Kurzweil’s reflection within its own albeit limited context quite interesting.

RESEARCHERS have long known that city dwellers are at greater risk of mental health problems than their country cousins. But the biology behind this was not known until now.

Results of an international study have shown for the first time how two regions of the brain responsible for regulating stress and emotion are affected by city living.

As part of the German study, 32 people were asked to do a tricky maths test while lying in a brain scanner.
Advertisement: Story continues below

Researchers at the University of Heidelberg introduced stress by imposing time constraints and relaying disapproving feedback from examiners.

They found that when exposed to stress the parts of the brain that process emotion – the amygdala and cingulate cortex – were more active among the students who lived in or had been raised in cities.

The findings, published in the journal Nature this week, establish that there is a connection between city living and sensitivity to social stress.

Results also showed a corelation between the activity levels in the amygdala region of the brain and the size of the city each student called home: people from cities of more than 100,000 people showed more activation of the amygdala region than those from towns of more than 10,000, and those in turn showed more activation than people from the rural areas.

Previous research has found that growing up in a big city raises the risk of schizophrenia.

Originally published by The Age

Source: Asher Moses, The Age

Straight men are as aroused by penises as homosexuals and have fantasies of their wives sleeping with other men, but any fears about the negative or corrupting influence of pornography are misguided, neuroscientists have found in one of the largest studies of internet porn habits.

The study also gathered some revealing insights into women’s porn searches finding that, unlike men, they generally prefer erotic stories to visuals. They were also turned on by stories about masculine men sharing their tender side and being intimate with each other.

For their new book A Billion Wicked Thoughts, neuroscientists Ogi Ogas and Sai Gaddam say they analysed as much data on internet pornography habits as they could find including more than a billion web searches, a million erotic stories, a half-million erotic videos, paid porn site subscription statistics, millions of personals ads, ten thousand digital romance novels, online data responses, the world’s most popular free porn sites and other data.


Women are clicking on internet porn more and more and getting addicted.
One of the largest studies on internet porn habits has revealed some
surprising insights. Photo: Phil Carrick

The study – which they dubbed the “world’s largest experiment” – found a group of about 20 sexual interests that accounted for 80 per cent of all the porn people watched and spent money on. The top five categories, in their words, are: youth, gays, MILFs (mothers), breasts and cheating wives.

Ogas said some of the interesting findings from his research included that men “prefer overweight women to underweight women” and while men generally prefer younger women, there was significant sexual interest in older women including those in their 40s, 50s and 60s.

“There’s even an internationally popular genre of erotica known as granny porn,” said Ogas.

“The four body parts that both straight and gay men are wired to find sexually interesting are: chests, butts, feet and penises. Heterosexual men are very interested in looking at penises, especially large penises.”

Orgas said heterosexual men searched for penises almost as often as they search for vaginas and out of the top 35,000 most popular adult sites, roughly 1000 were devoted to large penises.

“This interest is perhaps inherited from our primate ancestry: chimpanzees, monkeys, and bonobos use their penis as a prominent and versatile social instrument, to signal aggression, to indicate dominance, to mark territory, and to indicate sexual interest,” he said.

The study found that the largest audience for “shemale” – male-to-female transexual – porn was heterosexual men. Orgas described this as an “erotical illusion”.

“These are erotic stimuli that trick the perceptual machinery of the sexual brain by combining different sexual cues in novel combinations,” he said.

“A shemale has the body of a woman and a penis. The female body consists of female anatomical parts that trigger arousal (breasts and curves, for example)… but also has the cue of a penis, which is another sexual cue for men.”

Ogas and Gaddam’s research has been extensively covered everywhere from Time magazine to the Freaknonomics website.

The researchers have dismissed moral panic that argues pornography is encouraging men to pursue degrading and perverse sexual habits. Ogas said porn was a reflection of male desires rather than a creator of male desires and erotica generally functioned to liberate and satisfy male sexual interests.

“There is an inverse correlation between the availability of pornography and rape: the more porn that’s available, the less rape,” said Ogas, adding extreme pornography was a rare indulgence and did not spill over into the viewer’s real life.

“Almost all fears about the negative or corrupting influence of pornography are misguided, and usually applied to other people’s sexual interests.”

Ogas said the easy, inexpensive availability of online erotica had been a boon to women and “sexual minorities”.

He said previously women lacked a way to safely and conveniently explore their sexuality, and being able to explore their desires in the security, anonymity and comfort of their own homes was preferable to travelling to a red light district or going into the back room of a video rental store.

“Similar, for homosexuals, bisexuals and transexuals living outside of urban areas, it was very difficult to access erotica with the result that many sexual minorities felt isolated, alienated and ashamed,” said Ogas.

“Now, the internet allows these groups to explore their sexuality in safety and privacy and discover that many others share the same sexual interests that they do. Knowledge is power, especially when it comes to sex.”

As other researchers have found previously, Ogas said sexual cues that triggered arousal in women were mainly psychological while for men it was overwhelmingly visual. Straight man were aroused by sexual dominance while most women, and gay men, were wired to be aroused by sexual submission.

One unusual result from the research was that erotica featuring cheating wives was very popular among straight men.

Ogas explained that he believed men were attracted to cheating wives due to a function biologists call a “sperm competition cue”.

“All across the animal kingdom, when a male sees another male mate with a female, this often triggers greater sexual arousal in the viewing male so that he might pursue sex more vigorously – and produce more sperm – than his competitor in order to increase his chances of impregnating the female,” he said.

“The sperm competition cue also explains why men are aroused by cuckold porn, also called cheating wife porn or cheating girlfriend porn, or about fantasies of their own wife or girlfriend cheating on them.

“Though men are most definitely designed to get jealous and furious at the thought of their partner cheating on them, they can also become simultaneously aroused.”

twitter This reporter is on Twitter: @ashermoses

I recently posted an article by Burkhard Bilger, published in the The New Yorker, on David Eagleman. The following post by Betsy Mason for the Wired Science blog looks at aspects of Eagleman’s work, such as the structure of our brain as different competing aspects being involved in decision making, the role the illusion of time plays in schizophrenia as well as the legal ramifications arising from the fact that neuroscience suggests that we are not equal before the law.

Most people probably feel like they know their own brain reasonably well. After all, our thoughts form the core of who we are, or at least who we understand ourselves to be. But it turns out we know only a tiny portion of what our brains are doing and where our own thoughts come from.

Neuroscientist David Eagleman takes us on an enlightening tour of all that our brains are up to behind our backs in his new book Incognito: The Secret Lives of the Brain. Get a preview in the audio excerpt and learn more about the book and Eagleman’s current research in the interview below.

Wired.com: Your book deals a lot with the idea that we are totally unaware of most of what goes on in our brains. Is this why we get hunches that seem to just materialize from nowhere?

David Eagleman: The main thing that inspired me to write this book is looking at all the ways the conscious is just the littlest bit of what’s happening in the brain. Your brain does these massive computations under the hood all the time. And a hunch essentially is the result of all those computations. So it’s exactly like riding a bicycle, the way you don’t have to be consciously aware, in fact you cannot be consciously aware. Your consciousness has no access to the operations running under the hood that allow you to ride the bicycle, or for that matter that allow you to recognize somebody’s face. you don’t know how you recognize somebody’s face, you just do it effortlessly.

In both of these cases, it’s very hard to write computer programs to do this stuff, to ride a bicycle or recognize somebody’s face, because there’s massive computation going on there that’s required. Your brain does this all effortlessly and the hunch is when it serves up the end result of those computations.

Essentially the conscious mind is like a newspaper headline in the sense that all it ever wants is the summary, it doesn’t need to know all the details of how something happened, it just wants to know, Obama’s in China or whatever it is. It doesn’t need to know every bit of the background of American history and Chinese history, it just wants to know what’s happening right now. And so a hunch is a way of summarizing vast quantities of data. And you may not have any conscious access to how the computation was made.

Wired.com: If our brains are working without us being consciously aware of it, how does that affect the choices we make?

Eagleman: When people go through marriage registries, they find that people are more likely to marry other people whose first name begins with the first letter of their own first name, so Alex and Amy, Joel and Jenny, Donny and Daisy, these kind of things. And if your name is Dennis or Denise you’re statistically more likely to become a dentist. This can be verified by looking in the dentist professional registries.

Also, people whose birthday is Feb. 2, are disproportionately more likely to move to cities with the number two in their name, like Twin Lakes, Wisconsin. And people born on 3/3 are statistically overrepresented in places like Three Forks, Montana, and so on.

Anyway, the point of all this is that it’s a crazy reason to choose a life mate or a city to live in or a profession, and if you ask people about why they made these choices, that probably would not be included in their conscious narrative. And yet it’s statistically provable that these things do have an influence in very subtle ways on our choices. People like brands, for example, that begin with the the same first letters as their first names, and they’ll be more likely to choose that brand just based on that, even though they’re not consciously aware they’re doing that.

Wired.com: So if we’re not consciously directing our own decision-making, how do our brains handle the process?

Eagleman: I make this argument about the brain being like a team of rivals. I synthesize a lot of data to show that you are not one thing, but instead your brain is made up of these competing networks that are all battling it out to control this single output channel of your behavior. And so your brain’s like a neural parliament, and you’ve got these different parties in there like the Democrats and Republicans and Libertarians, all of whom love their country and feel that they know the best way to steer the ship of state. But they have differing opinions on how to do it, and they have to fight it out.

This is why we can cuss at ourselves and cajole ourselves and get angry at ourselves, and this is why you can do behavior and look back and think, “Wow, how did I do that?” It’s because you are not one person, you are not one thing. As Walt Whitman said, “I am large, I contain multitudes.”

In the book I break down all these different competing parties in the brain. If you’re trying to understand yourself, and what kind of person am I, and why did I do that and so on, this gives you a much richer view of what’s actually happening under the hood so that you don’t suffer from the illusion that there’s a single “I” in there.

And once you start understanding this about yourself, then you can structure things so that you constrain your future behavior. For example, once you realize that your short-term instant-gratification circuits will be really tempted in a certain situation, you can then think about it in advance and make sure you don’t get yourself into that situation.

Wired.com: Besides short-term vs. long term interests, what are some of the other warring parties in your brain?

Eagleman: Another team of rivals is closely related to this issue of emotion and reason. You have certain parts of your brain that really care about essentially math problems and just adding things up and deciding on something in a purely logical way, and then other networks in your brain that are involved in what we generally summarize as emotion. And these largely have to do with monitoring your internal state, instead of the external world, and they have to do with judging how things will pay off for the system whether these things are good or bad.

And it turns out with neuroimaging you can see these things in competition when people are making a decision, let’s say a moral decision where logically you might feel like you want to go one way, but emotionally you feel like you want to go the other way. You see these battles in action.

I think an understanding of the teams of rivals in the brain allows us to really think more clearly about other people’s behavior. In the book I use an example of Mel Gibson, who made these antisemitic rants when he was drunk and then afterward wrote these letters of apology that seemed to be genuine. Everybody was arguing about whether he is an anti-semite or not an anti-semite. What I say in the book is, even though his behavior was despicable, are we obliged to think somebody is or is not something? Isn’t it possible for somebody to have both racist and non-racist parts of their brain that can be coexisting in a single person — where he might say things at one point and feel bad about it at another point?

To say that somebody has true colors — that there’s sort of one thing this person is — isn’t nuanced enough with our understanding of modern neuroscience. People are complex in this way. They contain multitudes.

Wired.com: How does the brain decide who wins?

Eagleman: One piece of advice that I give in the book is something that my mother advised me a long time ago: If you’re ever stuck between two options and you just can’t make a decision, flip a coin and then consult your gut feeling about whether you’re disappointed with the way the coin landed.

What you’re doing is a gut check on how the coin toss came out, and that immediately tells you what you need to know. You sort of pretend that you’re committing to it. And then if you go, “Oh shit!” then you know that you should just go with the other choice.

Wired.com: What does your current research about time teach us about how our brains work?

Eagleman: I study time perception and illusions of time, and one of the main questions that I look at is, as we understand time better, what does that tell us about how these systems can break?

One of the experiments we did a few years ago showed that if we inject a small delay between your motor act and some sensory feedback, then when we remove that delay, you’ll have the impression that the feedback happened before you did the act. So, if you press a button and that causes a flash of light to go off, you quickly learn that you are causing the flash of light. Now if we insert a small delay so that when you press the button, there’s a tenth of a second before the flash, you don’t notice that delay, but your brain starts to adjust for it. Then, if we suddenly remove that delay, you’ll hit the button, the flash of light will happen immediately, but you will swear the flash of light happened before you pressed the button.

This is an illusory reversal of action and effect. What people say in this situation is, “It wasn’t me. The flash happened before I pressed the button.” And that struck me as very interesting because this is something that we see in schizophrenia. Schizophrenics will do what’s called credit misattribution where they will make an act, and they will claim that they are not the ones responsible for it.

So that immediately got me thinking that maybe what’s happening in schizophrenia is fundamentally a disorder of time perception. Because if you’re putting out actions into the world and you’re getting feedback, but you’re not getting the timing correct, then you will have cognitive fragmentation, which is what schizophrenics have.

We always talk to ourselves internally and listen to ourselves — we have an internal monologue going on. Now imagine that the talking and the hearing that’s going on internally, imagine that the order got reversed. Well, you would have to attribute that voice to somebody else. That’s an auditory hallucination, and that’s another thing that characterizes schizophrenia.

I’ve been running studies now at the Harris County Psychiatric Center with schizophrenic patients, and it does appear that they have problems in the time domain. So I’m pursuing this hypothesis right now about whether schizophrenia is fundamentally a disorder of time perception. If it is, that’s a big deal because it means that we might be able to develop rehabilitative strategies that involve playing a little video game in front of a computer instead of a pharmaceutical approach.

Wired.com: You argue that brain science could improve our legal system. How would that work?

Eagleman: As part of my day job I direct the Initiative on Neuroscience and Law. What I’m asking is, given the situation that much of what we do and think and act and believe is all generated by parts of our brain that we have no access to, what does this mean when we think about responsibility and blameworthiness when people are bad actors in society?

I’ve been working on this topic for years and it’s become clear to me that our legal system as it stands now is so broken and outdated. It essentially rests on this myth of equality, which says all brains are created equal. As long as you’re over 18 and you’re over an IQ of 70, then all brains are treated as though they have an equal capacity for decision making, for simulating possible futures for understanding consequences and so on. And it’s simply not true.

Along any axis that we measure brains, they are very different. There’s as much variation neurally as there is with people’s external physical characteristics. So an enlightened legal system, and one that’s also more humane and more cost effective, will instead of treating everybody equally and treating incarceration as a one-size-fits-all solution, will do customized sentencing and customized rehabilitation. It will try to understand people better, in terms of what can be done with them and for them. It can have better risk assessment to understand, “How dangerous is this person?”

We still have to take people who break the law off the streets to have a good society, so this doesn’t forgive anybody. But what it means is we have a forward-looking legal system that just worries about the probability of recidivism, or in other words, what is the probability that this person’s behavior will transfer to other future situations? That makes a forward-looking legal system instead of a backward-looking one like we have now, which is just a matter of blame and saying, “How blameworthy are you and we’re going to punish you for that.”

What’s happening more and more in courtrooms is defense lawyers will argue that their client has bad genes, that he was sexually abused as a child or he had in utero cocaine poisoning, so it wasn’t really his fault. It turns out that’s the wrong question to ask, because the interaction of genes and environment is so complex that we will never be able to say how somebody came to be who he is now and whether he had any real choice in the matter of whether he behaved this way or that way. So the only logical thing is to have a forward-looking system that says, “We can’t know the answer to that, all we need to know is how dangerous is this person into the future?”

Beyond that there’s this issue that our prison system has become our de facto mental health care system. The estimates now are that 30 percent of the prison population suffers from some sort of mental illness. It’s much more humane and enlightened and cost effective to have a system that deals with the mentally ill separately, deals with drug addicts separately. and so on. Incarceration is the right solution for some people because it will be a deterrent. But it doesn’t work if your brain’s not functioning properly. If you’re suffering from a psychosis, for example, putting you in prison isn’t going to fix that.

Wired.com: So do we need, next to the jury of your peers, a jury of brain experts?

Eagleman: So here’s what I think. Trials have two phases. There’s the guilt phase, or the fact finding phase, and of course that should always remain with a jury of your peers, there are many reasons for that. But the sentencing phase should be done with statistics and sophisticated risk assessment instruments. And I should mention, these are already underway.

For example, with sex offenders, people have done very good studies where they have followed tens of thousands of sex offenders for years after they are released form prison, and they find out who recidivates and who doesn’t. Then they correlate that with all of these things they can measure about the person. And it turns out that that gives really good predictive power about who’s likely to recidivate and who’s not.

Now I need to specify here that we will never be able to know whether any individual will commit a crime again or not, because life’s too complicated and crime is often circumstantial. Nonetheless, it is the case that some people are more dangerous than others. And these statistical tests are incredibly powerful tools for understanding who on average is going to be more dangerous than whom and thereby how long we should sentence them for.

Betsy Mason is the editor of Wired Science.
Follow @betsymason on Twitter.

Go to the  Wired Science website  to listen to excerpts of David Eagleman’s latest book “The secret lives of the brain”.

Supplementing the previous post on the illusion of time, here are 10 examples of warped time provided by PsyBlog.

How time perception is warped by life-threatening situations, eye movements, tiredness, hypnosis, age, the emotions and more…

The mind does funny things to our experience of time. Just ask French cave expert Michel Siffre.

In 1962 Siffre went to live in a cave that was completely isolated from mechanical clocks and natural light. He soon began to experience a huge change in his experience of time.

When he tried to measure out two minutes by counting up to 120 at one-second intervals, it took him 5 minutes. After emerging from the cave he guessed the trip had lasted 34 days. He’d actually been down there for 59 days. His experience of time was rapidly changing. From an outside perspective he was slowing down, but the psychological experience for Siffre was that time was speeding up.

But you don’t have to hide out in a cave for a couple of months to warp time, it happens to us all the time. Our experience of time is flexible; it depends on attention, motivation, the emotions and more.

1. Life-threatening situations

People often report that time seems to slow down in life-threatening situations, like skydiving.

But are we really processing more information in these seconds when time seems to stretch? Is it like slow-motion cameras in sports which can actually see more details of the high-speed action?

To test this, Stetson et al. (2007) had people staring at a special chronometer while free-falling 50 metres into a net. What they found was that time resolution doesn’t increase: we’re not able to distinguish shorter periods of time when in danger. What happens is we remember the time as longer because we record more of the experience. Life-threatening experiences make us really pay attention but we don’t gain superhuman powers of perception.

2. Time doesn’t fly when you’re having fun

We’ve all experienced the fact that time seems to fly when we’re having fun. Or does it? What about when you’re listening to a fantastic uplifting piece of music? Does time seem to fly by, or conversely, does it seem to slow down?

When this was tested by Kellaris (1992), they found that when listeners enjoyed the music more, time seemed to slow down. This may be because when we enjoy music we listen more carefully, getting lost in it. Greater attention leads to perception of a longer interval of time.

The same thing happens when you have a really good, exciting day out. At the end of the day it can feel like you ate breakfast a lifetime ago. You enjoyed yourself enormously and yet time has stretched out.

The fact that we intuitively believe time flies when we’re having fun may have more to do with how time seems to slow when we’re not having fun. Boredom draws our attention to the passage of time which gives us the feeling that it’s slowing down.

Or—prepare yourself for a 180 degree about-face—it could all be the other way around. Perhaps you’re having fun when time flies. In other words, we assume we’ve been enjoying ourselves when we notice that time has passed quickly.

There’s evidence for this in a recent experiment by Sackett et al. (2010). Participants doing a boring task were tricked into thinking it had lasted half as long as it really had. They thought it was more enjoyable than those who had been doing exactly the same task but who hadn’t been tricked about how much time had passed.

Ultimately it may come down to how much you believe that time flies when you’re having fun. Sackett and colleagues tested this idea as well and found it was true. In their experiments, people who believed more strongly in the idea that time flies when you’re having fun were more likely to believe they were having fun when time flew. So, the whole thing could partly be a self-fulfilling prophecy.

3. The stopped clock illusion

The stopped clock illusion is a weird effect that you may have experienced. It happens when you look at an analogue watch and the second-hand seems to freeze for longer than a second before moving on.

I always thought this was because I just happened to look at it right at the start of the second, but this is actually an illusion.

What is happening is that when your eyes move from one point to another (a saccade), your perception of time stretches slightly (Yarrow et al., 2001). Weirdly, it stretches backwards. So your brain tells you that you’ve been looking at the watch for slightly longer than you really have. Hence the illusion that the second-hand is frozen for more than a second.

This happens every time our eyes move from one fixation point to the next, it’s just that we only notice it when looking at a watch. One explanation is that our brains are filling in the gap while our eyes move from looking at one thing to the next.

4. Too tired to tell the time

When things happen very close together in time, our brains fuse them together into a single snapshot of the present. For vision the shortest interval we can perceive is about 80 milliseconds. If two things happen closer together than that then we experience them as simultaneous.

The shortest possible gap in time we can distinguish across modalities (say visual and auditory) is between 20 and 60 milliseconds (Fink et al., 2006). That’s as little as a fiftieth of a second.

When we’re tired, though, our perception of time goes awry and we find it more difficult to distinguish between short spaces of time. This fact can be used to measure whether people are too tired to fly a plane, drive a truck or be a doctor. Indeed just such simple hand-held devices that quickly assess your tiredness are already being developed (Eagleman, 2009).

5. Self-regulation stretches time

The effort of trying to either suppress or enhance our emotional reactions seems to change our perception of time. Psychologists have found that when people are trying to regulate their emotions, time seems to drag on.

Vohs and Schmeichel (2003) had participants watch an 11 minute clip from the film Terms of Endearment. Some participants were asked to remain emotionally neutral while watching the clip and others were told to act naturally. Those who tried to suppress their emotions estimated the clip had lasted longer than it really had.

6. Altered states of consciousness

People report all sorts of weird experiences with time when taking drugs like psilocybin, peyote or LSD. Time can seem to speed up, slow down, go backwards, or even stop.

But you don’t need drugs to enter an altered state of consciousness, hypnosis will do the trick. People generally seem to underestimate the time that they’ve been under hypnosis. One study found this figure was around 40% (Bowers & Brenneman, 1979).

7. Does time speed up with age?

People often say the years pass more quickly as they get older. While youthful summers seemed to stretch on into infinity, the summers of your later years zip by in the blink of an eye.

A common explanation for this is that everything is new when we are young so we pay more attention; consequently it feels like time expands. With age, though, new experiences diminish and it tends to be more of the same, so time seems to pass more quickly.

Whether or not this is true, there is some psychological evidence that time passes quicker for older people. One study has found that people in their 20s are pretty accurate at guessing an interval of 3 minutes, but people in their 60s systematically overestimate it, suggesting time is passing about 20% more quickly for them (Mangan & Bolinsky, 1997).

8. The emotional experience of time

The emotions we feel in the moment directly affect our perception of time. Negative emotions in particular seem to bring time to people’s attention and so make it seem longer.

Research on anxious cancer patients, those with depression and boredom-prone individuals suggests time stretches out for them (reported in Wittmann, 2009). Just like life-threatening situations, negative emotions can concentrate our attention on the passage of time and so make it seem longer than it really is.

This effect may be made worse by our efforts to regulate these negative emotions (see number 5), which also has the effect of stretching time.

9. It’s getting hot in here

If you’ve ever had a fever then you’ll know that body temperature can have strange effects on time perception.

Experiments have found that when body temperature is raised our perception of time speeds up (Wearden & Pento-Voak, 1995). Conversely when we are cooled down, our sense of time also slows down.

10. What’s your tempo?

Setting aside emotions, age, drugs and all the rest, our experience of time is also affected by who we are. People seem to operate to different beats; we’ve all met people who work at a much slower or faster pace than we do. Psychologists have found that people who are impulsive and oriented towards the present tend to find that time moves faster for them than others (from O’Brien et al., 2011).

There’s little research on this but it’s likely that each of us has our own personal tempo. Research has found that when different people listen to metronomes the number of beats per minute they describe as comfortable ranges from as slow as 40 bpm up to a high of 200 bpm (Kir-Stimon, 1977). This is a large range and may help to explain why some people seem to operate at such a different pace to ourselves.

Time is relative

The last words on time come from two great thinkers; first Albert Einstein:

“Put your hand on a hot stove for a minute, and it seems like an hour. Sit with a pretty girl for an hour, and it seems like a minute. That’s relativity.”

And finally, Douglas Adams:

“Time is an illusion. Lunchtime doubly so.”

Image credit: Alice Lucchin

What Sugar Actually Does To Your Brain And Body

Source: Adam Dachis on Lifehacker

We consume an enormous amount of sugar, whether consciously or not, but it’s a largely misunderstood substance. There are different kinds and different ways your body processes them all. Some consider it poison and others believe it’s the sweetest thing on earth. Here’s a look at the different forms of sugar, the various ways they effect you, and how they play a role in healthy — and unhealthy — diets.

Of course, if you already know how sugar works and how your body uses it, feel free to skip down to the final section about healthier sugar consumption.

The Different Types of Sugar

There are too many types of sugar (and, of course, sugar substitutes) to tackle in a high-level overview like this one, so we’re really only going to look at the two (and a half) that you regularly encounter: glucose and fructose.

Glucose

Glucose is a simple sugar that your body likes. Your cells use it as a primary source of energy, so when you consume glucose, it’s actually helpful. When it’s transported into the body, it stimulates the pancreas to produce insulin. Your brain notices this increase, understands that it’s busy metabolising what you just ate, and tells you that you’re less hungry. The important thing to note here is that when you consume glucose, your brain knows to tell you to stop eating when you’ve had enough.

But glucose isn’t perfect. There are many processes involved when you consume glucose, but one that occurs in your liver produces something called http://en.wikipedia.org/wiki/Very_low_density_lipoprotein (or VLDL). You don’t want VLDL. It causes problems (like cardiovascular disease). Fortunately, only about 1 out of 24 calories from glucose that are processed by the liver turn into VLDL. If glucose were the only thing you ate that produced VLDL, it would be a non-issue.

Sucrose and High Fructose Corn Syrup (HFCS)

For our purposes, high fructose corn syrup (HFCS) and sucrose are the same thing because they’re both highly sweet and they both contain a large amount of fructose. Sucrose is 50 per cent fructose and HFCS is 55 per cent fructose (which is high compared to normal corn syrup, but pretty normal when compared to cane sugar). The remainder of each is glucose, which we discussed above. In most cases, fructose is bad for you because of how it’s processed by the body. Fructose can only be metabolised by the liver, which is not a good thing. This means a greater number of calories — about three times more than glucose — are going through liver processes and that results in a much higher production of VLDL (the bad cholesterol mentioned earlier) and fat. It also results in a higher production of uric acid and a lot of other things you don’t want, which is believed to lead to fun stuff like hypertension and high blood pressure.

On top of that, fructose consumption negatively changes the way your brain recognises your consumption. This is because your brain resists leptin, the protein that’s vital for regulating energy intake and expenditure (which includes your keeping your appetite in check and your metabolism working efficiently). As a result, you keep eating without necessarily realising you’re full. For example, a soda containing high amounts of fructose (which is most non-diet sodas) will do little to make you think you’re full even though you’re taking in large amounts of calories. Your brain doesn’t get the message that you really consumed much of anything and so it still thinks you’re still hungry. This is a very, very basic look at part of how fructose is processed and doesn’t even touch upon many of its other problems, but identifies the issue most people care about: fat production.

This isn’t to say fructose is all bad. It does have a practical purpose. If you’re a professional athlete, for example, it can actually be helpful. HFCS actually repletes your glycogen supply faster, which is useful when you’re burning it off, so the use of HFCS in sports drinks actually has a practical purpose for those who can quickly burn it off. It’s not so helpful for those of us whose life focus is not physical activity — unless we find ourselves in a situation where we need fast energy that we’re going to quickly burn off.

Processed vs Unprocessed Foods

Fruit contains fructose, but as any food pyramid or suggested intake ratios will tell you, fruit is OK. How is that possible if fructose is almost always bad? This is because fruit, in its natural form, contains fibre. Fructose doesn’t provide a satiety alert to let your brain know to tell you to stop eating, but fibre does this to a high degree. This is why you can eat fruit — despite the fructose content — without experiencing the same problems as, say, drinking a sugary soft drink. This is why fruit can actually be beneficial. The same goes for processed sugar. Sugar doesn’t exist naturally as sparkly white crystals, but as a really tough stick called sugar cane. It isn’t until you process the sugar can that you lose all the fibre it contains. Without the fibre, you only have the tasty but problematic part of the original food. That’s why processed sugars can cause problems.

So why not keep the fibre (or at least some of it)? Because when you process food, you’re not processing it for the purpose of eating immediately. Instead you’re processing it to ship all over the country, or even the world. To do this, you obviously can’t let the food expire or it will be useless when it arrives. Because fibre causes the food to go bad much faster, it needs to be removed.

Photo by Dee’s Illustrations

Additionally, many processed foods are even worse off because of their low fat content. Sure, low fat content sounds good, but just because you eat fat doesn’t mean you retain it. Your body can efficiently process and excreted fat, so fat intake isn’t a huge issue by default. Nonetheless, the past 40 years brought us a low-fat craze. Fresh food can still taste good without a higher fat content, but processing low-fat food makes it taste like crap. Companies understand this, and so they add a bunch of sugar (and often salt) to fix that problem. This process essentially exchanges fat your body can actually use for fructose-produced fat that it cannot.

These are the main reasons why processed food is often an enemy if you want to stay healthy. This isn’t always the case, but it is far more likely than not. Check the sugar content on the back of every package of processed food you own or see at the supermarket and you’ll see it for yourself.

Healthier Sugar Consumption

OK, so some sugar isn’t really bad for you but some sugar, like fructose in high amounts, is unhealthy. Since fructose is plentiful in many processed foods, how can you eat better and still enjoy the sweet things you like? What follows are some suggestions. Some require a bit of sacrifice and will be difficult — but more effective — and others are easy enough for anyone to incorporate in his or her diet. If you want to try and curb your sugar intake, be reasonable about what you can accomplish. Failure is a lot more likely if you try to pack in large amounts of change at once . When you cut back on anything slowly, it feels much easier and is more likely to stick.

Stop Drinking Sugared Beverages
Of anything you can do, this is the most important. Fructose-heavy soft drink is remarkably problematic because, for reasons discussed above, you can keep drinking it while your body isn’t recognising your sugar intake — so your body remains hungry. On top of that, a lot of soft drink (Coke is a great example) contains high amounts of sodium. Why would you want salt in your soda? You wouldn’t, but it makes you thirsty and prompts you to buy more soft drink, so it’s great for the companies that make it. It also makes you pee (as does caffeine if your drink of choice has that) so you’ll feel the need to drink more as well. This is masked by simply adding more fructose to the drink, which is another obvious problem.

All of that is bad, but what makes it so important to stop drinkingsoft drinks is that you get absolutely nothing else with it. While other sugary items — such as a slice of cake or a doughnut — are no shining examples of nutrition, they at least contain some nutrients that will help to alert your brain that you’re actually eating. Fructose-heavy soft drink won’t do this, so it’s best to just cut it out entirely. This is the hardest thing but the most important. Cutting it out will make it easier to stop eating too much sugar (or anything, really), because you’ll be taking in far fewer calories that will go unnoticed by your brain.

What can you drink without issue? Water.

This may sound horrible to some people, but pretty much every other drink you can buy is a processed drink. This isn’t to say you can never have another sugared beverage again, but the more you drink them the harder it will be to control your appetite. If you want to incorporate sugared drinks and alcoholic beverages into your diet, try consuming them 20 minutes after you’ve eaten. You can use this same trick for desserts. (More on this in a minute.)

Eat Fibre with Your Sugar
As previously mentioned in the section about processed and unprocessed foods, fibre is very necessary in curbing sugar intake. It does what fructose can’t do, and that’s alert you that you’ve consumed calories and you don’t need to eat anymore. Basically, fibre and fructose need to work together. Fibre is fructose’s unattractive but brilliant friend. Fructose makes up for fiber’s lack of sweetness while fibre makes up for fructose’s uselessness.

So how do you eat fibre with your fructose? Don’t eat processed foods. Get your fructose from fruit or other sources that contain built-in fibre.

Avoid Processed Foods with High Amounts of Sugar
Cooking your own meals from unprocessed foods is almost always going to be a better option, but our busy lives make that difficult to accomplish for every single meal we eat. While avoiding processed foods altogether is a nice thought, it’s not very realistic. If you’re going to eat something processed, be sure to check the label for sugar content. If it is not a dessert food and the sugar count isn’t negligible, you should probably avoid it. If it contains HFCS early on in the ingredient list (or at all, really), you should probably avoid it. Buy whole wheat breads that are actually whole wheat. Avoid pre-packaged dinners whenever you can. Buy foods with more fibre. They’re likely to expire faster, which means more frequent trips to the grocery store, but that’s a pretty minor sacrifice to make.

Keep Sugar Products Out of the House
If you like dessert, don’t keep it at home. This is obvious, but it’s also one of the most effective options (you can’t eat something you don’t have). If you really want it, make yourself do a little work. Have dinner, and if you have a craving for dessert afterwards then go out and get some. Chances are it won’t take more than 20 minutes for that craving to die, as you’ll fill up and won’t want to eat anything else. In the event it doesn’t, go out and buy a reasonably sized dessert. As long as you’re not inclined to do this regularly, prolonging the decision to eat dessert should help you out.

Don’t Cut It Out Entirely

Photo by Nick Depree

If you’re currently eating quite a bit of sugar, or you really like it, cutting it out entirely is a bad idea. Not only is comfort food possibly good for your mental health, but it’s also believed that you can develop a dependency to sweet foods. As an experiment I cut out sugar for a month before writing this post. While the physical cravings were easy to curb, the psychological ones were much more challenging. Angela Pirisi, writing for Psychology Today, points to a study conducted by psychologist Dr Bart Hoebel, who believes sugar creates an actual dependency:

Laboratory experiments with rats showed that signs of sugar dependence developed over the course of 10 days. This suggests that it does not take long before the starve-binge behaviour catches up with animals, making them dependent. There is something about this combination of heightened opioid and dopamine responses in the brain that leads to dependency. Without these neurotransmitters, the animal begins to feel anxious and wants to eat sweet food again.

Artificial sweeteners didn’t change the dependence, leading Hoebel to believe that the sweetness was the main factor and not the calories. While the study couldn’t identify why these cravings exist, it could identify a dependency. If you’re cutting down on sugar, take it slowly.

Get Moving
Your metabolism pretty much goes in the toilet when you don’t move around at all, making sitting the harbinger of death. We’re big on standing desks, which, for starters, helps your burn far more calories than sitting. It’s just good for you all-around. As with any level of physical activity, from standing to walking to running, calorie burn is a poor focus to have. Going for a 20-minute run is about equal to two thin mint cookies (unless you’re really fast, in which case you might get a third cookie). Burning off a fast food meal would require exercising for most of your day. It’s just not feasible for anyone. Physical activity helps because it reduces stress (which reduces appetite) and improves the way your metabolism functions (so less fat is produced when processed by your body). These things are much more important than calorie burn.

Standing up is a good way to negate the effects of sitting down but you might not be able to do it all the time. If you can’t, make sure you get up and walk around at least every 30 minutes. If you just don’t want to stand up while you work, try doing it for only an hour a day. It’s a short amount of time and is better than nothing. Regardless of how much you sit, keep track of the time and try to engage in physical activity — even if it’s as mild as walking around — for as close to that amount of time as possible. Go for walks (or walk instead of drive), play a sport, exercise, clean the house, or do anything that keeps you moving around. Generally the entertainment you consume while sitting (television and movies) can still be consumed while you’re standing or moving around. This may not be your ideal situation, but it’s a good way to increase your physical activity without giving up a normally sedentary activity you enjoy.

Like with anything, sugar isn’t all that bad for you in moderation. The problem with sugar these days is that there’s a lot more of it in everything and it’s in practically everything. So long as you pay attention to what you’re eating and you don’t overdo it, sugar can be a pleasant part of your life few to no issues. The important thing is that you know what you’re consuming and make good choices as a result. The answer to this problem isn’t groundbreaking, but just a matter of paying attention.

Want to learn more about sugar and how it works? You’ll find a lot of links within this post to other studies and additional information that’s worth reading, but you also should check out Dr Robert H. Lustig’s lecture on sugar (which was the initial reason for writing this), as well as Sweet Surprise, which is an HFCS advocacy website that argues against the claims that it is bad for you.

Source: Brandon Keim, Wired Science

Sea nettles at the Monterey Bay Aquarium. (jimg944/Flickr).

That waste is useful is one of the animal kingdom’s cardinal principles. One creature’s discards are another’s dinner, and so continues the circle of life. But jellyfish, it would seem, bend the rule.

Their waste is generally inedible, food mostly for a few odd species of bacteria that live just long enough to emit a whiff of CO2, then sink. All that nutrition and energy vanishes with barely a trace.

During a jellyfish bloom, food webs may thus be plucked and rearranged, configured to feed jellies that in turn feed almost nothing. Whether this represents the future of Earth’s oceans depends on whom you ask, but it’s an interesting phenomenon in itself.

“Jellyfish are consuming more or less everything that’s present in the food web,” said Robert Condon, a Virginia Institute of Marine Science and co-author of a jellyfish-impact study published June 7 in Proceedings of the National Academy of Sciences. “They’re eating a lot of the food web, and turning it into gelatinous biomass. They’re essentially stealing a lot of the energy, then putting it away.”

Condon and his co-authors are part of a research community whose attention has been recently transfixed by jellyfish, which evolved more than 500 million years ago and once dominated Earth’s oceans, but until the late 20th century were of largely esoteric scientific interest.

In the 1990s, however, jellyfish populations exploded in the Bering Sea, rising by a factor of 40 in less than a decade. Fishermen nicknamed one region the “Slime Bank.”

By the time those blooms subsided, fishermen in the Sea of Japan were accustomed to 500-million-strong swarms of refrigerator-sized, ship-sinking Nomura jellyfish, their numbers unprecedented in recent memory. In the Mediterranean, once-seasonal jellies became a year-round fact of life, again wreaking fisheries havoc.

The blooms became a matter of popular and scientific fascination. Some researchers talked of a “rise of slime,” interpreting the blooms as portents of a “gelatinous future” in which overfished, overpolluted and rapidly overheating marine ecosystems are overrun by algae and jellies.

Such grim assessments may prove correct, though Condon thinks it’s too soon to know. Long-term datasets are few (see sidebar), and these seemingly apocalyptic blooms may represent a mix of local disturbance and natural cycling, not a global tipping point into ooze. But whatever the case may be, studying jellyfish is a sensible thing to do.

“They’re a big unknown,” said Condon, and one of the biggest unknowns is this: At an ecological level, exactly what happens during a jellyfish bloom, anyway?

In what may be the most comprehensive jellyfish study to date, Condon’s group spent nearly four years gathering data from Chesapeake Bay on Mnemiopsis leidyi and Chrysaora quinquecirrha, two species that have caused trouble elsewhere and are considered representative of jellyfish habits worldwide.

The researchers counted them at sea, measured the nutrients in surrounding water, and calculated the composition of nearby bacterial communities. In the lab, they observed how bacteria in seawater reacted to jellyfish, and tracked chemicals flowing through their aquariums.

They found that jellyfish, like many other marine species, excrete organic compounds as bodily wastes and as slime that covers their bodies. But whereas the excretions of other species are consumed by bacteria that form important parts of oceanic food webs, jellyfish excretions nourish gammaproteobacteria, a class of microbes that little else in the ocean likes to eat, and that produces little of further biological use.

“Lots of marine creatures make this dissolved organic matter that bacteria use to live. But the point of this paper is that the organic matter produced by jellies doesn’t make it back up the food web,” said study co-author Deborah Steinberg, also a Virginia Institute of Marine Science biologist. “When jellies are around, they’re shunting this energy into a form that’s just not very usable. They’re just shunting energy away from the rest of the food web.”

 

Model of the water-column food web before and after jellyfish blooms. Courtesy PNAS

Under normal conditions, gammaproteobacteria are rare. During jellyfish blooms, they may become ubiquitous. And though many questions remain unanswered — perhaps jellyfish and gammaproteobacteria end up as food in the open ocean, beyond the confines of this study — the implications are stark. Given time and numbers, jellyfish might be able to suck an ecosystem dry, converting its bounty to short-lived bacteria.

Even if it’s too soon to say that all Earth’s oceans are returning to some ancient, jellyfish-dominated state, it’s clear that in some areas people have made it easier for jellies, said Steinberg. Overfishing and pollution leave gaps that jellies have spent half a billion years evolving to exploit.

“We’re a long ways from jellyfish taking over the world, but humans are changing food webs in the ocean by our activities,” Steinberg said. “It’s an experiment, a big experiment, and we don’t know yet what the outcome is going to be. We need to be careful.”

See Also:

Source: Brandon Keim, Wired Science

A visualization of a ribozyme. (University of California, Santa Barbara)

The transformation of raw genetic material on a laboratory bench has provided a rare empirical demonstration of processes that may be universally crucial to evolution, but are only beginning to be understood.

The processes, called cryptic variation and preadaptation, involve mutations that don’t affect an organism when they first occur, and are initially exempt from pressures of natural selection. As they gather, however, at some later date, they could combine to form the basis for complex, unpredictable new traits.

In the new study, the ability of evolving, chemical-crunching molecules called ribozymes to adapt in new environments proved directly related to earlier accumulations of cryptic mutations. The details are esoteric, but their implications involve the very essence of adaptation and evolution.

“It’s one of the more modern topics in evolutionary theory,” said mathematical biologist Joshua Plotkin of the University of Pennsylvania, author of a commentary on the experiment, which was described June 2 in Nature. “The idea has been around for a while, but direct evidence hasn’t been found until recently.”

The experiment was led by evolutionary biologists Eric Hayden and Andreas Wagner of Switzerland’s University of Zurich, who use ribozymes — molecules made from RNA, a single-stranded form of genetic material – to study evolutionary principles in the simplest possible way.

The principles of cryptic variation and preadaptation were first proposed in the mid-20th century and conceptually refined in the mid-1970s. They were logical answers to the question of how complex traits, seemingly far too complex to be explained by one or a few mutations, could arise.

But even as such leading thinkers as Stephen Jay Gould embraced the concept, it proved difficult to study in detail. The tools didn’t exist to interpret genetic data with the necessary rigor. The concept itself was also difficult to grasp, injecting long periods of accumulation, purposeless mutations into an evolutionary narrative supposedly driven by constant selection.

In recent years, however, with the advent of better tools and a growing appreciation for evolution’s sheer complexity, researchers’ attention has turned again to cryptic variation and preadaptation. Computer models and scattered observations in bacteria and yeast hinted at their importance. But definitive proof, combining exhaustive genetic observation with real-world evolution, was elusive.

‘Cryptic variation addresses questions of innovation. How do new things come about in biology?’

“Cryptic variation addresses questions of innovation,” said Hayden. “How do new things come about in biology? There’s been a long history of this concept, but no concrete experimental demonstration.”

In the new study, Hayden and Wagner evolved ribozymes in test tubes of chemicals, then moved them to a new chemical substrate, a shift analogous to requiring animals to suddenly subsist on a new food source.

The ribozymes that flourished were those that had accumulated specific sets of cryptic mutations in their former environment. Those variations, seemingly irrelevant before, became the basis of newly useful adaptation. The researchers were able to measure every change in detail.

“It is a groundbreaking proof of principle,” said University of Arizona evolutionary biologist Joanna Masel, who wasn’t involved in the study. “This study is a clear demonstration that cryptic genetic variation can make evolution more effective.”

According to Plotkin, cryptic variation and preadaptation may be crucial to the evolution of drug resistance and immune system evasion in pathogens. Rather than looking for straightforward mutations, researchers could search for combinations, perhaps developing an “advance warning system” to flag seemingly innocuous changes.

Another application could be in genetic engineering. Whereas virus and bacteria designers tend to “accept any mutations that get them closer to their intended outcome,” said Plotkin, “it might be important to take lateral steps as well as uphill steps.”

Cryptic variation and preadaptation could also be important to the evolution of animals, from the origin of multicellularity to complex features like eyes and language. Plotkin would like to see studies revisiting the evolution of Charles Darwin’s famous finch beaks, but with an eye toward these newly described processes.

Masel said that better understanding cryptic variation and preadaptation could help programmers of evolving computer systems, and perhaps explain why some systems are better able than others to evolve. “Why are biological systems so evolvable?” she said. “This dynamic may or may not be the essence of evolvability. That’s certainly one of the hypotheses out there, and I am enthusiastic about it.”

These processes could also help interpret genomic studies that loosely link hundreds or thousands of genetic mutations to disease and development, frustrating geneticists searching for genetic patterns of heritability, said Masel and Hayden. And at a social level, they could be instructive to people interested in fostering innovation.

“My prediction is that it is good to foster lots of variation,” said Masel, who likened cryptic variation and preadaptation to Google’s famous requirement that employees spend 20 percent of their time on projects of personal whimsy. Rather than focusing narrowly on ideas that are obviously good, “Foster circumstances where lots of non-terrible ideas are floating around,” said Masel.

See Also:

Citations: “Cryptic genetic variation promotes rapid evolutionary adaptation in an RNA enzyme.” By Eric J. Hayden, Evandro Ferrada & Andreas Wagner. Nature, Vol. 473 Issue: 7348, June 2, 2011.

“Hidden diversity sparks adaptation.” By Joshua Plotkin and Jeremy Draghi. Nature, Vol. 473 Issue: 7348, June 2, 2011.

Brandon is a Wired Science reporter and freelance journalist. Based in Brooklyn, New York and Bangor, Maine, he’s fascinated with science, culture, history and nature. Follow @9brandon on Twitter.

Source: John C Abell, Wired Epicentre

There are no two ways about it: E-books are here to stay. Unless something as remarkable as Japan’s reversion to the sword occurs, digital books are the 21st century successor to print. And yet the e-book is fundamentally flawed. There are some aspects to print book culture that e-books can’t replicate (at least not easily) — yet.

Let’s put this into some context first. Amazon sparked the e-reader revolution with the first Kindle a mere two-and-a-half years ago, and it now already sells more e-books than all print books combined. Barnes & Noble, the century-old bricks-and-mortar bookseller, is being pursued by Liberty Media not because it has stores all over the place but because its Nook e-reader is the Kindle’s biggest competitor.

Reasonable arguments that the iPad would kill the e-reader seem laughable now, as both thrive and many people own one of each. One thing E-books and books are equally good at: In their own ways, they’re both platform agnostic.

But for all of the benefit they clearly bring, e-books are still falling short of a promise to make us forget their paper analogs. For now, you still lose something by moving on.

It isn’t always that way with tech: We rejoice at cutting the phone cord, we don’t fret that texting causes lousy penmanship and we are ecstatic that our computers, tablets and phones are replacing the TV set.

I’m not resorting to variations on the ambiguous tactile argument (“The feel and smell of paper is an integral part of the reading experience….”) that one hears from some late-to-never adopters. And — full disclosure — I have never owned an e-book reader, because I have an ingrained opposition to single-purpose devices. But since getting an iPad on day one, I haven’t purchased a print edition of anything for myself.

I am hooked — completely one with the idea that books are legacy items that may never go away, but have been forever marginalized as a niche medium. With that in mind, however, here are five things about e-books that might give you pause about saying good riddance to the printed page.

Fix these problems, and there really will be no limits to the e-book’s growth.

Continue reading …

1) An unfinished e-book isn’t a constant reminder to finish reading it.

Two months into 2011, The New York Times tech reporter (and former Wired reporter Jenna Wortham) wrote excitedly that she had finally finished her first e-book — how is such technological tardiness possible for someone so plugged in? Wortham had an excellent explanation: She kept forgetting to pick up any e-book she had started reading. It took the solemn determination of a New Year’s resolution to break that spell.

E-books don’t exist in your peripheral vision. They do not taunt you to finish what you started. They do not serve as constant, embarrassing reminders to your poor reading habits. Even 1,001 digital books are out of sight, and thus out of mind. A possible solution? Notifications that pop up to remind you that you’ve been on page 47 of A Shore Thing for 17 days.

2) You can’t keep your books all in one place.

Books arranged on your bookshelves don’t care what store they came from. But on tablets and smartphones, the shelves are divided by app — you can’t see all the e-books you own from various vendors, all in one place. There is simply no app for that. (With e-readers, you are doubly punished, because you can’t buy anything outside the company store anyway).

Apple doesn’t allow developers to tap into root information, which would be needed to create what would amount to a single library on an iOS device. If that restriction disappeared, there would still be the matter of individual vendors agreeing to cooperate — not a given since they are competitors and that kind of leveling could easily lead to price wars, for one thing.

But the way we e-read is the reverse of how we read. To pick up our next physical book, we peruse bookshelves we’ve arranged and pick something out. In the digital equivalent, we would see everything we own, tap on a book and it would invoke the app it requires — Kindle, Nook, Borders, etc. With the current sequence — open up a reader app, pick a book — you can easily forget what you own. Trivial? Try to imagine Borders dictating the size and shape of your bookshelf, and enforcing a rule that it hold only books you bought from them, and see if that thought offends you even a little bit.

3) Notes in the margins help you think.

It’s not enough to be able to highlight something. A careful reader wants to argue with the author, or amplify a point, or jot down an insight inspired by something freshly read. And it has to be proximate to the original — a separate notebook is ridiculous, even with a clever indexing system that seems inventable but is yet to be invented.

Books don’t offer much white space for readers to riff in, but e-books offer none. And what about the serendipity of sharing your thoughts, and being informed by the thoughts of others, from the messages in shared books?

Replicating this experience will take a new standard, adopted universally, among competitors whose book tech, unlike paper, is proprietary. For a notion of what this might look like, check out OpenMargin.

4) E-books are positioned as disposable, but aren’t priced that way.
This one is simple, and also easy to oversimplify since people still have to get paid. But until e-books truly add new value, the way Hollywood did with DVD extras, it’s just annoying to plunk down $13 for what amounts to a rental. E-books cost virtually nothing to produce, and yet the baseline cover price, set by publishers, is only fractionally below the discount price for the print version of new releases.

E-books can’t be shared, donated to your local library shelter, or re-sold. They don’t take up space, and thus coax conflicted feelings when it is time to weed some of them out. But because they aren’t social, even in the limited way that requires some degree of human contact in the physical world, they will also never be an extension of your personality. Which brings me to …

5) E-books can’t be used for interior design.

Before you roll your eyes at the shallowness of this gripe, consider this: When in your literate life you did not garnish your environment with books as a means of wordlessly introducing yourself to people in your circle? It probably began that time you toted The Cat in the Hat, trying not to be dispatched to bed during a grown-up dinner party.

It may be all about vanity, but books — how we arrange them, the ones we display in our public rooms, the ones we don’t keep — say a lot about what we want the world to think about us. Probably more than any other object in our homes, books are our coats of arms, our ice breakers, our calling cards. Locked in the dungeon of your digital reader, nobody can hear them speak on your behalf.

It’s a truism that no new medium kills the one that it eclipses — we still have radio, which pre-dates the internet, television and movies. So it would be foolish to predict the death of books anytime soon. And we haven’t seen the end of creative business models — there is no “all access pass” in book publishing, as is the trend now for magazines and the newspapers which have put up paywalls. Getting an e-book along with your print edition (or, the other way around) could be the best of both worlds, or the worst.

It would certainly solve my unexpected home decor problem.

Photo: Anthropologie store window, New York City. (John C Abell/Wired.com)

Some molds cause allergic reactions and respiratory problems. And a few molds, in the right conditions, produce “mycotoxins,” poisonous substances that can make people sick. When you see mold on food, is it safe to cut off the moldy part and use the rest? To find the answer to that question, delve beneath the surface of food to where molds take root.

What Are Molds?
Molds are microscopic fungi that live on plant or animal matter. No one knows how many species of fungi exist, but estimates range from tens of thousands to perhaps 300,000 or more. Most are filamentous (threadlike) organisms and the production of spores is characteristic of fungi in general. These spores can be transported by air, water, or insects.

Unlike bacteria that are one-celled, molds are made of many cells and can sometimes be seen with the naked eye. Under a microscope, they look like skinny mushrooms. In many molds, the body consists of:

  • root threads that invade the food it lives on,
  • a stalk rising above the food, and
  • spores that form at the ends of the stalks.

The spores give mold the color you see. When airborne, the spores spread the mold from place to place like dandelion seeds blowing across a meadow.

Molds have branches and roots that are like very thin threads. The roots may be difficult to see when the mold is growing on food and may be very deep in the food. Foods that are moldy may also have invisible bacteria growing along with the mold.

[Top of Page]

Are Some Molds Dangerous?
Yes, some molds cause allergic reactions and respiratory problems. And a few molds, in the right conditions, produce “mycotoxins,” poisonous substances that can make you sick.

[Top of Page]

Are Molds Only on the Surface of Food?
No, you only see part of the mold on the surface of food — gray fur on forgotten bologna, fuzzy green dots on bread, white dust on Cheddar, coin-size velvety circles on fruits, and furry growth on the surface of jellies. When a food shows heavy mold growth, “root” threads have invaded it deeply. In dangerous molds, poisonous substances are often contained in and around these threads. In some cases, toxins may have spread throughout the food.

[Top of Page]

Where Are Molds Found?
Molds are found in virtually every environment and can be detected, both indoors and outdoors, year round. Mold growth is encouraged by warm and humid conditions. Outdoors, they can be found in shady, damp areas or places where leaves or other vegetation are decomposing. Indoors, they can be found where humidity levels are high.

Molds form spores which, when dry, float through the air and find suitable conditions where they can start the growth cycle again.

[Top of Page]

What Are Some Common Foodborne Molds?
Molds most often found on meat and poultry are Alternaria, Aspergillus, Botrytis, Cladosporium, Fusarium, Geotrichum, Monilia, Manoscus, Mortierella, Mucor, Neurospora, Oidium, Oosproa, Penicillium, Rhizopus and Thamnidium. These molds can also be found on many other foods.

[Top of Page]

What Are Mycotoxins?
Mycotoxins are poisonous substances produced by certain molds found primarily in grain and nut crops, but are also known to be on celery, grape juice, apples, and other produce. There are many of them and scientists are continually discovering new ones. The Food and Agriculture Organization (FAO) of the United Nations estimates that 25% of the world’s food crops are affected by mycotoxins, of which the most notorious are aflatoxins.

[Top of Page]

What is Aflatoxin?
Aflatoxin is a cancer-causing poison produced by certain fungi in or on foods and feeds, especially in field corn and peanuts. They are probably the best known and most intensively researched mycotoxins in the world. Aflatoxins have been associated with various diseases, such as aflatoxicosis in livestock, domestic animals, and humans throughout the world. Many countries try to limit exposure to aflatoxin by regulating and monitoring its presence on commodities intended for use as food and feed. The prevention of aflatoxin is one of the most challenging toxicology issues of present time.

[Top of Page]

How Does the U.S. Government Control Aflatoxins?
Aflatoxins are considered unavoidable contaminants of food and feed, even where good manufacturing practices have been followed. The U.S. Food and Drug Administration and the USDA monitor peanuts and field corn for aflatoxin and can remove any food or feed with unacceptable levels of it.

[Top of Page]

Is Mushroom Poisoning Caused by Molds?
No, it is due to the toxin produced by the fungi, which are in the same family as molds. Mushroom poisoning is caused by the consumption of raw or cooked mushrooms, which are higher-species of fungi. The term “toadstool” (from the German “Todesstuhl” — death’s stool) is commonly given to poisonous mushrooms, but there is no general rule of thumb for distinguishing edible mushrooms from poisonous toadstools. The toxins that cause mushroom poisoning are produced naturally by the fungi. Most mushrooms that cause human poisoning cannot be made safe by cooking, canning, freezing, or any other processing. The only way to avoid poisoning is not to eat poisonous mushrooms.

[Top of Page]

Are Any Food Molds Beneficial?
Yes, molds are used to make certain kinds of cheeses and can be on the surface of cheese or be developed internally. Blue veined cheese such as Roquefort, blue, Gorgonzola, and Stilton are created by the introduction of P. roqueforti or Penicillium roqueforti spores. Cheeses such as Brie and Camembert have white surface molds. Other cheeses have both an internal and a surface mold. The molds used to manufacture these cheeses are safe to eat.

[Top of Page]

Why Can Mold Grow in the Refrigerator?
While most molds prefer warmer temperatures, they can grow at refrigerator temperatures, too. Molds also tolerate salt and sugar better than most other food invaders. Therefore, molds can grow in refrigerated jams and jelly and on cured, salty meats — ham, bacon, salami, and bologna.

[Top of Page]

How Can You Minimize Mold Growth?
Cleanliness is vital in controlling mold. Mold spores from affected food can build up in your refrigerator, dishcloths, and other cleaning utensils.

  • Clean the inside of the refrigerator every few months with 1 tablespoon of baking soda dissolved in a quart of water. Rinse with clear water and dry. Scrub visible mold (usually black) on rubber casings using 3 teaspoons of bleach in a quart of water.
  • Keep dishcloths, towels, sponges, and mops clean and fresh. A musty smell means they’re spreading mold around. Discard items you can’t clean or launder.
  • Keep the humidity level in the house below 40%.

[Top of Page]

Don’t Buy Moldy Foods
Examine food well before you buy it. Check food in glass jars, look at the stem areas on fresh produce, and avoid bruised produce. Notify the store manager about mold on foods!

Fresh meat and poultry are usually mold free, but cured and cooked meats may not be. Examine them carefully. Exceptions: Some salamis — San Francisco, Italian, and Eastern European types — have a characteristic thin, white mold coating which is safe to consume; however, they shouldn’t show any other mold. Dry-cured country hams normally have surface mold that must be scrubbed off before cooking.

[Top of Page]

Must Homemade Shelf-Stable Preserves be Water-Bath Processed?
Yes, molds can thrive in high-acid foods like jams, jellies, pickles, fruit, and tomatoes. But these microscopic fungi are easily destroyed by heat processing high-acid foods at a temperature of 212 °F in a boiling water canner for the recommended length of time. For more information about processing home-canned foods, go to the National Center for Home Food Preservation at: www.uga.edu/nchfp/.

[Top of Page]

How Can You Protect Food from Mold?

  • When serving food, keep it covered to prevent exposure to mold spores in the air. Use plastic wrap to cover foods you want to stay moist — fresh or cut fruits and vegetables, and green and mixed salads.
  • Empty opened cans of perishable foods into clean storage containers and refrigerate them promptly.
  • Don’t leave any perishables out of the refrigerator more than 2 hours.
  • Use leftovers within 3 to 4 days so mold doesn’t have a chance to grow.

[Top of Page]

How Should You Handle Food with Mold on It?
Buying small amounts and using food quickly can help prevent mold growth. But when you see moldy food:

  • Don’t sniff the moldy item. This can cause respiratory trouble.
  • If food is covered with mold, discard it. Put it into a small paper bag or wrap it in plastic and dispose in a covered trash can that children and animals can’t get into.
  • Clean the refrigerator or pantry at the spot where the food was stored.
  • Check nearby items the moldy food might have touched. Mold spreads quickly in fruits and vegetables.
  • See the attached chart “Moldy Food: When to Use, When to Discard.”
Molds on Food
FOOD HANDLING REASON
Luncheon meats, bacon, or hot dogs Discard Foods with high moisture content can be contaminated below the surface. Moldy foods may also have bacteria growing along with the mold.
Hard salami and dry-cured country hams Use. Scrub mold off surface. It is normal for these shelf-stable products to have surface mold.
Cooked leftover meat and poultry Discard Foods with high moisture content can be contaminated below the surface. Moldy foods may also have bacteria growing along with the mold.
Cooked casseroles Discard Foods with high moisture content can be contaminated below the surface. Moldy foods may also have bacteria growing along with the mold.
Cooked grain and pasta Discard Foods with high moisture content can be contaminated below the surface. Moldy foods may also have bacteria growing along with the mold.
Hard cheese
(not cheese where mold is part of the processing)
Use. Cut off at least 1 inch around and below the mold spot (keep the knife out of the mold itself so it will not cross-contaminate other parts of the cheese). After trimming off the mold, re-cover the cheese in fresh wrap. Mold generally cannot penetrate deep into the product.
Cheese made with mold
(such as Roquefort, blue, Gorgonzola, Stilton, Brie, Camembert)
Discard soft cheeses such as Brie and Camembert if they contain molds that are not a part of the manufacturing process. If surface mold is on hard cheeses such as Gorgonzola and Stilton, cut off mold at least 1 inch around and below the mold spot and handle like hard cheese (above). Molds that are not a part of the manufacturing process can be dangerous.
Soft cheese
(such as cottage, cream cheese, Neufchatel, chevre, Bel Paese, etc.) Crumbled, shredded, and sliced cheeses (all types)
Discard Foods with high moisture content can be contaminated below the surface. Shredded, sliced, or crumbled cheese can be contaminated by the cutting instrument. Moldy soft cheese can also have bacteria growing along with the mold.
Yogurt and sour cream Discard Foods with high moisture content can be contaminated below the surface. Moldy foods may also have bacteria growing along with the mold.
Jams and jellies Discard The mold could be producing a mycotoxin. Microbiologists recommend against scooping out the mold and using the remaining condiment.
Fruits and vegetables, FIRM
(such as cabbage, bell peppers, carrots, etc.)
Use. Cut off at least 1 inch around and below the mold spot (keep the knife out of the mold itself so it will not cross-contaminate other parts of the produce). Small mold spots can be cut off FIRM fruits and vegetables with low moisture content. It’s difficult for mold to penetrate dense foods.
Fruits and vegetables, SOFT
(such as cucumbers, peaches, tomatoes, etc.)
Discard SOFT fruits and vegetables with high moisture content can be contaminated below the surface.
Bread and baked goods Discard Porous foods can be contaminated below the surface.
Peanut butter, legumes and nuts Discard Foods processed without preservatives are at high risk for mold.

US Food & Safety Inspection Service 2010