Posts Tagged ‘technology’

Ray Kurzweil is interviewed here by Nataly Kelly who works for Common Sense Advisory, a company focused on language in the context of globalisation, which of course includes translation. While her questions are informed by her work context (and sometimes lead to repetition in Kurzweil’s answers), they nevertheless make Kurzweil reflect on whether and how computers will replace humans as translators.

 

Kurzweil already predicted more than eleven years ago in his book “The Age Of Spiritual Machines” that computers will reach a high degree of almost human proficiency in translating written and spoken language by 2029. While he was formulating his prediction, Franz Och started to work on language translation algorithms; Och today is the man behind Google Translations, which still is very crude but nevertheless has reached mainstream. Considering how rapidly technological progress happens, I wouldn’t be surprised if 2029 will turn out to be a fairly conservative assumption.

But the video is not so much focused on time frames, as it is on questions based on the effects of computers on translation, like for example whether we will live in societies without language barriers because computers will comprehensively convey meanings from one language to another in an instant. Kurzweil answers that question with a categorical ‘no’. Even if computer translations will be able to reach a level of proficiency that will allow us to use them for day-to-day conversation and communication, Kurzweil predicts that people will still gain immense benefits from learning other languages, the same way literature (in particular poetry) will still need humans for translation. Words do not just have literal meanings but are also carriers of cultural messages, are steeped in historical contexts and are the products of unique individual creative expression. Unless we create artificial intelligence that at least fully matches our own human one, machines relying on databases for translation won’t reflect and produce the complexity and richness that humans are able to tap into when extracting meaning from words in another language.

Another question Kurzweil addresses in this context is whether computer translations will wipe out the profession of human translators, for example in non-literary fields like that of commerce. Again he answers with a ‘no’. While making references to the just mentioned translation quality argument, he also suggests that new technologies bring diversification to traditional professions by providing people with new skill sets (and Kurzweil refers in this context to the introduction of synthesisers in the 1980s and the consequent changes computers have made to music creation).

While Kurzweil’s historical perspective seems verifiable over longer time periods, it certainly won’t hold true in the short-term. After all: car factories are not filled with masses of multi-skilled workers but with robots and a hand-full of engineers and technicians, and those workers’ experience was just another repetition of what for example the weavers’ trade went through in the 19th century when the steam engine eliminated it. Therefore and despite different avenues being opened for a new generation of language agents, countless translators will lose their professional career prospects forever – the same way people are losing their jobs in bookshops and record stores without being re-employed en-masse by online companies. But I guess, if you’re a futurist with a fascination for technology, social justice aspects are outside  your general frame of reference – which still makes Kurzweil’s reflection within its own albeit limited context quite interesting.

Advertisements

Source: John C Abell, Wired Epicentre

There are no two ways about it: E-books are here to stay. Unless something as remarkable as Japan’s reversion to the sword occurs, digital books are the 21st century successor to print. And yet the e-book is fundamentally flawed. There are some aspects to print book culture that e-books can’t replicate (at least not easily) — yet.

Let’s put this into some context first. Amazon sparked the e-reader revolution with the first Kindle a mere two-and-a-half years ago, and it now already sells more e-books than all print books combined. Barnes & Noble, the century-old bricks-and-mortar bookseller, is being pursued by Liberty Media not because it has stores all over the place but because its Nook e-reader is the Kindle’s biggest competitor.

Reasonable arguments that the iPad would kill the e-reader seem laughable now, as both thrive and many people own one of each. One thing E-books and books are equally good at: In their own ways, they’re both platform agnostic.

But for all of the benefit they clearly bring, e-books are still falling short of a promise to make us forget their paper analogs. For now, you still lose something by moving on.

It isn’t always that way with tech: We rejoice at cutting the phone cord, we don’t fret that texting causes lousy penmanship and we are ecstatic that our computers, tablets and phones are replacing the TV set.

I’m not resorting to variations on the ambiguous tactile argument (“The feel and smell of paper is an integral part of the reading experience….”) that one hears from some late-to-never adopters. And — full disclosure — I have never owned an e-book reader, because I have an ingrained opposition to single-purpose devices. But since getting an iPad on day one, I haven’t purchased a print edition of anything for myself.

I am hooked — completely one with the idea that books are legacy items that may never go away, but have been forever marginalized as a niche medium. With that in mind, however, here are five things about e-books that might give you pause about saying good riddance to the printed page.

Fix these problems, and there really will be no limits to the e-book’s growth.

Continue reading …

1) An unfinished e-book isn’t a constant reminder to finish reading it.

Two months into 2011, The New York Times tech reporter (and former Wired reporter Jenna Wortham) wrote excitedly that she had finally finished her first e-book — how is such technological tardiness possible for someone so plugged in? Wortham had an excellent explanation: She kept forgetting to pick up any e-book she had started reading. It took the solemn determination of a New Year’s resolution to break that spell.

E-books don’t exist in your peripheral vision. They do not taunt you to finish what you started. They do not serve as constant, embarrassing reminders to your poor reading habits. Even 1,001 digital books are out of sight, and thus out of mind. A possible solution? Notifications that pop up to remind you that you’ve been on page 47 of A Shore Thing for 17 days.

2) You can’t keep your books all in one place.

Books arranged on your bookshelves don’t care what store they came from. But on tablets and smartphones, the shelves are divided by app — you can’t see all the e-books you own from various vendors, all in one place. There is simply no app for that. (With e-readers, you are doubly punished, because you can’t buy anything outside the company store anyway).

Apple doesn’t allow developers to tap into root information, which would be needed to create what would amount to a single library on an iOS device. If that restriction disappeared, there would still be the matter of individual vendors agreeing to cooperate — not a given since they are competitors and that kind of leveling could easily lead to price wars, for one thing.

But the way we e-read is the reverse of how we read. To pick up our next physical book, we peruse bookshelves we’ve arranged and pick something out. In the digital equivalent, we would see everything we own, tap on a book and it would invoke the app it requires — Kindle, Nook, Borders, etc. With the current sequence — open up a reader app, pick a book — you can easily forget what you own. Trivial? Try to imagine Borders dictating the size and shape of your bookshelf, and enforcing a rule that it hold only books you bought from them, and see if that thought offends you even a little bit.

3) Notes in the margins help you think.

It’s not enough to be able to highlight something. A careful reader wants to argue with the author, or amplify a point, or jot down an insight inspired by something freshly read. And it has to be proximate to the original — a separate notebook is ridiculous, even with a clever indexing system that seems inventable but is yet to be invented.

Books don’t offer much white space for readers to riff in, but e-books offer none. And what about the serendipity of sharing your thoughts, and being informed by the thoughts of others, from the messages in shared books?

Replicating this experience will take a new standard, adopted universally, among competitors whose book tech, unlike paper, is proprietary. For a notion of what this might look like, check out OpenMargin.

4) E-books are positioned as disposable, but aren’t priced that way.
This one is simple, and also easy to oversimplify since people still have to get paid. But until e-books truly add new value, the way Hollywood did with DVD extras, it’s just annoying to plunk down $13 for what amounts to a rental. E-books cost virtually nothing to produce, and yet the baseline cover price, set by publishers, is only fractionally below the discount price for the print version of new releases.

E-books can’t be shared, donated to your local library shelter, or re-sold. They don’t take up space, and thus coax conflicted feelings when it is time to weed some of them out. But because they aren’t social, even in the limited way that requires some degree of human contact in the physical world, they will also never be an extension of your personality. Which brings me to …

5) E-books can’t be used for interior design.

Before you roll your eyes at the shallowness of this gripe, consider this: When in your literate life you did not garnish your environment with books as a means of wordlessly introducing yourself to people in your circle? It probably began that time you toted The Cat in the Hat, trying not to be dispatched to bed during a grown-up dinner party.

It may be all about vanity, but books — how we arrange them, the ones we display in our public rooms, the ones we don’t keep — say a lot about what we want the world to think about us. Probably more than any other object in our homes, books are our coats of arms, our ice breakers, our calling cards. Locked in the dungeon of your digital reader, nobody can hear them speak on your behalf.

It’s a truism that no new medium kills the one that it eclipses — we still have radio, which pre-dates the internet, television and movies. So it would be foolish to predict the death of books anytime soon. And we haven’t seen the end of creative business models — there is no “all access pass” in book publishing, as is the trend now for magazines and the newspapers which have put up paywalls. Getting an e-book along with your print edition (or, the other way around) could be the best of both worlds, or the worst.

It would certainly solve my unexpected home decor problem.

Photo: Anthropologie store window, New York City. (John C Abell/Wired.com)

After having read today about the many advantages of using an Android phone rather than the iPhone (and yes, there are many arguments for the opposite case 😉 ), the Lifehacker post below could make a switch decision easier.

It may sound too good to be true, but it’s actually easy to sync an Android phone with your Mac just as seamlessly as an iPhone could. Even better, the majority of syncing is done in the background for you, constantly, through Google.

The key to headache-free syncing between Mac and Android is to use Google’s cloud services as middle-men between the two. Your phone’s already designed around using Google for its mail, contact and calendar data, so we’re just going to show you how to sync those down to your Mac, which takes little to no effort. Once you’ve got it all set up, the only reason you’ll ever have for plugging your Android phone into your Mac will be to sync music (and if you really want to, you can sync that over Wi-Fi too).

Setting up Gmail in Mail

If you don’t use an email app on your Mac, but instead just use a browser to access a Gmail account, then you don’t need to worry about Mail at all. If you do use the app, you’ll be happy to know that syncing it with a Gmail account is so simple that it’s almost automated.

Simply open Mail’s preferences, click on Accounts in the top bar, then click the “+” symbol at the bottom-left of the window. After that, it’s as simple as entering your name, email address and login information. Mail automatically knows what settings to choose for Gmail.

Syncing Your Contacts with Mac’s Address Book

Open the Address Book, then open its preferences and click on Accounts in the top bar of the window. You should see an account called “On My Mac Local”, which should already be selected. To the right of that is a check-box labelled “Synchronize with Google”. Just check the box!

Afterwards, you should see a small sync icon in your Mac’s menu bar. Normally, that sync indicator would be used by MobileMe, but it’ll also work for other accounts. If you click on it, a menu will appear and you’ll have the option to “sync now”. Click that, and Address Book will begin to fill up with entries from your Google Contacts within a few seconds. Simple as that, your contacts are synced.

Syncing Your Calendars with iCal

In iCal, open preferences, then click on Accounts in the top bar. Click the “+” symbol at the bottom-left of the window. Leave Account Type set to “Automatic”, then enter your Gmail login information. After you click Create, iCal will sync your Google Calendar automatically. If you use reminders or alerts of any sort in Google Calendar, you’ll also want to click the Advanced tab in iCal’s preferences, and check the box marked “Turn off all alarms”. Otherwise, you’ll be receiving double reminders.

Syncing Your Music with DoubleTwist

If you’ve got a Google Music or Amazon Cloud Drive account, you can already upload your iTunes library and stream the music to your phone. Since not too many people use those services yet, and since they’re a different type of syncing than what we’re after today, we’ll use previously mentioned DoubleTwist.

DoubleTwist is called “iTunes for Android” for a reason. Not only does it look like a miniature iTunes, but it performs most of the same functions, only for your Android phone instead of an iPhone. With DoubleTwist, you can sync music to and from your phone, whether it’s song by song or a whole playlist. It’s fast, simple and free (which we like).

There’s also a DoubleTwist music player app available for Android, but it’s not required to play anything that you sync using the desktop app.

The DoubleTwist desktop app for Mac doesn’t only handle music, you can sync photos and videos just as easily with the built-in media browser.

If you really want to one-up your friends with iPhones, you can buy the DoubleTwist AirSync app to perform all your syncing over Wi-Fi. It costs $4.62, but it’s pretty fancy-pants to be able to sync without a cable. Note that it’ll definitely be slower, though, and is probably a bigger drain on your battery.=

Sync Your Photos with Picasa Web Albums

Last but definitely not least are your photos. We’re going to use Picasa Web Albums, since it syncs beautifully with Android. Many Android phones should be able to sync with Picasa out of the box. Just double check that all your Google services are set to sync by checking in your phone’s settings under Accounts & Sync, and tapping on your Google account.

You don’t need every Google service on the list to sync if you’re not planning on using them. Google Books, for example, may be listed. If you don’t plan on using it, just leave it unchecked. It won’t affect anything when syncing with your Mac.

Unfortunately, this is only really available to users with phones running stock Android. Users in HTC Sense or other manufacturer UIs might notice that there is no Picasa entry in Accounts & Settings, but there will likely be a Flickr entry. In this case, you have two options: you can download the stock version of the Gallery app and install it on your phone, which will let you sync Picasa, or you can sync with Flickr, which is also iPhoto-friendly. Note that if you want to install the original Gallery app, you’ll need an account at XDA and you’ll need to check the “Unknown Sources” box in Settings > Applications on your phone.

On the Mac side, you can either use Google’s Picasa desktop software to sync Picasa albums, or you can use the iPhoto to Picasa uploader. This iPhoto plug-in which will let you upload and manage Picasa albums by choosing File, then Export. If you’ve chosen to sync with Flickr, you can just hit the Flickr button in the bottom right hand corner of iPhoto’s window to send any album to your Flickr account — it’s already built-in.

Apple has actually made it very easy to sync your data with Google, so the setup should take you very little time, and by the end, you’ll have all your favourite Mac apps syncing data right to your phone. From here on out, your mail, address book, calendars and music libraries should look the same on both devices, without you having to lift a finger. Got your own tricks for keeping your Mac and Android phone in sync? Let us know in the comments.

I found it hard to believe, but it works. If you have problems with mobile phone reception, put your phone in a drinking glass and after a few seconds you get up to two bars more, which in my case  creates the difference between no recption and being able to use my phone.

So a few years ago one of the waitresses there discovered (how?) that if you put a phone in an empty glass it dramatically improves the reception. The Pasta e Basta restaurant is basically stuck in a concrete basement so reception has always been awful. But since they found out about this trick they at least have had enough reception to make and receive calls.

The waiter gave me glass, I put my iPhone in, reluctantly, and lo and behold: I got 3 bars and no 3G but some GPRS. Not perfect but a huge improvement from the ‘No signal’ message I got earlier.

Via Lifehacker and The Next Web

Wow – with this technology in place I’d come to the funeral too of people I don’t know! 😉 – wearing of course some reality distorting glasses as not to start crying when I see other people sobbing ….

body-shattering-funerals.jpg

You know what sucks about funeralsEverything. There’s not a single thing I likeabout them. And that’s not even considering how bad they are for the environment. Whatever happened to dumping bodies in a volcano or leaving them out for animals to gnaw on? You know, like the good ol’ days. FEED MY ASS TO LIONS I DON’T GIVE A FUUUUUUUUUU.

A Swedish company called Promessa has come up with a crazy new way of handling the remains of the deceased, and it’s straight out of science fiction. First, a body is chilled down to 18 degrees Celsius. Then it’s entirely submerged in liquid nitrogen, which freezes it solid, and makes it brittle enough that it can be shattered and pulverized into dust using high power sound waves. Next, the dust (which is still about the same mass as the body was) is exposed to a vacuum which boils off all the moisture contained in the dust, reducing its mass by 70% or so. Lastly, all of the inorganic stuff that may be left over is removed with an electromagnet, and the dust is placed in a coffin made of corn starch, all ready for a shallow burial that’ll turn everything into compost within a year.

I’m not gonna lie, that would increase the entertainment value of funerals by at least a thousand-fold. Shit, add some dance music and a laser-light show and I’d pay to go to the funerals of people I don’t even know! Hey bro, got any E? I’m coming down already. It’s cool if you don’t but you could at least answer me. Come on dude, stop bein’ such a stiff. *CRASH!!* Oh shi-shi. *runs out rubbing nipples*

The latest in eco-funerals: Terminator-style nitrogen shattering [dvice]

Thanks to Martin, who doesn’t care how he’s buried just so long as it’s not alive. AMEN TO THAT, BROTHA!

[Pinched from Geekologie]

An interesting post by Gizmodo, warning of making specs the basis of purchasing decisions.

Bullshit button, photo by Tristan Nitot/Flickr.com

by Bryan Gardiner, Gizmodo.com

To measure is to know, said Lord Kelvin. But as marketing departments get more and more creative with their published specifications, what we’re left measuring — and by extension, knowing — about our gear is increasingly worthless.

With the gadget-buying squarely in season, most of us will soon be turning to those ubiquitous columns of numbers, ratios, and percentages before making our final selections. Frequency responses will be consulted, dynamic contrast ratios compared, and color gamuts critiqued — all in an effort to gauge performance, determine value, and quickly pit one product against one another. The only problem? In many cases, you’d better off consulting chicken bones and fingernail clippings. Not only are a growing number of published specs misleading and/or overinflated, some have become downright meaningless. And it’s getting worse.

gizmodo_logoRemember how impressive something like Blast Processing sounded when you were 15? Made the Super Nintendo look downright wimpy, right? Well, spec cooking operates on more or less the same principle. Only instead of inventing empty marketing words manufacturers plop a bunch of faux math in our laps.

These lies and fabrications happen for a few reasons. First, numbers have tremendous sway over the decisions we make. A recent study in the Journal of Consumer Research suggests that quantitative specifications are so powerful that, even when given the ability to directly test the attributes of a given product ourselves, we still tend to choose the thing with the longer list and bigger numbers (ahem, megapixels).

Another reason for the proliferation of BS specs? Rivalry.

44 GHz buttons!
“The gadget world is loaded with gimmicks and lies because it’s extremely competitive,” says Raymond Soneira, president of DisplayMate Technologies. Soneira, who penned what many consider the debunking Bible for display specifications over at MaximumPC, says that as technological complexity increases in the gadget world, it gives manufacturers and marketers even more leeway to futz with the numbers. And futz they do.

“Most consumers don’t understand the technologies anyway so they are easily misled, fooled and even swindled,” he says.

More than anything though, there’s one simple reason behind the rise of dubious specs: It’s become an industry necessity. The temptation to exaggerate is now so overwhelming that attempting to stay out of the gimmick game is now seen as akin to product suicide. Try to anchor your specifications in the real world (with meaningful numbers) and your product will look inferior. Don’t publish them at all, and you’ll look like you’re trying to hide something. It’s an insidious Catch-22 for anyone with an ounce of integrity, so manufacturers and marketers simply make the easy choice.

David Moulton, a veteran audio engineer, musician and producer characterizes the gadget spec situation like this: “When engineers make a product they use specific tests to measure the performance. But when sales departments gets a hold of those test measurements, they start using those numbers as describers of value. They become, in essence, sales arguments.”

So which “sales arguments” should you avoid, dismiss, or at the very least raise a skeptical eyebrow at? We’ve compiled a quick list of some of the more brazen spec gimmicks to be wary of this holiday season.

DISPLAYS

Color Gamut

400% color gamut!What it is: This spec represents the range of colors a given display can produce, and is usually expressed as a percentage of a particular color standard, like Rec.709 (HDTVs) or sRGB (computers and digital cameras).
Why it’s bullshit: Manufacturers don’t tell you this, but the color gamut you actually want on all of your displays is the same one that was used when the content you’re viewing was created. If it’s different, you’ll see different colors than you’re supposed to see. Nevertheless, most companies are happy to exploit the common misconception that a wider color gamut is somehow indicative of a better display. So what’s up with those 145 percent color gamuts? Nothing special, really. Here’s what a larger gamut will do: make everything look saturated. Indeed displays claiming to have more than 100 percent of any standard color gamut aren’t able to show colors that aren’t in the original source image, says Soneira.

Contrast Ratio

12 thousand trillion:1 contrast ratio!What it is: Divide the brightness of peak white by the brightness of black on a display (after it’s been properly calibrated) and, voila, you’ll get what’s known as the contrast ratio.
Why it’s bullshit: In the real world, this measurement typically falls between 1,500:1 and 2,000:1. And that’s for the best LCDs, says Soneira. But those numbers are a thing of the past. The allure of bigger ratios has prompted manufacturers to bake this specification into a full-fledged nonsense soufflé. Today, we get what’s known “dynamic contrast ratio.” That’s reached by measuring blacks when a display’s video signal is entirely, well, black (when it’s in a standby mode). As you can imagine, that significantly reduces the light output of the unit and is obviously much darker than what’s actually used to determine the traditional contrast ratio with an actual picture present. Using this trick you’ll get, in some cases, astronomical contrast ratios like 5,000,000:1 or, in Sony’s case, “infinite.” While still technically true, this spec is utter nonsense and completely unhelpful in gauging real world performance. The only information that dynamic contrast ratio can relay is how much brighter the whites can be than the blacks.

Response Time

.000001 ms response time!What it is: Also referred to as latency or response rate, response time is a standard industry test that tries to quantify how much LCD motion blur you’ll see in fast moving scenes. (It doesn’t apply much to plasma displays). It’s determined by measuring the time it takes for one pixel to go from black to peak white and then back to black (rise-and-fall). And it’s not a particularly good indicator for real picture blur.
Why it’s bullshit: Consider this. In the span of five short years, display response times have gone from 25ms (milliseconds) to, in some cases, 1ms. How did this magic happen? Well, it kinda didn’t. The problem here, according to Soneira, is that most picture transitions involve much smaller, more subtle shades of gray-to-gray transitions, which usually take much longer (3-4 x) to complete. Those response times are far more important to a display’s ability to handle motion blur. But consumers often have no way of knowing which response time is being measured (gray-to-gray or rise-and-fall). Because the published specifications can have a considerable impact on sales, it is often more important for a manufacturer to reduce the black–to–peak-white–to–black response time value rather than improving the visually more important gray-to-gray transitions. The result? The LCD display with the fastest response time specifications may not have the least visual blur.

Viewing Angle

840 degree viewing angle!What it is: Pretty simple stuff: the maximum angle at which a display can be viewed with acceptable visual performance. Yes, there are generalities about viewing angle that everyone should know: A plasma display, for instance, willyield a wider view angle. But when it comes to the listed angles that manufactures include in spec sheets, you can pretty much ignore them.
Why it’s bullshit: Today, it’s not uncommon to see 180-degree + (total) viewing-angle specifications for many displays. This has absolutely no bearing on the actual acceptable viewing angles, according to Soneira. What most consumers don’t realize is that the angular spec is based on where the contrast ratio falls to a level of 10:1, hardly an acceptable (or visually pleasing) figure. More realistically, an angle of ±45 degrees may reproduce an acceptable contrast ratio, but only with very bright and saturated colors. Pictures that include a wide range of intensities, hues and saturations will appear “significantly degraded” at much smaller viewing angles. Of course, no one tells you this.

AUDIO

Dynamic Range

What it is: In the audio realm, this spec is measured in decibels and describes the ratio of the softest sound to the loudest sound a musical instrument or piece of audio equipment can produce. Audio engineers started worrying about this back in the days of analog recording when tape noise — the inherent noise embedded in magnetic recording — was a big problem. Today, with digital recording, it’s pretty much irrelevant.
Why it’s bullshit: Dynamic ranges are almost always over-represented, says Moulton. The main thing that consumers should known about dynamic range is that you’ll want it large enough so that there are no annoying noise artifacts. And, mostly, in the realm of music and film, we’re just fine. Moulton explains: “Electronically, we can manufacture much greater dynamic range than is available in the real world. When somebody claims 120db dynamic range, that’s just silly. We don’t get there. In the real acoustic world in which we live, our usable range is about half that, or 60db. What that means is that the really soft stuff can’t be heard because of the sounds in the spaces that we’re in. And the really loud stuff is so loud that if we played it back at that level we’d probably generate complaints and legal action.”

Frequency Response/Bandwidth

holographic noise reduction!What it is: There are two parts to this spec, really. First, there’s another word for it, which is bandwidth, or the width of the spectrum we are hearing. Our ears happen to have a very broad bandwidth—ten octaves to be precise (or ten doublings of frequency…or a ratio of 1000/1). The lowest frequency humans hear is about 20 Hz. The highest frequency is about 20 kHz. And for educational and musical purposes we divide that into 10 octaves. Each octave is a doubling of frequency.
Why it’s bullshit: When manufacturers make and sell audio gear, they cheat. Period. Today, it’s very common to specify 20 Hz – 20 kHz bandwidth, which is ridiculous. First, very little audio gear will do that in really rigorous way. Second, you speakers definitely won’t — unless they cost you about as much as the house in which they’re installed. It’s just beyond the capabilities of all but the most expensive equipment. “Frequency response is something that’s kind of claimed and you have to take it with a grain of salt,” says Moulton. “Everybody is going to claim good frequency response and everybody has, more or less, poor frequency response.”

Power Handling/Wattage

What it is: Crank it up! For many of us, beefy power handling equates to house shaking sound. Yet when most of us listen to music we are actually using very little power — typically about 1 or 2 watts. Still, it’s hard to discount that gorgeous pair of 1,200-watt speakers, right?
Why it’s bullshit: Power is, more often than not, irrelevant to most people’s music listening experience. Here’s a nice rule of thumb to think about power when you’re out shopping for a new sound system or speakers: Each doubling of power is barely audible (~3db). Put another way, ten times the power will make a woofer or loudspeaker sound almost twice as loud. So the difference between a 300-watt and a 1200 watt system…actually not so big.

So if more and more specs are offering less and less useful information, what’s a gadget geek to do? When possible, it’s always a good to try out gear yourself. The other option? Find a site you trust that reviews and plays with gadgets daily. You happen to be looking at one now.

Send an email to Bryan Gardiner, the author of this post, at bgardiner@gizmodo.com.

Photo credit: Tristan Nitot/Flickr

This story originally appeared on Gizmodo.

 


The following Guardian article talks about another one of those half-cocked geo-engineering fixes: liming the world’s oceans. The guy who promotes the idea is a former management consultant, which immediately raises the question: what makes him qualified to design climate change solutions?

Apart from that minor detail, this band aid, like all the other ones, makes no attempt to determine any possible consequences of lime dumping for the complex web of interactions within the oceanic environment as well as between it and weather patterns.

Sometimes I feel these self-proclaimed geo-engineers are the modern equivalent of snake oil peddlers …

limestonequarry

Just add lime (to the sea) – the latest plan to cut CO2 emissions

• Project ‘could turn back clock’ on carbon dioxide
• Guardian conference will select top 10 climate ideas

Duncan Clark
The Guardian

Putting lime into the oceans could stop or even reverse the accumulation of CO2 in the atmosphere, according to proposals unveiled at a conference on climate change solutions in Manchester today.

According to its advocates, the same technique could help fix one of the most dangerous side effects of man-made CO2 emissions: rising ocean acidity.

The project, known as Cquestrate, is the brainchild of Tim Kruger, a former management consultant. “This is an idea that can not only stop the clock on carbon dioxide, it can turn it back,” he said, although he conceded that tipping large quantities of lime into the sea would currently be illegal.

The oceans are a key part of the natural carbon cycle, in which carbon dioxide is circulated between the land, seas and atmosphere. About one-third of the CO2 released into the air by humans each year is soaked up by the oceans. This helps slow the rate of global warming but increases ocean acidity, posing a potentially disastrous threat to marine ecosystems.

Kruger’s scheme aims to boost the ability of the oceans to absorb CO2 but to do so in a way that helps reduce rather than increase ocean acidity. This is achieved by converting limestone into lime, in a process similar to those used in the cement industry, and adding the lime to seawater.

The lime reacts with CO2 dissolved in the water, converting it into bicarbonate ions, thereby decreasing the acidity of the water and enabling the oceans to absorb more CO2 from the air, so reducing global warming.

Kruger said: “It’s essential that we reduce our emissions, but that may not be enough. We need a plan B to actually reduce the amount of CO2 in the atmosphere. We need to research such concepts now – not just the science but also the legal, ethical and governance considerations.”

Kruger’s plan was one of 20 innovative schemes proposed at the Manchester Report, a two-day search for the best ideas to tackle climate change staged by the Guardian as part of the Manchester International Festival.

A panel of experts chaired by Lord Bingham, formerly Britain’s most senior judge, will select the 10 most promising ideas. These will be featured in a report that will be published in the Guardian next week and circulated to policymakers around the world.

Climate change secretary Ed Miliband told the conference the biggest danger faced by campaigners was creating a sense of defeatism. “We need to show people how they can aggregate their individual actions and be part of a bigger whole,” he said.

Cquestrate is one of a number of so-called “geo-engineering schemes” that have been proposed to intervene in the Earth’s systems in order to tackle climate change.

Kruger admits there are challenges to overcome: the world would need to mine and process about 10 cubic kilometres of limestone each year to soak up all the emissions the world produces, and the plan would only make sense if the CO2 resulting from lime production could be captured and buried at source.

Chris Goodall, one of the experts assessing the schemes, said of Cquestrate: “The basic concept looks good, though further research is needed into the feasibility.”

Another marine geo-engineering scheme was presented by Professor Stephen Salter, of Edinburgh University.

His proposal is to build a fleet of remote-controlled, energy-self-sufficient ships that would spray minuscule droplets of seawater into the air. The droplets would whiten and expand clouds, reflecting sunlight away from the Earth and into space.

Salter said 300 ships would increase cloud reflectivity enough to cancel out the temperature rise caused by man-made climate change so far, but 1,800 would be needed to offset a doubling of CO2, something expected within a few decades.

Further Reading

New Geoengineering Scheme Tackles Ocean Acidification, Too (Wired Science)