Advertisements

Technology Futures

Commentaries on Directions That Will Impact the Future of Technology

Archive for the month “April, 2012”

Ultra HD: Worth the Wait?


Sharp 85" UltraHD Prototype TV

So you just bought the latest and greatest super-thin 3D HDTV, thinking you are now state-of-the-art.  Think again.  Here comes (maybe) Ultra HD, also known as Super Hi-vision.  This format, proposed by NHK (the Japanese equivalent of PBS), offers 16 times the resolution of today’s high definition TV.  That is 7680 x 4320 pixels, about the same as IMAX.  A 2-hour movie in this format will require about 24 terabytes of data without compression!

In addition to the video, up to 24 audio channels can be used.  You might start thinking about where you are going to place those 24 speakers.

NHK expects to broadcast in UltraHD by 2020.  In the meantime, a few companies have prototyped systems that accommodate the format.  For example, Sharp has built an 85″ Ultra HD set.

To confuse matters further, some companies are looking into an in-between technology called QFHD (Quad Full High Definition).  This technology increases the pixel resolution by a factor of four rather than the 16 factor of Ultra HD.  The resolution is 3840 x 2160 pixels.  Samsung showed off a prototype TV display for QFHD at CES, and Toshiba will be selling a 55″ QFHD set called the Regza 55X3 this year priced at about $12,000. Although it is being sold in Japan and Europe on a limited basis, it is not clear that an American audience will pay that price.

That set, by the way, is autostereoscopic, meaning that you can see 3D without glasses.  Although there is no 1D content yet in QFHD format, Toshiba uses it by showing the two stereo signals needed for 3D, each in full 1080p resolution.  People who have seen the display report being very impressed.  In addition, the set is designed to accommodate multiple viewers within nine different regions.  The television utilizes extremely small lenses to split the video feed up into two views at different angles. The user can calibrate the views using face-tracking software built into the television.  Supposedly, one can be situated at many viewing angles and still see a clear picture.

Satellite TV provider DirecTV has announced that it will soon be able to broadcast QFHD signals and maybe Ultra HDTV signals by switching to a new generation of Ka-band satellites that offer significantly more bandwidth than the current Ku-band satellites.

Not to be outdone by the Japanese and Koreans, the Chinese television manufacturer, TCL, debuted the world’s largest 4K 3D LCD television this week at 110-inches. Offering 4,096×2,160 pixels of resolution, the television requires active shutter glasses to view 3D.  In addition, it utilizes multi-touch technology to create a touch-screen on the front of the display and offers dynamic backlight technology as well.  TCL is labeling the technology “China Star” and eventually plans to roll out the technology in smaller sets.  There have not been any price announcements nor is it known if the sets will be sold outside of China.

So get ready to toss that state-of-the-art TV you just bought.  The nature of the business is that state-of-the-art is a moving target.  Naturally, you will want to move with it.

 

 

Advertisements

IBM’s Uber Battery: Can it be real?


Battery 500 Project Demo System

A few weeks ago, I blogged about the Envia company, and its claimed breakthrough in battery technology.  As you would suspect, lots of other people are working on battery technology with the aim of producing an all-electric car that will go 500 miles without needing to be recharged.  One of the most promising efforts is IBM’s Battery 500 project.

With the initial research begun in 2009 at IBM’s Almaden research labs in California, this past week IBM announced that it has built a prototype that demonstrates the efficacy of the technology.  Wired Enterprise calls it the “Uber Battery,” a descriptor I stole for the title of this post.  IBM is not doing this alone.  It is collaborating with researchers in both Europe and Asia, along with universities and National Labs in the US.  Nevertheless, IBM is the driving force, and the project is an outgrowth of IBM’s well-publicized investment in nanotechnology.

It is difficult for a non-chemist to grasp the technology, but, briefly, the system works by using oxygen drawn from the air much as it is drawn into a conventional combustion engine.  Inside the battery’s  cells, the oxygen slips into tiny spaces that measure about an angstrom (0.00000000001 meters), and reacts with lithium ions situated on the battery’s cathode. That reaction turns the lithium ions to lithium peroxide, releasing electrons, thus generating electricity.  For more information oriented to the layman, go to the following website and check out the videos.

IBM credits much of the research advancement to the so-called Blue Gene supercomputers, used to analyze electro-chemical reactions to find alternative electrolyte materials that won’t degrade the battery while recharging.  These computers, located at Argonne National Lab and in Zurich, Switzerland have rung up tens of millions of processor-hours on the project.  The computer modeling is being used to determine how the ions and molecules of different materials will interact.  The hope is to find the optimum combination of materials that will permit commercialization of the technology.

The downside is that it is not expected to be commercialized until at least 2020.  In the meantime, auto manufacturers around the world are licking their collective chops.  If this technology is successful, it will signal the end of imported oil in the US.  The geopolitical implications are enormous.

Ultraviolet: Revolutionary or Yesterday’s News?


Ultraviolet Sticker

A couple of weeks ago, my stepson, the techy genius, asked me if I was tuned in to Ultraviolet.  He was surprised to learn that I’d never heard of it.  I am talking here about the service designed to peddle movies and such, not part of the light spectrum beyond visible violet.  For those of you who never heard of it either, here is a quick synopsis:

In the cloud, there is a place to store movies and other forms of video entertainment.  One buys the movies from any one of dozens of purveyors.  Instead of walking out of a store with a DVD in your hand, the movie gets uploaded to the cloud (or maybe it is already there).  By entering a password, you can access your movie in several ways:  You can stream it to any device equipped for streaming video.  You can download it to a computer for later viewing.  You can make a DVD, although you are only allowed to make a single copy.  Ultraviolet identifies itself as “a digital rights authentication and cloud-based licensing system.”

The entertainment industry loves the idea because it thinks it might limit piracy (talk about heads in the sand).  Retailers, both brick-and-mortar and online, like the idea.  They get the same price they would for a physical DVD, but they don’t have to carry any inventory.  When you go to the Ultraviolet website, you  will find a long list of some of the biggest names in video entertainment and the places that sell it identified as sponsors or supporters.

Despite the hype and the big-name support, both the reviewers and the public in general have not said many nice things about the service.  The Gigaom and Techdirt reviews are typical.   The system is cumbersome to use for average folks, and it does not have the support of the 800-pound gorillas, Amazon and Apple.

Although Ultraviolet was announced in July 2010, it hasn’t made much progress.  Only Sony and Paramount have offered their films via Ultraviolet, and those offerings are limited to a small fraction of the movies in their portfolios.  So, despite the hype, the content providers are not exactly demonstrating a heavy commitment.  Without content, the system is doomed.

It is no secret that DVD and Blu-ray sales are decreasing.   How many people need to buy movies, when it is so much cheaper to rent them on Netflix or at kiosks?  Another competitive threat is coming from the cable and satellite TV companies.  Not only can you DVR movies for later viewing, but they are offering subscribers the ability to stream them at no extra cost.  Unlike Netflix, which is pleading for the authority to stream movie content, the cable guys already have it.  Then there are the illegal downloads that can be found easily by simply doing a Google search.

As I said above, I’d never even heard of Ultraviolet until now, and I am supposed to know about important technologies in the home entertainment business.  I do not see a market-driving force that suggests that Ultraviolet, or any service requiring upfront money for movie content, has a chance of succeeding.  Do you?  Requiem Blockbuster.

 

Exascale: The Faraway Frontier of Computing?


Those of you who follow this blog know that I write about technology trends that are, in general, not too far into the future.  This post is a departure in that it takes a peek at a technology at least 10 years away dubbed (for the moment) as “Exascale Computing.”  I’m writing about it now because the chances of my living long enough to see it come to fruition are somewhere between slim and none, and because I used to be very involved with the supercomputing industry and absence makes the heart grow fonder.

Driving the development of this technology is a project known as SKA, which stands for Square Kilometer Array, a multibillion dollar radiotelescope 100X more sensitive than anything currently in existence.  Construction will begin in 5 years and will not be completed until 2024.  When it is operational, SKA will produce an Exabyte of data every single day.  To put that into perspective, that is twice the amount of data on the Internet worldwide – 1 quintillion bytes of information.

The SKA project is headed by ASTRON, the Netherlands Institute for Radio Astronomy.  To meet the vast computing requirements that will be needed to process that much information, ASTRON awarded IBM a $40+ million contract to begin developing what will be the world’s most powerful computer, equivalent to the combined computing power of 100 million high-end PCs.

There are three challenges that IBM will be addressing (see the graphic above):  transport (of data between computing elements), data storage and analysis.  Transport will be addressed using optical technology that is well-understood today.  Analysis will rely on massive parallel processing arrays with the power of a million+ Watsons.  Storage will rely on the development of new technology, most probably based on IBM’s research in nanotechnology applied to “phase-change memory.”.

If you are interested in this project, I urge you to view a short video about the SKA and IBM’s involvement in the project at this website.  Trust me, you will be amazed at the scope of the project and the technology challenges that will have to be overcome.

Post Navigation