Advertisements

Technology Futures

Commentaries on Directions That Will Impact the Future of Technology

WHDI vs WiHD? VHS vs Betamax again?


 

 

 

 

Wireless is one of today’s hot buzzwords.  My last post was about connecting rear speakers in  a surround system wirelessly (sort of).  This post is about sending audio/video signals, specifically of HD-quality normally associated with the HDMI (High Definition Multimedia Interface) wired connection prevalent in today’ TVs, DVDs, Cable and Satellite systems.

The powers that be have been working on this issue since 2007 and have come up with approaches to the problem that are supported by some of the biggest names in the electronics business.  These solutions cost $1000 or so until recently.  Now the price is down to around $200, and promises to be much less in the not-too-distant future.  In fact, solutions have been implemented at the chip level and will be embedded into equipment including TVs, receivers, computers, tablets, etc. where the incremental cost will hardly be noticed.

Unfortunately, there are two competing technologies, reminiscent of the VHS – Betamax confrontation of decades ago, with a big difference.  That is, the same companies are supporting both technologies, hedging their bets as it were.  The two technologies are called WHDI (Wireless Home Digital Interface) and WiHD (Wireless High Definition).  Neither of them have beed adopted by any recognized international standards body (such as the IEEE) and are sponsored by the euphemistic word “Consortium.”  To further confuse the consumer, there is something called Intel Wireless Display, or WiDi!

In any event, the objective is to eliminate the need for connecting an HDMI source device (cable receiver, DVD player, computer, etc.) from an HDMI playing device (TV, tablet, etc.).  This capability is especially useful when the TV is mounted in a location that is remote from the source equipment.

If you buy a kit to add to an existing equipment setup, you will get a transmitter that plugs into the source equipment’s HDMI-out jack and a receiver that plugs into the TV’s HDMI-in jack.  One or both devices aren’t needed if the technology is built-in to the equipment, assuming both the sending and receiving devices are using the same technology.  Your next question (obviously) is – Which technology should I choose?

That is a tough one to answer, just like the VHS-Betamax query of yore.  WHDI was created by an Israeli semiconductor company called Amimon.  The technology uses the 5- 6 GHz carrier band, the same as some wifi networks, wireless telephones and other consumer devices.  WiHD uses a 60 GHz carrier frequency, which is currently unoccupied by almost anything else one is like to encounter in a home.  Its adherents claim that nothing will interfere with its signals.

However, the laws of physics dictate that the higher the frequency, the shorter the range.  Early adopters report that WiHD works well for line-of-sight up to 30-odd feet, while WHDI devotees claim that the signal will go through walls and accommodate distances up to 100 feet.  Thus it would seem that the WHDI solution is more flexible.  Indeed, it is the technology used in the heavily advertised AT&T U-Verse receivers touted to be able to wirelessly broadcast a signal to up to four TVs in a single home.

However, at this stage of the game, it may be dangerous to jump to conclusions.  If you peruse the many online forums dealing with the subject, you will find that user experiences with either technology are all over the map, ranging from excellent to awful.  Besides the range and line-of-site issues, others are latency and whether or not video compression is used.  The watch-phrase is “Don’t buy an HDMI wireless system unless you can return it.”

Back to WiDi.  This is nothing more than WHDI circuitry embedded in some Intel laptop chips.  The idea is that anything you can see on your laptop can be sent wirelessly to any device supporting the WHDI standard.  It is currently being promoted by Dell (as one example) under the banner “Connected Home.”

A note of caution:  Many of the many wireless HDMI kits on the market come in packaging that does not state which technology is being used, nor does its advertising.  Therefore, if you want one that supports WiHD, you might well buy one that supports WHDI.  As usual, do your homework before laying out your money.  I also recommend that you peruse the forums (e.g. http://www.avsforum.com) on this subject and read the product reviews by users and independent agencies like ZDNet and CNET.

 

Advertisements

Wireless Surround Sound: Kudos and Caveats


Speaker wire costs about $.15/foot for 16 gauge, $.20 for 14 gauge and $.25/ft for 12 gauge.  These gauges will, respectively, handle 20′, 35′ and 100′ respectively into an 8 ohm load, and half that distance for a 4-ohm load.  So, if you have two 8-ohm surround speakers located 30 feet from your receiver, the wire will cost about $12.00.  Further, with this conventional wired setup, there will not be any discernible delay between the sound coming out of the front and rear speakers.  They will be in sync.  By the way, speaker wire carries very low frequencies – less than 20kHz – and spending a lot for brand name cable is a waste of money.

Problem:  You are not able to run wires through the ceilings or walls, and your significant other won’t allow you to lay the wires on the floor where grandpa can trip on them.  A solution:  Use a wireless link to connect the rear speakers to your surround sound receiver.  However, you need to consider the cost.  A wireless connection can add $100 -$500 to the cost of your 5.1 system, and twice that if you have a 7.1 system.  If your speakers cost $1000 apiece, the incremental cost of wireless may seem reasonable.  If your speakers cost $100 each, the cost of adding wireless might seem exorbitant.

Be advised, I am not writing here about inexpensive bluetooth wireless speaker connections designed for iPods and the like.  I’m talking about achieving CD-quality sound in a quality home entertainment system with true surround sound.

The technology used in all the commercially available wireless systems for home use relies on using either the 2.4 or 5.8 GHz transmission bands, the same used by other household appliances including microwave ovens, wireless telephones and wifi network routers.  Thus, these systems may be subject to electrical interference resulting in static and signal dropout.  The better systems claim to have resolved that problem by providing multiple transmission channels.  The idea is that the system will find a channel that is not being used by any other appliance.  In general, the success of this scheme is a function of price.  The more expensive the system, the better job it does of avoiding collisions and the clearer the sound will be.

Another problem is that the signal to the rear speakers will be delayed by some 15-20 milliseconds.  Most people won’t notice, but sensitivee audiophiles might find that annoying.  The best way to handle that is to use a receiver that allows the user to adjust the delay time to each speaker.  Fortunately, many moderately-priced receivers today have this capability.

One company, Avnera (Portland, OR), has designed a chipset for the purpose it calls AudioMagic.  By reducing the component content to a couple of chips, the cost of wireless is drastically reduced, and we can expect to see the prices of these systems drop considerably.  The Avnera chips are used in the Rocketfish system sold by Best Buy, one of the cheapest on the market.

The following table summarizes most of the available wireless speaker links on the North American market.  Data comes from the manufacturers’ websites and may or may not be accurate.  Prices are average online prices as of May 2012.  Careful shopping will likely turn up lower prices than those shown.  All of the models listed can be found by website search.  I could not figure out how to include the links in the table – a WordPress issue I suspect.

Notice that Samsung and Sony offer systems wherein the transmitter is contained on a card that plugs into their receivers.  The cards are much less expensive than separate transmitters.  I believe that chipsets like Avnera’s AudioMagic will soon be built-into receivers, reducing the cost of the transmitter function to very little.  Receivers, however, will still need to be expensive, because of the amplifiers they must contain.  Some speakers can be purchased with built-in amplifiers, but they tend to be aimed at the low-end of the market.  If you want to choose your own rear speakers, you will need to buy separate receivers.

Read Anywhere Low Power Displays: Not Far Off?


In the 1990s, I was introduced to a French scientist who, working alone, developed a process for making displays using a reflective coating technology.  He was related to the CEO of a client company that hired my firm to evaluate the money-making potential of the technology.  My colleagues and I were blown away by the possibilities inherent in the technology.  Not only did the displays look terrific, they cost almost nothing to produce and could be laid down on almost any substrate, even paper!  To make a long story short, the technology never saw the light of day.  The scientist, a holocaust survivor, was terrified that someone was gong to steal his ideas and was very difficult to deal with.  Eventually, the company got tired of waiting for him to cooperate.

According to those who claim to know a lot about display technology, there are dozens, if not hundreds, of new display ones put forward every year.  Very few ever see the light of day.  However, in the past few years, some new technologies with big money behind them have emerged.  All of them purport to provide high-quality viewing with low power consumption.

You may recall the project called One Laptop Per Child.  It’s aim was to be able to produce a laptop computer for $100, cheap enough so that every kid in every third world country could get one.  The program has achieved some success.  About 50 countries have adopted the device.  The utility of these laptops is, of course, a function of many parameters, but one of the key ones is the display, which has to be easy to read outdoors and use up so little power that the the laptop can be recharged by sunlight, given that electrical outlets are rare in darkest Africa.

The technology that made it possible was developed by Mary Lou Jepson at MIT’s Media Lab.  She has now gone on to found a company called Pixel Qi (pronounced “chee”) that is developing next-generation display technology with the emphasis on low-power consumption.  The company just announced a new screen architecture that it claims matches the resolution, color saturation, contrast ratio and viewing angles of the Apple Retina display, but draws three to 100 times lower power and is readable. in bright sunlight.

Qualcomm, the big mobile phone company, acquired a startup called Pixtronix a couple of years ago that developed a display technology dubbed Mirasol.  Again, high-quality viewing with low power consumption.  Depending on whose article you read, Qualcomm is investing $1 – 2 billion in a plant in Taiwan to produce Mirasol displays.  Samsung, which claims to be the world’s largest consumer electronics company, acquired s company called Liquavista, a spinout from Royal Philips Electronics.  Again, low-power with the ability to read in any lighting condition.

Not to be completely left in the dust, the big-time LCD panel manufacturers like Sharp are working hard on new technologies.  One of these is known as IGZO (Indium Gallium Zinc Oxide) which offers several options including lower power and better resolution than conventional LCD technology.  It was thought by many that Apple would use the technology in its iPad3, but that didn’t happen.

As you might expect, all of the foregoing are looking to replace conventional liquid crystal displays in hand-held and laptop devices.  A San Jose, CA company, Prysm, however, is specializing in low-power large format displays, based on its proprietary laser phosphor display technology.  Imagine an entire wall displaying high-quality video imagery using 75% less power than competing technologies, and you have the idea.

There isn’t enough room in this post to go into the various technologies that these companies are employing, nor the many other display technologies under development at places like the University of Cincinnati.  Follow some of the links in this post, and you’ll at least see what the manufacturers are willing to tell you.

Without picking winners and losers, I am convinced that low-power, high-resolution displays that can be read in any light will be hitting the market big time in the next few years.  The impact on battery life and manufacturing cost will be truly significant.  Think about a tablet or laptop you can read outdoors that charges its batteries without a cord.  To quote Joe E. Brown in Some Like it Hot, “Zowie.”

Olfactory Recognition and Synthesis: Does the Nose Know?


A digital dreamer’s goal is to duplicate the five human senses: touch, sight, hearing, smell and taste, ergo creating a robot that can do, as the song says, “anything you can do, I can do better.”  (Unfortunately, sex hasn’t yet figured into the dream – yet.)  Advances have been made in taste (there are only four), touch, hearing (especially voice) and sight.  Wall-E and Artoo aren’t far off the reality mark.

The toughest sense to conquer appears to be that of smell.  Most of you are too young to remember “Smell-O-Vision,” a 1960s attempt by Mike Todd’s son (Mike Todd was a famous movie producer, creator of Todd-A-O and husband to Liz Taylor) to add smells to motion picture viewing.  Approximately 30 odors were synthesized and pumped into the theater at appropriate times.  This early attempt was unsuccessful and was quickly abandoned by the movie industry.

The applications for synthetically duplicating the olfactory sense, however, are very important, and, to my way of thinking, justify a much larger research investment that is presently being made.  Drug interdiction, security, disease detection, contaminant detection, gas leaks, crime prevention are but a few of the very important applications of the sense of smell.  The best we have right now for these applications are dogs and vultures (for gas leaks)!

There are bazillions of molecules that can contribute to odors and more bazillions of permutations and combinations of those molecules.  It would be nice if we could make up a table of these molecules and simply look them up, but that approach simply isn’t in the cards.  The numbers are just too big.

A researcher in this field is Dr. Paul Rhodes.  This Stanford Visiting Scholar formed a company called Evolved Machines that is attempting to use neural network technology to identify odors.  Unfortunately, I don’t think he has gotten very far – his last press release was issued two years ago!

Given the enormous market potential of applications based on smell, it is a wonder that would-be entrepreneurs are not all over this one.  The IEEE publishes a Sensors Journal that has scholarly articles published on the subject from time-to-time, but the apparent lack of serious research is astonishing to this observer.  Do a Google search and be amazed at the paucity of informative hits on the subject.

So this subject remains an open question.  Will it have a future?  Maybe, even probably, but it looks like it will be a very long way off.

Ultra HD: Worth the Wait?


Sharp 85" UltraHD Prototype TV

So you just bought the latest and greatest super-thin 3D HDTV, thinking you are now state-of-the-art.  Think again.  Here comes (maybe) Ultra HD, also known as Super Hi-vision.  This format, proposed by NHK (the Japanese equivalent of PBS), offers 16 times the resolution of today’s high definition TV.  That is 7680 x 4320 pixels, about the same as IMAX.  A 2-hour movie in this format will require about 24 terabytes of data without compression!

In addition to the video, up to 24 audio channels can be used.  You might start thinking about where you are going to place those 24 speakers.

NHK expects to broadcast in UltraHD by 2020.  In the meantime, a few companies have prototyped systems that accommodate the format.  For example, Sharp has built an 85″ Ultra HD set.

To confuse matters further, some companies are looking into an in-between technology called QFHD (Quad Full High Definition).  This technology increases the pixel resolution by a factor of four rather than the 16 factor of Ultra HD.  The resolution is 3840 x 2160 pixels.  Samsung showed off a prototype TV display for QFHD at CES, and Toshiba will be selling a 55″ QFHD set called the Regza 55X3 this year priced at about $12,000. Although it is being sold in Japan and Europe on a limited basis, it is not clear that an American audience will pay that price.

That set, by the way, is autostereoscopic, meaning that you can see 3D without glasses.  Although there is no 1D content yet in QFHD format, Toshiba uses it by showing the two stereo signals needed for 3D, each in full 1080p resolution.  People who have seen the display report being very impressed.  In addition, the set is designed to accommodate multiple viewers within nine different regions.  The television utilizes extremely small lenses to split the video feed up into two views at different angles. The user can calibrate the views using face-tracking software built into the television.  Supposedly, one can be situated at many viewing angles and still see a clear picture.

Satellite TV provider DirecTV has announced that it will soon be able to broadcast QFHD signals and maybe Ultra HDTV signals by switching to a new generation of Ka-band satellites that offer significantly more bandwidth than the current Ku-band satellites.

Not to be outdone by the Japanese and Koreans, the Chinese television manufacturer, TCL, debuted the world’s largest 4K 3D LCD television this week at 110-inches. Offering 4,096×2,160 pixels of resolution, the television requires active shutter glasses to view 3D.  In addition, it utilizes multi-touch technology to create a touch-screen on the front of the display and offers dynamic backlight technology as well.  TCL is labeling the technology “China Star” and eventually plans to roll out the technology in smaller sets.  There have not been any price announcements nor is it known if the sets will be sold outside of China.

So get ready to toss that state-of-the-art TV you just bought.  The nature of the business is that state-of-the-art is a moving target.  Naturally, you will want to move with it.

 

 

IBM’s Uber Battery: Can it be real?


Battery 500 Project Demo System

A few weeks ago, I blogged about the Envia company, and its claimed breakthrough in battery technology.  As you would suspect, lots of other people are working on battery technology with the aim of producing an all-electric car that will go 500 miles without needing to be recharged.  One of the most promising efforts is IBM’s Battery 500 project.

With the initial research begun in 2009 at IBM’s Almaden research labs in California, this past week IBM announced that it has built a prototype that demonstrates the efficacy of the technology.  Wired Enterprise calls it the “Uber Battery,” a descriptor I stole for the title of this post.  IBM is not doing this alone.  It is collaborating with researchers in both Europe and Asia, along with universities and National Labs in the US.  Nevertheless, IBM is the driving force, and the project is an outgrowth of IBM’s well-publicized investment in nanotechnology.

It is difficult for a non-chemist to grasp the technology, but, briefly, the system works by using oxygen drawn from the air much as it is drawn into a conventional combustion engine.  Inside the battery’s  cells, the oxygen slips into tiny spaces that measure about an angstrom (0.00000000001 meters), and reacts with lithium ions situated on the battery’s cathode. That reaction turns the lithium ions to lithium peroxide, releasing electrons, thus generating electricity.  For more information oriented to the layman, go to the following website and check out the videos.

IBM credits much of the research advancement to the so-called Blue Gene supercomputers, used to analyze electro-chemical reactions to find alternative electrolyte materials that won’t degrade the battery while recharging.  These computers, located at Argonne National Lab and in Zurich, Switzerland have rung up tens of millions of processor-hours on the project.  The computer modeling is being used to determine how the ions and molecules of different materials will interact.  The hope is to find the optimum combination of materials that will permit commercialization of the technology.

The downside is that it is not expected to be commercialized until at least 2020.  In the meantime, auto manufacturers around the world are licking their collective chops.  If this technology is successful, it will signal the end of imported oil in the US.  The geopolitical implications are enormous.

Ultraviolet: Revolutionary or Yesterday’s News?


Ultraviolet Sticker

A couple of weeks ago, my stepson, the techy genius, asked me if I was tuned in to Ultraviolet.  He was surprised to learn that I’d never heard of it.  I am talking here about the service designed to peddle movies and such, not part of the light spectrum beyond visible violet.  For those of you who never heard of it either, here is a quick synopsis:

In the cloud, there is a place to store movies and other forms of video entertainment.  One buys the movies from any one of dozens of purveyors.  Instead of walking out of a store with a DVD in your hand, the movie gets uploaded to the cloud (or maybe it is already there).  By entering a password, you can access your movie in several ways:  You can stream it to any device equipped for streaming video.  You can download it to a computer for later viewing.  You can make a DVD, although you are only allowed to make a single copy.  Ultraviolet identifies itself as “a digital rights authentication and cloud-based licensing system.”

The entertainment industry loves the idea because it thinks it might limit piracy (talk about heads in the sand).  Retailers, both brick-and-mortar and online, like the idea.  They get the same price they would for a physical DVD, but they don’t have to carry any inventory.  When you go to the Ultraviolet website, you  will find a long list of some of the biggest names in video entertainment and the places that sell it identified as sponsors or supporters.

Despite the hype and the big-name support, both the reviewers and the public in general have not said many nice things about the service.  The Gigaom and Techdirt reviews are typical.   The system is cumbersome to use for average folks, and it does not have the support of the 800-pound gorillas, Amazon and Apple.

Although Ultraviolet was announced in July 2010, it hasn’t made much progress.  Only Sony and Paramount have offered their films via Ultraviolet, and those offerings are limited to a small fraction of the movies in their portfolios.  So, despite the hype, the content providers are not exactly demonstrating a heavy commitment.  Without content, the system is doomed.

It is no secret that DVD and Blu-ray sales are decreasing.   How many people need to buy movies, when it is so much cheaper to rent them on Netflix or at kiosks?  Another competitive threat is coming from the cable and satellite TV companies.  Not only can you DVR movies for later viewing, but they are offering subscribers the ability to stream them at no extra cost.  Unlike Netflix, which is pleading for the authority to stream movie content, the cable guys already have it.  Then there are the illegal downloads that can be found easily by simply doing a Google search.

As I said above, I’d never even heard of Ultraviolet until now, and I am supposed to know about important technologies in the home entertainment business.  I do not see a market-driving force that suggests that Ultraviolet, or any service requiring upfront money for movie content, has a chance of succeeding.  Do you?  Requiem Blockbuster.

 

Exascale: The Faraway Frontier of Computing?


Those of you who follow this blog know that I write about technology trends that are, in general, not too far into the future.  This post is a departure in that it takes a peek at a technology at least 10 years away dubbed (for the moment) as “Exascale Computing.”  I’m writing about it now because the chances of my living long enough to see it come to fruition are somewhere between slim and none, and because I used to be very involved with the supercomputing industry and absence makes the heart grow fonder.

Driving the development of this technology is a project known as SKA, which stands for Square Kilometer Array, a multibillion dollar radiotelescope 100X more sensitive than anything currently in existence.  Construction will begin in 5 years and will not be completed until 2024.  When it is operational, SKA will produce an Exabyte of data every single day.  To put that into perspective, that is twice the amount of data on the Internet worldwide – 1 quintillion bytes of information.

The SKA project is headed by ASTRON, the Netherlands Institute for Radio Astronomy.  To meet the vast computing requirements that will be needed to process that much information, ASTRON awarded IBM a $40+ million contract to begin developing what will be the world’s most powerful computer, equivalent to the combined computing power of 100 million high-end PCs.

There are three challenges that IBM will be addressing (see the graphic above):  transport (of data between computing elements), data storage and analysis.  Transport will be addressed using optical technology that is well-understood today.  Analysis will rely on massive parallel processing arrays with the power of a million+ Watsons.  Storage will rely on the development of new technology, most probably based on IBM’s research in nanotechnology applied to “phase-change memory.”.

If you are interested in this project, I urge you to view a short video about the SKA and IBM’s involvement in the project at this website.  Trust me, you will be amazed at the scope of the project and the technology challenges that will have to be overcome.

Envia Systems: Savior of the Electric Car?


The holy grail of the electric car business is a competitively-priced vehicle that will travel several hundred miles on a single charge.  As shown in the graphic, we ain’t there yet.  One key to finding that grail is battery technology.  A little-known Silicon Valley company called Envia Systems claims to have made tremendous strides in developing a battery technology that will lead the industry directly to that grail – pass Go, collect $200.

The operative measure for vehicle battery technology is Wh/Kg, or Watt-hours per Kilogram.  Current technology, like that used in Tesla Motors‘ cars, is around 240 Wh/Kg and costs roughly $200 per Kilowatt-hour.  Envia promises to deliver 400 Wh/Kg at $125 per Kilowatt-hour.  With those numbers, a $20,000 car could travel 300 miles before it needs to be recharged.

In general, battery technology improvement appears to be advancing at a rate of 5% a year.  If Envia’s claims are valid, its technology nearly doubles state-of-the-art energy density at half the cost.  Lots of folks are excited by the possibilities, including General Motors which has invested a bunch of money into the venture.

Envia began its journey in 2007 when it licensed some patents from Argonne National Laboratory (ANL).  Although Envia execs claim that the Envia technology was developed on its own, the Argonne patents gave it a start.  Although the details of the patent agreement are secret,  apparently ANL will share in Envia’s success if and when it happens.

Envia’s batteries are Lithium-Ion. (Li-ion), the same used in cell phones and portable electric tools.  Envia’s technology is based on using unique element chemical compositions for the anodes and cathodes, notably silicon and carbon, and an electrolyte that is stable at high temperatures.  Not all the problems that need to be solved have been solved, however.  Current tests show that, although Envia batteries can be recharged to 80% capacity after 400 charges, that number needs to be 1000 to last 300,000 miles, considered to be the average lifetime of a car.

Envia’s technology lends itself to conventional manufacturing processes.  The company plans to license it rather than go into the manufacturing business itself.  Possible licensees includes General Motors and some other high-profile domestic and foreign firms.

Compared with Solyndra’s $500 million failed investment, Envia Systems looks  like a bargain basement deal.  The total investment in the company is probably less than $25 million including a $4 million grant from ARPA-e and $7 million from General Motors, part of a $17 million package.

As noted above, while battery technology is one key to mass acceptance of electric cars, it is not the only key.  Getting those great batteries charged may pose an even bigger barrier.   One only has to look at GM’s recent problems with the Volt to get an idea of the challenge.  I took my wife’s Chevy into a dealer a few days ago for service.  While there, I asked one of the sales guys how Volts are selling.  He looked at me and groaned.  He said that the tree-huggers were there in force the first few weeks, but now nobody is interested.  FYI, I live in an area heavily populated by tree-huggers!

 

 

The Personal Cloud: Truth or Dare?


Analysts love to coin new expressions.  The one I’m most proud of is “Business Intelligence,” which I thought was a lot sexier than “data mining” and “data warehousing,” two earlier expressions which meant essentially the same thing.   IBM and a couple of other big companies jumped on the Business Intelligence bandwagon and made it an IT proverb.  I shoulda copyrighted or trademarked it, or started a domain business-intelligence .com.  Mighta made a few bucks.  Water under the dam.

I was impressed the other day when I saw a reference to the “Personal Cloud.”  This expression is claimed by Gartner Group, possibly the world’s largest techy analyst company, and mentioned prominently in a recent press release entitled “Gartner Says the Personal Cloud Will Replace the Personal Computer as the Center of Users’ Digital Lives by 2014.”  (Unfortunately, personalcloud.com is already a registered domain name, indicating that Gartner isn’t any smarter than I was.)

In any event, I read the press release.  I thought it made sense -sort of.  Do you agree with the following quote?

“Gartner analysts said the personal cloud will begin a new era that will provide users with a new level of flexibility with the devices they use for daily activities, while leveraging the strengths of each device, ultimately enabling new levels of user satisfaction and productivity. However, it will require enterprises to fundamentally rethink how they deliver applications and services to users.”

The 46-word opening sentence aside (my composition professor would have slammed that writer), I agree that users will  potentially be more productive using the cloud, but I’m having a tough time with the prediction that enterprises will have to rethink how they deliver services to users.  In my experience, enterprises seldom rethink anything.  Rather, they change as little as possible so as not to freak out those users and make more work for themselves.

Don’t get me wrong.  I think the cloud is terrific.  As I mentioned in an earlier post, the timeshare business of the late 60s – early 90s essentially offered cloud computing, but, in those days, networks were slow and storage capacity limited.  Today’s broadband networks and virtually unlimited random access storage enable a user to do almost anything online instead of the desktop.  The only real barrier to cloud computing is security, an issue that I believe will never get resolved.

If you are skeptical about the cloud, I’d like you to test out an app called Dropbox, a web-based file hosting service.  You can get a free single-user account or an account that can be shared by several users.  The learning curve is practically non-existent.  It’s a great example of cloud storage.

You can also try some cloud apps.  If you use Google’s gmail or calendar, you are already apping in the cloud!  Others you can try for free are Quicken Online, WordPress (blogging service that I am using right now), and Adobe Photoshop Express.  Check out this website, “10 cloud apps that slam-dunk their desktop counterparts.”

Of course, these personal apps have little to do with enterprise applications, but will certainly give you a taste of the possible.

Post Navigation