Advertisements

Technology Futures

Commentaries on Directions That Will Impact the Future of Technology

Archive for the month “November, 2011”

The Home Entertainment System Myth


I enjoy David Einstein’s weekly syndicated column in which he dispenses advice to anyone who writes in with a techie question.  This week a lady wrote in stating that she had bough a “3D Smart TV”, a Blu-ray disc player and a 3D Blu-ray version of Cars 2.  After setting everything up, she found that she couldn’t watch Cars 2 in 3D!  She wanted to know why, since she thought she bought those components so that her daughter could watch Cars 2 in 3D.  I bet I can guess who the retailer was that sold her that stuff.

David (not so nicely) told her that she would need a 3D player to play a 3D disc.  He then went on to explain that Cars 2 was available on an on-demand cable in 3D and that would play into her 3D TV.  He also gave her a few other options, that I’m sure the retailer never thought of mentioning.

This squib is a great example of how the home entertainment industry has failed at every level – developer, manufacturer, distributor and retailer – to educate the average consumer.  This is great news for the folks who attend CEDIA, the international trade association for companies who design and install electronic systems for the home.   These folks make a living off the consumer’s lack of knowledge.  If you are not a technical person and want a multi-component system to do what you want (e.g., play that 3D disc) by pressing a single button, you need to hire one of these CEDIA guys.    Since the CEDIA guys charge a lot, most people don’t hire them.  Without them, the chances are pretty good that you will a) buy the wrong stuff; b) pay too much for it; and c) use what you bought up to about 20% of its potential.

This state of affairs has been going on for many years, and I simply don’t understand it.   All of the technology that is needed to make a home entertainment system “plug and play” has been available for years, yet no company has addressed the issue, although many purport to do so.  Several years ago, I wrote Steve Jobs a letter explaining how certain technologies that Apple had or was working on could address this problem, and, in the process, make Apple a lot of money.  In reply, I got a sharply-worded letter from a lawyer stating that Apple had a firm policy of rejecting any ideas from anyone not an employee of the company, and I was cordially invited to mind my own business.  (I thought about sending a similar missive to Bill Gates, but figured Microsoft would probably be even nastier than Apple.)

If you have a chance, attend the Consumer Electronics Show (CES) this coming January.  Companies like Samsung, LG, Panasonic, Microsoft, etc. have booths measuring tens of thousands of square feet pushing thousands of products, but you won’t find much that tells the consumer how to integrate these products or even allude to ease-of use.  The words are there in the hype, but none of these companies deliver a solution.

There is a place out there for a mass-market systems integrator.  The revenue potential is massive.  Why nobody has seized the opportunity remains one of the great mysteries.  For most people, a home entertainment SYSTEM is a myth.

Grouse, grouse, grouse . . . . .

Advertisements

Are We Headed to Subscription-based Cloud Computing?


This week, an Adobe exec demonstrated a suite of cloud-based applications designed to run with a touch interface.  Adobe calls it the “Creative Cloud.”  Its strategy is to change from selling software in the box to selling software on a monthly subscription basis.  The exec said that Adobe expects half its sales to be subscription-based by 2015.

I’m a big proponent of subscriptions.  It doesn’t matter if it is a magazine or a renewable lease on a $100 million jet plane.  Customers tend to renew subscriptions without giving the process a lot of thought.  Thus subscriptions tend to guarantee an ongoing revenue stream without a lot of extra sales effort.  Most sales managers will tell you that it costs 4-5 times as much to get a new customer than it does to renew an old one.  One only has to look at the example set by Netflix to see what kind of success can be engendered by the subscription business model.

The touch interface will be a big driver for cloud-based software.  Cloud computing means that any device can easily connect to software or services, and that certainly includes tablets and phones, which are essentially touch-interface devices.  Further, there is a lot more touch-based technology coming.  If you haven’t seen the TV shows Hawaii Five-O, NCIS: LA or John King’s show on CNN, watch an episode or two just to see what gigenormous touch screens can bring to the party.  Also, Google “huge touch screens”.  You’ll find dozens of YouTube and other videos showing off these devices.

Finally, there is the reliability factor to consider.  A few minutes ago, while I was typing this blog entry, my computer inexplicably crashed – naturally, right before it was almost finished.  I had to copy the contents by hand, reboot my computer and re-enter the article.  Grrrrrr………………….    If I were in the Cloud, maybe I wouldn’t have to put up with that c–p.  I would happily pay a (reasonable) monthly subscription fee if I could be sure that I’d never again have to deal with another crash.

In addition, subscriptions could eliminate some cheaters.  I know a person who buys the latest version of a software package from Costco every year.  He installs it on his computer, registers it with the manufacturer and then returns it to Costco, taking advantage of Costco’s generous return policy!  Although I’m no expert on the subject, it seems to me that a lot of software piracy could be eliminated or at lest curtailed if one had to have a subscription, and the software was available only in the cloud.

In conclusion, I believe that subscription-based computing in the cloud will be a big deal, and that it will happen fast, certainly within the next five years.

 

Is an Ultrabook in Your Future?


The latest entry into the universe of personal computing goes by the moniker “Ultrabook.”  “What the heck is that?” you may well ask.  In
a nutshell, an Ultrabook is a skinny laptop computer that meets guidelines laid down by Intel Corporation.   Those guidelines include <21 mm thick (.83inches), a solid state “hard drive”, a minimum 5 hours battery life, an Intel “ultra low voltage” Core-I processor, Intel Rapid Start technology and Intel’s Anti-theft and ID protection technologies.  Computers that meet these guidelines are about to hit the shelves this quarter; first from Lenovo, Asus, Acer and Toshiba to be followed by other big name PC companies like HP and Dell.  If you would like to
see how the early models stack up, go to http://asia.cnet.com/which-ultrabook-should-you-get-62211773.htm.

Ultrabooks are obviously inspired by the “MacBook Air” that was introduced nearly three years ago.  It is now an established product line, and Apple expects to sell nearly 3 million of them this year at prices ranging from $1,000 – $2,000.  While not specifically meeting Intel’s
Ultrabook guidelines, the MacBook Air uses Intel processors and defacto meets those guidelines in many respects.

Ultrabook wannabees hope to take a big bite out of the space that Apple has had to itself for the better part of three years.  Their success (or lack thereof) will undoubtedly boil down to price.  Initial Ultrabook prices will start near $1,000 – same as a low-end MacBook Air – but the
industry is projecting a $600-700 price point within 6-12 months, which would be a significant price advantage over Apple.

In addition, the plan is to equip Ultrabooks with touch-screens running Windows 8.  Personally, I don’t see much of a benefit to having a touch screen on a laptop, except for some specialized applications.  One buys a laptop because it has a keyboard and mouse-equivalent.  I dunno, maybe it will turn on the gamers, but for anything involving text and/or data, gimme a keyboard and a mouse with a scrollwheel.

The “Netbook” is almost a forgotten category.  It has in common with an Ultrabook light weight, small size, and a keyboard.  It is also cheap.  A brand-new loaded 10” model with standard (not starter-edition) Windows 7 can be had for $250, while a used one can be found for $100.  In the traditional laptop market, $500 will get you a new machine from a major vendor running Windows 7 Professional with a 15.6” screen, 320GB drive and a DVD burner.  If you are willing to buy used or refurbished, you can get something similar for $200.  Of course, if you don’t need a keyboard, a tablet will probably do you fine.  You can pick up a good one today for less than $350, and, if you need it, some models
offer a docking station that will accommodate a keyboard and mouse, large display and provide connectivity for most peripherals.

In conclusion, I believe there are only two compelling reason to buy an Ultrabook:  1) sex appeal; and 2)  a 5-pound laptop is too
heavy to carry.  Note that weight aside, an Ultrabook still requires the same size carrying case as a laptop.

That said, I never underestimate the power of a well-financed PR campaign, and Intel/Microsoft and their customers will be spending billions to convince you that you can’t survive without an Ultrabook.  Too early to tell if it will go the way of the Edsel or change the nature of the computing landscape.

Is a Chinese Clone in Your Future?


Chinese Clone

The picture to the left shows a typical so-called Chinese Clone tablet.  These Android-based tablets typically sell for around $200, inclusive of shipping, if purchased online directly from a Chinese vendor and about 10% more from a US retailer.  See www.chinagrabber.com and www.mhz.biz for examples.  Compared with an iPad, Motorola Xoom or Samsung Galaxy which are typically $500 or more, they seem like a bargain at first glance.
With the tablet market heating up, it might seem that the Chinese clone makers are well-poised to grab a significant share of the worldwide market, given their price advantage.  However, if one looks under the hood, one may find some drawbacks, which, I believe, will ultimately consign them to the technoeconomic dying room.  For example:
  • These devices come with virtually no useful documentation, and no technical support from the manufacturers.  Instead, buyers are expected to get support from some Internet forums populated by techno-geeks with varying degrees of expertise.   See www.apad.tv and www.androidforums.com for examples.
  • It turns out that Android is not Android is not Android.  Google chose to put Android into the public domain, enabling anyone to create their own version.  Google has a certification process designed to ensure that a given version of Android meets certain criteria.  One may choose to go through that procedure – or not.  The clone manufacturers have opted out of that process.  As a result, the clone versions of Android are prohibited access to some important features.  For example, clone owners are not permitted to download many free apps and all paid apps from the Android Market.  While it is often possible to get around this limitation, the process may be painful – and may be illegal as well.
  • The manufacturers may or may not provide upgrades to the system software.  Even if they do, they are not likely to be state-of-the-art.  Taking up the slack, there are a few Linux gurus (Linux is the foundation of Android) who have taken it upon themselves to issue system software upgrades, apparently out of the goodness of their hearts.  These are published on one or more of the aforementioned forums, but the upgrade process generally takes a tech-savvy user to implement.
  • At least some of the components used in these clones are inexpensive and that translates to slow performance for some operations.  Scrolling can be jerky;   although video files play OK, streaming video from sources such as Netflix won’t run;   web pages sometimes take a long time to load; etc.

In fairness, some of the clones have nice features.  For example, the Flytouch, one of the more popular clones, has Ethernet, two USB full-size ports, HDMI output, and a connection for an external GPS antenna.  With care, the case can be popped open and the user can change the battery himself or increase internal storage by simply inserting a larger SD card.

However, the tablet market is heating up fast, and prices are on the way down.  The 10″ Toshiba Thrive, one of the newer tablets on the market, is selling online for as little as $300 (MSRP is $400).  The MSRP for the new Amazon Fire is $200 and the Barnes and Noble Nook Tablet is $250.  All of these devices are Google-certified and are backed by big international companies with reputations to protect.

These prices are US-based.  Fortunately for the Chinese cloners, prices are much higher elsewhere in the world, so the price advantage may well stick around for some time in parts of Europe and Asia.  In the end game, it may make no difference.  The Chinese are building virtually all of the world’s tablets, the iPad included, and would thus appear to be in a win-win situation.

Industrial Research, Alive but not Well in the Tech Sector


This is about electronics-based industry, not biotech or pharmaceuticals.

Fifty years ago, the leading US tech companies had the finest pure research laboratories in the world.  Since they paid their employees more than most universities, they tended to get the best and the brightest, who were often given blank checks to follow their interests without interference from a bureaucracy.  This resulted in the development of such things as UNIX (Bell Labs), nanotechnology (IBM) and the iconic GUI (Xerox developed it, Apple stole it).

The granddaddy of them all was Bell Labs, followed by IBM, Xerox, HP, and a few others.  Then an a–h–le named Judge Greene decided that AT&T, which created the greatest telecommunications infrastructure in the world, was a no-goodnik and busted up the company, which, besides dropping the US telecommunications industry about 20 years behind Europe, Japan and Korea, effectively gutted Bell Labs.  Bell Labs, you may remember, developed the laser, the transistor, cellular networks, and solar cells among thousands of lesser known inventions that have significantly impacted  the lives of 90% of the world’s population.

In the 1970s, wall street exerted tremendous pressure on companies to improve short-term profits.  As a result, pure research activities were severely curtailed at most American companies.  Research projects that held no promise for short-term financial returns were dropped, and thousands of top-notch scientists and engineers went looking for jobs.  Pure research funding came mostly from the government and most of that money went to universities.  To make matters worse, most government-funded projects have government strings attached.  Now I am not knocking university research, but the diminishing of our industrial research complex has cost the US big time, and has a lot to do with the fact that the US’ once unquestioned technology leadership is being challenged throughout the world.

Unfortunately, the people who follow this subject talk  about R&D, not R.  Thus, if you look a tables of R&D expenditures published by various consulting firms and universities, you will find that most of the leaders are car and oil companies.  These companies are spending a lot on D and very little on R.  I dismiss them as irrelevant to this discussion.

In the electronics/computing/telecom sector, the three leaders in annual R&D expenditures are Microsoft ($9.2B), Intel ($7.7B) and IBM ($6.3B).  Although none of these companies publicly break out R from D, these industry leaders are indeed spending a lot on basic research.  Thank goodness.

While our government leaders give a lot of speeches about restoring and/or increasing America’s leadership in technology, I don’t hear a lot of talk about how government can incentivize our industry to get back into the basic research business.  If we don’t do that, the technologically emerging countries are going to run right by us.

Are They Rain or Cumulus Clouds?


No one doubts that so-called “cloud computing” is on fire these days.  Cloud computing vendors would have us believe that all is beautiful out there – nice white fluffy cumulus clouds.

Once upon a time (1960s) there was a business known as “timesharing,” thought by many to be the nirvana of all nirvanae (I made that word up).  Timesharing is nothing more than cloud computing.  One had on his desk a “dumb terminal” consisting of a screen and a keyboard.  The actual computing was done on computers (called servers today) located in remote sites managed by professional IT people and offering the user access to much more computing power and storage than he could otherwise afford.  Timesharing ultimately died partly because the acoustic-coupled modems of the day and the supporting networks were very slow, and then the PC came along and finished it off.  But what really knocked it off was the lousy security.  In the beginning there weren’t any hackers, so it wasn’t of much concern, but as hacking networks became the vocation of thousands, security went to hell in a handbasket.

Replace the phrase “timesharing” with “cloud computing.”  It’s the same business.  And while progress in the cost of storage and networking has changed a lot, the Achilles heel of the business – security – hasn’t changed.  In my opinion, cloud computing is downright dangerous unless all you want to do is save your grandkids pictures in the ether – and even that can be dangerous.  Unless the security issue is addressed, businesses will reject it.  Thems are rain clouds out there.

Unless you are the NSA, security is one of those things that everyone has always given lip service to but won’t pay for.  There are some hopeful signs, however, that, if brought to fruition, may actually make cloud computing everyone’s MO for the next couple of decades.  One of those signs is a development by Harris Corporation who is working on building what they call the “Trusted Cloud.”  If you never heard of Harris, I’m not surprised.  Although it is a $6 billion company, it historically has done secret work for spook and military agencies.  (Over the years, it gave commercial markets a shot, (minicomputers for instance), but never hit any home runs in that milieu.)

I’m not going to go into the details in this blog, but, suffice it to say, Harris may be on the cusp of making the cloud practical for a lot of folks who won’t touch it today.  If you want to delve into this further, I refer you to another quasi-technical blog, http://chucksblog.emc.com/chucks_blog/2011/06/harris-what-it-takes-to-build-a-trusted-cloud.html.   It was written by a marketing guy, so you have to put it in that context, but the basic concepts are explained pretty well.  Happy reading.

The Return of the ASIC


Once upon a time, ASICs (Application Specific Integrated Circuits) were one of the hottest topic in Silicon Valley’s VC community.  Just cause I was curious, I Googled “ASIC” the other day and found 1) a manufacturer of athletic shoes, 2) the Australian Securities and Investment Commission, and 3) the American Society of Irrigation Consultants.  To be sure, the ubiquitous Wikipedia site did explain what an ASIC really was, but there wasn’t much else to be found.

I bring this up because I think ASICs are on their way back up and will become an increasingly hot product category over the next several years.  ASICs fell out of favor in some quarters because general-purpose processing chips got cheap enough so that they could compete price-wise.  But using a general-purpose processor for a specific application wastes a lot of transistors, and, because it is not optimized for the application it purports to serve, it is inherently inefficient.  But now we have electronic products like mobile phones and tablets being made in the millions of units.  That is a significant economic driver change.

So, while I’m not forecasting Intel’s imminent demise, I am forecasting excellent growth opportunities in the ASIC business over the next decade.  I found a forecast from a market research company (who will remain unnamed), stating that the ASIC compound annual growth rate over the next few years will be less than 3.8%.  I haven’t done a formal study, so I’m not prepared to cite any numbers of my own, but I think that forecast is extremely low.  As high-volume production of  electronic devices proliferates (anyone want to argue that?), ASICs have got to be winners.

Thoughts on Natural Language Input


Anyone who has seen a recent Apple ad or a newsclip on its new iPhone 4S, knows that Apple is pushing voice input as a revolutionary way, not only to input commands, but also to get answers to queries.  I’ve not had a chance to try out the 4S, but I own an Android phone and an Android tablet, both of which support voice input which works pretty well.  If you are interested in a comparison between Apple’s “Siri” and Android’s “Voice Actions,” take a look at the excellent PC World article on this subject at http://www.pcworld.com/article/242198/apples_siri_vs_androids_voice_actions_feature_showdown.html.

All of this current technology is very nice, but the promise of what is to come in natural language processing is what gets my adrenaline pumping.  (OK, so I sometimes exaggerate.)  If you are skeptical, I recommend you see the recent Jeopardy TV shows that pitted IBM”s “Watson” parallel processing computer against the two best-ever Jeopardy contestants.  Suffice it to say that Watson was a big winner.  Not only did Watson beat the c–p out of its opponents, its answers were given in the form of synthesized speech, and it made its bets based on complex probability algorithms.

I think the day is not far away when one will be able to ask one’s phone what bets to make on the third race at Pimlico, and get a (whispered) answer with far more reliability than the typical tout sheet!

Taking off from my earlier comments about using smart pads and phones as remote controls for home entertainment, security and climate-control, the day should not be too far off when voice will replace buttons.  I can’t think of anything that would give a bigger boost to the home entertainment biz.  At the risk of being accused of being a male chauvinist pig,  a lot of women, who are terrified of controlling anything more complicated than a stand-alone TV set, will become ardent home-theater enthusiasts.  If all they have to do is say “Record Dancing with the Stars”, or “Show me a list of all the movies starring Gregory Peck available on cable this week.”, or Turn down the volume.”, or “What’s next on my Netflix queue?”, or Turn the heat up to 72 degrees.”, I’m sure you get the concept.

Philips, one of the world’s largest consumer electronic companies, used to be the leading provider of programmable remote control devices.  Not long ago, it closed down the division responsible for that product line.  Do ya think it saw the handwriting (oops voicewriting) on the wall?

Home Entertainment Remote Controls Moving to Tablets and Phones


I had a side business once customizing programmable touch-screen remote controls for home entertainment systems.  Although the results could be magical, the devices were expensive to buy and often more expensive to program.  Not many people could afford them.  The latest trend in remote controls is to use tablets, smart phones and even iPods as interface devices to control home entertainment, security and climate control systems.  This makes a lot of sense.  The devices have beautiful high-resolution screens and can easily be adapted to communicate with equipment via traditional infrared, Ethernet, several RF standards and directly via WiFi, which is being incorporated into an increasing number of electronic components.  The professionals (the guys who install the $250,000 home theater systems) have embraced this new technology, but its true beauty, in my opinion, is that, with clever software, it can be adapted for Joe couch potato at a price he can afford and offer far more functionality than one can get with a Logitech Harmony, which is today the most popular universal remote control.

A new Silicon Valley startup, Cyphersoft LLC, just unveiled its “Roomie” remote control (www.roomieremote.com).  To use it, you need an iPad, iPhone or iPod, an app you buy from the Apple Apps Store, and, depending on the equipment that you need to control, you may need to buy one or more adapters.  (An adapter converts, for example, the WiFi IP signal from the remote device to the infrared signal needed to operate the equipment).  Although Roomie is in its first incarnation and has limited functionality, just about anything can be done as the software evolves without requiring additional hardware.

Despite the efforts of the various “Occupy” protests around the country, the growing population of billionaires will undoubtedly keep the pros in business for the foreseeable future.  Nevertheless, I predict that we are sitting on the leading edge of a revolution in home electronic system control, one that will bring sophisticated, easy-to-use functionality to the average household at a price that most homeowners (and renters) will be comfortable with.

Post Navigation