Technology Futures

Commentaries on Directions That Will Impact the Future of Technology

Archive for the tag “IBM”

IBM’s Uber Battery: Can it be real?

Battery 500 Project Demo System

A few weeks ago, I blogged about the Envia company, and its claimed breakthrough in battery technology.  As you would suspect, lots of other people are working on battery technology with the aim of producing an all-electric car that will go 500 miles without needing to be recharged.  One of the most promising efforts is IBM’s Battery 500 project.

With the initial research begun in 2009 at IBM’s Almaden research labs in California, this past week IBM announced that it has built a prototype that demonstrates the efficacy of the technology.  Wired Enterprise calls it the “Uber Battery,” a descriptor I stole for the title of this post.  IBM is not doing this alone.  It is collaborating with researchers in both Europe and Asia, along with universities and National Labs in the US.  Nevertheless, IBM is the driving force, and the project is an outgrowth of IBM’s well-publicized investment in nanotechnology.

It is difficult for a non-chemist to grasp the technology, but, briefly, the system works by using oxygen drawn from the air much as it is drawn into a conventional combustion engine.  Inside the battery’s  cells, the oxygen slips into tiny spaces that measure about an angstrom (0.00000000001 meters), and reacts with lithium ions situated on the battery’s cathode. That reaction turns the lithium ions to lithium peroxide, releasing electrons, thus generating electricity.  For more information oriented to the layman, go to the following website and check out the videos.

IBM credits much of the research advancement to the so-called Blue Gene supercomputers, used to analyze electro-chemical reactions to find alternative electrolyte materials that won’t degrade the battery while recharging.  These computers, located at Argonne National Lab and in Zurich, Switzerland have rung up tens of millions of processor-hours on the project.  The computer modeling is being used to determine how the ions and molecules of different materials will interact.  The hope is to find the optimum combination of materials that will permit commercialization of the technology.

The downside is that it is not expected to be commercialized until at least 2020.  In the meantime, auto manufacturers around the world are licking their collective chops.  If this technology is successful, it will signal the end of imported oil in the US.  The geopolitical implications are enormous.


Exascale: The Faraway Frontier of Computing?

Those of you who follow this blog know that I write about technology trends that are, in general, not too far into the future.  This post is a departure in that it takes a peek at a technology at least 10 years away dubbed (for the moment) as “Exascale Computing.”  I’m writing about it now because the chances of my living long enough to see it come to fruition are somewhere between slim and none, and because I used to be very involved with the supercomputing industry and absence makes the heart grow fonder.

Driving the development of this technology is a project known as SKA, which stands for Square Kilometer Array, a multibillion dollar radiotelescope 100X more sensitive than anything currently in existence.  Construction will begin in 5 years and will not be completed until 2024.  When it is operational, SKA will produce an Exabyte of data every single day.  To put that into perspective, that is twice the amount of data on the Internet worldwide – 1 quintillion bytes of information.

The SKA project is headed by ASTRON, the Netherlands Institute for Radio Astronomy.  To meet the vast computing requirements that will be needed to process that much information, ASTRON awarded IBM a $40+ million contract to begin developing what will be the world’s most powerful computer, equivalent to the combined computing power of 100 million high-end PCs.

There are three challenges that IBM will be addressing (see the graphic above):  transport (of data between computing elements), data storage and analysis.  Transport will be addressed using optical technology that is well-understood today.  Analysis will rely on massive parallel processing arrays with the power of a million+ Watsons.  Storage will rely on the development of new technology, most probably based on IBM’s research in nanotechnology applied to “phase-change memory.”.

If you are interested in this project, I urge you to view a short video about the SKA and IBM’s involvement in the project at this website.  Trust me, you will be amazed at the scope of the project and the technology challenges that will have to be overcome.

The Personal Cloud: Truth or Dare?

Analysts love to coin new expressions.  The one I’m most proud of is “Business Intelligence,” which I thought was a lot sexier than “data mining” and “data warehousing,” two earlier expressions which meant essentially the same thing.   IBM and a couple of other big companies jumped on the Business Intelligence bandwagon and made it an IT proverb.  I shoulda copyrighted or trademarked it, or started a domain business-intelligence .com.  Mighta made a few bucks.  Water under the dam.

I was impressed the other day when I saw a reference to the “Personal Cloud.”  This expression is claimed by Gartner Group, possibly the world’s largest techy analyst company, and mentioned prominently in a recent press release entitled “Gartner Says the Personal Cloud Will Replace the Personal Computer as the Center of Users’ Digital Lives by 2014.”  (Unfortunately, is already a registered domain name, indicating that Gartner isn’t any smarter than I was.)

In any event, I read the press release.  I thought it made sense -sort of.  Do you agree with the following quote?

“Gartner analysts said the personal cloud will begin a new era that will provide users with a new level of flexibility with the devices they use for daily activities, while leveraging the strengths of each device, ultimately enabling new levels of user satisfaction and productivity. However, it will require enterprises to fundamentally rethink how they deliver applications and services to users.”

The 46-word opening sentence aside (my composition professor would have slammed that writer), I agree that users will  potentially be more productive using the cloud, but I’m having a tough time with the prediction that enterprises will have to rethink how they deliver services to users.  In my experience, enterprises seldom rethink anything.  Rather, they change as little as possible so as not to freak out those users and make more work for themselves.

Don’t get me wrong.  I think the cloud is terrific.  As I mentioned in an earlier post, the timeshare business of the late 60s – early 90s essentially offered cloud computing, but, in those days, networks were slow and storage capacity limited.  Today’s broadband networks and virtually unlimited random access storage enable a user to do almost anything online instead of the desktop.  The only real barrier to cloud computing is security, an issue that I believe will never get resolved.

If you are skeptical about the cloud, I’d like you to test out an app called Dropbox, a web-based file hosting service.  You can get a free single-user account or an account that can be shared by several users.  The learning curve is practically non-existent.  It’s a great example of cloud storage.

You can also try some cloud apps.  If you use Google’s gmail or calendar, you are already apping in the cloud!  Others you can try for free are Quicken Online, WordPress (blogging service that I am using right now), and Adobe Photoshop Express.  Check out this website, “10 cloud apps that slam-dunk their desktop counterparts.”

Of course, these personal apps have little to do with enterprise applications, but will certainly give you a taste of the possible.

The IBM Tech Trends Report: Can 4000 IT Pros Be Right?

The eight-striper wordmark of IBM, the letters...

Image via Wikipedia

Last year, IBM conducted a survey of 4,000 IT professionals in an attempt to identify the most important technology trends.  The survey population included IT pros from 93 countries and 25 industries.  The US, China, Russia, India and Brazil contributed the most responses.  The results are published in a publicly available report titled The IBM Tech Trends Report.

This blog post shamefacedly plagiarizes from the IBM report.  For many years, I conducted similar types of surveys and published similar reports, but always for individual clients who would never let the competition see the results.  It is an interesting statement that IBM is willing to share its findings with anyone willing to take the time to read the material.  Even more interesting (to me) is that IBM also published the survey data.  Thus, if you don’t like IBM’s conclusions, you can formulate your own using its data!  Very cool indeed.

The data is in SPSS format.  Since not many people have SPSS, IBM cut a deal so that individuals can download and install a 14-day trial version, plenty of time to analyze the information.  I’ve already done that.  In future blog posts, I’ll present some of my conclusions, but I’m not going to tell you that they are based on this data, since that would be only one input that I would be inclined to use.

IBM’s study focused on four areas: business analytics, mobile computing, the cloud and social business.  According to IBM, these are four critical and interconnected areas that developers must concentrate on to build what IBM’s PR folks call “The Smarter Planet.”

I’m not going to regurgitate the findings of the study – you can read that for yourself – but there are a few things that came out of the study that I think are noteworthy:

  • There is less interest in automation in the US than there is in the other large countries.  Does this suggest that the US is so far ahead that it doesn’t think it needs it, or does the US have its collective head up its collective a-s?
  • Developers of mobile computing applications would be wise to concentrate on the Android platform.  Although iOS is very popular in the US and developed countries, Android offers a much shorter learning curve, and will be more appealing to the rest of the world.
  • Cloud Computing offers new opportunities for building and delivering applications and can lead to new ways of conceptualizing business models.
  • The popularity of social networking in the business environment is very closely tied to culture.  For example, social networking is embraced in India and spurned in Russia.  The US loves it, but is worried about security and privacy.

Most of the tech trends info that is on the web is written by reporters who get most of their information from interviews with people in the booths at the latest techy conference.  It’s nice to see real study results based on meaningful statistical data.



The Paperless Society: Myth or Reality?

CBS syndicated columnist Dave Ross recently said “The best thing about the Internet is that there is no paper.  The worst thing about the Internet is that there is no paper.”  I think Mr. Ross has succinctly captured the essence of the dichotomy.

Several years ago, the Aetna Insurance Company hired IBM to do an extensive $3 million study with the objective of reducing paper handling throughout Aetna by 10%.  When the study was completed, and recommendations made, Aetna management rejected it, concluding that the trauma caused by doing things differently would cost the company far more than the anticipated savings!

In 2004, FedEx paid $2.4 Billion to acquire 1300 Kinko stores.  What services does Kinko (now called “FedEx Office“) provide?  Primarily printing and copying paper.  Kinko’s also provides shipping services.  What do they ship?  You guessed it, mostly documents.  Is FedEx nuts?  Probably not.

The US Post Office is on the ropes because its paper shipping business has declined to the point where the existing infrastructure is bigger than it needs to be.

The Kindle, the Nook and the iPAD have obviated the need to read books on paper.

Several years ago, I visited the Head of HP’s Workstation Division.  When I asked him how they were making any money given the current competitive situation, he replied:  “Every morning, before office hours, we all gather in the parking lot, face Boise, bow down and utter a prayer of thanks.”  If you didn’t already know it, Boise is the HQ of HP’s Printer Division!

The newspaper business is on its last legs, given that an increasing number of people get their news over the Internet or on TV.

Visit Costco, Staples, Office Max and Office Depot and see rows and rows of paper for sale:  Copy paper, printer paper, card stock, colored paper, envelopes, etc., etc.

Reading a Magazine in a Barber shop

Who among you men can imagine waiting in a barber shop reading Playboy on a tablet?  Who among you women can imagine reading Vogue on a tablet while waiting for your manicure.

Although paper consumption in North America has declined by 25% in recent years, demand is on the upswing due to improvement in the economy, the introduction of new products and huge growth from Asia, especially China.

Here are oft-repeated statements: “Finding specific information in a stack of paper can be time consuming and often frustrating. Finding specific information in digital data is quick and easy.”  My response to that is a big MAYBE.  The statements may be true IF the data has been organized efficiently and the reader knows the right keywords to search on.  But suppose you are reading a lengthy document and want to refer back to something you read earlier.  If you are sitting in front of screen, how easy is it to do that?  Often, it is much faster to riffle through the pages of a paper document than to do a computer search where you might not even remember searchable keywords.

So what is one to make out of these seemingly contradictory observations?  I think it is safe to say that paper and electrons will coexist for a very long time.  Paper will be replaced where it makes sense, and used where it makes sense.

According to, each person in the United States uses 749 pounds of paper per year!  (Yes, that includes toilet paper).  We are not going to see that consumption decline significantly for a very long time.

Voice Input: Mind Over Matter?

List of The Big Bang Theory episodes (season 2)

Image via Wikipedia

A few nights ago, I happened to watch an episode of the TV comedy show “Big Bang Theory,” in which one of the lead characters carries on conversations with “Siri.”  For example:

Character (speaking with an Indian accent):  “Hello”

Siri (speaking with a sexy woman’s voice):  “Hello”

Character:  “What’s your name?”

Siri: “My name is Siri.”

Character: Are you single?”

Siri: “I don’t have a marital status.”

Character:  “How about a cup of coffee?”

Siri: “I found 6 coffee shops, 3 of them near you.”

Another example, this one truer to life, was the Jeopardy contest between IBM‘s Watson supercomputer and the show’s most successful contestants.  Questions were asked by host Alex Trebek and Watson’s answers were given in a staccato-sounding computer-generated English.

Speech recognition and speech synthesis are technologies that have been studied and under development for a long time.  IBM was one of the pioneers of this research, and the company continues to pursue it in labs all over the world.  IBM groups various voice-related technologies under the umbrella phrase Human Language Technologies.  Clicking on this link will bring up a page that will direct you to a layman’s overview of IBM’s many research projects, patents and related information.

There are two main parts to the speech field as it relates to computers:  Speech Recognition and Speech Synthesis.  Beyond that, subsidiary technologies include Speech-to-Text, Natural Language-to-Formal Language Translation, Speaker Recognition and Speaker Verification.  Besides IBM, other major companies including Microsoft and Google are investing in speech-related research.  DARPA (Defense Advanced Projects Research Agency) is particularly interested in recognizing speech in noisy environments, and has funded research in this field for 40 years through SRI International’s Artificial Intelligence Center.  That research formed the basis for the aforementioned Siri.  Nuance, the company known for its “Dragon” speech recognition software for PCs, does its own research and collaborates with IBM.  Nuance licenses its technology to Apple.

It is difficult to pinpoint the level of investment in speech-related technologies, but with big companies and government agencies heavily involved, together with push from the hugely competitive mobile market, we will see continuing investment and great accomplishment in the years to come

Eventually, these technologies will lead to nothing short of a computing revolution.  Chuck the keyboard, the mouse and the pad.  We were all born with the I/O of the future.



IBM Atomic Memory Breakthrough: A Computing Revolution

Today, it takes approximately 1 million atoms to store a single bit (0 or 1) of information using conventional magnetic storage technology.  Researchers at IBM’s Almaden Laboratory in San Jose, California led by Dr. Andreas Heinrich, have accomplished the same feat with only 12 atoms!

Before we get too excited, it was done by reducing the temperature to near absolute zero (-458 degrees F), which is a bit impractical for ordinary use.  Nevertheless, the researchers think that stable storage can be accomplished with as few as 150 atoms at room temperatures.

If you are interested in the details of the technology, they have been published in Science, the journal of the American Association for the Advancement of Science, one of the world’s top scientific publications.  Suffice it to say that this discovery may have enormous implications for the future of computing.  Not only will the density of storage be reduced by orders of magnitude, but power requirements will follow suit.

For decades, the computer industry has followed the dictates of Moore’s Law which says that transistor count will double on integrated circuits every two years.   If IBM’s research becomes practical reality, Moore’s Law will go the way of the dodo.  Atomic-scale memory is 100x denser than hard disk drives, 160x denser than NAND flash chips, 417x denser than DRAM components, and 10,000x denser than SRAM chips.  This is truly a game changer.

Practical implementation of this “nanomemory”will require the discovery of new materials that don’t presently exist.  IBM researchers think that will happen, but that it could take 5 – 10 years.  Fortunately, IBM is making a full-court press.  It has been investing upwards of $100 million per year in nanotechnology research, and intends to continue investing at that rate.

IBM has “opened its kimono” a bit on the subject.  Besides the Science article, which is geared to scientists, IBM has tried to explain what this is all about in terms most lay persons can understand.  If you are interested, go to this website and this one

Memory Array Made up of 12 Atoms


Post Navigation