fastcocreate-dark-notxt fastcodesign-dark-notxt fastcoexist-dark-notxt fastcolabs-dark-notxt fastcompany-full-dark video-dark fastcompany-dark-square nav-plus facebook twitter instagram pinterest linkedin back menu-close menu-open twitter-2 facebook-2 modal-pinterest icon-slideshow icon-video Email Facebook GooglePlus linkedin Reddit stumbleupon Twitter mic-down-arrow
Fast Company

The Future of Work

IBM's $3 Billion Investment In Synthetic Brains And Quantum Computing

IBM thinks the future belongs to computers that mimic the human brain and use quantum physics...and they're betting $3 billion on it.

IBM is unveiling a massive $3 billion research and development round on Wednesday, investing in weird, science fiction-like technologies—and, in the process, essentially staking Big Blue’s long-term survival on big data and cognitive computing.

Over the next five years, IBM will invest a significant amount of their total revenue in technologies like non-silicon computer chips, quantum computing research, and computers that mimic the human brain.

The $3 billion funding round will go towards a variety of projects designed to catapult semiconductor manufacturing past what IBM physical sciences director Supratik Guha calls the "end of silicon scaling" in microchips. Essentially, IBM believes there will be a point in the medium-term future where microchips will no longer be made out of silicon because other materials will allow for faster and more complex computation. In a telephone conversation, Guha told Fast Company that his company sees an end to silicon scaling within the next three to four tech generations.

The new R&D initiatives fall into two categories: Developing nanotech components for silicon chips for big data and cloud systems, and experimentation with "post-silicon" microchips. This will include research into quantum computers which don’t know binary code, neurosynaptic computers which mimic the behavior of living brains, carbon nanotubes, graphene tools and a variety of other technologies.

IBM’s investment is one of the largest for quantum computing to date; the company is one of the biggest researchers in the field, along with a Canadian company named D-Wave which is partnering with Google and NASA to develop quantum computer systems.

The news of the funding round is surprising to some IBM-watchers; for the past year or so, rumors flew that IBM was considering an exit from the microchip business. Barring an unlikely sharp turn by Big Blue in business strategy, the funding round is essentially a shift of strategy by IBM. While it looks likely that IBM will sell their chip unit—the New York Times cites GlobalFoundries as a likely buyer—the investment means that IBM sees a new way to make money from chips: Long-term returns from holding valuable, potentially lucrative patents and intellectual property.

"The point of this announcement is to underscore our commitment to the future of computing," Guha told Fast Company. "As you probably know, silicon technology has taken us a long way. A lot of stuff you see around you is a result of our ability to scale silicon tech, but the community at large realizes the end of silicon scaling is coming. However, performance scaling in computer system will continue in various ways; our R&D efforts are focused on different ways and means by which we do so."

Of all the investments announced in the round, neurosynaptic chips are the most novel. Essentially low-power microchips designed to mimic the behavior of the human brain, IBM has been researching the feasibility of building technology that can mimic human cognition for years. IBM is believed to be building a new programming language around the chips, which will be used for machine learning and cognitive computing systems like Watson. Some proof-of-concept neurosynaptic computing projects IBM announced previously include oral thermometers which identify bacteria by their odor and "conversation flowers" placed on tables which automatically identify speakers by voice and generate real-time transcripts of conversations, rendering transcriptionists obsolete.

[Image: Flickr user Sarah]