Artificial intelligence (AI), one of 20 core technologies I identified back in 1983 as the drivers of exponential economic value creation, is rapidly working its way into our lives from Amazon’s Alexa and Facebook’s M, to Google’s Now and Apple’s Siri. But it’s much bigger than that. From this point forward, it would be a good idea to keep a closer eye on AI’s rapid development and look for both predictable problems as well as amazing opportunities.
An example of how far AI has come is the recent news that a Google supercomputer, using its advanced AI software, was able to win a stunning 3-0 victory in a man vs. machine face-off against Go grandmaster Lee Sedol, one of the game’s all-time champions. For those who are not familiar with Go, it is a 3,000-year-old game that is widely considered to be the most complex game ever invented because it is reported to have more possible board configurations than there are atoms in the universe. Until just a few months ago, it was thought that a computer could not defeat a human grandmaster for at least another decade due to the game’s complexity. Everyone was wrong!
Why is this important to you? Because game-playing is a crucial way to measure AI’s ability to execute a certain “intellectual” task better than a human. So this was a big win for AI.
How did Google’s AlphaGo program advance so much faster than many expected? First, it illustrates the power of the “Three Digital Accelerators” – the exponential growth of processing power (Moore’s Law), bandwidth, and digital storage – that I first identified back in 1983. These accelerators have finally reaching a tipping point that will drive explosive growth going forward. And second, thanks to reaching that tipping point, Google’s AlphaGo program was able to partly teach itself. By playing millions of games against itself to hone its tactics through trial and error, the AlphaGo learned much faster than expected.
Another AI system that has been getting a lot of press is IBM’s Watson. Watson is a cognitive computer that learns over time. This cognitive AI technology can process information much more like a smart human than a smart computer. You may recall how IBM Watson first shot to fame back in 2011 by beating two of Jeopardy’s greatest champions on TV. Since then it has been applied in an ever-growing list of fields, thanks to three unique capabilities: natural language processing, hypothesis generation and evaluation, and dynamic learning.
Today, cognitive computing is being used in a wide variety of applications including healthcare, travel, and weather forecasting. When IBM began acquiring digital assets to strengthen its cloud capabilities with the acquisition of the Weather Company, the online community and newspaper headline writers were quick to voice their amusement. However, IBM soon had the last laugh when people learned that the Weather Company’s cloud-based services can easily handle over 26 million inquiries every day on its website and mobile app, as well as learn from not only the daily changes in weather, but also from the questions being asked.
This colossal amount of data from the fourth most-used mobile app would whet the appetite of even the permanently ravenous IBM Watson and would enable IBM to increase the level of analytics for its all-important business clients.
It is believed that weather is responsible for businesses losing $500 billion a year. Pharmaceutical companies are increasingly relying on accurate forecasts to predict a rise in the need for allergy medication, as are farmers – whose livelihoods often depend on what Mother Nature has around the corner – not only for where they grow their crops but also what is happening around the world where they sell their harvest. When coupled with the news that IBM also snapped up Merge Healthcare Inc. for a cool $1 billion in order to integrate its imaging management platform into our old friend Watson, it becomes instantly clear where Watson’s future is heading.
Real-time analytics of the online white noise that surrounds us all and translating it into a meaningful, actionable report is incredibly powerful.
With Watson’s learning capabilities, it’s not beyond the realm of possibility that Watson will learn more about science than a scientist after it learns from the entire history of scientific data and research.
How about fields like auditing and accounting? A few years ago, when I was the keynote speaker at KPMG’s annual partner meeting, I suggested that they consider partnering with IBM to have Watson learn all of the global accounting regulations so that they could transform their audit and tax practice and gain a huge advantage. After doing their own research on the subject, the KPMG team just announced that they are forming an alliance with IBM’s Watson unit to develop high-tech tools for auditing, as well as for KPMG’s other lines of business.
I have also worked with Deloitte & Touche, Ernst & Young, and PricewaterhouseCoopers, and I can assure you that they are also pouring hundreds of millions of dollars into using advanced AI and analytics to make audit and tax services far more accurate and comprehensive.
So if the Big Four firms are all using advanced tools like these, where is the advantage? And how can a smaller firm gain an advantage like this? Thanks to the cloud and the virtualization of services, you don’t have to own the tools in order to have access to them. In other words, it all comes back to us humans and how creatively we use the new tools. It’s not the tool – it’s how you use it!
IBM’s Watson, along with advanced AI and analytics from Google, Facebook, and others, will gain the cognitive insights and real-time advice mined from the ever-growing mountains of data generated by our connected world of devices, machines, and sensors (IoT) to revolutionize every industry.
Ultimately, advanced AI is promising almost limitless possibilities that will enable businesses in every field to make better decisions in far less time. But at what price? Many will point to the aggressive shake-up at IBM that is responsible for its recent and massive job cuts throughout the company and suggest that technology is making much of the human race redundant.
It is crucial to recognize how the technological landscape is evolving before our eyes during this digital transformation. Yes, it is true that hundreds of traditional jobs are disappearing (leaving many out of work), but it’s also important to realize how there are a wealth of new roles and employment opportunities arriving that are needed to help us progress further.
The so-called rise of the machines started by removing mundane and repetitive tasks and it is now moving more into what is often referred to as the white collar jobs. The key for us humans is to go beyond just reacting to change, and start getting ahead of it by paying attention to what I call the “Hard Trends” – the facts that are shaping the future – so that you can anticipate the problems and new opportunities ahead of you. Try focusing on being really good at the areas that computers have great difficulty understanding, including collaboration, communication, problem solving, and much more.
Making yourself increasingly valuable and relevant in the workplace will require you to learn new things on an ongoing basis as well as unlearn the old ways that are now holding you back. Remember, we live in a world filled with technology, but we live in a human world where relationships are all-important.
We need to become aware of the new tools available to us, and then creatively apply them to turn the impossible into the possible. By acquiring new knowledge and developing your creativity and problem solving skills, as well as honing your interpersonal, social, and communication skills regardless of your age, you can thrive in a world of transformational change.
There is an old saying: You can’t teach an old dog new tricks. The good news is, we aren’t dogs!
©2015 Burrus Research, Inc. All Rights Reserved. Used with permission.
DANIEL BURRUS is considered one of the world’s leading technology forecasters and innovation experts, and is the founder and CEO of Burrus Research, a research and consulting firm that monitors global advancements in technology driven trends to help clients understand how technological, social and business forces are converging to create enormous untapped opportunities. He is the author of six books including The New York Times best seller Flash Foresight.