This article is authored by Kimberly Chulis, who is the Director, Business Analytics Certificate at The University of Chicago Graham School. She is an Applied Advanced Analytics leader with 20+ years of applied analytics experience at Fortune 100 companies and high-tech firms, along with being an educator for institutions such as Purdue University and Northwestern University. In her current capacity, she is also an instructor to the students of the Postgraduate Program in Data Science & Machine Learning (PGPDM).
The advent of Artificial Intelligence, or AI as it is more commonly called, has been accompanied to a large extent by high-profile predictions of pessimism and doom. We read that the age of automation will mean that no worker will be safe from replacement by a machine and that robots are coming for our jobs. There are even survival guides that warn us that we must start putting strategies in place to profit rather than perish when the inevitable happens. We hear from tech moguls such as Elon Musk, who publically critiques AI and warns government officials that AI poses a greater disruptive risk to the world than a North Korean nuclear attack.
Other members of the advanced analytics community take a different view. Artificial Intelligence is one of the hottest topics on the data science radar. AI, combined with IoT, cloud solutions, deep learning, GPU processing, image and video analytics are all topics that command global attention. Speaking as a practitioner of data science and advanced analytics for over two decades, it’s been fascinating to participate firsthand in the evolution of AI. Most companies today are starting to explore how they can automate processes with AI solutions that up until recently relied on more manual machine learning solutions. In fact, with so many advancements over the past decade, it’s difficult to pinpoint specifically how and when data science surged in general popularity.
For decades, statisticians have been working outside of the spotlight performing important, but slightly mundane analyses, to drive insights and forecasts for industries and enterprises. Suddenly in the past few years, the same work with a slightly different twist and focus on bigger volumes and more complex varieties of data (as technology advanced to a point where the storage and management of it became cost effective and democratized) became what in 2014 Tom Davenport and DJ Patil dubbed the ‘Sexiest Job of the 21st Century’.
In parallel with the emergence and adoption of various types of NoSQL databases, a pivotal development in the continuum was the introduction of Spark. The arrival of Spark on the scene heralded a migration from the dramatically slower MapReduce to new technology rendering it possible to harness all of the sensor and streaming data in an affordable and secure cloud environment, and apply what has become a new paradigm for enterprises as they seek to actualize on their IoT and AI strategies.
Tech companies such as Google and Yahoo were incubators for what a few short years later would become open source communities generating freely available platforms and file systems developed to manage petabytes of data in real-time, performing streaming analytics and natural language processing and machine learning capabilities on data of all varieties, orderly and unstructured. This recent list of Chicago startups to watch in 2018 features high representation by analytics, machine learning and artificial intelligence vendors across industries. The pace and progress in this technology space is remarkable.
The pace of growth in the technology space around analytics is moving seemingly at the speed of light. The advancements are so rapid that entire leadership and project management methodologies have needed to shift to agile hybrids in an attempt to harness these new platforms. It is de rigueur today for enterprises to rent cloud storage and computing power on a pay-as-you-go basis without the commitment to on premise longer-term infrastructure and licenses. All of these factors and forces have paved the way to the AI revolution we are witnessing the nascent stages of today.
How much promise, or threat, does widespread adoption of AI hold for all of us? How is AI different from IoT? What industries stand to be impacted the most by AI? What jobs stand to be replaced by AI? Will the industry move towards total eventual automation, and how will this impact our collective well-being, sense of purpose and identify, stability, happiness, security, professions, quality of life?
To address some of these questions, I performed a search for content on what notable experts in the industry are predicting related to AI. In particular, there is a collection of recent YouTube commentaries presented by tech luminaries on the AI topic that help us to understand how pervasive and rapid we can expect the inclusion of AI and analytic automation into our lives and businesses. To start, this is not all new, rather a continuation of the analytics evolution. Technology that allows for the collection of volumes of data in a useful panel or time series data set has been around for a while. We’ve been generating these data in our automobiles, in the elevators we take to offices, and they have been forming the basis for the recommendations our personal apps provide us on traffic route optimization, where to dine and whose updates we are likely to find most capitating on Facebook. Most of us are familiar with and comfortable by now with asking Siri for the time, and Alexa to set an alarm for the next morning or to tell us the weather forecast. By and large human beings are adaptive, and after the initial frustration and learning curve has been overcome, become quite comfortable and later completely reliant on the new functionality.
Overwhelmingly, almost every expert review was positive. Our tech leaders, those guiding the largest and most promising technology firms and ventures in the world, are generally extremely positive about the huge benefits that humanity at large will reap from the inclusion of machine learning, AI, automation of business intelligence in such industries such as healthcare, finance, agriculture, education. We will move to such an individualized level of precision medicine, where prescriptions, robot-automated surgery, or gene editing therapy associated with highest likelihood of wellness success is selected for each of us based on a myriad of data points that reflect our age, genomic information, dietary and exercise, bioinformatics and clinical data.
Bill Gates talks about the space and developments in AI in terms of digital advancement, and illustrates the impact this will have on the global financial industry in developing countries where mobile phones in rural locations eliminate the need for physical branches and banks. He describes how phones will become the equivalent of debit cards to support banking for the poor with microfinance options, such as M-pesa in Kenya or Bkash in Bangladesh. He talks about how we will see the abundance of drought- and disease-resistant crops rise as precision agriculture through the use of drones determines real-time soil, crop, weather and moisture-related changes that are tied to the automation of prescriptive solutions to ensure the highest and most prolific crop yield possible. Students will benefit from precision educational strategies, matched with instructors that maximize their learning potential, with targeted instruction material and techniques designed for the most efficient and effective knowledge transfer, aspects of the student’s experience will be automatically measured through speech, text, mood, sentiment and other indicators that gauge the ongoing performance, satisfaction and achievements in classes.
Social media sites will continue to automatically detect and block communications from terrorist groups, as they are doing with ISIS communications now on Facebook, Twitter and YouTube. These automated systems that identify and remove most of the content — even before users can flag the posts for removal. And let’s not forget how AI is being used specifically for analytics, in the data quality determination, data cleansing, and selection and moderating for degradation in the machine learning and predictive modeling deployment process.
Healthcare represents an opportunity for widespread global social welfare impact. It’s one of the industries still most underserved by analytics and IOT, in part because of barriers such as heavy data privacy regulations, an archaic and disjointed legacy system of databases, and a required cultural shift among providers and patients that needs to converge and expand to fully realize the benefits AI can deliver on the precision medicine front.
The ability to process and make sense out of staggering amounts of all types of data holds tremendous potential in every industry. Most companies are still very early along the AI maturity path, in a position where CIOs, strategists, and practitioners are beginning to grasp the possibilities and are beginning to consider applications where they might execute on these in soloed use-case based projects.
A slightly different view on what’s coming with AI is put forth by technology analyst forecasts. According to the business analyst firm Gartner, by 2020 at least 20% of individuals in developed nations will rely on AI in the form of virtual assistants to assist with tasks, and by 2022 40% of government workers and client-facing workers will rely on AI support to make decisions and drive processes.
The other major trend Gartner predicts for these timeframes is in the realm of fake news. There will be a proliferation of AI-generated false information such that by 2020 consumers will be exposed to false content than true. This trend will lead to general distrust of digital content, as AI systems creating this content outpace the ability of AI to detect fraudulent digital information. Over 85% of CIOs will introduce AI pilot programs, and there will be a demand for transparent AI systems.
Finally, as Director, Business Analytics Certificate and instructor in PGPDM, educator teaching Analytics Management, Project Planning, Leadership and Entrepreneurship and as a longtime practitioner in the dynamic data science space, I’ve seen a kaleidoscope of analytics and AI applications, such as ones designed to optimize farm to table local food supply chain networks, and frameworks aimed at optimizing and automating trucking routes (which cuts down on costs, waste and increases driver satisfaction and tenure), systems rolled out in ICUs that dramatically reduce surgical site infections through the use of predictive analytics, as well as automated predictive systems that detect real-time critical instances in emergency rooms and triage messages to appropriate resource management optimization systems to enable proactive response to the fluctuating risk levels of varying nature in the patient mix.
Other interesting projects I’ve seen are forecasts for cryptocurrencies to guide investors rooted in social media sentiment, pitches to Oil & Gas companies on predictive maintenance solutions for highly specialized equipment that are rooted in machine learning, and of course each of us are familiar with the application of precision recommendations on Netflix and in our online retail shopping experiences on Amazon. My dissertation research identified research that has have found that public health epidemics can be identified using Twitter data up to two weeks faster than possible to see in CDC data. It seems there is no limit to the applications of automation and analytics.
The general consensus however, is that we are still very early in the process of adapting to AI, and we don’t have enough practice at it. There are enough barriers and constraints in place that it’s likely wide scale adoption and full automation is still closer a vision than an imminent reality.
First of all, there’s a talent shortage in the AI space, just as there was 6 years ago in the data scientist talent pool, when McKinsey put out their annual report calling. It is estimated in the International Institute for Analytics predictions for 2018 presentation that there are fewer than 1000 AI specialists globally.
One of the best responses to the expected impact of AI on humankind came from Nobel Laureate 2013 in Chemistry from Stanford University, Michael Levitt in this panel session , where each of the participants gave insightful views on the topic. Levitt sums it up by saying ‘Machinery doesn’t make people lazy. It just redirects their energy, usually in more productive ways’.