AI in 2019: Two Experts Weigh In
- By John K. Waters
If 2018 was the year "AI" found its way into the popular lexicon (and the ever-accelerating swirl of anxiety-producing tech buzzwords), 2019 looks to be the year AI, machine learning (ML) and deep learning (DL) are on everybody's mind.
Author, speaker and all-around tech maven Ahmed Banafa, who serves on the general engineering faculty at San Jose State University in San Jose, Calif., sees two especially noteworthy trends surfacing in the coming year: one good and more or less expected; one not so good.
First, the good: AI/ML and IoT are going to get close in 2019. The former will help the latter with data analysis in a range of implementations, Banafa explained in an email. He offered a few examples:
- Data Preparation: Defining pools of data and cleaning them, which will take us to concepts like Dark Data and Data Lakes.
- Data Discovery: Finding useful data in defined pools of data.
- Visualization of Streaming Data: On-the-fly dealing with streaming data by defining, discovering data and visualizing it in smart ways to make it easy for the decision-making process to take place without delay.
- Time Series Accuracy of Data: Keeping the level of confidence in data collected high with high accuracy and integrity of data.
- Predictive and Advance Analytics: Making decisions based on data collected, discovered and analyzed.
- Real-Time Geospatial and Location (Logistical Data): Maintaining the flow of data smoothly and under control.
Among other things, this integration will support intelligent speakers and cameras (Google Duplex for example), which will serve as the hubs of the IoT devices. It will also support the striking rise of the Voice User Interface (VUI) in the coming year.
Now, the not-so-good: Expect to see more AI-powered cybersecurity attacks in 2019. Banafa cited a study published by security firm Webroot in 2017, which found that, although AI is used by approximately 87 percent of U.S. cybersecurity professionals, 91 percent of those security pros are concerned that hackers will use AI to launch even more sophisticated cyberattacks. For example, AI can be used to automate the collection of certain information -- say, info relating to a specific organization -- which may be sourced from support forums, code repositories, social media platforms and more. Additionally, AI may be able to assist hackers when it comes to cracking passwords by narrowing down the number of probable passwords based on geography, demographics and other such factors.
Siddhartha Agarwal, Oracle's VP of product management and strategy, usually has a savvy prediction or two at this time of year. (There's a reason I once dubbed him "Oracle's Oracle.") This year was no exception; he sent me his latest list. But two in particular stand out for AI/ML-watchers: "The button disappears: AI becomes the app interface;" and "Machine learning takes a leap ahead in practical, domain-specific uses."
"AI becomes the UI, meaning that the pull-based/request-response model of using apps and services gradually disappears," Agarwal wrote. "Smartphones are still low IQ, because for the most part you have to pick them up, launch them and ask something, and then get a response back. In better-designed apps, however, the app initiates interactions via push notifications. Let's take this a step further so that an app, bot, or a virtual personal assistant using artificial intelligence will know what to do, when, why, where and how. And just do it."
He offered two examples:
- Consider an expense approvals app that watches your pattern of approving, so that it can eventually auto-approve 99 percent of expense reports and only bring to your attention the rare approval that requires your attention.
- Think of business users who do analytics every day. What if the analytics app could each day provide a new insight that an analyst might not have thought of, by considering the data she's been looking at, the past set of questions she asked and even what other analysts are asking?
"Developers need to figure out what data is really important to their business application," he added, "how to watch and learn from transactions, what business decisions would most benefit from this kind of proactive AI, and start experimenting. Embedded AI can predict what you need, deliver info and functionality via the right medium at the right place and time, including before you need it, and automate many tasks you do manually today."
When it comes to machine learning, Agarwal wrote, the advances we'll see in 2019 are part of a logical evolution of that technology. "The most valuable data comes with context," he said, "what you've done before, what questions you've asked, what other people are doing, what's normal versus odd activity. And the best understanding comes from the depth of data in domain-specific use cases, such as manufacturing, marketing campaigns, e-commerce sites, or IT operations center.
"At the same time, the volume of data being created in those scenarios scales beyond human capacity to understand or to react in real time. So, look for machine learning to take a big role this coming year in these domain-specific data challenges. On a factory floor, for example, expect Internet-of-things applications to deliver on the long-held promise of preventive maintenance -- predicting from operating data when a machine will fail. In marketing, apps will leverage all online and offline transactions along with social interactions to recommend which campaign to promote to which segment of customers, and when. Machine learning doesn't eliminate human judgment, but it will challenge decision makers in 2018 to consider ideas they wouldn't have before."
John has been covering the high-tech beat from Silicon Valley and the San Francisco Bay Area for nearly two decades. He serves as Editor-at-Large for Application Development Trends (www.ADTMag.com) and contributes regularly to Redmond Magazine, The Technology Horizons in Education Journal, and Campus Technology. He is the author of more than a dozen books, including The Everything Guide to Social Media; The Everything Computer Book; Blobitecture: Waveform Architecture and Digital Design; John Chambers and the Cisco Way; and Diablo: The Official Strategy Guide.