To hear technologists tell it, artificial intelligence (AI) is just around the corner, soon to do mankind’s bidding, and we already have examples of how this new fangled way of mixing thinking and doing will impact humankind. Siri.
Alright, I’m stretching it a bit, but Siri can be considered one of many spokespersons for how we’ll use AI in the future to communicate with devices and systems. Siri isn’t all that bright. Yet. Some say Google Assistant is the smarter one of the bunch, perhaps Alexa is close behind, certainly Microsoft’s Cortana is up there, but to be honest about this whole AI future, I think I like Apple’s artificial intelligence– as embodied by Siri in devices that integrate well with one another– better.
Clilff Kuang asked one of the best questions in “Can A.I. Be Taught To Explain Itself?” and adjusted what we think we know about AI.
“Artificial intelligence” is a misnomer, an airy and evocative term that can be shaded with whatever notions we might have about what “intelligence” is in the first place. Researchers today prefer the term “machine learning,” which better describes what makes such algorithms powerful.
OK, so now it’s machine learning. Fair enough. But can machines learn by themselves, or, like Siri, do they need some kind of human interaction to start and keep the learning process going? And, where does all the information that Siri– or some kind of interface to a machine that can sift through mountains of data– come from and who decides how it should be used.
For example, can I ask Siri if I qualify for a car or home loan? Yes. But Siri does not have enough information to provide a qualified response.
Let’s say that a computer program is deciding whether to give you a loan. It might start by comparing the loan amount with your income; then it might look at your credit history, marital status or age; then it might consider any number of other data points. After exhausting this “decision tree” of possible variables, the computer will spit out a decision.
This is where it gets interesting. Machine learning seems like a softer, less invasive, not-quite-so-scary term than artificial intelligence. That may explain why Apple, Google, Amazon, Microsoft, and others are giving their AI and machine learning efforts a human-like persona; one that appears willing and able to learn, but not too much too quickly.
Siri knows the weather. Siri knows how the nearby Tampa Bay Buccaneers did in their most recent game, and over time may understand that I have interests here but not interests there. I’m not so sure that’s learning, per se, but more just a good associative memory.
If the program were built with only a few examples to reason from, it probably wouldn’t be very accurate. But given millions of cases to consider, along with their various outcomes, a machine-learning algorithm could tweak itself — figuring out when to, say, give more weight to age and less to income — until it is able to handle a range of novel situations and reliably predict how likely each loan is to default.
You can almost see Siri ready to build some kind of personal database about each of us, fully programmed to segregate what it stores and learns about us from other users, but note that Apple keeps Siri generalized in knowledge, rather than giving Siri access to information that is personal to each of us.
I like that approach but I suspect it’s merely the first step in a long road toward Siri being able to explain, in detail, what she or he actually does and why, toward the next step where Siri learns about us without us giving any additional voice-to-voice (or, face-to-face– Apple must be working on an animated animoji Siri interface) information.
Machine learning isn’t just one technique. It encompasses entire families of them, from “boosted decision trees,” which allow an algorithm to change the weighting it gives to each data point, to “random forests,” which average together many thousands of randomly generated decision trees. The sheer proliferation of different techniques, none of them obviously better than the others, can leave researchers flummoxed over which one to choose.
For now, whatever our AI or machine learning friends learn is based upon what programmers can pull from collected data. The less collected data, the less smarts AI or machine learning faces like Siri or Alexa or Google’s poorly named Assistant will have. I want Siri to be smarter, yes, but not too smart too quickly, and no go around stealing information about me the way Alexa and Google Assistant do already.
I like Apple’s artificial intelligence operation better.