AI the new kid in class, has more LinkedIn connections than Donald H Taylor and is small enough to fit into most of our pockets. So, just how impressive is AI at learning relative to us? Well here are 8 reasons inspired by another Donald, this time with the last name Clarke.
Firstly, I think it is essential I address the elephant in the room. That is, AI is nowhere close to replacing the human workforce to the extent that industrialisation did, especially in L&D. With all its awesomeness, this technology is still very limited to specific tasks and currently lacks the versatility humans provide. So, this is not a piece to create panic, such as “Will AI take our jobs”. The internet trends 2019 slides released by Mary Meeker also supports this, for those looking for empirical evidence on the state of AI on today's workforce.
Therefore, this piece aims to share some contrasts between our human capabilities and those of AI. Hopefully, this will provide some perspective of where and how this technology could best serve us in the future.
What is AI?
Before we go any further, let's take a second to consult wiki on some of the definitions of AI. Yes, there is more than one 🤯 AI is a broad field in computer science, which also keeps evolving, so I guess that makes sense 🤷🏾♂️
In computer science, artificial intelligence (AI), sometimes called machine intelligence, is intelligence demonstrated by machines, in contrast to the natural intelligence displayed by humans. Leading AI textbooks define the field as the study of "intelligent agents": any device that perceives its environment and takes actions that maximise its chance of successfully achieving its goals (Poole, Mackworth & Goebel, 1998); (Nilsson, 1998). Colloquially, the term "artificial intelligence" is often used to describe machines (or computers) that mimic "cognitive" functions that humans associate with the human mind, such as "learning" and "problem-solving" (Russel & Norvig, 2003).
Without further ado ....
1. Time to Educate
For most white-collar employees or those in the knowledge market, it takes about 18 years to finish school, followed by approximately 3 -7 years of post-secondary schooling depending on their field of study.
AI, on the other hand, is only constrained to how much data it is fed and therefore the time to learn is almost negligible in comparison to humans.
When looking at these differences, you can see it takes a while before educated people can actually learn enough to contribute to the workplace.
2. Attention and Cognitive Overload
We are easily distracted and have a limit to how much information or tasks we can execute before we are unable to process information. There exist vast research and speculation into the attention span of humans. Gazzaley and Rosen, a neuroscientist and psychologist respectively, explain our flawed ability to focus when multi-tasking in their book “The distracted mind”. They state that the human brain is limited in its ability to pay attention and that we don’t actually multi-task, but rather switch rapidly between tasks.
Contrarily, the attention span of an AI is unwavering. AI’s do not experience cognitive overload and computers have long been able to multi-task using CPUs (the brain of the computer). While they may slow with additional tasks, you can always solve this with better algorithms or hardware (processors).
It is well understood within the L&D community how fast humans forget. In a study conducted by the University of East Anglia, they found that first-year university students forget up to 60 % of basic concepts learned the year before. This phenomenon was well characterised by Hermann Ebbinghaus in his memory studies over 100 years ago when he discovered the “The Forgetting Curve”.
Like the internet, computers (AI) never forget. They have the ability to store vast amounts of data locally or using cloud storage.
Although forgetting is biological flaw of being human, all is not lost. There are various methods we can use to counteract the rate at which we forget. Spaced repetition and recall is a widespread technique developed to combat forgetting. If this is something you might be interested in learning more about or trying, I would recommend checking out a free, open-source app called Anki. They have fantastic content on getting started and a large community of people who share useful resources for learning almost anything 🤓
The study of human motivation is well researched and abundant. So much so, that our first introduction of motivation theory can be as early as grade school when we learn about Maslow's hierarchy of needs. With that said, I see no need to hijack this post with a literature review on the subject. Therefore, from research, it is evident that how we learn is significantly influenced by our motivation or its lack thereof.
For AI, this point can be addressed relatively fast, since AI does not need any motivation to keep learning or executing tasks.
While empirical studies for measuring to what extent we are biased are rather argumentative. We can all agree that humans possess biases, whether based on race, gender, sexuality, etc.
So, does AI also suffer from bias? The short answer is yes, and the reason is a little unusual. If you are familiar with the concepts of nature vs nurture, it may be surprising to know that nurture plays a significant role in how AI’s learn. AI's need training data to learn, similar to how children initially absorb information from their parents and start to mimic them in one way or another. So when we teach an AI, we often use data derived from some human observations or interpretations. Therefore, as freaky as this sounds: we are also the parents' of AI and our biases can sometimes be inherited to our children.
The most notable example of how AI can inherit biases is when it was applied in the American judicial system. In response to overwhelmed courts, the US turned to AI to improve the efficiency of sentencing. Therefore, AI algorithms were used to determine recidivism scores—a single number estimating the likelihood that a person would re-offend. Since this data was based on historical sentencing data, the algorithm would then inherit any biases that data came with. This resulted in low-income minority groups receiving higher recidivism scores and therefore, more severe sentences than higher-income caucasian offenders with more serious crimes. The MIT Review covered this topic well in their article “AI is sending people to jail—and getting it wrong”, for those who want to dig deeper into this.
Humans need a lot of sleep to function. The average human spends approximately 26 years of their life sleeping and an additional seven years trying to sleep. That's loads of time not spent learning.
Although our electronic counterpart does not need sleep, in the spirit of being petty ☝🏾, I will mention their reliance on electricity to function.
Computer networking is the ability for computers to use nodes to share resources on a digital telecommunications network. In simple English, it's a way for computers to share information. While humans can also share information, we have a few constraints that computers don't.
- Our proximity to each other affects the speed of the information exchange.
- If the size information is too large and has to be retained by memory, it can become inconsistent.
These limitations can be practically illustrated by the children's game "broken telephone or "viskleken" for my fellow Swedes. During the game, you learn how difficult it is to share consistent information with multiple people. In the workplace, consistent and rapid information sharing is a necessity.
8. Lifelong learning
All life comes to an end unless it is artificial. Not only will this technology outlive us all, but it will keep getting better at learning over time. Talk about taking the concept of lifelong learning seriously.
Graduation Day - General Intelligence
Yip, it may be that AI is overhyped, but one cannot deny the potential it holds. Every day, the best researchers from Silicon Valley to Shenzhen work to improve its aptitude. It's no surprise that this wonder kid has almost every industry working to recruit it. While it's not clear whether AI will ever stop learning, it's inevitable graduation will be the day it reaches general intelligence.
No, I didn't forget!
Everyone knows no article about AI is complete without a Max Tegmark quote. So, here you go 😉
“Life 1.0”: life where both the hardware and software are evolved rather than designed. You and I, on the other hand, are examples of “Life 2.0”: life whose hardware is evolved, but whose software is largely designed. By your software, I mean all the algorithms and knowledge that you use to process the information from your senses and decide what to do—everything from the ability to recognize your friends when you see them to your ability to walk, read, write, calculate, sing and tell jokes.”