AI stands for “artificial intelligence.” The phrase was first coined by computer scientist and founder of AI, John McCarthy, in 1956. He defined it as the science and engineering of making intelligent computer programs that accomplish complex tasks by mimicking human learning behavior. However, the idea and concept of intelligent machines can be traced back to English mathematician Alan Turing in 1950. His article, “Computing Machinery and Intelligence,” introduced the possibility of these intelligent machines, as well as the standards by which their intelligence is measured (Haenlein & Kaplan, 2019).
Although AI developments may have taken the news by storm, AI technology has been around for quite some time in common, everyday apps and websites. For instance, did you know that artificial intelligence is built into your fitness tracker? In facial recognition software? Purchase recommendations from sites like Amazon based on previous purchases, or even email services that flag incoming messages as spam? These companies rely on software that uses algorithms that make predictions based on data collected from the user.
AI systems are evolving so quickly that, although artificial intelligence has not yet reached human-level cognizance and thinking, it may very well become a reality within decades.