Thanks to science fiction, many people probably feel like they have a good grasp on artificial intelligence. The image that comes to mind is some kind of computer, maybe in the form of a robot, that can communicate and make decisions just like a human. Thanks to modern advances, certain technologies on the market today bear some resemblance to this imagined ideal, making it seem as though AI will soon be able to perfectly replicate human learning and thought processes.
The reality is a little different. For starters, artificial intelligence can be thought of as a very broad spectrum of capability. If the basic definition is “computing that mimics human intelligence,” then most of computing history is actually about the evolution of artificial intelligence, and many of today’s standard IT practices were yesterday’s cutting-edge AI. Secondly, there is a large difference between human-like behavior in a specific use case and the ability to have that behavior across a wide variety of situations. By fundamentally understanding what AI is and how today’s AI is different from previous models, people can be more informed about how this trend will fit into business and society.
The content you requested is available to CompTIA Registered Users and CompTIA Premier Members.
Register now. It’s free!
Registration will provide you with a customized experience and give you instant access to hundreds of CompTIA research
reports, guides and tools. Anyone can create an account.
Learn more about registration