Artificial intelligence is usually associated with the future, but as a scientific concept, AI dates from about World War II. Since then, artificial intelligence has gone through some booms, but mostly busts. Now, after decades of ups and downs, artificial intelligence is getting smarter. The credit goes to the development of machine learning and deep learning, as well as the increasing capability of microchips, an explosion in the amount of available information, and the ability of supercomputers to analyze data.

 

AI timeline

Machine learning is a field of study that gives computers the ability to learn without being explicitly programmed. Machine learning enables such features as Amazon’s recommendation algorithm.

It’s similar to data mining. Machine learning uses data to detect patterns and adjust program actions accordingly — essentially the machine draws an inference from the data. Facebook’s News Feed uses machine learning to personalize each member’s feed, and make changes when exposed to new data.

Deep learning is a type of machine learning that uses artificial neural networks that loosely mimic how our brains work. The machine thinks in layers, with each layer capable of analyzing data at a deeper level of complexity and abstraction. Deep learning requires tremendous amounts of data and processing power, neither of which were available until the era of big data and cloud computing. Now, thanks to deep learning, AI is moving closer to delivering human-level capabilities envisioned decades ago.

link to interactive infographic