Machine learning (ML) refers to a system that can improve its performance on a specific task based on experience with or exposure to data without reprogramming. In layman's terms, machine learning is what allows applications to become predictive by categorizing and recognizing patterns in historical and incoming data.
Machine learning has evolved to become the de facto technology to make business and consumer applications—existing and new—smarter. It allows a computer to recognize patterns and use them to infer general rules about how to organize data or content and present it to users.
Properly used, machine learning algorithms, when deployed in multiple combinations, will begin to enable software programs to predict when things will happen and recommend ways to respond to them. The potential here is to become more of a real-time virtual assistant to humans and to other computer services.
Like cognitive computing, machine learning is cutting-edge computer science. Moreover, this domain is so broad and fundamental that it will take years to fully explore all of its potential applications.