Business and IT Glossary  > Artificial Intelligence (AI)

Artificial Intelligence (AI)

Artificial Intelligence (AI) extends the abilities of its human counterparts and can search, read, understand, and categorize various sets of unstructured data faster than any human or team can. As a computer science discipline, AI traces its roots to the 1956 Dartmouth Summer Research Project on Artificial Intelligence, which brought together leading academic and industry researchers to attack a simply stated problem with a bold premise:

“An attempt will be made to find how to make machines use language, form abstractions and concepts, solve kinds of problems now reserved for humans, and improve themselves. We think that a significant advance can be made in one or more of these problems if a carefully selected group of scientists work on it together for a summer.”

From the original research funding proposal by McCarthy (Dartmouth), Minsky (Harvard), Rochester (IBM), and Shannon (Bell Labs).

Related Research

[Special Report] Planning for 2018: Artificial Intelligence in Your Enterprise

In this Special Report, business leaders will better understand how to address the dilemma of when and how to evaluate Artificial Intelligence technologies. As business complexity only continues to grow beyond standard process and applications, business leaders responsible for planning will need to be clear of the potential transformative power of AI technologies, as it could become a powerful competitive advantage or major misstep in strategy.

Related Webinars

How to Manage Enterprise Data for Cognitive Computing

How you capture, organize, and manage data has a direct impact on the types of problems you can solve, and how efficiently you can solve them. What works well for traditional transaction systems may make real time cognitive computing difficult or impossible, so it is important to plan ahead. Watch this on-demand webinar to identify the right strategy for your enterprise to support your unique cognitive computing requirements.