Artificial intelligence, the major plot device in countless sci-fi films, is making it big in the entertainment industry. Recent breakthroughs in artificial intelligence and machine learning are helping Hollywood get even smarter about how it captures an audience, and the implications for your entertainment are staggering. In the brave new world of big data, neural networks, and human-computer interaction, there’s always something good on TV.
AI and machine learning
You’ve probably seen the terms “artificial intelligence” and “machine learning” used interchangeably. The former concept has its roots in the earliest days of computing, when the idea of a self-teaching and self-programming computer was first postulated. Hollywood became obsessed with the concept, putting forth classics like The Terminator, War Games, Blade Runner, and The Matrix. We haven’t reached our promised dystopian future just yet, but computer scientists are closer than they’ve ever been to realizing true AI.
Machine learning is the newest buzzword, used often to describe the actual application of “learning” software, software that improves as it encounters new information or situations. This technology exists, and it’s progressing rapidly. “It’s only been in the last few years that we’ve seen a fundamental transformation in this technology,” says Stephen Gold, the CMO and VP of business development and partner programs with IBM’s Watson project, which created the robot brain that beat Jeopardy legend Ken Jennings.
Big steps forward in natural language and image recognition have allowed computers to sort through exabytes of user data. The computers use these caches of big data to learn how to interact with users and optimize digital experiences. Tech giants like Google, Facebook, Amazon, and Apple push the tech further each year, and now routinely use uploaded photos, social media profiles, and web activity as massive data pools.