Artificial Intelligence – What’s in a name?
As the tech industry hype cycle continues to churn through my inbox daily, I find myself pondering the “artificial intelligence” meme of the day. My first reaction to overhyped terms is to resist giving them more credit than they deserve. “AI” associated with just about everything corresponds to my reluctance. In one way or another, all new products and/or services are related to “AI”: AI-based, AI-powered, AI-powered, AI-driven (oh come on!). Whether the product/service actually includes AI technology under the hood is debatable. Having an “if” statement in the underlying code does not make an AI system. That’s how I got the motivation for one of my regular columns here at insideBIGDATA, “AI Under the Hood”. I wanted to challenge companies to say exactly how AI is being used. Some companies take the bait, while others… to be charitable, consider their use of AI “proprietary”.
But as I reflect more on the AI frenzy of 2022, I view this accelerating trend as more than just a fad. One of the main reasons is that, unknown to many, AI is actually a very old term dating back to 1956 (although the discipline has transformed considerably since its beginnings). It is this year that Dartmouth Summer AI Research Project took place in New Hampshire. The seminal meeting drew a small group of scientists together to discuss new ideas and the term “artificial intelligence” was first coined by American academic John McCarthy. In the decades since, AI has survived a number of so-called “AI winters,” and that long journey has delivered the technology in its current form today. There could be something in this AI thing! AI actually includes both “machine learning” and “deep learning”. The slide I received below from a NVIDIA GTC the opening speech nicely sums up the lineage.
AI, it seems, has real longevity. It should command some respect, and that’s a good reason why I’m embracing the term today, albeit for different reasons than some of my PR friends. As a data scientist, I know the long history of this field, and I’m happy, maybe even energized, to give it a pass. However, I’m still on the fence with other hyper-hyped terms, like “metaverse”, “web3”, “cybercurrency” and “blockchain“. I have a long way to go to see if they can live up to their street success.
Giving credit where credit is due, NVIDIA helped propel AI out of its last “winter” around 2010 when the company wisely realized that its GPU technology that found such success in gaming could be applied to data science and machine learning. Suddenly, the limitations of AI, specifically deep learning, in terms of computation and high-capacity data storage were no longer a concern. GPUs could take these linear algebra equations found in contemporary algorithms and run them in parallel. Accelerated computing was born. Training times have gone from weeks or months for larger models to hours or days. Surprising! Personally, I don’t think we’ll see any more AI winters unless one is triggered by the lack of progress with AGI.
As a data scientist, even I have qualms about the term “data science” and whether it will stand the test of time. When I was in graduate school, my field of study was “computer science and applied statistics”, because “data science” was not yet a thing. I am happy today because now I can tell people what I am doing and earn their interest and respect to a large extent. If I’m at a cocktail party and someone asks me the proverbial “So what are you doing?” I can now proudly answer “I’m a data scientist”, and a lot of people will understand. Before, saying I work in computer science and applied statistics resulted in quick glances and exits. Now I find I get questions, and sometimes I even draw a circle of people around me to hear what I have to say about tech trends. Suddenly I’m cool with a cool professional designation.
Not everyone is convinced of the longevity of “data science”. A few years ago, my alma mater UCLA was designing a new graduate program to meet the growing interest in what people called data science. At the time, many schools were working quickly to create a new “Masters in Data Science” program, but UCLA took a more strategic, long-term approach. After careful consideration, they opted for a “Masters in Applied Statistics”. It’s really the same thing, it’s all in a name. I understand their decision. Who knows what my field will be called in 10 years, but “Applied Statistics” will be around for the foreseeable future!
For a detailed history of the AI field, please see the compelling title for the general public”Artificial Intelligence: A Guide for Thinking Humansby Melanie Mitchell, professor of computer science at Portland State. A wonderful read!

VScontributed by Daniel D. Gutierrez, Editor and Resident Data Scientist for insideBIGDATA. Besides being a technology journalist, Daniel is also a data science consultant, author, educator and sits on several advisory boards for various start-up companies.
Sign up for the free insideBIGDATA newsletter.
Join us on Twitter: @InsideBigData1 – https://twitter.com/InsideBigData1
Comments are closed.