Your AI Roadmap

Why is AI taking off now? Technical Trifecta for AI: GPUs, Big Data, and Machine Learning

Dr. Joan Palmiter Bajorek Season 2 Episode 17

Why is AI taking off now? Dr. Joan Palmiter Bajorek answers a common question given the rapid advances around us in AI. This episode ties previous episodes together.

The current hype surrounding AI technology is based on advances in on the trifecta of factors driving its growth: advancements in hardware (GPUs), the size and complexity of data sets, and improvements in machine learning algorithms. She emphasizes the significant investment in the sector and the implications for future AI products and technologies.

Episodes Cited that pair with this episode

Deepseek "BONUS DeepSeek Shocks the Market! Competitive Strengths and Weaknesses: Link: https://yourairoadmap.buzzsprout.com/2358279/episodes/16522850-bonus-deepseek-shocks-the-market-competitive-strengths-and-weaknesses

LLMs and Multimodal AI with Dr. Stefania Druga of Google: "My data set is the internet" 📊 Data Set Sizes: "You can think of what does it mean to scrape the entire internet and that's pretty much it...all the data that was ever created digitally and has the right permissions." Link: https://yourairoadmap.buzzsprout.com/2358279/episodes/14972812-llms-and-multimodal-ai-with-dr-stefania-druga-of-google


Takeaways
🤖 There's a lot of hype around AI technology right now.
💻 Advancements in GPUs are crucial for AI development.
📈 The size and complexity of data sets are increasing.
🧠 Machine learning algorithms and transformers are breakthroughs where ML is becoming more sophisticated.
💰 Investment in AI technology is at an all-time high.
🚀 OpenAI and NVIDIA are leading players in the market.
🌐 AI products are becoming more accessible to the public.
🤔 Understanding the trifecta of AI can provide new insights.
🤖 The future of AI includes personalized products and robotics.

Support the show

Learn More

YouTube! Watch the episode live @YourAIRoadmap
Connect with Joan on LinkedIn

✨📘 Buy the Bestselling Wiley Book: Your AI Roadmap: Actions to Expand Your Career, Money, and Joy. Featured in Forbes!

Who is Joan?

Ranked the #4⁠⁠ in Voice AI Influencer, ⁠⁠Dr. Joan Palmiter Bajorek⁠⁠ is the CEO of ⁠⁠Clarity AI⁠⁠, Founder of ⁠⁠Women in Voice⁠⁠, & Host of ⁠⁠Your AI Roadmap⁠⁠. With a decade in software & AI, she has worked at Nuance, VERSA Agency, & OneReach.ai in data & analysis, product, & digital transformation. She's an investor & technical advisor to startup & enterprise. A CES & VentureBeat speaker & Harvard Business Review published author, she has a PhD & is based in Seattle.

Disclaimer: Our links may have affiliate codes. This is an educational podcast and not intended as legal, career, or financial advice. Seek professional gu...

Hey folks, welcome back to another episode of Your AI Roadmap. I wanted to talk to you today about some technical stuff. Specifically, people have been asking me a lot of questions about why now? Right? There's all this hype going on and deep seek this and Nvidia that and kind of this question of what is going on that this is blowing up right now. And so I wanted to speak to that. And specifically, I think about this kind of in a trifecta, but there are four key factors that play into this really keenly. But let me talk about my trifecta the way I think about it. So the first one is going to be about GPUs and hardware. So if you think about back in the day, if you'll go with me a little bit, when I was in middle school, we had what are called computer labs. You ever been in one of these? And we had those iMacs that were like the size of a big box. They're really colorful, right? Like these iMacs, they were huge. They ran relatively slowly. They didn't have that fancy of graphics, but these were what computers were back quite a while ago, right? We've seen this advancement. And then what we've seen is, you with our computers, if y'all remember, if you're as old as me or not, you know, like BlackBerry's. like how the iPod and the iPhone have radically changed in the last just decade. And now instead of these huge boxes of iMacs, I have an iMac at home and it is gorgeously thin. It's a silvery perfection that runs really, really fast. A lot, a lot of stuff. And so when we think about advances in hardware on the processing power side, I think about it as muscles. These are literally mechanical muscles that are processing what's going on in the computer. And if you know anything about NVIDIA or ANET or any kind of Dell processors, we're talking about the muscles behind what's going on with the computer and how good they are. And if you've ever seen, this might be a deep cut for those of you listening, but NVIDIA's CEO handed one of the earliest supercomputers, I think there's a photo of this on Twitter. Anyway, I couldn't dig this up. I present this sometimes. Anyway, to open AI in really early days and Musk was in the room. So the ability to have these processors is a key component of unlocking the large language models and chat GBT products that we have on the market today. So one piece of the trifecta would be graphic processing units are computer muscles. The second piece, which I think is a little bit hard to better understand maybe than it used to be, is the size of our data sets. Now, if you've listened to this podcast and you've heard of the episode specifically with my friend, Stefania Druga, Dr. Stefania Druga of Google, she talks about that her data set is the internet, right? the size of these datasets cannot be underestimated and it's pretty, pretty wild. So when we think about big datasets, we're not just thinking about text, we're also thinking about images, we're thinking about how people interact and click with it. Datasets can be very complicated these days and as, excuse me. Datasets can be very complicated these days. As more and more of you have heard about me talking about my agriculture customer, know, datasets are not just text, they're not just emails, right? Huge repos of emails. They're not just images, know, capture things that we do with clicking on, that a toaster or not a toaster? Which of these photos has motorcycles? And you click on them, right? So lots of images and how we aggregate and we think about datasets of images. So the second- Excuse me, this is Luna Barks. We're on a Luna walk. So the second part of the trifecta would be datasets, the size of datasets, the availability of datasets that you can download. I just downloaded the other day a dataset of three million blue sky posts that you can just download and do research on. I'm not going to tell you what I did with those three million posts from blue sky, but anyway, these are the datasets. All right, so we've got muscle. We've got data. And then the third piece, the third piece would be back to the patterns that we can find among them. This is advances in machine learning and neural networks in attention. We can talk about transformers, which gets quite complicated into machine learning. But basically what we see overall is that we have a huge amount of data, like trillions, like ginormous. size of internet data sets, and now we have the muscles to actually process them effectively and leverage the patterns, refine models that are based on these data sets. And so as people in labs had access to processors, had proprietary data sets and so forth, coming out of Microsoft, coming out of Google, coming out of Stanford, et cetera. people could make these models better and better, especially over the last few years. So this trifecta all of course has to be said that behind the scenes, there's a huge amount of money and the amount of funding in this sector is just a lot. The amount of money Microsoft has dumped into OpenAI is no joke. I think it's also really interesting if you listen to my episode about DeepSeek, that people say that they only raised $5 million as compared to the billions that OpenAI has capital access to. So if we really think about the kind of why now, and we see these advances with our hardware, we see these advances with the size of our data sets and the availability to them, as well as the machine learning models, it's no surprise, I think, that eventually we got products on the market that you and I can just go on OpenAI's website. And literally the homepage today is that you can type in what you want and interact. with a large language model for free, at least here in the United States. International folks, I know not everyone has as much access, but it's really cool to be doing tutorials for Pluralsight these days where I am getting, I'm using Operator where it actually, you can ask it to do something and watch the computer attempt to do it, open up different browsers. It's pretty cool. So I guess I just wanted to kind of help you connect dots. and think about the why now of AI right now. And think about how all these things are interacting. And I just saw this really cool announcement recently from Factor, which is a robotics company and about how they're thinking about bringing robotics more into the home space. okay, I think what I said is just a lot and I maybe takes time to process, but I will. When I drop this episode, I'll put this, you can go back and listen to the DeepSeek episode that I talked about. I'll put the Stefania Druga episode about Google and large language models and multimodal data sets. But I really want you to think about the investment and kind of why now and what is coming next. I think there's a lot of different products that are personalized to different things. The investment in this sector. But hopefully if you haven't heard about it kind of framed in that way of the muscles and the patterns and the data set, that that can hopefully give you a new perspective on how you think about different things you see in the news of people wondering, will people need as much NVIDIA GPUs related to this product coming out? Or if there are huge efficiencies in models, will we need as much processing power? But at scale, I think it's just a little bit mind boggling. Hopefully you take a minute, you drink some tea after this one, you think about it, you think about the trifecta, maybe you're like, hey, Joan is missing a huge element I wanted to hear more about, but I have been dropping more and more episodes about career stuff and money stuff, which I think you're digging. People around the world are downloading them. But I also wanted to remind you that there are technical parts of this podcast. I have been just slammed at work and dropping fewer of those technical interview ones. I'm excited to drop more in the next few weeks. but thinking of you all and hoping that this podcast provides you hope and inspiration and education. And yeah, it's delightful to be in community with you. Okay, back to summary recap. We are talking about the trifecta of the why now of AI. Can you remember what they are? Pop quiz, muscles, the hardware, the GPUs, big data. and availability of companies and us to get our hands on more more data sets. And the third one, what is it? That's right, machine learning and all the transformers and advances of patterns that we see amongst these data sets that we can see better and understand and process faster with that hardware. Okay, I'll see you on another episode of Your AI Roadmap. Have a good day, bye.

People on this episode

Podcasts we love

Check out these other fine podcasts recommended by us, not an algorithm.

Hello Seven Podcast Artwork

Hello Seven Podcast

Rachel Rodgers
Your First Million Artwork

Your First Million

Arlan Hamilton