NVIDIA just raked in a whopping $30 billion last quarter, thanks to the AI industry’s #GPU addiction. These GPUs are the lifeblood of AI, crunching math problems like it's a sport. Nvidia is the reigning champ, but a new player, TensorWave, wants to break up the party by offering AMD chips instead. Founded after a pickleball match (seriously), TensorWave's mission is to end Nvidia’s monopoly, promising cheaper prices and better AI performance with AMD’s MI300X. Based in Las Vegas (because why not gamble?), they’ve already hit $3 million in revenue and are aiming for $25 million. Investors are all-in, too. It’s David vs. Goliath, but with a lot more maths. Jeff Tatarchuk 🌊Darrick Horton 🌊 #datacentre #aichips #aiforall
Tony Skurr’s Post
More Relevant Posts
-
"Bhai Agar tune us time NVIDIA me 10000rs Lagaye hote to ajj...." .. .. Remember when AI was just a buzzword and NVIDIA was hardly on any of us regular folk’s radar? Well, flash forward to now, and it's like they're everywhere in AI! 🤖💥 Back in 2020 when I was just dipping my toes into the tech world, I barely heard NVIDIA’s name. Fast forward, and boom—they're the big talk in AI chips! So, what’s the deal? Are we looking at a monopoly here or what? 🧐 NVIDIA's GPUs are the go-to for anything AI—from gaming to deep learning. It's like they've got a secret sauce or something. 🎮🔍 But hold up, they're not the only players on the field. We’ve got AMD, Intel, and those cool cats at Google stirring the pot too. But here’s the kicker: every time there's a big AI breakthrough or some wild new tech, it seems NVIDIA’s tech is lurking somewhere in the shadows. Spooky, right? 🕵️♂️💡 I mean, if we’re talking domination, NVIDIA is kinda like that one player in Monopoly who owns all the prime properties—yeah, Boardwalk and Park Place, I’m looking at you! 🎩🏠 What do you think? Is NVIDIA’s grip on the AI market setting us up for an innovation boom, or are we edging towards a tech monopoly? And hey, have any of you been on team NVIDIA since the get-go, or are there some hidden gems out there I should know about? Drop your thoughts, wild experiences, or even your favorite AI conspiracy theories down below! ⬇️👽 Let’s make this chat as lively as a GPU at max clock speed! 🎢🚀 #NVIDIA #AI #TechTalk #Innovation #MonopolyOrNot #TechConspiracies
To view or add a comment, sign in
-
The impact of gen AI on the media and entertainment workloads 🤖 #theCUBE is live from the 2nd session of #OpenStorageSummit -- powered by Supermicro, Intel Corporation, AMD and NVIDIA -- where we speak with Sherry Lin, storage solution product manager at Supermicro, on how gen AI has revolutionized the media and entertainment workloads. “Number one, we see higher performance needs as gen AI is used in content creation, post-production editing like animations, real-time rendering in virtual stages, virtual actors, etc. The storage server must be able to handle these intensive gen AI workloads,” Lin shares. “Number two is scalability. As the imagery resolution increases and AR/VR is more commonly used, the storage server has to be easily scaled in order to scale up or out in order to accommodate those increasing data volume and also complex projects. Number three is the storage media solutions. So, again, we not only need high performance storage, but also high-capacity storage,” she adds. 📺 Catch the interview live: https://lnkd.in/gwzsz8-S
To view or add a comment, sign in
-
Again, the old anecdote is confirmed: Where there is a gold rush, the most money is made by those who sell shovels. Using the AI hype as an example, it looks like Nvidia is beginning to steer the course of the US stock market. Since the beginning of the year, shares of Nvidia, the supplier of the GPU chips that power the AI revolution, have risen more than 40%. Last week, we all waited with bated breath for Nvidia's financial results to see if the stock market was going up or down. Does the tail wag the dog? #nvidia #chips #ai
To view or add a comment, sign in
-
Super Micro shares surge as AI boom drives 100,000 quarterly GPU shipments - https://lnkd.in/gED_CycM #supermicro #aiservermaker #supportaifactories #100000gpusshipmentquarterly #liquidcoolingsolutionfeatured #liquidcoolingsolutiondlc #newdlcproductintroduction #highestgpuperrackdensity #keydifferentiator #energysavingspacesavingforaiinfrastructuredeployment
To view or add a comment, sign in
-
NVIDIA just announced a banner quarter driven in large part due to their #AI related GPU technology. A gap analysis comparison of NVIDIA and Intel Corporation #patent portfolios shows that despite the overall lead by Intel in patenting in the US, NVIDIA leads in a key aspect of ray-tracing technology. Using dashboarding technology to perform competitive analysis has been a game changer for my clients. Being able to compare their portfolio with a competitor using technology classifications in a few clicks saves countless hours and effort sorting through spreadsheets or compiling views from search tools. How are you using patent data to perform competitive portfolio analysis and comparisons?
To view or add a comment, sign in
-
Rocky Berndsen Yes, artificial intelligence (AI) can do many things, but NOT everything. With our intellectual property (IP), we are doing what AI can NOT do. Without metadata, especially multilingual metadata, NO multilingual data can be found/retrieved for intelligence analysis, Internet of Things (IoT), etc., even by the most advanced technologies, like AI, quantum computers, etc. https://lnkd.in/g-aJFnXR When we used the following questions to test ChatGPT, an icon of AI, the experiment results showed that it can NOT answer them, while we can, with our IP; for example, a pure English question like: "Who, in England of UK, has new US patents granted on February 20th, 2024?" or a Chinese-English multilingual question like: "Who, in the '湖北' province of mainland China, has new US patents granted on February 20th, 2024?" Our IP is a copyrighted multilingual metadata. It can distill real time (2024.02.20) information about US patent holding/holder, based on the multilingual census geographical locations of Canada, mainland China, Hong Kong, Macao, Taiwan, Middle East (Israel, Saudi Arabia, UAE) and Europe (Austria, Germany, Switzerland, UK). Does our IP ring a bell to you? Thanks.
NVIDIA just announced a banner quarter driven in large part due to their #AI related GPU technology. A gap analysis comparison of NVIDIA and Intel Corporation #patent portfolios shows that despite the overall lead by Intel in patenting in the US, NVIDIA leads in a key aspect of ray-tracing technology. Using dashboarding technology to perform competitive analysis has been a game changer for my clients. Being able to compare their portfolio with a competitor using technology classifications in a few clicks saves countless hours and effort sorting through spreadsheets or compiling views from search tools. How are you using patent data to perform competitive portfolio analysis and comparisons?
To view or add a comment, sign in
-
xAI's Memphis Supercluster is live. It's equipped with an incredible 100,000 Nvidia H100 GPUs (they cost about $30.000 each). I believe this makes it the largest and most powerful computer installation of any kind ever created. The sheer scale of it is mind-boggling. To give you an idea, training ChatGPT-3, which had 175 billion parameters, would require about 0.3% of those GPUs for one month. For larger models, with up to a trillion parameters, up to 3% of those GPUs might be utilized to achieve effective training within a reasonable timeframe. While supercomputers like the upcoming Frontier at Oak Ridge National Laboratory Oak Ridge and Aurora at Argonne National Laboratory are expected to be highly powerful, xAI’s Memphis facility is already up and running. It’s not just about building future capacity, it’s already about making it operational to accelerate real-world applications. What does this mean for the future of AI? With such computational power available, xAI is certainly going to lead with significant breakthroughs in AI applications and technologies. #TechNews #Innovation #xAI #Nvidia #AI #GPU #H100 #SuperComputer #MemphisSupercluster
To view or add a comment, sign in
-
🚀 AI Advancements You Don’t Want to Miss: Gemini's Image Generator, AMD’s New AI Chips, and OpenAI’s o1 Benchmark! 🔥 🎨 Gemini Opens Its Doors: Now all Gemini users can access the powerful Imagen 3 text-to-image model! Whether you’re on mobile or desktop, you can create stunning visuals with ease. Advanced users, get ready to create people images — a game-changer in AI-generated art! 🌟 💥 AMD vs. Nvidia: With its latest MI325X AI chip, AMD is stepping up to challenge Nvidia’s dominance in the GPU market. With demand for Nvidia chips soaring and waitlists growing, AMD’s alternative is arriving right on time for Meta, Microsoft, and many more looking for top-tier AI solutions. 💡 🏅 OpenAI’s o1 Benchmark: OpenAI’s latest frontier, o1, is already making waves. A new benchmark designed to test the model’s ability to build other LLMs saw o1 earn medals in 17% of challenges — a notable achievement, even in complex, human-level tasks! 🧠 👇 Which of these advancements excites you the most? Let's discuss how they’ll shape the future! 🌐 #AI #GeminiAI #Imagen3 #AMD #Nvidia #AIChips #OpenAI #AIAdvances #LLM #TechInnovation #MachineLearning
To view or add a comment, sign in
-
🤖 Have you heard the exciting news about Groq This innovative AI startup just launched the world's fastest chatbot, capable of returning answers in mere milliseconds! 🤯 Founded by former Google engineers, Groq developed proprietary hardware called Linear Processor Units (LPUs) to power its ultra-fast AI. Unlike traditional GPUs built for graphics, Groq's custom LPUs are optimized specifically for AI workloads. ⚡️The results are mind-blowing - we're talking nearly 500 trillion operations per second! This unprecedented speed can supercharge large language models and enable developers to build the next generation of real-time AI applications. Most importantly, Groq's technology signals a potential breakthrough in AI hardware and the first real challenger to NVIDIA's dominance. Their LPUs offer a compelling alternative to GPUs for AI acceleration. This is a total game-changer! Kudos to the brilliant minds at Groq for this revolutionary advance in AI speed. 🥕If I could just figure out to bake the carrot cake as fast The future is looking very bright! #Groq #AI #LPU #GPU #Nvidia #GoogleAlumni #Chatbot #LLM
To view or add a comment, sign in
-
Can Groq overtake NVIDIA? Groq is a "semiconductor company" that designs Language Processing Units (LPUs), a specific type of chip designed for processing language, the "inference" part of the AI workflow. Nvidia designs GPUs that are more optimised for AI training applications and is being used in large numbers by companies such as Meta, Google, Tesla and Baidu, to train their LLMs. Nvidia also has a larger presence in the market with 500,000 H100 GPUs deployed in 2023 compared to Groq's projected deployment of 100,000 LPUs by the end of 2024. But, the USPs offered by the Groq chips are simplicity, energy efficiency and lower computing costs. LLMs run on Groq's chips are known for their low latency operations and quick replies. With all these advantages over Nvidia, there is a certain chance that Groq can overtake Nvidia in the AI processing chips space. #ai #groq #gpu #nvidia #meta #google
To view or add a comment, sign in