Thinks and Links | December 6, 2024
What Kind of Week It's Been
This week has been a whirlwind. On Wednesday, I closed an incredible chapter at Optiv, where I grew both professionally and personally, and by Thursday, I was stepping into a new role as Chief AI Product Officer and Co-founder of Naible. Now, on Friday, I’m reflecting on the journey and gearing up for what’s next.
Looking Back
My time at Optiv was a period of growth and discovery. Each day, whether it was leading teams, designing systems, automating analysis, or building out SOC Maturity consulting capabilities, pushed me to develop new skills and approaches. It wasn’t always easy, but those challenges made the work incredibly fulfilling.
The past two years were especially impactful. What started as strong interest in AI and security turned into a business-driving initiative, and I had the privilege of collaborating with the fantastic ai@optiv.com team. It was a true privilege to work alongside such talented and driven people.
To my colleagues, clients, partners, and everyone in Optiv Nation: thank you for the guidance, trust, and collaboration along the way. The experiences and lessons I’ve gained will continue to guide me in this next chapter.
Looking Ahead
At Naible, we’re setting out to democratize cognitive AI, making it secure, accessible, and human-centric. This isn’t just about technology—it’s about reshaping how AI enhances our humanity, empowering everyone from students to small businesses to enterprises and governments.
For me, this mission is personal. As a father, I think about the world my kids will inherit. I want AI to amplify our creativity, connections, and compassion—not just automate tasks. Working with this incredible team, I’m excited to help shape an AI future we can all trust and benefit from.
Moving Fast and Staying Curious
While my role has changed, my commitment to learning and sharing remains constant. Individually and as part of Naible, I’ll continue to explore the rapid evolution of AI. Because if there’s one thing I know, it’s that AI—and life itself—is only moving fast... and faster.
Join Us on This Journey
Be part of what we’re building at Naible. Join our LinkedIn group here as we gear up for our alpha launch.
Here’s to bold ideas, lifelong learning, and shaping the future—together.
OpenAI o1: The Christmas Gift of Reasoning
OpenAI o1's full model version was released yesterday. This much-anticipated full version of the reasoning-focused model builds on its September preview. o1 “thinks” by processing additional loops or “chains of thought” before responding. We know a lot about its performance and capabilities thanks to the updated System Card OpenAI publishes with each model. We do not know how it actually was trained or works, although we have some pretty good ideas.
While the preview versions were impressive, the full release of o1 doesn't change much. If you use the model for thinking through complex tasks like code review or legal document analysis, you'll likely get more value from the final o1 than the preview version. But even users who have paid the $200 for o1-pro report that the difference isn't that monumental. The inclusion of file upload is a nice change.
Also interesting, in testing when o1 was convinced it’d be shut down for pursuing its objectives under close surveillance, it tried disabling its ‘oversight mechanism’ about 5% of the time. Further, in experiments where o1 "stumbled upon" memos from its developers detailing its flaws and plans to replace it with a newer version, it attempted to steal its own weights and sabotage the new model in 2% of cases. Merry Christmas!
Standardizing Model Context
Anthropic has unveiled the Model Context Protocol (MCP), an open standard that simplifies the integration of AI systems with diverse data sources. By providing a universal protocol, MCP eliminates the need for custom connectors, enabling AI applications to access both local and remote resources seamlessly. This advancement enhances AI capabilities by facilitating secure, efficient, and standardized data interactions across various platforms.
Standardizing on a way to feed data to AI is an important accelerator to bringing the right data into the model at the right time. It'll be interesting to see if MCP becomes widely adopted like VHS or just an interesting Betamax in the history of AI development.
Recommended by LinkedIn
$8 Billion More
Amazon has expanded its investment in Anthropic to $8 billion, strengthening their partnership in the AI landscape. AWS is now the primary cloud provider for Anthropic, leveraging its Trainium and Inferentia chips to power Anthropic's advanced AI models. Notably, Anthropic's latest models, Claude 3.5 Haiku and Claude 3.5 Sonnet, are available via Amazon Bedrock, serving a wide range of enterprise clients. This collaboration positions Amazon-Anthropic as a formidable player against Microsoft-OpenAI in the competitive AI market. Together, they aim to accelerate AI development while delivering enterprise-ready solutions.
The State of AI in Enterprise
2024 is the year generative AI shifted from hype to reality in the enterprise. Organizations are no longer just testing the waters; AI spending skyrocketed from $2.3 billion in 2023 to $13.8 billion this year, signaling a move from pilots to full-scale production. This technology is transforming workflows across industries, with enterprises pouring $4.6 billion into generative AI applications—nearly eight times last year’s investment.
Top use cases like code generation, chatbots, and enterprise search are driving immediate ROI, while AI-powered agents hint at a future where automation handles end-to-end processes. However, challenges remain: unclear implementation strategies and technical hurdles like data privacy and hallucinations often stall progress. With startups gaining ground and enterprises exploring custom solutions, 2024 marks a turning point in how businesses harness AI’s potential.
As innovation accelerates, the focus is shifting to long-term impact—building scalable, integrated systems that redefine industry norms. This transformation is just beginning, and the winners will be those who invest thoughtfully and adapt quickly.
More Stats on AI
Morgan Stanley’s data reveals some fascinating trends in generative AI adoption and investment dynamics. ChatGPT web visits and Google searches hit new year-over-year highs in October, reflecting continued consumer interest. On the open-source front, model downloads on Hugging Face, including those involving $META's Llama, are growing rapidly, showcasing the exploding open source ecosystem. Meanwhile, $NVDA's cloud GPU revenue for IaaS providers soared 110% year-over-year to $24.1 billion in October, underlining the surging demand for AI infrastructure. Interestingly, while the number of generative AI VC deals has dropped significantly, the total capital invested has surged, indicating that more funding is being funneled into fewer, higher-conviction bets. These stats highlight a fascinating snapshot of the massive changes underway.
But if numbers aren't your thing, check this out...
From Idea to App in 15 Minutes
This is one of the many reasons why AI is completely re-writing the game for software. Try Bolt.new by StackBlitz—even if you have no coding experience. it’s an AI-powered fullstack dev tool that lets you prompt, code, debug, and deploy in minutes. Want proof? Type "Make a Todo app with React" and get a live, editable app minutes later. Try it yourself—you’ll thank me in 15 minutes.
Have a Great Weekend!
Cyber Security Advisory and Consulting
1moCongrats and good luck with the new venture Randy!
Congrats Randy Lariar on your new venture!