Weekly Wrap Sheet (11/21/2025): Bubbles, Benchmarks, Bigamy, Blackwell, and Bananas
Rational bets fuel collective excess, Gemini 3 resets the frontier, Anthropic multiplies power partners, Nvidia posts blockbuster quarter, Nano Banana Pro powers a step change in image gen
🎮 TL;DR
Rational actors trigger collective excess, inflating a capex cycle that looks prudent up close and wildly unsustainable in the wide shot
After weeks of suspense, Gemini 3 finally drops - and for once, the model lives up to the murmurs, wrecking benchmarks and launching with its own AI-native IDE
Anthropic borrows OpenAI’s dealmaking playbook - inking GPU deals across the stack and forming a $350B triad with Microsoft and NVIDIA. Monogamy is for amateurs
Nvidia Q3 earnings reassured skeptics and quieted the bubble talk with a reminder that not only is AI demand real, they can barely keep up with it
Google drops Nano Banana Pro with a wink and a watermark, elevating image gen from creative toy to monetizable infrastructure
🫧 The Bubble You Can Justify in Excel
The conversation around AI is starting to feel predictable. Every quarter, the tech giants step up to the mic, clear their throats, and drop another CapEx number that sounds like GDP. Analysts frown. Pundits warn of froth. Twitter shouts “bubble.” From a distance, it does look absurd - a bonfire of capital built on hope, hype, and a handful of GPU purchase agreements.
But zoom in, and each company’s behavior feels… rational. If you’re Microsoft, Amazon, Meta, Google - or OpenAI, Anthropic, xAI - the logic is brutally clear. If you believe that we are seeing a paradigm shift, then under-investing isn’t prudence; it’s strategic negligence. Being early and wrong costs money. Being late and wrong costs the company.
The tension is that when everyone behaves rationally according to their own incentives, the collective outcome can start to look irrational.
This is how infrastructure bubbles happen. Not because companies are dumb, but because they’re smart in the same way, at the same time, and under the same pressure. In the 90s, the telecom industry built enough fiber to wrap the Earth in glowing glass spaghetti. They were right about the future but wrong about the timing. Companies went bankrupt while the infrastructure they overbuilt became the foundation of the modern internet.
The truth is uncomfortably nuanced: we are watching a rational arms race inflate what may eventually be judged, in hindsight, as a spectacular overshoot.
It’s not pure FOMO, the underlying shift is real.
It’s not pure rationality, the economics are highly speculative.
It’s not pure mania, there is genuine technological inevitability.
It’s not pure efficiency, over-capacity will emerge.
This is a game with positive expected value for the winners and negative ROI for the collective. In moments of technological discontinuity, optimal strategies at the firm level and sensible outcomes at the macro level often diverge. That’s the paradox of AI capex: individually prudent, collectively excessive.
🧠 America’s Next Top Model: Gemini 3
Google has delivered a model that isn’t just competitive - it’s dominant. On almost every major benchmark, it posts not marginal gains but wide leads, sometimes by a factor of two or three. Only Anthropic keeps it from a clean sweep, holding onto a narrow 1-point edge on SWE-bench, a hard-won technical outpost in an otherwise rout.
Under the hood, Gemini 3 Pro is a Mixture-of-Experts transformer built from scratch - not a fine-tune or continuation of Gemini 2.5. It supports 1M-token inputs, 64K outputs, and native multimodality across text, code, images, audio, and video.
The key trick - sparse activation - decouples capacity from cost. Instead of running all parameters for every token, Gemini activates only the subset most relevant to the task. That efficiency lets Google run frontier-level reasoning for mid-tier prices ($2 in / $12 out <200K tokens).
As strong as the numbers are, they’re only half the story. The real play is Antigravity, Google’s new IDE for agent-first dev workflows. It allows multiple agents to operate an editor, terminal, and browser in parallel while producing Artifacts - self-verifying logs of what they did and why. Think mission control for autonomous dev agents, complete with feedback loops and memory. It’s the culmination of Google’s Windsurf acquisition and a formidable competitor for everyone in the coding-copilot stack.
America’s Next Top Model has arrived - and this one might just keep its crown through year-end.
➰ Anthropic’s Strategic Entanglements
The deeper we get into the AI boom, the clearer one thing becomes: no one at frontier scale can afford exclusivity. The industry has slid into an era of open relationships. Everyone is working with everyone. Everyone is “strategically aligned” with everyone. Call it the Silicon Valley polycule of compute.
For years, Anthropic-was-to-AWS what OpenAI-was-to-Microsoft: the prized tenant, the strategic jewel, the AI lab that justified billions in capex and a few extra megawatts on the grid. But training frontier models is now an industrial activity - you can’t build AGI on good intentions and one hyperscaler. So Anthropic started expanding the roster.
First came Google: Anthropic signed a deal with Google Cloud to access a large block of their Tensor Processing Units (TPUs) - the public figure is “up to 1 million TPUs” and targeting “well over a gigawatt of compute capacity by 2026.” It was also the moment everyone suddenly remembered Google already owns 14% of Anthropic and is therefore very much in the room, even when the Amazon-Anthropic story gets most of the airtime.
Then came Fluidstack: Anthropic announced a $50B build-out of custom datacenters in Texas and New York - effectively saying, we love you all, but we’d like a place of our own. By designing datacenters around its own model architectures and training cadence, Anthropic can optimize utilization, cooling, and interconnect design for its specific workloads - lowering per-token training costs while deepening long-term defensibility.
And now, the big one.
The Microsoft × NVIDIA × Anthropic triad: Anthropic’s new pact with Microsoft and NVIDIA is basically a compute-for-capital handshake at gigawatt scale. Anthropic commits to buying $30B of Azure GPU power, much of it on NVIDIA’s Blackwell and Rubin systems, while building Fluidstack on the side. In return, NVIDIA invests up to $10B and Microsoft up to $5B, valuing Anthropic near $350B.
Triads - model lab × hyperscaler × chipmaker - have quietly become the atomic unit of power in AI.
OpenAI × Microsoft × NVIDIA
xAI × Oracle × AMD
Anthropic × Microsoft × NVIDIA
And then there’s Google × Google × Google - the last true monogamist in a world of open relationships.
Frontier AI is too uncertain, too capital-intensive, and too politically exposed for most organizations to operate alone - or even to tie themselves to just one partner. The age of exclusivity is has been replaced by the age of entangled alliances - AI’s great polyamorous détente.
🔂 Nvidia’s Quarterly Reminder
Over the past few week, AI bubble talk had been gaining steam. Stocks wobbled. Analysts hedged. Then NVIDIA reported earnings and politely suggested we stop being dramatic.
When the company that supplies ~90% of the world’s advanced AI chips reports results, it doubles as a wellness check for the entire sector. And for now, the pulse looks strong, caffeinated, and entirely convinced of its own destiny.
NVIDIA reported $57B in quarterly revenue, +62% YoY, and about $10B sequentially. This was comfortably above consensus (~$54.9B) and well ahead of the ~55% growth expectation. Net income soared ~65% to ~$31.9 billion.
Forward guidance for Q4: around $65 billion in revenue. Analysts had expected less, but analysts have been expecting less for seven straight quarters now.
Jensen Huang made it plain: this isn’t speculative spend. It’s structural. “The clouds are sold out.” Every hyperscaler is buying. Every GPU is spoken for. The waitlist now includes next-gen chips. If this is a bubble, it’s the kind where every participant has a receipt. He broke the capex cycle into three rational drivers:
Cost savings - Accelerated computing is now the cheapest way to lower cost per compute unit after a decade of Moore’s Law stagnation. GPUs are replacing CPUs in a post-Moore world.
Revenue gains - Generative AI is replacing classical ML in recommender and ad systems. Meta’s new GenAI ad models increased conversions +5% on Instagram and +3% on Facebook Feed - billions in incremental ad revenue, no new users required.
Net-new markets - Agentic AI is creating categories we haven’t even named yet.
The takeaway: this is the most economically grounded compute cycle in decades.
Critics see Nvidia’s investments in OpenAI, Anthropic, Mistral, and others as “circular” - funding customers who then buy Nvidia chips. Management calls it CUDA distribution with equity upside in “once in a generation” assets. Jensen is building the ecosystem, seeding demand, and capturing value at every layer. And he expects those bets to “translate to extraordinary returns.”
And the best quote of the call: “How could anything be easy? What we’re doing has never been done before.” Exactly.
🍌 The Banana Means Business
It’s Google’s world, we’re just generating in it. The phoenix that rose from the ashes of “Bard” is now doing victory laps around the stadium, waving a fruit-shaped baton called Nano Banana Pro.
Their new image model fixes the three core failures of generative visuals: broken text, aesthetic incoherence, and hallucinated nonsense. Nano Banana Pro goes right at all of it:
Text That Doesn’t Look Like a Cry for Help. Historically, asking an AI model to write “Sale 50% Off” inside an image produced “Sdlq 50% 0ff.” This model finally gives us clean, legible text, multi-language typography, and diacritics that don’t look like accidental sprinkles. This upgrade unlocks ads, posters, explainers, product shots, social campaigns - actual commercial use, not just digital doodles.
Consistency by Design. Nano Banana Pro finally cracks the hardest part of visual storytelling - consistency. Blend up to 14 reference images while maintaining the look of 5 distinct people. Go from sketch to product render, blueprint to 3D realism, moodboard to campaign - all while preserving your brand’s visual DNA.We’ve moved from “generate a pretty picture” to “generate a coherent campaign.”
Real-World Knowledge. Built on Gemini 3 Pro, Nano Banana Pro fuses image generation with Search - pulling live data, embedding real facts, and updating as the world changes. This activates Google’s core moat - the canonical global database of facts and the crawler that continuously updates it, flipping the entire image-gen paradigm from “creative toy” to “visual knowledge engine”. Google is turning Search from a consumer product into a cognitive substrate for generative models. This may be their most durable long-term moat.
The Workflow Invasion. Nano Banana is burrowing into the workflow layer:
Slides → Beautify this deck. Vids → Rewrite the shot list. Gemini → Make me a heroic banana figurine. Vertex AI → Pipeline this into content ops.
The real competition isn’t Midjourney or OpenAI - it’s Canva, Adobe Express, Figma, and the $300B marketing-design stack.Watermarking. On paper, watermarking signals transparency. In practice, it’s a business model wearing a halo. By tagging free-tier images, Google creates a tiered rights system that maps neatly onto monetization: enterprises pay for the clean tier, consumers self-label as AI-native, and Google defines what “professional” looks like.
Nano Banana Pro is Google’s attempt to rebuild the entire global creative-production stack inside its ecosystem from consumer fun → to creator tooling → to enterprise workflows → to cloud monetization.
And the best part? They’re doing it under the cheerful, meme-ready name Nano Banana Pro, because in 2025, even strategic plays need a sense of humor.




Does the bubble narrative change if there is a low odds, wild breakthrough in more efficient GPU (e.g. Amazon and Google TPU's)? Or if companies fail to find a source of organic revenue? Will cost savings be enough to justify the cost of GPU's?
Thanks for writing this, it clarifies a lot. Your take on the AI bubble makes so much sense, especially with Gemini 3 dropping and Nvidia killing it. It's crazy how fast things are moving. But that 'Bubble You Can Justify in Excel' part is super thoaught-provoking. If every company's logic is 'strategic negligence' to under-invest, how do we square that with the 'wildly unsustainable' wide shot? Is there an exit ramp or just a crash?