This past week felt like a decade compressed into a news cycle. You know the saying: “There are decades where nothing happens, and weeks where decades happen.”
Ironic to quote a notorious communist it in a newsletter about funding the future, but here we are.
Musk’s mega-merger sucked the oxygen out of the room, but venture checks still cleared: from $16B for robotaxis to $1B for alt-silicon, AI’s capital cycle keeps flexing.
Lets dive in..
Want to get the most out of ChatGPT?
ChatGPT is a superpower if you know how to use it correctly.
Discover how HubSpot's guide to AI can elevate both your productivity and creativity to get more things done.
Learn to automate tasks, enhance decision-making, and foster innovation with the power of AI.
🔥 Biggest AI Funding Rounds This Week
Waymo – $16B (Private financing) – Mountain View, CA
Robotaxi heavyweight reloads for global AV expansion and operations scale.
Cerebras Systems – $1B (Late-stage) – Sunnyvale, CA
Wafer‑scale AI compute to ease the Nvidia bottleneck and build out clusters.
ElevenLabs – $500M (Series D) – London, UK
Text‑to‑speech leader turns voice into an agent platform; Sequoia leads.
Positron – $230M (Series B) – Reno, NV
High‑speed memory for AI chips to cut inference power and cost; QIA joins round.
Resolve AI – $125M (Series A) – San Francisco, CA
AI SRE copilot that auto‑triages outages and speeds incident response.
🦄 Startups to Watch:
Linq – $20M (Series A) – Birmingham, AL
Embeds AI assistants directly into iMessage/RCS so brands can chat, transact natively.EnFi – $15M (Seed) - Boston, MA
Agentic credit analysts for banks to underwrite faster with explainability.Sapiom (Seed) - Building rails for AI agents to buy their own SaaS/infra; if agentic procurement scales, this is the checkout layer.
Loop AI (Series A) - Pragmatic “AI agents for workflows” posture; early traction suggests bottoms‑up land‑and‑expand in enterprises.
Breezy (Pre‑seed) - Automates AR/invoicing for SMBs; crisp wedge with immediate ROI in cash‑flow ops.
EnFi (Seed) - Banks crave faster, explainable underwriting; if the guardrails hold, agentic credit analysts are a sticky system of record.
🗓️ Upcoming Conferences:
PS: We updated our list of upcoming conferences check out our full list here!
📡 Signals – What This Week Tells Us:
Mega‑rounds are back in force: AV and AI‑chips dominated with Waymo ($16B) and Cerebras ($1B).
Voice is a platform play as ElevenLabs arms agents and media with $500M.
“Alt‑silicon” and sovereign compute narratives strengthen (Positron raise; OpenAI shopping beyond Nvidia).
Agentic fintech/ops is getting real: EnFi (credit agents) and Resolve (AI SRE) target regulated, reliability‑critical workflows
🤝 Mergers & Acquisitions:
SpaceX acquired xAI, creating a $1.25T AI‑space juggernaut.
Varonis to acquire AI‑security firm AllTrue for ~$125M (cash).
Brookfield to buy Peakstone Realty Trust for ~$1.2B as infra players chase AI‑adjacent demand.
M&A Pulse: From orbit to warehouses, infra consolidation is accelerating while governance/security tuck‑ins pick up.
Beyond the Feed: AI Chips That Think at the Speed of Light
Researchers crossed a line that silicon has been tiptoeing around for decades.
A team at MIT and Lightmatter demonstrated a fully programmable photonic AI processor that performs core machine-learning operations using light instead of electricity.
No electrons moving through wires or heat-choked transistors.
Just photons bouncing through nanoscale waveguides.
The Wild Part:
These chips compute by interfering beams of light with each other. Matrix multiplication, the bottleneck behind every large model, happens almost instantly as light passes through the chip. There’s no clock speed in the traditional sense. The computation finishes at, quite literally, the speed of light.
In lab tests, the system executed inference tasks orders of magnitude faster than comparable GPU setups while consuming dramatically less power. The chip doesn’t “run” code so much as reshape light to become the computation.
The Breakthrough:
The processor is programmable, stackable, and designed to plug into existing data-center infrastructure. Engineers figured out how to correct for optical noise, temperature drift, and manufacturing imperfections that have plagued photonic computing for years.
Translation: this is the first time photonic AI looks deployable, not just impressive.
Why It Matters:
AI’s biggest constraint is power (no surprise there)
As models scale, data centers are slamming into energy ceilings, cooling limits, and grid bottlenecks. Photonic chips sidestep many of those constraints entirely. If this tech scales, it doesn’t just make AI faster. It reshapes where AI can live: edge devices, satellites, autonomous systems, and power-constrained environments where GPUs simply do not fit.
Signal to Watch:
If hyperscalers start pairing photonic inference chips with traditional GPUs, that’s the tell. This becomes infrastructure, not experiment.
Meme of the Week:

📬 Enjoyed the intel?
See you next week.
Until then: ship fast, raise smart, stay curious.
Feed The AI




