Large AI labs are locking in long-term compute through multi-year infrastructure deals. At the same time, open-weight models are being built to run without NVIDIA hardware.
Ownership at the tool layer is now less stable. The same labs funding infrastructure also compete with the developer tools that depend on their models. These shifts are happening at the same time.
DeepSeek released open models designed for non-NVIDIA hardware
On April 24, DeepSeek released V4-Pro and V4-Flash as open-weight models with 1M token context windows. The models are optimized for Huawei Ascend chips, and DeepSeek released the weights with a technical paper.
Independent reports describe this as a frontier-level open release built from the start for non-NVIDIA hardware. That challenges the default assumption that open deployments need an NVIDIA-first stack.
Teams that rely on open models now have a stronger non-NVIDIA path at the frontier level.
OpenAI released GPT-5.5 with higher coding benchmark scores
OpenAI released GPT-5.5 around April 23 and reported higher scores than GPT-5.4: 82.7% vs 75.1% on Terminal-Bench 2.0 and 58.6% vs 57.7% on SWE-Bench Pro. OpenAI listed pricing at $5 input and $30 output per million tokens with a 1M token context window.
The reported numbers point to better coding and long-context performance. The benchmark figures are self-reported by OpenAI, and no independent validation has been published in these sources.
The model appears stronger on coding benchmarks, but it also carries a higher output-token price than the prior version.
Google and Amazon committed about $65 billion to Anthropic
On April 24, Google committed up to $40 billion to Anthropic, with $10 billion committed immediately at a $380 billion valuation. Amazon had committed up to $25 billion the week before. Together, the announced commitments total about $65 billion in roughly ten days.
Claude now sits on both AWS and Google Cloud with explicit long-term backing from both providers. This looks like structural infrastructure positioning, not a short-term partnership.
Anthropic now has unusually deep cloud support from both major providers at the same time.
SpaceX disclosed an option to acquire Cursor for $60 billion
SpaceX announced a partnership with Cursor to build a next-generation coding AI on the Colossus supercomputer. SpaceX also disclosed an option to acquire Cursor for $60 billion, with an alternative to pay $10 billion for the work. Two senior Cursor engineering leads had already moved to xAI the prior month.
Cursor now faces uncertainty on ownership, model access, and roadmap control. Anthropic and OpenAI, whose models currently power Cursor, also ship competing coding products.
Developers using Cursor may face strategic volatility in both ownership and model-supply alignment.
Deep Research Max (Google Gemini API)
worth attentionGoogle launched Deep Research Max on April 21 as an autonomous research agent in the Gemini API. A single request can search the open web, connected file stores, and MCP servers together. Google positions this as the replacement for its December 2025 preview.
Use this when you need one workflow that combines MCP-compatible internal data with public web data, without building separate source integrations.
Source: Google Blog →Restricted AI models can be found by guessing the API URL
On launch day, unauthorized users accessed Anthropic's restricted Claude Mythos Preview. Reports say they guessed the API endpoint from known Anthropic URL patterns and used a credential held by a third-party contractor. The broader issue is endpoint enumeration: predictable URL patterns can expose restricted endpoints even without direct credential theft. As labs ship higher-access restricted models, third-party credential chains become a distinct attack surface.
Source: TechCrunch →$30 billion
This was Anthropic's annualized revenue run rate when the Amazon deal was announced, up from around $9 billion at the end of 2025. The scale helps explain why roughly $65 billion in commitments can be framed as competitive positioning.
Source: CNBC →The same labs that supply core models are now moving to control more of the tool layer. Compute deals lock in infrastructure, while tool partnerships and acquisition options pressure the application layer. The stack is consolidating from both directions.
- 01OpenAI Blog — Introducing GPT-5.5
- 02TechCrunch — GPT-5.5 / OpenAI super app
- 03Google Blog — Deep Research Max
- 04CNBC — Google $40B Anthropic investment
- 05Reuters — Google $40B Anthropic investment
- 06Anthropic Blog — Amazon compute deal
- 07MarkTechPost — DeepSeek-V4
- 08MIT Technology Review — DeepSeek-V4
- 09TechCrunch — SpaceX / Cursor
- 10TechCrunch — Anthropic Mythos unauthorized access
- 11Bloomberg — Anthropic Mythos unauthorized access