Please provide a valid email address!

AI Power Plays: Big Tech, Big Chips, and Big Lawsuits Reshape the Future of Intelligence

NYT vs. Perplexity: A New Flashpoint in the AI–News Wars

The New York Times has filed a lawsuit against AI search startup Perplexity, accusing it of copyright infringement and unfairly substituting the newspaper’s work in its products. The Times says Perplexity’s retrieval-augmented generation tools pull from websites and databases — including Times articles, sometimes behind a paywall — and then repackage that material in responses that can be verbatim or near-verbatim to the original reporting, without permission or payment. The paper also claims Perplexity’s system has hallucinated information and wrongly attributed it to the Times, allegedly harming the brand.

The suit lands as publishers are testing different strategies with AI: some, like The Times itself with Amazon and other outlets with companies like OpenAI, have signed licensing deals; others are going to court to gain leverage and push AI firms toward paid agreements. Perplexity, for its part, has tried to address these concerns with a publisher revenue-sharing program, a subscription model that shares most fees with participating outlets, and a licensing deal with Getty Images — but several major media organizations have already lined up against it. The Times is asking the court to stop Perplexity from using its content and to award damages, adding fresh legal pressure to ongoing fights over whether AI companies can freely train on and summarize journalism, or must pay to use it.

Apple Swaps AI Chiefs as It Scrambles to Catch Up

Apple’s longtime AI boss John Giannandrea is stepping down after seven years, staying on only as an adviser through spring 2026. He’s being replaced by Amar Subramanya, a seasoned Microsoft executive who previously spent 16 years at Google and led engineering for the Gemini Assistant. The hire gives Apple an AI leader who knows its biggest rivals’ technology and strategy from the inside.

The shake-up comes after a rocky rollout of Apple Intelligence, which has faced harsh reviews, embarrassing hallucinations in notification summaries, delayed Siri upgrades, and even class-action lawsuits from iPhone 16 buyers. Reports have described internal dysfunction, talent losses to competitors, and Apple’s growing reliance on Google’s Gemini to power the next Siri—an ironic twist given their long-running rivalry. Subramanya now has the job of turning that around, proving whether Apple’s privacy-first, on-device AI approach can compete with the massive cloud models driving the rest of the industry.

Nvidia’s New ‘Brain’ for Self-Driving Cars Goes Open Source

Nvidia has unveiled Alpamayo-R1, a new open reasoning vision–language model designed specifically for autonomous driving research. Built on the company’s Cosmos-Reason model family, Alpamayo-R1 can process both images and text, helping AI systems “see” the road, interpret what’s happening, and think through driving decisions step by step instead of just reacting. Nvidia says this kind of reasoning is key to reaching Level 4 autonomy, where vehicles can drive themselves in defined areas and conditions with minimal human intervention.

The model, along with related tools, is being released on GitHub and Hugging Face, backed by a new “Cosmos Cookbook” of guides, inference resources, and post-training workflows to help researchers customize it for their own robots and vehicles. Nvidia is positioning all this as part of its bigger push into “physical AI” — robots and autonomous machines powered by its GPUs — with leaders like CEO Jensen Huang arguing that building the “brains” for real-world robots is the next major wave of AI.

Amazon’s Trainium3 Turbocharges AI — and Makes Room for Nvidia

Amazon Web Services has unveiled Trainium3, a new 3nm AI training chip powering its Trainium3 UltraServer system. Announced at AWS re:Invent 2025, the third-gen hardware delivers over 4x the performance and 4x the memory of its predecessor for both training and inference, with up to 1 million chips linkable across thousands of servers. AWS also claims about 40% better energy efficiency, pitching the platform as a way for customers to cut both costs and power usage as AI workloads explode. Big AI users like Anthropic and Japan’s Karakuri are already using the new system and report significantly lower inference costs.

Looking ahead, AWS teased Trainium4, which is already in development and will support Nvidia’s NVLink Fusion high-speed interconnect. That means future AWS systems could tightly couple Amazon’s cheaper, homegrown chips with Nvidia GPUs while still fitting into Amazon-designed racks. By embracing Nvidia’s ecosystem — including CUDA, the dominant standard for AI apps — Amazon is positioning Trainium not as an island, but as a more affordable extension of Nvidia-based AI clusters running in its cloud.


Share

Don't Want to Miss Anything?

Sign up for our weekly newsletter

A once a week situation report on everything you need to know from this week in AI.

Please provide a valid email address!
* Yes, I agree to the terms and privacy policy.