Sep 8, 2025

Hey, it’s Abe. If you don’t have time to scroll through another 20-tab Twitter thread or decode yet another “AI is coming for your job” headline, don’t worry — I’ve got you.
Let’s get into what’s actually useful this week:
OpenAI Thinks It Solved Hallucinations — Sort Of
Abe’s take: This is the first honest approach to hallucinations that doesn’t feel like PR spin. Training needs to change — period. This paper actually explains how.
OpenAI just dropped a research paper claiming the reason chatbots hallucinate isn’t some deep mystery — it’s because their training rewards them for confidently guessing, even when they’re wrong. Saying “I don’t know” literally gets them zero credit.
What they found:
Models get full marks for lucky guesses and none for uncertainty.
So, guess what? They always guess. Confidently. Even when wrong.
They tested this by asking for birthdays and dissertation titles — the models made stuff up every time, but sounded sure.
Their fix? Penalize wrong guesses more than “I don’t know” answers.
Why it matters: This could finally shift how AI models behave — especially in high-stakes use cases like healthcare, finance, legal, or even customer service. Accuracy is great, but reliability is what builds trust.
Anthropic to Authors: We’ll Pay $1.5B
Abe’s take: This isn’t just a copyright lawsuit — it’s a signal that AI companies can’t treat “data” like a free-for-all anymore.
Anthropic just agreed to pay over $1.5B in a class-action settlement from authors whose work they used to train Claude — without permission. The kicker? Most of that content came from pirated shadow libraries like LibGen.
Quick hits:
About 500,000 books were involved.
$3,000 per work is the ballpark settlement number.
Court says legally purchased books = fair game, pirated ones = nope.
Anthropic also has to delete all the pirated material, permanently.
Why it matters: Legal lines are finally being drawn. If you’re building internal tools or using off-the-shelf models, start thinking about where the data actually comes from. Compliance isn’t optional anymore.
Automate Web Monitoring with Yutori Scouts
Abe’s take: This is one of the simplest ways to save 20 minutes a day. No more obsessively refreshing that page or hunting for news manually.
Yutori Scouts is a tool that watches websites for updates and emails you when something changes. It’s basically an AI intern who doesn’t sleep.
Use it for:
Spotting RFPs or press releases the moment they go live
Keeping tabs on job postings or grant openings
Tracking competitor changes (without paying for enterprise tools)
Getting alerts when a product restocks or a page updates
You set what to watch, how often you want updates, and Yutori handles the rest.
Pro tip: Pair this with Make or Zapier to push those alerts into Slack, Notion, or a CRM. That’s when it really clicks.
Popcorn: AI Video Tool That Goes Beyond “Clips”
Abe’s take: This isn’t just text-to-video. This is text-to-mini-movie. Feels like what Runway should have been by now.
Popcorn lets you drop in one prompt and get a full 1–3 minute video back — complete with a plot, characters, synced voices, and edits. No stitching scenes or dragging timelines.
It handles:
Full ideation-to-output video creation
Dialogue, soundtrack, sound FX, and scene transitions
Consistent characters and styles
Instant genre or narrative swaps
Why it matters: You don’t need a full creative team to storyboard ideas anymore. Brands can now visualize ad concepts, explainers, or pitch decks in video — without hiring a single editor. And creators? It’s a new toy with real teeth.
Use code AGENTIC for a free test run.
OpenAI Is Building Its Own Chips
Abe’s take: This is OpenAI’s next power move — and it’s about control. Less dependency on Nvidia means faster rollout and lower cost at scale.
OpenAI is working with Broadcom to build custom AI chips, reportedly placing a $10B order. These chips will fuel internal use — not go to market.
Key points:
These aren’t generic GPUs — they’re built for OpenAI’s infrastructure.
They want to double compute capacity in the next 5 months.
This move helps power GPT-5 and beyond while dodging GPU shortages.
Other giants (Meta, Google, Amazon) are already doing the same.
Why it matters: Owning the compute stack is the new moat. If you’re an AI company or heavy API user, this could shift pricing, speed, and service availability — especially if OpenAI starts prioritizing internal usage over third-party access.
Tool of the Week: Gumloop
Abe’s take: If you spend hours turning raw research into readable content — Gumloop is your new best friend.
It automates workflows for blog creation, turning sources like RSS feeds, transcripts, or scraped content into ready-to-publish posts or summaries.
What it’s good for:
Weekly digest creation
SEO content pipelines
Market research repackaging
Internal reports that don’t suck
Real talk: This replaces a VA or junior marketer for teams already overloaded. Set up a flow once and let it run in the background while you build.
Final Thought
There’s a difference between tools that wow you for 5 minutes and tools that quietly save you hours every week. This week, we saw a few of the latter.
If you’re building systems — not just content — these are the shifts to watch.
Want to stay ahead of AI — not the hype, just the real tech quietly changing how we live, work, and parent? Join our free AI Advantage community here: The AI Advantage Community — Thanks for reading, Abe.