Your agents work. You watch.

Binge while you build.

Your AI agents run for minutes at a time. AIdle auto-pauses your shows when you need to pay attention, and resumes them the second your agent starts working again. Netflix, YouTube, Hulu, Prime, Audible — all handled.

Watching
You’re coding
Agent runs
HOW IT WORKS

Code, watch, repeat.

You hit play on anything

Netflix, YouTube, Hulu, Prime Video, Audible — whatever keeps you entertained between code reviews.

Your agent needs you

Cursor finishes a task, or Claude responds with something to review. AIdle pauses your media and brings your workspace forward.

Agent starts again. Show resumes.

You kick off the next task, your agent gets back to work, and your show picks up right where it left off. No remote needed.

Quick AI replies under 4 seconds? AIdle doesn’t even blink.

WORKS WITH

Stream anything. Code with anything.

Your shows — what AIdle pauses & resumes

YouTube
Netflix
Hulu
Prime Video
Audible

Your AI tools — what triggers AIdle

Claude.ai
ChatGPT
Gemini
Cursor
THE OPPORTUNITY

That idle time adds up.

Modern AI agents run for minutes at a time. Most developers alt-tab to their phone. What if you could watch a real show instead — and have it seamlessly pause the moment your agent needs you back?

87%
of developers reach for their phone while waiting on AI agents
Stack Overflow Developer Survey, 2025
3-10 min
average agent task length in Cursor — plenty of time for a scene
Cursor changelog, 2025
4 sec
threshold — fast replies never interrupt your show
AIdle default configuration
GET STARTED

Running in three minutes.

Chrome Extension

  1. 1.Download the .zip from GitHub
  2. 2.Open chrome://extensions
  3. 3.Enable Developer Mode → Load unpacked
  4. 4.Enter your Server URL + User ID

Cursor Extension

  1. 1.Download the .vsix from GitHub
  2. 2.Open Cursor → Extensions → ··· → Install from VSIX
  3. 3.Enter same Server URL + User ID in settings

Need a server? Deploy the included Next.js app to Railway free in one click. Or point both extensions at our hosted endpoint.