Type "drone shot over coastline" or "two people talking at a table" and Rushes finds matching clips from your library — instantly, privately, on your Mac.
How It Works
You have hundreds — maybe thousands — of video clips across projects. Finding one specific shot means scrubbing through timelines, remembering file names, or hoping your folder structure was good enough. It's slow, and it breaks your creative flow.
Rushes generates a semantic embedding for every clip in your library using MobileCLIP, Apple's efficient vision-language model. When you type a query like "sunset over water" or "person on phone indoors," Rushes compares your text against every clip's embedding and returns the best matches, ranked by relevance.
This isn't keyword matching. It understands concepts. Search for "celebration" and it finds clips of people cheering, confetti, clinking glasses — even if none of those words appear in any tag.
All processing runs locally on your Mac using Apple's Neural Engine. No footage is uploaded anywhere. No cloud API calls. Your clips stay private and the search works offline.
Found the clip you need? Drag it straight from Rushes into Final Cut Pro, DaVinci Resolve, Premiere Pro, or any editor that accepts file drops. No export step, no re-importing — your originals are never copied or moved.
macOS 15 (Sequoia) or later. Apple Silicon (M1+) recommended for best performance. Works on Intel Macs but analysis will be slower.