Feature Deep Dive

Search your footage
with words, not tags.

Type "drone shot over coastline" or "two people talking at a table" and Rushes finds matching clips from your library — instantly, privately, on your Mac.

AI-powered natural language video search

The Problem

You have hundreds — maybe thousands — of video clips across projects. Finding one specific shot means scrubbing through timelines, remembering file names, or hoping your folder structure was good enough. It's slow, and it breaks your creative flow.

How Semantic Search Solves It

Rushes generates a semantic embedding for every clip in your library using MobileCLIP, Apple's efficient vision-language model. When you type a query like "sunset over water" or "person on phone indoors," Rushes compares your text against every clip's embedding and returns the best matches, ranked by relevance.

This isn't keyword matching. It understands concepts. Search for "celebration" and it finds clips of people cheering, confetti, clinking glasses — even if none of those words appear in any tag.

Completely On-Device

All processing runs locally on your Mac using Apple's Neural Engine. No footage is uploaded anywhere. No cloud API calls. Your clips stay private and the search works offline.

Built for Video Editors

Found the clip you need? Drag it straight from Rushes into Final Cut Pro, DaVinci Resolve, Premiere Pro, or any editor that accepts file drops. No export step, no re-importing — your originals are never copied or moved.

What You Can Search For

  • Scenes: "interview in an office," "car driving on highway," "crowded street at night"
  • Actions: "person running," "hands typing on keyboard," "dog playing in park"
  • Moods & Aesthetics: "moody blue lighting," "bright sunny day," "rainy window"
  • Objects: "coffee cup on table," "drone," "guitar"

System Requirements

macOS 15 (Sequoia) or later. Apple Silicon (M1+) recommended for best performance. Works on Intel Macs but analysis will be slower.