macOS App

Your footage,
instantly organized.

Rushes uses on-device ML to automatically tag, categorize, and make your raw video footage searchable in seconds. No cloud. No subscriptions.

See Rushes in action

Everything you need to wrangle your footage

Built for filmmakers, editors, and content creators who work with hours of raw video.

ML-Powered Tagging

Automatically classifies footage as talking head, interview, B-roll, outdoor, food, screen recording, and more — all on-device.

Semantic Search

Search with natural language like "person walking in the rain" and find matching clips instantly using MobileCLIP embeddings.

Smart Collections

Build dynamic filters with tag, duration, date, location, and favorite rules. Combine with AND/OR logic for powerful saved views.

Map View

See where your footage was filmed on an interactive map. Select areas to browse clips by location with filmstrip previews.

Non-Destructive

Your originals are never copied or moved. Rushes stores lightweight references so your files stay exactly where they are.

Drag & Drop

Import clips by dropping files or folders onto the library. Drag clips out to Finder or your editing app to start working immediately.

From raw footage to organized library in minutes

1

Import

Drop your video files or folders into Rushes. Supports MP4, MOV, and MKV. Files are referenced in place — nothing is copied.

2

ML Analyzes

On-device ML automatically scans every clip, generating tags, scene descriptions, and semantic embeddings — all private, no cloud required.

3

Organize & Find

Browse by tags, search with natural language, filter with smart collections, or explore by location on the map.

From the maker

Hi all, I'm Edward, the solo developer behind Rushes. I built this app to solve my own problem of managing thousands of hours of raw footage across multiple projects and devices. I also wanted a completely offline version where videos were not uploaded to the cloud for privacy, so I built this.

Below are some notes I've compiled:

System Requirements

  • macOS 15 (Sequoia) or later — the app uses SwiftUI APIs and Vision framework features that require macOS 15+
  • Apple Silicon (M1 or later) recommended — Core ML runs the CLIP models on the Neural Engine. Intel Macs will work but analysis will be significantly slower
  • RAM: 8GB minimum — the ML models load into memory (~30MB for MobileCLIP + Vision framework overhead)
  • Storage: ~150MB for the app — mostly the ML models. Video files are not copied, so no additional storage is needed beyond your existing footage

Good to Know

  • Supported formats: MP4, MOV, MKV
  • GPS metadata: Map view only works with clips filmed on GPS-enabled devices (iPhone, DJI drones, etc.). Screen recordings and downloaded videos won't appear on the map
  • Privacy: fully offline — all ML processing runs on-device. No data is sent to any server
  • Files are referenced, not copied — if you move or delete the original video files, Rushes will show a broken reference warning

Beta Notes

  • 3-day beta trial — the app disables after 3 days from first launch
  • Signed & notarized — the app is signed with an Apple Developer certificate and notarized by Apple. Just download, unzip, and open.

Feedback

Please email me at edwardsungswe@gmail.com about any problems or suggestions you would like implemented.