Advanced: Professional File Organization & Archiving guide illustration
⏱️ 2 min read

Advanced: Professional File Organization & Archiving


Table of Contents

Managing a multi-terabyte recording library requires more than just a large drive. To keep your archive searchable and stable, you need to implement automated naming conventions and structured archival workflows.

1) Optimized Naming Schemes

CaptureGem allows you to use variables to automatically sort files into folders. A professional-grade naming scheme ensures that no two files ever collide and that they remain organized even if the app’s database is lost.

Recommended Scheme: {Site}/{Model}/{Year}-{Month}-{Day}_{Hour}-{Minute}_{Model}_{Resolution}.mp4

  • Why this works: Sorting by Site and then Model keeps your folder counts manageable. Including the Resolution in the filename helps you identify high-quality clips without opening them.

2) Managing Sidecar Files

Every recording generates auxiliary data that you should keep alongside the video file.

  • Metadata (.json): Contains the exact start/end times and site-specific info.
  • Thumbnails (.jpg): Essential for fast browsing in the Review Tab.
  • Log Files: Useful for debugging if a specific segment was dropped.

Pro Tip: Keep these in the same folder as the .mp4. If you move a recording manually, ensure you move the associated sidecars to maintain the link in the CaptureGem UI.

3) Automated Move-to-Archive

Active recordings should always happen on a fast Local SSD. Once a week, you should move older content to slower, cheaper NAS storage.

Simple Python Archiver:

import os
import shutil
from datetime import datetime, timedelta

SOURCE = "/path/to/ssd/recordings"
DEST = "/path/to/nas/archive"
AGE_DAYS = 7

for root, dirs, files in os.walk(SOURCE):
    for f in files:
        path = os.path.join(root, f)
        mtime = datetime.fromtimestamp(os.path.getmtime(path))
        if mtime < datetime.now() - timedelta(days=AGE_DAYS):
            # Move file and preserve structure
            rel_path = os.path.relpath(path, SOURCE)
            dest_path = os.path.join(DEST, rel_path)
            os.makedirs(os.path.dirname(dest_path), exist_ok=True)
            shutil.move(path, dest_path)

4) Verifying Integrity with Checksums

When moving terabytes of data, “bit rot” or transfer errors can occur. Use tools like Hasher (Windows) or md5sum (Linux/macOS) to verify that the file on the NAS is bit-perfect compared to the original on the SSD.

Summary

A professional archive is defined by its organization. By using strict Naming Schemes and Automated Archiving, you can scale your library to 100TB+ while keeping every single clip instantly accessible.

Related guides

Rate this guide

Loading ratings...

Was this guide helpful?

Comments