|
|
il y a 4 semaines | |
|---|---|---|
| .. | ||
| migrations | il y a 4 semaines | |
| README.md | il y a 4 semaines | |
| database-test.db | il y a 1 mois | |
| database.db | il y a 4 semaines | |
This directory contains the SQLite databases used by the Watch Finished system.
erDiagram
FILES ||--o{ TASKS : "processed-by"
SETTINGS ||--o{ DATASETS : "configures"
FILES {
string dataset PK
string input PK
string output
string status
string date
}
TASKS {
integer id PK
string dataset
string input
string output
string preset
string status
integer progress
string created_at
string updated_at
}
SETTINGS {
string key PK
string value
}
DATASETS {
string name PK
boolean enabled
string destination
string exts
string ext
string preset
string clean
}
database.db - Main application database containing:
files table: Processed video files with metadatatasks table: Video processing queue and task statussettings table: Application configuration and dataset settingsdatabase.db.bak - Backup of the main database (created during migrations)
CREATE TABLE files (
dataset TEXT, -- Dataset name (e.g., 'movies', 'tvshows')
input TEXT, -- Original file path
output TEXT, -- Processed file path
status TEXT, -- Processing status ('pending', 'processing', 'success', 'failed')
date TEXT, -- ISO timestamp of last update
PRIMARY KEY (dataset, input)
);
CREATE TABLE tasks (
id INTEGER PRIMARY KEY AUTOINCREMENT,
dataset TEXT, -- Target dataset
input TEXT, -- Input file path
output TEXT, -- Output file path
preset TEXT, -- HandBrake preset used
status TEXT, -- Task status
progress INTEGER, -- Processing progress (0-100)
created_at TEXT, -- Creation timestamp
updated_at TEXT -- Last update timestamp
);
CREATE TABLE settings (
key TEXT PRIMARY KEY,
value TEXT -- JSON-encoded setting value
);
Dataset settings are stored in the settings table with keys like:
datasets/kids - Kids movies dataset configurationdatasets/pr0n - Adult content dataset configurationdatasets/tvshows - TV shows dataset configurationEach dataset configuration includes:
enabled: Whether the dataset is active for watchingdestination: Output directory for processed filesexts: File extensions to processext: Output file extensionpreset: HandBrake encoding presetclean: Filename cleaning rulesdatabase.db before major changesdatabase.db.bak during migrationscp database.db.bak database.dbThe application uses a proper database migration system to manage schema changes. This ensures that database changes can be versioned and applied consistently across different environments.
Migration files are stored in data/migrations/ and are named with timestamps: YYYY-MM-DDTHH-MM-SS_migration_name.sql.
Migrations are automatically applied when the service starts. You can also run them manually:
# Check migration status
pnpm run migrate:status
# Apply pending migrations
pnpm run migrate:up
# Create a new migration
pnpm run migrate:create <migration_name>
When you need to make schema changes:
Create a new migration file:
pnpm run migrate:create add_new_table
Edit the generated SQL file in data/migrations/ with your schema changes.
Test the migration by running it:
pnpm run migrate:up
Commit both the migration file and any code changes that depend on the new schema.
Since database.db is tracked in git to ensure schema consistency, follow this workflow for database changes:
database-test.db for testing to avoid committing test datadatabase.db - only schema changes via migrationsIf you accidentally modify database.db with data changes:
git restore data/database.dbThis ensures the git commit flow remains clean and migrations can bring any database to the correct state.