mirror of
https://github.com/Dictionarry-Hub/profilarr.git
synced 2026-01-27 21:20:53 +01:00
fix: require media management settings before quality profile sync
This commit is contained in:
@@ -1,454 +0,0 @@
|
||||
# Automatic Entity Release Import
|
||||
|
||||
**Status: Complete**
|
||||
|
||||
## Summary
|
||||
|
||||
Implemented a feature to bulk-import releases from connected Arr instances into
|
||||
the PCD for entity testing. Key components:
|
||||
|
||||
- **API endpoints**: `/api/v1/arr/library` and `/api/v1/arr/releases`
|
||||
- **Client methods**: `RadarrClient.getReleases()`, `SonarrClient.getSeries()`,
|
||||
`SonarrClient.getReleases()`, `SonarrClient.getSeasonPackReleases()`
|
||||
- **Utilities**: Flag normalization, indexer name sanitization, title similarity
|
||||
(Dice coefficient), release grouping/deduplication
|
||||
(`src/lib/server/utils/arr/releaseImport.ts`)
|
||||
- **UI**: `ImportReleasesModal.svelte` - two-step flow (library selection →
|
||||
release selection with search/sort/bulk-select)
|
||||
- **Entry point**: Import button in entity table row actions
|
||||
|
||||
---
|
||||
|
||||
## Overview
|
||||
|
||||
Adding entity test releases 1 by 1 is slow. This feature allows users to conduct
|
||||
an interactive search directly from Profilarr to connected Arrs, find releases
|
||||
for an entity, and automatically add them to the PCD.
|
||||
|
||||
## User Flow
|
||||
|
||||
1. User clicks "add test release" on an entity
|
||||
2. Choose between manual entry OR import from Arr
|
||||
3. For import: select an Arr instance (Radarr for movies, Sonarr for TV)
|
||||
4. Search/select the matching title in that Arr's library
|
||||
5. Profilarr triggers interactive search and fetches results
|
||||
6. Results are deduplicated/grouped by similar releases
|
||||
7. User reviews and can edit before confirming
|
||||
8. Releases are bulk-written to the PCD
|
||||
|
||||
## UI Considerations
|
||||
|
||||
- Entry point options: tabs in modal, separate buttons, or action button in
|
||||
entity table?
|
||||
- Need a review stage where users can see raw vs transformed results
|
||||
|
||||
---
|
||||
|
||||
## API Research
|
||||
|
||||
### Radarr - Release Endpoint
|
||||
|
||||
**Endpoint:** `GET /api/v3/release?movieId={id}`
|
||||
|
||||
This triggers an interactive search across all configured indexers and returns
|
||||
results. It does NOT download anything - just returns available releases.
|
||||
|
||||
Note: This is different from `POST /api/v3/command` with `MoviesSearch` which
|
||||
triggers a background search that may auto-grab releases.
|
||||
|
||||
**Example Response:**
|
||||
|
||||
```json
|
||||
{
|
||||
"guid": "PassThePopcorn-1345649",
|
||||
"title": "Beetlejuice.Beetlejuice.2024.Hybrid.1080p.BluRay.DDP7.1.x264-ZoroSenpai",
|
||||
"size": 13880140407,
|
||||
"indexer": "PassThePopcorn (Prowlarr)",
|
||||
"indexerId": 18,
|
||||
"languages": [{ "id": 1, "name": "English" }],
|
||||
"indexerFlags": ["G_Halfleech", "G_Internal"],
|
||||
"quality": {
|
||||
"quality": {
|
||||
"id": 7,
|
||||
"name": "Bluray-1080p",
|
||||
"source": "bluray",
|
||||
"resolution": 1080,
|
||||
"modifier": "none"
|
||||
}
|
||||
},
|
||||
"customFormats": [
|
||||
{ "id": 1474, "name": "1080p" },
|
||||
{ "id": 1424, "name": "1080p Bluray" }
|
||||
],
|
||||
"customFormatScore": 225600,
|
||||
"releaseGroup": "ZoroSenpai",
|
||||
"seeders": 204,
|
||||
"leechers": 0,
|
||||
"protocol": "torrent",
|
||||
"age": 426,
|
||||
"approved": false,
|
||||
"rejected": true,
|
||||
"rejections": ["Existing file on disk has equal or higher Custom Format score"]
|
||||
}
|
||||
```
|
||||
|
||||
**Fields we need for test releases:**
|
||||
|
||||
| API Field | Maps To | Notes |
|
||||
| -------------- | ------------------ | ---------------------------------- |
|
||||
| `title` | `title` | Full release name |
|
||||
| `size` | `size_bytes` | Already in bytes |
|
||||
| `indexer` | `indexers[]` | Needs sanitization (remove suffix) |
|
||||
| `languages` | `languages[]` | Extract `.name` from each object |
|
||||
| `indexerFlags` | `flags[]` | Already an array of strings |
|
||||
|
||||
### Sonarr - Release Endpoint
|
||||
|
||||
**Endpoint:** `GET /api/v3/release?seriesId={id}&seasonNumber={season}`
|
||||
|
||||
Both `seriesId` AND `seasonNumber` are required to get filtered results. Without
|
||||
`seasonNumber`, the endpoint returns RSS feed releases from all series.
|
||||
|
||||
For season packs, filter results by `fullSeason: true`.
|
||||
|
||||
**Getting seasons from series:**
|
||||
|
||||
Use `GET /api/v3/series/{id}` to get season info:
|
||||
|
||||
```json
|
||||
{
|
||||
"seasons": [
|
||||
{
|
||||
"seasonNumber": 1,
|
||||
"monitored": true,
|
||||
"statistics": {
|
||||
"episodeFileCount": 22,
|
||||
"episodeCount": 22,
|
||||
"totalEpisodeCount": 22,
|
||||
"percentOfEpisodes": 100
|
||||
}
|
||||
},
|
||||
{
|
||||
"seasonNumber": 2,
|
||||
"monitored": true,
|
||||
"statistics": {
|
||||
"nextAiring": "2026-02-27T01:00:00Z",
|
||||
"episodeCount": 10,
|
||||
"totalEpisodeCount": 14,
|
||||
"percentOfEpisodes": 100
|
||||
}
|
||||
}
|
||||
]
|
||||
}
|
||||
```
|
||||
|
||||
**Filter for finished seasons only** (don't search ongoing seasons):
|
||||
|
||||
```typescript
|
||||
const finishedSeasons = series.seasons.filter(
|
||||
(s) => s.statistics.episodeCount === s.statistics.totalEpisodeCount
|
||||
);
|
||||
```
|
||||
|
||||
**Example Response:**
|
||||
|
||||
```json
|
||||
{
|
||||
"guid": "BTN-2167645",
|
||||
"title": "Georgie.and.Mandys.First.Marriage.S01.1080p.AMZN.WEB-DL.DDP5.1.H.264-NTb",
|
||||
"size": 28025938746,
|
||||
"indexer": "BroadcasTheNet (Prowlarr)",
|
||||
"indexerId": 5,
|
||||
"languages": [{ "id": 1, "name": "English" }],
|
||||
"indexerFlags": 9,
|
||||
"fullSeason": true,
|
||||
"seasonNumber": 1,
|
||||
"seriesTitle": "Georgie & Mandy's First Marriage",
|
||||
"episodeNumbers": [],
|
||||
"quality": {
|
||||
"quality": {
|
||||
"id": 3,
|
||||
"name": "WEBDL-1080p",
|
||||
"source": "web",
|
||||
"resolution": 1080
|
||||
}
|
||||
},
|
||||
"customFormats": [{ "id": 808, "name": "2160p WEB-DL" }],
|
||||
"customFormatScore": 468100,
|
||||
"seeders": 258,
|
||||
"leechers": 4,
|
||||
"protocol": "torrent"
|
||||
}
|
||||
```
|
||||
|
||||
**Key differences from Radarr:**
|
||||
|
||||
| Field | Radarr | Sonarr |
|
||||
| -------------- | -------------------------------- | ----------------------------- |
|
||||
| `indexerFlags` | String array `["G_Internal"]` | Integer bitmask `9` |
|
||||
| Query params | `movieId` only | `seriesId` + `seasonNumber` |
|
||||
| Extra fields | - | `fullSeason`, `seasonNumber` |
|
||||
|
||||
### Indexer Flags
|
||||
|
||||
Radarr returns flags as string array, Sonarr as integer bitmask. We need to
|
||||
normalize both to a common format.
|
||||
|
||||
**Radarr flags (string array):**
|
||||
```
|
||||
["G_Freeleech", "G_Internal"] -> ["freeleech", "internal"]
|
||||
```
|
||||
|
||||
**Sonarr bitmask (from `src/lib/server/sync/mappings.ts`):**
|
||||
```
|
||||
freeleech: 1 (0b00000001)
|
||||
halfleech: 2 (0b00000010)
|
||||
double_upload: 4 (0b00000100)
|
||||
internal: 8 (0b00001000)
|
||||
scene: 16 (0b00010000)
|
||||
freeleech_75: 32 (0b00100000)
|
||||
freeleech_25: 64 (0b01000000)
|
||||
nuked: 128 (0b10000000)
|
||||
```
|
||||
|
||||
**Examples:**
|
||||
- `9` = freeleech (1) + internal (8)
|
||||
- `17` = freeleech (1) + scene (16)
|
||||
- `2` = halfleech
|
||||
|
||||
**Decoding function:**
|
||||
|
||||
```typescript
|
||||
import { INDEXER_FLAGS } from '$lib/server/sync/mappings.ts';
|
||||
|
||||
function decodeSonarrFlags(bitmask: number): string[] {
|
||||
const flags: string[] = [];
|
||||
const sonarrFlags = INDEXER_FLAGS.sonarr;
|
||||
|
||||
for (const [name, value] of Object.entries(sonarrFlags)) {
|
||||
if (bitmask & value) {
|
||||
flags.push(name);
|
||||
}
|
||||
}
|
||||
|
||||
return flags;
|
||||
}
|
||||
|
||||
function normalizeRadarrFlags(flags: string[]): string[] {
|
||||
// Remove "G_" prefix and lowercase
|
||||
return flags.map((f) => f.replace(/^G_/i, '').toLowerCase());
|
||||
}
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## TypeScript Types
|
||||
|
||||
```typescript
|
||||
// Radarr release response
|
||||
interface RadarrRelease {
|
||||
guid: string;
|
||||
title: string;
|
||||
size: number;
|
||||
indexer: string;
|
||||
indexerId: number;
|
||||
languages: Array<{ id: number; name: string }>;
|
||||
indexerFlags: string[]; // String array like ["G_Internal", "G_Freeleech"]
|
||||
quality: {
|
||||
quality: {
|
||||
id: number;
|
||||
name: string;
|
||||
source: string;
|
||||
resolution: number;
|
||||
modifier: string;
|
||||
};
|
||||
};
|
||||
customFormats: Array<{ id: number; name: string }>;
|
||||
customFormatScore: number;
|
||||
releaseGroup: string | null;
|
||||
seeders: number | null;
|
||||
leechers: number | null;
|
||||
protocol: 'torrent' | 'usenet';
|
||||
age: number;
|
||||
approved: boolean;
|
||||
rejected: boolean;
|
||||
rejections: string[];
|
||||
}
|
||||
|
||||
// Sonarr release response
|
||||
interface SonarrRelease {
|
||||
guid: string;
|
||||
title: string;
|
||||
size: number;
|
||||
indexer: string;
|
||||
indexerId: number;
|
||||
languages: Array<{ id: number; name: string }>;
|
||||
indexerFlags: number; // Integer bitmask
|
||||
fullSeason: boolean;
|
||||
seasonNumber: number;
|
||||
seriesTitle: string;
|
||||
episodeNumbers: number[];
|
||||
quality: {
|
||||
quality: {
|
||||
id: number;
|
||||
name: string;
|
||||
source: string;
|
||||
resolution: number;
|
||||
};
|
||||
};
|
||||
customFormats: Array<{ id: number; name: string }>;
|
||||
customFormatScore: number;
|
||||
releaseGroup: string | null;
|
||||
seeders: number | null;
|
||||
leechers: number | null;
|
||||
protocol: 'torrent' | 'usenet';
|
||||
age: number;
|
||||
approved: boolean;
|
||||
rejected: boolean;
|
||||
rejections: string[];
|
||||
}
|
||||
|
||||
// Transformed for grouping/deduplication
|
||||
interface GroupedRelease {
|
||||
title: string; // Canonical title (from first occurrence)
|
||||
size: number; // Average size of grouped releases
|
||||
indexers: string[]; // All indexers that have this release
|
||||
languages: string[]; // Union of all languages
|
||||
flags: string[]; // Union of all flags
|
||||
occurrences: number; // How many raw releases were grouped
|
||||
}
|
||||
|
||||
// Final shape for PCD test_releases table
|
||||
interface TestReleaseInput {
|
||||
entityId: number;
|
||||
title: string;
|
||||
size_bytes: number | null;
|
||||
languages: string[];
|
||||
indexers: string[];
|
||||
flags: string[];
|
||||
}
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Data Transformation
|
||||
|
||||
### Indexer Name Sanitization
|
||||
|
||||
Indexer names from Prowlarr include a suffix like `(Prowlarr)` that we should
|
||||
strip:
|
||||
|
||||
```
|
||||
"PassThePopcorn (Prowlarr)" -> "PassThePopcorn"
|
||||
"HDBits (Prowlarr)" -> "HDBits"
|
||||
"BeyondHD (Prowlarr)" -> "BeyondHD"
|
||||
```
|
||||
|
||||
**Implementation:**
|
||||
|
||||
```typescript
|
||||
function sanitizeIndexerName(name: string): string {
|
||||
return name.replace(/\s*\(Prowlarr\)$/i, '').trim();
|
||||
}
|
||||
```
|
||||
|
||||
### Release Deduplication Strategy
|
||||
|
||||
Same release often appears from multiple indexers with slight variations:
|
||||
|
||||
**Example from real data:**
|
||||
|
||||
```
|
||||
Title: Beetlejuice.Beetlejuice.2024.Hybrid.1080p.BluRay.DDP7.1.x264-ZoroSenpai
|
||||
- PassThePopcorn: 13,880,140,407 bytes, flags: []
|
||||
- BeyondHD: 13,880,140,407 bytes, flags: []
|
||||
- HDBits: 13,880,140,407 bytes, flags: [G_Halfleech, G_Internal]
|
||||
- Blutopia: 13,880,140,800 bytes, flags: [] (393 byte difference!)
|
||||
```
|
||||
|
||||
**Grouping criteria:**
|
||||
|
||||
1. **Title similarity > 90%** - Handle minor formatting differences (dots vs
|
||||
spaces)
|
||||
2. **Size within ±5%** - Same release should have nearly identical size
|
||||
|
||||
**Grouping algorithm:**
|
||||
|
||||
```typescript
|
||||
function groupReleases(releases: ArrRelease[]): GroupedRelease[] {
|
||||
const groups: GroupedRelease[] = [];
|
||||
|
||||
for (const release of releases) {
|
||||
const match = groups.find(
|
||||
(g) =>
|
||||
titleSimilarity(g.title, release.title) > 0.9 &&
|
||||
Math.abs(g.size - release.size) / g.size < 0.05
|
||||
);
|
||||
|
||||
if (match) {
|
||||
// Add to existing group
|
||||
match.indexers.push(sanitizeIndexerName(release.indexer));
|
||||
match.languages = union(match.languages, release.languages.map((l) => l.name));
|
||||
match.flags = union(match.flags, release.indexerFlags);
|
||||
match.occurrences++;
|
||||
} else {
|
||||
// Create new group
|
||||
groups.push({
|
||||
title: release.title,
|
||||
size: release.size,
|
||||
indexers: [sanitizeIndexerName(release.indexer)],
|
||||
languages: release.languages.map((l) => l.name),
|
||||
flags: [...release.indexerFlags],
|
||||
occurrences: 1,
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
return groups;
|
||||
}
|
||||
```
|
||||
|
||||
**Title similarity options:**
|
||||
|
||||
- Levenshtein distance normalized
|
||||
- Dice coefficient
|
||||
- Or simpler: normalize both titles (lowercase, replace separators) and compare
|
||||
|
||||
---
|
||||
|
||||
## Implementation Checklist
|
||||
|
||||
- [x] Add Radarr client method: `getReleases(movieId: number)`
|
||||
- [x] Add Sonarr client method: `getSeries(seriesId: number)` (for season info)
|
||||
- [x] Add Sonarr client method: `getReleases(seriesId: number, seasonNumber: number)`
|
||||
- [x] Add helper to get finished seasons from series data
|
||||
- [x] Add flag normalization utilities (Radarr string array, Sonarr bitmask)
|
||||
- [x] Create release transformation utilities (sanitize indexer names, group, dedupe)
|
||||
- [x] Add API endpoint for fetching/transforming releases
|
||||
- [x] Update ReleaseModal with import option
|
||||
- [x] Create import UI flow (Arr selection, title search, results review)
|
||||
- [x] Add bulk release creation to PCD writer
|
||||
- [x] Testing with real data
|
||||
|
||||
---
|
||||
|
||||
## Open Questions
|
||||
|
||||
1. ~~Should we show both raw and grouped results in the review UI?~~ **Resolved:
|
||||
Show only grouped results. The `occurrences` count indicates how many were
|
||||
merged.**
|
||||
2. ~~How to handle releases that don't group well (unique from single indexer)?~~
|
||||
**Resolved: They appear as-is with `occurrences: 1`.**
|
||||
3. ~~Should we store which indexers a release came from for reference?~~
|
||||
**Resolved: Yes, stored in `indexers[]` array.**
|
||||
4. ~~Sonarr: do we search by series or by specific episode/season?~~ **Resolved:
|
||||
Search by series + season number. Filter by `fullSeason: true` for season
|
||||
packs.**
|
||||
|
||||
---
|
||||
|
||||
## Future Considerations
|
||||
|
||||
- **Refactor modal**: `ImportReleasesModal.svelte` is large (~640 lines). Could
|
||||
split into separate components (e.g., `LibraryStep.svelte`,
|
||||
`ReleasesStep.svelte`, `SeasonSelector.svelte`).
|
||||
@@ -1,302 +0,0 @@
|
||||
# Manual Pull Handling for Databases
|
||||
|
||||
**Status: Complete**
|
||||
|
||||
## Summary
|
||||
|
||||
When `auto_pull = 0`, users receive notifications that updates are available but
|
||||
have no way to review or pull them. This document plans extending the existing
|
||||
`/databases/[id]/changes` page to support incoming changes (pull) alongside
|
||||
outgoing changes (push).
|
||||
|
||||
---
|
||||
|
||||
## Current State
|
||||
|
||||
### The `/databases/[id]/changes` Page
|
||||
|
||||
Currently this page:
|
||||
|
||||
- Only accessible to developers (requires `personal_access_token`)
|
||||
- Shows **outgoing changes** (uncommitted local ops)
|
||||
- Allows: select files, write commit message, push to remote
|
||||
- Allows: discard local changes
|
||||
- Allows: switch branches
|
||||
|
||||
### The Gap
|
||||
|
||||
When `auto_pull = 0` and updates are found:
|
||||
|
||||
1. User receives notification "Updates available for X"
|
||||
2. User has no way to see what those updates contain
|
||||
3. User has no way to pull them from the UI
|
||||
|
||||
---
|
||||
|
||||
## Proposed Solution
|
||||
|
||||
Extend the changes page to show both directions:
|
||||
|
||||
| Section | Who Sees It | Description |
|
||||
| ----------------- | ----------- | ---------------------------------- |
|
||||
| Incoming Changes | Everyone | Commits available to pull |
|
||||
| Outgoing Changes | Developers | Uncommitted local ops to push |
|
||||
|
||||
### User Flow
|
||||
|
||||
**For regular users (no PAT):**
|
||||
|
||||
1. Navigate to `/databases/[id]/changes`
|
||||
2. See "Incoming Changes" section with commits behind
|
||||
3. Review the changes (files modified in each commit)
|
||||
4. Click "Pull Updates" to sync
|
||||
|
||||
**For developers (with PAT):**
|
||||
|
||||
1. Same as above, plus...
|
||||
2. See "Outgoing Changes" section with uncommitted ops
|
||||
3. Full commit/push/discard functionality
|
||||
|
||||
---
|
||||
|
||||
## Implementation Plan
|
||||
|
||||
### 1. Remove PAT Requirement for Page Access
|
||||
|
||||
**File:** `src/routes/databases/[id]/changes/+page.server.ts`
|
||||
|
||||
```typescript
|
||||
export const load: PageServerLoad = async ({ parent }) => {
|
||||
const { database } = await parent();
|
||||
// Remove the PAT check - page is now accessible to everyone
|
||||
// PAT only needed for push actions
|
||||
return {
|
||||
isDeveloper: !!database.personal_access_token
|
||||
};
|
||||
};
|
||||
```
|
||||
|
||||
### 2. Update API to Return Incoming Changes
|
||||
|
||||
**File:** `src/routes/api/databases/[id]/changes/+server.ts`
|
||||
|
||||
Remove PAT requirement for GET. Add incoming changes data:
|
||||
|
||||
```typescript
|
||||
export const GET: RequestHandler = async ({ params }) => {
|
||||
const database = databaseInstancesQueries.getById(id);
|
||||
const git = new Git(database.local_path);
|
||||
|
||||
// Fetch for everyone
|
||||
const [status, incomingChanges, branches, repoInfo] = await Promise.all([
|
||||
git.status(),
|
||||
git.getIncomingChanges(),
|
||||
git.getBranches(),
|
||||
getRepoInfo(database.repository_url, database.personal_access_token)
|
||||
]);
|
||||
|
||||
// Only fetch outgoing changes for developers
|
||||
let uncommittedOps = null;
|
||||
if (database.personal_access_token) {
|
||||
uncommittedOps = await git.getUncommittedOps();
|
||||
}
|
||||
|
||||
return json({
|
||||
status,
|
||||
incomingChanges,
|
||||
branches,
|
||||
repoInfo,
|
||||
uncommittedOps
|
||||
});
|
||||
};
|
||||
```
|
||||
|
||||
### 3. Add Git Function for Incoming Changes
|
||||
|
||||
**File:** `src/lib/server/utils/git/status.ts`
|
||||
|
||||
```typescript
|
||||
export interface IncomingCommit {
|
||||
hash: string;
|
||||
shortHash: string;
|
||||
message: string;
|
||||
author: string;
|
||||
date: string;
|
||||
files: string[];
|
||||
}
|
||||
|
||||
export interface IncomingChanges {
|
||||
hasUpdates: boolean;
|
||||
commitsBehind: number;
|
||||
commits: IncomingCommit[];
|
||||
}
|
||||
|
||||
export async function getIncomingChanges(repoPath: string): Promise<IncomingChanges> {
|
||||
// Fetch latest from remote
|
||||
await execGitSafe(['fetch'], repoPath);
|
||||
|
||||
const branch = await getBranch(repoPath);
|
||||
const remoteBranch = `origin/${branch}`;
|
||||
|
||||
// Count commits behind
|
||||
const countOutput = await execGitSafe(
|
||||
['rev-list', '--count', `HEAD..${remoteBranch}`],
|
||||
repoPath
|
||||
);
|
||||
const commitsBehind = parseInt(countOutput || '0') || 0;
|
||||
|
||||
if (commitsBehind === 0) {
|
||||
return { hasUpdates: false, commitsBehind: 0, commits: [] };
|
||||
}
|
||||
|
||||
// Get commit details for incoming commits
|
||||
const logOutput = await execGitSafe(
|
||||
['log', '--format=%H|%h|%s|%an|%aI', `HEAD..${remoteBranch}`],
|
||||
repoPath
|
||||
);
|
||||
|
||||
const commits: IncomingCommit[] = [];
|
||||
for (const line of logOutput.split('\n').filter(Boolean)) {
|
||||
const [hash, shortHash, message, author, date] = line.split('|');
|
||||
|
||||
// Get files changed in this commit
|
||||
const filesOutput = await execGitSafe(
|
||||
['diff-tree', '--no-commit-id', '--name-only', '-r', hash],
|
||||
repoPath
|
||||
);
|
||||
const files = filesOutput.split('\n').filter(Boolean);
|
||||
|
||||
commits.push({ hash, shortHash, message, author, date, files });
|
||||
}
|
||||
|
||||
return { hasUpdates: true, commitsBehind, commits };
|
||||
}
|
||||
```
|
||||
|
||||
### 4. Add Pull Action
|
||||
|
||||
**File:** `src/routes/databases/[id]/changes/+page.server.ts`
|
||||
|
||||
```typescript
|
||||
export const actions: Actions = {
|
||||
// ... existing actions ...
|
||||
|
||||
pull: async ({ params }) => {
|
||||
const id = parseInt(params.id || '', 10);
|
||||
const database = databaseInstancesQueries.getById(id);
|
||||
|
||||
if (!database) {
|
||||
return { success: false, error: 'Database not found' };
|
||||
}
|
||||
|
||||
try {
|
||||
const result = await pcdManager.sync(id);
|
||||
return result;
|
||||
} catch (err) {
|
||||
return {
|
||||
success: false,
|
||||
error: err instanceof Error ? err.message : 'Failed to pull'
|
||||
};
|
||||
}
|
||||
}
|
||||
};
|
||||
```
|
||||
|
||||
### 5. Update Page UI
|
||||
|
||||
**File:** `src/routes/databases/[id]/changes/+page.svelte`
|
||||
|
||||
```svelte
|
||||
<script lang="ts">
|
||||
export let data: PageData;
|
||||
|
||||
let incomingChanges: IncomingChanges | null = null;
|
||||
// ... existing state ...
|
||||
|
||||
$: isDeveloper = data.isDeveloper;
|
||||
</script>
|
||||
|
||||
<!-- Incoming Changes Section (visible to everyone) -->
|
||||
<section>
|
||||
<h2>Incoming Changes</h2>
|
||||
|
||||
{#if incomingChanges?.hasUpdates}
|
||||
<p>{incomingChanges.commitsBehind} commits available</p>
|
||||
|
||||
<!-- Expandable table showing commits -->
|
||||
<ExpandableTable data={incomingChanges.commits} ...>
|
||||
<!-- Show commit message, author, date -->
|
||||
<!-- Expanded: show files changed -->
|
||||
</ExpandableTable>
|
||||
|
||||
<Button on:click={handlePull}>Pull Updates</Button>
|
||||
{:else}
|
||||
<p>Up to date</p>
|
||||
{/if}
|
||||
</section>
|
||||
|
||||
<!-- Outgoing Changes Section (developers only) -->
|
||||
{#if isDeveloper}
|
||||
<section>
|
||||
<h2>Outgoing Changes</h2>
|
||||
<!-- Existing uncommitted ops table -->
|
||||
<!-- Existing commit message + push UI -->
|
||||
</section>
|
||||
{/if}
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Sync Job Integration
|
||||
|
||||
When the sync job runs with `auto_pull = 0`:
|
||||
|
||||
1. `checkForUpdates()` already fetches and counts commits behind
|
||||
2. This data is already available via `git.status()` (the `behind` field)
|
||||
3. The `/api/databases/[id]/changes` endpoint will show this when user visits
|
||||
|
||||
No additional "store" needed - the git state IS the store. Each time the user
|
||||
visits the changes page, we fetch fresh data from git.
|
||||
|
||||
---
|
||||
|
||||
## Files to Create/Modify
|
||||
|
||||
### Modified Files
|
||||
|
||||
- `src/routes/databases/[id]/changes/+page.server.ts` - Remove PAT requirement
|
||||
for load, add pull action
|
||||
- `src/routes/databases/[id]/changes/+page.svelte` - Add incoming changes UI
|
||||
- `src/routes/api/databases/[id]/changes/+server.ts` - Remove PAT requirement
|
||||
for GET, add incoming changes data
|
||||
- `src/lib/server/utils/git/status.ts` - Add `getIncomingChanges()` function
|
||||
- `src/lib/server/utils/git/types.ts` - Add `IncomingChanges` types
|
||||
- `src/lib/server/utils/git/Git.ts` - Expose `getIncomingChanges()`
|
||||
|
||||
### New Components (optional)
|
||||
|
||||
- `src/routes/databases/[id]/changes/components/IncomingChangesTable.svelte`
|
||||
- `src/routes/databases/[id]/changes/components/OutgoingChangesTable.svelte`
|
||||
|
||||
Could extract existing table into `OutgoingChangesTable` and create matching
|
||||
`IncomingChangesTable` for consistency.
|
||||
|
||||
---
|
||||
|
||||
## Edge Cases
|
||||
|
||||
1. **No incoming changes**: Show "Up to date" message
|
||||
2. **Pull fails**: Show error, allow retry
|
||||
3. **Conflicts**: Shouldn't happen since user_ops are gitignored, but handle
|
||||
gracefully if it does
|
||||
4. **Large number of commits**: Paginate or limit to recent N commits
|
||||
|
||||
---
|
||||
|
||||
## UI Considerations
|
||||
|
||||
- StatusCard (repo info, branch switcher) visible for everyone
|
||||
- Use consistent table styling between incoming/outgoing
|
||||
- Incoming table is read-only (no checkboxes)
|
||||
- Clear visual separation between sections
|
||||
- Consider showing incoming changes count in the tab/nav if updates available
|
||||
@@ -1,402 +0,0 @@
|
||||
# Renaminatorr
|
||||
|
||||
**Status: Implemented**
|
||||
|
||||
## Summary
|
||||
|
||||
Bulk rename module that triggers Radarr/Sonarr's built-in rename functionality across
|
||||
your entire library. The Arr apps already know how to rename files based on your
|
||||
naming format settings — Renaminatorr just triggers that command in bulk.
|
||||
|
||||
## Overview
|
||||
|
||||
Unlike the upgrade system which needs filters and selectors (searches are expensive,
|
||||
hit indexers, take time), renames are fast local file operations. There's no need
|
||||
for complex filtering — if your naming format is correct, you want everything named
|
||||
correctly.
|
||||
|
||||
**Key insight:** The Arr's rename command is idempotent. If a file already matches
|
||||
the naming format, nothing happens. So we can just trigger rename on everything.
|
||||
|
||||
## Settings
|
||||
|
||||
| Setting | Type | Default | Description |
|
||||
|---------|------|---------|-------------|
|
||||
| **Dry Run** | boolean | true | Preview what would change without making changes |
|
||||
| **Rename Folders** | boolean | false | Also rename containing folders, not just files |
|
||||
| **Ignore Tag** | string | null | Tag name to skip (items with this tag won't be renamed) |
|
||||
| **Enabled** | boolean | false | Enable scheduled rename job |
|
||||
| **Schedule** | integer | 1440 | Run interval in minutes (default 24 hours) |
|
||||
|
||||
**Ignore Tag usage:**
|
||||
- User sets ignore tag to e.g. `profilarr-no-rename` in Profilarr
|
||||
- User tags specific movies/series in the Arr UI with that tag
|
||||
- Rename runs, skips anything with that tag
|
||||
- Useful for items with custom folder structures you want to preserve
|
||||
|
||||
**Job scheduling:**
|
||||
- When enabled, the rename job runs on the configured schedule
|
||||
- Similar to upgrades: a manager job runs every 30 mins and checks which configs are due
|
||||
- Manual rename can still be triggered anytime from the UI
|
||||
|
||||
## Process Flow
|
||||
|
||||
### 1. Fetch Media
|
||||
|
||||
Get all movies/series from the Arr instance.
|
||||
|
||||
```
|
||||
GET /api/v3/movie (Radarr)
|
||||
GET /api/v3/series (Sonarr)
|
||||
```
|
||||
|
||||
### 2. Filter by Ignore Tag
|
||||
|
||||
If an ignore tag is configured:
|
||||
- Look up the tag ID from the tag name
|
||||
- Filter out any media items that have this tag in their `tags[]` array
|
||||
|
||||
```typescript
|
||||
const tagId = await client.getOrCreateTag(ignoreTag);
|
||||
const filteredMedia = allMedia.filter(item => !item.tags.includes(tagId));
|
||||
```
|
||||
|
||||
### 3. Check Dry Run Mode
|
||||
|
||||
**If dry run is ON:**
|
||||
- For each filtered media item, call the rename preview API
|
||||
- Aggregate all results showing `existingPath → newPath`
|
||||
- Return preview to UI — no changes made
|
||||
|
||||
```
|
||||
GET /api/v3/rename?movieId={id} (Radarr)
|
||||
GET /api/v3/rename?seriesId={id} (Sonarr)
|
||||
```
|
||||
|
||||
**If dry run is OFF:**
|
||||
- Proceed to actual rename
|
||||
|
||||
### 4. Execute Rename
|
||||
|
||||
Trigger the rename command with all filtered media IDs.
|
||||
|
||||
```
|
||||
POST /api/v3/command
|
||||
{ "name": "RenameMovie", "movieIds": [...] } // Radarr
|
||||
{ "name": "RenameSeries", "seriesIds": [...] } // Sonarr
|
||||
```
|
||||
|
||||
### 5. Wait for Completion
|
||||
|
||||
Poll the command status until complete.
|
||||
|
||||
```
|
||||
GET /api/v3/command/{commandId}
|
||||
// Poll every 5s until status === "completed" or "failed"
|
||||
```
|
||||
|
||||
### 6. Rename Folders (Optional)
|
||||
|
||||
If rename folders is enabled:
|
||||
- Group media by root folder path
|
||||
- For each root folder, trigger folder rename
|
||||
|
||||
```
|
||||
PUT /api/v3/movie/editor
|
||||
{ "movieIds": [...], "moveFiles": true, "rootFolderPath": "/movies" }
|
||||
```
|
||||
|
||||
### 7. Refresh Metadata
|
||||
|
||||
Trigger a refresh so the Arr picks up the new paths.
|
||||
|
||||
```
|
||||
POST /api/v3/command
|
||||
{ "name": "RefreshMovie", "movieIds": [...] }
|
||||
```
|
||||
|
||||
### 8. Return Results
|
||||
|
||||
Return summary to UI:
|
||||
- Total items processed
|
||||
- Files renamed (with before/after paths)
|
||||
- Folders renamed (if enabled)
|
||||
- Any errors
|
||||
|
||||
---
|
||||
|
||||
## API Research
|
||||
|
||||
### Radarr
|
||||
|
||||
| Operation | Method | Endpoint | Payload |
|
||||
|-----------|--------|----------|---------|
|
||||
| Get all movies | `GET` | `/api/v3/movie` | — |
|
||||
| Rename preview | `GET` | `/api/v3/rename?movieId={id}` | — |
|
||||
| Rename files | `POST` | `/api/v3/command` | `{"name": "RenameMovie", "movieIds": [...]}` |
|
||||
| Rename folders | `PUT` | `/api/v3/movie/editor` | `{"movieIds": [...], "moveFiles": true, "rootFolderPath": "..."}` |
|
||||
| Refresh | `POST` | `/api/v3/command` | `{"name": "RefreshMovie", "movieIds": [...]}` |
|
||||
| Poll command | `GET` | `/api/v3/command/{id}` | — |
|
||||
|
||||
**Rename preview response:**
|
||||
|
||||
```json
|
||||
[
|
||||
{
|
||||
"movieId": 123,
|
||||
"movieFileId": 456,
|
||||
"existingPath": "Movie (2024)/Movie.2024.1080p.BluRay.x264-GROUP.mkv",
|
||||
"newPath": "Movie (2024)/Movie (2024) [Bluray-1080p].mkv"
|
||||
}
|
||||
]
|
||||
```
|
||||
|
||||
### Sonarr
|
||||
|
||||
| Operation | Method | Endpoint | Payload |
|
||||
|-----------|--------|----------|---------|
|
||||
| Get all series | `GET` | `/api/v3/series` | — |
|
||||
| Rename preview | `GET` | `/api/v3/rename?seriesId={id}` | — |
|
||||
| Rename files | `POST` | `/api/v3/command` | `{"name": "RenameSeries", "seriesIds": [...]}` |
|
||||
| Rename folders | `PUT` | `/api/v3/series/editor` | `{"seriesIds": [...], "moveFiles": true, "rootFolderPath": "..."}` |
|
||||
| Refresh | `POST` | `/api/v3/command` | `{"name": "RefreshSeries", "seriesIds": [...]}` |
|
||||
| Poll command | `GET` | `/api/v3/command/{id}` | — |
|
||||
|
||||
### Command Polling
|
||||
|
||||
Commands are async. Poll `GET /api/v3/command/{id}` until:
|
||||
- `status: "completed"` → success
|
||||
- `status: "failed"` → failure
|
||||
|
||||
Poll interval: ~5 seconds. Timeout after ~10 minutes.
|
||||
|
||||
---
|
||||
|
||||
## Existing Client Methods
|
||||
|
||||
**Already have in `src/lib/server/utils/arr/`:**
|
||||
|
||||
| Method | Location | Notes |
|
||||
|--------|----------|-------|
|
||||
| `getMovies()` | `RadarrClient` | Get all movies |
|
||||
| `getAllSeries()` | `SonarrClient` | Get all series |
|
||||
| `getTags()` | `BaseArrClient` | Get tags |
|
||||
| `createTag()` | `BaseArrClient` | Create tag |
|
||||
| `getOrCreateTag()` | `RadarrClient` | Get or create tag |
|
||||
|
||||
**Need to add:**
|
||||
|
||||
| Method | Location | Endpoint |
|
||||
|--------|----------|----------|
|
||||
| `getRenamePreview(id)` | `RadarrClient` | `GET /rename?movieId={id}` |
|
||||
| `getRenamePreview(id)` | `SonarrClient` | `GET /rename?seriesId={id}` |
|
||||
| `renameMovies(ids)` | `RadarrClient` | `POST /command` → `RenameMovie` |
|
||||
| `renameSeries(ids)` | `SonarrClient` | `POST /command` → `RenameSeries` |
|
||||
| `refreshMovies(ids)` | `RadarrClient` | `POST /command` → `RefreshMovie` |
|
||||
| `refreshSeries(ids)` | `SonarrClient` | `POST /command` → `RefreshSeries` |
|
||||
| `renameMovieFolders(ids, rootPath)` | `RadarrClient` | `PUT /movie/editor` |
|
||||
| `renameSeriesFolders(ids, rootPath)` | `SonarrClient` | `PUT /series/editor` |
|
||||
| `getCommand(id)` | `BaseArrClient` | `GET /command/{id}` |
|
||||
| `waitForCommand(id)` | `BaseArrClient` | Poll until complete |
|
||||
|
||||
---
|
||||
|
||||
## TypeScript Types
|
||||
|
||||
```typescript
|
||||
// Rename preview response item
|
||||
interface RenamePreviewItem {
|
||||
movieId?: number; // Radarr
|
||||
seriesId?: number; // Sonarr
|
||||
seasonNumber?: number; // Sonarr
|
||||
episodeNumbers?: number[]; // Sonarr
|
||||
movieFileId?: number; // Radarr
|
||||
episodeFileId?: number; // Sonarr
|
||||
existingPath: string;
|
||||
newPath: string;
|
||||
}
|
||||
|
||||
// Command response
|
||||
interface ArrCommand {
|
||||
id: number;
|
||||
name: string;
|
||||
status: 'queued' | 'started' | 'completed' | 'failed';
|
||||
queued: string;
|
||||
started?: string;
|
||||
ended?: string;
|
||||
message?: string;
|
||||
}
|
||||
|
||||
// Rename result for UI
|
||||
interface RenameResult {
|
||||
mediaId: number;
|
||||
title: string;
|
||||
year: number;
|
||||
filesRenamed: Array<{
|
||||
existingPath: string;
|
||||
newPath: string;
|
||||
}>;
|
||||
folderRenamed?: {
|
||||
existingPath: string;
|
||||
newPath: string;
|
||||
};
|
||||
}
|
||||
|
||||
// Structured log for each rename run
|
||||
interface RenameJobLog {
|
||||
instanceId: number;
|
||||
instanceName: string;
|
||||
startedAt: string;
|
||||
completedAt: string;
|
||||
status: 'success' | 'failed';
|
||||
|
||||
config: {
|
||||
dryRun: boolean;
|
||||
renameFolders: boolean;
|
||||
ignoreTag: string | null;
|
||||
};
|
||||
|
||||
library: {
|
||||
totalItems: number;
|
||||
skippedByTag: number;
|
||||
itemsToProcess: number;
|
||||
};
|
||||
|
||||
results: {
|
||||
filesRenamed: number;
|
||||
foldersRenamed: number;
|
||||
errors: string[];
|
||||
};
|
||||
}
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Notifications & Logging
|
||||
|
||||
### Notification Types
|
||||
|
||||
Add to `src/lib/server/notifications/types.ts`:
|
||||
|
||||
```typescript
|
||||
// Rename
|
||||
RENAME_SUCCESS: 'rename.success',
|
||||
RENAME_FAILED: 'rename.failed'
|
||||
```
|
||||
|
||||
**Notification content:**
|
||||
- Instance name
|
||||
- Dry run indicator
|
||||
- Files renamed count
|
||||
- Folders renamed count (if enabled)
|
||||
- Items skipped (due to ignore tag)
|
||||
- Errors if any
|
||||
|
||||
### Structured Logger
|
||||
|
||||
Create `src/lib/server/rename/logger.ts` with helpers:
|
||||
|
||||
- `logRenameRun(log: RenameJobLog)` — log completed run with metrics
|
||||
- `logRenameSkipped(instanceId, instanceName, reason)` — log when skipped
|
||||
- `logRenameStart(instanceId, instanceName)` — log when starting
|
||||
- `logRenameError(instanceId, instanceName, error)` — log errors
|
||||
|
||||
---
|
||||
|
||||
## Implementation Checklist
|
||||
|
||||
### Database
|
||||
|
||||
- [x] Create migration for `arr_rename_settings` table
|
||||
- [x] Add queries file (`src/lib/server/db/queries/arrRenameSettings.ts`)
|
||||
|
||||
**Table schema:**
|
||||
|
||||
```sql
|
||||
CREATE TABLE arr_rename_settings (
|
||||
id INTEGER PRIMARY KEY AUTOINCREMENT,
|
||||
arr_instance_id INTEGER NOT NULL UNIQUE,
|
||||
|
||||
-- Settings
|
||||
dry_run INTEGER NOT NULL DEFAULT 1,
|
||||
rename_folders INTEGER NOT NULL DEFAULT 0,
|
||||
ignore_tag TEXT,
|
||||
|
||||
-- Job scheduling
|
||||
enabled INTEGER NOT NULL DEFAULT 0,
|
||||
schedule INTEGER NOT NULL DEFAULT 1440, -- Run interval in minutes (default 24 hours)
|
||||
last_run_at DATETIME,
|
||||
|
||||
-- Metadata
|
||||
created_at DATETIME DEFAULT CURRENT_TIMESTAMP,
|
||||
updated_at DATETIME DEFAULT CURRENT_TIMESTAMP,
|
||||
|
||||
FOREIGN KEY (arr_instance_id) REFERENCES arr_instances(id) ON DELETE CASCADE
|
||||
);
|
||||
```
|
||||
|
||||
**Queries needed:**
|
||||
- `getByInstanceId(instanceId)` — get settings for an instance
|
||||
- `upsert(instanceId, settings)` — create or update settings
|
||||
- `getDueConfigs()` — get configs where enabled=1 and due to run
|
||||
- `updateLastRunAt(instanceId)` — update last_run_at after job runs
|
||||
|
||||
### Arr Client Methods
|
||||
|
||||
- [x] Add `getCommand(id)` to `BaseArrClient`
|
||||
- [x] Add `waitForCommand(id)` to `BaseArrClient` (poll helper)
|
||||
- [x] Add `getRenamePreview(movieId)` to `RadarrClient`
|
||||
- [x] Add `renameMovies(movieIds)` to `RadarrClient`
|
||||
- [x] Add `refreshMovies(movieIds)` to `RadarrClient`
|
||||
- [x] Add `renameMovieFolders(movieIds, rootPath)` to `RadarrClient`
|
||||
- [x] Add `getRenamePreview(seriesId)` to `SonarrClient`
|
||||
- [x] Add `renameSeries(seriesIds)` to `SonarrClient`
|
||||
- [x] Add `refreshSeries(seriesIds)` to `SonarrClient`
|
||||
- [x] Add `renameSeriesFolders(seriesIds, rootPath)` to `SonarrClient`
|
||||
|
||||
### Backend Logic
|
||||
|
||||
- [x] Create rename processor (`src/lib/server/rename/processor.ts`)
|
||||
- [x] Create `+page.server.ts` with form actions:
|
||||
- `save` — create/update settings
|
||||
- `run` — trigger manual rename
|
||||
|
||||
### Job & Notifications
|
||||
|
||||
- [x] Create job definition (`src/lib/server/jobs/definitions/renameManager.ts`)
|
||||
- [x] Create job logic (`src/lib/server/jobs/logic/renameManager.ts`)
|
||||
- [x] Register job in `src/lib/server/jobs/init.ts`
|
||||
- [x] Add notification types to `src/lib/server/notifications/types.ts`
|
||||
- [x] Create structured logger (`src/lib/server/rename/logger.ts`)
|
||||
|
||||
### Frontend
|
||||
|
||||
- [x] Create rename page under `/arr/[id]/rename/`
|
||||
- [x] Settings form:
|
||||
- Dry run toggle
|
||||
- Rename folders toggle
|
||||
- Ignore tag input
|
||||
- Enable scheduled job toggle
|
||||
- Schedule interval input
|
||||
- [x] "Test Run" button (triggers `run` action, only in dry run mode)
|
||||
- [x] Progress/loading state during rename
|
||||
- [x] Results display (alert with summary)
|
||||
|
||||
### Optional Enhancements
|
||||
|
||||
- [ ] Count limit (process N items per run) for very large libraries
|
||||
|
||||
---
|
||||
|
||||
## Resolved Questions
|
||||
|
||||
1. **Where to put rename?** → Page under each Arr instance (`/arr/[id]/rename/`)
|
||||
2. **Per-instance or all instances?** → Per-instance only (consistent with upgrades)
|
||||
3. **Job scheduler?** → Yes, added to job scheduler for automated runs
|
||||
|
||||
---
|
||||
|
||||
## Reference
|
||||
|
||||
Original implementation from daps: `dist/daps/modules/renameinatorr.py`
|
||||
API client: `dist/daps/util/arrpy.py`
|
||||
@@ -1,60 +0,0 @@
|
||||
# Link Sync Settings
|
||||
|
||||
**Status: Planning**
|
||||
|
||||
## Summary
|
||||
|
||||
When a user enables quality profile sync, require media management sync to also be
|
||||
enabled. This ensures users get the complete configuration — profiles alone without
|
||||
proper naming formats leads to inconsistent results.
|
||||
|
||||
## Problem
|
||||
|
||||
Users can currently enable quality profile sync without enabling media management
|
||||
sync. This creates issues:
|
||||
|
||||
- Quality profiles define *what* quality to grab
|
||||
- Media management defines *how* files are named and organized
|
||||
- Without both, the naming format may not match what the profile expects
|
||||
- Leads to confusion and support requests
|
||||
|
||||
## Solution
|
||||
|
||||
Client-side validation on the sync settings page. Don't let users save if:
|
||||
|
||||
1. Any quality profiles are selected for sync, AND
|
||||
2. Media management settings (naming, quality definitions) are not configured
|
||||
|
||||
## Implementation
|
||||
|
||||
**Location:** `src/routes/arr/[id]/sync/+page.svelte`
|
||||
|
||||
**Logic:**
|
||||
|
||||
```typescript
|
||||
// Check if any quality profiles are selected
|
||||
const hasQualityProfilesSelected = Object.values(qualityProfileState)
|
||||
.some(db => Object.values(db).some(selected => selected));
|
||||
|
||||
// Check if media management is configured
|
||||
const hasMediaManagement =
|
||||
mediaManagementState.namingDatabaseId !== null ||
|
||||
mediaManagementState.qualityDefinitionsDatabaseId !== null;
|
||||
|
||||
// Validation
|
||||
const canSave = !hasQualityProfilesSelected || hasMediaManagement;
|
||||
```
|
||||
|
||||
**UX:**
|
||||
|
||||
- Show warning message when quality profiles selected but no media management
|
||||
- Disable save buttons on QualityProfiles component until valid
|
||||
- Message: "Quality profiles require media management settings. Please configure
|
||||
naming or quality definitions to ensure consistent file naming."
|
||||
|
||||
## Checklist
|
||||
|
||||
- [ ] Add validation logic to `+page.svelte`
|
||||
- [ ] Pass `canSave` state to `QualityProfiles.svelte`
|
||||
- [ ] Show warning alert when invalid
|
||||
- [ ] Disable save button when invalid
|
||||
Reference in New Issue
Block a user