diff --git a/LICENSE b/LICENSE new file mode 100644 index 0000000..9b481b7 --- /dev/null +++ b/LICENSE @@ -0,0 +1,211 @@ +GNU AFFERO GENERAL PUBLIC LICENSE +Version 3, 19 November 2007 + +Copyright (C) 2007 Free Software Foundation, Inc. + +Everyone is permitted to copy and distribute verbatim copies of this license document, but changing it is not allowed. + + Preamble + +The GNU Affero General Public License is a free, copyleft license for software and other kinds of works, specifically designed to ensure cooperation with the community in the case of network server software. + +The licenses for most software and other practical works are designed to take away your freedom to share and change the works. By contrast, our General Public Licenses are intended to guarantee your freedom to share and change all versions of a program--to make sure it remains free software for all its users. + +When we speak of free software, we are referring to freedom, not price. Our General Public Licenses are designed to make sure that you have the freedom to distribute copies of free software (and charge for them if you wish), that you receive source code or can get it if you want it, that you can change the software or use pieces of it in new free programs, and that you know you can do these things. + +Developers that use our General Public Licenses protect your rights with two steps: (1) assert copyright on the software, and (2) offer you this License which gives you legal permission to copy, distribute and/or modify the software. + +A secondary benefit of defending all users' freedom is that improvements made in alternate versions of the program, if they receive widespread use, become available for other developers to incorporate. Many developers of free software are heartened and encouraged by the resulting cooperation. However, in the case of software used on network servers, this result may fail to come about. The GNU General Public License permits making a modified version and letting the public access it on a server without ever releasing its source code to the public. + +The GNU Affero General Public License is designed specifically to ensure that, in such cases, the modified source code becomes available to the community. It requires the operator of a network server to provide the source code of the modified version running there to the users of that server. Therefore, public use of a modified version, on a publicly accessible server, gives the public access to the source code of the modified version. + +An older license, called the Affero General Public License and published by Affero, was designed to accomplish similar goals. This is a different license, not a version of the Affero GPL, but Affero has released a new version of the Affero GPL which permits relicensing under this license. + +The precise terms and conditions for copying, distribution and modification follow. + + TERMS AND CONDITIONS + +0. Definitions. + +"This License" refers to version 3 of the GNU Affero General Public License. + +"Copyright" also means copyright-like laws that apply to other kinds of works, such as semiconductor masks. + +"The Program" refers to any copyrightable work licensed under this License. Each licensee is addressed as "you". "Licensees" and "recipients" may be individuals or organizations. + +To "modify" a work means to copy from or adapt all or part of the work in a fashion requiring copyright permission, other than the making of an exact copy. The resulting work is called a "modified version" of the earlier work or a work "based on" the earlier work. + +A "covered work" means either the unmodified Program or a work based on the Program. + +To "propagate" a work means to do anything with it that, without permission, would make you directly or secondarily liable for infringement under applicable copyright law, except executing it on a computer or modifying a private copy. Propagation includes copying, distribution (with or without modification), making available to the public, and in some countries other activities as well. + +To "convey" a work means any kind of propagation that enables other parties to make or receive copies. Mere interaction with a user through a computer network, with no transfer of a copy, is not conveying. + +An interactive user interface displays "Appropriate Legal Notices" to the extent that it includes a convenient and prominently visible feature that (1) displays an appropriate copyright notice, and (2) tells the user that there is no warranty for the work (except to the extent that warranties are provided), that licensees may convey the work under this License, and how to view a copy of this License. If the interface presents a list of user commands or options, such as a menu, a prominent item in the list meets this criterion. + +1. Source Code. +The "source code" for a work means the preferred form of the work for making modifications to it. "Object code" means any non-source form of a work. + +A "Standard Interface" means an interface that either is an official standard defined by a recognized standards body, or, in the case of interfaces specified for a particular programming language, one that is widely used among developers working in that language. + +The "System Libraries" of an executable work include anything, other than the work as a whole, that (a) is included in the normal form of packaging a Major Component, but which is not part of that Major Component, and (b) serves only to enable use of the work with that Major Component, or to implement a Standard Interface for which an implementation is available to the public in source code form. A "Major Component", in this context, means a major essential component (kernel, window system, and so on) of the specific operating system (if any) on which the executable work runs, or a compiler used to produce the work, or an object code interpreter used to run it. + +The "Corresponding Source" for a work in object code form means all the source code needed to generate, install, and (for an executable work) run the object code and to modify the work, including scripts to control those activities. However, it does not include the work's System Libraries, or general-purpose tools or generally available free programs which are used unmodified in performing those activities but which are not part of the work. For example, Corresponding Source includes interface definition files associated with source files for the work, and the source code for shared libraries and dynamically linked subprograms that the work is specifically designed to require, such as by intimate data communication or control flow between those subprograms and other parts of the work. + +The Corresponding Source need not include anything that users can regenerate automatically from other parts of the Corresponding Source. + +The Corresponding Source for a work in source code form is that same work. + +2. Basic Permissions. +All rights granted under this License are granted for the term of copyright on the Program, and are irrevocable provided the stated conditions are met. This License explicitly affirms your unlimited permission to run the unmodified Program. The output from running a covered work is covered by this License only if the output, given its content, constitutes a covered work. This License acknowledges your rights of fair use or other equivalent, as provided by copyright law. + +You may make, run and propagate covered works that you do not convey, without conditions so long as your license otherwise remains in force. You may convey covered works to others for the sole purpose of having them make modifications exclusively for you, or provide you with facilities for running those works, provided that you comply with the terms of this License in conveying all material for which you do not control copyright. Those thus making or running the covered works for you must do so exclusively on your behalf, under your direction and control, on terms that prohibit them from making any copies of your copyrighted material outside their relationship with you. + +Conveying under any other circumstances is permitted solely under the conditions stated below. Sublicensing is not allowed; section 10 makes it unnecessary. + +3. Protecting Users' Legal Rights From Anti-Circumvention Law. +No covered work shall be deemed part of an effective technological measure under any applicable law fulfilling obligations under article 11 of the WIPO copyright treaty adopted on 20 December 1996, or similar laws prohibiting or restricting circumvention of such measures. + +When you convey a covered work, you waive any legal power to forbid circumvention of technological measures to the extent such circumvention is effected by exercising rights under this License with respect to the covered work, and you disclaim any intention to limit operation or modification of the work as a means of enforcing, against the work's users, your or third parties' legal rights to forbid circumvention of technological measures. + +4. Conveying Verbatim Copies. +You may convey verbatim copies of the Program's source code as you receive it, in any medium, provided that you conspicuously and appropriately publish on each copy an appropriate copyright notice; keep intact all notices stating that this License and any non-permissive terms added in accord with section 7 apply to the code; keep intact all notices of the absence of any warranty; and give all recipients a copy of this License along with the Program. + +You may charge any price or no price for each copy that you convey, and you may offer support or warranty protection for a fee. + +5. Conveying Modified Source Versions. +You may convey a work based on the Program, or the modifications to produce it from the Program, in the form of source code under the terms of section 4, provided that you also meet all of these conditions: + + a) The work must carry prominent notices stating that you modified it, and giving a relevant date. + + b) The work must carry prominent notices stating that it is released under this License and any conditions added under section 7. This requirement modifies the requirement in section 4 to "keep intact all notices". + + c) You must license the entire work, as a whole, under this License to anyone who comes into possession of a copy. This License will therefore apply, along with any applicable section 7 additional terms, to the whole of the work, and all its parts, regardless of how they are packaged. This License gives no permission to license the work in any other way, but it does not invalidate such permission if you have separately received it. + + d) If the work has interactive user interfaces, each must display Appropriate Legal Notices; however, if the Program has interactive interfaces that do not display Appropriate Legal Notices, your work need not make them do so. + +A compilation of a covered work with other separate and independent works, which are not by their nature extensions of the covered work, and which are not combined with it such as to form a larger program, in or on a volume of a storage or distribution medium, is called an "aggregate" if the compilation and its resulting copyright are not used to limit the access or legal rights of the compilation's users beyond what the individual works permit. Inclusion of a covered work in an aggregate does not cause this License to apply to the other parts of the aggregate. + +6. Conveying Non-Source Forms. +You may convey a covered work in object code form under the terms of sections 4 and 5, provided that you also convey the machine-readable Corresponding Source under the terms of this License, in one of these ways: + + a) Convey the object code in, or embodied in, a physical product (including a physical distribution medium), accompanied by the Corresponding Source fixed on a durable physical medium customarily used for software interchange. + + b) Convey the object code in, or embodied in, a physical product (including a physical distribution medium), accompanied by a written offer, valid for at least three years and valid for as long as you offer spare parts or customer support for that product model, to give anyone who possesses the object code either (1) a copy of the Corresponding Source for all the software in the product that is covered by this License, on a durable physical medium customarily used for software interchange, for a price no more than your reasonable cost of physically performing this conveying of source, or (2) access to copy the Corresponding Source from a network server at no charge. + + c) Convey individual copies of the object code with a copy of the written offer to provide the Corresponding Source. This alternative is allowed only occasionally and noncommercially, and only if you received the object code with such an offer, in accord with subsection 6b. + + d) Convey the object code by offering access from a designated place (gratis or for a charge), and offer equivalent access to the Corresponding Source in the same way through the same place at no further charge. You need not require recipients to copy the Corresponding Source along with the object code. If the place to copy the object code is a network server, the Corresponding Source may be on a different server (operated by you or a third party) that supports equivalent copying facilities, provided you maintain clear directions next to the object code saying where to find the Corresponding Source. Regardless of what server hosts the Corresponding Source, you remain obligated to ensure that it is available for as long as needed to satisfy these requirements. + + e) Convey the object code using peer-to-peer transmission, provided you inform other peers where the object code and Corresponding Source of the work are being offered to the general public at no charge under subsection 6d. + +A separable portion of the object code, whose source code is excluded from the Corresponding Source as a System Library, need not be included in conveying the object code work. + +A "User Product" is either (1) a "consumer product", which means any tangible personal property which is normally used for personal, family, or household purposes, or (2) anything designed or sold for incorporation into a dwelling. In determining whether a product is a consumer product, doubtful cases shall be resolved in favor of coverage. For a particular product received by a particular user, "normally used" refers to a typical or common use of that class of product, regardless of the status of the particular user or of the way in which the particular user actually uses, or expects or is expected to use, the product. A product is a consumer product regardless of whether the product has substantial commercial, industrial or non-consumer uses, unless such uses represent the only significant mode of use of the product. + +"Installation Information" for a User Product means any methods, procedures, authorization keys, or other information required to install and execute modified versions of a covered work in that User Product from a modified version of its Corresponding Source. The information must suffice to ensure that the continued functioning of the modified object code is in no case prevented or interfered with solely because modification has been made. + +If you convey an object code work under this section in, or with, or specifically for use in, a User Product, and the conveying occurs as part of a transaction in which the right of possession and use of the User Product is transferred to the recipient in perpetuity or for a fixed term (regardless of how the transaction is characterized), the Corresponding Source conveyed under this section must be accompanied by the Installation Information. But this requirement does not apply if neither you nor any third party retains the ability to install modified object code on the User Product (for example, the work has been installed in ROM). + +The requirement to provide Installation Information does not include a requirement to continue to provide support service, warranty, or updates for a work that has been modified or installed by the recipient, or for the User Product in which it has been modified or installed. Access to a network may be denied when the modification itself materially and adversely affects the operation of the network or violates the rules and protocols for communication across the network. + +Corresponding Source conveyed, and Installation Information provided, in accord with this section must be in a format that is publicly documented (and with an implementation available to the public in source code form), and must require no special password or key for unpacking, reading or copying. + +7. Additional Terms. +"Additional permissions" are terms that supplement the terms of this License by making exceptions from one or more of its conditions. Additional permissions that are applicable to the entire Program shall be treated as though they were included in this License, to the extent that they are valid under applicable law. If additional permissions apply only to part of the Program, that part may be used separately under those permissions, but the entire Program remains governed by this License without regard to the additional permissions. + +When you convey a copy of a covered work, you may at your option remove any additional permissions from that copy, or from any part of it. (Additional permissions may be written to require their own removal in certain cases when you modify the work.) You may place additional permissions on material, added by you to a covered work, for which you have or can give appropriate copyright permission. + +Notwithstanding any other provision of this License, for material you add to a covered work, you may (if authorized by the copyright holders of that material) supplement the terms of this License with terms: + + a) Disclaiming warranty or limiting liability differently from the terms of sections 15 and 16 of this License; or + + b) Requiring preservation of specified reasonable legal notices or author attributions in that material or in the Appropriate Legal Notices displayed by works containing it; or + + c) Prohibiting misrepresentation of the origin of that material, or requiring that modified versions of such material be marked in reasonable ways as different from the original version; or + + d) Limiting the use for publicity purposes of names of licensors or authors of the material; or + + e) Declining to grant rights under trademark law for use of some trade names, trademarks, or service marks; or + + f) Requiring indemnification of licensors and authors of that material by anyone who conveys the material (or modified versions of it) with contractual assumptions of liability to the recipient, for any liability that these contractual assumptions directly impose on those licensors and authors. + +All other non-permissive additional terms are considered "further restrictions" within the meaning of section 10. If the Program as you received it, or any part of it, contains a notice stating that it is governed by this License along with a term that is a further restriction, you may remove that term. If a license document contains a further restriction but permits relicensing or conveying under this License, you may add to a covered work material governed by the terms of that license document, provided that the further restriction does not survive such relicensing or conveying. + +If you add terms to a covered work in accord with this section, you must place, in the relevant source files, a statement of the additional terms that apply to those files, or a notice indicating where to find the applicable terms. + +Additional terms, permissive or non-permissive, may be stated in the form of a separately written license, or stated as exceptions; the above requirements apply either way. + +8. Termination. + +You may not propagate or modify a covered work except as expressly provided under this License. Any attempt otherwise to propagate or modify it is void, and will automatically terminate your rights under this License (including any patent licenses granted under the third paragraph of section 11). + +However, if you cease all violation of this License, then your license from a particular copyright holder is reinstated (a) provisionally, unless and until the copyright holder explicitly and finally terminates your license, and (b) permanently, if the copyright holder fails to notify you of the violation by some reasonable means prior to 60 days after the cessation. + +Moreover, your license from a particular copyright holder is reinstated permanently if the copyright holder notifies you of the violation by some reasonable means, this is the first time you have received notice of violation of this License (for any work) from that copyright holder, and you cure the violation prior to 30 days after your receipt of the notice. + +Termination of your rights under this section does not terminate the licenses of parties who have received copies or rights from you under this License. If your rights have been terminated and not permanently reinstated, you do not qualify to receive new licenses for the same material under section 10. + +9. Acceptance Not Required for Having Copies. + +You are not required to accept this License in order to receive or run a copy of the Program. Ancillary propagation of a covered work occurring solely as a consequence of using peer-to-peer transmission to receive a copy likewise does not require acceptance. However, nothing other than this License grants you permission to propagate or modify any covered work. These actions infringe copyright if you do not accept this License. Therefore, by modifying or propagating a covered work, you indicate your acceptance of this License to do so. + +10. Automatic Licensing of Downstream Recipients. + +Each time you convey a covered work, the recipient automatically receives a license from the original licensors, to run, modify and propagate that work, subject to this License. You are not responsible for enforcing compliance by third parties with this License. + +An "entity transaction" is a transaction transferring control of an organization, or substantially all assets of one, or subdividing an organization, or merging organizations. If propagation of a covered work results from an entity transaction, each party to that transaction who receives a copy of the work also receives whatever licenses to the work the party's predecessor in interest had or could give under the previous paragraph, plus a right to possession of the Corresponding Source of the work from the predecessor in interest, if the predecessor has it or can get it with reasonable efforts. + +You may not impose any further restrictions on the exercise of the rights granted or affirmed under this License. For example, you may not impose a license fee, royalty, or other charge for exercise of rights granted under this License, and you may not initiate litigation (including a cross-claim or counterclaim in a lawsuit) alleging that any patent claim is infringed by making, using, selling, offering for sale, or importing the Program or any portion of it. + +11. Patents. + +A "contributor" is a copyright holder who authorizes use under this License of the Program or a work on which the Program is based. The work thus licensed is called the contributor's "contributor version". + +A contributor's "essential patent claims" are all patent claims owned or controlled by the contributor, whether already acquired or hereafter acquired, that would be infringed by some manner, permitted by this License, of making, using, or selling its contributor version, but do not include claims that would be infringed only as a consequence of further modification of the contributor version. For purposes of this definition, "control" includes the right to grant patent sublicenses in a manner consistent with the requirements of this License. + +Each contributor grants you a non-exclusive, worldwide, royalty-free patent license under the contributor's essential patent claims, to make, use, sell, offer for sale, import and otherwise run, modify and propagate the contents of its contributor version. + +In the following three paragraphs, a "patent license" is any express agreement or commitment, however denominated, not to enforce a patent (such as an express permission to practice a patent or covenant not to sue for patent infringement). To "grant" such a patent license to a party means to make such an agreement or commitment not to enforce a patent against the party. + +If you convey a covered work, knowingly relying on a patent license, and the Corresponding Source of the work is not available for anyone to copy, free of charge and under the terms of this License, through a publicly available network server or other readily accessible means, then you must either (1) cause the Corresponding Source to be so available, or (2) arrange to deprive yourself of the benefit of the patent license for this particular work, or (3) arrange, in a manner consistent with the requirements of this License, to extend the patent license to downstream recipients. "Knowingly relying" means you have actual knowledge that, but for the patent license, your conveying the covered work in a country, or your recipient's use of the covered work in a country, would infringe one or more identifiable patents in that country that you have reason to believe are valid. + +If, pursuant to or in connection with a single transaction or arrangement, you convey, or propagate by procuring conveyance of, a covered work, and grant a patent license to some of the parties receiving the covered work authorizing them to use, propagate, modify or convey a specific copy of the covered work, then the patent license you grant is automatically extended to all recipients of the covered work and works based on it. + +A patent license is "discriminatory" if it does not include within the scope of its coverage, prohibits the exercise of, or is conditioned on the non-exercise of one or more of the rights that are specifically granted under this License. You may not convey a covered work if you are a party to an arrangement with a third party that is in the business of distributing software, under which you make payment to the third party based on the extent of your activity of conveying the work, and under which the third party grants, to any of the parties who would receive the covered work from you, a discriminatory patent license (a) in connection with copies of the covered work conveyed by you (or copies made from those copies), or (b) primarily for and in connection with specific products or compilations that contain the covered work, unless you entered into that arrangement, or that patent license was granted, prior to 28 March 2007. + +Nothing in this License shall be construed as excluding or limiting any implied license or other defenses to infringement that may otherwise be available to you under applicable patent law. + +12. No Surrender of Others' Freedom. + +If conditions are imposed on you (whether by court order, agreement or otherwise) that contradict the conditions of this License, they do not excuse you from the conditions of this License. If you cannot convey a covered work so as to satisfy simultaneously your obligations under this License and any other pertinent obligations, then as a consequence you may not convey it at all. For example, if you agree to terms that obligate you to collect a royalty for further conveying from those to whom you convey the Program, the only way you could satisfy both those terms and this License would be to refrain entirely from conveying the Program. + +13. Remote Network Interaction; Use with the GNU General Public License. + +Notwithstanding any other provision of this License, if you modify the Program, your modified version must prominently offer all users interacting with it remotely through a computer network (if your version supports such interaction) an opportunity to receive the Corresponding Source of your version by providing access to the Corresponding Source from a network server at no charge, through some standard or customary means of facilitating copying of software. This Corresponding Source shall include the Corresponding Source for any work covered by version 3 of the GNU General Public License that is incorporated pursuant to the following paragraph. + +Notwithstanding any other provision of this License, you have permission to link or combine any covered work with a work licensed under version 3 of the GNU General Public License into a single combined work, and to convey the resulting work. The terms of this License will continue to apply to the part which is the covered work, but the work with which it is combined will remain governed by version 3 of the GNU General Public License. + +14. Revised Versions of this License. + +The Free Software Foundation may publish revised and/or new versions of the GNU Affero General Public License from time to time. Such new versions will be similar in spirit to the present version, but may differ in detail to address new problems or concerns. + +Each version is given a distinguishing version number. If the Program specifies that a certain numbered version of the GNU Affero General Public License "or any later version" applies to it, you have the option of following the terms and conditions either of that numbered version or of any later version published by the Free Software Foundation. If the Program does not specify a version number of the GNU Affero General Public License, you may choose any version ever published by the Free Software Foundation. + +If the Program specifies that a proxy can decide which future versions of the GNU Affero General Public License can be used, that proxy's public statement of acceptance of a version permanently authorizes you to choose that version for the Program. + +Later license versions may give you additional or different permissions. However, no additional obligations are imposed on any author or copyright holder as a result of your choosing to follow a later version. + +15. Disclaimer of Warranty. + +THERE IS NO WARRANTY FOR THE PROGRAM, TO THE EXTENT PERMITTED BY APPLICABLE LAW. EXCEPT WHEN OTHERWISE STATED IN WRITING THE COPYRIGHT HOLDERS AND/OR OTHER PARTIES PROVIDE THE PROGRAM "AS IS" WITHOUT WARRANTY OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE. THE ENTIRE RISK AS TO THE QUALITY AND PERFORMANCE OF THE PROGRAM IS WITH YOU. SHOULD THE PROGRAM PROVE DEFECTIVE, YOU ASSUME THE COST OF ALL NECESSARY SERVICING, REPAIR OR CORRECTION. + +16. Limitation of Liability. + +IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN WRITING WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MODIFIES AND/OR CONVEYS THE PROGRAM AS PERMITTED ABOVE, BE LIABLE TO YOU FOR DAMAGES, INCLUDING ANY GENERAL, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES ARISING OUT OF THE USE OR INABILITY TO USE THE PROGRAM (INCLUDING BUT NOT LIMITED TO LOSS OF DATA OR DATA BEING RENDERED INACCURATE OR LOSSES SUSTAINED BY YOU OR THIRD PARTIES OR A FAILURE OF THE PROGRAM TO OPERATE WITH ANY OTHER PROGRAMS), EVEN IF SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE POSSIBILITY OF SUCH DAMAGES. + +17. Interpretation of Sections 15 and 16. + +If the disclaimer of warranty and limitation of liability provided above cannot be given local legal effect according to their terms, reviewing courts shall apply local law that most closely approximates an absolute waiver of all civil liability in connection with the Program, unless a warranty or assumption of liability accompanies a copy of the Program in return for a fee. + +END OF TERMS AND CONDITIONS diff --git a/README.md b/README.md index b18148f..b0f3639 100644 --- a/README.md +++ b/README.md @@ -1,19 +1,108 @@ -# Profilarr 2.0 + + + + + +
Profilarr +

Profilarr

+

+ GitHub release + Docker Pulls + License + Website + Discord + Buy Me A Coffee + GitHub Sponsors +

+

Manage quality profiles, custom formats, and release profiles across your Radarr and Sonarr instances. Define your profiles once with a Git-backed configuration database, then sync them to any number of \*arr instances.

+
-> Complete rebuild of Profilarr - Configuration Management Platform for -> Radarr/Sonarr +## Features -⚠️ **WORK IN PROGRESS** - This is a ground-up rewrite and is NOT ready for -production use. For the stable version, see -[Profilarr 1.1](https://github.com/Dictionarry-Hub/profilarr). +**Core** -## Development Setup +- **Link** - Connect to configuration databases like the + [Dictionarry database](https://github.com/Dictionarry-Hub/db) or any Profilarr + Compliant Database (PCD) +- **Bridge** - Add your Radarr and Sonarr instances by URL and API key +- **Sync** - Push configurations to your instances. Profilarr compiles + everything to the right format automatically -### Prerequisites +**For Users** -- [Deno](https://deno.com/) 2.x or higher +- **Ready-to-Use Configurations** - Stop spending hours piecing together + settings from forum posts. Get complete, tested quality profiles, custom + formats, and media settings designed around specific goals +- **Stay Updated** - Make local tweaks that persist across upstream updates. + View changelogs, diffs, and revert changes when needed. Merge conflicts are + handled transparently +- **Automated Upgrades** - The arrs don't search for the best release, they grab + the first RSS item that qualifies. Profilarr triggers intelligent searches + based on filters and selectors -## Contributing +**For Developers** -Not accepting contributions yet - this is in active early development. Check -back later! +- **Unified Architecture** - One configuration language that compiles to + Radarr/Sonarr-specific formats on sync. No more maintaining separate configs + for each app +- **Reusable Components** - Regular expressions are separate entities shared + across custom formats. Change once, update everywhere +- **OSQL** - Configurations stored as append-only SQL operations. Readable, + auditable, diffable. Git-native version control with complete history +- **Testing** - Validate regex patterns, custom format conditions, and quality + profile behavior before syncing + +## Documentation + +See **[dictionarry.dev](https://dictionarry.dev/)** for complete installation, +usage, and API documenation. + +## Getting Started + +### Production + +TODO: + +- Deno binaries (Linux, macOS, Windows) +- Docker image build process +- Publishing to Docker Hub and ghcr.io +- Example `compose.yml` + +> [!NOTE] +> The parser service is only required for custom format and quality profile +> testing. Linking, syncing, and all other features work without it. + +### Development + +**Prerequisites** + +- [Deno](https://deno.com/) 2.x +- [.NET SDK](https://dotnet.microsoft.com/) 8.0+ + +```bash +git clone https://github.com/Dictionarry-Hub/profilarr.git +cd profilarr +deno task dev +``` + +This runs the parser service and Vite dev server concurrently. See +[CONTRIBUTING.md](docs/CONTRIBUTING.md) for architecture documentation. + +### Environment Variables + +| Variable | Default | Description | +| --------------- | -------------------- | --------------------------------- | +| `PORT` | `6868` | Web UI port | +| `HOST` | `0.0.0.0` | Bind address | +| `APP_BASE_PATH` | Executable directory | Base path for data, logs, backups | +| `TZ` | System timezone | Timezone for scheduling | +| `PARSER_HOST` | `localhost` | Parser service host | +| `PARSER_PORT` | `5000` | Parser service port | + +## License + +[AGPL-3.0](LICENSE) + +Profilarr is free and open source. You do not need to pay anyone to use it. If +someone is charging you for access to Profilarr, they are violating the spirit +of this project. diff --git a/docs/0.schema.sql b/docs/0.schema.sql deleted file mode 100644 index 2e30bf7..0000000 --- a/docs/0.schema.sql +++ /dev/null @@ -1,389 +0,0 @@ --- ============================================================================ --- PCD SCHEMA v1 --- ============================================================================ - --- ============================================================================ --- CORE ENTITY TABLES (Independent - No Foreign Key Dependencies) --- ============================================================================ --- These tables form the foundation and can be populated in any order - --- Tags are reusable labels that can be applied to multiple entity types -CREATE TABLE tags ( - id INTEGER PRIMARY KEY AUTOINCREMENT, - name VARCHAR(50) UNIQUE NOT NULL, - created_at TEXT NOT NULL DEFAULT CURRENT_TIMESTAMP -); - --- Languages used for profile configuration and custom format conditions -CREATE TABLE languages ( - id INTEGER PRIMARY KEY AUTOINCREMENT, - name VARCHAR(30) UNIQUE NOT NULL, - created_at TEXT NOT NULL DEFAULT CURRENT_TIMESTAMP, - updated_at TEXT NOT NULL DEFAULT CURRENT_TIMESTAMP -); - --- Regular expressions used in custom format pattern conditions -CREATE TABLE regular_expressions ( - id INTEGER PRIMARY KEY AUTOINCREMENT, - name VARCHAR(100) UNIQUE NOT NULL, - pattern TEXT NOT NULL, - regex101_id VARCHAR(50), -- Optional link to regex101.com for testing - description TEXT, - created_at TEXT NOT NULL DEFAULT CURRENT_TIMESTAMP, - updated_at TEXT NOT NULL DEFAULT CURRENT_TIMESTAMP -); - --- Individual quality definitions (e.g., "1080p Bluray", "2160p REMUX") -CREATE TABLE qualities ( - id INTEGER PRIMARY KEY AUTOINCREMENT, - name VARCHAR(100) UNIQUE NOT NULL, - created_at TEXT NOT NULL DEFAULT CURRENT_TIMESTAMP, - updated_at TEXT NOT NULL DEFAULT CURRENT_TIMESTAMP -); - --- Maps Profilarr canonical qualities to arr-specific API names --- Absence of a row means the quality doesn't exist for that arr -CREATE TABLE quality_api_mappings ( - quality_id INTEGER NOT NULL, - arr_type VARCHAR(20) NOT NULL, -- 'radarr', 'sonarr' - api_name VARCHAR(100) NOT NULL, - created_at TEXT NOT NULL DEFAULT CURRENT_TIMESTAMP, - PRIMARY KEY (quality_id, arr_type), - FOREIGN KEY (quality_id) REFERENCES qualities(id) ON DELETE CASCADE -); - --- Custom formats define patterns and conditions for media matching -CREATE TABLE custom_formats ( - id INTEGER PRIMARY KEY AUTOINCREMENT, - name VARCHAR(100) UNIQUE NOT NULL, - description TEXT, - created_at TEXT NOT NULL DEFAULT CURRENT_TIMESTAMP, - updated_at TEXT NOT NULL DEFAULT CURRENT_TIMESTAMP -); - --- ============================================================================ --- DEPENDENT ENTITY TABLES (Depend on Core Entities) --- ============================================================================ - --- Quality profiles define complete media acquisition strategies -CREATE TABLE quality_profiles ( - id INTEGER PRIMARY KEY AUTOINCREMENT, - name VARCHAR(100) UNIQUE NOT NULL, - description TEXT, - upgrades_allowed INTEGER NOT NULL DEFAULT 1, - minimum_custom_format_score INTEGER NOT NULL DEFAULT 0, - upgrade_until_score INTEGER NOT NULL DEFAULT 0, - upgrade_score_increment INTEGER NOT NULL DEFAULT 1 CHECK (upgrade_score_increment > 0), - created_at TEXT NOT NULL DEFAULT CURRENT_TIMESTAMP, - updated_at TEXT NOT NULL DEFAULT CURRENT_TIMESTAMP -); - --- Quality groups combine multiple qualities treated as equivalent --- Each group is specific to a quality profile (profiles do not share groups) -CREATE TABLE quality_groups ( - id INTEGER PRIMARY KEY AUTOINCREMENT, - quality_profile_id INTEGER NOT NULL, - name VARCHAR(100) NOT NULL, - created_at TEXT NOT NULL DEFAULT CURRENT_TIMESTAMP, - updated_at TEXT NOT NULL DEFAULT CURRENT_TIMESTAMP, - UNIQUE(quality_profile_id, name), - FOREIGN KEY (quality_profile_id) REFERENCES quality_profiles(id) ON DELETE CASCADE -); - --- Conditions define the matching logic for custom formats --- Each condition has a type and corresponding data in a type-specific table -CREATE TABLE custom_format_conditions ( - id INTEGER PRIMARY KEY AUTOINCREMENT, - custom_format_id INTEGER NOT NULL, - name VARCHAR(100) NOT NULL, - type VARCHAR(50) NOT NULL, - arr_type VARCHAR(20) NOT NULL, -- 'radarr', 'sonarr', 'all' - negate INTEGER NOT NULL DEFAULT 0, - required INTEGER NOT NULL DEFAULT 0, - created_at TEXT NOT NULL DEFAULT CURRENT_TIMESTAMP, - updated_at TEXT NOT NULL DEFAULT CURRENT_TIMESTAMP, - FOREIGN KEY (custom_format_id) REFERENCES custom_formats(id) ON DELETE CASCADE -); - --- ============================================================================ --- JUNCTION TABLES (Many-to-Many Relationships) --- ============================================================================ - --- Link regular expressions to tags -CREATE TABLE regular_expression_tags ( - regular_expression_id INTEGER NOT NULL, - tag_id INTEGER NOT NULL, - PRIMARY KEY (regular_expression_id, tag_id), - FOREIGN KEY (regular_expression_id) REFERENCES regular_expressions(id) ON DELETE CASCADE, - FOREIGN KEY (tag_id) REFERENCES tags(id) ON DELETE CASCADE -); - --- Link custom formats to tags -CREATE TABLE custom_format_tags ( - custom_format_id INTEGER NOT NULL, - tag_id INTEGER NOT NULL, - PRIMARY KEY (custom_format_id, tag_id), - FOREIGN KEY (custom_format_id) REFERENCES custom_formats(id) ON DELETE CASCADE, - FOREIGN KEY (tag_id) REFERENCES tags(id) ON DELETE CASCADE -); - --- Link quality profiles to tags -CREATE TABLE quality_profile_tags ( - quality_profile_id INTEGER NOT NULL, - tag_id INTEGER NOT NULL, - PRIMARY KEY (quality_profile_id, tag_id), - FOREIGN KEY (quality_profile_id) REFERENCES quality_profiles(id) ON DELETE CASCADE, - FOREIGN KEY (tag_id) REFERENCES tags(id) ON DELETE CASCADE -); - --- Link quality profiles to languages with type modifiers --- Type can be: 'must', 'only', 'not', or 'simple' (default language preference) -CREATE TABLE quality_profile_languages ( - quality_profile_id INTEGER NOT NULL, - language_id INTEGER NOT NULL, - type VARCHAR(20) NOT NULL DEFAULT 'simple', -- 'must', 'only', 'not', 'simple' - PRIMARY KEY (quality_profile_id, language_id), - FOREIGN KEY (quality_profile_id) REFERENCES quality_profiles(id) ON DELETE CASCADE, - FOREIGN KEY (language_id) REFERENCES languages(id) ON DELETE CASCADE -); - --- Define which qualities belong to which quality groups --- All qualities in a group are treated as equivalent -CREATE TABLE quality_group_members ( - quality_group_id INTEGER NOT NULL, - quality_id INTEGER NOT NULL, - PRIMARY KEY (quality_group_id, quality_id), - FOREIGN KEY (quality_group_id) REFERENCES quality_groups(id) ON DELETE CASCADE, - FOREIGN KEY (quality_id) REFERENCES qualities(id) ON DELETE CASCADE -); - --- Define the quality list for a profile (ordered by position) --- Each item references either a single quality OR a quality group (never both) --- Every quality must be represented (either directly or in a group) --- The enabled flag controls whether the quality/group is active -CREATE TABLE quality_profile_qualities ( - id INTEGER PRIMARY KEY AUTOINCREMENT, - quality_profile_id INTEGER NOT NULL, - quality_id INTEGER, -- References a single quality - quality_group_id INTEGER, -- OR references a quality group - position INTEGER NOT NULL, -- Display order in the profile - enabled INTEGER NOT NULL DEFAULT 1, -- Whether this quality/group is enabled - upgrade_until INTEGER NOT NULL DEFAULT 0, -- Stop upgrading at this quality - CHECK ((quality_id IS NOT NULL AND quality_group_id IS NULL) OR (quality_id IS NULL AND quality_group_id IS NOT NULL)), - FOREIGN KEY (quality_profile_id) REFERENCES quality_profiles(id) ON DELETE CASCADE, - FOREIGN KEY (quality_id) REFERENCES qualities(id) ON DELETE CASCADE, - FOREIGN KEY (quality_group_id) REFERENCES quality_groups(id) ON DELETE CASCADE -); - --- Assign custom formats to quality profiles with scoring --- Scores determine upgrade priority and filtering behavior -CREATE TABLE quality_profile_custom_formats ( - quality_profile_id INTEGER NOT NULL, - custom_format_id INTEGER NOT NULL, - arr_type VARCHAR(20) NOT NULL, -- 'radarr', 'sonarr', 'all', - score INTEGER NOT NULL, - PRIMARY KEY (quality_profile_id, custom_format_id, arr_type), - FOREIGN KEY (quality_profile_id) REFERENCES quality_profiles(id) ON DELETE CASCADE, - FOREIGN KEY (custom_format_id) REFERENCES custom_formats(id) ON DELETE CASCADE -); - --- ============================================================================ --- CUSTOM FORMAT CONDITION TYPE TABLES --- ============================================================================ --- Each condition type has a dedicated table storing type-specific data --- A condition_id should only appear in ONE of these tables, matching its type - --- Pattern-based conditions (release_title, release_group, edition) --- Each pattern condition references exactly one regular expression -CREATE TABLE condition_patterns ( - custom_format_condition_id INTEGER PRIMARY KEY, - regular_expression_id INTEGER NOT NULL, - FOREIGN KEY (custom_format_condition_id) REFERENCES custom_format_conditions(id) ON DELETE CASCADE, - FOREIGN KEY (regular_expression_id) REFERENCES regular_expressions(id) ON DELETE CASCADE -); - --- Language-based conditions -CREATE TABLE condition_languages ( - custom_format_condition_id INTEGER PRIMARY KEY, - language_id INTEGER NOT NULL, - except_language INTEGER NOT NULL DEFAULT 0, -- Match everything EXCEPT this language - FOREIGN KEY (custom_format_condition_id) REFERENCES custom_format_conditions(id) ON DELETE CASCADE, - FOREIGN KEY (language_id) REFERENCES languages(id) ON DELETE CASCADE -); - --- Indexer flag conditions (e.g., "Scene", "Freeleech") -CREATE TABLE condition_indexer_flags ( - custom_format_condition_id INTEGER PRIMARY KEY, - flag VARCHAR(100) NOT NULL, - FOREIGN KEY (custom_format_condition_id) REFERENCES custom_format_conditions(id) ON DELETE CASCADE -); - --- Source conditions (e.g., "Bluray", "Web", "DVD") -CREATE TABLE condition_sources ( - custom_format_condition_id INTEGER PRIMARY KEY, - source VARCHAR(100) NOT NULL, - FOREIGN KEY (custom_format_condition_id) REFERENCES custom_format_conditions(id) ON DELETE CASCADE -); - --- Resolution conditions (e.g., "1080p", "2160p") -CREATE TABLE condition_resolutions ( - custom_format_condition_id INTEGER PRIMARY KEY, - resolution VARCHAR(100) NOT NULL, - FOREIGN KEY (custom_format_condition_id) REFERENCES custom_format_conditions(id) ON DELETE CASCADE -); - --- Quality modifier conditions (e.g., "REMUX", "WEBDL") -CREATE TABLE condition_quality_modifiers ( - custom_format_condition_id INTEGER PRIMARY KEY, - quality_modifier VARCHAR(100) NOT NULL, - FOREIGN KEY (custom_format_condition_id) REFERENCES custom_format_conditions(id) ON DELETE CASCADE -); - --- Size-based conditions with min/max bounds in bytes -CREATE TABLE condition_sizes ( - custom_format_condition_id INTEGER PRIMARY KEY, - min_bytes INTEGER, -- Null means no minimum - max_bytes INTEGER, -- Null means no maximum - FOREIGN KEY (custom_format_condition_id) REFERENCES custom_format_conditions(id) ON DELETE CASCADE -); - --- Release type conditions (e.g., "Movie", "Episode") -CREATE TABLE condition_release_types ( - custom_format_condition_id INTEGER PRIMARY KEY, - release_type VARCHAR(100) NOT NULL, - FOREIGN KEY (custom_format_condition_id) REFERENCES custom_format_conditions(id) ON DELETE CASCADE -); - --- Year-based conditions with min/max bounds -CREATE TABLE condition_years ( - custom_format_condition_id INTEGER PRIMARY KEY, - min_year INTEGER, -- Null means no minimum - max_year INTEGER, -- Null means no maximum - FOREIGN KEY (custom_format_condition_id) REFERENCES custom_format_conditions(id) ON DELETE CASCADE -); - --- ============================================================================ --- MEDIA MANAGEMENT TABLES --- ============================================================================ - --- Radarr quality size definitions -CREATE TABLE radarr_quality_definitions ( - quality_id INTEGER PRIMARY KEY, - min_size INTEGER NOT NULL DEFAULT 0, - max_size INTEGER NOT NULL, - preferred_size INTEGER NOT NULL, - created_at TEXT NOT NULL DEFAULT CURRENT_TIMESTAMP, - updated_at TEXT NOT NULL DEFAULT CURRENT_TIMESTAMP, - FOREIGN KEY (quality_id) REFERENCES qualities(id) ON DELETE CASCADE -); - --- Sonarr quality size definitions -CREATE TABLE sonarr_quality_definitions ( - quality_id INTEGER PRIMARY KEY, - min_size INTEGER NOT NULL DEFAULT 0, - max_size INTEGER NOT NULL, - preferred_size INTEGER NOT NULL, - created_at TEXT NOT NULL DEFAULT CURRENT_TIMESTAMP, - updated_at TEXT NOT NULL DEFAULT CURRENT_TIMESTAMP, - FOREIGN KEY (quality_id) REFERENCES qualities(id) ON DELETE CASCADE -); - --- Radarr naming configuration -CREATE TABLE radarr_naming ( - id INTEGER PRIMARY KEY CHECK (id = 1), - rename INTEGER NOT NULL DEFAULT 1, - movie_format TEXT NOT NULL, - movie_folder_format TEXT NOT NULL, - replace_illegal_characters INTEGER NOT NULL DEFAULT 0, - colon_replacement_format VARCHAR(20) NOT NULL DEFAULT 'smart', - created_at TEXT NOT NULL DEFAULT CURRENT_TIMESTAMP, - updated_at TEXT NOT NULL DEFAULT CURRENT_TIMESTAMP -); - --- Sonarr naming configuration -CREATE TABLE sonarr_naming ( - id INTEGER PRIMARY KEY CHECK (id = 1), - rename INTEGER NOT NULL DEFAULT 1, - standard_episode_format TEXT NOT NULL, - daily_episode_format TEXT NOT NULL, - anime_episode_format TEXT NOT NULL, - series_folder_format TEXT NOT NULL, - season_folder_format TEXT NOT NULL, - replace_illegal_characters INTEGER NOT NULL DEFAULT 0, - colon_replacement_format INTEGER NOT NULL DEFAULT 4, - custom_colon_replacement_format TEXT, - multi_episode_style INTEGER NOT NULL DEFAULT 5, - created_at TEXT NOT NULL DEFAULT CURRENT_TIMESTAMP, - updated_at TEXT NOT NULL DEFAULT CURRENT_TIMESTAMP -); - --- Radarr general media settings -CREATE TABLE radarr_media_settings ( - id INTEGER PRIMARY KEY CHECK (id = 1), - propers_repacks VARCHAR(50) NOT NULL DEFAULT 'doNotPrefer', - enable_media_info INTEGER NOT NULL DEFAULT 1, - created_at TEXT NOT NULL DEFAULT CURRENT_TIMESTAMP, - updated_at TEXT NOT NULL DEFAULT CURRENT_TIMESTAMP -); - --- Sonarr general media settings -CREATE TABLE sonarr_media_settings ( - id INTEGER PRIMARY KEY CHECK (id = 1), - propers_repacks VARCHAR(50) NOT NULL DEFAULT 'doNotPrefer', - enable_media_info INTEGER NOT NULL DEFAULT 1, - created_at TEXT NOT NULL DEFAULT CURRENT_TIMESTAMP, - updated_at TEXT NOT NULL DEFAULT CURRENT_TIMESTAMP -); - --- ============================================================================ --- DELAY PROFILES --- ============================================================================ - --- Delay profiles control download timing preferences -CREATE TABLE delay_profiles ( - id INTEGER PRIMARY KEY AUTOINCREMENT, - name VARCHAR(100) UNIQUE NOT NULL, - preferred_protocol VARCHAR(20) NOT NULL CHECK ( - preferred_protocol IN ('prefer_usenet', 'prefer_torrent', 'only_usenet', 'only_torrent') - ), - usenet_delay INTEGER, -- minutes, NULL if only_torrent - torrent_delay INTEGER, -- minutes, NULL if only_usenet - bypass_if_highest_quality INTEGER NOT NULL DEFAULT 0, - bypass_if_above_custom_format_score INTEGER NOT NULL DEFAULT 0, - minimum_custom_format_score INTEGER, -- Required when bypass_if_above_custom_format_score = 1 - created_at TEXT NOT NULL DEFAULT CURRENT_TIMESTAMP, - updated_at TEXT NOT NULL DEFAULT CURRENT_TIMESTAMP, - -- Enforce usenet_delay is NULL only when only_torrent - CHECK ( - (preferred_protocol = 'only_torrent' AND usenet_delay IS NULL) OR - (preferred_protocol != 'only_torrent' AND usenet_delay IS NOT NULL) - ), - -- Enforce torrent_delay is NULL only when only_usenet - CHECK ( - (preferred_protocol = 'only_usenet' AND torrent_delay IS NULL) OR - (preferred_protocol != 'only_usenet' AND torrent_delay IS NOT NULL) - ), - -- Enforce minimum_custom_format_score required when bypass enabled - CHECK ( - (bypass_if_above_custom_format_score = 0 AND minimum_custom_format_score IS NULL) OR - (bypass_if_above_custom_format_score = 1 AND minimum_custom_format_score IS NOT NULL) - ) -); - --- Link delay profiles to tags (at least 1 required - enforced at application level) -CREATE TABLE delay_profile_tags ( - delay_profile_id INTEGER NOT NULL, - tag_id INTEGER NOT NULL, - PRIMARY KEY (delay_profile_id, tag_id), - FOREIGN KEY (delay_profile_id) REFERENCES delay_profiles(id) ON DELETE CASCADE, - FOREIGN KEY (tag_id) REFERENCES tags(id) ON DELETE CASCADE -); - --- ============================================================================ --- INDEXES AND CONSTRAINTS --- ============================================================================ - --- Ensure only one quality item per profile can be marked as upgrade_until -CREATE UNIQUE INDEX idx_one_upgrade_until_per_profile -ON quality_profile_qualities(quality_profile_id) -WHERE upgrade_until = 1; diff --git a/docs/1.languages.sql b/docs/1.languages.sql deleted file mode 100644 index f262d82..0000000 --- a/docs/1.languages.sql +++ /dev/null @@ -1,65 +0,0 @@ --- Languages -INSERT INTO languages (name) VALUES -('Unknown'), -('English'), -('French'), -('Spanish'), -('German'), -('Italian'), -('Danish'), -('Dutch'), -('Japanese'), -('Icelandic'), -('Chinese'), -('Russian'), -('Polish'), -('Vietnamese'), -('Swedish'), -('Norwegian'), -('Finnish'), -('Turkish'), -('Portuguese'), -('Flemish'), -('Greek'), -('Korean'), -('Hungarian'), -('Hebrew'), -('Lithuanian'), -('Czech'), -('Hindi'), -('Romanian'), -('Thai'), -('Bulgarian'), -('Portuguese (Brazil)'), -('Arabic'), -('Ukrainian'), -('Persian'), -('Bengali'), -('Slovak'), -('Latvian'), -('Spanish (Latino)'), -('Catalan'), -('Croatian'), -('Serbian'), -('Bosnian'), -('Estonian'), -('Tamil'), -('Indonesian'), -('Telugu'), -('Macedonian'), -('Slovenian'), -('Malayalam'), -('Kannada'), -('Albanian'), -('Afrikaans'), -('Marathi'), -('Tagalog'), -('Urdu'), -('Romansh'), -('Mongolian'), -('Georgian'), -('Azerbaijani'), -('Uzbek'), -('Malay'), -('Any'), -('Original'); \ No newline at end of file diff --git a/docs/2.qualities.sql b/docs/2.qualities.sql deleted file mode 100644 index e96ac95..0000000 --- a/docs/2.qualities.sql +++ /dev/null @@ -1,61 +0,0 @@ --- Qualities -INSERT INTO qualities (name) VALUES -('Unknown'), -('WORKPRINT'), -('CAM'), -('TELESYNC'), -('TELECINE'), -('DVDSCR'), -('REGIONAL'), -('SDTV'), -('DVD'), -('DVD-R'), -('HDTV-480p'), -('HDTV-720p'), -('HDTV-1080p'), -('HDTV-2160p'), -('WEBDL-480p'), -('WEBDL-720p'), -('WEBDL-1080p'), -('WEBDL-2160p'), -('WEBRip-480p'), -('WEBRip-720p'), -('WEBRip-1080p'), -('WEBRip-2160p'), -('Bluray-480p'), -('Bluray-576p'), -('Bluray-720p'), -('Bluray-1080p'), -('Bluray-2160p'), -('Remux-1080p'), -('Remux-2160p'), -('BR-DISK'), -('Raw-HD'); - --- Radarr mappings (30 qualities) -INSERT INTO quality_api_mappings (quality_id, arr_type, api_name) -SELECT id, 'radarr', name FROM qualities WHERE name IN ( - 'Unknown', 'WORKPRINT', 'CAM', 'TELESYNC', 'TELECINE', 'DVDSCR', 'REGIONAL', - 'SDTV', 'DVD', 'DVD-R', 'HDTV-720p', 'HDTV-1080p', 'HDTV-2160p', - 'WEBDL-480p', 'WEBDL-720p', 'WEBDL-1080p', 'WEBDL-2160p', - 'WEBRip-480p', 'WEBRip-720p', 'WEBRip-1080p', 'WEBRip-2160p', - 'Bluray-480p', 'Bluray-576p', 'Bluray-720p', 'Bluray-1080p', 'Bluray-2160p', - 'Remux-1080p', 'Remux-2160p', 'BR-DISK', 'Raw-HD' -); - --- Sonarr mappings (20 exact matches + 2 remuxes with different names) -INSERT INTO quality_api_mappings (quality_id, arr_type, api_name) -SELECT id, 'sonarr', name FROM qualities WHERE name IN ( - 'Unknown', 'SDTV', 'DVD', 'HDTV-720p', 'HDTV-1080p', 'HDTV-2160p', - 'WEBDL-480p', 'WEBDL-720p', 'WEBDL-1080p', 'WEBDL-2160p', - 'WEBRip-480p', 'WEBRip-720p', 'WEBRip-1080p', 'WEBRip-2160p', - 'Bluray-480p', 'Bluray-576p', 'Bluray-720p', 'Bluray-1080p', 'Bluray-2160p', - 'Raw-HD' -); - --- Sonarr remux mappings (different names) -INSERT INTO quality_api_mappings (quality_id, arr_type, api_name) -SELECT id, 'sonarr', 'Bluray-1080p Remux' FROM qualities WHERE name = 'Remux-1080p'; - -INSERT INTO quality_api_mappings (quality_id, arr_type, api_name) -SELECT id, 'sonarr', 'Bluray-2160p Remux' FROM qualities WHERE name = 'Remux-2160p'; diff --git a/docs/ARCHITECTURE.md b/docs/ARCHITECTURE.md deleted file mode 100644 index 3074a56..0000000 --- a/docs/ARCHITECTURE.md +++ /dev/null @@ -1,386 +0,0 @@ -# Profilarr Architecture Guide - -Quick reference for AI assistants working with this codebase. - ---- - -## Overview - -**Profilarr** is a SvelteKit + Deno application for managing configurations across *arr applications (Radarr, Sonarr, etc.). It compiles to standalone binaries. - -**Stack:** SvelteKit 2, Svelte 5 (using Svelte 4 syntax), TypeScript, Tailwind CSS 4, Deno 2, SQLite, Kysely - -**Svelte Syntax:** This codebase uses **Svelte 4 syntax** (`export let` for props, `$:` for reactivity, `createEventDispatcher` for events). Do not use Svelte 5 runes (`$state`, `$props`, `$derived`) except where explicitly noted. - -**Key Paths:** -- `src/routes/` - SvelteKit pages and API routes -- `src/lib/server/` - Backend logic -- `src/lib/client/` - Frontend stores and components -- `deno.json` - Import aliases and tasks - ---- - -## Backend - -### 1. Database Workflow - -SQLite database with WAL mode. `DatabaseManager` is a singleton handling connections, transactions, and health checks. - -**Files:** -- `src/lib/server/db/db.ts` - DatabaseManager singleton, transaction support, parametrized queries -- `src/lib/server/db/queries/` - Query files per entity (jobs.ts, arrInstances.ts, notificationServices.ts, etc.) - -**Pattern:** Each entity has a queries file exporting `{entity}Queries` with `create()`, `getById()`, `getAll()`, `update()`, `delete()` methods. - ---- - -### 2. Migration System - -`MigrationRunner` manages schema changes. Migrations are tracked in a `migrations` table with version, name, and applied_at. - -**Files:** -- `src/lib/server/db/migrations.ts` - MigrationRunner class, up/down support -- `src/lib/server/db/migrations/` - 16 migration files (001-016) - -**Key migrations:** 004 (jobs), 007 (notifications), 008 (database instances), 011-013 (upgrade configs), 014 (AI settings), 015-016 (sync configs) - ---- - -### 3. Jobs System - -Background job scheduler checking for due jobs every 60 seconds. Jobs are registered at startup and stored in the database. - -**Files:** -- `src/lib/server/jobs/scheduler.ts` - Scheduler singleton, 60s interval, prevents concurrent execution -- `src/lib/server/jobs/runner.ts` - Executes jobs, calculates next run time, records history -- `src/lib/server/jobs/registry.ts` - In-memory registry mapping job names to definitions -- `src/lib/server/jobs/init.ts` - Registers all jobs at startup -- `src/lib/server/jobs/definitions/` - Job definitions (syncDatabases.ts, syncArr.ts, upgradeManager.ts, etc.) - -**Schedule formats:** "daily", "hourly", "*/N minutes" - ---- - -### 4. Notifications System - -Pluggable notification system with builder pattern. Sends to enabled services in parallel, records all attempts in history. - -**Files:** -- `src/lib/server/notifications/NotificationManager.ts` - Central orchestrator, fire-and-forget pattern -- `src/lib/server/notifications/builder.ts` - Fluent API: `notify(type).title().message().meta().send()` -- `src/lib/server/notifications/types.ts` - Notification type constants (job.success, pcd.linked, upgrade.failed, etc.) -- `src/lib/server/notifications/notifiers/DiscordNotifier.ts` - Discord webhook implementation with embeds - ---- - -### 5. PCDs (Profilarr Compliant Databases) - -External git repositories containing configuration profiles. Uses layered SQL operations compiled into an in-memory SQLite cache. - -**Files:** -- `src/lib/server/pcd/pcd.ts` - PCDManager singleton (link, unlink, sync, checkForUpdates) -- `src/lib/server/pcd/manifest.ts` - Validates pcd.json manifests -- `src/lib/server/pcd/cache.ts` - PCDCache class, compiles SQL from layers, file watching -- `src/lib/server/pcd/schema.ts` - Kysely schema for PCD database tables -- `src/lib/server/pcd/ops.ts` - Loads .sql files from layer directories -- `src/lib/server/pcd/deps.ts` - Dependency resolution for PCD repos -- `src/lib/server/pcd/types.ts` - TypeScript types for profiles, custom formats, etc. - -**Layers (in order):** schema (from deps) → base (ops/) → tweaks (tweaks/) → user (user_ops/) - -**SQL helpers:** `qp(name)` (quality profile), `cf(name)` (custom format), `dp(name)` (delay profile), `tag(name)` - ---- - -### 6. Sync Engine - -Pushes profiles from PCDs to ARR instances. Handles quality profiles, delay profiles, and media management settings. - -**Files:** -- `src/lib/server/sync/` - Sync engine directory -- `src/lib/server/jobs/definitions/syncArr.ts` - Scheduled sync job -- `src/lib/server/db/queries/syncConfigs.ts` - Sync configuration queries -- `src/lib/server/db/migrations/015_create_arr_sync.ts` - Sync config schema -- `src/lib/server/db/migrations/016_add_arr_sync_columns.ts` - Additional sync columns - ---- - -### 7. ARR API Client - -3-tier class hierarchy: BaseHttpClient → BaseArrClient → App-specific clients (RadarrClient, SonarrClient, etc.) - -**Files:** -- `src/lib/server/utils/http/client.ts` - Base HTTP client with retry logic, connection pooling, exponential backoff -- `src/lib/server/utils/arr/base.ts` - Base ARR client, X-Api-Key auth, connection testing, delay profiles, tags -- `src/lib/server/utils/arr/factory.ts` - Factory function to create client by type -- `src/lib/server/utils/arr/types.ts` - TypeScript types for all ARR APIs -- `src/lib/server/utils/arr/clients/radarr.ts` - Full Radarr implementation (movies, quality profiles, files, search, library) -- `src/lib/server/utils/arr/clients/sonarr.ts` - Sonarr client stub -- `src/lib/server/utils/arr/clients/lidarr.ts` - Lidarr client (API v1) - -**Features:** 30s timeout, 3 retries on 5xx, batch operations, custom format scoring - ---- - -### 8. Upgrade Manager - -Scheduled job that searches for movies based on filters and selection strategies. Uses tag-based cooldowns. - -**Files:** -- `src/lib/server/upgrades/processor.ts` - Main orchestrator for upgrade workflow -- `src/lib/server/upgrades/normalize.ts` - Converts Radarr responses to normalized format -- `src/lib/server/upgrades/cooldown.ts` - Tag-based cooldown (profilarr-searched-YYYY-MM-DD) -- `src/lib/server/upgrades/logger.ts` - Structured logging for upgrade runs -- `src/lib/server/upgrades/types.ts` - Type definitions -- `src/lib/server/jobs/definitions/upgradeManager.ts` - Job definition (every 30 minutes) -- `src/lib/server/jobs/logic/upgradeManager.ts` - Job logic, fetches due configs -- `src/lib/server/db/queries/upgradeConfigs.ts` - Upgrade config CRUD - -**Filter modes:** round_robin (cycles through filters), random (picks random filter) - -**Dry run:** Tests workflow without triggering actual searches - ---- - -### 9. AI Integration - -AI-powered commit message generation using OpenAI-compatible APIs. Generates semantic messages from file diffs. - -**Files:** -- `src/lib/server/utils/ai/client.ts` - Core client: `isAIEnabled()`, `generateCommitMessage(diff)` -- `src/lib/server/db/queries/aiSettings.ts` - Singleton settings (get, update, reset) -- `src/lib/server/db/migrations/014_create_ai_settings.ts` - Schema (enabled, api_url, api_key, model) -- `src/routes/api/ai/status/+server.ts` - Check if AI is enabled -- `src/routes/api/databases/[id]/generate-commit-message/+server.ts` - Generate commit message endpoint -- `src/routes/settings/general/components/AISettings.svelte` - Settings UI - -**Supports:** OpenAI, Ollama, LM Studio, Claude API (any OpenAI-compatible endpoint) - ---- - -### 10. Git Integration - -Git operations for PCD repositories. - -**Files:** -- `src/lib/server/utils/git/` - Git utilities -- Key operations: clone, pull, checkout, status, update detection, private repo support (PAT) - ---- - -### 11. Configuration System - -Centralized configuration with paths and environment variables. - -**Files:** -- `src/lib/server/utils/config/config.ts` - Paths (database, data, backups, logs), timezone support - ---- - -### 12. Logging - -Structured logging with source tracking and log levels. - -**Files:** -- `src/lib/server/utils/logger/` - Logger utilities with levels (info, warn, error, debug) - ---- - -## Frontend - -### 13. Accent Variable System - -7 color palettes with 11 shades each (50-950). Uses CSS custom properties dynamically set via JavaScript. - -**Files:** -- `src/app.css` - CSS variables for accent colors (lines 7-43), @theme block for Tailwind -- `src/lib/client/stores/accent.ts` - AccentColor type, color palettes, localStorage persistence, `applyAccentColors()` -- `src/lib/client/ui/navigation/navbar/accentPicker.svelte` - Color picker UI - -**Colors:** blue, yellow, green, orange, teal, purple, rose - -**Usage:** `bg-accent-600`, `text-accent-500`, `border-accent-300` in Tailwind classes - ---- - -### 14. Theme System - -Light/dark mode via class-based Tailwind. Uses View Transitions API for smooth switching. - -**Files:** -- `src/lib/client/stores/theme.ts` - Theme store, localStorage persistence, applies class to document.documentElement -- `src/lib/client/ui/navigation/navbar/themeToggle.svelte` - Toggle button with Sun/Moon icons - -**Usage:** `dark:bg-neutral-900`, `dark:text-white` prefixes in Tailwind classes - ---- - -### 15. Reusable UI Components - -30+ components organized by functionality in `src/lib/client/ui/`. - -**Buttons:** -- `src/lib/client/ui/button/Button.svelte` - Variants (primary, secondary, danger), sizes (sm, md), icon support - -**Forms:** -- `src/lib/client/ui/form/FormInput.svelte` - Text input + textarea with label, description -- `src/lib/client/ui/form/NumberInput.svelte` - Numeric with increment/decrement buttons -- `src/lib/client/ui/form/TagInput.svelte` - Tag entry with Enter key, backspace deletion -- `src/lib/client/ui/form/IconCheckbox.svelte` - Checkbox with icon, 3 shapes, custom colors - -**Tables:** -- `src/lib/client/ui/table/Table.svelte` - Generic table with TypeScript generics, sorting, custom cell renderers -- `src/lib/client/ui/table/ExpandableTable.svelte` - Table with row expansion -- `src/lib/client/ui/table/ReorderableList.svelte` - Drag-and-drop list -- `src/lib/client/ui/table/types.ts` - Column definitions - -**Modals:** -- `src/lib/client/ui/modal/Modal.svelte` - Base modal with backdrop, Escape key, confirm/cancel -- `src/lib/client/ui/modal/SaveTargetModal.svelte` - Two-option modal (User/Base layer) -- `src/lib/client/ui/modal/UnsavedChangesModal.svelte` - Unsaved changes warning -- `src/lib/client/ui/modal/InfoModal.svelte` - Information-only modal - -**Navigation:** -- `src/lib/client/ui/navigation/navbar/navbar.svelte` - Top navbar with logo, accent picker, theme toggle -- `src/lib/client/ui/navigation/pageNav/pageNav.svelte` - Left sidebar with hierarchical groups -- `src/lib/client/ui/navigation/tabs/Tabs.svelte` - Tab navigation - -**Actions:** -- `src/lib/client/ui/actions/ActionsBar.svelte` - Container for inline action buttons -- `src/lib/client/ui/actions/ActionButton.svelte` - Icon button for toolbars -- `src/lib/client/ui/actions/SearchAction.svelte` - Search input with debouncing -- `src/lib/client/ui/actions/ViewToggle.svelte` - Table/card view toggle - -**Dropdowns:** -- `src/lib/client/ui/dropdown/Dropdown.svelte` - Positioned dropdown menu -- `src/lib/client/ui/dropdown/DropdownItem.svelte` - Menu item - -**State:** -- `src/lib/client/ui/state/EmptyState.svelte` - Empty data placeholder - ---- - -### 16. Colocation Strategy - -Route-based colocation: page-specific components live in route folders, shared UI in `src/lib/client/ui/`. - -**Pattern:** -``` -src/routes/delay-profiles/[databaseId]/ -├── +page.svelte # Page component -├── +page.server.ts # Server-side logic -├── components/ # Page-specific components -│ └── DelayProfileForm.svelte -└── views/ # View variations - ├── CardView.svelte - └── TableView.svelte -``` - -**Examples:** -- `src/routes/delay-profiles/[databaseId]/components/DelayProfileForm.svelte` -- `src/routes/databases/components/InstanceForm.svelte` -- `src/routes/quality-profiles/[databaseId]/views/CardView.svelte` - ---- - -### 17. State Management - -Svelte stores for global state with localStorage persistence. - -**Files:** -- `src/lib/client/stores/theme.ts` - Theme (light/dark) -- `src/lib/client/stores/accent.ts` - Accent color -- `src/lib/client/stores/search.ts` - Search state with debouncing, filters, derived stores -- `src/lib/client/stores/dataPage.ts` - `createDataPageStore()` combining search + view toggle + filtering -- `src/lib/client/stores/libraryCache.ts` - Client-side Radarr library cache -- `src/lib/client/alerts/store.ts` - Toast notifications (success, error, warning, info) - -**Data Page Pattern:** -```typescript -const { search, view, filtered, setItems } = createDataPageStore(data, { - storageKey: 'delayProfilesView', - searchKeys: ['name'] -}); -``` - ---- - -### 18. Form Enhancement - -SvelteKit `enhance` directive for progressive enhancement with loading states and alerts. - -**Pattern:** -```svelte -
{ - isLoading = true; - return async ({ result, update }) => { - if (result.type === 'failure') alertStore.add('error', msg); - else if (result.type === 'redirect') alertStore.add('success', msg); - await update(); - isLoading = false; - }; -}}> -``` - ---- - -### 19. Unsaved Changes Detection - -Tracks dirty state and prompts user before navigation. - -**Files:** -- `src/lib/client/utils/unsavedChanges.svelte.ts` - Exception using Svelte 5 `$state` rune; provides `markDirty()`, `confirmNavigation()`, `confirmDiscard()`, `cancelDiscard()` - ---- - -## Architecture - -### 20. Import Aliases - -Defined in `deno.json` for clean imports: - -| Alias | Path | -|-------|------| -| `$lib/` | `src/lib/` | -| `$config` | `src/lib/server/utils/config/config.ts` | -| `$logger/` | `src/lib/server/utils/logger/` | -| `$db/` | `src/lib/server/db/` | -| `$jobs/` | `src/lib/server/jobs/` | -| `$arr/` | `src/lib/server/utils/arr/` | -| `$notifications/` | `src/lib/server/notifications/` | -| `$pcd/` | `src/lib/server/pcd/` | -| `$stores/` | `src/lib/client/stores/` | -| `$ui/` | `src/lib/client/ui/` | - ---- - -### 21. Server Initialization - -Boot sequence in `src/hooks.server.ts`: - -1. Initialize configuration (env vars, paths) -2. Log startup banner -3. Initialize database connection -4. Run database migrations -5. Load log settings from database -6. Initialize PCD caches -7. Initialize job system -8. Start job scheduler - ---- - -### 22. Build & Deployment - -**Development:** -```bash -deno run -A npm:vite dev # Port 6969 -``` - -**Production:** -```bash -deno run -A npm:vite build -deno compile --target x86_64-unknown-linux-gnu --output dist/linux/profilarr dist/build/mod.ts -deno compile --target x86_64-pc-windows-msvc --output dist/windows/profilarr.exe dist/build/mod.ts -``` - -**Output:** Standalone binaries in `dist/linux/` and `dist/windows/` diff --git a/docs/CONTRIBUTING.md b/docs/CONTRIBUTING.md index 0ec5db7..3f35ae2 100644 --- a/docs/CONTRIBUTING.md +++ b/docs/CONTRIBUTING.md @@ -1,151 +1,1059 @@ # Contributing to Profilarr -Profilarr is a work-in-progress rewrite, so please coordinate larger changes first. This guide explains how the repo is organized and the expected contribution workflows. +## About -## Project Overview +Profilarr is a configuration management tool for Radarr and Sonarr. -Profilarr is a SvelteKit + Deno app that manages and syncs configurations across \*arr apps using Profilarr Compliant Databases (PCDs). It compiles to standalone binaries. +Setting up media automation properly means creating dozens of custom formats to +identify things like 4K releases, preferred encoders, and quality thresholds. +Then you need to orchestrate all those formats into quality profiles that +actually work together. Most people spend hours piecing this together from forum +posts and guides. -- **Frontend:** `src/routes/`, `src/lib/client/` -- **Backend:** `src/lib/server/` -- **PCDs:** git repositories cloned under `data/databases/` and compiled into an in-memory SQLite cache +Profilarr lets users pull from shared configuration databases instead of +building everything from scratch. You link a database, connect your arr +instances, and sync. It compiles the configurations, pushes them to your apps, +preserves any local modifications you've made, and tracks everything with git so +you can see what changed. -## Prerequisites +### Users and Developers -- **Deno 2.x** -- **Node + npm** only if you want to run ESLint/Prettier (`deno task lint` or `deno task format`). -- **.NET 8** only if you work on the parser microservice in `services/parser/`. +Profilarr serves two audiences. End users link external databases and sync +configurations to their Arr instances. Developers create and maintain those +databases. -## Development Commands +The editing interface serves both. End users can make custom tweaks to profiles +or formats after syncing—these local modifications persist across future syncs. +Developers use the same editors to build databases from scratch, test their +configurations, and iterate on profiles before publishing. -- `deno task dev` (default port 6969) -- `deno task test` -- `deno task lint` -- `deno task format` +## Stack -Useful environment variables: +SvelteKit with Svelte 5 running on Deno. We don't use runes—event handlers use +`onclick` syntax, but no `$state`, `$derived`, or other rune primitives. +Tailwind CSS 4 for styling. Lucide and Simple Icons for iconography. -- `APP_BASE_PATH` (defaults to the compiled binary location) -- `PARSER_HOST`, `PARSER_PORT` (C# parser microservice) -- `PORT`, `HOST` +Both `deno.json` and `package.json` exist in the project: -## Repo Tour +- **deno.json** defines import maps (path aliases like `$lib/`, `$stores/`), + Deno-specific imports from JSR (`jsr:@std/yaml`), and all runnable tasks + (`deno task dev`, `deno task build`, etc.) -- `docs/ARCHITECTURE.md` — system overview -- `docs/PCD SPEC.md` — operational SQL & layering model -- `docs/manifest.md` — `pcd.json` schema -- `docs/PARSER_PORT_DESIGN.md` — parser microservice -- `services/parser/` — C# parser microservice +- **package.json** provides npm dependencies that Vite and SvelteKit need during + the build process. The `@deno/vite-plugin` and `sveltekit-adapter-deno` + packages bridge these two ecosystems. -## App Database vs PCD Databases +When you run `deno task dev`, Deno resolves imports through `deno.json` while +Vite pulls dependencies from `node_modules`. Both files are required—`deno.json` +for runtime resolution and tasks, `package.json` for the Vite build toolchain. -**Profilarr app database** +### Frontend -- SQLite file: `data/profilarr.db` -- Boot sequence initializes config, opens DB, runs migrations, starts job system. -- Migrations live in `src/lib/server/db/migrations/` and are run on startup. +The UI lets users: -**PCD databases** +- **Link databases** — Connect external GitHub repositories containing + Profilarr-compliant configurations. Supports public and private repos with + auth tokens. -- Git repos cloned into `data/databases/`. -- Compiled into an in-memory SQLite cache (`PCDCache`) using ordered SQL operations. -- Layers in order: `schema` → `base` → `tweaks` → `user`. -- SQL helper functions available inside PCD ops: `qp`, `cf`, `dp`, `tag`. +- **Connect Arr instances** — Add Radarr, Sonarr, Lidarr, or Chaptarr instances + by URL and API key. -## Adding a Migration +- **Manage entities** — Create and edit quality profiles, custom formats, delay + profiles, media management settings, and regular expressions. Each entity type + has its own editor with testing capabilities. -1. Copy `src/lib/server/db/migrations/_template.ts` to a new file like `021_add_foo.ts`. -2. Update `version` and `name`, then fill out `up` SQL and (ideally) `down` SQL. -3. Add a static import in `src/lib/server/db/migrations.ts`. -4. Add the new migration to `loadMigrations()` (keep sequential ordering). +- **Configure sync** — Set up how and when configurations push to each Arr + instance. Strategies include manual, scheduled (cron), on-pull, and on-change. + Dependencies like custom formats auto-sync with their parent profiles. -Notes: +- **Browse libraries** — View downloaded media with filtering, sorting, and bulk + profile reassignment (Radarr only currently). -- Versions must be unique and sequential. -- Never edit an applied migration; create a new one instead. -- Migrations run automatically on server startup. +- **Manage upgrades** — Configure automatic quality upgrades with filters, + schedules, and dry-run testing. -## Working with PCDs +- **Settings** — Notifications (Discord/Slack/Email), backups, logging, theming, + and background job management. -**PCD layout** +#### Routes, Not Modals + +Prefer routes over modals. Modals should only be used for things requiring +immediate attention—confirmations like "you have unsaved changes" or "are you +sure you want to delete this?" They can also display supplementary information +about a page that wouldn't fit in the layout otherwise. + +In rare cases, modals can be used for one-time forms. Use this sparingly and +only when a route would be excessively nested. The only place we do this is for +adding test entities and releases to those entities. Without modals there, we'd +be 5-6 routes deep and the breadcrumbs become confusing. + +Examples: + +- `src/routes/databases/+page.svelte` — Confirmation modal for unlinking a + database. Warns about data loss before proceeding. +- `src/routes/arr/+page.svelte` — Confirmation modal for deleting an Arr + instance. +- `src/routes/settings/backups/+page.svelte` — Confirmation modals for both + deleting and restoring backups. +- `src/routes/arr/[id]/upgrades/components/UpgradesInfoModal.svelte` — Info + modal explaining how the upgrades module works. Too much content for the page + itself. +- `src/routes/quality-profiles/entity-testing/[databaseId]/components/AddEntityModal.svelte` + — One-time form exception. Searches TMDB and adds test entities. A route here + would be 5+ levels deep. +- `src/routes/quality-profiles/entity-testing/[databaseId]/components/ReleaseModal.svelte` + — One-time form exception. Adds test releases to entities. + +#### Alerts + +Users need feedback when they take an action. Use the alert system in +`src/lib/client/alerts/` to show success, error, warning, or info messages. +Import `alertStore` and call `alertStore.add(type, message)`. + +Examples: + +- `src/routes/arr/components/InstanceForm.svelte` — Shows success/error after + testing connection, creating, or updating an Arr instance. +- `src/routes/databases/components/InstanceForm.svelte` — Shows success/error + when linking, updating, or unlinking databases. +- `src/routes/settings/general/components/TMDBSettings.svelte` — Shows + success/error after testing TMDB API connection. +- `src/routes/settings/jobs/components/JobCard.svelte` — Shows success/error + when triggering or toggling background jobs. +- `src/routes/quality-profiles/entity-testing/[databaseId]/+page.svelte` — Shows + a persistent warning (duration: 0) when the parser service is unavailable. + +#### Dirty Tracking + +The dirty store (`src/lib/client/stores/dirty.ts`) tracks whether a form has +unsaved changes by comparing current state against an original snapshot. This +serves two purposes: disabling save buttons when nothing has changed (avoiding +unnecessary requests and file writes), and warning users before they navigate +away from unsaved work via `DirtyModal.svelte`. + +How it works: + +1. **Initialize** — Call `initEdit(serverData)` for existing records or + `initCreate(defaults)` for new ones. This captures the original snapshot. +2. **Update** — Call `update(field, value)` when a field changes. The store + compares current state against the snapshot using deep equality. +3. **Check** — Subscribe to `$isDirty` to enable/disable save buttons or show + warnings. +4. **Reset** — Call `initEdit(newServerData)` after a successful save to capture + the new baseline. + +Examples: + +- `src/routes/quality-profiles/[databaseId]/[id]/languages/+page.svelte` — + Tracks language selection changes. Save button disabled until dirty. +- `src/routes/quality-profiles/[databaseId]/[id]/qualities/+page.svelte` — + Tracks drag-and-drop reordering of quality tiers. Shows a sticky save bar only + when dirty. +- `src/routes/custom-formats/[databaseId]/components/GeneralForm.svelte` — + Handles both create and edit modes. Uses `initCreate()` for new formats, + `initEdit()` for existing ones. +- `src/routes/arr/[id]/sync/+page.svelte` — Aggregates dirty state from three + child components (QualityProfiles, DelayProfiles, MediaManagement). Prevents + syncing while there are unsaved changes. +- `src/lib/client/ui/modal/DirtyModal.svelte` — Global navigation guard. Uses + `beforeNavigate` to intercept route changes and prompt the user if dirty. + +#### Actions Bar + +Entity list pages use a horizontal toolbar for filters, search, and view +toggles. The components live in `src/lib/client/ui/actions/`: + +- **ActionsBar** — Container that groups child components. Uses negative margins + and CSS to make buttons appear connected (shared borders, rounded corners only + on the ends). +- **ActionButton** — Icon button with optional hover dropdown. Can be square or + variable width. +- **SearchAction** — Search input with debounce, integrates with a search store. +- **ViewToggle** — Dropdown to switch between card and table views. + +**Do not add custom margins, gaps, or wrapper divs between items inside +ActionsBar.** The component relies on direct children to calculate border +radius. Adding spacing breaks the connected appearance. + +```svelte + + + + + + + + + + + + + + + +
+ +
+ +
+``` + +Examples: + +- `src/routes/quality-profiles/[databaseId]/+page.svelte` +- `src/routes/custom-formats/[databaseId]/+page.svelte` +- `src/routes/arr/[id]/library/components/LibraryActionBar.svelte` + +#### Dropdowns + +Hover-triggered dropdown menus live in `src/lib/client/ui/dropdown/`: + +- **Dropdown** — Positioned container for dropdown content. Has an invisible + hover bridge so the menu stays open when moving the mouse from trigger to + content. Supports `left`, `right`, or `middle` positioning. +- **DropdownItem** — Individual menu item with icon, label, and optional + `selected` checkmark. Supports `disabled` and `danger` variants. +- **CustomGroupManager** — Specialized component for managing user-defined + filter groups (used in library filtering). + +Dropdowns are typically placed inside an `ActionButton` with +`hasDropdown={true}` using the `slot="dropdown"` pattern. See +`ViewToggle.svelte` for a simple example. + +#### Tables + +Data tables live in `src/lib/client/ui/table/`: + +- **Table** — Generic data table with typed column definitions. Supports sorting + (click column headers to cycle asc/desc/none), custom cell renderers, row + click handlers, compact mode, and an `actions` slot for row-level buttons. + Column definitions include `key`, `header`, `sortable`, `align`, `width`, and + optional `cell` render function. + +- **ExpandableTable** — Rows expand on click to reveal additional content via + the `expanded` slot. Chevron indicators show expand state. Supports + `chevronPosition` (left/right), `flushExpanded` for edge-to-edge content, and + external control of `expandedRows`. + +- **ReorderableList** — Drag-and-drop list for reordering items. Uses a + sensitivity threshold to prevent flickering during drags. Calls `onReorder` + with the new array after each move. + +Column types are defined in `types.ts`. Key properties: + +- `sortable` — enables click-to-sort on the column header +- `sortAccessor` — function to extract the sort value (useful when display + differs from sort order) +- `sortComparator` — custom comparison function for complex sorting +- `cell` — render function returning a string, HTML object, or Svelte component + +#### Stores + +Svelte stores live in `src/lib/client/stores/`. Two patterns exist: factory +functions that create store instances, and singleton stores. + +**Factory Stores** + +Use factory functions when each page needs its own isolated state: + +- `createSearchStore()` — Debounced search with filters. Returns methods for + `setQuery()`, `setFilter()`, `clear()`, and a `filterItems()` helper. +- `createDataPageStore()` — Combines search with view toggle (table/cards). + Persists view mode to localStorage. Returns `search`, `view`, and `filtered` + derived store. + +```typescript +import { createDataPageStore } from "$stores/dataPage"; + +const { search, view, filtered } = createDataPageStore(data.profiles, { + storageKey: "qualityProfilesView", + searchKeys: ["name", "description"], +}); + +// Use in template +{#each $filtered as profile} +``` + +**Singleton Stores** + +Export instances directly for app-wide state: + +- `themeStore` — Dark/light mode +- `accentStore` — Accent color +- `sidebarCollapsed` — Sidebar state +- `alertStore` — Global alerts (imported via `$alerts/store`) +- `libraryCache` — Per-instance library data cache + +**Dirty Store** + +The dirty store (`dirty.ts`) is documented above in Dirty Tracking. It's a +singleton but with methods that make it behave like a state machine for form +change detection. + +### Backend + +Server-side code lives in `src/lib/server/`. Profilarr uses two separate data +stores: the main SQLite database for application state (Arr connections, +settings, job history), and PCD git repositories for versioned configuration +(profiles, formats, media settings). The main database tracks _which_ PCDs are +linked and _how_ to sync them. The PCDs contain _what_ gets synced. + +Key directories: + +- **db/** — Main SQLite database (app state, settings, job history) +- **pcd/** — PCD cache management (compile, watch, query) +- **jobs/** — Background job scheduler and definitions +- **sync/** — Logic for pushing configs to Arr instances +- **upgrades/** — Automatic quality upgrade processing +- **notifications/** — Discord/Slack/Email delivery +- **utils/** — Shared utilities (arr clients, git, http, logger, config, cache) + +#### Utils + +Shared utilities live in `src/lib/server/utils/`. These are foundational modules +used throughout the backend. + +**Config** + +The config singleton (`config/config.ts`) is the most important utility. It +centralizes all application paths and environment configuration. Import it via +`$config`. + +```typescript +import { config } from "$config"; + +// Paths +config.paths.base; // Application root +config.paths.logs; // Log directory +config.paths.data; // Data directory +config.paths.database; // SQLite database file +config.paths.databases; // PCD repositories directory +config.paths.backups; // Backup directory + +// Server +config.port; // HTTP port (default: 6868) +config.host; // Bind address (default: 0.0.0.0) +config.serverUrl; // Display URL (http://localhost:6868) + +// Services +config.parserUrl; // Parser microservice URL +config.timezone; // System timezone +``` + +The base path defaults to the executable's directory but can be overridden via +`APP_BASE_PATH`. Call `config.init()` on startup to create required directories. + +**Logger** + +The logger (`logger/logger.ts`) handles console and file output with daily +rotation. Import via `$logger/logger.ts`. + +```typescript +import { logger } from "$logger/logger.ts"; + +await logger.debug("Cache miss", { source: "PCD", meta: { id: 1 } }); +await logger.info("Sync completed", { source: "Sync" }); +await logger.warn("Rate limited", { source: "GitHub" }); +await logger.error("Connection failed", { source: "Arr", meta: error }); +``` + +Log levels: DEBUG → INFO → WARN → ERROR. Users configure the minimum level in +settings. File logs are JSON (one entry per line), console logs are colored. + +**Logging guidelines:** + +- **DEBUG** — Internal state, cache hits/misses, detailed flow. Developers only. + Use liberally during development but ensure production logs aren't flooded. +- **INFO** — User-relevant events: sync completed, backup created, job finished. + Think of these as feedback for the user, similar to alerts in the frontend. + Keep them concise and actionable. +- **WARN** — Recoverable issues: rate limits, missing optional config, fallback + behavior triggered. +- **ERROR** — Failures requiring attention: connection errors, invalid data, + unhandled exceptions. + +Good logs are concise and contextual. Include `source` to identify the +subsystem. Include `meta` for structured data that helps debugging. Avoid +verbose messages or logging the same event multiple times. + +```typescript +// Good +await logger.info("Synced 5 profiles to Radarr", { source: "Sync" }); + +// Bad - too verbose, no source +await logger.info("Starting to sync profiles now..."); +await logger.info("Found 5 profiles to sync"); +await logger.info("Syncing profile 1..."); +await logger.info("Syncing profile 2..."); +``` + +**HTTP** + +The HTTP client (`http/client.ts`) provides a base class with connection pooling +and retry logic. Arr clients extend this. + +```typescript +import { BaseHttpClient } from "$http/client.ts"; + +class MyClient extends BaseHttpClient { + constructor(url: string) { + super(url, { + timeout: 30000, // Request timeout (ms) + retries: 3, // Retry count for 5xx errors + retryDelay: 500, // Base delay (exponential backoff) + headers: { Authorization: "Bearer token" }, + }); + } +} + +const client = new MyClient("https://api.example.com"); +const data = await client.get("/endpoint"); +client.close(); // Release connection pool +``` + +Features: + +- Connection pooling via `Deno.createHttpClient()` +- Automatic retries with exponential backoff for 500/502/503/504 +- Configurable timeout with AbortController +- JSON request/response handling +- `HttpError` class with status code and response body + +Always call `close()` when done to release pooled connections. + +**Git** + +Git operations (`git/`) wrap command-line git for PCD repository management. + +The `Git` class (`Git.ts`) provides a clean interface per repository: + +```typescript +import { Git } from "$utils/git/index.ts"; + +const git = new Git("/path/to/repo"); + +// Repository operations +await git.fetch(); +await git.pull(); +await git.push(); +await git.checkout("main"); +await git.resetToRemote(); + +// Status queries +const branch = await git.getBranch(); +const status = await git.status(); +const updates = await git.checkForUpdates(); +const commits = await git.getCommits(10); + +// PCD operation files +const uncommitted = await git.getUncommittedOps(); +const maxOp = await git.getMaxOpNumber(); +await git.discardOps(filepaths); +await git.addOps(filepaths, "commit message"); +``` + +Key modules: + +- `repo.ts` — Clone, fetch, pull, push, checkout, reset, stage, commit +- `status.ts` — Branch info, status, update checks, commit history, diffs +- `ops.ts` — PCD-specific operations: parse operation metadata, get uncommitted + ops, renumber and commit ops + +The `clone()` function in `repo.ts` validates GitHub URLs via API before +cloning, detects private repositories, and handles PAT authentication. + +**Cache** + +Simple in-memory cache with TTL (`cache/cache.ts`): + +```typescript +import { cache } from "$cache/cache.ts"; + +cache.set("key", data, 300); // TTL in seconds +const value = cache.get("key"); // Returns undefined if expired +cache.delete("key"); +cache.deleteByPrefix("library:"); // Clear related entries +cache.clear(); +``` + +Used for expensive computations like library data fetching. Not persisted across +restarts. + +**AI** + +Optional AI integration (`ai/client.ts`) for generating commit messages from +diffs. Supports OpenAI-compatible APIs including local models. + +```typescript +import { isAIEnabled, generateCommitMessage } from "$utils/ai/client.ts"; + +if (isAIEnabled()) { + const message = await generateCommitMessage(diffText); +} +``` + +Configured via settings UI (API URL, model, optional API key). Uses Chat +Completions API for most models, Responses API for GPT-5. + +#### Main Database + +SQLite database in `src/lib/server/db/`. No ORM—raw SQL with typed wrappers. + +Migrations live in `migrations/` as numbered TypeScript files. Each exports a +`migration` object with `version`, `name`, `up` (SQL), and optional `down`. New +migrations must be imported and added to the array in `migrations.ts`, and +`schema.sql` must be updated to reflect the current schema. Migrations run +automatically on startup in order. + +Examples: + +- `src/lib/server/db/migrations/001_create_arr_instances.ts` +- `src/lib/server/db/migrations/007_create_notification_tables.ts` +- `src/lib/server/db/schema.sql` + +All queries live in `queries/`, one file per table. Each file exports a query +object (e.g., `arrInstancesQueries`) with typed methods for CRUD operations. +**Queries are not written anywhere else in the codebase**—route handlers and +other code import from `queries/` rather than writing SQL inline. + +Examples: + +- `src/lib/server/db/queries/arrInstances.ts` +- `src/lib/server/db/queries/jobs.ts` + +#### PCD (Profilarr Compliant Database) + +PCDs are git repositories containing versioned configuration data—quality +profiles, custom formats, delay profiles, media management settings, and regular +expressions. Unlike the main database which stores application state directly, +PCDs store _operations_: append-only SQL files that are replayed to build an +in-memory database. This design enables git-based versioning, conflict-free +merging, and layered customization. + +Every PCD depends on a shared schema repository +([github.com/Dictionarry-Hub/schema](https://github.com/Dictionarry-Hub/schema)) +that defines the base tables. The official database is +[github.com/Dictionarry-Hub/db](https://github.com/Dictionarry-Hub/db). + +**Operational SQL (OSQL)** + +PCDs use an append-only approach where each change is written as a numbered SQL +file. Instead of mutating rows directly, you append INSERT, UPDATE, or DELETE +statements. When the cache compiles, it replays all operations in order to build +the current state. This makes every change trackable in git history and enables +non-destructive layering. + +**Layers** + +Operations are loaded and executed in a specific order: + +1. **Schema** (`deps/schema/ops/`) — Table definitions and seed data from the + schema dependency. Creates the database structure. + +2. **Base** (`ops/`) — The PCD's main configuration data. Quality profiles, + custom formats, and other entities maintained by the database author. + +3. **Tweaks** (`tweaks/`) — Optional adjustments that apply on top of base. + Useful for variant configurations or environment-specific overrides. + +4. **User** (`user_ops/`) — Local modifications made by the end user. These stay + on the user's machine and persist across pulls from upstream. + +Files within each layer are sorted by numeric prefix (`1.initial.sql`, +`2.add-formats.sql`, etc.) and executed in order. + +**Repository Layout** ``` my-pcd/ -├── pcd.json -├── ops/ -└── tweaks/ +├── pcd.json # Manifest file +├── ops/ # Base layer operations +│ ├── 1.initial.sql +│ └── 2.custom-formats.sql +├── deps/ +│ └── schema/ # Schema dependency (git submodule) +│ └── ops/ +├── tweaks/ # Optional tweaks layer +└── user_ops/ # User modifications (gitignored) ``` -**Authoring operations** +**Manifest** -- Follow the append-only Operational SQL approach. -- Use expected-value guards in `UPDATE` statements to surface conflicts. -- New ops go in `ops/` or `tweaks/` depending on intent. +Every PCD requires a `pcd.json` manifest: -**User ops** +```json +{ + "name": "my-database", + "version": "1.0.0", + "description": "Custom Arr configurations", + "dependencies": { + "https://github.com/Dictionarry-Hub/schema": "main" + }, + "arr_types": ["radarr", "sonarr"], + "profilarr": { + "minimum_version": "2.0.0" + } +} +``` -Profilarr writes user edits via `src/lib/server/pcd/writer.ts` into `user_ops/`, rebuilding the in-memory cache after write. +**Cache Compilation** -## Client UI Components +When Profilarr loads a PCD, it creates an in-memory SQLite database and replays +all operations in layer order. The `PCDCache` class in +`src/lib/server/pcd/cache.ts` handles this: -Shared UI lives in `src/lib/client/ui/`. Route-specific components live next to their routes. +1. Creates an in-memory SQLite database +2. Registers helper functions (`qp()`, `cf()`, `dp()`, `tag()`) for entity + lookups +3. Loads operations from all layers via `loadAllOperations()` +4. Executes each SQL file in order +5. Exposes the compiled database through Kysely for type-safe queries -**Alerts and toasts** +File watchers monitor the ops directories. When a `.sql` file changes, the cache +automatically recompiles after a short debounce. -- Store: `src/lib/client/alerts/store.ts` -- Use the alert store for success/error/info toasts in `enhance` actions and API responses. +**Writing Operations** -**Actions and toolbars** +When users edit entities through the frontend, changes are not applied directly +to the in-memory cache. Instead, `src/lib/server/pcd/writer.ts` generates SQL +files and writes them to the appropriate layer: -- `src/lib/client/ui/actions/ActionsBar.svelte` -- `src/lib/client/ui/actions/ActionButton.svelte` -- `src/lib/client/ui/actions/SearchAction.svelte` -- `src/lib/client/ui/actions/ViewToggle.svelte` +- **Base layer** (`ops/`) — For database maintainers with push access. Requires + a personal access token. +- **User layer** (`user_ops/`) — For local modifications. No authentication + required. -**Dropdowns** +The writer converts Kysely queries to executable SQL, assigns the next sequence +number, and writes the file. After writing, it triggers a cache recompile so +changes appear immediately. -- `src/lib/client/ui/dropdown/Dropdown.svelte` -- `src/lib/client/ui/dropdown/DropdownItem.svelte` +```typescript +// Example: writer converts this Kysely query to a .sql file +await writeOperation({ + databaseId: 1, + layer: "user", + description: "update-profile-score", + queries: [compiledKyselyQuery], + metadata: { + operation: "update", + entity: "quality_profile", + name: "HD Bluray + WEB", + }, +}); +``` -**Buttons** +This writes `user_ops/5.update-profile-score.sql` with the SQL and metadata +header, then recompiles the cache. -- `src/lib/client/ui/button/Button.svelte` (variants + sizes) +**Queries** -**Forms** +PCD queries live in `src/lib/server/pcd/queries/`, organized by entity type. +Each query file exports functions that use the `PCDCache` instance to read +compiled data: -- `FormInput`, `NumberInput`, `TagInput`, `IconCheckbox` +- `src/lib/server/pcd/queries/qualityProfiles/` — List, get, create, update +- `src/lib/server/pcd/queries/customFormats/` — List, get, conditions, tests +- `src/lib/server/pcd/queries/delayProfiles/` +- `src/lib/server/pcd/queries/regularExpressions/` +- `src/lib/server/pcd/queries/mediaManagement/` -**Tables and lists** +#### Sync -- `Table`, `ExpandableTable`, `ReorderableList` +The sync module (`src/lib/server/sync/`) pushes compiled PCD configurations to +Arr instances. It reads from the PCD cache, transforms data to match each Arr's +API format, and creates or updates entities by name. -**Modals** +**Architecture** -- `Modal`, `SaveTargetModal`, `UnsavedChangesModal`, `InfoModal` +Syncers extend `BaseSyncer`, which provides a fetch → transform → push pattern: -**Navigation** +1. **Fetch** — Read entities from the PCD cache +2. **Transform** — Convert PCD data to Arr API format using transformers +3. **Push** — Create or update entities in the Arr instance (matched by name) -- `navbar`, `pageNav`, `tabs` +Three syncer implementations handle different entity types: -**State and empty views** +- `QualityProfileSyncer` — Syncs quality profiles and their dependent custom + formats. Custom formats sync first so profile references resolve correctly. +- `DelayProfileSyncer` — Syncs delay profiles with protocol preferences and + bypass settings. +- `MediaManagementSyncer` — Syncs naming conventions, quality definitions, and + media settings. -- `EmptyState` +**Triggers** -## Svelte Conventions +Syncs are triggered by `should_sync` flags in the main database. The processor +evaluates these flags and runs appropriate syncers: -- Use Svelte 4 syntax (`export let`, `$:`) even though Svelte 5 is installed. -- Avoid Svelte 5 runes unless explicitly used in that module. -- Route-specific components should be colocated under their route directory. +- **Manual** — User clicks "Sync Now" in the UI +- **on_pull** — Triggered after pulling updates from a database repository +- **on_change** — Triggered when PCD files change (detected by file watcher) +- **schedule** — Cron expressions evaluated periodically; marks configs for sync + when the schedule matches -## Tests +The `processPendingSyncs()` function in `processor.ts` orchestrates all pending +syncs, iterating through flagged instances and running the appropriate syncers. -- Tests live in `src/tests/` and run with `deno task test`. -- Base test utilities are in `src/tests/base/BaseTest.ts`. -- Many tests create temp dirs under `/tmp/profilarr-tests`. +**Transformers** -## Parser Microservice (Optional) +Transformers in `transformers/` convert PCD data structures to Arr API payloads. +They handle differences between Radarr and Sonarr APIs: -If you touch parser-related code, see `docs/PARSER_PORT_DESIGN.md` and `services/parser/`. +- `customFormat.ts` — Transforms custom format conditions to API specifications. + Maps condition types (release_title, source, resolution) to their API + implementations and converts values using mappings. +- `qualityProfile.ts` — Transforms quality tiers, language settings, and format + scores. Handles quality name differences between apps. -- `dotnet run` from `services/parser/` -- Configure `PARSER_HOST` / `PARSER_PORT` in Profilarr +**Mappings** + +`mappings.ts` contains constants for translating between PCD values and Arr API +values. This includes indexer flags, sources, resolutions, quality definitions, +and languages. Each constant has separate mappings for Radarr and Sonarr where +their APIs differ. + +#### Jobs + +The job system (`src/lib/server/jobs/`) runs background tasks on schedules. Jobs +handle recurring operations like syncing databases, creating backups, cleaning +up logs, and processing upgrades. + +**Components** + +- **Registry** (`registry.ts`) — Stores job definitions in memory. Jobs register + on startup and can be looked up by name. +- **Scheduler** (`scheduler.ts`) — Checks for due jobs every minute and triggers + execution. Prevents concurrent runs of the same check cycle. +- **Runner** (`runner.ts`) — Executes a job's handler, records the run in the + database, calculates the next run time, and sends notifications on + success/failure. +- **Init** (`init.ts`) — Registers all job definitions and syncs them with the + database on startup. + +**Defining Jobs** + +Job definitions live in `definitions/`. Each exports a `JobDefinition` with +name, description, schedule (cron expression), and handler function: + +```typescript +export const myJob: JobDefinition = { + name: "my_job", + description: "Does something useful", + schedule: "0 * * * *", // Every hour + handler: async (): Promise => { + // Job logic here + return { success: true, output: "Done" }; + }, +}; +``` + +Register the job in `init.ts` by importing and calling +`jobRegistry.register(myJob)`. + +**Built-in Jobs** + +- `sync_arr` — Processes pending syncs to Arr instances (every minute) +- `sync_databases` — Pulls updates from linked database repositories +- `create_backup` — Creates application backups +- `cleanup_backups` — Removes old backups based on retention settings +- `cleanup_logs` — Prunes old log entries +- `upgrade_manager` — Processes automatic quality upgrades + +**Job Logic** + +Complex job logic lives in `logic/`. Definition files stay thin—they just wire +up the handler to the logic function. This keeps definitions readable and logic +testable. + +#### Notifications + +The notification system (`src/lib/server/notifications/`) sends alerts to +external services like Discord. It's fire-and-forget: failures are logged but +don't interrupt the calling code. + +**Components** + +- **NotificationManager** (`NotificationManager.ts`) — Central orchestrator. + Queries enabled services from the database, filters by notification type, and + dispatches to appropriate notifiers. Records all attempts in history. +- **Builder** (`builder.ts`) — Fluent API for constructing notifications. Chain + `.title()`, `.message()`, `.meta()`, and call `.send()`. +- **Notifiers** (`notifiers/`) — Service-specific implementations. Each extends + `BaseHttpNotifier` and formats payloads for their API. + +**Usage** + +```typescript +import { notify } from "$notifications/builder.ts"; +import { NotificationTypes } from "$notifications/types.ts"; + +await notify(NotificationTypes.PCD_SYNC_SUCCESS) + .title("Sync Complete") + .message("Synced 5 profiles to Radarr") + .meta({ instanceId: 1, profileCount: 5 }) + .send(); +``` + +**Notification Types** + +`types.ts` defines type constants for categorizing notifications: + +- `job..success` / `job..failed` — Job completion status +- `pcd.linked` / `pcd.unlinked` — Database connection changes +- `pcd.sync_success` / `pcd.sync_failed` — Sync results +- `upgrade.success` / `upgrade.partial` / `upgrade.failed` — Upgrade results + +Users configure which types each service receives in the settings UI. + +**Planned Services** + +Currently only Discord is implemented. Planned additions: + +- Telegram +- Slack +- Ntfy +- Apprise +- SMTP (email) +- Generic webhooks + +**Adding Notifiers** + +To add a new notification service: + +1. Create a config interface in `types.ts` +2. Create a notifier class in `notifiers/` extending `BaseHttpNotifier` +3. Implement `getName()`, `getWebhookUrl()`, and `formatPayload()` +4. Add the case to `NotificationManager.createNotifier()` + +#### Arr Clients + +The arr utilities (`src/lib/server/utils/arr/`) provide typed HTTP clients for +communicating with Radarr, Sonarr, Lidarr, and Chaptarr instances. + +**Base Client** + +`BaseArrClient` extends `BaseHttpClient` with arr-specific methods: connection +testing, delay profiles, tags, media management config, naming config, quality +definitions, custom formats, and quality profiles. All arr clients inherit from +this base. + +**App-Specific Clients** + +Each arr has its own client in `clients/` that extends `BaseArrClient` with +app-specific functionality: + +- `RadarrClient` — Adds movie operations, library fetching with computed custom + format scores, search commands, and tag management. +- `SonarrClient` — Series and episode operations. +- `LidarrClient` — Artist and album operations. +- `ChaptarrClient` — Chapter-specific operations. + +**Factory** + +`createArrClient(type, url, apiKey)` returns the appropriate client instance +based on the arr type. Used throughout the codebase when interacting with arr +instances. + +**Library Browser** + +The library browser (`src/routes/arr/[id]/library/`) displays downloaded media +with computed custom format scores and cutoff progress. + +**Supported:** Radarr only. **TODO:** Sonarr library views. + +The page fetches library data via API, which calls `RadarrClient.getLibrary()`. +This pulls movies, quality profiles, and movie files in parallel, then computes: + +- **Custom format score** — Sum of matched format scores from the profile +- **Cutoff progress** — Score as percentage of cutoff (0% to 100%+) +- **Score breakdown** — Individual format contributions shown on row expand + +Features: + +- **Filtering** — Filter by quality name or profile. Multiple filters use OR + within the same field, AND across fields. +- **Search** — Debounced title search. +- **Column visibility** — Toggle columns on/off, persisted to localStorage. +- **Profilarr profile detection** — Movies using profiles synced from Profilarr + databases show a blue badge; others show amber with a warning icon. +- **Expandable rows** — Click a row to see filename and score breakdown with + each format's contribution color-coded (green positive, red negative). +- **Client-side caching** — Library data cached per instance to avoid refetching + on navigation. Refresh button clears cache and refetches. + +#### Upgrades + +The upgrade system (`src/lib/server/upgrades/`) solves a fundamental limitation +of how Radarr and Sonarr work. The arrs don't search for the _best_ release—they +monitor RSS feeds and grab the first thing that qualifies as an upgrade. To +actually get optimal releases, you need to trigger manual searches. + +Profilarr's upgrade module automates this with configurable filters and +selectors, similar to +[Upgradinatorr](https://github.com/angrycuban13/Just-A-Bunch-Of-Starr-Scripts/blob/main/Upgradinatorr/README.md) +but built directly into the app. + +**Shared Types** + +Filter and selector logic lives in `src/lib/shared/` so both frontend and +backend use the same definitions: + +- `filters.ts` — Filter field definitions (monitored, cutoff_met, year, + popularity, tmdb_rating, etc.), operators (boolean, number, text, date), + rule/group types, and the `evaluateGroup()` function that recursively + evaluates nested AND/OR logic. +- `selectors.ts` — Selector definitions (random, oldest, newest, lowest_score, + most_popular, least_popular) with their `select()` functions. Each selector + sorts/shuffles items and returns the top N. + +**Processing Flow** + +The upgrade processor (`processor.ts`) orchestrates each run: + +1. **Fetch** — Pull the entire library from the arr instance along with quality + profiles and movie files. +2. **Normalize** — Convert arr data to a flat structure with fields matching + filter rule names (`monitored`, `cutoff_met`, `size_on_disk`, `tmdb_rating`, + `popularity`, etc.). +3. **Filter** — Call `evaluateGroup()` from `$shared/filters.ts` to evaluate + rules using AND/OR group logic. Supports nested groups and operators + appropriate to each field type. +4. **Cooldown** — Remove items that were searched recently. The system uses + date-based tags (e.g., `profilarr-searched-2026-01-15`) to track when items + were last searched. +5. **Select** — Call `getSelector()` from `$shared/selectors.ts` to pick which + items get searched. Options: random, oldest, newest, lowest CF score, most + popular, least popular. +6. **Search** — Trigger searches via the arr's command API. Tag searched items + with today's date for cooldown tracking. + +**Filter Modes** + +When multiple filters are configured: + +- **Round Robin** — Cycles through filters in order, one filter per scheduled + run. Filter index persists across runs. +- **Random** — Picks a random enabled filter each run. + +**Dry Run** + +Configs can be set to dry run mode, which executes the full filter/select +pipeline but skips the actual search and tagging. Useful for testing filter +logic before enabling real searches. + +**Structured Logging** + +Each upgrade run produces an `UpgradeJobLog` with detailed metrics: library +size, filter match counts, cooldown effects, selection details, search results. +The logger (`logger.ts`) formats these for the application log. + +**Rename (TODO)** + +A future rename module will use the same architecture but simpler: instead of +triggering searches, it will trigger rename commands for items matching filters. +Same filter/select flow, different action. + +### Microservices + +#### Parser + +A C# parser module lives in `src/services/parser`. This is a direct port of +Radarr/Sonarr's parsing logic, packaged under a single unified endpoint that +Profilarr uses for its testing functionality. It runs as a separate service and +communicates with the main app over HTTP. + +- **.NET 8.0** (`net8.0`) + +### API + +API routes live in `src/routes/api/`. The API is documented using OpenAPI 3.1 +specification in `docs/api/v1/`. + +**Documentation Requirement** + +When adding a new API endpoint, you must document it in the OpenAPI spec. This +is not optional. The spec serves as the source of truth for API consumers and +generates TypeScript types via `deno task generate:api-types`. + +**Spec Structure** + +``` +docs/api/v1/ +├── openapi.yaml # Main spec file, references paths and schemas +├── paths/ # Endpoint definitions grouped by domain +│ └── system.yaml # Example: health, openapi endpoints +└── schemas/ # Reusable type definitions + ├── common.yaml # Shared types (ComponentStatus, etc.) + └── health.yaml # Domain-specific types +``` + +**Adding an Endpoint** + +1. Create or update a path file in `docs/api/v1/paths/`: + +```yaml +# paths/databases.yaml +list: + get: + operationId: listDatabases + summary: List all databases + description: Returns all linked PCD databases + tags: + - Databases + responses: + "200": + description: List of databases + content: + application/json: + schema: + type: array + items: + $ref: "../schemas/database.yaml#/Database" +``` + +2. Reference it in `openapi.yaml`: + +```yaml +paths: + /databases: + $ref: "./paths/databases.yaml#/list" +``` + +3. Add any new schemas to `schemas/`: + +```yaml +# schemas/database.yaml +Database: + type: object + required: + - id + - name + - repositoryUrl + properties: + id: + type: integer + name: + type: string + repositoryUrl: + type: string + format: uri +``` + +4. Run `deno task generate:api-types` to regenerate TypeScript types. + +**Route Conventions** + +- Return JSON with consistent shapes: + - Success: `{ success: true, data?: ... }` or just the data + - Error: `{ success: false, error: "message" }` +- Use appropriate status codes: 200 OK, 201 Created, 400 Bad Request, 404 Not + Found, 500 Internal Server Error +- Validate input early with guard clauses +- Wrap operations in try-catch, return 500 with error message for unexpected + failures + +**Viewing Docs** + +The OpenAPI spec is served at `/api/v1/openapi.json` when the app is running. +You can load this into Swagger UI or other OpenAPI tools to browse the API +interactively diff --git a/docs/PARSER_PORT_DESIGN.md b/docs/PARSER_PORT_DESIGN.md deleted file mode 100644 index 7628ec2..0000000 --- a/docs/PARSER_PORT_DESIGN.md +++ /dev/null @@ -1,293 +0,0 @@ -# Unified Release Title Parser - C# Microservice - -Parser microservice for release title parsing, using native .NET regex for exact Radarr/Sonarr parity. - ---- - -## Goal - -Enable testing of custom format conditions against release titles without requiring a connected arr instance. Uses a C# microservice with regex patterns copied directly from Radarr/Sonarr source. - ---- - -## Architecture - -``` -┌─────────────────┐ HTTP ┌─────────────────────┐ -│ │ POST │ │ -│ Profilarr UI │ ───────────> │ Parser Service │ -│ (SvelteKit) │ /parse │ (C# / .NET 8) │ -│ │ <─────────── │ │ -└─────────────────┘ JSON └─────────────────────┘ -``` - -**Why C# microservice?** -- Native .NET regex - exact parity with Radarr/Sonarr -- Copy parser classes verbatim from source -- Fast (~1-5ms per parse) -- Easy to sync with upstream changes - ---- - -## Current Status - -### Completed (Phase 1-6) - -- [x] C# microservice scaffolded (`services/parser/`) -- [x] QualityParser ported from Radarr -- [x] TypeScript client in Profilarr -- [x] Config for `PARSER_HOST` / `PARSER_PORT` -- [x] LanguageParser ported from Radarr (58 languages supported) -- [x] ReleaseGroupParser ported from Radarr -- [x] TitleParser ported from Radarr (title, year, edition, IMDB/TMDB IDs) -- [x] EpisodeParser ported from Sonarr (ReleaseType, season/episode detection) - -### Remaining (Phase 7+) - -- [ ] Custom format testing UI integration - ---- - -## File Structure - -### C# Microservice - -``` -services/parser/ -├── Parser.csproj -├── Program.cs # Minimal API (POST /parse, GET /health) -├── Dockerfile -├── docker-compose.yml # Standalone docker compose -└── Core/ - ├── Types.cs # QualitySource, Resolution, QualityModifier enums - ├── Language.cs # Language enum (58 languages) - ├── RegexReplace.cs # Helper for regex replacement - ├── ParserCommon.cs # Shared regex patterns - ├── QualityParser.cs # Ported from Radarr (regex + decision tree) - ├── LanguageParser.cs # Ported from Radarr (language detection) - ├── ReleaseGroupParser.cs # Ported from Radarr (release group extraction) - ├── TitleParser.cs # Ported from Radarr (title, year, edition, IDs) - └── EpisodeParser.cs # Ported from Sonarr (season/episode, ReleaseType) -``` - -### TypeScript Client - -``` -src/lib/server/utils/arr/parser/ -├── index.ts # Exports -├── types.ts # Matching TypeScript enums -└── client.ts # HTTP client (uses config.parserUrl) -``` - -### Configuration - -``` -src/lib/server/utils/config/config.ts -``` - -Environment variables: -- `PARSER_HOST` (default: `localhost`) -- `PARSER_PORT` (default: `5000`) - ---- - -## API - -### POST /parse - -Request: -```json -{ "title": "Movie.Name.2024.1080p.BluRay.REMUX-GROUP" } -``` - -Response (movie): -```json -{ - "title": "Movie.Name.2024.1080p.BluRay.REMUX-GROUP", - "source": "Bluray", - "resolution": 1080, - "modifier": "Remux", - "revision": { - "version": 1, - "real": 0, - "isRepack": false - }, - "languages": ["Unknown"], - "releaseGroup": "GROUP", - "movieTitles": ["Movie Name"], - "year": 2024, - "edition": null, - "imdbId": null, - "tmdbId": 0, - "hardcodedSubs": null, - "releaseHash": null, - "episode": null -} -``` - -Response (TV series): -```json -{ - "title": "Show.Name.S01E05.Episode.Title.1080p.WEB-DL-GROUP", - "source": "WebDL", - "resolution": 1080, - "modifier": "None", - "revision": { "version": 1, "real": 0, "isRepack": false }, - "languages": ["Unknown"], - "releaseGroup": "GROUP", - "movieTitles": [], - "year": 0, - "edition": null, - "imdbId": null, - "tmdbId": 0, - "hardcodedSubs": null, - "releaseHash": null, - "episode": { - "seriesTitle": "Show Name", - "seasonNumber": 1, - "episodeNumbers": [5], - "absoluteEpisodeNumbers": [], - "airDate": null, - "fullSeason": false, - "isPartialSeason": false, - "isMultiSeason": false, - "isMiniSeries": false, - "special": false, - "releaseType": "SingleEpisode" - } -} -``` - -### GET /health - -Response: -```json -{ "status": "healthy" } -``` - ---- - -## Enums - -### QualitySource -```csharp -Unknown, Cam, Telesync, Telecine, Workprint, DVD, TV, WebDL, WebRip, Bluray -``` - -### Resolution -```csharp -Unknown = 0, R360p = 360, R480p = 480, R540p = 540, R576p = 576, -R720p = 720, R1080p = 1080, R2160p = 2160 -``` - -### QualityModifier -```csharp -None, Regional, Screener, RawHD, BRDisk, Remux -``` - -### ReleaseType -```csharp -Unknown, SingleEpisode, MultiEpisode, SeasonPack -``` - -### Language (58 supported) -```csharp -Unknown, English, French, Spanish, German, Italian, Danish, Dutch, Japanese, -Icelandic, Chinese, Russian, Polish, Vietnamese, Swedish, Norwegian, Finnish, -Turkish, Portuguese, Flemish, Greek, Korean, Hungarian, Hebrew, Lithuanian, -Czech, Hindi, Romanian, Thai, Bulgarian, PortugueseBR, Arabic, Ukrainian, -Persian, Bengali, Slovak, Latvian, SpanishLatino, Catalan, Croatian, Serbian, -Bosnian, Estonian, Tamil, Indonesian, Telugu, Macedonian, Slovenian, Malayalam, -Kannada, Albanian, Afrikaans, Marathi, Tagalog, Urdu, Romansh, Mongolian, -Georgian, Original -``` - ---- - -## Running the Service - -### Local Development - -```bash -cd services/parser -dotnet run -``` - -### Docker - -```bash -docker build -t profilarr-parser services/parser -docker run -p 5000:5000 profilarr-parser -``` - -### Docker Compose (standalone) - -```bash -cd services/parser -docker compose up -d -``` - -This uses the `services/parser/docker-compose.yml` which builds and runs the parser service. - -### Docker Compose (integrate with Profilarr) - -Add to your main docker-compose: - -```yaml -services: - parser: - build: ./services/parser - ports: - - "5000:5000" -``` - -Set in Profilarr environment: -``` -PARSER_HOST=parser -PARSER_PORT=5000 -``` - ---- - -## Source Reference - -Radarr parser source (cloned to `dist/parser-research/Radarr/`): - -| File | Purpose | Status | -|------|---------|--------| -| `QualityParser.cs` | Source, resolution, modifier detection | ✅ Ported | -| `LanguageParser.cs` | Language detection (58 languages) | ✅ Ported | -| `ReleaseGroupParser.cs` | Release group extraction | ✅ Ported | -| `Parser.cs` | Title/year/edition extraction | ✅ Ported | - -Sonarr additions (cloned to `dist/parser-research/Sonarr/`): - -| File | Purpose | Status | -|------|---------|--------| -| `Parser.cs` | Episode/season detection (40+ regex patterns) | ✅ Ported | -| `Model/ReleaseType.cs` | SingleEpisode, MultiEpisode, SeasonPack | ✅ Ported | -| `Model/ParsedEpisodeInfo.cs` | Episode info structure | ✅ Ported | - ---- - -## Next Steps - -1. **UI integration** - Custom format testing component - ---- - -## Maintenance - -To sync with upstream Radarr/Sonarr changes: - -```bash -cd dist/parser-research/Radarr -git pull -git diff HEAD~50 src/NzbDrone.Core/Parser/ - -cd dist/parser-research/Sonarr -git pull -git diff HEAD~50 src/NzbDrone.Core/Parser/ -``` - -Copy updated regex patterns and logic to `services/parser/Core/`. diff --git a/docs/PCD SPEC.md b/docs/PCD SPEC.md deleted file mode 100644 index cac67b3..0000000 --- a/docs/PCD SPEC.md +++ /dev/null @@ -1,102 +0,0 @@ -# Profile Compliant Databases (PCDs) - -## 1. Purpose - -PCDs describe a database as a sequence of SQL operations, not as final data. The -stored artifact is **how to build the state**, not **the state** itself. We -describe this as _operational_, instead of the traditional _stateful_. - -## 2. Operational SQL (OSQL) - -PCDs use SQL in an append-only, ordered way. Call this **Operational SQL -(OSQL)**. - -1. **Append-only**: once an operation exists, it is never edited or deleted. -2. **Ordered**: operations run in a defined order; later operations can override - the effects of earlier ones. -3. **Replayable**: anyone can rebuild the database by replaying operations in - order. -4. **Relational**: operations target real tables/columns/rows, so constraints - (FKs) still apply. - -This gives "Mutable Immutability": history is immutable; results are mutable -because new ops (operations) can be added. - -## 3. Change-Driven Development (CDD) - -CDD is the workflow for producing operations. - -1. Start from a change: "profile `1080p Quality HDR` should give `Dolby Atmos` a - higher score". -2. Express it as a single SQL operation: - -```sql -UPDATE quality_profile_custom_formats -SET score = 1200 -WHERE profile_id = qp('1080p Quality HDR') -AND custom_format_id = cf('Dolby Atmos') -AND score = 400; -- expected previous value -``` - -3. Append it to the appropriate layer (see Layers below) -4. Recompose. - -The expected-value guard (`AND score = 400`) is what makes conflicts explicit. - -## 4. Layers - -PCDs run in layers. Every layer is append-only, but later layers can override -the effect of earlier ones. - -1. **Schema**\ - Core DDL for the PCD. Created and maintained by Profilarr. Creates tables, - FKs, indexes. **No data.** - -2. **Dependencies**\ - Reserved for future use. Will allow PCDs to compose with other PCDs. - -3. **Base**\ - The actual shipped database content (profiles, quality lists, format - definitions) for this PCD/version. - -4. **Tweaks**\ - Optional, append-only operations that adjust behaviour (allow DV, allow CAMS, - disable group Z). - -5. **User Ops**\ - User changes created for a specific instantiation of a database. Heavy value - guards to detect conflicts and alert users when upstream changes. - -## 5. Repository Layout - -A PCD repository has a manifest, an operations folder, and an optional tweaks -folder. - -```text -my-pcd/ -├── pcd.json -├── ops/ -│ ├── 1.create-1080p-Efficient.sql -└── tweaks/ - ├── allow-DV-no-fallback.sql - └── ban-megusta.sql -``` - -In the case of the schema, it's the same layout, with only the DDL in `ops/` and -no tweaks: - -```text -schema-pcd/ -├── pcd.json -└── ops/ - └── 0.schema.sql -``` - -## 6. Dependencies (Post-2.0) - -**Dependencies are not part of 2.0.** At current scale (~10 in use databases), -forking solves shared-code needs without the complexity of dependency -resolution, version conflicts, and circular dependency detection. The layer -system supports adding dependencies in 2.1+ without breaking existing PCDs. -We'll build dependency support when clear duplication patterns emerge and -forking proves insufficient. diff --git a/docs/manifest.md b/docs/manifest.md deleted file mode 100644 index eef1f21..0000000 --- a/docs/manifest.md +++ /dev/null @@ -1,63 +0,0 @@ -# Manifest Specification - -Every Profilarr Compliant Database must include a `pcd.json` manifest file in -its root directory. This file defines the database's identity, compatibility, -and dependencies. - -## Required Fields - -| Field | Description | -| --------------------------- | --------------------------------------------------------------------------------------------- | -| `name` | Unique identifier for the database (lowercase, hyphens preferred) | -| `version` | Semantic version of the database (MAJOR.MINOR.PATCH) | -| `description` | Short summary of what the database provides | -| `dependencies` | Object mapping dependency names to semver ranges. All PCDs must depend on `schema` at minimum | -| `profilarr.minimum_version` | Minimum Profilarr version required to use this database | - -## Optional Fields - -| Field | Description | -| -------------- | ---------------------------------------------------------------------------------------------------------------------------------- | -| `arr_types` | Array of supported arr applications (`["radarr"]`, `["sonarr"]`, or `["radarr", "sonarr"]`). If omitted, assumes all are supported | -| `authors` | Array of contributor objects with name and optional email | -| `license` | SPDX license identifier | -| `repository` | Git repository URL | -| `dependencies` | Can include other PCDs in addition to the schema, enabling layered databases | -| `tags` | Array of descriptive keywords for discovery | -| `links` | External resource URLs (homepage, documentation, issues) | - -## Example - -```json -{ - "name": "db", - "version": "2.1.35", - "description": "Seraphys' OCD Playground", - "arr_types": ["radarr", "sonarr", "whisparr"], - - "dependencies": { - "schema": "^1.1.0" - }, - - "authors": [ - { - "name": "Dictionarry Team", - "email": "team@dictionarry.dev" - } - ], - - "license": "MIT", - "repository": "https://github.com/dictionarry-hub/database", - - "tags": ["4k", "hdr", "remux", "quality", "archival"], - - "links": { - "homepage": "https://dictionarry.dev", - "issues": "https://github.com/dictionarry-hub/db/issues" - }, - - "profilarr": { - "minimum_version": "2.0.0" - } -} -```