Compare commits

...

36 Commits

Author SHA1 Message Date
fallenbagel
b30794dd20 fix(base-scanner): derive media availability from actual season state instead of scanner input
The media-level status was incorrectly set to AVAILABLE when only some seasons were available. The
issue was isAllStandardSeasons/isAll4kSeasons checked the scanner's input array which only contains
seasons found by current scanner (e.g. 1 seaason in jellyfin). If that single season was fully
available, the check passes and the media gets marked AVAILABLE regardless of how many total seasons
existed. And the shouldStayAvailable (and its 4k check) locked the AVAILABLE status once set, as
long as no new seasons were added with a status which prevented the incorrect status from correcting
itself.
2026-02-14 00:46:25 +08:00
fallenbagel
1ed86c14c0 fix(media-request-subscriber): prevent mediald nullification from cascade saves (#2356) 2026-02-13 15:02:22 +05:00
fallenbagel
91261f6a61 fix(settings): DNS cache UI consistency, validation, and conditional rendering (#2382) 2026-02-13 00:16:10 +01:00
Gauthier
3dea58eead fix(overriderules): display the users of an override rule (#2410) 2026-02-12 23:20:16 +01:00
Gauthier
3eea8ee98e fix(watchlist): remove error log when a media from the watchlist is blacklisted (#2407) 2026-02-12 13:16:56 +05:00
Ludovic Ortega
7cd3521cfd docs(docker): document available image tags and their usage (#2402)
Signed-off-by: Ludovic Ortega <ludovic.ortega@adminafk.fr>
2026-02-11 19:17:03 +05:00
Gauthier
e53c2a34dc docs(blog): update authors description and add Discord link (#2405)
Co-authored-by: fallenbagel <98979876+fallenbagel@users.noreply.github.com>
2026-02-11 18:42:52 +05:00
Gauthier
095784bf62 docs(blog): add Seerr release blog post (#2401)
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2026-02-11 13:50:28 +01:00
fallenbagel
4f7819f028 fix: add IF EXISTS to SQLite migration DROP INDEX statements (#2398) 2026-02-10 14:25:36 +01:00
Gauthier
028c7c2434 fix(overriderules): test service using the right endpoint (#2399) 2026-02-10 13:18:56 +01:00
0xsysr3ll
e3dc1c302d fix(webpush): avoid querying push subs with empty user list (#2380) 2026-02-09 15:34:13 +01:00
fallenbagel
a44a3b1e14 perf: add database indexes & minor frontend/backend improvements (#2396) 2026-02-09 17:06:53 +05:00
fallenbagel
186998b888 chore(workflows): remove redundant Hugging Face model cache (#2397) 2026-02-09 12:01:55 +01:00
0xsysr3ll
df54fb9451 ci(workflow): add i18n label check to avoid duplicate comments (#2390) 2026-02-09 10:30:25 +01:00
seerr-weblate-bot
673f3f2939 chore(i18n): update translations from Weblate (#2395)
Co-authored-by: Weblate <noreply@weblate.org>
Co-authored-by: Ulrik J <ulrik.johansen@me.com>
Co-authored-by: fallenbagel <98979876+fallenbagel@users.noreply.github.com>
2026-02-09 07:17:17 +05:00
fallenbagel
3cd66589ca chore(i18n): rescue translations from #2384 (#2394)
Co-authored-by: Ulrik J <ulrik.johansen@me.com>
2026-02-09 07:07:11 +05:00
fallenbagel
dbee2fdf9f ci(duplicate-issues): migrate to pnpm from npm (#2388) 2026-02-08 12:28:02 +00:00
fallenbagel
0ffe3e8067 ci(issues): add LLM-driven duplicate issue detection (#2381) 2026-02-08 15:45:49 +05:00
seerr-weblate-bot
2dac679f1b chore(i18n): update translations from Weblate (#2378)
Co-authored-by: Anonymous <noreply@weblate.org>
Co-authored-by: Oleksandr Yurov <oyurov@icloud.com>
Co-authored-by: 宿命 <331874545@qq.com>
Co-authored-by: lauantaimakkara <a.lj.unma.va.l@googlemail.com>
Co-authored-by: Thadah <thadahdenyse+borgcube@protonmail.com>
Co-authored-by: Filip Zalitchi <nyt.g777@gmail.com>
Co-authored-by: Gökhan GÜRBÜZ <gkhn.gurbuz@hotmail.com>
Co-authored-by: HanaO00 <greenmalkak@gmail.com>
Co-authored-by: sephrat <sephrat.flo@gmail.com>
Co-authored-by: 0xsysr3ll <0xsysr3ll@pm.me>
Co-authored-by: Kyalarys <charli.pn@proton.me>
Co-authored-by: Mikael Wessel <post@mikaelkw.online>
Co-authored-by: Bas <910100490+weblate@proton.me>
Co-authored-by: Senne <senne@is.soms.moe>
Co-authored-by: ugyes <ferenc.bodi@live.com>
Co-authored-by: Kiss-Pusztai Balázs <balazs.movie@gmail.com>
Co-authored-by: NilsKarlssonPyssling <nisse@users.noreply.translate.jellyseerr.dev>
Co-authored-by: Jonas <jonaasjac@gmail.com>
Co-authored-by: Jamal R. <jamal2362@googlemail.com>
Co-authored-by: Christian <christian_thalmann@bluewin.ch>
Co-authored-by: Ulrik J <ulrikj@users.noreply.translate.jellyseerr.dev>
Co-authored-by: Fallenbagel <jellyseerr@borgcube.de>
2026-02-06 21:45:45 +01:00
fallenbagel
faa2c0a005 fix(servarr): add timeout to Radarr/Sonarr API requests to prevent infinite loading (#2375)
* fix(servarr): add timeout to Radarr/Sonarr API requests to prevent infinite loading

Adds a 5-second timeout to all Radarr/Sonarr API requests and displays a warning banner when
services are unreachable. This prevents the Recent Requests section and request list pages from
hanging indefinitely when a configured service has connection issues.

fix #2374

* fix(requests): only show service error banner to users with advanced permissions
2026-02-06 21:38:21 +01:00
Ludovic Ortega
a0a784b976 docs: add missing migration steps (#2376) 2026-02-06 13:55:46 +01:00
0xsysr3ll
0d270ac871 ci(workflow): validate i18n locale files are synchronized (#2347) 2026-02-05 07:43:18 +01:00
fallenbagel
8fc68c3888 revert(media-request): revert #2316 explicitly setting the mediaId when creating request (#2372)
This just reverts #2316. A further description is not needed. **TYPEORM**, thats all thats needed to
say.
2026-02-04 17:51:59 +01:00
fallenbagel
8b41685b31 chore(deps): upgrade prettier, and tailwind (#2351) 2026-01-29 07:48:34 +01:00
renovate[bot]
5bd31040c0 chore(deps): update dependency pg to v8.17.2 (#2011)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2026-01-28 02:58:56 +05:00
renovate[bot]
127a91ca9c ci(actions): update github actions (#2346)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2026-01-27 20:27:51 +01:00
renovate[bot]
7d2e24a528 build(docker): update node.js to v22.22.0 (#2057)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2026-01-27 20:21:09 +01:00
fallenbagel
ddf347994a chore(deps): update dependencies and fix security vulnerabilities (#2342)
* chore(deps): update dependencies and fix security vulnerabilities

Update TypeScript 4.9 → 5.4. Update Zod 3 → 4. Update nodemailer 6 → 7. Update @typescript-eslint
packages to v7. Update xml2js, undici, lodash, axios, swr, winston- Add pnpm.overrides for
transitive dependency vulnerabilities

* chore: fix import ordering for TypeScript 5.4 compatibility

prettier-plugin-organize-imports behaves differently with TypeScript 5.4 vs 4.9, causing CI
formatting checks to fail. This reformats imports to match the ordering expected by the plugin with
the upgraded TS version.
2026-01-27 19:00:42 +01:00
fallenbagel
0f7d29624b fix(availability-sync): handle resolution check for single-server setups (#2334)
PR #1543 introduced resolution checking to check 4k from non4k media when users have both server
types configured with the same service. Howerver, this causes false deletions for users with only a
single non4k service when radarr upgrades file to 4k resolution. This fix only applies resolution to
checking when both 4k and non4k servers are configured. Otherwise then if file exists then it counts
as available
2026-01-26 20:58:24 +01:00
fallenbagel
f627a8e9db refactor(api): replace plex-api package with internal implementation (#2335)
Removes plex-api dependency and its type declarations. Then extends the ExternalApi class for
PlexAPI implementation to mimick the exact same old behaviour. This should resolve the security
vulnerabilities in transitive dependencies: form-data(critical), request (moderate, deprecated),
tough-cookie (moderate), xml2js (moderate). Plex-api itself is also no longer maintained.
2026-01-26 20:52:44 +01:00
Gauthier
6031fab3b4 fix(collection): allow re-request of deleted items in a collection (#2339)
Fix an issue where re-requesting an entire collection or some items of a collection is not possible
if these items have been deleted.

fix #1749
2026-01-26 16:15:47 +01:00
fallenbagel
e1d3f29383 chore(deps): update next, openpgp, bcrypt, cypress & eslint (#2336)
* chore(deps): update next, openpgp, bcrypt, cypress & eslint

next: ^14.2.25 → ^14.2.35 (fixes DoS, SSRF, cache key confusion)
openpgp: 5.11.2 → 6.3.0 (v5 EOL, v6 has active security support)
bcrypt: 5.1.0 → 6.0.0 (eliminates vulnerable tar dependency chain)
@types/bcrypt: 5.0.0 → 6.0.0 (matches bcrypt major version)
cypress: 14.1.0 → 14.5.4 (fixes form-data vulnerability)
eslint: 8.35.0 → 8.57.1 (latest supported 8.x release)

No changes in code requireed as all APIs remain compatible

* chore(deps): remove unused @formatjs/swc-plugin-experimental
2026-01-26 16:13:03 +01:00
fallenbagel
f8f90cb903 fix(deps): upgrade typeorm to 0.3.28 to address security vulnerabilities (#2333)
Upgrade typeorm from 0.3.12 to 0.3.28 to resolve multiple security vulnerabilities. Fixes high
severity SQL injection vulnerability in typeorm (CVE present in <0.3.26). Removes Windows-specific
postinstall workaround that's no longer needed.The fix for #478 was a workaround and is now resolved
upstream see (https://github.com/typeorm/typeorm/issues/9766). The issue was specifically with
TypeORM 0.3.12's glob pattern handling on Windows.

fix #478
2026-01-26 09:03:37 +01:00
fallenbagel
65844a2f23 chore(deps): migrate to @seerr-team/react-tailwindcss-datepicker (#2330)
Migrates from `react-tailwindcss-datepicker-sct` to `@seerr-team/react-tailwindcss-datepicker`, our
own fork published on npm. This fork includes a fix for keyboard input not working in single date
mode (typing a date and pressing enter now properly applies the filter).

fix #1585
2026-01-25 17:09:05 +01:00
0xsysr3ll
62755692e9 fix(availability-sync): fix 4K media availability detection (#2298) 2026-01-23 12:26:07 +01:00
fallenbagel
beba2ea099 fix(mediarequest): explicitly set mediaId when creating request (#2316)
* fix(mediarequest): explicitly set mediaId when creating

Intermittent issue where media_request records were created with mediaId = NULL,causing TypeError
when accessing request.media.tmdbId on the profile page. TypeORM's implicit relation-to-foreign-key
mapping was failing intermittently. This sets the mediaId column explicitly and adds a guard to
check to fail fast if media.id is not populated after save.

fix #2315

* refactor: better logging when media id not found
2026-01-23 14:32:46 +05:00
264 changed files with 23871 additions and 6500 deletions

View File

@@ -16,6 +16,7 @@ module.exports = {
},
},
rules: {
'@typescript-eslint/no-explicit-any': 'warn', // disable the rule for now to replicate previous behavior
'@typescript-eslint/camelcase': 0,
'@typescript-eslint/no-use-before-define': 0,
'jsx-a11y/no-noninteractive-tabindex': 0,

View File

@@ -22,14 +22,78 @@ concurrency:
cancel-in-progress: true
jobs:
i18n:
name: i18n Check
if: github.event_name == 'pull_request'
runs-on: ubuntu-24.04
permissions:
contents: read
pull-requests: write
env:
GH_TOKEN: ${{ secrets.GITHUB_TOKEN }}
GH_REPO: ${{ github.repository }}
NUMBER: ${{ github.event.pull_request.number }}
steps:
- name: Checkout
uses: actions/checkout@93cb6efe18208431cddfb8368fd83d5badbf9bfd # v5.0.1
with:
persist-credentials: false
- name: Pnpm Setup
uses: pnpm/action-setup@41ff72655975bd51cab0327fa583b6e92b6d3061 # v4.2.0
- name: Set up Node.js
uses: actions/setup-node@a0853c24544627f65ddf259abe73b1d18a591444 # v5.0.0
with:
node-version-file: 'package.json'
- name: Get pnpm store directory
shell: bash
run: echo "STORE_PATH=$(pnpm store path --silent)" >> $GITHUB_ENV
- name: Setup pnpm cache
uses: actions/cache@0057852bfaa89a56745cba8c7296529d2fc39830 # v4.3.0
with:
path: ${{ env.STORE_PATH }}
key: ${{ runner.os }}-pnpm-store-${{ hashFiles('**/pnpm-lock.yaml') }}
restore-keys: |
${{ runner.os }}-pnpm-store-
- name: Install dependencies
env:
CI: true
run: pnpm install
- name: i18n Check
shell: bash
env:
I18N_LABEL: i18n-out-of-sync
BODY: |
The i18n check failed because translation messages are out of sync.
This usually happens when you've added or modified translation strings in your code but haven't updated the translation file.
Please run `pnpm i18n:extract` and commit the changes.
run: |
retry() { n=0; until "$@"; do n=$((n+1)); [ $n -ge 3 ] && break; echo "retry $n: $*" >&2; sleep 2; done; }
check_failed=0; node bin/check-i18n.js || check_failed=$?
pr_labels=$(gh pr view "$NUMBER" -R "$GH_REPO" --json labels -q '.labels[].name' 2>/dev/null) || true
has_label=0
while IFS= read -r name; do [ -n "$name" ] && [ "$name" = "$I18N_LABEL" ] && has_label=1 && break; done <<< "$pr_labels"
if [ "$check_failed" -eq 1 ]; then
[ "$has_label" -eq 0 ] && { retry gh pr edit "$NUMBER" -R "$GH_REPO" --add-label "$I18N_LABEL" || true; retry gh pr comment "$NUMBER" -R "$GH_REPO" -b "$BODY" || true; }
else
[ "$has_label" -eq 1 ] && retry gh pr edit "$NUMBER" -R "$GH_REPO" --remove-label "$I18N_LABEL" || true
fi
exit $check_failed
test:
name: Lint & Test Build
if: github.event_name == 'pull_request'
runs-on: ubuntu-24.04
container: node:22.20.0-alpine3.22@sha256:dbcedd8aeab47fbc0f4dd4bffa55b7c3c729a707875968d467aaaea42d6225af
container: node:22.22.0-alpine3.22@sha256:0c49915657c1c77c64c8af4d91d2f13fe96853bbd957993ed00dd592cbecc284
steps:
- name: Checkout
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
uses: actions/checkout@93cb6efe18208431cddfb8368fd83d5badbf9bfd # v5.0.1
with:
persist-credentials: false
@@ -78,7 +142,7 @@ jobs:
runs-on: ${{ matrix.runner }}
steps:
- name: Checkout
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
uses: actions/checkout@93cb6efe18208431cddfb8368fd83d5badbf9bfd # v5.0.1
with:
persist-credentials: false
@@ -87,7 +151,7 @@ jobs:
run: echo "TIMESTAMP=$(git log -1 --pretty=%ct)" >> "$GITHUB_OUTPUT"
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@e468171a9de216ec08956ac3ada2f0791b6bd435 # v3.11.1
uses: docker/setup-buildx-action@8d2750c68a42422c14e847fe6c8ac0403b4cbd6f # v3.12.0
- name: Warm cache (no push) — ${{ matrix.platform }}
uses: docker/build-push-action@263435318d21b8e681c14492fe198d362a7d2c83 # v6.18.0
@@ -114,7 +178,7 @@ jobs:
id-token: write
steps:
- name: Checkout
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
uses: actions/checkout@93cb6efe18208431cddfb8368fd83d5badbf9bfd # v5.0.1
with:
persist-credentials: false
@@ -123,7 +187,7 @@ jobs:
run: echo "TIMESTAMP=$(git log -1 --pretty=%ct)" >> "$GITHUB_OUTPUT"
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@e468171a9de216ec08956ac3ada2f0791b6bd435 # v3.11.1
uses: docker/setup-buildx-action@8d2750c68a42422c14e847fe6c8ac0403b4cbd6f # v3.12.0
- name: Log in to Docker Hub
uses: docker/login-action@5e57cd118135c172c3672efd75eb46360885c0ef # v3.6.0
@@ -140,7 +204,7 @@ jobs:
- name: Extract metadata
id: meta
uses: docker/metadata-action@c1e51972afc2121e065aed6d45c65596fe445f3f # v5.8.0
uses: docker/metadata-action@c299e40c65443455700f0fdfc63efafe5b349051 # v5.10.0
with:
images: |
${{ env.DOCKER_HUB }}

View File

@@ -37,20 +37,20 @@ jobs:
language: [actions, javascript]
steps:
- name: Checkout
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
uses: actions/checkout@93cb6efe18208431cddfb8368fd83d5badbf9bfd # v5.0.1
with:
persist-credentials: false
- name: Initialize CodeQL
uses: github/codeql-action/init@e296a935590eb16afc0c0108289f68c87e2a89a5 # v4.30.7
uses: github/codeql-action/init@cdefb33c0f6224e58673d9004f47f7cb3e328b89 # v4.31.10
with:
languages: ${{ matrix.language }}
queries: +security-and-quality
- name: Autobuild
uses: github/codeql-action/autobuild@e296a935590eb16afc0c0108289f68c87e2a89a5 # v4.30.7
uses: github/codeql-action/autobuild@cdefb33c0f6224e58673d9004f47f7cb3e328b89 # v4.31.10
- name: Perform CodeQL Analysis
uses: github/codeql-action/analyze@e296a935590eb16afc0c0108289f68c87e2a89a5 # v4.30.7
uses: github/codeql-action/analyze@cdefb33c0f6224e58673d9004f47f7cb3e328b89 # v4.31.10
with:
category: '/language:${{ matrix.language }}'

View File

@@ -37,7 +37,7 @@ jobs:
runs-on: ubuntu-24.04
steps:
- name: Checkout
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
uses: actions/checkout@93cb6efe18208431cddfb8368fd83d5badbf9bfd # v5.0.1
with:
persist-credentials: false
@@ -67,7 +67,7 @@ jobs:
run: pnpm exec cypress install
- name: Cypress run
uses: cypress-io/github-action@b8ba51a856ba5f4c15cf39007636d4ab04f23e3c # v6.10.2
uses: cypress-io/github-action@f790eee7a50d9505912f50c2095510be7de06aa7 # v6.10.9
with:
install: false
build: pnpm cypress:build

83
.github/workflows/detect-duplicate.yml vendored Normal file
View File

@@ -0,0 +1,83 @@
# yaml-language-server: $schema=https://json.schemastore.org/github-workflow.json
name: Duplicate Issue Detector
on:
issues:
types: [opened]
permissions: {}
env:
EMBEDDING_MODEL: ${{ vars.EMBEDDING_MODEL }}
GROQ_MODEL: ${{ vars.GROQ_MODEL }}
GROQ_API_KEY: ${{ secrets.GROQ_API_KEY }}
jobs:
detect-duplicate:
runs-on: ubuntu-24.04
if: ${{ !github.event.issue.pull_request }}
permissions:
issues: write
actions: read
contents: read
steps:
- name: Checkout repository
uses: actions/checkout@93cb6efe18208431cddfb8368fd83d5badbf9bfd # v5.0.1
- name: Pnpm Setup
uses: pnpm/action-setup@41ff72655975bd51cab0327fa583b6e92b6d3061 # v4.2.0
- name: Set up Node.js
uses: actions/setup-node@a0853c24544627f65ddf259abe73b1d18a591444 # v5.0.0
with:
node-version-file: 'package.json'
- name: Get pnpm store directory
shell: bash
run: echo "STORE_PATH=$(pnpm store path --silent)" >> $GITHUB_ENV
- name: Setup pnpm cache
uses: actions/cache@0057852bfaa89a56745cba8c7296529d2fc39830 # v4.3.0
with:
path: ${{ env.STORE_PATH }}
key: ${{ runner.os }}-pnpm-store-${{ hashFiles('**/pnpm-lock.yaml') }}
restore-keys: |
${{ runner.os }}-pnpm-store-
- name: Install dependencies
working-directory: bin/duplicate-detector
env:
CI: true
run: pnpm install --frozen-lockfile
- name: Download issue index
uses: dawidd6/action-download-artifact@5c98f0b039f36ef966fdb7dfa9779262785ecb05 # v14
with:
name: issue-index
workflow: rebuild-issue-index.yml
path: bin/duplicate-detector
search_artifacts: true
if_no_artifact_found: warn
- name: Build index if missing
working-directory: bin/duplicate-detector
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
GITHUB_REPOSITORY: ${{ github.repository }}
INDEX_PATH: issue_index.json
run: |
if [ ! -f issue_index.json ]; then
echo "No index found — building from scratch..."
node build-index.mjs
fi
- name: Detect duplicates
working-directory: bin/duplicate-detector
continue-on-error: true
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
GITHUB_REPOSITORY: ${{ github.repository }}
ISSUE_NUMBER: ${{ github.event.issue.number }}
INDEX_PATH: issue_index.json
run: node detect.mjs

View File

@@ -23,7 +23,7 @@ jobs:
name: Build Docusaurus
runs-on: ubuntu-24.04
steps:
- uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
- uses: actions/checkout@93cb6efe18208431cddfb8368fd83d5badbf9bfd # v5.0.1
with:
fetch-depth: 0
persist-credentials: false

View File

@@ -36,13 +36,13 @@ jobs:
steps:
- name: Checkout
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
uses: actions/checkout@93cb6efe18208431cddfb8368fd83d5badbf9bfd # v5.0.1
with:
fetch-depth: 0
persist-credentials: false
- name: Run Lychee link checker
uses: lycheeverse/lychee-action@885c65f3dc543b57c898c8099f4e08c8afd178a2 # v2.6.1
uses: lycheeverse/lychee-action@a8c4c7cb88f0c7386610c35eb25108e448569cb0 # v2.7.0
with:
fail: false
args: >-

View File

@@ -28,7 +28,7 @@ jobs:
has_artifacts: ${{ steps.check-artifacts.outputs.has_artifacts }}
steps:
- name: Checkout
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
uses: actions/checkout@93cb6efe18208431cddfb8368fd83d5badbf9bfd # v5.0.1
with:
fetch-depth: 0
persist-credentials: false
@@ -93,7 +93,7 @@ jobs:
if: needs.package-helm-chart.outputs.has_artifacts == 'true'
steps:
- name: Checkout
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
uses: actions/checkout@93cb6efe18208431cddfb8368fd83d5badbf9bfd # v5.0.1
with:
fetch-depth: 0
persist-credentials: false
@@ -151,7 +151,7 @@ jobs:
contents: read
steps:
- name: Checkout
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
uses: actions/checkout@93cb6efe18208431cddfb8368fd83d5badbf9bfd # v5.0.1
with:
fetch-depth: 0
persist-credentials: false

View File

@@ -28,7 +28,7 @@ jobs:
contents: read
steps:
- name: Checkout
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
uses: actions/checkout@93cb6efe18208431cddfb8368fd83d5badbf9bfd # v5.0.1
with:
fetch-depth: 0
persist-credentials: false
@@ -37,7 +37,7 @@ jobs:
uses: azure/setup-helm@1a275c3b69536ee54be43f2070a358922e12c8d4 # v4.3.1
- name: Set up chart-testing
uses: helm/chart-testing-action@0d28d3144d3a25ea2cc349d6e59901c4ff469b3b # v2.7.0
uses: helm/chart-testing-action@6ec842c01de15ebb84c8627d2744a0c2f2755c9f # v2.8.0
- name: Ensure documentation is updated
uses: docker://jnorwood/helm-docs:v1.14.2@sha256:7e562b49ab6b1dbc50c3da8f2dd6ffa8a5c6bba327b1c6335cc15ce29267979c

View File

@@ -33,7 +33,7 @@ jobs:
runs-on: ${{ matrix.runner }}
steps:
- name: Checkout
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
uses: actions/checkout@93cb6efe18208431cddfb8368fd83d5badbf9bfd # v5.0.1
with:
persist-credentials: false
@@ -42,7 +42,7 @@ jobs:
run: echo "TIMESTAMP=$(git log -1 --pretty=%ct)" >> "$GITHUB_OUTPUT"
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@e468171a9de216ec08956ac3ada2f0791b6bd435 # v3.11.1
uses: docker/setup-buildx-action@8d2750c68a42422c14e847fe6c8ac0403b4cbd6f # v3.12.0
- name: Derive preview version from tag
id: ver
@@ -79,7 +79,7 @@ jobs:
id-token: write
steps:
- name: Checkout
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
uses: actions/checkout@93cb6efe18208431cddfb8368fd83d5badbf9bfd # v5.0.1
with:
persist-credentials: false
@@ -88,7 +88,7 @@ jobs:
run: echo "TIMESTAMP=$(git log -1 --pretty=%ct)" >> "$GITHUB_OUTPUT"
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@e468171a9de216ec08956ac3ada2f0791b6bd435 # v3.11.1
uses: docker/setup-buildx-action@8d2750c68a42422c14e847fe6c8ac0403b4cbd6f # v3.12.0
- name: Log in to Docker Hub
uses: docker/login-action@5e57cd118135c172c3672efd75eb46360885c0ef # v3.6.0
@@ -115,7 +115,7 @@ jobs:
- name: Extract metadata
id: meta
uses: docker/metadata-action@c1e51972afc2121e065aed6d45c65596fe445f3f # v5.8.0
uses: docker/metadata-action@c299e40c65443455700f0fdfc63efafe5b349051 # v5.10.0
with:
images: |
${{ env.DOCKER_HUB }}

View File

@@ -0,0 +1,65 @@
# yaml-language-server: $schema=https://json.schemastore.org/github-workflow.json
name: Rebuild Issue Index
on:
schedule:
- cron: "0 3 * * *"
workflow_dispatch:
permissions: {}
env:
EMBEDDING_MODEL: ${{ vars.EMBEDDING_MODEL }}
jobs:
build-index:
runs-on: ubuntu-24.04
permissions:
issues: read
actions: write
contents: read
steps:
- name: Checkout repository
uses: actions/checkout@93cb6efe18208431cddfb8368fd83d5badbf9bfd # v5.0.1
- name: Pnpm Setup
uses: pnpm/action-setup@41ff72655975bd51cab0327fa583b6e92b6d3061 # v4.2.0
- name: Set up Node.js
uses: actions/setup-node@a0853c24544627f65ddf259abe73b1d18a591444 # v5.0.0
with:
node-version-file: 'package.json'
- name: Get pnpm store directory
shell: bash
run: echo "STORE_PATH=$(pnpm store path --silent)" >> $GITHUB_ENV
- name: Setup pnpm cache
uses: actions/cache@0057852bfaa89a56745cba8c7296529d2fc39830 # v4.3.0
with:
path: ${{ env.STORE_PATH }}
key: ${{ runner.os }}-pnpm-store-${{ hashFiles('**/pnpm-lock.yaml') }}
restore-keys: |
${{ runner.os }}-pnpm-store-
- name: Install dependencies
working-directory: bin/duplicate-detector
env:
CI: true
run: pnpm install --frozen-lockfile
- name: Build issue index
working-directory: bin/duplicate-detector
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
GITHUB_REPOSITORY: ${{ github.repository }}
INDEX_PATH: issue_index.json
run: node build-index.mjs
- name: Upload index artifact
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
with:
name: issue-index
path: bin/duplicate-detector/issue_index.json
retention-days: 7

View File

@@ -27,14 +27,14 @@ jobs:
release_body: ${{ steps.git-cliff.outputs.content }}
steps:
- name: Checkout
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
uses: actions/checkout@93cb6efe18208431cddfb8368fd83d5badbf9bfd # v5.0.1
with:
fetch-depth: 0
persist-credentials: false
- name: Generate changelog
id: git-cliff
uses: orhun/git-cliff-action@d77b37db2e3f7398432d34b72a12aa3e2ba87e51 # v4.6.0
uses: orhun/git-cliff-action@e16f179f0be49ecdfe63753837f20b9531642772 # v4.7.0
with:
config: .github/cliff.toml
args: -vv --current
@@ -50,7 +50,7 @@ jobs:
needs: changelog
steps:
- name: Checkout
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
uses: actions/checkout@93cb6efe18208431cddfb8368fd83d5badbf9bfd # v5.0.1
with:
persist-credentials: false
@@ -76,7 +76,7 @@ jobs:
VERSION: ${{ github.ref_name }}
steps:
- name: Checkout
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
uses: actions/checkout@93cb6efe18208431cddfb8368fd83d5badbf9bfd # v5.0.1
with:
persist-credentials: false
@@ -85,7 +85,7 @@ jobs:
run: echo "TIMESTAMP=$(git log -1 --pretty=%ct)" >> "$GITHUB_OUTPUT"
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@e468171a9de216ec08956ac3ada2f0791b6bd435 # v3.11.1
uses: docker/setup-buildx-action@8d2750c68a42422c14e847fe6c8ac0403b4cbd6f # v3.12.0
- name: Warm cache [${{ matrix.platform }}]
uses: docker/build-push-action@263435318d21b8e681c14492fe198d362a7d2c83 # v6.18.0
@@ -115,7 +115,7 @@ jobs:
VERSION: ${{ github.ref_name }}
steps:
- name: Checkout
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
uses: actions/checkout@93cb6efe18208431cddfb8368fd83d5badbf9bfd # v5.0.1
with:
persist-credentials: false
@@ -124,7 +124,7 @@ jobs:
run: echo "TIMESTAMP=$(git log -1 --pretty=%ct)" >> "$GITHUB_OUTPUT"
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@e468171a9de216ec08956ac3ada2f0791b6bd435 # v3.11.1
uses: docker/setup-buildx-action@8d2750c68a42422c14e847fe6c8ac0403b4cbd6f # v3.12.0
- name: Log in to Docker Hub
uses: docker/login-action@5e57cd118135c172c3672efd75eb46360885c0ef # v3.6.0
@@ -141,7 +141,7 @@ jobs:
- name: Extract metadata
id: meta
uses: docker/metadata-action@c1e51972afc2121e065aed6d45c65596fe445f3f # v5.8.0
uses: docker/metadata-action@c299e40c65443455700f0fdfc63efafe5b349051 # v5.10.0
with:
images: |
${{ env.DOCKER_HUB }}
@@ -201,7 +201,7 @@ jobs:
COSIGN_YES: 'true'
steps:
- name: Checkout
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
uses: actions/checkout@93cb6efe18208431cddfb8368fd83d5badbf9bfd # v5.0.1
with:
persist-credentials: false
@@ -209,7 +209,7 @@ jobs:
uses: sigstore/cosign-installer@faadad0cce49287aee09b3a48701e75088a2c6ad # v4.0.0
- name: Install Trivy
uses: aquasecurity/setup-trivy@e6c2c5e321ed9123bda567646e2f96565e34abe1 # v0.2.4
uses: aquasecurity/setup-trivy@3fb12ec12f41e471780db15c232d5dd185dcb514 # v0.2.5
- name: Log in to Docker Hub
uses: docker/login-action@5e57cd118135c172c3672efd75eb46360885c0ef # v3.6.0

View File

@@ -25,19 +25,19 @@ jobs:
if: github.actor == 'renovate[bot]'
steps:
- name: Checkout code
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
uses: actions/checkout@93cb6efe18208431cddfb8368fd83d5badbf9bfd # v5.0.1
with:
fetch-depth: 0
persist-credentials: false
- uses: actions/create-github-app-token@67018539274d69449ef7c02e8e71183d1719ab42 # v2.1.4
- uses: actions/create-github-app-token@29824e69f54612133e76f7eaac726eef6c875baf # v2.2.1
id: app-token
with:
app-id: 2138788
private-key: ${{ secrets.APP_SEERR_HELM_PRIVATE_KEY }}
- name: Set up chart-testing
uses: helm/chart-testing-action@0d28d3144d3a25ea2cc349d6e59901c4ff469b3b # v2.7.0
uses: helm/chart-testing-action@6ec842c01de15ebb84c8627d2744a0c2f2755c9f # v2.8.0
- name: Run chart-testing (list-changed)
id: list-changed

View File

@@ -21,7 +21,7 @@ jobs:
issues: write
pull-requests: write
steps:
- uses: actions/stale@5f858e3efba33a5ca4407a664cc011ad407f2008 # v10.1.0
- uses: actions/stale@997185467fa4f803885201cee163a9f38240193d # v10.1.1
with:
any-of-labels: "pending author's response"
exempt-issue-labels: 'confirmed'

View File

@@ -24,7 +24,7 @@ jobs:
permissions:
contents: read
steps:
- uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
- uses: actions/checkout@93cb6efe18208431cddfb8368fd83d5badbf9bfd # v5.0.1
with:
fetch-depth: 0
persist-credentials: false

View File

@@ -34,7 +34,7 @@ jobs:
steps:
- name: Checkout
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
uses: actions/checkout@93cb6efe18208431cddfb8368fd83d5badbf9bfd # v5.0.1
with:
fetch-depth: 0
persist-credentials: false
@@ -56,6 +56,6 @@ jobs:
ignore-unfixed: true
- name: Upload SARIF to code scanning
uses: github/codeql-action/upload-sarif@e296a935590eb16afc0c0108289f68c87e2a89a5 # v4.30.7
uses: github/codeql-action/upload-sarif@cdefb33c0f6224e58673d9004f47f7cb3e328b89 # v4.31.10
with:
sarif_file: trivy.sarif

View File

@@ -4,15 +4,17 @@ dist/
config/
pnpm-lock.yaml
cypress/config/settings.cypress.json
.github
.vscode
# assets
src/assets/
public/
!public/sw.js
docs/
!/public/
/public/*
!/public/sw.js
public/*
!public/sw.js
# helm charts
**/charts
# Prettier breaks GitHub alert syntax in markdown
*.md

View File

@@ -1,5 +1,5 @@
module.exports = {
plugins: [require('./merged-prettier-plugin.js')],
plugins: ['prettier-plugin-organize-imports', 'prettier-plugin-tailwindcss'],
singleQuote: true,
trailingComma: 'es5',
overrides: [
@@ -27,5 +27,11 @@ module.exports = {
rangeEnd: 0,
},
},
{
files: 'public/offline.html',
options: {
rangeEnd: 0,
},
},
],
};

View File

@@ -1,4 +1,4 @@
FROM node:22.20.0-alpine3.22@sha256:cb3143549582cc5f74f26f0992cdef4a422b22128cb517f94173a5f910fa4ee7 AS base
FROM node:22.22.0-alpine3.22@sha256:0c49915657c1c77c64c8af4d91d2f13fe96853bbd957993ed00dd592cbecc284 AS base
ARG SOURCE_DATE_EPOCH
ARG TARGETPLATFORM
ENV TARGETPLATFORM=${TARGETPLATFORM:-linux/amd64}
@@ -33,7 +33,7 @@ RUN pnpm build
RUN rm -rf .next/cache
FROM node:22.20.0-alpine3.22@sha256:cb3143549582cc5f74f26f0992cdef4a422b22128cb517f94173a5f910fa4ee7
FROM node:22.22.0-alpine3.22@sha256:0c49915657c1c77c64c8af4d91d2f13fe96853bbd957993ed00dd592cbecc284
ARG SOURCE_DATE_EPOCH
ARG COMMIT_TAG
ENV NODE_ENV=production

View File

@@ -1,4 +1,4 @@
FROM node:22.20.0-alpine3.22@sha256:cb3143549582cc5f74f26f0992cdef4a422b22128cb517f94173a5f910fa4ee7
FROM node:22.22.0-alpine3.22@sha256:0c49915657c1c77c64c8af4d91d2f13fe96853bbd957993ed00dd592cbecc284
ENV PNPM_HOME="/pnpm"
ENV PATH="$PNPM_HOME:$PATH"

39
bin/check-i18n.js Normal file
View File

@@ -0,0 +1,39 @@
#!/usr/bin/env node
/**
* Check that i18n locale files are in sync with extracted messages.
* Runs `pnpm i18n:extract` and compares en.json; exits 1 if they differ.
*/
const { execSync } = require('child_process');
const fs = require('fs');
const path = require('path');
const localePath = path.join(
__dirname,
'..',
'src',
'i18n',
'locale',
'en.json'
);
const backupPath = `${localePath}.bak`;
try {
fs.copyFileSync(localePath, backupPath);
execSync('pnpm i18n:extract', { stdio: 'inherit' });
const original = fs.readFileSync(backupPath, 'utf8');
const extracted = fs.readFileSync(localePath, 'utf8');
fs.unlinkSync(backupPath);
if (original !== extracted) {
console.error(
"i18n messages are out of sync. Please run 'pnpm i18n:extract' and commit the changes."
);
process.exit(1);
}
} catch (err) {
if (fs.existsSync(backupPath)) {
fs.unlinkSync(backupPath);
}
throw err;
}

1
bin/duplicate-detector/.gitignore vendored Normal file
View File

@@ -0,0 +1 @@
node_modules/

View File

@@ -0,0 +1,120 @@
#!/usr/bin/env node
/**
* Build Issue Embedding Index
*
* Fetches all open issues and recently closed ones,
* generates embeddings using a local ONNX transformer model,
* and saves them as a JSON artifact for the duplicate detector.
*/
import { pipeline } from '@huggingface/transformers';
import { mkdirSync, writeFileSync } from 'node:fs';
import { dirname } from 'node:path';
import { fetchIssues, issueText } from './utils.mjs';
const MODEL_NAME = process.env.EMBEDDING_MODEL || 'Xenova/all-MiniLM-L6-v2';
const OUTPUT_PATH = 'issue_index.json';
const INCLUDE_CLOSED_DAYS = 90;
const MAX_ISSUES = 5000;
const BATCH_SIZE = 64;
async function main() {
console.log('Fetching open issues...');
const openIssues = await fetchIssues({
state: 'open',
maxIssues: MAX_ISSUES,
});
console.log(`Fetched ${openIssues.length} open issues`);
const since = new Date(
Date.now() - INCLUDE_CLOSED_DAYS * 24 * 60 * 60 * 1000
).toISOString();
console.log(
`Fetching closed issues from last ${INCLUDE_CLOSED_DAYS} days...`
);
const closedIssues = await fetchIssues({
state: 'closed',
since,
maxIssues: MAX_ISSUES,
});
console.log(`Fetched ${closedIssues.length} closed issues`);
let allIssues = [...openIssues, ...closedIssues];
const seen = new Set();
allIssues = allIssues.filter((issue) => {
if (seen.has(issue.number)) return false;
seen.add(issue.number);
return true;
});
console.log(`Total unique issues to index: ${allIssues.length}`);
if (allIssues.length === 0) {
console.warn('No issues found - writing empty index');
writeFileSync(OUTPUT_PATH, JSON.stringify({ issues: [], embeddings: [] }));
return;
}
console.log(`Loading model: ${MODEL_NAME}`);
const extractor = await pipeline('feature-extraction', MODEL_NAME, {
dtype: 'fp32',
});
const texts = allIssues.map((issue) => issueText(issue.title, issue.body));
const allEmbeddings = [];
console.log(`Generating embeddings for ${texts.length} issues...`);
for (let i = 0; i < texts.length; i += BATCH_SIZE) {
const batch = texts.slice(i, i + BATCH_SIZE);
const output = await extractor(batch, {
pooling: 'mean',
normalize: true,
});
const vectors = output.tolist();
allEmbeddings.push(...vectors);
const progress = Math.min(i + BATCH_SIZE, texts.length);
console.log(` ${progress}/${texts.length}`);
}
const issueMetadata = allIssues.map((issue) => {
const body = (issue.body || '').trim();
return {
number: issue.number,
title: issue.title,
state: issue.state,
url: issue.html_url,
body_preview: body.slice(0, 500) || '',
labels: (issue.labels || []).map((l) => l.name),
created_at: issue.created_at,
updated_at: issue.updated_at,
};
});
const indexData = {
issues: issueMetadata,
embeddings: allEmbeddings,
model: MODEL_NAME,
issue_count: issueMetadata.length,
built_at: new Date().toISOString(),
};
const dir = dirname(OUTPUT_PATH);
if (dir && dir !== '.') mkdirSync(dir, { recursive: true });
writeFileSync(OUTPUT_PATH, JSON.stringify(indexData));
const sizeMb = (
Buffer.byteLength(JSON.stringify(indexData)) /
(1024 * 1024)
).toFixed(1);
console.log(
`Index saved to ${OUTPUT_PATH} (${sizeMb} MB, ${issueMetadata.length} issues)`
);
}
main().catch((err) => {
console.error(err);
process.exit(1);
});

View File

@@ -0,0 +1,274 @@
#!/usr/bin/env node
/**
* Duplicate Issue Detector
*
* Triggered on new issue creation. Compares the new issue against an
* existing embedding index, then uses an LLM to
* confirm duplicates before posting a comment for maintainer review.
*/
import { pipeline } from '@huggingface/transformers';
import { existsSync, readFileSync } from 'node:fs';
import {
addLabel,
dotProduct,
fetchIssues,
getIssue,
issueText,
postComment,
} from './utils.mjs';
const SIMILARITY_THRESHOLD = 0.55;
const TOP_K = 5;
const MAX_COMMENT_CANDIDATES = 3;
const MODEL_NAME = process.env.EMBEDDING_MODEL || 'Xenova/all-MiniLM-L6-v2';
const GROQ_MODEL = process.env.GROQ_MODEL || 'llama-3.3-70b-versatile';
const INDEX_PATH = 'issue_index.json';
const LABEL_NAME = 'possible-duplicate';
const GROQ_API_KEY = process.env.GROQ_API_KEY || '';
const ISSUE_NUMBER = parseInt(process.env.ISSUE_NUMBER, 10);
function loadIndex(path) {
if (!existsSync(path)) {
console.error(
`Index file not found at ${path}. Run build-index.mjs first.`
);
process.exit(1);
}
const data = JSON.parse(readFileSync(path, 'utf-8'));
console.log(`Loaded index with ${data.issues.length} issues`);
return data;
}
function findSimilar(
queryEmbedding,
index,
{ topK = TOP_K, threshold = SIMILARITY_THRESHOLD, excludeNumber } = {}
) {
const { issues, embeddings } = index;
if (!issues.length) return [];
const scored = issues.map((issue, i) => ({
...issue,
score: dotProduct(queryEmbedding, embeddings[i]),
}));
return scored
.sort((a, b) => b.score - a.score)
.filter(
(c) =>
c.score >= threshold && (!excludeNumber || c.number !== excludeNumber)
)
.slice(0, topK);
}
const CONFIRM_SYSTEM_PROMPT = `You are a GitHub issue triage assistant. You will be given a NEW issue and one \
or more CANDIDATE issues that may be duplicates.
For each candidate, determine if the new issue is truly a duplicate (same root \
problem/request) or merely related (similar area but different issue).
Respond ONLY with a JSON array of objects, each with:
- "number": the candidate issue number
- "duplicate": true or false
- "reason": one-sentence explanation
Example:
[{"number": 123, "duplicate": true, "reason": "Both report the same crash when ..."}]`;
async function confirmWithLlm(newIssue, candidates) {
if (!GROQ_API_KEY) {
console.warn('GROQ_API_KEY not set — skipping LLM confirmation');
return candidates;
}
const candidateText = candidates
.map(
(c) =>
`### Candidate #${c.number} (similarity: ${c.score.toFixed(2)})\n` +
`**Title:** ${c.title}\n` +
`**State:** ${c.state}\n` +
`**Body preview:** ${(c.body_preview || 'N/A').slice(0, 500)}`
)
.join('\n\n');
const userPrompt =
`## NEW ISSUE #${newIssue.number}\n` +
`**Title:** ${newIssue.title}\n` +
`**Body:**\n${(newIssue.body || 'No body').slice(0, 1500)}\n\n` +
`---\n\n` +
`## CANDIDATES\n${candidateText}`;
try {
const resp = await fetch(
'https://api.groq.com/openai/v1/chat/completions',
{
method: 'POST',
headers: {
Authorization: `Bearer ${GROQ_API_KEY}`,
'Content-Type': 'application/json',
},
body: JSON.stringify({
model: GROQ_MODEL,
messages: [
{ role: 'system', content: CONFIRM_SYSTEM_PROMPT },
{ role: 'user', content: userPrompt },
],
temperature: 0.1,
max_tokens: 1024,
}),
signal: AbortSignal.timeout(30_000),
}
);
if (!resp.ok) {
const text = await resp.text();
throw new Error(`Groq API error ${resp.status}: ${text}`);
}
let content = (await resp.json()).choices[0].message.content.trim();
if (content.startsWith('```')) {
content = content
.split('\n')
.slice(1)
.join('\n')
.replace(/```\s*$/, '')
.trim();
}
const verdicts = JSON.parse(content);
if (!Array.isArray(verdicts)) {
throw new Error('Invalid LLM response format - expected array');
}
const verdictMap = new Map(verdicts.map((v) => [v.number, v]));
const confirmed = [];
for (const c of candidates) {
const verdict = verdictMap.get(c.number);
if (verdict?.duplicate) {
c.llm_reason = verdict.reason || '';
confirmed.push(c);
} else {
const reason = verdict?.reason || 'not evaluated';
console.log(` #${c.number} ruled out by LLM: ${reason}`);
}
}
return confirmed;
} catch (err) {
console.warn(
`LLM confirmation failed: ${err.message} - falling back to all candidates`
);
return candidates;
}
}
function formatComment(candidates) {
const lines = [
'**Possible duplicate detected**',
'',
'This issue may be a duplicate of the following (detected via semantic similarity + LLM review):',
'',
];
for (const c of candidates.slice(0, MAX_COMMENT_CANDIDATES)) {
const confidence = `${(c.score * 100).toFixed(0)}%`;
let line = `- #${c.number} (${confidence} match) — ${c.title}`;
if (c.llm_reason) {
line += `\n > *${c.llm_reason}*`;
}
lines.push(line);
}
lines.push(
'',
'A maintainer will review this. If this is **not** a duplicate, no action is needed.',
'',
`<!-- duplicate-bot: candidates=${candidates.map((c) => c.number).join(',')} -->`
);
return lines.join('\n');
}
async function main() {
if (!ISSUE_NUMBER) {
console.error('ISSUE_NUMBER not set');
process.exit(1);
}
console.log(`Processing issue #${ISSUE_NUMBER}`);
const issue = await getIssue(ISSUE_NUMBER);
const oneHourAgo = new Date(Date.now() - 60 * 60 * 1000).toISOString();
const recentIssues = await fetchIssues({
creator: issue.user.login,
since: oneHourAgo,
state: 'all',
});
if (recentIssues.length > 10) {
console.log(
`User ${issue.user.login} created ${recentIssues.length} issues in the last hour - skipping to prevent spam`
);
return;
}
if (issue.pull_request) {
console.log('Skipping - this is a pull request');
return;
}
if (issue.user.type === 'Bot') {
console.log('Skipping - issue created by bot');
return;
}
console.log(`Loading model: ${MODEL_NAME}`);
const extractor = await pipeline('feature-extraction', MODEL_NAME, {
dtype: 'fp32',
});
const index = loadIndex(INDEX_PATH);
const text = issueText(issue.title, issue.body);
const output = await extractor(text, { pooling: 'mean', normalize: true });
const queryEmbedding = output.tolist()[0];
let candidates = findSimilar(queryEmbedding, index, {
topK: TOP_K,
threshold: SIMILARITY_THRESHOLD,
excludeNumber: issue.number,
});
if (!candidates.length) {
console.log('No similar issues found above threshold - done');
return;
}
console.log(`Found ${candidates.length} candidates above threshold:`);
for (const c of candidates) {
console.log(` #${c.number} (${c.score.toFixed(3)}) - ${c.title}`);
}
console.log('Running LLM confirmation via Groq...');
candidates = await confirmWithLlm(issue, candidates);
if (!candidates.length) {
console.log('LLM ruled out all candidates - done');
return;
}
const comment = formatComment(candidates);
await postComment(ISSUE_NUMBER, comment);
await addLabel(ISSUE_NUMBER, LABEL_NAME);
console.log('Done!');
}
main().catch((err) => {
console.error(err);
process.exit(1);
});

View File

@@ -0,0 +1,17 @@
{
"name": "duplicate-detector",
"version": "1.0.0",
"private": true,
"type": "module",
"packageManager": "pnpm@10.17.1",
"scripts": {
"build-index": "node build-index.mjs",
"detect": "node detect.mjs"
},
"dependencies": {
"@huggingface/transformers": "^3.8.1"
},
"engines": {
"node": ">=22.0"
}
}

655
bin/duplicate-detector/pnpm-lock.yaml generated Normal file
View File

@@ -0,0 +1,655 @@
lockfileVersion: '9.0'
settings:
autoInstallPeers: true
excludeLinksFromLockfile: false
importers:
.:
dependencies:
'@huggingface/transformers':
specifier: ^3.8.1
version: 3.8.1
packages:
'@emnapi/runtime@1.8.1':
resolution: {integrity: sha512-mehfKSMWjjNol8659Z8KxEMrdSJDDot5SXMq00dM8BN4o+CLNXQ0xH2V7EchNHV4RmbZLmmPdEaXZc5H2FXmDg==}
'@huggingface/jinja@0.5.5':
resolution: {integrity: sha512-xRlzazC+QZwr6z4ixEqYHo9fgwhTZ3xNSdljlKfUFGZSdlvt166DljRELFUfFytlYOYvo3vTisA/AFOuOAzFQQ==}
engines: {node: '>=18'}
'@huggingface/transformers@3.8.1':
resolution: {integrity: sha512-tsTk4zVjImqdqjS8/AOZg2yNLd1z9S5v+7oUPpXaasDRwEDhB+xnglK1k5cad26lL5/ZIaeREgWWy0bs9y9pPA==}
'@img/colour@1.0.0':
resolution: {integrity: sha512-A5P/LfWGFSl6nsckYtjw9da+19jB8hkJ6ACTGcDfEJ0aE+l2n2El7dsVM7UVHZQ9s2lmYMWlrS21YLy2IR1LUw==}
engines: {node: '>=18'}
'@img/sharp-darwin-arm64@0.34.5':
resolution: {integrity: sha512-imtQ3WMJXbMY4fxb/Ndp6HBTNVtWCUI0WdobyheGf5+ad6xX8VIDO8u2xE4qc/fr08CKG/7dDseFtn6M6g/r3w==}
engines: {node: ^18.17.0 || ^20.3.0 || >=21.0.0}
cpu: [arm64]
os: [darwin]
'@img/sharp-darwin-x64@0.34.5':
resolution: {integrity: sha512-YNEFAF/4KQ/PeW0N+r+aVVsoIY0/qxxikF2SWdp+NRkmMB7y9LBZAVqQ4yhGCm/H3H270OSykqmQMKLBhBJDEw==}
engines: {node: ^18.17.0 || ^20.3.0 || >=21.0.0}
cpu: [x64]
os: [darwin]
'@img/sharp-libvips-darwin-arm64@1.2.4':
resolution: {integrity: sha512-zqjjo7RatFfFoP0MkQ51jfuFZBnVE2pRiaydKJ1G/rHZvnsrHAOcQALIi9sA5co5xenQdTugCvtb1cuf78Vf4g==}
cpu: [arm64]
os: [darwin]
'@img/sharp-libvips-darwin-x64@1.2.4':
resolution: {integrity: sha512-1IOd5xfVhlGwX+zXv2N93k0yMONvUlANylbJw1eTah8K/Jtpi15KC+WSiaX/nBmbm2HxRM1gZ0nSdjSsrZbGKg==}
cpu: [x64]
os: [darwin]
'@img/sharp-libvips-linux-arm64@1.2.4':
resolution: {integrity: sha512-excjX8DfsIcJ10x1Kzr4RcWe1edC9PquDRRPx3YVCvQv+U5p7Yin2s32ftzikXojb1PIFc/9Mt28/y+iRklkrw==}
cpu: [arm64]
os: [linux]
'@img/sharp-libvips-linux-arm@1.2.4':
resolution: {integrity: sha512-bFI7xcKFELdiNCVov8e44Ia4u2byA+l3XtsAj+Q8tfCwO6BQ8iDojYdvoPMqsKDkuoOo+X6HZA0s0q11ANMQ8A==}
cpu: [arm]
os: [linux]
'@img/sharp-libvips-linux-ppc64@1.2.4':
resolution: {integrity: sha512-FMuvGijLDYG6lW+b/UvyilUWu5Ayu+3r2d1S8notiGCIyYU/76eig1UfMmkZ7vwgOrzKzlQbFSuQfgm7GYUPpA==}
cpu: [ppc64]
os: [linux]
'@img/sharp-libvips-linux-riscv64@1.2.4':
resolution: {integrity: sha512-oVDbcR4zUC0ce82teubSm+x6ETixtKZBh/qbREIOcI3cULzDyb18Sr/Wcyx7NRQeQzOiHTNbZFF1UwPS2scyGA==}
cpu: [riscv64]
os: [linux]
'@img/sharp-libvips-linux-s390x@1.2.4':
resolution: {integrity: sha512-qmp9VrzgPgMoGZyPvrQHqk02uyjA0/QrTO26Tqk6l4ZV0MPWIW6LTkqOIov+J1yEu7MbFQaDpwdwJKhbJvuRxQ==}
cpu: [s390x]
os: [linux]
'@img/sharp-libvips-linux-x64@1.2.4':
resolution: {integrity: sha512-tJxiiLsmHc9Ax1bz3oaOYBURTXGIRDODBqhveVHonrHJ9/+k89qbLl0bcJns+e4t4rvaNBxaEZsFtSfAdquPrw==}
cpu: [x64]
os: [linux]
'@img/sharp-libvips-linuxmusl-arm64@1.2.4':
resolution: {integrity: sha512-FVQHuwx1IIuNow9QAbYUzJ+En8KcVm9Lk5+uGUQJHaZmMECZmOlix9HnH7n1TRkXMS0pGxIJokIVB9SuqZGGXw==}
cpu: [arm64]
os: [linux]
'@img/sharp-libvips-linuxmusl-x64@1.2.4':
resolution: {integrity: sha512-+LpyBk7L44ZIXwz/VYfglaX/okxezESc6UxDSoyo2Ks6Jxc4Y7sGjpgU9s4PMgqgjj1gZCylTieNamqA1MF7Dg==}
cpu: [x64]
os: [linux]
'@img/sharp-linux-arm64@0.34.5':
resolution: {integrity: sha512-bKQzaJRY/bkPOXyKx5EVup7qkaojECG6NLYswgktOZjaXecSAeCWiZwwiFf3/Y+O1HrauiE3FVsGxFg8c24rZg==}
engines: {node: ^18.17.0 || ^20.3.0 || >=21.0.0}
cpu: [arm64]
os: [linux]
'@img/sharp-linux-arm@0.34.5':
resolution: {integrity: sha512-9dLqsvwtg1uuXBGZKsxem9595+ujv0sJ6Vi8wcTANSFpwV/GONat5eCkzQo/1O6zRIkh0m/8+5BjrRr7jDUSZw==}
engines: {node: ^18.17.0 || ^20.3.0 || >=21.0.0}
cpu: [arm]
os: [linux]
'@img/sharp-linux-ppc64@0.34.5':
resolution: {integrity: sha512-7zznwNaqW6YtsfrGGDA6BRkISKAAE1Jo0QdpNYXNMHu2+0dTrPflTLNkpc8l7MUP5M16ZJcUvysVWWrMefZquA==}
engines: {node: ^18.17.0 || ^20.3.0 || >=21.0.0}
cpu: [ppc64]
os: [linux]
'@img/sharp-linux-riscv64@0.34.5':
resolution: {integrity: sha512-51gJuLPTKa7piYPaVs8GmByo7/U7/7TZOq+cnXJIHZKavIRHAP77e3N2HEl3dgiqdD/w0yUfiJnII77PuDDFdw==}
engines: {node: ^18.17.0 || ^20.3.0 || >=21.0.0}
cpu: [riscv64]
os: [linux]
'@img/sharp-linux-s390x@0.34.5':
resolution: {integrity: sha512-nQtCk0PdKfho3eC5MrbQoigJ2gd1CgddUMkabUj+rBevs8tZ2cULOx46E7oyX+04WGfABgIwmMC0VqieTiR4jg==}
engines: {node: ^18.17.0 || ^20.3.0 || >=21.0.0}
cpu: [s390x]
os: [linux]
'@img/sharp-linux-x64@0.34.5':
resolution: {integrity: sha512-MEzd8HPKxVxVenwAa+JRPwEC7QFjoPWuS5NZnBt6B3pu7EG2Ge0id1oLHZpPJdn3OQK+BQDiw9zStiHBTJQQQQ==}
engines: {node: ^18.17.0 || ^20.3.0 || >=21.0.0}
cpu: [x64]
os: [linux]
'@img/sharp-linuxmusl-arm64@0.34.5':
resolution: {integrity: sha512-fprJR6GtRsMt6Kyfq44IsChVZeGN97gTD331weR1ex1c1rypDEABN6Tm2xa1wE6lYb5DdEnk03NZPqA7Id21yg==}
engines: {node: ^18.17.0 || ^20.3.0 || >=21.0.0}
cpu: [arm64]
os: [linux]
'@img/sharp-linuxmusl-x64@0.34.5':
resolution: {integrity: sha512-Jg8wNT1MUzIvhBFxViqrEhWDGzqymo3sV7z7ZsaWbZNDLXRJZoRGrjulp60YYtV4wfY8VIKcWidjojlLcWrd8Q==}
engines: {node: ^18.17.0 || ^20.3.0 || >=21.0.0}
cpu: [x64]
os: [linux]
'@img/sharp-wasm32@0.34.5':
resolution: {integrity: sha512-OdWTEiVkY2PHwqkbBI8frFxQQFekHaSSkUIJkwzclWZe64O1X4UlUjqqqLaPbUpMOQk6FBu/HtlGXNblIs0huw==}
engines: {node: ^18.17.0 || ^20.3.0 || >=21.0.0}
cpu: [wasm32]
'@img/sharp-win32-arm64@0.34.5':
resolution: {integrity: sha512-WQ3AgWCWYSb2yt+IG8mnC6Jdk9Whs7O0gxphblsLvdhSpSTtmu69ZG1Gkb6NuvxsNACwiPV6cNSZNzt0KPsw7g==}
engines: {node: ^18.17.0 || ^20.3.0 || >=21.0.0}
cpu: [arm64]
os: [win32]
'@img/sharp-win32-ia32@0.34.5':
resolution: {integrity: sha512-FV9m/7NmeCmSHDD5j4+4pNI8Cp3aW+JvLoXcTUo0IqyjSfAZJ8dIUmijx1qaJsIiU+Hosw6xM5KijAWRJCSgNg==}
engines: {node: ^18.17.0 || ^20.3.0 || >=21.0.0}
cpu: [ia32]
os: [win32]
'@img/sharp-win32-x64@0.34.5':
resolution: {integrity: sha512-+29YMsqY2/9eFEiW93eqWnuLcWcufowXewwSNIT6UwZdUUCrM3oFjMWH/Z6/TMmb4hlFenmfAVbpWeup2jryCw==}
engines: {node: ^18.17.0 || ^20.3.0 || >=21.0.0}
cpu: [x64]
os: [win32]
'@isaacs/fs-minipass@4.0.1':
resolution: {integrity: sha512-wgm9Ehl2jpeqP3zw/7mo3kRHFp5MEDhqAdwy1fTGkHAwnkGOVsgpvQhL8B5n1qlb01jV3n/bI0ZfZp5lWA1k4w==}
engines: {node: '>=18.0.0'}
'@protobufjs/aspromise@1.1.2':
resolution: {integrity: sha512-j+gKExEuLmKwvz3OgROXtrJ2UG2x8Ch2YZUxahh+s1F2HZ+wAceUNLkvy6zKCPVRkU++ZWQrdxsUeQXmcg4uoQ==}
'@protobufjs/base64@1.1.2':
resolution: {integrity: sha512-AZkcAA5vnN/v4PDqKyMR5lx7hZttPDgClv83E//FMNhR2TMcLUhfRUBHCmSl0oi9zMgDDqRUJkSxO3wm85+XLg==}
'@protobufjs/codegen@2.0.4':
resolution: {integrity: sha512-YyFaikqM5sH0ziFZCN3xDC7zeGaB/d0IUb9CATugHWbd1FRFwWwt4ld4OYMPWu5a3Xe01mGAULCdqhMlPl29Jg==}
'@protobufjs/eventemitter@1.1.0':
resolution: {integrity: sha512-j9ednRT81vYJ9OfVuXG6ERSTdEL1xVsNgqpkxMsbIabzSo3goCjDIveeGv5d03om39ML71RdmrGNjG5SReBP/Q==}
'@protobufjs/fetch@1.1.0':
resolution: {integrity: sha512-lljVXpqXebpsijW71PZaCYeIcE5on1w5DlQy5WH6GLbFryLUrBD4932W/E2BSpfRJWseIL4v/KPgBFxDOIdKpQ==}
'@protobufjs/float@1.0.2':
resolution: {integrity: sha512-Ddb+kVXlXst9d+R9PfTIxh1EdNkgoRe5tOX6t01f1lYWOvJnSPDBlG241QLzcyPdoNTsblLUdujGSE4RzrTZGQ==}
'@protobufjs/inquire@1.1.0':
resolution: {integrity: sha512-kdSefcPdruJiFMVSbn801t4vFK7KB/5gd2fYvrxhuJYg8ILrmn9SKSX2tZdV6V+ksulWqS7aXjBcRXl3wHoD9Q==}
'@protobufjs/path@1.1.2':
resolution: {integrity: sha512-6JOcJ5Tm08dOHAbdR3GrvP+yUUfkjG5ePsHYczMFLq3ZmMkAD98cDgcT2iA1lJ9NVwFd4tH/iSSoe44YWkltEA==}
'@protobufjs/pool@1.1.0':
resolution: {integrity: sha512-0kELaGSIDBKvcgS4zkjz1PeddatrjYcmMWOlAuAPwAeccUrPHdUqo/J6LiymHHEiJT5NrF1UVwxY14f+fy4WQw==}
'@protobufjs/utf8@1.1.0':
resolution: {integrity: sha512-Vvn3zZrhQZkkBE8LSuW3em98c0FwgO4nxzv6OdSxPKJIEKY2bGbHn+mhGIPerzI4twdxaP8/0+06HBpwf345Lw==}
'@types/node@25.2.2':
resolution: {integrity: sha512-BkmoP5/FhRYek5izySdkOneRyXYN35I860MFAGupTdebyE66uZaR+bXLHq8k4DirE5DwQi3NuhvRU1jqTVwUrQ==}
boolean@3.2.0:
resolution: {integrity: sha512-d0II/GO9uf9lfUHH2BQsjxzRJZBdsjgsBiW4BvhWk/3qoKwQFjIDVN19PfX8F2D/r9PCMTtLWjYVCFrpeYUzsw==}
deprecated: Package no longer supported. Contact Support at https://www.npmjs.com/support for more info.
chownr@3.0.0:
resolution: {integrity: sha512-+IxzY9BZOQd/XuYPRmrvEVjF/nqj5kgT4kEq7VofrDoM1MxoRjEWkrCC3EtLi59TVawxTAn+orJwFQcrqEN1+g==}
engines: {node: '>=18'}
define-data-property@1.1.4:
resolution: {integrity: sha512-rBMvIzlpA8v6E+SJZoo++HAYqsLrkg7MSfIinMPFhmkorw7X+dOXVJQs+QT69zGkzMyfDnIMN2Wid1+NbL3T+A==}
engines: {node: '>= 0.4'}
define-properties@1.2.1:
resolution: {integrity: sha512-8QmQKqEASLd5nx0U1B1okLElbUuuttJ/AnYmRXbbbGDWh6uS208EjD4Xqq/I9wK7u0v6O08XhTWnt5XtEbR6Dg==}
engines: {node: '>= 0.4'}
detect-libc@2.1.2:
resolution: {integrity: sha512-Btj2BOOO83o3WyH59e8MgXsxEQVcarkUOpEYrubB0urwnN10yQ364rsiByU11nZlqWYZm05i/of7io4mzihBtQ==}
engines: {node: '>=8'}
detect-node@2.1.0:
resolution: {integrity: sha512-T0NIuQpnTvFDATNuHN5roPwSBG83rFsuO+MXXH9/3N1eFbn4wcPjttvjMLEPWJ0RGUYgQE7cGgS3tNxbqCGM7g==}
es-define-property@1.0.1:
resolution: {integrity: sha512-e3nRfgfUZ4rNGL232gUgX06QNyyez04KdjFrF+LTRoOXmrOgFKDg4BCdsjW8EnT69eqdYGmRpJwiPVYNrCaW3g==}
engines: {node: '>= 0.4'}
es-errors@1.3.0:
resolution: {integrity: sha512-Zf5H2Kxt2xjTvbJvP2ZWLEICxA6j+hAmMzIlypy4xcBg1vKVnx89Wy0GbS+kf5cwCVFFzdCFh2XSCFNULS6csw==}
engines: {node: '>= 0.4'}
es6-error@4.1.1:
resolution: {integrity: sha512-Um/+FxMr9CISWh0bi5Zv0iOD+4cFh5qLeks1qhAopKVAJw3drgKbKySikp7wGhDL0HPeaja0P5ULZrxLkniUVg==}
escape-string-regexp@4.0.0:
resolution: {integrity: sha512-TtpcNJ3XAzx3Gq8sWRzJaVajRs0uVxA2YAkdb1jm2YkPz4G6egUFAyA3n5vtEIZefPk5Wa4UXbKuS5fKkJWdgA==}
engines: {node: '>=10'}
flatbuffers@25.9.23:
resolution: {integrity: sha512-MI1qs7Lo4Syw0EOzUl0xjs2lsoeqFku44KpngfIduHBYvzm8h2+7K8YMQh1JtVVVrUvhLpNwqVi4DERegUJhPQ==}
global-agent@3.0.0:
resolution: {integrity: sha512-PT6XReJ+D07JvGoxQMkT6qji/jVNfX/h364XHZOWeRzy64sSFr+xJ5OX7LI3b4MPQzdL4H8Y8M0xzPpsVMwA8Q==}
engines: {node: '>=10.0'}
globalthis@1.0.4:
resolution: {integrity: sha512-DpLKbNU4WylpxJykQujfCcwYWiV/Jhm50Goo0wrVILAv5jOr9d+H+UR3PhSCD2rCCEIg0uc+G+muBTwD54JhDQ==}
engines: {node: '>= 0.4'}
gopd@1.2.0:
resolution: {integrity: sha512-ZUKRh6/kUFoAiTAtTYPZJ3hw9wNxx+BIBOijnlG9PnrJsCcSjs1wyyD6vJpaYtgnzDrKYRSqf3OO6Rfa93xsRg==}
engines: {node: '>= 0.4'}
guid-typescript@1.0.9:
resolution: {integrity: sha512-Y8T4vYhEfwJOTbouREvG+3XDsjr8E3kIr7uf+JZ0BYloFsttiHU0WfvANVsR7TxNUJa/WpCnw/Ino/p+DeBhBQ==}
has-property-descriptors@1.0.2:
resolution: {integrity: sha512-55JNKuIW+vq4Ke1BjOTjM2YctQIvCT7GFzHwmfZPGo5wnrgkid0YQtnAleFSqumZm4az3n2BS+erby5ipJdgrg==}
json-stringify-safe@5.0.1:
resolution: {integrity: sha512-ZClg6AaYvamvYEE82d3Iyd3vSSIjQ+odgjaTzRuO3s7toCdFKczob2i0zCh7JE8kWn17yvAWhUVxvqGwUalsRA==}
long@5.3.2:
resolution: {integrity: sha512-mNAgZ1GmyNhD7AuqnTG3/VQ26o760+ZYBPKjPvugO8+nLbYfX6TVpJPseBvopbdY+qpZ/lKUnmEc1LeZYS3QAA==}
matcher@3.0.0:
resolution: {integrity: sha512-OkeDaAZ/bQCxeFAozM55PKcKU0yJMPGifLwV4Qgjitu+5MoAfSQN4lsLJeXZ1b8w0x+/Emda6MZgXS1jvsapng==}
engines: {node: '>=10'}
minipass@7.1.2:
resolution: {integrity: sha512-qOOzS1cBTWYF4BH8fVePDBOO9iptMnGUEZwNc/cMWnTV2nVLZ7VoNWEPHkYczZA0pdoA7dl6e7FL659nX9S2aw==}
engines: {node: '>=16 || 14 >=14.17'}
minizlib@3.1.0:
resolution: {integrity: sha512-KZxYo1BUkWD2TVFLr0MQoM8vUUigWD3LlD83a/75BqC+4qE0Hb1Vo5v1FgcfaNXvfXzr+5EhQ6ing/CaBijTlw==}
engines: {node: '>= 18'}
object-keys@1.1.1:
resolution: {integrity: sha512-NuAESUOUMrlIXOfHKzD6bpPu3tYt3xvjNdRIQ+FeT0lNb4K8WR70CaDxhuNguS2XG+GjkyMwOzsN5ZktImfhLA==}
engines: {node: '>= 0.4'}
onnxruntime-common@1.21.0:
resolution: {integrity: sha512-Q632iLLrtCAVOTO65dh2+mNbQir/QNTVBG3h/QdZBpns7mZ0RYbLRBgGABPbpU9351AgYy7SJf1WaeVwMrBFPQ==}
onnxruntime-common@1.22.0-dev.20250409-89f8206ba4:
resolution: {integrity: sha512-vDJMkfCfb0b1A836rgHj+ORuZf4B4+cc2bASQtpeoJLueuFc5DuYwjIZUBrSvx/fO5IrLjLz+oTrB3pcGlhovQ==}
onnxruntime-node@1.21.0:
resolution: {integrity: sha512-NeaCX6WW2L8cRCSqy3bInlo5ojjQqu2fD3D+9W5qb5irwxhEyWKXeH2vZ8W9r6VxaMPUan+4/7NDwZMtouZxEw==}
os: [win32, darwin, linux]
onnxruntime-web@1.22.0-dev.20250409-89f8206ba4:
resolution: {integrity: sha512-0uS76OPgH0hWCPrFKlL8kYVV7ckM7t/36HfbgoFw6Nd0CZVVbQC4PkrR8mBX8LtNUFZO25IQBqV2Hx2ho3FlbQ==}
platform@1.3.6:
resolution: {integrity: sha512-fnWVljUchTro6RiCFvCXBbNhJc2NijN7oIQxbwsyL0buWJPG85v81ehlHI9fXrJsMNgTofEoWIQeClKpgxFLrg==}
protobufjs@7.5.4:
resolution: {integrity: sha512-CvexbZtbov6jW2eXAvLukXjXUW1TzFaivC46BpWc/3BpcCysb5Vffu+B3XHMm8lVEuy2Mm4XGex8hBSg1yapPg==}
engines: {node: '>=12.0.0'}
roarr@2.15.4:
resolution: {integrity: sha512-CHhPh+UNHD2GTXNYhPWLnU8ONHdI+5DI+4EYIAOaiD63rHeYlZvyh8P+in5999TTSFgUYuKUAjzRI4mdh/p+2A==}
engines: {node: '>=8.0'}
semver-compare@1.0.0:
resolution: {integrity: sha512-YM3/ITh2MJ5MtzaM429anh+x2jiLVjqILF4m4oyQB18W7Ggea7BfqdH/wGMK7dDiMghv/6WG7znWMwUDzJiXow==}
semver@7.7.4:
resolution: {integrity: sha512-vFKC2IEtQnVhpT78h1Yp8wzwrf8CM+MzKMHGJZfBtzhZNycRFnXsHk6E5TxIkkMsgNS7mdX3AGB7x2QM2di4lA==}
engines: {node: '>=10'}
hasBin: true
serialize-error@7.0.1:
resolution: {integrity: sha512-8I8TjW5KMOKsZQTvoxjuSIa7foAwPWGOts+6o7sgjz41/qMD9VQHEDxi6PBvK2l0MXUmqZyNpUK+T2tQaaElvw==}
engines: {node: '>=10'}
sharp@0.34.5:
resolution: {integrity: sha512-Ou9I5Ft9WNcCbXrU9cMgPBcCK8LiwLqcbywW3t4oDV37n1pzpuNLsYiAV8eODnjbtQlSDwZ2cUEeQz4E54Hltg==}
engines: {node: ^18.17.0 || ^20.3.0 || >=21.0.0}
sprintf-js@1.1.3:
resolution: {integrity: sha512-Oo+0REFV59/rz3gfJNKQiBlwfHaSESl1pcGyABQsnnIfWOFt6JNj5gCog2U6MLZ//IGYD+nA8nI+mTShREReaA==}
tar@7.5.7:
resolution: {integrity: sha512-fov56fJiRuThVFXD6o6/Q354S7pnWMJIVlDBYijsTNx6jKSE4pvrDTs6lUnmGvNyfJwFQQwWy3owKz1ucIhveQ==}
engines: {node: '>=18'}
tslib@2.8.1:
resolution: {integrity: sha512-oJFu94HQb+KVduSUQL7wnpmqnfmLsOA/nAh6b6EH0wCEoK0/mPeXU6c3wKDV83MkOuHPRHtSXKKU99IBazS/2w==}
type-fest@0.13.1:
resolution: {integrity: sha512-34R7HTnG0XIJcBSn5XhDd7nNFPRcXYRZrBB2O2jdKqYODldSzBAqzsWoZYYvduky73toYS/ESqxPvkDf/F0XMg==}
engines: {node: '>=10'}
undici-types@7.16.0:
resolution: {integrity: sha512-Zz+aZWSj8LE6zoxD+xrjh4VfkIG8Ya6LvYkZqtUQGJPZjYl53ypCaUwWqo7eI0x66KBGeRo+mlBEkMSeSZ38Nw==}
yallist@5.0.0:
resolution: {integrity: sha512-YgvUTfwqyc7UXVMrB+SImsVYSmTS8X/tSrtdNZMImM+n7+QTriRXyXim0mBrTXNeqzVF0KWGgHPeiyViFFrNDw==}
engines: {node: '>=18'}
snapshots:
'@emnapi/runtime@1.8.1':
dependencies:
tslib: 2.8.1
optional: true
'@huggingface/jinja@0.5.5': {}
'@huggingface/transformers@3.8.1':
dependencies:
'@huggingface/jinja': 0.5.5
onnxruntime-node: 1.21.0
onnxruntime-web: 1.22.0-dev.20250409-89f8206ba4
sharp: 0.34.5
'@img/colour@1.0.0': {}
'@img/sharp-darwin-arm64@0.34.5':
optionalDependencies:
'@img/sharp-libvips-darwin-arm64': 1.2.4
optional: true
'@img/sharp-darwin-x64@0.34.5':
optionalDependencies:
'@img/sharp-libvips-darwin-x64': 1.2.4
optional: true
'@img/sharp-libvips-darwin-arm64@1.2.4':
optional: true
'@img/sharp-libvips-darwin-x64@1.2.4':
optional: true
'@img/sharp-libvips-linux-arm64@1.2.4':
optional: true
'@img/sharp-libvips-linux-arm@1.2.4':
optional: true
'@img/sharp-libvips-linux-ppc64@1.2.4':
optional: true
'@img/sharp-libvips-linux-riscv64@1.2.4':
optional: true
'@img/sharp-libvips-linux-s390x@1.2.4':
optional: true
'@img/sharp-libvips-linux-x64@1.2.4':
optional: true
'@img/sharp-libvips-linuxmusl-arm64@1.2.4':
optional: true
'@img/sharp-libvips-linuxmusl-x64@1.2.4':
optional: true
'@img/sharp-linux-arm64@0.34.5':
optionalDependencies:
'@img/sharp-libvips-linux-arm64': 1.2.4
optional: true
'@img/sharp-linux-arm@0.34.5':
optionalDependencies:
'@img/sharp-libvips-linux-arm': 1.2.4
optional: true
'@img/sharp-linux-ppc64@0.34.5':
optionalDependencies:
'@img/sharp-libvips-linux-ppc64': 1.2.4
optional: true
'@img/sharp-linux-riscv64@0.34.5':
optionalDependencies:
'@img/sharp-libvips-linux-riscv64': 1.2.4
optional: true
'@img/sharp-linux-s390x@0.34.5':
optionalDependencies:
'@img/sharp-libvips-linux-s390x': 1.2.4
optional: true
'@img/sharp-linux-x64@0.34.5':
optionalDependencies:
'@img/sharp-libvips-linux-x64': 1.2.4
optional: true
'@img/sharp-linuxmusl-arm64@0.34.5':
optionalDependencies:
'@img/sharp-libvips-linuxmusl-arm64': 1.2.4
optional: true
'@img/sharp-linuxmusl-x64@0.34.5':
optionalDependencies:
'@img/sharp-libvips-linuxmusl-x64': 1.2.4
optional: true
'@img/sharp-wasm32@0.34.5':
dependencies:
'@emnapi/runtime': 1.8.1
optional: true
'@img/sharp-win32-arm64@0.34.5':
optional: true
'@img/sharp-win32-ia32@0.34.5':
optional: true
'@img/sharp-win32-x64@0.34.5':
optional: true
'@isaacs/fs-minipass@4.0.1':
dependencies:
minipass: 7.1.2
'@protobufjs/aspromise@1.1.2': {}
'@protobufjs/base64@1.1.2': {}
'@protobufjs/codegen@2.0.4': {}
'@protobufjs/eventemitter@1.1.0': {}
'@protobufjs/fetch@1.1.0':
dependencies:
'@protobufjs/aspromise': 1.1.2
'@protobufjs/inquire': 1.1.0
'@protobufjs/float@1.0.2': {}
'@protobufjs/inquire@1.1.0': {}
'@protobufjs/path@1.1.2': {}
'@protobufjs/pool@1.1.0': {}
'@protobufjs/utf8@1.1.0': {}
'@types/node@25.2.2':
dependencies:
undici-types: 7.16.0
boolean@3.2.0: {}
chownr@3.0.0: {}
define-data-property@1.1.4:
dependencies:
es-define-property: 1.0.1
es-errors: 1.3.0
gopd: 1.2.0
define-properties@1.2.1:
dependencies:
define-data-property: 1.1.4
has-property-descriptors: 1.0.2
object-keys: 1.1.1
detect-libc@2.1.2: {}
detect-node@2.1.0: {}
es-define-property@1.0.1: {}
es-errors@1.3.0: {}
es6-error@4.1.1: {}
escape-string-regexp@4.0.0: {}
flatbuffers@25.9.23: {}
global-agent@3.0.0:
dependencies:
boolean: 3.2.0
es6-error: 4.1.1
matcher: 3.0.0
roarr: 2.15.4
semver: 7.7.4
serialize-error: 7.0.1
globalthis@1.0.4:
dependencies:
define-properties: 1.2.1
gopd: 1.2.0
gopd@1.2.0: {}
guid-typescript@1.0.9: {}
has-property-descriptors@1.0.2:
dependencies:
es-define-property: 1.0.1
json-stringify-safe@5.0.1: {}
long@5.3.2: {}
matcher@3.0.0:
dependencies:
escape-string-regexp: 4.0.0
minipass@7.1.2: {}
minizlib@3.1.0:
dependencies:
minipass: 7.1.2
object-keys@1.1.1: {}
onnxruntime-common@1.21.0: {}
onnxruntime-common@1.22.0-dev.20250409-89f8206ba4: {}
onnxruntime-node@1.21.0:
dependencies:
global-agent: 3.0.0
onnxruntime-common: 1.21.0
tar: 7.5.7
onnxruntime-web@1.22.0-dev.20250409-89f8206ba4:
dependencies:
flatbuffers: 25.9.23
guid-typescript: 1.0.9
long: 5.3.2
onnxruntime-common: 1.22.0-dev.20250409-89f8206ba4
platform: 1.3.6
protobufjs: 7.5.4
platform@1.3.6: {}
protobufjs@7.5.4:
dependencies:
'@protobufjs/aspromise': 1.1.2
'@protobufjs/base64': 1.1.2
'@protobufjs/codegen': 2.0.4
'@protobufjs/eventemitter': 1.1.0
'@protobufjs/fetch': 1.1.0
'@protobufjs/float': 1.0.2
'@protobufjs/inquire': 1.1.0
'@protobufjs/path': 1.1.2
'@protobufjs/pool': 1.1.0
'@protobufjs/utf8': 1.1.0
'@types/node': 25.2.2
long: 5.3.2
roarr@2.15.4:
dependencies:
boolean: 3.2.0
detect-node: 2.1.0
globalthis: 1.0.4
json-stringify-safe: 5.0.1
semver-compare: 1.0.0
sprintf-js: 1.1.3
semver-compare@1.0.0: {}
semver@7.7.4: {}
serialize-error@7.0.1:
dependencies:
type-fest: 0.13.1
sharp@0.34.5:
dependencies:
'@img/colour': 1.0.0
detect-libc: 2.1.2
semver: 7.7.4
optionalDependencies:
'@img/sharp-darwin-arm64': 0.34.5
'@img/sharp-darwin-x64': 0.34.5
'@img/sharp-libvips-darwin-arm64': 1.2.4
'@img/sharp-libvips-darwin-x64': 1.2.4
'@img/sharp-libvips-linux-arm': 1.2.4
'@img/sharp-libvips-linux-arm64': 1.2.4
'@img/sharp-libvips-linux-ppc64': 1.2.4
'@img/sharp-libvips-linux-riscv64': 1.2.4
'@img/sharp-libvips-linux-s390x': 1.2.4
'@img/sharp-libvips-linux-x64': 1.2.4
'@img/sharp-libvips-linuxmusl-arm64': 1.2.4
'@img/sharp-libvips-linuxmusl-x64': 1.2.4
'@img/sharp-linux-arm': 0.34.5
'@img/sharp-linux-arm64': 0.34.5
'@img/sharp-linux-ppc64': 0.34.5
'@img/sharp-linux-riscv64': 0.34.5
'@img/sharp-linux-s390x': 0.34.5
'@img/sharp-linux-x64': 0.34.5
'@img/sharp-linuxmusl-arm64': 0.34.5
'@img/sharp-linuxmusl-x64': 0.34.5
'@img/sharp-wasm32': 0.34.5
'@img/sharp-win32-arm64': 0.34.5
'@img/sharp-win32-ia32': 0.34.5
'@img/sharp-win32-x64': 0.34.5
sprintf-js@1.1.3: {}
tar@7.5.7:
dependencies:
'@isaacs/fs-minipass': 4.0.1
chownr: 3.0.0
minipass: 7.1.2
minizlib: 3.1.0
yallist: 5.0.0
tslib@2.8.1:
optional: true
type-fest@0.13.1: {}
undici-types@7.16.0: {}
yallist@5.0.0: {}

View File

@@ -0,0 +1,116 @@
const GITHUB_API = 'https://api.github.com';
const GITHUB_TOKEN = process.env.GITHUB_TOKEN;
const GITHUB_REPOSITORY = process.env.GITHUB_REPOSITORY;
function ghHeaders() {
return {
Authorization: `token ${GITHUB_TOKEN}`,
Accept: 'application/vnd.github+json',
};
}
export async function fetchIssues({
state = 'open',
since,
maxIssues = 5000,
} = {}) {
const issues = [];
let page = 1;
const perPage = 100;
while (issues.length < maxIssues) {
const params = new URLSearchParams({
state,
per_page: String(perPage),
page: String(page),
sort: 'updated',
direction: 'desc',
});
if (since) params.set('since', since);
const url = `${GITHUB_API}/repos/${GITHUB_REPOSITORY}/issues?${params}`;
const resp = await fetch(url, { headers: ghHeaders() });
if (!resp.ok) {
throw new Error(`GitHub API error: ${resp.status} ${resp.statusText}`);
}
const batch = await resp.json();
if (!batch.length) break;
for (const item of batch) {
if (!item.pull_request) {
issues.push(item);
}
}
page++;
if (batch.length < perPage) break;
}
return issues.slice(0, maxIssues);
}
export async function getIssue(issueNumber) {
const url = `${GITHUB_API}/repos/${GITHUB_REPOSITORY}/issues/${issueNumber}`;
const resp = await fetch(url, { headers: ghHeaders() });
if (!resp.ok) {
throw new Error(`GitHub API error: ${resp.status} ${resp.statusText}`);
}
return resp.json();
}
export async function postComment(issueNumber, body) {
const url = `${GITHUB_API}/repos/${GITHUB_REPOSITORY}/issues/${issueNumber}/comments`;
const resp = await fetch(url, {
method: 'POST',
headers: { ...ghHeaders(), 'Content-Type': 'application/json' },
body: JSON.stringify({ body }),
});
if (!resp.ok) {
throw new Error(
`Failed to post comment: ${resp.status} ${resp.statusText}`
);
}
console.log(`Posted comment on #${issueNumber}`);
}
export async function addLabel(issueNumber, label) {
const url = `${GITHUB_API}/repos/${GITHUB_REPOSITORY}/issues/${issueNumber}/labels`;
const resp = await fetch(url, {
method: 'POST',
headers: { ...ghHeaders(), 'Content-Type': 'application/json' },
body: JSON.stringify({ labels: [label] }),
});
if (resp.status === 404) {
console.warn(
`Label '${label}' does not exist - skipping. Create it manually.`
);
return;
}
if (!resp.ok) {
throw new Error(`Failed to add label: ${resp.status} ${resp.statusText}`);
}
console.log(`Added label '${label}' to #${issueNumber}`);
}
export function issueText(title, body) {
body = (body || '').trim();
if (body.length > 2000) body = body.slice(0, 2000) + '...';
return body ? `${title}\n\n${body}` : title;
}
export function dotProduct(a, b) {
let sum = 0;
for (let i = 0; i < a.length; i++) {
sum += a[i] * b[i];
}
return sum;
}

View File

@@ -13,6 +13,12 @@ Refer to [Configuring Databases](/extending-seerr/database-config#postgresql-opt
:::info
An alternative Docker image is available on Docker Hub for this project. You can find it at [Docker Hub Repository Link](https://hub.docker.com/r/seerr/seerr)
Our Docker images are available with the following tags:
- `latest`: Always points to the most recent stable release.
- Version tags (e.g., `v3.0.0`): For specific stable versions.
- `develop`: Rolling release/nightly builds for using the latest changes (use with caution).
:::
:::info
@@ -38,6 +44,13 @@ For details on the Docker CLI, please [review the official `docker run` document
#### Installation:
```bash
# Create the appdata folder
mkdir /path/to/appdata/config
# Chown the folder as the container runs as the `node` user (UID 1000).
chown -R 1000:1000 /path/to/appdata/config
```
```bash
docker run -d \
--name seerr \
@@ -48,20 +61,16 @@ docker run -d \
-p 5055:5055 \
-v /path/to/appdata/config:/app/config \
--restart unless-stopped \
ghcr.io/seerr-team/seerr:latest
```
The argument `-e PORT=5055` is optional.
If you want to add a healthcheck to the above command, you can add the following flags :
```
--health-cmd "wget --no-verbose --tries=1 --spider http://localhost:5055/api/v1/status || exit 1" \
--health-start-period 20s \
--health-timeout 3s \
--health-interval 15s \
--health-retries 3 \
ghcr.io/seerr-team/seerr:latest
```
The argument `-e PORT=5055` is optional.
To run the container as a specific user/group, you may optionally add `--user=[ user | user:group | uid | uid:gid | user:gid | uid:group ]` to the above command.
#### Updating:
@@ -115,6 +124,13 @@ services:
restart: unless-stopped
```
```bash
# Create the appdata folder
mkdir /path/to/appdata/config
# Chown the folder as the container runs as the `node` user (UID 1000).
chown -R 1000:1000 /path/to/appdata/config
```
Then, start all services defined in the Compose file:
```bash
docker compose up -d
@@ -129,9 +145,6 @@ Then, restart all services defined in the Compose file:
```bash
docker compose up -d
```
:::tip
You may alternatively use a third-party mechanism like [dockge](https://github.com/louislam/dockge) to manage your docker compose files.
:::
</TabItem>
</Tabs>
@@ -166,20 +179,16 @@ docker run -d \
-p 5055:5055 \
-v seerr-data:/app/config \
--restart unless-stopped \
ghcr.io/seerr-team/seerr:latest
```
The argument `-e PORT=5055` is optional.
If you want to add a healthcheck to the above command, you can add the following flags :
```
--health-cmd "wget --no-verbose --tries=1 --spider http://localhost:5055/api/v1/status || exit 1" \
--health-start-period 20s \
--health-timeout 3s \
--health-interval 15s \
--health-retries 3 \
ghcr.io/seerr-team/seerr:latest
```
The argument `-e PORT=5055` is optional.
#### Updating:
Pull the latest image:
```bash

View File

@@ -7,5 +7,9 @@ import DocCardList from '@theme/DocCardList';
Third-party installation methods are maintained by the community. The Seerr team is not responsible for these packages.
:::
:::info
Want to add a third-party installation method? Contributions are welcome! Feel free to open a pull request.
:::
<DocCardList />

View File

@@ -10,8 +10,21 @@ import Tabs from '@theme/Tabs';
import TabItem from '@theme/TabItem';
# Nix Package Manager
:::danger
This method has not yet been updated for Seerr and is currently a work in progress.
You can follow the ongoing work on these pull requests:
- https://github.com/NixOS/nixpkgs/pull/450096
- https://github.com/NixOS/nixpkgs/pull/450093
:::
<!--
:::warning
Third-party installation methods are maintained by the community. The Seerr team is not responsible for these packages.
:::
:::warning
This method is not recommended for most users. It is intended for advanced users who are using NixOS distribution.
:::
Refer to [NixOS documentation](https://search.nixos.org/options?channel=25.05&query=seerr)
-->

View File

@@ -0,0 +1,20 @@
---
title: TrueNAS (Advanced)
description: Install Seerr using TrueNAS
sidebar_position: 4
---
# TrueNAS
:::danger
This method has not yet been updated for Seerr and is currently a work in progress.
You can follow the ongoing work on this issue https://github.com/truenas/apps/issues/3374.
:::
<!--
:::warning
Third-party installation methods are maintained by the community. The Seerr team is not responsible for these packages.
:::
:::warning
This method is not recommended for most users. It is intended for advanced users who are using TrueNAS distribution.
:::
-->

View File

@@ -5,6 +5,12 @@ sidebar_position: 3
---
# Unraid
:::danger
This method has not yet been updated for Seerr and is awaiting a community contribution.
Feel free to open a pull request on GitHub to update this installation method.
:::
<!--
:::warning
Third-party installation methods are maintained by the community. The Seerr team is not responsible for these packages.
:::
@@ -18,3 +24,4 @@ This method is not recommended for most users. It is intended for advanced users
3. Click the **Install Button**.
4. On the following **Add Container** screen, make changes to the **Host Port** and **Host Path 1** \(Appdata\) as needed.
5. Click apply and access "Seerr" at your `<ServerIP:HostPort>` in a web browser.
-->

View File

@@ -5,26 +5,53 @@ title: Migration guide
import Tabs from '@theme/Tabs';
import TabItem from '@theme/TabItem';
:::important
Read our [release announcement](/blog/seerr-release) to learn what Seerr means for Jellyseerr and Overseerr users.
:::
Whether you come from Overseerr or Jellyseerr, you don't need to perform any manual migration steps, your instance will automatically be migrated to Seerr.
This migration will run automatically the first time you start your instance using the Seerr codebase (Docker image or source build or Kubernetes, etc.).
An additional migration will happen for Overseerr users, to migrate their configuration to the new codebase.
:::warning
:::danger
Before doing anything you should backup your existing instance so that you can rollback in case something goes wrong.
See [Backups](/using-seerr/backups) for details on how to properly backup your instance.
:::
:::warning
Installation methods are now divided into two categories: official and third-party methods.
The Seerr team is only responsible for official installation methods, while third-party methods are maintained by the community.
Some methods are currently not maintained, but this does not mean they are permanently discontinued. The community may restore and support them if they choose to do so.
- **Unraid app:** Not maintained
- **Snap package:** Not maintained
:::
## Docker
Refer to [Seerr Docker Documentation](/getting-started/docker), all of our examples have been updated to reflect the below change.
:::info
Seerr provides a secure, fully featured image with everything you need included.
We sincerely appreciate the past contributions from third-party maintainers, which helped enhance this image and its capabilities.
To maintain consistency and security, we encourage everyone to use the features available in the official Seerr image.
If you feel something is missing, please submit a feature request—your feedback is always welcome!
Our Docker images are available with the following tags:
- `latest`: Always points to the most recent stable release.
- Version tags (e.g., `v3.0.0`): For specific stable versions.
- `develop`: Rolling release/nightly builds for using the latest changes (use with caution).
:::
Changes :
- Renamed all references from `overseerr` or `jellyseerr` to `seerr`.
- The container image reference has been updated.
- The container can now be run as a non-root user (`node` user); remove the `user` directive if you have configured it.
- The container no longer provides an init process, so you must configure it by adding `init: true` for Docker Compose or `--init` for the Docker CLI.
#### Config folder permissions
:::info
**Config folder permissions**: Since the container now runs as the `node` user (UID 1000), you must ensure your config folder has the correct permissions. The `node` user must have read and write access to the `/app/config` directory.
Since the container now runs as the `node` user (UID 1000), you must ensure your config folder has the correct permissions. The `node` user must have read and write access to the `/app/config` directory.
If you're migrating from a previous installation, you may need to update the ownership of your config folder:
```bash
@@ -126,6 +153,12 @@ Summary of changes :
</TabItem>
</Tabs>
## Build From Source
Refer to [Seerr Build From Source Documentation](/getting-started/buildfromsource), all of our examples have been updated to reflect the below change.
Install from scratch by following the documentation, restore your data as described in [Backups](/using-seerr/backups), and then start Seerr. No additional steps are required.
## Kubernetes
Refer to [Seerr Kubernetes Documentation](/getting-started/kubernetes), all of our examples have been updated to reflect the below change.
@@ -166,3 +199,15 @@ Summary of changes :
```
</TabItem>
</Tabs>
### Nix (Third-party installation methods)
Waiting for https://github.com/NixOS/nixpkgs/pull/450096 and https://github.com/NixOS/nixpkgs/pull/450093
### AUR (Third-party installation methods)
See https://aur.archlinux.org/packages/seerr
### TrueNAS (Third-party installation methods)
Waiting for https://github.com/truenas/apps/issues/3374

View File

@@ -7,7 +7,7 @@ Seerr docs will be available at [docs.seerr.dev](https://docs.seerr.dev).
### Installation
```
$ pnpm
$ pnpm install
```
### Local Development

View File

@@ -4,7 +4,7 @@ description: The official Seerr blog for release notes, technical updates, and c
slug: welcome
authors: [fallenbagel, gauthier-th]
tags: [announcement, seerr, blog]
image: https://raw.githubusercontent.com/seerr-team/seerr/refs/heads/develop/gen-docs/static/img/logo.svg
image: https://raw.githubusercontent.com/seerr-team/seerr/refs/heads/develop/gen-docs/static/img/logo_full.svg
hide_table_of_contents: false
---

Binary file not shown.

After

Width:  |  Height:  |  Size: 36 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 166 KiB

View File

@@ -0,0 +1,127 @@
---
title: "Seerr Release: Unifying Overseerr and Jellyseerr"
description: "Overseerr and Jellyseerr are merging into a unified project: Seerr"
slug: seerr-release
authors: [seerr-team]
image: https://raw.githubusercontent.com/seerr-team/seerr/refs/heads/develop/gen-docs/static/img/logo_full.svg
hide_table_of_contents: false
---
We're excited to announce a major update: the Jellyseerr and Overseerr teams are officially merging into a single team called **Seerr**. This unification marks an important step forward as we bring our efforts together under one banner.
For users, this means one shared codebase combining all existing Overseerr functionalities with the latest Jellyseerr features, along with Jellyfin and Emby support, allowing us to deliver updates more efficiently and keep the project moving forward.
Please check how to migrate to Seerr in our [migration guide](https://docs.seerr.dev/migration-guide) and stay tuned for more updates on the project!
<!--truncate-->
## What's new in Seerr for Overseerr users
Seerr brings several features that were previously available in Jellyseerr but missing from Overseerr. These additions improve flexibility, performance, and overall control for admins and power users:
* **Alternative media solution:** Added support for Jellyfin and Emby in addition to the existing Plex integration.
* **PostgreSQL support**: In addition to SQLite, you can now opt in to using a PostgreSQL database.
* **Blacklist for movies, series, and tags**: Allows permitted users to hide movies, series, or tags from regular users.
* **Override rules**: Adjust default request settings based on conditions such as user, tag, or other criteria.
* **TVDB metadata**: Option to use TheTVDB metadata for series (as in Sonarr) instead of TMDB.
* **DNS caching**: Reduces lookup times and external requests, especially useful when using systems like Pi-Hole/Adguard Home.
* **Helm chart included**: Enables easier installation and maintenance in Kubernetes environments.
* **ntfy.sh notifications**: Support for sending notifications via ntfy.sh.
* **Disable special seasons:** Adds a setting to prevent special seasons from being shown or requested.
* **New languages**: Turkish and Basque
## What's new since the previous Jellyseerr release
This release also brings several important improvements and long-requested features, including **TheTVDB metadata support**, **DNS caching**, and **dynamic webhook placeholders**, along with a few quality-of-life improvements for developers and users alike.
### PNPM v10 Upgrade
We're updating Seerr to **PNPM v10** to keep up-to-date development tools. If you are building Seerr from source or if you contribute to Seerr, you'll need to **update your local PNPM installation** before working on the project.
This doesn't concern you if you're using Docker.
To update, run the following command:
`pnpm self-update`
After updating, verify your version with:
`pnpm -v`
You should see version **10.x.x**.
### TVDB Metadata Provider (Experimental)
We're excited to introduce support for **TheTVDB** as a new metadata provider!
Previously, Seerr relied solely on **TMDB** for movie and TV show information, which sometimes led to discrepancies in season and episode numbering when working with **Sonarr**, since Sonarr uses **TheTVDB** as its metadata source.
With this new integration, Seerr can now use **the same data source as Sonarr** for series and anime, ensuring consistent and accurate season and episode information across both platforms.
You can try this new experimental feature in the new “Metadata Providers” tab of the settings page:
![Metadata Providers](./metadata-providers.png)
### DNS Caching (Experimental)
By default, Node.js doesn't cache any DNS requests. Our DNS cache manager addresses the problems caused by extremely high DNS query rates, particularly for large Jellyfin libraries as each HTTP request was also resulting in another DNS request. Therefore, by caching these DNS lookups, **Seerr will now reduce stress on DNS servers** and avoid rate-limiting or blocks encountered with services like **Pi-Hole**/**Adguard Home**.
We will post another blog post soon on all the issues we encountered with DNS caching in Node.js.
You can enable this by checking the “DNS Cache” setting in the network tabs of the Seerr settings:
![DNS Cache](./dns-cache.png)
### AniDB for Jellyfin Libraries
This new version also brings additional metadata to Jellyfin-managed collections. When there's no provider ID from TMDB or TVDB, Seerr will automatically **fall back on AniDB**, expanding coverage for lesser-known or region-specific anime.
### Dynamic Placeholders in Webhook URLs
Webhook notifications are now more powerful and adaptable with **dynamic placeholder support in webhook URLs**. This allows Seerr to automatically replace placeholders in the webhook URL with real values at runtime.
For example, you can include the requester's username directly in your webhook URL to better integrate with third-party services or user-specific endpoints.
This feature can be enabled from the **Notifications** settings page, where available placeholders are listed for reference. It's currently marked as **experimental**, and we welcome community feedback to help refine and expand support for additional placeholders in future releases.
### Optional Images in Notifications
Another small feature: **images in notifications are now optional** (but still enabled by default). Previous versions always included images in notifications, which could lead to broken links or failed requests if images were missing or unavailable.
### Security improvement
Some outdated dependencies have been updated (some work is still in progress). Helm charts and containers are now cryptographically signed and can be verified and enforced client-side. Containers now run as rootless. Workflows have been completely reworked to minimize third-party actions. Permissions have been strengthened, and actions are now pinned to specific hashes for better traceability. The release process has been updated to remove many outdated and plugin dependencies, replacing them with more standard industry solutions.
:::important
## Note for PostgreSQL users (optional)
If you're migrating Postgres from version 17 to 18 in Docker, note that the data mount point has changed. Instead of using `/var/lib/postgresql/data`, the correct mount path is now `/var/lib/postgresql`. This update of the mount point is required to ensure the container functions correctly after the upgrade.
:::
## Conclusion
Seerr is built and maintained by dedicated volunteer contributors, whose skills and commitment make it all possible. Many thanks to everyone who contributed to this version:
* [0xsysr3ll](https://github.com/0xSysR3ll)
* [ale183](https://github.com/ale183)
* [Brandon Cohen](https://github.com/OwsleyJr)
* [Disparate2761](https://github.com/Disparate2761)
* [fallenbagel](https://github.com/fallenbagel)
* [Gauthier](https://github.com/gauthier-th)
* [Gauvain](https://github.com/Gauvino)
* [Georgy](https://github.com/tarasverq)
* [Ishan Jain](https://github.com/ishanjain28)
* [James Kruger](https://github.com/theGunner295)
* [Joe Harrison](https://github.com/sudo-kraken)
* [J. Winters-Brown](https://github.com/ofgrenudo)
* [Ludovic Ortega](https://github.com/M0NsTeRRR)
* [RolliePollie18](https://github.com/RolliePollie18)
* [Ryan Cohen](https://github.com/sct)
* [salty](https://github.com/saltydk)
* [samohtxotom](https://github.com/samohtxotom)
* [Sergii Bogomolov](https://github.com/sbogomolov)
* [Someone](https://github.com/InterN0te)
* [TacoCake](https://github.com/TacoCake)
* [Terry Sposato](https://github.com/tsposato)
* [TheCatLady](https://github.com/TheCatLady)
* [Thibaut Noah](https://github.com/tirrorex)
* [THOMAS B](https://github.com/TOomaAh)
Keep an eye on our blog for in-depth looks at our work and upcoming releases!

View File

@@ -1,8 +1,8 @@
fallenbagel:
name: Fallenbagel
page: true
title: Developer & Maintainer of Jellyseerr
description: Core Maintainer & Developer of Jellyseerr | Full-Stack Software Engineer | MSc Software Engineering Candidate.
title: Developer & Maintainer of Seerr
description: Core Maintainer & Developer of Seerr | Full-Stack Software Engineer | MSc Software Engineering Student.
url: https://github.com/fallenbagel
image_url: https://github.com/fallenbagel.png
email: hello@fallenbagel.com
@@ -12,10 +12,18 @@ fallenbagel:
gauthier-th:
name: Gauthier
page: true
title: Co-Developer & Co-Maintainer of Jellyseerr
description: Co-Maintainer & Developer of Jellyseerr | PhD Student in AI at ICB, Dijon
title: Developer & Maintainer of Seerr
description: Core Maintainer & Developer of Seerr | PhD Student in AI at ICB, Dijon
url: https://gauthierth.fr
image_url: https://github.com/gauthier-th.png
email: mail@gauthierth.fr
socials:
github: gauthier-th
seerr-team:
name: Seerr Team
title: The team behind Seerr, formerly known as the Jellyseerr and Overseerr teams.
url: https://seerr.dev
image_url: https://github.com/seerr-team.png
socials:
github: seerr-team

View File

@@ -64,7 +64,7 @@ const config: Config = {
navbar: {
logo: {
alt: 'Seerr',
src: 'img/logo.svg',
src: 'img/logo_full.svg',
},
items: [
{
@@ -72,6 +72,11 @@ const config: Config = {
label: 'Blog',
position: 'right',
},
{
href: 'https://discord.gg/seerr',
label: 'Discord',
position: 'right',
},
{
href: 'https://github.com/seerr-team/seerr',
label: 'GitHub',

View File

@@ -60,12 +60,12 @@
.table-of-contents__link--active,
a:not(
.card,
.menu__link,
.menu__link--sublist,
.menu__link--sublist-item,
.table-of-contents__link
) {
.card,
.menu__link,
.menu__link--sublist,
.menu__link--sublist-item,
.table-of-contents__link
) {
/* color: #793ae8; */
color: #6366f1;
}

View File

Before

Width:  |  Height:  |  Size: 25 KiB

After

Width:  |  Height:  |  Size: 25 KiB

File diff suppressed because one or more lines are too long

After

Width:  |  Height:  |  Size: 22 KiB

View File

@@ -1,21 +0,0 @@
/* eslint-disable */
const tailwind = require('prettier-plugin-tailwindcss');
const organizeImports = require('prettier-plugin-organize-imports');
const combinedFormatter = {
...tailwind,
parsers: {
...tailwind.parsers,
...Object.keys(organizeImports.parsers).reduce((acc, key) => {
acc[key] = {
...tailwind.parsers[key],
preprocess(code, options) {
return organizeImports.parsers[key].preprocess(code, options);
},
};
return acc;
}, {}),
},
};
module.exports = combinedFormatter;

View File

@@ -24,6 +24,6 @@ module.exports = {
},
experimental: {
scrollRestoration: true,
largePageDataBytes: 256000,
largePageDataBytes: 512 * 1000,
},
};

View File

@@ -5,7 +5,6 @@
"packageManager": "pnpm@10.24.0",
"scripts": {
"preinstall": "npx only-allow pnpm",
"postinstall": "node postinstall-win.js",
"dev": "nodemon -e ts --watch server --watch seerr-api.yml -e .json,.ts,.yml -x ts-node -r tsconfig-paths/register --files --project server/tsconfig.json server/index.ts",
"build:server": "tsc --project server/tsconfig.json && copyfiles -u 2 server/templates/**/*.{html,pug} dist/templates && tsc-alias -p server/tsconfig.json",
"build:next": "next build",
@@ -17,7 +16,7 @@
"migration:generate": "ts-node -r tsconfig-paths/register --project server/tsconfig.json ./node_modules/typeorm/cli.js migration:generate -d server/datasource.ts",
"migration:create": "ts-node -r tsconfig-paths/register --project server/tsconfig.json ./node_modules/typeorm/cli.js migration:create -d server/datasource.ts",
"migration:run": "ts-node -r tsconfig-paths/register --project server/tsconfig.json ./node_modules/typeorm/cli.js migration:run -d server/datasource.ts",
"format": "prettier --loglevel warn --write --cache .",
"format": "prettier --log-level warn --write --cache .",
"format:check": "prettier --check --cache .",
"typecheck": "pnpm typecheck:server && pnpm typecheck:client",
"typecheck:server": "tsc --project server/tsconfig.json --noEmit",
@@ -38,18 +37,18 @@
"@formatjs/intl-locale": "3.1.1",
"@formatjs/intl-pluralrules": "5.4.6",
"@formatjs/intl-utils": "3.8.4",
"@formatjs/swc-plugin-experimental": "^0.4.0",
"@headlessui/react": "1.7.12",
"@heroicons/react": "2.2.0",
"@seerr-team/react-tailwindcss-datepicker": "^1.3.4",
"@supercharge/request-ip": "1.2.0",
"@svgr/webpack": "6.5.1",
"@tanem/react-nprogress": "5.0.56",
"@types/ua-parser-js": "^0.7.36",
"@types/wink-jaro-distance": "^2.0.2",
"ace-builds": "1.43.4",
"axios": "1.13.2",
"axios": "1.13.3",
"axios-rate-limit": "1.4.0",
"bcrypt": "5.1.0",
"bcrypt": "6.0.0",
"bowser": "2.13.1",
"connect-typeorm": "1.1.4",
"cookie-parser": "1.4.7",
@@ -57,7 +56,6 @@
"country-flag-icons": "1.6.4",
"cronstrue": "2.23.0",
"date-fns": "2.29.3",
"dayjs": "1.11.19",
"dns-caching": "^0.2.7",
"email-templates": "12.0.3",
"express": "4.21.2",
@@ -68,16 +66,15 @@
"gravatar-url": "3.1.0",
"http-proxy-agent": "^7.0.2",
"https-proxy-agent": "^7.0.6",
"lodash": "4.17.21",
"lodash": "4.17.23",
"mime": "3",
"next": "^14.2.25",
"next": "^14.2.35",
"node-cache": "5.1.2",
"node-gyp": "9.3.1",
"node-schedule": "2.1.1",
"nodemailer": "6.10.0",
"openpgp": "5.11.2",
"pg": "8.16.3",
"plex-api": "5.3.2",
"nodemailer": "7.0.12",
"openpgp": "6.3.0",
"pg": "8.17.2",
"pug": "3.0.3",
"react": "^18.3.1",
"react-ace": "10.1.0",
@@ -90,7 +87,6 @@
"react-popper-tooltip": "4.4.2",
"react-select": "5.10.2",
"react-spring": "9.7.1",
"react-tailwindcss-datepicker-sct": "1.3.4",
"react-toast-notifications": "2.5.1",
"react-transition-group": "^4.4.5",
"react-truncate-markup": "5.1.2",
@@ -101,28 +97,28 @@
"sharp": "^0.33.4",
"sqlite3": "5.1.7",
"swagger-ui-express": "4.6.2",
"swr": "2.3.7",
"swr": "2.3.8",
"tailwind-merge": "^2.6.0",
"typeorm": "0.3.12",
"typeorm": "0.3.28",
"ua-parser-js": "^1.0.35",
"undici": "^7.16.0",
"undici": "^7.18.2",
"validator": "^13.15.23",
"web-push": "3.6.7",
"wink-jaro-distance": "^2.0.0",
"winston": "3.18.3",
"winston": "3.19.0",
"winston-daily-rotate-file": "4.7.1",
"xml2js": "0.4.23",
"xml2js": "0.5.0",
"yamljs": "0.3.0",
"yup": "0.32.11",
"zod": "3.24.2"
"zod": "4.3.6"
},
"devDependencies": {
"@commitlint/cli": "17.4.4",
"@commitlint/config-conventional": "17.4.4",
"@tailwindcss/aspect-ratio": "0.4.2",
"@tailwindcss/forms": "0.5.10",
"@tailwindcss/typography": "0.5.16",
"@types/bcrypt": "5.0.0",
"@tailwindcss/aspect-ratio": "^0.4.2",
"@tailwindcss/forms": "^0.5.10",
"@tailwindcss/typography": "^0.5.16",
"@types/bcrypt": "6.0.0",
"@types/cookie-parser": "1.4.10",
"@types/country-flag-icons": "1.2.2",
"@types/csurf": "1.11.5",
@@ -133,7 +129,7 @@
"@types/mime": "3",
"@types/node": "22.10.5",
"@types/node-schedule": "2.1.8",
"@types/nodemailer": "6.4.7",
"@types/nodemailer": "7",
"@types/react": "^18.3.3",
"@types/react-dom": "^18.3.0",
"@types/react-transition-group": "4.4.12",
@@ -142,20 +138,20 @@
"@types/swagger-ui-express": "4.1.8",
"@types/validator": "^13.15.10",
"@types/web-push": "3.6.4",
"@types/xml2js": "0.4.11",
"@types/xml2js": "0.4.14",
"@types/yamljs": "0.2.31",
"@types/yup": "0.29.14",
"@typescript-eslint/eslint-plugin": "5.54.0",
"@typescript-eslint/parser": "5.54.0",
"autoprefixer": "10.4.22",
"@typescript-eslint/eslint-plugin": "7.18.0",
"@typescript-eslint/parser": "7.18.0",
"autoprefixer": "^10.4.23",
"baseline-browser-mapping": "^2.8.32",
"commitizen": "4.3.1",
"copyfiles": "2.4.1",
"cy-mobile-commands": "0.3.0",
"cypress": "14.1.0",
"cypress": "14.5.4",
"cz-conventional-changelog": "3.3.0",
"eslint": "8.35.0",
"eslint-config-next": "^14.2.4",
"eslint": "8.57.1",
"eslint-config-next": "^14.2.35",
"eslint-config-prettier": "8.6.0",
"eslint-plugin-formatjs": "4.9.0",
"eslint-plugin-jsx-a11y": "6.10.2",
@@ -166,24 +162,20 @@
"husky": "8.0.3",
"lint-staged": "13.1.2",
"nodemon": "3.1.11",
"postcss": "8.5.6",
"prettier": "2.8.4",
"prettier-plugin-organize-imports": "3.2.2",
"prettier-plugin-tailwindcss": "0.2.3",
"tailwindcss": "3.2.7",
"postcss": "^8.5.6",
"prettier": "3.8.1",
"prettier-plugin-organize-imports": "4.3.0",
"prettier-plugin-tailwindcss": "0.6.14",
"tailwindcss": "3.4.19",
"ts-node": "10.9.2",
"tsc-alias": "1.8.16",
"tsconfig-paths": "4.2.0",
"typescript": "4.9.5"
"typescript": "5.4.5"
},
"engines": {
"node": "^22.0.0",
"pnpm": "^10.0.0"
},
"overrides": {
"sqlite3/node-gyp": "8.4.1",
"@types/express-session": "1.18.2"
},
"config": {
"commitizen": {
"path": "./node_modules/cz-conventional-changelog"
@@ -210,6 +202,10 @@
"cypress",
"sharp",
"sqlite3"
]
],
"overrides": {
"sqlite3>node-gyp": "8.4.1",
"@types/express-session": "1.18.2"
}
}
}

6209
pnpm-lock.yaml generated

File diff suppressed because it is too large Load Diff

View File

@@ -1,13 +0,0 @@
/* eslint-disable @typescript-eslint/no-var-requires */
const fs = require('fs');
const path = require('path');
const { execSync } = require('child_process');
if (process.platform === 'win32') {
const typeormPath = path.resolve('node_modules/typeorm');
if (fs.existsSync(typeormPath)) {
process.stdout.write('> Installing typeorm@0.3.11 for Windows\n');
execSync('pnpm add typeorm@0.3.11', { stdio: 'inherit' });
}
}

View File

@@ -13,6 +13,7 @@ const DEFAULT_ROLLING_BUFFER = 10000;
export interface ExternalAPIOptions {
nodeCache?: NodeCache;
headers?: Record<string, unknown>;
timeout?: number;
rateLimit?: {
maxRPS: number;
maxRequests: number;
@@ -32,6 +33,7 @@ class ExternalAPI {
this.axios = axios.create({
baseURL: baseUrl,
params,
timeout: options.timeout,
headers: {
'Content-Type': 'application/json',
Accept: 'application/json',

View File

@@ -57,7 +57,7 @@ interface GithubCommit {
sha: string;
url: string;
html_url: string;
}
},
];
}

View File

@@ -420,7 +420,7 @@ class JellyfinAPI extends ExternalAPI {
}
public async getEpisodes<
T extends { includeMediaInfo?: boolean } | undefined = undefined
T extends { includeMediaInfo?: boolean } | undefined = undefined,
>(
seriesID: string,
seasonID: string,

View File

@@ -1,7 +1,14 @@
import ExternalAPI from '@server/api/externalapi';
import type { Library, PlexSettings } from '@server/lib/settings';
import { getSettings } from '@server/lib/settings';
import logger from '@server/logger';
import NodePlexAPI from 'plex-api';
interface PlexStatusResponse {
MediaContainer: {
machineIdentifier: string;
friendlyName: string;
};
}
export interface PlexLibraryItem {
ratingKey: string;
@@ -84,9 +91,7 @@ interface PlexMetadataResponse {
};
}
class PlexAPI {
private plexClient: NodePlexAPI;
class PlexAPI extends ExternalAPI {
constructor({
plexToken,
plexSettings,
@@ -97,48 +102,33 @@ class PlexAPI {
timeout?: number;
}) {
const settings = getSettings();
let settingsPlex: PlexSettings | undefined;
plexSettings
? (settingsPlex = plexSettings)
: (settingsPlex = getSettings().plex);
const settingsPlex = plexSettings ?? settings.plex;
this.plexClient = new NodePlexAPI({
hostname: settingsPlex.ip,
port: settingsPlex.port,
https: settingsPlex.useSsl,
timeout: timeout,
token: plexToken ?? undefined,
authenticator: {
authenticate: (
_plexApi,
cb: (err?: string, token?: string) => void
) => {
if (!plexToken) {
return cb('Plex Token not found!');
}
cb(undefined, plexToken);
const protocol = settingsPlex.useSsl ? 'https' : 'http';
const baseUrl = `${protocol}://${settingsPlex.ip}:${settingsPlex.port}`;
super(
baseUrl,
{},
{
timeout,
headers: {
'X-Plex-Token': plexToken ?? '',
'X-Plex-Client-Identifier': settings.clientId,
'X-Plex-Product': 'Seerr',
'X-Plex-Device-Name': 'Seerr',
'X-Plex-Platform': 'Seerr',
},
},
// requestOptions: {
// includeChildren: 1,
// },
options: {
identifier: settings.clientId,
product: 'Seerr',
deviceName: 'Seerr',
platform: 'Seerr',
},
});
}
);
}
public async getStatus() {
return await this.plexClient.query('/');
public async getStatus(): Promise<PlexStatusResponse> {
return await this.get('/');
}
public async getLibraries(): Promise<PlexLibrary[]> {
const response = await this.plexClient.query<PlexLibrariesResponse>(
'/library/sections'
);
const response = await this.get<PlexLibrariesResponse>('/library/sections');
return response.MediaContainer.Directory;
}
@@ -187,13 +177,15 @@ class PlexAPI {
id: string,
{ offset = 0, size = 50 }: { offset?: number; size?: number } = {}
): Promise<{ totalSize: number; items: PlexLibraryItem[] }> {
const response = await this.plexClient.query<PlexLibraryResponse>({
uri: `/library/sections/${id}/all?includeGuids=1`,
extraHeaders: {
'X-Plex-Container-Start': `${offset}`,
'X-Plex-Container-Size': `${size}`,
},
});
const response = await this.get<PlexLibraryResponse>(
`/library/sections/${id}/all?includeGuids=1`,
{
headers: {
'X-Plex-Container-Start': `${offset}`,
'X-Plex-Container-Size': `${size}`,
},
}
);
return {
totalSize: response.MediaContainer.totalSize,
@@ -205,7 +197,7 @@ class PlexAPI {
key: string,
options: { includeChildren?: boolean } = {}
): Promise<PlexMetadata> {
const response = await this.plexClient.query<PlexMetadataResponse>(
const response = await this.get<PlexMetadataResponse>(
`/library/metadata/${key}${
options.includeChildren ? '?includeChildren=1' : ''
}`
@@ -215,7 +207,7 @@ class PlexAPI {
}
public async getChildrenMetadata(key: string): Promise<PlexMetadata[]> {
const response = await this.plexClient.query<PlexMetadataResponse>(
const response = await this.get<PlexMetadataResponse>(
`/library/metadata/${key}/children`
);
@@ -229,15 +221,17 @@ class PlexAPI {
},
mediaType: 'movie' | 'show'
): Promise<PlexLibraryItem[]> {
const response = await this.plexClient.query<PlexLibraryResponse>({
uri: `/library/sections/${id}/all?type=${
const response = await this.get<PlexLibraryResponse>(
`/library/sections/${id}/all?type=${
mediaType === 'show' ? '4' : '1'
}&sort=addedAt%3Adesc&addedAt>>=${Math.floor(options.addedAt / 1000)}`,
extraHeaders: {
'X-Plex-Container-Start': `0`,
'X-Plex-Container-Size': `500`,
},
});
{
headers: {
'X-Plex-Container-Start': '0',
'X-Plex-Container-Size': '500',
},
}
);
return response.MediaContainer.Metadata;
}

View File

@@ -21,7 +21,7 @@ export const mapSounds = (sounds: {
({
name,
description,
} as PushoverSound)
}) as PushoverSound
);
class PushoverAPI extends ExternalAPI {

View File

@@ -157,8 +157,8 @@ class RottenTomatoes extends ExternalAPI {
criticsRating: movie.rottenTomatoes.certifiedFresh
? 'Certified Fresh'
: movie.rottenTomatoes.criticsScore >= 60
? 'Fresh'
: 'Rotten',
? 'Fresh'
: 'Rotten',
criticsScore: movie.rottenTomatoes.criticsScore,
audienceRating:
movie.rottenTomatoes.audienceScore >= 60 ? 'Upright' : 'Spilled',

View File

@@ -92,11 +92,13 @@ class ServarrBase<QueueItemAppendT> extends ExternalAPI {
apiKey,
cacheName,
apiName,
timeout = 5000,
}: {
url: string;
apiKey: string;
cacheName: AvailableCacheIds;
apiName: string;
timeout?: number;
}) {
super(
url,
@@ -105,6 +107,7 @@ class ServarrBase<QueueItemAppendT> extends ExternalAPI {
},
{
nodeCache: cacheManager.getCache(cacheName).data,
timeout,
}
);

View File

@@ -64,8 +64,16 @@ export interface RadarrMovie {
}
class RadarrAPI extends ServarrBase<{ movieId: number }> {
constructor({ url, apiKey }: { url: string; apiKey: string }) {
super({ url, apiKey, cacheName: 'radarr', apiName: 'Radarr' });
constructor({
url,
apiKey,
timeout,
}: {
url: string;
apiKey: string;
timeout?: number;
}) {
super({ url, apiKey, cacheName: 'radarr', apiName: 'Radarr', timeout });
}
public getMovies = async (): Promise<RadarrMovie[]> => {

View File

@@ -111,8 +111,16 @@ class SonarrAPI extends ServarrBase<{
episodeId: number;
episode: EpisodeResult;
}> {
constructor({ url, apiKey }: { url: string; apiKey: string }) {
super({ url, apiKey, apiName: 'Sonarr', cacheName: 'sonarr' });
constructor({
url,
apiKey,
timeout,
}: {
url: string;
apiKey: string;
timeout?: number;
}) {
super({ url, apiKey, apiName: 'Sonarr', cacheName: 'sonarr', timeout });
}
public async getSeries(): Promise<SonarrSeries[]> {

View File

@@ -269,8 +269,8 @@ class TautulliAPI {
recordA.grandparent_rating_key && recordB.grandparent_rating_key
? recordA.grandparent_rating_key === recordB.grandparent_rating_key
: recordA.parent_rating_key && recordB.parent_rating_key
? recordA.parent_rating_key === recordB.parent_rating_key
: recordA.rating_key === recordB.rating_key
? recordA.parent_rating_key === recordB.parent_rating_key
: recordA.rating_key === recordB.rating_key
);
start += take;

View File

@@ -536,8 +536,8 @@ class TheMovieDb extends ExternalAPI implements TvShowProvider {
originalLanguage && originalLanguage !== 'all'
? originalLanguage
: originalLanguage === 'all'
? undefined
: this.originalLanguage,
? undefined
: this.originalLanguage,
// Set our release date values, but check if one is set and not the other,
// so we can force a past date or a future date. TMDB Requires both values if one is set!
'primary_release_date.gte':
@@ -630,8 +630,8 @@ class TheMovieDb extends ExternalAPI implements TvShowProvider {
originalLanguage && originalLanguage !== 'all'
? originalLanguage
: originalLanguage === 'all'
? undefined
: this.originalLanguage,
? undefined
: this.originalLanguage,
include_null_first_air_dates: includeEmptyReleaseDate,
with_genres: genre,
with_networks: network,

View File

@@ -392,8 +392,10 @@ export interface TmdbPersonCombinedCredits {
crew: TmdbPersonCreditCrew[];
}
export interface TmdbSeasonWithEpisodes
extends Omit<TmdbTvSeasonResult, 'episode_count'> {
export interface TmdbSeasonWithEpisodes extends Omit<
TmdbTvSeasonResult,
'episode_count'
> {
episodes: TmdbTvEpisodeResult[];
external_ids: TmdbExternalIds;
}

View File

@@ -26,6 +26,7 @@ import { MediaRequest } from './MediaRequest';
import Season from './Season';
@Entity()
@Index(['tmdbId', 'mediaType'])
class Media {
public static async getRelatedMedia(
user: User | undefined,
@@ -101,9 +102,11 @@ class Media {
public imdbId?: string;
@Column({ type: 'int', default: MediaStatus.UNKNOWN })
@Index()
public status: MediaStatus;
@Column({ type: 'int', default: MediaStatus.UNKNOWN })
@Index()
public status4k: MediaStatus;
@OneToMany(() => MediaRequest, (request) => request.media, {

View File

@@ -21,6 +21,7 @@ import {
AfterUpdate,
Column,
Entity,
Index,
ManyToOne,
OneToMany,
PrimaryGeneratedColumn,
@@ -513,6 +514,7 @@ export class MediaRequest {
public id: number;
@Column({ type: 'integer' })
@Index()
public status: MediaRequestStatus;
@ManyToOne(() => Media, (media) => media.requests, {

View File

@@ -5,11 +5,11 @@ import { Watchlist } from '@server/entity/Watchlist';
import type { QuotaResponse } from '@server/interfaces/api/userInterfaces';
import PreparedEmail from '@server/lib/email';
import type { PermissionCheckOptions } from '@server/lib/permissions';
import { hasPermission, Permission } from '@server/lib/permissions';
import { Permission, hasPermission } from '@server/lib/permissions';
import { getSettings } from '@server/lib/settings';
import logger from '@server/logger';
import { AfterDate } from '@server/utils/dateHelpers';
import { DbAwareColumn } from '@server/utils/DbColumnHelper';
import { AfterDate } from '@server/utils/dateHelpers';
import bcrypt from 'bcrypt';
import { randomUUID } from 'crypto';
import path from 'path';
@@ -271,7 +271,7 @@ export class User {
});
const movieQuotaLimit = !canBypass
? this.movieQuotaLimit ?? defaultQuotas.movie.quotaLimit
? (this.movieQuotaLimit ?? defaultQuotas.movie.quotaLimit)
: 0;
const movieQuotaDays = this.movieQuotaDays ?? defaultQuotas.movie.quotaDays;
@@ -295,7 +295,7 @@ export class User {
: 0;
const tvQuotaLimit = !canBypass
? this.tvQuotaLimit ?? defaultQuotas.tv.quotaLimit
? (this.tvQuotaLimit ?? defaultQuotas.tv.quotaLimit)
: 0;
const tvQuotaDays = this.tvQuotaDays ?? defaultQuotas.tv.quotaDays;

View File

@@ -165,12 +165,15 @@ app
try {
const descriptor = Object.getOwnPropertyDescriptor(req, 'ip');
if (descriptor?.writable === true) {
(req as any).ip = getClientIp(req) ?? '';
Object.defineProperty(req, 'ip', {
...descriptor,
value: getClientIp(req) ?? '',
});
}
} catch (e) {
logger.error('Failed to attach the ip to the request', {
label: 'Middleware',
message: e.message,
message: (e as Error).message,
});
} finally {
next();

View File

@@ -7,6 +7,10 @@ export interface RequestResultsResponse extends PaginatedResponse {
profileName?: string;
canRemove?: boolean;
})[];
serviceErrors: {
radarr: { id: number; name: string }[];
sonarr: { id: number; name: string }[];
};
}
export type MediaRequestBody = {

View File

@@ -513,8 +513,8 @@ class AvailabilitySync {
mediaServerType === MediaServerType.PLEX
? 'plex'
: mediaServerType === MediaServerType.JELLYFIN
? 'jellyfin'
: 'emby'
? 'jellyfin'
: 'emby'
} instance. Status will be changed to deleted.`,
{ label: 'AvailabilitySync' }
);
@@ -588,8 +588,8 @@ class AvailabilitySync {
mediaServerType === MediaServerType.PLEX
? 'plex'
: mediaServerType === MediaServerType.JELLYFIN
? 'jellyfin'
: 'emby'
? 'jellyfin'
: 'emby'
} instance. Status will be changed to deleted.`,
{ label: 'AvailabilitySync' }
);
@@ -612,6 +612,13 @@ class AvailabilitySync {
): Promise<boolean> {
let existsInRadarr = false;
const hasSameServerInBothModes = this.radarrServers.some((a) =>
this.radarrServers.some(
(b) =>
a.is4k !== b.is4k && a.hostname === b.hostname && a.port === b.port
)
);
// Check for availability in all of the available radarr servers
// If any find the media, we will assume the media exists
for (const server of this.radarrServers.filter(
@@ -642,7 +649,14 @@ class AvailabilitySync {
radarr?.movieFile?.mediaInfo?.resolution?.split('x');
const is4kMovie =
resolution?.length === 2 && Number(resolution[0]) >= 2000;
existsInRadarr = is4k ? is4kMovie : !is4kMovie;
if (hasSameServerInBothModes && resolution?.length === 2) {
// Same server in both modes then use resolution to distinguish
existsInRadarr = is4k ? is4kMovie : !is4kMovie;
} else {
// One server type and if file exists, count it
existsInRadarr = true;
}
}
} catch (ex) {
if (!ex.message.includes('404')) {
@@ -658,6 +672,8 @@ class AvailabilitySync {
);
}
}
if (existsInRadarr) break;
}
return existsInRadarr;
@@ -816,6 +832,50 @@ class AvailabilitySync {
this.plexSeasonsCache[ratingKey4k] =
await this.plexClient?.getChildrenMetadata(ratingKey4k);
}
if (plexMedia) {
if (ratingKey === ratingKey4k) {
plexMedia = undefined;
}
if (
plexMedia &&
media.mediaType === 'movie' &&
!plexMedia.Media?.some(
(mediaItem) => (mediaItem.width ?? 0) >= 2000
)
) {
plexMedia = undefined;
}
if (plexMedia && media.mediaType === 'tv') {
const cachedSeasons = this.plexSeasonsCache[ratingKey4k];
if (cachedSeasons?.length) {
let has4kInAnySeason = false;
for (const season of cachedSeasons) {
try {
const episodes = await this.plexClient?.getChildrenMetadata(
season.ratingKey
);
const has4kEpisode = episodes?.some((episode) =>
episode.Media?.some(
(mediaItem) => (mediaItem.width ?? 0) >= 2000
)
);
if (has4kEpisode) {
has4kInAnySeason = true;
break;
}
} catch {
// If we can't fetch episodes for a season, continue checking other seasons
}
}
if (!has4kInAnySeason) {
plexMedia = undefined;
}
}
}
}
}
if (plexMedia) {

View File

@@ -2,12 +2,12 @@ import { IssueStatus, IssueTypeName } from '@server/constants/issue';
import { getRepository } from '@server/datasource';
import { User } from '@server/entity/User';
import type { NotificationAgentDiscord } from '@server/lib/settings';
import { getSettings, NotificationAgentKey } from '@server/lib/settings';
import { NotificationAgentKey, getSettings } from '@server/lib/settings';
import logger from '@server/logger';
import axios from 'axios';
import {
hasNotificationType,
Notification,
hasNotificationType,
shouldSendAdminNotification,
} from '..';
import type { NotificationAgent, NotificationPayload } from './agent';
@@ -209,8 +209,8 @@ class DiscordAgent
? payload.issue
? `${applicationUrl}/issues/${payload.issue.id}`
: payload.media
? `${applicationUrl}/${payload.media.mediaType}/${payload.media.tmdbId}`
: undefined
? `${applicationUrl}/${payload.media.mediaType}/${payload.media.tmdbId}`
: undefined
: undefined;
return {

View File

@@ -4,7 +4,7 @@ import { getRepository } from '@server/datasource';
import { User } from '@server/entity/User';
import PreparedEmail from '@server/lib/email';
import type { NotificationAgentEmail } from '@server/lib/settings';
import { getSettings, NotificationAgentKey } from '@server/lib/settings';
import { NotificationAgentKey, getSettings } from '@server/lib/settings';
import logger from '@server/logger';
import type { EmailOptions } from 'email-templates';
import path from 'path';

View File

@@ -3,7 +3,7 @@ import type { NotificationAgentGotify } from '@server/lib/settings';
import { getSettings } from '@server/lib/settings';
import logger from '@server/logger';
import axios from 'axios';
import { hasNotificationType, Notification } from '..';
import { Notification, hasNotificationType } from '..';
import type { NotificationAgent, NotificationPayload } from './agent';
import { BaseAgent } from './agent';

View File

@@ -3,7 +3,7 @@ import type { NotificationAgentNtfy } from '@server/lib/settings';
import { getSettings } from '@server/lib/settings';
import logger from '@server/logger';
import axios from 'axios';
import { hasNotificationType, Notification } from '..';
import { Notification, hasNotificationType } from '..';
import type { NotificationAgent, NotificationPayload } from './agent';
import { BaseAgent } from './agent';

View File

@@ -3,12 +3,12 @@ import { MediaStatus } from '@server/constants/media';
import { getRepository } from '@server/datasource';
import { User } from '@server/entity/User';
import type { NotificationAgentPushbullet } from '@server/lib/settings';
import { getSettings, NotificationAgentKey } from '@server/lib/settings';
import { NotificationAgentKey, getSettings } from '@server/lib/settings';
import logger from '@server/logger';
import axios from 'axios';
import {
hasNotificationType,
Notification,
hasNotificationType,
shouldSendAdminNotification,
} from '..';
import type { NotificationAgent, NotificationPayload } from './agent';

View File

@@ -3,12 +3,12 @@ import { MediaStatus } from '@server/constants/media';
import { getRepository } from '@server/datasource';
import { User } from '@server/entity/User';
import type { NotificationAgentPushover } from '@server/lib/settings';
import { getSettings, NotificationAgentKey } from '@server/lib/settings';
import { NotificationAgentKey, getSettings } from '@server/lib/settings';
import logger from '@server/logger';
import axios from 'axios';
import {
hasNotificationType,
Notification,
hasNotificationType,
shouldSendAdminNotification,
} from '..';
import type { NotificationAgent, NotificationPayload } from './agent';
@@ -158,8 +158,8 @@ class PushoverAgent
? payload.issue
? `${applicationUrl}/issues/${payload.issue.id}`
: payload.media
? `${applicationUrl}/${payload.media.mediaType}/${payload.media.tmdbId}`
: undefined
? `${applicationUrl}/${payload.media.mediaType}/${payload.media.tmdbId}`
: undefined
: undefined;
const url_title = url
? `View ${payload.issue ? 'Issue' : 'Media'} in ${applicationTitle}`

View File

@@ -3,7 +3,7 @@ import type { NotificationAgentSlack } from '@server/lib/settings';
import { getSettings } from '@server/lib/settings';
import logger from '@server/logger';
import axios from 'axios';
import { hasNotificationType, Notification } from '..';
import { Notification, hasNotificationType } from '..';
import type { NotificationAgent, NotificationPayload } from './agent';
import { BaseAgent } from './agent';
@@ -183,8 +183,8 @@ class SlackAgent
? payload.issue
? `${applicationUrl}/issues/${payload.issue.id}`
: payload.media
? `${applicationUrl}/${payload.media.mediaType}/${payload.media.tmdbId}`
: undefined
? `${applicationUrl}/${payload.media.mediaType}/${payload.media.tmdbId}`
: undefined
: undefined;
if (url) {

View File

@@ -3,12 +3,12 @@ import { MediaStatus } from '@server/constants/media';
import { getRepository } from '@server/datasource';
import { User } from '@server/entity/User';
import type { NotificationAgentTelegram } from '@server/lib/settings';
import { getSettings, NotificationAgentKey } from '@server/lib/settings';
import { NotificationAgentKey, getSettings } from '@server/lib/settings';
import logger from '@server/logger';
import axios from 'axios';
import {
hasNotificationType,
Notification,
hasNotificationType,
shouldSendAdminNotification,
} from '..';
import type { NotificationAgent, NotificationPayload } from './agent';
@@ -133,8 +133,8 @@ class TelegramAgent
? payload.issue
? `${applicationUrl}/issues/${payload.issue.id}`
: payload.media
? `${applicationUrl}/${payload.media.mediaType}/${payload.media.tmdbId}`
: undefined
? `${applicationUrl}/${payload.media.mediaType}/${payload.media.tmdbId}`
: undefined
: undefined;
if (url) {

View File

@@ -5,7 +5,7 @@ import { getSettings } from '@server/lib/settings';
import logger from '@server/logger';
import axios from 'axios';
import { get } from 'lodash';
import { hasNotificationType, Notification } from '..';
import { Notification, hasNotificationType } from '..';
import type { NotificationAgent, NotificationPayload } from './agent';
import { BaseAgent } from './agent';
@@ -122,7 +122,7 @@ class WebhookAgent
`{{${keymapKey}}}`,
typeof keymapValue === 'function'
? keymapValue(payload, type)
: get(payload, keymapValue) ?? ''
: (get(payload, keymapValue) ?? '')
);
});
} else if (finalPayload[key] && typeof finalPayload[key] === 'object') {
@@ -186,8 +186,8 @@ class WebhookAgent
type === Notification.TEST_NOTIFICATION
? 'test'
: typeof keymapValue === 'function'
? keymapValue(payload, type)
: get(payload, keymapValue) || 'test';
? keymapValue(payload, type)
: get(payload, keymapValue) || 'test';
webhookUrl = webhookUrl.replace(
new RegExp(`{{${keymapKey}}}`, 'g'),
encodeURIComponent(variableValue)

View File

@@ -5,7 +5,7 @@ import MediaRequest from '@server/entity/MediaRequest';
import { User } from '@server/entity/User';
import { UserPushSubscription } from '@server/entity/UserPushSubscription';
import type { NotificationAgentConfig } from '@server/lib/settings';
import { getSettings, NotificationAgentKey } from '@server/lib/settings';
import { NotificationAgentKey, getSettings } from '@server/lib/settings';
import logger from '@server/logger';
import webpush from 'web-push';
import { Notification, shouldSendAdminNotification } from '..';
@@ -128,8 +128,8 @@ class WebPushAgent
const actionUrl = payload.issue
? `/issues/${payload.issue.id}`
: payload.media
? `/${payload.media.mediaType}/${payload.media.tmdbId}`
: undefined;
? `/${payload.media.mediaType}/${payload.media.tmdbId}`
: undefined;
const actionUrlTitle = actionUrl
? `View ${payload.issue ? 'Issue' : 'Media'}`
@@ -260,13 +260,16 @@ class WebPushAgent
shouldSendAdminNotification(type, user, payload)
);
const allSubs = await userPushSubRepository
.createQueryBuilder('pushSub')
.leftJoinAndSelect('pushSub.user', 'user')
.where('pushSub.userId IN (:...users)', {
users: manageUsers.map((user) => user.id),
})
.getMany();
const allSubs =
manageUsers.length > 0
? await userPushSubRepository
.createQueryBuilder('pushSub')
.leftJoinAndSelect('pushSub.user', 'user')
.where('pushSub.userId IN (:...users)', {
users: manageUsers.map((user) => user.id),
})
.getMany()
: [];
// We only want to send the custom notification when type is approved or declined
// Otherwise, default to the normal notification

View File

@@ -118,8 +118,8 @@ class BaseScanner<T> {
existing[is4k ? 'status4k' : 'status'] = !processing
? MediaStatus.AVAILABLE
: existing[is4k ? 'status4k' : 'status'] === MediaStatus.DELETED
? MediaStatus.DELETED
: MediaStatus.PROCESSING;
? MediaStatus.DELETED
: MediaStatus.PROCESSING;
if (mediaAddedAt) {
existing.mediaAddedAt = mediaAddedAt;
}
@@ -200,14 +200,14 @@ class BaseScanner<T> {
!is4k && !processing
? MediaStatus.AVAILABLE
: !is4k && processing
? MediaStatus.PROCESSING
: MediaStatus.UNKNOWN;
? MediaStatus.PROCESSING
: MediaStatus.UNKNOWN;
newMedia.status4k =
is4k && this.enable4kMovie && !processing
? MediaStatus.AVAILABLE
: is4k && this.enable4kMovie && processing
? MediaStatus.PROCESSING
: MediaStatus.UNKNOWN;
? MediaStatus.PROCESSING
: MediaStatus.UNKNOWN;
newMedia.mediaType = MediaType.MOVIE;
newMedia.serviceId = !is4k ? serviceId : undefined;
newMedia.serviceId4k = is4k ? serviceId : undefined;
@@ -327,17 +327,17 @@ class BaseScanner<T> {
existingSeason.status === MediaStatus.AVAILABLE
? MediaStatus.AVAILABLE
: season.episodes > 0
? MediaStatus.PARTIALLY_AVAILABLE
: !season.is4kOverride &&
season.processing &&
existingSeason.status !== MediaStatus.DELETED
? MediaStatus.PROCESSING
: !season.is4kOverride &&
!season.processing &&
season.episodes === 0 &&
existingSeason.status === MediaStatus.PROCESSING
? MediaStatus.UNKNOWN
: existingSeason.status;
? MediaStatus.PARTIALLY_AVAILABLE
: !season.is4kOverride &&
season.processing &&
existingSeason.status !== MediaStatus.DELETED
? MediaStatus.PROCESSING
: !season.is4kOverride &&
!season.processing &&
season.episodes === 0 &&
existingSeason.status === MediaStatus.PROCESSING
? MediaStatus.UNKNOWN
: existingSeason.status;
// Same thing here, except we only do updates if 4k is enabled
existingSeason.status4k =
@@ -347,17 +347,17 @@ class BaseScanner<T> {
existingSeason.status4k === MediaStatus.AVAILABLE
? MediaStatus.AVAILABLE
: this.enable4kShow && season.episodes4k > 0
? MediaStatus.PARTIALLY_AVAILABLE
: season.is4kOverride &&
season.processing &&
existingSeason.status4k !== MediaStatus.DELETED
? MediaStatus.PROCESSING
: season.is4kOverride &&
!season.processing &&
season.episodes4k === 0 &&
existingSeason.status4k === MediaStatus.PROCESSING
? MediaStatus.UNKNOWN
: existingSeason.status4k;
? MediaStatus.PARTIALLY_AVAILABLE
: season.is4kOverride &&
season.processing &&
existingSeason.status4k !== MediaStatus.DELETED
? MediaStatus.PROCESSING
: season.is4kOverride &&
!season.processing &&
season.episodes4k === 0 &&
existingSeason.status4k === MediaStatus.PROCESSING
? MediaStatus.UNKNOWN
: existingSeason.status4k;
} else {
newSeasons.push(
new Season({
@@ -366,45 +366,25 @@ class BaseScanner<T> {
season.totalEpisodes === season.episodes && season.episodes > 0
? MediaStatus.AVAILABLE
: season.episodes > 0
? MediaStatus.PARTIALLY_AVAILABLE
: !season.is4kOverride && season.processing
? MediaStatus.PROCESSING
: MediaStatus.UNKNOWN,
? MediaStatus.PARTIALLY_AVAILABLE
: !season.is4kOverride && season.processing
? MediaStatus.PROCESSING
: MediaStatus.UNKNOWN,
status4k:
this.enable4kShow &&
season.totalEpisodes === season.episodes4k &&
season.episodes4k > 0
? MediaStatus.AVAILABLE
: this.enable4kShow && season.episodes4k > 0
? MediaStatus.PARTIALLY_AVAILABLE
: season.is4kOverride && season.processing
? MediaStatus.PROCESSING
: MediaStatus.UNKNOWN,
? MediaStatus.PARTIALLY_AVAILABLE
: season.is4kOverride && season.processing
? MediaStatus.PROCESSING
: MediaStatus.UNKNOWN,
})
);
}
}
// We want to skip specials when checking if a show is available
const isAllStandardSeasons =
seasons.length &&
seasons
.filter((season) => season.seasonNumber !== 0)
.every(
(season) =>
season.episodes === season.totalEpisodes && season.episodes > 0
);
const isAll4kSeasons =
seasons.length &&
seasons
.filter((season) => season.seasonNumber !== 0)
.every(
(season) =>
season.episodes4k === season.totalEpisodes &&
season.episodes4k > 0
);
if (media) {
media.seasons = [...media.seasons, ...newSeasons];
@@ -464,62 +444,73 @@ class BaseScanner<T> {
externalServiceSlug;
}
// If the show is already available, and there are no new seasons, dont adjust
// the status. Skip specials when performing availability check
const shouldStayAvailable =
media.status === MediaStatus.AVAILABLE &&
newSeasons.filter(
(season) =>
season.status !== MediaStatus.UNKNOWN &&
season.status !== MediaStatus.DELETED &&
season.seasonNumber !== 0
).length === 0;
const shouldStayAvailable4k =
media.status4k === MediaStatus.AVAILABLE &&
newSeasons.filter(
(season) =>
season.status4k !== MediaStatus.UNKNOWN &&
season.status4k !== MediaStatus.DELETED &&
season.seasonNumber !== 0
).length === 0;
media.status =
isAllStandardSeasons || shouldStayAvailable
? MediaStatus.AVAILABLE
: media.seasons.some(
const nonSpecialSeasons = media.seasons.filter(
(s) => s.seasonNumber !== 0
);
// Check the actual season objects instead scanner input
// to determine overall availability status
const isAllStandardSeasonsAvailable =
nonSpecialSeasons.length > 0 &&
nonSpecialSeasons.every((s) => s.status === MediaStatus.AVAILABLE);
const isAll4kSeasonsAvailable =
nonSpecialSeasons.length > 0 &&
nonSpecialSeasons.every((s) => s.status4k === MediaStatus.AVAILABLE);
media.status = isAllStandardSeasonsAvailable
? MediaStatus.AVAILABLE
: media.seasons.some(
(season) =>
season.status === MediaStatus.PARTIALLY_AVAILABLE ||
season.status === MediaStatus.AVAILABLE
)
? MediaStatus.PARTIALLY_AVAILABLE
: (!seasons.length && media.status !== MediaStatus.DELETED) ||
media.seasons.some(
(season) => season.status === MediaStatus.PROCESSING
)
? MediaStatus.PROCESSING
: media.status === MediaStatus.DELETED
? MediaStatus.DELETED
: MediaStatus.UNKNOWN;
media.seasons.some(
(season) => season.status === MediaStatus.PROCESSING
)
? MediaStatus.PROCESSING
: media.status === MediaStatus.DELETED
? MediaStatus.DELETED
: MediaStatus.UNKNOWN;
media.status4k =
(isAll4kSeasons || shouldStayAvailable4k) && this.enable4kShow
isAll4kSeasonsAvailable && this.enable4kShow
? MediaStatus.AVAILABLE
: this.enable4kShow &&
media.seasons.some(
(season) =>
season.status4k === MediaStatus.PARTIALLY_AVAILABLE ||
season.status4k === MediaStatus.AVAILABLE
)
? MediaStatus.PARTIALLY_AVAILABLE
: (!seasons.length && media.status4k !== MediaStatus.DELETED) ||
media.seasons.some(
(season) => season.status4k === MediaStatus.PROCESSING
)
? MediaStatus.PROCESSING
: media.status4k === MediaStatus.DELETED
? MediaStatus.DELETED
: MediaStatus.UNKNOWN;
media.seasons.some(
(season) =>
season.status4k === MediaStatus.PARTIALLY_AVAILABLE ||
season.status4k === MediaStatus.AVAILABLE
)
? MediaStatus.PARTIALLY_AVAILABLE
: (!seasons.length && media.status4k !== MediaStatus.DELETED) ||
media.seasons.some(
(season) => season.status4k === MediaStatus.PROCESSING
)
? MediaStatus.PROCESSING
: media.status4k === MediaStatus.DELETED
? MediaStatus.DELETED
: MediaStatus.UNKNOWN;
await mediaRepository.save(media);
this.log(`Updating existing title: ${title}`);
} else {
// For new media, check actual newSeasons objects instead of scanner
// input to determine overall availability status
const nonSpecialNewSeasons = newSeasons.filter(
(s) => s.seasonNumber !== 0
);
const isAllStandardSeasonsAvailable =
nonSpecialNewSeasons.length > 0 &&
nonSpecialNewSeasons.every((s) => s.status === MediaStatus.AVAILABLE);
const isAll4kSeasonsAvailable =
nonSpecialNewSeasons.length > 0 &&
nonSpecialNewSeasons.every(
(s) => s.status4k === MediaStatus.AVAILABLE
);
const newMedia = new Media({
mediaType: MediaType.TV,
seasons: newSeasons,
@@ -564,34 +555,34 @@ class BaseScanner<T> {
)
? jellyfinMediaId
: undefined,
status: isAllStandardSeasons
status: isAllStandardSeasonsAvailable
? MediaStatus.AVAILABLE
: newSeasons.some(
(season) =>
season.status === MediaStatus.PARTIALLY_AVAILABLE ||
season.status === MediaStatus.AVAILABLE
)
? MediaStatus.PARTIALLY_AVAILABLE
: newSeasons.some(
(season) => season.status === MediaStatus.PROCESSING
)
? MediaStatus.PROCESSING
: MediaStatus.UNKNOWN,
status4k:
isAll4kSeasons && this.enable4kShow
? MediaStatus.AVAILABLE
: this.enable4kShow &&
newSeasons.some(
(season) =>
season.status4k === MediaStatus.PARTIALLY_AVAILABLE ||
season.status4k === MediaStatus.AVAILABLE
season.status === MediaStatus.PARTIALLY_AVAILABLE ||
season.status === MediaStatus.AVAILABLE
)
? MediaStatus.PARTIALLY_AVAILABLE
: newSeasons.some(
(season) => season.status4k === MediaStatus.PROCESSING
)
? MediaStatus.PROCESSING
: MediaStatus.UNKNOWN,
(season) => season.status === MediaStatus.PROCESSING
)
? MediaStatus.PROCESSING
: MediaStatus.UNKNOWN,
status4k:
isAll4kSeasonsAvailable && this.enable4kShow
? MediaStatus.AVAILABLE
: this.enable4kShow &&
newSeasons.some(
(season) =>
season.status4k === MediaStatus.PARTIALLY_AVAILABLE ||
season.status4k === MediaStatus.AVAILABLE
)
? MediaStatus.PARTIALLY_AVAILABLE
: newSeasons.some(
(season) => season.status4k === MediaStatus.PROCESSING
)
? MediaStatus.PROCESSING
: MediaStatus.UNKNOWN,
});
await mediaRepository.save(newMedia);
this.log(`Saved ${title}`);

View File

@@ -59,12 +59,11 @@ searchProviders.push({
const successfulResponses = responses.filter(
(r) => r.status === 'fulfilled'
) as
| (
| PromiseFulfilledResult<TmdbMovieDetails>
| PromiseFulfilledResult<TmdbTvDetails>
| PromiseFulfilledResult<TmdbPersonDetails>
)[];
) as (
| PromiseFulfilledResult<TmdbMovieDetails>
| PromiseFulfilledResult<TmdbTvDetails>
| PromiseFulfilledResult<TmdbPersonDetails>
)[];
const results: (TmdbMovieResult | TmdbTvResult | TmdbPersonResult)[] = [];
@@ -185,11 +184,10 @@ searchProviders.push({
const successfulResponses = responses.filter(
(r) => r.status === 'fulfilled'
) as
| (
| PromiseFulfilledResult<TmdbSearchMovieResponse>
| PromiseFulfilledResult<TmdbSearchTvResponse>
)[];
) as (
| PromiseFulfilledResult<TmdbSearchMovieResponse>
| PromiseFulfilledResult<TmdbSearchTvResponse>
)[];
const results: (TmdbMovieResult | TmdbTvResult)[] = [];

View File

@@ -3,6 +3,7 @@ import { MediaStatus, MediaType } from '@server/constants/media';
import { getRepository } from '@server/datasource';
import Media from '@server/entity/Media';
import {
BlacklistedMediaError,
DuplicateMediaRequestError,
MediaRequest,
NoSeasonsAvailableError,
@@ -144,6 +145,9 @@ class WatchlistSync {
errorMessage: e.message,
});
break;
// Blacklisted media should be silently ignored during watchlist sync to avoid spam
case BlacklistedMediaError:
break;
default:
logger.error('Failed to create media request from watchlist', {
label: 'Watchlist Sync',

View File

@@ -1,8 +1,6 @@
import type { MigrationInterface, QueryRunner } from 'typeorm';
export class AddTelegramMessageThreadId1734786596045
implements MigrationInterface
{
export class AddTelegramMessageThreadId1734786596045 implements MigrationInterface {
name = 'AddTelegramMessageThreadId1734786596045';
public async up(queryRunner: QueryRunner): Promise<void> {

View File

@@ -1,8 +1,6 @@
import type { MigrationInterface, QueryRunner } from 'typeorm';
export class AddUserAvatarCacheFields1743107707465
implements MigrationInterface
{
export class AddUserAvatarCacheFields1743107707465 implements MigrationInterface {
name = 'AddUserAvatarCacheFields1743107707465';
public async up(queryRunner: QueryRunner): Promise<void> {

View File

@@ -1,8 +1,6 @@
import type { MigrationInterface, QueryRunner } from 'typeorm';
export class AddUniqueConstraintToPushSubscription1765233385034
implements MigrationInterface
{
export class AddUniqueConstraintToPushSubscription1765233385034 implements MigrationInterface {
name = 'AddUniqueConstraintToPushSubscription1765233385034';
public async up(queryRunner: QueryRunner): Promise<void> {

View File

@@ -0,0 +1,35 @@
import type { MigrationInterface, QueryRunner } from 'typeorm';
export class AddPerformanceIndexes1770627987304 implements MigrationInterface {
name = 'AddPerformanceIndexes1770627987304';
public async up(queryRunner: QueryRunner): Promise<void> {
await queryRunner.query(
`CREATE INDEX "IDX_4c696e8ed36ae34fe18abe59d2" ON "media_request" ("status") `
);
await queryRunner.query(
`CREATE INDEX "IDX_c730c2d67f271a372c39a07b7e" ON "media" ("status") `
);
await queryRunner.query(
`CREATE INDEX "IDX_5d6218de4f547909391a5c1347" ON "media" ("status4k") `
);
await queryRunner.query(
`CREATE INDEX "IDX_f8233358694d1677a67899b90a" ON "media" ("tmdbId", "mediaType") `
);
}
public async down(queryRunner: QueryRunner): Promise<void> {
await queryRunner.query(
`DROP INDEX "public"."IDX_f8233358694d1677a67899b90a"`
);
await queryRunner.query(
`DROP INDEX "public"."IDX_5d6218de4f547909391a5c1347"`
);
await queryRunner.query(
`DROP INDEX "public"."IDX_c730c2d67f271a372c39a07b7e"`
);
await queryRunner.query(
`DROP INDEX "public"."IDX_4c696e8ed36ae34fe18abe59d2"`
);
}
}

View File

@@ -1,8 +1,6 @@
import type { MigrationInterface, QueryRunner } from 'typeorm';
export class AddUserRequestDeleteCascades1608219049304
implements MigrationInterface
{
export class AddUserRequestDeleteCascades1608219049304 implements MigrationInterface {
name = 'AddUserRequestDeleteCascades1608219049304';
public async up(queryRunner: QueryRunner): Promise<void> {

View File

@@ -1,8 +1,6 @@
import type { MigrationInterface, QueryRunner } from 'typeorm';
export class AddLastSeasonChangeMedia1608477467935
implements MigrationInterface
{
export class AddLastSeasonChangeMedia1608477467935 implements MigrationInterface {
name = 'AddLastSeasonChangeMedia1608477467935';
public async up(queryRunner: QueryRunner): Promise<void> {

View File

@@ -1,8 +1,6 @@
import type { MigrationInterface, QueryRunner } from 'typeorm';
export class ForceDropImdbUniqueConstraint1608477467935
implements MigrationInterface
{
export class ForceDropImdbUniqueConstraint1608477467935 implements MigrationInterface {
name = 'ForceDropImdbUniqueConstraint1608477467936';
public async up(queryRunner: QueryRunner): Promise<void> {

View File

@@ -1,8 +1,6 @@
import type { MigrationInterface, QueryRunner } from 'typeorm';
export class RemoveTmdbIdUniqueConstraint1609236552057
implements MigrationInterface
{
export class RemoveTmdbIdUniqueConstraint1609236552057 implements MigrationInterface {
name = 'RemoveTmdbIdUniqueConstraint1609236552057';
public async up(queryRunner: QueryRunner): Promise<void> {

View File

@@ -1,8 +1,6 @@
import type { MigrationInterface, QueryRunner } from 'typeorm';
export class AddMediaAddedFieldToMedia1610522845513
implements MigrationInterface
{
export class AddMediaAddedFieldToMedia1610522845513 implements MigrationInterface {
name = 'AddMediaAddedFieldToMedia1610522845513';
public async up(queryRunner: QueryRunner): Promise<void> {

View File

@@ -1,8 +1,6 @@
import type { MigrationInterface, QueryRunner } from 'typeorm';
export class SonarrRadarrSyncServiceFields1611757511674
implements MigrationInterface
{
export class SonarrRadarrSyncServiceFields1611757511674 implements MigrationInterface {
name = 'SonarrRadarrSyncServiceFields1611757511674';
public async up(queryRunner: QueryRunner): Promise<void> {

View File

@@ -1,8 +1,6 @@
import type { MigrationInterface, QueryRunner } from 'typeorm';
export class AddResetPasswordGuidAndExpiryDate1612482778137
implements MigrationInterface
{
export class AddResetPasswordGuidAndExpiryDate1612482778137 implements MigrationInterface {
name = 'AddResetPasswordGuidAndExpiryDate1612482778137';
public async up(queryRunner: QueryRunner): Promise<void> {

View File

@@ -1,8 +1,6 @@
import type { MigrationInterface, QueryRunner } from 'typeorm';
export class UpdateUserSettingsRegions1613955393450
implements MigrationInterface
{
export class UpdateUserSettingsRegions1613955393450 implements MigrationInterface {
name = 'UpdateUserSettingsRegions1613955393450';
public async up(queryRunner: QueryRunner): Promise<void> {

View File

@@ -1,8 +1,6 @@
import type { MigrationInterface, QueryRunner } from 'typeorm';
export class AddTelegramSettingsToUserSettings1614334195680
implements MigrationInterface
{
export class AddTelegramSettingsToUserSettings1614334195680 implements MigrationInterface {
name = 'AddTelegramSettingsToUserSettings1614334195680';
public async up(queryRunner: QueryRunner): Promise<void> {

View File

@@ -1,8 +1,6 @@
import type { MigrationInterface, QueryRunner } from 'typeorm';
export class CreateTagsFieldonMediaRequest1617624225464
implements MigrationInterface
{
export class CreateTagsFieldonMediaRequest1617624225464 implements MigrationInterface {
name = 'CreateTagsFieldonMediaRequest1617624225464';
public async up(queryRunner: QueryRunner): Promise<void> {

View File

@@ -1,8 +1,6 @@
import type { MigrationInterface, QueryRunner } from 'typeorm';
export class AddUserSettingsNotificationAgentsField1617730837489
implements MigrationInterface
{
export class AddUserSettingsNotificationAgentsField1617730837489 implements MigrationInterface {
name = 'AddUserSettingsNotificationAgentsField1617730837489';
public async up(queryRunner: QueryRunner): Promise<void> {

View File

@@ -1,8 +1,6 @@
import type { MigrationInterface, QueryRunner } from 'typeorm';
export class CreateUserPushSubscriptions1618912653565
implements MigrationInterface
{
export class CreateUserPushSubscriptions1618912653565 implements MigrationInterface {
name = 'CreateUserPushSubscriptions1618912653565';
public async up(queryRunner: QueryRunner): Promise<void> {

View File

@@ -1,8 +1,6 @@
import type { MigrationInterface, QueryRunner } from 'typeorm';
export class AddUserSettingsNotificationTypes1619339817343
implements MigrationInterface
{
export class AddUserSettingsNotificationTypes1619339817343 implements MigrationInterface {
name = 'AddUserSettingsNotificationTypes1619339817343';
public async up(queryRunner: QueryRunner): Promise<void> {

View File

@@ -1,8 +1,6 @@
import type { MigrationInterface, QueryRunner } from 'typeorm';
export class AddPushbulletPushoverUserSettings1635079863457
implements MigrationInterface
{
export class AddPushbulletPushoverUserSettings1635079863457 implements MigrationInterface {
name = 'AddPushbulletPushoverUserSettings1635079863457';
public async up(queryRunner: QueryRunner): Promise<void> {

View File

@@ -1,8 +1,6 @@
import type { MigrationInterface, QueryRunner } from 'typeorm';
export class AddWatchlistSyncUserSetting1660632269368
implements MigrationInterface
{
export class AddWatchlistSyncUserSetting1660632269368 implements MigrationInterface {
name = 'AddWatchlistSyncUserSetting1660632269368';
public async up(queryRunner: QueryRunner): Promise<void> {

View File

@@ -1,8 +1,6 @@
import type { MigrationInterface, QueryRunner } from 'typeorm';
export class AddMediaRequestIsAutoRequestedField1660714479373
implements MigrationInterface
{
export class AddMediaRequestIsAutoRequestedField1660714479373 implements MigrationInterface {
name = 'AddMediaRequestIsAutoRequestedField1660714479373';
public async up(queryRunner: QueryRunner): Promise<void> {

View File

@@ -1,8 +1,6 @@
import type { MigrationInterface, QueryRunner } from 'typeorm';
export class AddUserSettingsStreamingRegion1727907530757
implements MigrationInterface
{
export class AddUserSettingsStreamingRegion1727907530757 implements MigrationInterface {
name = 'AddUserSettingsStreamingRegion1727907530757';
public async up(queryRunner: QueryRunner): Promise<void> {

Some files were not shown because too many files have changed in this diff Show More