Compare commits

...

64 commits

Author SHA1 Message Date
cb70268966 Merge pull request 'Update sonarsource/sonarqube-scan-action action to v2.1.0' (#80) from renovate/sonarsource-sonarqube-scan-action-2.x into master
All checks were successful
check code / check-docs (push) Successful in 8s
check code / check-code-py38 (push) Successful in 20s
check code / check-code-py39 (push) Successful in 28s
check code / check-code-py310 (push) Successful in 26s
check code / check-code-py311 (push) Successful in 19s
check code / scan-code-py311 (push) Successful in 7m22s
run scheduled tests / check-code-py311 (push) Successful in 6m50s
Reviewed-on: #80
2024-06-01 09:39:09 +02:00
98b14838e8 Update sonarsource/sonarqube-scan-action action to v2.1.0
All checks were successful
build package and container / build-pypackage (pull_request) Successful in 15s
check code / check-docs (pull_request) Successful in 6s
create release / release-pypackage (pull_request) Successful in 19s
build package and container / build-container (pull_request) Successful in 4m11s
check code / check-code-py38 (pull_request) Successful in 6m56s
check code / scan-code-py311 (pull_request) Has been skipped
check code / check-code-py39 (pull_request) Successful in 6m52s
check code / check-code-py310 (pull_request) Successful in 6m58s
check code / check-code-py311 (pull_request) Successful in 7m7s
2024-05-23 20:17:07 +02:00
4911f02303 Merge pull request 'Update dependency shellcheck to v0.10.0' (#76) from renovate/shellcheck-0.x into master
All checks were successful
check code / check-docs (push) Successful in 6s
check code / check-code-py38 (push) Successful in 1m28s
check code / check-code-py39 (push) Successful in 2m55s
check code / check-code-py310 (push) Successful in 3m40s
check code / scan-code-py311 (push) Successful in 9m41s
check code / check-code-py311 (push) Successful in 21s
run scheduled tests / check-code-py311 (push) Successful in 6m53s
Reviewed-on: #76
2024-04-16 12:40:49 +02:00
5a04db3cd4 Merge pull request 'Update dependency just to v1.25.2' (#75) from renovate/just-1.x into master
Some checks failed
check code / check-code-py310 (push) Blocked by required conditions
check code / check-code-py311 (push) Blocked by required conditions
check code / check-docs (push) Successful in 6s
check code / check-code-py38 (push) Successful in 56s
check code / scan-code-py311 (push) Has been cancelled
check code / check-code-py39 (push) Has been cancelled
Reviewed-on: #75
2024-04-16 12:38:39 +02:00
fe8b36c705 Merge pull request 'Update sonarsource/sonarqube-scan-action action to v2.0.2' (#78) from renovate/sonarsource-sonarqube-scan-action-2.x into master
All checks were successful
check code / check-docs (push) Successful in 6s
check code / check-code-py38 (push) Successful in 33s
check code / check-code-py39 (push) Successful in 25s
check code / check-code-py310 (push) Successful in 25s
check code / check-code-py311 (push) Successful in 25s
check code / scan-code-py311 (push) Successful in 7m30s
run scheduled tests / check-code-py311 (push) Successful in 7m16s
Reviewed-on: #78
2024-04-08 12:19:20 +02:00
1ceaff2d67 Update sonarsource/sonarqube-scan-action action to v2.0.2
All checks were successful
check code / check-docs (pull_request) Successful in 7s
build package and container / build-pypackage (pull_request) Successful in 18s
create release / release-pypackage (pull_request) Successful in 30s
build package and container / build-container (pull_request) Successful in 3m2s
check code / check-code-py38 (pull_request) Successful in 6m59s
check code / scan-code-py311 (pull_request) Has been skipped
check code / check-code-py39 (pull_request) Successful in 7m14s
check code / check-code-py310 (pull_request) Successful in 7m14s
check code / check-code-py311 (pull_request) Successful in 7m15s
2024-04-04 20:14:07 +02:00
f8422f7670 Update dependency just to v1.25.2
All checks were successful
build package and container / build-pypackage (pull_request) Successful in 13s
check code / check-docs (pull_request) Successful in 12s
create release / release-pypackage (pull_request) Successful in 35s
build package and container / build-container (pull_request) Successful in 3m31s
check code / check-code-py38 (pull_request) Successful in 6m56s
check code / scan-code-py311 (pull_request) Has been skipped
check code / check-code-py39 (pull_request) Successful in 6m48s
check code / check-code-py310 (pull_request) Successful in 6m53s
check code / check-code-py311 (pull_request) Successful in 7m5s
2024-03-11 08:15:46 +01:00
f7fa583735 Update dependency shellcheck to v0.10.0
All checks were successful
build package and container / build-pypackage (pull_request) Successful in 14s
build package and container / build-container (pull_request) Successful in 3m4s
check code / check-docs (pull_request) Successful in 7s
create release / release-pypackage (pull_request) Successful in 29s
check code / check-code-py38 (pull_request) Successful in 6m58s
check code / scan-code-py311 (pull_request) Has been skipped
check code / check-code-py39 (pull_request) Successful in 6m47s
check code / check-code-py310 (pull_request) Successful in 7m3s
check code / check-code-py311 (pull_request) Successful in 6m49s
2024-03-08 08:18:50 +01:00
5afd538dcd fix pyproject coverage
All checks were successful
check code / check-docs (push) Successful in 6s
check code / check-code-py38 (push) Successful in 26s
check code / check-code-py39 (push) Successful in 22s
check code / check-code-py310 (push) Successful in 30s
check code / check-code-py311 (push) Successful in 22s
check code / scan-code-py311 (push) Successful in 7m11s
2024-02-21 13:38:37 +01:00
ee8a34b760 update release action
Some checks failed
check code / check-docs (push) Successful in 6s
check code / check-code-py38 (push) Successful in 26s
check code / check-code-py39 (push) Successful in 27s
check code / check-code-py310 (push) Successful in 22s
check code / check-code-py311 (push) Successful in 24s
check code / scan-code-py311 (push) Has been cancelled
2024-02-21 13:35:42 +01:00
0dc7e2f60d update ruff settings and fix lint violations 2024-02-21 13:35:42 +01:00
a1bd82778f Merge pull request 'Update dependency shfmt to v3.8.0' (#72) from renovate/shfmt-3.x into master
All checks were successful
check code / check-docs (push) Successful in 7s
check code / check-code-py38 (push) Successful in 22s
check code / check-code-py39 (push) Successful in 26s
check code / check-code-py310 (push) Successful in 30s
check code / check-code-py311 (push) Successful in 24s
check code / scan-code-py311 (push) Successful in 7m2s
Reviewed-on: #72
2024-02-13 08:18:17 +01:00
ae4b796469 Update dependency shfmt to v3.8.0
All checks were successful
build package and container / build-pypackage (pull_request) Successful in 13s
check code / check-docs (pull_request) Successful in 5s
create release / release-pypackage (pull_request) Successful in 22s
build package and container / build-container (pull_request) Successful in 4m0s
check code / check-code-py38 (pull_request) Successful in 6m49s
check code / scan-code-py311 (pull_request) Has been skipped
check code / check-code-py39 (pull_request) Successful in 7m6s
check code / check-code-py310 (pull_request) Successful in 6m51s
check code / check-code-py311 (pull_request) Successful in 6m53s
2024-02-12 20:14:36 +01:00
3950cd1927 Merge pull request 'Update dependency just to v1.24.0' (#73) from renovate/just-1.x into master
All checks were successful
check code / check-docs (push) Successful in 6s
check code / check-code-py38 (push) Successful in 20s
check code / check-code-py39 (push) Successful in 19s
check code / check-code-py310 (push) Successful in 19s
check code / check-code-py311 (push) Successful in 24s
check code / scan-code-py311 (push) Successful in 7m12s
Reviewed-on: #73
2024-02-12 14:43:02 +01:00
4a2e90ddda Update dependency just to v1.24.0
All checks were successful
build package and container / build-pypackage (pull_request) Successful in 13s
check code / check-docs (pull_request) Successful in 5s
create release / release-pypackage (pull_request) Successful in 30s
build package and container / build-container (pull_request) Successful in 2m51s
check code / check-code-py38 (pull_request) Successful in 6m48s
check code / scan-code-py311 (pull_request) Has been skipped
check code / check-code-py39 (pull_request) Successful in 7m1s
check code / check-code-py310 (pull_request) Successful in 6m47s
check code / check-code-py311 (pull_request) Successful in 7m10s
2024-02-12 08:16:05 +01:00
e1276b5be9 Fix pytz requirement
All checks were successful
check code / check-docs (push) Successful in 6s
check code / check-code-py38 (push) Successful in 26s
check code / check-code-py39 (push) Successful in 26s
check code / check-code-py310 (push) Successful in 25s
check code / check-code-py311 (push) Successful in 25s
check code / scan-code-py311 (push) Successful in 6m59s
2024-02-02 10:14:46 +01:00
8d652d6732
run ci tests only on PR
All checks were successful
check code / check-docs (push) Successful in 7s
check code / check-code-py38 (push) Successful in 23s
check code / check-code-py39 (push) Successful in 23s
check code / check-code-py310 (push) Successful in 22s
check code / check-code-py311 (push) Successful in 24s
check code / scan-code-py311 (push) Successful in 7m10s
2024-02-01 20:44:48 +01:00
0d0e45f800
run all tests only on PR
Some checks failed
check code / scan-code-py311 (push) Blocked by required conditions
check code / check-code-py39 (push) Blocked by required conditions
check code / check-code-py310 (push) Blocked by required conditions
check code / check-code-py311 (push) Blocked by required conditions
check code / check-docs (push) Successful in 7s
check code / check-code-py38 (push) Has been cancelled
2024-02-01 20:42:32 +01:00
01a56734f2 some ci fixes [skip ci]
All checks were successful
check code / check-docs (push) Successful in 7s
check code / check-code-py38 (push) Successful in 7m6s
check code / check-code-py39 (push) Successful in 7m7s
check code / check-code-py310 (push) Successful in 7m8s
check code / check-code-py311 (push) Successful in 6m52s
check code / scan-code-py311 (push) Successful in 7m2s
2024-02-01 16:02:22 +01:00
ed2dfd414c update badges in readme [skip ci] 2024-02-01 15:59:25 +01:00
85e57aec2e Bump version 2.4.0 → 2.4.1
Some checks failed
check code / check-docs (push) Successful in 7s
check code / scan-code-py311 (push) Successful in 6m55s
check code / check-code-py39 (push) Successful in 6m49s
check code / check-code-py38 (push) Successful in 6m59s
build package and container / build-pypackage (push) Successful in 19s
check code / check-code-py310 (push) Successful in 7m0s
create release / release-pypackage (push) Failing after 36s
build package and container / build-container (push) Successful in 4m38s
check code / check-code-py311 (push) Successful in 6m51s
2024-02-01 15:54:39 +01:00
b66cf11f95 update changelog 2024-02-01 15:54:33 +01:00
4eeaa4f603 add scheduled tests
Some checks failed
check code / check-code-py310 (push) Waiting to run
check code / check-code-py311 (push) Waiting to run
check code / check-docs (push) Successful in 6s
check code / scan-code-py311 (push) Has been cancelled
check code / check-code-py38 (push) Has been cancelled
check code / check-code-py39 (push) Has been cancelled
2024-02-01 15:52:39 +01:00
0e8f3768c2 update changelog 2024-02-01 15:52:39 +01:00
1f73c306bd fix release notes action 2024-02-01 15:52:39 +01:00
b20d442057 Merge pull request 'Release v2.4.0' (#55) from dev into master
Some checks failed
check code / check-code-py310 (push) Waiting to run
check code / check-code-py311 (push) Waiting to run
check code / check-docs (push) Successful in 7s
check code / scan-code-py311 (push) Has been cancelled
check code / check-code-py39 (push) Has been cancelled
check code / check-code-py38 (push) Has been cancelled
Reviewed-on: #55
2024-02-01 15:52:26 +01:00
9b83373450 fix coverage in sonarqube
All checks were successful
check code / scan-code-py311 (pull_request) Has been skipped
build package and container / build-pypackage (pull_request) Successful in 15s
check code / check-docs (pull_request) Successful in 7s
build package and container / build-container (pull_request) Successful in 4m54s
check code / check-code-py38 (pull_request) Successful in 6m51s
check code / check-code-py310 (pull_request) Successful in 6m46s
check code / check-code-py39 (pull_request) Successful in 6m58s
create release / release-pypackage (pull_request) Successful in 25s
check code / check-code-py311 (pull_request) Successful in 7m6s
2024-02-01 15:43:04 +01:00
7160e1b2a5 fix container build 2024-02-01 15:36:53 +01:00
9be6a07052 fix docker baseimage and CI errors
Some checks failed
build package and container / build-container (pull_request) Failing after 16s
check code / check-docs (pull_request) Successful in 6s
build package and container / build-pypackage (pull_request) Successful in 28s
check code / scan-code-py311 (pull_request) Successful in 17s
check code / check-code-py39 (pull_request) Successful in 6m40s
check code / check-code-py38 (pull_request) Successful in 7m2s
check code / check-code-py311 (pull_request) Successful in 6m40s
check code / check-code-py310 (pull_request) Successful in 6m56s
create release / release-pypackage (pull_request) Successful in 26s
2024-02-01 14:48:05 +01:00
9db6bb6f87 fix some CI errors
Some checks failed
build package and container / build-container (pull_request) Failing after 18s
check code / check-docs (pull_request) Successful in 6s
build package and container / build-pypackage (pull_request) Successful in 24s
check code / check-code-py311 (pull_request) Successful in 7m32s
check code / check-code-py39 (pull_request) Successful in 7m13s
check code / check-code-py310 (pull_request) Successful in 7m19s
check code / scan-code-py311 (pull_request) Successful in 18s
create release / release-pypackage (pull_request) Successful in 26s
check code / check-code-py38 (pull_request) Successful in 9m3s
2024-02-01 14:37:21 +01:00
89c7c1e386 Bump version 2.3.1 → 2.4.0
Some checks failed
create release / release-pypackage (push) Has been cancelled
build package and container / build-pypackage (push) Has been cancelled
build package and container / build-container (push) Has been cancelled
build package and container / build-container (pull_request) Failing after 18s
check code / check-docs (pull_request) Failing after 5s
check code / scan-code-py311 (pull_request) Failing after 4s
build package and container / build-pypackage (pull_request) Successful in 28s
check code / check-code-py311 (pull_request) Successful in 7m16s
create release / release-pypackage (pull_request) Successful in 28s
check code / check-code-py39 (pull_request) Successful in 7m39s
check code / check-code-py310 (pull_request) Successful in 7m41s
check code / check-code-py38 (pull_request) Successful in 8m24s
2024-02-01 14:23:23 +01:00
6fda875a48 fix tests and workflows 2024-02-01 14:22:25 +01:00
ea1eab403d update to ruff formatter and fix py3.8 compatibility 2024-02-01 13:59:45 +01:00
45dca15d39 Merge pull request 'Update dependency just to v1.23.0' (#53) from renovate/just-1.x into master
Reviewed-on: #53
2024-01-23 14:22:49 +01:00
9a709cc811 Update dependency just to v1.23.0 2024-01-13 08:10:57 +01:00
7166168850 update renovate path 2023-12-05 13:38:41 +01:00
236222b19b Merge pull request 'Update dependency just to v1.16.0' (#51) from renovate/just-1.x into master
Reviewed-on: #51
2023-11-09 09:27:46 +01:00
553a85b436 Update dependency just to v1.16.0 2023-11-09 08:08:31 +01:00
7a7ace9286
fix pytest fixtures
All checks were successful
ci/woodpecker/push/tests Pipeline was successful
Signed-off-by: Ivan Schaller <ivan@schaller.sh>
2023-07-02 16:45:33 +02:00
873e6ab0e2
update pyright and some type annotations. also increase line length to 100 chars
Some checks failed
ci/woodpecker/push/tests Pipeline failed
Signed-off-by: Ivan Schaller <ivan@schaller.sh>
2023-07-02 16:41:00 +02:00
d7c5bd7d17 Merge pull request 'Update dependency just to v1.14.0' (#44) from renovate/just-1.x into master
Some checks failed
ci/woodpecker/push/tests Pipeline failed
Reviewed-on: #44
2023-06-29 21:21:01 +02:00
d8947df817 Merge pull request 'Update dependency shfmt to v3.7.0' (#46) from renovate/shfmt-3.x into master
Some checks failed
ci/woodpecker/push/tests Pipeline failed
Reviewed-on: #46
2023-06-29 21:20:07 +02:00
2cbd204204 Update dependency shfmt to v3.7.0
Some checks failed
ci/woodpecker/push/tests Pipeline failed
ci/woodpecker/pr/tests Pipeline failed
ci/woodpecker/pr/test_tox_arm64 unknown status
ci/woodpecker/pr/test_docker_amd64 unknown status
ci/woodpecker/pr/test_docker_arm64 unknown status
ci/woodpecker/pr/test_release unknown status
ci/woodpecker/pr/test_tox_amd64 unknown status
2023-06-18 18:06:08 +00:00
ee72e8b6d9 Update dependency just to v1.14.0
Some checks failed
ci/woodpecker/push/tests Pipeline failed
ci/woodpecker/pr/tests Pipeline failed
ci/woodpecker/pr/test_tox_arm64 unknown status
ci/woodpecker/pr/test_docker_amd64 unknown status
ci/woodpecker/pr/test_release unknown status
ci/woodpecker/pr/test_docker_arm64 unknown status
ci/woodpecker/pr/test_tox_amd64 unknown status
2023-06-03 06:05:51 +00:00
29fe262ef7 Merge pull request '[2.3.1] - 2023-03-12' (#41) from dev into master
All checks were successful
ci/woodpecker/tag/tests Pipeline was successful
ci/woodpecker/tag/publish_release Pipeline was successful
ci/woodpecker/tag/publish_docker_arm64 Pipeline was successful
ci/woodpecker/tag/publish_docker_amd64 Pipeline was successful
ci/woodpecker/tag/publish_docker_manifest Pipeline was successful
ci/woodpecker/push/tests Pipeline was successful
Reviewed-on: #41
2023-03-12 04:47:37 +01:00
8173b2a729
fix type annitation for py38
All checks were successful
ci/woodpecker/push/tests Pipeline was successful
ci/woodpecker/pr/tests Pipeline was successful
ci/woodpecker/pr/test_docker_amd64 Pipeline was successful
ci/woodpecker/pr/test_docker_arm64 Pipeline was successful
ci/woodpecker/pr/test_release Pipeline was successful
ci/woodpecker/pr/test_tox_arm64 Pipeline was successful
ci/woodpecker/pr/test_tox_amd64 Pipeline was successful
Signed-off-by: Ivan Schaller <ivan@schaller.sh>
2023-03-12 01:51:51 +01:00
f7eebc2dec
fix type hint for py38
Some checks failed
ci/woodpecker/push/tests Pipeline was successful
ci/woodpecker/pr/tests Pipeline was successful
ci/woodpecker/pr/test_docker_amd64 Pipeline was successful
ci/woodpecker/pr/test_docker_arm64 Pipeline was successful
ci/woodpecker/pr/test_release Pipeline was successful
ci/woodpecker/pr/test_tox_arm64 Pipeline failed
ci/woodpecker/pr/test_tox_amd64 Pipeline failed
Signed-off-by: Ivan Schaller <ivan@schaller.sh>
2023-03-12 01:04:37 +01:00
987f72715c
update release date [CI SKIP]
Some checks failed
ci/woodpecker/pr/tests Pipeline was successful
ci/woodpecker/pr/test_docker_amd64 Pipeline was successful
ci/woodpecker/pr/test_docker_arm64 Pipeline was successful
ci/woodpecker/pr/test_release Pipeline was successful
ci/woodpecker/pr/test_tox_arm64 Pipeline failed
ci/woodpecker/pr/test_tox_amd64 Pipeline failed
Signed-off-by: Ivan Schaller <ivan@schaller.sh>
2023-03-12 00:32:37 +01:00
0ada98529a
update readme links
All checks were successful
ci/woodpecker/push/tests Pipeline was successful
Signed-off-by: Ivan Schaller <ivan@schaller.sh>
2023-02-22 21:41:23 +01:00
3d51869663
update CHANGELOG and bump version
Some checks failed
ci/woodpecker/push/tests Pipeline failed
Signed-off-by: Ivan Schaller <ivan@schaller.sh>
2023-02-22 21:05:25 +01:00
0f9e718e30
update readme
Some checks failed
ci/woodpecker/push/tests Pipeline failed
Signed-off-by: Ivan Schaller <ivan@schaller.sh>
2023-02-21 17:16:49 +01:00
9935c97f6c move custom types to own file
All checks were successful
ci/woodpecker/push/tests Pipeline was successful
2023-02-20 14:38:09 +01:00
bde2b9ebe9 add typed dicts for type hinting
All checks were successful
ci/woodpecker/push/tests Pipeline was successful
2023-02-20 14:03:40 +01:00
e2f0a8b41f
fix linter paths
All checks were successful
ci/woodpecker/push/tests Pipeline was successful
Signed-off-by: Ivan Schaller <ivan@schaller.sh>
2023-02-19 18:50:56 +01:00
1f244ef2d6
remove mypy
Some checks failed
ci/woodpecker/push/tests Pipeline failed
Signed-off-by: Ivan Schaller <ivan@schaller.sh>
2023-02-19 18:45:53 +01:00
32d5f8a9a1
update readme [CI SKIP]
Signed-off-by: Ivan Schaller <ivan@schaller.sh>
2023-02-18 16:36:52 +01:00
a53767bf74
update api template with type annotations [CI SKIP]
Signed-off-by: Ivan Schaller <ivan@schaller.sh>
2023-02-18 16:29:44 +01:00
830cfd48bb
install deps before pyright
All checks were successful
ci/woodpecker/push/tests Pipeline was successful
Signed-off-by: Ivan Schaller <ivan@schaller.sh>
2023-02-18 16:24:39 +01:00
03461b80bf
switch to strict typing with pyright
Some checks failed
ci/woodpecker/push/tests Pipeline failed
Signed-off-by: Ivan Schaller <ivan@schaller.sh>
2023-02-18 16:21:03 +01:00
ef7a914869
update readme for ruff [CI SKIP]
Signed-off-by: Ivan Schaller <ivan@schaller.sh>
2023-02-18 12:52:34 +01:00
a8f4b25802
fix special character in test
All checks were successful
ci/woodpecker/push/tests Pipeline was successful
Signed-off-by: Ivan Schaller <ivan@schaller.sh>
2023-02-18 12:44:42 +01:00
b5c5b97b16
fix ci task name
Some checks failed
ci/woodpecker/push/tests Pipeline failed
Signed-off-by: Ivan Schaller <ivan@schaller.sh>
2023-02-18 12:32:30 +01:00
5e28cb1088
format with ruff
Some checks failed
ci/woodpecker/push/tests Pipeline failed
Signed-off-by: Ivan Schaller <ivan@schaller.sh>
2023-02-18 12:29:25 +01:00
2ad0c575a7
switch to ruff and update justfile
Signed-off-by: Ivan Schaller <ivan@schaller.sh>
2023-02-18 12:23:50 +01:00
61 changed files with 1125 additions and 1191 deletions

1
.envrc
View file

@ -1 +0,0 @@
use asdf

View file

@ -0,0 +1,99 @@
name: build package and container
on:
push:
tags:
- "v*.*.*"
pull_request:
branches: [main, master]
jobs:
build-pypackage:
runs-on: python311
env:
HATCH_INDEX_REPO: main
HATCH_INDEX_USER: __token__
HATCH_INDEX_AUTH: ${{ secrets.PYPI_TOKEN }}
steps:
- name: checkout code
uses: actions/checkout@v3
- name: install hatch
run: pip install -U hatch hatchling
- name: build package
run: hatch build --clean
- name: publish package
if: gitea.event_name != 'pull_request'
run: hatch publish --yes --no-prompt
build-container:
runs-on: ubuntu-latest
env:
REGISTRY: docker.io
AUTHOR: olofvndrhr
IMAGE: manga-dlp
steps:
- name: checkout code
uses: actions/checkout@v3
- name: setup qemu
uses: docker/setup-qemu-action@v2
- name: setup docker buildx
uses: docker/setup-buildx-action@v2
- name: get container metadata
uses: docker/metadata-action@v4
id: metadata
with:
images: ${{ env.REGISTRY }}/${{ env.AUTHOR }}/${{ env.IMAGE }}
flavor: |
latest=auto
prefix=
suffix=
tags: |
type=schedule
type=ref,event=branch
type=ref,event=pr
type=semver,pattern={{version}}
type=semver,pattern={{major}}.{{minor}}
type=semver,pattern={{major}}
type=sha
- name: login to docker.io container registry
uses: docker/login-action@v2
with:
registry: ${{ env.REGISTRY }}
username: ${{ secrets.CR_USERNAME }}
password: ${{ secrets.CR_PASSWORD }}
- name: login to private container registry
uses: docker/login-action@v2
with:
registry: git.44net.ch
username: ${{ secrets.CR_PRIV_USERNAME }}
password: ${{ secrets.CR_PRIV_PASSWORD }}
- name: build and push docker image @amd64+arm64
uses: docker/build-push-action@v4
with:
push: ${{ gitea.event_name != 'pull_request' }}
platforms: linux/amd64,linux/arm64
context: .
file: docker/Dockerfile
provenance: false
tags: ${{ steps.metadata.outputs.tags }}
labels: ${{ steps.metadata.outputs.labels }}
- name: update dockerhub repo description
uses: peter-evans/dockerhub-description@v3
if: gitea.event_name != 'pull_request'
with:
repository: ${{ env.AUTHOR }}/${{ env.IMAGE }}
short-description: ${{ github.event.repository.description }}
enable-url-completion: true
username: ${{ secrets.CR_USERNAME }}
password: ${{ secrets.CR_PASSWORD }}

View file

@ -0,0 +1,122 @@
name: check code
on:
push:
branches: [main, master]
pull_request:
branches: [main, master]
jobs:
check-docs:
runs-on: python311
steps:
- name: checkout code
uses: actions/checkout@v3
- name: "build docs"
run: |
python3 -m pip install mkdocs
cd docs || exit 1
mkdocs build --strict
scan-code-py311:
runs-on: python311
if: gitea.event_name != 'pull_request'
needs: [check-code-py38]
steps:
- name: checkout code
uses: actions/checkout@v3
with:
fetch-depth: 0
- name: install hatch
run: pip install -U hatch
- name: get coverage (hatch)
run: hatch run default:cov
- name: run sonar-scanner
uses: sonarsource/sonarqube-scan-action@v2.1.0
env:
SONAR_HOST_URL: ${{ secrets.SONARQUBE_HOST }}
SONAR_TOKEN: ${{ secrets.SONARQUBE_TOKEN }}
check-code-py38:
runs-on: python38
steps:
- name: checkout code
uses: actions/checkout@v3
- name: install hatch
run: pip install -U hatch
- name: test codestyle
run: hatch run +py=3.8 lint:style
- name: test typing
run: hatch run +py=3.8 lint:typing
- name: run tests
if: gitea.event_name == 'pull_request'
run: hatch run default:test
check-code-py39:
runs-on: python39
needs: [check-code-py38]
steps:
- name: checkout code
uses: actions/checkout@v3
- name: install hatch
run: pip install -U hatch
- name: test codestyle
run: hatch run +py=3.9 lint:style
- name: test typing
run: hatch run +py=3.9 lint:typing
- name: run tests
if: gitea.event_name == 'pull_request'
run: hatch run default:test
check-code-py310:
runs-on: python310
needs: [check-code-py39]
steps:
- name: checkout code
uses: actions/checkout@v3
- name: install hatch
run: pip install -U hatch
- name: test codestyle
run: hatch run +py=3.10 lint:style
- name: test typing
run: hatch run +py=3.10 lint:typing
- name: run tests
if: gitea.event_name == 'pull_request'
run: hatch run default:test
check-code-py311:
runs-on: python311
needs: [check-code-py310]
steps:
- name: checkout code
uses: actions/checkout@v3
- name: install hatch
run: pip install -U hatch
- name: test codestyle
run: hatch run +py=3.11 lint:style
- name: test typing
run: hatch run +py=3.11 lint:typing
- name: run tests
if: gitea.event_name == 'pull_request'
run: hatch run default:test

View file

@ -0,0 +1,56 @@
name: create release
on:
push:
tags:
- "v*.*.*"
pull_request:
branches: [main, master]
jobs:
release-pypackage:
runs-on: python311
env:
HATCH_INDEX_REPO: main
HATCH_INDEX_USER: __token__
HATCH_INDEX_AUTH: ${{ secrets.PYPI_TOKEN }}
steps:
- name: checkout code
uses: actions/checkout@v3
- name: setup go
uses: actions/setup-go@v4
with:
go-version: '>=1.20'
- name: install hatch
run: pip install -U hatch hatchling
- name: build package
run: hatch build --clean
- name: get release notes
id: release-notes
uses: olofvndrhr/releasenote-gen@v1
- name: create gitea release
uses: https://gitea.com/actions/release-action@main
if: gitea.event_name != 'pull_request'
with:
title: ${{ gitea.ref_name }}
body: ${{ steps.release-notes.outputs.releasenotes }}
files: |-
dist/**
- name: create github release
uses: ncipollo/release-action@v1
if: gitea.event_name != 'pull_request'
with:
token: ${{ secrets.GH_TOKEN }}
owner: olofvndrhr
repo: manga-dlp
name: ${{ gitea.ref_name }}
body: ${{ steps.release-notes.outputs.releasenotes }}
artifacts: |-
dist/**

View file

@ -0,0 +1,18 @@
name: run scheduled tests
on:
schedule:
- cron: "0 20 * * 6"
jobs:
check-code-py311:
runs-on: python311
steps:
- name: checkout code
uses: actions/checkout@v3
- name: install hatch
run: pip install -U hatch
- name: run tests
run: hatch run default:test

View file

@ -1,5 +1,4 @@
python 3.9.13 3.10.5 3.8.13
shellcheck 0.9.0
shfmt 3.6.0
direnv 2.32.2
just 1.13.0
shellcheck 0.10.0
shfmt 3.8.0
just 1.25.2
lefthook 1.4.6

View file

@ -1,36 +0,0 @@
#########################################
# build and publish docker images amd64 #
#########################################
# branch: master
# event: tag
platform: linux/amd64
depends_on:
- tests
clone:
git:
image: woodpeckerci/plugin-git:v1.6.0
when:
event: tag
pipeline:
# build and publish docker image for amd64 - x86
build-amd64:
image: plugins/docker
pull: true
when:
event: tag
settings:
repo: olofvndrhr/manga-dlp
platforms: linux/amd64
dockerfile: docker/Dockerfile.amd64
auto_tag: true
auto_tag_suffix: linux-amd64
build_args: BUILD_VERSION=${CI_COMMIT_TAG}
username:
from_secret: cr-dhub-username
password:
from_secret: cr-dhub-key

View file

@ -1,36 +0,0 @@
#########################################
# build and publish docker images arm64 #
#########################################
# branch: master
# event: tag
platform: linux/arm64
depends_on:
- tests
clone:
git:
image: woodpeckerci/plugin-git:v1.6.0
when:
event: tag
pipeline:
# build and publish docker image for arm64
build-arm64:
image: plugins/docker
pull: true
when:
event: tag
settings:
repo: olofvndrhr/manga-dlp
platforms: linux/arm64
dockerfile: docker/Dockerfile.arm64
auto_tag: true
auto_tag_suffix: linux-arm64
build_args: BUILD_VERSION=${CI_COMMIT_TAG}
username:
from_secret: cr-dhub-username
password:
from_secret: cr-dhub-key

View file

@ -1,36 +0,0 @@
###########################
# publish docker manifest #
###########################
# branch: master
# event: tag
platform: linux/amd64
depends_on:
- publish_docker_amd64
- publish_docker_arm64
clone:
git:
image: woodpeckerci/plugin-git:v1.6.0
when:
event: tag
tag: "*[!-dev]"
pipeline:
# publish docker manifest for automatic multi arch pulls
publish-manifest:
image: plugins/manifest
pull: true
when:
event: tag
tag: "*[!-dev]"
settings:
spec: docker/manifest.tmpl
auto_tag: true
ignore_missing: true
username:
from_secret: cr-dhub-username
password:
from_secret: cr-dhub-key

View file

@ -1,77 +0,0 @@
###################
# publish release #
###################
# branch: master
# event: tag
platform: linux/amd64
depends_on:
- tests
clone:
git:
image: woodpeckerci/plugin-git:v1.6.0
when:
event: tag
pipeline:
# build wheel and dist
build-pypi:
image: cr.44net.ch/ci-plugins/tests
pull: true
when:
event: tag
commands:
- python3 -m hatch build --clean
# create release-notes
create-release-notes:
image: cr.44net.ch/baseimages/debian-base
pull: true
when:
event: tag
commands:
- bash get_release_notes.sh ${CI_COMMIT_TAG%%-dev}
# publish release on github (github.com/olofvndrhr/manga-dlp)
publish-release-github:
image: woodpeckerci/plugin-github-release
pull: true
when:
event: tag
settings:
api_key:
from_secret: github-olofvndrhr-token
files: dist/*
title: ${CI_COMMIT_TAG}
note: RELEASENOTES.md
# publish release on gitea (git.44net.ch/olofvndrhr/manga-dlp)
publish-release-gitea:
image: woodpeckerci/plugin-gitea-release
pull: true
when:
event: tag
settings:
api_key:
from_secret: gitea-olofvndrhr-token
base_url: https://git.44net.ch
files: dist/*
title: ${CI_COMMIT_TAG}
note: RELEASENOTES.md
# release pypi
release-pypi:
image: cr.44net.ch/ci-plugins/tests
pull: true
when:
event: tag
secrets:
- source: pypi_username
target: HATCH_INDEX_USER
- source: pypi_token
target: HATCH_INDEX_AUTH
commands:
- python3 -m hatch publish --no-prompt --yes

View file

@ -1,35 +0,0 @@
##################################
# test build docker images amd64 #
##################################
# branch: master
# event: pull_request
platform: linux/amd64
depends_on:
- tests
clone:
git:
image: woodpeckerci/plugin-git:v1.6.0
when:
branch: master
event: pull_request
pipeline:
# build docker image for amd64 - x86
test-build-amd64:
image: plugins/docker
pull: true
when:
branch: master
event: pull_request
settings:
dry_run: true
repo: olofvndrhr/manga-dlp
platforms: linux/amd64
dockerfile: docker/Dockerfile.amd64
auto_tag: true
auto_tag_suffix: linux-amd64-test
build_args: BUILD_VERSION=test

View file

@ -1,35 +0,0 @@
##################################
# test build docker images arm64 #
##################################
# branch: master
# event: pull_request
platform: linux/arm64
depends_on:
- tests
clone:
git:
image: woodpeckerci/plugin-git:v1.6.0
when:
branch: master
event: pull_request
pipeline:
# build docker image for arm64
test-build-arm64:
image: plugins/docker
pull: true
when:
branch: master
event: pull_request
settings:
dry_run: true
repo: olofvndrhr/manga-dlp
platforms: linux/arm64
dockerfile: docker/Dockerfile.arm64
auto_tag: true
auto_tag_suffix: linux-arm64-test
build_args: BUILD_VERSION=test

View file

@ -1,40 +0,0 @@
################
# test release #
################
# branch: master
# event: pull_request
platform: linux/amd64
depends_on:
- tests
clone:
git:
image: woodpeckerci/plugin-git:v1.6.0
when:
branch: master
event: pull_request
pipeline:
# build wheel and dist
test-build-pypi:
image: cr.44net.ch/ci-plugins/tests
pull: true
when:
branch: master
event: pull_request
commands:
- python3 -m hatch build --clean
# create release-notes
test-create-release-notes:
image: cr.44net.ch/baseimages/debian-base
pull: true
when:
branch: master
event: pull_request
commands:
- bash get_release_notes.sh latest
- cat RELEASENOTES.md

View file

@ -1,29 +0,0 @@
##################
# test tox amd64 #
##################
# branch: master
# event: pull_request
platform: linux/amd64
depends_on:
- tests
clone:
git:
image: woodpeckerci/plugin-git:v1.6.0
when:
branch: master
event: pull_request
pipeline:
# test code with different python versions - amd64
test-tox-amd64:
image: cr.44net.ch/ci-plugins/multipy
pull: true
when:
branch: master
event: pull_request
commands:
- python3 -m tox

View file

@ -1,32 +0,0 @@
##################
# test tox arm64 #
##################
# branch: master
# event: pull_request
platform: linux/arm64
depends_on:
- tests
clone:
git:
image: woodpeckerci/plugin-git:v1.6.0
when:
branch: master
event: pull_request
pipeline:
# test code with different python versions - arm64
test-tox-arm64:
image: cr.44net.ch/ci-plugins/multipy
pull: true
when:
branch: master
event: pull_request
commands:
- grep -v img2pdf contrib/requirements_dev.txt > contrib/requirements_dev_arm64.txt
- rm -f contrib/requirements_dev.txt
- mv contrib/requirements_dev_arm64.txt contrib/requirements_dev.txt
- python3 -m tox

View file

@ -1,105 +0,0 @@
##############################
# code testing and analysis #
#############################
# branch: all
# event: all
platform: linux/amd64
clone:
git:
image: woodpeckerci/plugin-git:v1.6.0
pipeline:
# check code style - shell
test-shfmt:
image: cr.44net.ch/ci-plugins/tests
pull: true
commands:
- shfmt -d -i 4 -bn -ci -sr .
# check code style - python
test-black:
image: cr.44net.ch/ci-plugins/tests
pull: true
commands:
- python3 -m black --check --diff .
# check imports - python
test-isort:
image: cr.44net.ch/ci-plugins/tests
pull: true
commands:
- python3 -m isort --check-only --diff .
# check unused and missing imports - python
test-autoflake:
image: cr.44net.ch/ci-plugins/tests
pull: true
commands:
- python3 -m autoflake --remove-all-unused-imports -r -v mangadlp/
- python3 -m autoflake --check --remove-all-unused-imports -r -v mangadlp/
# check static typing - python
test-mypy:
image: cr.44net.ch/ci-plugins/tests
pull: true
commands:
- python3 -m mypy --install-types --non-interactive mangadlp/
# mccabe, pycodestyle, pyflakes tests - python
test-pylama:
image: cr.44net.ch/ci-plugins/tests
pull: true
commands:
- python3 -m pylama mangadlp/
# pylint test - python
test-pylint:
image: cr.44net.ch/ci-plugins/tests
pull: true
commands:
- python3 -m pip install -r requirements.txt
- python3 -m pylint --fail-under 9 mangadlp/
# test mkdocs generation
test-mkdocs:
image: cr.44net.ch/ci-plugins/tests
pull: true
commands:
- python3 -m pip install mkdocs
- cd docs || exit 1
- python3 -m mkdocs build --strict
# test code with different python versions - python
test-tox-pytest:
when:
event: [ push ]
image: cr.44net.ch/ci-plugins/tests
pull: true
commands:
- python3 -m tox -e basic
# generate coverage report - python
test-tox-coverage:
when:
branch: master
event: [ pull_request ]
image: cr.44net.ch/ci-plugins/tests
pull: true
commands:
- python3 -m tox -e coverage
# analyse code with sonarqube and upload it
sonarqube-analysis:
when:
branch: master
event: [ pull_request ]
image: cr.44net.ch/ci-plugins/sonar-scanner
pull: true
settings:
sonar_host: https://sonarqube.44net.ch
sonar_token:
from_secret: sq-44net-token
usingProperties: true

View file

@ -9,6 +9,40 @@ to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
- Add support for more sites
## [2.4.1] - 2024-02-01
- same as 2.4.0
## [2.4.0] - 2024-02-01
### Fixed
- Some issues with Python3.8 compatibility
### Changed
- Moved build system from woodpecker-ci to gitea actions
- Updated some dependencies
- Updated the docker image
- Switched from formatter/linter `black` to `ruff`
- Switches typing from `pyright` to `mypy`
## [2.3.1] - 2023-03-12
### Added
- Added TypedDicts for type checkers and type annotation
### Fixed
- Fixed some typos in the README
### Changed
- Switched from pylint/pylama/isort/autoflake to ruff
- Switched from mypy to pyright and added strict type checking
- Updated the api template
## [2.3.0] - 2023-02-15
### Added

View file

@ -1,6 +1,6 @@
MIT License
Copyright (c) 2021-2023 Ivan Schaller
Copyright (c) 2021-present Ivan Schaller <ivan@schaller.sh>
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal

View file

@ -1,13 +1 @@
include *.json
include *.md
include *.properties
include *.py
include *.txt
include *.yml
include *.xml
recursive-include contrib *.py
recursive-include mangadlp *.py
recursive-include mangadlp *.xml
recursive-include tests *.py
recursive-include tests *.xml
recursive-include tests *.txt
graft src

View file

@ -2,48 +2,54 @@
> Full docs: https://manga-dlp.ivn.sh
CI/CD
[![status-badge](https://img.shields.io/drone/build/olofvndrhr/manga-dlp?label=tests&server=https%3A%2F%2Fci.44net.ch)](https://ci.44net.ch/olofvndrhr/manga-dlp)
[![Last Release](https://img.shields.io/github/release-date/olofvndrhr/manga-DLP?label=last%20release)](https://github.com/olofvndrhr/manga-dlp/releases)
[![Version](https://img.shields.io/github/v/release/olofvndrhr/manga-dlp?label=git%20release)](https://github.com/olofvndrhr/manga-dlp/releases)
[![Version PyPi](https://img.shields.io/pypi/v/manga-dlp?label=pypi%20release)](https://pypi.org/project/manga-dlp/)
Code Analysis
[![Quality Gate Status](https://sonarqube.44net.ch/api/project_badges/measure?project=olofvndrhr%3Amanga-dlp&metric=alert_status&token=f9558470580eea5b4899cf33f190eee16011346d)](https://sonarqube.44net.ch/dashboard?id=olofvndrhr%3Amanga-dlp)
[![Coverage](https://sonarqube.44net.ch/api/project_badges/measure?project=olofvndrhr%3Amanga-dlp&metric=coverage&token=f9558470580eea5b4899cf33f190eee16011346d)](https://sonarqube.44net.ch/dashboard?id=olofvndrhr%3Amanga-dlp)
[![Bugs](https://sonarqube.44net.ch/api/project_badges/measure?project=olofvndrhr%3Amanga-dlp&metric=bugs&token=f9558470580eea5b4899cf33f190eee16011346d)](https://sonarqube.44net.ch/dashboard?id=olofvndrhr%3Amanga-dlp)
[![Security](https://img.shields.io/snyk/vulnerabilities/github/olofvndrhr/manga-dlp)](https://app.snyk.io/org/olofvndrhr-t6h/project/aae9609d-a4e4-41f8-b1ac-f2561b2ad4e3)
[![Maintainability Rating](https://sonarqube.44net.ch/api/project_badges/measure?project=olofvndrhr%3Amanga-dlp&metric=sqale_rating&token=f9558470580eea5b4899cf33f190eee16011346d)](https://sonarqube.44net.ch/dashboard?id=olofvndrhr%3Amanga-dlp)
[![Reliability Rating](https://sonarqube.44net.ch/api/project_badges/measure?project=olofvndrhr%3Amanga-dlp&metric=reliability_rating&token=f9558470580eea5b4899cf33f190eee16011346d)](https://sonarqube.44net.ch/dashboard?id=olofvndrhr%3Amanga-dlp)
[![Security Rating](https://sonarqube.44net.ch/api/project_badges/measure?project=olofvndrhr%3Amanga-dlp&metric=security_rating&token=f9558470580eea5b4899cf33f190eee16011346d)](https://sonarqube.44net.ch/dashboard?id=olofvndrhr%3Amanga-dlp)
Meta
[![Code style](https://img.shields.io/badge/code%20style-black-black)](https://github.com/psf/black)
[![Linter](https://img.shields.io/badge/linter-pylint-yellowgreen)](https://pylint.pycqa.org/en/latest/)
[![Formatter](https://img.shields.io/badge/code%20style-ruff-black)](https://github.com/charliermarsh/ruff)
[![Linter](https://img.shields.io/badge/linter-ruff-red)](https://github.com/charliermarsh/ruff)
[![Types](https://img.shields.io/badge/types-mypy-blue)](https://github.com/python/mypy)
[![Imports](https://img.shields.io/badge/imports-isort-ef8336.svg)](https://github.com/pycqa/isort)
[![Tests](https://img.shields.io/badge/tests-pytest%20%7C%20tox-yellow)](https://github.com/pytest-dev/pytest/)
[![Coverage](https://img.shields.io/badge/coverage-coveragepy-green)](https://github.com/nedbat/coveragepy)
[![License](https://img.shields.io/badge/license-MIT-9400d3.svg)](https://snyk.io/learn/what-is-mit-license/)
[![Compatibility](https://img.shields.io/pypi/pyversions/manga-dlp)](https://pypi.org/project/manga-dlp/)
[![Compatibility](https://img.shields.io/badge/python-3.11-blue)]()
---
## Description
A manga download script written in python. It only supports [mangadex.org](https://mangadex.org/) for now. But support
for other sites is planned.
for other sites is _planned™_.
Before downloading a new chapter, the script always checks if there is already a chapter with the same name in the
download directory. If found the chapter is skipped. So you can run the script on a schedule to only download new
chapters without any additional setup.
The default behaiviour is to pack the images to a [cbz archive](https://en.wikipedia.org/wiki/Comic_book_archive). If
you just want the folder with all the pictures use the flag `--nocbz`.
you just want the folder with all the pictures use the flag `--format ""`.
## _Currently_ Supported sites
- [Mangadex.org](https://mangadex.org/)
## Features (not complete)
- Metadata support with [ComicInfo.xml](https://anansi-project.github.io/docs/comicinfo/intro)
- Json caching
- Custom hooks after/before each download
- Custom chapter name format
- Volume support
- Multiple archive formats supported (cbz,cbr,zip,none)
- Language selection
- Download all chapters directly
- And others...
## Usage
### Quick start
@ -124,18 +130,18 @@ verbosity: [mutually_exclusive]
For suggestions for improvement, just open a pull request.
If you want to add support for a new site, there is an api [template file](./contrib/api_template.py) which you can use.
And more infos and tools in the contrib [README.md](contrib/README.md)
If you want to add support for a new site, there is an api [template file](contrib/api_template.py) which you can use.
And more infos and tools are in the contrib [README.md](contrib/README.md)
Otherwise, you can open am issue with the name of the site which you want support for. (not guaranteed to be
implemented)
Otherwise, you can open an issue with the name of the site which you want support for (not guaranteed to be
implemented).
If you encounter any bugs, also just open an issue with a description of the problem.
## TODO's
- <del>Make docker container for easy distribution</del>
--> [Dockerhub](https://hub.docker.com/repository/docker/olofvndrhr/manga-dlp)
--> [Dockerhub](https://hub.docker.com/r/olofvndrhr/manga-dlp)
- <del>Automate release</del>
--> Done with woodpecker-ci
- <del>Make pypi package</del>

View file

@ -1,9 +1,15 @@
from typing import Dict, List
from mangadlp.models import ChapterData, ComicInfo
# api template for manga-dlp
class YourAPI:
"""Your API Class.
Get infos for a manga from example.org
Get infos for a manga from example.org.
Args:
url_uuid (str): URL or UUID of the manga
@ -22,10 +28,8 @@ class YourAPI:
api_base_url = "https://api.mangadex.org"
img_base_url = "https://uploads.mangadex.org"
def __init__(self, url_uuid, language, forcevol):
"""
get infos to initiate class
"""
def __init__(self, url_uuid: str, language: str, forcevol: bool):
"""get infos to initiate class."""
self.api_name = "Your API Name"
self.url_uuid = url_uuid
@ -36,22 +40,24 @@ class YourAPI:
self.manga_uuid = "abc"
self.manga_title = "abc"
self.chapter_list = ["1", "2", "2.1", "5", "10"]
self.manga_chapter_data = { # example data
self.manga_chapter_data: Dict[str, ChapterData] = { # example data
"1": {
"uuid": "abc",
"volume": "1",
"chapter": "1",
"name": "test",
"pages": 2,
},
"2": {
"uuid": "abc",
"volume": "1",
"chapter": "2",
"name": "test",
"pages": 45,
},
}
# or with --forcevol
self.manga_chapter_data = {
self.manga_chapter_data: Dict[str, ChapterData] = {
"1:1": {
"uuid": "abc",
"volume": "1",
@ -66,9 +72,8 @@ class YourAPI:
},
}
def get_chapter_images(chapter: str, download_wait: float) -> list:
"""
Get chapter images as a list (full links)
def get_chapter_images(self, chapter: str, wait_time: float) -> List[str]:
"""Get chapter images as a list (full links).
Args:
chapter: The chapter number (chapter data index)
@ -77,7 +82,6 @@ class YourAPI:
Returns:
The list of urls of the page images
"""
# example
return [
"https://abc.def/image/123.png",
@ -85,10 +89,10 @@ class YourAPI:
"https://abc.def/image/12345.png",
]
def create_metadata(self, chapter: str) -> dict:
"""
Get metadata with correct keys for ComicInfo.xml
Provide as much metadata as possible. empty/false values will be ignored
def create_metadata(self, chapter: str) -> ComicInfo:
"""Get metadata with correct keys for ComicInfo.xml.
Provide as much metadata as possible. empty/false values will be ignored.
Args:
chapter: The chapter number (chapter data index)
@ -96,7 +100,6 @@ class YourAPI:
Returns:
The metadata as a dict
"""
# metadata types. have to be valid
# {key: (type, default value, valid values)}
{
@ -155,7 +158,7 @@ class YourAPI:
# example
return {
"Volume": "abc",
"Volume": 1,
"LanguageISO": "en",
"Title": "test",
}

View file

@ -14,9 +14,7 @@ hatchling>=1.11.0
pytest>=7.0.0
coverage>=6.3.1
black>=22.1.0
isort>=5.10.0
pylint>=2.13.0
mypy>=0.940
tox>=3.24.5
autoflake>=1.4
pylama>=8.3.8
ruff>=0.0.247
pyright>=1.1.294

39
docker/Dockerfile Normal file
View file

@ -0,0 +1,39 @@
FROM git.44net.ch/44net/python311:11 AS builder
COPY pyproject.toml README.md /build/
COPY src /build/src
WORKDIR /build
RUN \
echo "**** building package ****" \
&& pip3 install hatch hatchling \
&& python3 -m hatch build --clean
FROM git.44net.ch/44net/debian-s6:11
LABEL maintainer="Ivan Schaller" \
description="A CLI manga downloader"
ENV PATH="/opt/python3/bin:${PATH}"
COPY --from=builder /opt/python3 /opt/python3
COPY --from=builder /build/dist/*.whl /build/dist/
COPY docker/rootfs /
RUN \
echo "**** creating folders ****" \
&& mkdir -p /app \
&& echo "**** updating pip ****" \
&& python3 -m pip install --upgrade pip setuptools wheel \
&& echo "**** install python packages ****" \
&& python3 -m pip install /build/dist/*.whl
RUN \
echo "**** cleanup ****" \
&& apt-get purge --auto-remove -y \
&& apt-get clean \
&& rm -rf \
/tmp/* \
/var/lib/apt/lists/* \
/var/tmp/*
WORKDIR /app

View file

@ -1,50 +0,0 @@
FROM cr.44net.ch/baseimages/debian-s6:11.6-linux-amd64
# set version label
ARG BUILD_VERSION
ENV IMAGE_VERSION=${BUILD_VERSION}
LABEL version="${BUILD_VERSION}"
LABEL maintainer="Ivan Schaller"
LABEL description="A CLI manga downloader"
# install packages
RUN \
echo "**** install base packages ****" \
&& apt-get update \
&& apt-get install -y --no-install-recommends \
python3 \
python3-pip
# prepare app
RUN \
echo "**** creating folders ****" \
&& mkdir -p /app \
&& echo "**** updating pip ****" \
&& python3 -m pip install --upgrade pip
# cleanup installation
RUN \
echo "**** cleanup ****" \
&& apt-get purge --auto-remove -y \
&& apt-get clean \
&& rm -rf \
/tmp/* \
/var/lib/apt/lists/* \
/var/tmp/*
# copy files to container
COPY docker/rootfs /
COPY mangadlp/ /app/mangadlp/
COPY \
manga-dlp.py \
requirements.txt \
LICENSE \
/app/
# install requirements
RUN pip install -r /app/requirements.txt
WORKDIR /app

View file

@ -1,52 +0,0 @@
FROM cr.44net.ch/baseimages/debian-s6:11.6-linux-arm64
# set version label
ARG BUILD_VERSION
ENV IMAGE_VERSION=${BUILD_VERSION}
LABEL version="${BUILD_VERSION}"
LABEL maintainer="Ivan Schaller"
LABEL description="A CLI manga downloader"
# install packages
RUN \
echo "**** install base packages ****" \
&& apt-get update \
&& apt-get install -y --no-install-recommends \
python3 \
python3-pip
# prepare app
RUN \
echo "**** creating folders ****" \
&& mkdir -p /app \
&& echo "**** updating pip ****" \
&& python3 -m pip install --upgrade pip
# cleanup installation
RUN \
echo "**** cleanup ****" \
&& apt-get purge --auto-remove -y \
&& apt-get clean \
&& rm -rf \
/tmp/* \
/var/lib/apt/lists/* \
/var/tmp/*
# copy files to container
COPY docker/rootfs /
COPY mangadlp/ /app/mangadlp/
COPY \
manga-dlp.py \
requirements.txt \
LICENSE \
/app/
# install requirements (without img2pdf)
RUN grep -v img2pdf /app/requirements.txt > /app/requirements-arm64.txt
RUN pip install -r /app/requirements-arm64.txt
WORKDIR /app

View file

@ -1,20 +0,0 @@
image: olofvndrhr/manga-dlp:{{#if build.tag}}{{trimPrefix "v" build.tag}}{{else}}dev{{/if}}
{{#if build.tags}}
tags:
{{#each build.tags}}
- {{this}}
{{/each}}
- "latest"
{{/if}}
manifests:
-
image: olofvndrhr/manga-dlp:{{#if build.tag}}{{trimPrefix "v" build.tag}}-{{else}}dev-{{/if}}linux-amd64
platform:
architecture: amd64
os: linux
-
image: olofvndrhr/manga-dlp:{{#if build.tag}}{{trimPrefix "v" build.tag}}-{{else}}dev-{{/if}}linux-arm64
platform:
architecture: arm64
os: linux
variant: v8

View file

@ -8,4 +8,3 @@ PATH=/usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin
# "s6-setuidgid abc" is used to set the permissions
0 12 * * * root s6-setuidgid abc /app/schedules/daily.sh > /proc/1/fd/1 2>&1

View file

@ -17,30 +17,42 @@ Code Analysis
Meta
[![Code style](https://img.shields.io/badge/code%20style-black-black)](https://github.com/psf/black)
[![Linter](https://img.shields.io/badge/linter-pylint-yellowgreen)](https://pylint.pycqa.org/en/latest/)
[![Types](https://img.shields.io/badge/types-mypy-blue)](https://github.com/python/mypy)
[![Imports](https://img.shields.io/badge/imports-isort-ef8336.svg)](https://github.com/pycqa/isort)
[![Linter](https://img.shields.io/badge/linter-ruff-red)](https://github.com/charliermarsh/ruff)
[![Types](https://img.shields.io/badge/types-pyright-blue)](https://github.com/microsoft/pyright)
[![Tests](https://img.shields.io/badge/tests-pytest%20%7C%20tox-yellow)](https://github.com/pytest-dev/pytest/)
[![Coverage](https://img.shields.io/badge/coverage-coveragepy-green)](https://github.com/nedbat/coveragepy)
[![License](https://img.shields.io/badge/license-MIT-9400d3.svg)](https://snyk.io/learn/what-is-mit-license/)
[![Compatibility](https://img.shields.io/pypi/pyversions/manga-dlp)](https://pypi.org/project/manga-dlp/)
---
## Description
A manga download script written in python. It only supports [mangadex.org](https://mangadex.org/) for now. But support
for other sites is planned.
for other sites is _planned™_.
Before downloading a new chapter, the script always checks if there is already a chapter with the same name in the
download directory. If found the chapter is skipped. So you can run the script on a schedule to only download new
chapters without any additional setup.
The default behaiviour is to pack the images to a [cbz archive](https://en.wikipedia.org/wiki/Comic_book_archive). If
you just want the folder with all the pictures use the flag `--nocbz`.
you just want the folder with all the pictures use the flag `--format ""`.
## _Currently_ Supported sites
- [Mangadex.org](https://mangadex.org/)
- [Mangadex.org](https://mangadex.org/)
## Features (not complete)
- Metadata support with [ComicInfo.xml](https://anansi-project.github.io/docs/comicinfo/intro)
- Json caching
- Custom hooks after/before each download
- Custom chapter name format
- Volume support
- Multiple archive formats supported (cbz,cbr,zip,none)
- Language selection
- Download all chapters directly
- And others...
## Usage
@ -82,7 +94,7 @@ mangadlp <args> # call script directly
### With docker
See the docker [README](docker/)
See the docker [README](https://manga-dlp.ivn.sh/docker/)
## Options
@ -122,22 +134,20 @@ verbosity: [mutually_exclusive]
For suggestions for improvement, just open a pull request.
If you want to add support for a new site, there is an
api [template file](https://github.com/olofvndrhr/manga-dlp/blob/master/contrib/api_template.py) which you can use.
And more infos and tools in the
contrib [README.md](https://github.com/olofvndrhr/manga-dlp/blob/master/contrib/README.md)
If you want to add support for a new site, there is an api [template file](https://github.com/olofvndrhr/manga-dlp/tree/master/contrib/api_template.py) which you can use.
And more infos and tools are in the contrib [README.md](https://github.com/olofvndrhr/manga-dlp/tree/master/contrib/README.md)
Otherwise, you can open am issue with the name of the site which you want support for. (not guaranteed to be
implemented)
Otherwise, you can open an issue with the name of the site which you want support for (not guaranteed to be
implemented).
If you encounter any bugs, also just open an issue with a description of the problem.
## TODO's
- <del>Make docker container for easy distribution</del>
--> [Dockerhub](https://hub.docker.com/repository/docker/olofvndrhr/manga-dlp)
- <del>Automate release</del>
--> Done with woodpecker-ci
- <del>Make pypi package</del>
--> Done with release [2.1.7](https://pypi.org/project/manga-dlp/)
- Add more supported sites
- <del>Make docker container for easy distribution</del>
--> [Dockerhub](https://hub.docker.com/r/olofvndrhr/manga-dlp)
- <del>Automate release</del>
--> Done with woodpecker-ci
- <del>Make pypi package</del>
--> Done with release [2.1.7](https://pypi.org/project/manga-dlp/)
- Add more supported sites

View file

@ -1,52 +0,0 @@
#!/bin/bash
# shellcheck disable=SC2016
# script to extract the release notes from the changelog
# show script help
function show_help() {
cat << EOF
Script to generate release-notes from a changelog (CHANGELOG.md)
Usage:
./get_release_notes.sh <new_version>
Example:
./get_release_notes.sh "2.0.5"
or
./get_release_notes.sh "latest"
EOF
exit 0
}
# create changelog for release
function get_release_notes() {
local l_version="${1}"
printf 'Creating release-notes\n'
# check for version
if [[ -z "${l_version}" ]]; then
printf 'You need to specify a version with $1\n'
exit 1
fi
if [[ ${l_version,,} == "latest" ]]; then
l_version="$(grep -o -E "^##\s\[[0-9]{1,2}.[0-9]{1,2}.[0-9]{1,2}\]" CHANGELOG.md | head -n 1 | grep -o -E "[0-9]{1,2}.[0-9]{1,2}.[0-9]{1,2}")"
fi
awk -v ver="[${l_version}]" \
'/^## / { if (p) { exit }; if ($2 == ver) { p=1 } } p && NF' \
'CHANGELOG.md' > 'RELEASENOTES.md'
printf 'Done\n'
}
# check options
case "${1}" in
'--help' | '-h' | 'help')
show_help
;;
*)
get_release_notes "${@}"
;;
esac

175
justfile
View file

@ -3,158 +3,73 @@
default: show_receipts
set shell := ["bash", "-uc"]
set dotenv-load := true
#set export
set dotenv-load
# aliases
alias s := show_receipts
alias i := show_system_info
alias p := prepare_workspace
alias l := lint
alias t := tests
alias f := tests_full
# variables
export asdf_version := "v0.10.2"
# default recipe to display help information
show_receipts:
@just --list
just --list
show_system_info:
@echo "=================================="
@echo "os : {{os()}}"
@echo "arch: {{arch()}}"
@echo "home: ${HOME}"
@echo "project dir: {{justfile_directory()}}"
@echo "justfile dir: {{justfile_directory()}}"
@echo "invocation dir: {{invocation_directory()}}"
@echo "running dir: `pwd -P`"
@echo "=================================="
check_asdf:
@if ! asdf --version; then \
just install_asdf \
;else \
echo "asdf already installed" \
;fi
just install_asdf_bins
install_asdf:
@echo "installing asdf"
@echo "asdf version: ${asdf_version}"
@git clone https://github.com/asdf-vm/asdf.git ~/.asdf --branch "${asdf_version}"
@echo "adding asdf to .bashrc"
@if ! grep -q ".asdf/asdf.sh" "${HOME}/.bashrc"; then \
echo -e '\n# source asdf' >> "${HOME}/.bashrc" \
;echo 'source "${HOME}/.asdf/asdf.sh"' >> "${HOME}/.bashrc" \
;echo -e 'source "${HOME}/.asdf/completions/asdf.bash"\n' >> "${HOME}/.bashrc" \
;fi
@echo "to load asdf either restart your shell or do: 'source \${HOME}/.bashrc'"
setup_asdf:
@echo "installing asdf bins"
# add plugins
@if ! asdf plugin add python; then :; fi
@if ! asdf plugin add shfmt; then :; fi
@if ! asdf plugin add shellcheck; then :; fi
@if ! asdf plugin add just https://github.com/franklad/asdf-just; then :; fi
@if ! asdf plugin add direnv; then :; fi
# install bins
@if ! asdf install; then :; fi
# setup direnv
@if ! asdf direnv setup --shell bash --version latest; then :; fi
setup:
asdf install
lefthook install
create_venv:
@echo "creating venv"
@python3 -m pip install --upgrade pip setuptools wheel
@python3 -m venv venv
python3 -m pip install --upgrade pip setuptools wheel
python3 -m venv venv
install_deps:
@echo "installing dependencies"
@pip3 install -r contrib/requirements_dev.txt
python3 -m hatch dep show requirements --project-only > /tmp/requirements.txt
pip3 install -r /tmp/requirements.txt
install_deps_dev:
@echo "installing dev dependencies"
python3 -m hatch dep show requirements --project-only > /tmp/requirements.txt
python3 -m hatch dep show requirements --env-only >> /tmp/requirements.txt
pip3 install -r /tmp/requirements.txt
create_reqs:
@echo "creating requirements"
pipreqs --force --savepath requirements.txt src/mangadlp/
test_shfmt:
@find . -type f \( -name "**.sh" -and -not -path "./venv/*" -and -not -path "./.tox/*" \) -exec shfmt -d -i 4 -bn -ci -sr "{}" \+;
find . -type f \( -name "**.sh" -and -not -path "./.**" -and -not -path "./venv**" \) -exec shfmt -d -i 4 -bn -ci -sr "{}" \+;
test_black:
@python3 -m black --check --diff .
test_isort:
@python3 -m isort --check-only --diff .
test_mypy:
@python3 -m mypy --install-types --non-interactive mangadlp/
test_pytest:
@python3 -m tox -e basic
test_autoflake:
@python3 -m autoflake --remove-all-unused-imports -r -v mangadlp/
@python3 -m autoflake --check --remove-all-unused-imports -r -v mangadlp/
test_pylama:
@python3 -m pylama --options tox.ini mangadlp/
test_pylint:
@python3 -m pylint --fail-under 9 mangadlp/
test_tox:
@python3 -m tox
test_tox_coverage:
@python3 -m tox -e coverage
test_build:
@python3 -m hatch build
test_ci_conf:
@woodpecker-cli lint .woodpecker/
test_docker_build:
@docker build . -f docker/Dockerfile.amd64 -t manga-dlp:test
# install dependecies and set everything up
prepare_workspace:
just show_system_info
just check_asdf
just setup_asdf
just create_venv
format_shfmt:
find . -type f \( -name "**.sh" -and -not -path "./.**" -and -not -path "./venv**" \) -exec shfmt -w -i 4 -bn -ci -sr "{}" \+;
lint:
just show_system_info
-just test_ci_conf
just test_shfmt
just test_black
just test_isort
just test_mypy
just test_autoflake
just test_pylama
just test_pylint
@echo -e "\n\033[0;32m=== ALL DONE ===\033[0m\n"
hatch run lint:style
hatch run lint:typing
tests:
format:
just show_system_info
-just test_ci_conf
just test_shfmt
just test_black
just test_isort
just test_mypy
just test_autoflake
just test_pylama
just test_pylint
just test_pytest
@echo -e "\n\033[0;32m=== ALL DONE ===\033[0m\n"
just format_shfmt
hatch run lint:fmt
tests_full:
just show_system_info
-just test_ci_conf
just test_shfmt
just test_black
just test_isort
just test_mypy
just test_autoflake
just test_pylama
just test_pylint
just test_build
just test_tox
just test_tox_coverage
just test_docker_build
@echo -e "\n\033[0;32m=== ALL DONE ===\033[0m\n"
check:
just lint
just format
test:
hatch run default:test
coverage:
hatch run default:cov
build:
hatch build --clean
run loglevel *flags:
hatch run mangadlp --loglevel {{loglevel}} {{flags}}

View file

@ -1,6 +1,7 @@
import sys
import mangadlp.cli
import src.mangadlp.cli
if __name__ == "__main__":
sys.exit(mangadlp.cli.main()) # pylint: disable=no-value-for-parameter
sys.exit(src.mangadlp.cli.main())

View file

@ -1 +0,0 @@
__version__ = "2.3.0"

View file

View file

@ -1,6 +0,0 @@
import sys
import mangadlp.cli
if __name__ == "__main__":
sys.exit(mangadlp.cli.main()) # pylint: disable=no-value-for-parameter

View file

@ -1,22 +1,16 @@
[build-system]
requires = ["hatchling>=1.11.0"]
requires = ["hatchling>=1.18", "hatch-regex-commit>=0.0.3"]
build-backend = "hatchling.build"
[project]
dynamic = ["version"]
name = "manga-dlp"
description = "A cli manga downloader"
readme = "README.md"
license = "MIT"
requires-python = ">=3.8"
authors = [
{ name = "Ivan Schaller", email = "ivan@schaller.sh" },
]
keywords = [
"manga",
"downloader",
"mangadex",
]
dynamic = ["version"]
authors = [{ name = "Ivan Schaller", email = "ivan@schaller.sh" }]
keywords = ["manga", "downloader", "mangadex"]
classifiers = [
"License :: OSI Approved :: MIT License",
"Natural Language :: English",
@ -24,13 +18,16 @@ classifiers = [
"Programming Language :: Python :: 3.8",
"Programming Language :: Python :: 3.9",
"Programming Language :: Python :: 3.10",
"Programming Language :: Python :: 3.11",
]
dependencies = [
"requests>=2.28.0",
"loguru>=0.6.0",
"click>=8.1.3",
"click-option-group>=0.5.5",
"xmltodict>=0.13.0"
"xmltodict>=0.13.0",
"img2pdf>=0.4.4",
"pytz>=2022.1",
]
[project.urls]
@ -44,70 +41,206 @@ mangadlp = "mangadlp.cli:main"
manga-dlp = "mangadlp.cli:main"
[tool.hatch.version]
path = "mangadlp/__about__.py"
[tool.hatch.build]
ignore-vcs = true
source = "regex_commit"
path = "src/mangadlp/__about__.py"
tag_sign = false
[tool.hatch.build.targets.sdist]
packages = ["mangadlp"]
packages = ["src/mangadlp"]
[tool.hatch.build.targets.wheel]
packages = ["mangadlp"]
packages = ["src/mangadlp"]
###
### envs
###
[tool.hatch.envs.default]
python = "3.11"
dependencies = [
"requests>=2.28.0",
"loguru>=0.6.0",
"click>=8.1.3",
"click-option-group>=0.5.5",
"pytest==7.4.3",
"coverage==7.3.2",
"xmltodict>=0.13.0",
"xmlschema>=2.2.1",
"img2pdf>=0.4.4",
"hatch>=1.6.0",
"hatchling>=1.11.0",
"pytest>=7.0.0",
"coverage>=6.3.1",
"black>=22.1.0",
"isort>=5.10.0",
"pylint>=2.13.0",
"mypy>=0.940",
"tox>=3.24.5",
"autoflake>=1.4",
"pylama>=8.3.8",
]
[tool.isort]
py_version = 39
skip_gitignore = true
line_length = 88
profile = "black"
multi_line_output = 3
include_trailing_comma = true
use_parentheses = true
[tool.hatch.envs.default.scripts]
test = "pytest {args:tests}"
test-cov = ["coverage erase", "coverage run -m pytest {args:tests}"]
cov-report = ["- coverage combine", "coverage report", "coverage xml"]
cov = ["test-cov", "cov-report"]
[[tool.hatch.envs.lint.matrix]]
python = ["3.8", "3.9", "3.10", "3.11"]
[tool.hatch.envs.lint]
detached = true
dependencies = [
"mypy==1.8.0",
"ruff==0.2.2",
]
[tool.hatch.envs.lint.scripts]
typing = "mypy --non-interactive --install-types {args:src/mangadlp}"
style = [
"ruff check --diff {args:.}",
"ruff format --check --diff {args:.}"
]
fmt = [
"ruff check --fix {args:.}",
"ruff format {args:.}",
"style"
]
all = ["style", "typing"]
###
### ruff
###
[tool.ruff]
target-version = "py38"
line-length = 100
indent-width = 4
fix = true
show-fixes = true
respect-gitignore = true
src = ["src", "tests"]
exclude = [
".direnv",
".git",
".mypy_cache",
".ruff_cache",
".svn",
".tox",
".nox",
".venv",
"venv",
"__pypackages__",
"build",
"dist",
"node_modules",
"venv",
"contrib"
]
[tool.ruff.lint]
select = [
"A",
"ARG",
"B",
"C",
"DTZ",
"E",
"EM",
"F",
"FBT",
"I",
"ICN",
"ISC",
"N",
"PLC",
"PLE",
"PLR",
"PLW",
"Q",
"RUF",
"S",
"T",
"TID",
"UP",
"W",
"YTT",
]
ignore-init-module-imports = true
ignore = ["E501", "D103", "D100", "D102", "PLR2004", "D403", "ISC001", "FBT001", "FBT002", "FBT003", "W505"]
unfixable = ["F401"]
[tool.ruff.format]
quote-style = "double"
indent-style = "space"
skip-magic-trailing-comma = false
line-ending = "lf"
docstring-code-format = true
[tool.ruff.lint.per-file-ignores]
"__init__.py" = ["D104"]
"__about__.py" = ["D104", "F841"]
"tests/**/*" = ["PLR2004", "S101", "TID252", "T201", "ARG001", "S603", "S605"]
[tool.ruff.lint.pyupgrade]
keep-runtime-typing = true
[tool.ruff.lint.isort]
lines-after-imports = 2
known-first-party = ["mangadlp"]
[tool.ruff.lint.flake8-tidy-imports]
ban-relative-imports = "all"
[tool.ruff.lint.pylint]
max-branches = 24
max-returns = 12
max-statements = 100
max-args = 15
allow-magic-value-types = ["str", "bytes", "complex", "float", "int"]
[tool.ruff.lint.mccabe]
max-complexity = 15
[tool.ruff.lint.pydocstyle]
convention = "google"
[tool.ruff.lint.pycodestyle]
max-doc-length = 100
###
### mypy
###
[tool.mypy]
python_version = "3.9"
disallow_untyped_defs = false
follow_imports = "normal"
ignore_missing_imports = true
warn_no_return = false
#plugins = ["pydantic.mypy"]
follow_imports = "silent"
warn_redundant_casts = true
warn_unused_ignores = true
show_error_context = true
disallow_any_generics = true
check_untyped_defs = true
no_implicit_reexport = true
ignore_missing_imports = true
warn_return_any = true
pretty = true
show_column_numbers = true
show_error_codes = true
pretty = true
no_implicit_optional = false
show_error_context = true
#[tool.pydantic-mypy]
#init_forbid_extra = true
#init_typed = true
#warn_required_dynamic_aliases = true
###
### pytest
###
[tool.pytest.ini_options]
pythonpath = [
"."
pythonpath = ["src"]
addopts = "--color=yes --exitfirst --verbose -ra"
filterwarnings = [
'ignore:Jupyter is migrating its paths to use standard platformdirs:DeprecationWarning',
]
###
### coverage
###
[tool.coverage.run]
source = ["mangadlp"]
source_pkgs = ["mangadlp", "tests"]
branch = true
command_line = "-m pytest --exitfirst"
parallel = true
omit = ["src/mangadlp/__about__.py"]
[tool.coverage.paths]
mangadlp = ["src/mangadlp", "*/manga-dlp/src/mangadlp"]
tests = ["tests", "*/manga-dlp/tests"]
[tool.coverage.report]
# Regexes for lines to exclude from consideration
@ -125,14 +258,7 @@ exclude_lines = [
"if __name__ == .__main__.:",
# Don't complain about abstract methods, they aren't run:
"@(abc.)?abstractmethod",
"no cov",
"if TYPE_CHECKING:",
]
ignore_errors = true
[tool.pylint.main]
py-version = "3.9"
[tool.pylint.logging]
logging-modules = ["logging", "loguru"]
disable = "C0301, C0114, C0116, W0703, R0902, R0913, E0401, W1203"
good-names = "r"
logging-format-style = "new"
# ignore_errors = true

View file

@ -1,6 +1,4 @@
{
"$schema": "https://docs.renovatebot.com/renovate-schema.json",
"extends": [
"local>44net-assets/docker-renovate-conf"
]
"$schema": "https://docs.renovatebot.com/renovate-schema.json",
"extends": ["local>44net/renovate"]
}

View file

@ -5,8 +5,8 @@ sonar.links.scm=https://github.com/olofvndrhr/manga-dlp
sonar.links.issue=https://github.com/olofvndrhr/manga-dlp/issues
sonar.links.ci=https://ci.44net.ch/olofvndrhr/manga-dlp
#
sonar.sources=mangadlp
sonar.tests=tests
sonar.exclusions=docker/**,contrib/**
sonar.python.version=3.9
sonar.sources=src/mangadlp
sonar.tests=tests
#sonar.exclusions=
sonar.python.coverage.reportPaths=coverage.xml

View file

@ -0,0 +1 @@
__version__ = "2.4.1"

7
src/mangadlp/__main__.py Normal file
View file

@ -0,0 +1,7 @@
import sys
import mangadlp.cli
if __name__ == "__main__":
sys.exit(mangadlp.cli.main())

View file

@ -1,15 +1,18 @@
import re
from time import sleep
from typing import Any, Dict, List
import requests
from loguru import logger as log
from mangadlp import utils
from mangadlp.models import ChapterData, ComicInfo
class Mangadex:
"""Mangadex API Class.
Get infos for a manga from mangadex.org
Get infos for a manga from mangadex.org.
Args:
url_uuid (str): URL or UUID of the manga
@ -19,7 +22,7 @@ class Mangadex:
Attributes:
api_name (str): Name of the API
manga_uuid (str): UUID of the manga, without the url part
manga_data (dict): Infos of the manga. Name, title etc
manga_data (dict): Infos of the manga. Name, title etc.
manga_title (str): The title of the manga, sanitized for all file systems
manga_chapter_data (dict): All chapter data of the manga. Volumes, chapters, chapter uuids and chapter names
chapter_list (list): A list of all available chapters for the language
@ -54,9 +57,7 @@ class Mangadex:
# get the uuid for the manga
def get_manga_uuid(self) -> str:
# isolate id from url
uuid_regex = re.compile(
"[a-z0-9]{8}-[a-z0-9]{4}-[a-z0-9]{4}-[a-z0-9]{4}-[a-z0-9]{12}"
)
uuid_regex = re.compile("[a-z0-9]{8}-[a-z0-9]{4}-[a-z0-9]{4}-[a-z0-9]{4}-[a-z0-9]{12}")
# try to get uuid in string
try:
uuid = uuid_regex.search(self.url_uuid)[0] # type: ignore
@ -67,14 +68,12 @@ class Mangadex:
return uuid
# make initial request
def get_manga_data(self) -> dict:
def get_manga_data(self) -> Dict[str, Any]:
log.debug(f"Getting manga data for: {self.manga_uuid}")
counter = 1
while counter <= 3:
try:
response = requests.get(
f"{self.api_base_url}/manga/{self.manga_uuid}", timeout=10
)
response = requests.get(f"{self.api_base_url}/manga/{self.manga_uuid}", timeout=10)
except Exception as exc:
if counter >= 3:
log.error("Maybe the MangaDex API is down?")
@ -84,12 +83,14 @@ class Mangadex:
counter += 1
else:
break
response_body: Dict[str, Dict[str, Any]] = response.json()
# check if manga exists
if response.json()["result"] != "ok":
if response_body["result"] != "ok":
log.error("Manga not found")
raise KeyError
return response.json()["data"]
return response_body["data"]
# get the title of the manga (and fix the filename)
def get_manga_title(self) -> str:
@ -97,32 +98,35 @@ class Mangadex:
attributes = self.manga_data["attributes"]
# try to get the title in requested language
try:
title = attributes["title"][self.language]
found_title = attributes["title"][self.language]
title = utils.fix_name(found_title)
except KeyError:
log.info("Manga title not found in requested language. Trying alt titles")
else:
log.debug(f"Language={self.language}, Title='{title}'")
return utils.fix_name(title)
return title # type: ignore
# search in alt titles
try:
log.debug(f"Alt titles: {attributes['altTitles']}")
for item in attributes["altTitles"]:
if item.get(self.language):
alt_title = item
alt_title_item = item
break
title = alt_title[self.language]
found_title = alt_title_item[self.language]
except (KeyError, UnboundLocalError):
log.warning(
"Manga title also not found in alt titles. Falling back to english title"
)
log.warning("Manga title also not found in alt titles. Falling back to english title")
else:
log.debug(f"Language={self.language}, Alt-title='{title}'")
return utils.fix_name(title)
title = utils.fix_name(found_title)
log.debug(f"Language={self.language}, Alt-title='{found_title}'")
return title # type: ignore
found_title = attributes["title"]["en"]
title = utils.fix_name(found_title)
title = attributes["title"]["en"]
log.debug(f"Language=en, Fallback-title='{title}'")
return utils.fix_name(title)
return title # type: ignore
# check if chapters are available in requested language
def check_chapter_lang(self) -> int:
@ -132,11 +136,9 @@ class Mangadex:
timeout=10,
)
try:
total_chapters = r.json()["total"]
total_chapters: int = r.json()["total"]
except Exception as exc:
log.error(
"Error retrieving the chapters list. Did you specify a valid language code?"
)
log.error("Error retrieving the chapters list. Did you specify a valid language code?")
raise exc
if total_chapters == 0:
log.error("No chapters available to download in specified language")
@ -146,13 +148,13 @@ class Mangadex:
return total_chapters
# get chapter data like name, uuid etc
def get_chapter_data(self) -> dict:
def get_chapter_data(self) -> Dict[str, ChapterData]:
log.debug(f"Getting chapter data for: {self.manga_uuid}")
api_sorting = "order[chapter]=asc&order[volume]=asc"
# check for chapters in specified lang
total_chapters = self.check_chapter_lang()
chapter_data = {}
chapter_data: Dict[str, ChapterData] = {}
last_volume, last_chapter = ("", "")
offset = 0
while offset < total_chapters: # if more than 500 chapters
@ -160,8 +162,9 @@ class Mangadex:
f"{self.api_base_url}/manga/{self.manga_uuid}/feed?{api_sorting}&limit=500&offset={offset}&{self.api_additions}",
timeout=10,
)
for chapter in r.json()["data"]:
attributes: dict = chapter["attributes"]
response_body: Dict[str, Any] = r.json()
for chapter in response_body["data"]:
attributes: Dict[str, Any] = chapter["attributes"]
# chapter infos from feed
chapter_num: str = attributes.get("chapter") or ""
chapter_vol: str = attributes.get("volume") or ""
@ -184,9 +187,7 @@ class Mangadex:
continue
# export chapter data as a dict
chapter_index = (
chapter_num if not self.forcevol else f"{chapter_vol}:{chapter_num}"
)
chapter_index = chapter_num if not self.forcevol else f"{chapter_vol}:{chapter_num}"
chapter_data[chapter_index] = {
"uuid": chapter_uuid,
"volume": chapter_vol,
@ -203,7 +204,7 @@ class Mangadex:
return chapter_data
# get images for the chapter (mangadex@home)
def get_chapter_images(self, chapter: str, wait_time: float) -> list:
def get_chapter_images(self, chapter: str, wait_time: float) -> List[str]:
log.debug(f"Getting chapter images for: {self.manga_uuid}")
athome_url = f"{self.api_base_url}/at-home/server"
chapter_uuid = self.manga_chapter_data[chapter]["uuid"]
@ -241,7 +242,7 @@ class Mangadex:
chapter_img_data = api_data["chapter"]["data"]
# get list of image urls
image_urls = []
image_urls: List[str] = []
for image in chapter_img_data:
image_urls.append(f"{self.img_base_url}/data/{chapter_hash}/{image}")
@ -250,9 +251,9 @@ class Mangadex:
return image_urls
# create list of chapters
def create_chapter_list(self) -> list:
def create_chapter_list(self) -> List[str]:
log.debug(f"Creating chapter list for: {self.manga_uuid}")
chapter_list = []
chapter_list: List[str] = []
for data in self.manga_chapter_data.values():
chapter_number: str = data["chapter"]
volume_number: str = data["volume"]
@ -263,15 +264,15 @@ class Mangadex:
return chapter_list
def create_metadata(self, chapter: str) -> dict:
def create_metadata(self, chapter: str) -> ComicInfo:
log.info("Creating metadata from api")
chapter_data = self.manga_chapter_data[chapter]
try:
volume = int(chapter_data.get("volume"))
volume = int(chapter_data["volume"])
except (ValueError, TypeError):
volume = None
metadata = {
metadata: ComicInfo = {
"Volume": volume,
"Number": chapter_data.get("chapter"),
"PageCount": chapter_data.get("pages"),

View file

@ -1,7 +1,7 @@
import re
import shutil
from pathlib import Path
from typing import Any, Union
from typing import Any, Dict, List, Tuple, Union
from loguru import logger as log
@ -10,11 +10,12 @@ from mangadlp.api.mangadex import Mangadex
from mangadlp.cache import CacheDB
from mangadlp.hooks import run_hook
from mangadlp.metadata import write_metadata
from mangadlp.models import ChapterData
from mangadlp.utils import get_file_format
def match_api(url_uuid: str) -> type:
"""Match the correct api class from a string
"""Match the correct api class from a string.
Args:
url_uuid: url/uuid to check
@ -22,9 +23,8 @@ def match_api(url_uuid: str) -> type:
Returns:
The class of the API to use
"""
# apis to check
apis: list[tuple[str, re.Pattern, type]] = [
apis: List[Tuple[str, re.Pattern[str], type]] = [
(
"mangadex.org",
re.compile(
@ -53,6 +53,7 @@ def match_api(url_uuid: str) -> type:
class MangaDLP:
"""Download Mangas from supported sites.
After initialization, start the script with the function get_manga().
Args:
@ -72,7 +73,7 @@ class MangaDLP:
add_metadata: Flag to toggle creation & inclusion of metadata
"""
def __init__( # pylint: disable=too-many-locals
def __init__( # noqa
self,
url_uuid: str,
language: str = "en",
@ -108,7 +109,7 @@ class MangaDLP:
self.chapter_post_hook_cmd = chapter_post_hook_cmd
self.cache_path = cache_path
self.add_metadata = add_metadata
self.hook_infos: dict = {}
self.hook_infos: Dict[str, Any] = {}
# prepare everything
self._prepare()
@ -138,9 +139,7 @@ class MangaDLP:
# prechecks userinput/options
# no url and no readin list given
if not self.url_uuid:
log.error(
'You need to specify a manga url/uuid with "-u" or a list with "--read"'
)
log.error('You need to specify a manga url/uuid with "-u" or a list with "--read"')
raise ValueError
# checks if --list is not used
if not self.list_chapters:
@ -160,7 +159,7 @@ class MangaDLP:
raise ValueError
# once called per manga
def get_manga(self) -> None:
def get_manga(self) -> None: # noqa
print_divider = "========================================="
# show infos
log.info(f"{print_divider}")
@ -178,9 +177,7 @@ class MangaDLP:
if self.chapters.lower() == "all":
chapters_to_download = self.manga_chapter_list
else:
chapters_to_download = utils.get_chapter_list(
self.chapters, self.manga_chapter_list
)
chapters_to_download = utils.get_chapter_list(self.chapters, self.manga_chapter_list)
# show chapters to download
log.info(f"Chapters selected: {', '.join(chapters_to_download)}")
@ -191,9 +188,7 @@ class MangaDLP:
# prepare cache if specified
if self.cache_path:
cache = CacheDB(
self.cache_path, self.manga_uuid, self.language, self.manga_title
)
cache = CacheDB(self.cache_path, self.manga_uuid, self.language, self.manga_title)
cached_chapters = cache.db_uuid_chapters
log.info(f"Cached chapters: {cached_chapters}")
@ -223,8 +218,8 @@ class MangaDLP:
)
# get chapters
skipped_chapters: list[Any] = []
error_chapters: list[Any] = []
skipped_chapters: List[Any] = []
error_chapters: List[Any] = []
for chapter in chapters_to_download:
if self.cache_path and chapter in cached_chapters:
log.info(f"Chapter '{chapter}' is in cache. Skipping download")
@ -256,9 +251,7 @@ class MangaDLP:
{"Format": self.file_format[1:], **metadata},
)
except Exception as exc:
log.warning(
f"Can't write metadata for chapter '{chapter}'. Reason={exc}"
)
log.warning(f"Can't write metadata for chapter '{chapter}'. Reason={exc}")
# pack downloaded folder
if self.file_format:
@ -310,14 +303,12 @@ class MangaDLP:
# once called per chapter
def get_chapter(self, chapter: str) -> Path:
# get chapter infos
chapter_infos: dict = self.api.manga_chapter_data[chapter]
chapter_infos: ChapterData = self.api.manga_chapter_data[chapter]
log.debug(f"Chapter infos: {chapter_infos}")
# get image urls for chapter
try:
chapter_image_urls = self.api.get_chapter_images(
chapter, self.download_wait
)
chapter_image_urls = self.api.get_chapter_images(chapter, self.download_wait)
except KeyboardInterrupt as exc:
log.critical("Keyboard interrupt. Stopping")
raise exc
@ -352,7 +343,7 @@ class MangaDLP:
log.debug(f"Filename: '{chapter_filename}'")
# set download path for chapter (image folder)
chapter_path = self.manga_path / chapter_filename
chapter_path: Path = self.manga_path / chapter_filename
# set archive path with file format
chapter_archive_path = Path(f"{chapter_path}{self.file_format}")
@ -406,9 +397,7 @@ class MangaDLP:
# download images
try:
downloader.download_chapter(
chapter_image_urls, chapter_path, self.download_wait
)
downloader.download_chapter(chapter_image_urls, chapter_path, self.download_wait)
except KeyboardInterrupt as exc:
log.critical("Keyboard interrupt. Stopping")
raise exc
@ -440,7 +429,7 @@ class MangaDLP:
# check if image folder is existing
if not chapter_path.exists():
log.error(f"Image folder: {chapter_path} does not exist")
raise IOError
raise OSError
if self.file_format == ".pdf":
utils.make_pdf(chapter_path)
else:

View file

@ -1,9 +1,11 @@
import json
from pathlib import Path
from typing import Dict, List, Union
from typing import List, Union
from loguru import logger as log
from mangadlp.models import CacheData, CacheKeyData
class CacheDB:
def __init__(
@ -26,12 +28,12 @@ class CacheDB:
if not self.db_data.get(self.db_key):
self.db_data[self.db_key] = {}
self.db_uuid_data: dict = self.db_data[self.db_key]
self.db_uuid_data: CacheKeyData = self.db_data[self.db_key]
if not self.db_uuid_data.get("name"):
self.db_uuid_data.update({"name": self.name})
self._write_db()
self.db_uuid_chapters: list = self.db_uuid_data.get("chapters") or []
self.db_uuid_chapters: List[str] = self.db_uuid_data.get("chapters") or []
def _prepare_db(self) -> None:
if self.db_path.exists():
@ -44,11 +46,11 @@ class CacheDB:
log.error("Can't create db-file")
raise exc
def _read_db(self) -> Dict[str, dict]:
def _read_db(self) -> CacheData:
log.info(f"Reading cache-db: {self.db_path}")
try:
db_txt = self.db_path.read_text(encoding="utf8")
db_dict: dict[str, dict] = json.loads(db_txt)
db_dict: CacheData = json.loads(db_txt)
except Exception as exc:
log.error("Can't load cache-db")
raise exc
@ -73,7 +75,7 @@ class CacheDB:
raise exc
def sort_chapters(chapters: list) -> List[str]:
def sort_chapters(chapters: List[str]) -> List[str]:
try:
sorted_list = sorted(chapters, key=float)
except Exception:

View file

@ -1,5 +1,6 @@
import sys
from pathlib import Path
from typing import Any, List
import click
from click_option_group import (
@ -15,7 +16,7 @@ from mangadlp.logger import prepare_logger
# read in the list of links from a file
def readin_list(_ctx, _param, value) -> list:
def readin_list(_ctx: click.Context, _param: str, value: str) -> List[str]:
if not value:
return []
@ -25,7 +26,8 @@ def readin_list(_ctx, _param, value) -> list:
url_str = list_file.read_text(encoding="utf-8")
url_list = url_str.splitlines()
except Exception as exc:
raise click.BadParameter("Can't get links from the file") from exc
msg = f"Reading in file '{list_file}'"
raise click.BadParameter(msg) from exc
# filter empty lines and remove them
filtered_list = list(filter(len, url_list))
@ -54,7 +56,7 @@ def readin_list(_ctx, _param, value) -> list:
"read_mangas",
is_eager=True,
callback=readin_list,
type=click.Path(exists=True, dir_okay=False),
type=click.Path(exists=True, dir_okay=False, path_type=str),
default=None,
show_default=True,
help="Path of file with manga links to download. One per line",
@ -227,14 +229,10 @@ def readin_list(_ctx, _param, value) -> list:
help="Enable/disable creation of metadata via ComicInfo.xml",
)
@click.pass_context
def main(ctx: click.Context, **kwargs) -> None:
"""
Script to download mangas from various sites
"""
def main(ctx: click.Context, **kwargs: Any) -> None:
"""Script to download mangas from various sites."""
url_uuid: str = kwargs.pop("url_uuid")
read_mangas: list[str] = kwargs.pop("read_mangas")
read_mangas: List[str] = kwargs.pop("read_mangas")
verbosity: int = kwargs.pop("verbosity")
# set log level to INFO if not set

View file

@ -2,7 +2,7 @@ import logging
import shutil
from pathlib import Path
from time import sleep
from typing import Union
from typing import List, Union
import requests
from loguru import logger as log
@ -12,7 +12,7 @@ from mangadlp import utils
# download images
def download_chapter(
image_urls: list,
image_urls: List[str],
chapter_path: Union[str, Path],
download_wait: float,
) -> None:
@ -54,5 +54,4 @@ def download_chapter(
log.error("Can't write file")
raise exc
image_num += 1
sleep(download_wait)

View file

@ -1,11 +1,15 @@
import os
import subprocess
from typing import Any
from loguru import logger as log
def run_hook(command: str, hook_type: str, **kwargs) -> int:
"""
def run_hook(command: str, hook_type: str, **kwargs: Any) -> int:
"""Run a command.
Run a command with subprocess.run and add kwargs to the environment.
Args:
command (str): command to run
hook_type (str): type of the hook
@ -14,7 +18,6 @@ def run_hook(command: str, hook_type: str, **kwargs) -> int:
Returns:
exit_code (int): exit code of command
"""
# check if hook commands are empty
if not command or command == "None":
log.debug(f"Hook '{hook_type}' empty. Not running")
@ -28,7 +31,7 @@ def run_hook(command: str, hook_type: str, **kwargs) -> int:
# running command
log.info(f"Hook '{hook_type}' - running command: '{command}'")
proc = subprocess.run(command_list, check=False, timeout=15, encoding="utf8")
proc = subprocess.run(command_list, check=False, timeout=15, encoding="utf8") # noqa
exit_code = proc.returncode
if exit_code == 0:

View file

@ -1,18 +1,18 @@
import logging
import sys
from typing import Any, Dict
from loguru import logger
LOGURU_FMT = "{time:%Y-%m-%dT%H:%M:%S%z} | <level>[{level: <7}]</level> [{name: <10}] [{function: <20}]: {message}"
# from loguru docs
class InterceptHandler(logging.Handler):
"""
Intercept python logging messages and log them via loguru.logger
"""
"""Intercept python logging messages and log them via loguru.logger."""
def emit(self, record):
def emit(self, record: Any) -> None:
# Get corresponding Loguru level if it exists
try:
level = logger.level(record.levelname).name
@ -22,25 +22,19 @@ class InterceptHandler(logging.Handler):
# Find caller from where originated the logged message
frame, depth = logging.currentframe(), 2
while frame.f_code.co_filename == logging.__file__:
frame = frame.f_back
frame = frame.f_back # type: ignore
depth += 1
logger.opt(depth=depth, exception=record.exc_info).log(
level, record.getMessage()
)
logger.opt(depth=depth, exception=record.exc_info).log(level, record.getMessage())
# init logger with format and log level
def prepare_logger(loglevel: int = 20) -> None:
config: dict = {
"handlers": [
{
"sink": sys.stdout,
"level": loglevel,
"format": LOGURU_FMT,
}
],
stdout_handler: Dict[str, Any] = {
"sink": sys.stdout,
"level": loglevel,
"format": LOGURU_FMT,
}
logging.basicConfig(handlers=[InterceptHandler()], level=loglevel)
logger.configure(**config)
logger.configure(handlers=[stdout_handler])

View file

@ -1,14 +1,17 @@
from pathlib import Path
from typing import Any, Dict, Tuple
from typing import Any, Dict, List, Tuple, Union
import xmltodict
from loguru import logger as log
from mangadlp.models import ComicInfo
METADATA_FILENAME = "ComicInfo.xml"
METADATA_TEMPLATE = Path("mangadlp/metadata/ComicInfo_v2.0.xml")
# define metadata types, defaults and valid values. an empty list means no value check
# {key: (type, default value, valid values)}
METADATA_TYPES: Dict[str, Tuple[type, Any, list]] = {
METADATA_TYPES: Dict[str, Tuple[Any, Union[str, int, None], List[Union[str, int, None]]]] = {
"Title": (str, None, []),
"Series": (str, None, []),
"Number": (str, None, []),
@ -59,23 +62,21 @@ METADATA_TYPES: Dict[str, Tuple[type, Any, list]] = {
}
def validate_metadata(metadata_in: dict) -> Dict[str, dict]:
def validate_metadata(metadata_in: ComicInfo) -> Dict[str, ComicInfo]:
log.info("Validating metadata")
metadata_valid: dict[str, dict] = {"ComicInfo": {}}
metadata_valid: Dict[str, ComicInfo] = {"ComicInfo": {}}
for key, value in METADATA_TYPES.items():
metadata_type, metadata_default, metadata_validation = value
# add default value if present
if metadata_default:
log.debug(
f"Setting default value for Key:{key} -> value={metadata_default}"
)
log.debug(f"Setting default value for Key:{key} -> value={metadata_default}")
metadata_valid["ComicInfo"][key] = metadata_default
# check if metadata key is available
try:
md_to_check = metadata_in[key]
md_to_check: Union[str, int, None] = metadata_in[key]
except KeyError:
continue
# check if provided metadata item is empty
@ -84,18 +85,14 @@ def validate_metadata(metadata_in: dict) -> Dict[str, dict]:
# check if metadata type is correct
log.debug(f"Key:{key} -> value={type(md_to_check)} -> check={metadata_type}")
if not isinstance(md_to_check, metadata_type): # noqa
log.warning(
f"Metadata has wrong type: {key}:{metadata_type} -> {md_to_check}"
)
if not isinstance(md_to_check, metadata_type):
log.warning(f"Metadata has wrong type: {key}:{metadata_type} -> {md_to_check}")
continue
# check if metadata is valid
log.debug(f"Key:{key} -> value={md_to_check} -> valid={metadata_validation}")
if (len(metadata_validation) > 0) and (md_to_check not in metadata_validation):
log.warning(
f"Metadata is invalid: {key}:{metadata_validation} -> {md_to_check}"
)
log.warning(f"Metadata is invalid: {key}:{metadata_validation} -> {md_to_check}")
continue
log.debug(f"Updating metadata: '{key}' = '{md_to_check}'")
@ -104,7 +101,7 @@ def validate_metadata(metadata_in: dict) -> Dict[str, dict]:
return metadata_valid
def write_metadata(chapter_path: Path, metadata: dict) -> None:
def write_metadata(chapter_path: Path, metadata: ComicInfo) -> None:
if metadata["Format"] == "pdf":
log.warning("Can't add metadata for pdf format. Skipping")
return

59
src/mangadlp/models.py Normal file
View file

@ -0,0 +1,59 @@
from typing import List, Optional, TypedDict
class ComicInfo(TypedDict, total=False):
"""ComicInfo.xml basic types.
Validation is done via metadata.validate_metadata()
All valid types and values are specified in metadata.METADATA_TYPES
"""
Title: Optional[str]
Series: Optional[str]
Number: Optional[str]
Count: Optional[int]
Volume: Optional[int]
AlternateSeries: Optional[str]
AlternateNumber: Optional[str]
AlternateCount: Optional[int]
Summary: Optional[str]
Notes: Optional[str]
Year: Optional[int]
Month: Optional[int]
Day: Optional[int]
Writer: Optional[str]
Colorist: Optional[str]
Publisher: Optional[str]
Genre: Optional[str]
Web: Optional[str]
PageCount: Optional[int]
LanguageISO: Optional[str]
Format: Optional[str]
BlackAndWhite: Optional[str]
Manga: Optional[str]
ScanInformation: Optional[str]
SeriesGroup: Optional[str]
AgeRating: Optional[str]
CommunityRating: Optional[int]
class ChapterData(TypedDict):
"""Basic chapter-data types.
All values have to be provided.
"""
uuid: str
volume: str
chapter: str
name: str
pages: int
class CacheKeyData(TypedDict):
chapters: List[str]
name: str
class CacheData(TypedDict):
__root__: CacheKeyData

View file

@ -4,6 +4,7 @@ from pathlib import Path
from typing import Any, List
from zipfile import ZipFile
import pytz
from loguru import logger as log
@ -30,7 +31,7 @@ def make_pdf(chapter_path: Path) -> None:
raise exc
pdf_path = Path(f"{chapter_path}.pdf")
images: list[str] = []
images: List[str] = []
for file in chapter_path.iterdir():
images.append(str(file))
try:
@ -41,15 +42,15 @@ def make_pdf(chapter_path: Path) -> None:
# create a list of chapters
def get_chapter_list(chapters: str, available_chapters: list) -> List[str]:
def get_chapter_list(chapters: str, available_chapters: List[str]) -> List[str]:
# check if there are available chapter
chapter_list: list[str] = []
chapter_list: List[str] = []
for chapter in chapters.split(","):
# check if chapter list is with volumes and ranges (forcevol)
if "-" in chapter and ":" in chapter:
# split chapters and volumes apart for list generation
lower_num_fv: list[str] = chapter.split("-")[0].split(":")
upper_num_fv: list[str] = chapter.split("-")[1].split(":")
lower_num_fv: List[str] = chapter.split("-")[0].split(":")
upper_num_fv: List[str] = chapter.split("-")[1].split(":")
vol_fv: str = lower_num_fv[0]
chap_beg_fv: int = int(lower_num_fv[1])
chap_end_fv: int = int(upper_num_fv[1])
@ -70,7 +71,7 @@ def get_chapter_list(chapters: str, available_chapters: list) -> List[str]:
# select all chapters from the volume --> 1: == 1:1,1:2,1:3...
if vol_num and not chap_num:
regex: Any = re.compile(f"{vol_num}:[0-9]{{1,4}}")
vol_list: list[str] = [n for n in available_chapters if regex.match(n)]
vol_list: List[str] = [n for n in available_chapters if regex.match(n)]
chapter_list.extend(vol_list)
else:
chapter_list.append(chapter)
@ -160,7 +161,7 @@ def get_file_format(file_format: str) -> str:
def progress_bar(progress: float, total: float) -> None:
time = datetime.now().strftime("%Y-%m-%dT%H:%M:%S")
time = datetime.now(tz=pytz.timezone("Europe/Zurich")).strftime("%Y-%m-%dT%H:%M:%S")
percent = int(progress / (int(total) / 100))
bar_length = 50
bar_progress = int(progress / (int(total) / bar_length))
@ -168,9 +169,9 @@ def progress_bar(progress: float, total: float) -> None:
whitespace_texture = " " * (bar_length - bar_progress)
if progress == total:
full_bar = "" * bar_length
print(f"\r{time}{' '*6}| [BAR ] ❙{full_bar}❙ 100%", end="\n")
print(f"\r{time}{' '*6}| [BAR ] ❙{full_bar}❙ 100%", end="\n") # noqa
else:
print(
print( # noqa
f"\r{time}{' '*6}| [BAR ] ❙{bar_texture}{whitespace_texture}{percent}%",
end="\r",
)

View file

@ -5,7 +5,9 @@ from mangadlp.app import MangaDLP
def test_check_api_mangadex():
url = "https://mangadex.org/title/a96676e5-8ae2-425e-b549-7f15dd34a6d8/komi-san-wa-komyushou-desu"
url = (
"https://mangadex.org/title/a96676e5-8ae2-425e-b549-7f15dd34a6d8/komi-san-wa-komyushou-desu"
)
test = MangaDLP(url_uuid=url, list_chapters=True, download_wait=2)
assert test.api_used == Mangadex

View file

@ -1,8 +1,10 @@
import shutil
from pathlib import Path
from typing import List
import pytest
import requests
from pytest import MonkeyPatch
from mangadlp import downloader
@ -17,7 +19,7 @@ def test_downloader():
]
chapter_path = Path("tests/test_folder1")
chapter_path.mkdir(parents=True, exist_ok=True)
images = []
images: List[str] = []
downloader.download_chapter(urls, str(chapter_path), 2)
for file in chapter_path.iterdir():
images.append(file.name)
@ -28,7 +30,7 @@ def test_downloader():
shutil.rmtree(chapter_path, ignore_errors=True)
def test_downloader_fail(monkeypatch):
def test_downloader_fail(monkeypatch: MonkeyPatch):
images = [
"https://uploads.mangadex.org/data/f1117c5e7aff315bc3429a8791c89d63/A1-c111d78b798f1dda1879334a3478f7ae4503578e8adf1af0fcc4e14d2a396ad4.png",
"https://uploads.mangadex.org/data/f1117c5e7aff315bc3429a8791c89d63/A2-717ec3c83e8e05ed7b505941431a417ebfed6a005f78b89650efd3b088b951ec.png",

View file

@ -16,13 +16,13 @@ def test_read_and_url():
def test_no_read_and_url():
url_uuid = "https://mangadex.org/title/0aea9f43-d4a9-4bf7-bebc-550a512f9b95/shikimori-s-not-just-a-cutie"
link_file = "tests/testfile.txt"
language = "en"
chapters = "1"
file_format = "cbz"
download_path = "tests"
command_args = f"-l {language} -c {chapters} --path {download_path} --format {file_format} --debug"
command_args = (
f"-l {language} -c {chapters} --path {download_path} --format {file_format} --debug"
)
script_path = "manga-dlp.py"
assert os.system(f"python3 {script_path} {command_args}") != 0
@ -30,10 +30,11 @@ def test_no_read_and_url():
def test_no_chaps():
url_uuid = "https://mangadex.org/title/0aea9f43-d4a9-4bf7-bebc-550a512f9b95/shikimori-s-not-just-a-cutie"
language = "en"
chapters = ""
file_format = "cbz"
download_path = "tests"
command_args = f"-u {url_uuid} -l {language} --path {download_path} --format {file_format} --debug"
command_args = (
f"-u {url_uuid} -l {language} --path {download_path} --format {file_format} --debug"
)
script_path = "manga-dlp.py"
assert os.system(f"python3 {script_path} {command_args}") != 0

View file

@ -4,6 +4,7 @@ import time
from pathlib import Path
import pytest
from pytest import MonkeyPatch
@pytest.fixture
@ -18,7 +19,7 @@ def wait_20s():
time.sleep(20)
def test_manga_pre_hook(wait_10s):
def test_manga_pre_hook(wait_10s: MonkeyPatch):
url_uuid = "https://mangadex.org/title/0aea9f43-d4a9-4bf7-bebc-550a512f9b95/shikimori-s-not-just-a-cutie"
manga_path = Path("tests/Shikimori's Not Just a Cutie")
language = "en"
@ -40,7 +41,7 @@ def test_manga_pre_hook(wait_10s):
manga_pre_hook,
]
script_path = "manga-dlp.py"
command = ["python3", script_path] + command_args
command = ["python3", script_path, *command_args]
assert subprocess.call(command) == 0
assert hook_file.is_file()
@ -50,7 +51,7 @@ def test_manga_pre_hook(wait_10s):
hook_file.unlink()
def test_manga_post_hook(wait_10s):
def test_manga_post_hook(wait_10s: MonkeyPatch):
url_uuid = "https://mangadex.org/title/0aea9f43-d4a9-4bf7-bebc-550a512f9b95/shikimori-s-not-just-a-cutie"
manga_path = Path("tests/Shikimori's Not Just a Cutie")
language = "en"
@ -72,7 +73,7 @@ def test_manga_post_hook(wait_10s):
manga_post_hook,
]
script_path = "manga-dlp.py"
command = ["python3", script_path] + command_args
command = ["python3", script_path, *command_args]
assert subprocess.call(command) == 0
assert hook_file.is_file()
@ -82,7 +83,7 @@ def test_manga_post_hook(wait_10s):
hook_file.unlink()
def test_chapter_pre_hook(wait_10s):
def test_chapter_pre_hook(wait_10s: MonkeyPatch):
url_uuid = "https://mangadex.org/title/0aea9f43-d4a9-4bf7-bebc-550a512f9b95/shikimori-s-not-just-a-cutie"
manga_path = Path("tests/Shikimori's Not Just a Cutie")
language = "en"
@ -104,7 +105,7 @@ def test_chapter_pre_hook(wait_10s):
chapter_pre_hook,
]
script_path = "manga-dlp.py"
command = ["python3", script_path] + command_args
command = ["python3", script_path, *command_args]
assert subprocess.call(command) == 0
assert hook_file.is_file()
@ -114,7 +115,7 @@ def test_chapter_pre_hook(wait_10s):
hook_file.unlink()
def test_chapter_post_hook(wait_10s):
def test_chapter_post_hook(wait_10s: MonkeyPatch):
url_uuid = "https://mangadex.org/title/0aea9f43-d4a9-4bf7-bebc-550a512f9b95/shikimori-s-not-just-a-cutie"
manga_path = Path("tests/Shikimori's Not Just a Cutie")
language = "en"
@ -136,7 +137,7 @@ def test_chapter_post_hook(wait_10s):
chapter_post_hook,
]
script_path = "manga-dlp.py"
command = ["python3", script_path] + command_args
command = ["python3", script_path, *command_args]
assert subprocess.call(command) == 0
assert hook_file.is_file()
@ -146,7 +147,7 @@ def test_chapter_post_hook(wait_10s):
hook_file.unlink()
def test_all_hooks(wait_10s):
def test_all_hooks(wait_10s: MonkeyPatch):
url_uuid = "https://mangadex.org/title/0aea9f43-d4a9-4bf7-bebc-550a512f9b95/shikimori-s-not-just-a-cutie"
manga_path = Path("tests/Shikimori's Not Just a Cutie")
language = "en"
@ -176,7 +177,7 @@ def test_all_hooks(wait_10s):
chapter_post_hook,
]
script_path = "manga-dlp.py"
command = ["python3", script_path] + command_args
command = ["python3", script_path, *command_args]
assert subprocess.call(command) == 0
assert Path("tests/manga-pre2.txt").is_file()

View file

@ -6,7 +6,7 @@ from mangadlp.cache import CacheDB
def test_cache_creation():
cache_file = Path("cache.json")
cache = CacheDB(cache_file, "abc", "en", "test")
CacheDB(cache_file, "abc", "en", "test")
assert cache_file.exists()
cache_file.unlink()

View file

@ -5,6 +5,7 @@ from pathlib import Path
import pytest
import xmlschema
from pytest import MonkeyPatch
from mangadlp.metadata import validate_metadata, write_metadata
@ -110,8 +111,10 @@ def test_metadata_validation_values2():
}
def test_metadata_chapter_validity(wait_20s):
url_uuid = "https://mangadex.org/title/76ee7069-23b4-493c-bc44-34ccbf3051a8/tomo-chan-wa-onna-no-ko"
def test_metadata_chapter_validity(wait_20s: MonkeyPatch):
url_uuid = (
"https://mangadex.org/title/76ee7069-23b4-493c-bc44-34ccbf3051a8/tomo-chan-wa-onna-no-ko"
)
manga_path = Path("tests/Tomo-chan wa Onna no ko")
metadata_path = manga_path / "Ch. 1 - Once In A Life Time Misfire/ComicInfo.xml"
language = "en"
@ -130,10 +133,10 @@ def test_metadata_chapter_validity(wait_20s):
"",
"--debug",
]
schema = xmlschema.XMLSchema("mangadlp/metadata/ComicInfo_v2.0.xsd")
schema = xmlschema.XMLSchema("src/mangadlp/metadata/ComicInfo_v2.0.xsd")
script_path = "manga-dlp.py"
command = ["python3", script_path] + command_args
command = ["python3", script_path, *command_args]
assert subprocess.call(command) == 0
assert metadata_path.is_file()

View file

@ -1,11 +1,14 @@
import pytest
import requests
from pytest import MonkeyPatch
from mangadlp.api.mangadex import Mangadex
def test_uuid_link():
url_uuid = "https://mangadex.org/title/a96676e5-8ae2-425e-b549-7f15dd34a6d8/komi-san-wa-komyushou-desu"
url_uuid = (
"https://mangadex.org/title/a96676e5-8ae2-425e-b549-7f15dd34a6d8/komi-san-wa-komyushou-desu"
)
language = "en"
forcevol = False
test = Mangadex(url_uuid, language, forcevol)
@ -33,7 +36,9 @@ def test_uuid_link_false():
def test_title():
url_uuid = "https://mangadex.org/title/a96676e5-8ae2-425e-b549-7f15dd34a6d8/komi-san-wa-komyushou-desu"
url_uuid = (
"https://mangadex.org/title/a96676e5-8ae2-425e-b549-7f15dd34a6d8/komi-san-wa-komyushou-desu"
)
language = "en"
forcevol = False
test = Mangadex(url_uuid, language, forcevol)
@ -51,16 +56,20 @@ def test_alt_title():
def test_alt_title_fallback():
url_uuid = "https://mangadex.org/title/d7037b2a-874a-4360-8a7b-07f2899152fd/mairimashita-iruma-kun"
url_uuid = (
"https://mangadex.org/title/d7037b2a-874a-4360-8a7b-07f2899152fd/mairimashita-iruma-kun"
)
language = "fr"
forcevol = False
test = Mangadex(url_uuid, language, forcevol)
assert test.manga_title == "Iruma à lécole des démons"
assert test.manga_title == "Iruma à lécole des démons" # noqa
def test_chapter_infos():
url_uuid = "https://mangadex.org/title/a96676e5-8ae2-425e-b549-7f15dd34a6d8/komi-san-wa-komyushou-desu"
url_uuid = (
"https://mangadex.org/title/a96676e5-8ae2-425e-b549-7f15dd34a6d8/komi-san-wa-komyushou-desu"
)
language = "en"
forcevol = False
test = Mangadex(url_uuid, language, forcevol)
@ -79,7 +88,9 @@ def test_chapter_infos():
def test_non_existing_manga():
url_uuid = "https://mangadex.org/title/a96676e5-8ae2-425e-b549-999999999999/komi-san-wa-komyushou-desu"
url_uuid = (
"https://mangadex.org/title/a96676e5-8ae2-425e-b549-999999999999/komi-san-wa-komyushou-desu"
)
language = "en"
forcevol = False
@ -88,12 +99,12 @@ def test_non_existing_manga():
assert e.type == KeyError
def test_api_failure(monkeypatch):
fail_url = (
"https://api.mangadex.nonexistant/manga/a96676e5-8ae2-425e-b549-7f15dd34a6d8"
)
def test_api_failure(monkeypatch: MonkeyPatch):
fail_url = "https://api.mangadex.nonexistant/manga/a96676e5-8ae2-425e-b549-7f15dd34a6d8"
monkeypatch.setattr(requests, "get", fail_url)
url_uuid = "https://mangadex.org/title/a96676e5-8ae2-425e-b549-7f15dd34a6d8/komi-san-wa-komyushou-desu"
url_uuid = (
"https://mangadex.org/title/a96676e5-8ae2-425e-b549-7f15dd34a6d8/komi-san-wa-komyushou-desu"
)
language = "en"
forcevol = False
@ -103,7 +114,9 @@ def test_api_failure(monkeypatch):
def test_chapter_lang_en():
url_uuid = "https://mangadex.org/title/a96676e5-8ae2-425e-b549-7f15dd34a6d8/komi-san-wa-komyushou-desu"
url_uuid = (
"https://mangadex.org/title/a96676e5-8ae2-425e-b549-7f15dd34a6d8/komi-san-wa-komyushou-desu"
)
language = "en"
forcevol = False
test = Mangadex(url_uuid, language, forcevol)
@ -112,7 +125,9 @@ def test_chapter_lang_en():
def test_empty_chapter_lang():
url_uuid = "https://mangadex.org/title/a96676e5-8ae2-425e-b549-7f15dd34a6d8/komi-san-wa-komyushou-desu"
url_uuid = (
"https://mangadex.org/title/a96676e5-8ae2-425e-b549-7f15dd34a6d8/komi-san-wa-komyushou-desu"
)
language = "ch"
forcevol = False
@ -122,7 +137,9 @@ def test_empty_chapter_lang():
def test_not_existing_lang():
url_uuid = "https://mangadex.org/title/a96676e5-8ae2-425e-b549-7f15dd34a6d8/komi-san-wa-komyushou-desu"
url_uuid = (
"https://mangadex.org/title/a96676e5-8ae2-425e-b549-7f15dd34a6d8/komi-san-wa-komyushou-desu"
)
language = "zz"
forcevol = False
@ -132,9 +149,7 @@ def test_not_existing_lang():
def test_create_chapter_list():
url_uuid = (
"https://mangadex.org/title/6fef1f74-a0ad-4f0d-99db-d32a7cd24098/fire-punch"
)
url_uuid = "https://mangadex.org/title/6fef1f74-a0ad-4f0d-99db-d32a7cd24098/fire-punch"
language = "en"
forcevol = False
test = Mangadex(url_uuid, language, forcevol)
@ -160,15 +175,76 @@ def test_create_chapter_list():
"19",
"20",
"21",
"22",
"23",
"24",
"25",
"26",
"27",
"28",
"29",
"30",
"31",
"32",
"33",
"34",
"34.5",
"35",
"36",
"37",
"38",
"39",
"40",
"41",
"42",
"43",
"44",
"45",
"46",
"47",
"48",
"49",
"50",
"51",
"52",
"53",
"54",
"55",
"56",
"57",
"58",
"59",
"60",
"61",
"62",
"63",
"64",
"65",
"66",
"67",
"68",
"69",
"70",
"71",
"72",
"73",
"74",
"75",
"76",
"77",
"78",
"79",
"80",
"81",
"82",
"83",
]
assert test.create_chapter_list() == test_list
def test_create_chapter_list_forcevol():
url_uuid = (
"https://mangadex.org/title/6fef1f74-a0ad-4f0d-99db-d32a7cd24098/fire-punch"
)
url_uuid = "https://mangadex.org/title/6fef1f74-a0ad-4f0d-99db-d32a7cd24098/fire-punch"
language = "en"
forcevol = True
test = Mangadex(url_uuid, language, forcevol)
@ -194,19 +270,83 @@ def test_create_chapter_list_forcevol():
"3:19",
"3:20",
"3:21",
"3:22",
"3:23",
"3:24",
"3:25",
"3:26",
"3:27",
"3:28",
"4:29",
"4:30",
"4:31",
"4:32",
"4:33",
"4:34",
"4:34.5",
"4:35",
"4:36",
"4:37",
"4:38",
"4:39",
"5:40",
"5:41",
"5:42",
"5:43",
"5:44",
"5:45",
"5:46",
"5:47",
"5:48",
"5:49",
"6:50",
"6:51",
"6:52",
"6:53",
"6:54",
"6:55",
"6:56",
"6:57",
"6:58",
"6:59",
"6:60",
"7:61",
"7:62",
"7:63",
"7:64",
"7:65",
"7:66",
"7:67",
"7:68",
"7:69",
"7:70",
"8:71",
"8:72",
"8:73",
"8:74",
"8:75",
"8:76",
"8:77",
"8:78",
"8:79",
"8:80",
"8:81",
"8:82",
"8:83",
]
assert test.create_chapter_list() == test_list
def test_get_chapter_images():
url_uuid = "https://mangadex.org/title/a96676e5-8ae2-425e-b549-7f15dd34a6d8/komi-san-wa-komyushou-desu"
url_uuid = (
"https://mangadex.org/title/a96676e5-8ae2-425e-b549-7f15dd34a6d8/komi-san-wa-komyushou-desu"
)
language = "en"
forcevol = False
test = Mangadex(url_uuid, language, forcevol)
img_base_url = "https://uploads.mangadex.org"
chapter_hash = "0752bc5db298beff6b932b9151dd8437"
chapter_uuid = "e86ec2c4-c5e4-4710-bfaa-7604f00939c7"
chapter_num = "1"
test_list = [
f"{img_base_url}/data/{chapter_hash}/x1-0deb4c9bfedd5be49e0a90cfb17cf343888239898c9e7451d569c0b3ea2971f4.jpg",
@ -227,11 +367,11 @@ def test_get_chapter_images():
assert test.get_chapter_images(chapter_num, 2) == test_list
def test_get_chapter_images_error(monkeypatch):
fail_url = (
"https://api.mangadex.org/at-home/server/e86ec2c4-c5e4-4710-bfaa-999999999999"
def test_get_chapter_images_error(monkeypatch: MonkeyPatch):
fail_url = "https://api.mangadex.org/at-home/server/e86ec2c4-c5e4-4710-bfaa-999999999999"
url_uuid = (
"https://mangadex.org/title/a96676e5-8ae2-425e-b549-7f15dd34a6d8/komi-san-wa-komyushou-desu"
)
url_uuid = "https://mangadex.org/title/a96676e5-8ae2-425e-b549-7f15dd34a6d8/komi-san-wa-komyushou-desu"
language = "en"
forcevol = False
test = Mangadex(url_uuid, language, forcevol)
@ -242,7 +382,9 @@ def test_get_chapter_images_error(monkeypatch):
def test_chapter_metadata():
url_uuid = "https://mangadex.org/title/a96676e5-8ae2-425e-b549-7f15dd34a6d8/komi-san-wa-komyushou-desu"
url_uuid = (
"https://mangadex.org/title/a96676e5-8ae2-425e-b549-7f15dd34a6d8/komi-san-wa-komyushou-desu"
)
language = "en"
forcevol = False
test = Mangadex(url_uuid, language, forcevol)

View file

@ -3,8 +3,10 @@ import platform
import shutil
import time
from pathlib import Path
from typing import List
import pytest
from pytest import MonkeyPatch
from mangadlp import app
@ -21,11 +23,9 @@ def wait_20s():
time.sleep(20)
def test_full_api_mangadex(wait_20s):
def test_full_api_mangadex(wait_20s: MonkeyPatch):
manga_path = Path("tests/Tomo-chan wa Onna no ko")
chapter_path = Path(
"tests/Tomo-chan wa Onna no ko/Ch. 1 - Once In A Life Time Misfire.cbz"
)
chapter_path = Path("tests/Tomo-chan wa Onna no ko/Ch. 1 - Once In A Life Time Misfire.cbz")
mdlp = app.MangaDLP(
url_uuid="https://mangadex.org/title/76ee7069-23b4-493c-bc44-34ccbf3051a8/tomo-chan-wa-onna-no-ko",
language="en",
@ -44,16 +44,16 @@ def test_full_api_mangadex(wait_20s):
shutil.rmtree(manga_path, ignore_errors=True)
def test_full_with_input_cbz(wait_20s):
url_uuid = "https://mangadex.org/title/76ee7069-23b4-493c-bc44-34ccbf3051a8/tomo-chan-wa-onna-no-ko"
def test_full_with_input_cbz(wait_20s: MonkeyPatch):
url_uuid = (
"https://mangadex.org/title/76ee7069-23b4-493c-bc44-34ccbf3051a8/tomo-chan-wa-onna-no-ko"
)
language = "en"
chapters = "1"
file_format = "cbz"
download_path = "tests"
manga_path = Path("tests/Tomo-chan wa Onna no ko")
chapter_path = Path(
"tests/Tomo-chan wa Onna no ko/Ch. 1 - Once In A Life Time Misfire.cbz"
)
chapter_path = Path("tests/Tomo-chan wa Onna no ko/Ch. 1 - Once In A Life Time Misfire.cbz")
command_args = f"-u {url_uuid} -l {language} -c {chapters} --path {download_path} --format {file_format} --debug --wait 2"
script_path = "manga-dlp.py"
os.system(f"python3 {script_path} {command_args}")
@ -64,16 +64,16 @@ def test_full_with_input_cbz(wait_20s):
shutil.rmtree(manga_path, ignore_errors=True)
def test_full_with_input_cbz_info(wait_20s):
url_uuid = "https://mangadex.org/title/76ee7069-23b4-493c-bc44-34ccbf3051a8/tomo-chan-wa-onna-no-ko"
def test_full_with_input_cbz_info(wait_20s: MonkeyPatch):
url_uuid = (
"https://mangadex.org/title/76ee7069-23b4-493c-bc44-34ccbf3051a8/tomo-chan-wa-onna-no-ko"
)
language = "en"
chapters = "1"
file_format = "cbz"
download_path = "tests"
manga_path = Path("tests/Tomo-chan wa Onna no ko")
chapter_path = Path(
"tests/Tomo-chan wa Onna no ko/Ch. 1 - Once In A Life Time Misfire.cbz"
)
chapter_path = Path("tests/Tomo-chan wa Onna no ko/Ch. 1 - Once In A Life Time Misfire.cbz")
command_args = f"-u {url_uuid} -l {language} -c {chapters} --path {download_path} --format {file_format} --wait 2"
script_path = "manga-dlp.py"
os.system(f"python3 {script_path} {command_args}")
@ -84,19 +84,17 @@ def test_full_with_input_cbz_info(wait_20s):
shutil.rmtree(manga_path, ignore_errors=True)
@pytest.mark.skipif(
platform.machine() != "x86_64", reason="pdf only supported on amd64"
)
def test_full_with_input_pdf(wait_20s):
url_uuid = "https://mangadex.org/title/76ee7069-23b4-493c-bc44-34ccbf3051a8/tomo-chan-wa-onna-no-ko"
@pytest.mark.skipif(platform.machine() != "x86_64", reason="pdf only supported on amd64")
def test_full_with_input_pdf(wait_20s: MonkeyPatch):
url_uuid = (
"https://mangadex.org/title/76ee7069-23b4-493c-bc44-34ccbf3051a8/tomo-chan-wa-onna-no-ko"
)
language = "en"
chapters = "1"
file_format = "pdf"
download_path = "tests"
manga_path = Path("tests/Tomo-chan wa Onna no ko")
chapter_path = Path(
"tests/Tomo-chan wa Onna no ko/Ch. 1 - Once In A Life Time Misfire.pdf"
)
chapter_path = Path("tests/Tomo-chan wa Onna no ko/Ch. 1 - Once In A Life Time Misfire.pdf")
command_args = f"-u {url_uuid} -l {language} -c {chapters} --path {download_path} --format {file_format} --debug --wait 2"
script_path = "manga-dlp.py"
os.system(f"python3 {script_path} {command_args}")
@ -107,16 +105,16 @@ def test_full_with_input_pdf(wait_20s):
shutil.rmtree(manga_path, ignore_errors=True)
def test_full_with_input_folder(wait_20s):
url_uuid = "https://mangadex.org/title/76ee7069-23b4-493c-bc44-34ccbf3051a8/tomo-chan-wa-onna-no-ko"
def test_full_with_input_folder(wait_20s: MonkeyPatch):
url_uuid = (
"https://mangadex.org/title/76ee7069-23b4-493c-bc44-34ccbf3051a8/tomo-chan-wa-onna-no-ko"
)
language = "en"
chapters = "1"
file_format = ""
download_path = "tests"
manga_path = Path("tests/Tomo-chan wa Onna no ko")
chapter_path = Path(
"tests/Tomo-chan wa Onna no ko/Ch. 1 - Once In A Life Time Misfire"
)
chapter_path = Path("tests/Tomo-chan wa Onna no ko/Ch. 1 - Once In A Life Time Misfire")
metadata_path = Path(
"tests/Tomo-chan wa Onna no ko/Ch. 1 - Once In A Life Time Misfire/ComicInfo.xml"
)
@ -131,16 +129,16 @@ def test_full_with_input_folder(wait_20s):
shutil.rmtree(manga_path, ignore_errors=True)
def test_full_with_input_skip_cbz(wait_10s):
url_uuid = "https://mangadex.org/title/76ee7069-23b4-493c-bc44-34ccbf3051a8/tomo-chan-wa-onna-no-ko"
def test_full_with_input_skip_cbz(wait_10s: MonkeyPatch):
url_uuid = (
"https://mangadex.org/title/76ee7069-23b4-493c-bc44-34ccbf3051a8/tomo-chan-wa-onna-no-ko"
)
language = "en"
chapters = "1"
file_format = "cbz"
download_path = "tests"
manga_path = Path("tests/Tomo-chan wa Onna no ko")
chapter_path = Path(
"tests/Tomo-chan wa Onna no ko/Ch. 1 - Once In A Life Time Misfire.cbz"
)
chapter_path = Path("tests/Tomo-chan wa Onna no ko/Ch. 1 - Once In A Life Time Misfire.cbz")
command_args = f"-u {url_uuid} -l {language} -c {chapters} --path {download_path} --format {file_format} --debug --wait 2"
script_path = "manga-dlp.py"
manga_path.mkdir(parents=True, exist_ok=True)
@ -153,22 +151,22 @@ def test_full_with_input_skip_cbz(wait_10s):
shutil.rmtree(manga_path, ignore_errors=True)
def test_full_with_input_skip_folder(wait_10s):
url_uuid = "https://mangadex.org/title/76ee7069-23b4-493c-bc44-34ccbf3051a8/tomo-chan-wa-onna-no-ko"
def test_full_with_input_skip_folder(wait_10s: MonkeyPatch):
url_uuid = (
"https://mangadex.org/title/76ee7069-23b4-493c-bc44-34ccbf3051a8/tomo-chan-wa-onna-no-ko"
)
language = "en"
chapters = "1"
file_format = ""
download_path = "tests"
manga_path = Path("tests/Tomo-chan wa Onna no ko")
chapter_path = Path(
"tests/Tomo-chan wa Onna no ko/Ch. 1 - Once In A Life Time Misfire"
)
chapter_path = Path("tests/Tomo-chan wa Onna no ko/Ch. 1 - Once In A Life Time Misfire")
command_args = f"-u {url_uuid} -l {language} -c {chapters} --path {download_path} --format '{file_format}' --debug --wait 2"
script_path = "manga-dlp.py"
chapter_path.mkdir(parents=True, exist_ok=True)
os.system(f"python3 {script_path} {command_args}")
found_files = []
found_files: List[str] = []
for file in chapter_path.iterdir():
found_files.append(file.name)
@ -184,17 +182,15 @@ def test_full_with_input_skip_folder(wait_10s):
shutil.rmtree(manga_path, ignore_errors=True)
def test_full_with_read_cbz(wait_20s):
def test_full_with_read_cbz(wait_20s: MonkeyPatch):
url_list = Path("tests/test_list2.txt")
language = "en"
chapters = "1"
file_format = "cbz"
download_path = "tests"
manga_path = Path("tests/Tomo-chan wa Onna no ko")
chapter_path = Path(
"tests/Tomo-chan wa Onna no ko/Ch. 1 - Once In A Life Time Misfire.cbz"
)
command_args = f"--read {str(url_list)} -l {language} -c {chapters} --path {download_path} --format {file_format} --debug --wait 2"
chapter_path = Path("tests/Tomo-chan wa Onna no ko/Ch. 1 - Once In A Life Time Misfire.cbz")
command_args = f"--read {url_list!s} -l {language} -c {chapters} --path {download_path} --format {file_format} --debug --wait 2"
script_path = "manga-dlp.py"
url_list.write_text(
"https://mangadex.org/title/76ee7069-23b4-493c-bc44-34ccbf3051a8/tomo-chan-wa-onna-no-ko"
@ -208,17 +204,15 @@ def test_full_with_read_cbz(wait_20s):
shutil.rmtree(manga_path, ignore_errors=True)
def test_full_with_read_skip_cbz(wait_10s):
def test_full_with_read_skip_cbz(wait_10s: MonkeyPatch):
url_list = Path("tests/test_list2.txt")
language = "en"
chapters = "1"
file_format = "cbz"
download_path = "tests"
manga_path = Path("tests/Tomo-chan wa Onna no ko")
chapter_path = Path(
"tests/Tomo-chan wa Onna no ko/Ch. 1 - Once In A Life Time Misfire.cbz"
)
command_args = f"--read {str(url_list)} -l {language} -c {chapters} --path {download_path} --format {file_format} --debug --wait 2"
chapter_path = Path("tests/Tomo-chan wa Onna no ko/Ch. 1 - Once In A Life Time Misfire.cbz")
command_args = f"--read {url_list!s} -l {language} -c {chapters} --path {download_path} --format {file_format} --debug --wait 2"
script_path = "manga-dlp.py"
manga_path.mkdir(parents=True, exist_ok=True)
chapter_path.touch()

View file

@ -4,6 +4,7 @@ import time
from pathlib import Path
import pytest
from pytest import MonkeyPatch
@pytest.fixture
@ -18,7 +19,7 @@ def wait_20s():
time.sleep(20)
def test_full_with_all_flags(wait_20s):
def test_full_with_all_flags(wait_20s: MonkeyPatch):
manga_path = Path("tests/Tomo-chan wa Onna no ko")
chapter_path = manga_path / "Ch. 1 - Once In A Life Time Misfire.cbz"
cache_path = Path("tests/test_cache.json")

31
tox.ini
View file

@ -1,31 +0,0 @@
[tox]
envlist = py38, py39, py310
isolated_build = True
[testenv]
deps =
-rcontrib/requirements_dev.txt
commands =
pytest --verbose --exitfirst --basetemp="{envtmpdir}" {posargs}
[testenv:basic]
deps =
-rcontrib/requirements_dev.txt
commands =
pytest --verbose --exitfirst --basetemp="{envtmpdir}" {posargs}
[testenv:coverage]
deps =
-rcontrib/requirements_dev.txt
commands =
coverage erase
coverage run
coverage xml -i
[pylama]
format = pycodestyle
linters = mccabe,pycodestyle,pyflakes
ignore = E501,C901,C0301