Compare commits
1732 Commits
v1.7.1
...
v1.12.0-be
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
454264a87f | ||
|
|
7ecb76dddc | ||
|
|
64eabbe8d0 | ||
|
|
197938eaab | ||
|
|
02a40055f5 | ||
|
|
72bacc016a | ||
|
|
aeecc10e45 | ||
|
|
2b3edbaa46 | ||
|
|
270f8677a7 | ||
|
|
447edd1355 | ||
|
|
024921212a | ||
|
|
5d08a34365 | ||
|
|
20763e7c26 | ||
|
|
b33ba4c902 | ||
|
|
fae5e834b9 | ||
|
|
4cb4bd13ad | ||
|
|
896304ccaa | ||
|
|
9ae186e6f9 | ||
|
|
c7690c05f5 | ||
|
|
7273a8c7a5 | ||
|
|
4195d5746f | ||
|
|
8b90b51b1a | ||
|
|
e74af5c73c | ||
|
|
99c2442b28 | ||
|
|
3c2df48a1a | ||
|
|
a0c1c48dca | ||
|
|
4e05aba0a5 | ||
|
|
299a69a2de | ||
|
|
7bc077ac08 | ||
|
|
64752f6b57 | ||
|
|
c2880bcf9a | ||
|
|
159dcdbda5 | ||
|
|
1838fa971e | ||
|
|
d8d111f093 | ||
|
|
31a03b1d30 | ||
|
|
5004771d79 | ||
|
|
92b9fc1ba9 | ||
|
|
585cc24dd5 | ||
|
|
f261c70f1e | ||
|
|
8c9dfa449c | ||
|
|
d94ca2962e | ||
|
|
3c7eacf923 | ||
|
|
643486b14b | ||
|
|
87045da1e2 | ||
|
|
a109723ada | ||
|
|
151573a26e | ||
|
|
284e0d3f60 | ||
|
|
7048af276a | ||
|
|
e6cd3c1970 | ||
|
|
623ac441d5 | ||
|
|
003201bc1b | ||
|
|
1bf6d9165f | ||
|
|
4b49bd9de8 | ||
|
|
69f82d503a | ||
|
|
6c7ff54aad | ||
|
|
0b53a8981c | ||
|
|
c4dbd58efd | ||
|
|
959f80604a | ||
|
|
dee691b72b | ||
|
|
a4829ce26a | ||
|
|
7ed4dedd5e | ||
|
|
93d272f50b | ||
|
|
6fe5674ac3 | ||
|
|
6024a862d6 | ||
|
|
195f3a5dbf | ||
|
|
94f0808a2f | ||
|
|
e3f062b981 | ||
|
|
22142203ce | ||
|
|
412d9f5cd2 | ||
|
|
133532a463 | ||
|
|
c9683808c9 | ||
|
|
b25f083687 | ||
|
|
62ba4b9730 | ||
|
|
150c7f26a5 | ||
|
|
4b4111ec03 | ||
|
|
9e33344808 | ||
|
|
bba1fc7194 | ||
|
|
efaa1c4dd7 | ||
|
|
a88b318d7d | ||
|
|
2460c3e076 | ||
|
|
9763b72f81 | ||
|
|
19ab62c06c | ||
|
|
eb8f37d846 | ||
|
|
5c9e2d7070 | ||
|
|
da9f2b1a8c | ||
|
|
985f298c46 | ||
|
|
2bb63b2d02 | ||
|
|
ac75c61c8c | ||
|
|
f8f0915a32 | ||
|
|
7b87511e88 | ||
|
|
bb05c2218f | ||
|
|
e96e8472d9 | ||
|
|
3191c15889 | ||
|
|
d4af7aa411 | ||
|
|
7b3719101a | ||
|
|
4def3bf5c2 | ||
|
|
3daee46c3d | ||
|
|
fbebd8d7c0 | ||
|
|
af5cb35531 | ||
|
|
61a2dca81f | ||
|
|
4aa8e9b800 | ||
|
|
b81fe1695d | ||
|
|
3625e5080c | ||
|
|
c21775980f | ||
|
|
33e597f5bb | ||
|
|
6fa2ca648a | ||
|
|
adecf5d927 | ||
|
|
e69d7d804b | ||
|
|
a0eecb83cf | ||
|
|
9955315a10 | ||
|
|
ee7097b497 | ||
|
|
387c23d27a | ||
|
|
359593728e | ||
|
|
9708832ccd | ||
|
|
aa2ae8fe4c | ||
|
|
729845662f | ||
|
|
6ff28c92a4 | ||
|
|
d19bf59f47 | ||
|
|
a340b9c8a1 | ||
|
|
d7939ca958 | ||
|
|
00d67d53bf | ||
|
|
b869ad02a1 | ||
|
|
91d4941438 | ||
|
|
5746e8b56d | ||
|
|
8e83f90952 | ||
|
|
80910c72cf | ||
|
|
ca4ece3ccd | ||
|
|
ac6c0484ed | ||
|
|
1e4923835b | ||
|
|
7be9ae9c02 | ||
|
|
da38efebdf | ||
|
|
0fd51e35e1 | ||
|
|
59e0c1fe4e | ||
|
|
cfe9528884 | ||
|
|
1b45637e9c | ||
|
|
76acf2b01d | ||
|
|
eda2bd2dbd | ||
|
|
6819decec3 | ||
|
|
c2220aa1ef | ||
|
|
0d87e529f3 | ||
|
|
24ce1830eb | ||
|
|
dfed4176ed | ||
|
|
be8615741e | ||
|
|
fd1f6aa960 | ||
|
|
067a6107f5 | ||
|
|
62782be08e | ||
|
|
428fe4a372 | ||
|
|
91e3302e54 | ||
|
|
906d5d0bab | ||
|
|
06c62abfbd | ||
|
|
31e4a0a88b | ||
|
|
cf82cb35c9 | ||
|
|
53fff1d54a | ||
|
|
60cf260b71 | ||
|
|
b9d1499d04 | ||
|
|
3fe68d7bbe | ||
|
|
2eeb02638b | ||
|
|
cb4beb5e71 | ||
|
|
729f25c435 | ||
|
|
d8e02c6fa0 | ||
|
|
26c7fad005 | ||
|
|
28b26eb4c7 | ||
|
|
4032315851 | ||
|
|
3c8d7f2dee | ||
|
|
c76460bd96 | ||
|
|
25f0a79d06 | ||
|
|
20e586fa60 | ||
|
|
384a118672 | ||
|
|
7d3110f392 | ||
|
|
ecd345f3e1 | ||
|
|
ea637b292d | ||
|
|
82c0b657c4 | ||
|
|
0476be0ef0 | ||
|
|
b12ab5fe04 | ||
|
|
50c0c65c60 | ||
|
|
a83058ab11 | ||
|
|
16d5daa867 | ||
|
|
e44fcd4e84 | ||
|
|
67d7b4cef5 | ||
|
|
83734c3bee | ||
|
|
06dab340dd | ||
|
|
58532eeb69 | ||
|
|
2bc208cd6e | ||
|
|
eaa7ae2fb5 | ||
|
|
f4e5023d22 | ||
|
|
cd5432fec0 | ||
|
|
b1410a854e | ||
|
|
8ec9c77e51 | ||
|
|
f9ce4d8f6a | ||
|
|
8c9a74ee0c | ||
|
|
0b59ef2cfa | ||
|
|
0099631905 | ||
|
|
06c6f33d97 | ||
|
|
4548038525 | ||
|
|
a2b7687c3b | ||
|
|
15cba8e14d | ||
|
|
605f86f0cf | ||
|
|
8cbaca22c1 | ||
|
|
4269074944 | ||
|
|
7b7331683d | ||
|
|
a83637b2bf | ||
|
|
721447999e | ||
|
|
72cbdca6e8 | ||
|
|
22e060e00e | ||
|
|
23fb5c2a1f | ||
|
|
d6e6f49c15 | ||
|
|
fc259c8bfd | ||
|
|
ecf90c4718 | ||
|
|
f0b359889e | ||
|
|
b0fb44db86 | ||
|
|
bfd955b210 | ||
|
|
8af21d6fe3 | ||
|
|
e9f25190e9 | ||
|
|
e81b829eb0 | ||
|
|
5f4e5c2cfb | ||
|
|
383358376f | ||
|
|
00f0b55729 | ||
|
|
b10b981cb5 | ||
|
|
f805407bce | ||
|
|
4f169da4a8 | ||
|
|
4031381c31 | ||
|
|
b7bc3830cc | ||
|
|
f2872d6475 | ||
|
|
f5219c101c | ||
|
|
f329b5a3d0 | ||
|
|
6f6a5f2eed | ||
|
|
266a8cd1a9 | ||
|
|
7675014c90 | ||
|
|
ec5971c134 | ||
|
|
d7fedfcd87 | ||
|
|
cb99a8741e | ||
|
|
a63ed236a4 | ||
|
|
4594a5c41c | ||
|
|
9109c25b3e | ||
|
|
5435dc2499 | ||
|
|
1d300fafad | ||
|
|
b1194f9524 | ||
|
|
2532bd1e2c | ||
|
|
800e842ab3 | ||
|
|
6f6f365e2b | ||
|
|
43b863b816 | ||
|
|
94e32005ca | ||
|
|
204e14877d | ||
|
|
92f05e051f | ||
|
|
17bdf2a233 | ||
|
|
ce37100a0a | ||
|
|
81c371d66b | ||
|
|
c114653977 | ||
|
|
6280b9948a | ||
|
|
3a322a7b33 | ||
|
|
327ae03589 | ||
|
|
cffbea9053 | ||
|
|
ad0ef9a5a8 | ||
|
|
03e7299925 | ||
|
|
d3ba910f2d | ||
|
|
329e649878 | ||
|
|
206ee97554 | ||
|
|
e512d4af8c | ||
|
|
6a5e752172 | ||
|
|
6ba527ef55 | ||
|
|
220cc1927c | ||
|
|
9956f4cb47 | ||
|
|
fe055f6391 | ||
|
|
eec506a13c | ||
|
|
1530bbd1cb | ||
|
|
db6afdd926 | ||
|
|
5642715721 | ||
|
|
3bd22f0b0f | ||
|
|
c92c7e1ced | ||
|
|
940f5d5b50 | ||
|
|
4dc893a4fa | ||
|
|
6d324dbd8e | ||
|
|
8ddf05e573 | ||
|
|
d869a6bcca | ||
|
|
40bdeffa38 | ||
|
|
5bf5710d39 | ||
|
|
551a7e606c | ||
|
|
feec36939b | ||
|
|
554bba839e | ||
|
|
ebaaa3a1e8 | ||
|
|
bae715cd34 | ||
|
|
2a3b8f5a7f | ||
|
|
6a023507e2 | ||
|
|
1551052cde | ||
|
|
b6dd36a439 | ||
|
|
ce38e4ae08 | ||
|
|
97d6503fef | ||
|
|
b0625cdced | ||
|
|
31e8c44c18 | ||
|
|
0472dfe25a | ||
|
|
8b36c9ad64 | ||
|
|
1266f2d5b9 | ||
|
|
8196051959 | ||
|
|
d198142a1e | ||
|
|
5e15ede849 | ||
|
|
06a6eb0326 | ||
|
|
28819d6d0f | ||
|
|
8a9e564dac | ||
|
|
534704693b | ||
|
|
bc40607c51 | ||
|
|
6fdc17cc72 | ||
|
|
69a5ba0618 | ||
|
|
3c71a9160f | ||
|
|
48ef8eca80 | ||
|
|
812df3782a | ||
|
|
54bb1ae27d | ||
|
|
ff4a8b37bd | ||
|
|
37d3a624b7 | ||
|
|
493f6173da | ||
|
|
272e87b741 | ||
|
|
2b5e6f7a9d | ||
|
|
70960f86ba | ||
|
|
ee4d25567c | ||
|
|
80a126e838 | ||
|
|
c02bd66b3f | ||
|
|
cea6720c1a | ||
|
|
700d58058c | ||
|
|
33e413af65 | ||
|
|
45a13523d4 | ||
|
|
95257d5723 | ||
|
|
8da3ae2c53 | ||
|
|
f17b541a5b | ||
|
|
2b2e518dea | ||
|
|
3f6e3a2750 | ||
|
|
14784d5832 | ||
|
|
8cd5e25364 | ||
|
|
7788d93227 | ||
|
|
826503802a | ||
|
|
6db1e36e14 | ||
|
|
3bc4d7dad7 | ||
|
|
32d546740b | ||
|
|
24da3e5034 | ||
|
|
9e295ddf4f | ||
|
|
eff6f2fb01 | ||
|
|
c597da495c | ||
|
|
de5e9c95ec | ||
|
|
4e27242373 | ||
|
|
7bf1e24616 | ||
|
|
fd0759bf6f | ||
|
|
d6bbf2cc8d | ||
|
|
80495d42de | ||
|
|
eac21f773f | ||
|
|
52f5831657 | ||
|
|
f35f33539a | ||
|
|
46f310603b | ||
|
|
531d3f03f9 | ||
|
|
85cfd7610d | ||
|
|
201b77189a | ||
|
|
5b76b45e33 | ||
|
|
bf2fac9393 | ||
|
|
5a3affe8c0 | ||
|
|
a5834393b3 | ||
|
|
cd6e37c520 | ||
|
|
af51165229 | ||
|
|
d480620be9 | ||
|
|
d470de3576 | ||
|
|
538249b26c | ||
|
|
fb9d3f736b | ||
|
|
a6b7beaf6b | ||
|
|
4d4d545343 | ||
|
|
049dc17902 | ||
|
|
b0ca57a7f0 | ||
|
|
cdd49c5142 | ||
|
|
4b31e5d0b4 | ||
|
|
8076ebd78c | ||
|
|
c864b3cd19 | ||
|
|
2704bcb979 | ||
|
|
59f6074093 | ||
|
|
b1da7f3491 | ||
|
|
adde88e7b9 | ||
|
|
8e876ef2d1 | ||
|
|
2ea0f83a91 | ||
|
|
05d8ea5a9d | ||
|
|
967248233f | ||
|
|
b4c4b9fb6a | ||
|
|
adb6483abc | ||
|
|
908db55bb7 | ||
|
|
610f20de28 | ||
|
|
b2513a5cde | ||
|
|
bfa1c13d01 | ||
|
|
12aaff431f | ||
|
|
547e5ea55e | ||
|
|
c301127096 | ||
|
|
19147855e7 | ||
|
|
4e7c7ea1d6 | ||
|
|
fcf8a49160 | ||
|
|
c6d658a954 | ||
|
|
a78cd6526c | ||
|
|
bf895b54f4 | ||
|
|
e5f84ef583 | ||
|
|
8c690a9a51 | ||
|
|
56526b970a | ||
|
|
94fbf92916 | ||
|
|
37f5e46d09 | ||
|
|
38be817637 | ||
|
|
17303f41da | ||
|
|
55ef0d4a1b | ||
|
|
a8f3c4be54 | ||
|
|
1b9de2be5a | ||
|
|
0e8265f1ae | ||
|
|
5b45a140b9 | ||
|
|
72fb9a475d | ||
|
|
bf97f5807f | ||
|
|
a707818b4d | ||
|
|
fb46c1b96a | ||
|
|
3226d8b25b | ||
|
|
5c4363cbea | ||
|
|
fa62ae820b | ||
|
|
17891bafaf | ||
|
|
15fdadadef | ||
|
|
ce9f604d81 | ||
|
|
4f876db5d1 | ||
|
|
5e5f56dc67 | ||
|
|
93fab8bb95 | ||
|
|
35ca2195fe | ||
|
|
7ace66d7fd | ||
|
|
4f9a31244b | ||
|
|
14cf4f7095 | ||
|
|
8bd7c27826 | ||
|
|
8c4f486fe9 | ||
|
|
2849414445 | ||
|
|
ea1ea0816f | ||
|
|
52d3a8703c | ||
|
|
4cb4d6adcd | ||
|
|
24444237f2 | ||
|
|
40c8629aef | ||
|
|
98cdf614a5 | ||
|
|
2eb2d99a91 | ||
|
|
18ad9bcbf2 | ||
|
|
997bff4917 | ||
|
|
78f9a80895 | ||
|
|
9231df7a4a | ||
|
|
6f25917c86 | ||
|
|
c41d1a78a8 | ||
|
|
c3331086d5 | ||
|
|
6bd9ccd8f6 | ||
|
|
68c7cecb07 | ||
|
|
bcc029a2c7 | ||
|
|
ea38eb01b2 | ||
|
|
01d070b882 | ||
|
|
1727eb00cc | ||
|
|
9d4180553c | ||
|
|
8049af4b22 | ||
|
|
7c6142643d | ||
|
|
2e8706f4e2 | ||
|
|
d39d32d555 | ||
|
|
6f52945449 | ||
|
|
37025297b5 | ||
|
|
aa023ea2e3 | ||
|
|
78bf0b63a5 | ||
|
|
dc9e9e3b48 | ||
|
|
ab29c49b7a | ||
|
|
1c0ac474b8 | ||
|
|
29391c1c7b | ||
|
|
693834971c | ||
|
|
97376d4b72 | ||
|
|
3ee1d2a9a9 | ||
|
|
605f885e19 | ||
|
|
25fb8d9c3b | ||
|
|
a96ecd673b | ||
|
|
58a01a57ee | ||
|
|
c18fc03ef3 | ||
|
|
a96f79f6a3 | ||
|
|
d6f1d004a3 | ||
|
|
da72d3571b | ||
|
|
8241da0eb3 | ||
|
|
51562667bf | ||
|
|
97eeae65a3 | ||
|
|
1aee2988f7 | ||
|
|
a63a8dd488 | ||
|
|
06a9df6dbd | ||
|
|
49933bb5a8 | ||
|
|
7d7d9630c1 | ||
|
|
6f0077efac | ||
|
|
39be68a1a4 | ||
|
|
ac69babfce | ||
|
|
02c782a127 | ||
|
|
4e90fda80f | ||
|
|
88e3e556a1 | ||
|
|
88cf6ef843 | ||
|
|
9b602a4bf0 | ||
|
|
fe2db4dbf7 | ||
|
|
47c88a6bdd | ||
|
|
a3bc3b78d5 | ||
|
|
fed7d3e993 | ||
|
|
3a74f24e49 | ||
|
|
52afab39cf | ||
|
|
8659292852 | ||
|
|
ce73f159fd | ||
|
|
71382e9c62 | ||
|
|
a1a802fc92 | ||
|
|
4200fc610d | ||
|
|
32d212cd9f | ||
|
|
5d3a6e230d | ||
|
|
b33fcc117e | ||
|
|
e96d65f945 | ||
|
|
cfeed0ce6e | ||
|
|
b89ecf7d77 | ||
|
|
5ca25d44ba | ||
|
|
2c1333a75f | ||
|
|
3c48ce0225 | ||
|
|
1e11c12d96 | ||
|
|
3e22e8e0b9 | ||
|
|
dba45f93a4 | ||
|
|
18f3f44ae9 | ||
|
|
85a6a271dc | ||
|
|
abb515d4ea | ||
|
|
309d1f2b67 | ||
|
|
fa2f09bc4b | ||
|
|
c51590cd12 | ||
|
|
8e01406acf | ||
|
|
7cce2f0fe6 | ||
|
|
95091c2f39 | ||
|
|
4a0aa12bd9 | ||
|
|
9a0329746a | ||
|
|
8392a6fd4a | ||
|
|
8fa18bb8a6 | ||
|
|
0095b593fb | ||
|
|
b1e5135e21 | ||
|
|
e88755e7ac | ||
|
|
c582947291 | ||
|
|
98fe3a2cb7 | ||
|
|
61647606fa | ||
|
|
95a1e5c645 | ||
|
|
8ead77f128 | ||
|
|
b9e9e82f33 | ||
|
|
487fd3a5dd | ||
|
|
657786a2fe | ||
|
|
e74d7dadfb | ||
|
|
a2937cd54d | ||
|
|
7b3ce6289f | ||
|
|
a16e8324be | ||
|
|
39de531df5 | ||
|
|
4764d4fd2b | ||
|
|
e147d4571f | ||
|
|
dc9aaa6472 | ||
|
|
8a061c4ac2 | ||
|
|
d051c5c282 | ||
|
|
9e60810a8b | ||
|
|
96ee7990b2 | ||
|
|
224bfeb72e | ||
|
|
f0497e7744 | ||
|
|
c9d6c208af | ||
|
|
9f2b8b1734 | ||
|
|
a04b9e3755 | ||
|
|
a81d4c5e9d | ||
|
|
2140d42098 | ||
|
|
43325371fc | ||
|
|
d10721089e | ||
|
|
f1a1a2da8b | ||
|
|
612e0a1163 | ||
|
|
2a5dc4de38 | ||
|
|
a5283525bc | ||
|
|
de98d748a9 | ||
|
|
f015556562 | ||
|
|
b897d6de2e | ||
|
|
54f20b381e | ||
|
|
c0d4248021 | ||
|
|
870e295aae | ||
|
|
4aa318598f | ||
|
|
00f39d8b58 | ||
|
|
0b1a16908f | ||
|
|
d9796e5003 | ||
|
|
3599bb52c0 | ||
|
|
af8a6c3764 | ||
|
|
6d37ebf79e | ||
|
|
f6a70b85f4 | ||
|
|
538a4219bd | ||
|
|
85c41b79be | ||
|
|
25d014d8ef | ||
|
|
9b01aa9202 | ||
|
|
df101f5e7a | ||
|
|
1fa735eb23 | ||
|
|
ebe21a0114 | ||
|
|
d132eba143 | ||
|
|
073c3c8fed | ||
|
|
e3c1bde793 | ||
|
|
27f7f0a941 | ||
|
|
9f5fd6c3ba | ||
|
|
914661fdbb | ||
|
|
0ae8200593 | ||
|
|
b68906b14e | ||
|
|
681eecc46e | ||
|
|
1578e8de2d | ||
|
|
023c931401 | ||
|
|
9ec89762a3 | ||
|
|
fa47595ac8 | ||
|
|
79f5019b40 | ||
|
|
756ce2f9d8 | ||
|
|
d47122340a | ||
|
|
b01cbc9aa0 | ||
|
|
3dfeee9332 | ||
|
|
057f6016cc | ||
|
|
c4965580de | ||
|
|
9a47963fd5 | ||
|
|
50a211f367 | ||
|
|
5f278d7fbb | ||
|
|
a17d251913 | ||
|
|
1cbf088656 | ||
|
|
d3254d6bcf | ||
|
|
1543729c7b | ||
|
|
ef2a96c34b | ||
|
|
656b1e150f | ||
|
|
e0f61003cf | ||
|
|
1ca98678cd | ||
|
|
9919cc1956 | ||
|
|
d2096e3c05 | ||
|
|
5f2b508b7a | ||
|
|
752d4f4249 | ||
|
|
72e7d5150e | ||
|
|
42a9e05a7f | ||
|
|
b4add2ed55 | ||
|
|
ed7d9295bd | ||
|
|
5b7b1b2349 | ||
|
|
d5c930acc9 | ||
|
|
4c93d6d7e6 | ||
|
|
066f3264fb | ||
|
|
88a803f949 | ||
|
|
e69615dc06 | ||
|
|
a1e0840e24 | ||
|
|
d814353e83 | ||
|
|
06d7845eca | ||
|
|
ae8682c7a5 | ||
|
|
c9c0b3d430 | ||
|
|
cc46fc7e4b | ||
|
|
d1b1ba21cd | ||
|
|
a009417a99 | ||
|
|
775da720ec | ||
|
|
aeae6ea0d3 | ||
|
|
0ae46d2269 | ||
|
|
0e7f1ec0de | ||
|
|
13cd55b96f | ||
|
|
9139e807ec | ||
|
|
53616f6625 | ||
|
|
526fdf1153 | ||
|
|
fc4aceb0ee | ||
|
|
3d8421b718 | ||
|
|
6cebceda15 | ||
|
|
e1fd6bda19 | ||
|
|
fd34414b17 | ||
|
|
3ce1886a54 | ||
|
|
8ed43779a8 | ||
|
|
a7949b3e22 | ||
|
|
19c293c3e6 | ||
|
|
ccb1ec4ff5 | ||
|
|
e5106bdca0 | ||
|
|
ba1366f49a | ||
|
|
1dc271723c | ||
|
|
f3b3db30a2 | ||
|
|
69241ce394 | ||
|
|
10f6195bac | ||
|
|
1d0cf77e7e | ||
|
|
beea3eb7eb | ||
|
|
a7b5b98174 | ||
|
|
046d43fbe8 | ||
|
|
8023aae738 | ||
|
|
2a9bb55559 | ||
|
|
e635bfedc5 | ||
|
|
be64552092 | ||
|
|
91a2dedfec | ||
|
|
069e0a1903 | ||
|
|
39149a891c | ||
|
|
daa49ee7c8 | ||
|
|
acd3832417 | ||
|
|
82b2ba3cc2 | ||
|
|
7e3e0a0fa6 | ||
|
|
3de6e0bcf1 | ||
|
|
3c325582d9 | ||
|
|
e5012cdc5f | ||
|
|
853d13b6f2 | ||
|
|
449fa9bf48 | ||
|
|
0a6828517a | ||
|
|
8585e77ccd | ||
|
|
db32431bcc | ||
|
|
e2d826b4ea | ||
|
|
06f1a4f744 | ||
|
|
d09bb563a7 | ||
|
|
0fafecc6a4 | ||
|
|
66b60654d9 | ||
|
|
b479027f3d | ||
|
|
0a81439415 | ||
|
|
4fcaa72886 | ||
|
|
9acb00dcba | ||
|
|
ebc453d720 | ||
|
|
770d72c3e8 | ||
|
|
e97c04c03d | ||
|
|
34a0111ff5 | ||
|
|
9214b41255 | ||
|
|
0d941bfb05 | ||
|
|
4c68d28a6f | ||
|
|
b511b084d0 | ||
|
|
f2939583d7 | ||
|
|
5951b0d946 | ||
|
|
bf088c427a | ||
|
|
ef1eead52e | ||
|
|
f77a431554 | ||
|
|
cb930d1e76 | ||
|
|
c4db7f7b6d | ||
|
|
a24524fb81 | ||
|
|
9e253fcc62 | ||
|
|
490dee2e90 | ||
|
|
a2032b9979 | ||
|
|
1561f561d9 | ||
|
|
6246bfaf28 | ||
|
|
57763d0c0b | ||
|
|
d441c8a26e | ||
|
|
54abeff63a | ||
|
|
dc045761d9 | ||
|
|
393c208cfd | ||
|
|
11f9bc898a | ||
|
|
a31089ca6e | ||
|
|
0355fa2cb6 | ||
|
|
73b945a9f3 | ||
|
|
396dd9ae1c | ||
|
|
b026dcc6ae | ||
|
|
36d2286d03 | ||
|
|
ea992e92f5 | ||
|
|
d6b5c733f3 | ||
|
|
7efdce44f7 | ||
|
|
9b3243533c | ||
|
|
0993fc07a3 | ||
|
|
6df73ae940 | ||
|
|
28f7b0dc13 | ||
|
|
bd64684fa4 | ||
|
|
a9abffaddc | ||
|
|
89e0f8e3ef | ||
|
|
9e91440245 | ||
|
|
4a24ba51c5 | ||
|
|
d5fb98b7c4 | ||
|
|
cda0a19b99 | ||
|
|
932a285b82 | ||
|
|
c414de9c35 | ||
|
|
2204090151 | ||
|
|
3c81a7468b | ||
|
|
5ef86f9489 | ||
|
|
90cb0836bb | ||
|
|
9b82ab95fb | ||
|
|
aa5aff246b | ||
|
|
c6a484439d | ||
|
|
ef1d4264b5 | ||
|
|
1b55717cc7 | ||
|
|
e1fa59122d | ||
|
|
dc1da7cb24 | ||
|
|
8652b7ddb0 | ||
|
|
84b3fee0f9 | ||
|
|
b52cb193e1 | ||
|
|
6a00d5e08a | ||
|
|
5bf26369e2 | ||
|
|
3357fa19f3 | ||
|
|
37a892d461 | ||
|
|
f149f9ccb1 | ||
|
|
d52fbbb040 | ||
|
|
36239ba09f | ||
|
|
446eca4ac3 | ||
|
|
318c1d2fbd | ||
|
|
f8ce6285df | ||
|
|
0a19ad4edb | ||
|
|
ab69961b5c | ||
|
|
1400dba12c | ||
|
|
a72cc5da83 | ||
|
|
630b8fa675 | ||
|
|
43514ad477 | ||
|
|
0ed95547d5 | ||
|
|
250e2d54b4 | ||
|
|
b02cd541a8 | ||
|
|
94f2a2ce33 | ||
|
|
6a7c0279bf | ||
|
|
5ba11b1161 | ||
|
|
917bce301c | ||
|
|
3fb32c5cf1 | ||
|
|
19a1bf0f5f | ||
|
|
abe8e678ea | ||
|
|
d16dee61fa | ||
|
|
e7b22b15c6 | ||
|
|
0f62770bce | ||
|
|
c67cff4f0e | ||
|
|
dccb9227a2 | ||
|
|
b7db1cf2c1 | ||
|
|
3d683d13b8 | ||
|
|
310c89ffdf | ||
|
|
dd069d753b | ||
|
|
27a24f10b3 | ||
|
|
b6b9bf0e3c | ||
|
|
5fc7350ea0 | ||
|
|
9b01c96846 | ||
|
|
e7c40fc3dc | ||
|
|
0da0b1c062 | ||
|
|
08988e11f8 | ||
|
|
30372b0e85 | ||
|
|
96aba9acdc | ||
|
|
43e78c8e69 | ||
|
|
47d0c77970 | ||
|
|
567e89d1c7 | ||
|
|
f1f5227ccd | ||
|
|
f825b5772d | ||
|
|
f9cb95e79c | ||
|
|
73845ef968 | ||
|
|
8be6c707de | ||
|
|
60f76d3e1f | ||
|
|
912dc9a847 | ||
|
|
70ef6412eb | ||
|
|
99db828d49 | ||
|
|
3c5b647303 | ||
|
|
d1aa08850d | ||
|
|
53e8d84af2 | ||
|
|
4c7242df6d | ||
|
|
a231b92644 | ||
|
|
c3a62268c7 | ||
|
|
69913ae250 | ||
|
|
09b5bd17f2 | ||
|
|
e384bd78c5 | ||
|
|
a2da1acdd7 | ||
|
|
a67ea8ffd9 | ||
|
|
0050a20710 | ||
|
|
ad65360a55 | ||
|
|
15bcb2491c | ||
|
|
2a2fa90cf9 | ||
|
|
6e9fbdb8ed | ||
|
|
097ab55f7a | ||
|
|
dddb82af23 | ||
|
|
377c37dfab | ||
|
|
fda844f64c | ||
|
|
daf90399bd | ||
|
|
0df0deb445 | ||
|
|
3d37e49c1a | ||
|
|
d05f803c16 | ||
|
|
25e2ca5295 | ||
|
|
02e8157fb9 | ||
|
|
708cac4683 | ||
|
|
66d79ccd3f | ||
|
|
261c6fb990 | ||
|
|
9c1195fc59 | ||
|
|
deaf02780e | ||
|
|
5602031b67 | ||
|
|
4c1bf240d6 | ||
|
|
b13ced93ed | ||
|
|
87472b31d2 | ||
|
|
7f1d01e443 | ||
|
|
1024d7e6e2 | ||
|
|
4cc2976614 | ||
|
|
caf4b54bc7 | ||
|
|
fb2efe5ab8 | ||
|
|
98f1722d1b | ||
|
|
a6dc8a373e | ||
|
|
91e96bc88f | ||
|
|
8c06858807 | ||
|
|
8025df5fe3 | ||
|
|
5aeb656a48 | ||
|
|
f96ee4f7a0 | ||
|
|
fcdb1dc30c | ||
|
|
dafefa33d6 | ||
|
|
d08eb0c66b | ||
|
|
d1a17480ea | ||
|
|
1e891414a3 | ||
|
|
c44c914d3d | ||
|
|
d10d2f5a54 | ||
|
|
6523cf0c4b | ||
|
|
1262c121f0 | ||
|
|
f7cd6974c5 | ||
|
|
a7e1ba82d6 | ||
|
|
d856e48045 | ||
|
|
87972ee7fe | ||
|
|
e6332944ce | ||
|
|
96d7dc273e | ||
|
|
4f4a08ccc7 | ||
|
|
1716a11f45 | ||
|
|
9ed0d5f7d6 | ||
|
|
9b461990b7 | ||
|
|
8a47e5b8e4 | ||
|
|
5f12087779 | ||
|
|
8aa9a4db65 | ||
|
|
75a54fab2c | ||
|
|
085b5eb9f1 | ||
|
|
543e221d3e | ||
|
|
e25584a202 | ||
|
|
4c268d5883 | ||
|
|
bf8fc3ca29 | ||
|
|
65c1b84508 | ||
|
|
f78ade5007 | ||
|
|
6f73aef262 | ||
|
|
4d715ddfbc | ||
|
|
cdf99166f4 | ||
|
|
fc267472b3 | ||
|
|
a0fde023f5 | ||
|
|
4a98156731 | ||
|
|
ecb190d1b8 | ||
|
|
14d82bd8ff | ||
|
|
6f50285f47 | ||
|
|
836ab7a9e7 | ||
|
|
06d9e38457 | ||
|
|
9aa7e8e37e | ||
|
|
fa526dd702 | ||
|
|
8325dc5977 | ||
|
|
abd34d8d17 | ||
|
|
d19b598371 | ||
|
|
833a301948 | ||
|
|
0e03633ed0 | ||
|
|
5e45b1f230 | ||
|
|
07e2329068 | ||
|
|
4e56fe339e | ||
|
|
e7ebc33090 | ||
|
|
3b07e0fe15 | ||
|
|
a246b3b598 | ||
|
|
11ab469a39 | ||
|
|
694ad53ef9 | ||
|
|
a3be3bb71a | ||
|
|
77b3aa5011 | ||
|
|
9aefff38e7 | ||
|
|
462b243531 | ||
|
|
430c5c3b87 | ||
|
|
97ceb1a8a6 | ||
|
|
55089aab32 | ||
|
|
b7c335507f | ||
|
|
9c0c734b34 | ||
|
|
7f5a3ca1a3 | ||
|
|
0b5c6d3532 | ||
|
|
5357775d42 | ||
|
|
0f25260163 | ||
|
|
964cfcd4fb | ||
|
|
c42388f7e2 | ||
|
|
ff7d4d15cd | ||
|
|
5e4a9311ed | ||
|
|
4e07280102 | ||
|
|
fdac108cab | ||
|
|
318f74b34a | ||
|
|
788b3a5ba8 | ||
|
|
19d4b85961 | ||
|
|
821c14fbce | ||
|
|
8c03d9c638 | ||
|
|
174a609449 | ||
|
|
5fd394726e | ||
|
|
1f27f7c12d | ||
|
|
ad6ef7314b | ||
|
|
98c306a315 | ||
|
|
11ad8ada79 | ||
|
|
905b28c1d7 | ||
|
|
84b01b2e4e | ||
|
|
5e9928e58b | ||
|
|
0457259e7e | ||
|
|
1f73a6913f | ||
|
|
b29593bad9 | ||
|
|
bd378f79f4 | ||
|
|
699ed62af4 | ||
|
|
d6ae4102b4 | ||
|
|
97eed9c66d | ||
|
|
b79a560816 | ||
|
|
a6045bb8e8 | ||
|
|
9f770e42ba | ||
|
|
b20fe9f09b | ||
|
|
48dbdc6c00 | ||
|
|
d33f993809 | ||
|
|
8ba44b3f71 | ||
|
|
83e0a6e179 | ||
|
|
8e2c4da55e | ||
|
|
cae79e0555 | ||
|
|
5b79aec065 | ||
|
|
91de061c06 | ||
|
|
f7d6f0bf21 | ||
|
|
551c765358 | ||
|
|
34ee26084d | ||
|
|
47a76412e2 | ||
|
|
45e6a419b3 | ||
|
|
c34f982496 | ||
|
|
f26fda9485 | ||
|
|
06a29cd45c | ||
|
|
98ab770437 | ||
|
|
c87f60c605 | ||
|
|
9e2430da46 | ||
|
|
f59abadbc4 | ||
|
|
4a3a55b923 | ||
|
|
9bd031fbd7 | ||
|
|
436f9e891e | ||
|
|
04faa10e3b | ||
|
|
38ba1d1a52 | ||
|
|
4422bb3f69 | ||
|
|
5b66ef0a74 | ||
|
|
5639659b63 | ||
|
|
7ba9cdbe23 | ||
|
|
6f6f006704 | ||
|
|
4fe37f6aee | ||
|
|
5162bdd404 | ||
|
|
c8f252d165 | ||
|
|
c289439cab | ||
|
|
807b7130e5 | ||
|
|
14b6216b49 | ||
|
|
9188e25dc5 | ||
|
|
fad1b03458 | ||
|
|
49054c61a4 | ||
|
|
e2d593c023 | ||
|
|
7455963124 | ||
|
|
6771e57fca | ||
|
|
9d117ee11b | ||
|
|
7d4b2c2413 | ||
|
|
5bb1824613 | ||
|
|
8c07b76e6a | ||
|
|
8cb58b4ff8 | ||
|
|
426178b6e5 | ||
|
|
07ec74a5d6 | ||
|
|
a865f2af7d | ||
|
|
3409d19139 | ||
|
|
71b4571524 | ||
|
|
9247300230 | ||
|
|
8967f07c8d | ||
|
|
6e21d3dbee | ||
|
|
45b3422506 | ||
|
|
e5de658e78 | ||
|
|
e0f93c26d6 | ||
|
|
df7e4d85c6 | ||
|
|
ae736f8f68 | ||
|
|
e2674c29a6 | ||
|
|
aa55162e2e | ||
|
|
1330390b4f | ||
|
|
617055fca7 | ||
|
|
185b352264 | ||
|
|
58afac2312 | ||
|
|
ea60b83336 | ||
|
|
05a00b3057 | ||
|
|
09139fe434 | ||
|
|
d003d26a67 | ||
|
|
ef2789cf57 | ||
|
|
7d4ce40a37 | ||
|
|
8feada6907 | ||
|
|
ed9b0c32d8 | ||
|
|
8088394d16 | ||
|
|
cb2823ff45 | ||
|
|
27906df149 | ||
|
|
b8e7f0b45f | ||
|
|
355b3fcb3d | ||
|
|
7aa0e5650b | ||
|
|
f9a0adc64e | ||
|
|
3b34aed64f | ||
|
|
8a8edfb108 | ||
|
|
3a7cbd3a42 | ||
|
|
0e443ba017 | ||
|
|
8ed401aec1 | ||
|
|
9ae847039b | ||
|
|
d4cb84ff76 | ||
|
|
17ae2aacbf | ||
|
|
82d03f2dc6 | ||
|
|
3cf2aaf8ff | ||
|
|
0c89721133 | ||
|
|
e206687070 | ||
|
|
cdb9c48545 | ||
|
|
0b8eff9643 | ||
|
|
16882b8fa9 | ||
|
|
2afa5940e3 | ||
|
|
8fa7bc3dab | ||
|
|
4a9dc1e33a | ||
|
|
b0d842a370 | ||
|
|
194dda6d84 | ||
|
|
3267708097 | ||
|
|
71a808c95a | ||
|
|
2c3c26edf1 | ||
|
|
18a4ba7778 | ||
|
|
140e239bdb | ||
|
|
537e7c63f4 | ||
|
|
ed2e884de8 | ||
|
|
60980cb26a | ||
|
|
3b84e34c8e | ||
|
|
ebdf9b55df | ||
|
|
f528b01de4 | ||
|
|
6ae9a8f2be | ||
|
|
65cfd55027 | ||
|
|
15d074d39c | ||
|
|
962d0ebb40 | ||
|
|
d408900a91 | ||
|
|
0bf9e55ca7 | ||
|
|
55d36b39b1 | ||
|
|
45fd01a688 | ||
|
|
1fda0782ae | ||
|
|
2680a83455 | ||
|
|
148e875523 | ||
|
|
b9e60e0145 | ||
|
|
03559454f6 | ||
|
|
50ee1c0bd5 | ||
|
|
c6b13271cf | ||
|
|
fd83e8f2a9 | ||
|
|
5f7c724531 | ||
|
|
31ebe2675b | ||
|
|
60d40cc2ce | ||
|
|
6796cdf947 | ||
|
|
6592a925c4 | ||
|
|
6d42b2a29d | ||
|
|
6b4dccfbd5 | ||
|
|
2582d325bc | ||
|
|
078814e77a | ||
|
|
a80b413d38 | ||
|
|
8cb93da53e | ||
|
|
299ae2b828 | ||
|
|
96cf316eec | ||
|
|
4aebb8e153 | ||
|
|
82c6942f09 | ||
|
|
ff280f0309 | ||
|
|
06d15a11c8 | ||
|
|
4b3649ea94 | ||
|
|
5ffb25b71d | ||
|
|
8b28159e2d | ||
|
|
99d6103617 | ||
|
|
a502fe7c5c | ||
|
|
bd18a57a5d | ||
|
|
3828d712bd | ||
|
|
a6be010464 | ||
|
|
efe51b30fe | ||
|
|
26019b9c17 | ||
|
|
ebdd7afb67 | ||
|
|
0c90b84a92 | ||
|
|
9c2265d1aa | ||
|
|
8046a6f3a7 | ||
|
|
ddff902291 | ||
|
|
a406920ae6 | ||
|
|
44e596b0c4 | ||
|
|
2b1c8c8d9a | ||
|
|
d40c13420d | ||
|
|
8ad2f7daf0 | ||
|
|
696ebf545f | ||
|
|
ed515f4e36 | ||
|
|
06a4949266 | ||
|
|
f50a01e118 | ||
|
|
09512be1ad | ||
|
|
bb951ad860 | ||
|
|
858ae909e8 | ||
|
|
1692bac3fe | ||
|
|
cce1595c3d | ||
|
|
67bb140eef | ||
|
|
6d5d308d6c | ||
|
|
d39b4ae8cb | ||
|
|
0f4b118b61 | ||
|
|
f5f2240828 | ||
|
|
66593a28f5 | ||
|
|
ba1cdd5914 | ||
|
|
3f536552a6 | ||
|
|
1d2282df9e | ||
|
|
1b56ffd0c0 | ||
|
|
d5018af2a3 | ||
|
|
5c1e09cc48 | ||
|
|
6d956ac13b | ||
|
|
865fbbd15c | ||
|
|
97cfd0085e | ||
|
|
f20f200c8d | ||
|
|
78bd424ecb | ||
|
|
6fa32c36e9 | ||
|
|
817882ff6f | ||
|
|
d1e8299010 | ||
|
|
42c50c4e0b | ||
|
|
c151db6e21 | ||
|
|
7844537355 | ||
|
|
a414208327 | ||
|
|
ab761e837c | ||
|
|
c8e838e3a0 | ||
|
|
9b24cf7591 | ||
|
|
5e3cdcdd6e | ||
|
|
baeb2a074a | ||
|
|
a56de4547c | ||
|
|
e3cc5c3013 | ||
|
|
4194b248b9 | ||
|
|
3fcbd8f3ac | ||
|
|
b3b2519bf0 | ||
|
|
fccea022fa | ||
|
|
d80d5e4e70 | ||
|
|
f1e93eb70a | ||
|
|
260b709296 | ||
|
|
9bb762fc8f | ||
|
|
5c49bbfc73 | ||
|
|
edabf208bc | ||
|
|
0878a199f4 | ||
|
|
eec1f03f86 | ||
|
|
a4c4b81297 | ||
|
|
a5f9c8f651 | ||
|
|
1ec7351842 | ||
|
|
f2cab81aed | ||
|
|
9cbc74ebb2 | ||
|
|
3d36d0445c | ||
|
|
f4fece5550 | ||
|
|
6133f745b7 | ||
|
|
02456b271b | ||
|
|
ad1f5ae081 | ||
|
|
86358d5561 | ||
|
|
87953cb98a | ||
|
|
f58e0041ce | ||
|
|
8183de4902 | ||
|
|
a8c575147b | ||
|
|
09fcc28ded | ||
|
|
22fb659b72 | ||
|
|
40ae184c4e | ||
|
|
18a2a41682 | ||
|
|
4faff70b5d | ||
|
|
8d3361766d | ||
|
|
5dd4d0c370 | ||
|
|
a0617c1fad | ||
|
|
9e9593b899 | ||
|
|
89b7270233 | ||
|
|
3af4808864 | ||
|
|
d4c3b7614d | ||
|
|
676ba9ca22 | ||
|
|
d0f5cc839f | ||
|
|
110bd65c20 | ||
|
|
ef0080b0a9 | ||
|
|
a81dc00ccf | ||
|
|
765fea7f7e | ||
|
|
a0f48130c0 | ||
|
|
7e2c693c8a | ||
|
|
7396e4c326 | ||
|
|
0175eab031 | ||
|
|
3d0a26fdb1 | ||
|
|
c52d18da1f | ||
|
|
e0f341938a | ||
|
|
a037e562b2 | ||
|
|
f1084cbdcf | ||
|
|
f6e4339069 | ||
|
|
a754c6047d | ||
|
|
a5d2ae2588 | ||
|
|
a6f3378c21 | ||
|
|
ca75fb5664 | ||
|
|
32861ad592 | ||
|
|
ada8516803 | ||
|
|
d5c27a95aa | ||
|
|
6b8a21d2b0 | ||
|
|
cb7e6f8cd0 | ||
|
|
f48a2cb65e | ||
|
|
0fdd3d56f4 | ||
|
|
173934258c | ||
|
|
b70e21a6d5 | ||
|
|
cafb884991 | ||
|
|
94b09614d9 | ||
|
|
7488505e37 | ||
|
|
641ff9a71d | ||
|
|
b596502f74 | ||
|
|
4fbd760005 | ||
|
|
6a5ac15b07 | ||
|
|
57b419fa87 | ||
|
|
52ce54930b | ||
|
|
fa9898ebe1 | ||
|
|
de4b3c39b9 | ||
|
|
232e358a34 | ||
|
|
a0d35f9262 | ||
|
|
8883ef81dc | ||
|
|
39a425eca4 | ||
|
|
e31d383cfc | ||
|
|
cb104b60f4 | ||
|
|
b88963c900 | ||
|
|
3597abbcd7 | ||
|
|
0f780b6271 | ||
|
|
f1ce4e1f5b | ||
|
|
5d0d800c0a | ||
|
|
3b330ef22f | ||
|
|
873bb4fd2d | ||
|
|
3973df64ba | ||
|
|
8cda62ae92 | ||
|
|
b8e1a49f85 | ||
|
|
f9147b5405 | ||
|
|
878f727a2c | ||
|
|
0169ee4885 | ||
|
|
904faf27c2 | ||
|
|
d2a38fe05c | ||
|
|
9a5d06239f | ||
|
|
a1fad471a9 | ||
|
|
57a97365b6 | ||
|
|
cce36906b1 | ||
|
|
36158609f0 | ||
|
|
da99ba114b | ||
|
|
9a66bd3c34 | ||
|
|
0d23b33c0e | ||
|
|
e721092c2a | ||
|
|
8c7afc5646 | ||
|
|
7752d83781 | ||
|
|
4b45e94beb | ||
|
|
2a4ec13c8e | ||
|
|
20671c718e | ||
|
|
e73db49ed0 | ||
|
|
0553824df2 | ||
|
|
71bc2c5944 | ||
|
|
05feadbb7a | ||
|
|
a4709b1175 | ||
|
|
3a031084f3 | ||
|
|
0c517e5351 | ||
|
|
5fe435048b | ||
|
|
a722bfd099 | ||
|
|
f3d99a5fdb | ||
|
|
79de0989d5 | ||
|
|
ca334770b7 | ||
|
|
1071357505 | ||
|
|
f32dfe0278 | ||
|
|
278cedf3d0 | ||
|
|
45a6b5a436 | ||
|
|
611707a3d1 | ||
|
|
b4d20d9b9a | ||
|
|
ecc4553e67 | ||
|
|
ef790ca6f4 | ||
|
|
2d88638da7 | ||
|
|
91ba0bd0af | ||
|
|
0e2e5f3413 | ||
|
|
7a99dcf693 | ||
|
|
4e78ca5d82 | ||
|
|
83de38e56f | ||
|
|
f4be2e4fe7 | ||
|
|
16b0f7f9ee | ||
|
|
27721aef71 | ||
|
|
329a317fdf | ||
|
|
daad634894 | ||
|
|
4444925dea | ||
|
|
9c1ae96d33 | ||
|
|
b1b6d50af6 | ||
|
|
4c697ab50e | ||
|
|
7450088674 | ||
|
|
b141671d90 | ||
|
|
2ab2d9127d | ||
|
|
278453451e | ||
|
|
91ee7972d2 | ||
|
|
d1f59a6590 | ||
|
|
cdecf8904e | ||
|
|
3d16266c69 | ||
|
|
191676b011 | ||
|
|
ea07b261ad | ||
|
|
e86f737320 | ||
|
|
9a8562c624 | ||
|
|
145c41f462 | ||
|
|
1d38367e79 | ||
|
|
f58c2d0a7b | ||
|
|
ca7a6fe1f1 | ||
|
|
95042f73c7 | ||
|
|
678bcb171a | ||
|
|
8da7e505c0 | ||
|
|
cdd2b99b6b | ||
|
|
72ce4405d5 | ||
|
|
d8e3d91a79 | ||
|
|
edaaedae36 | ||
|
|
da8246d8c3 | ||
|
|
5243ae80b4 | ||
|
|
3aca576a0d | ||
|
|
0bb9d91eae | ||
|
|
8825d6b15f | ||
|
|
1f73ca21bf | ||
|
|
2db0854eef | ||
|
|
f66e589312 | ||
|
|
5c9ad3068b | ||
|
|
d07b786da6 | ||
|
|
da5d32ed89 | ||
|
|
55dadea98e | ||
|
|
77fbbe95ff | ||
|
|
1aeb95396b | ||
|
|
48dfbbebc6 | ||
|
|
ccf3a9f3b2 | ||
|
|
c0cb97bd42 | ||
|
|
8efb97ef4e | ||
|
|
d8cda7fc1b | ||
|
|
ee9f1e7b70 | ||
|
|
92dd70098c | ||
|
|
4bea4c69a4 | ||
|
|
7734325b71 | ||
|
|
186ae844bc | ||
|
|
c9bdf1c184 | ||
|
|
13ffe468df | ||
|
|
a090cf7a10 | ||
|
|
b7250477b5 | ||
|
|
dfd16c5187 | ||
|
|
4afd6b78af | ||
|
|
398f6e5b0c | ||
|
|
d7f7d839f8 | ||
|
|
49a843dcdd | ||
|
|
ec045e81f2 | ||
|
|
d8a7828cb5 | ||
|
|
e32cb12ad7 | ||
|
|
ee2847cfea | ||
|
|
22e00a7080 | ||
|
|
cbe567069f | ||
|
|
53baed0389 | ||
|
|
39cb9589c8 | ||
|
|
bde03f3574 | ||
|
|
8f31d150fd | ||
|
|
453dbbb031 | ||
|
|
421754fff6 | ||
|
|
05ec5feacf | ||
|
|
639d9b27c8 | ||
|
|
2c99d027f3 | ||
|
|
4f176682dc | ||
|
|
13ef41bd42 | ||
|
|
3c6ba80323 | ||
|
|
fcfa8dfac2 | ||
|
|
5b73e9aee6 | ||
|
|
9ead264300 | ||
|
|
a617eda321 | ||
|
|
c913fa65b2 | ||
|
|
497c8c84e5 | ||
|
|
c58a94d497 | ||
|
|
6a853d1fa2 | ||
|
|
c30c58e564 | ||
|
|
3b7a4c6b6b | ||
|
|
978cdf2514 | ||
|
|
33609616aa | ||
|
|
006f6c998d | ||
|
|
05fd69eae4 | ||
|
|
b6b8719efa | ||
|
|
7e8b9549a1 | ||
|
|
032f78e0f5 | ||
|
|
4059bc9ec6 | ||
|
|
b85cd0925a | ||
|
|
f20254217f | ||
|
|
9424b763ca | ||
|
|
08ae3f8771 | ||
|
|
68f0cf419b | ||
|
|
26b12512b1 | ||
|
|
b98afadd5c | ||
|
|
499bd552a1 | ||
|
|
0090e27699 | ||
|
|
e568b3000e | ||
|
|
7ae8b46ea7 | ||
|
|
ffb903841b | ||
|
|
72ee904e67 | ||
|
|
222e1968d8 | ||
|
|
1df517afd3 | ||
|
|
cc4cea1a41 | ||
|
|
e8868d7ebf | ||
|
|
7d9a9033f9 | ||
|
|
87322d7732 | ||
|
|
08c3d6e84b | ||
|
|
b50325c3a3 | ||
|
|
12cdcf7681 | ||
|
|
34192349be | ||
|
|
153d0bb12a | ||
|
|
20092dadad | ||
|
|
6844f8f2bf | ||
|
|
58f2c6a5fc | ||
|
|
5f10d86f04 | ||
|
|
1120e823ed | ||
|
|
301b384a02 | ||
|
|
e4a26164de | ||
|
|
56d3e8893f | ||
|
|
99336908f0 | ||
|
|
090325af35 | ||
|
|
485be6c3fd | ||
|
|
faa9d36c34 | ||
|
|
ea8e108cdf | ||
|
|
100f5422f6 | ||
|
|
15c716e53b | ||
|
|
75e77c5e54 | ||
|
|
de4fdc07e0 | ||
|
|
51edb2fa14 | ||
|
|
bd995089a8 | ||
|
|
a90dd2ad1e | ||
|
|
cd44151e16 | ||
|
|
5715ba1a9a | ||
|
|
6f4a1c1751 | ||
|
|
2a1b1eb1a4 | ||
|
|
20c597b1d7 | ||
|
|
9dc1989507 | ||
|
|
681cb1b978 | ||
|
|
332a9fac5a | ||
|
|
d118f4a3f0 | ||
|
|
7620cd02f0 | ||
|
|
fdfd7bd82a | ||
|
|
01c17e10cc | ||
|
|
bed03b301b | ||
|
|
ef48762da5 | ||
|
|
7978f3f0e6 | ||
|
|
8d5fad72bf | ||
|
|
04db521851 | ||
|
|
26d27be161 | ||
|
|
1259d06302 | ||
|
|
4db3f366ef | ||
|
|
1c52c5b673 | ||
|
|
a6a885d4f4 | ||
|
|
b11066cef1 | ||
|
|
1941b0c3ed | ||
|
|
9a47fc747f | ||
|
|
a425d3a55b | ||
|
|
f03a0f6e73 | ||
|
|
74d5724092 | ||
|
|
3df21fcaa3 | ||
|
|
fbb5de6740 | ||
|
|
f7f9096c6e | ||
|
|
d5a8e1725d | ||
|
|
34413747d5 | ||
|
|
18de626919 | ||
|
|
bef192084f | ||
|
|
54eef16bfb | ||
|
|
7be49dba69 | ||
|
|
218a6af62a | ||
|
|
f0315d5c70 | ||
|
|
f82e04201b | ||
|
|
f41231f017 | ||
|
|
ae1fb76d13 | ||
|
|
fa0023223f | ||
|
|
1c80fd17fd | ||
|
|
48cb347198 | ||
|
|
ffd583ed11 | ||
|
|
97bbca3aef | ||
|
|
06fd92fd27 | ||
|
|
c0e05a7572 | ||
|
|
2abe6eec84 | ||
|
|
0b6e73840a | ||
|
|
1707fe8990 | ||
|
|
9335b0779c | ||
|
|
277b521fad | ||
|
|
af1b634d6d | ||
|
|
cc19008961 | ||
|
|
be304e37b4 | ||
|
|
0a34a4a7ad | ||
|
|
898564c8d8 | ||
|
|
708638b97f | ||
|
|
458e857956 | ||
|
|
ac62bcb7ba | ||
|
|
6b1c50b051 | ||
|
|
d4a5376f73 | ||
|
|
71b34aa3bd | ||
|
|
6b4d8b18e0 | ||
|
|
66b2013d23 | ||
|
|
dc86993c84 | ||
|
|
00a5c13001 | ||
|
|
0e34923114 | ||
|
|
011164bc32 | ||
|
|
0a43ce9ced | ||
|
|
d925fb38ce | ||
|
|
3dc617277f | ||
|
|
096af09fc4 | ||
|
|
f97f9b857b | ||
|
|
5c0829b052 | ||
|
|
aa999b34e2 | ||
|
|
0a06c291e2 | ||
|
|
4bbaf5f89c | ||
|
|
f88e070455 | ||
|
|
5c980c31be | ||
|
|
9eee37bc68 | ||
|
|
a4927477fb | ||
|
|
d0a6c6a2f3 | ||
|
|
92757c5d8c | ||
|
|
0cc7765f2b | ||
|
|
ee1ef4ff56 | ||
|
|
b5eed5e043 | ||
|
|
62a253f571 | ||
|
|
080a23dd8c | ||
|
|
c90129957e | ||
|
|
0fa717fe11 | ||
|
|
e72766a5bf | ||
|
|
104a684514 | ||
|
|
5a809d7e31 | ||
|
|
ce3f6837e9 | ||
|
|
3ffd2a745b | ||
|
|
6b9c07b809 | ||
|
|
d4e2722586 | ||
|
|
1343767295 | ||
|
|
4b4bfc052f | ||
|
|
f7539eb931 | ||
|
|
efcfecca10 | ||
|
|
6a3735822d | ||
|
|
5bbcc7f2f7 | ||
|
|
400f1d37bf | ||
|
|
985b774378 | ||
|
|
14cbcb4af6 | ||
|
|
18ce86407d | ||
|
|
d0ee203265 | ||
|
|
fc26fe0ac0 | ||
|
|
0e0cbe3517 | ||
|
|
8e2cb6d416 | ||
|
|
73a6e68e03 | ||
|
|
48f9cb09af | ||
|
|
1c83f489d1 | ||
|
|
f6d78a0044 | ||
|
|
e60a7df9a2 | ||
|
|
d0a0ae91c4 | ||
|
|
feaf2da834 | ||
|
|
bf8703deae | ||
|
|
1997b7b2d9 | ||
|
|
666b938550 | ||
|
|
4e8f546502 | ||
|
|
5e9f3586cd | ||
|
|
69ef26dab0 | ||
|
|
163231d307 | ||
|
|
c04b9fd7f6 | ||
|
|
e530750fc6 | ||
|
|
c3997c9f26 | ||
|
|
6d7defa79e | ||
|
|
bc232582df | ||
|
|
286affea38 | ||
|
|
5f5c9e2eb3 | ||
|
|
de5eaf1c2c | ||
|
|
6f3755684e | ||
|
|
6950daca9a | ||
|
|
b4de83e348 | ||
|
|
83a1a32a5e | ||
|
|
3cea4804f8 | ||
|
|
8ce32003d7 | ||
|
|
d3191490d9 | ||
|
|
f6d5ba56b1 | ||
|
|
998ca64c1e | ||
|
|
eaa33744a6 | ||
|
|
c0a47ca999 | ||
|
|
22bfab840d | ||
|
|
88b7f8ac1e | ||
|
|
6fbe4404f5 | ||
|
|
d5e340d0f6 | ||
|
|
94954eeba3 | ||
|
|
b2911b2eba | ||
|
|
c398f22e76 | ||
|
|
063f6c1d5a | ||
|
|
2ca691d3b8 | ||
|
|
f2086b3a90 | ||
|
|
bb15b744c8 | ||
|
|
f1e99de59a | ||
|
|
e0999c7ba4 | ||
|
|
89c5aac9ed | ||
|
|
4a7c9a6050 | ||
|
|
e94ce3102e | ||
|
|
0225faddbb | ||
|
|
fb10d3a5be | ||
|
|
0613e3ab12 | ||
|
|
b21edde1bc | ||
|
|
1953a8ecb7 | ||
|
|
c84b592543 | ||
|
|
53f905b88b | ||
|
|
d057a5076c | ||
|
|
bcb9c6ccb0 | ||
|
|
96f86adfb8 | ||
|
|
de89f75707 | ||
|
|
b2307d911e | ||
|
|
35a558ec01 | ||
|
|
2e97c0a5fb | ||
|
|
7d9575b7fd | ||
|
|
321e0ced2a | ||
|
|
a697eb8530 | ||
|
|
4b37c1963b | ||
|
|
8f2687e390 | ||
|
|
c0f799a807 | ||
|
|
abc5bd98b4 | ||
|
|
a51893c849 | ||
|
|
79e218d00a | ||
|
|
9ae20a6bec | ||
|
|
2f739ff0b3 | ||
|
|
e0a8f4df0d | ||
|
|
347d7c07ef | ||
|
|
42e3cf821b | ||
|
|
574f1be067 | ||
|
|
cbde0e0286 | ||
|
|
11012c6be1 | ||
|
|
432bd83188 | ||
|
|
f8adfa9873 | ||
|
|
1efd226f75 | ||
|
|
ba1bb95935 | ||
|
|
f5e740f2ec | ||
|
|
98be7b227f | ||
|
|
f07cfd4f51 | ||
|
|
029d81122b | ||
|
|
26eb7ecdb5 | ||
|
|
3cf3ea4e20 | ||
|
|
84a2937d31 | ||
|
|
1d1f3bde96 | ||
|
|
329d81ec64 | ||
|
|
6809b15ce1 | ||
|
|
c725f50f07 | ||
|
|
463560c929 | ||
|
|
c7412deb77 | ||
|
|
9ed6c78466 | ||
|
|
9a1bd9637c | ||
|
|
87fd18bee1 | ||
|
|
f3dced3199 | ||
|
|
b36989e8e4 | ||
|
|
95c45b90a8 | ||
|
|
0bdc132dcf | ||
|
|
e450391d9b | ||
|
|
38f4bf4e28 | ||
|
|
109b0ea78b | ||
|
|
5061feb7bc | ||
|
|
52480e0bc4 | ||
|
|
109dd45b56 | ||
|
|
c8da513d97 | ||
|
|
06e7e40b15 | ||
|
|
f3d6fcb52b | ||
|
|
2decae6586 | ||
|
|
6766041328 | ||
|
|
2d96cec464 | ||
|
|
843d229c3d | ||
|
|
3d5bcd9d75 | ||
|
|
b391dd6a3f | ||
|
|
c126cf2bc1 | ||
|
|
fcb5beb617 | ||
|
|
dad141726c | ||
|
|
48637188a7 | ||
|
|
5d063e8449 | ||
|
|
2b8a03e93c | ||
|
|
213665cea2 | ||
|
|
d230431227 | ||
|
|
2040c9fe41 | ||
|
|
b49821a9f4 | ||
|
|
1701b0afed | ||
|
|
7785431152 | ||
|
|
b325233b2d | ||
|
|
3baf29a6b7 | ||
|
|
2510216f21 | ||
|
|
e4d7ae9718 | ||
|
|
596fabda5d | ||
|
|
e95f34ae13 | ||
|
|
169cb9424f | ||
|
|
536576518e | ||
|
|
1bf36b6461 | ||
|
|
58ab269d3d | ||
|
|
7cbb73be7a | ||
|
|
cc54368658 | ||
|
|
4fd075aafa | ||
|
|
1817d014ef | ||
|
|
8f1f0f5475 | ||
|
|
b406a2430d | ||
|
|
431edb0e67 | ||
|
|
5b96944940 | ||
|
|
8a6aaf4e2d | ||
|
|
b30c4275ef | ||
|
|
7f7ec625c8 | ||
|
|
010f1f2bd1 | ||
|
|
f4e8ab27fa | ||
|
|
915382f2c7 | ||
|
|
c907d690b7 | ||
|
|
dd4d903f69 | ||
|
|
a823b8f70c | ||
|
|
c317eca1ca | ||
|
|
466afa8203 | ||
|
|
ca40e39da4 | ||
|
|
c2e3dc76d9 | ||
|
|
5a899664f8 | ||
|
|
990e905a04 | ||
|
|
6b7155a849 | ||
|
|
47851ddd3f | ||
|
|
47189643ff | ||
|
|
c1efe11cf3 | ||
|
|
0e40ef5f35 | ||
|
|
c8081595c4 | ||
|
|
a2b5b3b253 | ||
|
|
790bcf05ed | ||
|
|
d8d2d53c59 | ||
|
|
027897ff03 | ||
|
|
cca576f518 | ||
|
|
5fcf1b5434 | ||
|
|
942b5aa9df | ||
|
|
c05b39a056 | ||
|
|
3c8196527f |
@@ -1,6 +1,6 @@
|
|||||||
{
|
{
|
||||||
"qpdf": {
|
"qpdf": {
|
||||||
"version": "10.6.3"
|
"version": "11.2.0"
|
||||||
},
|
},
|
||||||
"jbig2enc": {
|
"jbig2enc": {
|
||||||
"version": "0.29",
|
"version": "0.29",
|
||||||
|
|||||||
@@ -27,6 +27,9 @@ indent_style = space
|
|||||||
[*.md]
|
[*.md]
|
||||||
indent_style = space
|
indent_style = space
|
||||||
|
|
||||||
|
[Pipfile.lock]
|
||||||
|
indent_style = space
|
||||||
|
|
||||||
# Tests don't get a line width restriction. It's still a good idea to follow
|
# Tests don't get a line width restriction. It's still a good idea to follow
|
||||||
# the 79 character rule, but in the interests of clarity, tests often need to
|
# the 79 character rule, but in the interests of clarity, tests often need to
|
||||||
# violate it.
|
# violate it.
|
||||||
|
|||||||
21
.github/ISSUE_TEMPLATE/bug-report.yml
vendored
@@ -6,13 +6,14 @@ body:
|
|||||||
- type: markdown
|
- type: markdown
|
||||||
attributes:
|
attributes:
|
||||||
value: |
|
value: |
|
||||||
Have a question? 👉 [Start a new discussion](https://github.com/paperless-ngx/paperless-ngx/discussions/new) or [ask in chat](https://matrix.to/#/#paperless:adnidor.de).
|
Have a question? 👉 [Start a new discussion](https://github.com/paperless-ngx/paperless-ngx/discussions/new) or [ask in chat](https://matrix.to/#/#paperlessngx:matrix.org).
|
||||||
|
|
||||||
Before opening an issue, please double check:
|
Before opening an issue, please double check:
|
||||||
|
|
||||||
- [The troubleshooting documentation](https://paperless-ngx.readthedocs.io/en/latest/troubleshooting.html).
|
- [The troubleshooting documentation](https://docs.paperless-ngx.com/troubleshooting/).
|
||||||
- [The installation instructions](https://paperless-ngx.readthedocs.io/en/latest/setup.html#installation).
|
- [The installation instructions](https://docs.paperless-ngx.com/setup/#installation).
|
||||||
- [Existing issues and discussions](https://github.com/paperless-ngx/paperless-ngx/search?q=&type=issues).
|
- [Existing issues and discussions](https://github.com/paperless-ngx/paperless-ngx/search?q=&type=issues).
|
||||||
|
- Disable any customer container initialization scripts, if using any
|
||||||
|
|
||||||
If you encounter issues while installing or configuring Paperless-ngx, please post in the ["Support" section of the discussions](https://github.com/paperless-ngx/paperless-ngx/discussions/new?category=support).
|
If you encounter issues while installing or configuring Paperless-ngx, please post in the ["Support" section of the discussions](https://github.com/paperless-ngx/paperless-ngx/discussions/new?category=support).
|
||||||
- type: textarea
|
- type: textarea
|
||||||
@@ -41,7 +42,15 @@ body:
|
|||||||
id: logs
|
id: logs
|
||||||
attributes:
|
attributes:
|
||||||
label: Webserver logs
|
label: Webserver logs
|
||||||
description: If available, post any logs from the web server related to your issue.
|
description: Logs from the web server related to your issue.
|
||||||
|
render: bash
|
||||||
|
validations:
|
||||||
|
required: true
|
||||||
|
- type: textarea
|
||||||
|
id: logs_browser
|
||||||
|
attributes:
|
||||||
|
label: Browser logs
|
||||||
|
description: Logs from the web browser related to your issue, if needed
|
||||||
render: bash
|
render: bash
|
||||||
- type: input
|
- type: input
|
||||||
id: version
|
id: version
|
||||||
@@ -63,9 +72,11 @@ body:
|
|||||||
attributes:
|
attributes:
|
||||||
label: Installation method
|
label: Installation method
|
||||||
options:
|
options:
|
||||||
- Docker
|
- Docker - official image
|
||||||
|
- Docker - linuxserver.io image
|
||||||
- Bare metal
|
- Bare metal
|
||||||
- Other (please describe above)
|
- Other (please describe above)
|
||||||
|
description: Note there are significant differences from the official image and linuxserver.io, please check if your issue is specific to the third-party image.
|
||||||
validations:
|
validations:
|
||||||
required: true
|
required: true
|
||||||
- type: input
|
- type: input
|
||||||
|
|||||||
2
.github/ISSUE_TEMPLATE/config.yml
vendored
@@ -4,7 +4,7 @@ contact_links:
|
|||||||
url: https://github.com/paperless-ngx/paperless-ngx/discussions
|
url: https://github.com/paperless-ngx/paperless-ngx/discussions
|
||||||
about: This issue tracker is not for support questions. Please refer to our Discussions.
|
about: This issue tracker is not for support questions. Please refer to our Discussions.
|
||||||
- name: 💬 Chat
|
- name: 💬 Chat
|
||||||
url: https://matrix.to/#/#paperless:adnidor.de
|
url: https://matrix.to/#/#paperlessngx:matrix.org
|
||||||
about: Want to discuss Paperless-ngx with others? Check out our chat.
|
about: Want to discuss Paperless-ngx with others? Check out our chat.
|
||||||
- name: 🚀 Feature Request
|
- name: 🚀 Feature Request
|
||||||
url: https://github.com/paperless-ngx/paperless-ngx/discussions/new?category=feature-requests
|
url: https://github.com/paperless-ngx/paperless-ngx/discussions/new?category=feature-requests
|
||||||
|
|||||||
4
.github/PULL_REQUEST_TEMPLATE.md
vendored
@@ -26,7 +26,7 @@ NOTE: Please check only one box!
|
|||||||
|
|
||||||
- [ ] I have read & agree with the [contributing guidelines](https://github.com/paperless-ngx/paperless-ngx/blob/main/CONTRIBUTING.md).
|
- [ ] I have read & agree with the [contributing guidelines](https://github.com/paperless-ngx/paperless-ngx/blob/main/CONTRIBUTING.md).
|
||||||
- [ ] If applicable, I have tested my code for new features & regressions on both mobile & desktop devices, using the latest version of major browsers.
|
- [ ] If applicable, I have tested my code for new features & regressions on both mobile & desktop devices, using the latest version of major browsers.
|
||||||
- [ ] If applicable, I have checked that all tests pass, see [documentation](https://paperless-ngx.readthedocs.io/en/latest/extending.html#back-end-development).
|
- [ ] If applicable, I have checked that all tests pass, see [documentation](https://docs.paperless-ngx.com/development/#back-end-development).
|
||||||
- [ ] I have run all `pre-commit` hooks, see [documentation](https://paperless-ngx.readthedocs.io/en/latest/extending.html#code-formatting-with-pre-commit-hooks).
|
- [ ] I have run all `pre-commit` hooks, see [documentation](https://docs.paperless-ngx.com/development/#code-formatting-with-pre-commit-hooks).
|
||||||
- [ ] I have made corresponding changes to the documentation as needed.
|
- [ ] I have made corresponding changes to the documentation as needed.
|
||||||
- [ ] I have checked my modifications for any breaking changes.
|
- [ ] I have checked my modifications for any breaking changes.
|
||||||
|
|||||||
41
.github/release-drafter.yml
vendored
@@ -1,7 +1,22 @@
|
|||||||
|
autolabeler:
|
||||||
|
- label: "bug"
|
||||||
|
branch:
|
||||||
|
- '/^fix/'
|
||||||
|
title:
|
||||||
|
- "/^fix/i"
|
||||||
|
- "/^Bugfix/i"
|
||||||
|
- label: "enhancement"
|
||||||
|
branch:
|
||||||
|
- '/^feature/'
|
||||||
|
title:
|
||||||
|
- "/^feature/i"
|
||||||
categories:
|
categories:
|
||||||
- title: 'Breaking Changes'
|
- title: 'Breaking Changes'
|
||||||
labels:
|
labels:
|
||||||
- 'breaking-change'
|
- 'breaking-change'
|
||||||
|
- title: 'Notable Changes'
|
||||||
|
labels:
|
||||||
|
- 'notable'
|
||||||
- title: 'Features'
|
- title: 'Features'
|
||||||
labels:
|
labels:
|
||||||
- 'enhancement'
|
- 'enhancement'
|
||||||
@@ -9,15 +24,23 @@ categories:
|
|||||||
labels:
|
labels:
|
||||||
- 'bug'
|
- 'bug'
|
||||||
- title: 'Documentation'
|
- title: 'Documentation'
|
||||||
label: 'documentation'
|
labels:
|
||||||
|
- 'documentation'
|
||||||
- title: 'Maintenance'
|
- title: 'Maintenance'
|
||||||
labels:
|
labels:
|
||||||
- 'chore'
|
- 'chore'
|
||||||
- 'deployment'
|
- 'deployment'
|
||||||
- 'translation'
|
- 'translation'
|
||||||
|
- 'ci-cd'
|
||||||
- title: 'Dependencies'
|
- title: 'Dependencies'
|
||||||
collapse-after: 3
|
collapse-after: 3
|
||||||
label: 'dependencies'
|
labels:
|
||||||
|
- 'dependencies'
|
||||||
|
- title: 'All App Changes'
|
||||||
|
labels:
|
||||||
|
- 'frontend'
|
||||||
|
- 'backend'
|
||||||
|
collapse-after: 0
|
||||||
include-labels:
|
include-labels:
|
||||||
- 'enhancement'
|
- 'enhancement'
|
||||||
- 'bug'
|
- 'bug'
|
||||||
@@ -25,12 +48,16 @@ include-labels:
|
|||||||
- 'deployment'
|
- 'deployment'
|
||||||
- 'translation'
|
- 'translation'
|
||||||
- 'dependencies'
|
- 'dependencies'
|
||||||
replacers: # Changes "Feature: Update checker" to "Update checker"
|
- 'documentation'
|
||||||
- search: '/Feature:|Feat:|\[feature\]/gi'
|
- 'frontend'
|
||||||
replace: ''
|
- 'backend'
|
||||||
change-template: '- $TITLE @$AUTHOR (#$NUMBER)'
|
- 'ci-cd'
|
||||||
|
- 'breaking-change'
|
||||||
|
- 'notable'
|
||||||
|
category-template: '### $TITLE'
|
||||||
|
change-template: '- $TITLE @$AUTHOR ([#$NUMBER]($URL))'
|
||||||
change-title-escapes: '\<*_&#@'
|
change-title-escapes: '\<*_&#@'
|
||||||
template: |
|
template: |
|
||||||
# Changelog
|
## paperless-ngx $RESOLVED_VERSION
|
||||||
|
|
||||||
$CHANGES
|
$CHANGES
|
||||||
|
|||||||
402
.github/scripts/cleanup-tags.py
vendored
Normal file
@@ -0,0 +1,402 @@
|
|||||||
|
#!/usr/bin/env python3
|
||||||
|
import json
|
||||||
|
import logging
|
||||||
|
import os
|
||||||
|
import shutil
|
||||||
|
import subprocess
|
||||||
|
from argparse import ArgumentParser
|
||||||
|
from typing import Dict
|
||||||
|
from typing import Final
|
||||||
|
from typing import List
|
||||||
|
from typing import Optional
|
||||||
|
|
||||||
|
from common import get_log_level
|
||||||
|
from github import ContainerPackage
|
||||||
|
from github import GithubBranchApi
|
||||||
|
from github import GithubContainerRegistryApi
|
||||||
|
|
||||||
|
logger = logging.getLogger("cleanup-tags")
|
||||||
|
|
||||||
|
|
||||||
|
class DockerManifest2:
|
||||||
|
"""
|
||||||
|
Data class wrapping the Docker Image Manifest Version 2.
|
||||||
|
|
||||||
|
See https://docs.docker.com/registry/spec/manifest-v2-2/
|
||||||
|
"""
|
||||||
|
|
||||||
|
def __init__(self, data: Dict) -> None:
|
||||||
|
self._data = data
|
||||||
|
# This is the sha256: digest string. Corresponds to GitHub API name
|
||||||
|
# if the package is an untagged package
|
||||||
|
self.digest = self._data["digest"]
|
||||||
|
platform_data_os = self._data["platform"]["os"]
|
||||||
|
platform_arch = self._data["platform"]["architecture"]
|
||||||
|
platform_variant = self._data["platform"].get(
|
||||||
|
"variant",
|
||||||
|
"",
|
||||||
|
)
|
||||||
|
self.platform = f"{platform_data_os}/{platform_arch}{platform_variant}"
|
||||||
|
|
||||||
|
|
||||||
|
class RegistryTagsCleaner:
|
||||||
|
"""
|
||||||
|
This is the base class for the image registry cleaning. Given a package
|
||||||
|
name, it will keep all images which are tagged and all untagged images
|
||||||
|
referred to by a manifest. This results in only images which have been untagged
|
||||||
|
and cannot be referenced except by their SHA in being removed. None of these
|
||||||
|
images should be referenced, so it is fine to delete them.
|
||||||
|
"""
|
||||||
|
|
||||||
|
def __init__(
|
||||||
|
self,
|
||||||
|
package_name: str,
|
||||||
|
repo_owner: str,
|
||||||
|
repo_name: str,
|
||||||
|
package_api: GithubContainerRegistryApi,
|
||||||
|
branch_api: Optional[GithubBranchApi],
|
||||||
|
):
|
||||||
|
self.actually_delete = False
|
||||||
|
self.package_api = package_api
|
||||||
|
self.branch_api = branch_api
|
||||||
|
self.package_name = package_name
|
||||||
|
self.repo_owner = repo_owner
|
||||||
|
self.repo_name = repo_name
|
||||||
|
self.tags_to_delete: List[str] = []
|
||||||
|
self.tags_to_keep: List[str] = []
|
||||||
|
|
||||||
|
# Get the information about all versions of the given package
|
||||||
|
# These are active, not deleted, the default returned from the API
|
||||||
|
self.all_package_versions = self.package_api.get_active_package_versions(
|
||||||
|
self.package_name,
|
||||||
|
)
|
||||||
|
|
||||||
|
# Get a mapping from a tag like "1.7.0" or "feature-xyz" to the ContainerPackage
|
||||||
|
# tagged with it. It makes certain lookups easy
|
||||||
|
self.all_pkgs_tags_to_version: Dict[str, ContainerPackage] = {}
|
||||||
|
for pkg in self.all_package_versions:
|
||||||
|
for tag in pkg.tags:
|
||||||
|
self.all_pkgs_tags_to_version[tag] = pkg
|
||||||
|
logger.info(
|
||||||
|
f"Located {len(self.all_package_versions)} versions of package {self.package_name}",
|
||||||
|
)
|
||||||
|
|
||||||
|
self.decide_what_tags_to_keep()
|
||||||
|
|
||||||
|
def clean(self):
|
||||||
|
"""
|
||||||
|
This method will delete image versions, based on the selected tags to delete
|
||||||
|
"""
|
||||||
|
for tag_to_delete in self.tags_to_delete:
|
||||||
|
package_version_info = self.all_pkgs_tags_to_version[tag_to_delete]
|
||||||
|
|
||||||
|
if self.actually_delete:
|
||||||
|
logger.info(
|
||||||
|
f"Deleting {tag_to_delete} (id {package_version_info.id})",
|
||||||
|
)
|
||||||
|
self.package_api.delete_package_version(
|
||||||
|
package_version_info,
|
||||||
|
)
|
||||||
|
|
||||||
|
else:
|
||||||
|
logger.info(
|
||||||
|
f"Would delete {tag_to_delete} (id {package_version_info.id})",
|
||||||
|
)
|
||||||
|
else:
|
||||||
|
logger.info("No tags to delete")
|
||||||
|
|
||||||
|
def clean_untagged(self, is_manifest_image: bool):
|
||||||
|
"""
|
||||||
|
This method will delete untagged images, that is those which are not named. It
|
||||||
|
handles if the image tag is actually a manifest, which points to images that look otherwise
|
||||||
|
untagged.
|
||||||
|
"""
|
||||||
|
|
||||||
|
def _clean_untagged_manifest():
|
||||||
|
"""
|
||||||
|
|
||||||
|
Handles the deletion of untagged images, but where the package is a manifest, ie a multi
|
||||||
|
arch image, which means some "untagged" images need to exist still.
|
||||||
|
|
||||||
|
Ok, bear with me, these are annoying.
|
||||||
|
|
||||||
|
Our images are multi-arch, so the manifest is more like a pointer to a sha256 digest.
|
||||||
|
These images are untagged, but pointed to, and so should not be removed (or every pull fails).
|
||||||
|
|
||||||
|
So for each image getting kept, parse the manifest to find the digest(s) it points to. Then
|
||||||
|
remove those from the list of untagged images. The final result is the untagged, not pointed to
|
||||||
|
version which should be safe to remove.
|
||||||
|
|
||||||
|
Example:
|
||||||
|
Tag: ghcr.io/paperless-ngx/paperless-ngx:1.7.1 refers to
|
||||||
|
amd64: sha256:b9ed4f8753bbf5146547671052d7e91f68cdfc9ef049d06690b2bc866fec2690
|
||||||
|
armv7: sha256:81605222df4ba4605a2ba4893276e5d08c511231ead1d5da061410e1bbec05c3
|
||||||
|
arm64: sha256:374cd68db40734b844705bfc38faae84cc4182371de4bebd533a9a365d5e8f3b
|
||||||
|
each of which appears as untagged image, but isn't really.
|
||||||
|
|
||||||
|
So from the list of untagged packages, remove those digests. Once all tags which
|
||||||
|
are being kept are checked, the remaining untagged packages are actually untagged
|
||||||
|
with no referrals in a manifest to them.
|
||||||
|
"""
|
||||||
|
# Simplify the untagged data, mapping name (which is a digest) to the version
|
||||||
|
# At the moment, these are the images which APPEAR untagged.
|
||||||
|
untagged_versions = {}
|
||||||
|
for x in self.all_package_versions:
|
||||||
|
if x.untagged:
|
||||||
|
untagged_versions[x.name] = x
|
||||||
|
|
||||||
|
skips = 0
|
||||||
|
|
||||||
|
# Parse manifests to locate digests pointed to
|
||||||
|
for tag in sorted(self.tags_to_keep):
|
||||||
|
full_name = f"ghcr.io/{self.repo_owner}/{self.package_name}:{tag}"
|
||||||
|
logger.info(f"Checking manifest for {full_name}")
|
||||||
|
try:
|
||||||
|
proc = subprocess.run(
|
||||||
|
[
|
||||||
|
shutil.which("docker"),
|
||||||
|
"manifest",
|
||||||
|
"inspect",
|
||||||
|
full_name,
|
||||||
|
],
|
||||||
|
capture_output=True,
|
||||||
|
)
|
||||||
|
|
||||||
|
manifest_list = json.loads(proc.stdout)
|
||||||
|
for manifest_data in manifest_list["manifests"]:
|
||||||
|
manifest = DockerManifest2(manifest_data)
|
||||||
|
|
||||||
|
if manifest.digest in untagged_versions:
|
||||||
|
logger.info(
|
||||||
|
f"Skipping deletion of {manifest.digest},"
|
||||||
|
f" referred to by {full_name}"
|
||||||
|
f" for {manifest.platform}",
|
||||||
|
)
|
||||||
|
del untagged_versions[manifest.digest]
|
||||||
|
skips += 1
|
||||||
|
|
||||||
|
except Exception as err:
|
||||||
|
self.actually_delete = False
|
||||||
|
logger.exception(err)
|
||||||
|
return
|
||||||
|
|
||||||
|
logger.info(
|
||||||
|
f"Skipping deletion of {skips} packages referred to by a manifest",
|
||||||
|
)
|
||||||
|
|
||||||
|
# Delete the untagged and not pointed at packages
|
||||||
|
logger.info(f"Deleting untagged packages of {self.package_name}")
|
||||||
|
for to_delete_name in untagged_versions:
|
||||||
|
to_delete_version = untagged_versions[to_delete_name]
|
||||||
|
|
||||||
|
if self.actually_delete:
|
||||||
|
logger.info(
|
||||||
|
f"Deleting id {to_delete_version.id} named {to_delete_version.name}",
|
||||||
|
)
|
||||||
|
self.package_api.delete_package_version(
|
||||||
|
to_delete_version,
|
||||||
|
)
|
||||||
|
else:
|
||||||
|
logger.info(
|
||||||
|
f"Would delete {to_delete_name} (id {to_delete_version.id})",
|
||||||
|
)
|
||||||
|
|
||||||
|
def _clean_untagged_non_manifest():
|
||||||
|
"""
|
||||||
|
If the package is not a multi-arch manifest, images without tags are safe to delete.
|
||||||
|
"""
|
||||||
|
|
||||||
|
for package in self.all_package_versions:
|
||||||
|
if package.untagged:
|
||||||
|
if self.actually_delete:
|
||||||
|
logger.info(
|
||||||
|
f"Deleting id {package.id} named {package.name}",
|
||||||
|
)
|
||||||
|
self.package_api.delete_package_version(
|
||||||
|
package,
|
||||||
|
)
|
||||||
|
else:
|
||||||
|
logger.info(
|
||||||
|
f"Would delete {package.name} (id {package.id})",
|
||||||
|
)
|
||||||
|
else:
|
||||||
|
logger.info(
|
||||||
|
f"Not deleting tag {package.tags[0]} of package {self.package_name}",
|
||||||
|
)
|
||||||
|
|
||||||
|
logger.info("Beginning untagged image cleaning")
|
||||||
|
|
||||||
|
if is_manifest_image:
|
||||||
|
_clean_untagged_manifest()
|
||||||
|
else:
|
||||||
|
_clean_untagged_non_manifest()
|
||||||
|
|
||||||
|
def decide_what_tags_to_keep(self):
|
||||||
|
"""
|
||||||
|
This method holds the logic to delete what tags to keep and there fore
|
||||||
|
what tags to delete.
|
||||||
|
|
||||||
|
By default, any image with at least 1 tag will be kept
|
||||||
|
"""
|
||||||
|
# By default, keep anything which is tagged
|
||||||
|
self.tags_to_keep = list(set(self.all_pkgs_tags_to_version.keys()))
|
||||||
|
|
||||||
|
|
||||||
|
class MainImageTagsCleaner(RegistryTagsCleaner):
|
||||||
|
def decide_what_tags_to_keep(self):
|
||||||
|
"""
|
||||||
|
Overrides the default logic for deciding what images to keep. Images tagged as "feature-"
|
||||||
|
will be removed, if the corresponding branch no longer exists.
|
||||||
|
"""
|
||||||
|
|
||||||
|
# Default to everything gets kept still
|
||||||
|
super().decide_what_tags_to_keep()
|
||||||
|
|
||||||
|
# Locate the feature branches
|
||||||
|
feature_branches = {}
|
||||||
|
for branch in self.branch_api.get_branches(
|
||||||
|
repo=self.repo_name,
|
||||||
|
):
|
||||||
|
if branch.name.startswith("feature-"):
|
||||||
|
logger.debug(f"Found feature branch {branch.name}")
|
||||||
|
feature_branches[branch.name] = branch
|
||||||
|
|
||||||
|
logger.info(f"Located {len(feature_branches)} feature branches")
|
||||||
|
|
||||||
|
if not len(feature_branches):
|
||||||
|
# Our work here is done, delete nothing
|
||||||
|
return
|
||||||
|
|
||||||
|
# Filter to packages which are tagged with feature-*
|
||||||
|
packages_tagged_feature: List[ContainerPackage] = []
|
||||||
|
for package in self.all_package_versions:
|
||||||
|
if package.tag_matches("feature-"):
|
||||||
|
packages_tagged_feature.append(package)
|
||||||
|
|
||||||
|
# Map tags like "feature-xyz" to a ContainerPackage
|
||||||
|
feature_pkgs_tags_to_versions: Dict[str, ContainerPackage] = {}
|
||||||
|
for pkg in packages_tagged_feature:
|
||||||
|
for tag in pkg.tags:
|
||||||
|
feature_pkgs_tags_to_versions[tag] = pkg
|
||||||
|
|
||||||
|
logger.info(
|
||||||
|
f'Located {len(feature_pkgs_tags_to_versions)} versions of package {self.package_name} tagged "feature-"',
|
||||||
|
)
|
||||||
|
|
||||||
|
# All the feature tags minus all the feature branches leaves us feature tags
|
||||||
|
# with no corresponding branch
|
||||||
|
self.tags_to_delete = list(
|
||||||
|
set(feature_pkgs_tags_to_versions.keys()) - set(feature_branches.keys()),
|
||||||
|
)
|
||||||
|
|
||||||
|
# All the tags minus the set of going to be deleted tags leaves us the
|
||||||
|
# tags which will be kept around
|
||||||
|
self.tags_to_keep = list(
|
||||||
|
set(self.all_pkgs_tags_to_version.keys()) - set(self.tags_to_delete),
|
||||||
|
)
|
||||||
|
logger.info(
|
||||||
|
f"Located {len(self.tags_to_delete)} versions of package {self.package_name} to delete",
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
class LibraryTagsCleaner(RegistryTagsCleaner):
|
||||||
|
"""
|
||||||
|
Exists for the off change that someday, the installer library images
|
||||||
|
will need their own logic
|
||||||
|
"""
|
||||||
|
|
||||||
|
pass
|
||||||
|
|
||||||
|
|
||||||
|
def _main():
|
||||||
|
parser = ArgumentParser(
|
||||||
|
description="Using the GitHub API locate and optionally delete container"
|
||||||
|
" tags which no longer have an associated feature branch",
|
||||||
|
)
|
||||||
|
|
||||||
|
# Requires an affirmative command to actually do a delete
|
||||||
|
parser.add_argument(
|
||||||
|
"--delete",
|
||||||
|
action="store_true",
|
||||||
|
default=False,
|
||||||
|
help="If provided, actually delete the container tags",
|
||||||
|
)
|
||||||
|
|
||||||
|
# When a tagged image is updated, the previous version remains, but it no longer tagged
|
||||||
|
# Add this option to remove them as well
|
||||||
|
parser.add_argument(
|
||||||
|
"--untagged",
|
||||||
|
action="store_true",
|
||||||
|
default=False,
|
||||||
|
help="If provided, delete untagged containers as well",
|
||||||
|
)
|
||||||
|
|
||||||
|
# If given, the package is assumed to be a multi-arch manifest. Cache packages are
|
||||||
|
# not multi-arch, all other types are
|
||||||
|
parser.add_argument(
|
||||||
|
"--is-manifest",
|
||||||
|
action="store_true",
|
||||||
|
default=False,
|
||||||
|
help="If provided, the package is assumed to be a multi-arch manifest following schema v2",
|
||||||
|
)
|
||||||
|
|
||||||
|
# Allows configuration of log level for debugging
|
||||||
|
parser.add_argument(
|
||||||
|
"--loglevel",
|
||||||
|
default="info",
|
||||||
|
help="Configures the logging level",
|
||||||
|
)
|
||||||
|
|
||||||
|
# Get the name of the package being processed this round
|
||||||
|
parser.add_argument(
|
||||||
|
"package",
|
||||||
|
help="The package to process",
|
||||||
|
)
|
||||||
|
|
||||||
|
args = parser.parse_args()
|
||||||
|
|
||||||
|
logging.basicConfig(
|
||||||
|
level=get_log_level(args),
|
||||||
|
datefmt="%Y-%m-%d %H:%M:%S",
|
||||||
|
format="%(asctime)s %(levelname)-8s %(message)s",
|
||||||
|
)
|
||||||
|
|
||||||
|
# Must be provided in the environment
|
||||||
|
repo_owner: Final[str] = os.environ["GITHUB_REPOSITORY_OWNER"]
|
||||||
|
repo: Final[str] = os.environ["GITHUB_REPOSITORY"]
|
||||||
|
gh_token: Final[str] = os.environ["TOKEN"]
|
||||||
|
|
||||||
|
# Find all branches named feature-*
|
||||||
|
# Note: Only relevant to the main application, but simpler to
|
||||||
|
# leave in for all packages
|
||||||
|
with GithubBranchApi(gh_token) as branch_api:
|
||||||
|
with GithubContainerRegistryApi(gh_token, repo_owner) as container_api:
|
||||||
|
if args.package in {"paperless-ngx", "paperless-ngx/builder/cache/app"}:
|
||||||
|
cleaner = MainImageTagsCleaner(
|
||||||
|
args.package,
|
||||||
|
repo_owner,
|
||||||
|
repo,
|
||||||
|
container_api,
|
||||||
|
branch_api,
|
||||||
|
)
|
||||||
|
else:
|
||||||
|
cleaner = LibraryTagsCleaner(
|
||||||
|
args.package,
|
||||||
|
repo_owner,
|
||||||
|
repo,
|
||||||
|
container_api,
|
||||||
|
None,
|
||||||
|
)
|
||||||
|
|
||||||
|
# Set if actually doing a delete vs dry run
|
||||||
|
cleaner.actually_delete = args.delete
|
||||||
|
|
||||||
|
# Clean images with tags
|
||||||
|
cleaner.clean()
|
||||||
|
|
||||||
|
# Clean images which are untagged
|
||||||
|
cleaner.clean_untagged(args.is_manifest)
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
_main()
|
||||||
25
.github/scripts/common.py
vendored
@@ -1,4 +1,5 @@
|
|||||||
#!/usr/bin/env python3
|
#!/usr/bin/env python3
|
||||||
|
import logging
|
||||||
|
|
||||||
|
|
||||||
def get_image_tag(
|
def get_image_tag(
|
||||||
@@ -9,7 +10,7 @@ def get_image_tag(
|
|||||||
"""
|
"""
|
||||||
Returns a string representing the normal image for a given package
|
Returns a string representing the normal image for a given package
|
||||||
"""
|
"""
|
||||||
return f"ghcr.io/{repo_name}/builder/{pkg_name}:{pkg_version}"
|
return f"ghcr.io/{repo_name.lower()}/builder/{pkg_name}:{pkg_version}"
|
||||||
|
|
||||||
|
|
||||||
def get_cache_image_tag(
|
def get_cache_image_tag(
|
||||||
@@ -24,4 +25,24 @@ def get_cache_image_tag(
|
|||||||
Registry type caching is utilized for the builder images, to allow fast
|
Registry type caching is utilized for the builder images, to allow fast
|
||||||
rebuilds, generally almost instant for the same version
|
rebuilds, generally almost instant for the same version
|
||||||
"""
|
"""
|
||||||
return f"ghcr.io/{repo_name}/builder/cache/{pkg_name}:{pkg_version}"
|
return f"ghcr.io/{repo_name.lower()}/builder/cache/{pkg_name}:{pkg_version}"
|
||||||
|
|
||||||
|
|
||||||
|
def get_log_level(args) -> int:
|
||||||
|
"""
|
||||||
|
Returns a logging level, based
|
||||||
|
:param args:
|
||||||
|
:return:
|
||||||
|
"""
|
||||||
|
levels = {
|
||||||
|
"critical": logging.CRITICAL,
|
||||||
|
"error": logging.ERROR,
|
||||||
|
"warn": logging.WARNING,
|
||||||
|
"warning": logging.WARNING,
|
||||||
|
"info": logging.INFO,
|
||||||
|
"debug": logging.DEBUG,
|
||||||
|
}
|
||||||
|
level = levels.get(args.loglevel.lower())
|
||||||
|
if level is None:
|
||||||
|
level = logging.INFO
|
||||||
|
return level
|
||||||
|
|||||||
10
.github/scripts/get-build-json.py
vendored
@@ -50,7 +50,6 @@ def _main():
|
|||||||
|
|
||||||
# Default output values
|
# Default output values
|
||||||
version = None
|
version = None
|
||||||
git_tag = None
|
|
||||||
extra_config = {}
|
extra_config = {}
|
||||||
|
|
||||||
if args.package in pipfile_data["default"]:
|
if args.package in pipfile_data["default"]:
|
||||||
@@ -59,12 +58,6 @@ def _main():
|
|||||||
pkg_version = pkg_data["version"].split("==")[-1]
|
pkg_version = pkg_data["version"].split("==")[-1]
|
||||||
version = pkg_version
|
version = pkg_version
|
||||||
|
|
||||||
# Based on the package, generate the expected Git tag name
|
|
||||||
if args.package == "pikepdf":
|
|
||||||
git_tag = f"v{pkg_version}"
|
|
||||||
elif args.package == "psycopg2":
|
|
||||||
git_tag = pkg_version.replace(".", "_")
|
|
||||||
|
|
||||||
# Any extra/special values needed
|
# Any extra/special values needed
|
||||||
if args.package == "pikepdf":
|
if args.package == "pikepdf":
|
||||||
extra_config["qpdf_version"] = build_json["qpdf"]["version"]
|
extra_config["qpdf_version"] = build_json["qpdf"]["version"]
|
||||||
@@ -72,8 +65,6 @@ def _main():
|
|||||||
elif args.package in build_json:
|
elif args.package in build_json:
|
||||||
version = build_json[args.package]["version"]
|
version = build_json[args.package]["version"]
|
||||||
|
|
||||||
if "git_tag" in build_json[args.package]:
|
|
||||||
git_tag = build_json[args.package]["git_tag"]
|
|
||||||
else:
|
else:
|
||||||
raise NotImplementedError(args.package)
|
raise NotImplementedError(args.package)
|
||||||
|
|
||||||
@@ -81,7 +72,6 @@ def _main():
|
|||||||
output = {
|
output = {
|
||||||
"name": args.package,
|
"name": args.package,
|
||||||
"version": version,
|
"version": version,
|
||||||
"git_tag": git_tag,
|
|
||||||
"image_tag": get_image_tag(repo_name, args.package, version),
|
"image_tag": get_image_tag(repo_name, args.package, version),
|
||||||
"cache_tag": get_cache_image_tag(
|
"cache_tag": get_cache_image_tag(
|
||||||
repo_name,
|
repo_name,
|
||||||
|
|||||||
274
.github/scripts/github.py
vendored
Normal file
@@ -0,0 +1,274 @@
|
|||||||
|
#!/usr/bin/env python3
|
||||||
|
"""
|
||||||
|
This module contains some useful classes for interacting with the Github API.
|
||||||
|
The full documentation for the API can be found here: https://docs.github.com/en/rest
|
||||||
|
|
||||||
|
Mostly, this focusses on two areas, repo branches and repo packages, as the use case
|
||||||
|
is cleaning up container images which are no longer referred to.
|
||||||
|
|
||||||
|
"""
|
||||||
|
import functools
|
||||||
|
import logging
|
||||||
|
import re
|
||||||
|
import urllib.parse
|
||||||
|
from typing import Dict
|
||||||
|
from typing import List
|
||||||
|
from typing import Optional
|
||||||
|
|
||||||
|
import httpx
|
||||||
|
|
||||||
|
logger = logging.getLogger("github-api")
|
||||||
|
|
||||||
|
|
||||||
|
class _GithubApiBase:
|
||||||
|
"""
|
||||||
|
A base class for interacting with the Github API. It
|
||||||
|
will handle the session and setting authorization headers.
|
||||||
|
"""
|
||||||
|
|
||||||
|
def __init__(self, token: str) -> None:
|
||||||
|
self._token = token
|
||||||
|
self._client: Optional[httpx.Client] = None
|
||||||
|
|
||||||
|
def __enter__(self) -> "_GithubApiBase":
|
||||||
|
"""
|
||||||
|
Sets up the required headers for auth and response
|
||||||
|
type from the API
|
||||||
|
"""
|
||||||
|
self._client = httpx.Client()
|
||||||
|
self._client.headers.update(
|
||||||
|
{
|
||||||
|
"Accept": "application/vnd.github.v3+json",
|
||||||
|
"Authorization": f"token {self._token}",
|
||||||
|
},
|
||||||
|
)
|
||||||
|
return self
|
||||||
|
|
||||||
|
def __exit__(self, exc_type, exc_val, exc_tb):
|
||||||
|
"""
|
||||||
|
Ensures the authorization token is cleaned up no matter
|
||||||
|
the reason for the exit
|
||||||
|
"""
|
||||||
|
if "Accept" in self._client.headers:
|
||||||
|
del self._client.headers["Accept"]
|
||||||
|
if "Authorization" in self._client.headers:
|
||||||
|
del self._client.headers["Authorization"]
|
||||||
|
|
||||||
|
# Close the session as well
|
||||||
|
self._client.close()
|
||||||
|
self._client = None
|
||||||
|
|
||||||
|
def _read_all_pages(self, endpoint):
|
||||||
|
"""
|
||||||
|
Helper function to read all pages of an endpoint, utilizing the
|
||||||
|
next.url until exhausted. Assumes the endpoint returns a list
|
||||||
|
"""
|
||||||
|
internal_data = []
|
||||||
|
|
||||||
|
while True:
|
||||||
|
resp = self._client.get(endpoint)
|
||||||
|
if resp.status_code == 200:
|
||||||
|
internal_data += resp.json()
|
||||||
|
if "next" in resp.links:
|
||||||
|
endpoint = resp.links["next"]["url"]
|
||||||
|
else:
|
||||||
|
logger.debug("Exiting pagination loop")
|
||||||
|
break
|
||||||
|
else:
|
||||||
|
logger.warning(f"Request to {endpoint} return HTTP {resp.status_code}")
|
||||||
|
resp.raise_for_status()
|
||||||
|
|
||||||
|
return internal_data
|
||||||
|
|
||||||
|
|
||||||
|
class _EndpointResponse:
|
||||||
|
"""
|
||||||
|
For all endpoint JSON responses, store the full
|
||||||
|
response data, for ease of extending later, if need be.
|
||||||
|
"""
|
||||||
|
|
||||||
|
def __init__(self, data: Dict) -> None:
|
||||||
|
self._data = data
|
||||||
|
|
||||||
|
|
||||||
|
class GithubBranch(_EndpointResponse):
|
||||||
|
"""
|
||||||
|
Simple wrapper for a repository branch, only extracts name information
|
||||||
|
for now.
|
||||||
|
"""
|
||||||
|
|
||||||
|
def __init__(self, data: Dict) -> None:
|
||||||
|
super().__init__(data)
|
||||||
|
self.name = self._data["name"]
|
||||||
|
|
||||||
|
|
||||||
|
class GithubBranchApi(_GithubApiBase):
|
||||||
|
"""
|
||||||
|
Wrapper around branch API.
|
||||||
|
|
||||||
|
See https://docs.github.com/en/rest/branches/branches
|
||||||
|
|
||||||
|
"""
|
||||||
|
|
||||||
|
def __init__(self, token: str) -> None:
|
||||||
|
super().__init__(token)
|
||||||
|
|
||||||
|
self._ENDPOINT = "https://api.github.com/repos/{REPO}/branches"
|
||||||
|
|
||||||
|
def get_branches(self, repo: str) -> List[GithubBranch]:
|
||||||
|
"""
|
||||||
|
Returns all current branches of the given repository owned by the given
|
||||||
|
owner or organization.
|
||||||
|
"""
|
||||||
|
# The environment GITHUB_REPOSITORY already contains the owner in the correct location
|
||||||
|
endpoint = self._ENDPOINT.format(REPO=repo)
|
||||||
|
internal_data = self._read_all_pages(endpoint)
|
||||||
|
return [GithubBranch(branch) for branch in internal_data]
|
||||||
|
|
||||||
|
|
||||||
|
class ContainerPackage(_EndpointResponse):
|
||||||
|
"""
|
||||||
|
Data class wrapping the JSON response from the package related
|
||||||
|
endpoints
|
||||||
|
"""
|
||||||
|
|
||||||
|
def __init__(self, data: Dict):
|
||||||
|
super().__init__(data)
|
||||||
|
# This is a numerical ID, required for interactions with this
|
||||||
|
# specific package, including deletion of it or restoration
|
||||||
|
self.id: int = self._data["id"]
|
||||||
|
|
||||||
|
# A string name. This might be an actual name or it could be a
|
||||||
|
# digest string like "sha256:"
|
||||||
|
self.name: str = self._data["name"]
|
||||||
|
|
||||||
|
# URL to the package, including its ID, can be used for deletion
|
||||||
|
# or restoration without needing to build up a URL ourselves
|
||||||
|
self.url: str = self._data["url"]
|
||||||
|
|
||||||
|
# The list of tags applied to this image. Maybe an empty list
|
||||||
|
self.tags: List[str] = self._data["metadata"]["container"]["tags"]
|
||||||
|
|
||||||
|
@functools.cached_property
|
||||||
|
def untagged(self) -> bool:
|
||||||
|
"""
|
||||||
|
Returns True if the image has no tags applied to it, False otherwise
|
||||||
|
"""
|
||||||
|
return len(self.tags) == 0
|
||||||
|
|
||||||
|
@functools.cache
|
||||||
|
def tag_matches(self, pattern: str) -> bool:
|
||||||
|
"""
|
||||||
|
Returns True if the image has at least one tag which matches the given regex,
|
||||||
|
False otherwise
|
||||||
|
"""
|
||||||
|
for tag in self.tags:
|
||||||
|
if re.match(pattern, tag) is not None:
|
||||||
|
return True
|
||||||
|
return False
|
||||||
|
|
||||||
|
def __repr__(self):
|
||||||
|
return f"Package {self.name}"
|
||||||
|
|
||||||
|
|
||||||
|
class GithubContainerRegistryApi(_GithubApiBase):
|
||||||
|
"""
|
||||||
|
Class wrapper to deal with the Github packages API. This class only deals with
|
||||||
|
container type packages, the only type published by paperless-ngx.
|
||||||
|
"""
|
||||||
|
|
||||||
|
def __init__(self, token: str, owner_or_org: str) -> None:
|
||||||
|
super().__init__(token)
|
||||||
|
self._owner_or_org = owner_or_org
|
||||||
|
if self._owner_or_org == "paperless-ngx":
|
||||||
|
# https://docs.github.com/en/rest/packages#get-all-package-versions-for-a-package-owned-by-an-organization
|
||||||
|
self._PACKAGES_VERSIONS_ENDPOINT = "https://api.github.com/orgs/{ORG}/packages/{PACKAGE_TYPE}/{PACKAGE_NAME}/versions"
|
||||||
|
# https://docs.github.com/en/rest/packages#delete-package-version-for-an-organization
|
||||||
|
self._PACKAGE_VERSION_DELETE_ENDPOINT = "https://api.github.com/orgs/{ORG}/packages/{PACKAGE_TYPE}/{PACKAGE_NAME}/versions/{PACKAGE_VERSION_ID}"
|
||||||
|
else:
|
||||||
|
# https://docs.github.com/en/rest/packages#get-all-package-versions-for-a-package-owned-by-the-authenticated-user
|
||||||
|
self._PACKAGES_VERSIONS_ENDPOINT = "https://api.github.com/user/packages/{PACKAGE_TYPE}/{PACKAGE_NAME}/versions"
|
||||||
|
# https://docs.github.com/en/rest/packages#delete-a-package-version-for-the-authenticated-user
|
||||||
|
self._PACKAGE_VERSION_DELETE_ENDPOINT = "https://api.github.com/user/packages/{PACKAGE_TYPE}/{PACKAGE_NAME}/versions/{PACKAGE_VERSION_ID}"
|
||||||
|
self._PACKAGE_VERSION_RESTORE_ENDPOINT = (
|
||||||
|
f"{self._PACKAGE_VERSION_DELETE_ENDPOINT}/restore"
|
||||||
|
)
|
||||||
|
|
||||||
|
def get_active_package_versions(
|
||||||
|
self,
|
||||||
|
package_name: str,
|
||||||
|
) -> List[ContainerPackage]:
|
||||||
|
"""
|
||||||
|
Returns all the versions of a given package (container images) from
|
||||||
|
the API
|
||||||
|
"""
|
||||||
|
|
||||||
|
package_type: str = "container"
|
||||||
|
# Need to quote this for slashes in the name
|
||||||
|
package_name = urllib.parse.quote(package_name, safe="")
|
||||||
|
|
||||||
|
endpoint = self._PACKAGES_VERSIONS_ENDPOINT.format(
|
||||||
|
ORG=self._owner_or_org,
|
||||||
|
PACKAGE_TYPE=package_type,
|
||||||
|
PACKAGE_NAME=package_name,
|
||||||
|
)
|
||||||
|
|
||||||
|
pkgs = []
|
||||||
|
|
||||||
|
for data in self._read_all_pages(endpoint):
|
||||||
|
pkgs.append(ContainerPackage(data))
|
||||||
|
|
||||||
|
return pkgs
|
||||||
|
|
||||||
|
def get_deleted_package_versions(
|
||||||
|
self,
|
||||||
|
package_name: str,
|
||||||
|
) -> List[ContainerPackage]:
|
||||||
|
package_type: str = "container"
|
||||||
|
# Need to quote this for slashes in the name
|
||||||
|
package_name = urllib.parse.quote(package_name, safe="")
|
||||||
|
|
||||||
|
endpoint = (
|
||||||
|
self._PACKAGES_VERSIONS_ENDPOINT.format(
|
||||||
|
ORG=self._owner_or_org,
|
||||||
|
PACKAGE_TYPE=package_type,
|
||||||
|
PACKAGE_NAME=package_name,
|
||||||
|
)
|
||||||
|
+ "?state=deleted"
|
||||||
|
)
|
||||||
|
|
||||||
|
pkgs = []
|
||||||
|
|
||||||
|
for data in self._read_all_pages(endpoint):
|
||||||
|
pkgs.append(ContainerPackage(data))
|
||||||
|
|
||||||
|
return pkgs
|
||||||
|
|
||||||
|
def delete_package_version(self, package_data: ContainerPackage):
|
||||||
|
"""
|
||||||
|
Deletes the given package version from the GHCR
|
||||||
|
"""
|
||||||
|
resp = self._client.delete(package_data.url)
|
||||||
|
if resp.status_code != 204:
|
||||||
|
logger.warning(
|
||||||
|
f"Request to delete {package_data.url} returned HTTP {resp.status_code}",
|
||||||
|
)
|
||||||
|
|
||||||
|
def restore_package_version(
|
||||||
|
self,
|
||||||
|
package_name: str,
|
||||||
|
package_data: ContainerPackage,
|
||||||
|
):
|
||||||
|
package_type: str = "container"
|
||||||
|
endpoint = self._PACKAGE_VERSION_RESTORE_ENDPOINT.format(
|
||||||
|
ORG=self._owner_or_org,
|
||||||
|
PACKAGE_TYPE=package_type,
|
||||||
|
PACKAGE_NAME=package_name,
|
||||||
|
PACKAGE_VERSION_ID=package_data.id,
|
||||||
|
)
|
||||||
|
|
||||||
|
resp = self._client.post(endpoint)
|
||||||
|
if resp.status_code != 204:
|
||||||
|
logger.warning(
|
||||||
|
f"Request to delete {endpoint} returned HTTP {resp.status_code}",
|
||||||
|
)
|
||||||
12
.github/stale.yml
vendored
@@ -1,15 +1,23 @@
|
|||||||
# Number of days of inactivity before an issue becomes stale
|
# Number of days of inactivity before an issue becomes stale
|
||||||
daysUntilStale: 30
|
daysUntilStale: 30
|
||||||
|
|
||||||
# Number of days of inactivity before a stale issue is closed
|
# Number of days of inactivity before a stale issue is closed
|
||||||
daysUntilClose: 7
|
daysUntilClose: 7
|
||||||
onlyLabels:
|
|
||||||
- unconfirmed
|
# Only issues or pull requests with all of these labels are check if stale. Defaults to `[]` (disabled)
|
||||||
|
onlyLabels: [cant-reproduce]
|
||||||
|
|
||||||
# Label to use when marking an issue as stale
|
# Label to use when marking an issue as stale
|
||||||
staleLabel: stale
|
staleLabel: stale
|
||||||
|
|
||||||
# Comment to post when marking an issue as stale. Set to `false` to disable
|
# Comment to post when marking an issue as stale. Set to `false` to disable
|
||||||
markComment: >
|
markComment: >
|
||||||
This issue has been automatically marked as stale because it has not had
|
This issue has been automatically marked as stale because it has not had
|
||||||
recent activity. It will be closed if no further activity occurs. Thank you
|
recent activity. It will be closed if no further activity occurs. Thank you
|
||||||
for your contributions.
|
for your contributions.
|
||||||
|
|
||||||
# Comment to post when closing a stale issue. Set to `false` to disable
|
# Comment to post when closing a stale issue. Set to `false` to disable
|
||||||
closeComment: false
|
closeComment: false
|
||||||
|
|
||||||
|
# See https://github.com/marketplace/stale for more info on the app
|
||||||
|
# and https://github.com/probot/stale for the configuration docs
|
||||||
|
|||||||
455
.github/workflows/ci.yml
vendored
@@ -13,63 +13,230 @@ on:
|
|||||||
branches-ignore:
|
branches-ignore:
|
||||||
- 'translations**'
|
- 'translations**'
|
||||||
|
|
||||||
|
env:
|
||||||
|
# This is the version of pipenv all the steps will use
|
||||||
|
# If changing this, change Dockerfile
|
||||||
|
DEFAULT_PIP_ENV_VERSION: "2022.11.30"
|
||||||
|
# This is the default version of Python to use in most steps
|
||||||
|
# If changing this, change Dockerfile
|
||||||
|
DEFAULT_PYTHON_VERSION: "3.9"
|
||||||
|
|
||||||
jobs:
|
jobs:
|
||||||
|
pre-commit:
|
||||||
|
name: Linting Checks
|
||||||
|
runs-on: ubuntu-22.04
|
||||||
|
steps:
|
||||||
|
-
|
||||||
|
name: Checkout repository
|
||||||
|
uses: actions/checkout@v3
|
||||||
|
-
|
||||||
|
name: Install python
|
||||||
|
uses: actions/setup-python@v4
|
||||||
|
with:
|
||||||
|
python-version: ${{ env.DEFAULT_PYTHON_VERSION }}
|
||||||
|
-
|
||||||
|
name: Check files
|
||||||
|
uses: pre-commit/action@v3.0.0
|
||||||
|
|
||||||
documentation:
|
documentation:
|
||||||
name: "Build Documentation"
|
name: "Build Documentation"
|
||||||
runs-on: ubuntu-20.04
|
runs-on: ubuntu-22.04
|
||||||
|
needs:
|
||||||
|
- pre-commit
|
||||||
steps:
|
steps:
|
||||||
-
|
-
|
||||||
name: Checkout
|
name: Checkout
|
||||||
uses: actions/checkout@v3
|
uses: actions/checkout@v3
|
||||||
-
|
|
||||||
name: Install pipenv
|
|
||||||
run: pipx install pipenv
|
|
||||||
-
|
-
|
||||||
name: Set up Python
|
name: Set up Python
|
||||||
uses: actions/setup-python@v3
|
id: setup-python
|
||||||
|
uses: actions/setup-python@v4
|
||||||
with:
|
with:
|
||||||
python-version: 3.9
|
python-version: ${{ env.DEFAULT_PYTHON_VERSION }}
|
||||||
cache: "pipenv"
|
cache: "pipenv"
|
||||||
cache-dependency-path: 'Pipfile.lock'
|
cache-dependency-path: 'Pipfile.lock'
|
||||||
|
-
|
||||||
|
name: Install pipenv
|
||||||
|
run: |
|
||||||
|
pip install --user pipenv==${DEFAULT_PIP_ENV_VERSION}
|
||||||
-
|
-
|
||||||
name: Install dependencies
|
name: Install dependencies
|
||||||
run: |
|
run: |
|
||||||
pipenv sync --dev
|
pipenv --python ${{ steps.setup-python.outputs.python-version }} sync --dev
|
||||||
|
-
|
||||||
|
name: List installed Python dependencies
|
||||||
|
run: |
|
||||||
|
pipenv --python ${{ steps.setup-python.outputs.python-version }} run pip list
|
||||||
-
|
-
|
||||||
name: Make documentation
|
name: Make documentation
|
||||||
run: |
|
run: |
|
||||||
cd docs/
|
pipenv --python ${{ steps.setup-python.outputs.python-version }} run mkdocs build --config-file ./mkdocs.yml
|
||||||
pipenv run make html
|
|
||||||
-
|
-
|
||||||
name: Upload artifact
|
name: Upload artifact
|
||||||
uses: actions/upload-artifact@v3
|
uses: actions/upload-artifact@v3
|
||||||
with:
|
with:
|
||||||
name: documentation
|
name: documentation
|
||||||
path: docs/_build/html/
|
path: site/
|
||||||
|
|
||||||
ci-backend:
|
documentation-deploy:
|
||||||
uses: ./.github/workflows/reusable-ci-backend.yml
|
name: "Deploy Documentation"
|
||||||
|
runs-on: ubuntu-22.04
|
||||||
ci-frontend:
|
if: github.event_name == 'push' && github.ref == 'refs/heads/main'
|
||||||
uses: ./.github/workflows/reusable-ci-frontend.yml
|
|
||||||
|
|
||||||
prepare-docker-build:
|
|
||||||
name: Prepare Docker Pipeline Data
|
|
||||||
if: github.event_name == 'push' && (startsWith(github.ref, 'refs/heads/feature-') || github.ref == 'refs/heads/dev' || github.ref == 'refs/heads/beta' || contains(github.ref, 'beta.rc') || startsWith(github.ref, 'refs/tags/v'))
|
|
||||||
runs-on: ubuntu-20.04
|
|
||||||
needs:
|
needs:
|
||||||
- documentation
|
- documentation
|
||||||
- ci-backend
|
|
||||||
- ci-frontend
|
|
||||||
steps:
|
steps:
|
||||||
-
|
-
|
||||||
name: Checkout
|
name: Checkout
|
||||||
uses: actions/checkout@v3
|
uses: actions/checkout@v3
|
||||||
-
|
-
|
||||||
name: Set up Python
|
name: Deploy docs
|
||||||
uses: actions/setup-python@v3
|
uses: mhausenblas/mkdocs-deploy-gh-pages@master
|
||||||
|
env:
|
||||||
|
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||||
|
CUSTOM_DOMAIN: docs.paperless-ngx.com
|
||||||
|
CONFIG_FILE: mkdocs.yml
|
||||||
|
EXTRA_PACKAGES: build-base
|
||||||
|
|
||||||
|
tests-backend:
|
||||||
|
name: "Tests (${{ matrix.python-version }})"
|
||||||
|
runs-on: ubuntu-22.04
|
||||||
|
needs:
|
||||||
|
- pre-commit
|
||||||
|
strategy:
|
||||||
|
matrix:
|
||||||
|
python-version: ['3.8', '3.9', '3.10']
|
||||||
|
fail-fast: false
|
||||||
|
env:
|
||||||
|
# Enable Tika end to end testing
|
||||||
|
TIKA_LIVE: 1
|
||||||
|
# Enable paperless_mail testing against real server
|
||||||
|
PAPERLESS_MAIL_TEST_HOST: ${{ secrets.TEST_MAIL_HOST }}
|
||||||
|
PAPERLESS_MAIL_TEST_USER: ${{ secrets.TEST_MAIL_USER }}
|
||||||
|
PAPERLESS_MAIL_TEST_PASSWD: ${{ secrets.TEST_MAIL_PASSWD }}
|
||||||
|
# Skip Tests which require convert
|
||||||
|
PAPERLESS_TEST_SKIP_CONVERT: 1
|
||||||
|
# Enable Gotenberg end to end testing
|
||||||
|
GOTENBERG_LIVE: 1
|
||||||
|
steps:
|
||||||
|
-
|
||||||
|
name: Checkout
|
||||||
|
uses: actions/checkout@v3
|
||||||
with:
|
with:
|
||||||
python-version: "3.9"
|
fetch-depth: 0
|
||||||
|
-
|
||||||
|
name: Start containers
|
||||||
|
run: |
|
||||||
|
docker compose --file ${GITHUB_WORKSPACE}/docker/compose/docker-compose.ci-test.yml pull --quiet
|
||||||
|
docker compose --file ${GITHUB_WORKSPACE}/docker/compose/docker-compose.ci-test.yml up --detach
|
||||||
|
-
|
||||||
|
name: Set up Python
|
||||||
|
id: setup-python
|
||||||
|
uses: actions/setup-python@v4
|
||||||
|
with:
|
||||||
|
python-version: "${{ matrix.python-version }}"
|
||||||
|
cache: "pipenv"
|
||||||
|
cache-dependency-path: 'Pipfile.lock'
|
||||||
|
-
|
||||||
|
name: Install pipenv
|
||||||
|
run: |
|
||||||
|
pip install --user pipenv==${DEFAULT_PIP_ENV_VERSION}
|
||||||
|
-
|
||||||
|
name: Install system dependencies
|
||||||
|
run: |
|
||||||
|
sudo apt-get update -qq
|
||||||
|
sudo apt-get install -qq --no-install-recommends unpaper tesseract-ocr imagemagick ghostscript libzbar0 poppler-utils
|
||||||
|
-
|
||||||
|
name: Install Python dependencies
|
||||||
|
run: |
|
||||||
|
pipenv --python ${{ steps.setup-python.outputs.python-version }} run python --version
|
||||||
|
pipenv --python ${{ steps.setup-python.outputs.python-version }} sync --dev
|
||||||
|
-
|
||||||
|
name: List installed Python dependencies
|
||||||
|
run: |
|
||||||
|
pipenv --python ${{ steps.setup-python.outputs.python-version }} run pip list
|
||||||
|
-
|
||||||
|
name: Tests
|
||||||
|
run: |
|
||||||
|
cd src/
|
||||||
|
pipenv --python ${{ steps.setup-python.outputs.python-version }} run pytest -ra
|
||||||
|
-
|
||||||
|
name: Get changed files
|
||||||
|
id: changed-files-specific
|
||||||
|
uses: tj-actions/changed-files@v35
|
||||||
|
with:
|
||||||
|
files: |
|
||||||
|
src/**
|
||||||
|
-
|
||||||
|
name: List all changed files
|
||||||
|
run: |
|
||||||
|
for file in ${{ steps.changed-files-specific.outputs.all_changed_files }}; do
|
||||||
|
echo "${file} was changed"
|
||||||
|
done
|
||||||
|
-
|
||||||
|
name: Publish coverage results
|
||||||
|
if: matrix.python-version == ${{ env.DEFAULT_PYTHON_VERSION }} && steps.changed-files-specific.outputs.any_changed == 'true'
|
||||||
|
env:
|
||||||
|
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||||
|
# https://github.com/coveralls-clients/coveralls-python/issues/251
|
||||||
|
run: |
|
||||||
|
cd src/
|
||||||
|
pipenv --python ${{ steps.setup-python.outputs.python-version }} run coveralls --service=github
|
||||||
|
-
|
||||||
|
name: Stop containers
|
||||||
|
if: always()
|
||||||
|
run: |
|
||||||
|
docker compose --file ${GITHUB_WORKSPACE}/docker/compose/docker-compose.ci-test.yml logs
|
||||||
|
docker compose --file ${GITHUB_WORKSPACE}/docker/compose/docker-compose.ci-test.yml down
|
||||||
|
|
||||||
|
tests-frontend:
|
||||||
|
name: "Tests Frontend"
|
||||||
|
runs-on: ubuntu-22.04
|
||||||
|
needs:
|
||||||
|
- pre-commit
|
||||||
|
strategy:
|
||||||
|
matrix:
|
||||||
|
node-version: [16.x]
|
||||||
|
steps:
|
||||||
|
- uses: actions/checkout@v3
|
||||||
|
-
|
||||||
|
name: Use Node.js ${{ matrix.node-version }}
|
||||||
|
uses: actions/setup-node@v3
|
||||||
|
with:
|
||||||
|
node-version: ${{ matrix.node-version }}
|
||||||
|
- run: cd src-ui && npm ci
|
||||||
|
- run: cd src-ui && npm run lint
|
||||||
|
- run: cd src-ui && npm run test
|
||||||
|
- run: cd src-ui && npm run e2e:ci
|
||||||
|
|
||||||
|
prepare-docker-build:
|
||||||
|
name: Prepare Docker Pipeline Data
|
||||||
|
if: github.event_name == 'push' && (startsWith(github.ref, 'refs/heads/feature-') || github.ref == 'refs/heads/dev' || github.ref == 'refs/heads/beta' || contains(github.ref, 'beta.rc') || startsWith(github.ref, 'refs/tags/v'))
|
||||||
|
runs-on: ubuntu-22.04
|
||||||
|
# If the push triggered the installer library workflow, wait for it to
|
||||||
|
# complete here. This ensures the required versions for the final
|
||||||
|
# image have been built, while not waiting at all if the versions haven't changed
|
||||||
|
concurrency:
|
||||||
|
group: build-installer-library
|
||||||
|
cancel-in-progress: false
|
||||||
|
needs:
|
||||||
|
- documentation
|
||||||
|
- tests-backend
|
||||||
|
- tests-frontend
|
||||||
|
steps:
|
||||||
|
-
|
||||||
|
name: Set ghcr repository name
|
||||||
|
id: set-ghcr-repository
|
||||||
|
run: |
|
||||||
|
ghcr_name=$(echo "${GITHUB_REPOSITORY}" | awk '{ print tolower($0) }')
|
||||||
|
echo "repository=${ghcr_name}" >> $GITHUB_OUTPUT
|
||||||
|
-
|
||||||
|
name: Checkout
|
||||||
|
uses: actions/checkout@v3
|
||||||
|
-
|
||||||
|
name: Set up Python
|
||||||
|
uses: actions/setup-python@v4
|
||||||
|
with:
|
||||||
|
python-version: ${{ env.DEFAULT_PYTHON_VERSION }}
|
||||||
-
|
-
|
||||||
name: Setup qpdf image
|
name: Setup qpdf image
|
||||||
id: qpdf-setup
|
id: qpdf-setup
|
||||||
@@ -78,7 +245,7 @@ jobs:
|
|||||||
|
|
||||||
echo ${build_json}
|
echo ${build_json}
|
||||||
|
|
||||||
echo ::set-output name=qpdf-json::${build_json}
|
echo "qpdf-json=${build_json}" >> $GITHUB_OUTPUT
|
||||||
-
|
-
|
||||||
name: Setup psycopg2 image
|
name: Setup psycopg2 image
|
||||||
id: psycopg2-setup
|
id: psycopg2-setup
|
||||||
@@ -87,7 +254,7 @@ jobs:
|
|||||||
|
|
||||||
echo ${build_json}
|
echo ${build_json}
|
||||||
|
|
||||||
echo ::set-output name=psycopg2-json::${build_json}
|
echo "psycopg2-json=${build_json}" >> $GITHUB_OUTPUT
|
||||||
-
|
-
|
||||||
name: Setup pikepdf image
|
name: Setup pikepdf image
|
||||||
id: pikepdf-setup
|
id: pikepdf-setup
|
||||||
@@ -96,7 +263,7 @@ jobs:
|
|||||||
|
|
||||||
echo ${build_json}
|
echo ${build_json}
|
||||||
|
|
||||||
echo ::set-output name=pikepdf-json::${build_json}
|
echo "pikepdf-json=${build_json}" >> $GITHUB_OUTPUT
|
||||||
-
|
-
|
||||||
name: Setup jbig2enc image
|
name: Setup jbig2enc image
|
||||||
id: jbig2enc-setup
|
id: jbig2enc-setup
|
||||||
@@ -105,10 +272,12 @@ jobs:
|
|||||||
|
|
||||||
echo ${build_json}
|
echo ${build_json}
|
||||||
|
|
||||||
echo ::set-output name=jbig2enc-json::${build_json}
|
echo "jbig2enc-json=${build_json}" >> $GITHUB_OUTPUT
|
||||||
|
|
||||||
outputs:
|
outputs:
|
||||||
|
|
||||||
|
ghcr-repository: ${{ steps.set-ghcr-repository.outputs.repository }}
|
||||||
|
|
||||||
qpdf-json: ${{ steps.qpdf-setup.outputs.qpdf-json }}
|
qpdf-json: ${{ steps.qpdf-setup.outputs.qpdf-json }}
|
||||||
|
|
||||||
pikepdf-json: ${{ steps.pikepdf-setup.outputs.pikepdf-json }}
|
pikepdf-json: ${{ steps.pikepdf-setup.outputs.pikepdf-json }}
|
||||||
@@ -117,86 +286,39 @@ jobs:
|
|||||||
|
|
||||||
jbig2enc-json: ${{ steps.jbig2enc-setup.outputs.jbig2enc-json}}
|
jbig2enc-json: ${{ steps.jbig2enc-setup.outputs.jbig2enc-json}}
|
||||||
|
|
||||||
build-qpdf-debs:
|
|
||||||
name: qpdf
|
|
||||||
needs:
|
|
||||||
- prepare-docker-build
|
|
||||||
uses: ./.github/workflows/reusable-workflow-builder.yml
|
|
||||||
with:
|
|
||||||
dockerfile: ./docker-builders/Dockerfile.qpdf
|
|
||||||
build-json: ${{ needs.prepare-docker-build.outputs.qpdf-json }}
|
|
||||||
build-args: |
|
|
||||||
QPDF_VERSION=${{ fromJSON(needs.prepare-docker-build.outputs.qpdf-json).version }}
|
|
||||||
|
|
||||||
build-jbig2enc:
|
|
||||||
name: jbig2enc
|
|
||||||
needs:
|
|
||||||
- prepare-docker-build
|
|
||||||
uses: ./.github/workflows/reusable-workflow-builder.yml
|
|
||||||
with:
|
|
||||||
dockerfile: ./docker-builders/Dockerfile.jbig2enc
|
|
||||||
build-json: ${{ needs.prepare-docker-build.outputs.jbig2enc-json }}
|
|
||||||
build-args: |
|
|
||||||
JBIG2ENC_VERSION=${{ fromJSON(needs.prepare-docker-build.outputs.jbig2enc-json).version }}
|
|
||||||
|
|
||||||
build-psycopg2-wheel:
|
|
||||||
name: psycopg2
|
|
||||||
needs:
|
|
||||||
- prepare-docker-build
|
|
||||||
uses: ./.github/workflows/reusable-workflow-builder.yml
|
|
||||||
with:
|
|
||||||
dockerfile: ./docker-builders/Dockerfile.psycopg2
|
|
||||||
build-json: ${{ needs.prepare-docker-build.outputs.psycopg2-json }}
|
|
||||||
build-args: |
|
|
||||||
PSYCOPG2_GIT_TAG=${{ fromJSON(needs.prepare-docker-build.outputs.psycopg2-json).git_tag }}
|
|
||||||
PSYCOPG2_VERSION=${{ fromJSON(needs.prepare-docker-build.outputs.psycopg2-json).version }}
|
|
||||||
|
|
||||||
build-pikepdf-wheel:
|
|
||||||
name: pikepdf
|
|
||||||
needs:
|
|
||||||
- prepare-docker-build
|
|
||||||
- build-qpdf-debs
|
|
||||||
uses: ./.github/workflows/reusable-workflow-builder.yml
|
|
||||||
with:
|
|
||||||
dockerfile: ./docker-builders/Dockerfile.pikepdf
|
|
||||||
build-json: ${{ needs.prepare-docker-build.outputs.pikepdf-json }}
|
|
||||||
build-args: |
|
|
||||||
REPO=${{ github.repository }}
|
|
||||||
QPDF_VERSION=${{ fromJSON(needs.prepare-docker-build.outputs.qpdf-json).version }}
|
|
||||||
PIKEPDF_GIT_TAG=${{ fromJSON(needs.prepare-docker-build.outputs.pikepdf-json).git_tag }}
|
|
||||||
PIKEPDF_VERSION=${{ fromJSON(needs.prepare-docker-build.outputs.pikepdf-json).version }}
|
|
||||||
|
|
||||||
# build and push image to docker hub.
|
# build and push image to docker hub.
|
||||||
build-docker-image:
|
build-docker-image:
|
||||||
runs-on: ubuntu-20.04
|
runs-on: ubuntu-22.04
|
||||||
concurrency:
|
concurrency:
|
||||||
group: ${{ github.workflow }}-build-docker-image-${{ github.ref_name }}
|
group: ${{ github.workflow }}-build-docker-image-${{ github.ref_name }}
|
||||||
cancel-in-progress: true
|
cancel-in-progress: true
|
||||||
needs:
|
needs:
|
||||||
- prepare-docker-build
|
- prepare-docker-build
|
||||||
- build-psycopg2-wheel
|
|
||||||
- build-jbig2enc
|
|
||||||
- build-qpdf-debs
|
|
||||||
- build-pikepdf-wheel
|
|
||||||
steps:
|
steps:
|
||||||
-
|
-
|
||||||
name: Check pushing to Docker Hub
|
name: Check pushing to Docker Hub
|
||||||
id: docker-hub
|
id: docker-hub
|
||||||
# Only push to Dockerhub from the main repo
|
# Only push to Dockerhub from the main repo AND the ref is either:
|
||||||
|
# main
|
||||||
|
# dev
|
||||||
|
# beta
|
||||||
|
# a tag
|
||||||
# Otherwise forks would require a Docker Hub account and secrets setup
|
# Otherwise forks would require a Docker Hub account and secrets setup
|
||||||
run: |
|
run: |
|
||||||
if [[ ${{ github.repository }} == "paperless-ngx/paperless-ngx" ]] ; then
|
if [[ ${{ needs.prepare-docker-build.outputs.ghcr-repository }} == "paperless-ngx/paperless-ngx" && ( ${{ github.ref_name }} == "main" || ${{ github.ref_name }} == "dev" || ${{ github.ref_name }} == "beta" || ${{ startsWith(github.ref, 'refs/tags/v') }} == "true" ) ]] ; then
|
||||||
echo ::set-output name=enable::"true"
|
echo "Enabling DockerHub image push"
|
||||||
|
echo "enable=true" >> $GITHUB_OUTPUT
|
||||||
else
|
else
|
||||||
echo ::set-output name=enable::"false"
|
echo "Not pushing to DockerHub"
|
||||||
|
echo "enable=false" >> $GITHUB_OUTPUT
|
||||||
fi
|
fi
|
||||||
-
|
-
|
||||||
name: Gather Docker metadata
|
name: Gather Docker metadata
|
||||||
id: docker-meta
|
id: docker-meta
|
||||||
uses: docker/metadata-action@v3
|
uses: docker/metadata-action@v4
|
||||||
with:
|
with:
|
||||||
images: |
|
images: |
|
||||||
ghcr.io/${{ github.repository }}
|
ghcr.io/${{ needs.prepare-docker-build.outputs.ghcr-repository }}
|
||||||
name=paperlessngx/paperless-ngx,enable=${{ steps.docker-hub.outputs.enable }}
|
name=paperlessngx/paperless-ngx,enable=${{ steps.docker-hub.outputs.enable }}
|
||||||
tags: |
|
tags: |
|
||||||
# Tag branches with branch name
|
# Tag branches with branch name
|
||||||
@@ -210,20 +332,20 @@ jobs:
|
|||||||
uses: actions/checkout@v3
|
uses: actions/checkout@v3
|
||||||
-
|
-
|
||||||
name: Set up Docker Buildx
|
name: Set up Docker Buildx
|
||||||
uses: docker/setup-buildx-action@v1
|
uses: docker/setup-buildx-action@v2
|
||||||
-
|
-
|
||||||
name: Set up QEMU
|
name: Set up QEMU
|
||||||
uses: docker/setup-qemu-action@v1
|
uses: docker/setup-qemu-action@v2
|
||||||
-
|
-
|
||||||
name: Login to Github Container Registry
|
name: Login to Github Container Registry
|
||||||
uses: docker/login-action@v1
|
uses: docker/login-action@v2
|
||||||
with:
|
with:
|
||||||
registry: ghcr.io
|
registry: ghcr.io
|
||||||
username: ${{ github.actor }}
|
username: ${{ github.actor }}
|
||||||
password: ${{ secrets.GITHUB_TOKEN }}
|
password: ${{ secrets.GITHUB_TOKEN }}
|
||||||
-
|
-
|
||||||
name: Login to Docker Hub
|
name: Login to Docker Hub
|
||||||
uses: docker/login-action@v1
|
uses: docker/login-action@v2
|
||||||
# Don't attempt to login is not pushing to Docker Hub
|
# Don't attempt to login is not pushing to Docker Hub
|
||||||
if: steps.docker-hub.outputs.enable == 'true'
|
if: steps.docker-hub.outputs.enable == 'true'
|
||||||
with:
|
with:
|
||||||
@@ -231,7 +353,7 @@ jobs:
|
|||||||
password: ${{ secrets.DOCKERHUB_TOKEN }}
|
password: ${{ secrets.DOCKERHUB_TOKEN }}
|
||||||
-
|
-
|
||||||
name: Build and push
|
name: Build and push
|
||||||
uses: docker/build-push-action@v2
|
uses: docker/build-push-action@v3
|
||||||
with:
|
with:
|
||||||
context: .
|
context: .
|
||||||
file: ./Dockerfile
|
file: ./Dockerfile
|
||||||
@@ -247,11 +369,11 @@ jobs:
|
|||||||
# Get cache layers from this branch, then dev, then main
|
# Get cache layers from this branch, then dev, then main
|
||||||
# This allows new branches to get at least some cache benefits, generally from dev
|
# This allows new branches to get at least some cache benefits, generally from dev
|
||||||
cache-from: |
|
cache-from: |
|
||||||
type=registry,ref=ghcr.io/${{ github.repository }}/builder/cache/app:${{ github.ref_name }}
|
type=registry,ref=ghcr.io/${{ needs.prepare-docker-build.outputs.ghcr-repository }}/builder/cache/app:${{ github.ref_name }}
|
||||||
type=registry,ref=ghcr.io/${{ github.repository }}/builder/cache/app:dev
|
type=registry,ref=ghcr.io/${{ needs.prepare-docker-build.outputs.ghcr-repository }}/builder/cache/app:dev
|
||||||
type=registry,ref=ghcr.io/${{ github.repository }}/builder/cache/app:main
|
type=registry,ref=ghcr.io/${{ needs.prepare-docker-build.outputs.ghcr-repository }}/builder/cache/app:main
|
||||||
cache-to: |
|
cache-to: |
|
||||||
type=registry,mode=max,ref=ghcr.io/${{ github.repository }}/builder/cache/app:${{ github.ref_name }}
|
type=registry,mode=max,ref=ghcr.io/${{ needs.prepare-docker-build.outputs.ghcr-repository }}/builder/cache/app:${{ github.ref_name }}
|
||||||
-
|
-
|
||||||
name: Inspect image
|
name: Inspect image
|
||||||
run: |
|
run: |
|
||||||
@@ -271,23 +393,32 @@ jobs:
|
|||||||
build-release:
|
build-release:
|
||||||
needs:
|
needs:
|
||||||
- build-docker-image
|
- build-docker-image
|
||||||
runs-on: ubuntu-20.04
|
runs-on: ubuntu-22.04
|
||||||
steps:
|
steps:
|
||||||
-
|
-
|
||||||
name: Checkout
|
name: Checkout
|
||||||
uses: actions/checkout@v3
|
uses: actions/checkout@v3
|
||||||
-
|
-
|
||||||
name: Set up Python
|
name: Set up Python
|
||||||
uses: actions/setup-python@v3
|
id: setup-python
|
||||||
|
uses: actions/setup-python@v4
|
||||||
with:
|
with:
|
||||||
python-version: 3.9
|
python-version: ${{ env.DEFAULT_PYTHON_VERSION }}
|
||||||
|
cache: "pipenv"
|
||||||
|
cache-dependency-path: 'Pipfile.lock'
|
||||||
-
|
-
|
||||||
name: Install dependencies
|
name: Install pipenv + tools
|
||||||
|
run: |
|
||||||
|
pip install --upgrade --user pipenv==${DEFAULT_PIP_ENV_VERSION} setuptools wheel
|
||||||
|
-
|
||||||
|
name: Install Python dependencies
|
||||||
|
run: |
|
||||||
|
pipenv --python ${{ steps.setup-python.outputs.python-version }} sync --dev
|
||||||
|
-
|
||||||
|
name: Install system dependencies
|
||||||
run: |
|
run: |
|
||||||
sudo apt-get update -qq
|
sudo apt-get update -qq
|
||||||
sudo apt-get install -qq --no-install-recommends gettext liblept5
|
sudo apt-get install -qq --no-install-recommends gettext liblept5
|
||||||
pip3 install --upgrade pip setuptools wheel
|
|
||||||
pip3 install -r requirements.txt
|
|
||||||
-
|
-
|
||||||
name: Download frontend artifact
|
name: Download frontend artifact
|
||||||
uses: actions/download-artifact@v3
|
uses: actions/download-artifact@v3
|
||||||
@@ -300,34 +431,38 @@ jobs:
|
|||||||
with:
|
with:
|
||||||
name: documentation
|
name: documentation
|
||||||
path: docs/_build/html/
|
path: docs/_build/html/
|
||||||
|
-
|
||||||
|
name: Generate requirements file
|
||||||
|
run: |
|
||||||
|
pipenv --python ${{ steps.setup-python.outputs.python-version }} requirements > requirements.txt
|
||||||
|
-
|
||||||
|
name: Compile messages
|
||||||
|
run: |
|
||||||
|
cd src/
|
||||||
|
pipenv --python ${{ steps.setup-python.outputs.python-version }} run python3 manage.py compilemessages
|
||||||
|
-
|
||||||
|
name: Collect static files
|
||||||
|
run: |
|
||||||
|
cd src/
|
||||||
|
pipenv --python ${{ steps.setup-python.outputs.python-version }} run python3 manage.py collectstatic --no-input
|
||||||
-
|
-
|
||||||
name: Move files
|
name: Move files
|
||||||
run: |
|
run: |
|
||||||
mkdir dist
|
mkdir dist
|
||||||
mkdir dist/paperless-ngx
|
mkdir dist/paperless-ngx
|
||||||
mkdir dist/paperless-ngx/scripts
|
mkdir dist/paperless-ngx/scripts
|
||||||
cp .dockerignore .env Dockerfile Pipfile Pipfile.lock LICENSE README.md requirements.txt dist/paperless-ngx/
|
cp .dockerignore .env Dockerfile Pipfile Pipfile.lock requirements.txt LICENSE README.md dist/paperless-ngx/
|
||||||
cp paperless.conf.example dist/paperless-ngx/paperless.conf
|
cp paperless.conf.example dist/paperless-ngx/paperless.conf
|
||||||
cp gunicorn.conf.py dist/paperless-ngx/gunicorn.conf.py
|
cp gunicorn.conf.py dist/paperless-ngx/gunicorn.conf.py
|
||||||
cp docker/ dist/paperless-ngx/docker -r
|
cp -r docker/ dist/paperless-ngx/docker
|
||||||
cp scripts/*.service scripts/*.sh dist/paperless-ngx/scripts/
|
cp scripts/*.service scripts/*.sh scripts/*.socket dist/paperless-ngx/scripts/
|
||||||
cp src/ dist/paperless-ngx/src -r
|
cp -r src/ dist/paperless-ngx/src
|
||||||
cp docs/_build/html/ dist/paperless-ngx/docs -r
|
cp -r docs/_build/html/ dist/paperless-ngx/docs
|
||||||
-
|
mv static dist/paperless-ngx
|
||||||
name: Compile messages
|
|
||||||
run: |
|
|
||||||
cd dist/paperless-ngx/src
|
|
||||||
python3 manage.py compilemessages
|
|
||||||
-
|
|
||||||
name: Collect static files
|
|
||||||
run: |
|
|
||||||
cd dist/paperless-ngx/src
|
|
||||||
python3 manage.py collectstatic --no-input
|
|
||||||
-
|
-
|
||||||
name: Make release package
|
name: Make release package
|
||||||
run: |
|
run: |
|
||||||
cd dist
|
cd dist
|
||||||
find . -name __pycache__ | xargs rm -r
|
|
||||||
tar -cJf paperless-ngx.tar.xz paperless-ngx/
|
tar -cJf paperless-ngx.tar.xz paperless-ngx/
|
||||||
-
|
-
|
||||||
name: Upload release artifact
|
name: Upload release artifact
|
||||||
@@ -337,7 +472,11 @@ jobs:
|
|||||||
path: dist/paperless-ngx.tar.xz
|
path: dist/paperless-ngx.tar.xz
|
||||||
|
|
||||||
publish-release:
|
publish-release:
|
||||||
runs-on: ubuntu-20.04
|
runs-on: ubuntu-22.04
|
||||||
|
outputs:
|
||||||
|
prerelease: ${{ steps.get_version.outputs.prerelease }}
|
||||||
|
changelog: ${{ steps.create-release.outputs.body }}
|
||||||
|
version: ${{ steps.get_version.outputs.version }}
|
||||||
needs:
|
needs:
|
||||||
- build-release
|
- build-release
|
||||||
if: github.ref_type == 'tag' && (startsWith(github.ref_name, 'v') || contains(github.ref_name, '-beta.rc'))
|
if: github.ref_type == 'tag' && (startsWith(github.ref_name, 'v') || contains(github.ref_name, '-beta.rc'))
|
||||||
@@ -352,16 +491,16 @@ jobs:
|
|||||||
name: Get version
|
name: Get version
|
||||||
id: get_version
|
id: get_version
|
||||||
run: |
|
run: |
|
||||||
echo ::set-output name=version::${{ github.ref_name }}
|
echo "version=${{ github.ref_name }}" >> $GITHUB_OUTPUT
|
||||||
if [[ ${{ contains(github.ref_name, '-beta.rc') }} == 'true' ]]; then
|
if [[ ${{ contains(github.ref_name, '-beta.rc') }} == 'true' ]]; then
|
||||||
echo ::set-output name=prerelease::true
|
echo "prerelease=true" >> $GITHUB_OUTPUT
|
||||||
else
|
else
|
||||||
echo ::set-output name=prerelease::false
|
echo "prerelease=false" >> $GITHUB_OUTPUT
|
||||||
fi
|
fi
|
||||||
-
|
-
|
||||||
name: Create Release and Changelog
|
name: Create Release and Changelog
|
||||||
id: create-release
|
id: create-release
|
||||||
uses: release-drafter/release-drafter@v5
|
uses: paperless-ngx/release-drafter@master
|
||||||
with:
|
with:
|
||||||
name: Paperless-ngx ${{ steps.get_version.outputs.version }}
|
name: Paperless-ngx ${{ steps.get_version.outputs.version }}
|
||||||
tag: ${{ steps.get_version.outputs.version }}
|
tag: ${{ steps.get_version.outputs.version }}
|
||||||
@@ -373,11 +512,71 @@ jobs:
|
|||||||
-
|
-
|
||||||
name: Upload release archive
|
name: Upload release archive
|
||||||
id: upload-release-asset
|
id: upload-release-asset
|
||||||
uses: actions/upload-release-asset@v1
|
uses: shogo82148/actions-upload-release-asset@v1
|
||||||
env:
|
|
||||||
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
|
||||||
with:
|
with:
|
||||||
|
github_token: ${{ secrets.GITHUB_TOKEN }}
|
||||||
upload_url: ${{ steps.create-release.outputs.upload_url }}
|
upload_url: ${{ steps.create-release.outputs.upload_url }}
|
||||||
asset_path: ./paperless-ngx.tar.xz
|
asset_path: ./paperless-ngx.tar.xz
|
||||||
asset_name: paperless-ngx-${{ steps.get_version.outputs.version }}.tar.xz
|
asset_name: paperless-ngx-${{ steps.get_version.outputs.version }}.tar.xz
|
||||||
asset_content_type: application/x-xz
|
asset_content_type: application/x-xz
|
||||||
|
|
||||||
|
append-changelog:
|
||||||
|
runs-on: ubuntu-22.04
|
||||||
|
needs:
|
||||||
|
- publish-release
|
||||||
|
if: needs.publish-release.outputs.prerelease == 'false'
|
||||||
|
steps:
|
||||||
|
-
|
||||||
|
name: Checkout
|
||||||
|
uses: actions/checkout@v3
|
||||||
|
with:
|
||||||
|
ref: main
|
||||||
|
-
|
||||||
|
name: Set up Python
|
||||||
|
uses: actions/setup-python@v4
|
||||||
|
with:
|
||||||
|
python-version: ${{ env.DEFAULT_PYTHON_VERSION }}
|
||||||
|
cache: "pipenv"
|
||||||
|
cache-dependency-path: 'Pipfile.lock'
|
||||||
|
-
|
||||||
|
name: Install pipenv + tools
|
||||||
|
run: |
|
||||||
|
pip install --upgrade --user pipenv==${DEFAULT_PIP_ENV_VERSION} setuptools wheel
|
||||||
|
-
|
||||||
|
name: Append Changelog to docs
|
||||||
|
id: append-Changelog
|
||||||
|
working-directory: docs
|
||||||
|
run: |
|
||||||
|
git branch ${{ needs.publish-release.outputs.version }}-changelog
|
||||||
|
git checkout ${{ needs.publish-release.outputs.version }}-changelog
|
||||||
|
echo -e "# Changelog\n\n${{ needs.publish-release.outputs.changelog }}\n" > changelog-new.md
|
||||||
|
echo "Manually linking usernames"
|
||||||
|
sed -i -r 's|@(.+?) \(\[#|[@\1](https://github.com/\1) ([#|ig' changelog-new.md
|
||||||
|
CURRENT_CHANGELOG=`tail --lines +2 changelog.md`
|
||||||
|
echo -e "$CURRENT_CHANGELOG" >> changelog-new.md
|
||||||
|
mv changelog-new.md changelog.md
|
||||||
|
pipenv run pre-commit run --files changelog.md || true
|
||||||
|
git config --global user.name "github-actions"
|
||||||
|
git config --global user.email "41898282+github-actions[bot]@users.noreply.github.com"
|
||||||
|
git commit -am "Changelog ${{ needs.publish-release.outputs.version }} - GHA"
|
||||||
|
git push origin ${{ needs.publish-release.outputs.version }}-changelog
|
||||||
|
-
|
||||||
|
name: Create Pull Request
|
||||||
|
uses: actions/github-script@v6
|
||||||
|
with:
|
||||||
|
script: |
|
||||||
|
const { repo, owner } = context.repo;
|
||||||
|
const result = await github.rest.pulls.create({
|
||||||
|
title: '[Documentation] Add ${{ needs.publish-release.outputs.version }} changelog',
|
||||||
|
owner,
|
||||||
|
repo,
|
||||||
|
head: '${{ needs.publish-release.outputs.version }}-changelog',
|
||||||
|
base: 'main',
|
||||||
|
body: 'This PR is auto-generated by CI.'
|
||||||
|
});
|
||||||
|
github.rest.issues.addLabels({
|
||||||
|
owner,
|
||||||
|
repo,
|
||||||
|
issue_number: result.data.number,
|
||||||
|
labels: ['documentation']
|
||||||
|
});
|
||||||
|
|||||||
93
.github/workflows/cleanup-tags.yml
vendored
Normal file
@@ -0,0 +1,93 @@
|
|||||||
|
# This workflow runs on certain conditions to check for and potentially
|
||||||
|
# delete container images from the GHCR which no longer have an associated
|
||||||
|
# code branch.
|
||||||
|
# Requires a PAT with the correct scope set in the secrets.
|
||||||
|
#
|
||||||
|
# This workflow will not trigger runs on forked repos.
|
||||||
|
|
||||||
|
name: Cleanup Image Tags
|
||||||
|
|
||||||
|
on:
|
||||||
|
delete:
|
||||||
|
push:
|
||||||
|
paths:
|
||||||
|
- ".github/workflows/cleanup-tags.yml"
|
||||||
|
- ".github/scripts/cleanup-tags.py"
|
||||||
|
- ".github/scripts/github.py"
|
||||||
|
- ".github/scripts/common.py"
|
||||||
|
|
||||||
|
concurrency:
|
||||||
|
group: registry-tags-cleanup
|
||||||
|
cancel-in-progress: false
|
||||||
|
|
||||||
|
jobs:
|
||||||
|
cleanup-images:
|
||||||
|
name: Cleanup Image Tags for ${{ matrix.primary-name }}
|
||||||
|
if: github.repository_owner == 'paperless-ngx'
|
||||||
|
runs-on: ubuntu-22.04
|
||||||
|
strategy:
|
||||||
|
matrix:
|
||||||
|
include:
|
||||||
|
- primary-name: "paperless-ngx"
|
||||||
|
cache-name: "paperless-ngx/builder/cache/app"
|
||||||
|
|
||||||
|
- primary-name: "paperless-ngx/builder/qpdf"
|
||||||
|
cache-name: "paperless-ngx/builder/cache/qpdf"
|
||||||
|
|
||||||
|
- primary-name: "paperless-ngx/builder/pikepdf"
|
||||||
|
cache-name: "paperless-ngx/builder/cache/pikepdf"
|
||||||
|
|
||||||
|
- primary-name: "paperless-ngx/builder/jbig2enc"
|
||||||
|
cache-name: "paperless-ngx/builder/cache/jbig2enc"
|
||||||
|
|
||||||
|
- primary-name: "paperless-ngx/builder/psycopg2"
|
||||||
|
cache-name: "paperless-ngx/builder/cache/psycopg2"
|
||||||
|
env:
|
||||||
|
# Requires a personal access token with the OAuth scope delete:packages
|
||||||
|
TOKEN: ${{ secrets.GHA_CONTAINER_DELETE_TOKEN }}
|
||||||
|
steps:
|
||||||
|
-
|
||||||
|
name: Checkout
|
||||||
|
uses: actions/checkout@v3
|
||||||
|
-
|
||||||
|
name: Login to Github Container Registry
|
||||||
|
uses: docker/login-action@v2
|
||||||
|
with:
|
||||||
|
registry: ghcr.io
|
||||||
|
username: ${{ github.actor }}
|
||||||
|
password: ${{ secrets.GITHUB_TOKEN }}
|
||||||
|
-
|
||||||
|
name: Set up Python
|
||||||
|
uses: actions/setup-python@v4
|
||||||
|
with:
|
||||||
|
python-version: "3.10"
|
||||||
|
-
|
||||||
|
name: Install httpx
|
||||||
|
run: |
|
||||||
|
python -m pip install httpx
|
||||||
|
#
|
||||||
|
# Clean up primary package
|
||||||
|
#
|
||||||
|
-
|
||||||
|
name: Cleanup for package "${{ matrix.primary-name }}"
|
||||||
|
if: "${{ env.TOKEN != '' }}"
|
||||||
|
run: |
|
||||||
|
python ${GITHUB_WORKSPACE}/.github/scripts/cleanup-tags.py --untagged --is-manifest --delete "${{ matrix.primary-name }}"
|
||||||
|
#
|
||||||
|
# Clean up registry cache package
|
||||||
|
#
|
||||||
|
-
|
||||||
|
name: Cleanup for package "${{ matrix.cache-name }}"
|
||||||
|
if: "${{ env.TOKEN != '' }}"
|
||||||
|
run: |
|
||||||
|
python ${GITHUB_WORKSPACE}/.github/scripts/cleanup-tags.py --untagged --delete "${{ matrix.cache-name }}"
|
||||||
|
#
|
||||||
|
# Verify tags which are left still pull
|
||||||
|
#
|
||||||
|
-
|
||||||
|
name: Check all tags still pull
|
||||||
|
run: |
|
||||||
|
ghcr_name=$(echo "ghcr.io/${GITHUB_REPOSITORY_OWNER}/${{ matrix.primary-name }}" | awk '{ print tolower($0) }')
|
||||||
|
echo "Pulling all tags of ${ghcr_name}"
|
||||||
|
docker pull --quiet --all-tags ${ghcr_name}
|
||||||
|
docker image list
|
||||||
4
.github/workflows/codeql-analysis.yml
vendored
@@ -23,7 +23,7 @@ on:
|
|||||||
jobs:
|
jobs:
|
||||||
analyze:
|
analyze:
|
||||||
name: Analyze
|
name: Analyze
|
||||||
runs-on: ubuntu-latest
|
runs-on: ubuntu-22.04
|
||||||
permissions:
|
permissions:
|
||||||
actions: read
|
actions: read
|
||||||
contents: read
|
contents: read
|
||||||
@@ -38,7 +38,7 @@ jobs:
|
|||||||
|
|
||||||
steps:
|
steps:
|
||||||
- name: Checkout repository
|
- name: Checkout repository
|
||||||
uses: actions/checkout@v2
|
uses: actions/checkout@v3
|
||||||
|
|
||||||
# Initializes the CodeQL tools for scanning.
|
# Initializes the CodeQL tools for scanning.
|
||||||
- name: Initialize CodeQL
|
- name: Initialize CodeQL
|
||||||
|
|||||||
171
.github/workflows/installer-library.yml
vendored
Normal file
@@ -0,0 +1,171 @@
|
|||||||
|
# This workflow will run to update the installer library of
|
||||||
|
# Docker images. These are the images which provide updated wheels
|
||||||
|
# .deb installation packages or maybe just some compiled library
|
||||||
|
|
||||||
|
name: Build Image Library
|
||||||
|
|
||||||
|
on:
|
||||||
|
push:
|
||||||
|
# Must match one of these branches AND one of the paths
|
||||||
|
# to be triggered
|
||||||
|
branches:
|
||||||
|
- "main"
|
||||||
|
- "dev"
|
||||||
|
- "library-*"
|
||||||
|
- "feature-*"
|
||||||
|
paths:
|
||||||
|
# Trigger the workflow if a Dockerfile changed
|
||||||
|
- "docker-builders/**"
|
||||||
|
# Trigger if a package was updated
|
||||||
|
- ".build-config.json"
|
||||||
|
- "Pipfile.lock"
|
||||||
|
# Also trigger on workflow changes related to the library
|
||||||
|
- ".github/workflows/installer-library.yml"
|
||||||
|
- ".github/workflows/reusable-workflow-builder.yml"
|
||||||
|
- ".github/scripts/**"
|
||||||
|
|
||||||
|
# Set a workflow level concurrency group so primary workflow
|
||||||
|
# can wait for this to complete if needed
|
||||||
|
# DO NOT CHANGE without updating main workflow group
|
||||||
|
concurrency:
|
||||||
|
group: build-installer-library
|
||||||
|
cancel-in-progress: false
|
||||||
|
|
||||||
|
jobs:
|
||||||
|
prepare-docker-build:
|
||||||
|
name: Prepare Docker Image Version Data
|
||||||
|
runs-on: ubuntu-22.04
|
||||||
|
steps:
|
||||||
|
-
|
||||||
|
name: Set ghcr repository name
|
||||||
|
id: set-ghcr-repository
|
||||||
|
run: |
|
||||||
|
ghcr_name=$(echo "${GITHUB_REPOSITORY}" | awk '{ print tolower($0) }')
|
||||||
|
echo "repository=${ghcr_name}" >> $GITHUB_OUTPUT
|
||||||
|
-
|
||||||
|
name: Checkout
|
||||||
|
uses: actions/checkout@v3
|
||||||
|
-
|
||||||
|
name: Set up Python
|
||||||
|
uses: actions/setup-python@v4
|
||||||
|
with:
|
||||||
|
python-version: "3.9"
|
||||||
|
-
|
||||||
|
name: Install jq
|
||||||
|
run: |
|
||||||
|
sudo apt-get update
|
||||||
|
sudo apt-get install jq
|
||||||
|
-
|
||||||
|
name: Setup qpdf image
|
||||||
|
id: qpdf-setup
|
||||||
|
run: |
|
||||||
|
build_json=$(python ${GITHUB_WORKSPACE}/.github/scripts/get-build-json.py qpdf)
|
||||||
|
|
||||||
|
echo ${build_json}
|
||||||
|
|
||||||
|
echo "qpdf-json=${build_json}" >> $GITHUB_OUTPUT
|
||||||
|
-
|
||||||
|
name: Setup psycopg2 image
|
||||||
|
id: psycopg2-setup
|
||||||
|
run: |
|
||||||
|
build_json=$(python ${GITHUB_WORKSPACE}/.github/scripts/get-build-json.py psycopg2)
|
||||||
|
|
||||||
|
echo ${build_json}
|
||||||
|
|
||||||
|
echo "psycopg2-json=${build_json}" >> $GITHUB_OUTPUT
|
||||||
|
-
|
||||||
|
name: Setup pikepdf image
|
||||||
|
id: pikepdf-setup
|
||||||
|
run: |
|
||||||
|
build_json=$(python ${GITHUB_WORKSPACE}/.github/scripts/get-build-json.py pikepdf)
|
||||||
|
|
||||||
|
echo ${build_json}
|
||||||
|
|
||||||
|
echo "pikepdf-json=${build_json}" >> $GITHUB_OUTPUT
|
||||||
|
-
|
||||||
|
name: Setup jbig2enc image
|
||||||
|
id: jbig2enc-setup
|
||||||
|
run: |
|
||||||
|
build_json=$(python ${GITHUB_WORKSPACE}/.github/scripts/get-build-json.py jbig2enc)
|
||||||
|
|
||||||
|
echo ${build_json}
|
||||||
|
|
||||||
|
echo "jbig2enc-json=${build_json}" >> $GITHUB_OUTPUT
|
||||||
|
-
|
||||||
|
name: Setup other versions
|
||||||
|
id: cache-bust-setup
|
||||||
|
run: |
|
||||||
|
pillow_version=$(jq -r '.default.pillow.version | gsub("=";"")' Pipfile.lock)
|
||||||
|
lxml_version=$(jq -r '.default.lxml.version | gsub("=";"")' Pipfile.lock)
|
||||||
|
|
||||||
|
echo "Pillow is ${pillow_version}"
|
||||||
|
echo "lxml is ${lxml_version}"
|
||||||
|
|
||||||
|
echo "pillow-version=${pillow_version}" >> $GITHUB_OUTPUT
|
||||||
|
echo "lxml-version=${lxml_version}" >> $GITHUB_OUTPUT
|
||||||
|
|
||||||
|
outputs:
|
||||||
|
|
||||||
|
ghcr-repository: ${{ steps.set-ghcr-repository.outputs.repository }}
|
||||||
|
|
||||||
|
qpdf-json: ${{ steps.qpdf-setup.outputs.qpdf-json }}
|
||||||
|
|
||||||
|
pikepdf-json: ${{ steps.pikepdf-setup.outputs.pikepdf-json }}
|
||||||
|
|
||||||
|
psycopg2-json: ${{ steps.psycopg2-setup.outputs.psycopg2-json }}
|
||||||
|
|
||||||
|
jbig2enc-json: ${{ steps.jbig2enc-setup.outputs.jbig2enc-json }}
|
||||||
|
|
||||||
|
pillow-version: ${{ steps.cache-bust-setup.outputs.pillow-version }}
|
||||||
|
|
||||||
|
lxml-version: ${{ steps.cache-bust-setup.outputs.lxml-version }}
|
||||||
|
|
||||||
|
build-qpdf-debs:
|
||||||
|
name: qpdf
|
||||||
|
needs:
|
||||||
|
- prepare-docker-build
|
||||||
|
uses: ./.github/workflows/reusable-workflow-builder.yml
|
||||||
|
with:
|
||||||
|
dockerfile: ./docker-builders/Dockerfile.qpdf
|
||||||
|
build-platforms: linux/amd64
|
||||||
|
build-json: ${{ needs.prepare-docker-build.outputs.qpdf-json }}
|
||||||
|
build-args: |
|
||||||
|
QPDF_VERSION=${{ fromJSON(needs.prepare-docker-build.outputs.qpdf-json).version }}
|
||||||
|
|
||||||
|
build-jbig2enc:
|
||||||
|
name: jbig2enc
|
||||||
|
needs:
|
||||||
|
- prepare-docker-build
|
||||||
|
uses: ./.github/workflows/reusable-workflow-builder.yml
|
||||||
|
with:
|
||||||
|
dockerfile: ./docker-builders/Dockerfile.jbig2enc
|
||||||
|
build-json: ${{ needs.prepare-docker-build.outputs.jbig2enc-json }}
|
||||||
|
build-args: |
|
||||||
|
JBIG2ENC_VERSION=${{ fromJSON(needs.prepare-docker-build.outputs.jbig2enc-json).version }}
|
||||||
|
|
||||||
|
build-psycopg2-wheel:
|
||||||
|
name: psycopg2
|
||||||
|
needs:
|
||||||
|
- prepare-docker-build
|
||||||
|
uses: ./.github/workflows/reusable-workflow-builder.yml
|
||||||
|
with:
|
||||||
|
dockerfile: ./docker-builders/Dockerfile.psycopg2
|
||||||
|
build-json: ${{ needs.prepare-docker-build.outputs.psycopg2-json }}
|
||||||
|
build-args: |
|
||||||
|
PSYCOPG2_VERSION=${{ fromJSON(needs.prepare-docker-build.outputs.psycopg2-json).version }}
|
||||||
|
|
||||||
|
build-pikepdf-wheel:
|
||||||
|
name: pikepdf
|
||||||
|
needs:
|
||||||
|
- prepare-docker-build
|
||||||
|
- build-qpdf-debs
|
||||||
|
uses: ./.github/workflows/reusable-workflow-builder.yml
|
||||||
|
with:
|
||||||
|
dockerfile: ./docker-builders/Dockerfile.pikepdf
|
||||||
|
build-json: ${{ needs.prepare-docker-build.outputs.pikepdf-json }}
|
||||||
|
build-args: |
|
||||||
|
REPO=${{ needs.prepare-docker-build.outputs.ghcr-repository }}
|
||||||
|
QPDF_VERSION=${{ fromJSON(needs.prepare-docker-build.outputs.qpdf-json).version }}
|
||||||
|
PIKEPDF_VERSION=${{ fromJSON(needs.prepare-docker-build.outputs.pikepdf-json).version }}
|
||||||
|
PILLOW_VERSION=${{ needs.prepare-docker-build.outputs.pillow-version }}
|
||||||
|
LXML_VERSION=${{ needs.prepare-docker-build.outputs.lxml-version }}
|
||||||
26
.github/workflows/project-actions.yml
vendored
@@ -13,6 +13,9 @@ on:
|
|||||||
- main
|
- main
|
||||||
- dev
|
- dev
|
||||||
|
|
||||||
|
permissions:
|
||||||
|
contents: read
|
||||||
|
|
||||||
env:
|
env:
|
||||||
todo: Todo
|
todo: Todo
|
||||||
done: Done
|
done: Done
|
||||||
@@ -21,11 +24,11 @@ env:
|
|||||||
jobs:
|
jobs:
|
||||||
issue_opened_or_reopened:
|
issue_opened_or_reopened:
|
||||||
name: issue_opened_or_reopened
|
name: issue_opened_or_reopened
|
||||||
runs-on: ubuntu-latest
|
runs-on: ubuntu-22.04
|
||||||
if: github.event_name == 'issues' && (github.event.action == 'opened' || github.event.action == 'reopened')
|
if: github.event_name == 'issues' && (github.event.action == 'opened' || github.event.action == 'reopened')
|
||||||
steps:
|
steps:
|
||||||
- name: Set issue status to ${{ env.todo }}
|
- name: Add issue to project and set status to ${{ env.todo }}
|
||||||
uses: leonsteinhaeuser/project-beta-automations@v1.2.1
|
uses: leonsteinhaeuser/project-beta-automations@v2.0.1
|
||||||
with:
|
with:
|
||||||
gh_token: ${{ secrets.GH_TOKEN }}
|
gh_token: ${{ secrets.GH_TOKEN }}
|
||||||
organization: paperless-ngx
|
organization: paperless-ngx
|
||||||
@@ -34,14 +37,21 @@ jobs:
|
|||||||
status_value: ${{ env.todo }} # Target status
|
status_value: ${{ env.todo }} # Target status
|
||||||
pr_opened_or_reopened:
|
pr_opened_or_reopened:
|
||||||
name: pr_opened_or_reopened
|
name: pr_opened_or_reopened
|
||||||
runs-on: ubuntu-latest
|
runs-on: ubuntu-22.04
|
||||||
if: github.event_name == 'pull_request_target' && (github.event.action == 'opened' || github.event.action == 'reopened')
|
permissions:
|
||||||
|
# write permission is required for autolabeler
|
||||||
|
pull-requests: write
|
||||||
|
if: github.event_name == 'pull_request_target' && (github.event.action == 'opened' || github.event.action == 'reopened') && github.event.pull_request.user.login != 'dependabot'
|
||||||
steps:
|
steps:
|
||||||
- name: Set PR status to ${{ env.in_progress }}
|
- name: Add PR to project and set status to "Needs Review"
|
||||||
uses: leonsteinhaeuser/project-beta-automations@v1.2.1
|
uses: leonsteinhaeuser/project-beta-automations@v2.0.1
|
||||||
with:
|
with:
|
||||||
gh_token: ${{ secrets.GH_TOKEN }}
|
gh_token: ${{ secrets.GH_TOKEN }}
|
||||||
organization: paperless-ngx
|
organization: paperless-ngx
|
||||||
project_id: 2
|
project_id: 2
|
||||||
resource_node_id: ${{ github.event.pull_request.node_id }}
|
resource_node_id: ${{ github.event.pull_request.node_id }}
|
||||||
status_value: ${{ env.in_progress }} # Target status
|
status_value: "Needs Review" # Target status
|
||||||
|
- name: Label PR with release-drafter
|
||||||
|
uses: release-drafter/release-drafter@v5
|
||||||
|
env:
|
||||||
|
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||||
|
|||||||
108
.github/workflows/reusable-ci-backend.yml
vendored
@@ -1,108 +0,0 @@
|
|||||||
name: Backend CI Jobs
|
|
||||||
|
|
||||||
on:
|
|
||||||
workflow_call:
|
|
||||||
|
|
||||||
jobs:
|
|
||||||
|
|
||||||
code-checks-backend:
|
|
||||||
name: "Code Style Checks"
|
|
||||||
runs-on: ubuntu-20.04
|
|
||||||
steps:
|
|
||||||
-
|
|
||||||
name: Checkout
|
|
||||||
uses: actions/checkout@v3
|
|
||||||
-
|
|
||||||
name: Install checkers
|
|
||||||
run: |
|
|
||||||
pipx install reorder-python-imports
|
|
||||||
pipx install yesqa
|
|
||||||
pipx install add-trailing-comma
|
|
||||||
pipx install flake8
|
|
||||||
-
|
|
||||||
name: Run reorder-python-imports
|
|
||||||
run: |
|
|
||||||
find src/ -type f -name '*.py' ! -path "*/migrations/*" | xargs reorder-python-imports
|
|
||||||
-
|
|
||||||
name: Run yesqa
|
|
||||||
run: |
|
|
||||||
find src/ -type f -name '*.py' ! -path "*/migrations/*" | xargs yesqa
|
|
||||||
-
|
|
||||||
name: Run add-trailing-comma
|
|
||||||
run: |
|
|
||||||
find src/ -type f -name '*.py' ! -path "*/migrations/*" | xargs add-trailing-comma
|
|
||||||
# black is placed after add-trailing-comma because it may format differently
|
|
||||||
# if a trailing comma is added
|
|
||||||
-
|
|
||||||
name: Run black
|
|
||||||
uses: psf/black@stable
|
|
||||||
with:
|
|
||||||
options: "--check --diff"
|
|
||||||
version: "22.3.0"
|
|
||||||
-
|
|
||||||
name: Run flake8 checks
|
|
||||||
run: |
|
|
||||||
cd src/
|
|
||||||
flake8 --max-line-length=88 --ignore=E203,W503
|
|
||||||
|
|
||||||
tests-backend:
|
|
||||||
name: "Tests (${{ matrix.python-version }})"
|
|
||||||
runs-on: ubuntu-20.04
|
|
||||||
needs:
|
|
||||||
- code-checks-backend
|
|
||||||
strategy:
|
|
||||||
matrix:
|
|
||||||
python-version: ['3.8', '3.9', '3.10']
|
|
||||||
fail-fast: false
|
|
||||||
steps:
|
|
||||||
-
|
|
||||||
name: Checkout
|
|
||||||
uses: actions/checkout@v3
|
|
||||||
with:
|
|
||||||
fetch-depth: 2
|
|
||||||
-
|
|
||||||
name: Install pipenv
|
|
||||||
run: pipx install pipenv
|
|
||||||
-
|
|
||||||
name: Set up Python
|
|
||||||
uses: actions/setup-python@v3
|
|
||||||
with:
|
|
||||||
python-version: "${{ matrix.python-version }}"
|
|
||||||
cache: "pipenv"
|
|
||||||
cache-dependency-path: 'Pipfile.lock'
|
|
||||||
-
|
|
||||||
name: Install system dependencies
|
|
||||||
run: |
|
|
||||||
sudo apt-get update -qq
|
|
||||||
sudo apt-get install -qq --no-install-recommends unpaper tesseract-ocr imagemagick ghostscript optipng libzbar0 poppler-utils
|
|
||||||
-
|
|
||||||
name: Install Python dependencies
|
|
||||||
run: |
|
|
||||||
pipenv sync --dev
|
|
||||||
-
|
|
||||||
name: Tests
|
|
||||||
run: |
|
|
||||||
cd src/
|
|
||||||
pipenv run pytest
|
|
||||||
-
|
|
||||||
name: Get changed files
|
|
||||||
id: changed-files-specific
|
|
||||||
uses: tj-actions/changed-files@v19
|
|
||||||
with:
|
|
||||||
files: |
|
|
||||||
src/**
|
|
||||||
-
|
|
||||||
name: List all changed files
|
|
||||||
run: |
|
|
||||||
for file in ${{ steps.changed-files-specific.outputs.all_changed_files }}; do
|
|
||||||
echo "${file} was changed"
|
|
||||||
done
|
|
||||||
-
|
|
||||||
name: Publish coverage results
|
|
||||||
if: matrix.python-version == '3.9' && steps.changed-files-specific.outputs.any_changed == 'true'
|
|
||||||
env:
|
|
||||||
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
|
||||||
# https://github.com/coveralls-clients/coveralls-python/issues/251
|
|
||||||
run: |
|
|
||||||
cd src/
|
|
||||||
pipenv run coveralls --service=github
|
|
||||||
42
.github/workflows/reusable-ci-frontend.yml
vendored
@@ -1,42 +0,0 @@
|
|||||||
name: Frontend CI Jobs
|
|
||||||
|
|
||||||
on:
|
|
||||||
workflow_call:
|
|
||||||
|
|
||||||
jobs:
|
|
||||||
|
|
||||||
code-checks-frontend:
|
|
||||||
name: "Code Style Checks"
|
|
||||||
runs-on: ubuntu-20.04
|
|
||||||
steps:
|
|
||||||
-
|
|
||||||
name: Checkout
|
|
||||||
uses: actions/checkout@v3
|
|
||||||
- uses: actions/setup-node@v3
|
|
||||||
with:
|
|
||||||
node-version: '16'
|
|
||||||
-
|
|
||||||
name: Install prettier
|
|
||||||
run: |
|
|
||||||
npm install prettier
|
|
||||||
-
|
|
||||||
name: Run prettier
|
|
||||||
run:
|
|
||||||
npx prettier --check --ignore-path Pipfile.lock **/*.js **/*.ts *.md **/*.md
|
|
||||||
tests-frontend:
|
|
||||||
name: "Tests"
|
|
||||||
runs-on: ubuntu-20.04
|
|
||||||
needs:
|
|
||||||
- code-checks-frontend
|
|
||||||
strategy:
|
|
||||||
matrix:
|
|
||||||
node-version: [16.x]
|
|
||||||
steps:
|
|
||||||
- uses: actions/checkout@v3
|
|
||||||
- name: Use Node.js ${{ matrix.node-version }}
|
|
||||||
uses: actions/setup-node@v3
|
|
||||||
with:
|
|
||||||
node-version: ${{ matrix.node-version }}
|
|
||||||
- run: cd src-ui && npm ci
|
|
||||||
- run: cd src-ui && npm run test
|
|
||||||
- run: cd src-ui && npm run e2e:ci
|
|
||||||
16
.github/workflows/reusable-workflow-builder.yml
vendored
@@ -13,6 +13,10 @@ on:
|
|||||||
required: false
|
required: false
|
||||||
default: ""
|
default: ""
|
||||||
type: string
|
type: string
|
||||||
|
build-platforms:
|
||||||
|
required: false
|
||||||
|
default: linux/amd64,linux/arm64,linux/arm/v7
|
||||||
|
type: string
|
||||||
|
|
||||||
concurrency:
|
concurrency:
|
||||||
group: ${{ github.workflow }}-${{ fromJSON(inputs.build-json).name }}-${{ fromJSON(inputs.build-json).version }}
|
group: ${{ github.workflow }}-${{ fromJSON(inputs.build-json).name }}-${{ fromJSON(inputs.build-json).version }}
|
||||||
@@ -21,32 +25,32 @@ concurrency:
|
|||||||
jobs:
|
jobs:
|
||||||
build-image:
|
build-image:
|
||||||
name: Build ${{ fromJSON(inputs.build-json).name }} @ ${{ fromJSON(inputs.build-json).version }}
|
name: Build ${{ fromJSON(inputs.build-json).name }} @ ${{ fromJSON(inputs.build-json).version }}
|
||||||
runs-on: ubuntu-latest
|
runs-on: ubuntu-22.04
|
||||||
steps:
|
steps:
|
||||||
-
|
-
|
||||||
name: Checkout
|
name: Checkout
|
||||||
uses: actions/checkout@v3
|
uses: actions/checkout@v3
|
||||||
-
|
-
|
||||||
name: Login to Github Container Registry
|
name: Login to Github Container Registry
|
||||||
uses: docker/login-action@v1
|
uses: docker/login-action@v2
|
||||||
with:
|
with:
|
||||||
registry: ghcr.io
|
registry: ghcr.io
|
||||||
username: ${{ github.actor }}
|
username: ${{ github.actor }}
|
||||||
password: ${{ secrets.GITHUB_TOKEN }}
|
password: ${{ secrets.GITHUB_TOKEN }}
|
||||||
-
|
-
|
||||||
name: Set up Docker Buildx
|
name: Set up Docker Buildx
|
||||||
uses: docker/setup-buildx-action@v1
|
uses: docker/setup-buildx-action@v2
|
||||||
-
|
-
|
||||||
name: Set up QEMU
|
name: Set up QEMU
|
||||||
uses: docker/setup-qemu-action@v1
|
uses: docker/setup-qemu-action@v2
|
||||||
-
|
-
|
||||||
name: Build ${{ fromJSON(inputs.build-json).name }}
|
name: Build ${{ fromJSON(inputs.build-json).name }}
|
||||||
uses: docker/build-push-action@v2
|
uses: docker/build-push-action@v3
|
||||||
with:
|
with:
|
||||||
context: .
|
context: .
|
||||||
file: ${{ inputs.dockerfile }}
|
file: ${{ inputs.dockerfile }}
|
||||||
tags: ${{ fromJSON(inputs.build-json).image_tag }}
|
tags: ${{ fromJSON(inputs.build-json).image_tag }}
|
||||||
platforms: linux/amd64,linux/arm64,linux/arm/v7
|
platforms: ${{ inputs.build-platforms }}
|
||||||
build-args: ${{ inputs.build-args }}
|
build-args: ${{ inputs.build-args }}
|
||||||
push: true
|
push: true
|
||||||
cache-from: type=registry,ref=${{ fromJSON(inputs.build-json).cache_tag }}
|
cache-from: type=registry,ref=${{ fromJSON(inputs.build-json).cache_tag }}
|
||||||
|
|||||||
15
.gitignore
vendored
@@ -51,8 +51,8 @@ coverage.xml
|
|||||||
# Django stuff:
|
# Django stuff:
|
||||||
*.log
|
*.log
|
||||||
|
|
||||||
# Sphinx documentation
|
# MkDocs documentation
|
||||||
docs/_build/
|
site/
|
||||||
|
|
||||||
# PyBuilder
|
# PyBuilder
|
||||||
target/
|
target/
|
||||||
@@ -63,11 +63,14 @@ target/
|
|||||||
|
|
||||||
# VS Code
|
# VS Code
|
||||||
.vscode
|
.vscode
|
||||||
|
/src-ui/.vscode
|
||||||
|
/docs/.vscode
|
||||||
|
|
||||||
# Other stuff that doesn't belong
|
# Other stuff that doesn't belong
|
||||||
.virtualenv
|
.virtualenv
|
||||||
virtualenv
|
virtualenv
|
||||||
/venv
|
/venv
|
||||||
|
.venv/
|
||||||
/docker-compose.env
|
/docker-compose.env
|
||||||
/docker-compose.yml
|
/docker-compose.yml
|
||||||
|
|
||||||
@@ -84,8 +87,12 @@ scripts/nuke
|
|||||||
/paperless.conf
|
/paperless.conf
|
||||||
/consume/
|
/consume/
|
||||||
/export/
|
/export/
|
||||||
/src-ui/.vscode
|
|
||||||
|
|
||||||
# this is where the compiled frontend is moved to.
|
# this is where the compiled frontend is moved to.
|
||||||
/src/documents/static/frontend/
|
/src/documents/static/frontend/
|
||||||
/docs/.vscode/settings.json
|
|
||||||
|
# mac os
|
||||||
|
.DS_Store
|
||||||
|
|
||||||
|
# celery schedule file
|
||||||
|
celerybeat-schedule*
|
||||||
|
|||||||
8
.hadolint.yml
Normal file
@@ -0,0 +1,8 @@
|
|||||||
|
failure-threshold: warning
|
||||||
|
ignored:
|
||||||
|
# https://github.com/hadolint/hadolint/wiki/DL3008
|
||||||
|
- DL3008
|
||||||
|
# https://github.com/hadolint/hadolint/wiki/DL3013
|
||||||
|
- DL3013
|
||||||
|
# https://github.com/hadolint/hadolint/wiki/DL3003
|
||||||
|
- DL3003
|
||||||
@@ -5,7 +5,7 @@
|
|||||||
repos:
|
repos:
|
||||||
# General hooks
|
# General hooks
|
||||||
- repo: https://github.com/pre-commit/pre-commit-hooks
|
- repo: https://github.com/pre-commit/pre-commit-hooks
|
||||||
rev: v4.2.0
|
rev: v4.4.0
|
||||||
hooks:
|
hooks:
|
||||||
- id: check-docstring-first
|
- id: check-docstring-first
|
||||||
- id: check-json
|
- id: check-json
|
||||||
@@ -27,7 +27,7 @@ repos:
|
|||||||
- id: check-case-conflict
|
- id: check-case-conflict
|
||||||
- id: detect-private-key
|
- id: detect-private-key
|
||||||
- repo: https://github.com/pre-commit/mirrors-prettier
|
- repo: https://github.com/pre-commit/mirrors-prettier
|
||||||
rev: "v2.6.2"
|
rev: "v2.7.1"
|
||||||
hooks:
|
hooks:
|
||||||
- id: prettier
|
- id: prettier
|
||||||
types_or:
|
types_or:
|
||||||
@@ -37,33 +37,33 @@ repos:
|
|||||||
exclude: "(^Pipfile\\.lock$)"
|
exclude: "(^Pipfile\\.lock$)"
|
||||||
# Python hooks
|
# Python hooks
|
||||||
- repo: https://github.com/asottile/reorder_python_imports
|
- repo: https://github.com/asottile/reorder_python_imports
|
||||||
rev: v3.1.0
|
rev: v3.9.0
|
||||||
hooks:
|
hooks:
|
||||||
- id: reorder-python-imports
|
- id: reorder-python-imports
|
||||||
exclude: "(migrations)"
|
exclude: "(migrations)"
|
||||||
- repo: https://github.com/asottile/yesqa
|
- repo: https://github.com/asottile/yesqa
|
||||||
rev: "v1.3.0"
|
rev: "v1.4.0"
|
||||||
hooks:
|
hooks:
|
||||||
- id: yesqa
|
- id: yesqa
|
||||||
exclude: "(migrations)"
|
exclude: "(migrations)"
|
||||||
- repo: https://github.com/asottile/add-trailing-comma
|
- repo: https://github.com/asottile/add-trailing-comma
|
||||||
rev: "v2.2.3"
|
rev: "v2.4.0"
|
||||||
hooks:
|
hooks:
|
||||||
- id: add-trailing-comma
|
- id: add-trailing-comma
|
||||||
exclude: "(migrations)"
|
exclude: "(migrations)"
|
||||||
- repo: https://gitlab.com/pycqa/flake8
|
- repo: https://github.com/PyCQA/flake8
|
||||||
rev: 3.9.2
|
rev: 6.0.0
|
||||||
hooks:
|
hooks:
|
||||||
- id: flake8
|
- id: flake8
|
||||||
files: ^src/
|
files: ^src/
|
||||||
args:
|
args:
|
||||||
- "--config=./src/setup.cfg"
|
- "--config=./src/setup.cfg"
|
||||||
- repo: https://github.com/psf/black
|
- repo: https://github.com/psf/black
|
||||||
rev: 22.3.0
|
rev: 22.12.0
|
||||||
hooks:
|
hooks:
|
||||||
- id: black
|
- id: black
|
||||||
- repo: https://github.com/asottile/pyupgrade
|
- repo: https://github.com/asottile/pyupgrade
|
||||||
rev: v2.32.1
|
rev: v3.3.1
|
||||||
hooks:
|
hooks:
|
||||||
- id: pyupgrade
|
- id: pyupgrade
|
||||||
exclude: "(migrations)"
|
exclude: "(migrations)"
|
||||||
@@ -74,13 +74,6 @@ repos:
|
|||||||
rev: v2.10.0
|
rev: v2.10.0
|
||||||
hooks:
|
hooks:
|
||||||
- id: hadolint
|
- id: hadolint
|
||||||
args:
|
|
||||||
- --ignore
|
|
||||||
- DL3008 # https://github.com/hadolint/hadolint/wiki/DL3008 (should probably do this at some point)
|
|
||||||
- --ignore
|
|
||||||
- DL3013 # https://github.com/hadolint/hadolint/wiki/DL3013 (should probably do this too at some point)
|
|
||||||
- --ignore
|
|
||||||
- DL3003 # https://github.com/hadolint/hadolint/wiki/DL3003 (seems excessive to use WORKDIR so much)
|
|
||||||
# Shell script hooks
|
# Shell script hooks
|
||||||
- repo: https://github.com/lovesegfault/beautysh
|
- repo: https://github.com/lovesegfault/beautysh
|
||||||
rev: v6.2.1
|
rev: v6.2.1
|
||||||
@@ -89,6 +82,6 @@ repos:
|
|||||||
args:
|
args:
|
||||||
- "--tab"
|
- "--tab"
|
||||||
- repo: https://github.com/shellcheck-py/shellcheck-py
|
- repo: https://github.com/shellcheck-py/shellcheck-py
|
||||||
rev: "v0.8.0.4"
|
rev: "v0.9.0.2"
|
||||||
hooks:
|
hooks:
|
||||||
- id: shellcheck
|
- id: shellcheck
|
||||||
|
|||||||
1
.python-version
Normal file
@@ -0,0 +1 @@
|
|||||||
|
3.8.15
|
||||||
@@ -1,16 +0,0 @@
|
|||||||
# .readthedocs.yml
|
|
||||||
# Read the Docs configuration file
|
|
||||||
# See https://docs.readthedocs.io/en/stable/config-file/v2.html for details
|
|
||||||
|
|
||||||
# Required
|
|
||||||
version: 2
|
|
||||||
|
|
||||||
# Build documentation in the docs/ directory with Sphinx
|
|
||||||
sphinx:
|
|
||||||
configuration: docs/conf.py
|
|
||||||
|
|
||||||
# Optionally set the version of Python and requirements required to build your docs
|
|
||||||
python:
|
|
||||||
version: "3.8"
|
|
||||||
install:
|
|
||||||
- requirements: docs/requirements.txt
|
|
||||||
@@ -7,4 +7,3 @@
|
|||||||
/src/ @paperless-ngx/backend
|
/src/ @paperless-ngx/backend
|
||||||
Pipfile* @paperless-ngx/backend
|
Pipfile* @paperless-ngx/backend
|
||||||
*.py @paperless-ngx/backend
|
*.py @paperless-ngx/backend
|
||||||
requirements.txt @paperless-ngx/backend
|
|
||||||
|
|||||||
@@ -27,11 +27,11 @@ Please format and test your code! I know it's a hassle, but it makes sure that y
|
|||||||
|
|
||||||
To test your code, execute `pytest` in the src/ directory. This also generates a html coverage report, which you can use to see if you missed anything important during testing.
|
To test your code, execute `pytest` in the src/ directory. This also generates a html coverage report, which you can use to see if you missed anything important during testing.
|
||||||
|
|
||||||
Before you can run `pytest`, ensure to [properly set up your local environment](https://paperless-ngx.readthedocs.io/en/latest/extending.html#initial-setup-and-first-start).
|
Before you can run `pytest`, ensure to [properly set up your local environment](https://docs.paperless-ngx.com/development/#initial-setup-and-first-start).
|
||||||
|
|
||||||
## More info:
|
## More info:
|
||||||
|
|
||||||
... is available in the documentation. https://paperless-ngx.readthedocs.io/en/latest/extending.html
|
... is available [in the documentation](https://docs.paperless-ngx.com/development).
|
||||||
|
|
||||||
# Merging PRs
|
# Merging PRs
|
||||||
|
|
||||||
|
|||||||
135
Dockerfile
@@ -26,19 +26,44 @@ COPY ./src-ui /src/src-ui
|
|||||||
WORKDIR /src/src-ui
|
WORKDIR /src/src-ui
|
||||||
RUN set -eux \
|
RUN set -eux \
|
||||||
&& npm update npm -g \
|
&& npm update npm -g \
|
||||||
&& npm ci --no-optional
|
&& npm ci --omit=optional
|
||||||
RUN set -eux \
|
RUN set -eux \
|
||||||
&& ./node_modules/.bin/ng build --configuration production
|
&& ./node_modules/.bin/ng build --configuration production
|
||||||
|
|
||||||
|
FROM --platform=$BUILDPLATFORM python:3.9-slim-bullseye as pipenv-base
|
||||||
|
|
||||||
|
# This stage generates the requirements.txt file using pipenv
|
||||||
|
# This stage runs once for the native platform, as the outputs are not
|
||||||
|
# dependent on target arch
|
||||||
|
# This way, pipenv dependencies are not left in the final image
|
||||||
|
# nor can pipenv mess up the final image somehow
|
||||||
|
# Inputs: None
|
||||||
|
|
||||||
|
WORKDIR /usr/src/pipenv
|
||||||
|
|
||||||
|
COPY Pipfile* ./
|
||||||
|
|
||||||
|
RUN set -eux \
|
||||||
|
&& echo "Installing pipenv" \
|
||||||
|
&& python3 -m pip install --no-cache-dir --upgrade pipenv==2022.11.30 \
|
||||||
|
&& echo "Generating requirement.txt" \
|
||||||
|
&& pipenv requirements > requirements.txt
|
||||||
|
|
||||||
FROM python:3.9-slim-bullseye as main-app
|
FROM python:3.9-slim-bullseye as main-app
|
||||||
|
|
||||||
LABEL org.opencontainers.image.authors="paperless-ngx team <hello@paperless-ngx.com>"
|
LABEL org.opencontainers.image.authors="paperless-ngx team <hello@paperless-ngx.com>"
|
||||||
LABEL org.opencontainers.image.documentation="https://paperless-ngx.readthedocs.io/en/latest/"
|
LABEL org.opencontainers.image.documentation="https://docs.paperless-ngx.com/"
|
||||||
LABEL org.opencontainers.image.source="https://github.com/paperless-ngx/paperless-ngx"
|
LABEL org.opencontainers.image.source="https://github.com/paperless-ngx/paperless-ngx"
|
||||||
LABEL org.opencontainers.image.url="https://github.com/paperless-ngx/paperless-ngx"
|
LABEL org.opencontainers.image.url="https://github.com/paperless-ngx/paperless-ngx"
|
||||||
LABEL org.opencontainers.image.licenses="GPL-3.0-only"
|
LABEL org.opencontainers.image.licenses="GPL-3.0-only"
|
||||||
|
|
||||||
ARG DEBIAN_FRONTEND=noninteractive
|
ARG DEBIAN_FRONTEND=noninteractive
|
||||||
|
# Buildx provided
|
||||||
|
ARG TARGETARCH
|
||||||
|
ARG TARGETVARIANT
|
||||||
|
|
||||||
|
# Workflow provided
|
||||||
|
ARG QPDF_VERSION
|
||||||
|
|
||||||
#
|
#
|
||||||
# Begin installation and configuration
|
# Begin installation and configuration
|
||||||
@@ -53,39 +78,40 @@ COPY --from=jbig2enc-builder /usr/src/jbig2enc/src/*.h /usr/local/include/
|
|||||||
|
|
||||||
# Packages need for running
|
# Packages need for running
|
||||||
ARG RUNTIME_PACKAGES="\
|
ARG RUNTIME_PACKAGES="\
|
||||||
|
# Python
|
||||||
|
python3 \
|
||||||
|
python3-pip \
|
||||||
|
python3-setuptools \
|
||||||
|
# General utils
|
||||||
curl \
|
curl \
|
||||||
file \
|
# Docker specific
|
||||||
|
gosu \
|
||||||
|
# Timezones support
|
||||||
|
tzdata \
|
||||||
# fonts for text file thumbnail generation
|
# fonts for text file thumbnail generation
|
||||||
fonts-liberation \
|
fonts-liberation \
|
||||||
gettext \
|
gettext \
|
||||||
ghostscript \
|
ghostscript \
|
||||||
gnupg \
|
gnupg \
|
||||||
gosu \
|
|
||||||
icc-profiles-free \
|
icc-profiles-free \
|
||||||
imagemagick \
|
imagemagick \
|
||||||
media-types \
|
# Image processing
|
||||||
liblept5 \
|
liblept5 \
|
||||||
libpq5 \
|
|
||||||
libxml2 \
|
|
||||||
liblcms2-2 \
|
liblcms2-2 \
|
||||||
libtiff5 \
|
libtiff5 \
|
||||||
libxslt1.1 \
|
|
||||||
libfreetype6 \
|
libfreetype6 \
|
||||||
libwebp6 \
|
libwebp6 \
|
||||||
libopenjp2-7 \
|
libopenjp2-7 \
|
||||||
libimagequant0 \
|
libimagequant0 \
|
||||||
libraqm0 \
|
libraqm0 \
|
||||||
libgnutls30 \
|
|
||||||
libjpeg62-turbo \
|
libjpeg62-turbo \
|
||||||
optipng \
|
# PostgreSQL
|
||||||
python3 \
|
libpq5 \
|
||||||
python3-pip \
|
|
||||||
python3-setuptools \
|
|
||||||
postgresql-client \
|
postgresql-client \
|
||||||
|
# MySQL / MariaDB
|
||||||
|
mariadb-client \
|
||||||
# For Numpy
|
# For Numpy
|
||||||
libatlas3-base \
|
libatlas3-base \
|
||||||
# thumbnail size reduction
|
|
||||||
pngquant \
|
|
||||||
# OCRmyPDF dependencies
|
# OCRmyPDF dependencies
|
||||||
tesseract-ocr \
|
tesseract-ocr \
|
||||||
tesseract-ocr-eng \
|
tesseract-ocr-eng \
|
||||||
@@ -93,13 +119,23 @@ ARG RUNTIME_PACKAGES="\
|
|||||||
tesseract-ocr-fra \
|
tesseract-ocr-fra \
|
||||||
tesseract-ocr-ita \
|
tesseract-ocr-ita \
|
||||||
tesseract-ocr-spa \
|
tesseract-ocr-spa \
|
||||||
tzdata \
|
|
||||||
unpaper \
|
unpaper \
|
||||||
|
pngquant \
|
||||||
|
# pikepdf / qpdf
|
||||||
|
jbig2dec \
|
||||||
|
libxml2 \
|
||||||
|
libxslt1.1 \
|
||||||
|
libgnutls30 \
|
||||||
# Mime type detection
|
# Mime type detection
|
||||||
|
file \
|
||||||
|
libmagic1 \
|
||||||
|
media-types \
|
||||||
zlib1g \
|
zlib1g \
|
||||||
# Barcode splitter
|
# Barcode splitter
|
||||||
libzbar0 \
|
libzbar0 \
|
||||||
poppler-utils"
|
poppler-utils \
|
||||||
|
# RapidFuzz on armv7
|
||||||
|
libatomic1"
|
||||||
|
|
||||||
# Install basic runtime packages.
|
# Install basic runtime packages.
|
||||||
# These change very infrequently
|
# These change very infrequently
|
||||||
@@ -122,20 +158,39 @@ COPY gunicorn.conf.py .
|
|||||||
# These change sometimes, but rarely
|
# These change sometimes, but rarely
|
||||||
WORKDIR /usr/src/paperless/src/docker/
|
WORKDIR /usr/src/paperless/src/docker/
|
||||||
|
|
||||||
RUN --mount=type=bind,readwrite,source=docker,target=./ \
|
COPY [ \
|
||||||
set -eux \
|
"docker/imagemagick-policy.xml", \
|
||||||
|
"docker/supervisord.conf", \
|
||||||
|
"docker/docker-entrypoint.sh", \
|
||||||
|
"docker/docker-prepare.sh", \
|
||||||
|
"docker/paperless_cmd.sh", \
|
||||||
|
"docker/wait-for-redis.py", \
|
||||||
|
"docker/env-from-file.sh", \
|
||||||
|
"docker/management_script.sh", \
|
||||||
|
"docker/flower-conditional.sh", \
|
||||||
|
"docker/install_management_commands.sh", \
|
||||||
|
"/usr/src/paperless/src/docker/" \
|
||||||
|
]
|
||||||
|
|
||||||
|
RUN set -eux \
|
||||||
&& echo "Configuring ImageMagick" \
|
&& echo "Configuring ImageMagick" \
|
||||||
&& cp imagemagick-policy.xml /etc/ImageMagick-6/policy.xml \
|
&& mv imagemagick-policy.xml /etc/ImageMagick-6/policy.xml \
|
||||||
&& echo "Configuring supervisord" \
|
&& echo "Configuring supervisord" \
|
||||||
&& mkdir /var/log/supervisord /var/run/supervisord \
|
&& mkdir /var/log/supervisord /var/run/supervisord \
|
||||||
&& cp supervisord.conf /etc/supervisord.conf \
|
&& mv supervisord.conf /etc/supervisord.conf \
|
||||||
&& echo "Setting up Docker scripts" \
|
&& echo "Setting up Docker scripts" \
|
||||||
&& cp docker-entrypoint.sh /sbin/docker-entrypoint.sh \
|
&& mv docker-entrypoint.sh /sbin/docker-entrypoint.sh \
|
||||||
&& chmod 755 /sbin/docker-entrypoint.sh \
|
&& chmod 755 /sbin/docker-entrypoint.sh \
|
||||||
&& cp docker-prepare.sh /sbin/docker-prepare.sh \
|
&& mv docker-prepare.sh /sbin/docker-prepare.sh \
|
||||||
&& chmod 755 /sbin/docker-prepare.sh \
|
&& chmod 755 /sbin/docker-prepare.sh \
|
||||||
&& cp wait-for-redis.py /sbin/wait-for-redis.py \
|
&& mv wait-for-redis.py /sbin/wait-for-redis.py \
|
||||||
&& chmod 755 /sbin/wait-for-redis.py \
|
&& chmod 755 /sbin/wait-for-redis.py \
|
||||||
|
&& mv env-from-file.sh /sbin/env-from-file.sh \
|
||||||
|
&& chmod 755 /sbin/env-from-file.sh \
|
||||||
|
&& mv paperless_cmd.sh /usr/local/bin/paperless_cmd.sh \
|
||||||
|
&& chmod 755 /usr/local/bin/paperless_cmd.sh \
|
||||||
|
&& mv flower-conditional.sh /usr/local/bin/flower-conditional.sh \
|
||||||
|
&& chmod 755 /usr/local/bin/flower-conditional.sh \
|
||||||
&& echo "Installing managment commands" \
|
&& echo "Installing managment commands" \
|
||||||
&& chmod +x install_management_commands.sh \
|
&& chmod +x install_management_commands.sh \
|
||||||
&& ./install_management_commands.sh
|
&& ./install_management_commands.sh
|
||||||
@@ -148,27 +203,27 @@ RUN --mount=type=bind,from=qpdf-builder,target=/qpdf \
|
|||||||
--mount=type=bind,from=pikepdf-builder,target=/pikepdf \
|
--mount=type=bind,from=pikepdf-builder,target=/pikepdf \
|
||||||
set -eux \
|
set -eux \
|
||||||
&& echo "Installing qpdf" \
|
&& echo "Installing qpdf" \
|
||||||
&& apt-get install --yes --no-install-recommends /qpdf/usr/src/qpdf/libqpdf28_*.deb \
|
&& apt-get install --yes --no-install-recommends /qpdf/usr/src/qpdf/${QPDF_VERSION}/${TARGETARCH}${TARGETVARIANT}/libqpdf29_*.deb \
|
||||||
&& apt-get install --yes --no-install-recommends /qpdf/usr/src/qpdf/qpdf_*.deb \
|
&& apt-get install --yes --no-install-recommends /qpdf/usr/src/qpdf/${QPDF_VERSION}/${TARGETARCH}${TARGETVARIANT}/qpdf_*.deb \
|
||||||
&& echo "Installing pikepdf and dependencies" \
|
&& echo "Installing pikepdf and dependencies" \
|
||||||
&& python3 -m pip install --no-cache-dir /pikepdf/usr/src/pikepdf/wheels/packaging*.whl \
|
&& python3 -m pip install --no-cache-dir /pikepdf/usr/src/wheels/*.whl \
|
||||||
&& python3 -m pip install --no-cache-dir /pikepdf/usr/src/pikepdf/wheels/lxml*.whl \
|
&& python3 -m pip list \
|
||||||
&& python3 -m pip install --no-cache-dir /pikepdf/usr/src/pikepdf/wheels/Pillow*.whl \
|
|
||||||
&& python3 -m pip install --no-cache-dir /pikepdf/usr/src/pikepdf/wheels/pyparsing*.whl \
|
|
||||||
&& python3 -m pip install --no-cache-dir /pikepdf/usr/src/pikepdf/wheels/pikepdf*.whl \
|
|
||||||
&& python -m pip list \
|
|
||||||
&& echo "Installing psycopg2" \
|
&& echo "Installing psycopg2" \
|
||||||
&& python3 -m pip install --no-cache-dir /psycopg2/usr/src/psycopg2/wheels/psycopg2*.whl \
|
&& python3 -m pip install --no-cache-dir /psycopg2/usr/src/wheels/psycopg2*.whl \
|
||||||
&& python -m pip list
|
&& python3 -m pip list
|
||||||
|
|
||||||
|
WORKDIR /usr/src/paperless/src/
|
||||||
|
|
||||||
# Python dependencies
|
# Python dependencies
|
||||||
# Change pretty frequently
|
# Change pretty frequently
|
||||||
COPY requirements.txt ../
|
COPY --from=pipenv-base /usr/src/pipenv/requirements.txt ./
|
||||||
|
|
||||||
# Packages needed only for building a few quick Python
|
# Packages needed only for building a few quick Python
|
||||||
# dependencies
|
# dependencies
|
||||||
ARG BUILD_PACKAGES="\
|
ARG BUILD_PACKAGES="\
|
||||||
build-essential \
|
build-essential \
|
||||||
|
git \
|
||||||
|
default-libmysqlclient-dev \
|
||||||
python3-dev"
|
python3-dev"
|
||||||
|
|
||||||
RUN set -eux \
|
RUN set -eux \
|
||||||
@@ -177,7 +232,11 @@ RUN set -eux \
|
|||||||
&& apt-get install --yes --quiet --no-install-recommends ${BUILD_PACKAGES} \
|
&& apt-get install --yes --quiet --no-install-recommends ${BUILD_PACKAGES} \
|
||||||
&& python3 -m pip install --no-cache-dir --upgrade wheel \
|
&& python3 -m pip install --no-cache-dir --upgrade wheel \
|
||||||
&& echo "Installing Python requirements" \
|
&& echo "Installing Python requirements" \
|
||||||
&& python3 -m pip install --default-timeout=1000 --no-cache-dir -r ../requirements.txt \
|
&& python3 -m pip install --default-timeout=1000 --no-cache-dir --requirement requirements.txt \
|
||||||
|
&& echo "Installing NLTK data" \
|
||||||
|
&& python3 -W ignore::RuntimeWarning -m nltk.downloader -d "/usr/share/nltk_data" snowball_data \
|
||||||
|
&& python3 -W ignore::RuntimeWarning -m nltk.downloader -d "/usr/share/nltk_data" stopwords \
|
||||||
|
&& python3 -W ignore::RuntimeWarning -m nltk.downloader -d "/usr/share/nltk_data" punkt \
|
||||||
&& echo "Cleaning up image" \
|
&& echo "Cleaning up image" \
|
||||||
&& apt-get -y purge ${BUILD_PACKAGES} \
|
&& apt-get -y purge ${BUILD_PACKAGES} \
|
||||||
&& apt-get -y autoremove --purge \
|
&& apt-get -y autoremove --purge \
|
||||||
@@ -188,8 +247,6 @@ RUN set -eux \
|
|||||||
&& rm -rf /var/cache/apt/archives/* \
|
&& rm -rf /var/cache/apt/archives/* \
|
||||||
&& truncate -s 0 /var/log/*log
|
&& truncate -s 0 /var/log/*log
|
||||||
|
|
||||||
WORKDIR /usr/src/paperless/src/
|
|
||||||
|
|
||||||
# copy backend
|
# copy backend
|
||||||
COPY ./src ./
|
COPY ./src ./
|
||||||
|
|
||||||
@@ -213,4 +270,4 @@ ENTRYPOINT ["/sbin/docker-entrypoint.sh"]
|
|||||||
|
|
||||||
EXPOSE 8000
|
EXPOSE 8000
|
||||||
|
|
||||||
CMD ["/usr/local/bin/supervisord", "-c", "/etc/supervisord.conf"]
|
CMD ["/usr/local/bin/paperless_cmd.sh"]
|
||||||
|
|||||||
62
Pipfile
@@ -10,62 +10,74 @@ name = "piwheels"
|
|||||||
|
|
||||||
[packages]
|
[packages]
|
||||||
dateparser = "~=1.1"
|
dateparser = "~=1.1"
|
||||||
django = "~=4.0"
|
django = "~=4.1"
|
||||||
django-cors-headers = "*"
|
django-cors-headers = "*"
|
||||||
django-extensions = "*"
|
django-extensions = "*"
|
||||||
django-filter = "~=21.1"
|
django-filter = "~=22.1"
|
||||||
django-q = "~=1.3"
|
djangorestframework = "~=3.14"
|
||||||
djangorestframework = "~=3.13"
|
|
||||||
filelock = "*"
|
filelock = "*"
|
||||||
fuzzywuzzy = {extras = ["speedup"], version = "*"}
|
|
||||||
gunicorn = "*"
|
gunicorn = "*"
|
||||||
imap-tools = "~=0.54.0"
|
imap-tools = "*"
|
||||||
langdetect = "*"
|
langdetect = "*"
|
||||||
pathvalidate = "*"
|
pathvalidate = "*"
|
||||||
pillow = "~=9.1"
|
pillow = "~=9.3"
|
||||||
# Any version update to pikepdf requires a base image update
|
pikepdf = "*"
|
||||||
pikepdf = "~=5.1"
|
|
||||||
python-gnupg = "*"
|
python-gnupg = "*"
|
||||||
python-dotenv = "*"
|
python-dotenv = "*"
|
||||||
python-dateutil = "*"
|
python-dateutil = "*"
|
||||||
python-magic = "*"
|
python-magic = "*"
|
||||||
# Any version update to psycopg2 requires a base image update
|
|
||||||
psycopg2 = "*"
|
psycopg2 = "*"
|
||||||
redis = "*"
|
rapidfuzz = "*"
|
||||||
# Pinned because aarch64 wheels and updates cause warnings when loading the classifier model.
|
redis = {extras = ["hiredis"], version = "*"}
|
||||||
scikit-learn="==1.0.2"
|
scikit-learn = "~=1.1"
|
||||||
whitenoise = "~=6.0.0"
|
numpy = "*"
|
||||||
watchdog = "~=2.1.0"
|
whitenoise = "~=6.2"
|
||||||
whoosh="~=2.7.4"
|
watchdog = "~=2.1"
|
||||||
|
whoosh="~=2.7"
|
||||||
inotifyrecursive = "~=0.3"
|
inotifyrecursive = "~=0.3"
|
||||||
ocrmypdf = "~=13.4"
|
ocrmypdf = "~=14.0"
|
||||||
tqdm = "*"
|
tqdm = "*"
|
||||||
tika = "*"
|
tika = "*"
|
||||||
# TODO: This will sadly also install daphne+dependencies,
|
# TODO: This will sadly also install daphne+dependencies,
|
||||||
# which an ASGI server we don't need. Adds about 15MB image size.
|
# which an ASGI server we don't need. Adds about 15MB image size.
|
||||||
channels = "~=3.0"
|
channels = "~=3.0"
|
||||||
channels-redis = "*"
|
|
||||||
uvicorn = {extras = ["standard"], version = "*"}
|
uvicorn = {extras = ["standard"], version = "*"}
|
||||||
concurrent-log-handler = "*"
|
concurrent-log-handler = "*"
|
||||||
"pdfminer.six" = "*"
|
"pdfminer.six" = "*"
|
||||||
"backports.zoneinfo" = {version = "*", markers = "python_version < '3.9'"}
|
|
||||||
"importlib-resources" = {version = "*", markers = "python_version < '3.9'"}
|
|
||||||
zipp = {version = "*", markers = "python_version < '3.9'"}
|
|
||||||
pyzbar = "*"
|
pyzbar = "*"
|
||||||
|
mysqlclient = "*"
|
||||||
|
celery = {extras = ["redis"], version = "*"}
|
||||||
|
django-celery-results = "*"
|
||||||
|
setproctitle = "*"
|
||||||
|
nltk = "*"
|
||||||
pdf2image = "*"
|
pdf2image = "*"
|
||||||
|
flower = "*"
|
||||||
|
bleach = "*"
|
||||||
|
|
||||||
|
#
|
||||||
|
# Packages locked due to issues (try to check if these are fixed in a release every so often)
|
||||||
|
#
|
||||||
|
|
||||||
|
# Pin this until piwheels is building 1.9 (see https://www.piwheels.org/project/scipy/)
|
||||||
|
scipy = "==1.8.1"
|
||||||
|
|
||||||
|
# Newer versions aren't builting yet (see https://www.piwheels.org/project/cryptography/)
|
||||||
|
cryptography = "==38.0.1"
|
||||||
|
|
||||||
|
# Locked version until https://github.com/django/channels_redis/issues/332
|
||||||
|
# is resolved
|
||||||
|
channels-redis = "==3.4.1"
|
||||||
|
|
||||||
[dev-packages]
|
[dev-packages]
|
||||||
coveralls = "*"
|
coveralls = "*"
|
||||||
factory-boy = "*"
|
factory-boy = "*"
|
||||||
pycodestyle = "*"
|
|
||||||
pytest = "*"
|
pytest = "*"
|
||||||
pytest-cov = "*"
|
pytest-cov = "*"
|
||||||
pytest-django = "*"
|
pytest-django = "*"
|
||||||
pytest-env = "*"
|
pytest-env = "*"
|
||||||
pytest-sugar = "*"
|
pytest-sugar = "*"
|
||||||
pytest-xdist = "*"
|
pytest-xdist = "*"
|
||||||
sphinx = "~=4.5.0"
|
|
||||||
sphinx_rtd_theme = "*"
|
|
||||||
tox = "*"
|
|
||||||
black = "*"
|
black = "*"
|
||||||
pre-commit = "*"
|
pre-commit = "*"
|
||||||
|
imagehash = "*"
|
||||||
|
mkdocs-material = "*"
|
||||||
|
|||||||
3110
Pipfile.lock
generated
33
README.md
@@ -1,8 +1,9 @@
|
|||||||
[](https://github.com/paperless-ngx/paperless-ngx/actions)
|
[](https://github.com/paperless-ngx/paperless-ngx/actions)
|
||||||
[](https://crowdin.com/project/paperless-ngx)
|
[](https://crowdin.com/project/paperless-ngx)
|
||||||
[](https://paperless-ngx.readthedocs.io/en/latest/?badge=latest)
|
[](https://docs.paperless-ngx.com)
|
||||||
[](https://coveralls.io/github/paperless-ngx/paperless-ngx?branch=master)
|
[](https://coveralls.io/github/paperless-ngx/paperless-ngx?branch=master)
|
||||||
[](https://matrix.to/#/#paperless:adnidor.de)
|
[](https://matrix.to/#/%23paperlessngx%3Amatrix.org)
|
||||||
|
[](https://demo.paperless-ngx.com)
|
||||||
|
|
||||||
<p align="center">
|
<p align="center">
|
||||||
<img src="https://github.com/paperless-ngx/paperless-ngx/raw/main/resources/logo/web/png/Black%20logo%20-%20no%20background.png#gh-light-mode-only" width="50%" />
|
<img src="https://github.com/paperless-ngx/paperless-ngx/raw/main/resources/logo/web/png/Black%20logo%20-%20no%20background.png#gh-light-mode-only" width="50%" />
|
||||||
@@ -32,13 +33,13 @@ A demo is available at [demo.paperless-ngx.com](https://demo.paperless-ngx.com)
|
|||||||
|
|
||||||
# Features
|
# Features
|
||||||
|
|
||||||

|

|
||||||

|

|
||||||
|
|
||||||
- Organize and index your scanned documents with tags, correspondents, types, and more.
|
- Organize and index your scanned documents with tags, correspondents, types, and more.
|
||||||
- Performs OCR on your documents, adds selectable text to image only documents and adds tags, correspondents and document types to your documents.
|
- Performs OCR on your documents, adds selectable text to image only documents and adds tags, correspondents and document types to your documents.
|
||||||
- Supports PDF documents, images, plain text files, and Office documents (Word, Excel, Powerpoint, and LibreOffice equivalents).
|
- Supports PDF documents, images, plain text files, and Office documents (Word, Excel, Powerpoint, and LibreOffice equivalents).
|
||||||
- Office document support is optional and provided by Apache Tika (see [configuration](https://paperless-ngx.readthedocs.io/en/latest/configuration.html#tika-settings))
|
- Office document support is optional and provided by Apache Tika (see [configuration](https://docs.paperless-ngx.com/configuration/#tika))
|
||||||
- Paperless stores your documents plain on disk. Filenames and folders are managed by paperless and their format can be configured freely.
|
- Paperless stores your documents plain on disk. Filenames and folders are managed by paperless and their format can be configured freely.
|
||||||
- Single page application front end.
|
- Single page application front end.
|
||||||
- Includes a dashboard that shows basic statistics and has document upload.
|
- Includes a dashboard that shows basic statistics and has document upload.
|
||||||
@@ -56,7 +57,7 @@ A demo is available at [demo.paperless-ngx.com](https://demo.paperless-ngx.com)
|
|||||||
- Paperless-ngx learns from your documents and will be able to automatically assign tags, correspondents and types to documents once you've stored a few documents in paperless.
|
- Paperless-ngx learns from your documents and will be able to automatically assign tags, correspondents and types to documents once you've stored a few documents in paperless.
|
||||||
- Optimized for multi core systems: Paperless-ngx consumes multiple documents in parallel.
|
- Optimized for multi core systems: Paperless-ngx consumes multiple documents in parallel.
|
||||||
- The integrated sanity checker makes sure that your document archive is in good health.
|
- The integrated sanity checker makes sure that your document archive is in good health.
|
||||||
- [More screenshots are available in the documentation](https://paperless-ngx.readthedocs.io/en/latest/screenshots.html).
|
- [More screenshots are available in the documentation](https://docs.paperless-ngx.com/#screenshots).
|
||||||
|
|
||||||
# Getting started
|
# Getting started
|
||||||
|
|
||||||
@@ -68,19 +69,19 @@ If you'd like to jump right in, you can configure a docker-compose environment w
|
|||||||
bash -c "$(curl -L https://raw.githubusercontent.com/paperless-ngx/paperless-ngx/main/install-paperless-ngx.sh)"
|
bash -c "$(curl -L https://raw.githubusercontent.com/paperless-ngx/paperless-ngx/main/install-paperless-ngx.sh)"
|
||||||
```
|
```
|
||||||
|
|
||||||
Alternatively, you can install the dependencies and setup apache and a database server yourself. The [documentation](https://paperless-ngx.readthedocs.io/en/latest/setup.html#installation) has a step by step guide on how to do it.
|
Alternatively, you can install the dependencies and setup apache and a database server yourself. The [documentation](https://docs.paperless-ngx.com/setup/#installation) has a step by step guide on how to do it.
|
||||||
|
|
||||||
Migrating from Paperless-ng is easy, just drop in the new docker image! See the [documentation on migrating](https://paperless-ngx.readthedocs.io/en/latest/setup.html#migrating-from-paperless-ng) for more details.
|
Migrating from Paperless-ng is easy, just drop in the new docker image! See the [documentation on migrating](https://docs.paperless-ngx.com/setup/#migrating-to-paperless-ngx) for more details.
|
||||||
|
|
||||||
<!-- omit in toc -->
|
<!-- omit in toc -->
|
||||||
|
|
||||||
### Documentation
|
### Documentation
|
||||||
|
|
||||||
The documentation for Paperless-ngx is available on [ReadTheDocs](https://paperless-ngx.readthedocs.io/).
|
The documentation for Paperless-ngx is available at [https://docs.paperless-ngx.com](https://docs.paperless-ngx.com/).
|
||||||
|
|
||||||
# Contributing
|
# Contributing
|
||||||
|
|
||||||
If you feel like contributing to the project, please do! Bug fixes, enhancements, visual fixes etc. are always welcome. If you want to implement something big: Please start a discussion about that! The [documentation](https://paperless-ngx.readthedocs.io/en/latest/extending.html) has some basic information on how to get started.
|
If you feel like contributing to the project, please do! Bug fixes, enhancements, visual fixes etc. are always welcome. If you want to implement something big: Please start a discussion about that! The [documentation](https://docs.paperless-ngx.com/development/) has some basic information on how to get started.
|
||||||
|
|
||||||
## Community Support
|
## Community Support
|
||||||
|
|
||||||
@@ -102,18 +103,10 @@ For bugs please [open an issue](https://github.com/paperless-ngx/paperless-ngx/i
|
|||||||
|
|
||||||
Paperless has been around a while now, and people are starting to build stuff on top of it. If you're one of those people, we can add your project to this list:
|
Paperless has been around a while now, and people are starting to build stuff on top of it. If you're one of those people, we can add your project to this list:
|
||||||
|
|
||||||
- [Paperless App](https://github.com/bauerj/paperless_app): An Android/iOS app for Paperless-ngx. Also works with the original Paperless and Paperless-ngx.
|
- [Paperless App](https://github.com/bauerj/paperless_app): An Android/iOS app for Paperless-ngx. Also works with the original Paperless and Paperless-ng.
|
||||||
- [Paperless Share](https://github.com/qcasey/paperless_share). Share any files from your Android application with paperless. Very simple, but works with all of the mobile scanning apps out there that allow you to share scanned documents.
|
- [Paperless Share](https://github.com/qcasey/paperless_share). Share any files from your Android application with paperless. Very simple, but works with all of the mobile scanning apps out there that allow you to share scanned documents.
|
||||||
- [Scan to Paperless](https://github.com/sbrunner/scan-to-paperless): Scan and prepare (crop, deskew, OCR, ...) your documents for Paperless.
|
- [Scan to Paperless](https://github.com/sbrunner/scan-to-paperless): Scan and prepare (crop, deskew, OCR, ...) your documents for Paperless.
|
||||||
|
- [Paperless Mobile](https://github.com/astubenbord/paperless-mobile): A modern, feature rich mobile application for Paperless.
|
||||||
These projects also exist, but their status and compatibility with paperless-ngx is unknown.
|
|
||||||
|
|
||||||
- [paperless-cli](https://github.com/stgarf/paperless-cli): A golang command line binary to interact with a Paperless instance.
|
|
||||||
|
|
||||||
This project also exists, but needs updates to be compatible with paperless-ngx.
|
|
||||||
|
|
||||||
- [Paperless Desktop](https://github.com/thomasbrueggemann/paperless-desktop): A desktop UI for your Paperless installation. Runs on Mac, Linux, and Windows.
|
|
||||||
Known issues on Mac: (Could not load reminders and documents)
|
|
||||||
|
|
||||||
# Important Note
|
# Important Note
|
||||||
|
|
||||||
|
|||||||
@@ -10,9 +10,9 @@
|
|||||||
# Example Usage:
|
# Example Usage:
|
||||||
# ./build-docker-image.sh Dockerfile -t paperless-ngx:my-awesome-feature
|
# ./build-docker-image.sh Dockerfile -t paperless-ngx:my-awesome-feature
|
||||||
|
|
||||||
set -eux
|
set -eu
|
||||||
|
|
||||||
if ! command -v jq; then
|
if ! command -v jq &> /dev/null ; then
|
||||||
echo "jq required"
|
echo "jq required"
|
||||||
exit 1
|
exit 1
|
||||||
elif [ ! -f "$1" ]; then
|
elif [ ! -f "$1" ]; then
|
||||||
@@ -20,23 +20,62 @@ elif [ ! -f "$1" ]; then
|
|||||||
exit 1
|
exit 1
|
||||||
fi
|
fi
|
||||||
|
|
||||||
# Parse what we can from Pipfile.lock
|
|
||||||
pikepdf_version=$(jq ".default.pikepdf.version" Pipfile.lock | sed 's/=//g' | sed 's/"//g')
|
|
||||||
psycopg2_version=$(jq ".default.psycopg2.version" Pipfile.lock | sed 's/=//g' | sed 's/"//g')
|
|
||||||
# Read this from the other config file
|
|
||||||
qpdf_version=$(jq ".qpdf.version" .build-config.json | sed 's/"//g')
|
|
||||||
jbig2enc_version=$(jq ".jbig2enc.version" .build-config.json | sed 's/"//g')
|
|
||||||
# Get the branch name (used for caching)
|
# Get the branch name (used for caching)
|
||||||
branch_name=$(git rev-parse --abbrev-ref HEAD)
|
branch_name=$(git rev-parse --abbrev-ref HEAD)
|
||||||
|
|
||||||
# https://docs.docker.com/develop/develop-images/build_enhancements/
|
# Parse eithe Pipfile.lock or the .build-config.json
|
||||||
# Required to use cache-from
|
jbig2enc_version=$(jq -r '.jbig2enc.version' .build-config.json)
|
||||||
export DOCKER_BUILDKIT=1
|
qpdf_version=$(jq -r '.qpdf.version' .build-config.json)
|
||||||
|
psycopg2_version=$(jq -r '.default.psycopg2.version | gsub("=";"")' Pipfile.lock)
|
||||||
|
pikepdf_version=$(jq -r '.default.pikepdf.version | gsub("=";"")' Pipfile.lock)
|
||||||
|
pillow_version=$(jq -r '.default.pillow.version | gsub("=";"")' Pipfile.lock)
|
||||||
|
lxml_version=$(jq -r '.default.lxml.version | gsub("=";"")' Pipfile.lock)
|
||||||
|
|
||||||
docker build --file "$1" \
|
base_filename="$(basename -- "${1}")"
|
||||||
--cache-from ghcr.io/paperless-ngx/paperless-ngx/builder/cache/app:"${branch_name}" \
|
build_args_str=""
|
||||||
--cache-from ghcr.io/paperless-ngx/paperless-ngx/builder/cache/app:dev \
|
cache_from_str=""
|
||||||
--build-arg JBIG2ENC_VERSION="${jbig2enc_version}" \
|
|
||||||
--build-arg QPDF_VERSION="${qpdf_version}" \
|
case "${base_filename}" in
|
||||||
--build-arg PIKEPDF_VERSION="${pikepdf_version}" \
|
|
||||||
--build-arg PSYCOPG2_VERSION="${psycopg2_version}" "${@:2}" .
|
*.jbig2enc)
|
||||||
|
build_args_str="--build-arg JBIG2ENC_VERSION=${jbig2enc_version}"
|
||||||
|
cache_from_str="--cache-from ghcr.io/paperless-ngx/paperless-ngx/builder/cache/jbig2enc:${jbig2enc_version}"
|
||||||
|
;;
|
||||||
|
|
||||||
|
*.psycopg2)
|
||||||
|
build_args_str="--build-arg PSYCOPG2_VERSION=${psycopg2_version}"
|
||||||
|
cache_from_str="--cache-from ghcr.io/paperless-ngx/paperless-ngx/builder/cache/psycopg2:${psycopg2_version}"
|
||||||
|
;;
|
||||||
|
|
||||||
|
*.qpdf)
|
||||||
|
build_args_str="--build-arg QPDF_VERSION=${qpdf_version}"
|
||||||
|
cache_from_str="--cache-from ghcr.io/paperless-ngx/paperless-ngx/builder/cache/qpdf:${qpdf_version}"
|
||||||
|
;;
|
||||||
|
|
||||||
|
*.pikepdf)
|
||||||
|
build_args_str="--build-arg QPDF_VERSION=${qpdf_version} --build-arg PIKEPDF_VERSION=${pikepdf_version} --build-arg PILLOW_VERSION=${pillow_version} --build-arg LXML_VERSION=${lxml_version}"
|
||||||
|
cache_from_str="--cache-from ghcr.io/paperless-ngx/paperless-ngx/builder/cache/pikepdf:${pikepdf_version}"
|
||||||
|
;;
|
||||||
|
|
||||||
|
Dockerfile)
|
||||||
|
build_args_str="--build-arg QPDF_VERSION=${qpdf_version} --build-arg PIKEPDF_VERSION=${pikepdf_version} --build-arg PSYCOPG2_VERSION=${psycopg2_version} --build-arg JBIG2ENC_VERSION=${jbig2enc_version}"
|
||||||
|
cache_from_str="--cache-from ghcr.io/paperless-ngx/paperless-ngx/builder/cache/app:${branch_name} --cache-from ghcr.io/paperless-ngx/paperless-ngx/builder/cache/app:dev"
|
||||||
|
;;
|
||||||
|
|
||||||
|
*)
|
||||||
|
echo "Unable to match ${base_filename}"
|
||||||
|
exit 1
|
||||||
|
;;
|
||||||
|
esac
|
||||||
|
|
||||||
|
read -r -a build_args_arr <<< "${build_args_str}"
|
||||||
|
read -r -a cache_from_arr <<< "${cache_from_str}"
|
||||||
|
|
||||||
|
set -eux
|
||||||
|
|
||||||
|
docker buildx build --file "${1}" \
|
||||||
|
--progress=plain \
|
||||||
|
--output=type=docker \
|
||||||
|
"${cache_from_arr[@]}" \
|
||||||
|
"${build_args_arr[@]}" \
|
||||||
|
"${@:2}" .
|
||||||
|
|||||||
@@ -1,14 +0,0 @@
|
|||||||
# This Dockerfile compiles the frontend
|
|
||||||
# Inputs: None
|
|
||||||
|
|
||||||
FROM node:16-bullseye-slim AS compile-frontend
|
|
||||||
|
|
||||||
COPY ./src /src/src
|
|
||||||
COPY ./src-ui /src/src-ui
|
|
||||||
|
|
||||||
WORKDIR /src/src-ui
|
|
||||||
RUN set -eux \
|
|
||||||
&& npm update npm -g \
|
|
||||||
&& npm ci --no-optional
|
|
||||||
RUN set -eux \
|
|
||||||
&& ./node_modules/.bin/ng build --configuration production
|
|
||||||
@@ -7,6 +7,7 @@ FROM debian:bullseye-slim as main
|
|||||||
LABEL org.opencontainers.image.description="A intermediate image with jbig2enc built"
|
LABEL org.opencontainers.image.description="A intermediate image with jbig2enc built"
|
||||||
|
|
||||||
ARG DEBIAN_FRONTEND=noninteractive
|
ARG DEBIAN_FRONTEND=noninteractive
|
||||||
|
ARG JBIG2ENC_VERSION
|
||||||
|
|
||||||
ARG BUILD_PACKAGES="\
|
ARG BUILD_PACKAGES="\
|
||||||
build-essential \
|
build-essential \
|
||||||
@@ -19,21 +20,16 @@ ARG BUILD_PACKAGES="\
|
|||||||
|
|
||||||
WORKDIR /usr/src/jbig2enc
|
WORKDIR /usr/src/jbig2enc
|
||||||
|
|
||||||
# As this is an base image for a multi-stage final image
|
|
||||||
# the added size of the install is basically irrelevant
|
|
||||||
RUN apt-get update --quiet \
|
|
||||||
&& apt-get install --yes --quiet --no-install-recommends ${BUILD_PACKAGES} \
|
|
||||||
&& rm -rf /var/lib/apt/lists/*
|
|
||||||
|
|
||||||
# Layers after this point change according to required version
|
|
||||||
# For better caching, seperate the basic installs from
|
|
||||||
# the building
|
|
||||||
|
|
||||||
ARG JBIG2ENC_VERSION
|
|
||||||
|
|
||||||
RUN set -eux \
|
RUN set -eux \
|
||||||
&& git clone --quiet --branch $JBIG2ENC_VERSION https://github.com/agl/jbig2enc .
|
&& echo "Installing build tools" \
|
||||||
RUN set -eux \
|
&& apt-get update --quiet \
|
||||||
&& ./autogen.sh
|
&& apt-get install --yes --quiet --no-install-recommends ${BUILD_PACKAGES} \
|
||||||
RUN set -eux \
|
&& echo "Building jbig2enc" \
|
||||||
&& ./configure && make
|
&& git clone --quiet --branch $JBIG2ENC_VERSION https://github.com/agl/jbig2enc . \
|
||||||
|
&& ./autogen.sh \
|
||||||
|
&& ./configure \
|
||||||
|
&& make \
|
||||||
|
&& echo "Cleaning up image" \
|
||||||
|
&& apt-get -y purge ${BUILD_PACKAGES} \
|
||||||
|
&& apt-get -y autoremove --purge \
|
||||||
|
&& rm -rf /var/lib/apt/lists/*
|
||||||
|
|||||||
@@ -2,8 +2,7 @@
|
|||||||
# Inputs:
|
# Inputs:
|
||||||
# - REPO - Docker repository to pull qpdf from
|
# - REPO - Docker repository to pull qpdf from
|
||||||
# - QPDF_VERSION - The image qpdf version to copy .deb files from
|
# - QPDF_VERSION - The image qpdf version to copy .deb files from
|
||||||
# - PIKEPDF_GIT_TAG - The Git tag to clone and build from
|
# - PIKEPDF_VERSION - Version of pikepdf to build wheel for
|
||||||
# - PIKEPDF_VERSION - Used to force the built pikepdf version to match
|
|
||||||
|
|
||||||
# Default to pulling from the main repo registry when manually building
|
# Default to pulling from the main repo registry when manually building
|
||||||
ARG REPO="paperless-ngx/paperless-ngx"
|
ARG REPO="paperless-ngx/paperless-ngx"
|
||||||
@@ -17,13 +16,23 @@ FROM python:3.9-slim-bullseye as main
|
|||||||
|
|
||||||
LABEL org.opencontainers.image.description="A intermediate image with pikepdf wheel built"
|
LABEL org.opencontainers.image.description="A intermediate image with pikepdf wheel built"
|
||||||
|
|
||||||
|
# Buildx provided
|
||||||
|
ARG TARGETARCH
|
||||||
|
ARG TARGETVARIANT
|
||||||
|
|
||||||
ARG DEBIAN_FRONTEND=noninteractive
|
ARG DEBIAN_FRONTEND=noninteractive
|
||||||
|
# Workflow provided
|
||||||
|
ARG QPDF_VERSION
|
||||||
|
ARG PIKEPDF_VERSION
|
||||||
|
# These are not used, but will still bust the cache if one changes
|
||||||
|
# Otherwise, the main image will try to build thing (and fail)
|
||||||
|
ARG PILLOW_VERSION
|
||||||
|
ARG LXML_VERSION
|
||||||
|
|
||||||
ARG BUILD_PACKAGES="\
|
ARG BUILD_PACKAGES="\
|
||||||
build-essential \
|
build-essential \
|
||||||
python3-dev \
|
python3-dev \
|
||||||
python3-pip \
|
python3-pip \
|
||||||
git \
|
|
||||||
# qpdf requirement - https://github.com/qpdf/qpdf#crypto-providers
|
# qpdf requirement - https://github.com/qpdf/qpdf#crypto-providers
|
||||||
libgnutls28-dev \
|
libgnutls28-dev \
|
||||||
# lxml requrements - https://lxml.de/installation.html
|
# lxml requrements - https://lxml.de/installation.html
|
||||||
@@ -51,42 +60,43 @@ ARG BUILD_PACKAGES="\
|
|||||||
|
|
||||||
WORKDIR /usr/src
|
WORKDIR /usr/src
|
||||||
|
|
||||||
COPY --from=qpdf-builder /usr/src/qpdf/*.deb ./
|
COPY --from=qpdf-builder /usr/src/qpdf/${QPDF_VERSION}/${TARGETARCH}${TARGETVARIANT}/*.deb ./
|
||||||
|
|
||||||
# As this is an base image for a multi-stage final image
|
# As this is an base image for a multi-stage final image
|
||||||
# the added size of the install is basically irrelevant
|
# the added size of the install is basically irrelevant
|
||||||
|
|
||||||
RUN set -eux \
|
RUN set -eux \
|
||||||
&& apt-get update --quiet \
|
&& echo "Installing build tools" \
|
||||||
&& apt-get install --yes --quiet --no-install-recommends $BUILD_PACKAGES \
|
&& apt-get update --quiet \
|
||||||
&& dpkg --install libqpdf28_*.deb \
|
&& apt-get install --yes --quiet --no-install-recommends ${BUILD_PACKAGES} \
|
||||||
&& dpkg --install libqpdf-dev_*.deb \
|
&& echo "Installing qpdf" \
|
||||||
&& python3 -m pip install --no-cache-dir --upgrade \
|
&& dpkg --install libqpdf29_*.deb \
|
||||||
pip \
|
&& dpkg --install libqpdf-dev_*.deb \
|
||||||
wheel \
|
&& echo "Installing Python tools" \
|
||||||
# https://pikepdf.readthedocs.io/en/latest/installation.html#requirements
|
&& python3 -m pip install --no-cache-dir --upgrade \
|
||||||
pybind11 \
|
pip \
|
||||||
&& rm -rf /var/lib/apt/lists/*
|
wheel \
|
||||||
|
# https://pikepdf.readthedocs.io/en/latest/installation.html#requirements
|
||||||
# Layers after this point change according to required version
|
pybind11 \
|
||||||
# For better caching, seperate the basic installs from
|
&& echo "Building pikepdf wheel ${PIKEPDF_VERSION}" \
|
||||||
# the building
|
&& mkdir wheels \
|
||||||
|
&& python3 -m pip wheel \
|
||||||
ARG PIKEPDF_GIT_TAG
|
# Build the package at the required version
|
||||||
ARG PIKEPDF_VERSION
|
pikepdf==${PIKEPDF_VERSION} \
|
||||||
|
# Look to piwheels for additional pre-built wheels
|
||||||
RUN set -eux \
|
--extra-index-url https://www.piwheels.org/simple \
|
||||||
&& echo "building pikepdf wheel" \
|
# Output the *.whl into this directory
|
||||||
# Note the v in the tag name here
|
--wheel-dir wheels \
|
||||||
&& git clone --quiet --depth 1 --branch "${PIKEPDF_GIT_TAG}" https://github.com/pikepdf/pikepdf.git \
|
# Do not use a binary packge for the package being built
|
||||||
&& cd pikepdf \
|
--no-binary=pikepdf \
|
||||||
# pikepdf seems to specifciy either a next version when built OR
|
# Do use binary packages for dependencies
|
||||||
# a post release tag.
|
--prefer-binary \
|
||||||
# In either case, this won't match what we want from requirements.txt
|
# Don't cache build files
|
||||||
# Directly modify the setup.py to set the version we just checked out of Git
|
--no-cache-dir \
|
||||||
&& sed -i "s/use_scm_version=True/version=\"${PIKEPDF_VERSION}\"/g" setup.py \
|
&& ls -ahl wheels \
|
||||||
# https://github.com/pikepdf/pikepdf/issues/323
|
&& echo "Gathering package data" \
|
||||||
&& rm pyproject.toml \
|
&& dpkg-query -f '${Package;-40}${Version}\n' -W > ./wheels/pkg-list.txt \
|
||||||
&& mkdir wheels \
|
&& echo "Cleaning up image" \
|
||||||
&& python3 -m pip wheel . --wheel-dir wheels \
|
&& apt-get -y purge ${BUILD_PACKAGES} \
|
||||||
&& ls -ahl wheels
|
&& apt-get -y autoremove --purge \
|
||||||
|
&& rm -rf /var/lib/apt/lists/*
|
||||||
|
|||||||
@@ -1,17 +1,16 @@
|
|||||||
# This Dockerfile builds the psycopg2 wheel
|
# This Dockerfile builds the psycopg2 wheel
|
||||||
# Inputs:
|
# Inputs:
|
||||||
# - PSYCOPG2_GIT_TAG - The Git tag to clone and build from
|
# - PSYCOPG2_VERSION - Version to build
|
||||||
# - PSYCOPG2_VERSION - Unused, kept for future possible usage
|
|
||||||
|
|
||||||
FROM python:3.9-slim-bullseye as main
|
FROM python:3.9-slim-bullseye as main
|
||||||
|
|
||||||
LABEL org.opencontainers.image.description="A intermediate image with psycopg2 wheel built"
|
LABEL org.opencontainers.image.description="A intermediate image with psycopg2 wheel built"
|
||||||
|
|
||||||
|
ARG PSYCOPG2_VERSION
|
||||||
ARG DEBIAN_FRONTEND=noninteractive
|
ARG DEBIAN_FRONTEND=noninteractive
|
||||||
|
|
||||||
ARG BUILD_PACKAGES="\
|
ARG BUILD_PACKAGES="\
|
||||||
build-essential \
|
build-essential \
|
||||||
git \
|
|
||||||
python3-dev \
|
python3-dev \
|
||||||
python3-pip \
|
python3-pip \
|
||||||
# https://www.psycopg.org/docs/install.html#prerequisites
|
# https://www.psycopg.org/docs/install.html#prerequisites
|
||||||
@@ -23,23 +22,29 @@ WORKDIR /usr/src
|
|||||||
# the added size of the install is basically irrelevant
|
# the added size of the install is basically irrelevant
|
||||||
|
|
||||||
RUN set -eux \
|
RUN set -eux \
|
||||||
&& apt-get update --quiet \
|
&& echo "Installing build tools" \
|
||||||
&& apt-get install --yes --quiet --no-install-recommends $BUILD_PACKAGES \
|
&& apt-get update --quiet \
|
||||||
&& rm -rf /var/lib/apt/lists/* \
|
&& apt-get install --yes --quiet --no-install-recommends ${BUILD_PACKAGES} \
|
||||||
&& python3 -m pip install --no-cache-dir --upgrade pip wheel
|
&& echo "Installing Python tools" \
|
||||||
|
&& python3 -m pip install --no-cache-dir --upgrade pip wheel \
|
||||||
# Layers after this point change according to required version
|
&& echo "Building psycopg2 wheel ${PSYCOPG2_VERSION}" \
|
||||||
# For better caching, seperate the basic installs from
|
&& cd /usr/src \
|
||||||
# the building
|
&& mkdir wheels \
|
||||||
|
&& python3 -m pip wheel \
|
||||||
ARG PSYCOPG2_GIT_TAG
|
# Build the package at the required version
|
||||||
ARG PSYCOPG2_VERSION
|
psycopg2==${PSYCOPG2_VERSION} \
|
||||||
|
# Output the *.whl into this directory
|
||||||
RUN set -eux \
|
--wheel-dir wheels \
|
||||||
&& echo "Building psycopg2 wheel" \
|
# Do not use a binary packge for the package being built
|
||||||
&& cd /usr/src \
|
--no-binary=psycopg2 \
|
||||||
&& git clone --quiet --depth 1 --branch ${PSYCOPG2_GIT_TAG} https://github.com/psycopg/psycopg2.git \
|
# Do use binary packages for dependencies
|
||||||
&& cd psycopg2 \
|
--prefer-binary \
|
||||||
&& mkdir wheels \
|
# Don't cache build files
|
||||||
&& python3 -m pip wheel . --wheel-dir wheels \
|
--no-cache-dir \
|
||||||
&& ls -ahl wheels/
|
&& ls -ahl wheels/ \
|
||||||
|
&& echo "Gathering package data" \
|
||||||
|
&& dpkg-query -f '${Package;-40}${Version}\n' -W > ./wheels/pkg-list.txt \
|
||||||
|
&& echo "Cleaning up image" \
|
||||||
|
&& apt-get -y purge ${BUILD_PACKAGES} \
|
||||||
|
&& apt-get -y autoremove --purge \
|
||||||
|
&& rm -rf /var/lib/apt/lists/*
|
||||||
|
|||||||
@@ -1,53 +1,156 @@
|
|||||||
FROM debian:bullseye-slim as main
|
#
|
||||||
|
# Stage: pre-build
|
||||||
|
# Purpose:
|
||||||
|
# - Installs common packages
|
||||||
|
# - Sets common environment variables related to dpkg
|
||||||
|
# - Aquires the qpdf source from bookwork
|
||||||
|
# Useful Links:
|
||||||
|
# - https://qpdf.readthedocs.io/en/stable/installation.html#system-requirements
|
||||||
|
# - https://wiki.debian.org/Multiarch/HOWTO
|
||||||
|
# - https://wiki.debian.org/CrossCompiling
|
||||||
|
#
|
||||||
|
|
||||||
LABEL org.opencontainers.image.description="A intermediate image with qpdf built"
|
FROM debian:bullseye-slim as pre-build
|
||||||
|
|
||||||
ARG DEBIAN_FRONTEND=noninteractive
|
ARG QPDF_VERSION
|
||||||
|
|
||||||
ARG BUILD_PACKAGES="\
|
ARG COMMON_BUILD_PACKAGES="\
|
||||||
build-essential \
|
cmake \
|
||||||
debhelper \
|
debhelper\
|
||||||
debian-keyring \
|
debian-keyring \
|
||||||
devscripts \
|
devscripts \
|
||||||
equivs \
|
dpkg-dev \
|
||||||
libtool \
|
equivs \
|
||||||
# https://qpdf.readthedocs.io/en/stable/installation.html#system-requirements
|
|
||||||
libjpeg62-turbo-dev \
|
|
||||||
libgnutls28-dev \
|
|
||||||
packaging-dev \
|
packaging-dev \
|
||||||
zlib1g-dev"
|
libtool"
|
||||||
|
|
||||||
|
ENV DEB_BUILD_OPTIONS="terse nocheck nodoc parallel=2"
|
||||||
|
|
||||||
WORKDIR /usr/src
|
WORKDIR /usr/src
|
||||||
|
|
||||||
# As this is an base image for a multi-stage final image
|
RUN set -eux \
|
||||||
# the added size of the install is basically irrelevant
|
&& echo "Installing common packages" \
|
||||||
|
&& apt-get update --quiet \
|
||||||
|
&& apt-get install --yes --quiet --no-install-recommends ${COMMON_BUILD_PACKAGES} \
|
||||||
|
&& echo "Getting qpdf source" \
|
||||||
|
&& echo "deb-src http://deb.debian.org/debian/ bookworm main" > /etc/apt/sources.list.d/bookworm-src.list \
|
||||||
|
&& apt-get update --quiet \
|
||||||
|
&& apt-get source --yes --quiet qpdf=${QPDF_VERSION}-1/bookworm
|
||||||
|
|
||||||
|
#
|
||||||
|
# Stage: amd64-builder
|
||||||
|
# Purpose: Builds qpdf for x86_64 (native build)
|
||||||
|
#
|
||||||
|
FROM pre-build as amd64-builder
|
||||||
|
|
||||||
|
ARG AMD64_BUILD_PACKAGES="\
|
||||||
|
build-essential \
|
||||||
|
libjpeg62-turbo-dev:amd64 \
|
||||||
|
libgnutls28-dev:amd64 \
|
||||||
|
zlib1g-dev:amd64"
|
||||||
|
|
||||||
|
WORKDIR /usr/src/qpdf-${QPDF_VERSION}
|
||||||
|
|
||||||
RUN set -eux \
|
RUN set -eux \
|
||||||
&& apt-get update --quiet \
|
&& echo "Beginning amd64" \
|
||||||
&& apt-get install --yes --quiet --no-install-recommends $BUILD_PACKAGES \
|
&& echo "Install amd64 packages" \
|
||||||
&& rm -rf /var/lib/apt/lists/*
|
&& apt-get update --quiet \
|
||||||
|
&& apt-get install --yes --quiet --no-install-recommends ${AMD64_BUILD_PACKAGES} \
|
||||||
|
&& echo "Building amd64" \
|
||||||
|
&& dpkg-buildpackage --build=binary --unsigned-source --unsigned-changes --post-clean \
|
||||||
|
&& echo "Removing debug files" \
|
||||||
|
&& rm -f ../libqpdf29-dbgsym* \
|
||||||
|
&& rm -f ../qpdf-dbgsym* \
|
||||||
|
&& echo "Gathering package data" \
|
||||||
|
&& dpkg-query -f '${Package;-40}${Version}\n' -W > ../pkg-list.txt
|
||||||
|
#
|
||||||
|
# Stage: armhf-builder
|
||||||
|
# Purpose:
|
||||||
|
# - Sets armhf specific environment
|
||||||
|
# - Builds qpdf for armhf (cross compile)
|
||||||
|
#
|
||||||
|
FROM pre-build as armhf-builder
|
||||||
|
|
||||||
# Layers after this point change according to required version
|
ARG ARMHF_PACKAGES="\
|
||||||
# For better caching, seperate the basic installs from
|
crossbuild-essential-armhf \
|
||||||
# the building
|
libjpeg62-turbo-dev:armhf \
|
||||||
|
libgnutls28-dev:armhf \
|
||||||
|
zlib1g-dev:armhf"
|
||||||
|
|
||||||
|
WORKDIR /usr/src/qpdf-${QPDF_VERSION}
|
||||||
|
|
||||||
|
ENV CXX="/usr/bin/arm-linux-gnueabihf-g++" \
|
||||||
|
CC="/usr/bin/arm-linux-gnueabihf-gcc"
|
||||||
|
|
||||||
|
RUN set -eux \
|
||||||
|
&& echo "Beginning armhf" \
|
||||||
|
&& echo "Install armhf packages" \
|
||||||
|
&& dpkg --add-architecture armhf \
|
||||||
|
&& apt-get update --quiet \
|
||||||
|
&& apt-get install --yes --quiet --no-install-recommends ${ARMHF_PACKAGES} \
|
||||||
|
&& echo "Building armhf" \
|
||||||
|
&& dpkg-buildpackage --build=binary --unsigned-source --unsigned-changes --post-clean --host-arch armhf \
|
||||||
|
&& echo "Removing debug files" \
|
||||||
|
&& rm -f ../libqpdf29-dbgsym* \
|
||||||
|
&& rm -f ../qpdf-dbgsym* \
|
||||||
|
&& echo "Gathering package data" \
|
||||||
|
&& dpkg-query -f '${Package;-40}${Version}\n' -W > ../pkg-list.txt
|
||||||
|
|
||||||
|
#
|
||||||
|
# Stage: aarch64-builder
|
||||||
|
# Purpose:
|
||||||
|
# - Sets aarch64 specific environment
|
||||||
|
# - Builds qpdf for aarch64 (cross compile)
|
||||||
|
#
|
||||||
|
FROM pre-build as aarch64-builder
|
||||||
|
|
||||||
|
ARG ARM64_PACKAGES="\
|
||||||
|
crossbuild-essential-arm64 \
|
||||||
|
libjpeg62-turbo-dev:arm64 \
|
||||||
|
libgnutls28-dev:arm64 \
|
||||||
|
zlib1g-dev:arm64"
|
||||||
|
|
||||||
|
ENV CXX="/usr/bin/aarch64-linux-gnu-g++" \
|
||||||
|
CC="/usr/bin/aarch64-linux-gnu-gcc"
|
||||||
|
|
||||||
|
WORKDIR /usr/src/qpdf-${QPDF_VERSION}
|
||||||
|
|
||||||
|
RUN set -eux \
|
||||||
|
&& echo "Beginning arm64" \
|
||||||
|
&& echo "Install arm64 packages" \
|
||||||
|
&& dpkg --add-architecture arm64 \
|
||||||
|
&& apt-get update --quiet \
|
||||||
|
&& apt-get install --yes --quiet --no-install-recommends ${ARM64_PACKAGES} \
|
||||||
|
&& echo "Building arm64" \
|
||||||
|
&& dpkg-buildpackage --build=binary --unsigned-source --unsigned-changes --post-clean --host-arch arm64 \
|
||||||
|
&& echo "Removing debug files" \
|
||||||
|
&& rm -f ../libqpdf29-dbgsym* \
|
||||||
|
&& rm -f ../qpdf-dbgsym* \
|
||||||
|
&& echo "Gathering package data" \
|
||||||
|
&& dpkg-query -f '${Package;-40}${Version}\n' -W > ../pkg-list.txt
|
||||||
|
|
||||||
|
#
|
||||||
|
# Stage: package
|
||||||
|
# Purpose: Holds the compiled .deb files in arch/variant specific folders
|
||||||
|
#
|
||||||
|
FROM alpine:3.17 as package
|
||||||
|
|
||||||
|
LABEL org.opencontainers.image.description="A image with qpdf installers stored in architecture & version specific folders"
|
||||||
|
|
||||||
# This must match to pikepdf's minimum at least
|
|
||||||
ARG QPDF_VERSION
|
ARG QPDF_VERSION
|
||||||
|
|
||||||
# In order to get the required version of qpdf, it is backported from bookwork
|
WORKDIR /usr/src/qpdf/${QPDF_VERSION}/amd64
|
||||||
# and then built from source
|
|
||||||
RUN set -eux \
|
COPY --from=amd64-builder /usr/src/*.deb ./
|
||||||
&& echo "Building qpdf" \
|
COPY --from=amd64-builder /usr/src/pkg-list.txt ./
|
||||||
&& echo "deb-src http://deb.debian.org/debian/ bookworm main" > /etc/apt/sources.list.d/bookworm-src.list \
|
|
||||||
&& apt-get update \
|
# Note this is ${TARGETARCH}${TARGETVARIANT} for armv7
|
||||||
&& mkdir qpdf \
|
WORKDIR /usr/src/qpdf/${QPDF_VERSION}/armv7
|
||||||
&& cd qpdf \
|
|
||||||
&& apt-get source --yes --quiet qpdf=${QPDF_VERSION}-1/bookworm \
|
COPY --from=armhf-builder /usr/src/*.deb ./
|
||||||
&& rm -rf /var/lib/apt/lists/* \
|
COPY --from=armhf-builder /usr/src/pkg-list.txt ./
|
||||||
&& cd qpdf-$QPDF_VERSION \
|
|
||||||
# We don't need to build the tests (also don't run them)
|
WORKDIR /usr/src/qpdf/${QPDF_VERSION}/arm64
|
||||||
&& rm -rf libtests \
|
|
||||||
&& DEBEMAIL=hello@paperless-ngx.com debchange --bpo \
|
COPY --from=aarch64-builder /usr/src/*.deb ./
|
||||||
&& export DEB_BUILD_OPTIONS="terse nocheck nodoc parallel=2" \
|
COPY --from=aarch64-builder /usr/src/pkg-list.txt ./
|
||||||
&& dpkg-buildpackage --build=binary --unsigned-source --unsigned-changes \
|
|
||||||
&& ls -ahl ../*.deb
|
|
||||||
|
|||||||
25
docker/compose/docker-compose.ci-test.yml
Normal file
@@ -0,0 +1,25 @@
|
|||||||
|
# docker-compose file for running paperless testing with actual gotenberg
|
||||||
|
# and Tika containers for a more end to end test of the Tika related functionality
|
||||||
|
# Can be used locally or by the CI to start the nessecary containers with the
|
||||||
|
# correct networking for the tests
|
||||||
|
|
||||||
|
version: "3.7"
|
||||||
|
services:
|
||||||
|
gotenberg:
|
||||||
|
image: docker.io/gotenberg/gotenberg:7.6
|
||||||
|
hostname: gotenberg
|
||||||
|
container_name: gotenberg
|
||||||
|
network_mode: host
|
||||||
|
restart: unless-stopped
|
||||||
|
# The gotenberg chromium route is used to convert .eml files. We do not
|
||||||
|
# want to allow external content like tracking pixels or even javascript.
|
||||||
|
command:
|
||||||
|
- "gotenberg"
|
||||||
|
- "--chromium-disable-javascript=true"
|
||||||
|
- "--chromium-allow-list=file:///tmp/.*"
|
||||||
|
tika:
|
||||||
|
image: ghcr.io/paperless-ngx/tika:latest
|
||||||
|
hostname: tika
|
||||||
|
container_name: tika
|
||||||
|
network_mode: host
|
||||||
|
restart: unless-stopped
|
||||||
@@ -36,3 +36,7 @@
|
|||||||
# The default language to use for OCR. Set this to the language most of your
|
# The default language to use for OCR. Set this to the language most of your
|
||||||
# documents are written in.
|
# documents are written in.
|
||||||
#PAPERLESS_OCR_LANGUAGE=eng
|
#PAPERLESS_OCR_LANGUAGE=eng
|
||||||
|
|
||||||
|
# Set if accessing paperless via a domain subpath e.g. https://domain.com/PATHPREFIX and using a reverse-proxy like traefik or nginx
|
||||||
|
#PAPERLESS_FORCE_SCRIPT_NAME=/PATHPREFIX
|
||||||
|
#PAPERLESS_STATIC_URL=/PATHPREFIX/static/ # trailing slash required
|
||||||
|
|||||||
103
docker/compose/docker-compose.mariadb-tika.yml
Normal file
@@ -0,0 +1,103 @@
|
|||||||
|
# docker-compose file for running paperless from the Docker Hub.
|
||||||
|
# This file contains everything paperless needs to run.
|
||||||
|
# Paperless supports amd64, arm and arm64 hardware.
|
||||||
|
#
|
||||||
|
# All compose files of paperless configure paperless in the following way:
|
||||||
|
#
|
||||||
|
# - Paperless is (re)started on system boot, if it was running before shutdown.
|
||||||
|
# - Docker volumes for storing data are managed by Docker.
|
||||||
|
# - Folders for importing and exporting files are created in the same directory
|
||||||
|
# as this file and mounted to the correct folders inside the container.
|
||||||
|
# - Paperless listens on port 8000.
|
||||||
|
#
|
||||||
|
# In addition to that, this docker-compose file adds the following optional
|
||||||
|
# configurations:
|
||||||
|
#
|
||||||
|
# - Instead of SQLite (default), MariaDB is used as the database server.
|
||||||
|
# - Apache Tika and Gotenberg servers are started with paperless and paperless
|
||||||
|
# is configured to use these services. These provide support for consuming
|
||||||
|
# Office documents (Word, Excel, Power Point and their LibreOffice counter-
|
||||||
|
# parts.
|
||||||
|
#
|
||||||
|
# To install and update paperless with this file, do the following:
|
||||||
|
#
|
||||||
|
# - Copy this file as 'docker-compose.yml' and the files 'docker-compose.env'
|
||||||
|
# and '.env' into a folder.
|
||||||
|
# - Run 'docker-compose pull'.
|
||||||
|
# - Run 'docker-compose run --rm webserver createsuperuser' to create a user.
|
||||||
|
# - Run 'docker-compose up -d'.
|
||||||
|
#
|
||||||
|
# For more extensive installation and update instructions, refer to the
|
||||||
|
# documentation.
|
||||||
|
|
||||||
|
version: "3.4"
|
||||||
|
services:
|
||||||
|
broker:
|
||||||
|
image: docker.io/library/redis:7
|
||||||
|
restart: unless-stopped
|
||||||
|
volumes:
|
||||||
|
- redisdata:/data
|
||||||
|
|
||||||
|
db:
|
||||||
|
image: docker.io/library/mariadb:10
|
||||||
|
restart: unless-stopped
|
||||||
|
volumes:
|
||||||
|
- dbdata:/var/lib/mysql
|
||||||
|
environment:
|
||||||
|
MARIADB_HOST: paperless
|
||||||
|
MARIADB_DATABASE: paperless
|
||||||
|
MARIADB_USER: paperless
|
||||||
|
MARIADB_PASSWORD: paperless
|
||||||
|
MARIADB_ROOT_PASSWORD: paperless
|
||||||
|
|
||||||
|
webserver:
|
||||||
|
image: ghcr.io/paperless-ngx/paperless-ngx:latest
|
||||||
|
restart: unless-stopped
|
||||||
|
depends_on:
|
||||||
|
- db
|
||||||
|
- broker
|
||||||
|
- gotenberg
|
||||||
|
- tika
|
||||||
|
ports:
|
||||||
|
- 8000:8000
|
||||||
|
healthcheck:
|
||||||
|
test: ["CMD", "curl", "-f", "http://localhost:8000"]
|
||||||
|
interval: 30s
|
||||||
|
timeout: 10s
|
||||||
|
retries: 5
|
||||||
|
volumes:
|
||||||
|
- data:/usr/src/paperless/data
|
||||||
|
- media:/usr/src/paperless/media
|
||||||
|
- ./export:/usr/src/paperless/export
|
||||||
|
- ./consume:/usr/src/paperless/consume
|
||||||
|
env_file: docker-compose.env
|
||||||
|
environment:
|
||||||
|
PAPERLESS_REDIS: redis://broker:6379
|
||||||
|
PAPERLESS_DBENGINE: mariadb
|
||||||
|
PAPERLESS_DBHOST: db
|
||||||
|
PAPERLESS_DBUSER: paperless # only needed if non-default username
|
||||||
|
PAPERLESS_DBPASS: paperless # only needed if non-default password
|
||||||
|
PAPERLESS_DBPORT: 3306
|
||||||
|
PAPERLESS_TIKA_ENABLED: 1
|
||||||
|
PAPERLESS_TIKA_GOTENBERG_ENDPOINT: http://gotenberg:3000
|
||||||
|
PAPERLESS_TIKA_ENDPOINT: http://tika:9998
|
||||||
|
|
||||||
|
gotenberg:
|
||||||
|
image: docker.io/gotenberg/gotenberg:7.6
|
||||||
|
restart: unless-stopped
|
||||||
|
# The gotenberg chromium route is used to convert .eml files. We do not
|
||||||
|
# want to allow external content like tracking pixels or even javascript.
|
||||||
|
command:
|
||||||
|
- "gotenberg"
|
||||||
|
- "--chromium-disable-javascript=true"
|
||||||
|
- "--chromium-allow-list=file:///tmp/.*"
|
||||||
|
|
||||||
|
tika:
|
||||||
|
image: ghcr.io/paperless-ngx/tika:latest
|
||||||
|
restart: unless-stopped
|
||||||
|
|
||||||
|
volumes:
|
||||||
|
data:
|
||||||
|
media:
|
||||||
|
dbdata:
|
||||||
|
redisdata:
|
||||||
81
docker/compose/docker-compose.mariadb.yml
Normal file
@@ -0,0 +1,81 @@
|
|||||||
|
# docker-compose file for running paperless from the Docker Hub.
|
||||||
|
# This file contains everything paperless needs to run.
|
||||||
|
# Paperless supports amd64, arm and arm64 hardware.
|
||||||
|
#
|
||||||
|
# All compose files of paperless configure paperless in the following way:
|
||||||
|
#
|
||||||
|
# - Paperless is (re)started on system boot, if it was running before shutdown.
|
||||||
|
# - Docker volumes for storing data are managed by Docker.
|
||||||
|
# - Folders for importing and exporting files are created in the same directory
|
||||||
|
# as this file and mounted to the correct folders inside the container.
|
||||||
|
# - Paperless listens on port 8000.
|
||||||
|
#
|
||||||
|
# In addition to that, this docker-compose file adds the following optional
|
||||||
|
# configurations:
|
||||||
|
#
|
||||||
|
# - Instead of SQLite (default), MariaDB is used as the database server.
|
||||||
|
#
|
||||||
|
# To install and update paperless with this file, do the following:
|
||||||
|
#
|
||||||
|
# - Copy this file as 'docker-compose.yml' and the files 'docker-compose.env'
|
||||||
|
# and '.env' into a folder.
|
||||||
|
# - Run 'docker-compose pull'.
|
||||||
|
# - Run 'docker-compose run --rm webserver createsuperuser' to create a user.
|
||||||
|
# - Run 'docker-compose up -d'.
|
||||||
|
#
|
||||||
|
# For more extensive installation and update instructions, refer to the
|
||||||
|
# documentation.
|
||||||
|
|
||||||
|
version: "3.4"
|
||||||
|
services:
|
||||||
|
broker:
|
||||||
|
image: docker.io/library/redis:7
|
||||||
|
restart: unless-stopped
|
||||||
|
volumes:
|
||||||
|
- redisdata:/data
|
||||||
|
|
||||||
|
db:
|
||||||
|
image: docker.io/library/mariadb:10
|
||||||
|
restart: unless-stopped
|
||||||
|
volumes:
|
||||||
|
- dbdata:/var/lib/mysql
|
||||||
|
environment:
|
||||||
|
MARIADB_HOST: paperless
|
||||||
|
MARIADB_DATABASE: paperless
|
||||||
|
MARIADB_USER: paperless
|
||||||
|
MARIADB_PASSWORD: paperless
|
||||||
|
MARIADB_ROOT_PASSWORD: paperless
|
||||||
|
|
||||||
|
webserver:
|
||||||
|
image: ghcr.io/paperless-ngx/paperless-ngx:latest
|
||||||
|
restart: unless-stopped
|
||||||
|
depends_on:
|
||||||
|
- db
|
||||||
|
- broker
|
||||||
|
ports:
|
||||||
|
- 8000:8000
|
||||||
|
healthcheck:
|
||||||
|
test: ["CMD", "curl", "-f", "http://localhost:8000"]
|
||||||
|
interval: 30s
|
||||||
|
timeout: 10s
|
||||||
|
retries: 5
|
||||||
|
volumes:
|
||||||
|
- data:/usr/src/paperless/data
|
||||||
|
- media:/usr/src/paperless/media
|
||||||
|
- ./export:/usr/src/paperless/export
|
||||||
|
- ./consume:/usr/src/paperless/consume
|
||||||
|
env_file: docker-compose.env
|
||||||
|
environment:
|
||||||
|
PAPERLESS_REDIS: redis://broker:6379
|
||||||
|
PAPERLESS_DBENGINE: mariadb
|
||||||
|
PAPERLESS_DBHOST: db
|
||||||
|
PAPERLESS_DBUSER: paperless # only needed if non-default username
|
||||||
|
PAPERLESS_DBPASS: paperless # only needed if non-default password
|
||||||
|
PAPERLESS_DBPORT: 3306
|
||||||
|
|
||||||
|
|
||||||
|
volumes:
|
||||||
|
data:
|
||||||
|
media:
|
||||||
|
dbdata:
|
||||||
|
redisdata:
|
||||||
@@ -31,13 +31,13 @@
|
|||||||
version: "3.4"
|
version: "3.4"
|
||||||
services:
|
services:
|
||||||
broker:
|
broker:
|
||||||
image: redis:6.0
|
image: docker.io/library/redis:7
|
||||||
restart: unless-stopped
|
restart: unless-stopped
|
||||||
volumes:
|
volumes:
|
||||||
- redisdata:/data
|
- redisdata:/data
|
||||||
|
|
||||||
db:
|
db:
|
||||||
image: postgres:13
|
image: docker.io/library/postgres:13
|
||||||
restart: unless-stopped
|
restart: unless-stopped
|
||||||
volumes:
|
volumes:
|
||||||
- pgdata:/var/lib/postgresql/data
|
- pgdata:/var/lib/postgresql/data
|
||||||
|
|||||||
@@ -33,13 +33,13 @@
|
|||||||
version: "3.4"
|
version: "3.4"
|
||||||
services:
|
services:
|
||||||
broker:
|
broker:
|
||||||
image: redis:6.0
|
image: docker.io/library/redis:7
|
||||||
restart: unless-stopped
|
restart: unless-stopped
|
||||||
volumes:
|
volumes:
|
||||||
- redisdata:/data
|
- redisdata:/data
|
||||||
|
|
||||||
db:
|
db:
|
||||||
image: postgres:13
|
image: docker.io/library/postgres:13
|
||||||
restart: unless-stopped
|
restart: unless-stopped
|
||||||
volumes:
|
volumes:
|
||||||
- pgdata:/var/lib/postgresql/data
|
- pgdata:/var/lib/postgresql/data
|
||||||
@@ -77,11 +77,15 @@ services:
|
|||||||
PAPERLESS_TIKA_ENDPOINT: http://tika:9998
|
PAPERLESS_TIKA_ENDPOINT: http://tika:9998
|
||||||
|
|
||||||
gotenberg:
|
gotenberg:
|
||||||
image: gotenberg/gotenberg:7.4
|
image: docker.io/gotenberg/gotenberg:7.6
|
||||||
restart: unless-stopped
|
restart: unless-stopped
|
||||||
|
|
||||||
|
# The gotenberg chromium route is used to convert .eml files. We do not
|
||||||
|
# want to allow external content like tracking pixels or even javascript.
|
||||||
command:
|
command:
|
||||||
- "gotenberg"
|
- "gotenberg"
|
||||||
- "--chromium-disable-routes=true"
|
- "--chromium-disable-javascript=true"
|
||||||
|
- "--chromium-allow-list=file:///tmp/.*"
|
||||||
|
|
||||||
tika:
|
tika:
|
||||||
image: ghcr.io/paperless-ngx/tika:latest
|
image: ghcr.io/paperless-ngx/tika:latest
|
||||||
|
|||||||
@@ -29,13 +29,13 @@
|
|||||||
version: "3.4"
|
version: "3.4"
|
||||||
services:
|
services:
|
||||||
broker:
|
broker:
|
||||||
image: redis:6.0
|
image: docker.io/library/redis:7
|
||||||
restart: unless-stopped
|
restart: unless-stopped
|
||||||
volumes:
|
volumes:
|
||||||
- redisdata:/data
|
- redisdata:/data
|
||||||
|
|
||||||
db:
|
db:
|
||||||
image: postgres:13
|
image: docker.io/library/postgres:13
|
||||||
restart: unless-stopped
|
restart: unless-stopped
|
||||||
volumes:
|
volumes:
|
||||||
- pgdata:/var/lib/postgresql/data
|
- pgdata:/var/lib/postgresql/data
|
||||||
|
|||||||
@@ -33,7 +33,7 @@
|
|||||||
version: "3.4"
|
version: "3.4"
|
||||||
services:
|
services:
|
||||||
broker:
|
broker:
|
||||||
image: redis:6.0
|
image: docker.io/library/redis:7
|
||||||
restart: unless-stopped
|
restart: unless-stopped
|
||||||
volumes:
|
volumes:
|
||||||
- redisdata:/data
|
- redisdata:/data
|
||||||
@@ -65,11 +65,15 @@ services:
|
|||||||
PAPERLESS_TIKA_ENDPOINT: http://tika:9998
|
PAPERLESS_TIKA_ENDPOINT: http://tika:9998
|
||||||
|
|
||||||
gotenberg:
|
gotenberg:
|
||||||
image: gotenberg/gotenberg:7.4
|
image: docker.io/gotenberg/gotenberg:7.6
|
||||||
restart: unless-stopped
|
restart: unless-stopped
|
||||||
|
|
||||||
|
# The gotenberg chromium route is used to convert .eml files. We do not
|
||||||
|
# want to allow external content like tracking pixels or even javascript.
|
||||||
command:
|
command:
|
||||||
- "gotenberg"
|
- "gotenberg"
|
||||||
- "--chromium-disable-routes=true"
|
- "--chromium-disable-javascript=true"
|
||||||
|
- "--chromium-allow-list=file:///tmp/.*"
|
||||||
|
|
||||||
tika:
|
tika:
|
||||||
image: ghcr.io/paperless-ngx/tika:latest
|
image: ghcr.io/paperless-ngx/tika:latest
|
||||||
|
|||||||
@@ -26,7 +26,7 @@
|
|||||||
version: "3.4"
|
version: "3.4"
|
||||||
services:
|
services:
|
||||||
broker:
|
broker:
|
||||||
image: redis:6.0
|
image: docker.io/library/redis:7
|
||||||
restart: unless-stopped
|
restart: unless-stopped
|
||||||
volumes:
|
volumes:
|
||||||
- redisdata:/data
|
- redisdata:/data
|
||||||
|
|||||||
@@ -4,44 +4,119 @@ set -e
|
|||||||
|
|
||||||
# Source: https://github.com/sameersbn/docker-gitlab/
|
# Source: https://github.com/sameersbn/docker-gitlab/
|
||||||
map_uidgid() {
|
map_uidgid() {
|
||||||
USERMAP_ORIG_UID=$(id -u paperless)
|
local -r usermap_original_uid=$(id -u paperless)
|
||||||
USERMAP_ORIG_GID=$(id -g paperless)
|
local -r usermap_original_gid=$(id -g paperless)
|
||||||
USERMAP_NEW_UID=${USERMAP_UID:-$USERMAP_ORIG_UID}
|
local -r usermap_new_uid=${USERMAP_UID:-$usermap_original_uid}
|
||||||
USERMAP_NEW_GID=${USERMAP_GID:-${USERMAP_ORIG_GID:-$USERMAP_NEW_UID}}
|
local -r usermap_new_gid=${USERMAP_GID:-${usermap_original_gid:-$usermap_new_uid}}
|
||||||
if [[ ${USERMAP_NEW_UID} != "${USERMAP_ORIG_UID}" || ${USERMAP_NEW_GID} != "${USERMAP_ORIG_GID}" ]]; then
|
if [[ ${usermap_new_uid} != "${usermap_original_uid}" || ${usermap_new_gid} != "${usermap_original_gid}" ]]; then
|
||||||
echo "Mapping UID and GID for paperless:paperless to $USERMAP_NEW_UID:$USERMAP_NEW_GID"
|
echo "Mapping UID and GID for paperless:paperless to $usermap_new_uid:$usermap_new_gid"
|
||||||
usermod -o -u "${USERMAP_NEW_UID}" paperless
|
usermod -o -u "${usermap_new_uid}" paperless
|
||||||
groupmod -o -g "${USERMAP_NEW_GID}" paperless
|
groupmod -o -g "${usermap_new_gid}" paperless
|
||||||
|
fi
|
||||||
|
}
|
||||||
|
|
||||||
|
map_folders() {
|
||||||
|
# Export these so they can be used in docker-prepare.sh
|
||||||
|
export DATA_DIR="${PAPERLESS_DATA_DIR:-/usr/src/paperless/data}"
|
||||||
|
export MEDIA_ROOT_DIR="${PAPERLESS_MEDIA_ROOT:-/usr/src/paperless/media}"
|
||||||
|
export CONSUME_DIR="${PAPERLESS_CONSUMPTION_DIR:-/usr/src/paperless/consume}"
|
||||||
|
}
|
||||||
|
|
||||||
|
custom_container_init() {
|
||||||
|
# Mostly borrowed from the LinuxServer.io base image
|
||||||
|
# https://github.com/linuxserver/docker-baseimage-ubuntu/tree/bionic/root/etc/cont-init.d
|
||||||
|
local -r custom_script_dir="/custom-cont-init.d"
|
||||||
|
# Tamper checking.
|
||||||
|
# Don't run files which are owned by anyone except root
|
||||||
|
# Don't run files which are writeable by others
|
||||||
|
if [ -d "${custom_script_dir}" ]; then
|
||||||
|
if [ -n "$(/usr/bin/find "${custom_script_dir}" -maxdepth 1 ! -user root)" ]; then
|
||||||
|
echo "**** Potential tampering with custom scripts detected ****"
|
||||||
|
echo "**** The folder '${custom_script_dir}' must be owned by root ****"
|
||||||
|
return 0
|
||||||
|
fi
|
||||||
|
if [ -n "$(/usr/bin/find "${custom_script_dir}" -maxdepth 1 -perm -o+w)" ]; then
|
||||||
|
echo "**** The folder '${custom_script_dir}' or some of contents have write permissions for others, which is a security risk. ****"
|
||||||
|
echo "**** Please review the permissions and their contents to make sure they are owned by root, and can only be modified by root. ****"
|
||||||
|
return 0
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Make sure custom init directory has files in it
|
||||||
|
if [ -n "$(/bin/ls -A "${custom_script_dir}" 2>/dev/null)" ]; then
|
||||||
|
echo "[custom-init] files found in ${custom_script_dir} executing"
|
||||||
|
# Loop over files in the directory
|
||||||
|
for SCRIPT in "${custom_script_dir}"/*; do
|
||||||
|
NAME="$(basename "${SCRIPT}")"
|
||||||
|
if [ -f "${SCRIPT}" ]; then
|
||||||
|
echo "[custom-init] ${NAME}: executing..."
|
||||||
|
/bin/bash "${SCRIPT}"
|
||||||
|
echo "[custom-init] ${NAME}: exited $?"
|
||||||
|
elif [ ! -f "${SCRIPT}" ]; then
|
||||||
|
echo "[custom-init] ${NAME}: is not a file"
|
||||||
|
fi
|
||||||
|
done
|
||||||
|
else
|
||||||
|
echo "[custom-init] no custom files found exiting..."
|
||||||
|
fi
|
||||||
|
|
||||||
fi
|
fi
|
||||||
}
|
}
|
||||||
|
|
||||||
initialize() {
|
initialize() {
|
||||||
|
|
||||||
|
# Setup environment from secrets before anything else
|
||||||
|
# Check for a version of this var with _FILE appended
|
||||||
|
# and convert the contents to the env var value
|
||||||
|
# Source it so export is persistent
|
||||||
|
# shellcheck disable=SC1091
|
||||||
|
source /sbin/env-from-file.sh
|
||||||
|
|
||||||
|
# Change the user and group IDs if needed
|
||||||
map_uidgid
|
map_uidgid
|
||||||
|
|
||||||
for dir in export data data/index media media/documents media/documents/originals media/documents/thumbnails; do
|
# Check for overrides of certain folders
|
||||||
if [[ ! -d "../$dir" ]]; then
|
map_folders
|
||||||
echo "Creating directory ../$dir"
|
|
||||||
mkdir ../$dir
|
local -r export_dir="/usr/src/paperless/export"
|
||||||
|
|
||||||
|
for dir in \
|
||||||
|
"${export_dir}" \
|
||||||
|
"${DATA_DIR}" "${DATA_DIR}/index" \
|
||||||
|
"${MEDIA_ROOT_DIR}" "${MEDIA_ROOT_DIR}/documents" "${MEDIA_ROOT_DIR}/documents/originals" "${MEDIA_ROOT_DIR}/documents/thumbnails" \
|
||||||
|
"${CONSUME_DIR}"; do
|
||||||
|
if [[ ! -d "${dir}" ]]; then
|
||||||
|
echo "Creating directory ${dir}"
|
||||||
|
mkdir "${dir}"
|
||||||
fi
|
fi
|
||||||
done
|
done
|
||||||
|
|
||||||
echo "Creating directory /tmp/paperless"
|
local -r tmp_dir="/tmp/paperless"
|
||||||
mkdir -p /tmp/paperless
|
echo "Creating directory ${tmp_dir}"
|
||||||
|
mkdir -p "${tmp_dir}"
|
||||||
|
|
||||||
set +e
|
set +e
|
||||||
echo "Adjusting permissions of paperless files. This may take a while."
|
echo "Adjusting permissions of paperless files. This may take a while."
|
||||||
chown -R paperless:paperless /tmp/paperless
|
chown -R paperless:paperless ${tmp_dir}
|
||||||
find .. -not \( -user paperless -and -group paperless \) -exec chown paperless:paperless {} +
|
for dir in \
|
||||||
|
"${export_dir}" \
|
||||||
|
"${DATA_DIR}" \
|
||||||
|
"${MEDIA_ROOT_DIR}" \
|
||||||
|
"${CONSUME_DIR}"; do
|
||||||
|
find "${dir}" -not \( -user paperless -and -group paperless \) -exec chown paperless:paperless {} +
|
||||||
|
done
|
||||||
set -e
|
set -e
|
||||||
|
|
||||||
gosu paperless /sbin/docker-prepare.sh
|
"${gosu_cmd[@]}" /sbin/docker-prepare.sh
|
||||||
|
|
||||||
|
# Leave this last thing
|
||||||
|
custom_container_init
|
||||||
|
|
||||||
}
|
}
|
||||||
|
|
||||||
install_languages() {
|
install_languages() {
|
||||||
echo "Installing languages..."
|
echo "Installing languages..."
|
||||||
|
|
||||||
local langs="$1"
|
read -ra langs <<<"$1"
|
||||||
read -ra langs <<<"$langs"
|
|
||||||
|
|
||||||
# Check that it is not empty
|
# Check that it is not empty
|
||||||
if [ ${#langs[@]} -eq 0 ]; then
|
if [ ${#langs[@]} -eq 0 ]; then
|
||||||
@@ -51,10 +126,6 @@ install_languages() {
|
|||||||
|
|
||||||
for lang in "${langs[@]}"; do
|
for lang in "${langs[@]}"; do
|
||||||
pkg="tesseract-ocr-$lang"
|
pkg="tesseract-ocr-$lang"
|
||||||
# English is installed by default
|
|
||||||
#if [[ "$lang" == "eng" ]]; then
|
|
||||||
# continue
|
|
||||||
#fi
|
|
||||||
|
|
||||||
if dpkg -s "$pkg" &>/dev/null; then
|
if dpkg -s "$pkg" &>/dev/null; then
|
||||||
echo "Package $pkg already installed!"
|
echo "Package $pkg already installed!"
|
||||||
@@ -76,6 +147,11 @@ install_languages() {
|
|||||||
|
|
||||||
echo "Paperless-ngx docker container starting..."
|
echo "Paperless-ngx docker container starting..."
|
||||||
|
|
||||||
|
gosu_cmd=(gosu paperless)
|
||||||
|
if [ "$(id -u)" == "$(id -u paperless)" ]; then
|
||||||
|
gosu_cmd=()
|
||||||
|
fi
|
||||||
|
|
||||||
# Install additional languages if specified
|
# Install additional languages if specified
|
||||||
if [[ -n "$PAPERLESS_OCR_LANGUAGES" ]]; then
|
if [[ -n "$PAPERLESS_OCR_LANGUAGES" ]]; then
|
||||||
install_languages "$PAPERLESS_OCR_LANGUAGES"
|
install_languages "$PAPERLESS_OCR_LANGUAGES"
|
||||||
@@ -85,7 +161,7 @@ initialize
|
|||||||
|
|
||||||
if [[ "$1" != "/"* ]]; then
|
if [[ "$1" != "/"* ]]; then
|
||||||
echo Executing management command "$@"
|
echo Executing management command "$@"
|
||||||
exec gosu paperless python3 manage.py "$@"
|
exec "${gosu_cmd[@]}" python3 manage.py "$@"
|
||||||
else
|
else
|
||||||
echo Executing "$@"
|
echo Executing "$@"
|
||||||
exec "$@"
|
exec "$@"
|
||||||
|
|||||||
@@ -3,16 +3,42 @@
|
|||||||
set -e
|
set -e
|
||||||
|
|
||||||
wait_for_postgres() {
|
wait_for_postgres() {
|
||||||
attempt_num=1
|
local attempt_num=1
|
||||||
max_attempts=5
|
local -r max_attempts=5
|
||||||
|
|
||||||
echo "Waiting for PostgreSQL to start..."
|
echo "Waiting for PostgreSQL to start..."
|
||||||
|
|
||||||
host="${PAPERLESS_DBHOST:=localhost}"
|
local -r host="${PAPERLESS_DBHOST:-localhost}"
|
||||||
port="${PAPERLESS_DBPORT:=5432}"
|
local -r port="${PAPERLESS_DBPORT:-5432}"
|
||||||
|
|
||||||
|
# Disable warning, host and port can't have spaces
|
||||||
|
# shellcheck disable=SC2086
|
||||||
|
while [ ! "$(pg_isready -h ${host} -p ${port})" ]; do
|
||||||
|
|
||||||
while [ ! "$(pg_isready -h $host -p $port)" ]; do
|
if [ $attempt_num -eq $max_attempts ]; then
|
||||||
|
echo "Unable to connect to database."
|
||||||
|
exit 1
|
||||||
|
else
|
||||||
|
echo "Attempt $attempt_num failed! Trying again in 5 seconds..."
|
||||||
|
fi
|
||||||
|
|
||||||
|
attempt_num=$(("$attempt_num" + 1))
|
||||||
|
sleep 5
|
||||||
|
done
|
||||||
|
}
|
||||||
|
|
||||||
|
wait_for_mariadb() {
|
||||||
|
echo "Waiting for MariaDB to start..."
|
||||||
|
|
||||||
|
local -r host="${PAPERLESS_DBHOST:=localhost}"
|
||||||
|
local -r port="${PAPERLESS_DBPORT:=3306}"
|
||||||
|
|
||||||
|
local attempt_num=1
|
||||||
|
local -r max_attempts=5
|
||||||
|
|
||||||
|
# Disable warning, host and port can't have spaces
|
||||||
|
# shellcheck disable=SC2086
|
||||||
|
while ! true > /dev/tcp/$host/$port; do
|
||||||
|
|
||||||
if [ $attempt_num -eq $max_attempts ]; then
|
if [ $attempt_num -eq $max_attempts ]; then
|
||||||
echo "Unable to connect to database."
|
echo "Unable to connect to database."
|
||||||
@@ -42,18 +68,25 @@ migrations() {
|
|||||||
# of the current container starts.
|
# of the current container starts.
|
||||||
flock 200
|
flock 200
|
||||||
echo "Apply database migrations..."
|
echo "Apply database migrations..."
|
||||||
python3 manage.py migrate
|
python3 manage.py migrate --skip-checks --no-input
|
||||||
) 200>/usr/src/paperless/data/migration_lock
|
) 200>"${DATA_DIR}/migration_lock"
|
||||||
|
}
|
||||||
|
|
||||||
|
django_checks() {
|
||||||
|
# Explicitly run the Django system checks
|
||||||
|
echo "Running Django checks"
|
||||||
|
python3 manage.py check
|
||||||
}
|
}
|
||||||
|
|
||||||
search_index() {
|
search_index() {
|
||||||
index_version=1
|
|
||||||
index_version_file=/usr/src/paperless/data/.index_version
|
|
||||||
|
|
||||||
if [[ (! -f "$index_version_file") || $(<$index_version_file) != "$index_version" ]]; then
|
local -r index_version=1
|
||||||
|
local -r index_version_file=${DATA_DIR}/.index_version
|
||||||
|
|
||||||
|
if [[ (! -f "${index_version_file}") || $(<"${index_version_file}") != "$index_version" ]]; then
|
||||||
echo "Search index out of date. Updating..."
|
echo "Search index out of date. Updating..."
|
||||||
python3 manage.py document_index reindex
|
python3 manage.py document_index reindex --no-progress-bar
|
||||||
echo $index_version | tee $index_version_file >/dev/null
|
echo ${index_version} | tee "${index_version_file}" >/dev/null
|
||||||
fi
|
fi
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -64,7 +97,9 @@ superuser() {
|
|||||||
}
|
}
|
||||||
|
|
||||||
do_work() {
|
do_work() {
|
||||||
if [[ -n "${PAPERLESS_DBHOST}" ]]; then
|
if [[ "${PAPERLESS_DBENGINE}" == "mariadb" ]]; then
|
||||||
|
wait_for_mariadb
|
||||||
|
elif [[ -n "${PAPERLESS_DBHOST}" ]]; then
|
||||||
wait_for_postgres
|
wait_for_postgres
|
||||||
fi
|
fi
|
||||||
|
|
||||||
@@ -72,6 +107,8 @@ do_work() {
|
|||||||
|
|
||||||
migrations
|
migrations
|
||||||
|
|
||||||
|
django_checks
|
||||||
|
|
||||||
search_index
|
search_index
|
||||||
|
|
||||||
superuser
|
superuser
|
||||||
|
|||||||
38
docker/env-from-file.sh
Normal file
@@ -0,0 +1,38 @@
|
|||||||
|
#!/usr/bin/env bash
|
||||||
|
|
||||||
|
# Scans the environment variables for those with the suffix _FILE
|
||||||
|
# When located, checks the file exists, and exports the contents
|
||||||
|
# of the file as the same name, minus the suffix
|
||||||
|
# This allows the use of Docker secrets or mounted files
|
||||||
|
# to fill in any of the settings configurable via environment
|
||||||
|
# variables
|
||||||
|
|
||||||
|
set -eu
|
||||||
|
|
||||||
|
for line in $(printenv)
|
||||||
|
do
|
||||||
|
# Extract the name of the environment variable
|
||||||
|
env_name=${line%%=*}
|
||||||
|
# Check if it starts with "PAPERLESS_" and ends in "_FILE"
|
||||||
|
if [[ ${env_name} == PAPERLESS_*_FILE ]]; then
|
||||||
|
# Extract the value of the environment
|
||||||
|
env_value=${line#*=}
|
||||||
|
|
||||||
|
# Check the file exists
|
||||||
|
if [[ -f ${env_value} ]]; then
|
||||||
|
|
||||||
|
# Trim off the _FILE suffix
|
||||||
|
non_file_env_name=${env_name%"_FILE"}
|
||||||
|
echo "Setting ${non_file_env_name} from file"
|
||||||
|
|
||||||
|
# Reads the value from th file
|
||||||
|
val="$(< "${!env_name}")"
|
||||||
|
|
||||||
|
# Sets the normal name to the read file contents
|
||||||
|
export "${non_file_env_name}"="${val}"
|
||||||
|
|
||||||
|
else
|
||||||
|
echo "File ${env_value} referenced by ${env_name} doesn't exist"
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
done
|
||||||
7
docker/flower-conditional.sh
Normal file
@@ -0,0 +1,7 @@
|
|||||||
|
#!/usr/bin/env bash
|
||||||
|
|
||||||
|
echo "Checking if we should start flower..."
|
||||||
|
|
||||||
|
if [[ -n "${PAPERLESS_ENABLE_FLOWER}" ]]; then
|
||||||
|
celery --app paperless flower
|
||||||
|
fi
|
||||||
@@ -2,7 +2,18 @@
|
|||||||
|
|
||||||
set -eu
|
set -eu
|
||||||
|
|
||||||
for command in document_archiver document_exporter document_importer mail_fetcher document_create_classifier document_index document_renamer document_retagger document_thumbnails document_sanity_checker manage_superuser;
|
for command in decrypt_documents \
|
||||||
|
document_archiver \
|
||||||
|
document_exporter \
|
||||||
|
document_importer \
|
||||||
|
mail_fetcher \
|
||||||
|
document_create_classifier \
|
||||||
|
document_index \
|
||||||
|
document_renamer \
|
||||||
|
document_retagger \
|
||||||
|
document_thumbnails \
|
||||||
|
document_sanity_checker \
|
||||||
|
manage_superuser;
|
||||||
do
|
do
|
||||||
echo "installing $command..."
|
echo "installing $command..."
|
||||||
sed "s/management_command/$command/g" management_script.sh > /usr/local/bin/$command
|
sed "s/management_command/$command/g" management_script.sh > /usr/local/bin/$command
|
||||||
|
|||||||
@@ -3,6 +3,9 @@
|
|||||||
set -e
|
set -e
|
||||||
|
|
||||||
cd /usr/src/paperless/src/
|
cd /usr/src/paperless/src/
|
||||||
|
# This ensures environment is setup
|
||||||
|
# shellcheck disable=SC1091
|
||||||
|
source /sbin/env-from-file.sh
|
||||||
|
|
||||||
if [[ $(id -u) == 0 ]] ;
|
if [[ $(id -u) == 0 ]] ;
|
||||||
then
|
then
|
||||||
|
|||||||
15
docker/paperless_cmd.sh
Executable file
@@ -0,0 +1,15 @@
|
|||||||
|
#!/usr/bin/env bash
|
||||||
|
|
||||||
|
rootless_args=()
|
||||||
|
if [ "$(id -u)" == "$(id -u paperless)" ]; then
|
||||||
|
rootless_args=(
|
||||||
|
--user
|
||||||
|
paperless
|
||||||
|
--logfile
|
||||||
|
supervisord.log
|
||||||
|
--pidfile
|
||||||
|
supervisord.pid
|
||||||
|
)
|
||||||
|
fi
|
||||||
|
|
||||||
|
exec /usr/local/bin/supervisord -c /etc/supervisord.conf "${rootless_args[@]}"
|
||||||
@@ -10,7 +10,7 @@ user=root
|
|||||||
[program:gunicorn]
|
[program:gunicorn]
|
||||||
command=gunicorn -c /usr/src/paperless/gunicorn.conf.py paperless.asgi:application
|
command=gunicorn -c /usr/src/paperless/gunicorn.conf.py paperless.asgi:application
|
||||||
user=paperless
|
user=paperless
|
||||||
|
priority = 1
|
||||||
stdout_logfile=/dev/stdout
|
stdout_logfile=/dev/stdout
|
||||||
stdout_logfile_maxbytes=0
|
stdout_logfile_maxbytes=0
|
||||||
stderr_logfile=/dev/stderr
|
stderr_logfile=/dev/stderr
|
||||||
@@ -19,17 +19,41 @@ stderr_logfile_maxbytes=0
|
|||||||
[program:consumer]
|
[program:consumer]
|
||||||
command=python3 manage.py document_consumer
|
command=python3 manage.py document_consumer
|
||||||
user=paperless
|
user=paperless
|
||||||
|
stopsignal=INT
|
||||||
|
priority = 20
|
||||||
stdout_logfile=/dev/stdout
|
stdout_logfile=/dev/stdout
|
||||||
stdout_logfile_maxbytes=0
|
stdout_logfile_maxbytes=0
|
||||||
stderr_logfile=/dev/stderr
|
stderr_logfile=/dev/stderr
|
||||||
stderr_logfile_maxbytes=0
|
stderr_logfile_maxbytes=0
|
||||||
|
|
||||||
[program:scheduler]
|
[program:celery]
|
||||||
command=python3 manage.py qcluster
|
|
||||||
|
command = celery --app paperless worker --loglevel INFO
|
||||||
user=paperless
|
user=paperless
|
||||||
stopasgroup = true
|
stopasgroup = true
|
||||||
|
stopwaitsecs = 60
|
||||||
|
priority = 5
|
||||||
|
stdout_logfile=/dev/stdout
|
||||||
|
stdout_logfile_maxbytes=0
|
||||||
|
stderr_logfile=/dev/stderr
|
||||||
|
stderr_logfile_maxbytes=0
|
||||||
|
|
||||||
|
[program:celery-beat]
|
||||||
|
|
||||||
|
command = celery --app paperless beat --loglevel INFO
|
||||||
|
user=paperless
|
||||||
|
stopasgroup = true
|
||||||
|
priority = 10
|
||||||
|
stdout_logfile=/dev/stdout
|
||||||
|
stdout_logfile_maxbytes=0
|
||||||
|
stderr_logfile=/dev/stderr
|
||||||
|
stderr_logfile_maxbytes=0
|
||||||
|
|
||||||
|
[program:celery-flower]
|
||||||
|
command = /usr/local/bin/flower-conditional.sh
|
||||||
|
user = paperless
|
||||||
|
startsecs = 0
|
||||||
|
priority = 40
|
||||||
stdout_logfile=/dev/stdout
|
stdout_logfile=/dev/stdout
|
||||||
stdout_logfile_maxbytes=0
|
stdout_logfile_maxbytes=0
|
||||||
stderr_logfile=/dev/stderr
|
stderr_logfile=/dev/stderr
|
||||||
|
|||||||
@@ -18,7 +18,7 @@ if __name__ == "__main__":
|
|||||||
|
|
||||||
REDIS_URL: Final[str] = os.getenv("PAPERLESS_REDIS", "redis://localhost:6379")
|
REDIS_URL: Final[str] = os.getenv("PAPERLESS_REDIS", "redis://localhost:6379")
|
||||||
|
|
||||||
print(f"Waiting for Redis: {REDIS_URL}", flush=True)
|
print(f"Waiting for Redis...", flush=True)
|
||||||
|
|
||||||
attempt = 0
|
attempt = 0
|
||||||
with Redis.from_url(url=REDIS_URL) as client:
|
with Redis.from_url(url=REDIS_URL) as client:
|
||||||
@@ -26,17 +26,19 @@ if __name__ == "__main__":
|
|||||||
try:
|
try:
|
||||||
client.ping()
|
client.ping()
|
||||||
break
|
break
|
||||||
except Exception:
|
except Exception as e:
|
||||||
print(
|
print(
|
||||||
f"Redis ping #{attempt} failed, waiting {RETRY_SLEEP_SECONDS}s",
|
f"Redis ping #{attempt} failed.\n"
|
||||||
|
f"Error: {str(e)}.\n"
|
||||||
|
f"Waiting {RETRY_SLEEP_SECONDS}s",
|
||||||
flush=True,
|
flush=True,
|
||||||
)
|
)
|
||||||
time.sleep(RETRY_SLEEP_SECONDS)
|
time.sleep(RETRY_SLEEP_SECONDS)
|
||||||
attempt += 1
|
attempt += 1
|
||||||
|
|
||||||
if attempt >= MAX_RETRY_COUNT:
|
if attempt >= MAX_RETRY_COUNT:
|
||||||
print(f"Failed to connect to: {REDIS_URL}")
|
print(f"Failed to connect to redis using environment variable PAPERLESS_REDIS.")
|
||||||
sys.exit(os.EX_UNAVAILABLE)
|
sys.exit(os.EX_UNAVAILABLE)
|
||||||
else:
|
else:
|
||||||
print(f"Connected to Redis broker: {REDIS_URL}")
|
print(f"Connected to Redis broker.")
|
||||||
sys.exit(os.EX_OK)
|
sys.exit(os.EX_OK)
|
||||||
|
|||||||
@@ -1,17 +0,0 @@
|
|||||||
FROM python:3.5.1
|
|
||||||
|
|
||||||
# Install Sphinx and Pygments
|
|
||||||
RUN pip install --no-cache-dir Sphinx Pygments \
|
|
||||||
# Setup directories, copy data
|
|
||||||
&& mkdir /build
|
|
||||||
|
|
||||||
COPY . /build
|
|
||||||
WORKDIR /build/docs
|
|
||||||
|
|
||||||
# Build documentation
|
|
||||||
RUN make html
|
|
||||||
|
|
||||||
# Start webserver
|
|
||||||
WORKDIR /build/docs/_build/html
|
|
||||||
EXPOSE 8000/tcp
|
|
||||||
CMD ["python3", "-m", "http.server"]
|
|
||||||
177
docs/Makefile
@@ -1,177 +0,0 @@
|
|||||||
# Makefile for Sphinx documentation
|
|
||||||
#
|
|
||||||
|
|
||||||
# You can set these variables from the command line.
|
|
||||||
SPHINXOPTS =
|
|
||||||
SPHINXBUILD = sphinx-build
|
|
||||||
PAPER =
|
|
||||||
BUILDDIR = _build
|
|
||||||
|
|
||||||
# User-friendly check for sphinx-build
|
|
||||||
ifeq ($(shell which $(SPHINXBUILD) >/dev/null 2>&1; echo $$?), 1)
|
|
||||||
$(error The '$(SPHINXBUILD)' command was not found. Make sure you have Sphinx installed, then set the SPHINXBUILD environment variable to point to the full path of the '$(SPHINXBUILD)' executable. Alternatively you can add the directory with the executable to your PATH. If you don't have Sphinx installed, grab it from http://sphinx-doc.org/)
|
|
||||||
endif
|
|
||||||
|
|
||||||
# Internal variables.
|
|
||||||
PAPEROPT_a4 = -D latex_paper_size=a4
|
|
||||||
PAPEROPT_letter = -D latex_paper_size=letter
|
|
||||||
ALLSPHINXOPTS = -d $(BUILDDIR)/doctrees $(PAPEROPT_$(PAPER)) $(SPHINXOPTS) .
|
|
||||||
# the i18n builder cannot share the environment and doctrees with the others
|
|
||||||
I18NSPHINXOPTS = $(PAPEROPT_$(PAPER)) $(SPHINXOPTS) .
|
|
||||||
|
|
||||||
.PHONY: help clean html dirhtml singlehtml pickle json htmlhelp qthelp devhelp epub latex latexpdf text man changes linkcheck doctest gettext
|
|
||||||
|
|
||||||
help:
|
|
||||||
@echo "Please use \`make <target>' where <target> is one of"
|
|
||||||
@echo " html to make standalone HTML files"
|
|
||||||
@echo " dirhtml to make HTML files named index.html in directories"
|
|
||||||
@echo " singlehtml to make a single large HTML file"
|
|
||||||
@echo " pickle to make pickle files"
|
|
||||||
@echo " json to make JSON files"
|
|
||||||
@echo " htmlhelp to make HTML files and a HTML help project"
|
|
||||||
@echo " qthelp to make HTML files and a qthelp project"
|
|
||||||
@echo " devhelp to make HTML files and a Devhelp project"
|
|
||||||
@echo " epub to make an epub"
|
|
||||||
@echo " latex to make LaTeX files, you can set PAPER=a4 or PAPER=letter"
|
|
||||||
@echo " latexpdf to make LaTeX files and run them through pdflatex"
|
|
||||||
@echo " latexpdfja to make LaTeX files and run them through platex/dvipdfmx"
|
|
||||||
@echo " text to make text files"
|
|
||||||
@echo " man to make manual pages"
|
|
||||||
@echo " texinfo to make Texinfo files"
|
|
||||||
@echo " info to make Texinfo files and run them through makeinfo"
|
|
||||||
@echo " gettext to make PO message catalogs"
|
|
||||||
@echo " changes to make an overview of all changed/added/deprecated items"
|
|
||||||
@echo " xml to make Docutils-native XML files"
|
|
||||||
@echo " pseudoxml to make pseudoxml-XML files for display purposes"
|
|
||||||
@echo " linkcheck to check all external links for integrity"
|
|
||||||
@echo " doctest to run all doctests embedded in the documentation (if enabled)"
|
|
||||||
|
|
||||||
clean:
|
|
||||||
rm -rf $(BUILDDIR)/*
|
|
||||||
|
|
||||||
html:
|
|
||||||
$(SPHINXBUILD) -b html $(ALLSPHINXOPTS) $(BUILDDIR)/html
|
|
||||||
@echo
|
|
||||||
@echo "Build finished. The HTML pages are in $(BUILDDIR)/html."
|
|
||||||
|
|
||||||
dirhtml:
|
|
||||||
$(SPHINXBUILD) -b dirhtml $(ALLSPHINXOPTS) $(BUILDDIR)/dirhtml
|
|
||||||
@echo
|
|
||||||
@echo "Build finished. The HTML pages are in $(BUILDDIR)/dirhtml."
|
|
||||||
|
|
||||||
singlehtml:
|
|
||||||
$(SPHINXBUILD) -b singlehtml $(ALLSPHINXOPTS) $(BUILDDIR)/singlehtml
|
|
||||||
@echo
|
|
||||||
@echo "Build finished. The HTML page is in $(BUILDDIR)/singlehtml."
|
|
||||||
|
|
||||||
pickle:
|
|
||||||
$(SPHINXBUILD) -b pickle $(ALLSPHINXOPTS) $(BUILDDIR)/pickle
|
|
||||||
@echo
|
|
||||||
@echo "Build finished; now you can process the pickle files."
|
|
||||||
|
|
||||||
json:
|
|
||||||
$(SPHINXBUILD) -b json $(ALLSPHINXOPTS) $(BUILDDIR)/json
|
|
||||||
@echo
|
|
||||||
@echo "Build finished; now you can process the JSON files."
|
|
||||||
|
|
||||||
htmlhelp:
|
|
||||||
$(SPHINXBUILD) -b htmlhelp $(ALLSPHINXOPTS) $(BUILDDIR)/htmlhelp
|
|
||||||
@echo
|
|
||||||
@echo "Build finished; now you can run HTML Help Workshop with the" \
|
|
||||||
".hhp project file in $(BUILDDIR)/htmlhelp."
|
|
||||||
|
|
||||||
qthelp:
|
|
||||||
$(SPHINXBUILD) -b qthelp $(ALLSPHINXOPTS) $(BUILDDIR)/qthelp
|
|
||||||
@echo
|
|
||||||
@echo "Build finished; now you can run "qcollectiongenerator" with the" \
|
|
||||||
".qhcp project file in $(BUILDDIR)/qthelp, like this:"
|
|
||||||
@echo "# qcollectiongenerator $(BUILDDIR)/qthelp/RIPEAtlasToolsMagellan.qhcp"
|
|
||||||
@echo "To view the help file:"
|
|
||||||
@echo "# assistant -collectionFile $(BUILDDIR)/qthelp/RIPEAtlasToolsMagellan.qhc"
|
|
||||||
|
|
||||||
devhelp:
|
|
||||||
$(SPHINXBUILD) -b devhelp $(ALLSPHINXOPTS) $(BUILDDIR)/devhelp
|
|
||||||
@echo
|
|
||||||
@echo "Build finished."
|
|
||||||
@echo "To view the help file:"
|
|
||||||
@echo "# mkdir -p $$HOME/.local/share/devhelp/RIPEAtlasToolsMagellan"
|
|
||||||
@echo "# ln -s $(BUILDDIR)/devhelp $$HOME/.local/share/devhelp/RIPEAtlasToolsMagellan"
|
|
||||||
@echo "# devhelp"
|
|
||||||
|
|
||||||
epub:
|
|
||||||
$(SPHINXBUILD) -b epub $(ALLSPHINXOPTS) $(BUILDDIR)/epub
|
|
||||||
@echo
|
|
||||||
@echo "Build finished. The epub file is in $(BUILDDIR)/epub."
|
|
||||||
|
|
||||||
latex:
|
|
||||||
$(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex
|
|
||||||
@echo
|
|
||||||
@echo "Build finished; the LaTeX files are in $(BUILDDIR)/latex."
|
|
||||||
@echo "Run \`make' in that directory to run these through (pdf)latex" \
|
|
||||||
"(use \`make latexpdf' here to do that automatically)."
|
|
||||||
|
|
||||||
latexpdf:
|
|
||||||
$(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex
|
|
||||||
@echo "Running LaTeX files through pdflatex..."
|
|
||||||
$(MAKE) -C $(BUILDDIR)/latex all-pdf
|
|
||||||
@echo "pdflatex finished; the PDF files are in $(BUILDDIR)/latex."
|
|
||||||
|
|
||||||
latexpdfja:
|
|
||||||
$(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex
|
|
||||||
@echo "Running LaTeX files through platex and dvipdfmx..."
|
|
||||||
$(MAKE) -C $(BUILDDIR)/latex all-pdf-ja
|
|
||||||
@echo "pdflatex finished; the PDF files are in $(BUILDDIR)/latex."
|
|
||||||
|
|
||||||
text:
|
|
||||||
$(SPHINXBUILD) -b text $(ALLSPHINXOPTS) $(BUILDDIR)/text
|
|
||||||
@echo
|
|
||||||
@echo "Build finished. The text files are in $(BUILDDIR)/text."
|
|
||||||
|
|
||||||
man:
|
|
||||||
$(SPHINXBUILD) -b man $(ALLSPHINXOPTS) $(BUILDDIR)/man
|
|
||||||
@echo
|
|
||||||
@echo "Build finished. The manual pages are in $(BUILDDIR)/man."
|
|
||||||
|
|
||||||
texinfo:
|
|
||||||
$(SPHINXBUILD) -b texinfo $(ALLSPHINXOPTS) $(BUILDDIR)/texinfo
|
|
||||||
@echo
|
|
||||||
@echo "Build finished. The Texinfo files are in $(BUILDDIR)/texinfo."
|
|
||||||
@echo "Run \`make' in that directory to run these through makeinfo" \
|
|
||||||
"(use \`make info' here to do that automatically)."
|
|
||||||
|
|
||||||
info:
|
|
||||||
$(SPHINXBUILD) -b texinfo $(ALLSPHINXOPTS) $(BUILDDIR)/texinfo
|
|
||||||
@echo "Running Texinfo files through makeinfo..."
|
|
||||||
make -C $(BUILDDIR)/texinfo info
|
|
||||||
@echo "makeinfo finished; the Info files are in $(BUILDDIR)/texinfo."
|
|
||||||
|
|
||||||
gettext:
|
|
||||||
$(SPHINXBUILD) -b gettext $(I18NSPHINXOPTS) $(BUILDDIR)/locale
|
|
||||||
@echo
|
|
||||||
@echo "Build finished. The message catalogs are in $(BUILDDIR)/locale."
|
|
||||||
|
|
||||||
changes:
|
|
||||||
$(SPHINXBUILD) -b changes $(ALLSPHINXOPTS) $(BUILDDIR)/changes
|
|
||||||
@echo
|
|
||||||
@echo "The overview file is in $(BUILDDIR)/changes."
|
|
||||||
|
|
||||||
linkcheck:
|
|
||||||
$(SPHINXBUILD) -b linkcheck $(ALLSPHINXOPTS) $(BUILDDIR)/linkcheck
|
|
||||||
@echo
|
|
||||||
@echo "Link check complete; look for any errors in the above output " \
|
|
||||||
"or in $(BUILDDIR)/linkcheck/output.txt."
|
|
||||||
|
|
||||||
doctest:
|
|
||||||
$(SPHINXBUILD) -b doctest $(ALLSPHINXOPTS) $(BUILDDIR)/doctest
|
|
||||||
@echo "Testing of doctests in the sources finished, look at the " \
|
|
||||||
"results in $(BUILDDIR)/doctest/output.txt."
|
|
||||||
|
|
||||||
xml:
|
|
||||||
$(SPHINXBUILD) -b xml $(ALLSPHINXOPTS) $(BUILDDIR)/xml
|
|
||||||
@echo
|
|
||||||
@echo "Build finished. The XML files are in $(BUILDDIR)/xml."
|
|
||||||
|
|
||||||
pseudoxml:
|
|
||||||
$(SPHINXBUILD) -b pseudoxml $(ALLSPHINXOPTS) $(BUILDDIR)/pseudoxml
|
|
||||||
@echo
|
|
||||||
@echo "Build finished. The pseudo-XML files are in $(BUILDDIR)/pseudoxml."
|
|
||||||
592
docs/_static/css/custom.css
vendored
@@ -1,592 +0,0 @@
|
|||||||
/* Variables */
|
|
||||||
:root {
|
|
||||||
--color-text-body: #5c5962;
|
|
||||||
--color-text-body-light: #fcfcfc;
|
|
||||||
--color-text-anchor: #7253ed;
|
|
||||||
--color-text-alt: rgba(0, 0, 0, 0.3);
|
|
||||||
--color-text-title: #27262b;
|
|
||||||
--color-text-code-inline: #e74c3c;
|
|
||||||
--color-text-code-nt: #062873;
|
|
||||||
--color-text-selection: #b19eff;
|
|
||||||
--color-bg-body: #fcfcfc;
|
|
||||||
--color-bg-body-alt: #f3f6f6;
|
|
||||||
--color-bg-side-nav: #f5f6fa;
|
|
||||||
--color-bg-side-nav-hover: #ebedf5;
|
|
||||||
--color-bg-code-block: var(--color-bg-side-nav);
|
|
||||||
--color-border: #eeebee;
|
|
||||||
--color-btn-neutral-bg: #f3f6f6;
|
|
||||||
--color-btn-neutral-bg-hover: #e5ebeb;
|
|
||||||
--color-success-title: #1abc9c;
|
|
||||||
--color-success-body: #dbfaf4;
|
|
||||||
--color-warning-title: #f0b37e;
|
|
||||||
--color-warning-body: #ffedcc;
|
|
||||||
--color-danger-title: #f29f97;
|
|
||||||
--color-danger-body: #fdf3f2;
|
|
||||||
--color-info-title: #6ab0de;
|
|
||||||
--color-info-body: #e7f2fa;
|
|
||||||
}
|
|
||||||
|
|
||||||
.dark-mode {
|
|
||||||
--color-text-body: #abb2bf;
|
|
||||||
--color-text-body-light: #9499a2;
|
|
||||||
--color-text-alt: rgba(0255, 255, 255, 0.5);
|
|
||||||
--color-text-title: var(--color-text-anchor);
|
|
||||||
--color-text-code-inline: #abb2bf;
|
|
||||||
--color-text-code-nt: #2063f3;
|
|
||||||
--color-text-selection: #030303;
|
|
||||||
--color-bg-body: #1d1d20 !important;
|
|
||||||
--color-bg-body-alt: #131315;
|
|
||||||
--color-bg-side-nav: #18181a;
|
|
||||||
--color-bg-side-nav-hover: #101216;
|
|
||||||
--color-bg-code-block: #101216;
|
|
||||||
--color-border: #47494f;
|
|
||||||
--color-btn-neutral-bg: #242529;
|
|
||||||
--color-btn-neutral-bg-hover: #101216;
|
|
||||||
--color-success-title: #02120f;
|
|
||||||
--color-success-body: #041b17;
|
|
||||||
--color-warning-title: #1b0e03;
|
|
||||||
--color-warning-body: #371d06;
|
|
||||||
--color-danger-title: #120902;
|
|
||||||
--color-danger-body: #1b0503;
|
|
||||||
--color-info-title: #020608;
|
|
||||||
--color-info-body: #06141e;
|
|
||||||
}
|
|
||||||
|
|
||||||
* {
|
|
||||||
transition: background-color 0.3s ease, border-color 0.3s ease;
|
|
||||||
}
|
|
||||||
|
|
||||||
/* Typography */
|
|
||||||
body {
|
|
||||||
font-family: system-ui,-apple-system,BlinkMacSystemFont,"Segoe UI",Roboto,"Helvetica Neue",Arial,sans-serif;
|
|
||||||
font-size: inherit;
|
|
||||||
line-height: 1.4;
|
|
||||||
color: var(--color-text-body);
|
|
||||||
}
|
|
||||||
|
|
||||||
h1, h2, h3, h4, h5, h6 {
|
|
||||||
font-family: inherit;
|
|
||||||
}
|
|
||||||
|
|
||||||
.rst-content .toctree-wrapper>p.caption, .rst-content h1, .rst-content h2, .rst-content h3, .rst-content h4, .rst-content h5, .rst-content h6 {
|
|
||||||
padding-top: .5em;
|
|
||||||
}
|
|
||||||
|
|
||||||
p, .main-content-wrap, .rst-content .section ul, .rst-content .toctree-wrapper ul, .rst-content section ul, .wy-plain-list-disc, article ul {
|
|
||||||
line-height: 1.6;
|
|
||||||
}
|
|
||||||
|
|
||||||
pre, .code, .rst-content .linenodiv pre, .rst-content div[class^=highlight] pre, .rst-content pre.literal-block {
|
|
||||||
font-family: "SFMono-Regular", Menlo,Consolas, Monospace;
|
|
||||||
font-size: 0.75em;
|
|
||||||
line-height: 1.8;
|
|
||||||
}
|
|
||||||
|
|
||||||
.wy-menu-vertical li.toctree-l3,.wy-menu-vertical li.toctree-l4 {
|
|
||||||
font-size: 1rem
|
|
||||||
}
|
|
||||||
|
|
||||||
.rst-versions {
|
|
||||||
font-family: inherit;
|
|
||||||
line-height: 1;
|
|
||||||
}
|
|
||||||
|
|
||||||
footer, footer p {
|
|
||||||
font-size: .8rem;
|
|
||||||
}
|
|
||||||
|
|
||||||
footer .rst-footer-buttons {
|
|
||||||
font-size: 1rem;
|
|
||||||
}
|
|
||||||
|
|
||||||
@media (max-width: 400px) {
|
|
||||||
/* break code lines on mobile */
|
|
||||||
pre, code {
|
|
||||||
word-break: break-word;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
|
|
||||||
/* Layout */
|
|
||||||
.wy-side-nav-search, .wy-menu-vertical {
|
|
||||||
width: auto;
|
|
||||||
}
|
|
||||||
|
|
||||||
.wy-nav-side {
|
|
||||||
z-index: 0;
|
|
||||||
display: flex;
|
|
||||||
flex-wrap: wrap;
|
|
||||||
background-color: var(--color-bg-side-nav);
|
|
||||||
}
|
|
||||||
|
|
||||||
.wy-side-scroll {
|
|
||||||
width: 100%;
|
|
||||||
overflow-y: auto;
|
|
||||||
}
|
|
||||||
|
|
||||||
@media (min-width: 66.5rem) {
|
|
||||||
.wy-side-scroll {
|
|
||||||
width:264px
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
@media (min-width: 50rem) {
|
|
||||||
.wy-nav-side {
|
|
||||||
flex-wrap: nowrap;
|
|
||||||
position: fixed;
|
|
||||||
width: 248px;
|
|
||||||
height: 100%;
|
|
||||||
flex-direction: column;
|
|
||||||
border-right: 1px solid var(--color-border);
|
|
||||||
align-items:flex-end
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
@media (min-width: 66.5rem) {
|
|
||||||
.wy-nav-side {
|
|
||||||
width: calc((100% - 1064px) / 2 + 264px);
|
|
||||||
min-width:264px
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
@media (min-width: 50rem) {
|
|
||||||
.wy-nav-content-wrap {
|
|
||||||
position: relative;
|
|
||||||
max-width: 800px;
|
|
||||||
margin-left:248px
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
@media (min-width: 66.5rem) {
|
|
||||||
.wy-nav-content-wrap {
|
|
||||||
margin-left:calc((100% - 1064px) / 2 + 264px)
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
|
|
||||||
/* Colors */
|
|
||||||
body.wy-body-for-nav,
|
|
||||||
.wy-nav-content {
|
|
||||||
background: var(--color-bg-body);
|
|
||||||
}
|
|
||||||
|
|
||||||
.wy-nav-side {
|
|
||||||
border-right: 1px solid var(--color-border);
|
|
||||||
}
|
|
||||||
|
|
||||||
.wy-side-nav-search, .wy-nav-top {
|
|
||||||
background: var(--color-bg-side-nav);
|
|
||||||
border-bottom: 1px solid var(--color-border);
|
|
||||||
}
|
|
||||||
|
|
||||||
.wy-nav-content-wrap {
|
|
||||||
background: inherit;
|
|
||||||
}
|
|
||||||
|
|
||||||
.wy-side-nav-search > a, .wy-nav-top a, .wy-nav-top i {
|
|
||||||
color: var(--color-text-title);
|
|
||||||
}
|
|
||||||
|
|
||||||
.wy-side-nav-search > a:hover, .wy-nav-top a:hover {
|
|
||||||
background: transparent;
|
|
||||||
}
|
|
||||||
|
|
||||||
.wy-side-nav-search > div.version {
|
|
||||||
color: var(--color-text-alt)
|
|
||||||
}
|
|
||||||
|
|
||||||
.wy-side-nav-search > div[role="search"] {
|
|
||||||
border-top: 1px solid var(--color-border);
|
|
||||||
}
|
|
||||||
|
|
||||||
.wy-menu-vertical li.toctree-l2.current>a, .wy-menu-vertical li.toctree-l2.current li.toctree-l3>a,
|
|
||||||
.wy-menu-vertical li.toctree-l3.current>a, .wy-menu-vertical li.toctree-l3.current li.toctree-l4>a {
|
|
||||||
background: var(--color-bg-side-nav);
|
|
||||||
}
|
|
||||||
|
|
||||||
.rst-content .highlighted {
|
|
||||||
background: #eedd85;
|
|
||||||
box-shadow: 0 0 0 2px #eedd85;
|
|
||||||
font-weight: 600;
|
|
||||||
}
|
|
||||||
|
|
||||||
.wy-side-nav-search input[type=text],
|
|
||||||
html.writer-html5 .rst-content table.docutils th {
|
|
||||||
color: var(--color-text-body);
|
|
||||||
}
|
|
||||||
|
|
||||||
.rst-content table.docutils:not(.field-list) tr:nth-child(2n-1) td,
|
|
||||||
.wy-table-backed,
|
|
||||||
.wy-table-odd td,
|
|
||||||
.wy-table-striped tr:nth-child(2n-1) td {
|
|
||||||
background-color: var(--color-bg-body-alt);
|
|
||||||
}
|
|
||||||
|
|
||||||
.rst-content table.docutils,
|
|
||||||
.wy-table-bordered-all,
|
|
||||||
html.writer-html5 .rst-content table.docutils th,
|
|
||||||
.rst-content table.docutils td,
|
|
||||||
.wy-table-bordered-all td,
|
|
||||||
hr {
|
|
||||||
border-color: var(--color-border) !important;
|
|
||||||
}
|
|
||||||
|
|
||||||
::selection {
|
|
||||||
background: var(--color-text-selection);
|
|
||||||
}
|
|
||||||
|
|
||||||
/* Ridiculous rules are taken from sphinx_rtd */
|
|
||||||
.rst-content .admonition-title,
|
|
||||||
.wy-alert-title {
|
|
||||||
color: var(--color-text-body-light);
|
|
||||||
}
|
|
||||||
|
|
||||||
.rst-content .hint,
|
|
||||||
.rst-content .important,
|
|
||||||
.rst-content .tip,
|
|
||||||
.rst-content .wy-alert-success,
|
|
||||||
.wy-alert.wy-alert-success {
|
|
||||||
background: var(--color-success-body);
|
|
||||||
}
|
|
||||||
|
|
||||||
.rst-content .hint .admonition-title,
|
|
||||||
.rst-content .hint .wy-alert-title,
|
|
||||||
.rst-content .important .admonition-title,
|
|
||||||
.rst-content .important .wy-alert-title,
|
|
||||||
.rst-content .tip .admonition-title,
|
|
||||||
.rst-content .tip .wy-alert-title,
|
|
||||||
.rst-content .wy-alert-success .admonition-title,
|
|
||||||
.rst-content .wy-alert-success .wy-alert-title,
|
|
||||||
.wy-alert.wy-alert-success .rst-content .admonition-title,
|
|
||||||
.wy-alert.wy-alert-success .wy-alert-title {
|
|
||||||
background-color: var(--color-success-title);
|
|
||||||
}
|
|
||||||
|
|
||||||
.rst-content .admonition-todo,
|
|
||||||
.rst-content .attention,
|
|
||||||
.rst-content .caution,
|
|
||||||
.rst-content .warning,
|
|
||||||
.rst-content .wy-alert-warning,
|
|
||||||
.wy-alert.wy-alert-warning {
|
|
||||||
background: var(--color-warning-body);
|
|
||||||
}
|
|
||||||
|
|
||||||
.rst-content .admonition-todo .admonition-title,
|
|
||||||
.rst-content .admonition-todo .wy-alert-title,
|
|
||||||
.rst-content .attention .admonition-title,
|
|
||||||
.rst-content .attention .wy-alert-title,
|
|
||||||
.rst-content .caution .admonition-title,
|
|
||||||
.rst-content .caution .wy-alert-title,
|
|
||||||
.rst-content .warning .admonition-title,
|
|
||||||
.rst-content .warning .wy-alert-title,
|
|
||||||
.rst-content .wy-alert-warning .admonition-title,
|
|
||||||
.rst-content .wy-alert-warning .wy-alert-title,
|
|
||||||
.rst-content .wy-alert.wy-alert-warning .admonition-title,
|
|
||||||
.wy-alert.wy-alert-warning .rst-content .admonition-title,
|
|
||||||
.wy-alert.wy-alert-warning .wy-alert-title {
|
|
||||||
background: var(--color-warning-title);
|
|
||||||
}
|
|
||||||
|
|
||||||
.rst-content .danger,
|
|
||||||
.rst-content .error,
|
|
||||||
.rst-content .wy-alert-danger,
|
|
||||||
.wy-alert.wy-alert-danger {
|
|
||||||
background: var(--color-danger-body);
|
|
||||||
}
|
|
||||||
|
|
||||||
.rst-content .danger .admonition-title,
|
|
||||||
.rst-content .danger .wy-alert-title,
|
|
||||||
.rst-content .error .admonition-title,
|
|
||||||
.rst-content .error .wy-alert-title,
|
|
||||||
.rst-content .wy-alert-danger .admonition-title,
|
|
||||||
.rst-content .wy-alert-danger .wy-alert-title,
|
|
||||||
.wy-alert.wy-alert-danger .rst-content .admonition-title,
|
|
||||||
.wy-alert.wy-alert-danger .wy-alert-title {
|
|
||||||
background: var(--color-danger-title);
|
|
||||||
}
|
|
||||||
|
|
||||||
.rst-content .note,
|
|
||||||
.rst-content .seealso,
|
|
||||||
.rst-content .wy-alert-info,
|
|
||||||
.wy-alert.wy-alert-info {
|
|
||||||
background: var(--color-info-body);
|
|
||||||
}
|
|
||||||
|
|
||||||
.rst-content .note .admonition-title,
|
|
||||||
.rst-content .note .wy-alert-title,
|
|
||||||
.rst-content .seealso .admonition-title,
|
|
||||||
.rst-content .seealso .wy-alert-title,
|
|
||||||
.rst-content .wy-alert-info .admonition-title,
|
|
||||||
.rst-content .wy-alert-info .wy-alert-title,
|
|
||||||
.wy-alert.wy-alert-info .rst-content .admonition-title,
|
|
||||||
.wy-alert.wy-alert-info .wy-alert-title {
|
|
||||||
background: var(--color-info-title);
|
|
||||||
}
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
/* Links */
|
|
||||||
a, a:visited,
|
|
||||||
.wy-menu-vertical a,
|
|
||||||
a.icon.icon-home,
|
|
||||||
.wy-menu-vertical li.toctree-l1.current > a.current {
|
|
||||||
color: var(--color-text-anchor);
|
|
||||||
text-decoration: none;
|
|
||||||
}
|
|
||||||
|
|
||||||
a:hover, .wy-breadcrumbs-aside a {
|
|
||||||
color: var(--color-text-anchor); /* reset */
|
|
||||||
}
|
|
||||||
|
|
||||||
.rst-versions a, .rst-versions .rst-current-version {
|
|
||||||
color: #var(--color-text-anchor);
|
|
||||||
}
|
|
||||||
|
|
||||||
.wy-nav-content a.reference, .wy-nav-content a:not([class]) {
|
|
||||||
background-image: linear-gradient(var(--color-border) 0%, var(--color-border) 100%);
|
|
||||||
background-repeat: repeat-x;
|
|
||||||
background-position: 0 100%;
|
|
||||||
background-size: 1px 1px;
|
|
||||||
}
|
|
||||||
|
|
||||||
.wy-nav-content a.reference:hover, .wy-nav-content a:not([class]):hover {
|
|
||||||
background-image: linear-gradient(rgba(114,83,237,0.45) 0%, rgba(114,83,237,0.45) 100%);
|
|
||||||
background-size: 1px 1px;
|
|
||||||
}
|
|
||||||
|
|
||||||
.wy-menu-vertical a:hover,
|
|
||||||
.wy-menu-vertical li.current a:hover,
|
|
||||||
.wy-menu-vertical a:active {
|
|
||||||
background: var(--color-bg-side-nav-hover) !important;
|
|
||||||
color: var(--color-text-body);
|
|
||||||
}
|
|
||||||
|
|
||||||
.wy-menu-vertical li.toctree-l1.current>a,
|
|
||||||
.wy-menu-vertical li.current>a,
|
|
||||||
.wy-menu-vertical li.on a {
|
|
||||||
background-color: var(--color-bg-side-nav-hover);
|
|
||||||
border: none;
|
|
||||||
font-weight: normal;
|
|
||||||
}
|
|
||||||
|
|
||||||
.wy-menu-vertical li.current {
|
|
||||||
background-color: inherit;
|
|
||||||
}
|
|
||||||
|
|
||||||
.wy-menu-vertical li.current a {
|
|
||||||
border-right: none;
|
|
||||||
}
|
|
||||||
|
|
||||||
.wy-menu-vertical li.toctree-l2 a,
|
|
||||||
.wy-menu-vertical li.toctree-l3 a,
|
|
||||||
.wy-menu-vertical li.toctree-l4 a,
|
|
||||||
.wy-menu-vertical li.toctree-l5 a,
|
|
||||||
.wy-menu-vertical li.toctree-l6 a,
|
|
||||||
.wy-menu-vertical li.toctree-l7 a,
|
|
||||||
.wy-menu-vertical li.toctree-l8 a,
|
|
||||||
.wy-menu-vertical li.toctree-l9 a,
|
|
||||||
.wy-menu-vertical li.toctree-l10 a {
|
|
||||||
color: var(--color-text-body);
|
|
||||||
}
|
|
||||||
|
|
||||||
a.image-reference, a.image-reference:hover {
|
|
||||||
background: none !important;
|
|
||||||
}
|
|
||||||
|
|
||||||
a.image-reference img {
|
|
||||||
cursor: zoom-in;
|
|
||||||
}
|
|
||||||
|
|
||||||
|
|
||||||
/* Code blocks */
|
|
||||||
.rst-content code, .rst-content tt, code {
|
|
||||||
padding: 0.25em;
|
|
||||||
font-weight: 400;
|
|
||||||
background-color: var(--color-bg-code-block);
|
|
||||||
border: 1px solid var(--color-border);
|
|
||||||
border-radius: 4px;
|
|
||||||
}
|
|
||||||
|
|
||||||
.rst-content div[class^=highlight], .rst-content pre.literal-block {
|
|
||||||
padding: 0.7rem;
|
|
||||||
margin-top: 0;
|
|
||||||
margin-bottom: 0.75rem;
|
|
||||||
overflow-x: auto;
|
|
||||||
background-color: var(--color-bg-side-nav);
|
|
||||||
border-color: var(--color-border);
|
|
||||||
border-radius: 4px;
|
|
||||||
box-shadow: none;
|
|
||||||
}
|
|
||||||
|
|
||||||
.rst-content .admonition-title,
|
|
||||||
.rst-content div.admonition,
|
|
||||||
.wy-alert-title {
|
|
||||||
padding: 10px 12px;
|
|
||||||
border-top-left-radius: 4px;
|
|
||||||
border-top-right-radius: 4px;
|
|
||||||
}
|
|
||||||
|
|
||||||
.highlight .go {
|
|
||||||
color: inherit;
|
|
||||||
}
|
|
||||||
|
|
||||||
.highlight .nt {
|
|
||||||
color: var(--color-text-code-nt);
|
|
||||||
}
|
|
||||||
|
|
||||||
.rst-content code.literal,
|
|
||||||
.rst-content tt.literal {
|
|
||||||
border-color: var(--color-border);
|
|
||||||
background-color: var(--color-border);
|
|
||||||
color: var(--color-text-code-inline)
|
|
||||||
}
|
|
||||||
|
|
||||||
|
|
||||||
/* Search */
|
|
||||||
.wy-side-nav-search input[type=text] {
|
|
||||||
border: none;
|
|
||||||
border-radius: 0;
|
|
||||||
background-color: transparent;
|
|
||||||
font-family: inherit;
|
|
||||||
font-size: .85rem;
|
|
||||||
box-shadow: none;
|
|
||||||
padding: .7rem 1rem .7rem 2.8rem;
|
|
||||||
margin: 0;
|
|
||||||
}
|
|
||||||
|
|
||||||
#rtd-search-form {
|
|
||||||
position: relative;
|
|
||||||
}
|
|
||||||
|
|
||||||
#rtd-search-form:before {
|
|
||||||
font: normal normal normal 14px/1 FontAwesome;
|
|
||||||
font-size: inherit;
|
|
||||||
text-rendering: auto;
|
|
||||||
-webkit-font-smoothing: antialiased;
|
|
||||||
-moz-osx-font-smoothing: grayscale;
|
|
||||||
content: "\f002";
|
|
||||||
color: var(--color-text-alt);
|
|
||||||
position: absolute;
|
|
||||||
left: 1.5rem;
|
|
||||||
top: .7rem;
|
|
||||||
}
|
|
||||||
|
|
||||||
/* Side nav */
|
|
||||||
.wy-side-nav-search {
|
|
||||||
padding: 1rem 0 0 0;
|
|
||||||
}
|
|
||||||
|
|
||||||
.wy-menu-vertical li a button.toctree-expand {
|
|
||||||
float: right;
|
|
||||||
margin-right: -1.5em;
|
|
||||||
padding: 0 .5em;
|
|
||||||
}
|
|
||||||
|
|
||||||
.wy-menu-vertical a,
|
|
||||||
.wy-menu-vertical li.current>a,
|
|
||||||
.wy-menu-vertical li.current li>a {
|
|
||||||
padding-right: 1.5em !important;
|
|
||||||
}
|
|
||||||
|
|
||||||
.wy-menu-vertical li.current li>a.current {
|
|
||||||
font-weight: 600;
|
|
||||||
}
|
|
||||||
|
|
||||||
/* Misc spacing */
|
|
||||||
.rst-content .admonition-title, .wy-alert-title {
|
|
||||||
padding: 10px 12px;
|
|
||||||
}
|
|
||||||
|
|
||||||
/* Buttons */
|
|
||||||
.btn {
|
|
||||||
display: inline-block;
|
|
||||||
box-sizing: border-box;
|
|
||||||
padding: 0.3em 1em;
|
|
||||||
margin: 0;
|
|
||||||
font-family: inherit;
|
|
||||||
font-size: inherit;
|
|
||||||
font-weight: 500;
|
|
||||||
line-height: 1.5;
|
|
||||||
color: #var(--color-text-anchor);
|
|
||||||
text-decoration: none;
|
|
||||||
vertical-align: baseline;
|
|
||||||
background-color: #f7f7f7;
|
|
||||||
border-width: 0;
|
|
||||||
border-radius: 4px;
|
|
||||||
box-shadow: 0 1px 2px rgba(0,0,0,0.12),0 3px 10px rgba(0,0,0,0.08);
|
|
||||||
appearance: none;
|
|
||||||
}
|
|
||||||
|
|
||||||
.btn:active {
|
|
||||||
padding: 0.3em 1em;
|
|
||||||
}
|
|
||||||
|
|
||||||
.rst-content .btn:focus {
|
|
||||||
outline: 1px solid #ccc;
|
|
||||||
}
|
|
||||||
|
|
||||||
.rst-content .btn-neutral, .rst-content .btn span.fa {
|
|
||||||
color: var(--color-text-body) !important;
|
|
||||||
}
|
|
||||||
|
|
||||||
.btn-neutral {
|
|
||||||
background-color: var(--color-btn-neutral-bg) !important;
|
|
||||||
color: var(--color-btn-neutral-text) !important;
|
|
||||||
border: 1px solid var(--color-btn-neutral-bg);
|
|
||||||
}
|
|
||||||
|
|
||||||
.btn:hover, .btn-neutral:hover {
|
|
||||||
background-color: var(--color-btn-neutral-bg-hover) !important;
|
|
||||||
}
|
|
||||||
|
|
||||||
|
|
||||||
/* Icon overrides */
|
|
||||||
.wy-side-nav-search a.icon-home:before {
|
|
||||||
display: none;
|
|
||||||
}
|
|
||||||
|
|
||||||
.fa-minus-square-o:before,.wy-menu-vertical li.current>a button.toctree-expand:before,.wy-menu-vertical li.on a button.toctree-expand:before {
|
|
||||||
content: "\f106"; /* fa-angle-up */
|
|
||||||
}
|
|
||||||
|
|
||||||
.fa-plus-square-o:before, .wy-menu-vertical li button.toctree-expand:before {
|
|
||||||
content: "\f107"; /* fa-angle-down */
|
|
||||||
}
|
|
||||||
|
|
||||||
|
|
||||||
/* Misc */
|
|
||||||
.wy-nav-top {
|
|
||||||
line-height: 36px;
|
|
||||||
}
|
|
||||||
|
|
||||||
.wy-nav-top > i {
|
|
||||||
font-size: 24px;
|
|
||||||
padding: 8px 0 0 2px;
|
|
||||||
color:#var(--color-text-anchor);
|
|
||||||
}
|
|
||||||
|
|
||||||
.rst-content table.docutils td,
|
|
||||||
.rst-content table.docutils th,
|
|
||||||
.rst-content table.field-list td,
|
|
||||||
.rst-content table.field-list th,
|
|
||||||
.wy-table td,
|
|
||||||
.wy-table th {
|
|
||||||
padding: 8px 14px;
|
|
||||||
}
|
|
||||||
|
|
||||||
.dark-mode-toggle {
|
|
||||||
position: absolute;
|
|
||||||
top: 14px;
|
|
||||||
right: 12px;
|
|
||||||
height: 20px;
|
|
||||||
width: 24px;
|
|
||||||
z-index: 10;
|
|
||||||
border: none;
|
|
||||||
background-color: transparent;
|
|
||||||
color: inherit;
|
|
||||||
opacity: 0.7;
|
|
||||||
}
|
|
||||||
|
|
||||||
.wy-nav-content-wrap {
|
|
||||||
z-index: 20;
|
|
||||||
}
|
|
||||||
47
docs/_static/js/darkmode.js
vendored
@@ -1,47 +0,0 @@
|
|||||||
let toggleButton;
|
|
||||||
let icon;
|
|
||||||
|
|
||||||
function load() {
|
|
||||||
"use strict";
|
|
||||||
|
|
||||||
toggleButton = document.createElement("button");
|
|
||||||
toggleButton.setAttribute("title", "Toggle dark mode");
|
|
||||||
toggleButton.classList.add("dark-mode-toggle");
|
|
||||||
icon = document.createElement("i");
|
|
||||||
icon.classList.add("fa", darkModeState ? "fa-sun-o" : "fa-moon-o");
|
|
||||||
toggleButton.appendChild(icon);
|
|
||||||
document.body.prepend(toggleButton);
|
|
||||||
|
|
||||||
// Listen for changes in the OS settings
|
|
||||||
// addListener is used because older versions of Safari don't support addEventListener
|
|
||||||
// prefersDarkQuery set in <head>
|
|
||||||
if (prefersDarkQuery) {
|
|
||||||
prefersDarkQuery.addListener(function (evt) {
|
|
||||||
toggleDarkMode(evt.matches);
|
|
||||||
});
|
|
||||||
}
|
|
||||||
|
|
||||||
// Initial setting depending on the prefers-color-mode or localstorage
|
|
||||||
// darkModeState should be set in the document <head> to prevent flash
|
|
||||||
if (darkModeState == undefined) darkModeState = false;
|
|
||||||
toggleDarkMode(darkModeState);
|
|
||||||
|
|
||||||
// Toggles the "dark-mode" class on click and sets localStorage state
|
|
||||||
toggleButton.addEventListener("click", () => {
|
|
||||||
darkModeState = !darkModeState;
|
|
||||||
|
|
||||||
toggleDarkMode(darkModeState);
|
|
||||||
localStorage.setItem("dark-mode", darkModeState);
|
|
||||||
});
|
|
||||||
}
|
|
||||||
|
|
||||||
function toggleDarkMode(state) {
|
|
||||||
document.documentElement.classList.toggle("dark-mode", state);
|
|
||||||
document.documentElement.classList.toggle("light-mode", !state);
|
|
||||||
icon.classList.remove("fa-sun-o");
|
|
||||||
icon.classList.remove("fa-moon-o");
|
|
||||||
icon.classList.add(state ? "fa-sun-o" : "fa-moon-o");
|
|
||||||
darkModeState = state;
|
|
||||||
}
|
|
||||||
|
|
||||||
document.addEventListener("DOMContentLoaded", load);
|
|
||||||
BIN
docs/_static/screenshots/mail-rules-edited.png
vendored
|
Before Width: | Height: | Size: 96 KiB |
13
docs/_templates/layout.html
vendored
@@ -1,13 +0,0 @@
|
|||||||
{% extends "!layout.html" %}
|
|
||||||
{% block extrahead %}
|
|
||||||
<script>
|
|
||||||
// MediaQueryList object
|
|
||||||
const prefersDarkQuery = window.matchMedia("(prefers-color-scheme: dark)");
|
|
||||||
const lsDark = localStorage.getItem("dark-mode");
|
|
||||||
let darkModeState = lsDark !== null ? lsDark == "true" : prefersDarkQuery.matches;
|
|
||||||
|
|
||||||
document.documentElement.classList.toggle("dark-mode", darkModeState);
|
|
||||||
document.documentElement.classList.toggle("light-mode", !darkModeState);
|
|
||||||
</script>
|
|
||||||
{{ super() }}
|
|
||||||
{% endblock %}
|
|
||||||
550
docs/administration.md
Normal file
@@ -0,0 +1,550 @@
|
|||||||
|
# Administration
|
||||||
|
|
||||||
|
## Making backups {#backup}
|
||||||
|
|
||||||
|
Multiple options exist for making backups of your paperless instance,
|
||||||
|
depending on how you installed paperless.
|
||||||
|
|
||||||
|
Before making backups, make sure that paperless is not running.
|
||||||
|
|
||||||
|
Options available to any installation of paperless:
|
||||||
|
|
||||||
|
- Use the [document exporter](#exporter). The document exporter exports all your documents,
|
||||||
|
thumbnails and metadata to a specific folder. You may import your
|
||||||
|
documents into a fresh instance of paperless again or store your
|
||||||
|
documents in another DMS with this export.
|
||||||
|
- The document exporter is also able to update an already existing
|
||||||
|
export. Therefore, incremental backups with `rsync` are entirely
|
||||||
|
possible.
|
||||||
|
|
||||||
|
!!! caution
|
||||||
|
|
||||||
|
You cannot import the export generated with one version of paperless in
|
||||||
|
a different version of paperless. The export contains an exact image of
|
||||||
|
the database, and migrations may change the database layout.
|
||||||
|
|
||||||
|
Options available to docker installations:
|
||||||
|
|
||||||
|
- Backup the docker volumes. These usually reside within
|
||||||
|
`/var/lib/docker/volumes` on the host and you need to be root in
|
||||||
|
order to access them.
|
||||||
|
|
||||||
|
Paperless uses 4 volumes:
|
||||||
|
|
||||||
|
- `paperless_media`: This is where your documents are stored.
|
||||||
|
- `paperless_data`: This is where auxillary data is stored. This
|
||||||
|
folder also contains the SQLite database, if you use it.
|
||||||
|
- `paperless_pgdata`: Exists only if you use PostgreSQL and
|
||||||
|
contains the database.
|
||||||
|
- `paperless_dbdata`: Exists only if you use MariaDB and contains
|
||||||
|
the database.
|
||||||
|
|
||||||
|
Options available to bare-metal and non-docker installations:
|
||||||
|
|
||||||
|
- Backup the entire paperless folder. This ensures that if your
|
||||||
|
paperless instance crashes at some point or your disk fails, you can
|
||||||
|
simply copy the folder back into place and it works.
|
||||||
|
|
||||||
|
When using PostgreSQL or MariaDB, you'll also have to backup the
|
||||||
|
database.
|
||||||
|
|
||||||
|
### Restoring {#migrating-restoring}
|
||||||
|
|
||||||
|
## Updating Paperless {#updating}
|
||||||
|
|
||||||
|
### Docker Route {#docker-updating}
|
||||||
|
|
||||||
|
If a new release of paperless-ngx is available, upgrading depends on how
|
||||||
|
you installed paperless-ngx in the first place. The releases are
|
||||||
|
available at the [release
|
||||||
|
page](https://github.com/paperless-ngx/paperless-ngx/releases).
|
||||||
|
|
||||||
|
First of all, ensure that paperless is stopped.
|
||||||
|
|
||||||
|
```shell-session
|
||||||
|
$ cd /path/to/paperless
|
||||||
|
$ docker-compose down
|
||||||
|
```
|
||||||
|
|
||||||
|
After that, [make a backup](#backup).
|
||||||
|
|
||||||
|
1. If you pull the image from the docker hub, all you need to do is:
|
||||||
|
|
||||||
|
```shell-session
|
||||||
|
$ docker-compose pull
|
||||||
|
$ docker-compose up
|
||||||
|
```
|
||||||
|
|
||||||
|
The docker-compose files refer to the `latest` version, which is
|
||||||
|
always the latest stable release.
|
||||||
|
|
||||||
|
2. If you built the image yourself, do the following:
|
||||||
|
|
||||||
|
```shell-session
|
||||||
|
$ git pull
|
||||||
|
$ docker-compose build
|
||||||
|
$ docker-compose up
|
||||||
|
```
|
||||||
|
|
||||||
|
Running `docker-compose up` will also apply any new database migrations.
|
||||||
|
If you see everything working, press CTRL+C once to gracefully stop
|
||||||
|
paperless. Then you can start paperless-ngx with `-d` to have it run in
|
||||||
|
the background.
|
||||||
|
|
||||||
|
!!! note
|
||||||
|
|
||||||
|
In version 0.9.14, the update process was changed. In 0.9.13 and
|
||||||
|
earlier, the docker-compose files specified exact versions and pull
|
||||||
|
won't automatically update to newer versions. In order to enable
|
||||||
|
updates as described above, either get the new `docker-compose.yml`
|
||||||
|
file from
|
||||||
|
[here](https://github.com/paperless-ngx/paperless-ngx/tree/master/docker/compose)
|
||||||
|
or edit the `docker-compose.yml` file, find the line that says
|
||||||
|
|
||||||
|
```
|
||||||
|
image: ghcr.io/paperless-ngx/paperless-ngx:0.9.x
|
||||||
|
```
|
||||||
|
|
||||||
|
and replace the version with `latest`:
|
||||||
|
|
||||||
|
```
|
||||||
|
image: ghcr.io/paperless-ngx/paperless-ngx:latest
|
||||||
|
```
|
||||||
|
|
||||||
|
!!! note
|
||||||
|
|
||||||
|
In version 1.7.1 and onwards, the Docker image can now be pinned to a
|
||||||
|
release series. This is often combined with automatic updaters such as
|
||||||
|
Watchtower to allow safer unattended upgrading to new bugfix releases
|
||||||
|
only. It is still recommended to always review release notes before
|
||||||
|
upgrading. To pin your install to a release series, edit the
|
||||||
|
`docker-compose.yml` find the line that says
|
||||||
|
|
||||||
|
```
|
||||||
|
image: ghcr.io/paperless-ngx/paperless-ngx:latest
|
||||||
|
```
|
||||||
|
|
||||||
|
and replace the version with the series you want to track, for
|
||||||
|
example:
|
||||||
|
|
||||||
|
```
|
||||||
|
image: ghcr.io/paperless-ngx/paperless-ngx:1.7
|
||||||
|
```
|
||||||
|
|
||||||
|
### Bare Metal Route {#bare-metal-updating}
|
||||||
|
|
||||||
|
After grabbing the new release and unpacking the contents, do the
|
||||||
|
following:
|
||||||
|
|
||||||
|
1. Update dependencies. New paperless version may require additional
|
||||||
|
dependencies. The dependencies required are listed in the section
|
||||||
|
about
|
||||||
|
[bare metal installations](/setup#bare_metal).
|
||||||
|
|
||||||
|
2. Update python requirements. Keep in mind to activate your virtual
|
||||||
|
environment before that, if you use one.
|
||||||
|
|
||||||
|
```shell-session
|
||||||
|
$ pip install -r requirements.txt
|
||||||
|
```
|
||||||
|
|
||||||
|
3. Migrate the database.
|
||||||
|
|
||||||
|
```shell-session
|
||||||
|
$ cd src
|
||||||
|
$ python3 manage.py migrate # (1)
|
||||||
|
```
|
||||||
|
|
||||||
|
1. Including `sudo -Hu <paperless_user>` may be required
|
||||||
|
|
||||||
|
This might not actually do anything. Not every new paperless version
|
||||||
|
comes with new database migrations.
|
||||||
|
|
||||||
|
## Downgrading Paperless {#downgrade-paperless}
|
||||||
|
|
||||||
|
Downgrades are possible. However, some updates also contain database
|
||||||
|
migrations (these change the layout of the database and may move data).
|
||||||
|
In order to move back from a version that applied database migrations,
|
||||||
|
you'll have to revert the database migration _before_ downgrading, and
|
||||||
|
then downgrade paperless.
|
||||||
|
|
||||||
|
This table lists the compatible versions for each database migration
|
||||||
|
number.
|
||||||
|
|
||||||
|
| Migration number | Version range |
|
||||||
|
| ---------------- | --------------- |
|
||||||
|
| 1011 | 1.0.0 |
|
||||||
|
| 1012 | 1.1.0 - 1.2.1 |
|
||||||
|
| 1014 | 1.3.0 - 1.3.1 |
|
||||||
|
| 1016 | 1.3.2 - current |
|
||||||
|
|
||||||
|
Execute the following management command to migrate your database:
|
||||||
|
|
||||||
|
```shell-session
|
||||||
|
$ python3 manage.py migrate documents <migration number>
|
||||||
|
```
|
||||||
|
|
||||||
|
!!! note
|
||||||
|
|
||||||
|
Some migrations cannot be undone. The command will issue errors if that
|
||||||
|
happens.
|
||||||
|
|
||||||
|
## Management utilities {#management-commands}
|
||||||
|
|
||||||
|
Paperless comes with some management commands that perform various
|
||||||
|
maintenance tasks on your paperless instance. You can invoke these
|
||||||
|
commands in the following way:
|
||||||
|
|
||||||
|
With docker-compose, while paperless is running:
|
||||||
|
|
||||||
|
```shell-session
|
||||||
|
$ cd /path/to/paperless
|
||||||
|
$ docker-compose exec webserver <command> <arguments>
|
||||||
|
```
|
||||||
|
|
||||||
|
With docker, while paperless is running:
|
||||||
|
|
||||||
|
```shell-session
|
||||||
|
$ docker exec -it <container-name> <command> <arguments>
|
||||||
|
```
|
||||||
|
|
||||||
|
Bare metal:
|
||||||
|
|
||||||
|
```shell-session
|
||||||
|
$ cd /path/to/paperless/src
|
||||||
|
$ python3 manage.py <command> <arguments> # (1)
|
||||||
|
```
|
||||||
|
|
||||||
|
1. Including `sudo -Hu <paperless_user>` may be required
|
||||||
|
|
||||||
|
All commands have built-in help, which can be accessed by executing them
|
||||||
|
with the argument `--help`.
|
||||||
|
|
||||||
|
### Document exporter {#exporter}
|
||||||
|
|
||||||
|
The document exporter exports all your data from paperless into a folder
|
||||||
|
for backup or migration to another DMS.
|
||||||
|
|
||||||
|
If you use the document exporter within a cronjob to backup your data
|
||||||
|
you might use the `-T` flag behind exec to suppress "The input device
|
||||||
|
is not a TTY" errors. For example:
|
||||||
|
`docker-compose exec -T webserver document_exporter ../export`
|
||||||
|
|
||||||
|
```
|
||||||
|
document_exporter target [-c] [-d] [-f] [-na] [-nt] [-p] [-sm] [-z]
|
||||||
|
|
||||||
|
optional arguments:
|
||||||
|
-c, --compare-checksums
|
||||||
|
-d, --delete
|
||||||
|
-f, --use-filename-format
|
||||||
|
-na, --no-archive
|
||||||
|
-nt, --no-thumbnail
|
||||||
|
-p, --use-folder-prefix
|
||||||
|
-sm, --split-manifest
|
||||||
|
-z --zip
|
||||||
|
```
|
||||||
|
|
||||||
|
`target` is a folder to which the data gets written. This includes
|
||||||
|
documents, thumbnails and a `manifest.json` file. The manifest contains
|
||||||
|
all metadata from the database (correspondents, tags, etc).
|
||||||
|
|
||||||
|
When you use the provided docker compose script, specify `../export` as
|
||||||
|
the target. This path inside the container is automatically mounted on
|
||||||
|
your host on the folder `export`.
|
||||||
|
|
||||||
|
If the target directory already exists and contains files, paperless
|
||||||
|
will assume that the contents of the export directory are a previous
|
||||||
|
export and will attempt to update the previous export. Paperless will
|
||||||
|
only export changed and added files. Paperless determines whether a file
|
||||||
|
has changed by inspecting the file attributes "date/time modified" and
|
||||||
|
"size". If that does not work out for you, specify `-c` or
|
||||||
|
`--compare-checksums` and paperless will attempt to compare file
|
||||||
|
checksums instead. This is slower.
|
||||||
|
|
||||||
|
Paperless will not remove any existing files in the export directory. If
|
||||||
|
you want paperless to also remove files that do not belong to the
|
||||||
|
current export such as files from deleted documents, specify `-d` or `--delete`.
|
||||||
|
Be careful when pointing paperless to a directory that already contains
|
||||||
|
other files.
|
||||||
|
|
||||||
|
The filenames generated by this command follow the format
|
||||||
|
`[date created] [correspondent] [title].[extension]`. If you want
|
||||||
|
paperless to use `PAPERLESS_FILENAME_FORMAT` for exported filenames
|
||||||
|
instead, specify `-f` or `--use-filename-format`.
|
||||||
|
|
||||||
|
If `-na` or `--no-archive` is provided, no archive files will be exported,
|
||||||
|
only the original files.
|
||||||
|
|
||||||
|
If `-nt` or `--no-thumbnail` is provided, thumbnail files will not be exported.
|
||||||
|
|
||||||
|
!!! note
|
||||||
|
|
||||||
|
When using the `-na`/`--no-archive` or `-nt`/`--no-thumbnail` options
|
||||||
|
the exporter will not output these files for backup. After importing,
|
||||||
|
the [sanity checker](#sanity-checker) will warn about missing thumbnails and archive files
|
||||||
|
until they are regenerated with `document_thumbnails` or [`document_archiver`](#archiver).
|
||||||
|
It can make sense to omit these files from backup as their content and checksum
|
||||||
|
can change (new archiver algorithm) and may then cause additional used space in
|
||||||
|
a deduplicated backup.
|
||||||
|
|
||||||
|
If `-p` or `--use-folder-prefix` is provided, files will be exported
|
||||||
|
in dedicated folders according to their nature: `archive`, `originals`,
|
||||||
|
`thumbnails` or `json`
|
||||||
|
|
||||||
|
If `-sm` or `--split-manifest` is provided, information about document
|
||||||
|
will be placed in individual json files, instead of a single JSON file. The main
|
||||||
|
manifest.json will still contain application wide information (e.g. tags, correspondent,
|
||||||
|
documenttype, etc)
|
||||||
|
|
||||||
|
If `-z` or `--zip` is provided, the export will be a zipfile
|
||||||
|
in the target directory, named according to the current date.
|
||||||
|
|
||||||
|
!!! warning
|
||||||
|
|
||||||
|
If exporting with the file name format, there may be errors due to
|
||||||
|
your operating system's maximum path lengths. Try adjusting the export
|
||||||
|
target or consider not using the filename format.
|
||||||
|
|
||||||
|
### Document importer {#importer}
|
||||||
|
|
||||||
|
The document importer takes the export produced by the [Document
|
||||||
|
exporter](#exporter) and imports it into paperless.
|
||||||
|
|
||||||
|
The importer works just like the exporter. You point it at a directory,
|
||||||
|
and the script does the rest of the work:
|
||||||
|
|
||||||
|
```
|
||||||
|
document_importer source
|
||||||
|
```
|
||||||
|
|
||||||
|
When you use the provided docker compose script, put the export inside
|
||||||
|
the `export` folder in your paperless source directory. Specify
|
||||||
|
`../export` as the `source`.
|
||||||
|
|
||||||
|
!!! note
|
||||||
|
|
||||||
|
Importing from a previous version of Paperless may work, but for best
|
||||||
|
results it is suggested to match the versions.
|
||||||
|
|
||||||
|
### Document retagger {#retagger}
|
||||||
|
|
||||||
|
Say you've imported a few hundred documents and now want to introduce a
|
||||||
|
tag or set up a new correspondent, and apply its matching to all of the
|
||||||
|
currently-imported docs. This problem is common enough that there are
|
||||||
|
tools for it.
|
||||||
|
|
||||||
|
```
|
||||||
|
document_retagger [-h] [-c] [-T] [-t] [-i] [--use-first] [-f]
|
||||||
|
|
||||||
|
optional arguments:
|
||||||
|
-c, --correspondent
|
||||||
|
-T, --tags
|
||||||
|
-t, --document_type
|
||||||
|
-s, --storage_path
|
||||||
|
-i, --inbox-only
|
||||||
|
--use-first
|
||||||
|
-f, --overwrite
|
||||||
|
```
|
||||||
|
|
||||||
|
Run this after changing or adding matching rules. It'll loop over all
|
||||||
|
of the documents in your database and attempt to match documents
|
||||||
|
according to the new rules.
|
||||||
|
|
||||||
|
Specify any combination of `-c`, `-T`, `-t` and `-s` to have the
|
||||||
|
retagger perform matching of the specified metadata type. If you don't
|
||||||
|
specify any of these options, the document retagger won't do anything.
|
||||||
|
|
||||||
|
Specify `-i` to have the document retagger work on documents tagged with
|
||||||
|
inbox tags only. This is useful when you don't want to mess with your
|
||||||
|
already processed documents.
|
||||||
|
|
||||||
|
When multiple document types or correspondents match a single document,
|
||||||
|
the retagger won't assign these to the document. Specify `--use-first`
|
||||||
|
to override this behavior and just use the first correspondent or type
|
||||||
|
it finds. This option does not apply to tags, since any amount of tags
|
||||||
|
can be applied to a document.
|
||||||
|
|
||||||
|
Finally, `-f` specifies that you wish to overwrite already assigned
|
||||||
|
correspondents, types and/or tags. The default behavior is to not assign
|
||||||
|
correspondents and types to documents that have this data already
|
||||||
|
assigned. `-f` works differently for tags: By default, only additional
|
||||||
|
tags get added to documents, no tags will be removed. With `-f`, tags
|
||||||
|
that don't match a document anymore get removed as well.
|
||||||
|
|
||||||
|
### Managing the Automatic matching algorithm
|
||||||
|
|
||||||
|
The _Auto_ matching algorithm requires a trained neural network to work.
|
||||||
|
This network needs to be updated whenever somethings in your data
|
||||||
|
changes. The docker image takes care of that automatically with the task
|
||||||
|
scheduler. You can manually renew the classifier by invoking the
|
||||||
|
following management command:
|
||||||
|
|
||||||
|
```
|
||||||
|
document_create_classifier
|
||||||
|
```
|
||||||
|
|
||||||
|
This command takes no arguments.
|
||||||
|
|
||||||
|
### Document thumbnails {#thumbnails}
|
||||||
|
|
||||||
|
Use this command to re-create document thumbnails. Optionally include the ` --document {id}` option to generate thumbnails for a specific document only.
|
||||||
|
|
||||||
|
```
|
||||||
|
document_thumbnails
|
||||||
|
```
|
||||||
|
|
||||||
|
### Managing the document search index {#index}
|
||||||
|
|
||||||
|
The document search index is responsible for delivering search results
|
||||||
|
for the website. The document index is automatically updated whenever
|
||||||
|
documents get added to, changed, or removed from paperless. However, if
|
||||||
|
the search yields non-existing documents or won't find anything, you
|
||||||
|
may need to recreate the index manually.
|
||||||
|
|
||||||
|
```
|
||||||
|
document_index {reindex,optimize}
|
||||||
|
```
|
||||||
|
|
||||||
|
Specify `reindex` to have the index created from scratch. This may take
|
||||||
|
some time.
|
||||||
|
|
||||||
|
Specify `optimize` to optimize the index. This updates certain aspects
|
||||||
|
of the index and usually makes queries faster and also ensures that the
|
||||||
|
autocompletion works properly. This command is regularly invoked by the
|
||||||
|
task scheduler.
|
||||||
|
|
||||||
|
### Managing filenames {#renamer}
|
||||||
|
|
||||||
|
If you use paperless' feature to
|
||||||
|
[assign custom filenames to your documents](/advanced_usage#file-name-handling), you can use this command to move all your files after
|
||||||
|
changing the naming scheme.
|
||||||
|
|
||||||
|
!!! warning
|
||||||
|
|
||||||
|
Since this command moves your documents, it is advised to do a backup
|
||||||
|
beforehand. The renaming logic is robust and will never overwrite or
|
||||||
|
delete a file, but you can't ever be careful enough.
|
||||||
|
|
||||||
|
```
|
||||||
|
document_renamer
|
||||||
|
```
|
||||||
|
|
||||||
|
The command takes no arguments and processes all your documents at once.
|
||||||
|
|
||||||
|
Learn how to use
|
||||||
|
[Management Utilities](#management-commands).
|
||||||
|
|
||||||
|
### Sanity checker {#sanity-checker}
|
||||||
|
|
||||||
|
Paperless has a built-in sanity checker that inspects your document
|
||||||
|
collection for issues.
|
||||||
|
|
||||||
|
The issues detected by the sanity checker are as follows:
|
||||||
|
|
||||||
|
- Missing original files.
|
||||||
|
- Missing archive files.
|
||||||
|
- Inaccessible original files due to improper permissions.
|
||||||
|
- Inaccessible archive files due to improper permissions.
|
||||||
|
- Corrupted original documents by comparing their checksum against
|
||||||
|
what is stored in the database.
|
||||||
|
- Corrupted archive documents by comparing their checksum against what
|
||||||
|
is stored in the database.
|
||||||
|
- Missing thumbnails.
|
||||||
|
- Inaccessible thumbnails due to improper permissions.
|
||||||
|
- Documents without any content (warning).
|
||||||
|
- Orphaned files in the media directory (warning). These are files
|
||||||
|
that are not referenced by any document im paperless.
|
||||||
|
|
||||||
|
```
|
||||||
|
document_sanity_checker
|
||||||
|
```
|
||||||
|
|
||||||
|
The command takes no arguments. Depending on the size of your document
|
||||||
|
archive, this may take some time.
|
||||||
|
|
||||||
|
### Fetching e-mail
|
||||||
|
|
||||||
|
Paperless automatically fetches your e-mail every 10 minutes by default.
|
||||||
|
If you want to invoke the email consumer manually, call the following
|
||||||
|
management command:
|
||||||
|
|
||||||
|
```
|
||||||
|
mail_fetcher
|
||||||
|
```
|
||||||
|
|
||||||
|
The command takes no arguments and processes all your mail accounts and
|
||||||
|
rules.
|
||||||
|
|
||||||
|
!!! note
|
||||||
|
|
||||||
|
As of October 2022 Microsoft no longer supports IMAP authentication
|
||||||
|
for Exchange servers, thus Exchange is no longer supported until a
|
||||||
|
solution is implemented in the Python IMAP library used by Paperless.
|
||||||
|
See [learn.microsoft.com](https://learn.microsoft.com/en-us/exchange/clients-and-mobile-in-exchange-online/deprecation-of-basic-authentication-exchange-online)
|
||||||
|
|
||||||
|
### Creating archived documents {#archiver}
|
||||||
|
|
||||||
|
Paperless stores archived PDF/A documents alongside your original
|
||||||
|
documents. These archived documents will also contain selectable text
|
||||||
|
for image-only originals. These documents are derived from the
|
||||||
|
originals, which are always stored unmodified. If coming from an earlier
|
||||||
|
version of paperless, your documents won't have archived versions.
|
||||||
|
|
||||||
|
This command creates PDF/A documents for your documents.
|
||||||
|
|
||||||
|
```
|
||||||
|
document_archiver --overwrite --document <id>
|
||||||
|
```
|
||||||
|
|
||||||
|
This command will only attempt to create archived documents when no
|
||||||
|
archived document exists yet, unless `--overwrite` is specified. If
|
||||||
|
`--document <id>` is specified, the archiver will only process that
|
||||||
|
document.
|
||||||
|
|
||||||
|
!!! note
|
||||||
|
|
||||||
|
This command essentially performs OCR on all your documents again,
|
||||||
|
according to your settings. If you run this with
|
||||||
|
`PAPERLESS_OCR_MODE=redo`, it will potentially run for a very long time.
|
||||||
|
You can cancel the command at any time, since this command will skip
|
||||||
|
already archived versions the next time it is run.
|
||||||
|
|
||||||
|
!!! note
|
||||||
|
|
||||||
|
Some documents will cause errors and cannot be converted into PDF/A
|
||||||
|
documents, such as encrypted PDF documents. The archiver will skip over
|
||||||
|
these documents each time it sees them.
|
||||||
|
|
||||||
|
### Managing encryption {#encryption}
|
||||||
|
|
||||||
|
Documents can be stored in Paperless using GnuPG encryption.
|
||||||
|
|
||||||
|
!!! warning
|
||||||
|
|
||||||
|
Encryption is deprecated since [paperless-ng 0.9](/changelog#paperless-ng-090) and doesn't really
|
||||||
|
provide any additional security, since you have to store the passphrase
|
||||||
|
in a configuration file on the same system as the encrypted documents
|
||||||
|
for paperless to work. Furthermore, the entire text content of the
|
||||||
|
documents is stored plain in the database, even if your documents are
|
||||||
|
encrypted. Filenames are not encrypted as well.
|
||||||
|
|
||||||
|
Also, the web server provides transparent access to your encrypted
|
||||||
|
documents.
|
||||||
|
|
||||||
|
Consider running paperless on an encrypted filesystem instead, which
|
||||||
|
will then at least provide security against physical hardware theft.
|
||||||
|
|
||||||
|
#### Enabling encryption
|
||||||
|
|
||||||
|
Enabling encryption is no longer supported.
|
||||||
|
|
||||||
|
#### Disabling encryption
|
||||||
|
|
||||||
|
Basic usage to disable encryption of your document store:
|
||||||
|
|
||||||
|
(Note: If `PAPERLESS_PASSPHRASE` isn't set already, you need to specify
|
||||||
|
it here)
|
||||||
|
|
||||||
|
```
|
||||||
|
decrypt_documents [--passphrase SECR3TP4SSPHRA$E]
|
||||||
|
```
|
||||||
@@ -1,516 +0,0 @@
|
|||||||
|
|
||||||
**************
|
|
||||||
Administration
|
|
||||||
**************
|
|
||||||
|
|
||||||
.. _administration-backup:
|
|
||||||
|
|
||||||
Making backups
|
|
||||||
##############
|
|
||||||
|
|
||||||
Multiple options exist for making backups of your paperless instance,
|
|
||||||
depending on how you installed paperless.
|
|
||||||
|
|
||||||
Before making backups, make sure that paperless is not running.
|
|
||||||
|
|
||||||
Options available to any installation of paperless:
|
|
||||||
|
|
||||||
* Use the :ref:`document exporter <utilities-exporter>`.
|
|
||||||
The document exporter exports all your documents, thumbnails and
|
|
||||||
metadata to a specific folder. You may import your documents into a
|
|
||||||
fresh instance of paperless again or store your documents in another
|
|
||||||
DMS with this export.
|
|
||||||
* The document exporter is also able to update an already existing export.
|
|
||||||
Therefore, incremental backups with ``rsync`` are entirely possible.
|
|
||||||
|
|
||||||
.. caution::
|
|
||||||
|
|
||||||
You cannot import the export generated with one version of paperless in a
|
|
||||||
different version of paperless. The export contains an exact image of the
|
|
||||||
database, and migrations may change the database layout.
|
|
||||||
|
|
||||||
Options available to docker installations:
|
|
||||||
|
|
||||||
* Backup the docker volumes. These usually reside within
|
|
||||||
``/var/lib/docker/volumes`` on the host and you need to be root in order
|
|
||||||
to access them.
|
|
||||||
|
|
||||||
Paperless uses 3 volumes:
|
|
||||||
|
|
||||||
* ``paperless_media``: This is where your documents are stored.
|
|
||||||
* ``paperless_data``: This is where auxillary data is stored. This
|
|
||||||
folder also contains the SQLite database, if you use it.
|
|
||||||
* ``paperless_pgdata``: Exists only if you use PostgreSQL and contains
|
|
||||||
the database.
|
|
||||||
|
|
||||||
Options available to bare-metal and non-docker installations:
|
|
||||||
|
|
||||||
* Backup the entire paperless folder. This ensures that if your paperless instance
|
|
||||||
crashes at some point or your disk fails, you can simply copy the folder back
|
|
||||||
into place and it works.
|
|
||||||
|
|
||||||
When using PostgreSQL, you'll also have to backup the database.
|
|
||||||
|
|
||||||
.. _migrating-restoring:
|
|
||||||
|
|
||||||
Restoring
|
|
||||||
=========
|
|
||||||
|
|
||||||
.. _administration-updating:
|
|
||||||
|
|
||||||
Updating Paperless
|
|
||||||
##################
|
|
||||||
|
|
||||||
Docker Route
|
|
||||||
============
|
|
||||||
|
|
||||||
If a new release of paperless-ngx is available, upgrading depends on how you
|
|
||||||
installed paperless-ngx in the first place. The releases are available at the
|
|
||||||
`release page <https://github.com/paperless-ngx/paperless-ngx/releases>`_.
|
|
||||||
|
|
||||||
First of all, ensure that paperless is stopped.
|
|
||||||
|
|
||||||
.. code:: shell-session
|
|
||||||
|
|
||||||
$ cd /path/to/paperless
|
|
||||||
$ docker-compose down
|
|
||||||
|
|
||||||
After that, :ref:`make a backup <administration-backup>`.
|
|
||||||
|
|
||||||
A. If you pull the image from the docker hub, all you need to do is:
|
|
||||||
|
|
||||||
.. code:: shell-session
|
|
||||||
|
|
||||||
$ docker-compose pull
|
|
||||||
$ docker-compose up
|
|
||||||
|
|
||||||
The docker-compose files refer to the ``latest`` version, which is always the latest
|
|
||||||
stable release.
|
|
||||||
|
|
||||||
B. If you built the image yourself, do the following:
|
|
||||||
|
|
||||||
.. code:: shell-session
|
|
||||||
|
|
||||||
$ git pull
|
|
||||||
$ docker-compose build
|
|
||||||
$ docker-compose up
|
|
||||||
|
|
||||||
Running ``docker-compose up`` will also apply any new database migrations.
|
|
||||||
If you see everything working, press CTRL+C once to gracefully stop paperless.
|
|
||||||
Then you can start paperless-ngx with ``-d`` to have it run in the background.
|
|
||||||
|
|
||||||
.. note::
|
|
||||||
|
|
||||||
In version 0.9.14, the update process was changed. In 0.9.13 and earlier, the
|
|
||||||
docker-compose files specified exact versions and pull won't automatically
|
|
||||||
update to newer versions. In order to enable updates as described above, either
|
|
||||||
get the new ``docker-compose.yml`` file from `here <https://github.com/paperless-ngx/paperless-ngx/tree/master/docker/compose>`_
|
|
||||||
or edit the ``docker-compose.yml`` file, find the line that says
|
|
||||||
|
|
||||||
.. code::
|
|
||||||
|
|
||||||
image: ghcr.io/paperless-ngx/paperless-ngx:0.9.x
|
|
||||||
|
|
||||||
and replace the version with ``latest``:
|
|
||||||
|
|
||||||
.. code::
|
|
||||||
|
|
||||||
image: ghcr.io/paperless-ngx/paperless-ngx:latest
|
|
||||||
|
|
||||||
.. note::
|
|
||||||
In version 1.7.1 and onwards, the Docker image can now pinned to a release series.
|
|
||||||
This is often combined with automatic updaters such as Watchtower to allow safer
|
|
||||||
unattended upgrading to new bugfix releases only. It is still recommended to always
|
|
||||||
review release notes before upgrading. To ping your install to a release series, edit
|
|
||||||
the ``docker-compose.yml`` find the line that says
|
|
||||||
|
|
||||||
.. code::
|
|
||||||
|
|
||||||
image: ghcr.io/paperless-ngx/paperless-ngx:latest
|
|
||||||
|
|
||||||
and replace the version with the series you want to track, for example:
|
|
||||||
|
|
||||||
.. code::
|
|
||||||
|
|
||||||
image: ghcr.io/paperless-ngx/paperless-ngx:1.7
|
|
||||||
|
|
||||||
Bare Metal Route
|
|
||||||
================
|
|
||||||
|
|
||||||
After grabbing the new release and unpacking the contents, do the following:
|
|
||||||
|
|
||||||
1. Update dependencies. New paperless version may require additional
|
|
||||||
dependencies. The dependencies required are listed in the section about
|
|
||||||
:ref:`bare metal installations <setup-bare_metal>`.
|
|
||||||
|
|
||||||
2. Update python requirements. Keep in mind to activate your virtual environment
|
|
||||||
before that, if you use one.
|
|
||||||
|
|
||||||
.. code:: shell-session
|
|
||||||
|
|
||||||
$ pip install -r requirements.txt
|
|
||||||
|
|
||||||
3. Migrate the database.
|
|
||||||
|
|
||||||
.. code:: shell-session
|
|
||||||
|
|
||||||
$ cd src
|
|
||||||
$ python3 manage.py migrate
|
|
||||||
|
|
||||||
This might not actually do anything. Not every new paperless version comes with new
|
|
||||||
database migrations.
|
|
||||||
|
|
||||||
Downgrading Paperless
|
|
||||||
#####################
|
|
||||||
|
|
||||||
Downgrades are possible. However, some updates also contain database migrations (these change the layout of the database and may move data).
|
|
||||||
In order to move back from a version that applied database migrations, you'll have to revert the database migration *before* downgrading,
|
|
||||||
and then downgrade paperless.
|
|
||||||
|
|
||||||
This table lists the compatible versions for each database migration number.
|
|
||||||
|
|
||||||
+------------------+-----------------+
|
|
||||||
| Migration number | Version range |
|
|
||||||
+------------------+-----------------+
|
|
||||||
| 1011 | 1.0.0 |
|
|
||||||
+------------------+-----------------+
|
|
||||||
| 1012 | 1.1.0 - 1.2.1 |
|
|
||||||
+------------------+-----------------+
|
|
||||||
| 1014 | 1.3.0 - 1.3.1 |
|
|
||||||
+------------------+-----------------+
|
|
||||||
| 1016 | 1.3.2 - current |
|
|
||||||
+------------------+-----------------+
|
|
||||||
|
|
||||||
Execute the following management command to migrate your database:
|
|
||||||
|
|
||||||
.. code:: shell-session
|
|
||||||
|
|
||||||
$ python3 manage.py migrate documents <migration number>
|
|
||||||
|
|
||||||
.. note::
|
|
||||||
|
|
||||||
Some migrations cannot be undone. The command will issue errors if that happens.
|
|
||||||
|
|
||||||
.. _utilities-management-commands:
|
|
||||||
|
|
||||||
Management utilities
|
|
||||||
####################
|
|
||||||
|
|
||||||
Paperless comes with some management commands that perform various maintenance
|
|
||||||
tasks on your paperless instance. You can invoke these commands in the following way:
|
|
||||||
|
|
||||||
With docker-compose, while paperless is running:
|
|
||||||
|
|
||||||
.. code:: shell-session
|
|
||||||
|
|
||||||
$ cd /path/to/paperless
|
|
||||||
$ docker-compose exec webserver <command> <arguments>
|
|
||||||
|
|
||||||
With docker, while paperless is running:
|
|
||||||
|
|
||||||
.. code:: shell-session
|
|
||||||
|
|
||||||
$ docker exec -it <container-name> <command> <arguments>
|
|
||||||
|
|
||||||
Bare metal:
|
|
||||||
|
|
||||||
.. code:: shell-session
|
|
||||||
|
|
||||||
$ cd /path/to/paperless/src
|
|
||||||
$ python3 manage.py <command> <arguments>
|
|
||||||
|
|
||||||
All commands have built-in help, which can be accessed by executing them with
|
|
||||||
the argument ``--help``.
|
|
||||||
|
|
||||||
.. _utilities-exporter:
|
|
||||||
|
|
||||||
Document exporter
|
|
||||||
=================
|
|
||||||
|
|
||||||
The document exporter exports all your data from paperless into a folder for
|
|
||||||
backup or migration to another DMS.
|
|
||||||
|
|
||||||
If you use the document exporter within a cronjob to backup your data you might use the ``-T`` flag behind exec to suppress "The input device is not a TTY" errors. For example: ``docker-compose exec -T webserver document_exporter ../export``
|
|
||||||
|
|
||||||
.. code::
|
|
||||||
|
|
||||||
document_exporter target [-c] [-f] [-d]
|
|
||||||
|
|
||||||
optional arguments:
|
|
||||||
-c, --compare-checksums
|
|
||||||
-f, --use-filename-format
|
|
||||||
-d, --delete
|
|
||||||
|
|
||||||
``target`` is a folder to which the data gets written. This includes documents,
|
|
||||||
thumbnails and a ``manifest.json`` file. The manifest contains all metadata from
|
|
||||||
the database (correspondents, tags, etc).
|
|
||||||
|
|
||||||
When you use the provided docker compose script, specify ``../export`` as the
|
|
||||||
target. This path inside the container is automatically mounted on your host on
|
|
||||||
the folder ``export``.
|
|
||||||
|
|
||||||
If the target directory already exists and contains files, paperless will assume
|
|
||||||
that the contents of the export directory are a previous export and will attempt
|
|
||||||
to update the previous export. Paperless will only export changed and added files.
|
|
||||||
Paperless determines whether a file has changed by inspecting the file attributes
|
|
||||||
"date/time modified" and "size". If that does not work out for you, specify
|
|
||||||
``--compare-checksums`` and paperless will attempt to compare file checksums instead.
|
|
||||||
This is slower.
|
|
||||||
|
|
||||||
Paperless will not remove any existing files in the export directory. If you want
|
|
||||||
paperless to also remove files that do not belong to the current export such as files
|
|
||||||
from deleted documents, specify ``--delete``. Be careful when pointing paperless to
|
|
||||||
a directory that already contains other files.
|
|
||||||
|
|
||||||
The filenames generated by this command follow the format
|
|
||||||
``[date created] [correspondent] [title].[extension]``.
|
|
||||||
If you want paperless to use ``PAPERLESS_FILENAME_FORMAT`` for exported filenames
|
|
||||||
instead, specify ``--use-filename-format``.
|
|
||||||
|
|
||||||
|
|
||||||
.. _utilities-importer:
|
|
||||||
|
|
||||||
Document importer
|
|
||||||
=================
|
|
||||||
|
|
||||||
The document importer takes the export produced by the `Document exporter`_ and
|
|
||||||
imports it into paperless.
|
|
||||||
|
|
||||||
The importer works just like the exporter. You point it at a directory, and
|
|
||||||
the script does the rest of the work:
|
|
||||||
|
|
||||||
.. code::
|
|
||||||
|
|
||||||
document_importer source
|
|
||||||
|
|
||||||
When you use the provided docker compose script, put the export inside the
|
|
||||||
``export`` folder in your paperless source directory. Specify ``../export``
|
|
||||||
as the ``source``.
|
|
||||||
|
|
||||||
|
|
||||||
.. _utilities-retagger:
|
|
||||||
|
|
||||||
Document retagger
|
|
||||||
=================
|
|
||||||
|
|
||||||
Say you've imported a few hundred documents and now want to introduce
|
|
||||||
a tag or set up a new correspondent, and apply its matching to all of
|
|
||||||
the currently-imported docs. This problem is common enough that
|
|
||||||
there are tools for it.
|
|
||||||
|
|
||||||
.. code::
|
|
||||||
|
|
||||||
document_retagger [-h] [-c] [-T] [-t] [-i] [--use-first] [-f]
|
|
||||||
|
|
||||||
optional arguments:
|
|
||||||
-c, --correspondent
|
|
||||||
-T, --tags
|
|
||||||
-t, --document_type
|
|
||||||
-i, --inbox-only
|
|
||||||
--use-first
|
|
||||||
-f, --overwrite
|
|
||||||
|
|
||||||
Run this after changing or adding matching rules. It'll loop over all
|
|
||||||
of the documents in your database and attempt to match documents
|
|
||||||
according to the new rules.
|
|
||||||
|
|
||||||
Specify any combination of ``-c``, ``-T`` and ``-t`` to have the
|
|
||||||
retagger perform matching of the specified metadata type. If you don't
|
|
||||||
specify any of these options, the document retagger won't do anything.
|
|
||||||
|
|
||||||
Specify ``-i`` to have the document retagger work on documents tagged
|
|
||||||
with inbox tags only. This is useful when you don't want to mess with
|
|
||||||
your already processed documents.
|
|
||||||
|
|
||||||
When multiple document types or correspondents match a single document,
|
|
||||||
the retagger won't assign these to the document. Specify ``--use-first``
|
|
||||||
to override this behavior and just use the first correspondent or type
|
|
||||||
it finds. This option does not apply to tags, since any amount of tags
|
|
||||||
can be applied to a document.
|
|
||||||
|
|
||||||
Finally, ``-f`` specifies that you wish to overwrite already assigned
|
|
||||||
correspondents, types and/or tags. The default behavior is to not
|
|
||||||
assign correspondents and types to documents that have this data already
|
|
||||||
assigned. ``-f`` works differently for tags: By default, only additional tags get
|
|
||||||
added to documents, no tags will be removed. With ``-f``, tags that don't
|
|
||||||
match a document anymore get removed as well.
|
|
||||||
|
|
||||||
|
|
||||||
Managing the Automatic matching algorithm
|
|
||||||
=========================================
|
|
||||||
|
|
||||||
The *Auto* matching algorithm requires a trained neural network to work.
|
|
||||||
This network needs to be updated whenever somethings in your data
|
|
||||||
changes. The docker image takes care of that automatically with the task
|
|
||||||
scheduler. You can manually renew the classifier by invoking the following
|
|
||||||
management command:
|
|
||||||
|
|
||||||
.. code::
|
|
||||||
|
|
||||||
document_create_classifier
|
|
||||||
|
|
||||||
This command takes no arguments.
|
|
||||||
|
|
||||||
.. _`administration-index`:
|
|
||||||
|
|
||||||
Managing the document search index
|
|
||||||
==================================
|
|
||||||
|
|
||||||
The document search index is responsible for delivering search results for the
|
|
||||||
website. The document index is automatically updated whenever documents get
|
|
||||||
added to, changed, or removed from paperless. However, if the search yields
|
|
||||||
non-existing documents or won't find anything, you may need to recreate the
|
|
||||||
index manually.
|
|
||||||
|
|
||||||
.. code::
|
|
||||||
|
|
||||||
document_index {reindex,optimize}
|
|
||||||
|
|
||||||
Specify ``reindex`` to have the index created from scratch. This may take some
|
|
||||||
time.
|
|
||||||
|
|
||||||
Specify ``optimize`` to optimize the index. This updates certain aspects of
|
|
||||||
the index and usually makes queries faster and also ensures that the
|
|
||||||
autocompletion works properly. This command is regularly invoked by the task
|
|
||||||
scheduler.
|
|
||||||
|
|
||||||
.. _utilities-renamer:
|
|
||||||
|
|
||||||
Managing filenames
|
|
||||||
==================
|
|
||||||
|
|
||||||
If you use paperless' feature to
|
|
||||||
:ref:`assign custom filenames to your documents <advanced-file_name_handling>`,
|
|
||||||
you can use this command to move all your files after changing
|
|
||||||
the naming scheme.
|
|
||||||
|
|
||||||
.. warning::
|
|
||||||
|
|
||||||
Since this command moves you documents around a lot, it is advised to to
|
|
||||||
a backup before. The renaming logic is robust and will never overwrite
|
|
||||||
or delete a file, but you can't ever be careful enough.
|
|
||||||
|
|
||||||
.. code::
|
|
||||||
|
|
||||||
document_renamer
|
|
||||||
|
|
||||||
The command takes no arguments and processes all your documents at once.
|
|
||||||
|
|
||||||
Learn how to use :ref:`Management Utilities<utilities-management-commands>`.
|
|
||||||
|
|
||||||
|
|
||||||
.. _utilities-sanity-checker:
|
|
||||||
|
|
||||||
Sanity checker
|
|
||||||
==============
|
|
||||||
|
|
||||||
Paperless has a built-in sanity checker that inspects your document collection for issues.
|
|
||||||
|
|
||||||
The issues detected by the sanity checker are as follows:
|
|
||||||
|
|
||||||
* Missing original files.
|
|
||||||
* Missing archive files.
|
|
||||||
* Inaccessible original files due to improper permissions.
|
|
||||||
* Inaccessible archive files due to improper permissions.
|
|
||||||
* Corrupted original documents by comparing their checksum against what is stored in the database.
|
|
||||||
* Corrupted archive documents by comparing their checksum against what is stored in the database.
|
|
||||||
* Missing thumbnails.
|
|
||||||
* Inaccessible thumbnails due to improper permissions.
|
|
||||||
* Documents without any content (warning).
|
|
||||||
* Orphaned files in the media directory (warning). These are files that are not referenced by any document im paperless.
|
|
||||||
|
|
||||||
|
|
||||||
.. code::
|
|
||||||
|
|
||||||
document_sanity_checker
|
|
||||||
|
|
||||||
The command takes no arguments. Depending on the size of your document archive, this may take some time.
|
|
||||||
|
|
||||||
|
|
||||||
Fetching e-mail
|
|
||||||
===============
|
|
||||||
|
|
||||||
Paperless automatically fetches your e-mail every 10 minutes by default. If
|
|
||||||
you want to invoke the email consumer manually, call the following management
|
|
||||||
command:
|
|
||||||
|
|
||||||
.. code::
|
|
||||||
|
|
||||||
mail_fetcher
|
|
||||||
|
|
||||||
The command takes no arguments and processes all your mail accounts and rules.
|
|
||||||
|
|
||||||
.. _utilities-archiver:
|
|
||||||
|
|
||||||
Creating archived documents
|
|
||||||
===========================
|
|
||||||
|
|
||||||
Paperless stores archived PDF/A documents alongside your original documents.
|
|
||||||
These archived documents will also contain selectable text for image-only
|
|
||||||
originals.
|
|
||||||
These documents are derived from the originals, which are always stored
|
|
||||||
unmodified. If coming from an earlier version of paperless, your documents
|
|
||||||
won't have archived versions.
|
|
||||||
|
|
||||||
This command creates PDF/A documents for your documents.
|
|
||||||
|
|
||||||
.. code::
|
|
||||||
|
|
||||||
document_archiver --overwrite --document <id>
|
|
||||||
|
|
||||||
This command will only attempt to create archived documents when no archived
|
|
||||||
document exists yet, unless ``--overwrite`` is specified. If ``--document <id>``
|
|
||||||
is specified, the archiver will only process that document.
|
|
||||||
|
|
||||||
.. note::
|
|
||||||
|
|
||||||
This command essentially performs OCR on all your documents again,
|
|
||||||
according to your settings. If you run this with ``PAPERLESS_OCR_MODE=redo``,
|
|
||||||
it will potentially run for a very long time. You can cancel the command
|
|
||||||
at any time, since this command will skip already archived versions the next time
|
|
||||||
it is run.
|
|
||||||
|
|
||||||
.. note::
|
|
||||||
|
|
||||||
Some documents will cause errors and cannot be converted into PDF/A documents,
|
|
||||||
such as encrypted PDF documents. The archiver will skip over these documents
|
|
||||||
each time it sees them.
|
|
||||||
|
|
||||||
.. _utilities-encyption:
|
|
||||||
|
|
||||||
Managing encryption
|
|
||||||
===================
|
|
||||||
|
|
||||||
Documents can be stored in Paperless using GnuPG encryption.
|
|
||||||
|
|
||||||
.. danger::
|
|
||||||
|
|
||||||
Encryption is deprecated since paperless-ngx 0.9 and doesn't really provide any
|
|
||||||
additional security, since you have to store the passphrase in a configuration
|
|
||||||
file on the same system as the encrypted documents for paperless to work.
|
|
||||||
Furthermore, the entire text content of the documents is stored plain in the
|
|
||||||
database, even if your documents are encrypted. Filenames are not encrypted as
|
|
||||||
well.
|
|
||||||
|
|
||||||
Also, the web server provides transparent access to your encrypted documents.
|
|
||||||
|
|
||||||
Consider running paperless on an encrypted filesystem instead, which will then
|
|
||||||
at least provide security against physical hardware theft.
|
|
||||||
|
|
||||||
|
|
||||||
Enabling encryption
|
|
||||||
-------------------
|
|
||||||
|
|
||||||
Enabling encryption is no longer supported.
|
|
||||||
|
|
||||||
|
|
||||||
Disabling encryption
|
|
||||||
--------------------
|
|
||||||
|
|
||||||
Basic usage to disable encryption of your document store:
|
|
||||||
|
|
||||||
(Note: If ``PAPERLESS_PASSPHRASE`` isn't set already, you need to specify it here)
|
|
||||||
|
|
||||||
.. code::
|
|
||||||
|
|
||||||
decrypt_documents [--passphrase SECR3TP4SSPHRA$E]
|
|
||||||
483
docs/advanced_usage.md
Normal file
@@ -0,0 +1,483 @@
|
|||||||
|
# Advanced Topics
|
||||||
|
|
||||||
|
Paperless offers a couple features that automate certain tasks and make
|
||||||
|
your life easier.
|
||||||
|
|
||||||
|
## Matching tags, correspondents, document types, and storage paths {#matching}
|
||||||
|
|
||||||
|
Paperless will compare the matching algorithms defined by every tag,
|
||||||
|
correspondent, document type, and storage path in your database to see
|
||||||
|
if they apply to the text in a document. In other words, if you define a
|
||||||
|
tag called `Home Utility` that had a `match` property of `bc hydro` and
|
||||||
|
a `matching_algorithm` of `literal`, Paperless will automatically tag
|
||||||
|
your newly-consumed document with your `Home Utility` tag so long as the
|
||||||
|
text `bc hydro` appears in the body of the document somewhere.
|
||||||
|
|
||||||
|
The matching logic is quite powerful. It supports searching the text of
|
||||||
|
your document with different algorithms, and as such, some
|
||||||
|
experimentation may be necessary to get things right.
|
||||||
|
|
||||||
|
In order to have a tag, correspondent, document type, or storage path
|
||||||
|
assigned automatically to newly consumed documents, assign a match and
|
||||||
|
matching algorithm using the web interface. These settings define when
|
||||||
|
to assign tags, correspondents, document types, and storage paths to
|
||||||
|
documents.
|
||||||
|
|
||||||
|
The following algorithms are available:
|
||||||
|
|
||||||
|
- **Any:** Looks for any occurrence of any word provided in match in
|
||||||
|
the PDF. If you define the match as `Bank1 Bank2`, it will match
|
||||||
|
documents containing either of these terms.
|
||||||
|
- **All:** Requires that every word provided appears in the PDF,
|
||||||
|
albeit not in the order provided.
|
||||||
|
- **Literal:** Matches only if the match appears exactly as provided
|
||||||
|
(i.e. preserve ordering) in the PDF.
|
||||||
|
- **Regular expression:** Parses the match as a regular expression and
|
||||||
|
tries to find a match within the document.
|
||||||
|
- **Fuzzy match:** I don't know. Look at the source.
|
||||||
|
- **Auto:** Tries to automatically match new documents. This does not
|
||||||
|
require you to set a match. See the notes below.
|
||||||
|
|
||||||
|
When using the _any_ or _all_ matching algorithms, you can search for
|
||||||
|
terms that consist of multiple words by enclosing them in double quotes.
|
||||||
|
For example, defining a match text of `"Bank of America" BofA` using the
|
||||||
|
_any_ algorithm, will match documents that contain either "Bank of
|
||||||
|
America" or "BofA", but will not match documents containing "Bank of
|
||||||
|
South America".
|
||||||
|
|
||||||
|
Then just save your tag, correspondent, document type, or storage path
|
||||||
|
and run another document through the consumer. Once complete, you should
|
||||||
|
see the newly-created document, automatically tagged with the
|
||||||
|
appropriate data.
|
||||||
|
|
||||||
|
### Automatic matching {#automatic-matching}
|
||||||
|
|
||||||
|
Paperless-ngx comes with a new matching algorithm called _Auto_. This
|
||||||
|
matching algorithm tries to assign tags, correspondents, document types,
|
||||||
|
and storage paths to your documents based on how you have already
|
||||||
|
assigned these on existing documents. It uses a neural network under the
|
||||||
|
hood.
|
||||||
|
|
||||||
|
If, for example, all your bank statements of your account 123 at the
|
||||||
|
Bank of America are tagged with the tag "bofa123" and the matching
|
||||||
|
algorithm of this tag is set to _Auto_, this neural network will examine
|
||||||
|
your documents and automatically learn when to assign this tag.
|
||||||
|
|
||||||
|
Paperless tries to hide much of the involved complexity with this
|
||||||
|
approach. However, there are a couple caveats you need to keep in mind
|
||||||
|
when using this feature:
|
||||||
|
|
||||||
|
- Changes to your documents are not immediately reflected by the
|
||||||
|
matching algorithm. The neural network needs to be _trained_ on your
|
||||||
|
documents after changes. Paperless periodically (default: once each
|
||||||
|
hour) checks for changes and does this automatically for you.
|
||||||
|
- The Auto matching algorithm only takes documents into account which
|
||||||
|
are NOT placed in your inbox (i.e. have any inbox tags assigned to
|
||||||
|
them). This ensures that the neural network only learns from
|
||||||
|
documents which you have correctly tagged before.
|
||||||
|
- The matching algorithm can only work if there is a correlation
|
||||||
|
between the tag, correspondent, document type, or storage path and
|
||||||
|
the document itself. Your bank statements usually contain your bank
|
||||||
|
account number and the name of the bank, so this works reasonably
|
||||||
|
well, However, tags such as "TODO" cannot be automatically
|
||||||
|
assigned.
|
||||||
|
- The matching algorithm needs a reasonable number of documents to
|
||||||
|
identify when to assign tags, correspondents, storage paths, and
|
||||||
|
types. If one out of a thousand documents has the correspondent
|
||||||
|
"Very obscure web shop I bought something five years ago", it will
|
||||||
|
probably not assign this correspondent automatically if you buy
|
||||||
|
something from them again. The more documents, the better.
|
||||||
|
- Paperless also needs a reasonable amount of negative examples to
|
||||||
|
decide when not to assign a certain tag, correspondent, document
|
||||||
|
type, or storage path. This will usually be the case as you start
|
||||||
|
filling up paperless with documents. Example: If all your documents
|
||||||
|
are either from "Webshop" and "Bank", paperless will assign one
|
||||||
|
of these correspondents to ANY new document, if both are set to
|
||||||
|
automatic matching.
|
||||||
|
|
||||||
|
## Hooking into the consumption process {#consume-hooks}
|
||||||
|
|
||||||
|
Sometimes you may want to do something arbitrary whenever a document is
|
||||||
|
consumed. Rather than try to predict what you may want to do, Paperless
|
||||||
|
lets you execute scripts of your own choosing just before or after a
|
||||||
|
document is consumed using a couple simple hooks.
|
||||||
|
|
||||||
|
Just write a script, put it somewhere that Paperless can read & execute,
|
||||||
|
and then put the path to that script in `paperless.conf` or
|
||||||
|
`docker-compose.env` with the variable name of either
|
||||||
|
`PAPERLESS_PRE_CONSUME_SCRIPT` or `PAPERLESS_POST_CONSUME_SCRIPT`.
|
||||||
|
|
||||||
|
!!! info
|
||||||
|
|
||||||
|
These scripts are executed in a **blocking** process, which means that
|
||||||
|
if a script takes a long time to run, it can significantly slow down
|
||||||
|
your document consumption flow. If you want things to run
|
||||||
|
asynchronously, you'll have to fork the process in your script and
|
||||||
|
exit.
|
||||||
|
|
||||||
|
### Pre-consumption script {#pre-consume-script}
|
||||||
|
|
||||||
|
Executed after the consumer sees a new document in the consumption
|
||||||
|
folder, but before any processing of the document is performed. This
|
||||||
|
script can access the following relevant environment variables set:
|
||||||
|
|
||||||
|
- `DOCUMENT_SOURCE_PATH`
|
||||||
|
|
||||||
|
A simple but common example for this would be creating a simple script
|
||||||
|
like this:
|
||||||
|
|
||||||
|
`/usr/local/bin/ocr-pdf`
|
||||||
|
|
||||||
|
```bash
|
||||||
|
#!/usr/bin/env bash
|
||||||
|
pdf2pdfocr.py -i ${DOCUMENT_SOURCE_PATH}
|
||||||
|
```
|
||||||
|
|
||||||
|
`/etc/paperless.conf`
|
||||||
|
|
||||||
|
```bash
|
||||||
|
...
|
||||||
|
PAPERLESS_PRE_CONSUME_SCRIPT="/usr/local/bin/ocr-pdf"
|
||||||
|
...
|
||||||
|
```
|
||||||
|
|
||||||
|
This will pass the path to the document about to be consumed to
|
||||||
|
`/usr/local/bin/ocr-pdf`, which will in turn call
|
||||||
|
[pdf2pdfocr.py](https://github.com/LeoFCardoso/pdf2pdfocr) on your
|
||||||
|
document, which will then overwrite the file with an OCR'd version of
|
||||||
|
the file and exit. At which point, the consumption process will begin
|
||||||
|
with the newly modified file.
|
||||||
|
|
||||||
|
The script's stdout and stderr will be logged line by line to the
|
||||||
|
webserver log, along with the exit code of the script.
|
||||||
|
|
||||||
|
### Post-consumption script {#post-consume-script}
|
||||||
|
|
||||||
|
Executed after the consumer has successfully processed a document and
|
||||||
|
has moved it into paperless. It receives the following environment
|
||||||
|
variables:
|
||||||
|
|
||||||
|
- `DOCUMENT_ID`
|
||||||
|
- `DOCUMENT_FILE_NAME`
|
||||||
|
- `DOCUMENT_CREATED`
|
||||||
|
- `DOCUMENT_MODIFIED`
|
||||||
|
- `DOCUMENT_ADDED`
|
||||||
|
- `DOCUMENT_SOURCE_PATH`
|
||||||
|
- `DOCUMENT_ARCHIVE_PATH`
|
||||||
|
- `DOCUMENT_THUMBNAIL_PATH`
|
||||||
|
- `DOCUMENT_DOWNLOAD_URL`
|
||||||
|
- `DOCUMENT_THUMBNAIL_URL`
|
||||||
|
- `DOCUMENT_CORRESPONDENT`
|
||||||
|
- `DOCUMENT_TAGS`
|
||||||
|
- `DOCUMENT_ORIGINAL_FILENAME`
|
||||||
|
|
||||||
|
The script can be in any language, but for a simple shell script
|
||||||
|
example, you can take a look at
|
||||||
|
[post-consumption-example.sh](https://github.com/paperless-ngx/paperless-ngx/blob/main/scripts/post-consumption-example.sh)
|
||||||
|
in this project.
|
||||||
|
|
||||||
|
The post consumption script cannot cancel the consumption process.
|
||||||
|
|
||||||
|
The script's stdout and stderr will be logged line by line to the
|
||||||
|
webserver log, along with the exit code of the script.
|
||||||
|
|
||||||
|
### Docker {#docker-consume-hooks}
|
||||||
|
|
||||||
|
To hook into the consumption process when using Docker, you
|
||||||
|
will need to pass the scripts into the container via a host mount
|
||||||
|
in your `docker-compose.yml`.
|
||||||
|
|
||||||
|
Assuming you have
|
||||||
|
`/home/paperless-ngx/scripts/post-consumption-example.sh` as a
|
||||||
|
script which you'd like to run.
|
||||||
|
|
||||||
|
You can pass that script into the consumer container via a host mount:
|
||||||
|
|
||||||
|
```yaml
|
||||||
|
...
|
||||||
|
webserver:
|
||||||
|
...
|
||||||
|
volumes:
|
||||||
|
...
|
||||||
|
- /home/paperless-ngx/scripts:/path/in/container/scripts/ # (1)!
|
||||||
|
environment: # (3)!
|
||||||
|
...
|
||||||
|
PAPERLESS_POST_CONSUME_SCRIPT: /path/in/container/scripts/post-consumption-example.sh # (2)!
|
||||||
|
...
|
||||||
|
```
|
||||||
|
|
||||||
|
1. The external scripts directory is mounted to a location inside the container.
|
||||||
|
2. The internal location of the script is used to set the script to run
|
||||||
|
3. This can also be set in `docker-compose.env`
|
||||||
|
|
||||||
|
Troubleshooting:
|
||||||
|
|
||||||
|
- Monitor the docker-compose log
|
||||||
|
`cd ~/paperless-ngx; docker-compose logs -f`
|
||||||
|
- Check your script's permission e.g. in case of permission error
|
||||||
|
`sudo chmod 755 post-consumption-example.sh`
|
||||||
|
- Pipe your scripts's output to a log file e.g.
|
||||||
|
`echo "${DOCUMENT_ID}" | tee --append /usr/src/paperless/scripts/post-consumption-example.log`
|
||||||
|
|
||||||
|
## File name handling {#file-name-handling}
|
||||||
|
|
||||||
|
By default, paperless stores your documents in the media directory and
|
||||||
|
renames them using the identifier which it has assigned to each
|
||||||
|
document. You will end up getting files like `0000123.pdf` in your media
|
||||||
|
directory. This isn't necessarily a bad thing, because you normally
|
||||||
|
don't have to access these files manually. However, if you wish to name
|
||||||
|
your files differently, you can do that by adjusting the
|
||||||
|
`PAPERLESS_FILENAME_FORMAT` configuration option. Paperless adds the
|
||||||
|
correct file extension e.g. `.pdf`, `.jpg` automatically.
|
||||||
|
|
||||||
|
This variable allows you to configure the filename (folders are allowed)
|
||||||
|
using placeholders. For example, configuring this to
|
||||||
|
|
||||||
|
```bash
|
||||||
|
PAPERLESS_FILENAME_FORMAT={created_year}/{correspondent}/{title}
|
||||||
|
```
|
||||||
|
|
||||||
|
will create a directory structure as follows:
|
||||||
|
|
||||||
|
```
|
||||||
|
2019/
|
||||||
|
My bank/
|
||||||
|
Statement January.pdf
|
||||||
|
Statement February.pdf
|
||||||
|
2020/
|
||||||
|
My bank/
|
||||||
|
Statement January.pdf
|
||||||
|
Letter.pdf
|
||||||
|
Letter_01.pdf
|
||||||
|
Shoe store/
|
||||||
|
My new shoes.pdf
|
||||||
|
```
|
||||||
|
|
||||||
|
!!! warning
|
||||||
|
|
||||||
|
Do not manually move your files in the media folder. Paperless remembers
|
||||||
|
the last filename a document was stored as. If you do rename a file,
|
||||||
|
paperless will report your files as missing and won't be able to find
|
||||||
|
them.
|
||||||
|
|
||||||
|
Paperless provides the following placeholders within filenames:
|
||||||
|
|
||||||
|
- `{asn}`: The archive serial number of the document, or "none".
|
||||||
|
- `{correspondent}`: The name of the correspondent, or "none".
|
||||||
|
- `{document_type}`: The name of the document type, or "none".
|
||||||
|
- `{tag_list}`: A comma separated list of all tags assigned to the
|
||||||
|
document.
|
||||||
|
- `{title}`: The title of the document.
|
||||||
|
- `{created}`: The full date (ISO format) the document was created.
|
||||||
|
- `{created_year}`: Year created only, formatted as the year with
|
||||||
|
century.
|
||||||
|
- `{created_year_short}`: Year created only, formatted as the year
|
||||||
|
without century, zero padded.
|
||||||
|
- `{created_month}`: Month created only (number 01-12).
|
||||||
|
- `{created_month_name}`: Month created name, as per locale
|
||||||
|
- `{created_month_name_short}`: Month created abbreviated name, as per
|
||||||
|
locale
|
||||||
|
- `{created_day}`: Day created only (number 01-31).
|
||||||
|
- `{added}`: The full date (ISO format) the document was added to
|
||||||
|
paperless.
|
||||||
|
- `{added_year}`: Year added only.
|
||||||
|
- `{added_year_short}`: Year added only, formatted as the year without
|
||||||
|
century, zero padded.
|
||||||
|
- `{added_month}`: Month added only (number 01-12).
|
||||||
|
- `{added_month_name}`: Month added name, as per locale
|
||||||
|
- `{added_month_name_short}`: Month added abbreviated name, as per
|
||||||
|
locale
|
||||||
|
- `{added_day}`: Day added only (number 01-31).
|
||||||
|
|
||||||
|
Paperless will try to conserve the information from your database as
|
||||||
|
much as possible. However, some characters that you can use in document
|
||||||
|
titles and correspondent names (such as `: \ /` and a couple more) are
|
||||||
|
not allowed in filenames and will be replaced with dashes.
|
||||||
|
|
||||||
|
If paperless detects that two documents share the same filename,
|
||||||
|
paperless will automatically append `_01`, `_02`, etc to the filename.
|
||||||
|
This happens if all the placeholders in a filename evaluate to the same
|
||||||
|
value.
|
||||||
|
|
||||||
|
!!! tip
|
||||||
|
|
||||||
|
You can affect how empty placeholders are treated by changing the
|
||||||
|
following setting to `true`.
|
||||||
|
|
||||||
|
```
|
||||||
|
PAPERLESS_FILENAME_FORMAT_REMOVE_NONE=True
|
||||||
|
```
|
||||||
|
|
||||||
|
Doing this results in all empty placeholders resolving to "" instead
|
||||||
|
of "none" as stated above. Spaces before empty placeholders are
|
||||||
|
removed as well, empty directories are omitted.
|
||||||
|
|
||||||
|
!!! tip
|
||||||
|
|
||||||
|
Paperless checks the filename of a document whenever it is saved.
|
||||||
|
Therefore, you need to update the filenames of your documents and move
|
||||||
|
them after altering this setting by invoking the
|
||||||
|
[`document renamer`](/administration#renamer).
|
||||||
|
|
||||||
|
!!! warning
|
||||||
|
|
||||||
|
Make absolutely sure you get the spelling of the placeholders right, or
|
||||||
|
else paperless will use the default naming scheme instead.
|
||||||
|
|
||||||
|
!!! caution
|
||||||
|
|
||||||
|
As of now, you could totally tell paperless to store your files anywhere
|
||||||
|
outside the media directory by setting
|
||||||
|
|
||||||
|
```
|
||||||
|
PAPERLESS_FILENAME_FORMAT=../../my/custom/location/{title}
|
||||||
|
```
|
||||||
|
|
||||||
|
However, keep in mind that inside docker, if files get stored outside of
|
||||||
|
the predefined volumes, they will be lost after a restart of paperless.
|
||||||
|
|
||||||
|
!!! warning
|
||||||
|
|
||||||
|
When file naming handling, in particular when using `{tag_list}`,
|
||||||
|
you may run into the limits of your operating system's maximum
|
||||||
|
path lengths. Files will retain the previous path instead and
|
||||||
|
the issue logged.
|
||||||
|
|
||||||
|
## Storage paths
|
||||||
|
|
||||||
|
One of the best things in Paperless is that you can not only access the
|
||||||
|
documents via the web interface, but also via the file system.
|
||||||
|
|
||||||
|
When as single storage layout is not sufficient for your use case,
|
||||||
|
storage paths come to the rescue. Storage paths allow you to configure
|
||||||
|
more precisely where each document is stored in the file system.
|
||||||
|
|
||||||
|
- Each storage path is a `PAPERLESS_FILENAME_FORMAT` and
|
||||||
|
follows the rules described above
|
||||||
|
- Each document is assigned a storage path using the matching
|
||||||
|
algorithms described above, but can be overwritten at any time
|
||||||
|
|
||||||
|
For example, you could define the following two storage paths:
|
||||||
|
|
||||||
|
1. Normal communications are put into a folder structure sorted by
|
||||||
|
`year/correspondent`
|
||||||
|
2. Communications with insurance companies are stored in a flat
|
||||||
|
structure with longer file names, but containing the full date of
|
||||||
|
the correspondence.
|
||||||
|
|
||||||
|
```
|
||||||
|
By Year = {created_year}/{correspondent}/{title}
|
||||||
|
Insurances = Insurances/{correspondent}/{created_year}-{created_month}-{created_day} {title}
|
||||||
|
```
|
||||||
|
|
||||||
|
If you then map these storage paths to the documents, you might get the
|
||||||
|
following result. For simplicity, `By Year` defines the same
|
||||||
|
structure as in the previous example above.
|
||||||
|
|
||||||
|
```text
|
||||||
|
2019/ # By Year
|
||||||
|
My bank/
|
||||||
|
Statement January.pdf
|
||||||
|
Statement February.pdf
|
||||||
|
|
||||||
|
Insurances/ # Insurances
|
||||||
|
Healthcare 123/
|
||||||
|
2022-01-01 Statement January.pdf
|
||||||
|
2022-02-02 Letter.pdf
|
||||||
|
2022-02-03 Letter.pdf
|
||||||
|
Dental 456/
|
||||||
|
2021-12-01 New Conditions.pdf
|
||||||
|
```
|
||||||
|
|
||||||
|
!!! tip
|
||||||
|
|
||||||
|
Defining a storage path is optional. If no storage path is defined for a
|
||||||
|
document, the global `PAPERLESS_FILENAME_FORMAT` is applied.
|
||||||
|
|
||||||
|
!!! warning
|
||||||
|
|
||||||
|
If you adjust the format of an existing storage path, old documents
|
||||||
|
don't get relocated automatically. You need to run the
|
||||||
|
[document renamer](/administration#renamer) to
|
||||||
|
adjust their paths.
|
||||||
|
|
||||||
|
## Celery Monitoring {#celery-monitoring}
|
||||||
|
|
||||||
|
The monitoring tool
|
||||||
|
[Flower](https://flower.readthedocs.io/en/latest/index.html) can be used
|
||||||
|
to view more detailed information about the health of the celery workers
|
||||||
|
used for asynchronous tasks. This includes details on currently running,
|
||||||
|
queued and completed tasks, timing and more. Flower can also be used
|
||||||
|
with Prometheus, as it exports metrics. For details on its capabilities,
|
||||||
|
refer to the Flower documentation.
|
||||||
|
|
||||||
|
To configure Flower further, create a `flowerconfig.py` and
|
||||||
|
place it into the `src/paperless` directory. For a Docker
|
||||||
|
installation, you can use volumes to accomplish this:
|
||||||
|
|
||||||
|
```yaml
|
||||||
|
services:
|
||||||
|
# ...
|
||||||
|
webserver:
|
||||||
|
ports:
|
||||||
|
- 5555:5555 # (2)!
|
||||||
|
# ...
|
||||||
|
volumes:
|
||||||
|
- /path/to/my/flowerconfig.py:/usr/src/paperless/src/paperless/flowerconfig.py:ro # (1)!
|
||||||
|
```
|
||||||
|
|
||||||
|
1. Note the `:ro` tag means the file will be mounted as read only.
|
||||||
|
2. `flower` runs by default on port 5555, but this can be configured
|
||||||
|
|
||||||
|
## Custom Container Initialization
|
||||||
|
|
||||||
|
The Docker image includes the ability to run custom user scripts during
|
||||||
|
startup. This could be utilized for installing additional tools or
|
||||||
|
Python packages, for example. Scripts are expected to be shell scripts.
|
||||||
|
|
||||||
|
To utilize this, mount a folder containing your scripts to the custom
|
||||||
|
initialization directory, `/custom-cont-init.d` and place
|
||||||
|
scripts you wish to run inside. For security, the folder must be owned
|
||||||
|
by `root` and should have permissions of `a=rx`. Additionally, scripts
|
||||||
|
must only be writable by `root`.
|
||||||
|
|
||||||
|
Your scripts will be run directly before the webserver completes
|
||||||
|
startup. Scripts will be run by the `root` user.
|
||||||
|
If you would like to switch users, the utility `gosu` is available and
|
||||||
|
preferred over `sudo`.
|
||||||
|
|
||||||
|
This is an advanced functionality with which you could break functionality
|
||||||
|
or lose data. If you experience issues, please disable any custom scripts
|
||||||
|
and try again before reporting an issue.
|
||||||
|
|
||||||
|
For example, using Docker Compose:
|
||||||
|
|
||||||
|
```yaml
|
||||||
|
services:
|
||||||
|
# ...
|
||||||
|
webserver:
|
||||||
|
# ...
|
||||||
|
volumes:
|
||||||
|
- /path/to/my/scripts:/custom-cont-init.d:ro # (1)!
|
||||||
|
```
|
||||||
|
|
||||||
|
1. Note the `:ro` tag means the folder will be mounted as read only. This is for extra security against changes
|
||||||
|
|
||||||
|
## MySQL Caveats {#mysql-caveats}
|
||||||
|
|
||||||
|
### Case Sensitivity
|
||||||
|
|
||||||
|
The database interface does not provide a method to configure a MySQL
|
||||||
|
database to be case sensitive. This would prevent a user from creating a
|
||||||
|
tag `Name` and `NAME` as they are considered the same.
|
||||||
|
|
||||||
|
Per Django documentation, to enable this requires manual intervention.
|
||||||
|
To enable case sensetive tables, you can execute the following command
|
||||||
|
against each table:
|
||||||
|
|
||||||
|
`ALTER TABLE <table_name> CONVERT TO CHARACTER SET utf8mb4 COLLATE utf8mb4_bin;`
|
||||||
|
|
||||||
|
You can also set the default for new tables (this does NOT affect
|
||||||
|
existing tables) with:
|
||||||
|
|
||||||
|
`ALTER DATABASE <db_name> CHARACTER SET utf8mb4 COLLATE utf8mb4_bin;`
|
||||||
@@ -1,292 +0,0 @@
|
|||||||
***************
|
|
||||||
Advanced topics
|
|
||||||
***************
|
|
||||||
|
|
||||||
Paperless offers a couple features that automate certain tasks and make your life
|
|
||||||
easier.
|
|
||||||
|
|
||||||
.. _advanced-matching:
|
|
||||||
|
|
||||||
Matching tags, correspondents and document types
|
|
||||||
################################################
|
|
||||||
|
|
||||||
Paperless will compare the matching algorithms defined by every tag and
|
|
||||||
correspondent already set in your database to see if they apply to the text in
|
|
||||||
a document. In other words, if you defined a tag called ``Home Utility``
|
|
||||||
that had a ``match`` property of ``bc hydro`` and a ``matching_algorithm`` of
|
|
||||||
``literal``, Paperless will automatically tag your newly-consumed document with
|
|
||||||
your ``Home Utility`` tag so long as the text ``bc hydro`` appears in the body
|
|
||||||
of the document somewhere.
|
|
||||||
|
|
||||||
The matching logic is quite powerful. It supports searching the text of your
|
|
||||||
document with different algorithms, and as such, some experimentation may be
|
|
||||||
necessary to get things right.
|
|
||||||
|
|
||||||
In order to have a tag, correspondent, or type assigned automatically to newly
|
|
||||||
consumed documents, assign a match and matching algorithm using the web
|
|
||||||
interface. These settings define when to assign correspondents, tags, and types
|
|
||||||
to documents.
|
|
||||||
|
|
||||||
The following algorithms are available:
|
|
||||||
|
|
||||||
* **Any:** Looks for any occurrence of any word provided in match in the PDF.
|
|
||||||
If you define the match as ``Bank1 Bank2``, it will match documents containing
|
|
||||||
either of these terms.
|
|
||||||
* **All:** Requires that every word provided appears in the PDF, albeit not in the
|
|
||||||
order provided.
|
|
||||||
* **Literal:** Matches only if the match appears exactly as provided (i.e. preserve ordering) in the PDF.
|
|
||||||
* **Regular expression:** Parses the match as a regular expression and tries to
|
|
||||||
find a match within the document.
|
|
||||||
* **Fuzzy match:** I dont know. Look at the source.
|
|
||||||
* **Auto:** Tries to automatically match new documents. This does not require you
|
|
||||||
to set a match. See the notes below.
|
|
||||||
|
|
||||||
When using the *any* or *all* matching algorithms, you can search for terms
|
|
||||||
that consist of multiple words by enclosing them in double quotes. For example,
|
|
||||||
defining a match text of ``"Bank of America" BofA`` using the *any* algorithm,
|
|
||||||
will match documents that contain either "Bank of America" or "BofA", but will
|
|
||||||
not match documents containing "Bank of South America".
|
|
||||||
|
|
||||||
Then just save your tag/correspondent and run another document through the
|
|
||||||
consumer. Once complete, you should see the newly-created document,
|
|
||||||
automatically tagged with the appropriate data.
|
|
||||||
|
|
||||||
|
|
||||||
.. _advanced-automatic_matching:
|
|
||||||
|
|
||||||
Automatic matching
|
|
||||||
==================
|
|
||||||
|
|
||||||
Paperless-ngx comes with a new matching algorithm called *Auto*. This matching
|
|
||||||
algorithm tries to assign tags, correspondents, and document types to your
|
|
||||||
documents based on how you have already assigned these on existing documents. It
|
|
||||||
uses a neural network under the hood.
|
|
||||||
|
|
||||||
If, for example, all your bank statements of your account 123 at the Bank of
|
|
||||||
America are tagged with the tag "bofa_123" and the matching algorithm of this
|
|
||||||
tag is set to *Auto*, this neural network will examine your documents and
|
|
||||||
automatically learn when to assign this tag.
|
|
||||||
|
|
||||||
Paperless tries to hide much of the involved complexity with this approach.
|
|
||||||
However, there are a couple caveats you need to keep in mind when using this
|
|
||||||
feature:
|
|
||||||
|
|
||||||
* Changes to your documents are not immediately reflected by the matching
|
|
||||||
algorithm. The neural network needs to be *trained* on your documents after
|
|
||||||
changes. Paperless periodically (default: once each hour) checks for changes
|
|
||||||
and does this automatically for you.
|
|
||||||
* The Auto matching algorithm only takes documents into account which are NOT
|
|
||||||
placed in your inbox (i.e. have any inbox tags assigned to them). This ensures
|
|
||||||
that the neural network only learns from documents which you have correctly
|
|
||||||
tagged before.
|
|
||||||
* The matching algorithm can only work if there is a correlation between the
|
|
||||||
tag, correspondent, or document type and the document itself. Your bank
|
|
||||||
statements usually contain your bank account number and the name of the bank,
|
|
||||||
so this works reasonably well, However, tags such as "TODO" cannot be
|
|
||||||
automatically assigned.
|
|
||||||
* The matching algorithm needs a reasonable number of documents to identify when
|
|
||||||
to assign tags, correspondents, and types. If one out of a thousand documents
|
|
||||||
has the correspondent "Very obscure web shop I bought something five years
|
|
||||||
ago", it will probably not assign this correspondent automatically if you buy
|
|
||||||
something from them again. The more documents, the better.
|
|
||||||
* Paperless also needs a reasonable amount of negative examples to decide when
|
|
||||||
not to assign a certain tag, correspondent or type. This will usually be the
|
|
||||||
case as you start filling up paperless with documents. Example: If all your
|
|
||||||
documents are either from "Webshop" and "Bank", paperless will assign one of
|
|
||||||
these correspondents to ANY new document, if both are set to automatic matching.
|
|
||||||
|
|
||||||
Hooking into the consumption process
|
|
||||||
####################################
|
|
||||||
|
|
||||||
Sometimes you may want to do something arbitrary whenever a document is
|
|
||||||
consumed. Rather than try to predict what you may want to do, Paperless lets
|
|
||||||
you execute scripts of your own choosing just before or after a document is
|
|
||||||
consumed using a couple simple hooks.
|
|
||||||
|
|
||||||
Just write a script, put it somewhere that Paperless can read & execute, and
|
|
||||||
then put the path to that script in ``paperless.conf`` or ``docker-compose.env`` with the variable name
|
|
||||||
of either ``PAPERLESS_PRE_CONSUME_SCRIPT`` or
|
|
||||||
``PAPERLESS_POST_CONSUME_SCRIPT``.
|
|
||||||
|
|
||||||
.. important::
|
|
||||||
|
|
||||||
These scripts are executed in a **blocking** process, which means that if
|
|
||||||
a script takes a long time to run, it can significantly slow down your
|
|
||||||
document consumption flow. If you want things to run asynchronously,
|
|
||||||
you'll have to fork the process in your script and exit.
|
|
||||||
|
|
||||||
|
|
||||||
Pre-consumption script
|
|
||||||
======================
|
|
||||||
|
|
||||||
Executed after the consumer sees a new document in the consumption folder, but
|
|
||||||
before any processing of the document is performed. This script receives exactly
|
|
||||||
one argument:
|
|
||||||
|
|
||||||
* Document file name
|
|
||||||
|
|
||||||
A simple but common example for this would be creating a simple script like
|
|
||||||
this:
|
|
||||||
|
|
||||||
``/usr/local/bin/ocr-pdf``
|
|
||||||
|
|
||||||
.. code:: bash
|
|
||||||
|
|
||||||
#!/usr/bin/env bash
|
|
||||||
pdf2pdfocr.py -i ${1}
|
|
||||||
|
|
||||||
``/etc/paperless.conf``
|
|
||||||
|
|
||||||
.. code:: bash
|
|
||||||
|
|
||||||
...
|
|
||||||
PAPERLESS_PRE_CONSUME_SCRIPT="/usr/local/bin/ocr-pdf"
|
|
||||||
...
|
|
||||||
|
|
||||||
This will pass the path to the document about to be consumed to ``/usr/local/bin/ocr-pdf``,
|
|
||||||
which will in turn call `pdf2pdfocr.py`_ on your document, which will then
|
|
||||||
overwrite the file with an OCR'd version of the file and exit. At which point,
|
|
||||||
the consumption process will begin with the newly modified file.
|
|
||||||
|
|
||||||
.. _pdf2pdfocr.py: https://github.com/LeoFCardoso/pdf2pdfocr
|
|
||||||
|
|
||||||
.. _advanced-post_consume_script:
|
|
||||||
|
|
||||||
Post-consumption script
|
|
||||||
=======================
|
|
||||||
|
|
||||||
Executed after the consumer has successfully processed a document and has moved it
|
|
||||||
into paperless. It receives the following arguments:
|
|
||||||
|
|
||||||
* Document id
|
|
||||||
* Generated file name
|
|
||||||
* Source path
|
|
||||||
* Thumbnail path
|
|
||||||
* Download URL
|
|
||||||
* Thumbnail URL
|
|
||||||
* Correspondent
|
|
||||||
* Tags
|
|
||||||
|
|
||||||
The script can be in any language, but for a simple shell script
|
|
||||||
example, you can take a look at `post-consumption-example.sh`_ in this project.
|
|
||||||
|
|
||||||
The post consumption script cannot cancel the consumption process.
|
|
||||||
|
|
||||||
Docker
|
|
||||||
------
|
|
||||||
Assumed you have ``/home/foo/paperless-ngx/scripts/post-consumption-example.sh``.
|
|
||||||
|
|
||||||
You can pass that script into the consumer container via a host mount in your ``docker-compose.yml``.
|
|
||||||
|
|
||||||
.. code:: bash
|
|
||||||
|
|
||||||
...
|
|
||||||
consumer:
|
|
||||||
...
|
|
||||||
volumes:
|
|
||||||
...
|
|
||||||
- /home/paperless-ngx/scripts:/path/in/container/scripts/
|
|
||||||
...
|
|
||||||
|
|
||||||
Example (docker-compose.yml): ``- /home/foo/paperless-ngx/scripts:/usr/src/paperless/scripts``
|
|
||||||
|
|
||||||
which in turn requires the variable ``PAPERLESS_POST_CONSUME_SCRIPT`` in ``docker-compose.env`` to point to ``/path/in/container/scripts/post-consumption-example.sh``.
|
|
||||||
|
|
||||||
Example (docker-compose.env): ``PAPERLESS_POST_CONSUME_SCRIPT=/usr/src/paperless/scripts/post-consumption-example.sh``
|
|
||||||
|
|
||||||
Troubleshooting:
|
|
||||||
|
|
||||||
- Monitor the docker-compose log ``cd ~/paperless-ngx; docker-compose logs -f``
|
|
||||||
- Check your script's permission e.g. in case of permission error ``sudo chmod 755 post-consumption-example.sh``
|
|
||||||
- Pipe your scripts's output to a log file e.g. ``echo "${DOCUMENT_ID}" | tee --append /usr/src/paperless/scripts/post-consumption-example.log``
|
|
||||||
|
|
||||||
.. _post-consumption-example.sh: https://github.com/paperless-ngx/paperless-ngx/blob/main/scripts/post-consumption-example.sh
|
|
||||||
|
|
||||||
.. _advanced-file_name_handling:
|
|
||||||
|
|
||||||
File name handling
|
|
||||||
##################
|
|
||||||
|
|
||||||
By default, paperless stores your documents in the media directory and renames them
|
|
||||||
using the identifier which it has assigned to each document. You will end up getting
|
|
||||||
files like ``0000123.pdf`` in your media directory. This isn't necessarily a bad
|
|
||||||
thing, because you normally don't have to access these files manually. However, if
|
|
||||||
you wish to name your files differently, you can do that by adjusting the
|
|
||||||
``PAPERLESS_FILENAME_FORMAT`` configuration option.
|
|
||||||
|
|
||||||
This variable allows you to configure the filename (folders are allowed) using
|
|
||||||
placeholders. For example, configuring this to
|
|
||||||
|
|
||||||
.. code:: bash
|
|
||||||
|
|
||||||
PAPERLESS_FILENAME_FORMAT={created_year}/{correspondent}/{title}
|
|
||||||
|
|
||||||
will create a directory structure as follows:
|
|
||||||
|
|
||||||
.. code::
|
|
||||||
|
|
||||||
2019/
|
|
||||||
My bank/
|
|
||||||
Statement January.pdf
|
|
||||||
Statement February.pdf
|
|
||||||
2020/
|
|
||||||
My bank/
|
|
||||||
Statement January.pdf
|
|
||||||
Letter.pdf
|
|
||||||
Letter_01.pdf
|
|
||||||
Shoe store/
|
|
||||||
My new shoes.pdf
|
|
||||||
|
|
||||||
.. danger::
|
|
||||||
|
|
||||||
Do not manually move your files in the media folder. Paperless remembers the
|
|
||||||
last filename a document was stored as. If you do rename a file, paperless will
|
|
||||||
report your files as missing and won't be able to find them.
|
|
||||||
|
|
||||||
Paperless provides the following placeholders within filenames:
|
|
||||||
|
|
||||||
* ``{asn}``: The archive serial number of the document, or "none".
|
|
||||||
* ``{correspondent}``: The name of the correspondent, or "none".
|
|
||||||
* ``{document_type}``: The name of the document type, or "none".
|
|
||||||
* ``{tag_list}``: A comma separated list of all tags assigned to the document.
|
|
||||||
* ``{title}``: The title of the document.
|
|
||||||
* ``{created}``: The full date (ISO format) the document was created.
|
|
||||||
* ``{created_year}``: Year created only.
|
|
||||||
* ``{created_month}``: Month created only (number 01-12).
|
|
||||||
* ``{created_day}``: Day created only (number 01-31).
|
|
||||||
* ``{added}``: The full date (ISO format) the document was added to paperless.
|
|
||||||
* ``{added_year}``: Year added only.
|
|
||||||
* ``{added_month}``: Month added only (number 01-12).
|
|
||||||
* ``{added_day}``: Day added only (number 01-31).
|
|
||||||
|
|
||||||
|
|
||||||
Paperless will try to conserve the information from your database as much as possible.
|
|
||||||
However, some characters that you can use in document titles and correspondent names (such
|
|
||||||
as ``: \ /`` and a couple more) are not allowed in filenames and will be replaced with dashes.
|
|
||||||
|
|
||||||
If paperless detects that two documents share the same filename, paperless will automatically
|
|
||||||
append ``_01``, ``_02``, etc to the filename. This happens if all the placeholders in a filename
|
|
||||||
evaluate to the same value.
|
|
||||||
|
|
||||||
.. hint::
|
|
||||||
|
|
||||||
Paperless checks the filename of a document whenever it is saved. Therefore,
|
|
||||||
you need to update the filenames of your documents and move them after altering
|
|
||||||
this setting by invoking the :ref:`document renamer <utilities-renamer>`.
|
|
||||||
|
|
||||||
.. warning::
|
|
||||||
|
|
||||||
Make absolutely sure you get the spelling of the placeholders right, or else
|
|
||||||
paperless will use the default naming scheme instead.
|
|
||||||
|
|
||||||
.. caution::
|
|
||||||
|
|
||||||
As of now, you could totally tell paperless to store your files anywhere outside
|
|
||||||
the media directory by setting
|
|
||||||
|
|
||||||
.. code::
|
|
||||||
|
|
||||||
PAPERLESS_FILENAME_FORMAT=../../my/custom/location/{title}
|
|
||||||
|
|
||||||
However, keep in mind that inside docker, if files get stored outside of the
|
|
||||||
predefined volumes, they will be lost after a restart of paperless.
|
|
||||||
318
docs/api.md
Normal file
@@ -0,0 +1,318 @@
|
|||||||
|
# The REST API
|
||||||
|
|
||||||
|
Paperless makes use of the [Django REST
|
||||||
|
Framework](https://django-rest-framework.org/) standard API interface. It
|
||||||
|
provides a browsable API for most of its endpoints, which you can
|
||||||
|
inspect at `http://<paperless-host>:<port>/api/`. This also documents
|
||||||
|
most of the available filters and ordering fields.
|
||||||
|
|
||||||
|
The API provides 7 main endpoints:
|
||||||
|
|
||||||
|
- `/api/documents/`: Full CRUD support, except POSTing new documents.
|
||||||
|
See below.
|
||||||
|
- `/api/correspondents/`: Full CRUD support.
|
||||||
|
- `/api/document_types/`: Full CRUD support.
|
||||||
|
- `/api/logs/`: Read-Only.
|
||||||
|
- `/api/tags/`: Full CRUD support.
|
||||||
|
- `/api/mail_accounts/`: Full CRUD support.
|
||||||
|
- `/api/mail_rules/`: Full CRUD support.
|
||||||
|
|
||||||
|
All of these endpoints except for the logging endpoint allow you to
|
||||||
|
fetch, edit and delete individual objects by appending their primary key
|
||||||
|
to the path, for example `/api/documents/454/`.
|
||||||
|
|
||||||
|
The objects served by the document endpoint contain the following
|
||||||
|
fields:
|
||||||
|
|
||||||
|
- `id`: ID of the document. Read-only.
|
||||||
|
- `title`: Title of the document.
|
||||||
|
- `content`: Plain text content of the document.
|
||||||
|
- `tags`: List of IDs of tags assigned to this document, or empty
|
||||||
|
list.
|
||||||
|
- `document_type`: Document type of this document, or null.
|
||||||
|
- `correspondent`: Correspondent of this document or null.
|
||||||
|
- `created`: The date time at which this document was created.
|
||||||
|
- `created_date`: The date (YYYY-MM-DD) at which this document was
|
||||||
|
created. Optional. If also passed with created, this is ignored.
|
||||||
|
- `modified`: The date at which this document was last edited in
|
||||||
|
paperless. Read-only.
|
||||||
|
- `added`: The date at which this document was added to paperless.
|
||||||
|
Read-only.
|
||||||
|
- `archive_serial_number`: The identifier of this document in a
|
||||||
|
physical document archive.
|
||||||
|
- `original_file_name`: Verbose filename of the original document.
|
||||||
|
Read-only.
|
||||||
|
- `archived_file_name`: Verbose filename of the archived document.
|
||||||
|
Read-only. Null if no archived document is available.
|
||||||
|
|
||||||
|
## Downloading documents
|
||||||
|
|
||||||
|
In addition to that, the document endpoint offers these additional
|
||||||
|
actions on individual documents:
|
||||||
|
|
||||||
|
- `/api/documents/<pk>/download/`: Download the document.
|
||||||
|
- `/api/documents/<pk>/preview/`: Display the document inline, without
|
||||||
|
downloading it.
|
||||||
|
- `/api/documents/<pk>/thumb/`: Download the PNG thumbnail of a
|
||||||
|
document.
|
||||||
|
|
||||||
|
Paperless generates archived PDF/A documents from consumed files and
|
||||||
|
stores both the original files as well as the archived files. By
|
||||||
|
default, the endpoints for previews and downloads serve the archived
|
||||||
|
file, if it is available. Otherwise, the original file is served. Some
|
||||||
|
document cannot be archived.
|
||||||
|
|
||||||
|
The endpoints correctly serve the response header fields
|
||||||
|
`Content-Disposition` and `Content-Type` to indicate the filename for
|
||||||
|
download and the type of content of the document.
|
||||||
|
|
||||||
|
In order to download or preview the original document when an archived
|
||||||
|
document is available, supply the query parameter `original=true`.
|
||||||
|
|
||||||
|
!!! tip
|
||||||
|
|
||||||
|
Paperless used to provide these functionality at `/fetch/<pk>/preview`,
|
||||||
|
`/fetch/<pk>/thumb` and `/fetch/<pk>/doc`. Redirects to the new URLs are
|
||||||
|
in place. However, if you use these old URLs to access documents, you
|
||||||
|
should update your app or script to use the new URLs.
|
||||||
|
|
||||||
|
## Getting document metadata
|
||||||
|
|
||||||
|
The api also has an endpoint to retrieve read-only metadata about
|
||||||
|
specific documents. this information is not served along with the
|
||||||
|
document objects, since it requires reading files and would therefore
|
||||||
|
slow down document lists considerably.
|
||||||
|
|
||||||
|
Access the metadata of a document with an ID `id` at
|
||||||
|
`/api/documents/<id>/metadata/`.
|
||||||
|
|
||||||
|
The endpoint reports the following data:
|
||||||
|
|
||||||
|
- `original_checksum`: MD5 checksum of the original document.
|
||||||
|
- `original_size`: Size of the original document, in bytes.
|
||||||
|
- `original_mime_type`: Mime type of the original document.
|
||||||
|
- `media_filename`: Current filename of the document, under which it
|
||||||
|
is stored inside the media directory.
|
||||||
|
- `has_archive_version`: True, if this document is archived, false
|
||||||
|
otherwise.
|
||||||
|
- `original_metadata`: A list of metadata associated with the original
|
||||||
|
document. See below.
|
||||||
|
- `archive_checksum`: MD5 checksum of the archived document, or null.
|
||||||
|
- `archive_size`: Size of the archived document in bytes, or null.
|
||||||
|
- `archive_metadata`: Metadata associated with the archived document,
|
||||||
|
or null. See below.
|
||||||
|
|
||||||
|
File metadata is reported as a list of objects in the following form:
|
||||||
|
|
||||||
|
```json
|
||||||
|
[
|
||||||
|
{
|
||||||
|
"namespace": "http://ns.adobe.com/pdf/1.3/",
|
||||||
|
"prefix": "pdf",
|
||||||
|
"key": "Producer",
|
||||||
|
"value": "SparklePDF, Fancy edition"
|
||||||
|
}
|
||||||
|
]
|
||||||
|
```
|
||||||
|
|
||||||
|
`namespace` and `prefix` can be null. The actual metadata reported
|
||||||
|
depends on the file type and the metadata available in that specific
|
||||||
|
document. Paperless only reports PDF metadata at this point.
|
||||||
|
|
||||||
|
## Authorization
|
||||||
|
|
||||||
|
The REST api provides three different forms of authentication.
|
||||||
|
|
||||||
|
1. Basic authentication
|
||||||
|
|
||||||
|
Authorize by providing a HTTP header in the form
|
||||||
|
|
||||||
|
```
|
||||||
|
Authorization: Basic <credentials>
|
||||||
|
```
|
||||||
|
|
||||||
|
where `credentials` is a base64-encoded string of
|
||||||
|
`<username>:<password>`
|
||||||
|
|
||||||
|
2. Session authentication
|
||||||
|
|
||||||
|
When you're logged into paperless in your browser, you're
|
||||||
|
automatically logged into the API as well and don't need to provide
|
||||||
|
any authorization headers.
|
||||||
|
|
||||||
|
3. Token authentication
|
||||||
|
|
||||||
|
Paperless also offers an endpoint to acquire authentication tokens.
|
||||||
|
|
||||||
|
POST a username and password as a form or json string to
|
||||||
|
`/api/token/` and paperless will respond with a token, if the login
|
||||||
|
data is correct. This token can be used to authenticate other
|
||||||
|
requests with the following HTTP header:
|
||||||
|
|
||||||
|
```
|
||||||
|
Authorization: Token <token>
|
||||||
|
```
|
||||||
|
|
||||||
|
Tokens can be managed and revoked in the paperless admin.
|
||||||
|
|
||||||
|
## Searching for documents
|
||||||
|
|
||||||
|
Full text searching is available on the `/api/documents/` endpoint. Two
|
||||||
|
specific query parameters cause the API to return full text search
|
||||||
|
results:
|
||||||
|
|
||||||
|
- `/api/documents/?query=your%20search%20query`: Search for a document
|
||||||
|
using a full text query. For details on the syntax, see [Basic Usage - Searching](/usage#basic-usage_searching).
|
||||||
|
- `/api/documents/?more_like=1234`: Search for documents similar to
|
||||||
|
the document with id 1234.
|
||||||
|
|
||||||
|
Pagination works exactly the same as it does for normal requests on this
|
||||||
|
endpoint.
|
||||||
|
|
||||||
|
Certain limitations apply to full text queries:
|
||||||
|
|
||||||
|
- Results are always sorted by search score. The results matching the
|
||||||
|
query best will show up first.
|
||||||
|
- Only a small subset of filtering parameters are supported.
|
||||||
|
|
||||||
|
Furthermore, each returned document has an additional `__search_hit__`
|
||||||
|
attribute with various information about the search results:
|
||||||
|
|
||||||
|
```
|
||||||
|
{
|
||||||
|
"count": 31,
|
||||||
|
"next": "http://localhost:8000/api/documents/?page=2&query=test",
|
||||||
|
"previous": null,
|
||||||
|
"results": [
|
||||||
|
|
||||||
|
...
|
||||||
|
|
||||||
|
{
|
||||||
|
"id": 123,
|
||||||
|
"title": "title",
|
||||||
|
"content": "content",
|
||||||
|
|
||||||
|
...
|
||||||
|
|
||||||
|
"__search_hit__": {
|
||||||
|
"score": 0.343,
|
||||||
|
"highlights": "text <span class="match">Test</span> text",
|
||||||
|
"rank": 23
|
||||||
|
}
|
||||||
|
},
|
||||||
|
|
||||||
|
...
|
||||||
|
|
||||||
|
]
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
- `score` is an indication how well this document matches the query
|
||||||
|
relative to the other search results.
|
||||||
|
- `highlights` is an excerpt from the document content and highlights
|
||||||
|
the search terms with `<span>` tags as shown above.
|
||||||
|
- `rank` is the index of the search results. The first result will
|
||||||
|
have rank 0.
|
||||||
|
|
||||||
|
### `/api/search/autocomplete/`
|
||||||
|
|
||||||
|
Get auto completions for a partial search term.
|
||||||
|
|
||||||
|
Query parameters:
|
||||||
|
|
||||||
|
- `term`: The incomplete term.
|
||||||
|
- `limit`: Amount of results. Defaults to 10.
|
||||||
|
|
||||||
|
Results returned by the endpoint are ordered by importance of the term
|
||||||
|
in the document index. The first result is the term that has the highest
|
||||||
|
[Tf/Idf](https://en.wikipedia.org/wiki/Tf%E2%80%93idf) score in the index.
|
||||||
|
|
||||||
|
```json
|
||||||
|
["term1", "term3", "term6", "term4"]
|
||||||
|
```
|
||||||
|
|
||||||
|
## POSTing documents {#file-uploads}
|
||||||
|
|
||||||
|
The API provides a special endpoint for file uploads:
|
||||||
|
|
||||||
|
`/api/documents/post_document/`
|
||||||
|
|
||||||
|
POST a multipart form to this endpoint, where the form field `document`
|
||||||
|
contains the document that you want to upload to paperless. The filename
|
||||||
|
is sanitized and then used to store the document in a temporary
|
||||||
|
directory, and the consumer will be instructed to consume the document
|
||||||
|
from there.
|
||||||
|
|
||||||
|
The endpoint supports the following optional form fields:
|
||||||
|
|
||||||
|
- `title`: Specify a title that the consumer should use for the
|
||||||
|
document.
|
||||||
|
- `created`: Specify a DateTime where the document was created (e.g.
|
||||||
|
"2016-04-19" or "2016-04-19 06:15:00+02:00").
|
||||||
|
- `correspondent`: Specify the ID of a correspondent that the consumer
|
||||||
|
should use for the document.
|
||||||
|
- `document_type`: Similar to correspondent.
|
||||||
|
- `tags`: Similar to correspondent. Specify this multiple times to
|
||||||
|
have multiple tags added to the document.
|
||||||
|
|
||||||
|
The endpoint will immediately return "OK" if the document consumption
|
||||||
|
process was started successfully. No additional status information about
|
||||||
|
the consumption process itself is available, since that happens in a
|
||||||
|
different process.
|
||||||
|
|
||||||
|
## API Versioning
|
||||||
|
|
||||||
|
The REST API is versioned since Paperless-ngx 1.3.0.
|
||||||
|
|
||||||
|
- Versioning ensures that changes to the API don't break older
|
||||||
|
clients.
|
||||||
|
- Clients specify the specific version of the API they wish to use
|
||||||
|
with every request and Paperless will handle the request using the
|
||||||
|
specified API version.
|
||||||
|
- Even if the underlying data model changes, older API versions will
|
||||||
|
always serve compatible data.
|
||||||
|
- If no version is specified, Paperless will serve version 1 to ensure
|
||||||
|
compatibility with older clients that do not request a specific API
|
||||||
|
version.
|
||||||
|
|
||||||
|
API versions are specified by submitting an additional HTTP `Accept`
|
||||||
|
header with every request:
|
||||||
|
|
||||||
|
```
|
||||||
|
Accept: application/json; version=6
|
||||||
|
```
|
||||||
|
|
||||||
|
If an invalid version is specified, Paperless 1.3.0 will respond with
|
||||||
|
"406 Not Acceptable" and an error message in the body. Earlier
|
||||||
|
versions of Paperless will serve API version 1 regardless of whether a
|
||||||
|
version is specified via the `Accept` header.
|
||||||
|
|
||||||
|
If a client wishes to verify whether it is compatible with any given
|
||||||
|
server, the following procedure should be performed:
|
||||||
|
|
||||||
|
1. Perform an _authenticated_ request against any API endpoint. If the
|
||||||
|
server is on version 1.3.0 or newer, the server will add two custom
|
||||||
|
headers to the response:
|
||||||
|
|
||||||
|
```
|
||||||
|
X-Api-Version: 2
|
||||||
|
X-Version: 1.3.0
|
||||||
|
```
|
||||||
|
|
||||||
|
2. Determine whether the client is compatible with this server based on
|
||||||
|
the presence/absence of these headers and their values if present.
|
||||||
|
|
||||||
|
### API Changelog
|
||||||
|
|
||||||
|
#### Version 1
|
||||||
|
|
||||||
|
Initial API version.
|
||||||
|
|
||||||
|
#### Version 2
|
||||||
|
|
||||||
|
- Added field `Tag.color`. This read/write string field contains a hex
|
||||||
|
color such as `#a6cee3`.
|
||||||
|
- Added read-only field `Tag.text_color`. This field contains the text
|
||||||
|
color to use for a specific tag, which is either black or white
|
||||||
|
depending on the brightness of `Tag.color`.
|
||||||
|
- Removed field `Tag.colour`.
|
||||||
300
docs/api.rst
@@ -1,300 +0,0 @@
|
|||||||
|
|
||||||
************
|
|
||||||
The REST API
|
|
||||||
************
|
|
||||||
|
|
||||||
|
|
||||||
Paperless makes use of the `Django REST Framework`_ standard API interface.
|
|
||||||
It provides a browsable API for most of its endpoints, which you can inspect
|
|
||||||
at ``http://<paperless-host>:<port>/api/``. This also documents most of the
|
|
||||||
available filters and ordering fields.
|
|
||||||
|
|
||||||
.. _Django REST Framework: http://django-rest-framework.org/
|
|
||||||
|
|
||||||
The API provides 5 main endpoints:
|
|
||||||
|
|
||||||
* ``/api/documents/``: Full CRUD support, except POSTing new documents. See below.
|
|
||||||
* ``/api/correspondents/``: Full CRUD support.
|
|
||||||
* ``/api/document_types/``: Full CRUD support.
|
|
||||||
* ``/api/logs/``: Read-Only.
|
|
||||||
* ``/api/tags/``: Full CRUD support.
|
|
||||||
|
|
||||||
All of these endpoints except for the logging endpoint
|
|
||||||
allow you to fetch, edit and delete individual objects
|
|
||||||
by appending their primary key to the path, for example ``/api/documents/454/``.
|
|
||||||
|
|
||||||
The objects served by the document endpoint contain the following fields:
|
|
||||||
|
|
||||||
* ``id``: ID of the document. Read-only.
|
|
||||||
* ``title``: Title of the document.
|
|
||||||
* ``content``: Plain text content of the document.
|
|
||||||
* ``tags``: List of IDs of tags assigned to this document, or empty list.
|
|
||||||
* ``document_type``: Document type of this document, or null.
|
|
||||||
* ``correspondent``: Correspondent of this document or null.
|
|
||||||
* ``created``: The date at which this document was created.
|
|
||||||
* ``modified``: The date at which this document was last edited in paperless. Read-only.
|
|
||||||
* ``added``: The date at which this document was added to paperless. Read-only.
|
|
||||||
* ``archive_serial_number``: The identifier of this document in a physical document archive.
|
|
||||||
* ``original_file_name``: Verbose filename of the original document. Read-only.
|
|
||||||
* ``archived_file_name``: Verbose filename of the archived document. Read-only. Null if no archived document is available.
|
|
||||||
|
|
||||||
|
|
||||||
Downloading documents
|
|
||||||
#####################
|
|
||||||
|
|
||||||
In addition to that, the document endpoint offers these additional actions on
|
|
||||||
individual documents:
|
|
||||||
|
|
||||||
* ``/api/documents/<pk>/download/``: Download the document.
|
|
||||||
* ``/api/documents/<pk>/preview/``: Display the document inline,
|
|
||||||
without downloading it.
|
|
||||||
* ``/api/documents/<pk>/thumb/``: Download the PNG thumbnail of a document.
|
|
||||||
|
|
||||||
Paperless generates archived PDF/A documents from consumed files and stores both
|
|
||||||
the original files as well as the archived files. By default, the endpoints
|
|
||||||
for previews and downloads serve the archived file, if it is available.
|
|
||||||
Otherwise, the original file is served.
|
|
||||||
Some document cannot be archived.
|
|
||||||
|
|
||||||
The endpoints correctly serve the response header fields ``Content-Disposition``
|
|
||||||
and ``Content-Type`` to indicate the filename for download and the type of content of
|
|
||||||
the document.
|
|
||||||
|
|
||||||
In order to download or preview the original document when an archived document is available,
|
|
||||||
supply the query parameter ``original=true``.
|
|
||||||
|
|
||||||
.. hint::
|
|
||||||
|
|
||||||
Paperless used to provide these functionality at ``/fetch/<pk>/preview``,
|
|
||||||
``/fetch/<pk>/thumb`` and ``/fetch/<pk>/doc``. Redirects to the new URLs
|
|
||||||
are in place. However, if you use these old URLs to access documents, you
|
|
||||||
should update your app or script to use the new URLs.
|
|
||||||
|
|
||||||
|
|
||||||
Getting document metadata
|
|
||||||
#########################
|
|
||||||
|
|
||||||
The api also has an endpoint to retrieve read-only metadata about specific documents. this
|
|
||||||
information is not served along with the document objects, since it requires reading
|
|
||||||
files and would therefore slow down document lists considerably.
|
|
||||||
|
|
||||||
Access the metadata of a document with an ID ``id`` at ``/api/documents/<id>/metadata/``.
|
|
||||||
|
|
||||||
The endpoint reports the following data:
|
|
||||||
|
|
||||||
* ``original_checksum``: MD5 checksum of the original document.
|
|
||||||
* ``original_size``: Size of the original document, in bytes.
|
|
||||||
* ``original_mime_type``: Mime type of the original document.
|
|
||||||
* ``media_filename``: Current filename of the document, under which it is stored inside the media directory.
|
|
||||||
* ``has_archive_version``: True, if this document is archived, false otherwise.
|
|
||||||
* ``original_metadata``: A list of metadata associated with the original document. See below.
|
|
||||||
* ``archive_checksum``: MD5 checksum of the archived document, or null.
|
|
||||||
* ``archive_size``: Size of the archived document in bytes, or null.
|
|
||||||
* ``archive_metadata``: Metadata associated with the archived document, or null. See below.
|
|
||||||
|
|
||||||
File metadata is reported as a list of objects in the following form:
|
|
||||||
|
|
||||||
.. code:: json
|
|
||||||
|
|
||||||
[
|
|
||||||
{
|
|
||||||
"namespace": "http://ns.adobe.com/pdf/1.3/",
|
|
||||||
"prefix": "pdf",
|
|
||||||
"key": "Producer",
|
|
||||||
"value": "SparklePDF, Fancy edition"
|
|
||||||
},
|
|
||||||
]
|
|
||||||
|
|
||||||
``namespace`` and ``prefix`` can be null. The actual metadata reported depends on the file type and the metadata
|
|
||||||
available in that specific document. Paperless only reports PDF metadata at this point.
|
|
||||||
|
|
||||||
Authorization
|
|
||||||
#############
|
|
||||||
|
|
||||||
The REST api provides three different forms of authentication.
|
|
||||||
|
|
||||||
1. Basic authentication
|
|
||||||
|
|
||||||
Authorize by providing a HTTP header in the form
|
|
||||||
|
|
||||||
.. code::
|
|
||||||
|
|
||||||
Authorization: Basic <credentials>
|
|
||||||
|
|
||||||
where ``credentials`` is a base64-encoded string of ``<username>:<password>``
|
|
||||||
|
|
||||||
2. Session authentication
|
|
||||||
|
|
||||||
When you're logged into paperless in your browser, you're automatically
|
|
||||||
logged into the API as well and don't need to provide any authorization
|
|
||||||
headers.
|
|
||||||
|
|
||||||
3. Token authentication
|
|
||||||
|
|
||||||
Paperless also offers an endpoint to acquire authentication tokens.
|
|
||||||
|
|
||||||
POST a username and password as a form or json string to ``/api/token/``
|
|
||||||
and paperless will respond with a token, if the login data is correct.
|
|
||||||
This token can be used to authenticate other requests with the
|
|
||||||
following HTTP header:
|
|
||||||
|
|
||||||
.. code::
|
|
||||||
|
|
||||||
Authorization: Token <token>
|
|
||||||
|
|
||||||
Tokens can be managed and revoked in the paperless admin.
|
|
||||||
|
|
||||||
Searching for documents
|
|
||||||
#######################
|
|
||||||
|
|
||||||
Full text searching is available on the ``/api/documents/`` endpoint. Two specific
|
|
||||||
query parameters cause the API to return full text search results:
|
|
||||||
|
|
||||||
* ``/api/documents/?query=your%20search%20query``: Search for a document using a full text query.
|
|
||||||
For details on the syntax, see :ref:`basic-usage_searching`.
|
|
||||||
|
|
||||||
* ``/api/documents/?more_like=1234``: Search for documents similar to the document with id 1234.
|
|
||||||
|
|
||||||
Pagination works exactly the same as it does for normal requests on this endpoint.
|
|
||||||
|
|
||||||
Certain limitations apply to full text queries:
|
|
||||||
|
|
||||||
* Results are always sorted by search score. The results matching the query best will show up first.
|
|
||||||
|
|
||||||
* Only a small subset of filtering parameters are supported.
|
|
||||||
|
|
||||||
Furthermore, each returned document has an additional ``__search_hit__`` attribute with various information
|
|
||||||
about the search results:
|
|
||||||
|
|
||||||
.. code::
|
|
||||||
|
|
||||||
{
|
|
||||||
"count": 31,
|
|
||||||
"next": "http://localhost:8000/api/documents/?page=2&query=test",
|
|
||||||
"previous": null,
|
|
||||||
"results": [
|
|
||||||
|
|
||||||
...
|
|
||||||
|
|
||||||
{
|
|
||||||
"id": 123,
|
|
||||||
"title": "title",
|
|
||||||
"content": "content",
|
|
||||||
|
|
||||||
...
|
|
||||||
|
|
||||||
"__search_hit__": {
|
|
||||||
"score": 0.343,
|
|
||||||
"highlights": "text <span class=\"match\">Test</span> text",
|
|
||||||
"rank": 23
|
|
||||||
}
|
|
||||||
},
|
|
||||||
|
|
||||||
...
|
|
||||||
|
|
||||||
]
|
|
||||||
}
|
|
||||||
|
|
||||||
* ``score`` is an indication how well this document matches the query relative to the other search results.
|
|
||||||
* ``highlights`` is an excerpt from the document content and highlights the search terms with ``<span>`` tags as shown above.
|
|
||||||
* ``rank`` is the index of the search results. The first result will have rank 0.
|
|
||||||
|
|
||||||
``/api/search/autocomplete/``
|
|
||||||
=============================
|
|
||||||
|
|
||||||
Get auto completions for a partial search term.
|
|
||||||
|
|
||||||
Query parameters:
|
|
||||||
|
|
||||||
* ``term``: The incomplete term.
|
|
||||||
* ``limit``: Amount of results. Defaults to 10.
|
|
||||||
|
|
||||||
Results returned by the endpoint are ordered by importance of the term in the
|
|
||||||
document index. The first result is the term that has the highest Tf/Idf score
|
|
||||||
in the index.
|
|
||||||
|
|
||||||
.. code:: json
|
|
||||||
|
|
||||||
[
|
|
||||||
"term1",
|
|
||||||
"term3",
|
|
||||||
"term6",
|
|
||||||
"term4"
|
|
||||||
]
|
|
||||||
|
|
||||||
|
|
||||||
.. _api-file_uploads:
|
|
||||||
|
|
||||||
POSTing documents
|
|
||||||
#################
|
|
||||||
|
|
||||||
The API provides a special endpoint for file uploads:
|
|
||||||
|
|
||||||
``/api/documents/post_document/``
|
|
||||||
|
|
||||||
POST a multipart form to this endpoint, where the form field ``document`` contains
|
|
||||||
the document that you want to upload to paperless. The filename is sanitized and
|
|
||||||
then used to store the document in a temporary directory, and the consumer will
|
|
||||||
be instructed to consume the document from there.
|
|
||||||
|
|
||||||
The endpoint supports the following optional form fields:
|
|
||||||
|
|
||||||
* ``title``: Specify a title that the consumer should use for the document.
|
|
||||||
* ``correspondent``: Specify the ID of a correspondent that the consumer should use for the document.
|
|
||||||
* ``document_type``: Similar to correspondent.
|
|
||||||
* ``tags``: Similar to correspondent. Specify this multiple times to have multiple tags added
|
|
||||||
to the document.
|
|
||||||
|
|
||||||
The endpoint will immediately return "OK" if the document consumption process
|
|
||||||
was started successfully. No additional status information about the consumption
|
|
||||||
process itself is available, since that happens in a different process.
|
|
||||||
|
|
||||||
|
|
||||||
.. _api-versioning:
|
|
||||||
|
|
||||||
API Versioning
|
|
||||||
##############
|
|
||||||
|
|
||||||
The REST API is versioned since Paperless-ngx 1.3.0.
|
|
||||||
|
|
||||||
* Versioning ensures that changes to the API don't break older clients.
|
|
||||||
* Clients specify the specific version of the API they wish to use with every request and Paperless will handle the request using the specified API version.
|
|
||||||
* Even if the underlying data model changes, older API versions will always serve compatible data.
|
|
||||||
* If no version is specified, Paperless will serve version 1 to ensure compatibility with older clients that do not request a specific API version.
|
|
||||||
|
|
||||||
API versions are specified by submitting an additional HTTP ``Accept`` header with every request:
|
|
||||||
|
|
||||||
.. code::
|
|
||||||
|
|
||||||
Accept: application/json; version=6
|
|
||||||
|
|
||||||
If an invalid version is specified, Paperless 1.3.0 will respond with "406 Not Acceptable" and an error message in the body.
|
|
||||||
Earlier versions of Paperless will serve API version 1 regardless of whether a version is specified via the ``Accept`` header.
|
|
||||||
|
|
||||||
If a client wishes to verify whether it is compatible with any given server, the following procedure should be performed:
|
|
||||||
|
|
||||||
1. Perform an *authenticated* request against any API endpoint. If the server is on version 1.3.0 or newer, the server will
|
|
||||||
add two custom headers to the response:
|
|
||||||
|
|
||||||
.. code::
|
|
||||||
|
|
||||||
X-Api-Version: 2
|
|
||||||
X-Version: 1.3.0
|
|
||||||
|
|
||||||
2. Determine whether the client is compatible with this server based on the presence/absence of these headers and their values if present.
|
|
||||||
|
|
||||||
|
|
||||||
API Changelog
|
|
||||||
=============
|
|
||||||
|
|
||||||
Version 1
|
|
||||||
---------
|
|
||||||
|
|
||||||
Initial API version.
|
|
||||||
|
|
||||||
Version 2
|
|
||||||
---------
|
|
||||||
|
|
||||||
* Added field ``Tag.color``. This read/write string field contains a hex color such as ``#a6cee3``.
|
|
||||||
* Added read-only field ``Tag.text_color``. This field contains the text color to use for a specific tag, which is either black or white depending on the brightness of ``Tag.color``.
|
|
||||||
* Removed field ``Tag.colour``.
|
|
||||||
36
docs/assets/extra.css
Normal file
@@ -0,0 +1,36 @@
|
|||||||
|
:root > * {
|
||||||
|
--md-primary-fg-color: #17541f;
|
||||||
|
--md-primary-fg-color--dark: #17541f;
|
||||||
|
--md-primary-fg-color--light: #17541f;
|
||||||
|
--md-accent-fg-color: #2b8a38;
|
||||||
|
--md-typeset-a-color: #21652a;
|
||||||
|
}
|
||||||
|
|
||||||
|
[data-md-color-scheme="slate"] {
|
||||||
|
--md-hue: 222;
|
||||||
|
}
|
||||||
|
|
||||||
|
@media (min-width: 400px) {
|
||||||
|
.grid-left {
|
||||||
|
width: 33%;
|
||||||
|
float: left;
|
||||||
|
}
|
||||||
|
.grid-right {
|
||||||
|
width: 62%;
|
||||||
|
margin-left: 4%;
|
||||||
|
float: left;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
.grid-left > p {
|
||||||
|
margin-bottom: 2rem;
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
.grid-right p {
|
||||||
|
margin: 0;
|
||||||
|
}
|
||||||
|
|
||||||
|
.index-callout {
|
||||||
|
margin-right: .5rem;
|
||||||
|
}
|
||||||
BIN
docs/assets/favicon.png
Normal file
|
After Width: | Height: | Size: 768 B |
12
docs/assets/logo.svg
Normal file
@@ -0,0 +1,12 @@
|
|||||||
|
<?xml version="1.0" encoding="utf-8"?>
|
||||||
|
<!-- Generator: Adobe Illustrator 27.0.1, SVG Export Plug-In . SVG Version: 6.00 Build 0) -->
|
||||||
|
<svg version="1.1" id="Layer_1" xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink" x="0px" y="0px"
|
||||||
|
viewBox="0 0 1000 1000" style="enable-background:new 0 0 1000 1000;" xml:space="preserve">
|
||||||
|
<style type="text/css">
|
||||||
|
.st0{fill:#FFFFFF;}
|
||||||
|
</style>
|
||||||
|
<path class="st0" d="M299,891.7c-4.2-19.8-12.5-59.6-13.6-59.6c-176.7-105.7-155.8-288.7-97.3-393.4
|
||||||
|
c12.5,131.8,245.8,222.8,109.8,383.9c-1.1,2,6.2,27.2,12.5,50.2c27.2-46,68-101.4,65.8-106.7C208.9,358.2,731.9,326.9,840.6,73.7
|
||||||
|
c49.1,244.8-25.1,623.5-445.5,719.7c-2,1.1-76.3,131.8-79.5,132.9c0-2-31.4-1.1-27.2-11.5C290.7,908.4,294.8,900.1,299,891.7
|
||||||
|
L299,891.7z M293.8,793.4c53.3-61.8-9.4-167.4-47.1-201.9C310.5,701.3,306.3,765.1,293.8,793.4L293.8,793.4z"/>
|
||||||
|
</svg>
|
||||||
|
After Width: | Height: | Size: 869 B |
68
docs/assets/logo_full_black.svg
Normal file
@@ -0,0 +1,68 @@
|
|||||||
|
<?xml version="1.0" encoding="utf-8"?>
|
||||||
|
<!-- Generator: Adobe Illustrator 27.0.1, SVG Export Plug-In . SVG Version: 6.00 Build 0) -->
|
||||||
|
<svg version="1.1" id="Layer_1" xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink" x="0px" y="0px"
|
||||||
|
viewBox="0 0 2962.2 860.2" style="enable-background:new 0 0 2962.2 860.2;" xml:space="preserve">
|
||||||
|
<style type="text/css">
|
||||||
|
.st0{fill:#17541F;stroke:#000000;stroke-miterlimit:10;}
|
||||||
|
</style>
|
||||||
|
<path d="M1055.6,639.7v-20.6c-18,20-43.1,30.1-75.4,30.1c-22.4,0-42.8-5.8-61-17.5c-18.3-11.7-32.5-27.8-42.9-48.3
|
||||||
|
c-10.3-20.5-15.5-43.3-15.5-68.4c0-25.1,5.2-48,15.5-68.5s24.6-36.6,42.9-48.3s38.6-17.5,61-17.5c32.3,0,57.5,10,75.4,30.1v-20.6
|
||||||
|
h85.3v249.6L1055.6,639.7L1055.6,639.7z M1059.1,514.9c0-17.4-5.2-31.9-15.5-43.8c-10.3-11.8-23.9-17.7-40.6-17.7
|
||||||
|
c-16.8,0-30.2,5.9-40.4,17.7c-10.2,11.8-15.3,26.4-15.3,43.8c0,17.4,5.1,31.9,15.3,43.8c10.2,11.8,23.6,17.7,40.4,17.7
|
||||||
|
c16.8,0,30.3-5.9,40.6-17.7C1054,546.9,1059.1,532.3,1059.1,514.9z"/>
|
||||||
|
<path d="M1417.8,398.2c18.3,11.7,32.5,27.8,42.9,48.3c10.3,20.5,15.5,43.3,15.5,68.5c0,25.1-5.2,48-15.5,68.4
|
||||||
|
c-10.3,20.5-24.6,36.6-42.9,48.3s-38.6,17.5-61,17.5c-32.3,0-57.5-10-75.4-30.1v165.6h-85.3V390.2h85.3v20.6
|
||||||
|
c18-20,43.1-30.1,75.4-30.1C1379.2,380.7,1399.5,386.6,1417.8,398.2z M1389.5,514.9c0-17.4-5.1-31.9-15.3-43.8
|
||||||
|
c-10.2-11.8-23.6-17.7-40.4-17.7s-30.2,5.9-40.4,17.7c-10.2,11.8-15.3,26.4-15.3,43.8c0,17.4,5.1,31.9,15.3,43.8
|
||||||
|
c10.2,11.8,23.6,17.7,40.4,17.7s30.2-5.9,40.4-17.7S1389.5,532.3,1389.5,514.9z"/>
|
||||||
|
<path d="M1713.6,555.3l53,49.4c-28.1,29.6-66.7,44.4-115.8,44.4c-28.1,0-53-5.8-74.5-17.5s-38.2-27.7-49.8-48
|
||||||
|
c-11.7-20.3-17.7-43.2-18-68.7c0-24.8,5.9-47.5,17.7-68c11.8-20.5,28.1-36.7,48.7-48.5s43.5-17.7,68.7-17.7
|
||||||
|
c24.8,0,47.6,6.1,68.2,18.2s37,29.5,49.1,52.3c12.1,22.7,18.2,49.1,18.2,79l-0.4,11.7h-181.8c3.6,11.4,10.5,20.7,20.9,28.1
|
||||||
|
c10.3,7.3,21.3,11,33,11c14.4,0,26.3-2.2,35.7-6.5C1695.8,570.1,1704.9,563.7,1713.6,555.3z M1596.9,486.2h92.9
|
||||||
|
c-2.1-12.3-7.5-22.1-16.2-29.4s-18.7-11-30.1-11s-21.5,3.7-30.3,11S1599,473.9,1596.9,486.2z"/>
|
||||||
|
<path d="M1908.8,418.4c7.8-10.8,17.2-19,28.3-24.7s22-8.5,32.8-8.5c11.4,0,20,1.6,26,4.9l-10.8,72.7c-8.4-2.1-15.7-3.1-22-3.1
|
||||||
|
c-17.1,0-30.4,4.3-39.9,12.8c-9.6,8.5-14.4,24.2-14.4,46.9v120.3h-85.3V390.2h85.3V418.4L1908.8,418.4z"/>
|
||||||
|
<path d="M2113,258.2v381.5h-85.3V258.2H2113z"/>
|
||||||
|
<path d="M2360.8,555.3l53,49.4c-28.1,29.6-66.7,44.4-115.8,44.4c-28.1,0-53-5.8-74.5-17.5s-38.2-27.7-49.8-48
|
||||||
|
c-11.7-20.3-17.7-43.2-18-68.7c0-24.8,5.9-47.5,17.7-68s28.1-36.7,48.7-48.5c20.6-11.8,43.5-17.7,68.7-17.7
|
||||||
|
c24.8,0,47.6,6.1,68.2,18.2c20.6,12.1,37,29.5,49.1,52.3c12.1,22.7,18.2,49.1,18.2,79l-0.4,11.7h-181.8
|
||||||
|
c3.6,11.4,10.5,20.7,20.9,28.1c10.3,7.3,21.3,11,33,11c14.4,0,26.3-2.2,35.7-6.5C2343.1,570.1,2352.1,563.7,2360.8,555.3z
|
||||||
|
M2244.1,486.2h92.9c-2.1-12.3-7.5-22.1-16.2-29.4s-18.7-11-30.1-11s-21.5,3.7-30.3,11C2251.7,464.1,2246.2,473.9,2244.1,486.2z"/>
|
||||||
|
<path d="M2565.9,446.3c-9.9,0-17.1,1.1-21.5,3.4c-4.5,2.2-6.7,5.9-6.7,11s3.4,8.8,10.3,11.2c6.9,2.4,18,4.9,33.2,7.6
|
||||||
|
c20,3,37,6.7,50.9,11.2s26,12.1,36.1,22.9c10.2,10.8,15.3,25.9,15.3,45.3c0,29.9-10.9,52.4-32.8,67.6
|
||||||
|
c-21.8,15.1-50.3,22.7-85.3,22.7c-25.7,0-49.5-3.7-71.4-11c-21.8-7.3-37.4-14.7-46.7-22.2l33.7-60.6c10.2,9,23.4,15.8,39.7,20.4
|
||||||
|
c16.3,4.6,31.3,7,45.1,7c19.7,0,29.6-5.2,29.6-15.7c0-5.4-3.3-9.4-9.9-11.9c-6.6-2.5-17.2-5.2-31.9-7.9c-18.9-3.3-34.9-7.2-48-11.7
|
||||||
|
c-13.2-4.5-24.6-12.2-34.3-23.1c-9.7-10.9-14.6-26-14.6-45.1c0-27.2,9.7-48.5,29-63.7c19.3-15.3,46-22.9,80.1-22.9
|
||||||
|
c23.3,0,44.4,3.6,63.3,10.8c18.9,7.2,34,14.5,45.3,22l-32.8,58.8c-10.8-7.5-23.2-13.7-37.3-18.6
|
||||||
|
C2590.5,448.7,2577.6,446.3,2565.9,446.3z"/>
|
||||||
|
<path d="M2817.3,446.3c-9.9,0-17.1,1.1-21.5,3.4c-4.5,2.2-6.7,5.9-6.7,11s3.4,8.8,10.3,11.2c6.9,2.4,18,4.9,33.2,7.6
|
||||||
|
c20,3,37,6.7,50.9,11.2s26,12.1,36.1,22.9c10.2,10.8,15.3,25.9,15.3,45.3c0,29.9-10.9,52.4-32.8,67.6
|
||||||
|
c-21.8,15.1-50.3,22.7-85.3,22.7c-25.7,0-49.5-3.7-71.4-11c-21.8-7.3-37.4-14.7-46.7-22.2l33.7-60.6c10.2,9,23.4,15.8,39.7,20.4
|
||||||
|
c16.3,4.6,31.3,7,45.1,7c19.8,0,29.6-5.2,29.6-15.7c0-5.4-3.3-9.4-9.9-11.9c-6.6-2.5-17.2-5.2-31.9-7.9c-18.9-3.3-34.9-7.2-48-11.7
|
||||||
|
c-13.2-4.5-24.6-12.2-34.3-23.1c-9.7-10.9-14.6-26-14.6-45.1c0-27.2,9.7-48.5,29-63.7c19.3-15.3,46-22.9,80.1-22.9
|
||||||
|
c23.3,0,44.4,3.6,63.3,10.8c18.9,7.2,34,14.5,45.3,22l-32.8,58.8c-10.8-7.5-23.2-13.7-37.3-18.6
|
||||||
|
C2841.8,448.7,2828.9,446.3,2817.3,446.3z"/>
|
||||||
|
<g>
|
||||||
|
<path d="M2508,724h60.2v17.3H2508V724z"/>
|
||||||
|
<path d="M2629.2,694.4c4.9-2,10.2-3.1,16-3.1c10.9,0,19.5,3.4,25.9,10.2s9.6,16.7,9.6,29.6v57.3h-19.6v-52.6
|
||||||
|
c0-9.3-1.7-16.2-5.1-20.7c-3.4-4.5-9.1-6.7-17-6.7c-6.5,0-11.8,2.4-16.1,7.1c-4.3,4.8-6.4,11.5-6.4,20.2v52.6h-19.6v-94.6h19.6v9.5
|
||||||
|
C2620.2,699.4,2624.4,696.4,2629.2,694.4z"/>
|
||||||
|
<path d="M2790.3,833.2c-8.6,6.8-19.4,10.2-32.3,10.2c-7.9,0-15.2-1.4-21.9-4.1s-12.1-6.8-16.3-12.2s-6.6-11.9-7.1-19.6h19.6
|
||||||
|
c0.7,6.1,3.5,10.8,8.4,13.9c4.9,3.2,10.7,4.8,17.4,4.8c7,0,13.1-2,18.2-6c5.1-4,7.7-10.3,7.7-18.9v-24.7c-3.6,3.4-8,6.2-13.3,8.2
|
||||||
|
c-5.2,2.1-10.7,3.1-16.3,3.1c-8.7,0-16.6-2.1-23.7-6.4c-7.1-4.3-12.6-10-16.7-17.3c-4-7.3-6-15.5-6-24.6s2-17.3,6-24.7
|
||||||
|
s9.6-13.2,16.7-17.4c7.1-4.3,15-6.4,23.7-6.4c5.7,0,11.1,1,16.3,3.1s9.6,4.8,13.3,8.2v-8.8h19.4v107.8
|
||||||
|
C2803.2,815.9,2798.9,826.4,2790.3,833.2z M2782.2,755.7c2.6-4.7,3.8-10,3.8-15.9s-1.3-11.2-3.8-16c-2.6-4.8-6.1-8.5-10.5-11.1
|
||||||
|
c-4.5-2.7-9.5-4-15.1-4c-5.8,0-10.9,1.4-15.4,4.3c-4.5,2.8-7.9,6.6-10.3,11.4c-2.4,4.8-3.6,9.9-3.6,15.5c0,5.4,1.2,10.5,3.6,15.3
|
||||||
|
c2.4,4.8,5.8,8.6,10.3,11.5s9.6,4.3,15.4,4.3c5.6,0,10.6-1.4,15.1-4.1C2776.1,764.1,2779.6,760.4,2782.2,755.7z"/>
|
||||||
|
<path d="M2843.5,788.4h-21.6l37.9-48l-36.4-46.6h22.6l25.7,33.3l25.8-33.3h21.6l-36.2,45.9l37.9,48.6h-22.6l-27.4-35L2843.5,788.4z
|
||||||
|
"/>
|
||||||
|
</g>
|
||||||
|
<path d="M835.8,319.2c-11.5-18.9-27.4-33.7-47.6-44.7c-20.2-10.9-43-16.4-68.5-16.4h-90.6c-8.6,39.6-21.3,77.2-38,112.4
|
||||||
|
c-10,21-21.3,41-33.9,59.9v209.2H647v-135h72.7c25.4,0,48.3-5.5,68.5-16.4s36.1-25.8,47.6-44.7c11.5-18.9,17.3-39.5,17.3-61.9
|
||||||
|
C853.1,358.9,847.4,338.1,835.8,319.2z M747,416.6c-9.4,9-21.8,13.5-37,13.5l-62.8,0.4v-93.4l62.8-0.4c15.3,0,27.6,4.5,37,13.5
|
||||||
|
s14.1,20,14.1,33.2C761.1,396.6,756.4,407.7,747,416.6z"/>
|
||||||
|
<path class="st0" d="M164.7,698.7c-3.5-16.5-10.4-49.6-11.3-49.6c-147.1-88-129.7-240.3-81-327.4C82.8,431.4,277,507.1,163.8,641.2
|
||||||
|
c-0.9,1.7,5.2,22.6,10.4,41.8c22.6-38.3,56.6-84.4,54.8-88.8C89.7,254.7,525,228.6,615.5,17.9c40.9,203.7-20.9,518.9-370.8,599
|
||||||
|
c-1.7,0.9-63.5,109.7-66.2,110.6c0-1.7-26.1-0.9-22.6-9.6C157.8,712.6,161.2,705.7,164.7,698.7L164.7,698.7z M160.4,616.9
|
||||||
|
c44.4-51.4-7.8-139.3-39.2-168C174.3,540.2,170.8,593.3,160.4,616.9L160.4,616.9z"/>
|
||||||
|
</svg>
|
||||||
|
After Width: | Height: | Size: 6.3 KiB |
69
docs/assets/logo_full_white.svg
Normal file
@@ -0,0 +1,69 @@
|
|||||||
|
<?xml version="1.0" encoding="utf-8"?>
|
||||||
|
<!-- Generator: Adobe Illustrator 27.0.1, SVG Export Plug-In . SVG Version: 6.00 Build 0) -->
|
||||||
|
<svg version="1.1" id="Layer_1" xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink" x="0px" y="0px"
|
||||||
|
viewBox="0 0 2962.2 860.2" style="enable-background:new 0 0 2962.2 860.2;" xml:space="preserve">
|
||||||
|
<style type="text/css">
|
||||||
|
.st0{fill:#FFFFFF;stroke:#000000;stroke-miterlimit:10;}
|
||||||
|
.st1{fill:#17541F;stroke:#000000;stroke-miterlimit:10;}
|
||||||
|
</style>
|
||||||
|
<path class="st0" d="M1055.6,639.7v-20.6c-18,20-43.1,30.1-75.4,30.1c-22.4,0-42.8-5.8-61-17.5c-18.3-11.7-32.5-27.8-42.9-48.3
|
||||||
|
c-10.3-20.5-15.5-43.3-15.5-68.4c0-25.1,5.2-48,15.5-68.5s24.6-36.6,42.9-48.3s38.6-17.5,61-17.5c32.3,0,57.5,10,75.4,30.1v-20.6
|
||||||
|
h85.3v249.6L1055.6,639.7L1055.6,639.7z M1059.1,514.9c0-17.4-5.2-31.9-15.5-43.8c-10.3-11.8-23.9-17.7-40.6-17.7
|
||||||
|
c-16.8,0-30.2,5.9-40.4,17.7c-10.2,11.8-15.3,26.4-15.3,43.8c0,17.4,5.1,31.9,15.3,43.8c10.2,11.8,23.6,17.7,40.4,17.7
|
||||||
|
c16.8,0,30.3-5.9,40.6-17.7C1054,546.9,1059.1,532.3,1059.1,514.9z"/>
|
||||||
|
<path class="st0" d="M1417.8,398.2c18.3,11.7,32.5,27.8,42.9,48.3c10.3,20.5,15.5,43.3,15.5,68.5c0,25.1-5.2,48-15.5,68.4
|
||||||
|
c-10.3,20.5-24.6,36.6-42.9,48.3s-38.6,17.5-61,17.5c-32.3,0-57.5-10-75.4-30.1v165.6h-85.3V390.2h85.3v20.6
|
||||||
|
c18-20,43.1-30.1,75.4-30.1C1379.2,380.7,1399.5,386.6,1417.8,398.2z M1389.5,514.9c0-17.4-5.1-31.9-15.3-43.8
|
||||||
|
c-10.2-11.8-23.6-17.7-40.4-17.7s-30.2,5.9-40.4,17.7c-10.2,11.8-15.3,26.4-15.3,43.8c0,17.4,5.1,31.9,15.3,43.8
|
||||||
|
c10.2,11.8,23.6,17.7,40.4,17.7s30.2-5.9,40.4-17.7S1389.5,532.3,1389.5,514.9z"/>
|
||||||
|
<path class="st0" d="M1713.6,555.3l53,49.4c-28.1,29.6-66.7,44.4-115.8,44.4c-28.1,0-53-5.8-74.5-17.5s-38.2-27.7-49.8-48
|
||||||
|
c-11.7-20.3-17.7-43.2-18-68.7c0-24.8,5.9-47.5,17.7-68c11.8-20.5,28.1-36.7,48.7-48.5s43.5-17.7,68.7-17.7
|
||||||
|
c24.8,0,47.6,6.1,68.2,18.2s37,29.5,49.1,52.3c12.1,22.7,18.2,49.1,18.2,79l-0.4,11.7h-181.8c3.6,11.4,10.5,20.7,20.9,28.1
|
||||||
|
c10.3,7.3,21.3,11,33,11c14.4,0,26.3-2.2,35.7-6.5C1695.8,570.1,1704.9,563.7,1713.6,555.3z M1596.9,486.2h92.9
|
||||||
|
c-2.1-12.3-7.5-22.1-16.2-29.4s-18.7-11-30.1-11s-21.5,3.7-30.3,11S1599,473.9,1596.9,486.2z"/>
|
||||||
|
<path class="st0" d="M1908.8,418.4c7.8-10.8,17.2-19,28.3-24.7s22-8.5,32.8-8.5c11.4,0,20,1.6,26,4.9l-10.8,72.7
|
||||||
|
c-8.4-2.1-15.7-3.1-22-3.1c-17.1,0-30.4,4.3-39.9,12.8c-9.6,8.5-14.4,24.2-14.4,46.9v120.3h-85.3V390.2h85.3V418.4L1908.8,418.4z"/>
|
||||||
|
<path class="st0" d="M2113,258.2v381.5h-85.3V258.2H2113z"/>
|
||||||
|
<path class="st0" d="M2360.8,555.3l53,49.4c-28.1,29.6-66.7,44.4-115.8,44.4c-28.1,0-53-5.8-74.5-17.5s-38.2-27.7-49.8-48
|
||||||
|
c-11.7-20.3-17.7-43.2-18-68.7c0-24.8,5.9-47.5,17.7-68s28.1-36.7,48.7-48.5c20.6-11.8,43.5-17.7,68.7-17.7
|
||||||
|
c24.8,0,47.6,6.1,68.2,18.2c20.6,12.1,37,29.5,49.1,52.3c12.1,22.7,18.2,49.1,18.2,79l-0.4,11.7h-181.8
|
||||||
|
c3.6,11.4,10.5,20.7,20.9,28.1c10.3,7.3,21.3,11,33,11c14.4,0,26.3-2.2,35.7-6.5C2343.1,570.1,2352.1,563.7,2360.8,555.3z
|
||||||
|
M2244.1,486.2h92.9c-2.1-12.3-7.5-22.1-16.2-29.4s-18.7-11-30.1-11s-21.5,3.7-30.3,11C2251.7,464.1,2246.2,473.9,2244.1,486.2z"/>
|
||||||
|
<path class="st0" d="M2565.9,446.3c-9.9,0-17.1,1.1-21.5,3.4c-4.5,2.2-6.7,5.9-6.7,11s3.4,8.8,10.3,11.2c6.9,2.4,18,4.9,33.2,7.6
|
||||||
|
c20,3,37,6.7,50.9,11.2s26,12.1,36.1,22.9c10.2,10.8,15.3,25.9,15.3,45.3c0,29.9-10.9,52.4-32.8,67.6
|
||||||
|
c-21.8,15.1-50.3,22.7-85.3,22.7c-25.7,0-49.5-3.7-71.4-11c-21.8-7.3-37.4-14.7-46.7-22.2l33.7-60.6c10.2,9,23.4,15.8,39.7,20.4
|
||||||
|
c16.3,4.6,31.3,7,45.1,7c19.7,0,29.6-5.2,29.6-15.7c0-5.4-3.3-9.4-9.9-11.9c-6.6-2.5-17.2-5.2-31.9-7.9c-18.9-3.3-34.9-7.2-48-11.7
|
||||||
|
c-13.2-4.5-24.6-12.2-34.3-23.1c-9.7-10.9-14.6-26-14.6-45.1c0-27.2,9.7-48.5,29-63.7c19.3-15.3,46-22.9,80.1-22.9
|
||||||
|
c23.3,0,44.4,3.6,63.3,10.8c18.9,7.2,34,14.5,45.3,22l-32.8,58.8c-10.8-7.5-23.2-13.7-37.3-18.6
|
||||||
|
C2590.5,448.7,2577.6,446.3,2565.9,446.3z"/>
|
||||||
|
<path class="st0" d="M2817.3,446.3c-9.9,0-17.1,1.1-21.5,3.4c-4.5,2.2-6.7,5.9-6.7,11s3.4,8.8,10.3,11.2c6.9,2.4,18,4.9,33.2,7.6
|
||||||
|
c20,3,37,6.7,50.9,11.2s26,12.1,36.1,22.9c10.2,10.8,15.3,25.9,15.3,45.3c0,29.9-10.9,52.4-32.8,67.6
|
||||||
|
c-21.8,15.1-50.3,22.7-85.3,22.7c-25.7,0-49.5-3.7-71.4-11c-21.8-7.3-37.4-14.7-46.7-22.2l33.7-60.6c10.2,9,23.4,15.8,39.7,20.4
|
||||||
|
c16.3,4.6,31.3,7,45.1,7c19.8,0,29.6-5.2,29.6-15.7c0-5.4-3.3-9.4-9.9-11.9c-6.6-2.5-17.2-5.2-31.9-7.9c-18.9-3.3-34.9-7.2-48-11.7
|
||||||
|
c-13.2-4.5-24.6-12.2-34.3-23.1c-9.7-10.9-14.6-26-14.6-45.1c0-27.2,9.7-48.5,29-63.7c19.3-15.3,46-22.9,80.1-22.9
|
||||||
|
c23.3,0,44.4,3.6,63.3,10.8c18.9,7.2,34,14.5,45.3,22l-32.8,58.8c-10.8-7.5-23.2-13.7-37.3-18.6
|
||||||
|
C2841.8,448.7,2828.9,446.3,2817.3,446.3z"/>
|
||||||
|
<g>
|
||||||
|
<path class="st0" d="M2508,724h60.2v17.3H2508V724z"/>
|
||||||
|
<path class="st0" d="M2629.2,694.4c4.9-2,10.2-3.1,16-3.1c10.9,0,19.5,3.4,25.9,10.2s9.6,16.7,9.6,29.6v57.3h-19.6v-52.6
|
||||||
|
c0-9.3-1.7-16.2-5.1-20.7c-3.4-4.5-9.1-6.7-17-6.7c-6.5,0-11.8,2.4-16.1,7.1c-4.3,4.8-6.4,11.5-6.4,20.2v52.6h-19.6v-94.6h19.6v9.5
|
||||||
|
C2620.2,699.4,2624.4,696.4,2629.2,694.4z"/>
|
||||||
|
<path class="st0" d="M2790.3,833.2c-8.6,6.8-19.4,10.2-32.3,10.2c-7.9,0-15.2-1.4-21.9-4.1s-12.1-6.8-16.3-12.2s-6.6-11.9-7.1-19.6
|
||||||
|
h19.6c0.7,6.1,3.5,10.8,8.4,13.9c4.9,3.2,10.7,4.8,17.4,4.8c7,0,13.1-2,18.2-6c5.1-4,7.7-10.3,7.7-18.9v-24.7
|
||||||
|
c-3.6,3.4-8,6.2-13.3,8.2c-5.2,2.1-10.7,3.1-16.3,3.1c-8.7,0-16.6-2.1-23.7-6.4c-7.1-4.3-12.6-10-16.7-17.3c-4-7.3-6-15.5-6-24.6
|
||||||
|
s2-17.3,6-24.7s9.6-13.2,16.7-17.4c7.1-4.3,15-6.4,23.7-6.4c5.7,0,11.1,1,16.3,3.1s9.6,4.8,13.3,8.2v-8.8h19.4v107.8
|
||||||
|
C2803.2,815.9,2798.9,826.4,2790.3,833.2z M2782.2,755.7c2.6-4.7,3.8-10,3.8-15.9s-1.3-11.2-3.8-16c-2.6-4.8-6.1-8.5-10.5-11.1
|
||||||
|
c-4.5-2.7-9.5-4-15.1-4c-5.8,0-10.9,1.4-15.4,4.3c-4.5,2.8-7.9,6.6-10.3,11.4c-2.4,4.8-3.6,9.9-3.6,15.5c0,5.4,1.2,10.5,3.6,15.3
|
||||||
|
c2.4,4.8,5.8,8.6,10.3,11.5s9.6,4.3,15.4,4.3c5.6,0,10.6-1.4,15.1-4.1C2776.1,764.1,2779.6,760.4,2782.2,755.7z"/>
|
||||||
|
<path class="st0" d="M2843.5,788.4h-21.6l37.9-48l-36.4-46.6h22.6l25.7,33.3l25.8-33.3h21.6l-36.2,45.9l37.9,48.6h-22.6l-27.4-35
|
||||||
|
L2843.5,788.4z"/>
|
||||||
|
</g>
|
||||||
|
<path class="st0" d="M835.8,319.2c-11.5-18.9-27.4-33.7-47.6-44.7c-20.2-10.9-43-16.4-68.5-16.4h-90.6c-8.6,39.6-21.3,77.2-38,112.4
|
||||||
|
c-10,21-21.3,41-33.9,59.9v209.2H647v-135h72.7c25.4,0,48.3-5.5,68.5-16.4s36.1-25.8,47.6-44.7c11.5-18.9,17.3-39.5,17.3-61.9
|
||||||
|
C853.1,358.9,847.4,338.1,835.8,319.2z M747,416.6c-9.4,9-21.8,13.5-37,13.5l-62.8,0.4v-93.4l62.8-0.4c15.3,0,27.6,4.5,37,13.5
|
||||||
|
s14.1,20,14.1,33.2C761.1,396.6,756.4,407.7,747,416.6z"/>
|
||||||
|
<path class="st1" d="M164.7,698.7c-3.5-16.5-10.4-49.6-11.3-49.6c-147.1-88-129.7-240.3-81-327.4C82.8,431.4,277,507.1,163.8,641.2
|
||||||
|
c-0.9,1.7,5.2,22.6,10.4,41.8c22.6-38.3,56.6-84.4,54.8-88.8C89.7,254.7,525,228.6,615.5,17.9c40.9,203.7-20.9,518.9-370.8,599
|
||||||
|
c-1.7,0.9-63.5,109.7-66.2,110.6c0-1.7-26.1-0.9-22.6-9.6C157.8,712.6,161.2,705.7,164.7,698.7L164.7,698.7z M160.4,616.9
|
||||||
|
c44.4-51.4-7.8-139.3-39.2-168C174.3,540.2,170.8,593.3,160.4,616.9L160.4,616.9z"/>
|
||||||
|
</svg>
|
||||||
|
After Width: | Height: | Size: 6.5 KiB |
|
Before Width: | Height: | Size: 67 KiB After Width: | Height: | Size: 67 KiB |
|
Before Width: | Height: | Size: 661 KiB After Width: | Height: | Size: 661 KiB |
|
Before Width: | Height: | Size: 457 KiB After Width: | Height: | Size: 457 KiB |
|
Before Width: | Height: | Size: 436 KiB After Width: | Height: | Size: 436 KiB |
|
Before Width: | Height: | Size: 462 KiB After Width: | Height: | Size: 462 KiB |
|
Before Width: | Height: | Size: 608 KiB After Width: | Height: | Size: 608 KiB |
|
Before Width: | Height: | Size: 698 KiB After Width: | Height: | Size: 698 KiB |
|
Before Width: | Height: | Size: 706 KiB After Width: | Height: | Size: 706 KiB |
|
Before Width: | Height: | Size: 480 KiB After Width: | Height: | Size: 480 KiB |
|
Before Width: | Height: | Size: 680 KiB After Width: | Height: | Size: 680 KiB |
|
Before Width: | Height: | Size: 686 KiB After Width: | Height: | Size: 686 KiB |
|
Before Width: | Height: | Size: 848 KiB After Width: | Height: | Size: 848 KiB |
|
Before Width: | Height: | Size: 703 KiB After Width: | Height: | Size: 703 KiB |
BIN
docs/assets/screenshots/mail-rules-edited.png
Normal file
|
After Width: | Height: | Size: 76 KiB |
|
Before Width: | Height: | Size: 388 KiB After Width: | Height: | Size: 388 KiB |
|
Before Width: | Height: | Size: 26 KiB After Width: | Height: | Size: 26 KiB |
|
Before Width: | Height: | Size: 54 KiB After Width: | Height: | Size: 54 KiB |
|
Before Width: | Height: | Size: 517 KiB After Width: | Height: | Size: 517 KiB |
2588
docs/changelog.md
Normal file
1787
docs/changelog.rst
333
docs/conf.py
@@ -1,333 +0,0 @@
|
|||||||
import sphinx_rtd_theme
|
|
||||||
|
|
||||||
|
|
||||||
__version__ = None
|
|
||||||
__full_version_str__ = None
|
|
||||||
__major_minor_version_str__ = None
|
|
||||||
exec(open("../src/paperless/version.py").read())
|
|
||||||
|
|
||||||
|
|
||||||
extensions = [
|
|
||||||
"sphinx.ext.autodoc",
|
|
||||||
"sphinx.ext.intersphinx",
|
|
||||||
"sphinx.ext.todo",
|
|
||||||
"sphinx.ext.imgmath",
|
|
||||||
"sphinx.ext.viewcode",
|
|
||||||
"sphinx_rtd_theme",
|
|
||||||
]
|
|
||||||
|
|
||||||
# Add any paths that contain templates here, relative to this directory.
|
|
||||||
templates_path = ["_templates"]
|
|
||||||
|
|
||||||
# The suffix of source filenames.
|
|
||||||
source_suffix = ".rst"
|
|
||||||
|
|
||||||
# The encoding of source files.
|
|
||||||
# source_encoding = 'utf-8-sig'
|
|
||||||
|
|
||||||
# The master toctree document.
|
|
||||||
master_doc = "index"
|
|
||||||
|
|
||||||
# General information about the project.
|
|
||||||
project = "Paperless-ngx"
|
|
||||||
copyright = "2015-2022, Daniel Quinn, Jonas Winkler, and the paperless-ngx team"
|
|
||||||
|
|
||||||
# The version info for the project you're documenting, acts as replacement for
|
|
||||||
# |version| and |release|, also used in various other places throughout the
|
|
||||||
# built documents.
|
|
||||||
#
|
|
||||||
|
|
||||||
#
|
|
||||||
# If the build process ever explodes here, it's because you've set the version
|
|
||||||
# number in paperless.version to a tuple with 3 numbers in it.
|
|
||||||
#
|
|
||||||
|
|
||||||
# The short X.Y version.
|
|
||||||
version = __major_minor_version_str__
|
|
||||||
# The full version, including alpha/beta/rc tags.
|
|
||||||
release = __full_version_str__
|
|
||||||
|
|
||||||
# The language for content autogenerated by Sphinx. Refer to documentation
|
|
||||||
# for a list of supported languages.
|
|
||||||
# language = None
|
|
||||||
|
|
||||||
# There are two options for replacing |today|: either, you set today to some
|
|
||||||
# non-false value, then it is used:
|
|
||||||
# today = ''
|
|
||||||
# Else, today_fmt is used as the format for a strftime call.
|
|
||||||
# today_fmt = '%B %d, %Y'
|
|
||||||
|
|
||||||
# List of patterns, relative to source directory, that match files and
|
|
||||||
# directories to ignore when looking for source files.
|
|
||||||
exclude_patterns = ["_build"]
|
|
||||||
|
|
||||||
# The reST default role (used for this markup: `text`) to use for all
|
|
||||||
# documents.
|
|
||||||
# default_role = None
|
|
||||||
|
|
||||||
# If true, '()' will be appended to :func: etc. cross-reference text.
|
|
||||||
# add_function_parentheses = True
|
|
||||||
|
|
||||||
# If true, the current module name will be prepended to all description
|
|
||||||
# unit titles (such as .. function::).
|
|
||||||
# add_module_names = True
|
|
||||||
|
|
||||||
# If true, sectionauthor and moduleauthor directives will be shown in the
|
|
||||||
# output. They are ignored by default.
|
|
||||||
# show_authors = False
|
|
||||||
|
|
||||||
# The name of the Pygments (syntax highlighting) style to use.
|
|
||||||
pygments_style = "sphinx"
|
|
||||||
|
|
||||||
# A list of ignored prefixes for module index sorting.
|
|
||||||
# modindex_common_prefix = []
|
|
||||||
|
|
||||||
# If true, keep warnings as "system message" paragraphs in the built documents.
|
|
||||||
# keep_warnings = False
|
|
||||||
|
|
||||||
|
|
||||||
# -- Options for HTML output ----------------------------------------------
|
|
||||||
|
|
||||||
# The theme to use for HTML and HTML Help pages. See the documentation for
|
|
||||||
# a list of builtin themes.
|
|
||||||
html_theme = "sphinx_rtd_theme"
|
|
||||||
|
|
||||||
# Theme options are theme-specific and customize the look and feel of a theme
|
|
||||||
# further. For a list of options available for each theme, see the
|
|
||||||
# documentation.
|
|
||||||
# html_theme_options = {}
|
|
||||||
|
|
||||||
# Add any paths that contain custom themes here, relative to this directory.
|
|
||||||
html_theme_path = []
|
|
||||||
|
|
||||||
# The name for this set of Sphinx documents. If None, it defaults to
|
|
||||||
# "<project> v<release> documentation".
|
|
||||||
# html_title = None
|
|
||||||
|
|
||||||
# A shorter title for the navigation bar. Default is the same as html_title.
|
|
||||||
# html_short_title = None
|
|
||||||
|
|
||||||
# The name of an image file (relative to this directory) to place at the top
|
|
||||||
# of the sidebar.
|
|
||||||
# html_logo = None
|
|
||||||
|
|
||||||
# The name of an image file (within the static path) to use as favicon of the
|
|
||||||
# docs. This file should be a Windows icon file (.ico) being 16x16 or 32x32
|
|
||||||
# pixels large.
|
|
||||||
# html_favicon = None
|
|
||||||
|
|
||||||
# Add any paths that contain custom static files (such as style sheets) here,
|
|
||||||
# relative to this directory. They are copied after the builtin static files,
|
|
||||||
# so a file named "default.css" will overwrite the builtin "default.css".
|
|
||||||
html_static_path = ["_static"]
|
|
||||||
|
|
||||||
# These paths are either relative to html_static_path
|
|
||||||
# or fully qualified paths (eg. https://...)
|
|
||||||
html_css_files = [
|
|
||||||
"css/custom.css",
|
|
||||||
]
|
|
||||||
|
|
||||||
html_js_files = [
|
|
||||||
"js/darkmode.js",
|
|
||||||
]
|
|
||||||
|
|
||||||
# Add any extra paths that contain custom files (such as robots.txt or
|
|
||||||
# .htaccess) here, relative to this directory. These files are copied
|
|
||||||
# directly to the root of the documentation.
|
|
||||||
# html_extra_path = []
|
|
||||||
|
|
||||||
# If not '', a 'Last updated on:' timestamp is inserted at every page bottom,
|
|
||||||
# using the given strftime format.
|
|
||||||
# html_last_updated_fmt = '%b %d, %Y'
|
|
||||||
|
|
||||||
# If true, SmartyPants will be used to convert quotes and dashes to
|
|
||||||
# typographically correct entities.
|
|
||||||
# html_use_smartypants = True
|
|
||||||
|
|
||||||
# Custom sidebar templates, maps document names to template names.
|
|
||||||
# html_sidebars = {}
|
|
||||||
|
|
||||||
# Additional templates that should be rendered to pages, maps page names to
|
|
||||||
# template names.
|
|
||||||
# html_additional_pages = {}
|
|
||||||
|
|
||||||
# If false, no module index is generated.
|
|
||||||
# html_domain_indices = True
|
|
||||||
|
|
||||||
# If false, no index is generated.
|
|
||||||
# html_use_index = True
|
|
||||||
|
|
||||||
# If true, the index is split into individual pages for each letter.
|
|
||||||
# html_split_index = False
|
|
||||||
|
|
||||||
# If true, links to the reST sources are added to the pages.
|
|
||||||
# html_show_sourcelink = True
|
|
||||||
|
|
||||||
# If true, "Created using Sphinx" is shown in the HTML footer. Default is True.
|
|
||||||
# html_show_sphinx = True
|
|
||||||
|
|
||||||
# If true, "(C) Copyright ..." is shown in the HTML footer. Default is True.
|
|
||||||
# html_show_copyright = True
|
|
||||||
|
|
||||||
# If true, an OpenSearch description file will be output, and all pages will
|
|
||||||
# contain a <link> tag referring to it. The value of this option must be the
|
|
||||||
# base URL from which the finished HTML is served.
|
|
||||||
# html_use_opensearch = ''
|
|
||||||
|
|
||||||
# This is the file name suffix for HTML files (e.g. ".xhtml").
|
|
||||||
# html_file_suffix = None
|
|
||||||
|
|
||||||
# Output file base name for HTML help builder.
|
|
||||||
htmlhelp_basename = "paperless"
|
|
||||||
|
|
||||||
# -- Options for LaTeX output ---------------------------------------------
|
|
||||||
|
|
||||||
latex_elements = {
|
|
||||||
# The paper size ('letterpaper' or 'a4paper').
|
|
||||||
#'papersize': 'letterpaper',
|
|
||||||
# The font size ('10pt', '11pt' or '12pt').
|
|
||||||
#'pointsize': '10pt',
|
|
||||||
# Additional stuff for the LaTeX preamble.
|
|
||||||
#'preamble': '',
|
|
||||||
}
|
|
||||||
|
|
||||||
# Grouping the document tree into LaTeX files. List of tuples
|
|
||||||
# (source start file, target name, title,
|
|
||||||
# author, documentclass [howto, manual, or own class]).
|
|
||||||
latex_documents = [
|
|
||||||
("index", "paperless.tex", "Paperless Documentation", "Daniel Quinn", "manual"),
|
|
||||||
]
|
|
||||||
|
|
||||||
# The name of an image file (relative to this directory) to place at the top of
|
|
||||||
# the title page.
|
|
||||||
# latex_logo = None
|
|
||||||
|
|
||||||
# For "manual" documents, if this is true, then toplevel headings are parts,
|
|
||||||
# not chapters.
|
|
||||||
# latex_use_parts = False
|
|
||||||
|
|
||||||
# If true, show page references after internal links.
|
|
||||||
# latex_show_pagerefs = False
|
|
||||||
|
|
||||||
# If true, show URL addresses after external links.
|
|
||||||
# latex_show_urls = False
|
|
||||||
|
|
||||||
# Documents to append as an appendix to all manuals.
|
|
||||||
# latex_appendices = []
|
|
||||||
|
|
||||||
# If false, no module index is generated.
|
|
||||||
# latex_domain_indices = True
|
|
||||||
|
|
||||||
|
|
||||||
# -- Options for manual page output ---------------------------------------
|
|
||||||
|
|
||||||
# One entry per manual page. List of tuples
|
|
||||||
# (source start file, name, description, authors, manual section).
|
|
||||||
man_pages = [("index", "paperless", "Paperless Documentation", ["Daniel Quinn"], 1)]
|
|
||||||
|
|
||||||
# If true, show URL addresses after external links.
|
|
||||||
# man_show_urls = False
|
|
||||||
|
|
||||||
|
|
||||||
# -- Options for Texinfo output -------------------------------------------
|
|
||||||
|
|
||||||
# Grouping the document tree into Texinfo files. List of tuples
|
|
||||||
# (source start file, target name, title, author,
|
|
||||||
# dir menu entry, description, category)
|
|
||||||
texinfo_documents = [
|
|
||||||
(
|
|
||||||
"index",
|
|
||||||
"Paperless",
|
|
||||||
"Paperless Documentation",
|
|
||||||
"Daniel Quinn",
|
|
||||||
"paperless",
|
|
||||||
"Scan, index, and archive all of your paper documents.",
|
|
||||||
"Miscellaneous",
|
|
||||||
),
|
|
||||||
]
|
|
||||||
|
|
||||||
# Documents to append as an appendix to all manuals.
|
|
||||||
# texinfo_appendices = []
|
|
||||||
|
|
||||||
# If false, no module index is generated.
|
|
||||||
# texinfo_domain_indices = True
|
|
||||||
|
|
||||||
# How to display URL addresses: 'footnote', 'no', or 'inline'.
|
|
||||||
# texinfo_show_urls = 'footnote'
|
|
||||||
|
|
||||||
# If true, do not generate a @detailmenu in the "Top" node's menu.
|
|
||||||
# texinfo_no_detailmenu = False
|
|
||||||
|
|
||||||
|
|
||||||
# -- Options for Epub output ----------------------------------------------
|
|
||||||
|
|
||||||
# Bibliographic Dublin Core info.
|
|
||||||
epub_title = "Paperless"
|
|
||||||
epub_author = "Daniel Quinn"
|
|
||||||
epub_publisher = "Daniel Quinn"
|
|
||||||
epub_copyright = "2015, Daniel Quinn"
|
|
||||||
|
|
||||||
# The basename for the epub file. It defaults to the project name.
|
|
||||||
# epub_basename = u'Paperless'
|
|
||||||
|
|
||||||
# The HTML theme for the epub output. Since the default themes are not optimized
|
|
||||||
# for small screen space, using the same theme for HTML and epub output is
|
|
||||||
# usually not wise. This defaults to 'epub', a theme designed to save visual
|
|
||||||
# space.
|
|
||||||
# epub_theme = 'epub'
|
|
||||||
|
|
||||||
# The language of the text. It defaults to the language option
|
|
||||||
# or en if the language is not set.
|
|
||||||
# epub_language = ''
|
|
||||||
|
|
||||||
# The scheme of the identifier. Typical schemes are ISBN or URL.
|
|
||||||
# epub_scheme = ''
|
|
||||||
|
|
||||||
# The unique identifier of the text. This can be a ISBN number
|
|
||||||
# or the project homepage.
|
|
||||||
# epub_identifier = ''
|
|
||||||
|
|
||||||
# A unique identification for the text.
|
|
||||||
# epub_uid = ''
|
|
||||||
|
|
||||||
# A tuple containing the cover image and cover page html template filenames.
|
|
||||||
# epub_cover = ()
|
|
||||||
|
|
||||||
# A sequence of (type, uri, title) tuples for the guide element of content.opf.
|
|
||||||
# epub_guide = ()
|
|
||||||
|
|
||||||
# HTML files that should be inserted before the pages created by sphinx.
|
|
||||||
# The format is a list of tuples containing the path and title.
|
|
||||||
# epub_pre_files = []
|
|
||||||
|
|
||||||
# HTML files shat should be inserted after the pages created by sphinx.
|
|
||||||
# The format is a list of tuples containing the path and title.
|
|
||||||
# epub_post_files = []
|
|
||||||
|
|
||||||
# A list of files that should not be packed into the epub file.
|
|
||||||
epub_exclude_files = ["search.html"]
|
|
||||||
|
|
||||||
# The depth of the table of contents in toc.ncx.
|
|
||||||
# epub_tocdepth = 3
|
|
||||||
|
|
||||||
# Allow duplicate toc entries.
|
|
||||||
# epub_tocdup = True
|
|
||||||
|
|
||||||
# Choose between 'default' and 'includehidden'.
|
|
||||||
# epub_tocscope = 'default'
|
|
||||||
|
|
||||||
# Fix unsupported image types using the PIL.
|
|
||||||
# epub_fix_images = False
|
|
||||||
|
|
||||||
# Scale large images.
|
|
||||||
# epub_max_image_width = 0
|
|
||||||
|
|
||||||
# How to display URL addresses: 'footnote', 'no', or 'inline'.
|
|
||||||
# epub_show_urls = 'inline'
|
|
||||||
|
|
||||||
# If false, no index is generated.
|
|
||||||
# epub_use_index = True
|
|
||||||
|
|
||||||
|
|
||||||
# Example configuration for intersphinx: refer to the Python standard library.
|
|
||||||
intersphinx_mapping = {"http://docs.python.org/": None}
|
|
||||||
1119
docs/configuration.md
Normal file
@@ -1,851 +0,0 @@
|
|||||||
.. _configuration:
|
|
||||||
|
|
||||||
*************
|
|
||||||
Configuration
|
|
||||||
*************
|
|
||||||
|
|
||||||
Paperless provides a wide range of customizations.
|
|
||||||
Depending on how you run paperless, these settings have to be defined in different
|
|
||||||
places.
|
|
||||||
|
|
||||||
* If you run paperless on docker, ``paperless.conf`` is not used. Rather, configure
|
|
||||||
paperless by copying necessary options to ``docker-compose.env``.
|
|
||||||
* If you are running paperless on anything else, paperless will search for the
|
|
||||||
configuration file in these locations and use the first one it finds:
|
|
||||||
|
|
||||||
.. code::
|
|
||||||
|
|
||||||
/path/to/paperless/paperless.conf
|
|
||||||
/etc/paperless.conf
|
|
||||||
/usr/local/etc/paperless.conf
|
|
||||||
|
|
||||||
|
|
||||||
Required services
|
|
||||||
#################
|
|
||||||
|
|
||||||
PAPERLESS_REDIS=<url>
|
|
||||||
This is required for processing scheduled tasks such as email fetching, index
|
|
||||||
optimization and for training the automatic document matcher.
|
|
||||||
|
|
||||||
Defaults to redis://localhost:6379.
|
|
||||||
|
|
||||||
PAPERLESS_DBHOST=<hostname>
|
|
||||||
By default, sqlite is used as the database backend. This can be changed here.
|
|
||||||
Set PAPERLESS_DBHOST and PostgreSQL will be used instead of mysql.
|
|
||||||
|
|
||||||
PAPERLESS_DBPORT=<port>
|
|
||||||
Adjust port if necessary.
|
|
||||||
|
|
||||||
Default is 5432.
|
|
||||||
|
|
||||||
PAPERLESS_DBNAME=<name>
|
|
||||||
Database name in PostgreSQL.
|
|
||||||
|
|
||||||
Defaults to "paperless".
|
|
||||||
|
|
||||||
PAPERLESS_DBUSER=<name>
|
|
||||||
Database user in PostgreSQL.
|
|
||||||
|
|
||||||
Defaults to "paperless".
|
|
||||||
|
|
||||||
PAPERLESS_DBPASS=<password>
|
|
||||||
Database password for PostgreSQL.
|
|
||||||
|
|
||||||
Defaults to "paperless".
|
|
||||||
|
|
||||||
PAPERLESS_DBSSLMODE=<mode>
|
|
||||||
SSL mode to use when connecting to PostgreSQL.
|
|
||||||
|
|
||||||
See `the official documentation about sslmode <https://www.postgresql.org/docs/current/libpq-ssl.html>`_.
|
|
||||||
|
|
||||||
Default is ``prefer``.
|
|
||||||
|
|
||||||
Paths and folders
|
|
||||||
#################
|
|
||||||
|
|
||||||
PAPERLESS_CONSUMPTION_DIR=<path>
|
|
||||||
This where your documents should go to be consumed. Make sure that it exists
|
|
||||||
and that the user running the paperless service can read/write its contents
|
|
||||||
before you start Paperless.
|
|
||||||
|
|
||||||
Don't change this when using docker, as it only changes the path within the
|
|
||||||
container. Change the local consumption directory in the docker-compose.yml
|
|
||||||
file instead.
|
|
||||||
|
|
||||||
Defaults to "../consume/", relative to the "src" directory.
|
|
||||||
|
|
||||||
PAPERLESS_DATA_DIR=<path>
|
|
||||||
This is where paperless stores all its data (search index, SQLite database,
|
|
||||||
classification model, etc).
|
|
||||||
|
|
||||||
Defaults to "../data/", relative to the "src" directory.
|
|
||||||
|
|
||||||
PAPERLESS_TRASH_DIR=<path>
|
|
||||||
Instead of removing deleted documents, they are moved to this directory.
|
|
||||||
|
|
||||||
This must be writeable by the user running paperless. When running inside
|
|
||||||
docker, ensure that this path is within a permanent volume (such as
|
|
||||||
"../media/trash") so it won't get lost on upgrades.
|
|
||||||
|
|
||||||
Defaults to empty (i.e. really delete documents).
|
|
||||||
|
|
||||||
PAPERLESS_MEDIA_ROOT=<path>
|
|
||||||
This is where your documents and thumbnails are stored.
|
|
||||||
|
|
||||||
You can set this and PAPERLESS_DATA_DIR to the same folder to have paperless
|
|
||||||
store all its data within the same volume.
|
|
||||||
|
|
||||||
Defaults to "../media/", relative to the "src" directory.
|
|
||||||
|
|
||||||
PAPERLESS_STATICDIR=<path>
|
|
||||||
Override the default STATIC_ROOT here. This is where all static files
|
|
||||||
created using "collectstatic" manager command are stored.
|
|
||||||
|
|
||||||
Unless you're doing something fancy, there is no need to override this.
|
|
||||||
|
|
||||||
Defaults to "../static/", relative to the "src" directory.
|
|
||||||
|
|
||||||
PAPERLESS_FILENAME_FORMAT=<format>
|
|
||||||
Changes the filenames paperless uses to store documents in the media directory.
|
|
||||||
See :ref:`advanced-file_name_handling` for details.
|
|
||||||
|
|
||||||
Default is none, which disables this feature.
|
|
||||||
|
|
||||||
PAPERLESS_LOGGING_DIR=<path>
|
|
||||||
This is where paperless will store log files.
|
|
||||||
|
|
||||||
Defaults to "``PAPERLESS_DATA_DIR``/log/".
|
|
||||||
|
|
||||||
|
|
||||||
Logging
|
|
||||||
#######
|
|
||||||
|
|
||||||
PAPERLESS_LOGROTATE_MAX_SIZE=<num>
|
|
||||||
Maximum file size for log files before they are rotated, in bytes.
|
|
||||||
|
|
||||||
Defaults to 1 MiB.
|
|
||||||
|
|
||||||
PAPERLESS_LOGROTATE_MAX_BACKUPS=<num>
|
|
||||||
Number of rotated log files to keep.
|
|
||||||
|
|
||||||
Defaults to 20.
|
|
||||||
|
|
||||||
.. _hosting-and-security:
|
|
||||||
|
|
||||||
Hosting & Security
|
|
||||||
##################
|
|
||||||
|
|
||||||
PAPERLESS_SECRET_KEY=<key>
|
|
||||||
Paperless uses this to make session tokens. If you expose paperless on the
|
|
||||||
internet, you need to change this, since the default secret is well known.
|
|
||||||
|
|
||||||
Use any sequence of characters. The more, the better. You don't need to
|
|
||||||
remember this. Just face-roll your keyboard.
|
|
||||||
|
|
||||||
Default is listed in the file ``src/paperless/settings.py``.
|
|
||||||
|
|
||||||
PAPERLESS_URL=<url>
|
|
||||||
This setting can be used to set the three options below (ALLOWED_HOSTS,
|
|
||||||
CORS_ALLOWED_HOSTS and CSRF_TRUSTED_ORIGINS). If the other options are
|
|
||||||
set the values will be combined with this one. Do not include a trailing
|
|
||||||
slash. E.g. https://paperless.domain.com
|
|
||||||
|
|
||||||
Defaults to empty string, leaving the other settings unaffected.
|
|
||||||
|
|
||||||
PAPERLESS_CSRF_TRUSTED_ORIGINS=<comma-separated-list>
|
|
||||||
A list of trusted origins for unsafe requests (e.g. POST). As of Django 4.0
|
|
||||||
this is required to access the Django admin via the web.
|
|
||||||
See https://docs.djangoproject.com/en/4.0/ref/settings/#csrf-trusted-origins
|
|
||||||
|
|
||||||
Can also be set using PAPERLESS_URL (see above).
|
|
||||||
|
|
||||||
Defaults to empty string, which does not add any origins to the trusted list.
|
|
||||||
|
|
||||||
PAPERLESS_ALLOWED_HOSTS=<comma-separated-list>
|
|
||||||
If you're planning on putting Paperless on the open internet, then you
|
|
||||||
really should set this value to the domain name you're using. Failing to do
|
|
||||||
so leaves you open to HTTP host header attacks:
|
|
||||||
https://docs.djangoproject.com/en/3.1/topics/security/#host-header-validation
|
|
||||||
|
|
||||||
Just remember that this is a comma-separated list, so "example.com" is fine,
|
|
||||||
as is "example.com,www.example.com", but NOT " example.com" or "example.com,"
|
|
||||||
|
|
||||||
Can also be set using PAPERLESS_URL (see above).
|
|
||||||
|
|
||||||
If manually set, please remember to include "localhost". Otherwise docker
|
|
||||||
healthcheck will fail.
|
|
||||||
|
|
||||||
Defaults to "*", which is all hosts.
|
|
||||||
|
|
||||||
PAPERLESS_CORS_ALLOWED_HOSTS=<comma-separated-list>
|
|
||||||
You need to add your servers to the list of allowed hosts that can do CORS
|
|
||||||
calls. Set this to your public domain name.
|
|
||||||
|
|
||||||
Can also be set using PAPERLESS_URL (see above).
|
|
||||||
|
|
||||||
Defaults to "http://localhost:8000".
|
|
||||||
|
|
||||||
PAPERLESS_FORCE_SCRIPT_NAME=<path>
|
|
||||||
To host paperless under a subpath url like example.com/paperless you set
|
|
||||||
this value to /paperless. No trailing slash!
|
|
||||||
|
|
||||||
Defaults to none, which hosts paperless at "/".
|
|
||||||
|
|
||||||
PAPERLESS_STATIC_URL=<path>
|
|
||||||
Override the STATIC_URL here. Unless you're hosting Paperless off a
|
|
||||||
subdomain like /paperless/, you probably don't need to change this.
|
|
||||||
|
|
||||||
Defaults to "/static/".
|
|
||||||
|
|
||||||
PAPERLESS_AUTO_LOGIN_USERNAME=<username>
|
|
||||||
Specify a username here so that paperless will automatically perform login
|
|
||||||
with the selected user.
|
|
||||||
|
|
||||||
.. danger::
|
|
||||||
|
|
||||||
Do not use this when exposing paperless on the internet. There are no
|
|
||||||
checks in place that would prevent you from doing this.
|
|
||||||
|
|
||||||
Defaults to none, which disables this feature.
|
|
||||||
|
|
||||||
PAPERLESS_ADMIN_USER=<username>
|
|
||||||
If this environment variable is specified, Paperless automatically creates
|
|
||||||
a superuser with the provided username at start. This is useful in cases
|
|
||||||
where you can not run the `createsuperuser` command separately, such as Kubernetes
|
|
||||||
or AWS ECS.
|
|
||||||
|
|
||||||
Requires `PAPERLESS_ADMIN_PASSWORD` to be set.
|
|
||||||
|
|
||||||
.. note::
|
|
||||||
|
|
||||||
This will not change an existing [super]user's password, nor will
|
|
||||||
it recreate a user that already exists. You can leave this throughout
|
|
||||||
the lifecycle of the containers.
|
|
||||||
|
|
||||||
PAPERLESS_ADMIN_MAIL=<email>
|
|
||||||
(Optional) Specify superuser email address. Only used when
|
|
||||||
`PAPERLESS_ADMIN_USER` is set.
|
|
||||||
|
|
||||||
Defaults to ``root@localhost``.
|
|
||||||
|
|
||||||
PAPERLESS_ADMIN_PASSWORD=<password>
|
|
||||||
Only used when `PAPERLESS_ADMIN_USER` is set.
|
|
||||||
This will be the password of the automatically created superuser.
|
|
||||||
|
|
||||||
|
|
||||||
PAPERLESS_COOKIE_PREFIX=<str>
|
|
||||||
Specify a prefix that is added to the cookies used by paperless to identify
|
|
||||||
the currently logged in user. This is useful for when you're running two
|
|
||||||
instances of paperless on the same host.
|
|
||||||
|
|
||||||
After changing this, you will have to login again.
|
|
||||||
|
|
||||||
Defaults to ``""``, which does not alter the cookie names.
|
|
||||||
|
|
||||||
PAPERLESS_ENABLE_HTTP_REMOTE_USER=<bool>
|
|
||||||
Allows authentication via HTTP_REMOTE_USER which is used by some SSO
|
|
||||||
applications.
|
|
||||||
|
|
||||||
.. warning::
|
|
||||||
|
|
||||||
This will allow authentication by simply adding a ``Remote-User: <username>`` header
|
|
||||||
to a request. Use with care! You especially *must* ensure that any such header is not
|
|
||||||
passed from your proxy server to paperless.
|
|
||||||
|
|
||||||
If you're exposing paperless to the internet directly, do not use this.
|
|
||||||
|
|
||||||
Also see the warning `in the official documentation <https://docs.djangoproject.com/en/3.1/howto/auth-remote-user/#configuration>`.
|
|
||||||
|
|
||||||
Defaults to `false` which disables this feature.
|
|
||||||
|
|
||||||
PAPERLESS_HTTP_REMOTE_USER_HEADER_NAME=<str>
|
|
||||||
If `PAPERLESS_ENABLE_HTTP_REMOTE_USER` is enabled, this property allows to
|
|
||||||
customize the name of the HTTP header from which the authenticated username
|
|
||||||
is extracted. Values are in terms of
|
|
||||||
[HttpRequest.META](https://docs.djangoproject.com/en/3.1/ref/request-response/#django.http.HttpRequest.META).
|
|
||||||
Thus, the configured value must start with `HTTP_` followed by the
|
|
||||||
normalized actual header name.
|
|
||||||
|
|
||||||
Defaults to `HTTP_REMOTE_USER`.
|
|
||||||
|
|
||||||
PAPERLESS_LOGOUT_REDIRECT_URL=<str>
|
|
||||||
URL to redirect the user to after a logout. This can be used together with
|
|
||||||
`PAPERLESS_ENABLE_HTTP_REMOTE_USER` to redirect the user back to the SSO
|
|
||||||
application's logout page.
|
|
||||||
|
|
||||||
Defaults to None, which disables this feature.
|
|
||||||
|
|
||||||
.. _configuration-ocr:
|
|
||||||
|
|
||||||
OCR settings
|
|
||||||
############
|
|
||||||
|
|
||||||
Paperless uses `OCRmyPDF <https://ocrmypdf.readthedocs.io/en/latest/>`_ for
|
|
||||||
performing OCR on documents and images. Paperless uses sensible defaults for
|
|
||||||
most settings, but all of them can be configured to your needs.
|
|
||||||
|
|
||||||
PAPERLESS_OCR_LANGUAGE=<lang>
|
|
||||||
Customize the language that paperless will attempt to use when
|
|
||||||
parsing documents.
|
|
||||||
|
|
||||||
It should be a 3-letter language code consistent with ISO
|
|
||||||
639: https://www.loc.gov/standards/iso639-2/php/code_list.php
|
|
||||||
|
|
||||||
Set this to the language most of your documents are written in.
|
|
||||||
|
|
||||||
This can be a combination of multiple languages such as ``deu+eng``,
|
|
||||||
in which case tesseract will use whatever language matches best.
|
|
||||||
Keep in mind that tesseract uses much more cpu time with multiple
|
|
||||||
languages enabled.
|
|
||||||
|
|
||||||
Defaults to "eng".
|
|
||||||
|
|
||||||
Note: If your language contains a '-' such as chi-sim, you must use chi_sim
|
|
||||||
|
|
||||||
PAPERLESS_OCR_MODE=<mode>
|
|
||||||
Tell paperless when and how to perform ocr on your documents. Four modes
|
|
||||||
are available:
|
|
||||||
|
|
||||||
* ``skip``: Paperless skips all pages and will perform ocr only on pages
|
|
||||||
where no text is present. This is the safest option.
|
|
||||||
* ``skip_noarchive``: In addition to skip, paperless won't create an
|
|
||||||
archived version of your documents when it finds any text in them.
|
|
||||||
This is useful if you don't want to have two almost-identical versions
|
|
||||||
of your digital documents in the media folder. This is the fastest option.
|
|
||||||
* ``redo``: Paperless will OCR all pages of your documents and attempt to
|
|
||||||
replace any existing text layers with new text. This will be useful for
|
|
||||||
documents from scanners that already performed OCR with insufficient
|
|
||||||
results. It will also perform OCR on purely digital documents.
|
|
||||||
|
|
||||||
This option may fail on some documents that have features that cannot
|
|
||||||
be removed, such as forms. In this case, the text from the document is
|
|
||||||
used instead.
|
|
||||||
* ``force``: Paperless rasterizes your documents, converting any text
|
|
||||||
into images and puts the OCRed text on top. This works for all documents,
|
|
||||||
however, the resulting document may be significantly larger and text
|
|
||||||
won't appear as sharp when zoomed in.
|
|
||||||
|
|
||||||
The default is ``skip``, which only performs OCR when necessary and always
|
|
||||||
creates archived documents.
|
|
||||||
|
|
||||||
Read more about this in the `OCRmyPDF documentation <https://ocrmypdf.readthedocs.io/en/latest/advanced.html#when-ocr-is-skipped>`_.
|
|
||||||
|
|
||||||
PAPERLESS_OCR_CLEAN=<mode>
|
|
||||||
Tells paperless to use ``unpaper`` to clean any input document before
|
|
||||||
sending it to tesseract. This uses more resources, but generally results
|
|
||||||
in better OCR results. The following modes are available:
|
|
||||||
|
|
||||||
* ``clean``: Apply unpaper.
|
|
||||||
* ``clean-final``: Apply unpaper, and use the cleaned images to build the
|
|
||||||
output file instead of the original images.
|
|
||||||
* ``none``: Do not apply unpaper.
|
|
||||||
|
|
||||||
Defaults to ``clean``.
|
|
||||||
|
|
||||||
.. note::
|
|
||||||
|
|
||||||
``clean-final`` is incompatible with ocr mode ``redo``. When both
|
|
||||||
``clean-final`` and the ocr mode ``redo`` is configured, ``clean``
|
|
||||||
is used instead.
|
|
||||||
|
|
||||||
PAPERLESS_OCR_DESKEW=<bool>
|
|
||||||
Tells paperless to correct skewing (slight rotation of input images mainly
|
|
||||||
due to improper scanning)
|
|
||||||
|
|
||||||
Defaults to ``true``, which enables this feature.
|
|
||||||
|
|
||||||
.. note::
|
|
||||||
|
|
||||||
Deskewing is incompatible with ocr mode ``redo``. Deskewing will get
|
|
||||||
disabled automatically if ``redo`` is used as the ocr mode.
|
|
||||||
|
|
||||||
PAPERLESS_OCR_ROTATE_PAGES=<bool>
|
|
||||||
Tells paperless to correct page rotation (90°, 180° and 270° rotation).
|
|
||||||
|
|
||||||
If you notice that paperless is not rotating incorrectly rotated
|
|
||||||
pages (or vice versa), try adjusting the threshold up or down (see below).
|
|
||||||
|
|
||||||
Defaults to ``true``, which enables this feature.
|
|
||||||
|
|
||||||
|
|
||||||
PAPERLESS_OCR_ROTATE_PAGES_THRESHOLD=<num>
|
|
||||||
Adjust the threshold for automatic page rotation by ``PAPERLESS_OCR_ROTATE_PAGES``.
|
|
||||||
This is an arbitrary value reported by tesseract. "15" is a very conservative value,
|
|
||||||
whereas "2" is a very aggressive option and will often result in correctly rotated pages
|
|
||||||
being rotated as well.
|
|
||||||
|
|
||||||
Defaults to "12".
|
|
||||||
|
|
||||||
PAPERLESS_OCR_OUTPUT_TYPE=<type>
|
|
||||||
Specify the the type of PDF documents that paperless should produce.
|
|
||||||
|
|
||||||
* ``pdf``: Modify the PDF document as little as possible.
|
|
||||||
* ``pdfa``: Convert PDF documents into PDF/A-2b documents, which is a
|
|
||||||
subset of the entire PDF specification and meant for storing
|
|
||||||
documents long term.
|
|
||||||
* ``pdfa-1``, ``pdfa-2``, ``pdfa-3`` to specify the exact version of
|
|
||||||
PDF/A you wish to use.
|
|
||||||
|
|
||||||
If not specified, ``pdfa`` is used. Remember that paperless also keeps
|
|
||||||
the original input file as well as the archived version.
|
|
||||||
|
|
||||||
|
|
||||||
PAPERLESS_OCR_PAGES=<num>
|
|
||||||
Tells paperless to use only the specified amount of pages for OCR. Documents
|
|
||||||
with less than the specified amount of pages get OCR'ed completely.
|
|
||||||
|
|
||||||
Specifying 1 here will only use the first page.
|
|
||||||
|
|
||||||
When combined with ``PAPERLESS_OCR_MODE=redo`` or ``PAPERLESS_OCR_MODE=force``,
|
|
||||||
paperless will not modify any text it finds on excluded pages and copy it
|
|
||||||
verbatim.
|
|
||||||
|
|
||||||
Defaults to 0, which disables this feature and always uses all pages.
|
|
||||||
|
|
||||||
PAPERLESS_OCR_IMAGE_DPI=<num>
|
|
||||||
Paperless will OCR any images you put into the system and convert them
|
|
||||||
into PDF documents. This is useful if your scanner produces images.
|
|
||||||
In order to do so, paperless needs to know the DPI of the image.
|
|
||||||
Most images from scanners will have this information embedded and
|
|
||||||
paperless will detect and use that information. In case this fails, it
|
|
||||||
uses this value as a fallback.
|
|
||||||
|
|
||||||
Set this to the DPI your scanner produces images at.
|
|
||||||
|
|
||||||
Default is none, which will automatically calculate image DPI so that
|
|
||||||
the produced PDF documents are A4 sized.
|
|
||||||
|
|
||||||
PAPERLESS_OCR_MAX_IMAGE_PIXELS=<num>
|
|
||||||
Paperless will not OCR images that have more pixels than this limit.
|
|
||||||
This is intended to prevent decompression bombs from overloading paperless.
|
|
||||||
Increasing this limit is desired if you face a DecompressionBombError despite
|
|
||||||
the concerning file not being malicious; this could e.g. be caused by invalidly
|
|
||||||
recognized metadata.
|
|
||||||
If you have enough resources or if you are certain that your uploaded files
|
|
||||||
are not malicious you can increase this value to your needs.
|
|
||||||
The default value is 256000000, an image with more pixels than that would not be parsed.
|
|
||||||
|
|
||||||
PAPERLESS_OCR_USER_ARGS=<json>
|
|
||||||
OCRmyPDF offers many more options. Use this parameter to specify any
|
|
||||||
additional arguments you wish to pass to OCRmyPDF. Since Paperless uses
|
|
||||||
the API of OCRmyPDF, you have to specify these in a format that can be
|
|
||||||
passed to the API. See `the API reference of OCRmyPDF <https://ocrmypdf.readthedocs.io/en/latest/api.html#reference>`_
|
|
||||||
for valid parameters. All command line options are supported, but they
|
|
||||||
use underscores instead of dashes.
|
|
||||||
|
|
||||||
.. caution::
|
|
||||||
|
|
||||||
Paperless has been tested to work with the OCR options provided
|
|
||||||
above. There are many options that are incompatible with each other,
|
|
||||||
so specifying invalid options may prevent paperless from consuming
|
|
||||||
any documents.
|
|
||||||
|
|
||||||
Specify arguments as a JSON dictionary. Keep note of lower case booleans
|
|
||||||
and double quoted parameter names and strings. Examples:
|
|
||||||
|
|
||||||
.. code:: json
|
|
||||||
|
|
||||||
{"deskew": true, "optimize": 3, "unpaper_args": "--pre-rotate 90"}
|
|
||||||
|
|
||||||
.. _configuration-tika:
|
|
||||||
|
|
||||||
Tika settings
|
|
||||||
#############
|
|
||||||
|
|
||||||
Paperless can make use of `Tika <https://tika.apache.org/>`_ and
|
|
||||||
`Gotenberg <https://gotenberg.dev/>`_ for parsing and
|
|
||||||
converting "Office" documents (such as ".doc", ".xlsx" and ".odt"). If you
|
|
||||||
wish to use this, you must provide a Tika server and a Gotenberg server,
|
|
||||||
configure their endpoints, and enable the feature.
|
|
||||||
|
|
||||||
PAPERLESS_TIKA_ENABLED=<bool>
|
|
||||||
Enable (or disable) the Tika parser.
|
|
||||||
|
|
||||||
Defaults to false.
|
|
||||||
|
|
||||||
PAPERLESS_TIKA_ENDPOINT=<url>
|
|
||||||
Set the endpoint URL were Paperless can reach your Tika server.
|
|
||||||
|
|
||||||
Defaults to "http://localhost:9998".
|
|
||||||
|
|
||||||
PAPERLESS_TIKA_GOTENBERG_ENDPOINT=<url>
|
|
||||||
Set the endpoint URL were Paperless can reach your Gotenberg server.
|
|
||||||
|
|
||||||
Defaults to "http://localhost:3000".
|
|
||||||
|
|
||||||
If you run paperless on docker, you can add those services to the docker-compose
|
|
||||||
file (see the provided ``docker-compose.sqlite-tika.yml`` file for reference). The changes
|
|
||||||
requires are as follows:
|
|
||||||
|
|
||||||
.. code:: yaml
|
|
||||||
|
|
||||||
services:
|
|
||||||
# ...
|
|
||||||
|
|
||||||
webserver:
|
|
||||||
# ...
|
|
||||||
|
|
||||||
environment:
|
|
||||||
# ...
|
|
||||||
|
|
||||||
PAPERLESS_TIKA_ENABLED: 1
|
|
||||||
PAPERLESS_TIKA_GOTENBERG_ENDPOINT: http://gotenberg:3000
|
|
||||||
PAPERLESS_TIKA_ENDPOINT: http://tika:9998
|
|
||||||
|
|
||||||
# ...
|
|
||||||
|
|
||||||
gotenberg:
|
|
||||||
image: gotenberg/gotenberg:7.4
|
|
||||||
restart: unless-stopped
|
|
||||||
command:
|
|
||||||
- "gotenberg"
|
|
||||||
- "--chromium-disable-routes=true"
|
|
||||||
|
|
||||||
tika:
|
|
||||||
image: ghcr.io/paperless-ngx/tika:latest
|
|
||||||
restart: unless-stopped
|
|
||||||
|
|
||||||
Add the configuration variables to the environment of the webserver (alternatively
|
|
||||||
put the configuration in the ``docker-compose.env`` file) and add the additional
|
|
||||||
services below the webserver service. Watch out for indentation.
|
|
||||||
|
|
||||||
Make sure to use the correct format `PAPERLESS_TIKA_ENABLED = 1` so python_dotenv can parse the statement correctly.
|
|
||||||
|
|
||||||
Software tweaks
|
|
||||||
###############
|
|
||||||
|
|
||||||
PAPERLESS_TASK_WORKERS=<num>
|
|
||||||
Paperless does multiple things in the background: Maintain the search index,
|
|
||||||
maintain the automatic matching algorithm, check emails, consume documents,
|
|
||||||
etc. This variable specifies how many things it will do in parallel.
|
|
||||||
|
|
||||||
|
|
||||||
PAPERLESS_THREADS_PER_WORKER=<num>
|
|
||||||
Furthermore, paperless uses multiple threads when consuming documents to
|
|
||||||
speed up OCR. This variable specifies how many pages paperless will process
|
|
||||||
in parallel on a single document.
|
|
||||||
|
|
||||||
.. caution::
|
|
||||||
|
|
||||||
Ensure that the product
|
|
||||||
|
|
||||||
PAPERLESS_TASK_WORKERS * PAPERLESS_THREADS_PER_WORKER
|
|
||||||
|
|
||||||
does not exceed your CPU core count or else paperless will be extremely slow.
|
|
||||||
If you want paperless to process many documents in parallel, choose a high
|
|
||||||
worker count. If you want paperless to process very large documents faster,
|
|
||||||
use a higher thread per worker count.
|
|
||||||
|
|
||||||
The default is a balance between the two, according to your CPU core count,
|
|
||||||
with a slight favor towards threads per worker:
|
|
||||||
|
|
||||||
+----------------+---------+---------+
|
|
||||||
| CPU core count | Workers | Threads |
|
|
||||||
+----------------+---------+---------+
|
|
||||||
| 1 | 1 | 1 |
|
|
||||||
+----------------+---------+---------+
|
|
||||||
| 2 | 2 | 1 |
|
|
||||||
+----------------+---------+---------+
|
|
||||||
| 4 | 2 | 2 |
|
|
||||||
+----------------+---------+---------+
|
|
||||||
| 6 | 2 | 3 |
|
|
||||||
+----------------+---------+---------+
|
|
||||||
| 8 | 2 | 4 |
|
|
||||||
+----------------+---------+---------+
|
|
||||||
| 12 | 3 | 4 |
|
|
||||||
+----------------+---------+---------+
|
|
||||||
| 16 | 4 | 4 |
|
|
||||||
+----------------+---------+---------+
|
|
||||||
|
|
||||||
If you only specify PAPERLESS_TASK_WORKERS, paperless will adjust
|
|
||||||
PAPERLESS_THREADS_PER_WORKER automatically.
|
|
||||||
|
|
||||||
|
|
||||||
PAPERLESS_WORKER_TIMEOUT=<num>
|
|
||||||
Machines with few cores or weak ones might not be able to finish OCR on
|
|
||||||
large documents within the default 1800 seconds. So extending this timeout
|
|
||||||
may prove to be useful on weak hardware setups.
|
|
||||||
|
|
||||||
PAPERLESS_WORKER_RETRY=<num>
|
|
||||||
If PAPERLESS_WORKER_TIMEOUT has been configured, the retry time for a task can
|
|
||||||
also be configured. By default, this value will be set to 10s more than the
|
|
||||||
worker timeout. This value should never be set less than the worker timeout.
|
|
||||||
|
|
||||||
PAPERLESS_TIME_ZONE=<timezone>
|
|
||||||
Set the time zone here.
|
|
||||||
See https://docs.djangoproject.com/en/3.1/ref/settings/#std:setting-TIME_ZONE
|
|
||||||
for details on how to set it.
|
|
||||||
|
|
||||||
Defaults to UTC.
|
|
||||||
|
|
||||||
|
|
||||||
.. _configuration-polling:
|
|
||||||
|
|
||||||
PAPERLESS_CONSUMER_POLLING=<num>
|
|
||||||
If paperless won't find documents added to your consume folder, it might
|
|
||||||
not be able to automatically detect filesystem changes. In that case,
|
|
||||||
specify a polling interval in seconds here, which will then cause paperless
|
|
||||||
to periodically check your consumption directory for changes. This will also
|
|
||||||
disable listening for file system changes with ``inotify``.
|
|
||||||
|
|
||||||
Defaults to 0, which disables polling and uses filesystem notifications.
|
|
||||||
|
|
||||||
|
|
||||||
PAPERLESS_CONSUMER_DELETE_DUPLICATES=<bool>
|
|
||||||
When the consumer detects a duplicate document, it will not touch the
|
|
||||||
original document. This default behavior can be changed here.
|
|
||||||
|
|
||||||
Defaults to false.
|
|
||||||
|
|
||||||
|
|
||||||
PAPERLESS_CONSUMER_RECURSIVE=<bool>
|
|
||||||
Enable recursive watching of the consumption directory. Paperless will
|
|
||||||
then pickup files from files in subdirectories within your consumption
|
|
||||||
directory as well.
|
|
||||||
|
|
||||||
Defaults to false.
|
|
||||||
|
|
||||||
|
|
||||||
PAPERLESS_CONSUMER_SUBDIRS_AS_TAGS=<bool>
|
|
||||||
Set the names of subdirectories as tags for consumed files.
|
|
||||||
E.g. <CONSUMPTION_DIR>/foo/bar/file.pdf will add the tags "foo" and "bar" to
|
|
||||||
the consumed file. Paperless will create any tags that don't exist yet.
|
|
||||||
|
|
||||||
This is useful for sorting documents with certain tags such as ``car`` or
|
|
||||||
``todo`` prior to consumption. These folders won't be deleted.
|
|
||||||
|
|
||||||
PAPERLESS_CONSUMER_RECURSIVE must be enabled for this to work.
|
|
||||||
|
|
||||||
Defaults to false.
|
|
||||||
|
|
||||||
PAPERLESS_CONSUMER_ENABLE_BARCODES=<bool>
|
|
||||||
Enables the scanning and page separation based on detected barcodes.
|
|
||||||
This allows for scanning and adding multiple documents per uploaded
|
|
||||||
file, which are separated by one or multiple barcode pages.
|
|
||||||
|
|
||||||
For ease of use, it is suggested to use a standardized separation page,
|
|
||||||
e.g. `here <https://www.alliancegroup.co.uk/patch-codes.htm>`_.
|
|
||||||
|
|
||||||
If no barcodes are detected in the uploaded file, no page separation
|
|
||||||
will happen.
|
|
||||||
|
|
||||||
The original document will be removed and the separated pages will be
|
|
||||||
saved as pdf.
|
|
||||||
|
|
||||||
Defaults to false.
|
|
||||||
|
|
||||||
PAPERLESS_CONSUMER_BARCODE_TIFF_SUPPORT=<bool>
|
|
||||||
Whether TIFF image files should be scanned for barcodes.
|
|
||||||
This will automatically convert any TIFF image(s) to pdfs for later
|
|
||||||
processing.
|
|
||||||
This only has an effect, if PAPERLESS_CONSUMER_ENABLE_BARCODES has been
|
|
||||||
enabled.
|
|
||||||
|
|
||||||
Defaults to false.
|
|
||||||
|
|
||||||
PAPERLESS_CONSUMER_BARCODE_STRING=PATCHT
|
|
||||||
Defines the string to be detected as a separator barcode.
|
|
||||||
If paperless is used with the PATCH-T separator pages, users
|
|
||||||
shouldn't change this.
|
|
||||||
|
|
||||||
Defaults to "PATCHT"
|
|
||||||
|
|
||||||
|
|
||||||
PAPERLESS_CONVERT_MEMORY_LIMIT=<num>
|
|
||||||
On smaller systems, or even in the case of Very Large Documents, the consumer
|
|
||||||
may explode, complaining about how it's "unable to extend pixel cache". In
|
|
||||||
such cases, try setting this to a reasonably low value, like 32. The
|
|
||||||
default is to use whatever is necessary to do everything without writing to
|
|
||||||
disk, and units are in megabytes.
|
|
||||||
|
|
||||||
For more information on how to use this value, you should search
|
|
||||||
the web for "MAGICK_MEMORY_LIMIT".
|
|
||||||
|
|
||||||
Defaults to 0, which disables the limit.
|
|
||||||
|
|
||||||
PAPERLESS_CONVERT_TMPDIR=<path>
|
|
||||||
Similar to the memory limit, if you've got a small system and your OS mounts
|
|
||||||
/tmp as tmpfs, you should set this to a path that's on a physical disk, like
|
|
||||||
/home/your_user/tmp or something. ImageMagick will use this as scratch space
|
|
||||||
when crunching through very large documents.
|
|
||||||
|
|
||||||
For more information on how to use this value, you should search
|
|
||||||
the web for "MAGICK_TMPDIR".
|
|
||||||
|
|
||||||
Default is none, which disables the temporary directory.
|
|
||||||
|
|
||||||
PAPERLESS_OPTIMIZE_THUMBNAILS=<bool>
|
|
||||||
Use optipng to optimize thumbnails. This usually reduces the size of
|
|
||||||
thumbnails by about 20%, but uses considerable compute time during
|
|
||||||
consumption.
|
|
||||||
|
|
||||||
Defaults to true.
|
|
||||||
|
|
||||||
PAPERLESS_POST_CONSUME_SCRIPT=<filename>
|
|
||||||
After a document is consumed, Paperless can trigger an arbitrary script if
|
|
||||||
you like. This script will be passed a number of arguments for you to work
|
|
||||||
with. For more information, take a look at :ref:`advanced-post_consume_script`.
|
|
||||||
|
|
||||||
The default is blank, which means nothing will be executed.
|
|
||||||
|
|
||||||
PAPERLESS_FILENAME_DATE_ORDER=<format>
|
|
||||||
Paperless will check the document text for document date information.
|
|
||||||
Use this setting to enable checking the document filename for date
|
|
||||||
information. The date order can be set to any option as specified in
|
|
||||||
https://dateparser.readthedocs.io/en/latest/settings.html#date-order.
|
|
||||||
The filename will be checked first, and if nothing is found, the document
|
|
||||||
text will be checked as normal.
|
|
||||||
|
|
||||||
Defaults to none, which disables this feature.
|
|
||||||
|
|
||||||
PAPERLESS_THUMBNAIL_FONT_NAME=<filename>
|
|
||||||
Paperless creates thumbnails for plain text files by rendering the content
|
|
||||||
of the file on an image and uses a predefined font for that. This
|
|
||||||
font can be changed here.
|
|
||||||
|
|
||||||
Note that this won't have any effect on already generated thumbnails.
|
|
||||||
|
|
||||||
Defaults to ``/usr/share/fonts/liberation/LiberationSerif-Regular.ttf``.
|
|
||||||
|
|
||||||
PAPERLESS_IGNORE_DATES=<string>
|
|
||||||
Paperless parses a documents creation date from filename and file content.
|
|
||||||
You may specify a comma separated list of dates that should be ignored during
|
|
||||||
this process. This is useful for special dates (like date of birth) that appear
|
|
||||||
in documents regularly but are very unlikely to be the documents creation date.
|
|
||||||
|
|
||||||
You may specify dates in a multitude of formats supported by dateparser (see
|
|
||||||
https://dateparser.readthedocs.io/en/latest/#popular-formats) but as the dates
|
|
||||||
need to be comma separated, the options are limited.
|
|
||||||
Example: "2020-12-02,22.04.1999"
|
|
||||||
|
|
||||||
Defaults to an empty string to not ignore any dates.
|
|
||||||
|
|
||||||
PAPERLESS_DATE_ORDER=<format>
|
|
||||||
Paperless will try to determine the document creation date from its contents.
|
|
||||||
Specify the date format Paperless should expect to see within your documents.
|
|
||||||
|
|
||||||
This option defaults to DMY which translates to day first, month second, and year
|
|
||||||
last order. Characters D, M, or Y can be shuffled to meet the required order.
|
|
||||||
|
|
||||||
PAPERLESS_CONSUMER_IGNORE_PATTERNS=<json>
|
|
||||||
By default, paperless ignores certain files and folders in the consumption
|
|
||||||
directory, such as system files created by the Mac OS.
|
|
||||||
|
|
||||||
This can be adjusted by configuring a custom json array with patterns to exclude.
|
|
||||||
|
|
||||||
Defaults to ``[".DS_STORE/*", "._*", ".stfolder/*", ".stversions/*", ".localized/*", "desktop.ini"]``.
|
|
||||||
|
|
||||||
Binaries
|
|
||||||
########
|
|
||||||
|
|
||||||
There are a few external software packages that Paperless expects to find on
|
|
||||||
your system when it starts up. Unless you've done something creative with
|
|
||||||
their installation, you probably won't need to edit any of these. However,
|
|
||||||
if you've installed these programs somewhere where simply typing the name of
|
|
||||||
the program doesn't automatically execute it (ie. the program isn't in your
|
|
||||||
$PATH), then you'll need to specify the literal path for that program.
|
|
||||||
|
|
||||||
PAPERLESS_CONVERT_BINARY=<path>
|
|
||||||
Defaults to "/usr/bin/convert".
|
|
||||||
|
|
||||||
PAPERLESS_GS_BINARY=<path>
|
|
||||||
Defaults to "/usr/bin/gs".
|
|
||||||
|
|
||||||
PAPERLESS_OPTIPNG_BINARY=<path>
|
|
||||||
Defaults to "/usr/bin/optipng".
|
|
||||||
|
|
||||||
|
|
||||||
.. _configuration-docker:
|
|
||||||
|
|
||||||
Docker-specific options
|
|
||||||
#######################
|
|
||||||
|
|
||||||
These options don't have any effect in ``paperless.conf``. These options adjust
|
|
||||||
the behavior of the docker container. Configure these in `docker-compose.env`.
|
|
||||||
|
|
||||||
PAPERLESS_WEBSERVER_WORKERS=<num>
|
|
||||||
The number of worker processes the webserver should spawn. More worker processes
|
|
||||||
usually result in the front end to load data much quicker. However, each worker process
|
|
||||||
also loads the entire application into memory separately, so increasing this value
|
|
||||||
will increase RAM usage.
|
|
||||||
|
|
||||||
Consider configuring this to 1 on low power devices with limited amount of RAM.
|
|
||||||
|
|
||||||
Defaults to 2.
|
|
||||||
|
|
||||||
PAPERLESS_PORT=<port>
|
|
||||||
The port number the webserver will listen on inside the container. There are
|
|
||||||
special setups where you may need this to avoid collisions with other
|
|
||||||
services (like using podman with multiple containers in one pod).
|
|
||||||
|
|
||||||
Don't change this when using Docker. To change the port the webserver is
|
|
||||||
reachable outside of the container, instead refer to the "ports" key in
|
|
||||||
``docker-compose.yml``.
|
|
||||||
|
|
||||||
Defaults to 8000.
|
|
||||||
|
|
||||||
USERMAP_UID=<uid>
|
|
||||||
The ID of the paperless user in the container. Set this to your actual user ID on the
|
|
||||||
host system, which you can get by executing
|
|
||||||
|
|
||||||
.. code:: shell-session
|
|
||||||
|
|
||||||
$ id -u
|
|
||||||
|
|
||||||
Paperless will change ownership on its folders to this user, so you need to get this right
|
|
||||||
in order to be able to write to the consumption directory.
|
|
||||||
|
|
||||||
Defaults to 1000.
|
|
||||||
|
|
||||||
USERMAP_GID=<gid>
|
|
||||||
The ID of the paperless Group in the container. Set this to your actual group ID on the
|
|
||||||
host system, which you can get by executing
|
|
||||||
|
|
||||||
.. code:: shell-session
|
|
||||||
|
|
||||||
$ id -g
|
|
||||||
|
|
||||||
Paperless will change ownership on its folders to this group, so you need to get this right
|
|
||||||
in order to be able to write to the consumption directory.
|
|
||||||
|
|
||||||
Defaults to 1000.
|
|
||||||
|
|
||||||
PAPERLESS_OCR_LANGUAGES=<list>
|
|
||||||
Additional OCR languages to install. By default, paperless comes with
|
|
||||||
English, German, Italian, Spanish and French. If your language is not in this list, install
|
|
||||||
additional languages with this configuration option:
|
|
||||||
|
|
||||||
.. code:: bash
|
|
||||||
|
|
||||||
PAPERLESS_OCR_LANGUAGES=tur ces
|
|
||||||
|
|
||||||
To actually use these languages, also set the default OCR language of paperless:
|
|
||||||
|
|
||||||
.. code:: bash
|
|
||||||
|
|
||||||
PAPERLESS_OCR_LANGUAGE=tur
|
|
||||||
|
|
||||||
Defaults to none, which does not install any additional languages.
|
|
||||||
|
|
||||||
|
|
||||||
.. _configuration-update-checking:
|
|
||||||
|
|
||||||
Update Checking
|
|
||||||
###############
|
|
||||||
|
|
||||||
PAPERLESS_ENABLE_UPDATE_CHECK=<bool>
|
|
||||||
Enable (or disable) the automatic check for available updates. This feature is disabled
|
|
||||||
by default but if it is not explicitly set Paperless-ngx will show a message about this.
|
|
||||||
|
|
||||||
If enabled, the feature works by pinging the the Github API for the latest release e.g.
|
|
||||||
https://api.github.com/repos/paperless-ngx/paperless-ngx/releases/latest
|
|
||||||
to determine whether a new version is available.
|
|
||||||
|
|
||||||
Actual updating of the app must still be performed manually.
|
|
||||||
|
|
||||||
Note that for users of thirdy-party containers e.g. linuxserver.io this notification
|
|
||||||
may be 'ahead' of a new release from the third-party maintainers.
|
|
||||||
|
|
||||||
In either case, no tracking data is collected by the app in any way.
|
|
||||||
|
|
||||||
Defaults to none, which disables the feature.
|
|
||||||
474
docs/development.md
Normal file
@@ -0,0 +1,474 @@
|
|||||||
|
# Development
|
||||||
|
|
||||||
|
This section describes the steps you need to take to start development
|
||||||
|
on paperless-ngx.
|
||||||
|
|
||||||
|
Check out the source from github. The repository is organized in the
|
||||||
|
following way:
|
||||||
|
|
||||||
|
- `main` always represents the latest release and will only see
|
||||||
|
changes when a new release is made.
|
||||||
|
- `dev` contains the code that will be in the next release.
|
||||||
|
- `feature-X` contain bigger changes that will be in some release, but
|
||||||
|
not necessarily the next one.
|
||||||
|
|
||||||
|
When making functional changes to paperless, _always_ make your changes
|
||||||
|
on the `dev` branch.
|
||||||
|
|
||||||
|
Apart from that, the folder structure is as follows:
|
||||||
|
|
||||||
|
- `docs/` - Documentation.
|
||||||
|
- `src-ui/` - Code of the front end.
|
||||||
|
- `src/` - Code of the back end.
|
||||||
|
- `scripts/` - Various scripts that help with different parts of
|
||||||
|
development.
|
||||||
|
- `docker/` - Files required to build the docker image.
|
||||||
|
|
||||||
|
## Contributing to Paperless
|
||||||
|
|
||||||
|
Maybe you've been using Paperless for a while and want to add a feature
|
||||||
|
or two, or maybe you've come across a bug that you have some ideas how
|
||||||
|
to solve. The beauty of open source software is that you can see what's
|
||||||
|
wrong and help to get it fixed for everyone!
|
||||||
|
|
||||||
|
Before contributing please review our [code of
|
||||||
|
conduct](https://github.com/paperless-ngx/paperless-ngx/blob/main/CODE_OF_CONDUCT.md)
|
||||||
|
and other important information in the [contributing
|
||||||
|
guidelines](https://github.com/paperless-ngx/paperless-ngx/blob/main/CONTRIBUTING.md).
|
||||||
|
|
||||||
|
## Code formatting with pre-commit Hooks
|
||||||
|
|
||||||
|
To ensure a consistent style and formatting across the project source,
|
||||||
|
the project utilizes a Git [`pre-commit`](https://git-scm.com/book/en/v2/Customizing-Git-Git-Hooks)
|
||||||
|
hook to perform some formatting and linting before a commit is allowed.
|
||||||
|
That way, everyone uses the same style and some common issues can be caught
|
||||||
|
early on. See below for installation instructions.
|
||||||
|
|
||||||
|
Once installed, hooks will run when you commit. If the formatting isn't
|
||||||
|
quite right or a linter catches something, the commit will be rejected.
|
||||||
|
You'll need to look at the output and fix the issue. Some hooks, such
|
||||||
|
as the Python formatting tool `black`, will format failing
|
||||||
|
files, so all you need to do is `git add` those files again
|
||||||
|
and retry your commit.
|
||||||
|
|
||||||
|
## Initial setup and first start
|
||||||
|
|
||||||
|
After you forked and cloned the code from github you need to perform a
|
||||||
|
first-time setup. To do the setup you need to perform the steps from the
|
||||||
|
following chapters in a certain order:
|
||||||
|
|
||||||
|
1. Install prerequisites + pipenv as mentioned in
|
||||||
|
[Bare metal route](/setup#bare_metal)
|
||||||
|
|
||||||
|
2. Copy `paperless.conf.example` to `paperless.conf` and enable debug
|
||||||
|
mode.
|
||||||
|
|
||||||
|
3. Install the Angular CLI interface:
|
||||||
|
|
||||||
|
```shell-session
|
||||||
|
$ npm install -g @angular/cli
|
||||||
|
```
|
||||||
|
|
||||||
|
4. Install pre-commit hooks
|
||||||
|
|
||||||
|
```shell-session
|
||||||
|
pre-commit install
|
||||||
|
```
|
||||||
|
|
||||||
|
5. Create `consume` and `media` folders in the cloned root folder.
|
||||||
|
|
||||||
|
```shell-session
|
||||||
|
mkdir -p consume media
|
||||||
|
```
|
||||||
|
|
||||||
|
6. You can now either ...
|
||||||
|
|
||||||
|
- install redis or
|
||||||
|
|
||||||
|
- use the included scripts/start-services.sh to use docker to fire
|
||||||
|
up a redis instance (and some other services such as tika,
|
||||||
|
gotenberg and a database server) or
|
||||||
|
|
||||||
|
- spin up a bare redis container
|
||||||
|
|
||||||
|
```shell-session
|
||||||
|
docker run -d -p 6379:6379 --restart unless-stopped redis:latest
|
||||||
|
```
|
||||||
|
|
||||||
|
7. Install the python dependencies by performing in the src/ directory.
|
||||||
|
|
||||||
|
```shell-session
|
||||||
|
pipenv install --dev
|
||||||
|
```
|
||||||
|
|
||||||
|
!!! note
|
||||||
|
|
||||||
|
Make sure you're using python 3.10.x or lower. Otherwise you might
|
||||||
|
get issues with building dependencies. You can use
|
||||||
|
[pyenv](https://github.com/pyenv/pyenv) to install a specific
|
||||||
|
python version.
|
||||||
|
|
||||||
|
8. Generate the static UI so you can perform a login to get session
|
||||||
|
that is required for frontend development (this needs to be done one
|
||||||
|
time only). From src-ui directory:
|
||||||
|
|
||||||
|
```shell-session
|
||||||
|
npm install .
|
||||||
|
./node_modules/.bin/ng build --configuration production
|
||||||
|
```
|
||||||
|
|
||||||
|
9. Apply migrations and create a superuser for your dev instance:
|
||||||
|
|
||||||
|
```shell-session
|
||||||
|
python3 manage.py migrate
|
||||||
|
python3 manage.py createsuperuser
|
||||||
|
```
|
||||||
|
|
||||||
|
10. Now spin up the dev backend. Depending on which part of paperless
|
||||||
|
you're developing for, you need to have some or all of them
|
||||||
|
running.
|
||||||
|
|
||||||
|
```shell-session
|
||||||
|
python3 manage.py runserver & python3 manage.py document_consumer & celery --app paperless worker
|
||||||
|
```
|
||||||
|
|
||||||
|
11. Login with the superuser credentials provided in step 8 at
|
||||||
|
`http://localhost:8000` to create a session that enables you to use
|
||||||
|
the backend.
|
||||||
|
|
||||||
|
Backend development environment is now ready, to start Frontend
|
||||||
|
development go to `/src-ui` and run `ng serve`. From there you can use
|
||||||
|
`http://localhost:4200` for a preview.
|
||||||
|
|
||||||
|
## Back end development
|
||||||
|
|
||||||
|
The backend is a [Django](https://www.djangoproject.com/) application. PyCharm works well for development,
|
||||||
|
but you can use whatever you want.
|
||||||
|
|
||||||
|
Configure the IDE to use the src/ folder as the base source folder.
|
||||||
|
Configure the following launch configurations in your IDE:
|
||||||
|
|
||||||
|
- `python3 manage.py runserver`
|
||||||
|
- `celery --app paperless worker`
|
||||||
|
- `python3 manage.py document_consumer`
|
||||||
|
|
||||||
|
To start them all:
|
||||||
|
|
||||||
|
```shell-session
|
||||||
|
python3 manage.py runserver & python3 manage.py document_consumer & celery --app paperless worker
|
||||||
|
```
|
||||||
|
|
||||||
|
Testing and code style:
|
||||||
|
|
||||||
|
- Run `pytest` in the `src/` directory to execute all tests. This also
|
||||||
|
generates a HTML coverage report. When runnings test, paperless.conf
|
||||||
|
is loaded as well. However: the tests rely on the default
|
||||||
|
configuration. This is not ideal. But for now, make sure no settings
|
||||||
|
except for DEBUG are overridden when testing.
|
||||||
|
|
||||||
|
- Coding style is enforced by the Git pre-commit hooks. These will
|
||||||
|
ensure your code is formatted and do some linting when you do a `git commit`.
|
||||||
|
|
||||||
|
- You can also run `black` manually to format your code
|
||||||
|
|
||||||
|
- The `pre-commit` hooks will modify files and interact with each other.
|
||||||
|
It may take a couple of `git add`, `git commit` cycle to satisfy them.
|
||||||
|
|
||||||
|
!!! note
|
||||||
|
|
||||||
|
The line length rule E501 is generally useful for getting multiple
|
||||||
|
source files next to each other on the screen. However, in some
|
||||||
|
cases, its just not possible to make some lines fit, especially
|
||||||
|
complicated IF cases. Append `# noqa: E501` to disable this check
|
||||||
|
for certain lines.
|
||||||
|
|
||||||
|
## Front end development
|
||||||
|
|
||||||
|
The front end is built using Angular. In order to get started, you need
|
||||||
|
`npm`. Install the Angular CLI interface with
|
||||||
|
|
||||||
|
```shell-session
|
||||||
|
$ npm install -g @angular/cli
|
||||||
|
```
|
||||||
|
|
||||||
|
and make sure that it's on your path. Next, in the src-ui/ directory,
|
||||||
|
install the required dependencies of the project.
|
||||||
|
|
||||||
|
```shell-session
|
||||||
|
$ npm install
|
||||||
|
```
|
||||||
|
|
||||||
|
You can launch a development server by running
|
||||||
|
|
||||||
|
```shell-session
|
||||||
|
$ ng serve
|
||||||
|
```
|
||||||
|
|
||||||
|
This will automatically update whenever you save. However, in-place
|
||||||
|
compilation might fail on syntax errors, in which case you need to
|
||||||
|
restart it.
|
||||||
|
|
||||||
|
By default, the development server is available on
|
||||||
|
`http://localhost:4200/` and is configured to access the API at
|
||||||
|
`http://localhost:8000/api/`, which is the default of the backend. If
|
||||||
|
you enabled DEBUG on the back end, several security overrides for
|
||||||
|
allowed hosts, CORS and X-Frame-Options are in place so that the front
|
||||||
|
end behaves exactly as in production. This also relies on you being
|
||||||
|
logged into the back end. Without a valid session, The front end will
|
||||||
|
simply not work.
|
||||||
|
|
||||||
|
Testing and code style:
|
||||||
|
|
||||||
|
- The frontend code (.ts, .html, .scss) use `prettier` for code
|
||||||
|
formatting via the Git `pre-commit` hooks which run automatically on
|
||||||
|
commit. See
|
||||||
|
[above](#code-formatting-with-pre-commit-hooks) for installation. You can also run this via cli with a
|
||||||
|
command such as
|
||||||
|
|
||||||
|
```shell-session
|
||||||
|
$ git ls-files -- '*.ts' | xargs pre-commit run prettier --files
|
||||||
|
```
|
||||||
|
|
||||||
|
- Frontend testing uses jest and cypress. There is currently a need
|
||||||
|
for significantly more frontend tests. Unit tests and e2e tests,
|
||||||
|
respectively, can be run non-interactively with:
|
||||||
|
|
||||||
|
```shell-session
|
||||||
|
$ ng test
|
||||||
|
$ npm run e2e:ci
|
||||||
|
```
|
||||||
|
|
||||||
|
Cypress also includes a UI which can be run from within the `src-ui`
|
||||||
|
directory with
|
||||||
|
|
||||||
|
```shell-session
|
||||||
|
$ ./node_modules/.bin/cypress open
|
||||||
|
```
|
||||||
|
|
||||||
|
In order to build the front end and serve it as part of django, execute
|
||||||
|
|
||||||
|
```shell-session
|
||||||
|
$ ng build --configuration production
|
||||||
|
```
|
||||||
|
|
||||||
|
This will build the front end and put it in a location from which the
|
||||||
|
Django server will serve it as static content. This way, you can verify
|
||||||
|
that authentication is working.
|
||||||
|
|
||||||
|
## Localization
|
||||||
|
|
||||||
|
Paperless is available in many different languages. Since paperless
|
||||||
|
consists both of a django application and an Angular front end, both
|
||||||
|
these parts have to be translated separately.
|
||||||
|
|
||||||
|
### Front end localization
|
||||||
|
|
||||||
|
- The Angular front end does localization according to the [Angular
|
||||||
|
documentation](https://angular.io/guide/i18n).
|
||||||
|
- The source language of the project is "en_US".
|
||||||
|
- The source strings end up in the file "src-ui/messages.xlf".
|
||||||
|
- The translated strings need to be placed in the
|
||||||
|
"src-ui/src/locale/" folder.
|
||||||
|
- In order to extract added or changed strings from the source files,
|
||||||
|
call `ng xi18n --ivy`.
|
||||||
|
|
||||||
|
Adding new languages requires adding the translated files in the
|
||||||
|
"src-ui/src/locale/" folder and adjusting a couple files.
|
||||||
|
|
||||||
|
1. Adjust "src-ui/angular.json":
|
||||||
|
|
||||||
|
```json
|
||||||
|
"i18n": {
|
||||||
|
"sourceLocale": "en-US",
|
||||||
|
"locales": {
|
||||||
|
"de": "src/locale/messages.de.xlf",
|
||||||
|
"nl-NL": "src/locale/messages.nl_NL.xlf",
|
||||||
|
"fr": "src/locale/messages.fr.xlf",
|
||||||
|
"en-GB": "src/locale/messages.en_GB.xlf",
|
||||||
|
"pt-BR": "src/locale/messages.pt_BR.xlf",
|
||||||
|
"language-code": "language-file"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
2. Add the language to the available options in
|
||||||
|
"src-ui/src/app/services/settings.service.ts":
|
||||||
|
|
||||||
|
```typescript
|
||||||
|
getLanguageOptions(): LanguageOption[] {
|
||||||
|
return [
|
||||||
|
{code: "en-us", name: $localize`English (US)`, englishName: "English (US)", dateInputFormat: "mm/dd/yyyy"},
|
||||||
|
{code: "en-gb", name: $localize`English (GB)`, englishName: "English (GB)", dateInputFormat: "dd/mm/yyyy"},
|
||||||
|
{code: "de", name: $localize`German`, englishName: "German", dateInputFormat: "dd.mm.yyyy"},
|
||||||
|
{code: "nl", name: $localize`Dutch`, englishName: "Dutch", dateInputFormat: "dd-mm-yyyy"},
|
||||||
|
{code: "fr", name: $localize`French`, englishName: "French", dateInputFormat: "dd/mm/yyyy"},
|
||||||
|
{code: "pt-br", name: $localize`Portuguese (Brazil)`, englishName: "Portuguese (Brazil)", dateInputFormat: "dd/mm/yyyy"}
|
||||||
|
// Add your new language here
|
||||||
|
]
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
`dateInputFormat` is a special string that defines the behavior of
|
||||||
|
the date input fields and absolutely needs to contain "dd", "mm"
|
||||||
|
and "yyyy".
|
||||||
|
|
||||||
|
3. Import and register the Angular data for this locale in
|
||||||
|
"src-ui/src/app/app.module.ts":
|
||||||
|
|
||||||
|
```typescript
|
||||||
|
import localeDe from '@angular/common/locales/de'
|
||||||
|
registerLocaleData(localeDe)
|
||||||
|
```
|
||||||
|
|
||||||
|
### Back end localization
|
||||||
|
|
||||||
|
A majority of the strings that appear in the back end appear only when
|
||||||
|
the admin is used. However, some of these are still shown on the front
|
||||||
|
end (such as error messages).
|
||||||
|
|
||||||
|
- The django application does localization according to the [django
|
||||||
|
documentation](https://docs.djangoproject.com/en/3.1/topics/i18n/translation/).
|
||||||
|
- The source language of the project is "en_US".
|
||||||
|
- Localization files end up in the folder "src/locale/".
|
||||||
|
- In order to extract strings from the application, call
|
||||||
|
`python3 manage.py makemessages -l en_US`. This is important after
|
||||||
|
making changes to translatable strings.
|
||||||
|
- The message files need to be compiled for them to show up in the
|
||||||
|
application. Call `python3 manage.py compilemessages` to do this.
|
||||||
|
The generated files don't get committed into git, since these are
|
||||||
|
derived artifacts. The build pipeline takes care of executing this
|
||||||
|
command.
|
||||||
|
|
||||||
|
Adding new languages requires adding the translated files in the
|
||||||
|
"src/locale/" folder and adjusting the file
|
||||||
|
"src/paperless/settings.py" to include the new language:
|
||||||
|
|
||||||
|
```python
|
||||||
|
LANGUAGES = [
|
||||||
|
("en-us", _("English (US)")),
|
||||||
|
("en-gb", _("English (GB)")),
|
||||||
|
("de", _("German")),
|
||||||
|
("nl-nl", _("Dutch")),
|
||||||
|
("fr", _("French")),
|
||||||
|
("pt-br", _("Portuguese (Brazil)")),
|
||||||
|
# Add language here.
|
||||||
|
]
|
||||||
|
```
|
||||||
|
|
||||||
|
## Building the documentation
|
||||||
|
|
||||||
|
The documentation is built using material-mkdocs, see their [documentation](https://squidfunk.github.io/mkdocs-material/reference/).
|
||||||
|
If you want to build the documentation locally, this is how you do it:
|
||||||
|
|
||||||
|
1. Install python dependencies.
|
||||||
|
|
||||||
|
```shell-session
|
||||||
|
$ cd /path/to/paperless
|
||||||
|
$ pipenv install --dev
|
||||||
|
```
|
||||||
|
|
||||||
|
2. Build the documentation
|
||||||
|
|
||||||
|
```shell-session
|
||||||
|
$ cd /path/to/paperless
|
||||||
|
$ pipenv mkdocs build --config-file mkdocs.yml
|
||||||
|
```
|
||||||
|
|
||||||
|
## Building the Docker image
|
||||||
|
|
||||||
|
The docker image is primarily built by the GitHub actions workflow, but
|
||||||
|
it can be faster when developing to build and tag an image locally.
|
||||||
|
|
||||||
|
To provide the build arguments automatically, build the image using the
|
||||||
|
helper script `build-docker-image.sh`.
|
||||||
|
|
||||||
|
Building the docker image from source:
|
||||||
|
|
||||||
|
```shell-session
|
||||||
|
./build-docker-image.sh Dockerfile -t <your-tag>
|
||||||
|
```
|
||||||
|
|
||||||
|
## Extending Paperless
|
||||||
|
|
||||||
|
Paperless does not have any fancy plugin systems and will probably never
|
||||||
|
have. However, some parts of the application have been designed to allow
|
||||||
|
easy integration of additional features without any modification to the
|
||||||
|
base code.
|
||||||
|
|
||||||
|
### Making custom parsers
|
||||||
|
|
||||||
|
Paperless uses parsers to add documents to paperless. A parser is
|
||||||
|
responsible for:
|
||||||
|
|
||||||
|
- Retrieve the content from the original
|
||||||
|
- Create a thumbnail
|
||||||
|
- Optional: Retrieve a created date from the original
|
||||||
|
- Optional: Create an archived document from the original
|
||||||
|
|
||||||
|
Custom parsers can be added to paperless to support more file types. In
|
||||||
|
order to do that, you need to write the parser itself and announce its
|
||||||
|
existence to paperless.
|
||||||
|
|
||||||
|
The parser itself must extend `documents.parsers.DocumentParser` and
|
||||||
|
must implement the methods `parse` and `get_thumbnail`. You can provide
|
||||||
|
your own implementation to `get_date` if you don't want to rely on
|
||||||
|
paperless' default date guessing mechanisms.
|
||||||
|
|
||||||
|
```python
|
||||||
|
class MyCustomParser(DocumentParser):
|
||||||
|
|
||||||
|
def parse(self, document_path, mime_type):
|
||||||
|
# This method does not return anything. Rather, you should assign
|
||||||
|
# whatever you got from the document to the following fields:
|
||||||
|
|
||||||
|
# The content of the document.
|
||||||
|
self.text = "content"
|
||||||
|
|
||||||
|
# Optional: path to a PDF document that you created from the original.
|
||||||
|
self.archive_path = os.path.join(self.tempdir, "archived.pdf")
|
||||||
|
|
||||||
|
# Optional: "created" date of the document.
|
||||||
|
self.date = get_created_from_metadata(document_path)
|
||||||
|
|
||||||
|
def get_thumbnail(self, document_path, mime_type):
|
||||||
|
# This should return the path to a thumbnail you created for this
|
||||||
|
# document.
|
||||||
|
return os.path.join(self.tempdir, "thumb.webp")
|
||||||
|
```
|
||||||
|
|
||||||
|
If you encounter any issues during parsing, raise a
|
||||||
|
`documents.parsers.ParseError`.
|
||||||
|
|
||||||
|
The `self.tempdir` directory is a temporary directory that is guaranteed
|
||||||
|
to be empty and removed after consumption finished. You can use that
|
||||||
|
directory to store any intermediate files and also use it to store the
|
||||||
|
thumbnail / archived document.
|
||||||
|
|
||||||
|
After that, you need to announce your parser to paperless. You need to
|
||||||
|
connect a handler to the `document_consumer_declaration` signal. Have a
|
||||||
|
look in the file `src/paperless_tesseract/apps.py` on how that's done.
|
||||||
|
The handler is a method that returns information about your parser:
|
||||||
|
|
||||||
|
```python
|
||||||
|
def myparser_consumer_declaration(sender, **kwargs):
|
||||||
|
return {
|
||||||
|
"parser": MyCustomParser,
|
||||||
|
"weight": 0,
|
||||||
|
"mime_types": {
|
||||||
|
"application/pdf": ".pdf",
|
||||||
|
"image/jpeg": ".jpg",
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
- `parser` is a reference to a class that extends `DocumentParser`.
|
||||||
|
- `weight` is used whenever two or more parsers are able to parse a
|
||||||
|
file: The parser with the higher weight wins. This can be used to
|
||||||
|
override the parsers provided by paperless.
|
||||||
|
- `mime_types` is a dictionary. The keys are the mime types your
|
||||||
|
parser supports and the value is the default file extension that
|
||||||
|
paperless should use when storing files and serving them for
|
||||||
|
download. We could guess that from the file extensions, but some
|
||||||
|
mime types have many extensions associated with them and the python
|
||||||
|
methods responsible for guessing the extension do not always return
|
||||||
|
the same value.
|
||||||
@@ -1,431 +0,0 @@
|
|||||||
.. _extending:
|
|
||||||
|
|
||||||
Paperless-ngx Development
|
|
||||||
#########################
|
|
||||||
|
|
||||||
This section describes the steps you need to take to start development on paperless-ngx.
|
|
||||||
|
|
||||||
Check out the source from github. The repository is organized in the following way:
|
|
||||||
|
|
||||||
* ``main`` always represents the latest release and will only see changes
|
|
||||||
when a new release is made.
|
|
||||||
* ``dev`` contains the code that will be in the next release.
|
|
||||||
* ``feature-X`` contain bigger changes that will be in some release, but not
|
|
||||||
necessarily the next one.
|
|
||||||
|
|
||||||
When making functional changes to paperless, *always* make your changes on the ``dev`` branch.
|
|
||||||
|
|
||||||
Apart from that, the folder structure is as follows:
|
|
||||||
|
|
||||||
* ``docs/`` - Documentation.
|
|
||||||
* ``src-ui/`` - Code of the front end.
|
|
||||||
* ``src/`` - Code of the back end.
|
|
||||||
* ``scripts/`` - Various scripts that help with different parts of development.
|
|
||||||
* ``docker/`` - Files required to build the docker image.
|
|
||||||
|
|
||||||
Contributing to Paperless
|
|
||||||
=========================
|
|
||||||
|
|
||||||
Maybe you've been using Paperless for a while and want to add a feature or two,
|
|
||||||
or maybe you've come across a bug that you have some ideas how to solve. The
|
|
||||||
beauty of open source software is that you can see what's wrong and help to get
|
|
||||||
it fixed for everyone!
|
|
||||||
|
|
||||||
Before contributing please review our `code of conduct`_ and other important
|
|
||||||
information in the `contributing guidelines`_.
|
|
||||||
|
|
||||||
.. _code-formatting-with-pre-commit-hooks:
|
|
||||||
|
|
||||||
Code formatting with pre-commit Hooks
|
|
||||||
=====================================
|
|
||||||
|
|
||||||
To ensure a consistent style and formatting across the project source, the project
|
|
||||||
utilizes a Git `pre-commit` hook to perform some formatting and linting before a
|
|
||||||
commit is allowed. That way, everyone uses the same style and some common issues
|
|
||||||
can be caught early on. See below for installation instructions.
|
|
||||||
|
|
||||||
Once installed, hooks will run when you commit. If the formatting isn't quite right
|
|
||||||
or a linter catches something, the commit will be rejected. You'll need to look at the
|
|
||||||
output and fix the issue. Some hooks, such as the Python formatting tool `black`,
|
|
||||||
will format failing files, so all you need to do is `git add` those files again and
|
|
||||||
retry your commit.
|
|
||||||
|
|
||||||
Initial setup and first start
|
|
||||||
=============================
|
|
||||||
|
|
||||||
After you forked and cloned the code from github you need to perform a first-time setup.
|
|
||||||
To do the setup you need to perform the steps from the following chapters in a certain order:
|
|
||||||
|
|
||||||
1. Install prerequisites + pipenv as mentioned in :ref:`Bare metal route <setup-bare_metal>`
|
|
||||||
2. Copy ``paperless.conf.example`` to ``paperless.conf`` and enable debug mode.
|
|
||||||
3. Install the Angular CLI interface:
|
|
||||||
|
|
||||||
.. code:: shell-session
|
|
||||||
|
|
||||||
$ npm install -g @angular/cli
|
|
||||||
|
|
||||||
4. Install pre-commit
|
|
||||||
|
|
||||||
.. code:: shell-session
|
|
||||||
|
|
||||||
pre-commit install
|
|
||||||
|
|
||||||
5. Create ``consume`` and ``media`` folders in the cloned root folder.
|
|
||||||
|
|
||||||
.. code:: shell-session
|
|
||||||
|
|
||||||
mkdir -p consume media
|
|
||||||
|
|
||||||
6. You can now either ...
|
|
||||||
|
|
||||||
* install redis or
|
|
||||||
* use the included scripts/start-services.sh to use docker to fire up a redis instance (and some other services such as tika, gotenberg and a postgresql server) or
|
|
||||||
* spin up a bare redis container
|
|
||||||
|
|
||||||
.. code:: shell-session
|
|
||||||
|
|
||||||
docker run -d -p 6379:6379 --restart unless-stopped redis:latest
|
|
||||||
|
|
||||||
7. Install the python dependencies by performing in the src/ directory.
|
|
||||||
|
|
||||||
.. code:: shell-session
|
|
||||||
|
|
||||||
pipenv install --dev
|
|
||||||
|
|
||||||
* Make sure you're using python 3.9.x or lower. Otherwise you might get issues with building dependencies. You can use `pyenv <https://github.com/pyenv/pyenv>`_ to install a specific python version.
|
|
||||||
|
|
||||||
8. Generate the static UI so you can perform a login to get session that is required for frontend development (this needs to be done one time only). From src-ui directory:
|
|
||||||
|
|
||||||
.. code:: shell-session
|
|
||||||
|
|
||||||
npm install .
|
|
||||||
./node_modules/.bin/ng build --configuration production
|
|
||||||
|
|
||||||
9. Apply migrations and create a superuser for your dev instance:
|
|
||||||
|
|
||||||
.. code:: shell-session
|
|
||||||
|
|
||||||
python3 manage.py migrate
|
|
||||||
python3 manage.py createsuperuser
|
|
||||||
|
|
||||||
10. Now spin up the dev backend. Depending on which part of paperless you're developing for, you need to have some or all of them running.
|
|
||||||
|
|
||||||
.. code:: shell-session
|
|
||||||
|
|
||||||
python3 manage.py runserver & python3 manage.py document_consumer & python3 manage.py qcluster
|
|
||||||
|
|
||||||
11. Login with the superuser credentials provided in step 8 at ``http://localhost:8000`` to create a session that enables you to use the backend.
|
|
||||||
|
|
||||||
Backend development environment is now ready, to start Frontend development go to ``/src-ui`` and run ``ng serve``. From there you can use ``http://localhost:4200`` for a preview.
|
|
||||||
|
|
||||||
Back end development
|
|
||||||
====================
|
|
||||||
|
|
||||||
The backend is a django application. PyCharm works well for development, but you can use whatever
|
|
||||||
you want.
|
|
||||||
|
|
||||||
Configure the IDE to use the src/ folder as the base source folder. Configure the following
|
|
||||||
launch configurations in your IDE:
|
|
||||||
|
|
||||||
* python3 manage.py runserver
|
|
||||||
* python3 manage.py qcluster
|
|
||||||
* python3 manage.py document_consumer
|
|
||||||
|
|
||||||
To start them all:
|
|
||||||
|
|
||||||
.. code:: shell-session
|
|
||||||
|
|
||||||
python3 manage.py runserver & python3 manage.py document_consumer & python3 manage.py qcluster
|
|
||||||
|
|
||||||
Testing and code style:
|
|
||||||
|
|
||||||
* Run ``pytest`` in the src/ directory to execute all tests. This also generates a HTML coverage
|
|
||||||
report. When runnings test, paperless.conf is loaded as well. However: the tests rely on the default
|
|
||||||
configuration. This is not ideal. But for now, make sure no settings except for DEBUG are overridden when testing.
|
|
||||||
* Coding style is enforced by the Git pre-commit hooks. These will ensure your code is formatted and do some
|
|
||||||
linting when you do a `git commit`.
|
|
||||||
* You can also run ``black`` manually to format your code
|
|
||||||
|
|
||||||
.. note::
|
|
||||||
|
|
||||||
The line length rule E501 is generally useful for getting multiple source files
|
|
||||||
next to each other on the screen. However, in some cases, its just not possible
|
|
||||||
to make some lines fit, especially complicated IF cases. Append ``# NOQA: E501``
|
|
||||||
to disable this check for certain lines.
|
|
||||||
|
|
||||||
Front end development
|
|
||||||
=====================
|
|
||||||
|
|
||||||
The front end is built using Angular. In order to get started, you need ``npm``.
|
|
||||||
Install the Angular CLI interface with
|
|
||||||
|
|
||||||
.. code:: shell-session
|
|
||||||
|
|
||||||
$ npm install -g @angular/cli
|
|
||||||
|
|
||||||
and make sure that it's on your path. Next, in the src-ui/ directory, install the
|
|
||||||
required dependencies of the project.
|
|
||||||
|
|
||||||
.. code:: shell-session
|
|
||||||
|
|
||||||
$ npm install
|
|
||||||
|
|
||||||
You can launch a development server by running
|
|
||||||
|
|
||||||
.. code:: shell-session
|
|
||||||
|
|
||||||
$ ng serve
|
|
||||||
|
|
||||||
This will automatically update whenever you save. However, in-place compilation might fail
|
|
||||||
on syntax errors, in which case you need to restart it.
|
|
||||||
|
|
||||||
By default, the development server is available on ``http://localhost:4200/`` and is configured
|
|
||||||
to access the API at ``http://localhost:8000/api/``, which is the default of the backend.
|
|
||||||
If you enabled DEBUG on the back end, several security overrides for allowed hosts, CORS and
|
|
||||||
X-Frame-Options are in place so that the front end behaves exactly as in production. This also
|
|
||||||
relies on you being logged into the back end. Without a valid session, The front end will simply
|
|
||||||
not work.
|
|
||||||
|
|
||||||
Testing and code style:
|
|
||||||
|
|
||||||
* The frontend code (.ts, .html, .scss) use ``prettier`` for code formatting via the Git
|
|
||||||
``pre-commit`` hooks which run automatically on commit. See
|
|
||||||
:ref:`above <code-formatting-with-pre-commit-hooks>` for installation. You can also run this
|
|
||||||
via cli with a command such as
|
|
||||||
|
|
||||||
.. code:: shell-session
|
|
||||||
|
|
||||||
$ git ls-files -- '*.ts' | xargs pre-commit run prettier --files
|
|
||||||
|
|
||||||
* Frontend testing uses jest and cypress. There is currently a need for significantly more
|
|
||||||
frontend tests. Unit tests and e2e tests, respectively, can be run non-interactively with:
|
|
||||||
|
|
||||||
.. code:: shell-session
|
|
||||||
|
|
||||||
$ ng test
|
|
||||||
$ npm run e2e:ci
|
|
||||||
|
|
||||||
Cypress also includes a UI which can be run from within the ``src-ui`` directory with
|
|
||||||
|
|
||||||
.. code:: shell-session
|
|
||||||
|
|
||||||
$ ./node_modules/.bin/cypress open
|
|
||||||
|
|
||||||
In order to build the front end and serve it as part of django, execute
|
|
||||||
|
|
||||||
.. code:: shell-session
|
|
||||||
|
|
||||||
$ ng build --prod
|
|
||||||
|
|
||||||
This will build the front end and put it in a location from which the Django server will serve
|
|
||||||
it as static content. This way, you can verify that authentication is working.
|
|
||||||
|
|
||||||
|
|
||||||
Localization
|
|
||||||
============
|
|
||||||
|
|
||||||
Paperless is available in many different languages. Since paperless consists both of a django
|
|
||||||
application and an Angular front end, both these parts have to be translated separately.
|
|
||||||
|
|
||||||
Front end localization
|
|
||||||
----------------------
|
|
||||||
|
|
||||||
* The Angular front end does localization according to the `Angular documentation <https://angular.io/guide/i18n>`_.
|
|
||||||
* The source language of the project is "en_US".
|
|
||||||
* The source strings end up in the file "src-ui/messages.xlf".
|
|
||||||
* The translated strings need to be placed in the "src-ui/src/locale/" folder.
|
|
||||||
* In order to extract added or changed strings from the source files, call ``ng xi18n --ivy``.
|
|
||||||
|
|
||||||
Adding new languages requires adding the translated files in the "src-ui/src/locale/" folder and adjusting a couple files.
|
|
||||||
|
|
||||||
1. Adjust "src-ui/angular.json":
|
|
||||||
|
|
||||||
.. code:: json
|
|
||||||
|
|
||||||
"i18n": {
|
|
||||||
"sourceLocale": "en-US",
|
|
||||||
"locales": {
|
|
||||||
"de": "src/locale/messages.de.xlf",
|
|
||||||
"nl-NL": "src/locale/messages.nl_NL.xlf",
|
|
||||||
"fr": "src/locale/messages.fr.xlf",
|
|
||||||
"en-GB": "src/locale/messages.en_GB.xlf",
|
|
||||||
"pt-BR": "src/locale/messages.pt_BR.xlf",
|
|
||||||
"language-code": "language-file"
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
2. Add the language to the available options in "src-ui/src/app/services/settings.service.ts":
|
|
||||||
|
|
||||||
.. code:: typescript
|
|
||||||
|
|
||||||
getLanguageOptions(): LanguageOption[] {
|
|
||||||
return [
|
|
||||||
{code: "en-us", name: $localize`English (US)`, englishName: "English (US)", dateInputFormat: "mm/dd/yyyy"},
|
|
||||||
{code: "en-gb", name: $localize`English (GB)`, englishName: "English (GB)", dateInputFormat: "dd/mm/yyyy"},
|
|
||||||
{code: "de", name: $localize`German`, englishName: "German", dateInputFormat: "dd.mm.yyyy"},
|
|
||||||
{code: "nl", name: $localize`Dutch`, englishName: "Dutch", dateInputFormat: "dd-mm-yyyy"},
|
|
||||||
{code: "fr", name: $localize`French`, englishName: "French", dateInputFormat: "dd/mm/yyyy"},
|
|
||||||
{code: "pt-br", name: $localize`Portuguese (Brazil)`, englishName: "Portuguese (Brazil)", dateInputFormat: "dd/mm/yyyy"}
|
|
||||||
// Add your new language here
|
|
||||||
]
|
|
||||||
}
|
|
||||||
|
|
||||||
``dateInputFormat`` is a special string that defines the behavior of the date input fields and absolutely needs to contain "dd", "mm" and "yyyy".
|
|
||||||
|
|
||||||
3. Import and register the Angular data for this locale in "src-ui/src/app/app.module.ts":
|
|
||||||
|
|
||||||
.. code:: typescript
|
|
||||||
|
|
||||||
import localeDe from '@angular/common/locales/de';
|
|
||||||
registerLocaleData(localeDe)
|
|
||||||
|
|
||||||
Back end localization
|
|
||||||
---------------------
|
|
||||||
|
|
||||||
A majority of the strings that appear in the back end appear only when the admin is used. However,
|
|
||||||
some of these are still shown on the front end (such as error messages).
|
|
||||||
|
|
||||||
* The django application does localization according to the `django documentation <https://docs.djangoproject.com/en/3.1/topics/i18n/translation/>`_.
|
|
||||||
* The source language of the project is "en_US".
|
|
||||||
* Localization files end up in the folder "src/locale/".
|
|
||||||
* In order to extract strings from the application, call ``python3 manage.py makemessages -l en_US``. This is important after making changes to translatable strings.
|
|
||||||
* The message files need to be compiled for them to show up in the application. Call ``python3 manage.py compilemessages`` to do this. The generated files don't get
|
|
||||||
committed into git, since these are derived artifacts. The build pipeline takes care of executing this command.
|
|
||||||
|
|
||||||
Adding new languages requires adding the translated files in the "src/locale/" folder and adjusting the file "src/paperless/settings.py" to include the new language:
|
|
||||||
|
|
||||||
.. code:: python
|
|
||||||
|
|
||||||
LANGUAGES = [
|
|
||||||
("en-us", _("English (US)")),
|
|
||||||
("en-gb", _("English (GB)")),
|
|
||||||
("de", _("German")),
|
|
||||||
("nl-nl", _("Dutch")),
|
|
||||||
("fr", _("French")),
|
|
||||||
("pt-br", _("Portuguese (Brazil)")),
|
|
||||||
# Add language here.
|
|
||||||
]
|
|
||||||
|
|
||||||
|
|
||||||
Building the documentation
|
|
||||||
==========================
|
|
||||||
|
|
||||||
The documentation is built using sphinx. I've configured ReadTheDocs to automatically build
|
|
||||||
the documentation when changes are pushed. If you want to build the documentation locally,
|
|
||||||
this is how you do it:
|
|
||||||
|
|
||||||
1. Install python dependencies.
|
|
||||||
|
|
||||||
.. code:: shell-session
|
|
||||||
|
|
||||||
$ cd /path/to/paperless
|
|
||||||
$ pipenv install --dev
|
|
||||||
|
|
||||||
2. Build the documentation
|
|
||||||
|
|
||||||
.. code:: shell-session
|
|
||||||
|
|
||||||
$ cd /path/to/paperless/docs
|
|
||||||
$ pipenv run make clean html
|
|
||||||
|
|
||||||
This will build the HTML documentation, and put the resulting files in the ``_build/html``
|
|
||||||
directory.
|
|
||||||
|
|
||||||
Building the Docker image
|
|
||||||
=========================
|
|
||||||
|
|
||||||
The docker image is primarily built by the GitHub actions workflow, but it can be
|
|
||||||
faster when developing to build and tag an image locally.
|
|
||||||
|
|
||||||
To provide the build arguments automatically, build the image using the helper
|
|
||||||
script ``build-docker-image.sh``.
|
|
||||||
|
|
||||||
Building the docker image from source:
|
|
||||||
|
|
||||||
.. code:: shell-session
|
|
||||||
|
|
||||||
./build-docker-image.sh Dockerfile -t <your-tag>
|
|
||||||
|
|
||||||
Extending Paperless
|
|
||||||
===================
|
|
||||||
|
|
||||||
Paperless does not have any fancy plugin systems and will probably never have. However,
|
|
||||||
some parts of the application have been designed to allow easy integration of additional
|
|
||||||
features without any modification to the base code.
|
|
||||||
|
|
||||||
Making custom parsers
|
|
||||||
---------------------
|
|
||||||
|
|
||||||
Paperless uses parsers to add documents to paperless. A parser is responsible for:
|
|
||||||
|
|
||||||
* Retrieve the content from the original
|
|
||||||
* Create a thumbnail
|
|
||||||
* Optional: Retrieve a created date from the original
|
|
||||||
* Optional: Create an archived document from the original
|
|
||||||
|
|
||||||
Custom parsers can be added to paperless to support more file types. In order to do that,
|
|
||||||
you need to write the parser itself and announce its existence to paperless.
|
|
||||||
|
|
||||||
The parser itself must extend ``documents.parsers.DocumentParser`` and must implement the
|
|
||||||
methods ``parse`` and ``get_thumbnail``. You can provide your own implementation to
|
|
||||||
``get_date`` if you don't want to rely on paperless' default date guessing mechanisms.
|
|
||||||
|
|
||||||
.. code:: python
|
|
||||||
|
|
||||||
class MyCustomParser(DocumentParser):
|
|
||||||
|
|
||||||
def parse(self, document_path, mime_type):
|
|
||||||
# This method does not return anything. Rather, you should assign
|
|
||||||
# whatever you got from the document to the following fields:
|
|
||||||
|
|
||||||
# The content of the document.
|
|
||||||
self.text = "content"
|
|
||||||
|
|
||||||
# Optional: path to a PDF document that you created from the original.
|
|
||||||
self.archive_path = os.path.join(self.tempdir, "archived.pdf")
|
|
||||||
|
|
||||||
# Optional: "created" date of the document.
|
|
||||||
self.date = get_created_from_metadata(document_path)
|
|
||||||
|
|
||||||
def get_thumbnail(self, document_path, mime_type):
|
|
||||||
# This should return the path to a thumbnail you created for this
|
|
||||||
# document.
|
|
||||||
return os.path.join(self.tempdir, "thumb.png")
|
|
||||||
|
|
||||||
If you encounter any issues during parsing, raise a ``documents.parsers.ParseError``.
|
|
||||||
|
|
||||||
The ``self.tempdir`` directory is a temporary directory that is guaranteed to be empty
|
|
||||||
and removed after consumption finished. You can use that directory to store any
|
|
||||||
intermediate files and also use it to store the thumbnail / archived document.
|
|
||||||
|
|
||||||
After that, you need to announce your parser to paperless. You need to connect a
|
|
||||||
handler to the ``document_consumer_declaration`` signal. Have a look in the file
|
|
||||||
``src/paperless_tesseract/apps.py`` on how that's done. The handler is a method
|
|
||||||
that returns information about your parser:
|
|
||||||
|
|
||||||
.. code:: python
|
|
||||||
|
|
||||||
def myparser_consumer_declaration(sender, **kwargs):
|
|
||||||
return {
|
|
||||||
"parser": MyCustomParser,
|
|
||||||
"weight": 0,
|
|
||||||
"mime_types": {
|
|
||||||
"application/pdf": ".pdf",
|
|
||||||
"image/jpeg": ".jpg",
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
* ``parser`` is a reference to a class that extends ``DocumentParser``.
|
|
||||||
|
|
||||||
* ``weight`` is used whenever two or more parsers are able to parse a file: The parser with
|
|
||||||
the higher weight wins. This can be used to override the parsers provided by
|
|
||||||
paperless.
|
|
||||||
|
|
||||||
* ``mime_types`` is a dictionary. The keys are the mime types your parser supports and the value
|
|
||||||
is the default file extension that paperless should use when storing files and serving them for
|
|
||||||
download. We could guess that from the file extensions, but some mime types have many extensions
|
|
||||||
associated with them and the python methods responsible for guessing the extension do not always
|
|
||||||
return the same value.
|
|
||||||
|
|
||||||
.. _code of conduct: https://github.com/paperless-ngx/paperless-ngx/blob/main/CODE_OF_CONDUCT.md
|
|
||||||
.. _contributing guidelines: https://github.com/paperless-ngx/paperless-ngx/blob/main/CONTRIBUTING.md
|
|
||||||
123
docs/faq.md
Normal file
@@ -0,0 +1,123 @@
|
|||||||
|
# Frequently Asked Questions
|
||||||
|
|
||||||
|
## _What's the general plan for Paperless-ngx?_
|
||||||
|
|
||||||
|
**A:** While Paperless-ngx is already considered largely
|
||||||
|
"feature-complete" it is a community-driven project and development
|
||||||
|
will be guided in this way. New features can be submitted via GitHub
|
||||||
|
discussions and "up-voted" by the community but this is not a
|
||||||
|
guarantee the feature will be implemented. This project will always be
|
||||||
|
open to collaboration in the form of PRs, ideas etc.
|
||||||
|
|
||||||
|
## _I'm using docker. Where are my documents?_
|
||||||
|
|
||||||
|
**A:** Your documents are stored inside the docker volume
|
||||||
|
`paperless_media`. Docker manages this volume automatically for you. It
|
||||||
|
is a persistent storage and will persist as long as you don't
|
||||||
|
explicitly delete it. The actual location depends on your host operating
|
||||||
|
system. On Linux, chances are high that this location is
|
||||||
|
|
||||||
|
```
|
||||||
|
/var/lib/docker/volumes/paperless_media/_data
|
||||||
|
```
|
||||||
|
|
||||||
|
!!! warning
|
||||||
|
|
||||||
|
Do not mess with this folder. Don't change permissions and don't move
|
||||||
|
files around manually. This folder is meant to be entirely managed by
|
||||||
|
docker and paperless.
|
||||||
|
|
||||||
|
## Let's say I want to switch tools in a year. Can I easily move to other systems?
|
||||||
|
|
||||||
|
**A:** Your documents are stored as plain files inside the media folder.
|
||||||
|
You can always drag those files out of that folder to use them
|
||||||
|
elsewhere. Here are a couple notes about that.
|
||||||
|
|
||||||
|
- Paperless-ngx never modifies your original documents. It keeps
|
||||||
|
checksums of all documents and uses a scheduled sanity checker to
|
||||||
|
check that they remain the same.
|
||||||
|
- By default, paperless uses the internal ID of each document as its
|
||||||
|
filename. This might not be very convenient for export. However, you
|
||||||
|
can adjust the way files are stored in paperless by
|
||||||
|
[configuring the filename format](/advanced_usage#file-name-handling).
|
||||||
|
- [The exporter](/administration#exporter) is
|
||||||
|
another easy way to get your files out of paperless with reasonable
|
||||||
|
file names.
|
||||||
|
|
||||||
|
## _What file types does paperless-ngx support?_
|
||||||
|
|
||||||
|
**A:** Currently, the following files are supported:
|
||||||
|
|
||||||
|
- PDF documents, PNG images, JPEG images, TIFF images, GIF images and
|
||||||
|
WebP images are processed with OCR and converted into PDF documents.
|
||||||
|
- Plain text documents are supported as well and are added verbatim to
|
||||||
|
paperless.
|
||||||
|
- With the optional Tika integration enabled (see [Tika configuration](/configuration#tika),
|
||||||
|
Paperless also supports various Office documents (.docx, .doc, odt,
|
||||||
|
.ppt, .pptx, .odp, .xls, .xlsx, .ods).
|
||||||
|
|
||||||
|
Paperless-ngx determines the type of a file by inspecting its content.
|
||||||
|
The file extensions do not matter.
|
||||||
|
|
||||||
|
## _Will paperless-ngx run on Raspberry Pi?_
|
||||||
|
|
||||||
|
**A:** The short answer is yes. I've tested it on a Raspberry Pi 3 B.
|
||||||
|
The long answer is that certain parts of Paperless will run very slow,
|
||||||
|
such as the OCR. On Raspberry Pi, try to OCR documents before feeding
|
||||||
|
them into paperless so that paperless can reuse the text. The web
|
||||||
|
interface is a lot snappier, since it runs in your browser and paperless
|
||||||
|
has to do much less work to serve the data.
|
||||||
|
|
||||||
|
!!! note
|
||||||
|
|
||||||
|
You can adjust some of the settings so that paperless uses less
|
||||||
|
processing power. See [setup](/setup#less-powerful-devices) for details.
|
||||||
|
|
||||||
|
## _How do I install paperless-ngx on Raspberry Pi?_
|
||||||
|
|
||||||
|
**A:** Docker images are available for armv7 and arm64 hardware, so just
|
||||||
|
follow the docker-compose instructions. Apart from more required disk
|
||||||
|
space compared to a bare metal installation, docker comes with close to
|
||||||
|
zero overhead, even on Raspberry Pi.
|
||||||
|
|
||||||
|
If you decide to got with the bare metal route, be aware that some of
|
||||||
|
the python requirements do not have precompiled packages for ARM /
|
||||||
|
ARM64. Installation of these will require additional development
|
||||||
|
libraries and compilation will take a long time.
|
||||||
|
|
||||||
|
## _How do I run this on Unraid?_
|
||||||
|
|
||||||
|
**A:** Paperless-ngx is available as [community
|
||||||
|
app](https://unraid.net/community/apps?q=paperless-ngx) in Unraid. [Uli
|
||||||
|
Fahrer](https://github.com/Tooa) created a container template for that.
|
||||||
|
|
||||||
|
## _How do I run this on my toaster?_
|
||||||
|
|
||||||
|
**A:** I honestly don't know! As for all other devices that might be
|
||||||
|
able to run paperless, you're a bit on your own. If you can't run the
|
||||||
|
docker image, the documentation has instructions for bare metal
|
||||||
|
installs. I'm running paperless on an i3 processor from 2015 or so.
|
||||||
|
This is also what I use to test new releases with. Apart from that, I
|
||||||
|
also have a Raspberry Pi, which I occasionally build the image on and
|
||||||
|
see if it works.
|
||||||
|
|
||||||
|
## _How do I proxy this with NGINX?_
|
||||||
|
|
||||||
|
**A:** See [here](/setup#nginx).
|
||||||
|
|
||||||
|
## _How do I get WebSocket support with Apache mod_wsgi_?
|
||||||
|
|
||||||
|
**A:** `mod_wsgi` by itself does not support ASGI. Paperless will
|
||||||
|
continue to work with WSGI, but certain features such as status
|
||||||
|
notifications about document consumption won't be available.
|
||||||
|
|
||||||
|
If you want to continue using `mod_wsgi`, you will have to run an
|
||||||
|
ASGI-enabled web server as well that processes WebSocket connections,
|
||||||
|
and configure Apache to redirect WebSocket connections to this server.
|
||||||
|
Multiple options for ASGI servers exist:
|
||||||
|
|
||||||
|
- `gunicorn` with `uvicorn` as the worker implementation (the default
|
||||||
|
of paperless)
|
||||||
|
- `daphne` as a standalone server, which is the reference
|
||||||
|
implementation for ASGI.
|
||||||
|
- `uvicorn` as a standalone server
|
||||||
117
docs/faq.rst
@@ -1,117 +0,0 @@
|
|||||||
|
|
||||||
**************************
|
|
||||||
Frequently asked questions
|
|
||||||
**************************
|
|
||||||
|
|
||||||
**Q:** *What's the general plan for Paperless-ngx?*
|
|
||||||
|
|
||||||
**A:** While Paperless-ngx is already considered largely "feature-complete" it is a community-driven
|
|
||||||
project and development will be guided in this way. New features can be submitted via
|
|
||||||
GitHub discussions and "up-voted" by the community but this is not a guarantee the feature
|
|
||||||
will be implemented. This project will always be open to collaboration in the form of PRs,
|
|
||||||
ideas etc.
|
|
||||||
|
|
||||||
**Q:** *I'm using docker. Where are my documents?*
|
|
||||||
|
|
||||||
**A:** Your documents are stored inside the docker volume ``paperless_media``.
|
|
||||||
Docker manages this volume automatically for you. It is a persistent storage
|
|
||||||
and will persist as long as you don't explicitly delete it. The actual location
|
|
||||||
depends on your host operating system. On Linux, chances are high that this location
|
|
||||||
is
|
|
||||||
|
|
||||||
.. code::
|
|
||||||
|
|
||||||
/var/lib/docker/volumes/paperless_media/_data
|
|
||||||
|
|
||||||
.. caution::
|
|
||||||
|
|
||||||
Do not mess with this folder. Don't change permissions and don't move
|
|
||||||
files around manually. This folder is meant to be entirely managed by docker
|
|
||||||
and paperless.
|
|
||||||
|
|
||||||
**Q:** *Let's say I want to switch tools in a year. Can I easily move to other systems?*
|
|
||||||
|
|
||||||
**A:** Your documents are stored as plain files inside the media folder. You can always drag those files
|
|
||||||
out of that folder to use them elsewhere. Here are a couple notes about that.
|
|
||||||
|
|
||||||
* Paperless-ngx never modifies your original documents. It keeps checksums of all documents and uses a
|
|
||||||
scheduled sanity checker to check that they remain the same.
|
|
||||||
* By default, paperless uses the internal ID of each document as its filename. This might not be very
|
|
||||||
convenient for export. However, you can adjust the way files are stored in paperless by
|
|
||||||
:ref:`configuring the filename format <advanced-file_name_handling>`.
|
|
||||||
* :ref:`The exporter <utilities-exporter>` is another easy way to get your files out of paperless with reasonable file names.
|
|
||||||
|
|
||||||
**Q:** *What file types does paperless-ngx support?*
|
|
||||||
|
|
||||||
**A:** Currently, the following files are supported:
|
|
||||||
|
|
||||||
* PDF documents, PNG images, JPEG images, TIFF images and GIF images are processed with OCR and converted into PDF documents.
|
|
||||||
* Plain text documents are supported as well and are added verbatim
|
|
||||||
to paperless.
|
|
||||||
* With the optional Tika integration enabled (see :ref:`Configuration <configuration-tika>`), Paperless also supports various
|
|
||||||
Office documents (.docx, .doc, odt, .ppt, .pptx, .odp, .xls, .xlsx, .ods).
|
|
||||||
|
|
||||||
Paperless-ngx determines the type of a file by inspecting its content. The
|
|
||||||
file extensions do not matter.
|
|
||||||
|
|
||||||
**Q:** *Will paperless-ngx run on Raspberry Pi?*
|
|
||||||
|
|
||||||
**A:** The short answer is yes. I've tested it on a Raspberry Pi 3 B.
|
|
||||||
The long answer is that certain parts of
|
|
||||||
Paperless will run very slow, such as the OCR. On Raspberry Pi,
|
|
||||||
try to OCR documents before feeding them into paperless so that paperless can
|
|
||||||
reuse the text. The web interface is a lot snappier, since it runs
|
|
||||||
in your browser and paperless has to do much less work to serve the data.
|
|
||||||
|
|
||||||
.. note::
|
|
||||||
|
|
||||||
You can adjust some of the settings so that paperless uses less processing
|
|
||||||
power. See :ref:`setup-less_powerful_devices` for details.
|
|
||||||
|
|
||||||
|
|
||||||
**Q:** *How do I install paperless-ngx on Raspberry Pi?*
|
|
||||||
|
|
||||||
**A:** Docker images are available for arm and arm64 hardware, so just follow
|
|
||||||
the docker-compose instructions. Apart from more required disk space compared to
|
|
||||||
a bare metal installation, docker comes with close to zero overhead, even on
|
|
||||||
Raspberry Pi.
|
|
||||||
|
|
||||||
If you decide to got with the bare metal route, be aware that some of the
|
|
||||||
python requirements do not have precompiled packages for ARM / ARM64. Installation
|
|
||||||
of these will require additional development libraries and compilation will take
|
|
||||||
a long time.
|
|
||||||
|
|
||||||
**Q:** *How do I run this on Unraid?*
|
|
||||||
|
|
||||||
**A:** Paperless-ngx is available as `community app <https://unraid.net/community/apps?q=paperless-ngx>`_
|
|
||||||
in Unraid. `Uli Fahrer <https://github.com/Tooa>`_ created a container template for that.
|
|
||||||
|
|
||||||
**Q:** *How do I run this on my toaster?*
|
|
||||||
|
|
||||||
**A:** I honestly don't know! As for all other devices that might be able
|
|
||||||
to run paperless, you're a bit on your own. If you can't run the docker image,
|
|
||||||
the documentation has instructions for bare metal installs. I'm running
|
|
||||||
paperless on an i3 processor from 2015 or so. This is also what I use to test
|
|
||||||
new releases with. Apart from that, I also have a Raspberry Pi, which I
|
|
||||||
occasionally build the image on and see if it works.
|
|
||||||
|
|
||||||
**Q:** *How do I proxy this with NGINX?*
|
|
||||||
|
|
||||||
**A:** See :ref:`here <setup-nginx>`.
|
|
||||||
|
|
||||||
.. _faq-mod_wsgi:
|
|
||||||
|
|
||||||
**Q:** *How do I get WebSocket support with Apache mod_wsgi*?
|
|
||||||
|
|
||||||
**A:** ``mod_wsgi`` by itself does not support ASGI. Paperless will continue
|
|
||||||
to work with WSGI, but certain features such as status notifications about
|
|
||||||
document consumption won't be available.
|
|
||||||
|
|
||||||
If you want to continue using ``mod_wsgi``, you will have to run an ASGI-enabled
|
|
||||||
web server as well that processes WebSocket connections, and configure Apache to
|
|
||||||
redirect WebSocket connections to this server. Multiple options for ASGI servers
|
|
||||||
exist:
|
|
||||||
|
|
||||||
* ``gunicorn`` with ``uvicorn`` as the worker implementation (the default of paperless)
|
|
||||||
* ``daphne`` as a standalone server, which is the reference implementation for ASGI.
|
|
||||||
* ``uvicorn`` as a standalone server
|
|
||||||
138
docs/index.md
Normal file
@@ -0,0 +1,138 @@
|
|||||||
|
<div class="grid-left" markdown>
|
||||||
|
{.index-logo}
|
||||||
|
{.index-logo}
|
||||||
|
|
||||||
|
**Paperless-ngx** is a _community-supported_ open-source document management system that transforms your
|
||||||
|
physical documents into a searchable online archive so you can keep, well, _less paper_.
|
||||||
|
|
||||||
|
[Get started](/setup){ .md-button .md-button--primary .index-callout }
|
||||||
|
[Demo](https://demo.paperless-ngx.com){ .md-button .md-button--secondary target=\_blank }
|
||||||
|
|
||||||
|
</div>
|
||||||
|
<div class="grid-right" markdown>
|
||||||
|
{.index-screenshot}
|
||||||
|
{.index-screenshot}
|
||||||
|
</div>
|
||||||
|
<div class="clear"></div>
|
||||||
|
|
||||||
|
## Why This Exists
|
||||||
|
|
||||||
|
Paper is a nightmare. Environmental issues aside, there's no excuse for
|
||||||
|
it in the 21st century. It takes up space, collects dust, doesn't
|
||||||
|
support any form of a search feature, indexing is tedious, it's heavy
|
||||||
|
and prone to damage & loss.
|
||||||
|
|
||||||
|
This software is designed to make "going paperless" easier. No more worrying
|
||||||
|
about finding stuff again, feed documents right from the post box into
|
||||||
|
the scanner and then shred them. Perhaps you might find it useful too.
|
||||||
|
|
||||||
|
## Paperless, a history
|
||||||
|
|
||||||
|
Paperless is a simple Django application running in two parts: a
|
||||||
|
_Consumer_ (the thing that does the indexing) and the _Web server_ (the
|
||||||
|
part that lets you search & download already-indexed documents). If you
|
||||||
|
want to learn more about its functions keep on reading after the
|
||||||
|
installation section.
|
||||||
|
|
||||||
|
Paperless-ngx is a document management system that transforms your
|
||||||
|
physical documents into a searchable online archive so you can keep,
|
||||||
|
well, _less paper_.
|
||||||
|
|
||||||
|
Paperless-ngx forked from paperless-ng to continue the great work and
|
||||||
|
distribute responsibility of supporting and advancing the project among
|
||||||
|
a team of people.
|
||||||
|
|
||||||
|
NG stands for both Angular (the framework used for the Frontend) and
|
||||||
|
next-gen. Publishing this project under a different name also avoids
|
||||||
|
confusion between paperless and paperless-ngx.
|
||||||
|
|
||||||
|
If you want to learn about what's different in paperless-ngx from
|
||||||
|
Paperless, check out these resources in the documentation:
|
||||||
|
|
||||||
|
- [Some screenshots](#screenshots) of the new UI are available.
|
||||||
|
- Read [this section](/advanced_usage#automatic-matching) if you want to learn about how paperless automates all
|
||||||
|
tagging using machine learning.
|
||||||
|
- Paperless now comes with a [proper email consumer](/usage#usage-email) that's fully tested and production ready.
|
||||||
|
- Paperless creates searchable PDF/A documents from whatever you put into the consumption directory. This means
|
||||||
|
that you can select text in image-only documents coming from your scanner.
|
||||||
|
- See [this note](/administration#encryption) about GnuPG encryption in paperless-ngx.
|
||||||
|
- Paperless is now integrated with a
|
||||||
|
[task processing queue](/setup#task_processor) that tells you at a glance when and why something is not working.
|
||||||
|
- The [changelog](/changelog) contains a detailed list of all changes in paperless-ngx.
|
||||||
|
|
||||||
|
## Screenshots
|
||||||
|
|
||||||
|
This is what Paperless-ngx looks like.
|
||||||
|
|
||||||
|
The dashboard shows customizable views on your document and allows
|
||||||
|
document uploads:
|
||||||
|
|
||||||
|
[](assets/screenshots/dashboard.png)
|
||||||
|
|
||||||
|
The document list provides three different styles to scroll through your
|
||||||
|
documents:
|
||||||
|
|
||||||
|
[](assets/screenshots/documents-table.png)
|
||||||
|
|
||||||
|
[](assets/screenshots/documents-smallcards.png)
|
||||||
|
|
||||||
|
[](assets/screenshots/documents-largecards.png)
|
||||||
|
|
||||||
|
Paperless-ngx also supports dark mode:
|
||||||
|
|
||||||
|
[](assets/screenshots/documents-smallcards-dark.png)
|
||||||
|
|
||||||
|
Extensive filtering mechanisms:
|
||||||
|
|
||||||
|
[](assets/screenshots/documents-filter.png)
|
||||||
|
|
||||||
|
Bulk editing of document tags, correspondents, etc.:
|
||||||
|
|
||||||
|
[](assets/screenshots/bulk-edit.png)
|
||||||
|
|
||||||
|
Side-by-side editing of documents:
|
||||||
|
|
||||||
|
[](assets/screenshots/editing.png)
|
||||||
|
|
||||||
|
Tag editing. This looks about the same for correspondents and document
|
||||||
|
types.
|
||||||
|
|
||||||
|
[](assets/screenshots/new-tag.png)
|
||||||
|
|
||||||
|
Searching provides auto complete and highlights the results.
|
||||||
|
|
||||||
|
[](assets/screenshots/search-preview.png)
|
||||||
|
|
||||||
|
[](assets/screenshots/search-results.png)
|
||||||
|
|
||||||
|
Fancy mail filters!
|
||||||
|
|
||||||
|
[](assets/screenshots/mail-rules-edited.png)
|
||||||
|
|
||||||
|
Mobile devices are supported.
|
||||||
|
|
||||||
|
[](assets/screenshots/mobile.png)
|
||||||
|
|
||||||
|
## Support
|
||||||
|
|
||||||
|
Community support is available via [GitHub Discussions](https://github.com/paperless-ngx/paperless-ngx/discussions/) and [the Matrix chat room](https://matrix.to/#/#paperless:matrix.org).
|
||||||
|
|
||||||
|
### Feature Requests
|
||||||
|
|
||||||
|
Feature requests can be submitted via [GitHub Discussions](https://github.com/paperless-ngx/paperless-ngx/discussions/categories/feature-requests) where you can search for existing ideas, add your own and vote for the ones you care about.
|
||||||
|
|
||||||
|
### Bugs
|
||||||
|
|
||||||
|
For bugs please [open an issue](https://github.com/paperless-ngx/paperless-ngx/issues) or [start a discussion](https://github.com/paperless-ngx/paperless-ngx/discussions/categories/support) if you have questions.
|
||||||
|
|
||||||
|
## Contributing
|
||||||
|
|
||||||
|
People interested in continuing the work on paperless-ngx are encouraged to reach out on [GitHub](https://github.com/paperless-ngx/paperless-ngx) or [the Matrix chat room](https://matrix.to/#/#paperless:matrix.org). If you would like to contribute to the project on an ongoing basis there are multiple teams (frontend, ci/cd, etc) that could use your help so please reach out!
|
||||||
|
|
||||||
|
### Translation
|
||||||
|
|
||||||
|
Paperless-ngx is available in many languages that are coordinated on [Crowdin](https://crwd.in/paperless-ngx). If you want to help out by translating paperless-ngx into your language, please head over to https://crwd.in/paperless-ngx, and thank you!
|
||||||
|
|
||||||
|
## Scanners & Software
|
||||||
|
|
||||||
|
Paperless-ngx is compatible with many different scanners and scanning tools. A user-maintained list of scanners and other software is available on [the wiki](https://github.com/paperless-ngx/paperless-ngx/wiki/Scanner-&-Software-Recommendations).
|
||||||