Compare commits

..

1233 commits

Author SHA1 Message Date
7edbda70f5 apply build template
Some checks failed
Build Docker Image on Commit / build-and-publish (push) Failing after 14s
2025-04-20 15:59:40 -07:00
canewsin
290025958f v0.9.0(4630) 2023-07-12 18:28:32 +05:30
canewsin
25c5658b72 Upgrade GH runner to 20.04 2023-07-12 18:22:16 +05:30
canewsin
2970e3a205 Fetch plugins changes 2023-07-12 01:25:48 +05:30
PramUkesh
866179f6a3
v0.8.6(4626) 2023-07-01 04:27:48 +05:30
PramUkesh
e8cf14bcf5
Add trackers to Config.py for failsafety incase missing trackers..txt file 2023-07-01 04:25:41 +05:30
PramUkesh
fedcf9c1c6
Added Proxy links 2023-07-01 03:21:32 +05:30
PramUkesh
117bcf25d9
Fix pysha3 dep installation issue 2023-07-01 02:56:49 +05:30
canewsin
a429349cd4 FileRequest -> Fix error wording 2023-03-24 02:24:14 +05:30
canewsin
d8e52eaabd FileRequest -> Remove Unnecessary check 2023-03-24 02:23:16 +05:30
canewsin
f2ef6e5d9c Fix Response when site is missing for actionAs 2023-02-24 16:56:10 +05:30
canewsin
dd2bb07cfb v0.8.5(4625) 2023-02-12 00:41:38 +05:30
Seto
06a9d1e0ff Fix openssl error in windows. 2023-02-10 18:51:36 +05:30
canewsin
c354f9e24d Use default theme-class for corrupt users.json file
where settings key is missing etc
Fixes Ui.UiServer Error 500: UiWSGIHandler error
2022-12-25 01:28:16 +05:30
canewsin
77b4297224 Update Stats Plugin 2022-12-25 01:26:53 +05:30
canewsin
edc5310cd2 v0.8.4(4620) 2022-12-11 05:01:55 +05:30
canewsin
99a8409513 Increase Def Min Site Size to 25MB 2022-12-11 04:30:31 +05:30
canewsin
3550a64837 v0.8.3(4611) 2022-12-11 03:21:22 +05:30
canewsin
85ef28e6fb ContentManager.py Improve Logging of Valid Signers 2022-12-11 03:21:22 +05:30
canewsin
1500d9356b SiteStrorage.py -> Fix accessing unassigned varible 2022-12-11 03:21:22 +05:30
canewsin
f1a71770fa ContentManager -> Support for multiSig 2022-12-11 03:21:22 +05:30
canewsin
f79a73cef4 main.py -> Fix accessing unassigned varible 2022-12-11 00:51:23 +05:30
canewsin
0731787518 v0.8.2(4610) 2022-11-01 18:10:15 +05:30
canewsin
ad95eede10 Config:: Skip loading missing tracker files 2022-11-01 18:06:32 +05:30
canewsin
459b0a73ca Move trackers to seperate file & Add more trackers 2022-11-01 18:01:03 +05:30
canewsin
b7870edd2e Fix Startup Error when plugins dir missing 2022-11-01 18:00:58 +05:30
Ganesh Chowdary Nune
d5703541be Added documentation for getRandomPort fn 2022-10-09 02:36:18 +05:30
canewsin
ba96654e1d
v 0.8.1-patch(4601) 2022-10-05 03:36:15 +05:30
canewsin
ac72d623f0
remove duplicate xescape(s) 2022-10-05 03:33:50 +05:30
canewsin
fd857985f6
v0.8.0(4600) 2022-10-01 02:22:50 +05:30
canewsin
966f671efe
Update CHANGELOG.md 2022-10-01 02:21:54 +05:30
caryoscelus
86109ae4b2 fix readdress loop
use better escaping in render

fixes #19
2022-09-15 19:18:23 +05:30
canewsin
611fc774c8
Remove Patreon badge 2022-06-13 23:07:57 +05:30
BratishkaErik
0ed0b746a4
Update README-ru.md (#177)
@BratishkaErik  Thanks for your contribution
2022-06-13 23:06:04 +05:30
canewsin
49e68c3a78 Include inner_path of failed request for signing in error msg and response 2022-06-11 01:36:01 +05:30
canewsin
3ac677c9a7 Don't Fail Silently When Cert is Not Selected
When Site doesn't have cert selected but has userdata, signing userdata fails silently without proper error message
2022-06-11 01:18:02 +05:30
canewsin
016cfe9e16 Console Log Updates, Specify min supported ZeroNet version for Rust version Protocol Compatibility
Reduce noise(error => warning) on file missing in sites.
2022-06-09 22:38:57 +05:30
canewsin
712ee18634
Update FUNDING.yml 2022-06-02 19:15:22 +05:30
canewsin
5579c6b3cc rev4591 2022-05-27 08:58:42 +05:30
canewsin
c3815c56ea Revert File Open to catch File Access Errors.
https://github.com/ZeroNetX/ZeroNet/issues/174
2022-05-27 08:38:20 +05:30
canewsin
b257338b0a
v 0.8.0(4590)
- Major Version Upgrade to reflect RCE reported by geekless.
2022-05-26 17:30:59 +05:30
canewsin
ac70f83879 v 0.7.9-patch(4586) 2022-05-26 15:41:47 +05:30
canewsin
2ad80afa10 actionUpdate response Optimisation 2022-05-26 11:48:15 +05:30
canewsin
fe048cd08c Update Plugins Repo 2022-05-26 11:47:25 +05:30
canewsin
f9d7ccd83c Fix Unhandled File Access Errors 2022-05-26 11:46:58 +05:30
canewsin
b29884db78
Create codeql-analysis.yml 2022-05-19 00:10:38 +05:30
canewsin
a5190234ab v 0.7.9(4585)
- Tracker Supply Improvements.
- First Party Tracker Update Service using Dashboard Site.
2022-04-08 23:31:12 +05:30
canewsin
00db9c9f87 Rust Version Compatibility for update Protocol msg
and diff patch
2022-04-08 23:12:10 +05:30
canewsin
02ceb70a4f Tracker Supply improvemets
- Removed Non Working Trakers.
 - Dynamically Load Trackers from Dashboard Site.
2022-03-26 18:39:01 +05:30
canewsin
7ce118d645 Fix Repo Url for Bug Report 2022-03-12 17:38:23 +05:30
canewsin
eb397cf4c7 Update Plugins Repo 2022-03-12 11:14:10 +05:30
Marek Küthe
f8c9f2da4f
remove old v2 onion service (#158) 2022-03-12 10:40:33 +05:30
canewsin
69d7eacfa4 v 0.7.9-beta (4581) 2022-03-06 18:23:17 +05:30
canewsin
f498aedb96 v0.7.8 (4580)
- Update Plugins with some bug fixes and Improvements
2022-03-02 20:17:14 +05:30
canewsin
5ee928852b v 0.7.6 (4565)
- Sync Plugin Updates
- Clean up tor v3 patch (#115)
    https://github.com/ZeroNetX/ZeroNet/pull/115
- Add More Default Plugins to Repo
- Doubled Site Publish Limits
- Update ZeroNet Repo Urls (#103)
- UI/UX: Increases Size of Notifications Close Button(#106)
- Moved Plugins to Seperate Repo
- Added `access_key` variable in Config, this used to access restrited plugins when multiuser plugin is enabled. When MultiUserPlugin is enabled we cannot access some pages like /Stats, this key will remove such restriction with access key.
- Added `last_connection_id_current_version` to ConnectionServer, helpful to estimate no of connection from current client version.
- Added current version:  connections to /Stats page. see the previous point.
2022-01-12 05:13:17 +05:30
canewsin
7078badefa Update Docker Image 2021-12-03 01:18:53 +05:30
canewsin
edd2760fed v 0.7.5 (4560) 2021-11-28 23:45:42 +05:30
canewsin
7d1ec41d09 v 0.7.3 (4556) 2021-11-28 22:35:33 +05:30
canewsin
7acd8df906 Fix xrange is undefined error
- xrange is undefined error in  Tor-v3 Patch Files
2021-11-28 22:06:50 +05:30
canewsin
a1eb6eede5 Fix Incorrect viewport on mobile while loading
https://github.com/HelloZeroNet/ZeroNet/issues/2474
2021-11-28 01:23:52 +05:30
canewsin
eab7fc2be4 Tor V3 Patch 2021-11-28 01:15:19 +05:30
ZeroNet
454c0b2e7e
Merge pull request #2716 from imachug/uifile-404-fix
Fix 404 error handler in UiFilePlugin
2021-01-25 03:24:19 +01:00
ZeroNet
03da34c5d6
Merge pull request #2714 from timgates42/bugfix_typo_positive
docs: fix simple typo, positibe -> positive
2021-01-25 03:23:46 +01:00
Ivanq
c3f4591f91 Fix 404 error handler in UiFilePlugin 2020-12-27 13:28:35 +03:00
Tim Gates
3ad7bc87e5
docs: fix simple typo, positibe -> positive
There is a small typo in src/util/UpnpPunch.py.

Should read `positive` rather than `positibe`.
2020-12-22 07:44:57 +11:00
ZeroNet
b4f4c12521
Merge pull request #2695 from kabitofu/test
Japanese Translation
2020-12-14 01:40:45 +01:00
kabitofu
39d86fec9c Japanese Translation 2020-12-04 13:07:32 +09:00
Tamas Kocsis
02c27b841f Rev4555 2020-12-03 20:05:41 +01:00
Tamas Kocsis
8dafbef6ad Fix sidebar menu display 2020-12-03 20:05:13 +01:00
Tamas Kocsis
c831d175ad Merge UiFileManager js 2020-12-03 20:04:58 +01:00
Tamas Kocsis
3cbfbae42d Move file listing binary extension list to separate file, add missing comma 2020-12-03 20:04:37 +01:00
Tamas Kocsis
a1105562cd Fix site listing show on big site visit 2020-12-03 20:04:09 +01:00
Tamas Kocsis
b3c9de5e47 Don't show tracker tor bridge warning if more that 3 trackers finished 2020-12-03 20:02:48 +01:00
Tamas Kocsis
3ffa3c2f79 Use svg for top-right zero button 2020-12-03 20:02:20 +01:00
ZeroNet
03273527da
Merge pull request #2694 from styromaniac/patch-3
Replaced logo-white.png with an SVG data URI. (logo-white.png deleted)
2020-12-03 19:59:40 +01:00
Styromaniac
2795e20b0c
Replaced logo-white.png with an SVG data URI. 2020-12-03 08:33:54 -05:00
ZeroNet
eb86df5fb6
Add Python 3.9 to github tests 2020-11-30 14:51:23 +01:00
Tamas Kocsis
ecfb6b6b3a Rev4553, Debug messages formatting Windows fix 2020-11-30 14:38:25 +01:00
Tamas Kocsis
1b4f93f14b Make Debug message test cases independent from line numbers, Windows support 2020-11-30 14:37:45 +01:00
ZeroNet
40db30a260
Merge pull request #2642 from imachug/better-logs
Use more unique yet short paths for logging
2020-11-30 14:08:50 +01:00
ZeroNet
3a4a5404c0
Merge pull request #2686 from kabitofu/test
Corrections and additions to the Japanese translation
2020-11-28 01:36:19 +01:00
kabitofu
2bb12a247b Japanese Translation 2020-11-27 17:49:21 +09:00
Tamas Kocsis
38a3ea6373 Rev4551, Keep uiserver running if fileserver bind failed 2020-11-26 02:01:41 +01:00
ZeroNet
2798ad6fb2
Readme formatting 2020-11-24 15:39:57 +01:00
ZeroNet
2e7b0071a3
Remove unnecessary html tag from readme 2020-11-24 15:38:12 +01:00
ZeroNet
12e82bc9c4
Merge pull request #2682 from canewsin/patch-4
Added New Google Play Links
2020-11-24 15:33:34 +01:00
canewsin
774691fa39 Added New Google Play Links 2020-11-23 09:36:37 +05:30
Tamas Kocsis
52d6c9fedf Rev4550, Show all modified files after click 2020-11-19 03:05:53 +01:00
ZeroNet
d68c635e9a
Merge pull request #2679 from hashy0917/patch-1
bug fix.
2020-11-19 02:40:47 +01:00
hashy0917
21557b5517
bug fix.
Fixed the part where an error occurs when setting to Japanese.
2020-11-17 19:25:23 +09:00
ZeroNet
11fe0ece67
Add XDA labs link for Android client 2020-11-14 03:15:13 +01:00
ZeroNet
92363d5227
Add .onion a address of zeronet.io website 2020-11-12 02:37:44 +01:00
ZeroNet
cf0c5db5b9
Replace Play Store link as Google removed it 2020-11-12 01:36:49 +01:00
Tamas Kocsis
57dda4e6d6 Rev4549, Fix startup OpenSSL lib find recursion error 2020-11-09 01:21:52 +01:00
Tamas Kocsis
0a3bf43e1c Rev4548 2020-11-03 02:53:15 +01:00
Tamas Kocsis
48455e3e45 Better cli test output 2020-11-03 02:52:50 +01:00
Tamas Kocsis
c515e26cd6 Fix OpenSSL dll/so location find patcher 2020-11-03 02:51:42 +01:00
Tamas Kocsis
5cac059ef4 Display warning if SSLCurve native load failed 2020-11-03 02:50:21 +01:00
Tamas Kocsis
8f6e27904c Display verification lib speedups on benchmark 2020-11-03 02:49:01 +01:00
Tamas Kocsis
e757f2a2d4 Display library versions at /Env url endpoint 2020-11-03 02:48:01 +01:00
Tamas Kocsis
b104d5dd41 Refactor /Stats page rendering to seprate functions 2020-11-03 02:44:31 +01:00
ZeroNet
f6106be733
Merge pull request #2670 from TTTaevas/py3
Fix typo in French translation
2020-11-03 02:34:34 +01:00
Taevas
9305a2e7ac Fix typo in French translation 2020-11-02 15:29:29 +01:00
Jabba
4f6833c488
Complete list of binary extensions (#2661) 2020-10-29 03:24:35 +01:00
Tamas Kocsis
e2b1cf3938 Revert "Travis CI: Run a current version of pytest (#2648)"
This reverts commit 147dd4bc35.
2020-10-29 02:49:04 +01:00
Tamas Kocsis
6fcfe5b394 Rev4540, More proper bigfile filtering for OptionalFileList 2020-10-29 02:44:07 +01:00
Tamas Kocsis
12013d64c8 Rev4539, Add more extensions to UiFileManager binary files 2020-10-29 01:38:49 +01:00
Jabba
54fb2fde7c
Adding OGG MIME (#2657)
* Adding OGG MIME

Fixing: https://github.com/HelloZeroNet/ZeroNet/issues/2656

* Adding ova and ogv extensions
2020-10-26 15:59:42 +01:00
Christian Clauss
147dd4bc35
Travis CI: Run a current version of pytest (#2648)
Pytest 4 --> 6
2020-10-26 15:47:15 +01:00
Christian Clauss
32c9d5fa70
GitHub Actions upgrade to actions/checkout@v2 (#2650)
* GitHub Actions upgrade to actions/checkout@v2

* Update tests.yml
2020-10-26 15:41:46 +01:00
Ivanq
dd08b89c81 Make tests pass on Python 3.5 2020-10-13 20:36:39 +03:00
Ivanq
6770b450b3 Handle src/gevent/... paths 2020-10-13 19:50:08 +03:00
Ivanq
9b2772b171 Use more unique yet short paths for logging 2020-10-13 19:12:46 +03:00
shortcutme
29dac8a188
Rev4538 2020-10-12 13:12:44 +02:00
Ivanq
352da6bf62
Allow sites to request several CORS permissions at once (#2631)
* Allow sites to request several CORS permissions at once

* Fix typo

* Don't wait for CORS site download
2020-10-12 13:00:40 +02:00
Tamas Kocsis
288050e5b4 Rev4537, Add warning on invalid file save in UiFileManager 2020-10-06 17:10:58 +02:00
Tamas Kocsis
785d2351eb Rev4536, Fix Cors permission request for connecting site 2020-10-05 19:02:22 +02:00
Tamas Kocsis
9d1bed11af Rev4533 2020-10-01 17:18:43 +02:00
Tamas Kocsis
b2342e64bd Fix file manager listing height calculation for many files 2020-10-01 17:18:33 +02:00
Tamas Kocsis
dcbfb8afe0 Fix browse files link for not installed domain plugins 2020-10-01 17:17:51 +02:00
Jabba
4b8dfc5114
Update requirements.txt (#2617)
* Update requirements.txt

See: https://github.com/HelloZeroNet/ZeroNet/pull/2616#issuecomment-697341526
2020-09-24 18:24:24 +02:00
Tamas Kocsis
c0baf8b68d Version 0.7.2, Rev4528 2020-09-21 18:28:12 +02:00
Tamas Kocsis
73dc69605b Merge sidebar js, css 2020-09-21 18:26:55 +02:00
Tamas Kocsis
52ed8c18ca Add browse files link to sidebar 2020-09-21 18:26:44 +02:00
Tamas Kocsis
19bc0358b5 Merge UiFileManager js, css 2020-09-21 18:25:53 +02:00
Tamas Kocsis
392350ff79 Codemirror file editor for UiFileManager plugin 2020-09-21 18:25:38 +02:00
Tamas Kocsis
f0b0f57643 UiFileManager plugin 2020-09-21 18:23:28 +02:00
Tamas Kocsis
85790f8866 Check sites on ip change 2020-09-21 18:20:49 +02:00
Tamas Kocsis
ce5b4c3eda Don't non-existent files as bad 2020-09-18 18:45:42 +02:00
Tamas Kocsis
fde3b51129 Formatting 2020-09-18 18:44:42 +02:00
Tamas Kocsis
550d02d473 Retry site update if it was failed last time 2020-09-18 18:44:28 +02:00
Tamas Kocsis
4da89580c1 Don't mark content.json as bad file if update failed 2020-09-18 18:43:54 +02:00
Tamas Kocsis
bf092b83ab Workaround for stuck iframe url in Firefox when using back button 2020-09-18 18:43:25 +02:00
Tamas Kocsis
0309b81695 SiteListModifiedFiles: Give error instead of exception if content file does not exists 2020-09-18 18:42:03 +02:00
Tamas Kocsis
e74fdc4036 Redirect homepage with / at the end 2020-09-09 18:29:53 +02:00
Tamas Kocsis
b9c65d75ef Move error log handler to config object to be able to catch plugin load errors 2020-09-09 18:29:24 +02:00
Tamas Kocsis
49f8e0bc3a Allow link to console tabs 2020-09-09 18:21:09 +02:00
Tamas Kocsis
c4f8c0177e Add mode to tracker announce logging 2020-09-08 19:36:54 +02:00
Tamas Kocsis
8c20927f68 Allow test port checker functions from CLI 2020-09-08 19:35:58 +02:00
Tamas Kocsis
5b09f7af41 New port checker: ipfingerprints.com, PortChecker minor rearranging 2020-09-08 19:35:23 +02:00
Tamas Kocsis
1695571afa Add browser-like header for port checker requests 2020-09-08 19:32:45 +02:00
Tamas Kocsis
8dc5aee8aa Js based redirecting template formatting 2020-09-08 19:32:10 +02:00
Tamas Kocsis
94765af0f3 Fix not downloaded site delete on startup 2020-09-08 19:28:41 +02:00
Tamas Kocsis
5a226baaa5 Reduce announce number for not recently added sites 2020-09-08 19:28:04 +02:00
Tamas Kocsis
b7bc197012 Only try to get more peers for timeout task if site is recently added 2020-09-08 19:26:18 +02:00
Tamas Kocsis
a0dfbe31f6 Add timeout for private key recover message 2020-09-06 17:06:26 +02:00
Tamas Kocsis
964545dd1f Remove unnecessary logging 2020-09-06 17:01:59 +02:00
Tamas Kocsis
817ab04941 Fix private key recover typo 2020-09-04 18:29:02 +02:00
Tamas Kocsis
91d0ce3a50 Save users.json of private key change 2020-09-04 18:21:52 +02:00
Tamas Kocsis
e97236201c Try to recover site privatekey from master seed when site owned switch enabled 2020-09-04 18:21:02 +02:00
Tamas Kocsis
e14f5bf847 Allow modified files query from non-admin sites 2020-09-04 18:15:56 +02:00
Tamas Kocsis
79f10ffe0c Return error when fileGet binary file 2020-09-04 18:15:16 +02:00
Tamas Kocsis
0bc9374a7d Optional stats to dirList websocket API command 2020-09-04 18:14:22 +02:00
Tamas Kocsis
8a71bf65cd Don't leak local path on delete error 2020-09-04 18:08:43 +02:00
Tamas Kocsis
9d198ff7f2 Display full path in 404 error instead of inner_path 2020-09-04 18:07:29 +02:00
Tamas Kocsis
cafeebf120 Fix wrapper_nonce adding to url 2020-09-04 18:07:03 +02:00
Tamas Kocsis
46fba195da Merge js, css 2020-09-04 17:57:56 +02:00
Tamas Kocsis
501bd51bd1 Only set title from content.json if wrapperSetTitle has not been called 2020-09-04 17:57:34 +02:00
Tamas Kocsis
f7874e1ca3 Fix loading bar hide bug 2020-09-04 17:56:16 +02:00
Tamas Kocsis
8d964d1b8e Fix infopanel overflow on mobile devices 2020-09-04 17:55:41 +02:00
Tamas Kocsis
051e404a80 Fix typo in Benchmark cli info success number 2020-09-04 17:52:26 +02:00
Tamas Kocsis
6c1abf4004 Don't switch to libev for newer versions of gevent 2020-09-04 17:49:23 +02:00
Tamas Kocsis
0907edb6b1 Remove obsolate auth_key generation 2020-09-04 17:35:48 +02:00
Tamas Kocsis
6ff14d1bbd Fix plugin config error when running update.py 2020-09-04 17:17:15 +02:00
Tamas Kocsis
4ad5c065f1 Don't display gui error when running from cli on Windows 2020-09-04 17:16:47 +02:00
ZeroNet
c17b8d53d3
Update changelog with 0.6.5, 0.7.0, 0.7.1 2020-09-03 16:56:41 +02:00
ZeroNet
9022a1098a
Merge pull request #2485 from geekless/sidebar-no-content-json
Allow opening the sidebar while content.json is not loaded
2020-07-20 18:00:09 +02:00
ZeroNet
6e758ff363
Merge pull request #2596 from shyam-sam/Dockerfile-arm64v8
arm64 arch docker image request #2568
2020-07-20 17:42:12 +02:00
SuperMan
29c3523353 arm64 arch docker image request #2568 2020-07-18 17:45:32 +05:30
shortcutme
47ff6c6801
Rev4496 2020-06-30 17:04:55 +02:00
shortcutme
6bd49e8aff
Fix killing greenlets gevent exception 2020-06-30 17:04:47 +02:00
shortcutme
ddbd5c7b19
Fix reset file server port with config web interface 2020-06-30 17:04:09 +02:00
shortcutme
635c3b27cd
Fix loading invalid site block list 2020-06-30 17:03:06 +02:00
shortcutme
6776dabdb3
Fix piecemap downlad error when invalid piecemap got downloaded 2020-06-30 17:02:39 +02:00
shortcutme
14cbaf47c8
Rev4493 2020-06-18 17:28:56 +02:00
shortcutme
4eb50377c3
Warning about deleting private key for owned sites 2020-06-18 17:23:15 +02:00
shortcutme
ea6016d004
Fix latest gevent compatibility 2020-06-18 17:22:45 +02:00
shortcutme
79d26060b3
Add site address hash to site info websocket response 2020-06-18 17:22:33 +02:00
shortcutme
97ad084c21
Ignore ipv6 tests if not supported by os 2020-06-18 17:22:08 +02:00
shortcutme
179e5cb651
Fix portchecker.co 2020-06-18 17:21:43 +02:00
ZeroNet
367745b5ea
Move Android Play store link next to download options 2020-06-16 18:25:58 +02:00
ZeroNet
5c38a78b79
Merge pull request #2573 from canewsin/patch-2
Added Android Play Store Link to Read Me
2020-06-16 18:24:48 +02:00
canewsin
a02ed56c69
Added Android Play Store Link to Read Me 2020-06-16 10:33:21 +05:30
ZeroNet
f868fed51d
Merge pull request #2558 from gqgs/ws-iterable
Avoid iterating in uninitialized result
2020-05-19 02:02:16 +02:00
Guilherme
e4f42b8ce3 Avoid iterating in uninitialized result 2020-05-11 11:51:10 -03:00
ZeroNet
eeb48fc72e
Merge pull request #2556 from gqgs/debug
Remove unnecessary debugger
2020-05-10 12:07:39 +02:00
Guilherme
85733abade Remove unnecessary debugger 2020-05-10 02:04:30 -03:00
shortcutme
8db4344171
Rev4486, Fix UiPassword cleanup error 2020-05-04 13:38:30 +02:00
ZeroNet
0a9391d28b
Merge pull request #2547 from anoadragon453/anoa/pluralize
Fix pluralize translation function
2020-05-03 17:51:26 +02:00
Andrew Morgan
cfef7ab071 Fix pluralize translation function 2020-05-03 14:31:20 +01:00
shortcutme
38c1727b94
Rev4485 2020-05-03 03:59:33 +02:00
shortcutme
36d96d484e
Workaround for UiPassword cookie issues with sandboxed iframes 2020-05-03 03:59:09 +02:00
shortcutme
439f8fc476
Fix UiPassword logout and session list url encoding 2020-05-03 03:57:17 +02:00
shortcutme
07faa3d6d3
Move wrapper necessary check to separate function 2020-05-03 03:56:06 +02:00
ZeroNet
3c7022ea9d
Merge pull request #2546 from anoadragon453/anoa/bigfile_seekable
Add missing seekable() class method to BigFile plugin
2020-05-02 11:23:30 +02:00
Andrew Morgan
a657afcd47 Add missing seekable() class method to BigFile plugin 2020-05-01 18:24:47 +01:00
ZeroNet
f3a839f422
Require final gevent 1.5.0 for Python 3.8 2020-04-12 12:15:10 +02:00
shortcutme
ad3920b26a
Rev4478, Skip slow updated files checking with large content.json 2020-04-11 13:34:18 +02:00
ZeroNet
8ffd8d7a3e
Merge pull request #2510 from filips123/patch-1
Use Gevent prerelease for Python 3.8
2020-04-08 21:38:43 +02:00
Filip Š
71001491df
Use Gevent prerelease for Python 3.8 2020-04-08 17:05:39 +02:00
ZeroNet
701765b53b
Merge pull request #2496 from canewsin/patch-2
Update LICENSE
2020-04-01 17:54:47 +02:00
ZeroNet
fa880d99f1
Merge pull request #2503 from imachug/compressed-keys
Support compressed keys
2020-03-31 00:48:28 +02:00
Ivanq
0a9a9b5a57 Support compressed keys 2020-03-30 09:40:06 +03:00
shortcutme
56acac8cd3
Rev4473, Fix Merger site skipping content load to db for some seconds after new site added 2020-03-25 04:13:16 +01:00
ZeroNet
995d3bf717
Merge pull request #2495 from pataquets/patch-1
Readme: Add Docker image info and docker pulls badge
2020-03-25 00:37:16 +01:00
canewsin
1de7485858
Update LICENSE 2020-03-25 03:30:41 +05:30
Alfonso Montero
e1c0fd6984
Readme: Add Docker image info and docker pulls badge 2020-03-24 21:16:33 +01:00
ZeroNet
108a3de433
Update Dockerfile 2020-03-24 02:26:54 +01:00
ZeroNet
740fe65355
Update Dockerfile 2020-03-24 02:09:57 +01:00
ZeroNet
abde3d4cf7
Update Dockerfile 2020-03-24 01:58:33 +01:00
ZeroNet
c90c887f8f
Merge pull request #2491 from imachug/import-sslcrypto
Import sslcrypto from lib
2020-03-21 21:50:35 +01:00
Ivanq
a4d91f7081 Import sslcrypto from lib 2020-03-21 22:52:56 +03:00
shortcutme
31d4304915
Rev4471, Allow files start with dot 2020-03-21 19:51:44 +01:00
shortcutme
1eec388252
Rev4469 2020-03-20 18:53:25 +01:00
shortcutme
70de3213d6
Fix peer save dictionary changed error 2020-03-20 18:52:58 +01:00
shortcutme
f41d022038
Log BrokenPipeError as warning 2020-03-20 18:52:18 +01:00
shortcutme
723d1f4370
Rev4467 2020-03-18 03:21:14 +01:00
shortcutme
ca94703fc3
Fix tray icon destroy overflow exception 2020-03-18 03:21:00 +01:00
shortcutme
a5971adbe6
Add data_dir to example UiConfig tracker list 2020-03-18 03:19:01 +01:00
ZeroNet
dfeebbabe8
Merge pull request #2487 from imachug/gevent-ws-fix2
Update gevent-ws to v2.0.7 to fix werkzeug
2020-03-17 21:49:54 +01:00
Ivanq
66194ce435 Update gevent-ws to v2.0.7 to fix werkzeug 2020-03-17 23:48:36 +03:00
Vadim Ushakov
2de3c9a544 Allow opening the sidebar while content.json is not loaded
If one opens the sidebar of a site not being downloaded yet, the following error occurs:

```
  Internal error: KeyError('content.json',): 'content.json'
  UiWebsocket.py line 79 > 235 > Sidebar/SidebarPlugin.py line 527 > 120 > ContentDbDict.py line 59
```

Also, the sidebar is not visible.

This fixes the both issues.

For sites without peers, the only way to delete the site was to navigate to ZeroHellow, scroll the left panel to "Connecting sites", and delete the site from the list. Now those sites can be deleted from the sidebar.
2020-03-17 23:09:40 +07:00
shortcutme
5fb342a825
Change to GPLv3 license
Based on https://github.com/HelloZeroNet/ZeroNet/issues/2273
2020-03-17 14:48:24 +01:00
ZeroNet
3156d2f94b
Merge pull request #2483 from imachug/gevent-ws-fix
Potential fix of BrokenPipeError
2020-03-17 14:25:11 +01:00
Ivanq
ba156bbdec Potential fix of BrokenPipeError 2020-03-17 07:54:56 +03:00
ZeroNet
6beb76eac8
Merge pull request #2482 from imachug/gevent-ws-insensitive
Upgrade gevent-ws to v2.0.5
2020-03-16 21:36:43 +01:00
Ivanq
d3d18234df Upgrade gevent-ws to v2.0.5 2020-03-16 20:50:10 +03:00
ZeroNet
faa24a8b41
Merge pull request #2478 from imachug/sslcrypto-weird
Switch to sslcrypto v4.0 to support OpenSSL without builtin curves
2020-03-16 17:43:30 +01:00
ZeroNet
f749228a2c
Merge pull request #2475 from imachug/gevent-ws-fix2
Disable process_result on websocket requests
2020-03-16 16:16:47 +01:00
Ivanq
7e17a4e967 Switch to sslcrypto v4.0 to support OpenSSL without builtin curves 2020-03-15 20:18:04 +03:00
Ivanq
19f003141b Disable process_result on websocket requests 2020-03-14 07:27:19 +03:00
ZeroNet
53a6063576
Merge pull request #2471 from imachug/patch-1
Search for any OpenSSL version in LD_LIBRARY_PATH
2020-03-10 21:34:39 +01:00
Ivanq
33af83b2cd
Search for any OpenSSL version in LD_LIBRARY_PATH 2020-03-10 22:31:26 +03:00
ZeroNet
3426d5fe63
Merge pull request #2466 from imachug/websocket
Fix websocket_client compatibility
2020-03-09 15:54:54 +01:00
ZeroNet
f2934c10b4
Merge pull request #2463 from canewsin/patch-1
Added Github Action Test Badge to ReadMe
2020-03-09 15:51:13 +01:00
Ivanq
a2457b2488 Forgot that Upgrade is case-insensitive 2020-03-09 11:06:35 +03:00
canewsin
193632c3f9
Added Github Action Test Badge to ReadMe 2020-03-07 17:30:27 +05:30
ZeroNet
a1c176bb3f
Merge pull request #2459 from imachug/github-actions
Add GitHub Actions workflow
2020-03-05 21:16:13 +01:00
Ivanq
02fd1dc4d0 Add GitHub Actions workflow 2020-03-05 23:03:23 +03:00
ZeroNet
296e4aab57
Fix sslcrypto thread safety (#2454)
* Use sslcrypto instead of pyelliptic and pybitcointools

* Fix CryptMessage

* Support Python 3.4

* Fix user creation

* Get rid of pyelliptic and pybitcointools

* Fix typo

* Delete test file

* Add sslcrypto to tree

* Update sslcrypto

* Add pyaes to src/lib

* Fix typo in tests

* Update sslcrypto version

* Use privatekey_bin instead of privatekey for bytes objects

* Fix sslcrypto

* Fix Benchmark plugin

* Don't calculate the same thing twice

* Only import sslcrypto once

* Handle fallback sslcrypto implementation during tests

* Fix sslcrypto fallback implementation selection

* Fix thread safety

* Add derivation

* Bring split back

* Fix typo

* v3.3

* Fix custom OpenSSL discovery
2020-03-05 17:54:46 +01:00
ZeroNet
7ba2c9344d
Merge pull request #2457 from imachug/segfault
Make ThreadPool a context manager to prevent memory leaks
2020-03-05 10:45:14 +01:00
Ivanq
09e65e1d95 Make ThreadPool a context manager to prevent memory leaks 2020-03-05 08:06:57 +03:00
shortcutme
c4f65a5d7b
Rev4462, Experimental fix for segfault on shutdown 2020-03-04 21:50:28 +01:00
ZeroNet
37a401fdef
Merge pull request #2449 from krzotr/polish-translation
Polish translation
2020-03-04 18:14:06 +01:00
ZeroNet
e7d1e1f097
Merge pull request #2455 from zyw271828/py3
Improve README-zh-cn.md according to latest README.md
2020-03-03 10:36:51 +01:00
zyw271828
6df3036f11
Improve README-zh-cn.md according to latest README.md 2020-03-03 13:12:54 +08:00
zyw271828
e2a582d892
Update "How can I create a ZeroNet site" section of README-zh-cn.md 2020-03-03 12:54:41 +08:00
zyw271828
aaabcb6b1a
Update "How to join" section of README-zh-cn.md 2020-03-03 12:48:18 +08:00
ZeroNet
7bf790003e
Merge pull request #2453 from krzotr/patch-7
Fixed `Cache-Control` for .js and .css files - 10 minutes cache
2020-03-02 18:25:16 +01:00
shortcutme
f46b945cdc
Rev4461 2020-03-02 17:09:21 +01:00
shortcutme
27761c5045
Fix merger site updating 2020-03-02 17:09:13 +01:00
shortcutme
e0bf4dc9ec
Skip announcing to trackers with unsupported address 2020-03-02 17:08:43 +01:00
shortcutme
1fc67a3d71
Rev4460, Fix mergersite update on slow storage 2020-03-02 16:44:34 +01:00
krzotr
5baacf963d
Fixed Cache-Control for .js and .css files 2020-02-29 00:51:41 +01:00
Krzysztof Otręba
b790bcac9b Polish translation 2020-02-28 01:24:44 +01:00
Ivanq
219b90668f
Switch from gevent-websocket to gevent-ws (#2439)
* Switch from gevent-websocket to gevent-ws

* Return error handling, add gevent_ws source to lib
2020-02-28 01:20:04 +01:00
krzotr
2862587c15
Fixed "LookupError: 'hex' is not a text encoding" on /StatsBootstrapper page (#2442)
* Fixed "LookupError: 'hex' is not a text encoding"

* Fixed  KeyError: 'ip4'
2020-02-27 00:48:26 +01:00
shortcutme
6218a92895
Rev4458 2020-02-25 16:47:28 +01:00
shortcutme
58f03e21ef
Change unreliable trackers 2020-02-25 16:47:04 +01:00
shortcutme
b85477787d
Workaround for Tor utf8 cookie file path encoding bug on Windows 2020-02-25 16:46:21 +01:00
shortcutme
6a1235bd45
Remove old Gevent RLock support 2020-02-25 16:45:55 +01:00
ZeroNet
33d6a9c402
Merge pull request #2438 from imachug/websocket
Avoid code duplication in bigfileUploadInit
2020-02-24 17:56:14 +01:00
Ivanq
17f65a5179 Avoid code duplication 2020-02-24 19:19:35 +03:00
Ivanq
f8e2cbe429
Allow uploading files via websocket (#2437)
* Allow uploading files via websocket

* Fix
2020-02-24 13:46:01 +01:00
shortcutme
f0a706f6ab
Rev4455, Fix new sites file downloading 2020-02-21 13:58:11 +01:00
shortcutme
8b994e42c2
Rev4452 2020-02-20 17:27:50 +01:00
shortcutme
ae9a76a6c9
Fix double sites.json loading on startup when adding missing sites 2020-02-20 17:27:31 +01:00
shortcutme
9b85d8638d
Don't allow run site api calls when site is deleting 2020-02-20 17:25:56 +01:00
shortcutme
a9c75a3146
Fix start dir parsing for command line and better description 2020-02-20 17:25:06 +01:00
shortcutme
1cc0ec3f31
Indepently configurable OpenSSL lib/bin file 2020-02-20 17:23:00 +01:00
shortcutme
b1819ff71d
Fix trayicon autostart script duplicated arguments 2020-02-20 17:19:16 +01:00
shortcutme
fca1033f83
Fix trayicon auto start script write/read with utf8 path 2020-02-20 17:18:59 +01:00
ZeroNet
32855d0479
Merge pull request #2426 from canewsin/patch-2
Added Custom Openssl Path for Native Clients and start_dir config
2020-02-19 17:20:24 +01:00
shortcutme
2c826eba2d
Rev4447, Fix Msgpack 1.0.0 compatibility 2020-02-19 16:48:14 +01:00
canewsin
8facd9ff84 Added Custom Openssl Path for Native Clients and start_dir config
This Parameter helpful where openssl path is not fixed always, we can also use this to reduce code verbosity by providing other like these and provide them as parameter

            if sys.platform.startswith("win"):
                self.openssl_bin = "tools\\openssl\\openssl.exe"
            elif config.dist_type.startswith("bundle_linux"):
                self.openssl_bin = "../runtime/bin/openssl"
            else:
                self.openssl_bin = "openssl"
Also Added Custom start_dir config option since android path issue of not valid "./" path, where files via provided path are not loading on some systems like Android client.

for more detailed conversation see pull request [#2422](https://github.com/HelloZeroNet/ZeroNet/pull/2422)
2020-02-18 23:09:16 +05:30
shortcutme
64e5e0c80e
Rev445, Fix and test random fail in CryptMessage decrypt 2020-02-18 15:28:14 +01:00
shortcutme
8aa4e27938
Rev4411 2020-02-13 17:26:29 +01:00
shortcutme
bc76bf291a
Fix site blocklist with address hash based blocking and move checking to server-side 2020-02-13 17:26:15 +01:00
shortcutme
70cc982e2e
Log actual disabled function for multiuser plugin 2020-02-13 17:24:59 +01:00
shortcutme
61ac6a30d3
Fix loading blocked raw sites 2020-02-13 17:24:22 +01:00
shortcutme
d2627f36d5
Pass all arguments on site need 2020-02-13 17:23:37 +01:00
shortcutme
d36324e0d3
More detailed info on http host error 2020-02-13 17:23:00 +01:00
shortcutme
113b57415f
More detailed info on origin error 2020-02-13 17:22:37 +01:00
shortcutme
fefd2474b1
Don't reload sites on listing 2020-02-13 17:22:09 +01:00
tangdou1
28ce08de8e
Update zh.json (#2413)
* Update zh.json

* Update zh.json

* Update zh.json
2020-02-11 16:12:06 +01:00
shortcutme
037f0a3ff4
Rev4404 2020-02-07 16:43:23 +01:00
shortcutme
a3546d56b0
Merge js 2020-02-07 16:42:26 +01:00
shortcutme
95bf4ecb42
Read 5MB of logs for non-default console tabs 2020-02-07 16:40:27 +01:00
shortcutme
c91f2f0a09
Move all optional file download to separate button on sidebar 2020-02-07 16:40:04 +01:00
shortcutme
6d425f30fe
Stop checkconnections with connectionserver 2020-02-07 16:38:42 +01:00
shortcutme
8e79a7da63
Fix incomplete loading of dbschema.json 2020-02-07 16:37:37 +01:00
shortcutme
10c02c31c2
Rev4401 2020-01-28 16:59:03 +01:00
shortcutme
a0f5e1bde8
Fix translations 2020-01-28 16:58:46 +01:00
shortcutme
2e9cff928c
Skip commit if already commiting 2020-01-28 16:58:14 +01:00
shortcutme
46210b2f04
Use peer ip in peer exchange if no active connection 2020-01-28 16:57:20 +01:00
shortcutme
6dae187e22
More detailed logging on write error 2020-01-28 16:56:35 +01:00
ZeroNet
a7e783a26b
Merge pull request #2403 from eduaddad/patch-5
CONFIGURATION ITEM VALUE CHANGED - should work now
2020-01-28 16:50:17 +01:00
ZeroNet
60af3ceda9
Merge pull request #2404 from eduaddad/patch-6
added the Save as .Zip translation to Brazilian Portuguese
2020-01-28 16:49:08 +01:00
shortcutme
11415fe082
Log mock ws caller to get more detail on random test fail 2020-01-24 16:05:19 +01:00
ZeroNet
df93fa0ffe
Add PGP public key link 2020-01-24 14:58:37 +01:00
Eduaddad
849d514f28
added to translation Save as .Zip
added to translation Save as .Zip
2020-01-22 14:00:19 -03:00
Eduaddad
4d8ee4bafb
CONFIGURATION ITEM VALUE CHANGED - should work now
CONFIGURATION ITEM VALUE CHANGED - should work now
2020-01-22 13:54:19 -03:00
shortcutme
ac8aaaff75
Rev4399 2020-01-22 16:37:48 +01:00
shortcutme
238ede9419
Only correct time if we have at least 9 connected peers 2020-01-22 16:37:07 +01:00
shortcutme
835174270e
Less wait for closing cursors 2020-01-22 16:36:52 +01:00
shortcutme
62a2ec7254
Make sure to commit before vacuum 2020-01-22 16:36:33 +01:00
shortcutme
a9368bb3c8
Don't allow parallel sites.json loading 2020-01-22 16:35:40 +01:00
shortcutme
e75e199334
Fix multi-line log events display in web console 2020-01-22 16:33:54 +01:00
shortcutme
2b7aebd89d
Fix optional file loading when sites.json load takes more than 1 sec 2020-01-22 16:33:30 +01:00
shortcutme
3e08eabc86
Proper error when piecemap download fails 2020-01-22 16:31:35 +01:00
shortcutme
a16d55c863
Fix compatibility with Snap package 2020-01-22 16:31:09 +01:00
ZeroNet
e51ae580b9
Merge pull request #2396 from eduaddad/patch-4
Update pt-br.json
2020-01-22 16:27:43 +01:00
ZeroNet
914576b9db
Change to full PGP fingerprint 2020-01-19 13:30:22 +01:00
Eduaddad
3edb34ec56
Update pt-br.json
more translations
2020-01-15 00:53:19 -03:00
shortcutme
224093b3dd
Rev4397, Fix big file invalid path errors 2020-01-09 16:35:05 +01:00
Ivanq
77c3e43978 Detect content encoding based on query string (#2385) 2020-01-07 10:34:14 +01:00
shortcutme
03350d7454
Rev4394 2020-01-04 16:56:42 +01:00
shortcutme
2b5e57e840
Fix updateing deleted site in contentdb 2020-01-04 16:55:56 +01:00
shortcutme
39442977db
Thread safe access and request log updating in optionalmanager 2020-01-04 16:55:40 +01:00
shortcutme
0af90aad37
Maxmind db as download source no longer works 2020-01-04 16:55:08 +01:00
shortcutme
c5d51c9cab
Verify cert in separate function 2020-01-04 16:54:34 +01:00
shortcutme
0dbcec8092
Merge wrapper 2020-01-04 16:54:20 +01:00
shortcutme
76e4b75c2d
Fix removing loading screen without loaded content 2020-01-04 16:54:13 +01:00
shortcutme
c1ad7914f1
Always update loading screen site too large message with site info received 2020-01-04 16:53:49 +01:00
shortcutme
9085a4b0cc
Less frequent update of progress bar 2020-01-04 16:53:11 +01:00
shortcutme
820346c98d
More logging to wrapper 2020-01-04 16:52:51 +01:00
shortcutme
995d87c167
Don't add escaping iframe message for link without target=_top 2020-01-04 16:52:18 +01:00
shortcutme
fe739fa848
Log tasks with larger priority 2020-01-04 16:48:56 +01:00
shortcutme
b6d0bf8f6b
Use msvcrt 110 and 120 when 110 is not avaliable 2020-01-04 16:48:37 +01:00
ZeroNet
aec1ab4ed2
Merge pull request #2378 from rllola/zeroname_doc_update
Update Zeroname updater documentation
2020-01-02 17:54:12 +01:00
rllola
7d5f3354b6 Update README; 'valueencoding' configuration required for rpc call; 2020-01-02 11:32:47 +01:00
shortcutme
feb58e4b0e
Rev4382, Fix is_prev_builtin startup error 2019-12-31 18:15:17 +01:00
shortcutme
163825c03e
Rev4381 2019-12-31 12:56:10 +01:00
shortcutme
3fc80f834d
New tests for worker task manager 2019-12-31 12:55:09 +01:00
shortcutme
20b0db7ddb
Thread safe task remove in failTask 2019-12-31 12:54:45 +01:00
shortcutme
b2e7cbb927
Refactor task adding with less locking 2019-12-31 12:51:52 +01:00
shortcutme
5987274edf
Name task adding lock 2019-12-31 12:50:39 +01:00
shortcutme
ba218974c4
Task remove optimization 2019-12-31 12:50:21 +01:00
shortcutme
721d4a22f1
Remove unnecessary log from worker task manager 2019-12-31 12:49:59 +01:00
shortcutme
32b0153d34
Log site address with getfile error 2019-12-31 12:46:01 +01:00
shortcutme
71d32d7414
Less slow query loggin 2019-12-31 12:45:36 +01:00
shortcutme
796ee572ce
Fix verify invalid json 2019-12-31 12:44:47 +01:00
shortcutme
60146a083c
Fix ui_websocket test result with None 2019-12-21 03:30:27 +01:00
shortcutme
df87bd41b4
Log WsMock sent data itself to figure out random Crypt test fail 2019-12-21 03:22:37 +01:00
shortcutme
3d73599deb
Don't retry bad files also in big file tests 2019-12-21 03:21:38 +01:00
shortcutme
48124e12d9
Rev4372 2019-12-21 03:05:49 +01:00
shortcutme
17fb740c51
Don't try to download bad files again in tests to avoid random test fails 2019-12-21 03:05:19 +01:00
shortcutme
c6b07f1294
Wait until checkmodification spawned pools are finishing 2019-12-21 03:04:36 +01:00
shortcutme
3ccce46314
Wait until downloadContent pool finishes 2019-12-21 03:03:49 +01:00
shortcutme
7c1da5da52
Abilty to disable file bad file retry at end of download 2019-12-21 03:03:32 +01:00
shortcutme
c5de1447c8
onComplete will be triggered by WorkerManager 2019-12-21 03:02:53 +01:00
shortcutme
e16ace433c
Better logging in site download content 2019-12-21 03:02:36 +01:00
shortcutme
975f53b95b
New logging format for tests 2019-12-21 03:01:45 +01:00
shortcutme
8a994b5559
Ask before UiWebsocket server shutdown action 2019-12-21 02:59:50 +01:00
shortcutme
2acf24c336
Fix ipv4 checking regexp 2019-12-21 02:59:18 +01:00
shortcutme
2c3f1ba7ad
Check if all task are complete on fail task 2019-12-21 02:59:04 +01:00
shortcutme
c01245a4e0
Log task fail 2019-12-21 02:58:48 +01:00
shortcutme
f119f7d0d2
Use faster and thread safe way to re-sort tasks 2019-12-21 02:58:35 +01:00
shortcutme
62d4edadf6
Fail task if no peer left to try 2019-12-21 02:57:53 +01:00
shortcutme
8bf17d3a69
Add reason for Worker actions 2019-12-21 02:57:25 +01:00
shortcutme
0881e274a9
Log lock waits for task adding in WorkerManager 2019-12-21 02:56:42 +01:00
shortcutme
7ca09ba75b
Fix updating key 0 in WorkerTaskManager 2019-12-21 02:55:22 +01:00
ZeroNet
bde8b30d5c
Upload logs after failure, remove sometimes failing coverage check 2019-12-20 17:32:49 +01:00
shortcutme
87d1c736e2
Fix log printing typo 2019-12-19 02:45:00 +01:00
shortcutme
eba81cc7d2
Print logs from subdirs 2019-12-19 02:36:36 +01:00
shortcutme
69eb831c7e
Rev4361 2019-12-19 02:17:21 +01:00
shortcutme
99e6326974
More compact stack logging 2019-12-19 02:17:13 +01:00
shortcutme
50bbe47bf2
Better logging on file update 2019-12-19 02:17:00 +01:00
shortcutme
8bfef12ad4
Don't try to pack unknown peer addresses 2019-12-19 02:16:41 +01:00
ZeroNet
6085cfd1a7
Merge pull request #2364 from Zaefarani/patch-1
Add Farsi (Persian) Translation to ZeroNet
2019-12-18 19:29:26 +01:00
shortcutme
d660a268e8
Rev4360 2019-12-18 16:43:58 +01:00
shortcutme
c161140a90
Add locking for db cursor 2019-12-18 16:43:46 +01:00
shortcutme
7af8d1cd93
Save last lock time 2019-12-18 16:42:47 +01:00
shortcutme
845b50915d
Rev4358 2019-12-18 15:32:50 +01:00
shortcutme
dbbad3097c
Add segfault catcher, log plugins to separate directory 2019-12-18 15:32:42 +01:00
shortcutme
1abaa6fddc
Benchmark only exits when running as test from cli 2019-12-18 15:24:40 +01:00
shortcutme
7ecf09a496
Allow to change test log dir with environmental variable 2019-12-18 15:24:05 +01:00
shortcutme
c0639fef75
Lock task adding to avoid race condition when getFileInfo switches 2019-12-18 15:23:16 +01:00
ZeroNet
c08d266822
Change to more simple way to create new site 2019-12-18 14:55:37 +01:00
Hamid reza Zaefarani
6bc3c168c6
Merge pull request #1 from decentralizedauthority/patch-1
Rename fa,json to fa.json
2019-12-18 14:45:00 +03:30
Hamid reza Zaefarani
1fe7127082
Rename fa,json to fa.json 2019-12-18 14:35:54 +03:30
Decentralized Authority
cfaaaf57ec
Rename fa,json to fa.json 2019-12-18 10:59:48 +00:00
shortcutme
93d2ee65fe
Validate json files in src and plugins dir 2019-12-17 21:30:01 +01:00
shortcutme
9c08e41b9e
Rev4355 2019-12-17 21:03:01 +01:00
shortcutme
abee87bbec
Wait for threadpool kill with 1s timeout to fix memory leak test 2019-12-17 21:02:48 +01:00
shortcutme
d4b6f79746
Display logs after failure 2019-12-17 21:01:04 +01:00
shortcutme
a7c26f893f
Rev4354 2019-12-17 20:46:29 +01:00
Hamid reza Zaefarani
24b8cdf87a
Add Farsi (Persian) Translation to ZeroNet
Persian Translation of ZeroNet Site
2019-12-17 23:15:51 +03:30
shortcutme
fd43aa61ef
Current gevent in PyPI is not fully compatible with Python3.8 2019-12-17 20:37:32 +01:00
shortcutme
77869830c5
Fix shutdown hang 2019-12-17 20:36:52 +01:00
shortcutme
87fc8ced5e
Accept only my exception when testing Noparallel 2019-12-17 16:06:13 +01:00
ZeroNet
909967629b
Remove incompatible tests 2019-12-17 15:34:43 +01:00
ZeroNet
afe0d82f18
Run only selected benchmark tests 2019-12-17 15:25:46 +01:00
shortcutme
1ad97a6696
Run internal test on CI 2019-12-17 15:16:23 +01:00
shortcutme
e7e8e59c1e
Rev4353 2019-12-17 15:08:42 +01:00
shortcutme
f3665b172f
Avoid unnecessary pool call 2019-12-17 15:07:32 +01:00
shortcutme
23b3cd3986
Better rebuild log message 2019-12-17 15:07:00 +01:00
shortcutme
f7ee6744af
Db busy event waited in getDb 2019-12-17 15:06:36 +01:00
shortcutme
ac45217816
Add reason for db close and rebuilds 2019-12-17 15:05:59 +01:00
shortcutme
8c51e81a0b
Fix double opening of dbs 2019-12-17 15:05:21 +01:00
shortcutme
9d777951dd
Fix console tabs display gitch on edge 2019-12-17 15:03:04 +01:00
shortcutme
2778b17f8d
Ignore trayicon destroy errors 2019-12-17 15:02:39 +01:00
shortcutme
98c98fbac7
Thread safe method to create directory for db 2019-12-17 15:02:18 +01:00
shortcutme
9b1f6337c3
Wait for cursor finish on db close 2019-12-17 15:02:04 +01:00
shortcutme
2019093431
Fix testing on slower storage 2019-12-17 15:01:15 +01:00
shortcutme
eac25caf28
Log packing peer arrors as debug 2019-12-17 15:00:23 +01:00
shortcutme
b421893434
Return timer greenet 2019-12-17 15:00:09 +01:00
shortcutme
f1b19f5fc7
Fix DbQuery logging 2019-12-17 14:59:54 +01:00
shortcutme
61f1a741fc
Test main loop caller 2019-12-17 14:52:58 +01:00
shortcutme
f01d335835
Test noparallel multi thread compatibility 2019-12-17 14:52:13 +01:00
shortcutme
5c1b34387c
Noparallel multi thread compatibility 2019-12-17 14:51:57 +01:00
shortcutme
dfd55c3957
Fix memory leak when using sleep in threads 2019-12-17 14:50:38 +01:00
shortcutme
b21895fa78
Kill threadpool properly 2019-12-17 14:50:10 +01:00
shortcutme
495d695c5a
Fix threadpool apply and spawn when threadpool is full 2019-12-17 14:49:50 +01:00
shortcutme
3309489c24
Only call the function in separate thread when in the main loop 2019-12-17 14:48:11 +01:00
shortcutme
8a5a75e68f
Allow pass calls to the main loop 2019-12-17 14:47:27 +01:00
shortcutme
c1df78b97f
Name threadpools 2019-12-17 14:43:33 +01:00
shortcutme
4c31aae97b
Refactor worker, fix concurrent write errors 2019-12-17 14:42:33 +01:00
shortcutme
0839fdfc5e
Add reason for db close 2019-12-17 14:35:49 +01:00
shortcutme
d062f01127
Log temp site events under different name 2019-12-17 14:34:53 +01:00
shortcutme
e91fb90a45
Fix tests when running for long time 2019-12-17 14:34:29 +01:00
shortcutme
6539ca5eb0
Log spy actions to file when running tests 2019-12-17 14:33:06 +01:00
shortcutme
b138ebc519
Capture fd for pytest 2019-12-17 14:32:43 +01:00
shortcutme
79c1cd15ab
Use libev when running test 2019-12-17 14:32:17 +01:00
shortcutme
10c1986c54
Fix site list changing during listing 2019-12-17 14:31:55 +01:00
shortcutme
d7cabb47ca
Log task numbers on content.json start 2019-12-17 14:31:41 +01:00
shortcutme
8de1714f08
Fix onComplete call when donwload end 2019-12-17 14:31:12 +01:00
shortcutme
20ba9cd589
Log site download time 2019-12-17 14:30:29 +01:00
shortcutme
af1ac9bce8
Try to find already running task for file before start a new one 2019-12-17 14:30:14 +01:00
shortcutme
31a6e3ee9a
Don't allow clone to run in parallel 2019-12-17 14:29:48 +01:00
shortcutme
dca1dcdd2d
Use always active connection in DbCursor 2019-12-17 14:28:52 +01:00
shortcutme
a54f5f3e9f
Change trackers to more stable onces 2019-12-17 14:26:14 +01:00
shortcutme
51f49cd45a
Always use libev if possible 2019-12-17 14:25:04 +01:00
shortcutme
eb63eb7b1d
Log startup errors in log file 2019-12-17 14:24:44 +01:00
shortcutme
b4f7e51e96
Limit stack size on formatting 2019-12-17 14:24:08 +01:00
shortcutme
c2d2189039
Log content init failed as info 2019-12-17 14:23:47 +01:00
shortcutme
1eda3258de
Always raise error on verify error 2019-12-17 14:23:31 +01:00
shortcutme
0171cb0844
Avoid get db_inner_path for every file on signing 2019-12-17 14:23:18 +01:00
shortcutme
08a0a63631
Create ssl contexts only once 2019-12-17 14:22:29 +01:00
shortcutme
8ed7d0385d
If possible use loaded db to get db file inner_path 2019-12-17 14:21:47 +01:00
shortcutme
02d45e9c39
Use separate threadpool for batch site storage operations 2019-12-17 14:20:49 +01:00
shortcutme
2a402a0674
Use thread-safe mode to create directories 2019-12-17 14:18:54 +01:00
shortcutme
1be56b5a39
Return exit code 1 if any test failed 2019-12-17 14:10:42 +01:00
shortcutme
1e175bc41f
Remove used cursors from benchmark db test 2019-12-17 14:10:05 +01:00
ZeroNet
c16569a6ab
Merge pull request #2363 from GiganticBlackBear/py3
Update hu.json
2019-12-16 16:45:48 +01:00
Gigantic Black Bear
d19cc64611
Update hu.json 2019-12-16 15:19:42 +00:00
shortcutme
958882c1c5
Revert "Switch to sslcrypto for cryptography tasks (#2338)"
This reverts commit fbc7b6fc4f.
2019-12-15 18:30:42 +01:00
ZeroNet
2f7323043f
Merge pull request #2358 from imachug/bencode
Switch to bencode_open
2019-12-15 12:49:04 +01:00
Ivanq
fbc7b6fc4f Switch to sslcrypto for cryptography tasks (#2338)
* Use sslcrypto instead of pyelliptic and pybitcointools

* Fix CryptMessage

* Support Python 3.4

* Fix user creation

* Get rid of pyelliptic and pybitcointools

* Fix typo

* Delete test file

* Add sslcrypto to tree

* Update sslcrypto

* Add pyaes to src/lib

* Fix typo in tests

* Update sslcrypto version

* Use privatekey_bin instead of privatekey for bytes objects

* Fix sslcrypto

* Fix Benchmark plugin

* Don't calculate the same thing twice

* Only import sslcrypto once

* Handle fallback sslcrypto implementation during tests

* Fix sslcrypto fallback implementation selection
2019-12-15 12:46:06 +01:00
Ivanq
3178b69172 Switch to bencode_open 2019-12-12 17:46:16 +03:00
shortcutme
28fcf3c1ea
Rev4327 2019-12-11 20:04:50 +01:00
shortcutme
71939097b0
Make execution order test more predictable 2019-12-11 20:04:39 +01:00
shortcutme
2fd337bb55
Add wasm content type 2019-12-11 20:03:28 +01:00
shortcutme
5e26161e84
Rev4325 2019-12-04 17:16:08 +01:00
shortcutme
04ecb89e9a
Avoid sending too many publish request to an outdated client 2019-12-04 17:15:42 +01:00
shortcutme
23f851343f
Fix exception when params is an iterator 2019-12-04 17:15:08 +01:00
shortcutme
5ce1782d05
Change journal and foreign keys mode on db connect 2019-12-04 17:14:50 +01:00
shortcutme
daee14533c
Fix site number changes when data collected for stats 2019-12-04 17:14:04 +01:00
ZeroNet
31f505b309
Merge pull request #2339 from ethernetcat/py3
Update jp.json
2019-12-04 15:40:51 +01:00
shortcutme
c8214bf3ea
Fix threadpool test premature end on some platforms 2019-12-04 12:47:47 +01:00
shortcutme
1935a69c04
Add session based log disable at test 2019-12-04 12:46:44 +01:00
shortcutme
ea5f64bfea
Only log at start of the test cases 2019-12-04 12:46:13 +01:00
shortcutme
3dd04b27de
Correct invalid UiConfig pt-br json file 2019-12-04 11:03:45 +01:00
ZeroNet
9940b7bff3
Merge pull request #2334 from eduaddad/patch-3
Translation update for latest changes
2019-12-04 11:00:11 +01:00
ethernetcat
901ccf2d14 Update jp.json 2019-12-04 17:52:33 +09:00
Eduaddad
6a1a821ed4
Translation update for latest changes
Translation update for latest changes
2019-11-30 12:04:25 -03:00
shortcutme
aa9fe09337
Remove unnecessary line from config 2019-11-30 02:19:18 +01:00
shortcutme
bdb655243f
Rev4322 2019-11-30 02:16:29 +01:00
shortcutme
566c29363f
Slower progress bar animation 2019-11-30 02:15:17 +01:00
shortcutme
37b8c0241f
Db threads modify in config interface 2019-11-30 02:14:54 +01:00
shortcutme
1a17645e93
Remove unnecessary import 2019-11-30 02:14:08 +01:00
shortcutme
5fba850d74
Don't close connection if it's already closed 2019-11-30 02:13:58 +01:00
shortcutme
bd90e0ce52
Add Db id to logging identifier 2019-11-30 02:13:39 +01:00
shortcutme
c24cfa721b
Lock db while connecting 2019-11-30 02:13:17 +01:00
shortcutme
1670d96908
Execute db commit in separate thread 2019-11-30 02:12:33 +01:00
shortcutme
ec3c44c5b3
Use ThreadPool lock in Db 2019-11-30 02:11:34 +01:00
shortcutme
12bfad8fe6
Don't execute query while commiting 2019-11-30 02:11:11 +01:00
shortcutme
594edc6e9a
Commit after executemany 2019-11-30 02:10:40 +01:00
shortcutme
99304a09ca
Log long db queries 2019-11-30 02:10:11 +01:00
shortcutme
5c93aadce3
Gevent block time resolution log to ms 2019-11-30 02:09:14 +01:00
shortcutme
f0c10efca6
Progress meter for site delete 2019-11-30 02:08:29 +01:00
shortcutme
c10dd5239e
Log test case start/end and debug message 2019-11-30 02:08:11 +01:00
shortcutme
fa0d1a50b5
Better test of threadpool 2019-11-30 02:07:40 +01:00
shortcutme
66a1c4d242
Multi-process and gevent loop friendly lock 2019-11-30 02:07:30 +01:00
shortcutme
b7c6b84826
Don't log killed worker write as error 2019-11-30 02:05:20 +01:00
shortcutme
1c587bde25
Avoid write race on same file 2019-11-30 02:04:59 +01:00
shortcutme
e1dc29c374
Rev4308 2019-11-27 03:08:20 +01:00
shortcutme
59e0ffd8e0
Remove unnecessary imports from CryptMessage 2019-11-27 03:08:01 +01:00
shortcutme
f7c767c1c8
Make Chart plugin compatible with db changes 2019-11-27 03:07:44 +01:00
shortcutme
fca9db7972
Try fix Recursive use of cursors ProgrammingError by creating new cursor for every execute and move Lock to db 2019-11-27 03:07:08 +01:00
shortcutme
afd23849a6
Log site delete as info 2019-11-27 03:04:49 +01:00
shortcutme
1b2eee058c
Log test case start and end 2019-11-27 03:03:31 +01:00
shortcutme
777486a5be
Try new way to avoid pytest io errors 2019-11-27 03:03:22 +01:00
shortcutme
8b6f221e22
Formatting 2019-11-27 03:02:18 +01:00
shortcutme
97ecb7e3aa
Rev4303 2019-11-25 14:50:16 +01:00
shortcutme
5df5e25d68
Better logging of recent peers 2019-11-25 14:49:40 +01:00
shortcutme
66a950a481
New, much faster worker task sorting 2019-11-25 14:43:28 +01:00
shortcutme
29346cdef5
Faster, async local ip discovery 2019-11-25 14:40:52 +01:00
shortcutme
4f8e941e39
Fix err type logging 2019-11-25 14:39:24 +01:00
shortcutme
756f5a1608
Fix display peer found time on /Stats page 2019-11-25 14:38:53 +01:00
shortcutme
416e7d6fe0
Fix too fast benchmark results statistics 2019-11-25 14:38:27 +01:00
shortcutme
7b210429b5
Multi threaded eciesDecrypt 2019-11-25 14:37:55 +01:00
shortcutme
c52d47b15f
Don't show notifications when testing 2019-11-25 14:35:31 +01:00
shortcutme
9a43626aa6
When testing don't register shutdown functions 2019-11-25 14:35:16 +01:00
shortcutme
c14e722303
Fix bug that someomes blocked plugins accessing connectionserver sitelist 2019-11-25 14:34:46 +01:00
shortcutme
07633ba79d
Fix local peers dropping out from recent peers 2019-11-25 14:33:18 +01:00
shortcutme
6ff7fe55fc
Make sure we use local peers if possible 2019-11-25 14:32:06 +01:00
shortcutme
a14c36cd3e
Add peer's site to str represetntation 2019-11-25 14:31:12 +01:00
shortcutme
c21fe3d23a
Prefer connecting to non-onion peers 2019-11-25 14:30:51 +01:00
d9xr92
89e8fd3d3a potential fix for #2323 (#2324)
* potential fix for #2323

* Update DbCursor.py

* replaced RLock with Lock
2019-11-23 13:22:36 +01:00
shortcutme
966f393e20
Rev4290 2019-11-20 14:08:49 +01:00
shortcutme
d85c27e67b
Merge config js 2019-11-20 14:08:19 +01:00
shortcutme
a5f8a53196
Fix change detection for integers on config interface 2019-11-20 14:08:02 +01:00
shortcutme
9299e5b614
Kill greenlets with notify 2019-11-20 14:07:33 +01:00
shortcutme
6c31a3b77e
Change fs thread number on config interface 2019-11-20 14:07:04 +01:00
shortcutme
6262c80886
Fix benchmark on Firefox 2019-11-20 14:06:27 +01:00
shortcutme
5aa115c88a
Heavier task in thread pool test to make sure it will pass 2019-11-19 02:25:28 +01:00
shortcutme
511587dd8b
Allow images from data uris 2019-11-19 02:19:14 +01:00
shortcutme
5d34bb9062
Rev4287 2019-11-19 02:17:32 +01:00
shortcutme
4025d753e3
Don't print errors happened in thread 2019-11-19 02:16:44 +01:00
shortcutme
58214c0ac3
Move file writes and reads to separate thread 2019-11-19 02:16:20 +01:00
shortcutme
5d113757df
Stop greenlets when deleting a site in test 2019-11-19 02:15:47 +01:00
shortcutme
b41a03674f
New configuration options for fs write and read thread count 2019-11-19 02:15:00 +01:00
shortcutme
8c1f64243f
Test CLI action parser 2019-11-19 02:14:29 +01:00
shortcutme
cdd9dd4f6f
Fix duplicate content_db connecting 2019-11-19 02:12:24 +01:00
shortcutme
57f2a43864
Formatting 2019-11-19 02:11:19 +01:00
shortcutme
74d7fb7835
Less verbose logging in site storage 2019-11-19 02:10:42 +01:00
shortcutme
dd61429e2f
Handle announcer thread killing properly 2019-11-19 02:09:55 +01:00
shortcutme
8f27f50b34
Log SQL statements in progress as warning 2019-11-19 02:09:36 +01:00
shortcutme
96e7fbdca1
Don't try to commit if no db connection 2019-11-19 02:08:30 +01:00
shortcutme
39352eb97e
Fix test function listing name 2019-11-19 02:08:03 +01:00
shortcutme
1c607645c7
Track and stop site connected greenlets on delete 2019-11-19 02:07:51 +01:00
shortcutme
2ad3493fb0
Test and benchmark of crypto function in CryptMessage plugin 2019-11-19 02:05:02 +01:00
shortcutme
331dc99086
Fix benchmark plugin test listing if not loaded before other plugins 2019-11-19 02:03:27 +01:00
shortcutme
4424c8272d
New refactored Benchmark plugin to test compatibility and meassure system performance 2019-11-19 01:48:31 +01:00
shortcutme
16162955af
New cli test action 2019-11-19 01:47:06 +01:00
shortcutme
23006e495f
FilePack plugin pass possible other parameters to site storage read function 2019-11-19 01:45:50 +01:00
shortcutme
4351af35f4
Don't load geoip db in parallel 2019-11-19 01:44:26 +01:00
shortcutme
e8af5db2e8
Keep track gevent block number, remove Benchmark from stats plugin 2019-11-19 01:43:39 +01:00
shortcutme
9d048371b7
Better way to patch gevent error handling 2019-11-19 01:42:00 +01:00
shortcutme
a187726ba8
Formatting 2019-11-19 01:40:39 +01:00
shortcutme
0ff1bcfd19
Remove not used variable and module import 2019-11-19 01:40:00 +01:00
shortcutme
08fee35bcf
Fix pytest output capturing error 2019-11-19 01:39:16 +01:00
shortcutme
08574bf676
Handle unkown variables when rendering template 2019-11-19 01:38:22 +01:00
shortcutme
5c27a0efcc
Rev4260, Fix UiConfig pt-br language json 2019-11-15 19:43:56 +01:00
ZeroNet
7576f96604
Merge pull request #2307 from nfenclova/py3
Replace usage of deprecated API 'cgi.parse_qsl'
2019-11-14 17:59:16 +01:00
Natalia Fenclová
456e330854 Replace usage of deprecated API 'cgi.parse_qsl'
Python 3.8.0 removed this deprecated API. We already use the replacement
`urllib.parse.parse_qsl` in `UIRequest.py`
2019-11-14 17:22:39 +01:00
ZeroNet
8d95eb937f
Longer timeout to close connections in ssl connection test 2019-11-11 17:14:36 +01:00
shortcutme
2f50fef787
Rev4259 2019-11-11 16:18:33 +01:00
shortcutme
ac69007292
Fix file rendering if content.json download failed 2019-11-11 16:18:25 +01:00
shortcutme
e8b0a3d1c4
Fix loading screen scrolling on smaller screens 2019-11-11 16:18:01 +01:00
Eduaddad
dfbbbd9381 translation update (#2275)
* translation update

translation update

* Update pt-br.json

* missing comma correction
2019-11-11 15:37:50 +01:00
ZeroNet
4ab339b375
Merge pull request #2278 from eduaddad/patch-2
text was duplicated and in Chinese
2019-11-11 15:37:24 +01:00
shortcutme
fce24cedbd
Rev4257 2019-11-07 02:48:17 +01:00
shortcutme
f9b62564ca
Fix logging in non-debug mode 2019-11-07 02:47:45 +01:00
shortcutme
d569d9488a
Deny invalid files on Windows 2019-11-07 02:47:19 +01:00
shortcutme
f172751df3
Test utf8 filename download 2019-11-07 02:44:54 +01:00
shortcutme
13233d47bd
Fix long running test 2019-11-07 02:44:33 +01:00
Josh
74d7d92a4d Allow all valid filenames to be added to content.json (#2141)
* Allow all valid filenames to be added to content.json

* Replace hex version of regex with non-hex version

* Add basic test for path validation with ASCII and UTF-8 filenames

* Amend path validation test to meet standards
2019-11-07 02:18:27 +01:00
ZeroNet
d3a0f5c268
Add macOS download option, change localhost to 127.0.0.1 2019-11-06 23:26:52 +01:00
ZeroNet
63f213a5d5
Merge pull request #2286 from cclauss/patch-1
Upgrade to the production version of Python 3.8
2019-11-04 19:52:37 +01:00
Christian Clauss
6d4c4d9f27
Upgrade to the production version of Python 3.8 2019-11-04 19:44:15 +01:00
Eduaddad
b2be4672ec
text was duplicated and in Chinese 2019-11-04 09:18:32 -03:00
Lola Dam
1bfe328a1b No restart after update (#2242) 2019-11-01 19:36:43 +01:00
shortcutme
1f453b6c13
Rev4253, Hotfix siteCreate CLI 2019-10-30 03:09:28 +01:00
shortcutme
ee8e3c3c9c
Use master_seed for siteCreate CLI by default 2019-10-30 03:07:06 +01:00
shortcutme
37f315dfc2
Rev4252 2019-10-30 02:31:04 +01:00
shortcutme
1e1e560795
Use master seed to create new site from cli 2019-10-30 02:30:01 +01:00
shortcutme
8d88cfcd68
Fix notification width calculation 2019-10-30 02:28:37 +01:00
shortcutme
74badf9c9c
Rev4250 2019-10-28 16:44:40 +01:00
shortcutme
86087550f1
Log ConnectionResetError as warning 2019-10-28 16:44:14 +01:00
shortcutme
8dfc200f24
Update cachable type list 2019-10-28 16:43:37 +01:00
shortcutme
cb4a4bd707
Add utf-8 charset header to more types 2019-10-28 16:43:19 +01:00
shortcutme
24ba2a150b
Remove limitations for img, font, media, style src in raw mode 2019-10-28 16:42:28 +01:00
shortcutme
e1d92bf0ec
Changing allow-origin to js files looks no longer necessary 2019-10-28 16:41:55 +01:00
shortcutme
270f3e9ffd
Use host to check same origin if referrer looks trimmed to host 2019-10-28 16:41:08 +01:00
shortcutme
e1f73697ff
Extend built-in content types list 2019-10-28 16:11:45 +01:00
shortcutme
e82155aac4
Rev4245, Fix target=_blank links 2019-10-26 20:36:53 +02:00
shortcutme
d7669413af
Log ConnectionAbortedError as warning 2019-10-26 20:17:09 +02:00
shortcutme
28d4fc5d12
Update location of bundled OpenSSL on macOS 2019-10-24 12:22:54 +02:00
Filip Š
dac4fcd52b Allow NOSANDBOX in local mode (#2238) 2019-10-24 12:01:31 +02:00
6543
8dff33b38a git ignore log folder (#2239) 2019-10-24 10:53:41 +02:00
shortcutme
6bae1f8a4b
Rev4243 2019-10-24 03:09:38 +02:00
shortcutme
10ceeb7f02
Remove no longer necessary files after cert generation 2019-10-24 03:09:28 +02:00
shortcutme
448bb3ce98
Fix OpenSSL cert generation using LibreSSL 2019-10-24 03:09:16 +02:00
shortcutme
0531d47721
Fix shutdown errors on macOS 2019-10-24 03:08:45 +02:00
shortcutme
b21719e2f2
Fix OpenSSL lib loading on macOS 2019-10-24 03:08:27 +02:00
Ornataweaver
2960db2352 simple English improvement (#2232) 2019-10-22 17:33:10 +02:00
ZeroNet
cb3629343b
Update requirements.txt 2019-10-22 17:22:30 +02:00
shortcutme
fa7013fdf7
Rev4241 2019-10-16 15:44:56 +02:00
shortcutme
db868dba81
Merge sidebar css 2019-10-16 15:44:38 +02:00
shortcutme
608a411d97
Db table rebuild as debug message 2019-10-16 15:43:28 +02:00
shortcutme
20c63c73b3
Support silent loading of verify lib 2019-10-16 15:43:07 +02:00
shortcutme
5ca3401eb9
Remove UiRequestPlugin from Zeroname plugin 2019-10-16 15:42:49 +02:00
ZeroNet
435a3c285e
Merge pull request #2218 from caryoscelus/py3
minor code improvement: super/init
2019-10-07 11:23:32 +02:00
caryoscelus
6405cae706 minor code improvement: super/init
`__init__` should only ever return None
2019-10-06 22:30:18 +00:00
shortcutme
6451e7f9f1
Console display fixes 2019-10-06 03:32:04 +02:00
shortcutme
127fa5fa82
Rev4238 2019-10-06 03:27:47 +02:00
shortcutme
63fd0a9fa1
Put the infopanel lower to avoid console interruption 2019-10-06 03:21:35 +02:00
shortcutme
344ad44854
Fix reload if there is hash in the url 2019-10-06 03:20:58 +02:00
shortcutme
43a5742258
Resolve domain in parsePath function 2019-10-06 03:20:16 +02:00
shortcutme
924a61309a
Cached isDomain / resolveDomain functions 2019-10-06 03:18:14 +02:00
shortcutme
9dd5c88da4
Monospace font when displaying errors 2019-10-06 03:15:57 +02:00
shortcutme
0598bcf332
Fix utf8 post data parsing 2019-10-06 03:15:20 +02:00
shortcutme
ead1b3e5f5
Log 403 as warning 2019-10-06 03:14:45 +02:00
shortcutme
dd493c87fa
Display WSGI errors to the browser 2019-10-06 03:13:32 +02:00
shortcutme
29640e614c
Admin API call to list server errors 2019-10-06 03:12:47 +02:00
shortcutme
73e0aa17c4
Don't encode byte responses 2019-10-06 03:10:43 +02:00
shortcutme
917a2e59ce
Fix compacting large json files 2019-10-06 03:10:20 +02:00
shortcutme
119e1a9bf0
Simple cache decorator 2019-10-06 03:09:48 +02:00
shortcutme
6eb79ba75e
Don't annunce site if not serving 2019-10-06 03:08:54 +02:00
shortcutme
1f9eafa619
Merge sidebar js, css 2019-10-06 03:08:32 +02:00
shortcutme
d5da404ed4
Log zeroname db load error 2019-10-06 03:08:09 +02:00
shortcutme
1b41aa70cc
Don't mess with console visibility on Windows 2019-10-06 03:07:52 +02:00
shortcutme
284b1a4f8a
Console filters to Warning, Error 2019-10-06 03:07:34 +02:00
shortcutme
fe432ad843
Open console with #ZeroNet:Console hash in url 2019-10-06 03:07:14 +02:00
shortcutme
15fca6bd12
User selection from a list in multiuser local mode 2019-10-06 03:05:39 +02:00
ZeroNet
57c0daa294
Merge pull request #2213 from filips123/sponsor-button
Display a sponsor button in repository (py3)
2019-10-02 18:20:19 +02:00
Filip Š
fee95654fa Create FUNDING.yml
(cherry picked from commit f08bea7f90)
2019-09-30 22:04:40 +02:00
shortcutme
bb436f9931
Rev4223 2019-09-28 17:17:47 +02:00
shortcutme
3682f0aed4
Wait for db close on tests 2019-09-28 17:03:43 +02:00
shortcutme
43c366d2fb
Restrict blocked site addition when using mergerSiteAdd 2019-09-28 17:02:27 +02:00
shortcutme
b21b885aa9
Move site add to separate function 2019-09-28 17:01:37 +02:00
shortcutme
6bb929a896
Merge branch 'py3' of https://github.com/HelloZeroNet/ZeroNet into py3 2019-09-19 16:38:25 +02:00
shortcutme
f5829f6012
Rev4221 2019-09-19 16:38:20 +02:00
shortcutme
d06b4abecf
Add multiuser admin status to server info 2019-09-19 16:38:05 +02:00
shortcutme
d7db631b95
Shut down UiServer if FileServer startup failed 2019-09-19 16:33:45 +02:00
shortcutme
93e6ec4933
Fix display site add prompt 2019-09-19 16:32:30 +02:00
Christian Seibold
2fbf2c7771 English Grammar Fix: Change "Forgot" to "Forget" in Sidebar (#2202)
* Change forgot to forget

English grammar fix

* Change forgot to forget

Fix English grammar
2019-09-18 19:49:53 +02:00
shortcutme
b474677db1
Remove pyelliptic from requirements because an OpenSSL 1.1 compatible version is bundled in the lib dir 2019-09-15 22:14:20 +02:00
shortcutme
dbcd8602c5
Rev4214 2019-09-15 22:12:09 +02:00
shortcutme
1793407748
Png and svg version of the logo 2019-09-15 22:12:03 +02:00
shortcutme
6f0d4a50d1
Add apple touch icon support for Safari 2019-09-15 22:11:51 +02:00
shortcutme
10817aefae
Fix pyelliptic OpenSSL 1.1 compatibility if it's also present in site-packages 2019-09-15 22:08:48 +02:00
shortcutme
4293a44c93
Don't try to find OpenSSL 1.0.x 2019-09-15 22:08:20 +02:00
shortcutme
96759e9303
Rev4210, Fix format exception if no args 2019-09-12 00:24:16 +02:00
shortcutme
448483371c
Formatting 2019-09-12 00:23:36 +02:00
Lola Dam
0738964e64 Save content.json of site even if limit size is reached (#2114)
* fix #2107; Still save the content.json received even if site size limit is reached but dont download files; Allow better distribution of latest version of content.json

* Added test

* Fix test for huge content file (now it fails)

* Dont download huge content.json file and update test

* Remove comments
2019-09-10 18:18:21 +02:00
ZeroNet
deec2e62ce
Add Linux bundle install method 2019-09-10 18:16:02 +02:00
ZeroNet
c52da69367
Check py3 branch build status 2019-09-10 18:08:45 +02:00
shortcutme
2de35266c4
Rev4208, Add details on Tor connection error 2019-09-10 15:43:42 +02:00
ZeroNet
e6b8097b43
Merge pull request #2187 from krzotr/patch-5
Set custom priority in FileNeed and FileGet command
2019-09-08 17:24:19 +02:00
krzotr
55c7585334
Set custom priority in FileNeed and FileGet command
When you use `FileNeed` or `FileGet` command the default priority is set to `6`.
You cannot change that value because is hardcoded.

Now you can set priority of downloading files manually:

```
this.cmd("fileNeed", {
    "inner_path": inner_path + "|all",
    "priority": 10
})
```
2019-09-08 11:51:46 +02:00
shortcutme
62d278a367
Version 0.7.1 2019-09-06 04:03:01 +02:00
shortcutme
38e20b7c31
Rev4206 2019-09-04 20:16:57 +02:00
shortcutme
d3fce8ca36
Support Linux bundle OpenSSL 2019-09-04 20:16:32 +02:00
shortcutme
2a7d7acce0
Support updating linux bundle 2019-09-04 20:15:49 +02:00
shortcutme
eab63c6af8
Keep file permissions on update rename workaround 2019-09-04 20:15:37 +02:00
shortcutme
4f0613689a
Formatting 2019-09-04 20:13:32 +02:00
shortcutme
743463dce9
Execute shutdown function before running update to avoid segfault on linux 2019-09-04 20:13:16 +02:00
shortcutme
0b04176f18
Rev4203, Change console encoding to utf8 on Windows 2019-09-03 12:00:25 +02:00
ZeroNet
166a65e1b1
Merge pull request #2183 from imachug/patch-2
Fix gevent.Timeout being not caught
2019-09-02 21:59:44 +02:00
Ivanq
5da4537d7c
Fix gevent.Timeout being not caught 2019-09-02 19:34:29 +00:00
ZeroNet
b9e71c9f6f
Merge pull request #2181 from imachug/patch-1
Fix UnicodeDecodeError when OpenSSL is not found
2019-09-02 21:10:15 +02:00
Ivanq
500c96abe2
Fix UnicodeDecodeError when OpenSSL is not found
Fixes #2180
2019-09-02 14:35:28 +00:00
shortcutme
149c4f5c7b
Rev4200 2019-09-02 14:17:46 +02:00
shortcutme
76bc9fcddf
Open sidebar with location hash 2019-09-02 14:17:35 +02:00
shortcutme
f999f167b1
Offer access with ip address on invalid host error 2019-09-02 02:10:52 +02:00
shortcutme
9ac96cdd50
Don't leak allowed origins in error message 2019-09-02 02:09:53 +02:00
shortcutme
3c4bc6ae35
Always update merger sites db on content.json update 2019-09-02 02:08:07 +02:00
ZeroNet
879b504b0f
Merge pull request #2179 from krzotr/patch-4
KeyError: 'piece_size' in `fileNeed` command in BigfilePlugin when try to download non-optional files with `|all`
2019-08-30 21:05:46 +02:00
Krzysztof Otręba
baa5df1d01 fixed KeyError: 'piece_size' when try to download non-optional file using '|all' 2019-08-30 18:59:19 +02:00
shortcutme
912c958ac0
Rev4197 2019-08-26 03:21:04 +02:00
shortcutme
d166a16a24
Use function flagging in plugins 2019-08-26 03:20:07 +02:00
shortcutme
1bd1ddf410
Test function flagging 2019-08-26 03:15:29 +02:00
shortcutme
7890771faa
Test permissions of websocket 2019-08-26 03:11:24 +02:00
shortcutme
376fd0d439
Use flags instead of permission list 2019-08-26 03:09:48 +02:00
shortcutme
c414e6caa2
Support action async call flag 2019-08-26 03:08:57 +02:00
shortcutme
ed7a3b2356
Get action permissions from flag db 2019-08-26 03:02:30 +02:00
shortcutme
adffbd1973
New function flagging decorator class to keep track permissions 2019-08-26 02:55:01 +02:00
shortcutme
6750682e4f
Rev4191 2019-08-23 03:42:31 +02:00
shortcutme
d1fb4067e7
Hide trackers proxy settings if tor always set on /Config page 2019-08-23 03:40:44 +02:00
shortcutme
ab9fe173a8
Don't use trackers proxy in tor always mode 2019-08-23 03:40:29 +02:00
shortcutme
8a7ae368d8
No opened services if we are in tor always mode 2019-08-23 03:40:20 +02:00
shortcutme
248fc5f015
Use re.sub to replace template variables 2019-08-23 03:39:50 +02:00
shortcutme
e16611f15a
Allow websocket connection originates from earlier accepted hostnames 2019-08-23 03:39:16 +02:00
Ivanq
24b3651d2e Allow blob: protocol (#2166)
* Allow blob: protocol

* Fix quotes
2019-08-20 12:42:01 +02:00
ZeroNet
0e236e53fd
Merge pull request #2167 from imachug/merge-media
Add --merge_media config option
2019-08-20 12:10:39 +02:00
Ivanq
61ba9848e5 Add --merge_media config option 2019-08-20 08:16:35 +00:00
Ivanq
01ff89315b Add GitLab CI/CD support (#2163)
* Use GitLab CI/CD

* Force colored tests

* Get rid of an error

* Mark tests as slow

* Disable codecov & coveralls

* Python 3.5-3.8

* Add Python 3.4

* Support both OpenSSL 1.1.0 and 1.1.1+

* Test both OpenSSL 1.1.0 and 1.1.1+

* Fix OpenSSL 1.1.1

* Fix Python 3.4 build
2019-08-19 17:30:31 +02:00
shortcutme
155d8d4dfd
Rev4188, Allow only white listed values for open_browser 2019-08-19 13:42:49 +02:00
ZeroNet
c7822ed6e6
Merge pull request #2160 from imachug/patch-1
Allow files with `..` as a name substring
2019-08-19 13:35:29 +02:00
Ivanq
1ed40b3b82
Allow files with .. as a name substring 2019-08-19 07:09:32 +00:00
shortcutme
18dc359cfc
Rev4187 2019-08-18 03:03:22 +02:00
shortcutme
b871849df4
Add origin validation to websocket connections 2019-08-18 03:03:02 +02:00
shortcutme
7d1ca3862d
Make missing IPv6 a warning not an error 2019-08-18 03:02:30 +02:00
shortcutme
2a887870ff
Rev4185 2019-08-17 20:35:00 +02:00
shortcutme
1d5bde01cc
Deny plugin add request in multiuser mode 2019-08-17 20:34:21 +02:00
shortcutme
8537939d26
Disable UDP in proxy mode 2019-08-17 20:34:04 +02:00
shortcutme
fcb3ac3917
Only change default proxy to tor in tor always mode 2019-08-17 20:33:43 +02:00
shortcutme
d63a4b3912
Rev4181 2019-08-15 03:19:23 +02:00
shortcutme
6a245a202c
Fix server connections encryption 2019-08-15 03:19:05 +02:00
shortcutme
429043f60c
CLI peerPing command display connection encryption info only once 2019-08-15 03:09:53 +02:00
shortcutme
8f491fe6e1
Use SSLContext for connection encryption, add fake SNI, ALPN 2019-08-15 03:08:40 +02:00
shortcutme
92358bafc0
Wider max notification width to allow blacklist button in same line 2019-08-15 03:06:13 +02:00
shortcutme
d93e89899b
Fix tracker proxy PySocks import 2019-08-15 03:05:46 +02:00
shortcutme
2bdd073608
Move resolveDomain to SiteManager for easier resolver plugins 2019-08-15 03:05:29 +02:00
shortcutme
7801937f74
Rev4176, Fix update of plugins 2019-08-13 21:07:26 +02:00
shortcutme
1d7e0c47dd
Merge branch 'py3' of https://github.com/HelloZeroNet/ZeroNet into py3 2019-08-12 17:58:28 +02:00
shortcutme
7b9b48e62d
Rev4175, Console and file logging disable support 2019-08-12 17:58:23 +02:00
shortcutme
d610f94e7d
Display TLS 1.3 support on /Stats page 2019-08-12 17:56:06 +02:00
ZeroNet
7742f2f5fb
Merge pull request #2137 from imachug/patch-1
Fix preferring CLI argument over zeronet.conf
2019-08-11 14:51:23 +02:00
Ivanq
3f7e22497d
Fix preferring CLI argument over zeronet.conf
Fix using open_browser from CLI arguments in case there are several `--open_browser` arguments, which often happens after restarts.
2019-08-11 12:18:55 +03:00
shortcutme
e745760520
Rev4172 2019-08-09 13:18:57 +02:00
shortcutme
bd5c2b1daa
Also try to load OpenSSL dll from Python/DDLs directory 2019-08-09 13:18:40 +02:00
shortcutme
0bbeede975
Don't try to display bigfile limit settings if no bigfile plugin enabled 2019-08-09 13:17:48 +02:00
shortcutme
30865c9d1c
Rev4169 2019-08-08 23:37:49 +02:00
shortcutme
1cfe874893
Use find_library first to locate libeay32 2019-08-08 23:37:43 +02:00
shortcutme
5da46ca29c
Cleanup whitespace in pyelliptic 2019-08-08 23:37:21 +02:00
shortcutme
cc21cbd1bd
Use relative import in pyelliptic 2019-08-08 23:36:58 +02:00
shortcutme
79ba4a9d23
Rev4167 2019-08-08 14:39:02 +02:00
shortcutme
44ef0cbe59
Always load plugins abc sorted 2019-08-08 14:37:42 +02:00
shortcutme
88f2b39576
Don't try to connect to onion addresses if not supported by the client 2019-08-08 14:37:19 +02:00
shortcutme
bf10cdef63
Add some delay on pex error before try the next peer 2019-08-08 14:36:33 +02:00
shortcutme
3696db89ab
Don't increment tracker error number if no internet connection 2019-08-08 14:35:58 +02:00
shortcutme
eeaa5d21d8
Start unrealible trackers on force reannounce 2019-08-08 14:35:35 +02:00
shortcutme
f4bec3bb4d
Avoid starting new workers on possibly unvalaible file 2019-08-08 14:35:04 +02:00
shortcutme
dc6f3cf0b2
Merge branch 'py3' of https://github.com/HelloZeroNet/ZeroNet into py3 2019-08-07 14:13:00 +02:00
shortcutme
b5a1310add
Rev4165 2019-08-07 14:12:53 +02:00
shortcutme
b22343f65c
Support multiple trackers_file argument 2019-08-07 14:12:45 +02:00
shortcutme
b9b317e213
Remove accidently left print on plugin load 2019-08-07 14:11:58 +02:00
shortcutme
6cd18bbf04
Display more clean error on users.json/sites.json load error 2019-08-07 14:11:30 +02:00
ZeroNet
8c6400e4d6
Correct venv install 2019-08-06 14:56:45 +02:00
shortcutme
b6e1559a80
Rev4163 2019-08-03 01:35:37 +02:00
shortcutme
605ae75dda
Re-compile UiConfig js, css, backdrop to bottom popup 2019-08-03 01:35:00 +02:00
shortcutme
39f318fbd5
Add plugin version information to server_info 2019-08-03 01:34:21 +02:00
shortcutme
21def81439
Fix js, css merging with absolute merged_path 2019-08-03 01:34:00 +02:00
shortcutme
7e9ab8321a
Add plugin description data 2019-08-03 01:32:55 +02:00
shortcutme
4094d3a9bf
Plugin to install, update and delete third-party plugins using the web interface 2019-08-03 01:31:11 +02:00
shortcutme
0877fec638
Restrict plugin commands in multi-user mode 2019-08-03 01:29:27 +02:00
shortcutme
f40c3e6b81
Add notification messages max-width 2019-08-02 20:11:00 +02:00
shortcutme
bb705ae863
Fix source code reloader crash on directory modifications/file deletions 2019-08-02 16:19:35 +02:00
shortcutme
c4a3a53be0
Also reload source code on file changes in installed plugins 2019-08-02 16:19:05 +02:00
shortcutme
713ff17e91
Allow load installed third-party plugins and enable/disable plugins in config file data/plugins.json 2019-08-02 16:18:37 +02:00
shortcutme
0c659a477d
Remove hard-coded translate files directory 2019-08-02 16:16:19 +02:00
shortcutme
26678a65f8
Limit notifications max with 2019-08-02 16:15:45 +02:00
shortcutme
c5116fb318
Modify testAction command to use handleRequest instead of directly calling the function 2019-08-02 16:14:44 +02:00
shortcutme
fa970fa102
Test CryptMessage plugin using testAction function 2019-08-02 16:14:17 +02:00
shortcutme
fbafd23177
Add OpenSSL 1.1 support to CryptMessage plugin by using radfish's pyelliptic version 2019-08-02 16:13:54 +02:00
shortcutme
be742c78e7
Formatting for better readability 2019-08-02 16:06:22 +02:00
shortcutme
3e97c154a0
Remove hard-coded directory path from plugins 2019-08-02 16:05:19 +02:00
shortcutme
1eb97ea381
Delayed save of sites.json 2019-08-02 14:06:25 +02:00
shortcutme
f6e06456b0
Use advaced json dumper to save sites.json and users.json 2019-08-02 14:06:05 +02:00
shortcutme
5e90cd9714
Move advanced json formatter to helper.py 2019-08-02 14:05:14 +02:00
shortcutme
06406fa46c
Avoid bare exceptions 2019-08-02 14:04:18 +02:00
ZeroNet
08b7034d6f
Merge pull request #2116 from filips123/patch-2
Add response to some commands
2019-08-02 10:52:37 +02:00
Filip Š
5b91aef4ec
Add response to some commands 2019-08-01 19:16:10 +02:00
shortcutme
d8a121cd06
Rev4129 2019-07-18 03:34:09 +02:00
shortcutme
902a1b1c88
Fix OpenSSL dll loading on Windows 2019-07-18 03:33:56 +02:00
shortcutme
c9a2b86c16
Log possible OpenSSL cert generation error message at the same line 2019-07-18 03:33:35 +02:00
shortcutme
27fcb70774
Log loaded verify lib path and load time 2019-07-18 03:32:45 +02:00
shortcutme
e488841031
Display loaded verify lib path in benchmark 2019-07-18 03:32:22 +02:00
shortcutme
6cffa1c0ca
Change maxstdio using ctypes as win32file module is not included with Python3 by default 2019-07-18 03:31:57 +02:00
shortcutme
d3e8fcea47
Rev4126 2019-07-17 16:31:38 +02:00
shortcutme
9526424a47
Display error message dialog on Windows for startup errors 2019-07-17 16:31:32 +02:00
shortcutme
149278abd0
Skip reload on attribute changes 2019-07-17 16:30:56 +02:00
shortcutme
314c8b22db
Fix parsing config file with % in value 2019-07-17 16:30:32 +02:00
shortcutme
c502688ce3
Internals renamed to Console 2019-07-17 16:30:17 +02:00
shortcutme
866346b059
Fix and test bootstrapper hash cache reload from db 2019-07-17 16:29:54 +02:00
shortcutme
de8286829a
Remove outdated cn changelog 2019-07-16 13:41:44 +02:00
ZeroNet
18c407bfc2
Merge pull request #2096 from geekless/missing-encodeResponse
Add missing @helper.encodeResponse
2019-07-15 10:59:03 +02:00
Vadim Ushakov
076684176b Add missing @helper.encodeResponse in StatsPlugin.py and BootstrapperPlugin.py 2019-07-15 14:50:24 +07:00
Vadim Ushakov
a2cb1615b3 Move the BitTorrent related code from SiteAnnouncer.py (#2078)
* Move the BitTorrent related code from SiteAnnouncer.py to a separate plugin

* AnnounceBitTorrentPlugin.py: add missing `from Debug import Debug`
2019-07-10 16:12:25 +02:00
ZeroNet
6b5fa140b9
Try use pypi gevent for py3.8 tests 2019-07-10 12:48:08 +02:00
shortcutme
356d0521e6
Rev4122 2019-07-10 03:15:56 +02:00
shortcutme
5a08ab93d3
Ignore file attribute changes when reloading source code 2019-07-10 03:15:46 +02:00
shortcutme
8185f4dfda
Test getFile inner_path security 2019-07-10 03:14:30 +02:00
shortcutme
f4f0e2afa8
Allow like parameters in database queries 2019-07-10 03:14:09 +02:00
shortcutme
67d6b1e724
Fix double logging when running tests 2019-07-10 03:12:56 +02:00
shortcutme
e34a9d452a
Allow filter optional files by inner path 2019-07-10 03:11:20 +02:00
ZeroNet
2819a36469
Merge pull request #2084 from tangdou1/patch-5
Update zh.json
2019-07-09 03:01:13 +02:00
tangdou1
8815b4e0c3
Update zh.json 2019-07-08 10:05:34 +08:00
ZeroNet
960635b993
Fix win download link 2019-07-08 02:51:16 +02:00
shortcutme
f9dcb29e92
Remove development test version warning, add windows instuctions, remove outdated instuctions 2019-07-08 02:41:30 +02:00
ZeroNet
5a746769d0
Merge pull request #2073 from filips123/fix-infinite-reloading
Fix infinite reloading when system theme changes
2019-07-06 23:25:35 +02:00
Filip Š
87b4500467 Fix infinite reloading when system theme changes 2019-07-05 18:43:54 +02:00
ZeroNet
951e47469a
Merge pull request #2060 from imachug/default-subdir-clone
Fix siteCloning subdirectory with -default files
2019-07-05 17:13:29 +02:00
shortcutme
c1db963c76
Rev4112, Fix loading screen glitch, Change unstable trackers 2019-07-04 14:39:41 +02:00
ZeroNet
a252ec36f0
Merge pull request #2069 from imachug/patch-1
Guess content type correctly
2019-07-04 11:34:01 +02:00
Ivanq
33b478199a
Guess content type correctly
Fix e.g. vue.min.js being reported as text/plain instead of text/javascript.
2019-07-04 12:09:07 +03:00
shortcutme
21f285e099
Rev4111 2019-07-03 18:37:35 +02:00
shortcutme
fec312ed09
Better pytest atexit logging error workaround 2019-07-03 18:37:13 +02:00
shortcutme
eb2627721e
Fix pytest 5.x compatibility 2019-07-03 18:36:41 +02:00
shortcutme
ff32f822ba
Raise exception instead of using assert 2019-07-03 18:35:55 +02:00
ZeroNet
80bfccd9d3
Merge pull request #2066 from geekless/Dockerfile-fix-caching
Fix the order of commands in Dockerfile to make use of the caching
2019-07-03 16:22:03 +02:00
ZeroNet
eb5a24064a
Merge pull request #2067 from rllola/fix-travis
fix pytest version to 4.6.3 to avoid it to broke for python 3.8 dev
2019-07-03 16:21:23 +02:00
rllola
b971ccc673 fix pytest version to 4.6.3 to avoid it to broke for python 3.8 dev 2019-07-03 15:48:18 +02:00
Vadim Ushakov
945687bdad Fix the order of commands in Dockerfile to make use of the caching of intermediate Docker images.
In py2 version, `COPY . /root` was placed after `RUN apk ...`, so that the result of `RUN apk ...` can be cached by Docker.

In py3 version, the commands were reordered to make the file `/root/requirements.txt` available for `pip install`. That prevents caching, and the docker image every time is rebuild from scrach.

To enable the caching back again, we can `COPY` just the single file `requirements.txt` before running other commands. Since the file is unmodified most of the time, the resulting image can be effectively cached. The other ZeroNet files are copied after doing `RUN apk ...`, as in the previous version.
2019-07-03 19:18:54 +07:00
ZeroNet
6f56d0a944
Update .travis.yml 2019-07-03 11:44:12 +02:00
ZeroNet
66c48ba4ec
Update to latest packages before tests 2019-07-02 16:03:41 +02:00
ZeroNet
f83ade8d33
Merge pull request #2061 from imachug/dotdot
Allow some paths to contain .. but not ../
2019-07-01 17:56:27 +02:00
ZeroNet
eae0d1b2a6
Merge pull request #1959 from tangdou1/patch-1
Change default value to 10MB
2019-07-01 17:45:37 +02:00
Ivanq
743f92d15e Allow some paths to contain .. but not ../ 2019-07-01 18:17:42 +03:00
shortcutme
822dec5c03
Rev4110 2019-07-01 16:28:56 +02:00
shortcutme
40b84755de
Add some fixed content_type 2019-07-01 16:28:37 +02:00
shortcutme
1b307166ee
Formatting 2019-07-01 16:27:40 +02:00
shortcutme
7a483e7912
Add short address to site_info 2019-07-01 16:27:34 +02:00
shortcutme
841230fe80
Call onClosed function if websocket is disconneced 2019-07-01 16:27:20 +02:00
shortcutme
900ae4e1ea
Remove notification about port open status on startup 2019-07-01 16:26:57 +02:00
shortcutme
72b6d6c676
Make wrapper compatible with sidebar console function 2019-07-01 16:26:37 +02:00
shortcutme
f979ed133f
Workaround for pytest 0.4.1+ atexit logging errors 2019-07-01 16:25:45 +02:00
shortcutme
fb2cf5f04d
More detailed logging of file change event 2019-07-01 16:24:48 +02:00
shortcutme
62401b24ec
Add r string literal for regexps 2019-07-01 16:24:23 +02:00
shortcutme
43f833e604
Allow multiple values of same key in the config file 2019-07-01 16:20:13 +02:00
shortcutme
612a3f4401
Fix parsing config file for lines that has no values 2019-07-01 16:19:32 +02:00
shortcutme
4c2cf99fd2
Add console function to sidebar 2019-07-01 16:19:12 +02:00
shortcutme
aebd9b410d
Fix feedlistfollow request before siteinfo 2019-07-01 16:08:21 +02:00
Ivanq
8eee9caa01 Fix siteCloning subdirectory with -default files 2019-07-01 09:49:03 +03:00
Ivanq
d278a30d19 Allow sites to lock pointer (#2059)
Add `allow-pointer-lock` to iframe sandbox
2019-06-30 16:39:17 +02:00
Ivanq
1117569148 Fix starting ZeroNet via start.py (#2052) 2019-06-28 00:58:58 +02:00
Lola Dam
753396ac0c Try and catch block for dbRebuild (#2047)
* Try and catch block for dbRebuild

* Use self.log.error and not logging

* Use self.log.error and not logging in SiteStorage also

* Check if the rebuild is working
2019-06-23 14:21:50 +02:00
shortcutme
9a267ffcaf
Rev4016, Fix updater 2019-06-12 02:57:18 +02:00
ZeroNet
6254143fc6
Fix noparallel ignoreclass test 2019-06-11 18:24:44 +02:00
shortcutme
862e19a263
Rev4104, Don't start blocking Noparallel calls in separate greenlet to be able to catch exceptions. 2019-06-11 17:04:37 +02:00
shortcutme
eeef6fe65f
Rev4102, Fix fileGet test 2019-06-06 03:17:42 +02:00
shortcutme
c05916477c
Merge branch 'py3' of https://github.com/HelloZeroNet/ZeroNet into py3 2019-06-06 02:49:14 +02:00
shortcutme
63d7e73cff
Rev4101 2019-06-06 02:49:11 +02:00
shortcutme
8cb629fb55
Return True on rebuildDb success 2019-06-06 02:27:59 +02:00
shortcutme
d596f28f46
Log non-file read errors on fileGet 2019-06-06 02:27:35 +02:00
shortcutme
8f26c0aeae
Test null byte file download bug fix 2019-06-06 02:27:09 +02:00
ZeroNet
e4978d8a05
Use dev version of gevent for Python 3.8 2019-06-04 22:27:16 +02:00
ZeroNet
4f43d977ed
Install cffi files to compile gevent 2019-06-04 20:05:33 +02:00
ZeroNet
d0c39e6bf4
Test using Python 3.8-dev 2019-06-04 19:48:39 +02:00
ZeroNet
0965d98dbd
Also test with Python 3.8 2019-06-04 18:54:47 +02:00
shortcutme
45fea827af
Rev4100 2019-06-04 16:18:59 +02:00
shortcutme
350ee13d66
Fix serving binary files with zero characters in it. 2019-06-04 16:18:52 +02:00
shortcutme
bb7af2e8ed
Fix closing progress notification at 100% 2019-06-04 16:18:02 +02:00
Lola Dam
9cda561091 Show error message when db failed to rebuild (#2043)
* Show error message when db failed to rebuild; fix #1908;

* Forgot file
2019-06-04 13:54:35 +02:00
shortcutme
d38846f126
Rev4099, Fix ZipStream for older Python versions 2019-05-31 15:08:30 +02:00
shortcutme
3b764439af
Rev4098 2019-05-30 04:29:57 +02:00
shortcutme
d1e404f093
New updater that supports updating bundle files 2019-05-30 04:29:47 +02:00
shortcutme
6c4440c2d1
Less verbose Tor controller logging 2019-05-30 04:28:57 +02:00
shortcutme
8e2a7c2b2d
Respect optionalHelp when checking files 2019-05-30 04:28:41 +02:00
shortcutme
fce30baa12
Check files in offline mode on update 2019-05-30 04:28:09 +02:00
shortcutme
c63215c992
Don't download geolite db in offline mode 2019-05-30 04:27:40 +02:00
shortcutme
815fe02c83
Make sidebarGetPeers an async command 2019-05-30 04:27:20 +02:00
shortcutme
e2e1a5b38c
Fix streamZip variable name 2019-05-30 04:27:01 +02:00
shortcutme
4222c31b3e
Don't push body of content.json with updates if larger than 10kb 2019-05-30 04:26:41 +02:00
shortcutme
e5d3b0e7b8
Use openssl from tools directory on Windows 2019-05-30 04:24:58 +02:00
shortcutme
422064e092
Only show console after app close in debug mode 2019-05-30 04:24:42 +02:00
shortcutme
d548c6bdfa
Add new line after benhmark errors 2019-05-30 04:24:17 +02:00
shortcutme
efb7b147af
Fix broken zip file generation 2019-05-30 04:24:01 +02:00
shortcutme
0f8b220f59
Merge sidebar js 2019-05-29 16:03:48 +02:00
shortcutme
1bba253156
Rev4094 2019-05-29 16:03:32 +02:00
shortcutme
a2d29a4531
Only display sign error if there is no stored privatekey 2019-05-29 16:03:10 +02:00
shortcutme
589869c5ed
Move progress display to separate function 2019-05-29 16:02:34 +02:00
shortcutme
7b41922c2d
Use user certificate if possible for signing using sidebar, more sign, publish to separate functions 2019-05-29 16:02:10 +02:00
Lola Dam
7262fbfb4e Added optional redirect value for siteClone (#2032)
* Added optional redirect value; see #1891

* Return address of the newly cloned site
2019-05-28 18:05:52 +02:00
Lola Dam
9c0e8ee833 Better error message if private key not stored when assigning (#2033)
* Return a more instructive message in case the privatekey is not found when attempting to sign

* Fix typo
2019-05-28 18:04:49 +02:00
Ivanq
9119d72b9b Fix calling start.py to reopen browser (#2029)
* Fix calling start.py to reopen browser

* Move below
2019-05-24 16:36:29 +02:00
shortcutme
891aac4713
Rev4093 2019-05-21 15:57:49 +02:00
shortcutme
2fa006d74e
Fix loading json files with utf8 content 2019-05-21 15:54:36 +02:00
shortcutme
a6c97a304f
Remove empty exception from config.py 2019-05-21 15:53:53 +02:00
shortcutme
cfa4f8fa63
Fix log_dir exception on startup fail 2019-05-21 15:53:32 +02:00
shortcutme
ce0cf09b10
Fix sidebar zip generation 2019-05-21 15:52:58 +02:00
shortcutme
1567fb745d
Fix sidebar site download with utf8 title 2019-05-21 15:52:44 +02:00
ZeroNet
416f563261
Exclude third-party pybitcointools from flake8 test 2019-05-20 17:17:49 +02:00
ZeroNet
25d6eea906
Merge pull request #2026 from imachug/js-modules
ECMAScript modules
2019-05-20 17:06:17 +02:00
ZeroNet
4e819ac035
Merge pull request #2025 from imachug/fileneed-timeout
Handle fileNeed timeout
2019-05-20 17:05:23 +02:00
ZeroNet
dbcaa6bf85
Merge pull request #2006 from imachug/cryptmessage-bitcoin
Add privToPub and pubToAddr commands
2019-05-20 17:04:38 +02:00
Ivanq
3205187090 Rename commands to have ecc... prefix 2019-05-19 15:52:36 +03:00
Ivanq
ed85981409 Fix JS modules 2019-05-19 15:45:34 +03:00
Ivanq
5d920ff7df Handle gevent.Timeout error 2019-05-19 15:42:11 +03:00
ZeroNet
5456f0e106
Merge pull request #2020 from cclauss/patch-1
Travis CI: Use flake8 to find Python syntax errors
2019-05-17 17:30:13 +02:00
cclauss
8962c16670
Declare 'err' because Python 3 has stricter scoping rules
I Python 3, __err__ will go out of scope after the __try / except__ block.  This change preserves the value after the end of the __try / except__ block.
2019-05-17 12:33:10 +02:00
cclauss
2ed1572c3c
Travis CI: Use flake8 to find Python syntax errors
Also, put the execution steps in order:
1. before_install -->
2. install -->
3. before_script -->
4. script -->
5. after success -->
6. notifications
2019-05-17 09:00:25 +02:00
ZeroNet
89cb673502
Merge pull request #2016 from cclauss/patch-1
.travis.yml: The 'sudo' tag is now deprecated in Travis CI
2019-05-16 23:16:14 +02:00
ZeroNet
98c9c8dd43
Merge pull request #2018 from cclauss/patch-2
Fix Python 3 syntax errors in zonename_updater.py
2019-05-16 23:13:10 +02:00
cclauss
fd46f141ea
Fix Python 3 syntax errors in zonename_updater.py 2019-05-16 19:39:16 +02:00
cclauss
41aec089bc
.travis.yml: The 'sudo' tag is now deprecated in Travis CI 2019-05-16 00:00:52 +02:00
cclauss
31697022fd
Add Python 3.4 2019-05-15 23:59:36 +02:00
cclauss
948a1c3d03
.travis.yml: The 'sudo' tag is now deprecated in Travis CI
[Travis are now recommending removing the __sudo__ tag](https://blog.travis-ci.com/2018-11-19-required-linux-infrastructure-migration).

"_If you currently specify __sudo: false__ in your __.travis.yml__, we recommend removing that configuration_"

Also, removed Python 3.4 because it is EOL  https://devguide.python.org/devcycle/#end-of-life-branches
2019-05-15 22:36:48 +02:00
shortcutme
20371895c9
Rev4090, Remove fs paths from error responses 2019-05-02 18:02:56 +02:00
shortcutme
617027eb52
Rev4089, Support compressed addresses in libsecp256k1 sign verification 2019-05-02 17:38:36 +02:00
shortcutme
6b9106b178
Test verify of compressed and uncompressed address signature 2019-05-02 17:18:31 +02:00
shortcutme
6207ccd559
Fix pex result parsing when there is no connection 2019-05-02 17:17:57 +02:00
shortcutme
043ac5a510
Log renaming 2019-05-02 17:17:00 +02:00
Ivanq
4eaeade618 Add privToPub and pubToAddr commands 2019-05-01 08:04:39 +03:00
shortcutme
f318f76994
Add missing function 2019-04-29 17:18:02 +02:00
shortcutme
fd085d2d37
Rev4086, Fix verify content.json files without files_optional 2019-04-29 16:54:07 +02:00
shortcutme
327f580218
Rev4085 2019-04-29 16:44:19 +02:00
shortcutme
7bef78e10f
Fix newsfeed sql query with many parameters 2019-04-29 16:44:13 +02:00
shortcutme
b54916b1dc
Show console after got hidden in non-debug mode 2019-04-29 16:43:34 +02:00
ZeroNet
c2ab102c0e
Merge pull request #1999 from filips123/py3
Support for detection of system's theme
2019-04-29 16:35:55 +02:00
ZeroNet
3f3e73455b
Merge pull request #2001 from imachug/build-all
Build wrapper all.js to support web notifications
2019-04-29 16:32:35 +02:00
Ivanq
4f09a5111b Build wrapper all.js to support web notifications 2019-04-27 18:19:16 +03:00
Filip Š
baf820bcdb Support for detection of system's theme 2019-04-26 18:23:25 +02:00
ZeroNet
538f69235f
Merge pull request #1985 from rllola/fix-zeroname-local
New ZeronameLocal plugin with connection to namecoin node
2019-04-26 12:58:50 +02:00
ZeroNet
2b9f1257be
Merge pull request #1993 from imachug/notifications
Support web notifications
2019-04-26 12:58:18 +02:00
Ivanq
6e58e8d50f Don't require WebNotifications permission 2019-04-26 12:55:33 +03:00
shortcutme
90420f1a89
Merge branch 'py3' of https://github.com/HelloZeroNet/ZeroNet into py3 2019-04-23 02:01:45 +02:00
shortcutme
021b822c4f
Rev4080 2019-04-23 02:01:40 +02:00
shortcutme
4ac54845fc
Fix double logging when testing 2019-04-23 02:00:59 +02:00
shortcutme
4c9d3ee3a6
Test big file renames 2019-04-23 02:00:11 +02:00
shortcutme
e688671972
Fix rename error variable problem in site storage 2019-04-23 01:59:59 +02:00
shortcutme
6bd63ff42a
Test file renames 2019-04-23 01:59:12 +02:00
shortcutme
efc5211451
Test optional file renames in OptionalManager plugin 2019-04-23 01:58:37 +02:00
shortcutme
f2bf5b12bd
Support optional file rename in OptionalManager plugin 2019-04-23 01:58:21 +02:00
shortcutme
dccda1af92
Pass optionalRemoved return value 2019-04-23 01:57:57 +02:00
shortcutme
4ca0e6b781
Support file renames in content.json if the sha512 hash is the same 2019-04-23 01:56:11 +02:00
shortcutme
4016e7c217
Test Bigfile plugin using tostring where it possible 2019-04-23 01:55:05 +02:00
rllola
907a26a8b9 Take care of the exceptions so it wont crash zeronet if someting goes wrong. 2019-04-20 20:23:59 +02:00
ZeroNet
dc23bfeb87
Merge pull request #1994 from imachug/ecdsa
A third small ECDSA fix
2019-04-20 13:20:03 +02:00
shortcutme
4bfd4bd714
Rev4074, Fix Ecdsa functions in cryptmessage plugin 2019-04-20 09:48:56 +02:00
Ivanq
9ddb984004 Rename Push notifications to Web notifications 2019-04-19 22:19:25 +03:00
Ivanq
e618c0e9ef Add closePushNotification 2019-04-19 22:19:25 +03:00
Ivanq
b55d2b53df Support Notification API 2019-04-19 22:19:25 +03:00
Ivanq
5733ec8363 Fix 2019-04-19 22:16:59 +03:00
Ivanq
d7d75a1fe8 Fix ECDSA on CryptMessage (#1987)
* Add ecdsaSign and ecdsaVerify

* Fix return

* Fix unicode

* Update CryptMessagePlugin.py

* Remove .encode("utf8")

* Fix keys during ECDSA signing
2019-04-19 19:34:07 +02:00
rllola
ef6ccb330b Dont raise an error if domain has more than one subdomain and just return None 2019-04-18 15:27:49 +02:00
shortcutme
afbacdfc96
Rev4073 2019-04-18 12:23:55 +02:00
shortcutme
5842441062
Remove unused function 2019-04-18 12:22:57 +02:00
shortcutme
f083301b4c
Allow larger packets 2019-04-18 12:22:38 +02:00
shortcutme
8edbecce3c
Fix diffing 2019-04-18 12:21:59 +02:00
shortcutme
b114c52c0d
Allow pluginned classes in memory on reload 2019-04-18 12:21:50 +02:00
shortcutme
4671f47222
Fix bigfile piecefield standalone run test 2019-04-18 12:21:33 +02:00
rllola
4be0e1ee7f Forgot to cache in one 'if' 2019-04-17 18:34:53 +02:00
ZeroNet
dd4c213805
Merge pull request #1989 from radfish/PR-py3--translate-bytearray-fix
Ui, Translate: fix bytearray format string for Py 3.4
2019-04-17 11:29:16 +02:00
redfish
7e57a8f71e Ui,Translate: remove bytearray format string
Py 3.4 does not support bytearray format strings
for % operator: b"%s" % s
2019-04-16 20:54:55 -04:00
rllola
86d3d35619 Disable Zeroname plugin 2019-04-16 20:07:30 +02:00
rllola
f195111354 Using http.client instead of requests module 2019-04-16 16:23:09 +02:00
radfish
ec6fd48b86 Bigfile: fix piece field bitmask to be used as bytearray consistently (#1982)
* Bigfile: make Piecefield array a bytearray

We want an array of characters. Py2 strings made sense to
use as an array of characters, but Py3 strings are different
and no longer a good choice.

* Bigfile: store bits as binary instead of char

* BigFile: rename to/from string -> to/from bytes

Since the type was changed to bytearray.
2019-04-16 15:14:19 +02:00
radfish
1516d55a88 Sidebar: rename media-globe/ to media_globe/ (#1973)
So that it can be an importable package.
This is in preparation for setuptools packaging.
2019-04-16 11:34:55 +02:00
Ivanq
bdb0dc32a7 Add ECDSA actions to CryptMessage (#1984)
* Add ecdsaSign and ecdsaVerify

* Fix return

* Fix unicode

* Update CryptMessagePlugin.py
2019-04-15 22:55:01 +02:00
shortcutme
5ff2f792e6
Rev4070, Fix Multiuser plugin import order, Run coverage before optional plugins 2019-04-15 22:54:17 +02:00
shortcutme
8246505289
Rev4069 2019-04-15 22:48:55 +02:00
shortcutme
526a5d3fb1
Fix compatibility with Python <=3.5 2019-04-15 22:48:43 +02:00
shortcutme
f970815645
Run tests in debug mode 2019-04-15 22:48:16 +02:00
shortcutme
f83c77e7ea
Fix plugin error message 2019-04-15 22:48:09 +02:00
shortcutme
654cce92cd
Rev4068 2019-04-15 22:24:00 +02:00
shortcutme
8f0bfbc553
Test Multiuser and Bootstrapper plugins 2019-04-15 22:23:30 +02:00
shortcutme
bc39e52f56
Rev4066 2019-04-15 22:20:16 +02:00
shortcutme
a822238735
Use 1544 port for tor in tests 2019-04-15 22:19:38 +02:00
shortcutme
b168772d7f
Create user for tests if necessary 2019-04-15 22:19:16 +02:00
shortcutme
034e104c06
Log fileserver startup error for tests 2019-04-15 22:19:00 +02:00
shortcutme
0c0f117bc3
Don't parse config file for tests 2019-04-15 22:18:40 +02:00
shortcutme
1d4ab8833b
Test and enforce proper import plugin order in debug mode 2019-04-15 22:18:18 +02:00
shortcutme
90fee9788d
Always translate html files to avoid compatibility problems with brackets in url 2019-04-15 22:16:47 +02:00
shortcutme
bf7597e1b2
Add simple test for Multiuser plugin 2019-04-15 22:16:05 +02:00
shortcutme
54ff940c2b
Fix Bootstrapper plugin py3 compatibility 2019-04-15 22:15:42 +02:00
shortcutme
446641c31c
Always commit before Db VACUUM 2019-04-15 22:11:44 +02:00
shortcutme
572d55752c
Avoid random websocket test fails 2019-04-15 16:54:49 +02:00
shortcutme
04394d8ced
Rev4064 2019-04-15 16:29:01 +02:00
shortcutme
c7ea66bfef
Fix shutdown before file_server started 2019-04-15 16:07:18 +02:00
shortcutme
bfc5e2dce6
Support live changing offline mode 2019-04-15 15:49:53 +02:00
shortcutme
a7e8293d1a
Add offline mode info to server info 2019-04-15 15:49:34 +02:00
shortcutme
698f0cc230
Fix isServing check in site info formatting 2019-04-15 15:49:04 +02:00
shortcutme
f414f0746c
Don't update site in offline mode 2019-04-15 15:48:16 +02:00
shortcutme
235b8f359c
Don't create new connections in offline mode 2019-04-15 15:47:17 +02:00
shortcutme
2326cf3de8
Ignore incoming connections in offline mode 2019-04-15 15:47:05 +02:00
shortcutme
b8879853d5
Support closing all current connection in ConnectionServer 2019-04-15 15:46:53 +02:00
shortcutme
498fd4bf01
Don't listen ConnectionServer if not started 2019-04-15 15:46:37 +02:00
shortcutme
996f326c74
Store if UiServer is running 2019-04-15 15:45:07 +02:00
shortcutme
43b68faf73
Merge Ui all.js 2019-04-15 15:44:04 +02:00
shortcutme
8429ad7db7
Use global ZeroNet-Internal url to access Websocket 2019-04-15 15:42:24 +02:00
shortcutme
879b522914
Disable portcheck in offline mode 2019-04-15 15:16:38 +02:00
shortcutme
be584aa3d1
Change offline mode in /Config page 2019-04-15 15:12:08 +02:00
shortcutme
b82f57e7a2
Fix small file upload using bigfile plugin 2019-04-15 15:07:31 +02:00
shortcutme
998ec3eb4f
Disable all site serving using a global offline argument 2019-04-15 15:06:25 +02:00
radfish
f6e3a74567 [setuptools packaging] access modules via imports (#1969) 2019-04-15 12:31:33 +02:00
ZeroNet
6d8f55cf75
Merge pull request #1983 from krzotr/patch-3
OptionalManager file info - set bytes_downloaded to total file size if a file is fully downloaded
2019-04-15 12:21:28 +02:00
rllola
c4d8466195 Delete old plugin 2019-04-14 22:58:05 +02:00
rllola
36ff506dfe Added 'requests' dependency to requierements.txt 2019-04-14 22:44:04 +02:00
rllola
af1fb7aaa6 Also recognise 'map' namecoin standard way of registering for domain 2019-04-14 22:41:26 +02:00
rllola
1a944735df New ZeronameLocal plugin with connection to namecoin node 2019-04-14 16:58:58 +02:00
krzotr
1c8fba4286
OptionalManager file info - set bytes_downloaded to file size if a file is fully downloaded 2019-04-13 23:07:13 +02:00
shortcutme
0260b30335
Rev4059 2019-04-12 15:03:58 +02:00
shortcutme
0f72085c2a
Also return ws error to websocket connection 2019-04-12 15:03:43 +02:00
shortcutme
6ad8a10f37
Fix invalid ws request response value 2019-04-12 15:03:19 +02:00
shortcutme
8a38983dfc
Better logging of ws connection for siteCmd CLI action 2019-04-12 15:02:04 +02:00
shortcutme
9f5600b7f7
Rev4057, Log invalid result from websocket 2019-04-11 16:33:36 +02:00
shortcutme
a7632889a2
Rev4056 2019-04-11 01:30:04 +02:00
shortcutme
58a4bf479c
Only send env details if in debug mode 2019-04-11 01:29:56 +02:00
shortcutme
1ce4f99b80
Send noscript header to error messages and OPTIONS request 2019-04-11 01:29:32 +02:00
shortcutme
f94ecb3ec5
Fix error 404 on uimedia route 2019-04-11 01:28:00 +02:00
shortcutme
ce7c22fd57
Ignore items with no date_added in newsfeed 2019-04-11 01:18:52 +02:00
shortcutme
a5c7e59601
Rev4054, Escape error detail to avoid XSS (reported by krzotr) 2019-04-11 00:37:55 +02:00
ZeroNet
efbf70726f
Merge pull request #1975 from krzotr/patch-2
Updated to python 3.6 in Dockerfile
2019-04-10 23:43:50 +02:00
krzotr
490b1dc01b
Updated python version in Dockerfile 2019-04-10 23:08:09 +02:00
shortcutme
ec81965393
Rev4053 2019-04-10 19:59:56 +02:00
shortcutme
21536b8948
More clear logging when pinging unencrypted connections 2019-04-10 19:59:37 +02:00
shortcutme
100c2c8741
Set serial by command line to avoid .srl file creation 2019-04-10 19:59:02 +02:00
shortcutme
d47e4a3e0e
More detailed error logging on ssl handshake fail 2019-04-10 19:58:37 +02:00
shortcutme
0c9ea8f580
Merge branch 'py3' of https://github.com/HelloZeroNet/ZeroNet into py3 2019-04-10 14:57:32 +02:00
shortcutme
2320eb8723
Rev4052 2019-04-10 14:57:24 +02:00
shortcutme
17bbeefeca
Fix getWebsocket 2019-04-10 14:57:06 +02:00
shortcutme
31372e269d
Give notification to all connected clients about ZeroNet update 2019-04-10 14:56:47 +02:00
shortcutme
ac799a60da
Stop fs watcher with UiServer 2019-04-10 14:56:10 +02:00
radfish
6a1d716ba1 test: refer to test data path via variable (#1964) 2019-04-10 11:30:35 +02:00
ZeroNet
ed12cc1186
Merge pull request #1952 from tangdou1/patch-6
Update TrayiconPlugin.py for IPV6 compatibility
2019-04-09 17:53:04 +02:00
ZeroNet
8370ac8426
Merge pull request #1970 from radfish/PR-py3--import-plugins
[setuptools packaging] PluginManager: get plugins path via import
2019-04-09 17:51:05 +02:00
shortcutme
a20ff59572
Rev4050 2019-04-09 16:21:46 +02:00
shortcutme
8587f01caa
Fix update script target directory 2019-04-09 16:21:39 +02:00
shortcutme
c7078be407
Always verify client update 2019-04-09 15:07:56 +02:00
shortcutme
718a00974b
Merge js 2019-04-09 15:06:33 +02:00
shortcutme
d612676a80
Log closing websocket when updating event 2019-04-09 15:06:27 +02:00
shortcutme
d7bcfb415b
Fix js merging white space stripping 2019-04-09 15:06:09 +02:00
shortcutme
6928a17e61
Rev4048 2019-04-08 18:15:23 +02:00
shortcutme
d097092e8e
Merge js 2019-04-08 18:15:02 +02:00
shortcutme
79eb6573be
Support listing bad files with API 2019-04-08 18:14:45 +02:00
shortcutme
ffed8c9181
Add updatesite, dist_type, verify lib to serverinfo for admin sites 2019-04-08 18:14:31 +02:00
shortcutme
ff8573635d
Limit max width of notification 2019-04-08 18:13:58 +02:00
shortcutme
643244ffd1
Less visible changed files notification number 2019-04-08 18:13:18 +02:00
shortcutme
9fd059aef8
Give admin permission to updater site 2019-04-08 18:12:58 +02:00
shortcutme
6764a7ad2f
Fix js merging 2019-04-08 18:12:43 +02:00
shortcutme
5642d0aae6
Also ignore db -wal and -shm temp db files when signing 2019-04-08 18:12:29 +02:00
shortcutme
763e5f4ac0
Fix too short sleep 2019-04-08 18:12:00 +02:00
shortcutme
87abdb92e9
Fix bigfile upload 2019-04-08 18:11:46 +02:00
ZeroNet
447ab47d59
Merge pull request #1971 from radfish/PR-py3--bump-interpreter
[setuptools packaging] zeronet: bump script interpreter to python3
2019-04-08 01:57:58 +02:00
redfish
9ed88f25f0 zeronet: bump script interpreter to python3 2019-04-07 19:41:19 -04:00
redfish
73814550e5 PluginManager: get plugins path via import
* skip __pycache__ when loading
2019-04-07 18:50:23 -04:00
ZeroNet
226f7dea65
Merge pull request #1962 from radfish/PR-py3--sys-geoip
Sidebar: use geoip db from system if exists
2019-04-07 12:04:44 +02:00
ZeroNet
ed3de771e8
Merge pull request #1963 from radfish/PR-py3--dead-ssl-code
[setuptools packaging] remove some  dead code related to openssl, pyelliptic
2019-04-07 12:03:17 +02:00
ZeroNet
edf3cf3b65
Merge pull request #1966 from radfish/PR-py3--ui-pathlib
[setuptools packaging] Ui: extend actionFile to accept pathlib.Path
2019-04-07 12:01:07 +02:00
ZeroNet
7a54615156
Merge pull request #1965 from radfish/PR-py3--cfg-paths-no-dec
[setuptools packaging] config: path.expanduser returns py3 strings
2019-04-07 11:56:18 +02:00
redfish
1e1f967292 Ui: extend actionFile to accept pathlib.Path 2019-04-06 22:22:34 -04:00
redfish
6f5d4fdc51 config: path.expanduser returns py3 strings
And strings have no decode method.
2019-04-06 22:16:37 -04:00
redfish
e7a6be035e zeronet: no openssl.closeLibrary in pyelliptic
Not in pyelliptic 2.0.1 (PyBitmessage fork of pyelliptic)
2019-04-06 20:13:56 -04:00
redfish
ee762f349c zeronet: remove ref to opensslVerify
This module is no longer used. Not in lib/
2019-04-06 20:13:56 -04:00
redfish
4d98b05e6c Sidebar: use geoip db from system if exists 2019-04-06 17:24:25 -04:00
ZeroNet
4f4591658d
Merge pull request #1960 from imachug/start-browser
Fix double --open_browser
2019-04-06 23:13:12 +02:00
Ivanq
0c70e95232
Use spaces instead of tabs 2019-04-06 15:02:18 +03:00
Ivanq
594e8b8c20 Fix double --open_browser 2019-04-06 08:30:45 +03:00
tangdou1
4c358b9f08
Big File is bigger than 10MB
Big File is an optional file which is bigger than 10MB, so the default value should be at least 10MB.
2019-04-06 11:57:58 +08:00
shortcutme
7b1594c69c
Rev4044 2019-04-04 13:29:36 +02:00
shortcutme
6d27feba97
New updater site for Python3 version 2019-04-04 13:29:26 +02:00
shortcutme
4363dcbbc1
Distribtion type config value for future support in the updater script 2019-04-04 13:29:11 +02:00
shortcutme
a208f47b6a
Fix sidebar opening for fast mouse movements 2019-04-04 13:28:38 +02:00
shortcutme
84268cd43c
Updater script 2019-04-04 13:28:02 +02:00
shortcutme
380c32dee2
Worker stats on stop 2019-04-04 13:27:46 +02:00
shortcutme
bfc7e7c33f
Only start worker if there is valid task for it 2019-04-04 13:27:21 +02:00
shortcutme
8594e4ce4a
Add reason for startWorkers 2019-04-04 13:27:06 +02:00
shortcutme
752dabe554
Openssl dll find patch to libeay32 2019-04-04 13:25:10 +02:00
shortcutme
042db64a00
Fix multiuser plugin py3 compatibility 2019-04-04 13:24:42 +02:00
shortcutme
f55fd8d861
Avoid re-define variable name 2019-04-04 13:24:26 +02:00
ZeroNet
cc41572d48
Merge pull request #1953 from tangdou1/patch-4
ipv6 compatibility
2019-04-03 15:54:54 +02:00
tangdou1
42de962cbf
ipv6 compatibility 2019-04-03 19:56:14 +08:00
tangdou1
f527b8225f
IPV6 compatibility 2019-04-03 19:54:19 +08:00
ZeroNet
dd9ccfa3d2
Merge pull request #1947 from radfish/PR-py3--cryptmsg-base64-type
CryptMessage: base64 arg type byte-array; File: set error message before use
2019-04-01 01:15:59 +02:00
ZeroNet
24b6780c1f
Merge pull request #1946 from tangdou1/patch-5
Update README.md
2019-04-01 01:15:20 +02:00
redfish
941571f71f file: set error message before using it
Fixes this exception:

Unhandled exception: [(<class 'UnboundLocalError'>,
UnboundLocalError("local variable 'err' referenced before assignm>
 Traceback (most recent call last):
   File "src/gevent/greenlet.py", line 766, in gevent._greenlet.Greenlet.run
   File "/opt/zeronet/src/util/RateLimit.py", line 57, in <lambda>
     thread = gevent.spawn_later(time_left, lambda: callQueue(event))  # Call this function later
   File "/opt/zeronet/src/util/RateLimit.py", line 42, in callQueue
     return func(*args, **kwargs)
   File "/opt/zeronet/src/File/FileRequest.py", line 185, in actionUpdate
     self.response({"error": "File invalid: %s" % err})
UnboundLocalError: local variable 'err' referenced before assignment
2019-03-31 16:25:26 -04:00
redfish
65be9f438b CryptMessage: pass byte-array type to base64
Fixes this error upon sending a message in ZeroMail:
WebSocket handleRequest error: TypeError: a bytes-like object is
required, not 'str' in UiWebsocket.py line 83 > UiWebsocket.py line 269
> CryptMessage/CryptMessagePlugin.py line 80 >
CryptMessage/CryptMessagePlugin.py line 80 > base64.py line 58
2019-03-31 14:05:15 -04:00
tangdou1
15d13ac9f6
Update README.md 2019-03-31 13:47:47 +08:00
ZeroNet
6df3acaf1e
Merge pull request #1943 from radfish/PR-py3--req
requirements: pyelliptic any version, websocket_client
2019-03-31 00:24:15 +01:00
redfish
b6ee24dcd5 readme: mention distro packages; also formatting 2019-03-30 09:50:10 -04:00
redfish
1a3e5b7893 requirements: main websocket package
'websocket' doesn't look like the right one.
2019-03-30 09:50:10 -04:00
shortcutme
5df8e10b95
Rev4033 2019-03-29 02:31:46 +01:00
shortcutme
bddf2d6537
Fix fileGet command 2019-03-29 02:31:14 +01:00
shortcutme
3d05bdcb63
Log file command errors 2019-03-29 02:31:05 +01:00
shortcutme
52e28eefce
Use lowercase filename for content type guess 2019-03-29 02:30:43 +01:00
shortcutme
b5b0626251
Add name for websocket event on site deletion 2019-03-29 02:30:26 +01:00
shortcutme
4a4f311cf8
Better logging of cert generation 2019-03-29 02:29:55 +01:00
shortcutme
bad4d14cf6
Save OpenSSL rand file in data directory to avoid error message on Windows 2019-03-29 02:29:32 +01:00
shortcutme
b814a633c6
Don't reset broken ssl client list on cleanup 2019-03-29 02:28:46 +01:00
shortcutme
73524d70dc
Switch back to default log file naming because of broken rotate file deleting 2019-03-29 02:28:03 +01:00
shortcutme
9fad83e46c
More detailed logging on archive open error 2019-03-29 02:27:26 +01:00
shortcutme
9fbf4771f2
Fix directory commands on packed files 2019-03-29 02:26:50 +01:00
shortcutme
41cd7da5bd
Rev4026 2019-03-27 03:13:14 +01:00
shortcutme
3d975fd767
Fix libeay32.dll location 2019-03-27 03:12:21 +01:00
shortcutme
f8f857c820
Fix atomicWrite non existing tmpold removal 2019-03-27 03:12:02 +01:00
shortcutme
9546ed0bb6
Try to connect to tor before starting own one under windows 2019-03-27 03:11:38 +01:00
shortcutme
a3f957427f
Make sure the test went without unnecessary reconnects 2019-03-27 03:10:58 +01:00
shortcutme
9b36c55422
Fix pytest warning 2019-03-27 03:10:29 +01:00
shortcutme
b6286372fb
Better cleanup after tests 2019-03-27 03:10:21 +01:00
shortcutme
74e71a1971
Fix http tracker announce compatibility 2019-03-27 03:09:47 +01:00
shortcutme
9050f1a039
Show crypto cipher on peerPing command 2019-03-27 03:09:28 +01:00
shortcutme
de303bf453
Modern crypto ciphers 2019-03-27 03:09:09 +01:00
shortcutme
3d8d3a9237
Randomize SSL subject (merged ValdikSS's commit) 2019-03-27 03:08:37 +01:00
shortcutme
cf354d59fb
Fix incoming connection implicit crypt 2019-03-27 03:06:46 +01:00
shortcutme
706852d9a7
Fix Stats page rendering in non-debug mode, reduce source code size 2019-03-27 03:06:22 +01:00
shortcutme
63e405c27e
Rev4022 2019-03-23 03:42:26 +01:00
shortcutme
33e8c6fb73
Fix ipv6 port checker 2019-03-23 03:41:52 +01:00
shortcutme
faba28dd94
Proper handle of sigterm signal, log reason of shutdown 2019-03-23 03:41:42 +01:00
shortcutme
16f36824e6
Fix benchmark on Python 3.5 2019-03-23 03:40:42 +01:00
shortcutme
5c1ec0b141
Ecies encrypted string length can be different in rare cases 2019-03-23 03:38:30 +01:00
shortcutme
e24d1016a5
Fix bigfile upload post request return value 2019-03-23 03:38:04 +01:00
shortcutme
a82ee338ef
Rev4017, Fix Bigfile test, Python 3.4 compatibility 2019-03-21 02:48:21 +01:00
shortcutme
60405bf222
Rev4016, Add ad test Python 3.4 compatibility 2019-03-21 02:22:22 +01:00
shortcutme
1da6c8c84e
Fix Python 3.7 test 2019-03-20 01:07:18 +01:00
shortcutme
32329c1817
Rev4015 2019-03-20 01:06:56 +01:00
shortcutme
4aee7a6c61
Make openLocked always return BlockingIOError on fail 2019-03-20 01:05:52 +01:00
shortcutme
e6c2937c1b
Rev4014 2019-03-20 00:50:44 +01:00
shortcutme
1bbf9b62ad
Test on multiple python versions 2019-03-20 00:50:32 +01:00
shortcutme
fa9e024b42
Base58 package is required for libsecp256k verify 2019-03-20 00:50:18 +01:00
shortcutme
8c52038671
Switch to WAL mode as it's faster on older sqlite 2019-03-20 00:49:51 +01:00
shortcutme
7aff97b6ff
Fix loading json files to db on Python 3.5 2019-03-20 00:49:27 +01:00
shortcutme
77530f13ee
Fix content.json update and verify on Python 3.5 2019-03-20 00:48:51 +01:00
shortcutme
0a1c22530a
More clear paramter name for verify 2019-03-20 00:48:09 +01:00
shortcutme
e6c0fe0370
OpenSSL config file to lib dir 2019-03-20 00:47:43 +01:00
shortcutme
63ba0a5551
Fix tests on Python 3.5 2019-03-20 00:46:57 +01:00
shortcutme
c7bfe0d537
Fix Upnp test 2019-03-20 00:46:16 +01:00
shortcutme
05887c976a
Test on Python 3.5 using travis, temporary disable docker build until stable release 2019-03-19 17:02:39 +01:00
ZeroNet
ccc8fda24f
Merge pull request #1926 from 0polar/patch-4
Fix unable to open context menu on Windows
2019-03-19 16:42:21 +01:00
ZeroNet
abb458bdd3
Merge pull request #1925 from 0polar/patch-1
Fix "no module" error
2019-03-19 16:41:48 +01:00
ZeroNet
41429dd254
Merge pull request #1922 from 0polar/patch-2
Fix wrong module name
2019-03-19 16:41:05 +01:00
ZeroNet
268a39e93b
Merge pull request #1923 from 0polar/patch-3
pip install in right way
2019-03-19 16:40:39 +01:00
0polar
8411c60d4a
Fix unable to open context menu on Windows
Python3 string no need decoding or encoding
2019-03-19 20:56:30 +08:00
0polar
de91f7ec15
Fix "no module" error
Since added into pip requirements.txt, no need to `from...`
2019-03-19 20:36:26 +08:00
0polar
6094af819b
Update requirements.txt 2019-03-19 20:15:55 +08:00
0polar
5f21563d7d
Update README.md 2019-03-19 20:11:01 +08:00
0polar
5b9afe70b2
pip install in right way 2019-03-19 20:00:30 +08:00
0polar
27f47460e3
Fix wrong module name
Exception: You probably meant to install and run gevent-websocket
2019-03-19 19:59:12 +08:00
shortcutme
ad1bd045f7
Rev4011 2019-03-18 03:38:11 +01:00
shortcutme
9a9a8bfdc7
Fix peer loading 2019-03-18 03:37:05 +01:00
shortcutme
c88152cac2
Use shared cursor where possible 2019-03-18 03:36:44 +01:00
shortcutme
61c72ac3ea
Fix SQLite concurrency errors 2019-03-18 03:36:12 +01:00
shortcutme
84c39f3baa
Less sensitive db progress handler 2019-03-18 03:33:51 +01:00
shortcutme
7d6ef195fd
Don't allow to run db on different thread 2019-03-18 03:33:28 +01:00
shortcutme
52ac972332
Keep need commit status if commit failed 2019-03-18 03:33:06 +01:00
shortcutme
9aa599f9d2
Close and commit all db at exit 2019-03-18 03:32:42 +01:00
shortcutme
a5ce7e5a1f
Rev4006 2019-03-18 01:08:35 +01:00
shortcutme
f8511bf199
Display error if try to start with Python2 2019-03-18 01:08:20 +01:00
shortcutme
cfdc6bac7b
Remove test lock files after test 2019-03-18 01:08:02 +01:00
shortcutme
33e4c088b9
Upnp opening function return success value 2019-03-18 01:07:46 +01:00
shortcutme
a620bf2174
Fix lang html variable on config page 2019-03-18 01:06:45 +01:00
shortcutme
e77d63294e
Fix config page 2019-03-18 01:06:04 +01:00
shortcutme
82c55ba038
Rev4003, Fix peer sorting if no ping delay yet 2019-03-16 04:57:59 +01:00
shortcutme
e1394d7a7d
Fix socks package name 2019-03-16 04:48:56 +01:00
shortcutme
9f99fa8edc
Remove not working tracker 2019-03-16 04:26:59 +01:00
shortcutme
02e67a901f
Import global maxminddb module 2019-03-16 04:26:27 +01:00
shortcutme
f331f5e92c
Add maxminddb as requirement 2019-03-16 04:26:13 +01:00
shortcutme
6e5bf5fef6
Decode masgpack hash key values as byte 2019-03-16 04:22:49 +01:00
shortcutme
c7b4e28f82
Version 0.7.0, Rev4001 2019-03-16 03:53:37 +01:00
shortcutme
9235ecfe7b
Add warning, installation method to readme 2019-03-16 03:53:22 +01:00
shortcutme
7f234721ec
Add required modules 2019-03-16 03:48:58 +01:00
shortcutme
242b3edbc4
Fix BigFilePiecefiled typo 2019-03-16 03:44:13 +01:00
shortcutme
b7894faa96
Fix AnnounceShare backward compatibility 2019-03-16 03:44:01 +01:00
shortcutme
f3a4b9c709
Fix announce py3 compatibility 2019-03-16 03:43:11 +01:00
shortcutme
ea638dd0e0
Fix Noparallel test on slower machines 2019-03-16 03:02:59 +01:00
shortcutme
f0b53c4cbb
Add bundled pybitcointools 2019-03-16 03:01:50 +01:00
shortcutme
3eae349a0a
Remove included win_inet_pton, websocket, rsa, socks, pyelliptic, pybitcointools, pyasn1, opensslVerify, merkletools, geventwebsocket, BitcoinECC, bencode module 2019-03-16 02:58:49 +01:00
shortcutme
ff5004cb8d
Remove included maxminddb 2019-03-16 02:52:12 +01:00
shortcutme
567855e2d2
TestHelper formatting 2019-03-16 02:49:41 +01:00
shortcutme
d20da5d803
1ms is the minimum sleep with new gevent 2019-03-16 02:46:33 +01:00
shortcutme
b98a9d2e80
Commit before vacuum 2019-03-16 02:45:37 +01:00
shortcutme
955164aa3c
New configuration option to use libsecp256k1 for speedup 2019-03-16 02:45:06 +01:00
shortcutme
db8f9988eb
Use log extension for rolled log file names 2019-03-16 02:44:22 +01:00
shortcutme
9b2cae8e33
Don't log geventwebsocket module debug messages 2019-03-16 02:43:45 +01:00
shortcutme
ac325c5c5e
Py3 compatibility of FileRequest module 2019-03-16 02:43:07 +01:00
shortcutme
e92f3ea100
New watchdog module based file change watching 2019-03-16 02:42:43 +01:00
shortcutme
75d8338f2d
Debug stack formatting include module names 2019-03-16 02:41:09 +01:00
shortcutme
a1b5dad1c8
New Db connection type to avoid corruption 2019-03-16 02:40:32 +01:00
shortcutme
0e2f7fb122
Use global rsa module 2019-03-16 02:39:11 +01:00
shortcutme
ee631730c7
Remove sha1 sum function 2019-03-16 02:38:47 +01:00
shortcutme
f7fd445c73
Test sha sum parameter type properly 2019-03-16 02:38:38 +01:00
shortcutme
5c0fc38272
Remove not used ECC cert generation 2019-03-16 02:37:48 +01:00
shortcutme
6df0321962
Py3 compatibilty of CryptConnection module 2019-03-16 02:37:38 +01:00
shortcutme
65d19d350c
We don't support old style sign verification anymore 2019-03-16 02:36:45 +01:00
shortcutme
bc93796727
Add faster libsecp256k1 support for sign verification, Remove old style signing support, 2019-03-16 02:36:11 +01:00
shortcutme
6f0531c663
Test CryptMessage ui_websocket result more reliable way 2019-03-16 02:33:38 +01:00
shortcutme
545acebbaf
New CryptMessage test functions for ecies crypto 2019-03-16 02:33:09 +01:00
shortcutme
af49404320
Remove support of old request type 2019-03-16 02:32:10 +01:00
shortcutme
717802860d
Create new unpacker object if client sending new-style, bin-type compatible msgpack streams 2019-03-16 02:31:49 +01:00
shortcutme
edd3f35790
Use new Msgpack library for unpacking 2019-03-16 02:30:54 +01:00
shortcutme
20806a8c97
ZeroName plugin Py3 compatibility 2019-03-16 02:27:26 +01:00
shortcutme
f071cc5c04
Make Stats page Py3 compatible, Add libsecp256k1 testing, Xy packing format, Reduce code duplication 2019-03-16 02:27:04 +01:00
shortcutme
90c9078bf5
Remove unnecessary logging of Sidebar loadGlobe 2019-03-16 02:25:20 +01:00
shortcutme
203e70afbc
Py3 compatibility of PeerDb plugin 2019-03-16 02:24:36 +01:00
shortcutme
2599e54fd0
Py3 compatibility of FilePack plugin 2019-03-16 02:24:17 +01:00
shortcutme
2737425242
Py3 compatibility of UiPassword plugin 2019-03-16 02:23:46 +01:00
shortcutme
40569eee2e
Py3 compatibility of CryptMessage plugin, Rename ecies crypto function names to make it more clear 2019-03-16 02:23:00 +01:00
shortcutme
883c2851ff
Py3 compatibility of ContentFilter plugin 2019-03-16 02:20:32 +01:00
shortcutme
4fe4d0a7e7
BEGIN / END no longer necessary as there is no autocommit in new db module 2019-03-16 02:18:53 +01:00
shortcutme
a46d8fe7f3
Hash id of the hashmap changed because of use_bin_type msgpack packing 2019-03-16 02:15:37 +01:00
shortcutme
dd70d27a0e
Use new Msgpack modue for testing Bigfile plugin 2019-03-16 02:14:43 +01:00
shortcutme
b46ee0c495
Use Msgpack module in Bigfile plugin 2019-03-16 02:14:08 +01:00
shortcutme
28ffb3fd18
Ignore sha3 warning of merkletools module 2019-03-16 02:13:17 +01:00
shortcutme
d1456850d1
Py3 compatibility in Bigfile piecefield 2019-03-16 02:12:45 +01:00
shortcutme
050e2febab
Log add types with zero announce request 2019-03-16 02:12:03 +01:00
shortcutme
f56c8ef755
Save shared trackers files as utf8 2019-03-16 02:11:38 +01:00
shortcutme
1a9529157f
Backward compatibility with tracker sharing response 2019-03-16 02:11:22 +01:00
shortcutme
8c5c3cb7a6
Use Msgpack module in BroadcastServer plugin 2019-03-16 02:09:27 +01:00
shortcutme
8ab9b06185
Subtl module py3 compatibility 2019-03-16 02:06:28 +01:00
shortcutme
91c5556f21
Remove old gevent compatibility patches 2019-03-16 02:05:27 +01:00
shortcutme
95cf47d9a4
Test site download with all avaliable crypto lib 2019-03-16 01:01:30 +01:00
shortcutme
dfad2370aa
Test file locking 2019-03-16 01:01:06 +01:00
shortcutme
331e25cc41
Test content rules with all avaliable crypto lib 2019-03-16 01:00:49 +01:00
shortcutme
6dcf7e8088
Remove testing of old signature 2019-03-16 01:00:21 +01:00
shortcutme
99690a6145
Test longer string signing 2019-03-16 00:59:27 +01:00
shortcutme
27bcc3c685
Test sign and verify with all crypto lib avaliable 2019-03-16 00:59:11 +01:00
shortcutme
bb60558968
Test hashing functions 2019-03-16 00:57:50 +01:00
shortcutme
af38a3927a
Test utf8 diffing 2019-03-16 00:57:03 +01:00
shortcutme
bf6771152e
Test backward compatibility to py2 byte-less msgpack unpacker 2019-03-16 00:56:50 +01:00
shortcutme
48b6c81b36
Test msgpack streaming with binary data 2019-03-16 00:56:25 +01:00
shortcutme
d95da7372a
Feed Msgpack unpacker as byte 2019-03-16 00:54:27 +01:00
shortcutme
231037b0fe
Test Msgpack result 2019-03-16 00:54:12 +01:00
shortcutme
c481d20ce8
Use new libs in Msgpack tests 2019-03-16 00:54:00 +01:00
shortcutme
dc32556983
Add utf8 and binary data to msgpack test vector 2019-03-16 00:53:18 +01:00
shortcutme
d7b43f4722
Same priority file download order does not matter 2019-03-16 00:51:32 +01:00
shortcutme
ca29fcec7d
findHashId order does not matter 2019-03-16 00:50:25 +01:00
shortcutme
bd637d661a
Test translate of utf8 strings 2019-03-16 00:49:09 +01:00
shortcutme
ef8174af70
All problematic characters will be escaped 2019-03-16 00:48:56 +01:00
shortcutme
dee562437b
Rename hashfield pack functions to bytes 2019-03-16 00:42:21 +01:00
shortcutme
a1a4a73260
Save sites.json as utf8 2019-03-16 00:41:26 +01:00
shortcutme
627edeb0f2
Py3 support in announce requests 2019-03-16 00:33:12 +01:00
shortcutme
16f29b65f2
Use if in protocol port detection 2019-03-16 00:32:50 +01:00
shortcutme
6d2a863af5
Sleep a bit before creating new file_server to allow connection closing 2019-03-16 00:15:43 +01:00
shortcutme
35e61a0c69
More reliable UiWebsocket testing 2019-03-16 00:15:19 +01:00
shortcutme
c474699695
Different crypto lib fixture 2019-03-16 00:14:58 +01:00
shortcutme
1e2dadf75e
Log test to log/cmd.log 2019-03-16 00:14:11 +01:00
shortcutme
002303a765
Db rebuilding error display, reconnect bug fix 2019-03-16 00:11:52 +01:00
shortcutme
59426c31f7
SiteStorage Py3 compatibility 2019-03-16 00:10:49 +01:00
shortcutme
fd895d0ef5
Tormanager only disconnect if connected 2019-03-16 00:08:04 +01:00
shortcutme
8220272953
Py3 encoding support in TorManager 2019-03-16 00:05:23 +01:00
shortcutme
c8fc1ebefa
Remove tor downloading for windows 2019-03-16 00:04:09 +01:00
shortcutme
ac9531eb98
Use global socks module 2019-03-16 00:03:05 +01:00
shortcutme
56d68ce161
Open translate language file as utf8 2019-03-16 00:02:13 +01:00
shortcutme
a3ef3b34e1
Support multi-line notification 2019-03-16 00:01:52 +01:00
shortcutme
30e348f965
Remove empty line from wrapper template 2019-03-16 00:01:15 +01:00
shortcutme
b981ddadca
Encode error repose to bytes 2019-03-16 00:00:35 +01:00
shortcutme
bcd721e2ef
Always display title if there is content.json file 2019-03-16 00:00:04 +01:00
shortcutme
a96ff8399f
Open template as utf8 file 2019-03-15 23:59:30 +01:00
shortcutme
2f4dec45a6
Decode path as utf8 2019-03-15 23:58:18 +01:00
shortcutme
b216e42397
Fix modified files checking 2019-03-15 23:57:30 +01:00
shortcutme
9b6c624554
Allow cloning if content.json update fails 2019-03-15 23:57:06 +01:00
shortcutme
12154613c2
Write use file as binary 2019-03-15 23:56:16 +01:00
shortcutme
a42dee5a44
unpackOnionAddress Py3 support 2019-03-15 23:55:40 +01:00
shortcutme
d4d86172f0
Cmp function backport and Utf8 to Byte response decorator helper funcations 2019-03-15 23:55:23 +01:00
shortcutme
a49f454826
Lock files in binary mode, with one byte 2019-03-15 23:53:48 +01:00
shortcutme
cd9a965057
atomicWrite Py3 support, full stack logging 2019-03-15 23:50:33 +01:00
shortcutme
f5bc26e9fe
Use binary format for atomicWrite 2019-03-15 23:49:55 +01:00
shortcutme
c55d69d587
Python3 has builtin inet_pton support 2019-03-15 23:49:13 +01:00
shortcutme
e508357cfb
RateLimit py3 support 2019-03-15 23:48:46 +01:00
shortcutme
1c578b2b3f
Remove SSL patches for old Python/Gevent support, patch ctypes find_library to find openssl libs 2019-03-15 23:48:28 +01:00
shortcutme
4ce2ef732d
Rename StreamingMsgpack to Msgpack, add helpers 2019-03-15 23:33:04 +01:00
shortcutme
65705aba10
Fix UpnpPunch py3 compatibility 2019-03-15 23:32:05 +01:00
shortcutme
e97873fb7e
UpnpPunch gets custom logger 2019-03-15 23:31:13 +01:00
shortcutme
13d1df3cef
Only log worker download status if small ammount of task present 2019-03-15 23:30:34 +01:00
shortcutme
7ffb7db888
Add task statistics logging every 15sec 2019-03-15 23:29:04 +01:00
shortcutme
74366379ba
Only log added task in verbose mode 2019-03-15 23:26:33 +01:00
shortcutme
6b89d05a3c
Disable update 2019-03-15 23:10:29 +01:00
shortcutme
b0b9a4d33c
Change to Python3 coding style 2019-03-15 21:06:59 +01:00
shortcutme
fc0fe0557b
Ignore *.bak files 2019-03-15 18:47:31 +01:00
504 changed files with 11941 additions and 55911 deletions

View file

@ -0,0 +1,40 @@
name: Build Docker Image on Commit
on:
push:
branches:
- main
tags:
- '!' # Exclude tags
jobs:
build-and-publish:
runs-on: docker-builder
steps:
- name: Checkout repository
uses: actions/checkout@v4
- name: Set REPO_VARS
id: repo-url
run: |
echo "REPO_HOST=$(echo "${{ github.server_url }}" | sed 's~http[s]*://~~g')" >> $GITHUB_ENV
echo "REPO_PATH=${{ github.repository }}" >> $GITHUB_ENV
- name: Login to OCI registry
run: |
echo "${{ secrets.OCI_TOKEN }}" | docker login $REPO_HOST -u "${{ secrets.OCI_USER }}" --password-stdin
- name: Build and push Docker images
run: |
# Build Docker image with commit SHA
docker build -t $REPO_HOST/$REPO_PATH:${{ github.sha }} .
docker push $REPO_HOST/$REPO_PATH:${{ github.sha }}
# Build Docker image with nightly tag
docker tag $REPO_HOST/$REPO_PATH:${{ github.sha }} $REPO_HOST/$REPO_PATH:nightly
docker push $REPO_HOST/$REPO_PATH:nightly
# Remove local images to save storage
docker rmi $REPO_HOST/$REPO_PATH:${{ github.sha }}
docker rmi $REPO_HOST/$REPO_PATH:nightly

View file

@ -0,0 +1,37 @@
name: Build and Publish Docker Image on Tag
on:
push:
tags:
- '*'
jobs:
build-and-publish:
runs-on: docker-builder
steps:
- name: Checkout repository
uses: actions/checkout@v4
- name: Set REPO_VARS
id: repo-url
run: |
echo "REPO_HOST=$(echo "${{ github.server_url }}" | sed 's~http[s]*://~~g')" >> $GITHUB_ENV
echo "REPO_PATH=${{ github.repository }}" >> $GITHUB_ENV
- name: Login to OCI registry
run: |
echo "${{ secrets.OCI_TOKEN }}" | docker login $REPO_HOST -u "${{ secrets.OCI_USER }}" --password-stdin
- name: Build and push Docker image
run: |
TAG=${{ github.ref_name }} # Get the tag name from the context
# Build and push multi-platform Docker images
docker build -t $REPO_HOST/$REPO_PATH:$TAG --push .
# Tag and push latest
docker tag $REPO_HOST/$REPO_PATH:$TAG $REPO_HOST/$REPO_PATH:latest
docker push $REPO_HOST/$REPO_PATH:latest
# Remove the local image to save storage
docker rmi $REPO_HOST/$REPO_PATH:$TAG
docker rmi $REPO_HOST/$REPO_PATH:latest

11
.github/FUNDING.yml vendored
View file

@ -1 +1,10 @@
custom: https://zeronet.io/docs/help_zeronet/donate/ github: canewsin
patreon: # Replace with a single Patreon username e.g., user1
open_collective: # Replace with a single Open Collective username e.g., user1
ko_fi: canewsin
tidelift: # Replace with a single Tidelift platform-name/package-name e.g., npm/babel
community_bridge: # Replace with a single Community Bridge project-name e.g., cloud-foundry
liberapay: canewsin
issuehunt: # Replace with a single IssueHunt username e.g., user1
otechie: # Replace with a single Otechie username e.g., user1
custom: ['https://paypal.me/PramUkesh', 'https://zerolink.ml/1DeveLopDZL1cHfKi8UXHh2UBEhzH6HhMp/help_zeronet/donate/']

72
.github/workflows/codeql-analysis.yml vendored Normal file
View file

@ -0,0 +1,72 @@
# For most projects, this workflow file will not need changing; you simply need
# to commit it to your repository.
#
# You may wish to alter this file to override the set of languages analyzed,
# or to provide custom queries or build logic.
#
# ******** NOTE ********
# We have attempted to detect the languages in your repository. Please check
# the `language` matrix defined below to confirm you have the correct set of
# supported CodeQL languages.
#
name: "CodeQL"
on:
push:
branches: [ py3-latest ]
pull_request:
# The branches below must be a subset of the branches above
branches: [ py3-latest ]
schedule:
- cron: '32 19 * * 2'
jobs:
analyze:
name: Analyze
runs-on: ubuntu-latest
permissions:
actions: read
contents: read
security-events: write
strategy:
fail-fast: false
matrix:
language: [ 'javascript', 'python' ]
# CodeQL supports [ 'cpp', 'csharp', 'go', 'java', 'javascript', 'python', 'ruby' ]
# Learn more about CodeQL language support at https://aka.ms/codeql-docs/language-support
steps:
- name: Checkout repository
uses: actions/checkout@v3
# Initializes the CodeQL tools for scanning.
- name: Initialize CodeQL
uses: github/codeql-action/init@v2
with:
languages: ${{ matrix.language }}
# If you wish to specify custom queries, you can do so here or in a config file.
# By default, queries listed here will override any specified in a config file.
# Prefix the list here with "+" to use these queries and those in the config file.
# Details on CodeQL's query packs refer to : https://docs.github.com/en/code-security/code-scanning/automatically-scanning-your-code-for-vulnerabilities-and-errors/configuring-code-scanning#using-queries-in-ql-packs
# queries: security-extended,security-and-quality
# Autobuild attempts to build any compiled languages (C/C++, C#, or Java).
# If this step fails, then you should remove it and run the build manually (see below)
- name: Autobuild
uses: github/codeql-action/autobuild@v2
# Command-line programs to run using the OS shell.
# 📚 See https://docs.github.com/en/actions/using-workflows/workflow-syntax-for-github-actions#jobsjob_idstepsrun
# If the Autobuild fails above, remove it and uncomment the following three lines.
# modify them (or add more) to build your code if your project, please refer to the EXAMPLE below for guidance.
# - run: |
# echo "Run, Build Application using script"
# ./location_of_script_within_repo/buildscript.sh
- name: Perform CodeQL Analysis
uses: github/codeql-action/analyze@v2

51
.github/workflows/tests.yml vendored Normal file
View file

@ -0,0 +1,51 @@
name: tests
on: [push, pull_request]
jobs:
test:
runs-on: ubuntu-20.04
strategy:
max-parallel: 16
matrix:
python-version: ["3.7", "3.8", "3.9"]
steps:
- name: Checkout ZeroNet
uses: actions/checkout@v2
with:
submodules: "true"
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v1
with:
python-version: ${{ matrix.python-version }}
- name: Prepare for installation
run: |
python3 -m pip install setuptools
python3 -m pip install --upgrade pip wheel
python3 -m pip install --upgrade codecov coveralls flake8 mock pytest==4.6.3 pytest-cov selenium
- name: Install
run: |
python3 -m pip install --upgrade -r requirements.txt
python3 -m pip list
- name: Prepare for tests
run: |
openssl version -a
echo 0 | sudo tee /proc/sys/net/ipv6/conf/all/disable_ipv6
- name: Test
run: |
catchsegv python3 -m pytest src/Test --cov=src --cov-config src/Test/coverage.ini
export ZERONET_LOG_DIR="log/CryptMessage"; catchsegv python3 -m pytest -x plugins/CryptMessage/Test
export ZERONET_LOG_DIR="log/Bigfile"; catchsegv python3 -m pytest -x plugins/Bigfile/Test
export ZERONET_LOG_DIR="log/AnnounceLocal"; catchsegv python3 -m pytest -x plugins/AnnounceLocal/Test
export ZERONET_LOG_DIR="log/OptionalManager"; catchsegv python3 -m pytest -x plugins/OptionalManager/Test
export ZERONET_LOG_DIR="log/Multiuser"; mv plugins/disabled-Multiuser plugins/Multiuser && catchsegv python -m pytest -x plugins/Multiuser/Test
export ZERONET_LOG_DIR="log/Bootstrapper"; mv plugins/disabled-Bootstrapper plugins/Bootstrapper && catchsegv python -m pytest -x plugins/Bootstrapper/Test
find src -name "*.json" | xargs -n 1 python3 -c "import json, sys; print(sys.argv[1], end=' '); json.load(open(sys.argv[1])); print('[OK]')"
find plugins -name "*.json" | xargs -n 1 python3 -c "import json, sys; print(sys.argv[1], end=' '); json.load(open(sys.argv[1])); print('[OK]')"
flake8 . --count --select=E9,F63,F72,F82 --show-source --statistics --exclude=src/lib/pyaes/

8
.gitignore vendored
View file

@ -7,9 +7,14 @@ __pycache__/
# Hidden files # Hidden files
.* .*
!/.forgejo
!/.github
!/.gitignore !/.gitignore
!/.travis.yml !/.travis.yml
!/.gitlab-ci.yml
# Temporary files
*.bak
# Data dir # Data dir
data/* data/*
@ -26,3 +31,6 @@ tools/phantomjs
# ZeroNet config file # ZeroNet config file
zeronet.conf zeronet.conf
# ZeroNet log files
log/*

48
.gitlab-ci.yml Normal file
View file

@ -0,0 +1,48 @@
stages:
- test
.test_template: &test_template
stage: test
before_script:
- pip install --upgrade pip wheel
# Selenium and requests can't be installed without a requests hint on Python 3.4
- pip install --upgrade requests>=2.22.0
- pip install --upgrade codecov coveralls flake8 mock pytest==4.6.3 pytest-cov selenium
- pip install --upgrade -r requirements.txt
script:
- pip list
- openssl version -a
- python -m pytest -x plugins/CryptMessage/Test --color=yes
- python -m pytest -x plugins/Bigfile/Test --color=yes
- python -m pytest -x plugins/AnnounceLocal/Test --color=yes
- python -m pytest -x plugins/OptionalManager/Test --color=yes
- python -m pytest src/Test --cov=src --cov-config src/Test/coverage.ini --color=yes
- mv plugins/disabled-Multiuser plugins/Multiuser
- python -m pytest -x plugins/Multiuser/Test --color=yes
- mv plugins/disabled-Bootstrapper plugins/Bootstrapper
- python -m pytest -x plugins/Bootstrapper/Test --color=yes
- flake8 . --count --select=E9,F63,F72,F82 --show-source --statistics --exclude=src/lib/pyaes/
test:py3.4:
image: python:3.4.3
<<: *test_template
test:py3.5:
image: python:3.5.7
<<: *test_template
test:py3.6:
image: python:3.6.9
<<: *test_template
test:py3.7-openssl1.1.0:
image: python:3.7.0b5
<<: *test_template
test:py3.7-openssl1.1.1:
image: python:3.7.4
<<: *test_template
test:py3.8:
image: python:3.8.0b3
<<: *test_template

3
.gitmodules vendored Normal file
View file

@ -0,0 +1,3 @@
[submodule "plugins"]
path = plugins
url = https://github.com/ZeroNetX/ZeroNet-Plugins.git

View file

@ -1,11 +1,20 @@
language: python language: python
python: python:
- 2.7 - 3.4
- 3.5
- 3.6
- 3.7
- 3.8
services: services:
- docker - docker
cache: pip
before_install:
- pip install --upgrade pip wheel
- pip install --upgrade codecov coveralls flake8 mock pytest==4.6.3 pytest-cov selenium
# - docker build -t zeronet .
# - docker run -d -v $PWD:/root/data -p 15441:15441 -p 127.0.0.1:43110:43110 zeronet
install: install:
- pip install -U pip wheel - pip install --upgrade -r requirements.txt
- pip install -r requirements.txt
- pip list - pip list
before_script: before_script:
- openssl version -a - openssl version -a
@ -15,23 +24,22 @@ before_script:
sudo sh -c 'echo 0 > /proc/sys/net/ipv6/conf/all/disable_ipv6'; sudo sh -c 'echo 0 > /proc/sys/net/ipv6/conf/all/disable_ipv6';
fi fi
script: script:
- python -m pytest -x plugins/CryptMessage/Test - catchsegv python -m pytest src/Test --cov=src --cov-config src/Test/coverage.ini
- python -m pytest -x plugins/Bigfile/Test - export ZERONET_LOG_DIR="log/CryptMessage"; catchsegv python -m pytest -x plugins/CryptMessage/Test
- python -m pytest -x plugins/AnnounceLocal/Test - export ZERONET_LOG_DIR="log/Bigfile"; catchsegv python -m pytest -x plugins/Bigfile/Test
- python -m pytest -x plugins/OptionalManager/Test - export ZERONET_LOG_DIR="log/AnnounceLocal"; catchsegv python -m pytest -x plugins/AnnounceLocal/Test
- python -m pytest src/Test --cov=src --cov-config src/Test/coverage.ini - export ZERONET_LOG_DIR="log/OptionalManager"; catchsegv python -m pytest -x plugins/OptionalManager/Test
before_install: - export ZERONET_LOG_DIR="log/Multiuser"; mv plugins/disabled-Multiuser plugins/Multiuser && catchsegv python -m pytest -x plugins/Multiuser/Test
- pip install -U pytest mock pytest-cov selenium - export ZERONET_LOG_DIR="log/Bootstrapper"; mv plugins/disabled-Bootstrapper plugins/Bootstrapper && catchsegv python -m pytest -x plugins/Bootstrapper/Test
- pip install codecov - find src -name "*.json" | xargs -n 1 python3 -c "import json, sys; print(sys.argv[1], end=' '); json.load(open(sys.argv[1])); print('[OK]')"
- pip install coveralls - find plugins -name "*.json" | xargs -n 1 python3 -c "import json, sys; print(sys.argv[1], end=' '); json.load(open(sys.argv[1])); print('[OK]')"
- docker build -t zeronet . - flake8 . --count --select=E9,F63,F72,F82 --show-source --statistics --exclude=src/lib/pyaes/
- docker run -d -v $PWD:/root/data -p 15441:15441 -p 127.0.0.1:43110:43110 zeronet after_failure:
- zip -r log.zip log/
- curl --upload-file ./log.zip https://transfer.sh/log.zip
after_success: after_success:
- codecov - codecov
- coveralls --rcfile=src/Test/coverage.ini - coveralls --rcfile=src/Test/coverage.ini
cache:
directories:
- $HOME/.cache/pip
notifications: notifications:
email: email:
recipients: recipients:

View file

@ -1,134 +0,0 @@
## ZeroNet 0.5.1 (2016-11-18)
### 新增
- 多语言界面
- 新插件:为站点 HTML 与 JS 文件提供的翻译助手
- 每个站点独立的 favicon
### 修复
- 并行可选文件下载
## ZeroNet 0.5.0 (2016-11-08)
### 新增
- 新插件:允许在 ZeroHello 列出/删除/固定/管理文件
- 新的 API 命令来关注用户的可选文件,与可选文件的请求统计
- 新的可选文件总大小限制
- 新插件:保存节点到数据库并在重启时保持它们,使得更快的可选文件搜索以及在没有 Tracker 的情况下工作
- 重写 UPnP 端口打开器 + 退出时关闭端口(感谢 sirMackk!
- 通过懒惰 PeerHashfield 创建来减少内存占用
- 在 /Stats 页面加载 JSON 文件统计与数据库信息
### 更改
- 独立的锁定文件来获得更好的 Windows 兼容性
- 当执行 start.py 时,即使 ZeroNet 已经运行也打开浏览器
- 在重载时保持插件顺序来允许插件扩展另一个插件
- 只在完整加载 sites.json 时保存来避免数据丢失
- 将更多的 Tracker 更改为更可靠的 Tracker
- 更少的 findhashid CPU 使用率
- 合并下载大量可选文件
- 更多对于可选文件的其他优化
- 如果一个站点有 1000 个节点,更积极地清理
- 为验证错误使用警告而不是错误
- 首先推送更新到更新的客户端
- 损坏文件重置改进
### 修复
- 修复启动时出现的站点删除错误
- 延迟 WebSocket 消息直到连接上
- 修复如果文件包含额外数据时的数据库导入
- 修复大站点下载
- 修复 diff 发送 bug (跟踪它好长时间了)
- 修复当 JSON 文件包含 [] 字符时随机出现的发布错误
- 修复 siteDelete 与 siteCreate bug
- 修复文件写入确认对话框
## ZeroNet 0.4.1 (2016-09-05)
### 新增
- 更快启动与更少内存使用的内核改变
- 尝试连接丢失时重新连接 Tor
- 侧边栏滑入
- 尝试避免不完整的数据文件被覆盖
- 更快地打开数据库
- 在侧边栏显示用户文件大小
- 依赖 --connection_limit 的并发 worker 数量
### 更改
- 在空闲 5 分钟后关闭数据库
- 更好的站点大小计算
- 允许在域名中使用“-”符号
- 总是尝试为站点保持连接
- 移除已合并站点的合并权限
- 只扫描最后 3 天的新闻源来加快数据库请求
- 更新 ZeroBundle-win 到 Python 2.7.12
### 修复
- 修复重要的安全问题:允许任意用户无需有效的来自 ID 提供者的证书发布新内容,感谢 Kaffie 指出
- 修复在没有选择提供证书提供者时的侧边栏错误
- 在数据库重建时跳过无效文件
- 修复随机弹出的 WebSocket 连接错误
- 修复新的 siteCreate 命令
- 修复站点大小计算
- 修复计算机唤醒后的端口打开检查
- 修复 --size_limit 的命令行解析
## ZeroNet 0.4.0 (2016-08-11)
### 新增
- 合并站点插件
- Live source code reloading: Faster core development by allowing me to make changes in ZeroNet source code without restarting it.
- 为合并站点设计的新 JSON 表
- 从侧边栏重建数据库
- 允许直接在 JSON 表中存储自定义数据:更简单与快速的 SQL 查询
- 用户文件存档:允许站点拥有者存档不活跃的用户内容到单个文件(减少初始同步的时间/CPU/内存使用率)
- 在文件删除时同时触发数据库 onUpdated/update
- 从 ZeroFrame API 请求权限
- 允许使用 fileWrite API 命令在 content.json 存储额外数据
- 更快的可选文件下载
- 使用替代源 (Gogs, Gitlab) 来下载更新
- Track provided sites/connection and prefer to keep the ones with more sites to reduce connection number
### 更改
- 保持每个站点至少 5 个连接
- 将目标站点连接从 10 更改到 15
- ZeroHello 搜索功能稳定性/速度改进
- 提升机械硬盘下的客户端性能
### 修复
- 修复 IE11 wrapper nonce 错误
- 修复在移动设备上的侧边栏
- 修复站点大小计算
- 修复 IE10 兼容性
- Windows XP ZeroBundle 兼容性(感谢中国人民)
## ZeroNet 0.3.7 (2016-05-27)
### 更改
- 通过只传输补丁来减少带宽使用
- 其他 CPU /内存优化
## ZeroNet 0.3.6 (2016-05-27)
### 新增
- 新的 ZeroHello
- Newsfeed 函数
### 修复
- 安全性修复
## ZeroNet 0.3.5 (2016-02-02)
### 新增
- 带有 .onion 隐藏服务的完整 Tor 支持
- 使用 ZeroNet 协议的 Bootstrap
### 修复
- 修复 Gevent 1.0.2 兼容性
## ZeroNet 0.3.4 (2015-12-28)
### 新增
- AES, ECIES API 函数支持
- PushState 与 ReplaceState URL 通过 API 的操作支持
- 多用户 localstorage

View file

@ -1,3 +1,201 @@
### ZeroNet 0.9.0 (2023-07-12) Rev4630
- Fix RDos Issue in Plugins https://github.com/ZeroNetX/ZeroNet-Plugins/pull/9
- Add trackers to Config.py for failsafety incase missing trackers.txt
- Added Proxy links
- Fix pysha3 dep installation issue
- FileRequest -> Remove Unnecessary check, Fix error wording
- Fix Response when site is missing for `actionAs`
### ZeroNet 0.8.5 (2023-02-12) Rev4625
- Fix(https://github.com/ZeroNetX/ZeroNet/pull/202) for SSL cert gen failed on Windows.
- default theme-class for missing value in `users.json`.
- Fetch Stats Plugin changes.
### ZeroNet 0.8.4 (2022-12-12) Rev4620
- Increase Minimum Site size to 25MB.
### ZeroNet 0.8.3 (2022-12-11) Rev4611
- main.py -> Fix accessing unassigned varible
- ContentManager -> Support for multiSig
- SiteStrorage.py -> Fix accessing unassigned varible
- ContentManager.py Improve Logging of Valid Signers
### ZeroNet 0.8.2 (2022-11-01) Rev4610
- Fix Startup Error when plugins dir missing
- Move trackers to seperate file & Add more trackers
- Config:: Skip loading missing tracker files
- Added documentation for getRandomPort fn
### ZeroNet 0.8.1 (2022-10-01) Rev4600
- fix readdress loop (cherry-pick previously added commit from conservancy)
- Remove Patreon badge
- Update README-ru.md (#177)
- Include inner_path of failed request for signing in error msg and response
- Don't Fail Silently When Cert is Not Selected
- Console Log Updates, Specify min supported ZeroNet version for Rust version Protocol Compatibility
- Update FUNDING.yml
### ZeroNet 0.8.0 (2022-05-27) Rev4591
- Revert File Open to catch File Access Errors.
### ZeroNet 0.7.9-patch (2022-05-26) Rev4586
- Use xescape(s) from zeronet-conservancy
- actionUpdate response Optimisation
- Fetch Plugins Repo Updates
- Fix Unhandled File Access Errors
- Create codeql-analysis.yml
### ZeroNet 0.7.9 (2022-05-26) Rev4585
- Rust Version Compatibility for update Protocol msg
- Removed Non Working Trakers.
- Dynamically Load Trackers from Dashboard Site.
- Tracker Supply Improvements.
- Fix Repo Url for Bug Report
- First Party Tracker Update Service using Dashboard Site.
- remove old v2 onion service [#158](https://github.com/ZeroNetX/ZeroNet/pull/158)
### ZeroNet 0.7.8 (2022-03-02) Rev4580
- Update Plugins with some bug fixes and Improvements
### ZeroNet 0.7.6 (2022-01-12) Rev4565
- Sync Plugin Updates
- Clean up tor v3 patch [#115](https://github.com/ZeroNetX/ZeroNet/pull/115)
- Add More Default Plugins to Repo
- Doubled Site Publish Limits
- Update ZeroNet Repo Urls [#103](https://github.com/ZeroNetX/ZeroNet/pull/103)
- UI/UX: Increases Size of Notifications Close Button [#106](https://github.com/ZeroNetX/ZeroNet/pull/106)
- Moved Plugins to Seperate Repo
- Added `access_key` variable in Config, this used to access restrited plugins when multiuser plugin is enabled. When MultiUserPlugin is enabled we cannot access some pages like /Stats, this key will remove such restriction with access key.
- Added `last_connection_id_current_version` to ConnectionServer, helpful to estimate no of connection from current client version.
- Added current version: connections to /Stats page. see the previous point.
### ZeroNet 0.7.5 (2021-11-28) Rev4560
- Add more default trackers
- Change default homepage address to `1HELLoE3sFD9569CLCbHEAVqvqV7U2Ri9d`
- Change default update site address to `1Update8crprmciJHwp2WXqkx2c4iYp18`
### ZeroNet 0.7.3 (2021-11-28) Rev4555
- Fix xrange is undefined error
- Fix Incorrect viewport on mobile while loading
- Tor-V3 Patch by anonymoose
### ZeroNet 0.7.1 (2019-07-01) Rev4206
### Added
- Built-in logging console in the web UI to see what's happening in the background. (pull down top-right 0 button to see it)
- Display database rebuild errors [Thanks to Lola]
- New plugin system that allows to install and manage builtin/third party extensions to the ZeroNet client using the web interface.
- Support multiple trackers_file
- Add OpenSSL 1.1 support to CryptMessage plugin based on Bitmessage modifications [Thanks to radfish]
- Display visual error message on startup errors
- Fix max opened files changing on Windows platform
- Display TLS1.3 compatibility on /Stats page
- Add fake SNI and ALPN to peer connections to make it more like standard https connections
- Hide and ignore tracker_proxy setting in Tor: Always mode as it's going to use Tor anyway.
- Deny websocket connections from unknown origins
- Restrict open_browser values to avoid RCE on sandbox escape
- Offer access web interface by IP address in case of unknown host
- Link to site's sidebar with "#ZeroNet:OpenSidebar" hash
### Changed
- Allow .. in file names [Thanks to imachug]
- Change unstable trackers
- More clean errors on sites.json/users.json load error
- Various tweaks for tracker rating on unstable connections
- Use OpenSSL 1.1 dlls from default Python Windows distribution if possible
- Re-factor domain resolving for easier domain plugins
- Disable UDP connections if --proxy is used
- New, decorator-based Websocket API permission system to avoid future typo mistakes
### Fixed
- Fix parsing config lines that have no value
- Fix start.py [Thanks to imachug]
- Allow multiple values of the same key in the config file [Thanks ssdifnskdjfnsdjk for reporting]
- Fix parsing config file lines that has % in the value [Thanks slrslr for reporting]
- Fix bootstrapper plugin hash reloads [Thanks geekless for reporting]
- Fix CryptMessage plugin OpenSSL dll loading on Windows (ZeroMail errors) [Thanks cxgreat2014 for reporting]
- Fix startup error when using OpenSSL 1.1 [Thanks to imachug]
- Fix a bug that did not loaded merged site data for 5 sec after the merged site got added
- Fix typo that allowed to add new plugins in public proxy mode. [Thanks styromaniac for reporting]
- Fix loading non-big files with "|all" postfix [Thanks to krzotr]
- Fix OpenSSL cert generation error crash by change Windows console encoding to utf8
#### Wrapper html injection vulnerability [Reported by ivanq]
In ZeroNet before rev4188 the wrapper template variables was rendered incorrectly.
Result: The opened site was able to gain WebSocket connection with unrestricted ADMIN/NOSANDBOX access, change configuration values and possible RCE on client's machine.
Fix: Fixed the template rendering code, disallowed WebSocket connections from unknown locations, restricted open_browser configuration values to avoid possible RCE in case of sandbox escape.
Note: The fix is also back ported to ZeroNet Py 2.x version (Rev3870)
### ZeroNet 0.7.0 (2019-06-12) Rev4106 (First release targeting Python 3.4+)
### Added
- 5-10x faster signature verification by using libsecp256k1 (Thanks to ZeroMux)
- Generated SSL certificate randomization to avoid protocol filters (Thanks to ValdikSS)
- Offline mode
- P2P source code update using ZeroNet protocol
- ecdsaSign/Verify commands to CryptMessage plugin (Thanks to imachug)
- Efficient file rename: change file names instead of re-downloading the file.
- Make redirect optional on site cloning (Thanks to Lola)
- EccPrivToPub / EccPubToPriv functions (Thanks to imachug)
- Detect and change dark/light theme based on OS setting (Thanks to filips123)
### Changed
- Re-factored code to Python3 runtime (compatible with Python 3.4-3.8)
- More safe database sync mode
- Removed bundled third-party libraries where it's possible
- Use lang=en instead of lang={lang} in urls to avoid url encode problems
- Remove environment details from error page
- Don't push content.json updates larger than 10kb to significantly reduce bw usage for site with many files
### Fixed
- Fix sending files with \0 characters
- Security fix: Escape error detail to avoid XSS (reported by krzotr)
- Fix signature verification using libsecp256k1 for compressed addresses (mostly certificates generated in the browser)
- Fix newsfeed if you have more than 1000 followed topic/post on one site.
- Fix site download as zip file
- Fix displaying sites with utf8 title
- Error message if dbRebuild fails (Thanks to Lola)
- Fix browser reopen if executing start.py again. (Thanks to imachug)
### ZeroNet 0.6.5 (2019-02-16) Rev3851 (Last release targeting Python 2.7.x)
### Added
- IPv6 support in peer exchange, bigfiles, optional file finding, tracker sharing, socket listening and connecting (based on tangdou1 modifications)
- New tracker database format with IPv6 support
- Display notification if there is an unpublished modification for your site
- Listen and shut down normally for SIGTERM (Thanks to blurHY)
- Support tilde `~` in filenames (by d14na)
- Support map for Namecoin subdomain names (Thanks to lola)
- Add log level to config page
- Support `{data}` for data dir variable in trackers_file value
- Quick check content.db on startup and rebuild if necessary
- Don't show meek proxy option if the tor client does not supports it
### Changed
- Refactored port open checking with IPv6 support
- Consider non-local IPs as external even is the open port check fails (for CJDNS and Yggdrasil support)
- Add IPv6 tracker and change unstable tracker
- Don't correct sent local time with the calculated time correction
- Disable CSP for Edge
- Only support CREATE commands in dbschema indexes node and SELECT from storage.query
### Fixed
- Check the length of master seed when executing cryptGetPrivatekey CLI command
- Only reload source code on file modification / creation
- Detection and issue warning for latest no-script plugin
- Fix atomic write of a non-existent file
- Fix sql queries with lots of variables and sites with lots of content.json
- Fix multi-line parsing of zeronet.conf
- Fix site deletion from users.json
- Fix site cloning before site downloaded (Reported by unsystemizer)
- Fix queryJson for non-list nodes (Reported by MingchenZhang)
## ZeroNet 0.6.4 (2018-10-20) Rev3660 ## ZeroNet 0.6.4 (2018-10-20) Rev3660
### Added ### Added
- New plugin: UiConfig. A web interface that allows changing ZeroNet settings. - New plugin: UiConfig. A web interface that allows changing ZeroNet settings.

View file

@ -1,7 +1,7 @@
GNU GENERAL PUBLIC LICENSE GNU GENERAL PUBLIC LICENSE
Version 3, 29 June 2007 Version 3, 29 June 2007
Copyright (C) 2007 Free Software Foundation, Inc. <http://fsf.org/> Copyright (C) 2007 Free Software Foundation, Inc. <https://fsf.org/>
Everyone is permitted to copy and distribute verbatim copies Everyone is permitted to copy and distribute verbatim copies
of this license document, but changing it is not allowed. of this license document, but changing it is not allowed.
@ -645,7 +645,7 @@ the "copyright" line and a pointer to where the full notice is found.
GNU General Public License for more details. GNU General Public License for more details.
You should have received a copy of the GNU General Public License You should have received a copy of the GNU General Public License
along with this program. If not, see <http://www.gnu.org/licenses/>. along with this program. If not, see <https://www.gnu.org/licenses/>.
Also add information on how to contact you by electronic and paper mail. Also add information on how to contact you by electronic and paper mail.
@ -664,11 +664,11 @@ might be different; for a GUI interface, you would use an "about box".
You should also get your employer (if you work as a programmer) or school, You should also get your employer (if you work as a programmer) or school,
if any, to sign a "copyright disclaimer" for the program, if necessary. if any, to sign a "copyright disclaimer" for the program, if necessary.
For more information on this, and how to apply and follow the GNU GPL, see For more information on this, and how to apply and follow the GNU GPL, see
<http://www.gnu.org/licenses/>. <https://www.gnu.org/licenses/>.
The GNU General Public License does not permit incorporating your program The GNU General Public License does not permit incorporating your program
into proprietary programs. If your program is a subroutine library, you into proprietary programs. If your program is a subroutine library, you
may consider it more useful to permit linking proprietary applications with may consider it more useful to permit linking proprietary applications with
the library. If this is what you want to do, use the GNU Lesser General the library. If this is what you want to do, use the GNU Lesser General
Public License instead of this License. But first, please read Public License instead of this License. But first, please read
<http://www.gnu.org/philosophy/why-not-lgpl.html>. <https://www.gnu.org/licenses/why-not-lgpl.html>.

View file

@ -1,26 +1,33 @@
FROM alpine:3.8 FROM alpine:3.15
#Base settings #Base settings
ENV HOME /root ENV HOME /root
COPY requirements.txt /root/requirements.txt
#Install ZeroNet #Install ZeroNet
RUN apk --no-cache --no-progress add musl-dev gcc python python-dev py2-pip tor openssl \ RUN apk --update --no-cache --no-progress add python3 python3-dev py3-pip gcc g++ autoconf automake libtool libffi-dev musl-dev make tor openssl \
&& pip install --no-cache-dir gevent msgpack \ && pip3 install -r /root/requirements.txt \
&& apk del musl-dev gcc python-dev py2-pip \ && apk del python3-dev gcc g++ autoconf automake libtool libffi-dev musl-dev make \
&& echo "ControlPort 9051" >> /etc/tor/torrc \ && echo "ControlPort 9051" >> /etc/tor/torrc \
&& echo "CookieAuthentication 1" >> /etc/tor/torrc && echo "CookieAuthentication 1" >> /etc/tor/torrc
RUN python3 -V \
&& python3 -m pip list \
&& tor --version \
&& openssl version
#Add Zeronet source #Add Zeronet source
COPY . /root COPY . /root
VOLUME /root/data VOLUME /root/data
#Control if Tor proxy is started #Control if Tor proxy is started
ENV ENABLE_TOR false ENV ENABLE_TOR true
WORKDIR /root WORKDIR /root
#Set upstart command #Set upstart command
CMD (! ${ENABLE_TOR} || tor&) && python zeronet.py --ui_ip 0.0.0.0 --fileserver_port 26552 CMD (! ${ENABLE_TOR} || tor&) && python3 zeronet.py --ui_ip 0.0.0.0 --fileserver_port 26117
#Expose ports #Expose ports
EXPOSE 43110 26552 EXPOSE 43110 26117

34
Dockerfile.arm64v8 Normal file
View file

@ -0,0 +1,34 @@
FROM alpine:3.12
#Base settings
ENV HOME /root
COPY requirements.txt /root/requirements.txt
#Install ZeroNet
RUN apk --update --no-cache --no-progress add python3 python3-dev gcc libffi-dev musl-dev make tor openssl \
&& pip3 install -r /root/requirements.txt \
&& apk del python3-dev gcc libffi-dev musl-dev make \
&& echo "ControlPort 9051" >> /etc/tor/torrc \
&& echo "CookieAuthentication 1" >> /etc/tor/torrc
RUN python3 -V \
&& python3 -m pip list \
&& tor --version \
&& openssl version
#Add Zeronet source
COPY . /root
VOLUME /root/data
#Control if Tor proxy is started
ENV ENABLE_TOR false
WORKDIR /root
#Set upstart command
CMD (! ${ENABLE_TOR} || tor&) && python3 zeronet.py --ui_ip 0.0.0.0 --fileserver_port 26552
#Expose ports
EXPOSE 43110 26552

367
LICENSE
View file

@ -1,340 +1,27 @@
GNU GENERAL PUBLIC LICENSE This program is free software: you can redistribute it and/or modify
Version 2, June 1991 it under the terms of the GNU General Public License as published by
the Free Software Foundation, version 3.
Copyright (C) 1989, 1991 Free Software Foundation, Inc., <http://fsf.org/>
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA This program is distributed in the hope that it will be useful,
Everyone is permitted to copy and distribute verbatim copies but WITHOUT ANY WARRANTY; without even the implied warranty of
of this license document, but changing it is not allowed. MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
GNU General Public License for more details.
Preamble
You should have received a copy of the GNU General Public License
The licenses for most software are designed to take away your along with this program. If not, see <https://www.gnu.org/licenses/>.
freedom to share and change it. By contrast, the GNU General Public
License is intended to guarantee your freedom to share and change free
software--to make sure the software is free for all its users. This Additional Conditions :
General Public License applies to most of the Free Software
Foundation's software and to any other program whose authors commit to Contributing to this repo
using it. (Some other Free Software Foundation software is covered by This repo is governed by GPLv3, same is located at the root of the ZeroNet git repo,
the GNU Lesser General Public License instead.) You can apply it to unless specified separately all code is governed by that license, contributions to this repo
your programs, too. are divided into two key types, key contributions and non-key contributions, key contributions
are which, directly affects the code performance, quality and features of software,
When we speak of free software, we are referring to freedom, not non key contributions include things like translation datasets, image, graphic or video
price. Our General Public Licenses are designed to make sure that you contributions that does not affect the main usability of software but improves the existing
have the freedom to distribute copies of free software (and charge for usability of certain thing or feature, these also include tests written with code, since their
this service if you wish), that you receive source code or can get it purpose is to check, whether something is working or not as intended. All the non-key contributions
if you want it, that you can change the software or use pieces of it are governed by [CC BY-SA 4.0](https://creativecommons.org/licenses/by-sa/4.0/), unless specified
in new free programs; and that you know you can do these things. above, a contribution is ruled by the type of contribution if there is a conflict between two
contributing parties of repo in any case.
To protect your rights, we need to make restrictions that forbid
anyone to deny you these rights or to ask you to surrender the rights.
These restrictions translate to certain responsibilities for you if you
distribute copies of the software, or if you modify it.
For example, if you distribute copies of such a program, whether
gratis or for a fee, you must give the recipients all the rights that
you have. You must make sure that they, too, receive or can get the
source code. And you must show them these terms so they know their
rights.
We protect your rights with two steps: (1) copyright the software, and
(2) offer you this license which gives you legal permission to copy,
distribute and/or modify the software.
Also, for each author's protection and ours, we want to make certain
that everyone understands that there is no warranty for this free
software. If the software is modified by someone else and passed on, we
want its recipients to know that what they have is not the original, so
that any problems introduced by others will not reflect on the original
authors' reputations.
Finally, any free program is threatened constantly by software
patents. We wish to avoid the danger that redistributors of a free
program will individually obtain patent licenses, in effect making the
program proprietary. To prevent this, we have made it clear that any
patent must be licensed for everyone's free use or not licensed at all.
The precise terms and conditions for copying, distribution and
modification follow.
GNU GENERAL PUBLIC LICENSE
TERMS AND CONDITIONS FOR COPYING, DISTRIBUTION AND MODIFICATION
0. This License applies to any program or other work which contains
a notice placed by the copyright holder saying it may be distributed
under the terms of this General Public License. The "Program", below,
refers to any such program or work, and a "work based on the Program"
means either the Program or any derivative work under copyright law:
that is to say, a work containing the Program or a portion of it,
either verbatim or with modifications and/or translated into another
language. (Hereinafter, translation is included without limitation in
the term "modification".) Each licensee is addressed as "you".
Activities other than copying, distribution and modification are not
covered by this License; they are outside its scope. The act of
running the Program is not restricted, and the output from the Program
is covered only if its contents constitute a work based on the
Program (independent of having been made by running the Program).
Whether that is true depends on what the Program does.
1. You may copy and distribute verbatim copies of the Program's
source code as you receive it, in any medium, provided that you
conspicuously and appropriately publish on each copy an appropriate
copyright notice and disclaimer of warranty; keep intact all the
notices that refer to this License and to the absence of any warranty;
and give any other recipients of the Program a copy of this License
along with the Program.
You may charge a fee for the physical act of transferring a copy, and
you may at your option offer warranty protection in exchange for a fee.
2. You may modify your copy or copies of the Program or any portion
of it, thus forming a work based on the Program, and copy and
distribute such modifications or work under the terms of Section 1
above, provided that you also meet all of these conditions:
a) You must cause the modified files to carry prominent notices
stating that you changed the files and the date of any change.
b) You must cause any work that you distribute or publish, that in
whole or in part contains or is derived from the Program or any
part thereof, to be licensed as a whole at no charge to all third
parties under the terms of this License.
c) If the modified program normally reads commands interactively
when run, you must cause it, when started running for such
interactive use in the most ordinary way, to print or display an
announcement including an appropriate copyright notice and a
notice that there is no warranty (or else, saying that you provide
a warranty) and that users may redistribute the program under
these conditions, and telling the user how to view a copy of this
License. (Exception: if the Program itself is interactive but
does not normally print such an announcement, your work based on
the Program is not required to print an announcement.)
These requirements apply to the modified work as a whole. If
identifiable sections of that work are not derived from the Program,
and can be reasonably considered independent and separate works in
themselves, then this License, and its terms, do not apply to those
sections when you distribute them as separate works. But when you
distribute the same sections as part of a whole which is a work based
on the Program, the distribution of the whole must be on the terms of
this License, whose permissions for other licensees extend to the
entire whole, and thus to each and every part regardless of who wrote it.
Thus, it is not the intent of this section to claim rights or contest
your rights to work written entirely by you; rather, the intent is to
exercise the right to control the distribution of derivative or
collective works based on the Program.
In addition, mere aggregation of another work not based on the Program
with the Program (or with a work based on the Program) on a volume of
a storage or distribution medium does not bring the other work under
the scope of this License.
3. You may copy and distribute the Program (or a work based on it,
under Section 2) in object code or executable form under the terms of
Sections 1 and 2 above provided that you also do one of the following:
a) Accompany it with the complete corresponding machine-readable
source code, which must be distributed under the terms of Sections
1 and 2 above on a medium customarily used for software interchange; or,
b) Accompany it with a written offer, valid for at least three
years, to give any third party, for a charge no more than your
cost of physically performing source distribution, a complete
machine-readable copy of the corresponding source code, to be
distributed under the terms of Sections 1 and 2 above on a medium
customarily used for software interchange; or,
c) Accompany it with the information you received as to the offer
to distribute corresponding source code. (This alternative is
allowed only for noncommercial distribution and only if you
received the program in object code or executable form with such
an offer, in accord with Subsection b above.)
The source code for a work means the preferred form of the work for
making modifications to it. For an executable work, complete source
code means all the source code for all modules it contains, plus any
associated interface definition files, plus the scripts used to
control compilation and installation of the executable. However, as a
special exception, the source code distributed need not include
anything that is normally distributed (in either source or binary
form) with the major components (compiler, kernel, and so on) of the
operating system on which the executable runs, unless that component
itself accompanies the executable.
If distribution of executable or object code is made by offering
access to copy from a designated place, then offering equivalent
access to copy the source code from the same place counts as
distribution of the source code, even though third parties are not
compelled to copy the source along with the object code.
4. You may not copy, modify, sublicense, or distribute the Program
except as expressly provided under this License. Any attempt
otherwise to copy, modify, sublicense or distribute the Program is
void, and will automatically terminate your rights under this License.
However, parties who have received copies, or rights, from you under
this License will not have their licenses terminated so long as such
parties remain in full compliance.
5. You are not required to accept this License, since you have not
signed it. However, nothing else grants you permission to modify or
distribute the Program or its derivative works. These actions are
prohibited by law if you do not accept this License. Therefore, by
modifying or distributing the Program (or any work based on the
Program), you indicate your acceptance of this License to do so, and
all its terms and conditions for copying, distributing or modifying
the Program or works based on it.
6. Each time you redistribute the Program (or any work based on the
Program), the recipient automatically receives a license from the
original licensor to copy, distribute or modify the Program subject to
these terms and conditions. You may not impose any further
restrictions on the recipients' exercise of the rights granted herein.
You are not responsible for enforcing compliance by third parties to
this License.
7. If, as a consequence of a court judgment or allegation of patent
infringement or for any other reason (not limited to patent issues),
conditions are imposed on you (whether by court order, agreement or
otherwise) that contradict the conditions of this License, they do not
excuse you from the conditions of this License. If you cannot
distribute so as to satisfy simultaneously your obligations under this
License and any other pertinent obligations, then as a consequence you
may not distribute the Program at all. For example, if a patent
license would not permit royalty-free redistribution of the Program by
all those who receive copies directly or indirectly through you, then
the only way you could satisfy both it and this License would be to
refrain entirely from distribution of the Program.
If any portion of this section is held invalid or unenforceable under
any particular circumstance, the balance of the section is intended to
apply and the section as a whole is intended to apply in other
circumstances.
It is not the purpose of this section to induce you to infringe any
patents or other property right claims or to contest validity of any
such claims; this section has the sole purpose of protecting the
integrity of the free software distribution system, which is
implemented by public license practices. Many people have made
generous contributions to the wide range of software distributed
through that system in reliance on consistent application of that
system; it is up to the author/donor to decide if he or she is willing
to distribute software through any other system and a licensee cannot
impose that choice.
This section is intended to make thoroughly clear what is believed to
be a consequence of the rest of this License.
8. If the distribution and/or use of the Program is restricted in
certain countries either by patents or by copyrighted interfaces, the
original copyright holder who places the Program under this License
may add an explicit geographical distribution limitation excluding
those countries, so that distribution is permitted only in or among
countries not thus excluded. In such case, this License incorporates
the limitation as if written in the body of this License.
9. The Free Software Foundation may publish revised and/or new versions
of the General Public License from time to time. Such new versions will
be similar in spirit to the present version, but may differ in detail to
address new problems or concerns.
Each version is given a distinguishing version number. If the Program
specifies a version number of this License which applies to it and "any
later version", you have the option of following the terms and conditions
either of that version or of any later version published by the Free
Software Foundation. If the Program does not specify a version number of
this License, you may choose any version ever published by the Free Software
Foundation.
10. If you wish to incorporate parts of the Program into other free
programs whose distribution conditions are different, write to the author
to ask for permission. For software which is copyrighted by the Free
Software Foundation, write to the Free Software Foundation; we sometimes
make exceptions for this. Our decision will be guided by the two goals
of preserving the free status of all derivatives of our free software and
of promoting the sharing and reuse of software generally.
NO WARRANTY
11. BECAUSE THE PROGRAM IS LICENSED FREE OF CHARGE, THERE IS NO WARRANTY
FOR THE PROGRAM, TO THE EXTENT PERMITTED BY APPLICABLE LAW. EXCEPT WHEN
OTHERWISE STATED IN WRITING THE COPYRIGHT HOLDERS AND/OR OTHER PARTIES
PROVIDE THE PROGRAM "AS IS" WITHOUT WARRANTY OF ANY KIND, EITHER EXPRESSED
OR IMPLIED, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF
MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE. THE ENTIRE RISK AS
TO THE QUALITY AND PERFORMANCE OF THE PROGRAM IS WITH YOU. SHOULD THE
PROGRAM PROVE DEFECTIVE, YOU ASSUME THE COST OF ALL NECESSARY SERVICING,
REPAIR OR CORRECTION.
12. IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN WRITING
WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MAY MODIFY AND/OR
REDISTRIBUTE THE PROGRAM AS PERMITTED ABOVE, BE LIABLE TO YOU FOR DAMAGES,
INCLUDING ANY GENERAL, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES ARISING
OUT OF THE USE OR INABILITY TO USE THE PROGRAM (INCLUDING BUT NOT LIMITED
TO LOSS OF DATA OR DATA BEING RENDERED INACCURATE OR LOSSES SUSTAINED BY
YOU OR THIRD PARTIES OR A FAILURE OF THE PROGRAM TO OPERATE WITH ANY OTHER
PROGRAMS), EVEN IF SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE
POSSIBILITY OF SUCH DAMAGES.
END OF TERMS AND CONDITIONS
How to Apply These Terms to Your New Programs
If you develop a new program, and you want it to be of the greatest
possible use to the public, the best way to achieve this is to make it
free software which everyone can redistribute and change under these terms.
To do so, attach the following notices to the program. It is safest
to attach them to the start of each source file to most effectively
convey the exclusion of warranty; and each file should have at least
the "copyright" line and a pointer to where the full notice is found.
{description}
Copyright (C) {year} {fullname}
This program is free software; you can redistribute it and/or modify
it under the terms of the GNU General Public License as published by
the Free Software Foundation; either version 2 of the License, or
(at your option) any later version.
This program is distributed in the hope that it will be useful,
but WITHOUT ANY WARRANTY; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
GNU General Public License for more details.
You should have received a copy of the GNU General Public License along
with this program; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
Also add information on how to contact you by electronic and paper mail.
If the program is interactive, make it output a short notice like this
when it starts in an interactive mode:
Gnomovision version 69, Copyright (C) year name of author
Gnomovision comes with ABSOLUTELY NO WARRANTY; for details type `show w'.
This is free software, and you are welcome to redistribute it
under certain conditions; type `show c' for details.
The hypothetical commands `show w' and `show c' should show the appropriate
parts of the General Public License. Of course, the commands you use may
be called something other than `show w' and `show c'; they could even be
mouse-clicks or menu items--whatever suits your program.
You should also get your employer (if you work as a programmer) or your
school, if any, to sign a "copyright disclaimer" for the program, if
necessary. Here is a sample; alter the names:
Yoyodyne, Inc., hereby disclaims all copyright interest in the program
`Gnomovision' (which makes passes at compilers) written by James Hacker.
{signature of Ty Coon}, 1 April 1989
Ty Coon, President of Vice
This General Public License does not permit incorporating your program into
proprietary programs. If your program is a subroutine library, you may
consider it more useful to permit linking proprietary applications with the
library. If this is what you want to do, use the GNU Lesser General
Public License instead of this License.

View file

@ -1,211 +1,133 @@
# ZeroNet [![Build Status](https://travis-ci.org/HelloZeroNet/ZeroNet.svg?branch=master)](https://travis-ci.org/HelloZeroNet/ZeroNet) [![Documentation](https://img.shields.io/badge/docs-faq-brightgreen.svg)](https://zeronet.io/docs/faq/) [![Help](https://img.shields.io/badge/keep_this_project_alive-donate-yellow.svg)](https://zeronet.io/docs/help_zeronet/donate/) # ZeroNet [![tests](https://github.com/ZeroNetX/ZeroNet/actions/workflows/tests.yml/badge.svg)](https://github.com/ZeroNetX/ZeroNet/actions/workflows/tests.yml) [![Documentation](https://img.shields.io/badge/docs-faq-brightgreen.svg)](https://docs.zeronet.dev/1DeveLopDZL1cHfKi8UXHh2UBEhzH6HhMp/faq/) [![Help](https://img.shields.io/badge/keep_this_project_alive-donate-yellow.svg)](https://docs.zeronet.dev/1DeveLopDZL1cHfKi8UXHh2UBEhzH6HhMp/help_zeronet/donate/) [![Docker Pulls](https://img.shields.io/docker/pulls/canewsin/zeronet)](https://hub.docker.com/r/canewsin/zeronet)
[简体中文](./README-zh-cn.md) [简体中文](./README-zh-cn.md)
[English](./README.md) [English](./README.md)
Децентрализованные вебсайты использующие Bitcoin криптографию и BitTorrent сеть - https://zeronet.io Децентрализованные вебсайты, использующие криптографию Bitcoin и протокол BitTorrent — https://zeronet.dev ([Зеркало в ZeroNet](http://127.0.0.1:43110/1ZeroNetyV5mKY9JF1gsm82TuBXHpfdLX/)). В отличии от Bitcoin, ZeroNet'у не требуется блокчейн для работы, однако он использует ту же криптографию, чтобы обеспечить сохранность и проверку данных.
## Зачем? ## Зачем?
* Мы верим в открытую, свободную, и не отцензуренную сеть и коммуникацию. - Мы верим в открытую, свободную, и неподдающуюся цензуре сеть и связь.
* Нет единой точки отказа: Сайт онлайн пока по крайней мере 1 пир обслуживает его. - Нет единой точки отказа: Сайт остаётся онлайн, пока его обслуживает хотя бы 1 пир.
* Никаких затрат на хостинг: Сайты обслуживаются посетителями. - Нет затрат на хостинг: Сайты обслуживаются посетителями.
* Невозможно отключить: Он нигде, потому что он везде. - Невозможно отключить: Он нигде, потому что он везде.
* Быстр и работает оффлайн: Вы можете получить доступ к сайту, даже если Интернет недоступен. - Скорость и возможность работать без Интернета: Вы сможете получить доступ к сайту, потому что его копия хранится на вашем компьютере и у ваших пиров.
## Особенности ## Особенности
* Обновляемые в реальном времени сайты
* Поддержка Namecoin .bit доменов
* Лёгок в установке: распаковал & запустил
* Клонирование вебсайтов в один клик
* Password-less [BIP32](https://github.com/bitcoin/bips/blob/master/bip-0032.mediawiki)
based authorization: Ваша учетная запись защищена той же криптографией, что и ваш Bitcoin-кошелек
* Встроенный SQL-сервер с синхронизацией данных P2P: Позволяет упростить разработку сайта и ускорить загрузку страницы
* Анонимность: Полная поддержка сети Tor с помощью скрытых служб .onion вместо адресов IPv4
* TLS зашифрованные связи
* Автоматическое открытие uPnP порта
* Плагин для поддержки многопользовательской (openproxy)
* Работает с любыми браузерами и операционными системами
- Обновление сайтов в реальном времени
- Поддержка доменов `.bit` ([Namecoin](https://www.namecoin.org))
- Легкая установка: просто распакуйте и запустите
- Клонирование сайтов "в один клик"
- Беспарольная [BIP32](https://github.com/bitcoin/bips/blob/master/bip-0032.mediawiki)
авторизация: Ваша учетная запись защищена той же криптографией, что и ваш Bitcoin-кошелек
- Встроенный SQL-сервер с синхронизацией данных P2P: Позволяет упростить разработку сайта и ускорить загрузку страницы
- Анонимность: Полная поддержка сети Tor, используя скрытые службы `.onion` вместо адресов IPv4
- Зашифрованное TLS подключение
- Автоматическое открытие UPnPпорта
- Плагин для поддержки нескольких пользователей (openproxy)
- Работа с любыми браузерами и операционными системами
## Текущие ограничения
- Файловые транзакции не сжаты
- Нет приватных сайтов
## Как это работает? ## Как это работает?
* После запуска `zeronet.py` вы сможете посетить зайты (zeronet сайты) используя адрес - После запуска `zeronet.py` вы сможете посещать сайты в ZeroNet, используя адрес
`http://127.0.0.1:43110/{zeronet_address}` `http://127.0.0.1:43110/{zeronet_адрес}`
(например. `http://127.0.0.1:43110/1HeLLo4uzjaLetFx6NH3PMwFP3qbRbTf3D`). (Например: `http://127.0.0.1:43110/1HELLoE3sFD9569CLCbHEAVqvqV7U2Ri9d`).
* Когда вы посещаете новый сайт zeronet, он пытается найти пиров с помощью BitTorrent - Когда вы посещаете новый сайт в ZeroNet, он пытается найти пиров с помощью протокола BitTorrent,
чтобы загрузить файлы сайтов (html, css, js ...) из них. чтобы скачать у них файлы сайта (HTML, CSS, JS и т.д.).
* Каждый посещенный зайт также обслуживается вами. (Т.е хранится у вас на компьютере) - После посещения сайта вы тоже становитесь его пиром.
* Каждый сайт содержит файл `content.json`, который содержит все остальные файлы в хэше sha512 - Каждый сайт содержит файл `content.json`, который содержит SHA512 хеши всех остальные файлы
и подпись, созданную с использованием частного ключа сайта. и подпись, созданную с помощью закрытого ключа сайта.
* Если владелец сайта (у которого есть закрытый ключ для адреса сайта) изменяет сайт, то он/она - Если владелец сайта (тот, кто владеет закрытым ключом для адреса сайта) изменяет сайт, он
подписывает новый `content.json` и публикует его для пиров. После этого пиры проверяют целостность `content.json` подписывает новый `content.json` и публикует его для пиров. После этого пиры проверяют целостность `content.json`
(используя подпись), они загружают измененные файлы и публикуют новый контент для других пиров. (используя подпись), скачвают изменённые файлы и распространяют новый контент для других пиров.
#### [Слайд-шоу о криптографии ZeroNet, обновлениях сайтов, многопользовательских сайтах »](https://docs.google.com/presentation/d/1_2qK1IuOKJ51pgBvllZ9Yu7Au2l551t3XBgyTSvilew/pub?start=false&loop=false&delayms=3000)
#### [Часто задаваемые вопросы »](https://zeronet.io/docs/faq/)
#### [Документация разработчика ZeroNet »](https://zeronet.io/docs/site_development/getting_started/)
[Презентация о криптографии ZeroNet, обновлениях сайтов, многопользовательских сайтах »](https://docs.google.com/presentation/d/1_2qK1IuOKJ51pgBvllZ9Yu7Au2l551t3XBgyTSvilew/pub?start=false&loop=false&delayms=3000)
[Часто задаваемые вопросы »](https://docs.zeronet.dev/1DeveLopDZL1cHfKi8UXHh2UBEhzH6HhMp/faq/)
[Документация разработчика ZeroNet »](https://docs.zeronet.dev/1DeveLopDZL1cHfKi8UXHh2UBEhzH6HhMp/site_development/getting_started/)
## Скриншоты ## Скриншоты
![Screenshot](https://i.imgur.com/H60OAHY.png) ![Screenshot](https://i.imgur.com/H60OAHY.png)
![ZeroTalk](https://zeronet.io/docs/img/zerotalk.png) ![ZeroTalk](https://zeronet.io/docs/img/zerotalk.png)
[Больше скриншотов в документации ZeroNet »](https://docs.zeronet.dev/1DeveLopDZL1cHfKi8UXHh2UBEhzH6HhMp/using_zeronet/sample_sites/)
#### [Больше скриншотов в ZeroNet документации »](https://zeronet.io/docs/using_zeronet/sample_sites/) ## Как присоединиться?
### Windows
## Как вступить - Скачайте и распакуйте архив [ZeroNet-win.zip](https://github.com/ZeroNetX/ZeroNet/releases/latest/download/ZeroNet-win.zip) (26МБ)
- Запустите `ZeroNet.exe`
* Скачайте ZeroBundle пакет: ### macOS
* [Microsoft Windows](https://github.com/HelloZeroNet/ZeroNet-win/archive/dist/ZeroNet-win.zip)
* [Apple macOS](https://github.com/HelloZeroNet/ZeroNet-mac/archive/dist/ZeroNet-mac.zip)
* [Linux 64-bit](https://github.com/HelloZeroNet/ZeroBundle/raw/master/dist/ZeroBundle-linux64.tar.gz)
* [Linux 32-bit](https://github.com/HelloZeroNet/ZeroBundle/raw/master/dist/ZeroBundle-linux32.tar.gz)
* Распакуйте где угодно
* Запустите `ZeroNet.exe` (win), `ZeroNet(.app)` (osx), `ZeroNet.sh` (linux)
### Linux терминал - Скачайте и распакуйте архив [ZeroNet-mac.zip](https://github.com/ZeroNetX/ZeroNet/releases/latest/download/ZeroNet-mac.zip) (14МБ)
- Запустите `ZeroNet.app`
* `wget https://github.com/HelloZeroNet/ZeroBundle/raw/master/dist/ZeroBundle-linux64.tar.gz` ### Linux (64 бит)
* `tar xvpfz ZeroBundle-linux64.tar.gz`
* `cd ZeroBundle`
* Запустите с помощью `./ZeroNet.sh`
Он загружает последнюю версию ZeroNet, затем запускает её автоматически. - Скачайте и распакуйте архив [ZeroNet-linux.zip](https://github.com/ZeroNetX/ZeroNet/releases/latest/download/ZeroNet-linux.zip) (14МБ)
- Запустите `./ZeroNet.sh`
#### Ручная установка для Debian Linux > **Note**
> Запустите таким образом: `./ZeroNet.sh --ui_ip '*' --ui_restrict ваш_ip_адрес`, чтобы разрешить удалённое подключение к веб–интерфейсу.
* `sudo apt-get update` ### Docker
* `sudo apt-get install msgpack-python python-gevent`
* `wget https://github.com/HelloZeroNet/ZeroNet/archive/master.tar.gz`
* `tar xvpfz master.tar.gz`
* `cd ZeroNet-master`
* Запустите с помощью `python2 zeronet.py`
* Откройте http://127.0.0.1:43110/ в вашем браузере.
### [Arch Linux](https://www.archlinux.org) Официальный образ находится здесь: https://hub.docker.com/r/canewsin/zeronet/
* `git clone https://aur.archlinux.org/zeronet.git` ### Android (arm, arm64, x86)
* `cd zeronet`
* `makepkg -srci`
* `systemctl start zeronet`
* Откройте http://127.0.0.1:43110/ в вашем браузере.
Смотрите [ArchWiki](https://wiki.archlinux.org)'s [ZeroNet - Для работы требуется Android как минимум версии 5.0 Lollipop
article](https://wiki.archlinux.org/index.php/ZeroNet) для дальнейшей помощи. - [<img src="https://play.google.com/intl/en_us/badges/images/generic/en_badge_web_generic.png"
alt="Download from Google Play"
height="80">](https://play.google.com/store/apps/details?id=in.canews.zeronetmobile)
- Скачать APK: https://github.com/canewsin/zeronet_mobile/releases
### [Gentoo Linux](https://www.gentoo.org) ### Android (arm, arm64, x86) Облегчённый клиент только для просмотра (1МБ)
* [`layman -a raiagent`](https://github.com/leycec/raiagent) - Для работы требуется Android как минимум версии 4.1 Jelly Bean
* `echo '>=net-vpn/zeronet-0.5.4' >> /etc/portage/package.accept_keywords` - [<img src="https://play.google.com/intl/en_us/badges/images/generic/en_badge_web_generic.png"
* *(Опционально)* Включить поддержку Tor: `echo 'net-vpn/zeronet tor' >> alt="Download from Google Play"
/etc/portage/package.use` height="80">](https://play.google.com/store/apps/details?id=dev.zeronetx.app.lite)
* `emerge zeronet`
* `rc-service zeronet start`
* Откройте http://127.0.0.1:43110/ в вашем браузере.
Смотрите `/usr/share/doc/zeronet-*/README.gentoo.bz2` для дальнейшей помощи. ### Установка из исходного кода
### [FreeBSD](https://www.freebsd.org/) ```sh
wget https://github.com/ZeroNetX/ZeroNet/releases/latest/download/ZeroNet-src.zip
* `pkg install zeronet` or `cd /usr/ports/security/zeronet/ && make install clean` unzip ZeroNet-src.zip
* `sysrc zeronet_enable="YES"` cd ZeroNet
* `service zeronet start` sudo apt-get update
* Откройте http://127.0.0.1:43110/ в вашем браузере. sudo apt-get install python3-pip
sudo python3 -m pip install -r requirements.txt
### [Vagrant](https://www.vagrantup.com/)
* `vagrant up`
* Подключитесь к VM с помощью `vagrant ssh`
* `cd /vagrant`
* Запустите `python2 zeronet.py --ui_ip 0.0.0.0`
* Откройте http://127.0.0.1:43110/ в вашем браузере.
### [Docker](https://www.docker.com/)
* `docker run -d -v <local_data_folder>:/root/data -p 15441:15441 -p 127.0.0.1:43110:43110 nofish/zeronet`
* Это изображение Docker включает в себя прокси-сервер Tor, который по умолчанию отключён.
Остерегайтесь что некоторые хостинг-провайдеры могут не позволить вам запускать Tor на своих серверах.
Если вы хотите включить его,установите переменную среды `ENABLE_TOR` в` true` (по умолчанию: `false`) Например:
`docker run -d -e "ENABLE_TOR=true" -v <local_data_folder>:/root/data -p 15441:15441 -p 127.0.0.1:43110:43110 nofish/zeronet`
* Откройте http://127.0.0.1:43110/ в вашем браузере.
### [Virtualenv](https://virtualenv.readthedocs.org/en/latest/)
* `virtualenv env`
* `source env/bin/activate`
* `pip install msgpack gevent`
* `python2 zeronet.py`
* Откройте http://127.0.0.1:43110/ в вашем браузере.
## Текущие ограничения
* ~~Нет torrent-похожего файла разделения для поддержки больших файлов~~ (поддержка больших файлов добавлена)
* ~~Не анонимнее чем Bittorrent~~ (добавлена встроенная поддержка Tor)
* Файловые транзакции не сжаты ~~ или незашифрованы еще ~~ (добавлено шифрование TLS)
* Нет приватных сайтов
## Как я могу создать сайт в Zeronet?
Завершите работу zeronet, если он запущен
```bash
$ zeronet.py siteCreate
...
- Site private key (Приватный ключ сайта): 23DKQpzxhbVBrAtvLEc2uvk7DZweh4qL3fn3jpM3LgHDczMK2TtYUq
- Site address (Адрес сайта): 13DNDkMUExRf9Xa9ogwPKqp7zyHFEqbhC2
...
- Site created! (Сайт создан)
$ zeronet.py
...
``` ```
- Запустите `python3 zeronet.py`
Поздравляем, вы закончили! Теперь каждый может получить доступ к вашему зайту используя Откройте приветственную страницу ZeroHello в вашем браузере по ссылке http://127.0.0.1:43110/
`http://localhost:43110/13DNDkMUExRf9Xa9ogwPKqp7zyHFEqbhC2`
Следующие шаги: [ZeroNet Developer Documentation](https://zeronet.io/docs/site_development/getting_started/) ## Как мне создать сайт в ZeroNet?
- Кликните на **⋮** > **"Create new, empty site"** в меню на сайте [ZeroHello](http://127.0.0.1:43110/1HELLoE3sFD9569CLCbHEAVqvqV7U2Ri9d).
- Вы будете **перенаправлены** на совершенно новый сайт, который может быть изменён только вами!
- Вы можете найти и изменить контент вашего сайта в каталоге **data/[адресашего_сайта]**
- После изменений откройте ваш сайт, переключите влево кнопку "0" в правом верхнем углу, затем нажмите кнопки **sign** и **publish** внизу
## Как я могу модифицировать Zeronet сайт? Следующие шаги: [Документация разработчика ZeroNet](https://docs.zeronet.dev/1DeveLopDZL1cHfKi8UXHh2UBEhzH6HhMp/site_development/getting_started/)
* Измените файлы расположенные в data/13DNDkMUExRf9Xa9ogwPKqp7zyHFEqbhC2 директории.
Когда закончите с изменением:
```bash
$ zeronet.py siteSign 13DNDkMUExRf9Xa9ogwPKqp7zyHFEqbhC2
- Signing site (Подпись сайта): 13DNDkMUExRf9Xa9ogwPKqp7zyHFEqbhC2...
Private key (Приватный ключ) (input hidden):
```
* Введите секретный ключ, который вы получили при создании сайта, потом:
```bash
$ zeronet.py sitePublish 13DNDkMUExRf9Xa9ogwPKqp7zyHFEqbhC2
...
Site:13DNDk..bhC2 Publishing to 3/10 peers...
Site:13DNDk..bhC2 Successfuly published to 3 peers
- Serving files....
```
* Вот и всё! Вы успешно подписали и опубликовали свои изменения.
## Поддержите проект ## Поддержите проект
- Bitcoin: 1QDhxQ6PraUZa21ET5fYUCPgdrwBomnFgX - Bitcoin: 1ZeroNetyV5mKY9JF1gsm82TuBXHpfdLX (Рекомендуем)
- Paypal: https://zeronet.io/docs/help_zeronet/donate/ - LiberaPay: https://liberapay.com/PramUkesh
- Paypal: https://paypal.me/PramUkesh
### Спонсоры - Другие способы: [Donate](!https://docs.zeronet.dev/1DeveLopDZL1cHfKi8UXHh2UBEhzH6HhMp/help_zeronet/donate/#help-to-keep-zeronet-development-alive)
* Улучшенная совместимость с MacOS / Safari стала возможной благодаря [BrowserStack.com](https://www.browserstack.com)
#### Спасибо! #### Спасибо!
* Больше информации, помощь, журнал изменений, zeronet сайты: https://www.reddit.com/r/zeronet/ - Здесь вы можете получить больше информации, помощь, прочитать список изменений и исследовать ZeroNet сайты: https://www.reddit.com/r/zeronetx/
* Приходите, пообщайтесь с нами: [#zeronet @ FreeNode](https://kiwiirc.com/client/irc.freenode.net/zeronet) или на [gitter](https://gitter.im/HelloZeroNet/ZeroNet) - Общение происходит на канале [#zeronet @ FreeNode](https://kiwiirc.com/client/irc.freenode.net/zeronet) или в [Gitter](https://gitter.im/canewsin/ZeroNet)
* Email: hello@zeronet.io (PGP: CB9613AE) - Электронная почта: canews.in@gmail.com

View file

@ -1,51 +1,49 @@
# ZeroNet [![Build Status](https://travis-ci.org/HelloZeroNet/ZeroNet.svg?branch=master)](https://travis-ci.org/HelloZeroNet/ZeroNet) [![Documentation](https://img.shields.io/badge/docs-faq-brightgreen.svg)](https://zeronet.io/docs/faq/) [![Help](https://img.shields.io/badge/keep_this_project_alive-donate-yellow.svg)](https://zeronet.io/docs/help_zeronet/donate/) # ZeroNet [![tests](https://github.com/ZeroNetX/ZeroNet/actions/workflows/tests.yml/badge.svg)](https://github.com/ZeroNetX/ZeroNet/actions/workflows/tests.yml) [![Documentation](https://img.shields.io/badge/docs-faq-brightgreen.svg)](https://docs.zeronet.dev/1DeveLopDZL1cHfKi8UXHh2UBEhzH6HhMp/faq/) [![Help](https://img.shields.io/badge/keep_this_project_alive-donate-yellow.svg)](https://docs.zeronet.dev/1DeveLopDZL1cHfKi8UXHh2UBEhzH6HhMp/help_zeronet/donate/) [![Docker Pulls](https://img.shields.io/docker/pulls/canewsin/zeronet)](https://hub.docker.com/r/canewsin/zeronet)
[English](./README.md) [English](./README.md)
使用 Bitcoin 加密和 BitTorrent 网络的去中心化网络 - https://zeronet.io 使用 Bitcoin 加密和 BitTorrent 网络的去中心化网络 - https://zeronet.dev
## 为什么? ## 为什么
* 我们相信开放,自由,无审查的网络 * 我们相信开放,自由,无审查的网络和通讯
* 不会受单点故障影响:只要有在线的节点,站点就会保持在线 * 不会受单点故障影响:只要有在线的节点,站点就会保持在线
* 无托管费用: 站点由访问者托管 * 无托管费用站点由访问者托管
* 无法关闭: 因为节点无处不在 * 无法关闭因为节点无处不在
* 快速并可离线运行: 即使没有互联网连接也可以使用 * 快速并可离线运行即使没有互联网连接也可以使用
## 功能 ## 功能
* 实时站点更新 * 实时站点更新
* 支持 Namecoin 的 .bit 域名 * 支持 Namecoin 的 .bit 域名
* 安装方便: 只需解压并运行 * 安装方便只需解压并运行
* 一键克隆存在的站点 * 一键克隆存在的站点
* 无需密码、基于 [BIP32](https://github.com/bitcoin/bips/blob/master/bip-0032.mediawiki) 的认证:用与比特币钱包相同的加密方法用来保护你的账户 * 无需密码、基于 [BIP32](https://github.com/bitcoin/bips/blob/master/bip-0032.mediawiki)
你的账户被使用和比特币钱包相同的加密方法 的认证:您的账户被与比特币钱包相同的加密方法保护
* 内建 SQL 服务器和 P2P 数据同步: 让开发更简单并提升加载速度 * 内建 SQL 服务器和 P2P 数据同步让开发更简单并提升加载速度
* 匿名性: 完整的 Tor 网络支持,支持通过 .onion 隐藏服务相互连接而不是通过IPv4地址连接 * 匿名性完整的 Tor 网络支持,支持通过 .onion 隐藏服务相互连接而不是通过 IPv4 地址连接
* TLS 加密连接 * TLS 加密连接
* 自动打开 uPnP 端口 * 自动打开 uPnP 端口
* 插件和多用户 (开放式代理) 支持 * 多用户openproxy支持的插件
* 全平台兼容 * 适用于任何浏览器 / 操作系统
## 原理 ## 原理
* 在你运行`zeronet.py`后你将可以通过`http://127.0.0.1:43110/{zeronet_address}` (比如. * 在运行 `zeronet.py` 后,您将可以通过
`http://127.0.0.1:43110/1HeLLo4uzjaLetFx6NH3PMwFP3qbRbTf3D`)。访问 zeronet 中的站点。 `http://127.0.0.1:43110/{zeronet_address}`(例如:
`http://127.0.0.1:43110/1HELLoE3sFD9569CLCbHEAVqvqV7U2Ri9d`)访问 zeronet 中的站点
* 在您浏览 zeronet 站点时,客户端会尝试通过 BitTorrent 网络来寻找可用的节点从而下载需要的文件htmlcssjs...
* 您将会储存每一个浏览过的站点
* 每个站点都包含一个名为 `content.json` 的文件,它储存了其他所有文件的 sha512 散列值以及一个通过站点私钥生成的签名
* 如果站点的所有者(拥有站点地址的私钥)修改了站点,并且他 / 她签名了新的 `content.json` 然后推送至其他节点,
那么这些节点将会在使用签名验证 `content.json` 的真实性后,下载修改后的文件并将新内容推送至另外的节点
* 在你浏览 zeronet 站点时,客户端会尝试通过 BitTorrent 网络来寻找可用的节点,从而下载需要的文件 (html, css, js...) #### [关于 ZeroNet 加密,站点更新,多用户站点的幻灯片 »](https://docs.google.com/presentation/d/1_2qK1IuOKJ51pgBvllZ9Yu7Au2l551t3XBgyTSvilew/pub?start=false&loop=false&delayms=3000)
#### [常见问题 »](https://docs.zeronet.dev/1DeveLopDZL1cHfKi8UXHh2UBEhzH6HhMp/faq/)
* 你将会储存每一个浏览过的站点 #### [ZeroNet 开发者文档 »](https://docs.zeronet.dev/1DeveLopDZL1cHfKi8UXHh2UBEhzH6HhMp/site_development/getting_started/)
* 每个站点都包含一个名为 `content.json` ,它储存了其他所有文件的 sha512 hash 值
和一个通过站点私钥建立的签名
* 如果站点的所有者 (拥有私钥的那个人) 修改了站点, 并且他/她签名了新的 `content.json` 然后推送至其他节点,
那么所有节点将会在验证 `content.json` 的真实性 (使用签名)后, 下载修改后的文件并推送至其他节点。
#### [有关于 ZeroNet 加密, 站点更新, 多用户站点的幻灯片 »](https://docs.google.com/presentation/d/1qBxkroB_iiX2zHEn0dt-N-qRZgyEzui46XS2hEa3AA4/pub?start=false&loop=false&delayms=3000)
#### [常见问题 »](https://zeronet.io/docs/faq/)
#### [ZeroNet开发者文档 »](https://zeronet.io/docs/site_development/getting_started/)
## 屏幕截图 ## 屏幕截图
@ -53,136 +51,82 @@
![Screenshot](https://i.imgur.com/H60OAHY.png) ![Screenshot](https://i.imgur.com/H60OAHY.png)
![ZeroTalk](https://zeronet.io/docs/img/zerotalk.png) ![ZeroTalk](https://zeronet.io/docs/img/zerotalk.png)
#### [在 ZeroNet 文档里查看更多的屏幕截图 »](https://zeronet.io/docs/using_zeronet/sample_sites/) #### [ZeroNet 文档中的更多屏幕截图 »](https://docs.zeronet.dev/1DeveLopDZL1cHfKi8UXHh2UBEhzH6HhMp/using_zeronet/sample_sites/)
## 如何加入 ## 如何加入
* 下载 ZeroBundle 文件包: ### Windows
* [Microsoft Windows](https://github.com/HelloZeroNet/ZeroNet-win/archive/dist/ZeroNet-win.zip)
* [Apple macOS](https://github.com/HelloZeroNet/ZeroNet-mac/archive/dist/ZeroNet-mac.zip)
* [Linux 64bit](https://github.com/HelloZeroNet/ZeroBundle/raw/master/dist/ZeroBundle-linux64.tar.gz)
* [Linux 32bit](https://github.com/HelloZeroNet/ZeroBundle/raw/master/dist/ZeroBundle-linux32.tar.gz)
* 解压缩
* 运行 `ZeroNet.exe` (win), `ZeroNet(.app)` (osx), `ZeroNet.sh` (linux)
### Linux 命令行 - 下载 [ZeroNet-win.zip](https://github.com/ZeroNetX/ZeroNet/releases/latest/download/ZeroNet-win.zip) (26MB)
- 在任意位置解压缩
- 运行 `ZeroNet.exe`
### macOS
* `wget https://github.com/HelloZeroNet/ZeroBundle/raw/master/dist/ZeroBundle-linux64.tar.gz` - 下载 [ZeroNet-mac.zip](https://github.com/ZeroNetX/ZeroNet/releases/latest/download/ZeroNet-mac.zip) (14MB)
* `tar xvpfz ZeroBundle-linux64.tar.gz` - 在任意位置解压缩
* `cd ZeroBundle` - 运行 `ZeroNet.app`
* 执行 `./ZeroNet.sh` 来启动
### Linux (x86-64bit)
在你打开时他将会自动下载最新版本的 ZeroNet 。 - `wget https://github.com/ZeroNetX/ZeroNet/releases/latest/download/ZeroNet-linux.zip`
- `unzip ZeroNet-linux.zip`
- `cd ZeroNet-linux`
- 使用以下命令启动 `./ZeroNet.sh`
- 在浏览器打开 http://127.0.0.1:43110/ 即可访问 ZeroHello 页面
__提示__ 若要允许在 Web 界面上的远程连接,使用以下命令启动 `./ZeroNet.sh --ui_ip '*' --ui_restrict your.ip.address`
#### 在 Debian Linux 中手动安装 ### 从源代码安装
* `sudo apt-get update` - `wget https://github.com/ZeroNetX/ZeroNet/releases/latest/download/ZeroNet-src.zip`
* `sudo apt-get install msgpack-python python-gevent` - `unzip ZeroNet-src.zip`
* `wget https://github.com/HelloZeroNet/ZeroNet/archive/master.tar.gz` - `cd ZeroNet`
* `tar xvpfz master.tar.gz` - `sudo apt-get update`
* `cd ZeroNet-master` - `sudo apt-get install python3-pip`
* 执行 `python2 zeronet.py` 来启动 - `sudo python3 -m pip install -r requirements.txt`
* 在你的浏览器中打开 http://127.0.0.1:43110/ - 使用以下命令启动 `python3 zeronet.py`
- 在浏览器打开 http://127.0.0.1:43110/ 即可访问 ZeroHello 页面
### [FreeBSD](https://www.freebsd.org/) ### Android (arm, arm64, x86)
- minimum Android version supported 21 (Android 5.0 Lollipop)
- [<img src="https://play.google.com/intl/en_us/badges/images/generic/en_badge_web_generic.png"
alt="Download from Google Play"
height="80">](https://play.google.com/store/apps/details?id=in.canews.zeronetmobile)
- APK download: https://github.com/canewsin/zeronet_mobile/releases
* `pkg install zeronet` 或者 `cd /usr/ports/security/zeronet/ && make install clean` ### Android (arm, arm64, x86) Thin Client for Preview Only (Size 1MB)
* `sysrc zeronet_enable="YES"` - minimum Android version supported 16 (JellyBean)
* `service zeronet start` - [<img src="https://play.google.com/intl/en_us/badges/images/generic/en_badge_web_generic.png"
* 在你的浏览器中打开 http://127.0.0.1:43110/ alt="Download from Google Play"
height="80">](https://play.google.com/store/apps/details?id=dev.zeronetx.app.lite)
### [Vagrant](https://www.vagrantup.com/)
* `vagrant up`
* 通过 `vagrant ssh` 连接到 VM
* `cd /vagrant`
* 运行 `python2 zeronet.py --ui_ip 0.0.0.0`
* 在你的浏览器中打开 http://127.0.0.1:43110/
### [Docker](https://www.docker.com/)
* `docker run -d -v <local_data_folder>:/root/data -p 26552:26552 -p 43110:43110 nofish/zeronet`
* 这个 Docker 镜像包含了 Tor ,但默认是禁用的,因为一些托管商不允许你在他们的服务器上运行 Tor。如果你希望启用它
设置 `ENABLE_TOR` 环境变量为 `true` (默认: `false`). E.g.:
`docker run -d -e "ENABLE_TOR=true" -v <local_data_folder>:/root/data -p 26552:26552 -p 43110:43110 nofish/zeronet`
* 在你的浏览器中打开 http://127.0.0.1:43110/
### [Virtualenv](https://virtualenv.readthedocs.org/en/latest/)
* `virtualenv env`
* `source env/bin/activate`
* `pip install msgpack gevent`
* `python2 zeronet.py`
* 在你的浏览器中打开 http://127.0.0.1:43110/
## 现有限制 ## 现有限制
* ~~没有类似于 BitTorrent 的文件拆分来支持大文件~~ (已添加大文件支持) * 传输文件时没有压缩
* ~~没有比 BitTorrent 更好的匿名性~~ (已添加内置的完整 Tor 支持)
* 传输文件时没有压缩~~和加密~~ (已添加 TLS 支持)
* 不支持私有站点 * 不支持私有站点
## 如何创建一个 ZeroNet 站点? ## 如何创建一个 ZeroNet 站点?
* 点击 [ZeroHello](http://127.0.0.1:43110/1HELLoE3sFD9569CLCbHEAVqvqV7U2Ri9d) 站点的 **⋮** > **「新建空站点」** 菜单项
* 您将被**重定向**到一个全新的站点,该站点只能由您修改
* 您可以在 **data/[您的站点地址]** 目录中找到并修改网站的内容
* 修改后打开您的网站将右上角的「0」按钮拖到左侧然后点击底部的**签名**并**发布**按钮
如果 zeronet 在运行,把它关掉 接下来的步骤:[ZeroNet 开发者文档](https://docs.zeronet.dev/1DeveLopDZL1cHfKi8UXHh2UBEhzH6HhMp/site_development/getting_started/)
执行:
```bash
$ zeronet.py siteCreate
...
- Site private key: 23DKQpzxhbVBrAtvLEc2uvk7DZweh4qL3fn3jpM3LgHDczMK2TtYUq
- Site address: 13DNDkMUExRf9Xa9ogwPKqp7zyHFEqbhC2
...
- Site created!
$ zeronet.py
...
```
你已经完成了! 现在任何人都可以通过
`http://localhost:43110/13DNDkMUExRf9Xa9ogwPKqp7zyHFEqbhC2`
来访问你的站点
下一步: [ZeroNet 开发者文档](https://zeronet.io/docs/site_development/getting_started/)
## 我要如何修改 ZeroNet 站点?
* 修改位于 data/13DNDkMUExRf9Xa9ogwPKqp7zyHFEqbhC2 的目录.
在你改好之后:
```bash
$ zeronet.py siteSign 13DNDkMUExRf9Xa9ogwPKqp7zyHFEqbhC2
- Signing site: 13DNDkMUExRf9Xa9ogwPKqp7zyHFEqbhC2...
Private key (input hidden):
```
* 输入你在创建站点时获得的私钥
```bash
$ zeronet.py sitePublish 13DNDkMUExRf9Xa9ogwPKqp7zyHFEqbhC2
...
Site:13DNDk..bhC2 Publishing to 3/10 peers...
Site:13DNDk..bhC2 Successfuly published to 3 peers
- Serving files....
```
* 就是这样! 你现在已经成功的签名并推送了你的更改。
## 帮助这个项目 ## 帮助这个项目
- Bitcoin: 1ZeroNetyV5mKY9JF1gsm82TuBXHpfdLX (Preferred)
- LiberaPay: https://liberapay.com/PramUkesh
- Paypal: https://paypal.me/PramUkesh
- Others: [Donate](!https://docs.zeronet.dev/1DeveLopDZL1cHfKi8UXHh2UBEhzH6HhMp/help_zeronet/donate/#help-to-keep-zeronet-development-alive)
- Bitcoin: 1QDhxQ6PraUZa21ET5fYUCPgdrwBomnFgX
- Paypal: https://zeronet.io/docs/help_zeronet/donate/
### 赞助商 #### 感谢您!
* 在 OSX/Safari 下 [BrowserStack.com](https://www.browserstack.com) 带来更好的兼容性 * 更多信息,帮助,变更记录和 zeronet 站点https://www.reddit.com/r/zeronetx/
* 前往 [#zeronet @ FreeNode](https://kiwiirc.com/client/irc.freenode.net/zeronet) 或 [gitter](https://gitter.im/canewsin/ZeroNet) 和我们聊天
#### 感谢! * [这里](https://gitter.im/canewsin/ZeroNet)是一个 gitter 上的中文聊天室
* Email: canews.in@gmail.com
* 更多信息, 帮助, 变更记录和 zeronet 站点: https://www.reddit.com/r/zeronet/
* 在: [#zeronet @ FreeNode](https://kiwiirc.com/client/irc.freenode.net/zeronet) 和我们聊天,或者使用 [gitter](https://gitter.im/HelloZeroNet/ZeroNet)
* [这里](https://gitter.im/ZeroNet-zh/Lobby)是一个 gitter 上的中文聊天室
* Email: hello@noloop.me

207
README.md
View file

@ -1,9 +1,6 @@
# ZeroNet [![Build Status](https://travis-ci.org/HelloZeroNet/ZeroNet.svg?branch=master)](https://travis-ci.org/HelloZeroNet/ZeroNet) [![Documentation](https://img.shields.io/badge/docs-faq-brightgreen.svg)](https://zeronet.io/docs/faq/) [![Help](https://img.shields.io/badge/keep_this_project_alive-donate-yellow.svg)](https://zeronet.io/docs/help_zeronet/donate/) # ZeroNet [![tests](https://github.com/ZeroNetX/ZeroNet/actions/workflows/tests.yml/badge.svg)](https://github.com/ZeroNetX/ZeroNet/actions/workflows/tests.yml) [![Documentation](https://img.shields.io/badge/docs-faq-brightgreen.svg)](https://docs.zeronet.dev/1DeveLopDZL1cHfKi8UXHh2UBEhzH6HhMp/faq/) [![Help](https://img.shields.io/badge/keep_this_project_alive-donate-yellow.svg)](https://docs.zeronet.dev/1DeveLopDZL1cHfKi8UXHh2UBEhzH6HhMp/help_zeronet/donate/) [![Docker Pulls](https://img.shields.io/docker/pulls/canewsin/zeronet)](https://hub.docker.com/r/canewsin/zeronet)
<!--TODO: Update Onion Site -->
[简体中文](./README-zh-cn.md) Decentralized websites using Bitcoin crypto and the BitTorrent network - https://zeronet.dev / [ZeroNet Site](http://127.0.0.1:43110/1ZeroNetyV5mKY9JF1gsm82TuBXHpfdLX/), Unlike Bitcoin, ZeroNet Doesn't need a blockchain to run, But uses cryptography used by BTC, to ensure data integrity and validation.
[Русский](./README-ru.md)
Decentralized websites using Bitcoin crypto and the BitTorrent network - https://zeronet.io
## Why? ## Why?
@ -36,22 +33,22 @@ Decentralized websites using Bitcoin crypto and the BitTorrent network - https:/
* After starting `zeronet.py` you will be able to visit zeronet sites using * After starting `zeronet.py` you will be able to visit zeronet sites using
`http://127.0.0.1:43110/{zeronet_address}` (eg. `http://127.0.0.1:43110/{zeronet_address}` (eg.
`http://127.0.0.1:43110/1HeLLo4uzjaLetFx6NH3PMwFP3qbRbTf3D`). `http://127.0.0.1:43110/1HELLoE3sFD9569CLCbHEAVqvqV7U2Ri9d`).
* When you visit a new zeronet site, it tries to find peers using the BitTorrent * When you visit a new zeronet site, it tries to find peers using the BitTorrent
network so it can download the site files (html, css, js...) from them. network so it can download the site files (html, css, js...) from them.
* Each visited site is also served by you. * Each visited site is also served by you.
* Every site contains a `content.json` file which holds all other files in a sha512 hash * Every site contains a `content.json` file which holds all other files in a sha512 hash
and a signature generated using the site's private key. and a signature generated using the site's private key.
* If the site owner (who has the private key for the site address) modifies the * If the site owner (who has the private key for the site address) modifies the
site, then he/she signs the new `content.json` and publishes it to the peers. site and signs the new `content.json` and publishes it to the peers.
Afterwards, the peers verify the `content.json` integrity (using the Afterwards, the peers verify the `content.json` integrity (using the
signature), they download the modified files and publish the new content to signature), they download the modified files and publish the new content to
other peers. other peers.
#### [Slideshow about ZeroNet cryptography, site updates, multi-user sites »](https://docs.google.com/presentation/d/1_2qK1IuOKJ51pgBvllZ9Yu7Au2l551t3XBgyTSvilew/pub?start=false&loop=false&delayms=3000) #### [Slideshow about ZeroNet cryptography, site updates, multi-user sites »](https://docs.google.com/presentation/d/1_2qK1IuOKJ51pgBvllZ9Yu7Au2l551t3XBgyTSvilew/pub?start=false&loop=false&delayms=3000)
#### [Frequently asked questions »](https://zeronet.io/docs/faq/) #### [Frequently asked questions »](https://docs.zeronet.dev/1DeveLopDZL1cHfKi8UXHh2UBEhzH6HhMp/faq/)
#### [ZeroNet Developer Documentation »](https://zeronet.io/docs/site_development/getting_started/) #### [ZeroNet Developer Documentation »](https://docs.zeronet.dev/1DeveLopDZL1cHfKi8UXHh2UBEhzH6HhMp/site_development/getting_started/)
## Screenshots ## Screenshots
@ -59,163 +56,101 @@ Decentralized websites using Bitcoin crypto and the BitTorrent network - https:/
![Screenshot](https://i.imgur.com/H60OAHY.png) ![Screenshot](https://i.imgur.com/H60OAHY.png)
![ZeroTalk](https://zeronet.io/docs/img/zerotalk.png) ![ZeroTalk](https://zeronet.io/docs/img/zerotalk.png)
#### [More screenshots in ZeroNet docs »](https://zeronet.io/docs/using_zeronet/sample_sites/) #### [More screenshots in ZeroNet docs »](https://docs.zeronet.dev/1DeveLopDZL1cHfKi8UXHh2UBEhzH6HhMp/using_zeronet/sample_sites/)
## How to join ## How to join
* Download ZeroBundle package: ### Windows
* [Microsoft Windows](https://github.com/HelloZeroNet/ZeroNet-win/archive/dist/ZeroNet-win.zip)
* [Apple macOS](https://github.com/HelloZeroNet/ZeroNet-mac/archive/dist/ZeroNet-mac.zip)
* [Linux x86/64-bit](https://github.com/HelloZeroNet/ZeroBundle/raw/master/dist/ZeroBundle-linux64.tar.gz)
* [Linux x86/32-bit](https://github.com/HelloZeroNet/ZeroBundle/raw/master/dist/ZeroBundle-linux32.tar.gz)
* Unpack anywhere
* Run `ZeroNet.exe` (win), `ZeroNet(.app)` (osx), `ZeroNet.sh` (linux)
### Linux terminal on x86-64 - Download [ZeroNet-win.zip](https://github.com/ZeroNetX/ZeroNet/releases/latest/download/ZeroNet-win.zip) (26MB)
- Unpack anywhere
- Run `ZeroNet.exe`
### macOS
* `wget https://github.com/HelloZeroNet/ZeroBundle/raw/master/dist/ZeroBundle-linux64.tar.gz` - Download [ZeroNet-mac.zip](https://github.com/ZeroNetX/ZeroNet/releases/latest/download/ZeroNet-mac.zip) (14MB)
* `tar xvpfz ZeroBundle-linux64.tar.gz` - Unpack anywhere
* `cd ZeroBundle` - Run `ZeroNet.app`
* Start with `./ZeroNet.sh`
### Linux (x86-64bit)
- `wget https://github.com/ZeroNetX/ZeroNet/releases/latest/download/ZeroNet-linux.zip`
- `unzip ZeroNet-linux.zip`
- `cd ZeroNet-linux`
- Start with: `./ZeroNet.sh`
- Open the ZeroHello landing page in your browser by navigating to: http://127.0.0.1:43110/
__Tip:__ Start with `./ZeroNet.sh --ui_ip '*' --ui_restrict your.ip.address` to allow remote connections on the web interface.
### Android (arm, arm64, x86)
- minimum Android version supported 21 (Android 5.0 Lollipop)
- [<img src="https://play.google.com/intl/en_us/badges/images/generic/en_badge_web_generic.png"
alt="Download from Google Play"
height="80">](https://play.google.com/store/apps/details?id=in.canews.zeronetmobile)
- APK download: https://github.com/canewsin/zeronet_mobile/releases
It downloads the latest version of ZeroNet then starts it automatically. ### Android (arm, arm64, x86) Thin Client for Preview Only (Size 1MB)
- minimum Android version supported 16 (JellyBean)
- [<img src="https://play.google.com/intl/en_us/badges/images/generic/en_badge_web_generic.png"
alt="Download from Google Play"
height="80">](https://play.google.com/store/apps/details?id=dev.zeronetx.app.lite)
#### Manual install for Debian Linux
* `sudo apt-get update` #### Docker
* `sudo apt-get install msgpack-python python-gevent` There is an official image, built from source at: https://hub.docker.com/r/canewsin/zeronet/
* `wget https://github.com/HelloZeroNet/ZeroNet/archive/master.tar.gz`
* `tar xvpfz master.tar.gz`
* `cd ZeroNet-master`
* Start with `python2 zeronet.py`
* Open http://127.0.0.1:43110/ in your browser
### [Whonix](https://www.whonix.org) ### Online Proxies
Proxies are like seed boxes for sites(i.e ZNX runs on a cloud vps), you can try zeronet experience from proxies. Add your proxy below if you have one.
* [Instructions](https://www.whonix.org/wiki/ZeroNet) #### Official ZNX Proxy :
### [Arch Linux](https://www.archlinux.org) https://proxy.zeronet.dev/
* `git clone https://aur.archlinux.org/zeronet.git` https://zeronet.dev/
* `cd zeronet`
* `makepkg -srci`
* `systemctl start zeronet`
* Open http://127.0.0.1:43110/ in your browser
See [ArchWiki](https://wiki.archlinux.org)'s [ZeroNet #### From Community
article](https://wiki.archlinux.org/index.php/ZeroNet) for further assistance.
### [Gentoo Linux](https://www.gentoo.org) https://0net-preview.com/
* [`eselect repository enable raiagent`](https://github.com/leycec/raiagent) https://portal.ngnoid.tv/
* `emerge --sync`
* `echo 'net-vpn/zeronet' >> /etc/portage/package.accept_keywords`
* *(Optional)* Enable Tor support: `echo 'net-vpn/zeronet tor' >>
/etc/portage/package.use`
* `emerge zeronet`
* `rc-service zeronet start`
* *(Optional)* Enable zeronet at runlevel "default": `rc-update add zeronet`
* Open http://127.0.0.1:43110/ in your browser
See `/usr/share/doc/zeronet-*/README.gentoo.bz2` for further assistance. https://zeronet.ipfsscan.io/
### [FreeBSD](https://www.freebsd.org/)
* `pkg install zeronet` or `cd /usr/ports/security/zeronet/ && make install clean` ### Install from source
* `sysrc zeronet_enable="YES"`
* `service zeronet start`
* Open http://127.0.0.1:43110/ in your browser
### [Vagrant](https://www.vagrantup.com/) - `wget https://github.com/ZeroNetX/ZeroNet/releases/latest/download/ZeroNet-src.zip`
- `unzip ZeroNet-src.zip`
* `vagrant up` - `cd ZeroNet`
* Access VM with `vagrant ssh` - `sudo apt-get update`
* `cd /vagrant` - `sudo apt-get install python3-pip`
* Run `python2 zeronet.py --ui_ip 0.0.0.0` - `sudo python3 -m pip install -r requirements.txt`
* Open http://127.0.0.1:43110/ in your browser - Start with: `python3 zeronet.py`
- Open the ZeroHello landing page in your browser by navigating to: http://127.0.0.1:43110/
### [Docker](https://www.docker.com/)
* `docker run -d -v <local_data_folder>:/root/data -p 26552:26552 -p 127.0.0.1:43110:43110 nofish/zeronet`
* This Docker image includes the Tor proxy, which is disabled by default. Beware that some
hosting providers may not allow you running Tor in their servers. If you want to enable it,
set `ENABLE_TOR` environment variable to `true` (Default: `false`). E.g.:
`docker run -d -e "ENABLE_TOR=true" -v <local_data_folder>:/root/data -p 26552:26552 -p 127.0.0.1:43110:43110 nofish/zeronet`
* Open http://127.0.0.1:43110/ in your browser
### [Virtualenv](https://virtualenv.readthedocs.org/en/latest/)
* `virtualenv env`
* `source env/bin/activate`
* `pip install msgpack gevent`
* `python2 zeronet.py`
* Open http://127.0.0.1:43110/ in your browser
## Current limitations ## Current limitations
* ~~No torrent-like file splitting for big file support~~ (big file support added) * File transactions are not compressed
* ~~No more anonymous than Bittorrent~~ (built-in full Tor support added)
* File transactions are not compressed ~~or encrypted yet~~ (TLS encryption added)
* No private sites * No private sites
## How can I create a ZeroNet site? ## How can I create a ZeroNet site?
Shut down zeronet if you are running it already * Click on **⋮** > **"Create new, empty site"** menu item on the site [ZeroHello](http://127.0.0.1:43110/1HELLoE3sFD9569CLCbHEAVqvqV7U2Ri9d).
* You will be **redirected** to a completely new site that is only modifiable by you!
```bash * You can find and modify your site's content in **data/[yoursiteaddress]** directory
$ zeronet.py siteCreate * After the modifications open your site, drag the topright "0" button to left, then press **sign** and **publish** buttons on the bottom
...
- Site private key: 23DKQpzxhbVBrAtvLEc2uvk7DZweh4qL3fn3jpM3LgHDczMK2TtYUq
- Site address: 13DNDkMUExRf9Xa9ogwPKqp7zyHFEqbhC2
...
- Site created!
$ zeronet.py
...
```
Congratulations, you're finished! Now anyone can access your site using
`http://localhost:43110/13DNDkMUExRf9Xa9ogwPKqp7zyHFEqbhC2`
Next steps: [ZeroNet Developer Documentation](https://zeronet.io/docs/site_development/getting_started/)
## How can I modify a ZeroNet site?
* Modify files located in data/13DNDkMUExRf9Xa9ogwPKqp7zyHFEqbhC2 directory.
After you're finished:
```bash
$ zeronet.py siteSign 13DNDkMUExRf9Xa9ogwPKqp7zyHFEqbhC2
- Signing site: 13DNDkMUExRf9Xa9ogwPKqp7zyHFEqbhC2...
Private key (input hidden):
```
* Enter the private key you got when you created the site, then:
```bash
$ zeronet.py sitePublish 13DNDkMUExRf9Xa9ogwPKqp7zyHFEqbhC2
...
Site:13DNDk..bhC2 Publishing to 3/10 peers...
Site:13DNDk..bhC2 Successfuly published to 3 peers
- Serving files....
```
* That's it! You've successfully signed and published your modifications.
Next steps: [ZeroNet Developer Documentation](https://docs.zeronet.dev/1DeveLopDZL1cHfKi8UXHh2UBEhzH6HhMp/site_development/getting_started/)
## Help keep this project alive ## Help keep this project alive
- Bitcoin: 1ZeroNetyV5mKY9JF1gsm82TuBXHpfdLX (Preferred)
- Bitcoin: 1QDhxQ6PraUZa21ET5fYUCPgdrwBomnFgX - LiberaPay: https://liberapay.com/PramUkesh
- Paypal: https://zeronet.io/docs/help_zeronet/donate/ - Paypal: https://paypal.me/PramUkesh
- Others: [Donate](!https://docs.zeronet.dev/1DeveLopDZL1cHfKi8UXHh2UBEhzH6HhMp/help_zeronet/donate/#help-to-keep-zeronet-development-alive)
### Sponsors
* Better macOS/Safari compatibility made possible by [BrowserStack.com](https://www.browserstack.com)
#### Thank you! #### Thank you!
* More info, help, changelog, zeronet sites: https://www.reddit.com/r/zeronet/ * More info, help, changelog, zeronet sites: https://www.reddit.com/r/zeronetx/
* Come, chat with us: [#zeronet @ FreeNode](https://kiwiirc.com/client/irc.freenode.net/zeronet) or on [gitter](https://gitter.im/HelloZeroNet/ZeroNet) * Come, chat with us: [#zeronet @ FreeNode](https://kiwiirc.com/client/irc.freenode.net/zeronet) or on [gitter](https://gitter.im/canewsin/ZeroNet)
* Email: hello@zeronet.io (PGP: CB9613AE) * Email: canews.in@gmail.com

1
plugins Submodule

@ -0,0 +1 @@
Subproject commit 689d9309f73371f4681191b125ec3f2e14075eeb

View file

@ -1,148 +0,0 @@
import time
import gevent
from Plugin import PluginManager
from Config import config
import BroadcastServer
@PluginManager.registerTo("SiteAnnouncer")
class SiteAnnouncerPlugin(object):
def announce(self, force=False, *args, **kwargs):
local_announcer = self.site.connection_server.local_announcer
thread = None
if local_announcer and (force or time.time() - local_announcer.last_discover > 5 * 60):
thread = gevent.spawn(local_announcer.discover, force=force)
back = super(SiteAnnouncerPlugin, self).announce(force=force, *args, **kwargs)
if thread:
thread.join()
return back
class LocalAnnouncer(BroadcastServer.BroadcastServer):
def __init__(self, server, listen_port):
super(LocalAnnouncer, self).__init__("zeronet", listen_port=listen_port)
self.server = server
self.sender_info["peer_id"] = self.server.peer_id
self.sender_info["port"] = self.server.port
self.sender_info["broadcast_port"] = listen_port
self.sender_info["rev"] = config.rev
self.known_peers = {}
self.last_discover = 0
def discover(self, force=False):
self.log.debug("Sending discover request (force: %s)" % force)
self.last_discover = time.time()
if force: # Probably new site added, clean cache
self.known_peers = {}
for peer_id, known_peer in self.known_peers.items():
if time.time() - known_peer["found"] > 20 * 60:
del(self.known_peers[peer_id])
self.log.debug("Timeout, removing from known_peers: %s" % peer_id)
self.broadcast({"cmd": "discoverRequest", "params": {}}, port=self.listen_port)
def actionDiscoverRequest(self, sender, params):
back = {
"cmd": "discoverResponse",
"params": {
"sites_changed": self.server.site_manager.sites_changed
}
}
if sender["peer_id"] not in self.known_peers:
self.known_peers[sender["peer_id"]] = {"added": time.time(), "sites_changed": 0, "updated": 0, "found": time.time()}
self.log.debug("Got discover request from unknown peer %s (%s), time to refresh known peers" % (sender["ip"], sender["peer_id"]))
gevent.spawn_later(1.0, self.discover) # Let the response arrive first to the requester
return back
def actionDiscoverResponse(self, sender, params):
if sender["peer_id"] in self.known_peers:
self.known_peers[sender["peer_id"]]["found"] = time.time()
if params["sites_changed"] != self.known_peers.get(sender["peer_id"], {}).get("sites_changed"):
# Peer's site list changed, request the list of new sites
return {"cmd": "siteListRequest"}
else:
# Peer's site list is the same
for site in self.server.sites.values():
peer = site.peers.get("%s:%s" % (sender["ip"], sender["port"]))
if peer:
peer.found("local")
def actionSiteListRequest(self, sender, params):
back = []
sites = self.server.sites.values()
# Split adresses to group of 100 to avoid UDP size limit
site_groups = [sites[i:i + 100] for i in range(0, len(sites), 100)]
for site_group in site_groups:
res = {}
res["sites_changed"] = self.server.site_manager.sites_changed
res["sites"] = [site.address_hash for site in site_group]
back.append({"cmd": "siteListResponse", "params": res})
return back
def actionSiteListResponse(self, sender, params):
s = time.time()
peer_sites = set(params["sites"])
num_found = 0
added_sites = []
for site in self.server.sites.values():
if site.address_hash in peer_sites:
added = site.addPeer(sender["ip"], sender["port"], source="local")
num_found += 1
if added:
site.worker_manager.onPeers()
site.updateWebsocket(peers_added=1)
added_sites.append(site)
# Save sites changed value to avoid unnecessary site list download
if sender["peer_id"] not in self.known_peers:
self.known_peers[sender["peer_id"]] = {"added": time.time()}
self.known_peers[sender["peer_id"]]["sites_changed"] = params["sites_changed"]
self.known_peers[sender["peer_id"]]["updated"] = time.time()
self.known_peers[sender["peer_id"]]["found"] = time.time()
self.log.debug(
"Tracker result: Discover from %s response parsed in %.3fs, found: %s added: %s of %s" %
(sender["ip"], time.time() - s, num_found, added_sites, len(peer_sites))
)
@PluginManager.registerTo("FileServer")
class FileServerPlugin(object):
def __init__(self, *args, **kwargs):
res = super(FileServerPlugin, self).__init__(*args, **kwargs)
if config.broadcast_port and config.tor != "always" and not config.disable_udp:
self.local_announcer = LocalAnnouncer(self, config.broadcast_port)
else:
self.local_announcer = None
return res
def start(self, *args, **kwargs):
if self.local_announcer:
gevent.spawn(self.local_announcer.start)
return super(FileServerPlugin, self).start(*args, **kwargs)
def stop(self):
if self.local_announcer:
self.local_announcer.stop()
res = super(FileServerPlugin, self).stop()
return res
@PluginManager.registerTo("ConfigPlugin")
class ConfigPlugin(object):
def createArguments(self):
group = self.parser.add_argument_group("AnnounceLocal plugin")
group.add_argument('--broadcast_port', help='UDP broadcasting port for local peer discovery', default=1544, type=int, metavar='port')
return super(ConfigPlugin, self).createArguments()

View file

@ -1,140 +0,0 @@
import socket
import logging
import time
from contextlib import closing
import msgpack
from Debug import Debug
from util import UpnpPunch
class BroadcastServer(object):
def __init__(self, service_name, listen_port=1544, listen_ip=''):
self.log = logging.getLogger("BroadcastServer")
self.listen_port = listen_port
self.listen_ip = listen_ip
self.running = False
self.sock = None
self.sender_info = {"service": service_name}
def createBroadcastSocket(self):
sock = socket.socket(socket.AF_INET, socket.SOCK_DGRAM)
sock.setsockopt(socket.SOL_SOCKET, socket.SO_BROADCAST, 1)
sock.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEADDR, 1)
if hasattr(socket, 'SO_REUSEPORT'):
try:
sock.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEPORT, 1)
except Exception as err:
self.log.warning("Error setting SO_REUSEPORT: %s" % err)
binded = False
for retry in range(3):
try:
sock.bind((self.listen_ip, self.listen_port))
binded = True
break
except Exception as err:
self.log.error(
"Socket bind to %s:%s error: %s, retry #%s" %
(self.listen_ip, self.listen_port, Debug.formatException(err), retry)
)
time.sleep(retry)
if binded:
return sock
else:
return False
def start(self): # Listens for discover requests
self.sock = self.createBroadcastSocket()
if not self.sock:
self.log.error("Unable to listen on port %s" % self.listen_port)
return
self.log.debug("Started on port %s" % self.listen_port)
self.running = True
while self.running:
try:
data, addr = self.sock.recvfrom(8192)
except Exception as err:
if self.running:
self.log.error("Listener receive error: %s" % err)
continue
if not self.running:
break
try:
message = msgpack.unpackb(data)
response_addr, message = self.handleMessage(addr, message)
if message:
self.send(response_addr, message)
except Exception as err:
self.log.error("Handlemessage error: %s" % Debug.formatException(err))
self.log.debug("Stopped listening on port %s" % self.listen_port)
def stop(self):
self.log.debug("Stopping, socket: %s" % self.sock)
self.running = False
if self.sock:
self.sock.close()
def send(self, addr, message):
if type(message) is not list:
message = [message]
for message_part in message:
message_part["sender"] = self.sender_info
self.log.debug("Send to %s: %s" % (addr, message_part["cmd"]))
with closing(socket.socket(socket.AF_INET, socket.SOCK_DGRAM)) as sock:
sock = socket.socket(socket.AF_INET, socket.SOCK_DGRAM)
sock.sendto(msgpack.packb(message_part), addr)
def getMyIps(self):
return UpnpPunch._get_local_ips()
def broadcast(self, message, port=None):
if not port:
port = self.listen_port
my_ips = self.getMyIps()
addr = ("255.255.255.255", port)
message["sender"] = self.sender_info
self.log.debug("Broadcast using ips %s on port %s: %s" % (my_ips, port, message["cmd"]))
for my_ip in my_ips:
try:
with closing(socket.socket(socket.AF_INET, socket.SOCK_DGRAM)) as sock:
sock = socket.socket(socket.AF_INET, socket.SOCK_DGRAM)
sock.setsockopt(socket.SOL_SOCKET, socket.SO_BROADCAST, 1)
sock.bind((my_ip, 0))
sock.sendto(msgpack.packb(message), addr)
except Exception as err:
self.log.warning("Error sending broadcast using ip %s: %s" % (my_ip, err))
def handleMessage(self, addr, message):
self.log.debug("Got from %s: %s" % (addr, message["cmd"]))
cmd = message["cmd"]
params = message.get("params", {})
sender = message["sender"]
sender["ip"] = addr[0]
func_name = "action" + cmd[0].upper() + cmd[1:]
func = getattr(self, func_name, None)
if sender["service"] != "zeronet" or sender["peer_id"] == self.sender_info["peer_id"]:
# Skip messages not for us or sent by us
message = None
elif func:
message = func(sender, params)
else:
self.log.debug("Unknown cmd: %s" % cmd)
message = None
return (sender["ip"], sender["broadcast_port"]), message

View file

@ -1,113 +0,0 @@
import time
import copy
import gevent
import pytest
import mock
from AnnounceLocal import AnnounceLocalPlugin
from File import FileServer
from Test import Spy
@pytest.fixture
def announcer(file_server, site):
file_server.sites[site.address] = site
announcer = AnnounceLocalPlugin.LocalAnnouncer(file_server, listen_port=1100)
file_server.local_announcer = announcer
announcer.listen_port = 1100
announcer.sender_info["broadcast_port"] = 1100
announcer.getMyIps = mock.MagicMock(return_value=["127.0.0.1"])
announcer.discover = mock.MagicMock(return_value=False) # Don't send discover requests automatically
gevent.spawn(announcer.start)
time.sleep(0.5)
assert file_server.local_announcer.running
return file_server.local_announcer
@pytest.fixture
def announcer_remote(request, site_temp):
file_server_remote = FileServer("127.0.0.1", 1545)
file_server_remote.sites[site_temp.address] = site_temp
announcer = AnnounceLocalPlugin.LocalAnnouncer(file_server_remote, listen_port=1101)
file_server_remote.local_announcer = announcer
announcer.listen_port = 1101
announcer.sender_info["broadcast_port"] = 1101
announcer.getMyIps = mock.MagicMock(return_value=["127.0.0.1"])
announcer.discover = mock.MagicMock(return_value=False) # Don't send discover requests automatically
gevent.spawn(announcer.start)
time.sleep(0.5)
assert file_server_remote.local_announcer.running
def cleanup():
file_server_remote.stop()
request.addfinalizer(cleanup)
return file_server_remote.local_announcer
@pytest.mark.usefixtures("resetSettings")
@pytest.mark.usefixtures("resetTempSettings")
class TestAnnounce:
def testSenderInfo(self, announcer):
sender_info = announcer.sender_info
assert sender_info["port"] > 0
assert len(sender_info["peer_id"]) == 20
assert sender_info["rev"] > 0
def testIgnoreSelfMessages(self, announcer):
# No response to messages that has same peer_id as server
assert not announcer.handleMessage(("0.0.0.0", 123), {"cmd": "discoverRequest", "sender": announcer.sender_info, "params": {}})[1]
# Response to messages with different peer id
sender_info = copy.copy(announcer.sender_info)
sender_info["peer_id"] += "-"
addr, res = announcer.handleMessage(("0.0.0.0", 123), {"cmd": "discoverRequest", "sender": sender_info, "params": {}})
assert res["params"]["sites_changed"] > 0
def testDiscoverRequest(self, announcer, announcer_remote):
assert len(announcer_remote.known_peers) == 0
with Spy.Spy(announcer_remote, "handleMessage") as responses:
announcer_remote.broadcast({"cmd": "discoverRequest", "params": {}}, port=announcer.listen_port)
time.sleep(0.1)
response_cmds = [response[1]["cmd"] for response in responses]
assert response_cmds == ["discoverResponse", "siteListResponse"]
assert len(responses[-1][1]["params"]["sites"]) == 1
# It should only request siteList if sites_changed value is different from last response
with Spy.Spy(announcer_remote, "handleMessage") as responses:
announcer_remote.broadcast({"cmd": "discoverRequest", "params": {}}, port=announcer.listen_port)
time.sleep(0.1)
response_cmds = [response[1]["cmd"] for response in responses]
assert response_cmds == ["discoverResponse"]
def testPeerDiscover(self, announcer, announcer_remote, site):
assert announcer.server.peer_id != announcer_remote.server.peer_id
assert len(announcer.server.sites.values()[0].peers) == 0
announcer.broadcast({"cmd": "discoverRequest"}, port=announcer_remote.listen_port)
time.sleep(0.1)
assert len(announcer.server.sites.values()[0].peers) == 1
def testRecentPeerList(self, announcer, announcer_remote, site):
assert len(site.peers_recent) == 0
assert len(site.peers) == 0
with Spy.Spy(announcer, "handleMessage") as responses:
announcer.broadcast({"cmd": "discoverRequest", "params": {}}, port=announcer_remote.listen_port)
time.sleep(0.1)
assert [response[1]["cmd"] for response in responses] == ["discoverResponse", "siteListResponse"]
assert len(site.peers_recent) == 1
assert len(site.peers) == 1
# It should update peer without siteListResponse
last_time_found = site.peers.values()[0].time_found
site.peers_recent.clear()
with Spy.Spy(announcer, "handleMessage") as responses:
announcer.broadcast({"cmd": "discoverRequest", "params": {}}, port=announcer_remote.listen_port)
time.sleep(0.1)
assert [response[1]["cmd"] for response in responses] == ["discoverResponse"]
assert len(site.peers_recent) == 1
assert site.peers.values()[0].time_found > last_time_found

View file

@ -1,4 +0,0 @@
from src.Test.conftest import *
from Config import config
config.broadcast_port = 0

View file

@ -1,5 +0,0 @@
[pytest]
python_files = Test*.py
addopts = -rsxX -v --durations=6
markers =
webtest: mark a test as a webtest.

View file

@ -1 +0,0 @@
import AnnounceLocalPlugin

View file

@ -1,188 +0,0 @@
import time
import os
import logging
import json
import atexit
import gevent
from Config import config
from Plugin import PluginManager
from util import helper
class TrackerStorage(object):
def __init__(self):
self.log = logging.getLogger("TrackerStorage")
self.file_path = "%s/trackers.json" % config.data_dir
self.load()
self.time_discover = 0.0
atexit.register(self.save)
def getDefaultFile(self):
return {"shared": {}}
def onTrackerFound(self, tracker_address, type="shared", my=False):
if not tracker_address.startswith("zero://"):
return False
trackers = self.getTrackers()
added = False
if tracker_address not in trackers:
trackers[tracker_address] = {
"time_added": time.time(),
"time_success": 0,
"latency": 99.0,
"num_error": 0,
"my": False
}
self.log.debug("New tracker found: %s" % tracker_address)
added = True
trackers[tracker_address]["time_found"] = time.time()
trackers[tracker_address]["my"] = my
return added
def onTrackerSuccess(self, tracker_address, latency):
trackers = self.getTrackers()
if tracker_address not in trackers:
return False
trackers[tracker_address]["latency"] = latency
trackers[tracker_address]["time_success"] = time.time()
trackers[tracker_address]["num_error"] = 0
def onTrackerError(self, tracker_address):
trackers = self.getTrackers()
if tracker_address not in trackers:
return False
trackers[tracker_address]["time_error"] = time.time()
trackers[tracker_address]["num_error"] += 1
if len(self.getWorkingTrackers()) >= config.working_shared_trackers_limit:
error_limit = 5
else:
error_limit = 30
error_limit
if trackers[tracker_address]["num_error"] > error_limit and trackers[tracker_address]["time_success"] < time.time() - 60 * 60:
self.log.debug("Tracker %s looks down, removing." % tracker_address)
del trackers[tracker_address]
def getTrackers(self, type="shared"):
return self.file_content.setdefault(type, {})
def getWorkingTrackers(self, type="shared"):
trackers = {
key: tracker for key, tracker in self.getTrackers(type).iteritems()
if tracker["time_success"] > time.time() - 60 * 60
}
return trackers
def getFileContent(self):
if not os.path.isfile(self.file_path):
open(self.file_path, "w").write("{}")
return self.getDefaultFile()
try:
return json.load(open(self.file_path))
except Exception as err:
self.log.error("Error loading trackers list: %s" % err)
return self.getDefaultFile()
def load(self):
self.file_content = self.getFileContent()
trackers = self.getTrackers()
self.log.debug("Loaded %s shared trackers" % len(trackers))
for address, tracker in trackers.items():
tracker["num_error"] = 0
if not address.startswith("zero://"):
del trackers[address]
def save(self):
s = time.time()
helper.atomicWrite(self.file_path, json.dumps(self.file_content, indent=2, sort_keys=True))
self.log.debug("Saved in %.3fs" % (time.time() - s))
def discoverTrackers(self, peers):
if len(self.getWorkingTrackers()) > config.working_shared_trackers_limit:
return False
s = time.time()
num_success = 0
for peer in peers:
if peer.connection and peer.connection.handshake.get("rev", 0) < 3560:
continue # Not supported
res = peer.request("getTrackers")
if not res or "error" in res:
continue
num_success += 1
for tracker_address in res["trackers"]:
added = self.onTrackerFound(tracker_address)
if added: # Only add one tracker from one source
break
if not num_success and len(peers) < 20:
self.time_discover = 0.0
if num_success:
self.save()
self.log.debug("Trackers discovered from %s/%s peers in %.3fs" % (num_success, len(peers), time.time() - s))
if "tracker_storage" not in locals():
tracker_storage = TrackerStorage()
@PluginManager.registerTo("SiteAnnouncer")
class SiteAnnouncerPlugin(object):
def getTrackers(self):
if tracker_storage.time_discover < time.time() - 5 * 60:
tracker_storage.time_discover = time.time()
gevent.spawn(tracker_storage.discoverTrackers, self.site.getConnectedPeers())
trackers = super(SiteAnnouncerPlugin, self).getTrackers()
shared_trackers = tracker_storage.getTrackers("shared").keys()
if shared_trackers:
return trackers + shared_trackers
else:
return trackers
def announceTracker(self, tracker, *args, **kwargs):
res = super(SiteAnnouncerPlugin, self).announceTracker(tracker, *args, **kwargs)
if res:
latency = res
tracker_storage.onTrackerSuccess(tracker, latency)
elif res is False:
tracker_storage.onTrackerError(tracker)
return res
@PluginManager.registerTo("FileRequest")
class FileRequestPlugin(object):
def actionGetTrackers(self, params):
shared_trackers = tracker_storage.getWorkingTrackers("shared").keys()
self.response({"trackers": shared_trackers})
@PluginManager.registerTo("FileServer")
class FileServerPlugin(object):
def portCheck(self, *args, **kwargs):
res = super(FileServerPlugin, self).portCheck(*args, **kwargs)
if res and not config.tor == "always" and "Bootstrapper" in PluginManager.plugin_manager.plugin_names:
for ip in self.ip_external_list:
my_tracker_address = "zero://%s:%s" % (ip, config.fileserver_port)
tracker_storage.onTrackerFound(my_tracker_address, my=True)
return res
@PluginManager.registerTo("ConfigPlugin")
class ConfigPlugin(object):
def createArguments(self):
group = self.parser.add_argument_group("AnnounceShare plugin")
group.add_argument('--working_shared_trackers_limit', help='Stop discovering new shared trackers after this number of shared trackers reached', default=5, type=int, metavar='limit')
return super(ConfigPlugin, self).createArguments()

View file

@ -1,25 +0,0 @@
import pytest
from AnnounceShare import AnnounceSharePlugin
from Peer import Peer
from Config import config
@pytest.mark.usefixtures("resetSettings")
@pytest.mark.usefixtures("resetTempSettings")
class TestAnnounceShare:
def testAnnounceList(self, file_server):
open("%s/trackers.json" % config.data_dir, "w").write("{}")
tracker_storage = AnnounceSharePlugin.tracker_storage
tracker_storage.load()
print tracker_storage.file_path, config.data_dir
peer = Peer(file_server.ip, 1544, connection_server=file_server)
assert peer.request("getTrackers")["trackers"] == []
tracker_storage.onTrackerFound("zero://%s:15441" % file_server.ip)
assert peer.request("getTrackers")["trackers"] == []
# It needs to have at least one successfull announce to be shared to other peers
tracker_storage.onTrackerSuccess("zero://%s:15441" % file_server.ip, 1.0)
assert peer.request("getTrackers")["trackers"] == ["zero://%s:15441" % file_server.ip]

View file

@ -1,3 +0,0 @@
from src.Test.conftest import *
from Config import config

View file

@ -1,5 +0,0 @@
[pytest]
python_files = Test*.py
addopts = -rsxX -v --durations=6
markers =
webtest: mark a test as a webtest.

View file

@ -1 +0,0 @@
import AnnounceSharePlugin

View file

@ -1,138 +0,0 @@
import time
import itertools
from Plugin import PluginManager
from util import helper
from Crypt import CryptRsa
allow_reload = False # No source reload supported in this plugin
time_full_announced = {} # Tracker address: Last announced all site to tracker
connection_pool = {} # Tracker address: Peer object
# We can only import plugin host clases after the plugins are loaded
@PluginManager.afterLoad
def importHostClasses():
global Peer, AnnounceError
from Peer import Peer
from Site.SiteAnnouncer import AnnounceError
# Process result got back from tracker
def processPeerRes(tracker_address, site, peers):
added = 0
# Ip4
found_ipv4 = 0
peers_normal = itertools.chain(peers.get("ip4", []), peers.get("ipv4", []), peers.get("ipv6", []))
for packed_address in peers_normal:
found_ipv4 += 1
peer_ip, peer_port = helper.unpackAddress(packed_address)
if site.addPeer(peer_ip, peer_port, source="tracker"):
added += 1
# Onion
found_onion = 0
for packed_address in peers["onion"]:
found_onion += 1
peer_onion, peer_port = helper.unpackOnionAddress(packed_address)
if site.addPeer(peer_onion, peer_port, source="tracker"):
added += 1
if added:
site.worker_manager.onPeers()
site.updateWebsocket(peers_added=added)
return added
@PluginManager.registerTo("SiteAnnouncer")
class SiteAnnouncerPlugin(object):
def getTrackerHandler(self, protocol):
if protocol == "zero":
return self.announceTrackerZero
else:
return super(SiteAnnouncerPlugin, self).getTrackerHandler(protocol)
def announceTrackerZero(self, tracker_address, mode="start", num_want=10):
global time_full_announced
s = time.time()
need_types = ["ip4"] # ip4 for backward compatibility reasons
need_types += self.site.connection_server.supported_ip_types
if self.site.connection_server.tor_manager.enabled:
need_types.append("onion")
if mode == "start" or mode == "more": # Single: Announce only this site
sites = [self.site]
full_announce = False
else: # Multi: Announce all currently serving site
full_announce = True
if time.time() - time_full_announced.get(tracker_address, 0) < 60 * 15: # No reannounce all sites within short time
return None
time_full_announced[tracker_address] = time.time()
from Site import SiteManager
sites = [site for site in SiteManager.site_manager.sites.values() if site.settings["serving"]]
# Create request
add_types = self.getOpenedServiceTypes()
request = {
"hashes": [], "onions": [], "port": self.fileserver_port, "need_types": need_types, "need_num": 20, "add": add_types
}
for site in sites:
if "onion" in add_types:
onion = self.site.connection_server.tor_manager.getOnion(site.address)
request["onions"].append(onion)
request["hashes"].append(site.address_hash)
# Tracker can remove sites that we don't announce
if full_announce:
request["delete"] = True
# Sent request to tracker
tracker_peer = connection_pool.get(tracker_address) # Re-use tracker connection if possible
if not tracker_peer:
tracker_ip, tracker_port = tracker_address.rsplit(":", 1)
tracker_peer = Peer(str(tracker_ip), int(tracker_port), connection_server=self.site.connection_server)
tracker_peer.is_tracker_connection = True
connection_pool[tracker_address] = tracker_peer
res = tracker_peer.request("announce", request)
if not res or "peers" not in res:
if full_announce:
time_full_announced[tracker_address] = 0
raise AnnounceError("Invalid response: %s" % res)
# Add peers from response to site
site_index = 0
peers_added = 0
for site_res in res["peers"]:
site = sites[site_index]
peers_added += processPeerRes(tracker_address, site, site_res)
site_index += 1
# Check if we need to sign prove the onion addresses
if "onion_sign_this" in res:
self.site.log.debug("Signing %s for %s to add %s onions" % (res["onion_sign_this"], tracker_address, len(sites)))
request["onion_signs"] = {}
request["onion_sign_this"] = res["onion_sign_this"]
request["need_num"] = 0
for site in sites:
onion = self.site.connection_server.tor_manager.getOnion(site.address)
publickey = self.site.connection_server.tor_manager.getPublickey(onion)
if publickey not in request["onion_signs"]:
sign = CryptRsa.sign(res["onion_sign_this"], self.site.connection_server.tor_manager.getPrivatekey(onion))
request["onion_signs"][publickey] = sign
res = tracker_peer.request("announce", request)
if not res or "onion_sign_this" in res:
if full_announce:
time_full_announced[tracker_address] = 0
raise AnnounceError("Announce onion address to failed: %s" % res)
if full_announce:
tracker_peer.remove() # Close connection, we don't need it in next 5 minute
self.site.log.debug(
"Tracker announce result: zero://%s (sites: %s, new peers: %s) in %.3fs" %
(tracker_address, site_index, peers_added, time.time() - s)
)
return True

View file

@ -1 +0,0 @@
import AnnounceZeroPlugin

View file

@ -1,158 +0,0 @@
import array
def packPiecefield(data):
res = []
if not data:
return array.array("H", "")
if data[0] == "0":
res.append(0)
find = "1"
else:
find = "0"
last_pos = 0
pos = 0
while 1:
pos = data.find(find, pos)
if find == "0":
find = "1"
else:
find = "0"
if pos == -1:
res.append(len(data) - last_pos)
break
res.append(pos - last_pos)
last_pos = pos
return array.array("H", res)
def unpackPiecefield(data):
if not data:
return ""
res = []
char = "1"
for times in data:
if times > 10000:
return ""
res.append(char * times)
if char == "1":
char = "0"
else:
char = "1"
return "".join(res)
class BigfilePiecefield(object):
__slots__ = ["data"]
def __init__(self):
self.data = ""
def fromstring(self, s):
self.data = s
def tostring(self):
return self.data
def pack(self):
return packPiecefield(self.data).tostring()
def unpack(self, s):
self.data = unpackPiecefield(array.array("H", s))
def __getitem__(self, key):
try:
return int(self.data[key])
except IndexError:
return False
def __setitem__(self, key, value):
data = self.data
if len(data) < key:
data = data.ljust(key+1, "0")
data = data[:key] + str(int(value)) + data[key + 1:]
self.data = data
class BigfilePiecefieldPacked(object):
__slots__ = ["data"]
def __init__(self):
self.data = ""
def fromstring(self, data):
self.data = packPiecefield(data).tostring()
def tostring(self):
return unpackPiecefield(array.array("H", self.data))
def pack(self):
return array.array("H", self.data).tostring()
def unpack(self, data):
self.data = data
def __getitem__(self, key):
try:
return int(self.tostring()[key])
except IndexError:
return False
def __setitem__(self, key, value):
data = self.tostring()
if len(data) < key:
data = data.ljust(key+1, "0")
data = data[:key] + str(int(value)) + data[key + 1:]
self.fromstring(data)
if __name__ == "__main__":
import os
import psutil
import time
testdata = "1" * 100 + "0" * 900 + "1" * 4000 + "0" * 4999 + "1"
meminfo = psutil.Process(os.getpid()).memory_info
for storage in [BigfilePiecefieldPacked, BigfilePiecefield]:
print "-- Testing storage: %s --" % storage
m = meminfo()[0]
s = time.time()
piecefields = {}
for i in range(10000):
piecefield = storage()
piecefield.fromstring(testdata[:i] + "0" + testdata[i + 1:])
piecefields[i] = piecefield
print "Create x10000: +%sKB in %.3fs (len: %s)" % ((meminfo()[0] - m) / 1024, time.time() - s, len(piecefields[0].data))
m = meminfo()[0]
s = time.time()
for piecefield in piecefields.values():
val = piecefield[1000]
print "Query one x10000: +%sKB in %.3fs" % ((meminfo()[0] - m) / 1024, time.time() - s)
m = meminfo()[0]
s = time.time()
for piecefield in piecefields.values():
piecefield[1000] = True
print "Change one x10000: +%sKB in %.3fs" % ((meminfo()[0] - m) / 1024, time.time() - s)
m = meminfo()[0]
s = time.time()
for piecefield in piecefields.values():
packed = piecefield.pack()
print "Pack x10000: +%sKB in %.3fs (len: %s)" % ((meminfo()[0] - m) / 1024, time.time() - s, len(packed))
m = meminfo()[0]
s = time.time()
for piecefield in piecefields.values():
piecefield.unpack(packed)
print "Unpack x10000: +%sKB in %.3fs (len: %s)" % ((meminfo()[0] - m) / 1024, time.time() - s, len(piecefields[0].data))
piecefields = {}

View file

@ -1,769 +0,0 @@
import time
import os
import subprocess
import shutil
import collections
import math
import json
import msgpack
import gevent
import gevent.lock
from Plugin import PluginManager
from Debug import Debug
from Crypt import CryptHash
from lib import merkletools
from util import helper
import util
from BigfilePiecefield import BigfilePiecefield, BigfilePiecefieldPacked
# We can only import plugin host clases after the plugins are loaded
@PluginManager.afterLoad
def importPluginnedClasses():
global VerifyError, config
from Content.ContentManager import VerifyError
from Config import config
if "upload_nonces" not in locals():
upload_nonces = {}
@PluginManager.registerTo("UiRequest")
class UiRequestPlugin(object):
def isCorsAllowed(self, path):
if path == "/ZeroNet-Internal/BigfileUpload":
return True
else:
return super(UiRequestPlugin, self).isCorsAllowed(path)
def actionBigfileUpload(self):
nonce = self.get.get("upload_nonce")
if nonce not in upload_nonces:
return self.error403("Upload nonce error.")
upload_info = upload_nonces[nonce]
del upload_nonces[nonce]
self.sendHeader(200, "text/html", noscript=True, extra_headers={
"Access-Control-Allow-Origin": "null",
"Access-Control-Allow-Credentials": "true"
})
self.readMultipartHeaders(self.env['wsgi.input']) # Skip http headers
site = upload_info["site"]
inner_path = upload_info["inner_path"]
with site.storage.open(inner_path, "wb", create_dirs=True) as out_file:
merkle_root, piece_size, piecemap_info = site.content_manager.hashBigfile(
self.env['wsgi.input'], upload_info["size"], upload_info["piece_size"], out_file
)
if len(piecemap_info["sha512_pieces"]) == 1: # Small file, don't split
hash = piecemap_info["sha512_pieces"][0].encode("hex")
hash_id = site.content_manager.hashfield.getHashId(hash)
site.content_manager.optionalDownloaded(inner_path, hash_id, upload_info["size"], own=True)
else: # Big file
file_name = helper.getFilename(inner_path)
msgpack.pack({file_name: piecemap_info}, site.storage.open(upload_info["piecemap"], "wb"))
# Find piecemap and file relative path to content.json
file_info = site.content_manager.getFileInfo(inner_path, new_file=True)
content_inner_path_dir = helper.getDirname(file_info["content_inner_path"])
piecemap_relative_path = upload_info["piecemap"][len(content_inner_path_dir):]
file_relative_path = inner_path[len(content_inner_path_dir):]
# Add file to content.json
if site.storage.isFile(file_info["content_inner_path"]):
content = site.storage.loadJson(file_info["content_inner_path"])
else:
content = {}
if "files_optional" not in content:
content["files_optional"] = {}
content["files_optional"][file_relative_path] = {
"sha512": merkle_root,
"size": upload_info["size"],
"piecemap": piecemap_relative_path,
"piece_size": piece_size
}
merkle_root_hash_id = site.content_manager.hashfield.getHashId(merkle_root)
site.content_manager.optionalDownloaded(inner_path, merkle_root_hash_id, upload_info["size"], own=True)
site.storage.writeJson(file_info["content_inner_path"], content)
site.content_manager.contents.loadItem(file_info["content_inner_path"]) # reload cache
return json.dumps({
"merkle_root": merkle_root,
"piece_num": len(piecemap_info["sha512_pieces"]),
"piece_size": piece_size,
"inner_path": inner_path
})
def readMultipartHeaders(self, wsgi_input):
for i in range(100):
line = wsgi_input.readline()
if line == "\r\n":
break
return i
def actionFile(self, file_path, *args, **kwargs):
if kwargs.get("file_size", 0) > 1024 * 1024 and kwargs.get("path_parts"): # Only check files larger than 1MB
path_parts = kwargs["path_parts"]
site = self.server.site_manager.get(path_parts["address"])
big_file = site.storage.openBigfile(path_parts["inner_path"], prebuffer=2 * 1024 * 1024)
if big_file:
kwargs["file_obj"] = big_file
kwargs["file_size"] = big_file.size
return super(UiRequestPlugin, self).actionFile(file_path, *args, **kwargs)
@PluginManager.registerTo("UiWebsocket")
class UiWebsocketPlugin(object):
def actionBigfileUploadInit(self, to, inner_path, size):
valid_signers = self.site.content_manager.getValidSigners(inner_path)
auth_address = self.user.getAuthAddress(self.site.address)
if not self.site.settings["own"] and auth_address not in valid_signers:
self.log.error("FileWrite forbidden %s not in valid_signers %s" % (auth_address, valid_signers))
return self.response(to, {"error": "Forbidden, you can only modify your own files"})
nonce = CryptHash.random()
piece_size = 1024 * 1024
inner_path = self.site.content_manager.sanitizePath(inner_path)
file_info = self.site.content_manager.getFileInfo(inner_path, new_file=True)
content_inner_path_dir = helper.getDirname(file_info["content_inner_path"])
file_relative_path = inner_path[len(content_inner_path_dir):]
upload_nonces[nonce] = {
"added": time.time(),
"site": self.site,
"inner_path": inner_path,
"websocket_client": self,
"size": size,
"piece_size": piece_size,
"piecemap": inner_path + ".piecemap.msgpack"
}
return {
"url": "/ZeroNet-Internal/BigfileUpload?upload_nonce=" + nonce,
"piece_size": piece_size,
"inner_path": inner_path,
"file_relative_path": file_relative_path
}
def actionSiteSetAutodownloadBigfileLimit(self, to, limit):
permissions = self.getPermissions(to)
if "ADMIN" not in permissions:
return self.response(to, "You don't have permission to run this command")
self.site.settings["autodownload_bigfile_size_limit"] = int(limit)
self.response(to, "ok")
def actionFileDelete(self, to, inner_path):
piecemap_inner_path = inner_path + ".piecemap.msgpack"
if self.hasFilePermission(inner_path) and self.site.storage.isFile(piecemap_inner_path):
# Also delete .piecemap.msgpack file if exists
self.log.debug("Deleting piecemap: %s" % piecemap_inner_path)
file_info = self.site.content_manager.getFileInfo(piecemap_inner_path)
if file_info:
content_json = self.site.storage.loadJson(file_info["content_inner_path"])
relative_path = file_info["relative_path"]
if relative_path in content_json.get("files_optional", {}):
del content_json["files_optional"][relative_path]
self.site.storage.writeJson(file_info["content_inner_path"], content_json)
self.site.content_manager.loadContent(file_info["content_inner_path"], add_bad_files=False, force=True)
try:
self.site.storage.delete(piecemap_inner_path)
except Exception, err:
self.log.error("File %s delete error: %s" % (piecemap_inner_path, err))
return super(UiWebsocketPlugin, self).actionFileDelete(to, inner_path)
@PluginManager.registerTo("ContentManager")
class ContentManagerPlugin(object):
def getFileInfo(self, inner_path, *args, **kwargs):
if "|" not in inner_path:
return super(ContentManagerPlugin, self).getFileInfo(inner_path, *args, **kwargs)
inner_path, file_range = inner_path.split("|")
pos_from, pos_to = map(int, file_range.split("-"))
file_info = super(ContentManagerPlugin, self).getFileInfo(inner_path, *args, **kwargs)
return file_info
def readFile(self, file_in, size, buff_size=1024 * 64):
part_num = 0
recv_left = size
while 1:
part_num += 1
read_size = min(buff_size, recv_left)
part = file_in.read(read_size)
if not part:
break
yield part
if part_num % 100 == 0: # Avoid blocking ZeroNet execution during upload
time.sleep(0.001)
recv_left -= read_size
if recv_left <= 0:
break
def hashBigfile(self, file_in, size, piece_size=1024 * 1024, file_out=None):
self.site.settings["has_bigfile"] = True
recv = 0
try:
piece_hash = CryptHash.sha512t()
piece_hashes = []
piece_recv = 0
mt = merkletools.MerkleTools()
mt.hash_function = CryptHash.sha512t
part = ""
for part in self.readFile(file_in, size):
if file_out:
file_out.write(part)
recv += len(part)
piece_recv += len(part)
piece_hash.update(part)
if piece_recv >= piece_size:
piece_digest = piece_hash.digest()
piece_hashes.append(piece_digest)
mt.leaves.append(piece_digest)
piece_hash = CryptHash.sha512t()
piece_recv = 0
if len(piece_hashes) % 100 == 0 or recv == size:
self.log.info("- [HASHING:%.0f%%] Pieces: %s, %.1fMB/%.1fMB" % (
float(recv) / size * 100, len(piece_hashes), recv / 1024 / 1024, size / 1024 / 1024
))
part = ""
if len(part) > 0:
piece_digest = piece_hash.digest()
piece_hashes.append(piece_digest)
mt.leaves.append(piece_digest)
except Exception as err:
raise err
finally:
if file_out:
file_out.close()
mt.make_tree()
return mt.get_merkle_root(), piece_size, {
"sha512_pieces": piece_hashes
}
def hashFile(self, dir_inner_path, file_relative_path, optional=False):
inner_path = dir_inner_path + file_relative_path
file_size = self.site.storage.getSize(inner_path)
# Only care about optional files >1MB
if not optional or file_size < 1 * 1024 * 1024:
return super(ContentManagerPlugin, self).hashFile(dir_inner_path, file_relative_path, optional)
back = {}
content = self.contents.get(dir_inner_path + "content.json")
hash = None
piecemap_relative_path = None
piece_size = None
# Don't re-hash if it's already in content.json
if content and file_relative_path in content.get("files_optional", {}):
file_node = content["files_optional"][file_relative_path]
if file_node["size"] == file_size:
self.log.info("- [SAME SIZE] %s" % file_relative_path)
hash = file_node.get("sha512")
piecemap_relative_path = file_node.get("piecemap")
piece_size = file_node.get("piece_size")
if not hash or not piecemap_relative_path: # Not in content.json yet
if file_size < 5 * 1024 * 1024: # Don't create piecemap automatically for files smaller than 5MB
return super(ContentManagerPlugin, self).hashFile(dir_inner_path, file_relative_path, optional)
self.log.info("- [HASHING] %s" % file_relative_path)
merkle_root, piece_size, piecemap_info = self.hashBigfile(self.site.storage.open(inner_path, "rb"), file_size)
if not hash:
hash = merkle_root
if not piecemap_relative_path:
file_name = helper.getFilename(file_relative_path)
piecemap_relative_path = file_relative_path + ".piecemap.msgpack"
piecemap_inner_path = inner_path + ".piecemap.msgpack"
msgpack.pack({file_name: piecemap_info}, self.site.storage.open(piecemap_inner_path, "wb"))
back.update(super(ContentManagerPlugin, self).hashFile(dir_inner_path, piecemap_relative_path, optional=True))
piece_num = int(math.ceil(float(file_size) / piece_size))
# Add the merkle root to hashfield
hash_id = self.site.content_manager.hashfield.getHashId(hash)
self.optionalDownloaded(inner_path, hash_id, file_size, own=True)
self.site.storage.piecefields[hash].fromstring("1" * piece_num)
back[file_relative_path] = {"sha512": hash, "size": file_size, "piecemap": piecemap_relative_path, "piece_size": piece_size}
return back
def getPiecemap(self, inner_path):
file_info = self.site.content_manager.getFileInfo(inner_path)
piecemap_inner_path = helper.getDirname(file_info["content_inner_path"]) + file_info["piecemap"]
self.site.needFile(piecemap_inner_path, priority=20)
piecemap = msgpack.unpack(self.site.storage.open(piecemap_inner_path))[helper.getFilename(inner_path)]
piecemap["piece_size"] = file_info["piece_size"]
return piecemap
def verifyPiece(self, inner_path, pos, piece):
piecemap = self.getPiecemap(inner_path)
piece_i = pos / piecemap["piece_size"]
if CryptHash.sha512sum(piece, format="digest") != piecemap["sha512_pieces"][piece_i]:
raise VerifyError("Invalid hash")
return True
def verifyFile(self, inner_path, file, ignore_same=True):
if "|" not in inner_path:
return super(ContentManagerPlugin, self).verifyFile(inner_path, file, ignore_same)
inner_path, file_range = inner_path.split("|")
pos_from, pos_to = map(int, file_range.split("-"))
return self.verifyPiece(inner_path, pos_from, file)
def optionalDownloaded(self, inner_path, hash_id, size=None, own=False):
if "|" in inner_path:
inner_path, file_range = inner_path.split("|")
pos_from, pos_to = map(int, file_range.split("-"))
file_info = self.getFileInfo(inner_path)
# Mark piece downloaded
piece_i = pos_from / file_info["piece_size"]
self.site.storage.piecefields[file_info["sha512"]][piece_i] = True
# Only add to site size on first request
if hash_id in self.hashfield:
size = 0
elif size > 1024 * 1024:
file_info = self.getFileInfo(inner_path)
if file_info and "sha512" in file_info: # We already have the file, but not in piecefield
sha512 = file_info["sha512"]
if sha512 not in self.site.storage.piecefields:
self.site.storage.checkBigfile(inner_path)
return super(ContentManagerPlugin, self).optionalDownloaded(inner_path, hash_id, size, own)
def optionalRemoved(self, inner_path, hash_id, size=None):
if size and size > 1024 * 1024:
file_info = self.getFileInfo(inner_path)
sha512 = file_info["sha512"]
if sha512 in self.site.storage.piecefields:
del self.site.storage.piecefields[sha512]
# Also remove other pieces of the file from download queue
for key in self.site.bad_files.keys():
if key.startswith(inner_path + "|"):
del self.site.bad_files[key]
self.site.worker_manager.removeSolvedFileTasks()
return super(ContentManagerPlugin, self).optionalRemoved(inner_path, hash_id, size)
@PluginManager.registerTo("SiteStorage")
class SiteStoragePlugin(object):
def __init__(self, *args, **kwargs):
super(SiteStoragePlugin, self).__init__(*args, **kwargs)
self.piecefields = collections.defaultdict(BigfilePiecefield)
if "piecefields" in self.site.settings.get("cache", {}):
for sha512, piecefield_packed in self.site.settings["cache"].get("piecefields").iteritems():
if piecefield_packed:
self.piecefields[sha512].unpack(piecefield_packed.decode("base64"))
self.site.settings["cache"]["piecefields"] = {}
def createSparseFile(self, inner_path, size, sha512=None):
file_path = self.getPath(inner_path)
file_dir = os.path.dirname(file_path)
if not os.path.isdir(file_dir):
os.makedirs(file_dir)
f = open(file_path, 'wb')
f.truncate(min(1024 * 1024 * 5, size)) # Only pre-allocate up to 5MB
f.close()
if os.name == "nt":
startupinfo = subprocess.STARTUPINFO()
startupinfo.dwFlags |= subprocess.STARTF_USESHOWWINDOW
subprocess.call(["fsutil", "sparse", "setflag", file_path], close_fds=True, startupinfo=startupinfo)
if sha512 and sha512 in self.piecefields:
self.log.debug("%s: File not exists, but has piecefield. Deleting piecefield." % inner_path)
del self.piecefields[sha512]
def write(self, inner_path, content):
if "|" not in inner_path:
return super(SiteStoragePlugin, self).write(inner_path, content)
# Write to specific position by passing |{pos} after the filename
inner_path, file_range = inner_path.split("|")
pos_from, pos_to = map(int, file_range.split("-"))
file_path = self.getPath(inner_path)
# Create dir if not exist
file_dir = os.path.dirname(file_path)
if not os.path.isdir(file_dir):
os.makedirs(file_dir)
if not os.path.isfile(file_path):
file_info = self.site.content_manager.getFileInfo(inner_path)
self.createSparseFile(inner_path, file_info["size"])
# Write file
with open(file_path, "rb+") as file:
file.seek(pos_from)
if hasattr(content, 'read'): # File-like object
shutil.copyfileobj(content, file) # Write buff to disk
else: # Simple string
file.write(content)
del content
self.onUpdated(inner_path)
def checkBigfile(self, inner_path):
file_info = self.site.content_manager.getFileInfo(inner_path)
if not file_info or (file_info and "piecemap" not in file_info): # It's not a big file
return False
self.site.settings["has_bigfile"] = True
file_path = self.getPath(inner_path)
sha512 = file_info["sha512"]
piece_num = int(math.ceil(float(file_info["size"]) / file_info["piece_size"]))
if os.path.isfile(file_path):
if sha512 not in self.piecefields:
if open(file_path).read(128) == "\0" * 128:
piece_data = "0"
else:
piece_data = "1"
self.log.debug("%s: File exists, but not in piecefield. Filling piecefiled with %s * %s." % (inner_path, piece_num, piece_data))
self.piecefields[sha512].fromstring(piece_data * piece_num)
else:
self.log.debug("Creating bigfile: %s" % inner_path)
self.createSparseFile(inner_path, file_info["size"], sha512)
self.piecefields[sha512].fromstring("0" * piece_num)
return True
def openBigfile(self, inner_path, prebuffer=0):
if not self.checkBigfile(inner_path):
return False
self.site.needFile(inner_path, blocking=False) # Download piecemap
return BigFile(self.site, inner_path, prebuffer=prebuffer)
class BigFile(object):
def __init__(self, site, inner_path, prebuffer=0):
self.site = site
self.inner_path = inner_path
file_path = site.storage.getPath(inner_path)
file_info = self.site.content_manager.getFileInfo(inner_path)
self.piece_size = file_info["piece_size"]
self.sha512 = file_info["sha512"]
self.size = file_info["size"]
self.prebuffer = prebuffer
self.read_bytes = 0
self.piecefield = self.site.storage.piecefields[self.sha512]
self.f = open(file_path, "rb+")
self.read_lock = gevent.lock.Semaphore()
def read(self, buff=64 * 1024):
with self.read_lock:
pos = self.f.tell()
read_until = min(self.size, pos + buff)
requests = []
# Request all required blocks
while 1:
piece_i = pos / self.piece_size
if piece_i * self.piece_size >= read_until:
break
pos_from = piece_i * self.piece_size
pos_to = pos_from + self.piece_size
if not self.piecefield[piece_i]:
requests.append(self.site.needFile("%s|%s-%s" % (self.inner_path, pos_from, pos_to), blocking=False, update=True, priority=10))
pos += self.piece_size
if not all(requests):
return None
# Request prebuffer
if self.prebuffer:
prebuffer_until = min(self.size, read_until + self.prebuffer)
priority = 3
while 1:
piece_i = pos / self.piece_size
if piece_i * self.piece_size >= prebuffer_until:
break
pos_from = piece_i * self.piece_size
pos_to = pos_from + self.piece_size
if not self.piecefield[piece_i]:
self.site.needFile("%s|%s-%s" % (self.inner_path, pos_from, pos_to), blocking=False, update=True, priority=max(0, priority))
priority -= 1
pos += self.piece_size
gevent.joinall(requests)
self.read_bytes += buff
# Increase buffer for long reads
if self.read_bytes > 7 * 1024 * 1024 and self.prebuffer < 5 * 1024 * 1024:
self.site.log.debug("%s: Increasing bigfile buffer size to 5MB..." % self.inner_path)
self.prebuffer = 5 * 1024 * 1024
return self.f.read(buff)
def seek(self, pos, whence=0):
with self.read_lock:
if whence == 2: # Relative from file end
pos = self.size + pos # Use the real size instead of size on the disk
whence = 0
return self.f.seek(pos, whence)
def tell(self):
return self.f.tell()
def close(self):
self.f.close()
def __enter__(self):
return self
def __exit__(self, exc_type, exc_val, exc_tb):
self.close()
@PluginManager.registerTo("WorkerManager")
class WorkerManagerPlugin(object):
def addTask(self, inner_path, *args, **kwargs):
file_info = kwargs.get("file_info")
if file_info and "piecemap" in file_info: # Bigfile
self.site.settings["has_bigfile"] = True
piecemap_inner_path = helper.getDirname(file_info["content_inner_path"]) + file_info["piecemap"]
piecemap_task = None
if not self.site.storage.isFile(piecemap_inner_path):
# Start download piecemap
piecemap_task = super(WorkerManagerPlugin, self).addTask(piecemap_inner_path, priority=30)
autodownload_bigfile_size_limit = self.site.settings.get("autodownload_bigfile_size_limit", config.autodownload_bigfile_size_limit)
if "|" not in inner_path and self.site.isDownloadable(inner_path) and file_info["size"] / 1024 / 1024 <= autodownload_bigfile_size_limit:
gevent.spawn_later(0.1, self.site.needFile, inner_path + "|all") # Download all pieces
if "|" in inner_path:
# Start download piece
task = super(WorkerManagerPlugin, self).addTask(inner_path, *args, **kwargs)
inner_path, file_range = inner_path.split("|")
pos_from, pos_to = map(int, file_range.split("-"))
task["piece_i"] = pos_from / file_info["piece_size"]
task["sha512"] = file_info["sha512"]
else:
if inner_path in self.site.bad_files:
del self.site.bad_files[inner_path]
if piecemap_task:
task = piecemap_task
else:
fake_evt = gevent.event.AsyncResult() # Don't download anything if no range specified
fake_evt.set(True)
task = {"evt": fake_evt}
if not self.site.storage.isFile(inner_path):
self.site.storage.createSparseFile(inner_path, file_info["size"], file_info["sha512"])
piece_num = int(math.ceil(float(file_info["size"]) / file_info["piece_size"]))
self.site.storage.piecefields[file_info["sha512"]].fromstring("0" * piece_num)
else:
task = super(WorkerManagerPlugin, self).addTask(inner_path, *args, **kwargs)
return task
def taskAddPeer(self, task, peer):
if "piece_i" in task:
if not peer.piecefields[task["sha512"]][task["piece_i"]]:
if task["sha512"] not in peer.piecefields:
gevent.spawn(peer.updatePiecefields, force=True)
elif not task["peers"]:
gevent.spawn(peer.updatePiecefields)
return False # Deny to add peers to task if file not in piecefield
return super(WorkerManagerPlugin, self).taskAddPeer(task, peer)
@PluginManager.registerTo("FileRequest")
class FileRequestPlugin(object):
def isReadable(self, site, inner_path, file, pos):
# Peek into file
if file.read(10) == "\0" * 10:
# Looks empty, but makes sures we don't have that piece
file_info = site.content_manager.getFileInfo(inner_path)
if "piece_size" in file_info:
piece_i = pos / file_info["piece_size"]
if not site.storage.piecefields[file_info["sha512"]][piece_i]:
return False
# Seek back to position we want to read
file.seek(pos)
return super(FileRequestPlugin, self).isReadable(site, inner_path, file, pos)
def actionGetPiecefields(self, params):
site = self.sites.get(params["site"])
if not site or not site.settings["serving"]: # Site unknown or not serving
self.response({"error": "Unknown site"})
return False
# Add peer to site if not added before
peer = site.addPeer(self.connection.ip, self.connection.port, return_peer=True)
if not peer.connection: # Just added
peer.connect(self.connection) # Assign current connection to peer
piecefields_packed = {sha512: piecefield.pack() for sha512, piecefield in site.storage.piecefields.iteritems()}
self.response({"piecefields_packed": piecefields_packed})
def actionSetPiecefields(self, params):
site = self.sites.get(params["site"])
if not site or not site.settings["serving"]: # Site unknown or not serving
self.response({"error": "Unknown site"})
self.connection.badAction(5)
return False
# Add or get peer
peer = site.addPeer(self.connection.ip, self.connection.port, return_peer=True, connection=self.connection)
if not peer.connection:
peer.connect(self.connection)
peer.piecefields = collections.defaultdict(BigfilePiecefieldPacked)
for sha512, piecefield_packed in params["piecefields_packed"].iteritems():
peer.piecefields[sha512].unpack(piecefield_packed)
site.settings["has_bigfile"] = True
self.response({"ok": "Updated"})
@PluginManager.registerTo("Peer")
class PeerPlugin(object):
def __getattr__(self, key):
if key == "piecefields":
self.piecefields = collections.defaultdict(BigfilePiecefieldPacked)
return self.piecefields
elif key == "time_piecefields_updated":
self.time_piecefields_updated = None
return self.time_piecefields_updated
else:
return super(PeerPlugin, self).__getattr__(key)
@util.Noparallel(ignore_args=True)
def updatePiecefields(self, force=False):
if self.connection and self.connection.handshake.get("rev", 0) < 2190:
return False # Not supported
# Don't update piecefield again in 1 min
if self.time_piecefields_updated and time.time() - self.time_piecefields_updated < 60 and not force:
return False
self.time_piecefields_updated = time.time()
res = self.request("getPiecefields", {"site": self.site.address})
if not res or "error" in res:
return False
self.piecefields = collections.defaultdict(BigfilePiecefieldPacked)
try:
for sha512, piecefield_packed in res["piecefields_packed"].iteritems():
self.piecefields[sha512].unpack(piecefield_packed)
except Exception as err:
self.log("Invalid updatePiecefields response: %s" % Debug.formatException(err))
return self.piecefields
def sendMyHashfield(self, *args, **kwargs):
return super(PeerPlugin, self).sendMyHashfield(*args, **kwargs)
def updateHashfield(self, *args, **kwargs):
if self.site.settings.get("has_bigfile"):
thread = gevent.spawn(self.updatePiecefields, *args, **kwargs)
back = super(PeerPlugin, self).updateHashfield(*args, **kwargs)
thread.join()
return back
else:
return super(PeerPlugin, self).updateHashfield(*args, **kwargs)
def getFile(self, site, inner_path, *args, **kwargs):
if "|" in inner_path:
inner_path, file_range = inner_path.split("|")
pos_from, pos_to = map(int, file_range.split("-"))
kwargs["pos_from"] = pos_from
kwargs["pos_to"] = pos_to
return super(PeerPlugin, self).getFile(site, inner_path, *args, **kwargs)
@PluginManager.registerTo("Site")
class SitePlugin(object):
def isFileDownloadAllowed(self, inner_path, file_info):
if "piecemap" in file_info:
file_size_mb = file_info["size"] / 1024 / 1024
if config.bigfile_size_limit and file_size_mb > config.bigfile_size_limit:
self.log.debug(
"Bigfile size %s too large: %sMB > %sMB, skipping..." %
(inner_path, file_size_mb, config.bigfile_size_limit)
)
return False
file_info = file_info.copy()
file_info["size"] = file_info["piece_size"]
return super(SitePlugin, self).isFileDownloadAllowed(inner_path, file_info)
def getSettingsCache(self):
back = super(SitePlugin, self).getSettingsCache()
if self.storage.piecefields:
back["piecefields"] = {sha512: piecefield.pack().encode("base64") for sha512, piecefield in self.storage.piecefields.iteritems()}
return back
def needFile(self, inner_path, *args, **kwargs):
if inner_path.endswith("|all"):
@util.Pooled(20)
def pooledNeedBigfile(inner_path, *args, **kwargs):
if inner_path not in self.bad_files:
self.log.debug("Cancelled piece, skipping %s" % inner_path)
return False
return self.needFile(inner_path, *args, **kwargs)
inner_path = inner_path.replace("|all", "")
file_info = self.needFileInfo(inner_path)
file_size = file_info["size"]
piece_size = file_info["piece_size"]
piece_num = int(math.ceil(float(file_size) / piece_size))
file_threads = []
piecefield = self.storage.piecefields.get(file_info["sha512"])
for piece_i in range(piece_num):
piece_from = piece_i * piece_size
piece_to = min(file_size, piece_from + piece_size)
if not piecefield or not piecefield[piece_i]:
inner_path_piece = "%s|%s-%s" % (inner_path, piece_from, piece_to)
self.bad_files[inner_path_piece] = self.bad_files.get(inner_path_piece, 1)
res = pooledNeedBigfile(inner_path_piece, blocking=False)
if res is not True and res is not False:
file_threads.append(res)
gevent.joinall(file_threads)
else:
return super(SitePlugin, self).needFile(inner_path, *args, **kwargs)
@PluginManager.registerTo("ConfigPlugin")
class ConfigPlugin(object):
def createArguments(self):
group = self.parser.add_argument_group("Bigfile plugin")
group.add_argument('--autodownload_bigfile_size_limit', help='Also download bigfiles smaller than this limit if help distribute option is checked', default=1, metavar="MB", type=int)
group.add_argument('--bigfile_size_limit', help='Maximum size of downloaded big files', default=False, metavar="MB", type=int)
return super(ConfigPlugin, self).createArguments()

View file

@ -1,522 +0,0 @@
import time
from cStringIO import StringIO
import pytest
import msgpack
import mock
from Connection import ConnectionServer
from Content.ContentManager import VerifyError
from File import FileServer
from File import FileRequest
from Worker import WorkerManager
from Peer import Peer
from Bigfile import BigfilePiecefield, BigfilePiecefieldPacked
from Test import Spy
@pytest.mark.usefixtures("resetSettings")
@pytest.mark.usefixtures("resetTempSettings")
class TestBigfile:
privatekey = "5KUh3PvNm5HUWoCfSUfcYvfQ2g3PrRNJWr6Q9eqdBGu23mtMntv"
def createBigfile(self, site, inner_path="data/optional.any.iso", pieces=10):
f = site.storage.open(inner_path, "w")
for i in range(pieces * 100):
f.write(("Test%s" % i).ljust(10, "-") * 1000)
f.close()
assert site.content_manager.sign("content.json", self.privatekey)
return inner_path
def testPiecemapCreate(self, site):
inner_path = self.createBigfile(site)
content = site.storage.loadJson("content.json")
assert "data/optional.any.iso" in content["files_optional"]
file_node = content["files_optional"][inner_path]
assert file_node["size"] == 10 * 1000 * 1000
assert file_node["sha512"] == "47a72cde3be80b4a829e7674f72b7c6878cf6a70b0c58c6aa6c17d7e9948daf6"
assert file_node["piecemap"] == inner_path + ".piecemap.msgpack"
piecemap = msgpack.unpack(site.storage.open(file_node["piecemap"], "rb"))["optional.any.iso"]
assert len(piecemap["sha512_pieces"]) == 10
assert piecemap["sha512_pieces"][0] != piecemap["sha512_pieces"][1]
assert piecemap["sha512_pieces"][0].encode("hex") == "a73abad9992b3d0b672d0c2a292046695d31bebdcb1e150c8410bbe7c972eff3"
def testVerifyPiece(self, site):
inner_path = self.createBigfile(site)
# Verify all 10 piece
f = site.storage.open(inner_path, "rb")
for i in range(10):
piece = StringIO(f.read(1024 * 1024))
piece.seek(0)
site.content_manager.verifyPiece(inner_path, i * 1024 * 1024, piece)
f.close()
# Try to verify piece 0 with piece 1 hash
with pytest.raises(VerifyError) as err:
i = 1
f = site.storage.open(inner_path, "rb")
piece = StringIO(f.read(1024 * 1024))
f.close()
site.content_manager.verifyPiece(inner_path, i * 1024 * 1024, piece)
assert "Invalid hash" in str(err)
def testSparseFile(self, site):
inner_path = "sparsefile"
# Create a 100MB sparse file
site.storage.createSparseFile(inner_path, 100 * 1024 * 1024)
# Write to file beginning
s = time.time()
f = site.storage.write("%s|%s-%s" % (inner_path, 0, 1024 * 1024), "hellostart" * 1024)
time_write_start = time.time() - s
# Write to file end
s = time.time()
f = site.storage.write("%s|%s-%s" % (inner_path, 99 * 1024 * 1024, 99 * 1024 * 1024 + 1024 * 1024), "helloend" * 1024)
time_write_end = time.time() - s
# Verify writes
f = site.storage.open(inner_path)
assert f.read(10) == "hellostart"
f.seek(99 * 1024 * 1024)
assert f.read(8) == "helloend"
f.close()
site.storage.delete(inner_path)
# Writing to end shold not take much longer, than writing to start
assert time_write_end <= max(0.1, time_write_start * 1.1)
def testRangedFileRequest(self, file_server, site, site_temp):
inner_path = self.createBigfile(site)
file_server.sites[site.address] = site
client = FileServer(file_server.ip, 1545)
client.sites[site_temp.address] = site_temp
site_temp.connection_server = client
connection = client.getConnection(file_server.ip, 1544)
# Add file_server as peer to client
peer_file_server = site_temp.addPeer(file_server.ip, 1544)
buff = peer_file_server.getFile(site_temp.address, "%s|%s-%s" % (inner_path, 5 * 1024 * 1024, 6 * 1024 * 1024))
assert len(buff.getvalue()) == 1 * 1024 * 1024 # Correct block size
assert buff.getvalue().startswith("Test524") # Correct data
buff.seek(0)
assert site.content_manager.verifyPiece(inner_path, 5 * 1024 * 1024, buff) # Correct hash
connection.close()
client.stop()
def testRangedFileDownload(self, file_server, site, site_temp):
inner_path = self.createBigfile(site)
# Init source server
site.connection_server = file_server
file_server.sites[site.address] = site
# Make sure the file and the piecemap in the optional hashfield
file_info = site.content_manager.getFileInfo(inner_path)
assert site.content_manager.hashfield.hasHash(file_info["sha512"])
piecemap_hash = site.content_manager.getFileInfo(file_info["piecemap"])["sha512"]
assert site.content_manager.hashfield.hasHash(piecemap_hash)
# Init client server
client = ConnectionServer(file_server.ip, 1545)
site_temp.connection_server = client
peer_client = site_temp.addPeer(file_server.ip, 1544)
# Download site
site_temp.download(blind_includes=True).join(timeout=5)
bad_files = site_temp.storage.verifyFiles(quick_check=True)["bad_files"]
assert not bad_files
# client_piecefield = peer_client.piecefields[file_info["sha512"]].tostring()
# assert client_piecefield == "1" * 10
# Download 5. and 10. block
site_temp.needFile("%s|%s-%s" % (inner_path, 5 * 1024 * 1024, 6 * 1024 * 1024))
site_temp.needFile("%s|%s-%s" % (inner_path, 9 * 1024 * 1024, 10 * 1024 * 1024))
# Verify 0. block not downloaded
f = site_temp.storage.open(inner_path)
assert f.read(10) == "\0" * 10
# Verify 5. and 10. block downloaded
f.seek(5 * 1024 * 1024)
assert f.read(7) == "Test524"
f.seek(9 * 1024 * 1024)
assert f.read(7) == "943---T"
# Verify hashfield
assert set(site_temp.content_manager.hashfield) == set([18343, 30970]) # 18343: data/optional.any.iso, 30970: data/optional.any.iso.hashmap.msgpack
def testOpenBigfile(self, file_server, site, site_temp):
inner_path = self.createBigfile(site)
# Init source server
site.connection_server = file_server
file_server.sites[site.address] = site
# Init client server
client = ConnectionServer(file_server.ip, 1545)
site_temp.connection_server = client
site_temp.addPeer(file_server.ip, 1544)
# Download site
site_temp.download(blind_includes=True).join(timeout=5)
# Open virtual file
assert not site_temp.storage.isFile(inner_path)
with site_temp.storage.openBigfile(inner_path) as f:
with Spy.Spy(FileRequest, "route") as requests:
f.seek(5 * 1024 * 1024)
assert f.read(7) == "Test524"
f.seek(9 * 1024 * 1024)
assert f.read(7) == "943---T"
assert len(requests) == 4 # 1x peicemap + 1x getpiecefield + 2x for pieces
assert set(site_temp.content_manager.hashfield) == set([18343, 30970])
assert site_temp.storage.piecefields[f.sha512].tostring() == "0000010001"
assert f.sha512 in site_temp.getSettingsCache()["piecefields"]
# Test requesting already downloaded
with Spy.Spy(FileRequest, "route") as requests:
f.seek(5 * 1024 * 1024)
assert f.read(7) == "Test524"
assert len(requests) == 0
# Test requesting multi-block overflow reads
with Spy.Spy(FileRequest, "route") as requests:
f.seek(5 * 1024 * 1024) # We already have this block
data = f.read(1024 * 1024 * 3) # Our read overflow to 6. and 7. block
assert data.startswith("Test524")
assert data.endswith("Test838-")
assert "\0" not in data # No null bytes allowed
assert len(requests) == 2 # Two block download
# Test out of range request
f.seek(5 * 1024 * 1024)
data = f.read(1024 * 1024 * 30)
assert len(data) == 10 * 1000 * 1000 - (5 * 1024 * 1024)
f.seek(30 * 1024 * 1024)
data = f.read(1024 * 1024 * 30)
assert len(data) == 0
@pytest.mark.parametrize("piecefield_obj", [BigfilePiecefield, BigfilePiecefieldPacked])
def testPiecefield(self, piecefield_obj, site):
testdatas = [
"1" * 100 + "0" * 900 + "1" * 4000 + "0" * 4999 + "1",
"010101" * 10 + "01" * 90 + "10" * 400 + "0" * 4999,
"1" * 10000,
"0" * 10000
]
for testdata in testdatas:
piecefield = piecefield_obj()
piecefield.fromstring(testdata)
assert piecefield.tostring() == testdata
assert piecefield[0] == int(testdata[0])
assert piecefield[100] == int(testdata[100])
assert piecefield[1000] == int(testdata[1000])
assert piecefield[len(testdata) - 1] == int(testdata[len(testdata) - 1])
packed = piecefield.pack()
piecefield_new = piecefield_obj()
piecefield_new.unpack(packed)
assert piecefield.tostring() == piecefield_new.tostring()
assert piecefield_new.tostring() == testdata
def testFileGet(self, file_server, site, site_temp):
inner_path = self.createBigfile(site)
# Init source server
site.connection_server = file_server
file_server.sites[site.address] = site
# Init client server
site_temp.connection_server = FileServer(file_server.ip, 1545)
site_temp.connection_server.sites[site_temp.address] = site_temp
site_temp.addPeer(file_server.ip, 1544)
# Download site
site_temp.download(blind_includes=True).join(timeout=5)
# Download second block
with site_temp.storage.openBigfile(inner_path) as f:
f.seek(1024 * 1024)
assert f.read(1024)[0] != "\0"
# Make sure first block not download
with site_temp.storage.open(inner_path) as f:
assert f.read(1024)[0] == "\0"
peer2 = site.addPeer(file_server.ip, 1545, return_peer=True)
# Should drop error on first block request
assert not peer2.getFile(site.address, "%s|0-%s" % (inner_path, 1024 * 1024 * 1))
# Should not drop error for second block request
assert peer2.getFile(site.address, "%s|%s-%s" % (inner_path, 1024 * 1024 * 1, 1024 * 1024 * 2))
def benchmarkPeerMemory(self, site, file_server):
# Init source server
site.connection_server = file_server
file_server.sites[site.address] = site
import psutil, os
meminfo = psutil.Process(os.getpid()).memory_info
mem_s = meminfo()[0]
s = time.time()
for i in range(25000):
site.addPeer(file_server.ip, i)
print "%.3fs MEM: + %sKB" % (time.time() - s, (meminfo()[0] - mem_s) / 1024) # 0.082s MEM: + 6800KB
print site.peers.values()[0].piecefields
def testUpdatePiecefield(self, file_server, site, site_temp):
inner_path = self.createBigfile(site)
server1 = file_server
server1.sites[site.address] = site
server2 = FileServer(file_server.ip, 1545)
server2.sites[site_temp.address] = site_temp
site_temp.connection_server = server2
# Add file_server as peer to client
server2_peer1 = site_temp.addPeer(file_server.ip, 1544)
# Testing piecefield sync
assert len(server2_peer1.piecefields) == 0
assert server2_peer1.updatePiecefields() # Query piecefields from peer
assert len(server2_peer1.piecefields) > 0
def testWorkerManagerPiecefieldDeny(self, file_server, site, site_temp):
inner_path = self.createBigfile(site)
server1 = file_server
server1.sites[site.address] = site
server2 = FileServer(file_server.ip, 1545)
server2.sites[site_temp.address] = site_temp
site_temp.connection_server = server2
# Add file_server as peer to client
server2_peer1 = site_temp.addPeer(file_server.ip, 1544) # Working
site_temp.downloadContent("content.json", download_files=False)
site_temp.needFile("data/optional.any.iso.piecemap.msgpack")
# Add fake peers with optional files downloaded
for i in range(5):
fake_peer = site_temp.addPeer("127.0.1.%s" % i, 1544)
fake_peer.hashfield = site.content_manager.hashfield
fake_peer.has_hashfield = True
with Spy.Spy(WorkerManager, "addWorker") as requests:
site_temp.needFile("%s|%s-%s" % (inner_path, 5 * 1024 * 1024, 6 * 1024 * 1024))
site_temp.needFile("%s|%s-%s" % (inner_path, 6 * 1024 * 1024, 7 * 1024 * 1024))
# It should only request parts from peer1 as the other peers does not have the requested parts in piecefields
assert len([request[1] for request in requests if request[1] != server2_peer1]) == 0
def testWorkerManagerPiecefieldDownload(self, file_server, site, site_temp):
inner_path = self.createBigfile(site)
server1 = file_server
server1.sites[site.address] = site
server2 = FileServer(file_server.ip, 1545)
server2.sites[site_temp.address] = site_temp
site_temp.connection_server = server2
sha512 = site.content_manager.getFileInfo(inner_path)["sha512"]
# Create 10 fake peer for each piece
for i in range(10):
peer = Peer(file_server.ip, 1544, site_temp, server2)
peer.piecefields[sha512][i] = "1"
peer.updateHashfield = mock.MagicMock(return_value=False)
peer.updatePiecefields = mock.MagicMock(return_value=False)
peer.findHashIds = mock.MagicMock(return_value={"nope": []})
peer.hashfield = site.content_manager.hashfield
peer.has_hashfield = True
peer.key = "Peer:%s" % i
site_temp.peers["Peer:%s" % i] = peer
site_temp.downloadContent("content.json", download_files=False)
site_temp.needFile("data/optional.any.iso.piecemap.msgpack")
with Spy.Spy(Peer, "getFile") as requests:
for i in range(10):
site_temp.needFile("%s|%s-%s" % (inner_path, i * 1024 * 1024, (i + 1) * 1024 * 1024))
assert len(requests) == 10
for i in range(10):
assert requests[i][0] == site_temp.peers["Peer:%s" % i] # Every part should be requested from piece owner peer
def testDownloadStats(self, file_server, site, site_temp):
inner_path = self.createBigfile(site)
# Init source server
site.connection_server = file_server
file_server.sites[site.address] = site
# Init client server
client = ConnectionServer(file_server.ip, 1545)
site_temp.connection_server = client
site_temp.addPeer(file_server.ip, 1544)
# Download site
site_temp.download(blind_includes=True).join(timeout=5)
# Open virtual file
assert not site_temp.storage.isFile(inner_path)
# Check size before downloads
assert site_temp.settings["size"] < 10 * 1024 * 1024
assert site_temp.settings["optional_downloaded"] == 0
size_piecemap = site_temp.content_manager.getFileInfo(inner_path + ".piecemap.msgpack")["size"]
size_bigfile = site_temp.content_manager.getFileInfo(inner_path)["size"]
with site_temp.storage.openBigfile(inner_path) as f:
assert "\0" not in f.read(1024)
assert site_temp.settings["optional_downloaded"] == size_piecemap + size_bigfile
with site_temp.storage.openBigfile(inner_path) as f:
# Don't count twice
assert "\0" not in f.read(1024)
assert site_temp.settings["optional_downloaded"] == size_piecemap + size_bigfile
# Add second block
assert "\0" not in f.read(1024 * 1024)
assert site_temp.settings["optional_downloaded"] == size_piecemap + size_bigfile
def testPrebuffer(self, file_server, site, site_temp):
inner_path = self.createBigfile(site)
# Init source server
site.connection_server = file_server
file_server.sites[site.address] = site
# Init client server
client = ConnectionServer(file_server.ip, 1545)
site_temp.connection_server = client
site_temp.addPeer(file_server.ip, 1544)
# Download site
site_temp.download(blind_includes=True).join(timeout=5)
# Open virtual file
assert not site_temp.storage.isFile(inner_path)
with site_temp.storage.openBigfile(inner_path, prebuffer=1024 * 1024 * 2) as f:
with Spy.Spy(FileRequest, "route") as requests:
f.seek(5 * 1024 * 1024)
assert f.read(7) == "Test524"
# assert len(requests) == 3 # 1x piecemap + 1x getpiecefield + 1x for pieces
assert len([task for task in site_temp.worker_manager.tasks if task["inner_path"].startswith(inner_path)]) == 2
time.sleep(0.5) # Wait prebuffer download
sha512 = site.content_manager.getFileInfo(inner_path)["sha512"]
assert site_temp.storage.piecefields[sha512].tostring() == "0000011100"
# No prebuffer beyond end of the file
f.seek(9 * 1024 * 1024)
assert "\0" not in f.read(7)
assert len([task for task in site_temp.worker_manager.tasks if task["inner_path"].startswith(inner_path)]) == 0
def testDownloadAllPieces(self, file_server, site, site_temp):
inner_path = self.createBigfile(site)
# Init source server
site.connection_server = file_server
file_server.sites[site.address] = site
# Init client server
client = ConnectionServer(file_server.ip, 1545)
site_temp.connection_server = client
site_temp.addPeer(file_server.ip, 1544)
# Download site
site_temp.download(blind_includes=True).join(timeout=5)
# Open virtual file
assert not site_temp.storage.isFile(inner_path)
with Spy.Spy(FileRequest, "route") as requests:
site_temp.needFile("%s|all" % inner_path)
assert len(requests) == 12 # piecemap.msgpack, getPiecefields, 10 x piece
# Don't re-download already got pieces
with Spy.Spy(FileRequest, "route") as requests:
site_temp.needFile("%s|all" % inner_path)
assert len(requests) == 0
def testFileSize(self, file_server, site, site_temp):
inner_path = self.createBigfile(site)
# Init source server
site.connection_server = file_server
file_server.sites[site.address] = site
# Init client server
client = ConnectionServer(file_server.ip, 1545)
site_temp.connection_server = client
site_temp.addPeer(file_server.ip, 1544)
# Download site
site_temp.download(blind_includes=True).join(timeout=5)
# Open virtual file
assert not site_temp.storage.isFile(inner_path)
# Download first block
site_temp.needFile("%s|%s-%s" % (inner_path, 0 * 1024 * 1024, 1 * 1024 * 1024))
assert site_temp.storage.getSize(inner_path) < 1000 * 1000 * 10 # Size on the disk should be smaller than the real size
site_temp.needFile("%s|%s-%s" % (inner_path, 9 * 1024 * 1024, 10 * 1024 * 1024))
assert site_temp.storage.getSize(inner_path) == site.storage.getSize(inner_path)
@pytest.mark.parametrize("size", [1024 * 3, 1024 * 1024 * 3, 1024 * 1024 * 30])
def testNullFileRead(self, file_server, site, site_temp, size):
inner_path = "data/optional.iso"
f = site.storage.open(inner_path, "w")
f.write("\0" * size)
f.close()
assert site.content_manager.sign("content.json", self.privatekey)
# Init source server
site.connection_server = file_server
file_server.sites[site.address] = site
# Init client server
site_temp.connection_server = FileServer(file_server.ip, 1545)
site_temp.connection_server.sites[site_temp.address] = site_temp
site_temp.addPeer(file_server.ip, 1544)
# Download site
site_temp.download(blind_includes=True).join(timeout=5)
if "piecemap" in site.content_manager.getFileInfo(inner_path): # Bigfile
site_temp.needFile(inner_path + "|all")
else:
site_temp.needFile(inner_path)
assert site_temp.storage.getSize(inner_path) == size

View file

@ -1 +0,0 @@
from src.Test.conftest import *

View file

@ -1,5 +0,0 @@
[pytest]
python_files = Test*.py
addopts = -rsxX -v --durations=6
markers =
webtest: mark a test as a webtest.

View file

@ -1,2 +0,0 @@
import BigfilePlugin
from BigfilePiecefield import BigfilePiecefield, BigfilePiecefieldPacked

View file

@ -1,182 +0,0 @@
import time
import sys
import collections
import itertools
import logging
import gevent
from util import helper
from Config import config
class ChartCollector(object):
def __init__(self, db):
self.db = db
if config.action == "main":
gevent.spawn_later(60 * 3, self.collector)
self.log = logging.getLogger("ChartCollector")
self.last_values = collections.defaultdict(dict)
def setInitialLastValues(self, sites):
# Recover last value of site bytes/sent
for site in sites:
self.last_values["site:" + site.address]["site_bytes_recv"] = site.settings.get("bytes_recv", 0)
self.last_values["site:" + site.address]["site_bytes_sent"] = site.settings.get("bytes_sent", 0)
def getCollectors(self):
collectors = {}
file_server = sys.modules["main"].file_server
sites = file_server.sites
if not sites:
return collectors
content_db = sites.values()[0].content_manager.contents.db
# Connection stats
collectors["connection"] = lambda: len(file_server.connections)
collectors["connection_in"] = (
lambda: len([1 for connection in file_server.connections if connection.type == "in"])
)
collectors["connection_onion"] = (
lambda: len([1 for connection in file_server.connections if connection.ip.endswith(".onion")])
)
collectors["connection_ping_avg"] = (
lambda: round(1000 * helper.avg(
[connection.last_ping_delay for connection in file_server.connections if connection.last_ping_delay]
))
)
collectors["connection_ping_min"] = (
lambda: round(1000 * min(
[connection.last_ping_delay for connection in file_server.connections if connection.last_ping_delay]
))
)
collectors["connection_rev_avg"] = (
lambda: helper.avg(
[connection.handshake["rev"] for connection in file_server.connections if connection.handshake]
)
)
# Request stats
collectors["file_bytes_recv|change"] = lambda: file_server.bytes_recv
collectors["file_bytes_sent|change"] = lambda: file_server.bytes_sent
collectors["request_num_recv|change"] = lambda: file_server.num_recv
collectors["request_num_sent|change"] = lambda: file_server.num_sent
# Limit
collectors["optional_limit"] = lambda: content_db.getOptionalLimitBytes()
collectors["optional_used"] = lambda: content_db.getOptionalUsedBytes()
collectors["optional_downloaded"] = lambda: sum([site.settings.get("optional_downloaded", 0) for site in sites.values()])
# Peers
collectors["peer"] = lambda (peers): len(peers)
collectors["peer_onion"] = lambda (peers): len([True for peer in peers if ".onion" in peer])
# Size
collectors["size"] = lambda: sum([site.settings.get("size", 0) for site in sites.values()])
collectors["size_optional"] = lambda: sum([site.settings.get("size_optional", 0) for site in sites.values()])
collectors["content"] = lambda: sum([len(site.content_manager.contents) for site in sites.values()])
return collectors
def getSiteCollectors(self):
site_collectors = {}
# Size
site_collectors["site_size"] = lambda(site): site.settings.get("size", 0)
site_collectors["site_size_optional"] = lambda(site): site.settings.get("size_optional", 0)
site_collectors["site_optional_downloaded"] = lambda(site): site.settings.get("optional_downloaded", 0)
site_collectors["site_content"] = lambda(site): len(site.content_manager.contents)
# Data transfer
site_collectors["site_bytes_recv|change"] = lambda(site): site.settings.get("bytes_recv", 0)
site_collectors["site_bytes_sent|change"] = lambda(site): site.settings.get("bytes_sent", 0)
# Peers
site_collectors["site_peer"] = lambda(site): len(site.peers)
site_collectors["site_peer_onion"] = lambda(site): len(
[True for peer in site.peers.itervalues() if peer.ip.endswith(".onion")]
)
site_collectors["site_peer_connected"] = lambda(site): len([True for peer in site.peers.itervalues() if peer.connection])
return site_collectors
def getUniquePeers(self):
sites = sys.modules["main"].file_server.sites
return set(itertools.chain.from_iterable(
[site.peers.keys() for site in sites.values()]
))
def collectDatas(self, collectors, last_values, site=None):
if site is None:
peers = self.getUniquePeers()
datas = {}
for key, collector in collectors.iteritems():
try:
if site:
value = collector(site)
elif key.startswith("peer"):
value = collector(peers)
else:
value = collector()
except Exception as err:
self.log.info("Collector %s error: %s" % (key, err))
value = None
if "|change" in key: # Store changes relative to last value
key = key.replace("|change", "")
last_value = last_values.get(key, 0)
last_values[key] = value
value = value - last_value
if value is None:
datas[key] = None
else:
datas[key] = round(value, 3)
return datas
def collectGlobal(self, collectors, last_values):
now = int(time.time())
s = time.time()
datas = self.collectDatas(collectors, last_values["global"])
values = []
for key, value in datas.iteritems():
values.append((self.db.getTypeId(key), value, now))
self.log.debug("Global collectors done in %.3fs" % (time.time() - s))
s = time.time()
cur = self.db.getCursor()
cur.execute("BEGIN")
cur.cursor.executemany("INSERT INTO data (type_id, value, date_added) VALUES (?, ?, ?)", values)
cur.execute("END")
cur.close()
self.log.debug("Global collectors inserted in %.3fs" % (time.time() - s))
def collectSites(self, sites, collectors, last_values):
now = int(time.time())
s = time.time()
values = []
for address, site in sites.iteritems():
site_datas = self.collectDatas(collectors, last_values["site:%s" % address], site)
for key, value in site_datas.iteritems():
values.append((self.db.getTypeId(key), self.db.getSiteId(address), value, now))
time.sleep(0.000001)
self.log.debug("Site collections done in %.3fs" % (time.time() - s))
s = time.time()
cur = self.db.getCursor()
cur.execute("BEGIN")
cur.cursor.executemany("INSERT INTO data (type_id, site_id, value, date_added) VALUES (?, ?, ?, ?)", values)
cur.execute("END")
cur.close()
self.log.debug("Site collectors inserted in %.3fs" % (time.time() - s))
def collector(self):
collectors = self.getCollectors()
site_collectors = self.getSiteCollectors()
sites = sys.modules["main"].file_server.sites
i = 0
while 1:
self.collectGlobal(collectors, self.last_values)
if i % 12 == 0: # Only collect sites data every hour
self.collectSites(sites, site_collectors, self.last_values)
time.sleep(60 * 5)
i += 1

View file

@ -1,133 +0,0 @@
from Config import config
from Db import Db
import time
class ChartDb(Db):
def __init__(self):
self.version = 2
super(ChartDb, self).__init__(self.getSchema(), "%s/chart.db" % config.data_dir)
self.foreign_keys = True
self.checkTables()
self.sites = self.loadSites()
self.types = self.loadTypes()
def getSchema(self):
schema = {}
schema["db_name"] = "Chart"
schema["tables"] = {}
schema["tables"]["data"] = {
"cols": [
["data_id", "INTEGER PRIMARY KEY ASC AUTOINCREMENT NOT NULL UNIQUE"],
["type_id", "INTEGER NOT NULL"],
["site_id", "INTEGER"],
["value", "INTEGER"],
["date_added", "DATETIME DEFAULT (CURRENT_TIMESTAMP)"]
],
"indexes": [
"CREATE INDEX site_id ON data (site_id)",
"CREATE INDEX date_added ON data (date_added)"
],
"schema_changed": 2
}
schema["tables"]["type"] = {
"cols": [
["type_id", "INTEGER PRIMARY KEY NOT NULL UNIQUE"],
["name", "TEXT"]
],
"schema_changed": 1
}
schema["tables"]["site"] = {
"cols": [
["site_id", "INTEGER PRIMARY KEY NOT NULL UNIQUE"],
["address", "TEXT"]
],
"schema_changed": 1
}
return schema
def getTypeId(self, name):
if name not in self.types:
self.execute("INSERT INTO type ?", {"name": name})
self.types[name] = self.cur.cursor.lastrowid
return self.types[name]
def getSiteId(self, address):
if address not in self.sites:
self.execute("INSERT INTO site ?", {"address": address})
self.sites[address] = self.cur.cursor.lastrowid
return self.sites[address]
def loadSites(self):
sites = {}
for row in self.execute("SELECT * FROM site"):
sites[row["address"]] = row["site_id"]
return sites
def loadTypes(self):
types = {}
for row in self.execute("SELECT * FROM type"):
types[row["name"]] = row["type_id"]
return types
def deleteSite(self, address):
if address in self.sites:
site_id = self.sites[address]
del self.sites[address]
self.execute("DELETE FROM site WHERE ?", {"site_id": site_id})
self.execute("DELETE FROM data WHERE ?", {"site_id": site_id})
def archive(self):
week_back = 1
while 1:
s = time.time()
date_added_from = time.time() - 60 * 60 * 24 * 7 * (week_back + 1)
date_added_to = date_added_from + 60 * 60 * 24 * 7
res = self.execute("""
SELECT
MAX(date_added) AS date_added,
SUM(value) AS value,
GROUP_CONCAT(data_id) AS data_ids,
type_id,
site_id,
COUNT(*) AS num
FROM data
WHERE
site_id IS NULL AND
date_added > :date_added_from AND
date_added < :date_added_to
GROUP BY strftime('%Y-%m-%d %H', date_added, 'unixepoch', 'localtime'), type_id
""", {"date_added_from": date_added_from, "date_added_to": date_added_to})
num_archived = 0
cur = self.getCursor()
for row in res:
if row["num"] == 1:
continue
cur.execute("INSERT INTO data ?", {
"type_id": row["type_id"],
"site_id": row["site_id"],
"value": row["value"],
"date_added": row["date_added"]
})
cur.execute("DELETE FROM data WHERE data_id IN (%s)" % row["data_ids"])
num_archived += row["num"]
self.log.debug("Archived %s data from %s weeks ago in %.3fs" % (num_archived, week_back, time.time() - s))
week_back += 1
time.sleep(0.1)
if num_archived == 0:
break
# Only keep 6 month of global stats
self.execute(
"DELETE FROM data WHERE site_id IS NULL AND date_added < :date_added_limit",
{"date_added_limit": time.time() - 60 * 60 * 24 * 30 * 6 }
)
# Only keep 1 month of site stats
self.execute(
"DELETE FROM data WHERE site_id IS NOT NULL AND date_added < :date_added_limit",
{"date_added_limit": time.time() - 60 * 60 * 24 * 30 }
)
if week_back > 1:
self.execute("VACUUM")

View file

@ -1,60 +0,0 @@
import time
import itertools
import gevent
from Config import config
from util import helper
from Plugin import PluginManager
from ChartDb import ChartDb
from ChartCollector import ChartCollector
if "db" not in locals().keys(): # Share on reloads
db = ChartDb()
gevent.spawn_later(10 * 60, db.archive)
helper.timer(60 * 60 * 6, db.archive)
collector = ChartCollector(db)
@PluginManager.registerTo("SiteManager")
class SiteManagerPlugin(object):
def load(self, *args, **kwargs):
back = super(SiteManagerPlugin, self).load(*args, **kwargs)
collector.setInitialLastValues(self.sites.values())
return back
def delete(self, address, *args, **kwargs):
db.deleteSite(address)
return super(SiteManagerPlugin, self).delete(address, *args, **kwargs)
@PluginManager.registerTo("UiWebsocket")
class UiWebsocketPlugin(object):
def actionChartDbQuery(self, to, query, params=None):
if not "ADMIN" in self.permissions:
return {"error": "No permission"}
if config.debug or config.verbose:
s = time.time()
rows = []
try:
if not query.strip().upper().startswith("SELECT"):
raise Exception("Only SELECT query supported")
res = db.execute(query, params)
except Exception, err: # Response the error to client
self.log.error("ChartDbQuery error: %s" % err)
return {"error": str(err)}
# Convert result to dict
for row in res:
rows.append(dict(row))
if config.verbose and time.time() - s > 0.1: # Log slow query
self.log.debug("Slow query: %s (%.3fs)" % (query, time.time() - s))
return rows
def actionChartGetPeerLocations(self, to):
if not "ADMIN" in self.permissions:
return {"error": "No permission"}
peers = {}
for site in self.server.sites.values():
peers.update(site.peers)
peer_locations = self.getPeerLocations(peers)
return peer_locations

View file

@ -1 +0,0 @@
import ChartPlugin

View file

@ -1,223 +0,0 @@
import time
import re
import cgi
import hashlib
from Plugin import PluginManager
from Translate import Translate
from Config import config
from ContentFilterStorage import ContentFilterStorage
if "_" not in locals():
_ = Translate("plugins/ContentFilter/languages/")
@PluginManager.registerTo("SiteManager")
class SiteManagerPlugin(object):
def load(self, *args, **kwargs):
global filter_storage
super(SiteManagerPlugin, self).load(*args, **kwargs)
filter_storage = ContentFilterStorage(site_manager=self)
@PluginManager.registerTo("UiWebsocket")
class UiWebsocketPlugin(object):
# Mute
def cbMuteAdd(self, to, auth_address, cert_user_id, reason):
filter_storage.file_content["mutes"][auth_address] = {
"cert_user_id": cert_user_id, "reason": reason, "source": self.site.address, "date_added": time.time()
}
filter_storage.save()
filter_storage.changeDbs(auth_address, "remove")
self.response(to, "ok")
def actionMuteAdd(self, to, auth_address, cert_user_id, reason):
if "ADMIN" in self.getPermissions(to):
self.cbMuteAdd(to, auth_address, cert_user_id, reason)
else:
self.cmd(
"confirm",
[_["Hide all content from <b>%s</b>?"] % cgi.escape(cert_user_id), _["Mute"]],
lambda (res): self.cbMuteAdd(to, auth_address, cert_user_id, reason)
)
def cbMuteRemove(self, to, auth_address):
del filter_storage.file_content["mutes"][auth_address]
filter_storage.save()
filter_storage.changeDbs(auth_address, "load")
self.response(to, "ok")
def actionMuteRemove(self, to, auth_address):
if "ADMIN" in self.getPermissions(to):
self.cbMuteRemove(to, auth_address)
else:
self.cmd(
"confirm",
[_["Unmute <b>%s</b>?"] % cgi.escape(filter_storage.file_content["mutes"][auth_address]["cert_user_id"]), _["Unmute"]],
lambda (res): self.cbMuteRemove(to, auth_address)
)
def actionMuteList(self, to):
if "ADMIN" in self.getPermissions(to):
self.response(to, filter_storage.file_content["mutes"])
else:
return self.response(to, {"error": "Forbidden: Only ADMIN sites can list mutes"})
# Siteblock
def actionSiteblockAdd(self, to, site_address, reason=None):
if "ADMIN" not in self.getPermissions(to):
return self.response(to, {"error": "Forbidden: Only ADMIN sites can add to blocklist"})
filter_storage.file_content["siteblocks"][site_address] = {"date_added": time.time(), "reason": reason}
filter_storage.save()
self.response(to, "ok")
def actionSiteblockRemove(self, to, site_address):
if "ADMIN" not in self.getPermissions(to):
return self.response(to, {"error": "Forbidden: Only ADMIN sites can remove from blocklist"})
del filter_storage.file_content["siteblocks"][site_address]
filter_storage.save()
self.response(to, "ok")
def actionSiteblockList(self, to):
if "ADMIN" in self.getPermissions(to):
self.response(to, filter_storage.file_content["siteblocks"])
else:
return self.response(to, {"error": "Forbidden: Only ADMIN sites can list blocklists"})
# Include
def actionFilterIncludeAdd(self, to, inner_path, description=None, address=None):
if address:
if "ADMIN" not in self.getPermissions(to):
return self.response(to, {"error": "Forbidden: Only ADMIN sites can manage different site include"})
site = self.server.sites[address]
else:
address = self.site.address
site = self.site
if "ADMIN" in self.getPermissions(to):
self.cbFilterIncludeAdd(to, True, address, inner_path, description)
else:
content = site.storage.loadJson(inner_path)
title = _["New shared global content filter: <b>%s</b> (%s sites, %s users)"] % (
cgi.escape(inner_path), len(content.get("siteblocks", {})), len(content.get("mutes", {}))
)
self.cmd(
"confirm",
[title, "Add"],
lambda (res): self.cbFilterIncludeAdd(to, res, address, inner_path, description)
)
def cbFilterIncludeAdd(self, to, res, address, inner_path, description):
if not res:
self.response(to, res)
return False
filter_storage.includeAdd(address, inner_path, description)
self.response(to, "ok")
def actionFilterIncludeRemove(self, to, inner_path, address=None):
if address:
if "ADMIN" not in self.getPermissions(to):
return self.response(to, {"error": "Forbidden: Only ADMIN sites can manage different site include"})
else:
address = self.site.address
key = "%s/%s" % (address, inner_path)
if key not in filter_storage.file_content["includes"]:
self.response(to, {"error": "Include not found"})
filter_storage.includeRemove(address, inner_path)
self.response(to, "ok")
def actionFilterIncludeList(self, to, all_sites=False, filters=False):
if all_sites and "ADMIN" not in self.getPermissions(to):
return self.response(to, {"error": "Forbidden: Only ADMIN sites can list all sites includes"})
back = []
includes = filter_storage.file_content.get("includes", {}).values()
for include in includes:
if not all_sites and include["address"] != self.site.address:
continue
if filters:
include = dict(include) # Don't modify original file_content
include_site = filter_storage.site_manager.get(include["address"])
if not include_site:
continue
try:
content = include_site.storage.loadJson(include["inner_path"])
include["error"] = None
except Exception as err:
if include_site.settings["own"]:
include_site.log.warning("Error loading filter %s: %s" % (include["inner_path"], err))
content = {}
include["error"] = str(err)
include["mutes"] = content.get("mutes", {})
include["siteblocks"] = content.get("siteblocks", {})
back.append(include)
self.response(to, back)
@PluginManager.registerTo("SiteStorage")
class SiteStoragePlugin(object):
def updateDbFile(self, inner_path, file=None, cur=None):
if file is not False: # File deletion always allowed
# Find for bitcoin addresses in file path
matches = re.findall("/(1[A-Za-z0-9]{26,35})/", inner_path)
# Check if any of the adresses are in the mute list
for auth_address in matches:
if filter_storage.isMuted(auth_address):
self.log.debug("Mute match: %s, ignoring %s" % (auth_address, inner_path))
return False
return super(SiteStoragePlugin, self).updateDbFile(inner_path, file=file, cur=cur)
def onUpdated(self, inner_path, file=None):
file_path = "%s/%s" % (self.site.address, inner_path)
if file_path in filter_storage.file_content["includes"]:
self.log.debug("Filter file updated: %s" % inner_path)
filter_storage.includeUpdateAll()
return super(SiteStoragePlugin, self).onUpdated(inner_path, file=file)
@PluginManager.registerTo("UiRequest")
class UiRequestPlugin(object):
def actionWrapper(self, path, extra_headers=None):
match = re.match("/(?P<address>[A-Za-z0-9\._-]+)(?P<inner_path>/.*|$)", path)
if not match:
return False
address = match.group("address")
if self.server.site_manager.get(address): # Site already exists
return super(UiRequestPlugin, self).actionWrapper(path, extra_headers)
if self.server.site_manager.isDomain(address):
address = self.server.site_manager.resolveDomain(address)
if address:
address_sha256 = "0x" + hashlib.sha256(address).hexdigest()
else:
address_sha256 = None
if filter_storage.isSiteblocked(address) or filter_storage.isSiteblocked(address_sha256):
site = self.server.site_manager.get(config.homepage)
if not extra_headers:
extra_headers = {}
script_nonce = self.getScriptNonce()
self.sendHeader(extra_headers=extra_headers, script_nonce=script_nonce)
return iter([super(UiRequestPlugin, self).renderWrapper(
site, path, "uimedia/plugins/contentfilter/blocklisted.html?address=" + address,
"Blacklisted site", extra_headers, show_loadingscreen=False, script_nonce=script_nonce
)])
else:
return super(UiRequestPlugin, self).actionWrapper(path, extra_headers)
def actionUiMedia(self, path, *args, **kwargs):
if path.startswith("/uimedia/plugins/contentfilter/"):
file_path = path.replace("/uimedia/plugins/contentfilter/", "plugins/ContentFilter/media/")
return self.actionFile(file_path)
else:
return super(UiRequestPlugin, self).actionUiMedia(path)

View file

@ -1,140 +0,0 @@
import os
import json
import logging
import collections
import time
from Debug import Debug
from Plugin import PluginManager
from Config import config
from util import helper
class ContentFilterStorage(object):
def __init__(self, site_manager):
self.log = logging.getLogger("ContentFilterStorage")
self.file_path = "%s/filters.json" % config.data_dir
self.site_manager = site_manager
self.file_content = self.load()
# Set default values for filters.json
if not self.file_content:
self.file_content = {}
# Site blacklist renamed to site blocks
if "site_blacklist" in self.file_content:
self.file_content["siteblocks"] = self.file_content["site_blacklist"]
del self.file_content["site_blacklist"]
for key in ["mutes", "siteblocks", "includes"]:
if key not in self.file_content:
self.file_content[key] = {}
self.include_filters = collections.defaultdict(set) # Merged list of mutes and blacklists from all include
self.includeUpdateAll(update_site_dbs=False)
def load(self):
# Rename previously used mutes.json -> filters.json
if os.path.isfile("%s/mutes.json" % config.data_dir):
self.log.info("Renaming mutes.json to filters.json...")
os.rename("%s/mutes.json" % config.data_dir, self.file_path)
if os.path.isfile(self.file_path):
try:
return json.load(open(self.file_path))
except Exception as err:
self.log.error("Error loading filters.json: %s" % err)
return None
else:
return None
def includeUpdateAll(self, update_site_dbs=True):
s = time.time()
new_include_filters = collections.defaultdict(set)
# Load all include files data into a merged set
for include_path in self.file_content["includes"]:
address, inner_path = include_path.split("/", 1)
try:
content = self.site_manager.get(address).storage.loadJson(inner_path)
except Exception as err:
self.log.warning(
"Error loading include %s: %s" %
(include_path, Debug.formatException(err))
)
continue
for key, val in content.iteritems():
if type(val) is not dict:
continue
new_include_filters[key].update(val.keys())
mutes_added = new_include_filters["mutes"].difference(self.include_filters["mutes"])
mutes_removed = self.include_filters["mutes"].difference(new_include_filters["mutes"])
self.include_filters = new_include_filters
if update_site_dbs:
for auth_address in mutes_added:
self.changeDbs(auth_address, "remove")
for auth_address in mutes_removed:
if not self.isMuted(auth_address):
self.changeDbs(auth_address, "load")
num_mutes = len(self.include_filters["mutes"])
num_siteblocks = len(self.include_filters["siteblocks"])
self.log.debug(
"Loaded %s mutes, %s blocked sites from %s includes in %.3fs" %
(num_mutes, num_siteblocks, len(self.file_content["includes"]), time.time() - s)
)
def includeAdd(self, address, inner_path, description=None):
self.file_content["includes"]["%s/%s" % (address, inner_path)] = {
"date_added": time.time(),
"address": address,
"description": description,
"inner_path": inner_path
}
self.includeUpdateAll()
self.save()
def includeRemove(self, address, inner_path):
del self.file_content["includes"]["%s/%s" % (address, inner_path)]
self.includeUpdateAll()
self.save()
def save(self):
s = time.time()
helper.atomicWrite(self.file_path, json.dumps(self.file_content, indent=2, sort_keys=True))
self.log.debug("Saved in %.3fs" % (time.time() - s))
def isMuted(self, auth_address):
if auth_address in self.file_content["mutes"] or auth_address in self.include_filters["mutes"]:
return True
else:
return False
def isSiteblocked(self, address):
if address in self.file_content["siteblocks"] or address in self.include_filters["siteblocks"]:
return True
else:
return False
# Search and remove or readd files of an user
def changeDbs(self, auth_address, action):
self.log.debug("Mute action %s on user %s" % (action, auth_address))
res = self.site_manager.list().values()[0].content_manager.contents.db.execute(
"SELECT * FROM content LEFT JOIN site USING (site_id) WHERE inner_path LIKE :inner_path",
{"inner_path": "%%/%s/%%" % auth_address}
)
for row in res:
site = self.site_manager.sites.get(row["address"])
if not site:
continue
dir_inner_path = helper.getDirname(row["inner_path"])
for file_name in site.storage.walk(dir_inner_path):
if action == "remove":
site.storage.onUpdated(dir_inner_path + file_name, False)
else:
site.storage.onUpdated(dir_inner_path + file_name)
site.onFileDone(dir_inner_path + file_name)

View file

@ -1,82 +0,0 @@
import pytest
from ContentFilter import ContentFilterPlugin
from Site import SiteManager
@pytest.fixture
def filter_storage():
ContentFilterPlugin.filter_storage = ContentFilterPlugin.ContentFilterStorage(SiteManager.site_manager)
return ContentFilterPlugin.filter_storage
@pytest.mark.usefixtures("resetSettings")
@pytest.mark.usefixtures("resetTempSettings")
class TestContentFilter:
def createInclude(self, site):
site.storage.writeJson("filters.json", {
"mutes": {"1J6UrZMkarjVg5ax9W4qThir3BFUikbW6C": {}},
"siteblocks": {site.address: {}}
})
def testIncludeLoad(self, site, filter_storage):
self.createInclude(site)
filter_storage.file_content["includes"]["%s/%s" % (site.address, "filters.json")] = {
"date_added": 1528295893,
}
assert not filter_storage.include_filters["mutes"]
assert not filter_storage.isMuted("1J6UrZMkarjVg5ax9W4qThir3BFUikbW6C")
assert not filter_storage.isSiteblocked(site.address)
filter_storage.includeUpdateAll(update_site_dbs=False)
assert len(filter_storage.include_filters["mutes"]) == 1
assert filter_storage.isMuted("1J6UrZMkarjVg5ax9W4qThir3BFUikbW6C")
assert filter_storage.isSiteblocked(site.address)
def testIncludeAdd(self, site, filter_storage):
self.createInclude(site)
query_num_json = "SELECT COUNT(*) AS num FROM json WHERE directory = 'users/1J6UrZMkarjVg5ax9W4qThir3BFUikbW6C'"
assert not filter_storage.isSiteblocked(site.address)
assert not filter_storage.isMuted("1J6UrZMkarjVg5ax9W4qThir3BFUikbW6C")
assert site.storage.query(query_num_json).fetchone()["num"] == 2
# Add include
filter_storage.includeAdd(site.address, "filters.json")
assert filter_storage.isSiteblocked(site.address)
assert filter_storage.isMuted("1J6UrZMkarjVg5ax9W4qThir3BFUikbW6C")
assert site.storage.query(query_num_json).fetchone()["num"] == 0
# Remove include
filter_storage.includeRemove(site.address, "filters.json")
assert not filter_storage.isSiteblocked(site.address)
assert not filter_storage.isMuted("1J6UrZMkarjVg5ax9W4qThir3BFUikbW6C")
assert site.storage.query(query_num_json).fetchone()["num"] == 2
def testIncludeChange(self, site, filter_storage):
self.createInclude(site)
filter_storage.includeAdd(site.address, "filters.json")
assert filter_storage.isSiteblocked(site.address)
assert filter_storage.isMuted("1J6UrZMkarjVg5ax9W4qThir3BFUikbW6C")
# Add new blocked site
assert not filter_storage.isSiteblocked("1Hello")
filter_content = site.storage.loadJson("filters.json")
filter_content["siteblocks"]["1Hello"] = {}
site.storage.writeJson("filters.json", filter_content)
assert filter_storage.isSiteblocked("1Hello")
# Add new muted user
query_num_json = "SELECT COUNT(*) AS num FROM json WHERE directory = 'users/1C5sgvWaSgfaTpV5kjBCnCiKtENNMYo69q'"
assert not filter_storage.isMuted("1C5sgvWaSgfaTpV5kjBCnCiKtENNMYo69q")
assert site.storage.query(query_num_json).fetchone()["num"] == 2
filter_content["mutes"]["1C5sgvWaSgfaTpV5kjBCnCiKtENNMYo69q"] = {}
site.storage.writeJson("filters.json", filter_content)
assert filter_storage.isMuted("1C5sgvWaSgfaTpV5kjBCnCiKtENNMYo69q")
assert site.storage.query(query_num_json).fetchone()["num"] == 0

View file

@ -1 +0,0 @@
from src.Test.conftest import *

View file

@ -1,5 +0,0 @@
[pytest]
python_files = Test*.py
addopts = -rsxX -v --durations=6
markers =
webtest: mark a test as a webtest.

View file

@ -1 +0,0 @@
import ContentFilterPlugin

View file

@ -1,6 +0,0 @@
{
"Hide all content from <b>%s</b>?": "<b>%s</b> tartalmaniak elrejtése?",
"Mute": "Elnémítás",
"Unmute <b>%s</b>?": "<b>%s</b> tartalmaniak megjelenítése?",
"Unmute": "Némítás visszavonása"
}

View file

@ -1,6 +0,0 @@
{
"Hide all content from <b>%s</b>?": "<b>%s</b> Vuoi nascondere i contenuti di questo utente ?",
"Mute": "Attiva Silenzia",
"Unmute <b>%s</b>?": "<b>%s</b> Vuoi mostrare i contenuti di questo utente ?",
"Unmute": "Disattiva Silenzia"
}

View file

@ -1,6 +0,0 @@
{
"Hide all content from <b>%s</b>?": "<b>%s</b> Ocultar todo o conteúdo de ?",
"Mute": "Ativar o Silêncio",
"Unmute <b>%s</b>?": "<b>%s</b> Você quer mostrar o conteúdo deste usuário ?",
"Unmute": "Desligar o silêncio"
}

View file

@ -1,6 +0,0 @@
{
"Hide all content from <b>%s</b>?": "屏蔽 <b>%s</b> 的所有內容?",
"Mute": "屏蔽",
"Unmute <b>%s</b>?": "對 <b>%s</b> 解除屏蔽?",
"Unmute": "解除屏蔽"
}

View file

@ -1,6 +0,0 @@
{
"Hide all content from <b>%s</b>?": "屏蔽 <b>%s</b> 的所有内容?",
"Mute": "屏蔽",
"Unmute <b>%s</b>?": "对 <b>%s</b> 解除屏蔽?",
"Unmute": "解除屏蔽"
}

View file

@ -1,107 +0,0 @@
<html>
<body>
<style>
.content { line-height: 24px; font-family: monospace; font-size: 14px; color: #636363; text-transform: uppercase; top: 38%; position: relative; text-align: center; perspective: 1000px }
.content h1, .content h2 { font-weight: normal; letter-spacing: 1px; }
.content h2 { font-size: 15px; }
.content #details {
text-align: left; display: inline-block; width: 350px; background-color: white; padding: 17px 27px; border-radius: 0px;
box-shadow: 0px 2px 7px -1px #d8d8d8; text-transform: none; margin: 15px; transform: scale(0) rotateX(90deg); transition: all 0.6s cubic-bezier(0.785, 0.135, 0.15, 0.86);
}
.content #details #added { font-size: 12px; text-align: right; color: #a9a9a9; }
#button { transition: all 1s cubic-bezier(0.075, 0.82, 0.165, 1); opacity: 0; transform: translateY(50px); transition-delay: 0.5s }
.button {
padding: 8px 20px; background-color: #FFF85F; border-bottom: 2px solid #CDBD1E; border-radius: 2px;
text-decoration: none; transition: all 0.5s; background-position: left center; display: inline-block; margin-top: 10px; color: black;
}
.button:hover { background-color: #FFF400; border-bottom: 2px solid #4D4D4C; transition: none; }
.button:active { position: relative; top: 1px; }
.button:focus { outline: none; }
.textbutton { color: #999; margin-top: 25px; display: inline-block; text-transform: none; font-family: Arial, Helvetica; text-decoration: none; padding: 5px 15px; }
.textbutton-main { background-color: #FFF; color: #333; border-radius: 5px; }
.textbutton:hover { text-decoration: underline; color: #333; transition: none !important; }
.textbutton:active { background-color: #fafbfc; }
</style>
<div class="content">
<h1>Site blocked</h1>
<h2>This site is on your blocklist:</h2>
<div id="details">
<div id="reason">Too much image</div>
<div id="added">on 2015-01-25 12:32:11</div>
</div>
<div id="buttons">
<a href="/" class="textbutton textbutton-main" id="back">Back to homepage</a>
<a href="#Visit+Site" class="textbutton" id="visit">Remove from blocklist and visit the site</a>
</div>
</div>
<script type="text/javascript" src="js/ZeroFrame.js"></script>
<script>
function buf2hex(buffer) {
return Array.prototype.map.call(new Uint8Array(buffer), x => ('00' + x.toString(16)).slice(-2)).join('');
}
async function sha256hex(s) {
var buff = new TextEncoder("utf-8").encode(s)
return "0x" + buf2hex(await crypto.subtle.digest("SHA-256", buff))
}
class Page extends ZeroFrame {
onOpenWebsocket () {
this.cmd("wrapperSetTitle", "Visiting a blocked site - ZeroNet")
this.cmd("siteInfo", {}, (site_info) => {
this.site_info = site_info
})
var address = document.location.search.match(/address=(.*?)[&\?]/)[1]
this.updateSiteblockDetails(address)
}
async updateSiteblockDetails(address) {
var address_sha256 = await sha256hex(address)
var blocks = await this.cmdp("siteblockList")
if (blocks[address] || blocks[address_sha256]) {
block = blocks[address]
} else {
var includes = await this.cmdp("filterIncludeList", {all_sites: true, filters: true})
for (let include of includes) {
if (include["siteblocks"][address]) {
var block = include["siteblocks"][address]
block["include"] = include
}
if (include["siteblocks"][address_sha256]) {
var block = include["siteblocks"][address_sha256]
block["include"] = include
}
}
}
this.blocks = blocks
var reason = block["reason"]
if (!reason) reason = "Unknown reason"
var date = new Date(block["date_added"] * 1000)
document.getElementById("reason").innerText = reason
document.getElementById("added").innerText = "at " + date.toLocaleDateString() + " " + date.toLocaleTimeString()
if (block["include"]) {
document.getElementById("added").innerText += " from a shared blocklist"
document.getElementById("visit").innerText = "Ignore blocking and visit the site"
}
document.getElementById("details").style.transform = "scale(1) rotateX(0deg)"
document.getElementById("visit").style.transform = "translateY(0)"
document.getElementById("visit").style.opacity = "1"
document.getElementById("visit").onclick = () => {
if (block["include"])
this.cmd("siteAdd", address, () => { this.cmd("wrapperReload") })
else
this.cmd("siteblockRemove", address, () => { this.cmd("wrapperReload") })
}
}
}
page = new Page()
</script>
</body>
</html>

View file

@ -1,119 +0,0 @@
// Version 1.0.0 - Initial release
// Version 1.1.0 (2017-08-02) - Added cmdp function that returns promise instead of using callback
// Version 1.2.0 (2017-08-02) - Added Ajax monkey patch to emulate XMLHttpRequest over ZeroFrame API
const CMD_INNER_READY = 'innerReady'
const CMD_RESPONSE = 'response'
const CMD_WRAPPER_READY = 'wrapperReady'
const CMD_PING = 'ping'
const CMD_PONG = 'pong'
const CMD_WRAPPER_OPENED_WEBSOCKET = 'wrapperOpenedWebsocket'
const CMD_WRAPPER_CLOSE_WEBSOCKET = 'wrapperClosedWebsocket'
class ZeroFrame {
constructor(url) {
this.url = url
this.waiting_cb = {}
this.wrapper_nonce = document.location.href.replace(/.*wrapper_nonce=([A-Za-z0-9]+).*/, "$1")
this.connect()
this.next_message_id = 1
this.init()
}
init() {
return this
}
connect() {
this.target = window.parent
window.addEventListener('message', e => this.onMessage(e), false)
this.cmd(CMD_INNER_READY)
}
onMessage(e) {
let message = e.data
let cmd = message.cmd
if (cmd === CMD_RESPONSE) {
if (this.waiting_cb[message.to] !== undefined) {
this.waiting_cb[message.to](message.result)
}
else {
this.log("Websocket callback not found:", message)
}
} else if (cmd === CMD_WRAPPER_READY) {
this.cmd(CMD_INNER_READY)
} else if (cmd === CMD_PING) {
this.response(message.id, CMD_PONG)
} else if (cmd === CMD_WRAPPER_OPENED_WEBSOCKET) {
this.onOpenWebsocket()
} else if (cmd === CMD_WRAPPER_CLOSE_WEBSOCKET) {
this.onCloseWebsocket()
} else {
this.onRequest(cmd, message)
}
}
onRequest(cmd, message) {
this.log("Unknown request", message)
}
response(to, result) {
this.send({
cmd: CMD_RESPONSE,
to: to,
result: result
})
}
cmd(cmd, params={}, cb=null) {
this.send({
cmd: cmd,
params: params
}, cb)
}
cmdp(cmd, params={}) {
return new Promise((resolve, reject) => {
this.cmd(cmd, params, (res) => {
if (res && res.error) {
reject(res.error)
} else {
resolve(res)
}
})
})
}
send(message, cb=null) {
message.wrapper_nonce = this.wrapper_nonce
message.id = this.next_message_id
this.next_message_id++
this.target.postMessage(message, '*')
if (cb) {
this.waiting_cb[message.id] = cb
}
}
log(...args) {
console.log.apply(console, ['[ZeroFrame]'].concat(args))
}
onOpenWebsocket() {
this.log('Websocket open')
}
onCloseWebsocket() {
this.log('Websocket close')
}
monkeyPatchAjax() {
var page = this
XMLHttpRequest.prototype.realOpen = XMLHttpRequest.prototype.open
this.cmd("wrapperGetAjaxKey", [], (res) => { this.ajax_key = res })
var newOpen = function (method, url, async) {
url += "?ajax_key=" + page.ajax_key
return this.realOpen(method, url, async)
}
XMLHttpRequest.prototype.open = newOpen
}
}

View file

@ -1,104 +0,0 @@
import re
import cgi
import copy
from Plugin import PluginManager
from Translate import Translate
if "_" not in locals():
_ = Translate("plugins/Cors/languages/")
def getCorsPath(site, inner_path):
match = re.match("^cors-([A-Za-z0-9]{26,35})/(.*)", inner_path)
if not match:
raise Exception("Invalid cors path: %s" % inner_path)
cors_address = match.group(1)
cors_inner_path = match.group(2)
if not "Cors:%s" % cors_address in site.settings["permissions"]:
raise Exception("This site has no permission to access site %s" % cors_address)
return cors_address, cors_inner_path
@PluginManager.registerTo("UiWebsocket")
class UiWebsocketPlugin(object):
def hasSitePermission(self, address, cmd=None):
if super(UiWebsocketPlugin, self).hasSitePermission(address, cmd=cmd):
return True
if not "Cors:%s" % address in self.site.settings["permissions"] or cmd not in ["fileGet", "fileList", "dirList", "fileRules", "optionalFileInfo", "fileQuery", "dbQuery", "userGetSettings", "siteInfo"]:
return False
else:
return True
# Add cors support for file commands
def corsFuncWrapper(self, func_name, to, inner_path, *args, **kwargs):
if inner_path.startswith("cors-"):
cors_address, cors_inner_path = getCorsPath(self.site, inner_path)
req_self = copy.copy(self)
req_self.site = self.server.sites.get(cors_address) # Change the site to the merged one
if not req_self.site:
return {"error": "No site found"}
func = getattr(super(UiWebsocketPlugin, req_self), func_name)
back = func(to, cors_inner_path, *args, **kwargs)
return back
else:
func = getattr(super(UiWebsocketPlugin, self), func_name)
return func(to, inner_path, *args, **kwargs)
def actionFileGet(self, to, inner_path, *args, **kwargs):
return self.corsFuncWrapper("actionFileGet", to, inner_path, *args, **kwargs)
def actionFileList(self, to, inner_path, *args, **kwargs):
return self.corsFuncWrapper("actionFileList", to, inner_path, *args, **kwargs)
def actionDirList(self, to, inner_path, *args, **kwargs):
return self.corsFuncWrapper("actionDirList", to, inner_path, *args, **kwargs)
def actionFileRules(self, to, inner_path, *args, **kwargs):
return self.corsFuncWrapper("actionFileRules", to, inner_path, *args, **kwargs)
def actionOptionalFileInfo(self, to, inner_path, *args, **kwargs):
return self.corsFuncWrapper("actionOptionalFileInfo", to, inner_path, *args, **kwargs)
def actionCorsPermission(self, to, address):
site = self.server.sites.get(address)
if site:
site_name = site.content_manager.contents.get("content.json", {}).get("title")
button_title = _["Grant"]
else:
site_name = address
button_title = _["Grant & Add"]
if site and "Cors:" + address in self.permissions:
return "ignored"
self.cmd(
"confirm",
[_["This site requests <b>read</b> permission to: <b>%s</b>"] % cgi.escape(site_name), button_title],
lambda (res): self.cbCorsPermission(to, address)
)
def cbCorsPermission(self, to, address):
self.actionPermissionAdd(to, "Cors:" + address)
site = self.server.sites.get(address)
if not site:
self.server.site_manager.need(address)
@PluginManager.registerTo("UiRequest")
class UiRequestPlugin(object):
# Allow to load cross origin files using /cors-address/file.jpg
def parsePath(self, path):
path_parts = super(UiRequestPlugin, self).parsePath(path)
if "cors-" not in path: # Optimization
return path_parts
site = self.server.sites[path_parts["address"]]
try:
path_parts["address"], path_parts["inner_path"] = getCorsPath(site, path_parts["inner_path"])
except:
return None
return path_parts

View file

@ -1 +0,0 @@
import CorsPlugin

View file

@ -1,53 +0,0 @@
from lib.pybitcointools import bitcoin as btctools
import hashlib
ecc_cache = {}
def encrypt(data, pubkey, ephemcurve=None, ciphername='aes-256-cbc'):
from lib import pyelliptic
curve, pubkey_x, pubkey_y, i = pyelliptic.ECC._decode_pubkey(pubkey)
if ephemcurve is None:
ephemcurve = curve
ephem = pyelliptic.ECC(curve=ephemcurve)
key = hashlib.sha512(ephem.raw_get_ecdh_key(pubkey_x, pubkey_y)).digest()
key_e, key_m = key[:32], key[32:]
pubkey = ephem.get_pubkey()
iv = pyelliptic.OpenSSL.rand(pyelliptic.OpenSSL.get_cipher(ciphername).get_blocksize())
ctx = pyelliptic.Cipher(key_e, iv, 1, ciphername)
ciphertext = iv + pubkey + ctx.ciphering(data)
mac = pyelliptic.hmac_sha256(key_m, ciphertext)
return key_e, ciphertext + mac
def split(encrypted):
iv = encrypted[0:16]
ciphertext = encrypted[16+70:-32]
return iv, ciphertext
def getEcc(privatekey=None):
from lib import pyelliptic
global eccs
if privatekey not in ecc_cache:
if privatekey:
publickey_bin = btctools.encode_pubkey(btctools.privtopub(privatekey), "bin")
publickey_openssl = toOpensslPublickey(publickey_bin)
privatekey_openssl = toOpensslPrivatekey(privatekey)
ecc_cache[privatekey] = pyelliptic.ECC(curve='secp256k1', privkey=privatekey_openssl, pubkey=publickey_openssl)
else:
ecc_cache[None] = pyelliptic.ECC()
return ecc_cache[privatekey]
def toOpensslPrivatekey(privatekey):
privatekey_bin = btctools.encode_privkey(privatekey, "bin")
return '\x02\xca\x00\x20' + privatekey_bin
def toOpensslPublickey(publickey):
publickey_bin = btctools.encode_pubkey(publickey, "bin")
publickey_bin = publickey_bin[1:]
publickey_openssl = '\x02\xca\x00 ' + publickey_bin[:32] + '\x00 ' + publickey_bin[32:]
return publickey_openssl

View file

@ -1,149 +0,0 @@
import base64
import os
from Plugin import PluginManager
from Crypt import CryptBitcoin
from lib.pybitcointools import bitcoin as btctools
import CryptMessage
@PluginManager.registerTo("UiWebsocket")
class UiWebsocketPlugin(object):
def encrypt(self, text, publickey):
encrypted = CryptMessage.encrypt(text, CryptMessage.toOpensslPublickey(publickey))
return encrypted
def decrypt(self, encrypted, privatekey):
back = CryptMessage.getEcc(privatekey).decrypt(encrypted)
return back.decode("utf8")
# - Actions -
# Returns user's public key unique to site
# Return: Public key
def actionUserPublickey(self, to, index=0):
publickey = self.user.getEncryptPublickey(self.site.address, index)
self.response(to, publickey)
# Encrypt a text using the publickey or user's sites unique publickey
# Return: Encrypted text using base64 encoding
def actionEciesEncrypt(self, to, text, publickey=0, return_aes_key=False):
if type(publickey) is int: # Encrypt using user's publickey
publickey = self.user.getEncryptPublickey(self.site.address, publickey)
aes_key, encrypted = self.encrypt(text.encode("utf8"), publickey.decode("base64"))
if return_aes_key:
self.response(to, [base64.b64encode(encrypted), base64.b64encode(aes_key)])
else:
self.response(to, base64.b64encode(encrypted))
# Decrypt a text using privatekey or the user's site unique private key
# Return: Decrypted text or list of decrypted texts
def actionEciesDecrypt(self, to, param, privatekey=0):
if type(privatekey) is int: # Decrypt using user's privatekey
privatekey = self.user.getEncryptPrivatekey(self.site.address, privatekey)
if type(param) == list:
encrypted_texts = param
else:
encrypted_texts = [param]
texts = [] # Decoded texts
for encrypted_text in encrypted_texts:
try:
text = self.decrypt(encrypted_text.decode("base64"), privatekey)
texts.append(text)
except Exception as err:
texts.append(None)
if type(param) == list:
self.response(to, texts)
else:
self.response(to, texts[0])
# Encrypt a text using AES
# Return: Iv, AES key, Encrypted text
def actionAesEncrypt(self, to, text, key=None, iv=None):
from lib import pyelliptic
if key:
key = key.decode("base64")
else:
key = os.urandom(32)
if iv: # Generate new AES key if not definied
iv = iv.decode("base64")
else:
iv = pyelliptic.Cipher.gen_IV('aes-256-cbc')
if text:
encrypted = pyelliptic.Cipher(key, iv, 1, ciphername='aes-256-cbc').ciphering(text.encode("utf8"))
else:
encrypted = ""
self.response(to, [base64.b64encode(key), base64.b64encode(iv), base64.b64encode(encrypted)])
# Decrypt a text using AES
# Return: Decrypted text
def actionAesDecrypt(self, to, *args):
from lib import pyelliptic
if len(args) == 3: # Single decrypt
encrypted_texts = [(args[0], args[1])]
keys = [args[2]]
else: # Batch decrypt
encrypted_texts, keys = args
texts = [] # Decoded texts
for iv, encrypted_text in encrypted_texts:
encrypted_text = encrypted_text.decode("base64")
iv = iv.decode("base64")
text = None
for key in keys:
ctx = pyelliptic.Cipher(key.decode("base64"), iv, 0, ciphername='aes-256-cbc')
try:
decrypted = ctx.ciphering(encrypted_text)
if decrypted and decrypted.decode("utf8"): # Valid text decoded
text = decrypted
except Exception, err:
pass
texts.append(text)
if len(args) == 3:
self.response(to, texts[0])
else:
self.response(to, texts)
@PluginManager.registerTo("User")
class UserPlugin(object):
def getEncryptPrivatekey(self, address, param_index=0):
assert param_index >= 0 and param_index <= 1000
site_data = self.getSiteData(address)
if site_data.get("cert"): # Different privatekey for different cert provider
index = param_index + self.getAddressAuthIndex(site_data["cert"])
else:
index = param_index
if "encrypt_privatekey_%s" % index not in site_data:
address_index = self.getAddressAuthIndex(address)
crypt_index = address_index + 1000 + index
site_data["encrypt_privatekey_%s" % index] = CryptBitcoin.hdPrivatekey(self.master_seed, crypt_index)
self.log.debug("New encrypt privatekey generated for %s:%s" % (address, index))
return site_data["encrypt_privatekey_%s" % index]
def getEncryptPublickey(self, address, param_index=0):
assert param_index >= 0 and param_index <= 1000
site_data = self.getSiteData(address)
if site_data.get("cert"): # Different privatekey for different cert provider
index = param_index + self.getAddressAuthIndex(site_data["cert"])
else:
index = param_index
if "encrypt_publickey_%s" % index not in site_data:
privatekey = self.getEncryptPrivatekey(address, param_index)
publickey = btctools.encode_pubkey(btctools.privtopub(privatekey), "bin_compressed")
site_data["encrypt_publickey_%s" % index] = base64.b64encode(publickey)
return site_data["encrypt_publickey_%s" % index]

View file

@ -1,109 +0,0 @@
import pytest
from CryptMessage import CryptMessage
@pytest.mark.usefixtures("resetSettings")
class TestCrypt:
def testPublickey(self, ui_websocket):
pub = ui_websocket.testAction("UserPublickey", 0)
assert len(pub) == 44 # Compressed, b64 encoded publickey
# Different pubkey for specificed index
assert ui_websocket.testAction("UserPublickey", 1) != ui_websocket.testAction("UserPublickey", 0)
# Same publickey for same index
assert ui_websocket.testAction("UserPublickey", 2) == ui_websocket.testAction("UserPublickey", 2)
# Different publickey for different cert
site_data = ui_websocket.user.getSiteData(ui_websocket.site.address)
site_data["cert"] = None
pub1 = ui_websocket.testAction("UserPublickey", 0)
site_data = ui_websocket.user.getSiteData(ui_websocket.site.address)
site_data["cert"] = "zeroid.bit"
pub2 = ui_websocket.testAction("UserPublickey", 0)
assert pub1 != pub2
def testEcies(self, ui_websocket):
ui_websocket.actionUserPublickey(0, 0)
pub = ui_websocket.ws.result
ui_websocket.actionEciesEncrypt(0, "hello", pub)
encrypted = ui_websocket.ws.result
assert len(encrypted) == 180
# Don't allow decrypt using other privatekey index
ui_websocket.actionEciesDecrypt(0, encrypted, 123)
decrypted = ui_websocket.ws.result
assert decrypted != "hello"
# Decrypt using correct privatekey
ui_websocket.actionEciesDecrypt(0, encrypted)
decrypted = ui_websocket.ws.result
assert decrypted == "hello"
# Decrypt batch
ui_websocket.actionEciesDecrypt(0, [encrypted, "baad", encrypted])
decrypted = ui_websocket.ws.result
assert decrypted == ["hello", None, "hello"]
def testEciesUtf8(self, ui_websocket):
# Utf8 test
utf8_text = u'\xc1rv\xedzt\xfbr\xf5t\xfck\xf6rf\xfar\xf3g\xe9p'
ui_websocket.actionEciesEncrypt(0, utf8_text)
encrypted = ui_websocket.ws.result
ui_websocket.actionEciesDecrypt(0, encrypted)
assert ui_websocket.ws.result == utf8_text
def testEciesAes(self, ui_websocket):
ui_websocket.actionEciesEncrypt(0, "hello", return_aes_key=True)
ecies_encrypted, aes_key = ui_websocket.ws.result
# Decrypt using Ecies
ui_websocket.actionEciesDecrypt(0, ecies_encrypted)
assert ui_websocket.ws.result == "hello"
# Decrypt using AES
aes_iv, aes_encrypted = CryptMessage.split(ecies_encrypted.decode("base64"))
ui_websocket.actionAesDecrypt(0, aes_iv.encode("base64"), aes_encrypted.encode("base64"), aes_key)
assert ui_websocket.ws.result == "hello"
def testAes(self, ui_websocket):
ui_websocket.actionAesEncrypt(0, "hello")
key, iv, encrypted = ui_websocket.ws.result
assert len(key) == 44
assert len(iv) == 24
assert len(encrypted) == 24
# Single decrypt
ui_websocket.actionAesDecrypt(0, iv, encrypted, key)
assert ui_websocket.ws.result == "hello"
# Batch decrypt
ui_websocket.actionAesEncrypt(0, "hello")
key2, iv2, encrypted2 = ui_websocket.ws.result
assert [key, iv, encrypted] != [key2, iv2, encrypted2]
# 2 correct key
ui_websocket.actionAesDecrypt(0, [[iv, encrypted], [iv, encrypted], [iv, "baad"], [iv2, encrypted2]], [key])
assert ui_websocket.ws.result == ["hello", "hello", None, None]
# 3 key
ui_websocket.actionAesDecrypt(0, [[iv, encrypted], [iv, encrypted], [iv, "baad"], [iv2, encrypted2]], [key, key2])
assert ui_websocket.ws.result == ["hello", "hello", None, "hello"]
def testAesUtf8(self, ui_websocket):
utf8_text = u'\xc1rv\xedzt\xfbr\xf5t\xfck\xf6rf\xfar\xf3g\xe9'
ui_websocket.actionAesEncrypt(0, utf8_text)
key, iv, encrypted = ui_websocket.ws.result
ui_websocket.actionAesDecrypt(0, iv, encrypted, key)
assert ui_websocket.ws.result == utf8_text

View file

@ -1 +0,0 @@
from src.Test.conftest import *

View file

@ -1,5 +0,0 @@
[pytest]
python_files = Test*.py
addopts = -rsxX -v --durations=6
markers =
webtest: mark a test as a webtest.

View file

@ -1 +0,0 @@
import CryptMessagePlugin

View file

@ -1,194 +0,0 @@
import os
import re
import gevent
from Plugin import PluginManager
from Config import config
from Debug import Debug
# Keep archive open for faster reponse times for large sites
archive_cache = {}
def closeArchive(archive_path):
if archive_path in archive_cache:
del archive_cache[archive_path]
def openArchive(archive_path, file_obj=None):
if archive_path not in archive_cache:
if archive_path.endswith("tar.gz"):
import tarfile
archive_cache[archive_path] = tarfile.open(file_obj or archive_path, "r:gz")
elif archive_path.endswith("tar.bz2"):
import tarfile
archive_cache[archive_path] = tarfile.open(file_obj or archive_path, "r:bz2")
else:
import zipfile
archive_cache[archive_path] = zipfile.ZipFile(file_obj or archive_path)
gevent.spawn_later(5, lambda: closeArchive(archive_path)) # Close after 5 sec
archive = archive_cache[archive_path]
return archive
def openArchiveFile(archive_path, path_within, file_obj=None):
archive = openArchive(archive_path, file_obj=file_obj)
if archive_path.endswith(".zip"):
return archive.open(path_within)
else:
return archive.extractfile(path_within.encode("utf8"))
@PluginManager.registerTo("UiRequest")
class UiRequestPlugin(object):
def actionSiteMedia(self, path, **kwargs):
if ".zip/" in path or ".tar.gz/" in path:
file_obj = None
path_parts = self.parsePath(path)
file_path = u"%s/%s/%s" % (config.data_dir, path_parts["address"], path_parts["inner_path"].decode("utf8"))
match = re.match("^(.*\.(?:tar.gz|tar.bz2|zip))/(.*)", file_path)
archive_path, path_within = match.groups()
if archive_path not in archive_cache:
site = self.server.site_manager.get(path_parts["address"])
if not site:
return self.actionSiteAddPrompt(path)
archive_inner_path = site.storage.getInnerPath(archive_path)
if not os.path.isfile(archive_path):
# Wait until file downloads
result = site.needFile(archive_inner_path, priority=10)
# Send virutal file path download finished event to remove loading screen
site.updateWebsocket(file_done=archive_inner_path)
if not result:
return self.error404(archive_inner_path)
file_obj = site.storage.openBigfile(archive_inner_path)
header_allow_ajax = False
if self.get.get("ajax_key"):
requester_site = self.server.site_manager.get(path_parts["request_address"])
if self.get["ajax_key"] == requester_site.settings["ajax_key"]:
header_allow_ajax = True
else:
return self.error403("Invalid ajax_key")
try:
file = openArchiveFile(archive_path, path_within, file_obj=file_obj)
content_type = self.getContentType(file_path)
self.sendHeader(200, content_type=content_type, noscript=kwargs.get("header_noscript", False), allow_ajax=header_allow_ajax)
return self.streamFile(file)
except Exception as err:
self.log.debug("Error opening archive file: %s" % Debug.formatException(err))
return self.error404(path)
return super(UiRequestPlugin, self).actionSiteMedia(path, **kwargs)
def streamFile(self, file):
for i in range(100): # Read max 6MB
try:
block = file.read(60 * 1024)
if block:
yield block
else:
raise StopIteration
except StopIteration:
file.close()
break
@PluginManager.registerTo("SiteStorage")
class SiteStoragePlugin(object):
def isFile(self, inner_path):
if ".zip/" in inner_path or ".tar.gz/" in inner_path:
match = re.match("^(.*\.(?:tar.gz|tar.bz2|zip))/(.*)", inner_path)
archive_inner_path, path_within = match.groups()
return super(SiteStoragePlugin, self).isFile(archive_inner_path)
else:
return super(SiteStoragePlugin, self).isFile(inner_path)
def openArchive(self, inner_path):
archive_path = self.getPath(inner_path)
file_obj = None
if archive_path not in archive_cache:
if not os.path.isfile(archive_path):
result = self.site.needFile(inner_path, priority=10)
self.site.updateWebsocket(file_done=inner_path)
if not result:
raise Exception("Unable to download file")
file_obj = self.site.storage.openBigfile(inner_path)
try:
archive = openArchive(archive_path, file_obj=file_obj)
except Exception as err:
raise Exception("Unable to download file: %s" % err)
return archive
def walk(self, inner_path, *args, **kwags):
if ".zip" in inner_path or ".tar.gz" in inner_path:
match = re.match("^(.*\.(?:tar.gz|tar.bz2|zip))(.*)", inner_path)
archive_inner_path, path_within = match.groups()
archive = self.openArchive(archive_inner_path)
path_within = path_within.lstrip("/")
if archive_inner_path.endswith(".zip"):
namelist = [name for name in archive.namelist() if not name.endswith("/")]
else:
namelist = [item.name for item in archive.getmembers() if not item.isdir()]
namelist_relative = []
for name in namelist:
if not name.startswith(path_within):
continue
name_relative = name.replace(path_within, "", 1).rstrip("/")
namelist_relative.append(name_relative)
return namelist_relative
else:
return super(SiteStoragePlugin, self).walk(inner_path, *args, **kwags)
def list(self, inner_path, *args, **kwags):
if ".zip" in inner_path or ".tar.gz" in inner_path:
match = re.match("^(.*\.(?:tar.gz|tar.bz2|zip))(.*)", inner_path)
archive_inner_path, path_within = match.groups()
archive = self.openArchive(archive_inner_path)
path_within = path_within.lstrip("/")
if archive_inner_path.endswith(".zip"):
namelist = [name for name in archive.namelist()]
else:
namelist = [item.name for item in archive.getmembers()]
namelist_relative = []
for name in namelist:
if not name.startswith(path_within):
continue
name_relative = name.replace(path_within, "", 1).rstrip("/")
if "/" in name_relative: # File is in sub-directory
continue
namelist_relative.append(name_relative)
return namelist_relative
else:
return super(SiteStoragePlugin, self).list(inner_path, *args, **kwags)
def read(self, inner_path, mode="r"):
if ".zip/" in inner_path or ".tar.gz/" in inner_path:
match = re.match("^(.*\.(?:tar.gz|tar.bz2|zip))(.*)", inner_path)
archive_inner_path, path_within = match.groups()
archive = self.openArchive(archive_inner_path)
path_within = path_within.lstrip("/")
print archive, archive_inner_path
if archive_inner_path.endswith(".zip"):
return archive.open(path_within).read()
else:
return archive.extractfile(path_within.encode("utf8")).read()
else:
return super(SiteStoragePlugin, self).read(inner_path, mode)

View file

@ -1 +0,0 @@
import FilePackPlugin

View file

@ -1,384 +0,0 @@
import re
import time
import copy
from Plugin import PluginManager
from Translate import Translate
from util import RateLimit
from util import helper
from Debug import Debug
try:
import OptionalManager.UiWebsocketPlugin # To make optioanlFileInfo merger sites compatible
except Exception:
pass
if "merger_db" not in locals().keys(): # To keep merger_sites between module reloads
merger_db = {} # Sites that allowed to list other sites {address: [type1, type2...]}
merged_db = {} # Sites that allowed to be merged to other sites {address: type, ...}
merged_to_merger = {} # {address: [site1, site2, ...]} cache
site_manager = None # Site manager for merger sites
if "_" not in locals():
_ = Translate("plugins/MergerSite/languages/")
# Check if the site has permission to this merger site
def checkMergerPath(address, inner_path):
merged_match = re.match("^merged-(.*?)/([A-Za-z0-9]{26,35})/", inner_path)
if merged_match:
merger_type = merged_match.group(1)
# Check if merged site is allowed to include other sites
if merger_type in merger_db.get(address, []):
# Check if included site allows to include
merged_address = merged_match.group(2)
if merged_db.get(merged_address) == merger_type:
inner_path = re.sub("^merged-(.*?)/([A-Za-z0-9]{26,35})/", "", inner_path)
return merged_address, inner_path
else:
raise Exception(
"Merger site (%s) does not have permission for merged site: %s (%s)" %
(merger_type, merged_address, merged_db.get(merged_address))
)
else:
raise Exception("No merger (%s) permission to load: <br>%s (%s not in %s)" % (
address, inner_path, merger_type, merger_db.get(address, []))
)
else:
raise Exception("Invalid merger path: %s" % inner_path)
@PluginManager.registerTo("UiWebsocket")
class UiWebsocketPlugin(object):
# Download new site
def actionMergerSiteAdd(self, to, addresses):
if type(addresses) != list:
# Single site add
addresses = [addresses]
# Check if the site has merger permission
merger_types = merger_db.get(self.site.address)
if not merger_types:
return self.response(to, {"error": "Not a merger site"})
if RateLimit.isAllowed(self.site.address + "-MergerSiteAdd", 10) and len(addresses) == 1:
# Without confirmation if only one site address and not called in last 10 sec
self.cbMergerSiteAdd(to, addresses)
else:
self.cmd(
"confirm",
[_["Add <b>%s</b> new site?"] % len(addresses), "Add"],
lambda (res): self.cbMergerSiteAdd(to, addresses)
)
self.response(to, "ok")
# Callback of adding new site confirmation
def cbMergerSiteAdd(self, to, addresses):
added = 0
for address in addresses:
added += 1
site_manager.need(address)
if added:
self.cmd("notification", ["done", _["Added <b>%s</b> new site"] % added, 5000])
RateLimit.called(self.site.address + "-MergerSiteAdd")
site_manager.updateMergerSites()
# Delete a merged site
def actionMergerSiteDelete(self, to, address):
site = self.server.sites.get(address)
if not site:
return self.response(to, {"error": "No site found: %s" % address})
merger_types = merger_db.get(self.site.address)
if not merger_types:
return self.response(to, {"error": "Not a merger site"})
if merged_db.get(address) not in merger_types:
return self.response(to, {"error": "Merged type (%s) not in %s" % (merged_db.get(address), merger_types)})
self.cmd("notification", ["done", _["Site deleted: <b>%s</b>"] % address, 5000])
self.response(to, "ok")
# Lists merged sites
def actionMergerSiteList(self, to, query_site_info=False):
merger_types = merger_db.get(self.site.address)
ret = {}
if not merger_types:
return self.response(to, {"error": "Not a merger site"})
for address, merged_type in merged_db.iteritems():
if merged_type not in merger_types:
continue # Site not for us
if query_site_info:
site = self.server.sites.get(address)
ret[address] = self.formatSiteInfo(site, create_user=False)
else:
ret[address] = merged_type
self.response(to, ret)
def hasSitePermission(self, address, *args, **kwargs):
if super(UiWebsocketPlugin, self).hasSitePermission(address, *args, **kwargs):
return True
else:
if self.site.address in [merger_site.address for merger_site in merged_to_merger.get(address, [])]:
return True
else:
return False
# Add support merger sites for file commands
def mergerFuncWrapper(self, func_name, to, inner_path, *args, **kwargs):
if inner_path.startswith("merged-"):
merged_address, merged_inner_path = checkMergerPath(self.site.address, inner_path)
# Set the same cert for merged site
merger_cert = self.user.getSiteData(self.site.address).get("cert")
if merger_cert and self.user.getSiteData(merged_address).get("cert") != merger_cert:
self.user.setCert(merged_address, merger_cert)
req_self = copy.copy(self)
req_self.site = self.server.sites.get(merged_address) # Change the site to the merged one
func = getattr(super(UiWebsocketPlugin, req_self), func_name)
return func(to, merged_inner_path, *args, **kwargs)
else:
func = getattr(super(UiWebsocketPlugin, self), func_name)
return func(to, inner_path, *args, **kwargs)
def actionFileList(self, to, inner_path, *args, **kwargs):
return self.mergerFuncWrapper("actionFileList", to, inner_path, *args, **kwargs)
def actionDirList(self, to, inner_path, *args, **kwargs):
return self.mergerFuncWrapper("actionDirList", to, inner_path, *args, **kwargs)
def actionFileGet(self, to, inner_path, *args, **kwargs):
return self.mergerFuncWrapper("actionFileGet", to, inner_path, *args, **kwargs)
def actionFileWrite(self, to, inner_path, *args, **kwargs):
return self.mergerFuncWrapper("actionFileWrite", to, inner_path, *args, **kwargs)
def actionFileDelete(self, to, inner_path, *args, **kwargs):
return self.mergerFuncWrapper("actionFileDelete", to, inner_path, *args, **kwargs)
def actionFileRules(self, to, inner_path, *args, **kwargs):
return self.mergerFuncWrapper("actionFileRules", to, inner_path, *args, **kwargs)
def actionFileNeed(self, to, inner_path, *args, **kwargs):
return self.mergerFuncWrapper("actionFileNeed", to, inner_path, *args, **kwargs)
def actionOptionalFileInfo(self, to, inner_path, *args, **kwargs):
return self.mergerFuncWrapper("actionOptionalFileInfo", to, inner_path, *args, **kwargs)
def actionOptionalFileDelete(self, to, inner_path, *args, **kwargs):
return self.mergerFuncWrapper("actionOptionalFileDelete", to, inner_path, *args, **kwargs)
def actionBigfileUploadInit(self, to, inner_path, *args, **kwargs):
back = self.mergerFuncWrapper("actionBigfileUploadInit", to, inner_path, *args, **kwargs)
if inner_path.startswith("merged-"):
merged_address, merged_inner_path = checkMergerPath(self.site.address, inner_path)
back["inner_path"] = "merged-%s/%s/%s" % (merged_db[merged_address], merged_address, back["inner_path"])
return back
# Add support merger sites for file commands with privatekey parameter
def mergerFuncWrapperWithPrivatekey(self, func_name, to, privatekey, inner_path, *args, **kwargs):
func = getattr(super(UiWebsocketPlugin, self), func_name)
if inner_path.startswith("merged-"):
merged_address, merged_inner_path = checkMergerPath(self.site.address, inner_path)
merged_site = self.server.sites.get(merged_address)
# Set the same cert for merged site
merger_cert = self.user.getSiteData(self.site.address).get("cert")
if merger_cert:
self.user.setCert(merged_address, merger_cert)
site_before = self.site # Save to be able to change it back after we ran the command
self.site = merged_site # Change the site to the merged one
try:
back = func(to, privatekey, merged_inner_path, *args, **kwargs)
finally:
self.site = site_before # Change back to original site
return back
else:
return func(to, privatekey, inner_path, *args, **kwargs)
def actionSiteSign(self, to, privatekey=None, inner_path="content.json", *args, **kwargs):
return self.mergerFuncWrapperWithPrivatekey("actionSiteSign", to, privatekey, inner_path, *args, **kwargs)
def actionSitePublish(self, to, privatekey=None, inner_path="content.json", *args, **kwargs):
return self.mergerFuncWrapperWithPrivatekey("actionSitePublish", to, privatekey, inner_path, *args, **kwargs)
def actionPermissionAdd(self, to, permission):
super(UiWebsocketPlugin, self).actionPermissionAdd(to, permission)
if permission.startswith("Merger"):
self.site.storage.rebuildDb()
def actionPermissionDetails(self, to, permission):
if not permission.startswith("Merger"):
return super(UiWebsocketPlugin, self).actionPermissionDetails(to, permission)
merger_type = permission.replace("Merger:", "")
if not re.match("^[A-Za-z0-9-]+$", merger_type):
raise Exception("Invalid merger_type: %s" % merger_type)
merged_sites = []
for address, merged_type in merged_db.iteritems():
if merged_type != merger_type:
continue
site = self.server.sites.get(address)
try:
merged_sites.append(site.content_manager.contents.get("content.json").get("title", address))
except Exception as err:
merged_sites.append(address)
details = _["Read and write permissions to sites with merged type of <b>%s</b> "] % merger_type
details += _["(%s sites)"] % len(merged_sites)
details += "<div style='white-space: normal; max-width: 400px'>%s</div>" % ", ".join(merged_sites)
self.response(to, details)
@PluginManager.registerTo("UiRequest")
class UiRequestPlugin(object):
# Allow to load merged site files using /merged-ZeroMe/address/file.jpg
def parsePath(self, path):
path_parts = super(UiRequestPlugin, self).parsePath(path)
if "merged-" not in path: # Optimization
return path_parts
path_parts["address"], path_parts["inner_path"] = checkMergerPath(path_parts["address"], path_parts["inner_path"])
return path_parts
@PluginManager.registerTo("SiteStorage")
class SiteStoragePlugin(object):
# Also rebuild from merged sites
def getDbFiles(self):
merger_types = merger_db.get(self.site.address)
# First return the site's own db files
for item in super(SiteStoragePlugin, self).getDbFiles():
yield item
# Not a merger site, that's all
if not merger_types:
raise StopIteration
merged_sites = [
site_manager.sites[address]
for address, merged_type in merged_db.iteritems()
if merged_type in merger_types
]
found = 0
for merged_site in merged_sites:
self.log.debug("Loading merged site: %s" % merged_site)
merged_type = merged_db[merged_site.address]
for content_inner_path, content in merged_site.content_manager.contents.iteritems():
# content.json file itself
if merged_site.storage.isFile(content_inner_path): # Missing content.json file
merged_inner_path = "merged-%s/%s/%s" % (merged_type, merged_site.address, content_inner_path)
yield merged_inner_path, merged_site.storage.getPath(content_inner_path)
else:
merged_site.log.error("[MISSING] %s" % content_inner_path)
# Data files in content.json
content_inner_path_dir = helper.getDirname(content_inner_path) # Content.json dir relative to site
for file_relative_path in content.get("files", {}).keys() + content.get("files_optional", {}).keys():
if not file_relative_path.endswith(".json"):
continue # We only interesed in json files
file_inner_path = content_inner_path_dir + file_relative_path # File Relative to site dir
file_inner_path = file_inner_path.strip("/") # Strip leading /
if merged_site.storage.isFile(file_inner_path):
merged_inner_path = "merged-%s/%s/%s" % (merged_type, merged_site.address, file_inner_path)
yield merged_inner_path, merged_site.storage.getPath(file_inner_path)
else:
merged_site.log.error("[MISSING] %s" % file_inner_path)
found += 1
if found % 100 == 0:
time.sleep(0.000001) # Context switch to avoid UI block
# Also notice merger sites on a merged site file change
def onUpdated(self, inner_path, file=None):
super(SiteStoragePlugin, self).onUpdated(inner_path, file)
merged_type = merged_db.get(self.site.address)
for merger_site in merged_to_merger.get(self.site.address, []):
if merger_site.address == self.site.address: # Avoid infinite loop
continue
virtual_path = "merged-%s/%s/%s" % (merged_type, self.site.address, inner_path)
if inner_path.endswith(".json"):
if file is not None:
merger_site.storage.onUpdated(virtual_path, file=file)
else:
merger_site.storage.onUpdated(virtual_path, file=self.open(inner_path))
else:
merger_site.storage.onUpdated(virtual_path)
@PluginManager.registerTo("Site")
class SitePlugin(object):
def fileDone(self, inner_path):
super(SitePlugin, self).fileDone(inner_path)
for merger_site in merged_to_merger.get(self.address, []):
if merger_site.address == self.address:
continue
for ws in merger_site.websockets:
ws.event("siteChanged", self, {"event": ["file_done", inner_path]})
def fileFailed(self, inner_path):
super(SitePlugin, self).fileFailed(inner_path)
for merger_site in merged_to_merger.get(self.address, []):
if merger_site.address == self.address:
continue
for ws in merger_site.websockets:
ws.event("siteChanged", self, {"event": ["file_failed", inner_path]})
@PluginManager.registerTo("SiteManager")
class SiteManagerPlugin(object):
# Update merger site for site types
def updateMergerSites(self):
global merger_db, merged_db, merged_to_merger, site_manager
s = time.time()
merger_db = {}
merged_db = {}
merged_to_merger = {}
site_manager = self
if not self.sites:
return
for site in self.sites.itervalues():
# Update merged sites
try:
merged_type = site.content_manager.contents.get("content.json", {}).get("merged_type")
except Exception, err:
self.log.error("Error loading site %s: %s" % (site.address, Debug.formatException(err)))
continue
if merged_type:
merged_db[site.address] = merged_type
# Update merger sites
for permission in site.settings["permissions"]:
if not permission.startswith("Merger:"):
continue
if merged_type:
self.log.error(
"Removing permission %s from %s: Merger and merged at the same time." %
(permission, site.address)
)
site.settings["permissions"].remove(permission)
continue
merger_type = permission.replace("Merger:", "")
if site.address not in merger_db:
merger_db[site.address] = []
merger_db[site.address].append(merger_type)
site_manager.sites[site.address] = site
# Update merged to merger
if merged_type:
for merger_site in self.sites.itervalues():
if "Merger:" + merged_type in merger_site.settings["permissions"]:
if site.address not in merged_to_merger:
merged_to_merger[site.address] = []
merged_to_merger[site.address].append(merger_site)
self.log.debug("Updated merger sites in %.3fs" % (time.time() - s))
def load(self, *args, **kwags):
super(SiteManagerPlugin, self).load(*args, **kwags)
self.updateMergerSites()
def save(self, *args, **kwags):
super(SiteManagerPlugin, self).save(*args, **kwags)
self.updateMergerSites()

View file

@ -1 +0,0 @@
import MergerSitePlugin

View file

@ -1,5 +0,0 @@
{
"Add <b>%s</b> new site?": "¿Agregar <b>%s</b> nuevo sitio?",
"Added <b>%s</b> new site": "Sitio <b>%s</b> agregado",
"Site deleted: <b>%s</b>": "Sitio removido: <b>%s</b>"
}

View file

@ -1,5 +0,0 @@
{
"Add <b>%s</b> new site?": "Ajouter le site <b>%s</b> ?",
"Added <b>%s</b> new site": "Site <b>%s</b> ajouté",
"Site deleted: <b>%s</b>": "Site <b>%s</b> supprimé"
}

View file

@ -1,5 +0,0 @@
{
"Add <b>%s</b> new site?": "Új oldal hozzáadása: <b>%s</b>?",
"Added <b>%s</b> new site": "Új oldal hozzáadva: <b>%s</b>",
"Site deleted: <b>%s</b>": "Oldal törölve: <b>%s</b>"
}

View file

@ -1,5 +0,0 @@
{
"Add <b>%s</b> new site?": "Aggiungere <b>%s</b> nuovo sito ?",
"Added <b>%s</b> new site": "Sito <b>%s</b> aggiunto",
"Site deleted: <b>%s</b>": "Sito <b>%s</b> eliminato"
}

View file

@ -1,5 +0,0 @@
{
"Add <b>%s</b> new site?": "Adicionar <b>%s</b> novo site?",
"Added <b>%s</b> new site": "Site <b>%s</b> adicionado",
"Site deleted: <b>%s</b>": "Site removido: <b>%s</b>"
}

View file

@ -1,5 +0,0 @@
{
"Add <b>%s</b> new site?": "<b>%s</b> sitesi eklensin mi?",
"Added <b>%s</b> new site": "<b>%s</b> sitesi eklendi",
"Site deleted: <b>%s</b>": "<b>%s</b> sitesi silindi"
}

View file

@ -1,5 +0,0 @@
{
"Add <b>%s</b> new site?": "添加新網站: <b>%s</b>",
"Added <b>%s</b> new site": "已添加到新網站:<b>%s</b>",
"Site deleted: <b>%s</b>": "網站已刪除:<b>%s</b>"
}

View file

@ -1,5 +0,0 @@
{
"Add <b>%s</b> new site?": "添加新站点: <b>%s</b>",
"Added <b>%s</b> new site": "已添加到新站点:<b>%s</b>",
"Site deleted: <b>%s</b>": "站点已删除:<b>%s</b>"
}

View file

@ -1,188 +0,0 @@
import time
import re
from Plugin import PluginManager
from Db import DbQuery
from Debug import Debug
from util import helper
@PluginManager.registerTo("UiWebsocket")
class UiWebsocketPlugin(object):
def formatSiteInfo(self, site, create_user=True):
site_info = super(UiWebsocketPlugin, self).formatSiteInfo(site, create_user=create_user)
feed_following = self.user.sites.get(site.address, {}).get("follow", None)
if feed_following == None:
site_info["feed_follow_num"] = None
else:
site_info["feed_follow_num"] = len(feed_following)
return site_info
def actionFeedFollow(self, to, feeds):
self.user.setFeedFollow(self.site.address, feeds)
self.user.save()
self.response(to, "ok")
def actionFeedListFollow(self, to):
feeds = self.user.sites[self.site.address].get("follow", {})
self.response(to, feeds)
def actionFeedQuery(self, to, limit=10, day_limit=3):
if "ADMIN" not in self.site.settings["permissions"]:
return self.response(to, "FeedQuery not allowed")
from Site import SiteManager
rows = []
stats = []
total_s = time.time()
num_sites = 0
for address, site_data in self.user.sites.items():
feeds = site_data.get("follow")
if not feeds:
continue
if type(feeds) is not dict:
self.log.debug("Invalid feed for site %s" % address)
continue
num_sites += 1
for name, query_set in feeds.iteritems():
site = SiteManager.site_manager.get(address)
if not site or not site.storage.has_db:
continue
s = time.time()
try:
query_raw, params = query_set
query_parts = re.split(r"UNION(?:\s+ALL|)", query_raw)
for i, query_part in enumerate(query_parts):
db_query = DbQuery(query_part)
if day_limit:
where = " WHERE %s > strftime('%%s', 'now', '-%s day')" % (db_query.fields.get("date_added", "date_added"), day_limit)
if "WHERE" in query_part:
query_part = re.sub("WHERE (.*?)(?=$| GROUP BY)", where+" AND (\\1)", query_part)
else:
query_part += where
query_parts[i] = query_part
query = " UNION ".join(query_parts)
if ":params" in query:
query_params = map(helper.sqlquote, params)
query = query.replace(":params", ",".join(query_params))
res = site.storage.query(query + " ORDER BY date_added DESC LIMIT %s" % limit)
except Exception as err: # Log error
self.log.error("%s feed query %s error: %s" % (address, name, Debug.formatException(err)))
stats.append({"site": site.address, "feed_name": name, "error": str(err)})
continue
for row in res:
row = dict(row)
if not isinstance(row["date_added"], (int, long, float, complex)):
self.log.debug("Invalid date_added from site %s: %r" % (address, row["date_added"]))
continue
if row["date_added"] > 1000000000000: # Formatted as millseconds
row["date_added"] = row["date_added"] / 1000
if "date_added" not in row or row["date_added"] > time.time() + 120:
self.log.debug("Newsfeed item from the future from from site %s" % address)
continue # Feed item is in the future, skip it
row["site"] = address
row["feed_name"] = name
rows.append(row)
stats.append({"site": site.address, "feed_name": name, "taken": round(time.time() - s, 3)})
time.sleep(0.0001)
return self.response(to, {"rows": rows, "stats": stats, "num": len(rows), "sites": num_sites, "taken": round(time.time() - total_s, 3)})
def parseSearch(self, search):
parts = re.split("(site|type):", search)
if len(parts) > 1: # Found filter
search_text = parts[0]
parts = [part.strip() for part in parts]
filters = dict(zip(parts[1::2], parts[2::2]))
else:
search_text = search
filters = {}
return [search_text, filters]
def actionFeedSearch(self, to, search, limit=30, day_limit=30):
if "ADMIN" not in self.site.settings["permissions"]:
return self.response(to, "FeedSearch not allowed")
from Site import SiteManager
rows = []
stats = []
num_sites = 0
total_s = time.time()
search_text, filters = self.parseSearch(search)
for address, site in SiteManager.site_manager.list().iteritems():
if not site.storage.has_db:
continue
if "site" in filters:
if filters["site"].lower() not in [site.address, site.content_manager.contents["content.json"].get("title").lower()]:
continue
if site.storage.db: # Database loaded
feeds = site.storage.db.schema.get("feeds")
else:
try:
feeds = site.storage.loadJson("dbschema.json").get("feeds")
except:
continue
if not feeds:
continue
num_sites += 1
for name, query in feeds.iteritems():
s = time.time()
try:
db_query = DbQuery(query)
params = []
# Filters
if search_text:
db_query.wheres.append("(%s LIKE ? OR %s LIKE ?)" % (db_query.fields["body"], db_query.fields["title"]))
search_like = "%" + search_text.replace(" ", "%") + "%"
params.append(search_like)
params.append(search_like)
if filters.get("type") and filters["type"] not in query:
continue
if day_limit:
db_query.wheres.append(
"%s > strftime('%%s', 'now', '-%s day')" % (db_query.fields.get("date_added", "date_added"), day_limit)
)
# Order
db_query.parts["ORDER BY"] = "date_added DESC"
db_query.parts["LIMIT"] = str(limit)
res = site.storage.query(str(db_query), params)
except Exception, err:
self.log.error("%s feed query %s error: %s" % (address, name, Debug.formatException(err)))
stats.append({"site": site.address, "feed_name": name, "error": str(err), "query": query})
continue
for row in res:
row = dict(row)
if row["date_added"] > time.time() + 120:
continue # Feed item is in the future, skip it
row["site"] = address
row["feed_name"] = name
rows.append(row)
stats.append({"site": site.address, "feed_name": name, "taken": round(time.time() - s, 3)})
return self.response(to, {"rows": rows, "num": len(rows), "sites": num_sites, "taken": round(time.time() - total_s, 3), "stats": stats})
@PluginManager.registerTo("User")
class UserPlugin(object):
# Set queries that user follows
def setFeedFollow(self, address, feeds):
site_data = self.getSiteData(address)
site_data["follow"] = feeds
self.save()
return site_data

View file

@ -1 +0,0 @@
import NewsfeedPlugin

View file

@ -1,422 +0,0 @@
import time
import collections
import itertools
import re
import gevent
from util import helper
from Plugin import PluginManager
from Config import config
from Debug import Debug
if "content_db" not in locals().keys(): # To keep between module reloads
content_db = None
@PluginManager.registerTo("ContentDb")
class ContentDbPlugin(object):
def __init__(self, *args, **kwargs):
global content_db
content_db = self
self.filled = {} # Site addresses that already filled from content.json
self.need_filling = False # file_optional table just created, fill data from content.json files
self.time_peer_numbers_updated = 0
self.my_optional_files = {} # Last 50 site_address/inner_path called by fileWrite (auto-pinning these files)
self.optional_files = collections.defaultdict(dict)
self.optional_files_loading = False
helper.timer(60 * 5, self.checkOptionalLimit)
super(ContentDbPlugin, self).__init__(*args, **kwargs)
def getSchema(self):
schema = super(ContentDbPlugin, self).getSchema()
# Need file_optional table
schema["tables"]["file_optional"] = {
"cols": [
["file_id", "INTEGER PRIMARY KEY UNIQUE NOT NULL"],
["site_id", "INTEGER REFERENCES site (site_id) ON DELETE CASCADE"],
["inner_path", "TEXT"],
["hash_id", "INTEGER"],
["size", "INTEGER"],
["peer", "INTEGER DEFAULT 0"],
["uploaded", "INTEGER DEFAULT 0"],
["is_downloaded", "INTEGER DEFAULT 0"],
["is_pinned", "INTEGER DEFAULT 0"],
["time_added", "INTEGER DEFAULT 0"],
["time_downloaded", "INTEGER DEFAULT 0"],
["time_accessed", "INTEGER DEFAULT 0"]
],
"indexes": [
"CREATE UNIQUE INDEX file_optional_key ON file_optional (site_id, inner_path)",
"CREATE INDEX is_downloaded ON file_optional (is_downloaded)"
],
"schema_changed": 11
}
return schema
def initSite(self, site):
super(ContentDbPlugin, self).initSite(site)
if self.need_filling:
self.fillTableFileOptional(site)
if not self.optional_files_loading:
gevent.spawn_later(1, self.loadFilesOptional)
self.optional_files_loading = True
def checkTables(self):
changed_tables = super(ContentDbPlugin, self).checkTables()
if "file_optional" in changed_tables:
self.need_filling = True
return changed_tables
# Load optional files ending
def loadFilesOptional(self):
s = time.time()
num = 0
total = 0
total_downloaded = 0
res = content_db.execute("SELECT site_id, inner_path, size, is_downloaded FROM file_optional")
site_sizes = collections.defaultdict(lambda: collections.defaultdict(int))
for row in res:
self.optional_files[row["site_id"]][row["inner_path"][-8:]] = 1
num += 1
# Update site size stats
site_sizes[row["site_id"]]["size_optional"] += row["size"]
if row["is_downloaded"]:
site_sizes[row["site_id"]]["optional_downloaded"] += row["size"]
# Site site size stats to sites.json settings
site_ids_reverse = {val: key for key, val in self.site_ids.iteritems()}
for site_id, stats in site_sizes.iteritems():
site_address = site_ids_reverse.get(site_id)
if not site_address:
self.log.error("Not found site_id: %s" % site_id)
continue
site = self.sites[site_address]
site.settings["size_optional"] = stats["size_optional"]
site.settings["optional_downloaded"] = stats["optional_downloaded"]
total += stats["size_optional"]
total_downloaded += stats["optional_downloaded"]
self.log.debug(
"Loaded %s optional files: %.2fMB, downloaded: %.2fMB in %.3fs" %
(num, float(total) / 1024 / 1024, float(total_downloaded) / 1024 / 1024, time.time() - s)
)
if self.need_filling and self.getOptionalLimitBytes() >= 0 and self.getOptionalLimitBytes() < total_downloaded:
limit_bytes = self.getOptionalLimitBytes()
limit_new = round((float(total_downloaded) / 1024 / 1024 / 1024) * 1.1, 2) # Current limit + 10%
self.log.debug(
"First startup after update and limit is smaller than downloaded files size (%.2fGB), increasing it from %.2fGB to %.2fGB" %
(float(total_downloaded) / 1024 / 1024 / 1024, float(limit_bytes) / 1024 / 1024 / 1024, limit_new)
)
config.saveValue("optional_limit", limit_new)
config.optional_limit = str(limit_new)
# Predicts if the file is optional
def isOptionalFile(self, site_id, inner_path):
return self.optional_files[site_id].get(inner_path[-8:])
# Fill file_optional table with optional files found in sites
def fillTableFileOptional(self, site):
s = time.time()
site_id = self.site_ids.get(site.address)
if not site_id:
return False
cur = self.getCursor()
cur.execute("BEGIN")
res = cur.execute("SELECT * FROM content WHERE size_files_optional > 0 AND site_id = %s" % site_id)
num = 0
for row in res.fetchall():
content = site.content_manager.contents[row["inner_path"]]
try:
num += self.setContentFilesOptional(site, row["inner_path"], content, cur=cur)
except Exception as err:
self.log.error("Error loading %s into file_optional: %s" % (row["inner_path"], err))
cur.execute("COMMIT")
cur.close()
# Set my files to pinned
from User import UserManager
user = UserManager.user_manager.get()
if not user:
user = UserManager.user_manager.create()
auth_address = user.getAuthAddress(site.address)
self.execute(
"UPDATE file_optional SET is_pinned = 1 WHERE site_id = :site_id AND inner_path LIKE :inner_path",
{"site_id": site_id, "inner_path": "%%/%s/%%" % auth_address}
)
self.log.debug(
"Filled file_optional table for %s in %.3fs (loaded: %s, is_pinned: %s)" %
(site.address, time.time() - s, num, self.cur.cursor.rowcount)
)
self.filled[site.address] = True
def setContentFilesOptional(self, site, content_inner_path, content, cur=None):
if not cur:
cur = self
try:
cur.execute("BEGIN")
except Exception as err:
self.log.warning("Transaction begin error %s %s: %s" % (site, content_inner_path, Debug.formatException(err)))
num = 0
site_id = self.site_ids[site.address]
content_inner_dir = helper.getDirname(content_inner_path)
for relative_inner_path, file in content.get("files_optional", {}).iteritems():
file_inner_path = content_inner_dir + relative_inner_path
hash_id = int(file["sha512"][0:4], 16)
if hash_id in site.content_manager.hashfield:
is_downloaded = 1
else:
is_downloaded = 0
if site.address + "/" + content_inner_dir in self.my_optional_files:
is_pinned = 1
else:
is_pinned = 0
cur.insertOrUpdate("file_optional", {
"hash_id": hash_id,
"size": int(file["size"])
}, {
"site_id": site_id,
"inner_path": file_inner_path
}, oninsert={
"time_added": int(time.time()),
"time_downloaded": int(time.time()) if is_downloaded else 0,
"is_downloaded": is_downloaded,
"peer": is_downloaded,
"is_pinned": is_pinned
})
self.optional_files[site_id][file_inner_path[-8:]] = 1
num += 1
if cur == self:
try:
cur.execute("END")
except Exception as err:
self.log.warning("Transaction end error %s %s: %s" % (site, content_inner_path, Debug.formatException(err)))
return num
def setContent(self, site, inner_path, content, size=0):
super(ContentDbPlugin, self).setContent(site, inner_path, content, size=size)
old_content = site.content_manager.contents.get(inner_path, {})
if (not self.need_filling or self.filled.get(site.address)) and ("files_optional" in content or "files_optional" in old_content):
self.setContentFilesOptional(site, inner_path, content)
# Check deleted files
if old_content:
old_files = old_content.get("files_optional", {}).keys()
new_files = content.get("files_optional", {}).keys()
content_inner_dir = helper.getDirname(inner_path)
deleted = [content_inner_dir + key for key in old_files if key not in new_files]
if deleted:
site_id = self.site_ids[site.address]
self.execute("DELETE FROM file_optional WHERE ?", {"site_id": site_id, "inner_path": deleted})
def deleteContent(self, site, inner_path):
content = site.content_manager.contents.get(inner_path)
if content and "files_optional" in content:
site_id = self.site_ids[site.address]
content_inner_dir = helper.getDirname(inner_path)
optional_inner_paths = [
content_inner_dir + relative_inner_path
for relative_inner_path in content.get("files_optional", {}).keys()
]
self.execute("DELETE FROM file_optional WHERE ?", {"site_id": site_id, "inner_path": optional_inner_paths})
super(ContentDbPlugin, self).deleteContent(site, inner_path)
def updatePeerNumbers(self):
s = time.time()
num_file = 0
num_updated = 0
num_site = 0
for site in self.sites.values():
if not site.content_manager.has_optional_files:
continue
if not site.settings["serving"]:
continue
has_updated_hashfield = next((
peer
for peer in site.peers.itervalues()
if peer.has_hashfield and peer.hashfield.time_changed > self.time_peer_numbers_updated
), None)
if not has_updated_hashfield and site.content_manager.hashfield.time_changed < self.time_peer_numbers_updated:
continue
hashfield_peers = itertools.chain.from_iterable(
peer.hashfield.storage
for peer in site.peers.itervalues()
if peer.has_hashfield
)
peer_nums = collections.Counter(
itertools.chain(
hashfield_peers,
site.content_manager.hashfield
)
)
site_id = self.site_ids[site.address]
if not site_id:
continue
res = self.execute("SELECT file_id, hash_id, peer FROM file_optional WHERE ?", {"site_id": site_id})
updates = {}
for row in res:
peer_num = peer_nums.get(row["hash_id"], 0)
if peer_num != row["peer"]:
updates[row["file_id"]] = peer_num
self.execute("BEGIN")
for file_id, peer_num in updates.iteritems():
self.execute("UPDATE file_optional SET peer = ? WHERE file_id = ?", (peer_num, file_id))
self.execute("END")
num_updated += len(updates)
num_file += len(peer_nums)
num_site += 1
self.time_peer_numbers_updated = time.time()
self.log.debug("%s/%s peer number for %s site updated in %.3fs" % (num_updated, num_file, num_site, time.time() - s))
def queryDeletableFiles(self):
# First return the files with atleast 10 seeder and not accessed in last week
query = """
SELECT * FROM file_optional
WHERE peer > 10 AND %s
ORDER BY time_accessed < %s DESC, uploaded / size
""" % (self.getOptionalUsedWhere(), int(time.time() - 60 * 60 * 7))
limit_start = 0
while 1:
num = 0
res = self.execute("%s LIMIT %s, 50" % (query, limit_start))
for row in res:
yield row
num += 1
if num < 50:
break
limit_start += 50
self.log.debug("queryDeletableFiles returning less-seeded files")
# Then return files less seeder but still not accessed in last week
query = """
SELECT * FROM file_optional
WHERE peer <= 10 AND %s
ORDER BY peer DESC, time_accessed < %s DESC, uploaded / size
""" % (self.getOptionalUsedWhere(), int(time.time() - 60 * 60 * 7))
limit_start = 0
while 1:
num = 0
res = self.execute("%s LIMIT %s, 50" % (query, limit_start))
for row in res:
yield row
num += 1
if num < 50:
break
limit_start += 50
self.log.debug("queryDeletableFiles returning everyting")
# At the end return all files
query = """
SELECT * FROM file_optional
WHERE peer <= 10 AND %s
ORDER BY peer DESC, time_accessed, uploaded / size
""" % self.getOptionalUsedWhere()
limit_start = 0
while 1:
num = 0
res = self.execute("%s LIMIT %s, 50" % (query, limit_start))
for row in res:
yield row
num += 1
if num < 50:
break
limit_start += 50
def getOptionalLimitBytes(self):
if config.optional_limit.endswith("%"):
limit_percent = float(re.sub("[^0-9.]", "", config.optional_limit))
limit_bytes = helper.getFreeSpace() * (limit_percent / 100)
else:
limit_bytes = float(re.sub("[^0-9.]", "", config.optional_limit)) * 1024 * 1024 * 1024
return limit_bytes
def getOptionalUsedWhere(self):
maxsize = config.optional_limit_exclude_minsize * 1024 * 1024
query = "is_downloaded = 1 AND is_pinned = 0 AND size < %s" % maxsize
# Don't delete optional files from owned sites
my_site_ids = []
for address, site in self.sites.items():
if site.settings["own"]:
my_site_ids.append(str(self.site_ids[address]))
if my_site_ids:
query += " AND site_id NOT IN (%s)" % ", ".join(my_site_ids)
return query
def getOptionalUsedBytes(self):
size = self.execute("SELECT SUM(size) FROM file_optional WHERE %s" % self.getOptionalUsedWhere()).fetchone()[0]
if not size:
size = 0
return size
def getOptionalNeedDelete(self, size):
if config.optional_limit.endswith("%"):
limit_percent = float(re.sub("[^0-9.]", "", config.optional_limit))
need_delete = size - ((helper.getFreeSpace() + size) * (limit_percent / 100))
else:
need_delete = size - self.getOptionalLimitBytes()
return need_delete
def checkOptionalLimit(self, limit=None):
if not limit:
limit = self.getOptionalLimitBytes()
if limit < 0:
self.log.debug("Invalid limit for optional files: %s" % limit)
return False
size = self.getOptionalUsedBytes()
need_delete = self.getOptionalNeedDelete(size)
self.log.debug(
"Optional size: %.1fMB/%.1fMB, Need delete: %.1fMB" %
(float(size) / 1024 / 1024, float(limit) / 1024 / 1024, float(need_delete) / 1024 / 1024)
)
if need_delete <= 0:
return False
self.updatePeerNumbers()
site_ids_reverse = {val: key for key, val in self.site_ids.iteritems()}
deleted_file_ids = []
for row in self.queryDeletableFiles():
site_address = site_ids_reverse.get(row["site_id"])
site = self.sites.get(site_address)
if not site:
self.log.error("No site found for id: %s" % row["site_id"])
continue
site.log.debug("Deleting %s %.3f MB left" % (row["inner_path"], float(need_delete) / 1024 / 1024))
deleted_file_ids.append(row["file_id"])
try:
site.content_manager.optionalRemoved(row["inner_path"], row["hash_id"], row["size"])
site.storage.delete(row["inner_path"])
need_delete -= row["size"]
except Exception as err:
site.log.error("Error deleting %s: %s" % (row["inner_path"], err))
if need_delete <= 0:
break
cur = self.getCursor()
cur.execute("BEGIN")
for file_id in deleted_file_ids:
cur.execute("UPDATE file_optional SET is_downloaded = 0, is_pinned = 0, peer = peer - 1 WHERE ?", {"file_id": file_id})
cur.execute("COMMIT")
cur.close()

View file

@ -1,229 +0,0 @@
import time
import re
import collections
import gevent
from util import helper
from Plugin import PluginManager
import ContentDbPlugin
# We can only import plugin host clases after the plugins are loaded
@PluginManager.afterLoad
def importPluginnedClasses():
global config
from Config import config
def processAccessLog():
if access_log:
content_db = ContentDbPlugin.content_db
now = int(time.time())
num = 0
for site_id in access_log:
content_db.execute(
"UPDATE file_optional SET time_accessed = %s WHERE ?" % now,
{"site_id": site_id, "inner_path": access_log[site_id].keys()}
)
num += len(access_log[site_id])
access_log.clear()
def processRequestLog():
if request_log:
content_db = ContentDbPlugin.content_db
cur = content_db.getCursor()
num = 0
cur.execute("BEGIN")
for site_id in request_log:
for inner_path, uploaded in request_log[site_id].iteritems():
content_db.execute(
"UPDATE file_optional SET uploaded = uploaded + %s WHERE ?" % uploaded,
{"site_id": site_id, "inner_path": inner_path}
)
num += 1
cur.execute("END")
request_log.clear()
if "access_log" not in locals().keys(): # To keep between module reloads
access_log = collections.defaultdict(dict) # {site_id: {inner_path1: 1, inner_path2: 1...}}
request_log = collections.defaultdict(lambda: collections.defaultdict(int)) # {site_id: {inner_path1: 1, inner_path2: 1...}}
helper.timer(61, processAccessLog)
helper.timer(60, processRequestLog)
@PluginManager.registerTo("ContentManager")
class ContentManagerPlugin(object):
def __init__(self, *args, **kwargs):
self.cache_is_pinned = {}
super(ContentManagerPlugin, self).__init__(*args, **kwargs)
def optionalDownloaded(self, inner_path, hash_id, size=None, own=False):
if "|" in inner_path: # Big file piece
file_inner_path, file_range = inner_path.split("|")
else:
file_inner_path = inner_path
self.contents.db.executeDelayed(
"UPDATE file_optional SET time_downloaded = :now, is_downloaded = 1, peer = peer + 1 WHERE site_id = :site_id AND inner_path = :inner_path AND is_downloaded = 0",
{"now": int(time.time()), "site_id": self.contents.db.site_ids[self.site.address], "inner_path": file_inner_path}
)
return super(ContentManagerPlugin, self).optionalDownloaded(inner_path, hash_id, size, own)
def optionalRemoved(self, inner_path, hash_id, size=None):
self.contents.db.execute(
"UPDATE file_optional SET is_downloaded = 0, is_pinned = 0, peer = peer - 1 WHERE site_id = :site_id AND inner_path = :inner_path AND is_downloaded = 1",
{"site_id": self.contents.db.site_ids[self.site.address], "inner_path": inner_path}
)
if self.contents.db.cur.cursor.rowcount > 0:
back = super(ContentManagerPlugin, self).optionalRemoved(inner_path, hash_id, size)
# Re-add to hashfield if we have other file with the same hash_id
if self.isDownloaded(hash_id=hash_id, force_check_db=True):
self.hashfield.appendHashId(hash_id)
return back
def isDownloaded(self, inner_path=None, hash_id=None, force_check_db=False):
if hash_id and not force_check_db and hash_id not in self.hashfield:
return False
if inner_path:
res = self.contents.db.execute(
"SELECT is_downloaded FROM file_optional WHERE site_id = :site_id AND inner_path = :inner_path LIMIT 1",
{"site_id": self.contents.db.site_ids[self.site.address], "inner_path": inner_path}
)
else:
res = self.contents.db.execute(
"SELECT is_downloaded FROM file_optional WHERE site_id = :site_id AND hash_id = :hash_id AND is_downloaded = 1 LIMIT 1",
{"site_id": self.contents.db.site_ids[self.site.address], "hash_id": hash_id}
)
row = res.fetchone()
if row and row[0]:
return True
else:
return False
def isPinned(self, inner_path):
if inner_path in self.cache_is_pinned:
self.site.log.debug("Cached is pinned: %s" % inner_path)
return self.cache_is_pinned[inner_path]
res = self.contents.db.execute(
"SELECT is_pinned FROM file_optional WHERE site_id = :site_id AND inner_path = :inner_path LIMIT 1",
{"site_id": self.contents.db.site_ids[self.site.address], "inner_path": inner_path}
)
row = res.fetchone()
if row and row[0]:
is_pinned = True
else:
is_pinned = False
self.cache_is_pinned[inner_path] = is_pinned
self.site.log.debug("Cache set is pinned: %s %s" % (inner_path, is_pinned))
return is_pinned
def setPin(self, inner_path, is_pinned):
content_db = self.contents.db
site_id = content_db.site_ids[self.site.address]
content_db.execute("UPDATE file_optional SET is_pinned = %d WHERE ?" % is_pinned, {"site_id": site_id, "inner_path": inner_path})
self.cache_is_pinned = {}
def optionalDelete(self, inner_path):
if self.isPinned(inner_path):
self.site.log.debug("Skip deleting pinned optional file: %s" % inner_path)
return False
else:
return super(ContentManagerPlugin, self).optionalDelete(inner_path)
@PluginManager.registerTo("WorkerManager")
class WorkerManagerPlugin(object):
def doneTask(self, task):
super(WorkerManagerPlugin, self).doneTask(task)
if task["optional_hash_id"] and not self.tasks: # Execute delayed queries immedietly after tasks finished
ContentDbPlugin.content_db.processDelayed()
@PluginManager.registerTo("UiRequest")
class UiRequestPlugin(object):
def parsePath(self, path):
global access_log
path_parts = super(UiRequestPlugin, self).parsePath(path)
if path_parts:
site_id = ContentDbPlugin.content_db.site_ids.get(path_parts["request_address"])
if site_id:
if ContentDbPlugin.content_db.isOptionalFile(site_id, path_parts["inner_path"]):
access_log[site_id][path_parts["inner_path"]] = 1
return path_parts
@PluginManager.registerTo("FileRequest")
class FileRequestPlugin(object):
def actionGetFile(self, params):
stats = super(FileRequestPlugin, self).actionGetFile(params)
self.recordFileRequest(params["site"], params["inner_path"], stats)
return stats
def actionStreamFile(self, params):
stats = super(FileRequestPlugin, self).actionStreamFile(params)
self.recordFileRequest(params["site"], params["inner_path"], stats)
return stats
def recordFileRequest(self, site_address, inner_path, stats):
if not stats:
# Only track the last request of files
return False
site_id = ContentDbPlugin.content_db.site_ids[site_address]
if site_id and ContentDbPlugin.content_db.isOptionalFile(site_id, inner_path):
request_log[site_id][inner_path] += stats["bytes_sent"]
@PluginManager.registerTo("Site")
class SitePlugin(object):
def isDownloadable(self, inner_path):
is_downloadable = super(SitePlugin, self).isDownloadable(inner_path)
if is_downloadable:
return is_downloadable
for path in self.settings.get("optional_help", {}).iterkeys():
if inner_path.startswith(path):
return True
return False
def fileForgot(self, inner_path):
if "|" in inner_path and self.content_manager.isPinned(re.sub(r"\|.*", "", inner_path)):
self.log.debug("File %s is pinned, no fileForgot" % inner_path)
return False
else:
return super(SitePlugin, self).fileForgot(inner_path)
def fileDone(self, inner_path):
if "|" in inner_path and self.bad_files.get(inner_path, 0) > 5: # Idle optional file done
inner_path_file = re.sub(r"\|.*", "", inner_path)
num_changed = 0
for key, val in self.bad_files.items():
if key.startswith(inner_path_file) and val > 1:
self.bad_files[key] = 1
num_changed += 1
self.log.debug("Idle optional file piece done, changed retry number of %s pieces." % num_changed)
if num_changed:
gevent.spawn(self.retryBadFiles)
return super(SitePlugin, self).fileDone(inner_path)
@PluginManager.registerTo("ConfigPlugin")
class ConfigPlugin(object):
def createArguments(self):
group = self.parser.add_argument_group("OptionalManager plugin")
group.add_argument('--optional_limit', help='Limit total size of optional files', default="10%", metavar="GB or free space %")
group.add_argument('--optional_limit_exclude_minsize', help='Exclude files larger than this limit from optional size limit calculation', default=20, metavar="MB", type=int)
return super(ConfigPlugin, self).createArguments()

View file

@ -1,148 +0,0 @@
import hashlib
import os
import copy
import json
from cStringIO import StringIO
import pytest
from OptionalManager import OptionalManagerPlugin
from util import helper
from Crypt import CryptBitcoin
@pytest.mark.usefixtures("resetSettings")
class TestOptionalManager:
def testDbFill(self, site):
contents = site.content_manager.contents
assert len(site.content_manager.hashfield) > 0
assert contents.db.execute("SELECT COUNT(*) FROM file_optional WHERE is_downloaded = 1").fetchone()[0] == len(site.content_manager.hashfield)
def testSetContent(self, site):
contents = site.content_manager.contents
# Add new file
new_content = copy.deepcopy(contents["content.json"])
new_content["files_optional"]["testfile"] = {
"size": 1234,
"sha512": "aaaabbbbcccc"
}
num_optional_files_before = contents.db.execute("SELECT COUNT(*) FROM file_optional").fetchone()[0]
contents["content.json"] = new_content
assert contents.db.execute("SELECT COUNT(*) FROM file_optional").fetchone()[0] > num_optional_files_before
# Remove file
new_content = copy.deepcopy(contents["content.json"])
del new_content["files_optional"]["testfile"]
num_optional_files_before = contents.db.execute("SELECT COUNT(*) FROM file_optional").fetchone()[0]
contents["content.json"] = new_content
assert contents.db.execute("SELECT COUNT(*) FROM file_optional").fetchone()[0] < num_optional_files_before
def testDeleteContent(self, site):
contents = site.content_manager.contents
num_optional_files_before = contents.db.execute("SELECT COUNT(*) FROM file_optional").fetchone()[0]
del contents["content.json"]
assert contents.db.execute("SELECT COUNT(*) FROM file_optional").fetchone()[0] < num_optional_files_before
def testVerifyFiles(self, site):
contents = site.content_manager.contents
# Add new file
new_content = copy.deepcopy(contents["content.json"])
new_content["files_optional"]["testfile"] = {
"size": 1234,
"sha512": "aaaabbbbcccc"
}
contents["content.json"] = new_content
file_row = contents.db.execute("SELECT * FROM file_optional WHERE inner_path = 'testfile'").fetchone()
assert not file_row["is_downloaded"]
# Write file from outside of ZeroNet
site.storage.open("testfile", "wb").write("A" * 1234) # For quick check hash does not matter only file size
hashfield_len_before = len(site.content_manager.hashfield)
site.storage.verifyFiles(quick_check=True)
assert len(site.content_manager.hashfield) == hashfield_len_before + 1
file_row = contents.db.execute("SELECT * FROM file_optional WHERE inner_path = 'testfile'").fetchone()
assert file_row["is_downloaded"]
# Delete file outside of ZeroNet
site.storage.delete("testfile")
site.storage.verifyFiles(quick_check=True)
file_row = contents.db.execute("SELECT * FROM file_optional WHERE inner_path = 'testfile'").fetchone()
assert not file_row["is_downloaded"]
def testVerifyFilesSameHashId(self, site):
contents = site.content_manager.contents
new_content = copy.deepcopy(contents["content.json"])
# Add two files with same hashid (first 4 character)
new_content["files_optional"]["testfile1"] = {
"size": 1234,
"sha512": "aaaabbbbcccc"
}
new_content["files_optional"]["testfile2"] = {
"size": 2345,
"sha512": "aaaabbbbdddd"
}
contents["content.json"] = new_content
assert site.content_manager.hashfield.getHashId("aaaabbbbcccc") == site.content_manager.hashfield.getHashId("aaaabbbbdddd")
# Write files from outside of ZeroNet (For quick check hash does not matter only file size)
site.storage.open("testfile1", "wb").write("A" * 1234)
site.storage.open("testfile2", "wb").write("B" * 2345)
site.storage.verifyFiles(quick_check=True)
# Make sure that both is downloaded
assert site.content_manager.isDownloaded("testfile1")
assert site.content_manager.isDownloaded("testfile2")
assert site.content_manager.hashfield.getHashId("aaaabbbbcccc") in site.content_manager.hashfield
# Delete one of the files
site.storage.delete("testfile1")
site.storage.verifyFiles(quick_check=True)
assert not site.content_manager.isDownloaded("testfile1")
assert site.content_manager.isDownloaded("testfile2")
assert site.content_manager.hashfield.getHashId("aaaabbbbdddd") in site.content_manager.hashfield
def testIsPinned(self, site):
assert not site.content_manager.isPinned("data/img/zerotalk-upvote.png")
site.content_manager.setPin("data/img/zerotalk-upvote.png", True)
assert site.content_manager.isPinned("data/img/zerotalk-upvote.png")
assert len(site.content_manager.cache_is_pinned) == 1
site.content_manager.cache_is_pinned = {}
assert site.content_manager.isPinned("data/img/zerotalk-upvote.png")
def testBigfilePieceReset(self, site):
site.bad_files = {
"data/fake_bigfile.mp4|0-1024": 10,
"data/fake_bigfile.mp4|1024-2048": 10,
"data/fake_bigfile.mp4|2048-3064": 10
}
site.onFileDone("data/fake_bigfile.mp4|0-1024")
assert site.bad_files["data/fake_bigfile.mp4|1024-2048"] == 1
assert site.bad_files["data/fake_bigfile.mp4|2048-3064"] == 1
def testOptionalDelete(self, site):
privatekey = "5KUh3PvNm5HUWoCfSUfcYvfQ2g3PrRNJWr6Q9eqdBGu23mtMntv"
contents = site.content_manager.contents
site.content_manager.setPin("data/img/zerotalk-upvote.png", True)
site.content_manager.setPin("data/img/zeroid.png", False)
new_content = copy.deepcopy(contents["content.json"])
del new_content["files_optional"]["data/img/zerotalk-upvote.png"]
del new_content["files_optional"]["data/img/zeroid.png"]
assert site.storage.isFile("data/img/zerotalk-upvote.png")
assert site.storage.isFile("data/img/zeroid.png")
site.storage.writeJson("content.json", new_content)
site.content_manager.loadContent("content.json", force=True)
assert not site.storage.isFile("data/img/zeroid.png")
assert site.storage.isFile("data/img/zerotalk-upvote.png")

View file

@ -1 +0,0 @@
from src.Test.conftest import *

View file

@ -1,5 +0,0 @@
[pytest]
python_files = Test*.py
addopts = -rsxX -v --durations=6
markers =
webtest: mark a test as a webtest.

View file

@ -1,383 +0,0 @@
import re
import time
import cgi
import gevent
from Plugin import PluginManager
from Config import config
from util import helper
from Translate import Translate
if "_" not in locals():
_ = Translate("plugins/OptionalManager/languages/")
bigfile_sha512_cache = {}
@PluginManager.registerTo("UiWebsocket")
class UiWebsocketPlugin(object):
def __init__(self, *args, **kwargs):
self.time_peer_numbers_updated = 0
super(UiWebsocketPlugin, self).__init__(*args, **kwargs)
def actionSiteSign(self, to, privatekey=None, inner_path="content.json", *args, **kwargs):
# Add file to content.db and set it as pinned
content_db = self.site.content_manager.contents.db
content_inner_dir = helper.getDirname(inner_path)
content_db.my_optional_files[self.site.address + "/" + content_inner_dir] = time.time()
if len(content_db.my_optional_files) > 50: # Keep only last 50
oldest_key = min(
content_db.my_optional_files.iterkeys(),
key=(lambda key: content_db.my_optional_files[key])
)
del content_db.my_optional_files[oldest_key]
return super(UiWebsocketPlugin, self).actionSiteSign(to, privatekey, inner_path, *args, **kwargs)
def updatePeerNumbers(self):
self.site.updateHashfield()
content_db = self.site.content_manager.contents.db
content_db.updatePeerNumbers()
self.site.updateWebsocket(peernumber_updated=True)
def addBigfileInfo(self, row):
global bigfile_sha512_cache
content_db = self.site.content_manager.contents.db
site = content_db.sites[row["address"]]
if not site.settings.get("has_bigfile"):
return False
file_key = row["address"] + "/" + row["inner_path"]
sha512 = bigfile_sha512_cache.get(file_key)
file_info = None
if not sha512:
file_info = site.content_manager.getFileInfo(row["inner_path"])
if not file_info or not file_info.get("piece_size"):
return False
sha512 = file_info["sha512"]
bigfile_sha512_cache[file_key] = sha512
if sha512 in site.storage.piecefields:
piecefield = site.storage.piecefields[sha512].tostring()
else:
piecefield = None
if piecefield:
row["pieces"] = len(piecefield)
row["pieces_downloaded"] = piecefield.count("1")
row["downloaded_percent"] = 100 * row["pieces_downloaded"] / row["pieces"]
if row["pieces_downloaded"]:
if not file_info:
file_info = site.content_manager.getFileInfo(row["inner_path"])
row["bytes_downloaded"] = row["pieces_downloaded"] * file_info.get("piece_size", 0)
else:
row["bytes_downloaded"] = 0
row["is_downloading"] = bool(next((inner_path for inner_path in site.bad_files if inner_path.startswith(row["inner_path"])), False))
# Add leech / seed stats
row["peer_seed"] = 0
row["peer_leech"] = 0
for peer in site.peers.itervalues():
if not peer.time_piecefields_updated or sha512 not in peer.piecefields:
continue
peer_piecefield = peer.piecefields[sha512].tostring()
if not peer_piecefield:
continue
if peer_piecefield == "1" * len(peer_piecefield):
row["peer_seed"] += 1
else:
row["peer_leech"] += 1
# Add myself
if piecefield:
if row["pieces_downloaded"] == row["pieces"]:
row["peer_seed"] += 1
else:
row["peer_leech"] += 1
return True
# Optional file functions
def actionOptionalFileList(self, to, address=None, orderby="time_downloaded DESC", limit=10, filter="downloaded"):
if not address:
address = self.site.address
# Update peer numbers if necessary
content_db = self.site.content_manager.contents.db
if time.time() - content_db.time_peer_numbers_updated > 60 * 1 and time.time() - self.time_peer_numbers_updated > 60 * 5:
# Start in new thread to avoid blocking
self.time_peer_numbers_updated = time.time()
gevent.spawn(self.updatePeerNumbers)
if address == "all" and "ADMIN" not in self.permissions:
return self.response(to, {"error": "Forbidden"})
if not self.hasSitePermission(address):
return self.response(to, {"error": "Forbidden"})
if not all([re.match("^[a-z_*/+-]+( DESC| ASC|)$", part.strip()) for part in orderby.split(",")]):
return self.response(to, "Invalid order_by")
if type(limit) != int:
return self.response(to, "Invalid limit")
back = []
content_db = self.site.content_manager.contents.db
wheres = {}
wheres_raw = []
if "bigfile" in filter:
wheres["size >"] = 1024 * 1024 * 10
if "not_downloaded" in filter:
wheres["is_downloaded"] = 0
elif "downloaded" in filter:
wheres_raw.append("(is_downloaded = 1 OR is_pinned = 1)")
if "pinned" in filter:
wheres["is_pinned"] = 1
if address == "all":
join = "LEFT JOIN site USING (site_id)"
else:
wheres["site_id"] = content_db.site_ids[address]
join = ""
if wheres_raw:
query_wheres_raw = "AND" + " AND ".join(wheres_raw)
else:
query_wheres_raw = ""
query = "SELECT * FROM file_optional %s WHERE ? %s ORDER BY %s LIMIT %s" % (join, query_wheres_raw, orderby, limit)
for row in content_db.execute(query, wheres):
row = dict(row)
if address != "all":
row["address"] = address
if row["size"] > 1024 * 1024:
has_info = self.addBigfileInfo(row)
else:
has_info = False
if not has_info:
if row["is_downloaded"]:
row["bytes_downloaded"] = row["size"]
row["downloaded_percent"] = 100
else:
row["bytes_downloaded"] = 0
row["downloaded_percent"] = 0
back.append(row)
self.response(to, back)
def actionOptionalFileInfo(self, to, inner_path):
content_db = self.site.content_manager.contents.db
site_id = content_db.site_ids[self.site.address]
# Update peer numbers if necessary
if time.time() - content_db.time_peer_numbers_updated > 60 * 1 and time.time() - self.time_peer_numbers_updated > 60 * 5:
# Start in new thread to avoid blocking
self.time_peer_numbers_updated = time.time()
gevent.spawn(self.updatePeerNumbers)
query = "SELECT * FROM file_optional WHERE site_id = :site_id AND inner_path = :inner_path LIMIT 1"
res = content_db.execute(query, {"site_id": site_id, "inner_path": inner_path})
row = next(res, None)
if row:
row = dict(row)
if row["size"] > 1024 * 1024:
row["address"] = self.site.address
self.addBigfileInfo(row)
self.response(to, row)
else:
self.response(to, None)
def setPin(self, inner_path, is_pinned, address=None):
if not address:
address = self.site.address
if not self.hasSitePermission(address):
return {"error": "Forbidden"}
site = self.server.sites[address]
site.content_manager.setPin(inner_path, is_pinned)
return "ok"
def actionOptionalFilePin(self, to, inner_path, address=None):
if type(inner_path) is not list:
inner_path = [inner_path]
back = self.setPin(inner_path, 1, address)
num_file = len(inner_path)
if back == "ok":
if num_file == 1:
self.cmd("notification", ["done", _["Pinned %s"] % cgi.escape(helper.getFilename(inner_path[0])), 5000])
else:
self.cmd("notification", ["done", _["Pinned %s files"] % num_file, 5000])
self.response(to, back)
def actionOptionalFileUnpin(self, to, inner_path, address=None):
if type(inner_path) is not list:
inner_path = [inner_path]
back = self.setPin(inner_path, 0, address)
num_file = len(inner_path)
if back == "ok":
if num_file == 1:
self.cmd("notification", ["done", _["Removed pin from %s"] % cgi.escape(helper.getFilename(inner_path[0])), 5000])
else:
self.cmd("notification", ["done", _["Removed pin from %s files"] % num_file, 5000])
self.response(to, back)
def actionOptionalFileDelete(self, to, inner_path, address=None):
if not address:
address = self.site.address
if not self.hasSitePermission(address):
return self.response(to, {"error": "Forbidden"})
site = self.server.sites[address]
content_db = site.content_manager.contents.db
site_id = content_db.site_ids[site.address]
res = content_db.execute("SELECT * FROM file_optional WHERE ? LIMIT 1", {"site_id": site_id, "inner_path": inner_path, "is_downloaded": 1})
row = next(res, None)
if not row:
return self.response(to, {"error": "Not found in content.db"})
removed = site.content_manager.optionalRemoved(inner_path, row["hash_id"], row["size"])
# if not removed:
# return self.response(to, {"error": "Not found in hash_id: %s" % row["hash_id"]})
content_db.execute("UPDATE file_optional SET is_downloaded = 0, is_pinned = 0, peer = peer - 1 WHERE ?", {"site_id": site_id, "inner_path": inner_path})
try:
site.storage.delete(inner_path)
except Exception as err:
return self.response(to, {"error": "File delete error: %s" % err})
site.updateWebsocket(file_delete=inner_path)
if inner_path in site.content_manager.cache_is_pinned:
site.content_manager.cache_is_pinned = {}
self.response(to, "ok")
# Limit functions
def actionOptionalLimitStats(self, to):
if "ADMIN" not in self.site.settings["permissions"]:
return self.response(to, "Forbidden")
back = {}
back["limit"] = config.optional_limit
back["used"] = self.site.content_manager.contents.db.getOptionalUsedBytes()
back["free"] = helper.getFreeSpace()
self.response(to, back)
def actionOptionalLimitSet(self, to, limit):
if "ADMIN" not in self.site.settings["permissions"]:
return self.response(to, {"error": "Forbidden"})
config.optional_limit = re.sub("\.0+$", "", limit) # Remove unnecessary digits from end
config.saveValue("optional_limit", limit)
self.response(to, "ok")
# Distribute help functions
def actionOptionalHelpList(self, to, address=None):
if not address:
address = self.site.address
if not self.hasSitePermission(address):
return self.response(to, {"error": "Forbidden"})
site = self.server.sites[address]
self.response(to, site.settings.get("optional_help", {}))
def actionOptionalHelp(self, to, directory, title, address=None):
if not address:
address = self.site.address
if not self.hasSitePermission(address):
return self.response(to, {"error": "Forbidden"})
site = self.server.sites[address]
content_db = site.content_manager.contents.db
site_id = content_db.site_ids[address]
if "optional_help" not in site.settings:
site.settings["optional_help"] = {}
stats = content_db.execute(
"SELECT COUNT(*) AS num, SUM(size) AS size FROM file_optional WHERE site_id = :site_id AND inner_path LIKE :inner_path",
{"site_id": site_id, "inner_path": directory + "%"}
).fetchone()
stats = dict(stats)
if not stats["size"]:
stats["size"] = 0
if not stats["num"]:
stats["num"] = 0
self.cmd("notification", [
"done",
_["You started to help distribute <b>%s</b>.<br><small>Directory: %s</small>"] %
(cgi.escape(title), cgi.escape(directory)),
10000
])
site.settings["optional_help"][directory] = title
self.response(to, dict(stats))
def actionOptionalHelpRemove(self, to, directory, address=None):
if not address:
address = self.site.address
if not self.hasSitePermission(address):
return self.response(to, {"error": "Forbidden"})
site = self.server.sites[address]
try:
del site.settings["optional_help"][directory]
self.response(to, "ok")
except Exception:
self.response(to, {"error": "Not found"})
def cbOptionalHelpAll(self, to, site, value):
site.settings["autodownloadoptional"] = value
self.response(to, value)
def actionOptionalHelpAll(self, to, value, address=None):
if not address:
address = self.site.address
if not self.hasSitePermission(address):
return self.response(to, {"error": "Forbidden"})
site = self.server.sites[address]
if value:
if "ADMIN" in self.site.settings["permissions"]:
self.cbOptionalHelpAll(to, site, True)
else:
site_title = site.content_manager.contents["content.json"].get("title", address)
self.cmd(
"confirm",
[
_["Help distribute all new optional files on site <b>%s</b>"] % cgi.escape(site_title),
_["Yes, I want to help!"]
],
lambda (res): self.cbOptionalHelpAll(to, site, True)
)
else:
site.settings["autodownloadoptional"] = False
self.response(to, False)

View file

@ -1 +0,0 @@
import OptionalManagerPlugin

View file

@ -1,7 +0,0 @@
{
"Pinned %s files": "Archivos %s fijados",
"Removed pin from %s files": "Archivos %s que no estan fijados",
"You started to help distribute <b>%s</b>.<br><small>Directory: %s</small>": "Tu empezaste a ayudar a distribuir <b>%s</b>.<br><small>Directorio: %s</small>",
"Help distribute all new optional files on site <b>%s</b>": "Ayude a distribuir todos los archivos opcionales en el sitio <b>%s</b>",
"Yes, I want to help!": "¡Si, yo quiero ayudar!"
}

View file

@ -1,7 +0,0 @@
{
"Pinned %s files": "Fichiers %s épinglés",
"Removed pin from %s files": "Fichiers %s ne sont plus épinglés",
"You started to help distribute <b>%s</b>.<br><small>Directory: %s</small>": "Vous avez commencé à aider à distribuer <b>%s</b>.<br><small>Dossier : %s</small>",
"Help distribute all new optional files on site <b>%s</b>": "Aider à distribuer tous les fichiers optionnels du site <b>%s</b>",
"Yes, I want to help!": "Oui, je veux aider !"
}

View file

@ -1,7 +0,0 @@
{
"Pinned %s files": "%s fájl rögzítve",
"Removed pin from %s files": "%s fájl rögzítés eltávolítva",
"You started to help distribute <b>%s</b>.<br><small>Directory: %s</small>": "Új segítség a terjesztésben: <b>%s</b>.<br><small>Könyvtár: %s</small>",
"Help distribute all new optional files on site <b>%s</b>": "Segítség az összes új opcionális fájl terjesztésében az <b>%s</b> oldalon",
"Yes, I want to help!": "Igen, segíteni akarok!"
}

View file

@ -1,7 +0,0 @@
{
"Pinned %s files": "Arquivos %s fixados",
"Removed pin from %s files": "Arquivos %s não estão fixados",
"You started to help distribute <b>%s</b>.<br><small>Directory: %s</small>": "Você começou a ajudar a distribuir <b>%s</b>.<br><small>Pasta: %s</small>",
"Help distribute all new optional files on site <b>%s</b>": "Ajude a distribuir todos os novos arquivos opcionais no site <b>%s</b>",
"Yes, I want to help!": "Sim, eu quero ajudar!"
}

View file

@ -1,7 +0,0 @@
{
"Pinned %s files": "已固定 %s 個檔",
"Removed pin from %s files": "已解除固定 %s 個檔",
"You started to help distribute <b>%s</b>.<br><small>Directory: %s</small>": "你已經開始幫助分發 <b>%s</b> 。<br><small>目錄:%s</small>",
"Help distribute all new optional files on site <b>%s</b>": "你想要幫助分發 <b>%s</b> 網站的所有檔嗎?",
"Yes, I want to help!": "是,我想要幫助!"
}

View file

@ -1,7 +0,0 @@
{
"Pinned %s files": "已固定 %s 个文件",
"Removed pin from %s files": "已解除固定 %s 个文件",
"You started to help distribute <b>%s</b>.<br><small>Directory: %s</small>": "您已经开始帮助分发 <b>%s</b> 。<br><small>目录:%s</small>",
"Help distribute all new optional files on site <b>%s</b>": "您想要帮助分发 <b>%s</b> 站点的所有文件吗?",
"Yes, I want to help!": "是,我想要帮助!"
}

View file

@ -1,103 +0,0 @@
import time
import sqlite3
import random
import atexit
import gevent
from Plugin import PluginManager
@PluginManager.registerTo("ContentDb")
class ContentDbPlugin(object):
def __init__(self, *args, **kwargs):
atexit.register(self.saveAllPeers)
super(ContentDbPlugin, self).__init__(*args, **kwargs)
def getSchema(self):
schema = super(ContentDbPlugin, self).getSchema()
schema["tables"]["peer"] = {
"cols": [
["site_id", "INTEGER REFERENCES site (site_id) ON DELETE CASCADE"],
["address", "TEXT NOT NULL"],
["port", "INTEGER NOT NULL"],
["hashfield", "BLOB"],
["reputation", "INTEGER NOT NULL"],
["time_added", "INTEGER NOT NULL"],
["time_found", "INTEGER NOT NULL"]
],
"indexes": [
"CREATE UNIQUE INDEX peer_key ON peer (site_id, address, port)"
],
"schema_changed": 2
}
return schema
def loadPeers(self, site):
s = time.time()
site_id = self.site_ids.get(site.address)
res = self.execute("SELECT * FROM peer WHERE site_id = :site_id", {"site_id": site_id})
num = 0
num_hashfield = 0
for row in res:
peer = site.addPeer(str(row["address"]), row["port"])
if not peer: # Already exist
continue
if row["hashfield"]:
peer.hashfield.replaceFromString(row["hashfield"])
num_hashfield += 1
peer.time_added = row["time_added"]
peer.time_found = row["time_found"]
peer.reputation = row["reputation"]
if row["address"].endswith(".onion"):
peer.reputation = peer.reputation / 2 - 1 # Onion peers less likely working
num += 1
if num_hashfield:
site.content_manager.has_optional_files = True
site.log.debug("%s peers (%s with hashfield) loaded in %.3fs" % (num, num_hashfield, time.time() - s))
def iteratePeers(self, site):
site_id = self.site_ids.get(site.address)
for key, peer in site.peers.iteritems():
address, port = key.rsplit(":", 1)
if peer.has_hashfield:
hashfield = sqlite3.Binary(peer.hashfield.tostring())
else:
hashfield = ""
yield (site_id, address, port, hashfield, peer.reputation, int(peer.time_added), int(peer.time_found))
def savePeers(self, site, spawn=False):
if spawn:
# Save peers every hour (+random some secs to not update very site at same time)
gevent.spawn_later(60 * 60 + random.randint(0, 60), self.savePeers, site, spawn=True)
if not site.peers:
site.log.debug("Peers not saved: No peers found")
return
s = time.time()
site_id = self.site_ids.get(site.address)
cur = self.getCursor()
cur.execute("BEGIN")
try:
cur.execute("DELETE FROM peer WHERE site_id = :site_id", {"site_id": site_id})
cur.cursor.executemany(
"INSERT INTO peer (site_id, address, port, hashfield, reputation, time_added, time_found) VALUES (?, ?, ?, ?, ?, ?, ?)",
self.iteratePeers(site)
)
except Exception as err:
site.log.error("Save peer error: %s" % err)
finally:
cur.execute("END")
site.log.debug("Peers saved in %.3fs" % (time.time() - s))
def initSite(self, site):
super(ContentDbPlugin, self).initSite(site)
gevent.spawn_later(0.5, self.loadPeers, site)
gevent.spawn_later(60*60, self.savePeers, site, spawn=True)
def saveAllPeers(self):
for site in self.sites.values():
try:
self.savePeers(site)
except Exception, err:
site.log.error("Save peer error: %s" % err)

View file

@ -1,2 +0,0 @@
import PeerDbPlugin

View file

@ -1,760 +0,0 @@
import re
import os
import cgi
import sys
import math
import time
import json
try:
import cStringIO as StringIO
except:
import StringIO
import gevent
from Config import config
from Plugin import PluginManager
from Debug import Debug
from Translate import Translate
from util import helper
from ZipStream import ZipStream
plugin_dir = "plugins/Sidebar"
media_dir = plugin_dir + "/media"
sys.path.append(plugin_dir) # To able to load geoip lib
loc_cache = {}
if "_" not in locals():
_ = Translate(plugin_dir + "/languages/")
@PluginManager.registerTo("UiRequest")
class UiRequestPlugin(object):
# Inject our resources to end of original file streams
def actionUiMedia(self, path):
if path == "/uimedia/all.js" or path == "/uimedia/all.css":
# First yield the original file and header
body_generator = super(UiRequestPlugin, self).actionUiMedia(path)
for part in body_generator:
yield part
# Append our media file to the end
ext = re.match(".*(js|css)$", path).group(1)
plugin_media_file = "%s/all.%s" % (media_dir, ext)
if config.debug:
# If debugging merge *.css to all.css and *.js to all.js
from Debug import DebugMedia
DebugMedia.merge(plugin_media_file)
if ext == "js":
yield _.translateData(open(plugin_media_file).read())
else:
for part in self.actionFile(plugin_media_file, send_header=False):
yield part
elif path.startswith("/uimedia/globe/"): # Serve WebGL globe files
file_name = re.match(".*/(.*)", path).group(1)
plugin_media_file = "%s-globe/%s" % (media_dir, file_name)
if config.debug and path.endswith("all.js"):
# If debugging merge *.css to all.css and *.js to all.js
from Debug import DebugMedia
DebugMedia.merge(plugin_media_file)
for part in self.actionFile(plugin_media_file):
yield part
else:
for part in super(UiRequestPlugin, self).actionUiMedia(path):
yield part
def actionZip(self):
address = self.get["address"]
site = self.server.site_manager.get(address)
if not site:
return self.error404("Site not found")
title = site.content_manager.contents.get("content.json", {}).get("title", "").encode('ascii', 'ignore')
filename = "%s-backup-%s.zip" % (title, time.strftime("%Y-%m-%d_%H_%M"))
self.sendHeader(content_type="application/zip", extra_headers={'Content-Disposition': 'attachment; filename="%s"' % filename})
return self.streamZip(site.storage.getPath("."))
def streamZip(self, file_path):
zs = ZipStream(file_path)
while 1:
data = zs.read()
if not data:
break
yield data
@PluginManager.registerTo("UiWebsocket")
class UiWebsocketPlugin(object):
def sidebarRenderPeerStats(self, body, site):
connected = len([peer for peer in site.peers.values() if peer.connection and peer.connection.connected])
connectable = len([peer_id for peer_id in site.peers.keys() if not peer_id.endswith(":0")])
onion = len([peer_id for peer_id in site.peers.keys() if ".onion" in peer_id])
local = len([peer for peer in site.peers.values() if helper.isPrivateIp(peer.ip)])
peers_total = len(site.peers)
# Add myself
if site.settings["serving"]:
peers_total += 1
if any(site.connection_server.port_opened.values()):
connectable += 1
if site.connection_server.tor_manager.start_onions:
onion += 1
if peers_total:
percent_connected = float(connected) / peers_total
percent_connectable = float(connectable) / peers_total
percent_onion = float(onion) / peers_total
else:
percent_connectable = percent_connected = percent_onion = 0
if local:
local_html = _(u"<li class='color-yellow'><span>{_[Local]}:</span><b>{local}</b></li>")
else:
local_html = ""
peer_ips = [peer.key for peer in site.getConnectablePeers(20, allow_private=False)]
peer_ips.sort(key=lambda peer_ip: ".onion:" in peer_ip)
copy_link = "http://127.0.0.1:43110/%s/?zeronet_peers=%s" % (
site.content_manager.contents["content.json"].get("domain", site.address),
",".join(peer_ips)
)
body.append(_(u"""
<li>
<label>
{_[Peers]}
<small class="label-right"><a href='{copy_link}' id='link-copypeers' class='link-right'>{_[Copy to clipboard]}</a></small>
</label>
<ul class='graph'>
<li style='width: 100%' class='total back-black' title="{_[Total peers]}"></li>
<li style='width: {percent_connectable:.0%}' class='connectable back-blue' title='{_[Connectable peers]}'></li>
<li style='width: {percent_onion:.0%}' class='connected back-purple' title='{_[Onion]}'></li>
<li style='width: {percent_connected:.0%}' class='connected back-green' title='{_[Connected peers]}'></li>
</ul>
<ul class='graph-legend'>
<li class='color-green'><span>{_[Connected]}:</span><b>{connected}</b></li>
<li class='color-blue'><span>{_[Connectable]}:</span><b>{connectable}</b></li>
<li class='color-purple'><span>{_[Onion]}:</span><b>{onion}</b></li>
{local_html}
<li class='color-black'><span>{_[Total]}:</span><b>{peers_total}</b></li>
</ul>
</li>
""".replace("{local_html}", local_html)))
def sidebarRenderTransferStats(self, body, site):
recv = float(site.settings.get("bytes_recv", 0)) / 1024 / 1024
sent = float(site.settings.get("bytes_sent", 0)) / 1024 / 1024
transfer_total = recv + sent
if transfer_total:
percent_recv = recv / transfer_total
percent_sent = sent / transfer_total
else:
percent_recv = 0.5
percent_sent = 0.5
body.append(_(u"""
<li>
<label>{_[Data transfer]}</label>
<ul class='graph graph-stacked'>
<li style='width: {percent_recv:.0%}' class='received back-yellow' title="{_[Received bytes]}"></li>
<li style='width: {percent_sent:.0%}' class='sent back-green' title="{_[Sent bytes]}"></li>
</ul>
<ul class='graph-legend'>
<li class='color-yellow'><span>{_[Received]}:</span><b>{recv:.2f}MB</b></li>
<li class='color-green'<span>{_[Sent]}:</span><b>{sent:.2f}MB</b></li>
</ul>
</li>
"""))
def sidebarRenderFileStats(self, body, site):
body.append(_(u"""
<li>
<label>
{_[Files]}
<small class="label-right"><a href='#Site+directory' id='link-directory' class='link-right'>{_[Open site directory]}</a>
<a href='/ZeroNet-Internal/Zip?address={site.address}' id='link-zip' class='link-right' download='site.zip'>{_[Save as .zip]}</a></small>
</label>
<ul class='graph graph-stacked'>
"""))
extensions = (
("html", "yellow"),
("css", "orange"),
("js", "purple"),
("Image", "green"),
("json", "darkblue"),
("User data", "blue"),
("Other", "white"),
("Total", "black")
)
# Collect stats
size_filetypes = {}
size_total = 0
contents = site.content_manager.listContents() # Without user files
for inner_path in contents:
content = site.content_manager.contents[inner_path]
if "files" not in content or content["files"] is None:
continue
for file_name, file_details in content["files"].items():
size_total += file_details["size"]
ext = file_name.split(".")[-1]
size_filetypes[ext] = size_filetypes.get(ext, 0) + file_details["size"]
# Get user file sizes
size_user_content = site.content_manager.contents.execute(
"SELECT SUM(size) + SUM(size_files) AS size FROM content WHERE ?",
{"not__inner_path": contents}
).fetchone()["size"]
if not size_user_content:
size_user_content = 0
size_filetypes["User data"] = size_user_content
size_total += size_user_content
# The missing difference is content.json sizes
if "json" in size_filetypes:
size_filetypes["json"] += max(0, site.settings["size"] - size_total)
size_total = size_other = site.settings["size"]
# Bar
for extension, color in extensions:
if extension == "Total":
continue
if extension == "Other":
size = max(0, size_other)
elif extension == "Image":
size = size_filetypes.get("jpg", 0) + size_filetypes.get("png", 0) + size_filetypes.get("gif", 0)
size_other -= size
else:
size = size_filetypes.get(extension, 0)
size_other -= size
if size_total == 0:
percent = 0
else:
percent = 100 * (float(size) / size_total)
percent = math.floor(percent * 100) / 100 # Floor to 2 digits
body.append(
u"""<li style='width: %.2f%%' class='%s back-%s' title="%s"></li>""" %
(percent, _[extension], color, _[extension])
)
# Legend
body.append("</ul><ul class='graph-legend'>")
for extension, color in extensions:
if extension == "Other":
size = max(0, size_other)
elif extension == "Image":
size = size_filetypes.get("jpg", 0) + size_filetypes.get("png", 0) + size_filetypes.get("gif", 0)
elif extension == "Total":
size = size_total
else:
size = size_filetypes.get(extension, 0)
if extension == "js":
title = "javascript"
else:
title = extension
if size > 1024 * 1024 * 10: # Format as mB is more than 10mB
size_formatted = "%.0fMB" % (size / 1024 / 1024)
else:
size_formatted = "%.0fkB" % (size / 1024)
body.append(u"<li class='color-%s'><span>%s:</span><b>%s</b></li>" % (color, _[title], size_formatted))
body.append("</ul></li>")
def sidebarRenderSizeLimit(self, body, site):
free_space = helper.getFreeSpace() / 1024 / 1024
size = float(site.settings["size"]) / 1024 / 1024
size_limit = site.getSizeLimit()
percent_used = size / size_limit
body.append(_(u"""
<li>
<label>{_[Size limit]} <small>({_[limit used]}: {percent_used:.0%}, {_[free space]}: {free_space:,d}MB)</small></label>
<input type='text' class='text text-num' value="{size_limit}" id='input-sitelimit'/><span class='text-post'>MB</span>
<a href='#Set' class='button' id='button-sitelimit'>{_[Set]}</a>
</li>
"""))
def sidebarRenderOptionalFileStats(self, body, site):
size_total = float(site.settings["size_optional"])
size_downloaded = float(site.settings["optional_downloaded"])
if not size_total:
return False
percent_downloaded = size_downloaded / size_total
size_formatted_total = size_total / 1024 / 1024
size_formatted_downloaded = size_downloaded / 1024 / 1024
body.append(_(u"""
<li>
<label>{_[Optional files]}</label>
<ul class='graph'>
<li style='width: 100%' class='total back-black' title="{_[Total size]}"></li>
<li style='width: {percent_downloaded:.0%}' class='connected back-green' title='{_[Downloaded files]}'></li>
</ul>
<ul class='graph-legend'>
<li class='color-green'><span>{_[Downloaded]}:</span><b>{size_formatted_downloaded:.2f}MB</b></li>
<li class='color-black'><span>{_[Total]}:</span><b>{size_formatted_total:.2f}MB</b></li>
</ul>
</li>
"""))
return True
def sidebarRenderOptionalFileSettings(self, body, site):
if self.site.settings.get("autodownloadoptional"):
checked = "checked='checked'"
else:
checked = ""
body.append(_(u"""
<li>
<label>{_[Download and help distribute all files]}</label>
<input type="checkbox" class="checkbox" id="checkbox-autodownloadoptional" {checked}/><div class="checkbox-skin"></div>
"""))
autodownload_bigfile_size_limit = int(site.settings.get("autodownload_bigfile_size_limit", config.autodownload_bigfile_size_limit))
body.append(_(u"""
<div class='settings-autodownloadoptional'>
<label>{_[Auto download big file size limit]}</label>
<input type='text' class='text text-num' value="{autodownload_bigfile_size_limit}" id='input-autodownload_bigfile_size_limit'/><span class='text-post'>MB</span>
<a href='#Set' class='button' id='button-autodownload_bigfile_size_limit'>{_[Set]}</a>
</div>
"""))
body.append("</li>")
def sidebarRenderBadFiles(self, body, site):
body.append(_(u"""
<li>
<label>{_[Needs to be updated]}:</label>
<ul class='filelist'>
"""))
i = 0
for bad_file, tries in site.bad_files.iteritems():
i += 1
body.append(_(u"""<li class='color-red' title="{bad_file_path} ({tries})">{bad_filename}</li>""", {
"bad_file_path": bad_file,
"bad_filename": helper.getFilename(bad_file),
"tries": _.pluralize(tries, "{} try", "{} tries")
}))
if i > 30:
break
if len(site.bad_files) > 30:
num_bad_files = len(site.bad_files) - 30
body.append(_(u"""<li class='color-red'>{_[+ {num_bad_files} more]}</li>""", nested=True))
body.append("""
</ul>
</li>
""")
def sidebarRenderDbOptions(self, body, site):
if site.storage.db:
inner_path = site.storage.getInnerPath(site.storage.db.db_path)
size = float(site.storage.getSize(inner_path)) / 1024
feeds = len(site.storage.db.schema.get("feeds", {}))
else:
inner_path = _[u"No database found"]
size = 0.0
feeds = 0
body.append(_(u"""
<li>
<label>{_[Database]} <small>({size:.2f}kB, {_[search feeds]}: {_[{feeds} query]})</small></label>
<div class='flex'>
<input type='text' class='text disabled' value="{inner_path}" disabled='disabled'/>
<a href='#Reload' id="button-dbreload" class='button'>{_[Reload]}</a>
<a href='#Rebuild' id="button-dbrebuild" class='button'>{_[Rebuild]}</a>
</div>
</li>
""", nested=True))
def sidebarRenderIdentity(self, body, site):
auth_address = self.user.getAuthAddress(self.site.address, create=False)
rules = self.site.content_manager.getRules("data/users/%s/content.json" % auth_address)
if rules and rules.get("max_size"):
quota = rules["max_size"] / 1024
try:
content = site.content_manager.contents["data/users/%s/content.json" % auth_address]
used = len(json.dumps(content)) + sum([file["size"] for file in content["files"].values()])
except:
used = 0
used = used / 1024
else:
quota = used = 0
body.append(_(u"""
<li>
<label>{_[Identity address]} <small>({_[limit used]}: {used:.2f}kB / {quota:.2f}kB)</small></label>
<div class='flex'>
<span class='input text disabled'>{auth_address}</span>
<a href='#Change' class='button' id='button-identity'>{_[Change]}</a>
</div>
</li>
"""))
def sidebarRenderControls(self, body, site):
auth_address = self.user.getAuthAddress(self.site.address, create=False)
if self.site.settings["serving"]:
class_pause = ""
class_resume = "hidden"
else:
class_pause = "hidden"
class_resume = ""
body.append(_(u"""
<li>
<label>{_[Site control]}</label>
<a href='#Update' class='button noupdate' id='button-update'>{_[Update]}</a>
<a href='#Pause' class='button {class_pause}' id='button-pause'>{_[Pause]}</a>
<a href='#Resume' class='button {class_resume}' id='button-resume'>{_[Resume]}</a>
<a href='#Delete' class='button noupdate' id='button-delete'>{_[Delete]}</a>
</li>
"""))
donate_key = site.content_manager.contents.get("content.json", {}).get("donate", True)
site_address = self.site.address
body.append(_(u"""
<li>
<label>{_[Site address]}</label><br>
<div class='flex'>
<span class='input text disabled'>{site_address}</span>
"""))
if donate_key == False or donate_key == "":
pass
elif (type(donate_key) == str or type(donate_key) == unicode) and len(donate_key) > 0:
body.append(_(u"""
</div>
</li>
<li>
<label>{_[Donate]}</label><br>
<div class='flex'>
{donate_key}
"""))
else:
body.append(_(u"""
<a href='bitcoin:{site_address}' class='button' id='button-donate'>{_[Donate]}</a>
"""))
body.append(_(u"""
</div>
</li>
"""))
def sidebarRenderOwnedCheckbox(self, body, site):
if self.site.settings["own"]:
checked = "checked='checked'"
else:
checked = ""
body.append(_(u"""
<h2 class='owned-title'>{_[This is my site]}</h2>
<input type="checkbox" class="checkbox" id="checkbox-owned" {checked}/><div class="checkbox-skin"></div>
"""))
def sidebarRenderOwnSettings(self, body, site):
title = site.content_manager.contents.get("content.json", {}).get("title", "")
description = site.content_manager.contents.get("content.json", {}).get("description", "")
body.append(_(u"""
<li>
<label for='settings-title'>{_[Site title]}</label>
<input type='text' class='text' value="{title}" id='settings-title'/>
</li>
<li>
<label for='settings-description'>{_[Site description]}</label>
<input type='text' class='text' value="{description}" id='settings-description'/>
</li>
<li>
<a href='#Save' class='button' id='button-settings'>{_[Save site settings]}</a>
</li>
"""))
def sidebarRenderContents(self, body, site):
has_privatekey = bool(self.user.getSiteData(site.address, create=False).get("privatekey"))
if has_privatekey:
tag_privatekey = _(u"{_[Private key saved.]} <a href='#Forgot+private+key' id='privatekey-forgot' class='link-right'>{_[Forgot]}</a>")
else:
tag_privatekey = _(u"<a href='#Add+private+key' id='privatekey-add' class='link-right'>{_[Add saved private key]}</a>")
body.append(_(u"""
<li>
<label>{_[Content publishing]} <small class='label-right'>{tag_privatekey}</small></label>
""".replace("{tag_privatekey}", tag_privatekey)))
# Choose content you want to sign
body.append(_(u"""
<div class='flex'>
<input type='text' class='text' value="content.json" id='input-contents'/>
<a href='#Sign-and-Publish' id='button-sign-publish' class='button'>{_[Sign and publish]}</a>
<a href='#Sign-or-Publish' id='menu-sign-publish'>\u22EE</a>
</div>
"""))
contents = ["content.json"]
contents += site.content_manager.contents.get("content.json", {}).get("includes", {}).keys()
body.append(_(u"<div class='contents'>{_[Choose]}: "))
for content in contents:
body.append(_("<a href='{content}' class='contents-content'>{content}</a> "))
body.append("</div>")
body.append("</li>")
def actionSidebarGetHtmlTag(self, to):
permissions = self.getPermissions(to)
if "ADMIN" not in permissions:
return self.response(to, "You don't have permission to run this command")
site = self.site
body = []
body.append("<div>")
body.append("<a href='#Close' class='close'>&times;</a>")
body.append("<h1>%s</h1>" % cgi.escape(site.content_manager.contents.get("content.json", {}).get("title", ""), True))
body.append("<div class='globe loading'></div>")
body.append("<ul class='fields'>")
self.sidebarRenderPeerStats(body, site)
self.sidebarRenderTransferStats(body, site)
self.sidebarRenderFileStats(body, site)
self.sidebarRenderSizeLimit(body, site)
has_optional = self.sidebarRenderOptionalFileStats(body, site)
if has_optional:
self.sidebarRenderOptionalFileSettings(body, site)
self.sidebarRenderDbOptions(body, site)
self.sidebarRenderIdentity(body, site)
self.sidebarRenderControls(body, site)
if site.bad_files:
self.sidebarRenderBadFiles(body, site)
self.sidebarRenderOwnedCheckbox(body, site)
body.append("<div class='settings-owned'>")
self.sidebarRenderOwnSettings(body, site)
self.sidebarRenderContents(body, site)
body.append("</div>")
body.append("</ul>")
body.append("</div>")
body.append("<div class='menu template'>")
body.append("<a href='#'' class='menu-item template'>Template</a>")
body.append("</div>")
self.response(to, "".join(body))
def downloadGeoLiteDb(self, db_path):
import urllib
import gzip
import shutil
from util import helper
self.log.info("Downloading GeoLite2 City database...")
self.cmd("progress", ["geolite-info", _["Downloading GeoLite2 City database (one time only, ~20MB)..."], 0])
db_urls = [
"https://geolite.maxmind.com/download/geoip/database/GeoLite2-City.mmdb.gz",
"https://raw.githubusercontent.com/texnikru/GeoLite2-Database/master/GeoLite2-City.mmdb.gz"
]
for db_url in db_urls:
try:
# Download
response = helper.httpRequest(db_url)
data_size = response.getheader('content-length')
data_recv = 0
data = StringIO.StringIO()
while True:
buff = response.read(1024 * 512)
if not buff:
break
data.write(buff)
data_recv += 1024 * 512
if data_size:
progress = int(float(data_recv) / int(data_size) * 100)
self.cmd("progress", ["geolite-info", _["Downloading GeoLite2 City database (one time only, ~20MB)..."], progress])
self.log.info("GeoLite2 City database downloaded (%s bytes), unpacking..." % data.tell())
data.seek(0)
# Unpack
with gzip.GzipFile(fileobj=data) as gzip_file:
shutil.copyfileobj(gzip_file, open(db_path, "wb"))
self.cmd("progress", ["geolite-info", _["GeoLite2 City database downloaded!"], 100])
time.sleep(2) # Wait for notify animation
return True
except Exception as err:
self.log.error("Error downloading %s: %s" % (db_url, err))
pass
self.cmd("progress", [
"geolite-info",
_["GeoLite2 City database download error: {}!<br>Please download manually and unpack to data dir:<br>{}"].format(err, db_urls[0]),
-100
])
def getLoc(self, geodb, ip):
global loc_cache
if ip in loc_cache:
return loc_cache[ip]
else:
try:
loc_data = geodb.get(ip)
except:
loc_data = None
if not loc_data or "location" not in loc_data:
loc_cache[ip] = None
return None
loc = {
"lat": loc_data["location"]["latitude"],
"lon": loc_data["location"]["longitude"],
}
if "city" in loc_data:
loc["city"] = loc_data["city"]["names"]["en"]
if "country" in loc_data:
loc["country"] = loc_data["country"]["names"]["en"]
loc_cache[ip] = loc
return loc
def getPeerLocations(self, peers):
import maxminddb
db_path = config.data_dir + '/GeoLite2-City.mmdb'
if not os.path.isfile(db_path) or os.path.getsize(db_path) == 0:
if not self.downloadGeoLiteDb(db_path):
return False
geodb = maxminddb.open_database(db_path)
peers = peers.values()
# Place bars
peer_locations = []
placed = {} # Already placed bars here
for peer in peers:
# Height of bar
if peer.connection and peer.connection.last_ping_delay:
ping = round(peer.connection.last_ping_delay * 1000)
else:
ping = None
loc = self.getLoc(geodb, peer.ip)
if not loc:
continue
# Create position array
lat, lon = loc["lat"], loc["lon"]
latlon = "%s,%s" % (lat, lon)
if latlon in placed and helper.getIpType(peer.ip) == "ipv4": # Dont place more than 1 bar to same place, fake repos using ip address last two part
lat += float(128 - int(peer.ip.split(".")[-2])) / 50
lon += float(128 - int(peer.ip.split(".")[-1])) / 50
latlon = "%s,%s" % (lat, lon)
placed[latlon] = True
peer_location = {}
peer_location.update(loc)
peer_location["lat"] = lat
peer_location["lon"] = lon
peer_location["ping"] = ping
peer_locations.append(peer_location)
# Append myself
for ip in self.site.connection_server.ip_external_list:
my_loc = self.getLoc(geodb, ip)
if my_loc:
my_loc["ping"] = 0
peer_locations.append(my_loc)
return peer_locations
def actionSidebarGetPeers(self, to):
permissions = self.getPermissions(to)
if "ADMIN" not in permissions:
return self.response(to, "You don't have permission to run this command")
try:
peer_locations = self.getPeerLocations(self.site.peers)
globe_data = []
ping_times = [
peer_location["ping"]
for peer_location in peer_locations
if peer_location["ping"]
]
if ping_times:
ping_avg = sum(ping_times) / float(len(ping_times))
else:
ping_avg = 0
for peer_location in peer_locations:
if peer_location["ping"] == 0: # Me
height = -0.135
elif peer_location["ping"]:
height = min(0.20, math.log(1 + peer_location["ping"] / ping_avg, 300))
else:
height = -0.03
globe_data += [peer_location["lat"], peer_location["lon"], height]
self.response(to, globe_data)
except Exception, err:
self.log.debug("sidebarGetPeers error: %s" % Debug.formatException(err))
self.response(to, {"error": err})
def actionSiteSetOwned(self, to, owned):
permissions = self.getPermissions(to)
if "ADMIN" not in permissions:
return self.response(to, "You don't have permission to run this command")
if self.site.address == config.updatesite:
return self.response(to, "You can't change the ownership of the updater site")
self.site.settings["own"] = bool(owned)
self.site.updateWebsocket(owned=owned)
def actionUserSetSitePrivatekey(self, to, privatekey):
permissions = self.getPermissions(to)
if "ADMIN" not in permissions:
return self.response(to, "You don't have permission to run this command")
site_data = self.user.sites[self.site.address]
site_data["privatekey"] = privatekey
self.site.updateWebsocket(set_privatekey=bool(privatekey))
return "ok"
def actionSiteSetAutodownloadoptional(self, to, owned):
permissions = self.getPermissions(to)
if "ADMIN" not in permissions:
return self.response(to, "You don't have permission to run this command")
self.site.settings["autodownloadoptional"] = bool(owned)
self.site.bad_files = {}
gevent.spawn(self.site.update, check_files=True)
self.site.worker_manager.removeSolvedFileTasks()
def actionDbReload(self, to):
permissions = self.getPermissions(to)
if "ADMIN" not in permissions:
return self.response(to, "You don't have permission to run this command")
self.site.storage.closeDb()
self.site.storage.getDb()
return self.response(to, "ok")
def actionDbRebuild(self, to):
permissions = self.getPermissions(to)
if "ADMIN" not in permissions:
return self.response(to, "You don't have permission to run this command")
self.site.storage.rebuildDb()
return self.response(to, "ok")

View file

@ -1,59 +0,0 @@
import cStringIO as StringIO
import os
import zipfile
class ZipStream(file):
def __init__(self, dir_path):
self.dir_path = dir_path
self.pos = 0
self.buff_pos = 0
self.zf = zipfile.ZipFile(self, 'w', zipfile.ZIP_DEFLATED, allowZip64=True)
self.buff = StringIO.StringIO()
self.file_list = self.getFileList()
def getFileList(self):
for root, dirs, files in os.walk(self.dir_path):
for file in files:
file_path = root + "/" + file
relative_path = os.path.join(os.path.relpath(root, self.dir_path), file)
yield file_path, relative_path
self.zf.close()
def read(self, size=60 * 1024):
for file_path, relative_path in self.file_list:
self.zf.write(file_path, relative_path)
if self.buff.tell() >= size:
break
self.buff.seek(0)
back = self.buff.read()
self.buff.truncate(0)
self.buff.seek(0)
self.buff_pos += len(back)
return back
def write(self, data):
self.pos += len(data)
self.buff.write(data)
def tell(self):
return self.pos
def seek(self, pos, whence=0):
if pos >= self.buff_pos:
self.buff.seek(pos - self.buff_pos, whence)
self.pos = pos
def flush(self):
pass
if __name__ == "__main__":
zs = ZipStream(".")
out = open("out.zip", "wb")
while 1:
data = zs.read()
print("Write %s" % len(data))
if not data:
break
out.write(data)
out.close()

View file

@ -1 +0,0 @@
import SidebarPlugin

View file

@ -1,81 +0,0 @@
{
"Peers": "Klienter",
"Connected": "Forbundet",
"Connectable": "Mulige",
"Connectable peers": "Mulige klienter",
"Data transfer": "Data overførsel",
"Received": "Modtaget",
"Received bytes": "Bytes modtaget",
"Sent": "Sendt",
"Sent bytes": "Bytes sendt",
"Files": "Filer",
"Total": "I alt",
"Image": "Image",
"Other": "Andet",
"User data": "Bruger data",
"Size limit": "Side max størrelse",
"limit used": "brugt",
"free space": "fri",
"Set": "Opdater",
"Optional files": "Valgfri filer",
"Downloaded": "Downloadet",
"Download and help distribute all files": "Download og hjælp med at dele filer",
"Total size": "Størrelse i alt",
"Downloaded files": "Filer downloadet",
"Database": "Database",
"search feeds": "søgninger",
"{feeds} query": "{feeds} søgninger",
"Reload": "Genindlæs",
"Rebuild": "Genopbyg",
"No database found": "Ingen database fundet",
"Identity address": "Autorisations ID",
"Change": "Skift",
"Update": "Opdater",
"Pause": "Pause",
"Resume": "Aktiv",
"Delete": "Slet",
"Are you sure?": "Er du sikker?",
"Site address": "Side addresse",
"Donate": "Doner penge",
"Missing files": "Manglende filer",
"{} try": "{} forsøg",
"{} tries": "{} forsøg",
"+ {num_bad_files} more": "+ {num_bad_files} mere",
"This is my site": "Dette er min side",
"Site title": "Side navn",
"Site description": "Side beskrivelse",
"Save site settings": "Gem side opsætning",
"Content publishing": "Indhold offentliggøres",
"Choose": "Vælg",
"Sign": "Signer",
"Publish": "Offentliggør",
"This function is disabled on this proxy": "Denne funktion er slået fra på denne ZeroNet proxyEz a funkció ki van kapcsolva ezen a proxy-n",
"GeoLite2 City database download error: {}!<br>Please download manually and unpack to data dir:<br>{}": "GeoLite2 City database kunne ikke downloades: {}!<br>Download venligst databasen manuelt og udpak i data folder:<br>{}",
"Downloading GeoLite2 City database (one time only, ~20MB)...": "GeoLite2 város adatbázis letöltése (csak egyszer kell, kb 20MB)...",
"GeoLite2 City database downloaded!": "GeoLite2 City database downloadet!",
"Are you sure?": "Er du sikker?",
"Site storage limit modified!": "Side max størrelse ændret!",
"Database schema reloaded!": "Database definition genindlæst!",
"Database rebuilding....": "Genopbygger database...",
"Database rebuilt!": "Database genopbygget!",
"Site updated!": "Side opdateret!",
"Delete this site": "Slet denne side",
"File write error: ": "Fejl ved skrivning af fil: ",
"Site settings saved!": "Side opsætning gemt!",
"Enter your private key:": "Indtast din private nøgle:",
" Signed!": " Signeret!",
"WebGL not supported": "WebGL er ikke supporteret"
}

View file

@ -1,81 +0,0 @@
{
"Peers": "Peers",
"Connected": "Verbunden",
"Connectable": "Verbindbar",
"Connectable peers": "Verbindbare Peers",
"Data transfer": "Datei Transfer",
"Received": "Empfangen",
"Received bytes": "Empfangene Bytes",
"Sent": "Gesendet",
"Sent bytes": "Gesendete Bytes",
"Files": "Dateien",
"Total": "Gesamt",
"Image": "Bilder",
"Other": "Sonstiges",
"User data": "Nutzer Daten",
"Size limit": "Speicher Limit",
"limit used": "Limit benutzt",
"free space": "freier Speicher",
"Set": "Setzten",
"Optional files": "Optionale Dateien",
"Downloaded": "Heruntergeladen",
"Download and help distribute all files": "Herunterladen und helfen alle Dateien zu verteilen",
"Total size": "Gesamte Größe",
"Downloaded files": "Heruntergeladene Dateien",
"Database": "Datenbank",
"search feeds": "Feeds durchsuchen",
"{feeds} query": "{feeds} Abfrage",
"Reload": "Neu laden",
"Rebuild": "Neu bauen",
"No database found": "Keine Datenbank gefunden",
"Identity address": "Identitäts Adresse",
"Change": "Ändern",
"Update": "Aktualisieren",
"Pause": "Pausieren",
"Resume": "Fortsetzen",
"Delete": "Löschen",
"Are you sure?": "Bist du sicher?",
"Site address": "Seiten Adresse",
"Donate": "Spenden",
"Missing files": "Fehlende Dateien",
"{} try": "{} versuch",
"{} tries": "{} versuche",
"+ {num_bad_files} more": "+ {num_bad_files} mehr",
"This is my site": "Das ist meine Seite",
"Site title": "Seiten Titel",
"Site description": "Seiten Beschreibung",
"Save site settings": "Einstellungen der Seite speichern",
"Content publishing": "Inhaltsveröffentlichung",
"Choose": "Wähle",
"Sign": "Signieren",
"Publish": "Veröffentlichen",
"This function is disabled on this proxy": "Diese Funktion ist auf dieser Proxy deaktiviert",
"GeoLite2 City database download error: {}!<br>Please download manually and unpack to data dir:<br>{}": "GeoLite2 City Datenbank Download Fehler: {}!<br>Bitte manuell herunterladen und die Datei in das Datei Verzeichnis extrahieren:<br>{}",
"Downloading GeoLite2 City database (one time only, ~20MB)...": "Herunterladen der GeoLite2 City Datenbank (einmalig, ~20MB)...",
"GeoLite2 City database downloaded!": "GeoLite2 City Datenbank heruntergeladen!",
"Are you sure?": "Bist du sicher?",
"Site storage limit modified!": "Speicher Limit der Seite modifiziert!",
"Database schema reloaded!": "Datebank Schema neu geladen!",
"Database rebuilding....": "Datenbank neu bauen...",
"Database rebuilt!": "Datenbank neu gebaut!",
"Site updated!": "Seite aktualisiert!",
"Delete this site": "Diese Seite löschen",
"File write error: ": "Datei schreib fehler:",
"Site settings saved!": "Seiten Einstellungen gespeichert!",
"Enter your private key:": "Gib deinen privaten Schlüssel ein:",
" Signed!": " Signiert!",
"WebGL not supported": "WebGL nicht unterstützt"
}

View file

@ -1,79 +0,0 @@
{
"Peers": "Pares",
"Connected": "Conectados",
"Connectable": "Conectables",
"Connectable peers": "Pares conectables",
"Data transfer": "Transferencia de datos",
"Received": "Recibidos",
"Received bytes": "Bytes recibidos",
"Sent": "Enviados",
"Sent bytes": "Bytes envidados",
"Files": "Ficheros",
"Total": "Total",
"Image": "Imagen",
"Other": "Otro",
"User data": "Datos del usuario",
"Size limit": "Límite de tamaño",
"limit used": "Límite utilizado",
"free space": "Espacio libre",
"Set": "Establecer",
"Optional files": "Ficheros opcionales",
"Downloaded": "Descargado",
"Download and help distribute all files": "Descargar y ayudar a distribuir todos los ficheros",
"Total size": "Tamaño total",
"Downloaded files": "Ficheros descargados",
"Database": "Base de datos",
"search feeds": "Fuentes de búsqueda",
"{feeds} query": "{feeds} consulta",
"Reload": "Recargar",
"Rebuild": "Reconstruir",
"No database found": "No se ha encontrado la base de datos",
"Identity address": "Dirección de la identidad",
"Change": "Cambiar",
"Update": "Actualizar",
"Pause": "Pausar",
"Resume": "Reanudar",
"Delete": "Borrar",
"Site address": "Dirección del sitio",
"Donate": "Donar",
"Missing files": "Ficheros perdidos",
"{} try": "{} intento",
"{} tries": "{} intentos",
"+ {num_bad_files} more": "+ {num_bad_files} más",
"This is my site": "Este es mi sitio",
"Site title": "Título del sitio",
"Site description": "Descripción del sitio",
"Save site settings": "Guardar la configuración del sitio",
"Content publishing": "Publicación del contenido",
"Choose": "Elegir",
"Sign": "Firmar",
"Publish": "Publicar",
"This function is disabled on this proxy": "Esta función está deshabilitada en este proxy",
"GeoLite2 City database download error: {}!<br>Please download manually and unpack to data dir:<br>{}": "¡Error de la base de datos GeoLite2 {}!<br>Por favor, descárgalo manualmente y descomprime al directorio de datos<br>{}",
"Downloading GeoLite2 City database (one time only, ~20MB)...": "Descargando la base de datos de GeoLite2 (una única vez, ~20MB...",
"GeoLite2 City database downloaded!": "¡Base de datos de GeoLite2 descargada!",
"Are you sure?": "¿Estás seguro?",
"Site storage limit modified!": "¡Límite de almacenamiento del sitio modificado!",
"Database schema reloaded!": "¡Esquema de la base de datos recargado!",
"Database rebuilding....": "Reconstruyendo la base de datos...",
"Database rebuilt!": "¡Base de datos reconstruida!",
"Site updated!": "¡Sitio actualizado!",
"Delete this site": "Borrar este sitio",
"File write error: ": "Error de escritura de fichero",
"Site settings saved!": "¡Configuración del sitio guardada!",
"Enter your private key:": "Introduce tu clave privada",
" Signed!": " ¡firmado!",
"WebGL not supported": "WebGL no está soportado"
}

View file

@ -1,82 +0,0 @@
{
"Peers": "Pairs",
"Connected": "Connectés",
"Connectable": "Accessibles",
"Connectable peers": "Pairs accessibles",
"Data transfer": "Données transférées",
"Received": "Reçues",
"Received bytes": "Bytes reçus",
"Sent": "Envoyées",
"Sent bytes": "Bytes envoyés",
"Files": "Fichiers",
"Total": "Total",
"Image": "Image",
"Other": "Autre",
"User data": "Utilisateurs",
"Size limit": "Taille maximale",
"limit used": "utlisé",
"free space": "libre",
"Set": "Modifier",
"Optional files": "Fichiers optionnels",
"Downloaded": "Téléchargé",
"Download and help distribute all files": "Télécharger et distribuer tous les fichiers",
"Total size": "Taille totale",
"Downloaded files": "Fichiers téléchargés",
"Database": "Base de données",
"search feeds": "recherche",
"{feeds} query": "{feeds} requête",
"Reload": "Recharger",
"Rebuild": "Reconstruire",
"No database found": "Aucune base de données trouvée",
"Identity address": "Adresse d'identité",
"Change": "Modifier",
"Site control": "Opérations",
"Update": "Mettre à jour",
"Pause": "Suspendre",
"Resume": "Reprendre",
"Delete": "Supprimer",
"Are you sure?": "Êtes-vous certain?",
"Site address": "Adresse du site",
"Donate": "Faire un don",
"Missing files": "Fichiers manquants",
"{} try": "{} essai",
"{} tries": "{} essais",
"+ {num_bad_files} more": "+ {num_bad_files} manquants",
"This is my site": "Ce site m'appartient",
"Site title": "Nom du site",
"Site description": "Description du site",
"Save site settings": "Enregistrer les paramètres",
"Content publishing": "Publication du contenu",
"Choose": "Sélectionner",
"Sign": "Signer",
"Publish": "Publier",
"This function is disabled on this proxy": "Cette fonction est désactivé sur ce proxy",
"GeoLite2 City database download error: {}!<br>Please download manually and unpack to data dir:<br>{}": "Erreur au téléchargement de la base de données GeoLite2: {}!<br>Téléchargez et décompressez dans le dossier data:<br>{}",
"Downloading GeoLite2 City database (one time only, ~20MB)...": "Téléchargement de la base de données GeoLite2 (une seule fois, ~20MB)...",
"GeoLite2 City database downloaded!": "Base de données GeoLite2 téléchargée!",
"Are you sure?": "Êtes-vous certain?",
"Site storage limit modified!": "Taille maximale modifiée!",
"Database schema reloaded!": "Base de données rechargée!",
"Database rebuilding....": "Reconstruction de la base de données...",
"Database rebuilt!": "Base de données reconstruite!",
"Site updated!": "Site mis à jour!",
"Delete this site": "Supprimer ce site",
"File write error: ": "Erreur à l'écriture du fichier: ",
"Site settings saved!": "Paramètres du site enregistrés!",
"Enter your private key:": "Entrez votre clé privée:",
" Signed!": " Signé!",
"WebGL not supported": "WebGL n'est pas supporté"
}

View file

@ -1,82 +0,0 @@
{
"Peers": "Csatlakozási pontok",
"Connected": "Csaltakozva",
"Connectable": "Csatlakozható",
"Connectable peers": "Csatlakozható peer-ek",
"Data transfer": "Adatátvitel",
"Received": "Fogadott",
"Received bytes": "Fogadott byte-ok",
"Sent": "Küldött",
"Sent bytes": "Küldött byte-ok",
"Files": "Fájlok",
"Total": "Összesen",
"Image": "Kép",
"Other": "Egyéb",
"User data": "Felh. adat",
"Size limit": "Méret korlát",
"limit used": "felhasznált",
"free space": "szabad hely",
"Set": "Beállít",
"Optional files": "Opcionális fájlok",
"Downloaded": "Letöltött",
"Download and help distribute all files": "Minden opcionális fájl letöltése",
"Total size": "Teljes méret",
"Downloaded files": "Letöltve",
"Database": "Adatbázis",
"search feeds": "Keresés források",
"{feeds} query": "{feeds} lekérdezés",
"Reload": "Újratöltés",
"Rebuild": "Újraépítés",
"No database found": "Adatbázis nem található",
"Identity address": "Azonosító cím",
"Change": "Módosít",
"Site control": "Oldal műveletek",
"Update": "Frissít",
"Pause": "Szünteltet",
"Resume": "Folytat",
"Delete": "Töröl",
"Are you sure?": "Biztos vagy benne?",
"Site address": "Oldal címe",
"Donate": "Támogatás",
"Missing files": "Hiányzó fájlok",
"{} try": "{} próbálkozás",
"{} tries": "{} próbálkozás",
"+ {num_bad_files} more": "+ még {num_bad_files} darab",
"This is my site": "Ez az én oldalam",
"Site title": "Oldal neve",
"Site description": "Oldal leírása",
"Save site settings": "Oldal beállítások mentése",
"Content publishing": "Tartalom publikálás",
"Choose": "Válassz",
"Sign": "Aláírás",
"Publish": "Publikálás",
"This function is disabled on this proxy": "Ez a funkció ki van kapcsolva ezen a proxy-n",
"GeoLite2 City database download error: {}!<br>Please download manually and unpack to data dir:<br>{}": "GeoLite2 város adatbázis letöltési hiba: {}!<br>A térképhez töltsd le és csomagold ki a data könyvtárba:<br>{}",
"Downloading GeoLite2 City database (one time only, ~20MB)...": "GeoLite2 város adatbázis letöltése (csak egyszer kell, kb 20MB)...",
"GeoLite2 City database downloaded!": "GeoLite2 város adatbázis letöltve!",
"Are you sure?": "Biztos vagy benne?",
"Site storage limit modified!": "Az oldalt méret korlát módosítva!",
"Database schema reloaded!": "Adatbázis séma újratöltve!",
"Database rebuilding....": "Adatbázis újraépítés...",
"Database rebuilt!": "Adatbázis újraépítve!",
"Site updated!": "Az oldal frissítve!",
"Delete this site": "Az oldal törlése",
"File write error: ": "Fájl írási hiba: ",
"Site settings saved!": "Az oldal beállításai elmentve!",
"Enter your private key:": "Add meg a prviát kulcsod:",
" Signed!": " Aláírva!",
"WebGL not supported": "WebGL nem támogatott"
}

Some files were not shown because too many files have changed in this diff Show more