Merge branch 'master' into feature/reverse_proxy_config

This commit is contained in:
fyears 2024-04-05 10:36:46 +08:00 committed by GitHub
commit bac23f3356
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194
60 changed files with 4653 additions and 2997 deletions

View File

@ -23,17 +23,16 @@ This is yet another unofficial sync plugin for Obsidian. If you like it or find
- Webdav
- [Here](./docs/services_connectable_or_not.md) shows more connectable (or not-connectable) services in details.
- **Obsidian Mobile supported.** Vaults can be synced across mobile and desktop devices with the cloud service as the "broker".
- **[End-to-end encryption](./docs/encryption.md) supported.** Files would be encrypted using openssl format before being sent to the cloud **if** user specify a password.
- **[End-to-end encryption](./docs/encryption/README.md) supported.** Files would be encrypted using openssl format before being sent to the cloud **if** user specify a password.
- **Scheduled auto sync supported.** You can also manually trigger the sync using sidebar ribbon, or using the command from the command palette (or even bind the hot key combination to the command then press the hot key combination).
- **[Minimal Intrusive](./docs/minimal_intrusive_design.md).**
- **Skip Large files** and **skip paths** by custom regex conditions!
- **Fully open source under [Apache-2.0 License](./LICENSE).**
- **[Sync Algorithm open](./docs/sync_algorithm_v2.md) for discussion.**
- **[Sync Algorithm open](./docs/sync_algorithm/v3/intro.md) for discussion.**
- **[Basic Conflict Detection And Handling](./docs/sync_algorithm/v3/intro.md)** now, more to come!
## Limitations
- **To support deletions sync, extra metadata will also be uploaded.** See [Minimal Intrusive](./docs/minimal_intrusive_design.md).
- **No Conflict resolution. No content-diff-and-patch algorithm.** All files and folders are compared using their local and remote "last modified time" and those with later "last modified time" wins.
- **Cloud services cost you money.** Always be aware of the costs and pricing. Specifically, all the operations, including but not limited to downloading, uploading, listing all files, calling any api, storage sizes, may or may not cost you money.
- **Some limitations from the browser environment.** More technical details are [in the doc](./docs/browser_env.md).
- **You should protect your `data.json` file.** The file contains sensitive information.
@ -75,6 +74,7 @@ Additionally, the plugin author may occasionally visit Obsidian official forum a
- If you want to enable end-to-end encryption, also set a password in settings. If you do not specify a password, the files and folders are synced in plain, original content to the cloud.
- Click the new "circle arrow" icon on the ribbon (the left sidebar), **every time** you want to sync your vault between local and remote. (Or, you could configure auto sync in the settings panel (See next chapter).) While syncing, the icon becomes "two half-circle arrows". Besides clicking the icon on the sidebar ribbon, you can also activate the corresponding command in the command palette.
- **Be patient while syncing.** Especially in the first-time sync.
- If you want to sync the files across multiple devices, **your vault name should be the same** while using default settings.
### Dropbox
@ -82,6 +82,7 @@ Additionally, the plugin author may occasionally visit Obsidian official forum a
- After the authorization, the plugin can read your name and email (which cannot be unselected on Dropbox api), and read and write files in your Dropbox's `/Apps/remotely-save` folder.
- If you decide to authorize this plugin to connect to Dropbox, please go to plugin's settings, and choose Dropbox then follow the instructions. [More with screenshot is here](./docs/dropbox_review_material/README.md).
- Password-based end-to-end encryption is also supported. But please be aware that **the vault name itself is not encrypted**.
- If you want to sync the files across multiple devices, **your vault name should be the same** while using default settings.
### OneDrive for personal
@ -90,6 +91,8 @@ Additionally, the plugin author may occasionally visit Obsidian official forum a
- After the authorization, the plugin can read your name and email, and read and write files in your OneDrive's `/Apps/remotely-save` folder.
- If you decide to authorize this plugin to connect to OneDrive, please go to plugin's settings, and choose OneDrive then follow the instructions.
- Password-based end-to-end encryption is also supported. But please be aware that **the vault name itself is not encrypted**.
- If you want to sync the files across multiple devices, **your vault name should be the same** while using default settings.
- You might also want to checkout [faq for OneDrive](./docs/remote_services/onedrive/README.md).
### webdav
@ -102,6 +105,7 @@ Additionally, the plugin author may occasionally visit Obsidian official forum a
- Very old version of Obsidian needs [configuring CORS](./docs/remote_services/webdav_general/webav_cors.md).
- Your data would be synced to a `${vaultName}` sub folder on your webdav server.
- Password-based end-to-end encryption is also supported. But please be aware that **the vault name itself is not encrypted**.
- If you want to sync the files across multiple devices, **your vault name should be the same** while using default settings.
## Scheduled Auto Sync

View File

@ -0,0 +1,8 @@
# Encryption
Currently (March 2024), Remotely Save supports two end to end encryption format:
1. [RClone Crypt](./rclone.md) format, which is the recommend way now.
2. [OpenSSL enc](./openssl.md) format
Here is also the [comparation](./comparation.md).

View File

@ -0,0 +1,23 @@
# Comparation Between Encryption Formats
## Warning
**ALWAYS BACKUP YOUR VAULT MANUALLY!!!**
If you switch between RClone Crypt format and OpenSSL enc format, you have to delete the cloud vault files **manually** and **fully**, so that the plugin can re-sync (i.e. re-upload) the newly encrypted versions to the cloud.
## The feature table
| | RClone Crypt | OpenSSL enc | comments |
| ------------------------ | ------------------------------------------------------------------------------------------ | -------------------------------------------------------------- | ------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
| key generation | scrypt with fixed salt | PBKDF2 with dynamic salt | scrypt is better than PBKDF2 from the algorithm aspect. But RClone uses fixed salt by default. Also the parameters might affect the result. |
| content encryption | XSalsa20Poly1305 on chunks | AES-256-CBC | XSalsa20Poly1305 is way better than AES-256-CBC. And encryption by chunks should require less resources. |
| file name encryption | EME on each segment of the path | AES-256-CBC on the whole path | RClone has the benefit as well as pitfall that the path structure is preserved. Maybe it's more of a design decision difference? No comment on EME and AES-256-CBC. |
| viewing decrypted result | RClone has command that can mount the encrypted vault as if the encryption is transparent. | No convenient way except writing some scripts we are aware of. | RClone is way more convenient. |
## Some notes
1. Anyway, security is a hard problem. The author of Remotely Save doesn't have sufficient knowledge to "judge" which one is the better format. **Use them at your own risk.**
2. Currently the RClone Crypt format is recommended by default in Remotely Save. Just because of the taste from the Remotely Save author, who likes RClone.
3. **Always use a long password.**
4. Both algorithms are selected deliberately to **be compatible with some well-known third-party tools** (instead of some home-made methods) and **have many tests to ensure the correctness**.

View File

@ -1,10 +1,22 @@
# Encryption
# OpenSSL enc format
If a password is set, the files are encrypted before being sent to the cloud.
The encryption algorithm is delibrately designed to be aligned with openssl format.
## Warning
1. The encryption algorithm is implemented using web-crypto.
**ALWAYS BACKUP YOUR VAULT MANUALLY!!!**
If you switch between RClone Crypt format and OpenSSL enc format, you have to delete the cloud vault files **manually** and **fully**, so that the plugin can re-sync (i.e. re-upload) the newly encrypted versions to the cloud.
## Comparation between encryption formats
See the doc [Comparation](./comparation.md).
## Interoperability with official OpenSSL
This encryption algorithm is delibrately designed to be aligned with openssl format.
1. The encryption algorithm is implemented using web-crypto. Using AES-256-CBC.
2. The file content is encrypted using openssl format. Assuming a file named `sometext.txt`, a password `somepassword`, then the encryption is equivalent to the following command:
```bash

46
docs/encryption/rclone.md Normal file
View File

@ -0,0 +1,46 @@
# RClone Crypt format
The encryption is compatible with RClone Crypt with **base64** name encryption format.
It's developed based on another js project by the same author of Remotely Save: [`@fyears/rclone-crypt`](https://github.com/fyears/rclone-crypt), which is NOT an official library from RClone, and is NOT affiliated with RClone.
Reasonable tests are also ported from official RClone code, to ensure the compatibility and correctness of the encryption.
## Warning
**ALWAYS BACKUP YOUR VAULT MANUALLY!!!**
If you switch between RClone Crypt format and OpenSSL enc format, you have to delete the cloud vault files **manually** and **fully**, so that the plugin can re-sync (i.e. re-upload) the newly encrypted versions to the cloud.
## Comparation between encryption formats
See the doc [Comparation](./comparation.md).
## Interoperability with official RClone
Please pay attention that the plugin uses **base64** of encrypted file names, while official RClone by default uses **base32** file names. The intention is purely for potentially support longer file names.
You could set up the RClone profile by calling `rclone config`. You need to create two profiles, one for your original connection and the other for RClone Crypt.
Finally, a working config file should like this:
```ini
[webdav1]
type = webdav
url = https://example.com/sharefolder1/subfolder1 # the same as the web address in Remotely Save settings.
vendor = other
user = <some webdav username>
pass = <some webdav password, obfuscated>
[webdav1crypt]
type = crypt
remote = nas1test:vaultname # the same as your "Remote Base Directory" (usually the vault name) in Remotely Save settings
password = <some encryption password, obfuscated>
filename_encoding = base64 # don't forget this!!!
```
You can use the `mount` command to view and see the files in file explorer! On Windows, the command should like this (the remote vault is mounted to drive `X:`):
```bash
rclone mount webdav1crypt: X: --network-mode
```

View File

@ -12,8 +12,8 @@ See [here](./export_sync_plans.md).
See [here](./check_console_output.md).
## Advanced: Save Console Output Then Read Them Later
## Advanced: Use `Logstravaganza` to export logs
This method works for desktop and mobile devices (iOS, Android).
This method works for desktop and mobile devices (iOS, Android), especially useful for iOS.
See [here](./save_console_output_and_export.md).
See [here](./use_logstravaganza.md).

View File

@ -1,25 +0,0 @@
# Save Console Output And Read Them Later
## Disable Auto Sync Firstly
You should disable auto sync to avoid any unexpected running.
## Set The Output Level To Debug
Go to the plugin settings, scroll down to the section "Debug" -> "alter console log level", and change it from "info" to "debug".
## Enable Saving The Output To DB
Go to the plugin settings, scroll down to the section "Debug" -> "Save Console Logs Into DB", and change it from "disable" to "enable". **This setting has some performance cost, so do NOT always turn this on when not necessary!**
## Run The Sync
Trigger the sync manually (by clicking the icon on the ribbon sidebar). Something (hopefully) helpful should show up in the console. The the console logs are also saved into DB now.
## Export The Output And Read The Logs
Go to the plugin settings, scroll down to the section "Debug" -> "Export Console Logs From DB", and click the button. A new file `log_hist_exported_on_....md` should be created inside the special folder `_debug_remotely_save/`. You could read it and hopefully find something useful.
## Disable Saving The Output To DB
After debugging, go to the plugin settings, scroll down to the section "Debug" -> "Save Console Logs Into DB", and change it from "enable" to "disable".

View File

@ -0,0 +1,14 @@
# Use `Logstravaganza`
On iOS, it's quite hard to directly check the console logs.
Luckily, there is a third-party plugin: [`Logstravaganza`](https://obsidian.md/plugins?search=Logstravaganza#), by Carlo Zottmann, that can redirect the output to a note.
You can just:
1. Install it.
2. Enable it.
3. Do something, to trigger some console logs.
4. Checkout `LOGGING-NOTE (device name).md` in the root of your vault.
See more on its site: <https://github.com/czottmann/obsidian-logstravaganza>.

56
docs/linux.md Normal file
View File

@ -0,0 +1,56 @@
# How to receive `obsidian://` in Linux
## Background
For example, when we are authorizing OneDrive, we have to jump back to Obsidian automatically using `obsidian://`.
## Short Desc From Official Obsidian Doc
Official doc has some explanation:
<https://help.obsidian.md/Extending+Obsidian/Obsidian+URI#Register+Obsidian+URI>
# Long Desc
Assuming the username is `somebody`, and the `.AppImage` file is downloaded to `~/Desktop`.
1. Download and **extract** the app image file in terminal
```bash
cd /home/somebody/Desktop
chmod +x Obsidian-x.y.z.AppImage
./Obsidian-x.y.z.AppImage --appimage-extract
# you should have the folder squashfs-root
# we want to rename it
mv squashfs-root Obsidian
```
2. Create a `.desktop` file
```bash
# copy and paste the follow MULTI LINE command
# you might need to input your password because it requires root privilege
# remember to adjust the path
cat > ~/Desktop/obsidian.desktop <<EOF
[Desktop Entry]
Name=Obsidian
Comment=obsidian
Exec=/home/somebody/Desktop/Obsidian/obsidian %u
Keywords=obsidian
StartupNotify=true
Terminal=false
Type=Application
Icon=/home/somebody/Desktop/Obsidian/obsidian.png
MimeType=x-scheme-handler/obsidian;
EOF
# yeah we can check out the output
cat ~/Desktop/obsidian.desktop
## [Desktop Entry]
## ...
```
3. Right click the `obsidian.desktop` file on the Desktop, and click "Allow launching"
4. Double click the `obsidian.desktop` file.

View File

@ -1,8 +1,10 @@
# Minimal Intrusive Design
Before version 0.3.0, the plugin did not upload additional meta data to the remote.
~~Before version 0.3.0, the plugin did not upload additional meta data to the remote.~~
From and after version 0.3.0, the plugin just upload minimal extra necessary meta data to the remote.
~~From version 0.3.0 ~ 0.3.40, the plugin just upload minimal extra necessary meta data to the remote.~~
From version 0.4.1 and above, the plugin doesn't need uploading meta data due to the sync algorithm upgrade.
## Benefits
@ -12,10 +14,14 @@ For example, it's possbile for a uses to manually upload a file to s3, and next
And it's also possible to combine another "sync-to-s3" solution (like, another software) on desktops, and this plugin on mobile devices, together.
## Necessarity Of Uploading Extra Metadata
## ~~Necessarity Of Uploading Extra Metadata from 0.3.0 ~ 0.3.40~~
The main issue comes from deletions (and renamings which is actually interpreted as "deletion-then-creation").
~~The main issue comes from deletions (and renamings which is actually interpreted as "deletion-then-creation").~~
If we don't upload any extra info to the remote, there's usually no way for the second device to know what files / folders have been deleted on the first device.
~~If we don't upload any extra info to the remote, there's usually no way for the second device to know what files / folders have been deleted on the first device.~~
To overcome this issue, from and after version 0.3.0, the plugin uploads extra metadata files `_remotely-save-metadata-on-remote.{json,bin}` to users' configured cloud services. Those files contain some info about what has been deleted on the first device, so that the second device can read the list to apply the deletions to itself. Some other necessary meta info would also be written into the extra files.
~~To overcome this issue, from and after version 0.3.0, the plugin uploads extra metadata files `_remotely-save-metadata-on-remote.{json,bin}` to users' configured cloud services. Those files contain some info about what has been deleted on the first device, so that the second device can read the list to apply the deletions to itself. Some other necessary meta info would also be written into the extra files.~~
## No uploading extra metadata from 0.4.1
Some information, including previous successful sync status of each file, is kept locally.

View File

@ -0,0 +1,23 @@
# OneDrive
- **This plugin is NOT an official Microsoft / OneDrive product.** The plugin just uses Microsoft's [OneDrive's public API](https://docs.microsoft.com/en-us/onedrive/developer/rest-api).
- After the authorization, the plugin can read your name and email, and read and write files in your OneDrive's `/Apps/remotely-save` folder.
- If you decide to authorize this plugin to connect to OneDrive, please go to plugin's settings, and choose OneDrive then follow the instructions.
- Password-based end-to-end encryption is also supported. But please be aware that **the vault name itself is not encrypted**.
- If you want to sync the files across multiple devices, **your vault name should be the same** while using default settings.
## FAQ
### How about OneDrive for Business?
This plugin only works for "OneDrive for personal", and not works for "OneDrive for Business" (yet). See [#11](https://github.com/fyears/remotely-save/issues/11) to further details.
### I cannot find `/Apps/remotely-save` folder
Mystically some users report that their OneDrive generate `/Application/Graph` instead of `/Apps/remotely-save`. See [#517](https://github.com/remotely-save/remotely-save/issues/517).
The solution is simple:
1. Backup your vault manually.
2. Go to onedrive website (<https://onedrive.live.com/>), and rename `/Application/Graph` to `/Application/remotely-save` (right click on the folder and you will see rename option)
3. Come back to Obsidian and try to sync!

View File

@ -28,19 +28,12 @@ Using the principle of least privilege is crucial for security when allowing a t
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "ObsidianBucket",
"Effect": "Allow",
"Action": [
"s3:HeadBucket"
],
"Resource": "arn:aws:s3:::my-bucket"
},
{
"Sid": "ObsidianObjects",
"Effect": "Allow",
"Action": [
"s3:HeadObject",
"s3:ListBucket",
"s3:PutObject",
"s3:CopyObject",
"s3:UploadPart",
@ -56,7 +49,10 @@ Using the principle of least privilege is crucial for security when allowing a t
"s3:DeleteObject",
"s3:DeleteObjects"
],
"Resource": "arn:aws:s3:::my-bucket/*"
"Resource": [
"arn:aws:s3:::my-bucket",
"arn:aws:s3:::my-bucket/*"
]
}
]
}

View File

@ -8,10 +8,10 @@
1. Register an account. Login.
2. Create a bucket.
3. Create S3 Credentials in Access Management. Allow all permissions for the bucket. Remember the access key and secret key and the end point. The end point is likely to be [`https://gateway.storjshare.io`](https://docs.storj.io/dcs/api/s3/s3-compatible-gateway).
![](./storj_create_s3_cred_1.png)
![](./storj_create_s3_cred_2.png)
3. Create S3 Credentials in Access Management. Allow all permissions for the bucket. Remember the access key and secret key and the end point. The end point is likely to be [`https://gateway.storjshare.io`](https://docs.storj.io/dcs/api/s3/s3-compatible-gateway).
![](./storj_create_s3_cred_1.png)
![](./storj_create_s3_cred_2.png)
4. Input your credentials into remotely-save settings. Region [should be `global`](https://docs.storj.io/dcs/api/s3/s3-compatibility).
![](storj_remotely_save_settings.png)
![](storj_remotely_save_settings.png)
5. Check connectivity.
6. Sync!

View File

@ -0,0 +1,7 @@
# Sync Algorithm
- [v1](./v1/README.md)
- [v2](./v2/README.md)
- v3
- [intro doc for end users](./v3/intro.md)
- [design doc](./v3/design.md)

View File

@ -0,0 +1,4 @@
# Sync Algorithm V3
- [intro doc for end users](./intro.md)
- [design doc](./design.md)

View File

@ -0,0 +1,71 @@
# Sync Algorithm V3
Drafted on 20240117.
An absolutely better sync algorithm. Better for tracking deletions and better for subbranching.
## Huge Thanks
Basically a combination of algorithm v2 + [synclone](https://github.com/Jwink3101/syncrclone/blob/master/docs/algorithm.md) + [rsinc](https://github.com/ConorWilliams/rsinc) + (some of rclone [bisync](https://rclone.org/bisync/)). All of the later three are released under MIT License so no worries about the licenses.
## Features
Must have
1. true deletion detection
2. deletion protection (blocking) with a setting
3. transaction from the old algorithm
4. user warning show up, **new algorithm needs all clients to be updated!** (deliberately corrput the metadata file??)
5. filters
6. conflict warning
7. partial sync
Nice to have
1. true time and hash
2. conflict rename
## Description
We have _five_ input sources:
1. local all files
2. remote all files
3. _local previous succeeded sync history_
4. local deletions
5. remote deletions.
Init run, consuming remote deletions :
Change history data into _local previous succeeded sync history_.
Later runs, use the first, second, third sources **only**.
Bidirectional table is modified based on synclone and rsinc. Incremental push / pull only tables is further modified based on the bidirectional table. The number inside the table cell is the decision branch in the code.
Bidirectional:
| local\remote | remote unchanged | remote modified | remote deleted | remote created |
| --------------- | ------------------ | ------------------------- | ------------------ | ------------------------- |
| local unchanged | (02/21) do nothing | (09) pull | (07) delete local | (??) conflict |
| local modified | (10) push | (16/17/18/19/20) conflict | (08) push | (??) conflict |
| local deleted | (04) delete remote | (05) pull | (01) clean history | (03) pull |
| local created | (??) conflict | (??) conflict | (06) push | (11/12/13/14/15) conflict |
Incremental push only:
| local\remote | remote unchanged | remote modified | remote deleted | remote created |
| --------------- | ---------------------------- | ---------------------------- | ---------------------- | ---------------------------- |
| local unchanged | (02/21) do nothing | **(26) conflict push** | **(32) conflict push** | (??) conflict |
| local modified | (10) push | **(25) conflict push** | (08) push | (??) conflict |
| local deleted | **(29) conflict do nothing** | **(30) conflict do nothing** | (01) clean history | **(28) conflict do nothing** |
| local created | (??) conflict | (??) conflict | (06) push | **(23) conflict push** |
Incremental pull only:
| local\remote | remote unchanged | remote modified | remote deleted | remote created |
| --------------- | ---------------------- | ---------------------- | ---------------------------- | ---------------------- |
| local unchanged | (02/21) do nothing | (09) pull | **(33) conflict do nothing** | (??) conflict |
| local modified | **(27) conflict pull** | **(24) conflict pull** | **(34) conflict do nothing** | (??) conflict |
| local deleted | **(35) conflict pull** | (05) pull | (01) clean history | (03) pull |
| local created | (??) conflict | (??) conflict | **(31) conflict do nothing** | **(22) conflict pull** |

View File

@ -0,0 +1,14 @@
# Introduction To Sync Algorithm V3
- [x] sync conflict: keep newer
- [x] sync conflict: keep larger
- [ ] sync conflict: keep both and rename
- [ ] sync conflict: show warning
- [x] deletion: true deletion status computation
- [x] meta data: no remote meta data any more
- [x] migration: old data auto transfer to new db (hopefully)
- [x] sync direction: incremental push only
- [x] sync direction: incremental pull only
- [x] sync protection: warning based on the threshold
- [ ] partial sync: better sync on save
- [x] encrpytion: new encryption method, see [this](../../encryption/)

View File

@ -1,38 +0,0 @@
# Sync Algorithm V3
Drafted on 20240117.
An absolutely better sync algorithm. Better for tracking deletions and better for subbranching.
## Huge Thanks
Basically a combination of algorithm v2 + [synclone](https://github.com/Jwink3101/syncrclone) + [rsinc](https://github.com/ConorWilliams/rsinc) + (some of rclone [bisync](https://rclone.org/bisync/)). All of the later three are released under MIT License so no worries about the licenses.
## Features
Must have
1. true deletion detection
2. deletion protection (blocking) with a setting
3. transaction from the old algorithm
4. user warning show up, **new algorithm needs all clients to be updated!** (deliberately corrput the metadata file??)
5. filters
6. conflict warning
7. partial sync
Nice to have
1. true time and hash
2. conflict rename
## Description
We have _five_ input sources: local all files, remote all files, _local previous succeeded sync history_, local deletions, remote deletions.
Init run, consuming local deletions and remote deletions :
TBD
Later runs, use the first, second, third sources **only**.
TBD

View File

@ -1,6 +1,7 @@
import dotenv from "dotenv/config";
import esbuild from "esbuild";
import process from "process";
import inlineWorkerPlugin from "esbuild-plugin-inline-worker";
// import builtins from 'builtin-modules'
const banner = `/*
@ -54,6 +55,7 @@ esbuild
"process.env.NODE_DEBUG": `undefined`, // ugly fix
"process.env.DEBUG": `undefined`, // ugly fix
},
plugins: [inlineWorkerPlugin()],
})
.then((context) => {
if (process.argv.includes("--watch")) {

View File

@ -1,7 +1,7 @@
{
"id": "remotely-save",
"name": "Remotely Save",
"version": "0.3.38",
"version": "0.4.16",
"minAppVersion": "0.13.21",
"description": "Yet another unofficial plugin allowing users to synchronize notes between local device and the cloud service.",
"author": "fyears",

View File

@ -1,7 +1,7 @@
{
"id": "remotely-save",
"name": "Remotely Save",
"version": "0.3.38",
"version": "0.4.16",
"minAppVersion": "0.13.21",
"description": "Yet another unofficial plugin allowing users to synchronize notes between local device and the cloud service.",
"author": "fyears",

View File

@ -1,6 +1,6 @@
{
"name": "remotely-save",
"version": "0.3.38",
"version": "0.4.16",
"description": "This is yet another sync plugin for Obsidian app.",
"scripts": {
"dev2": "node esbuild.config.mjs --watch",
@ -24,7 +24,7 @@
"license": "Apache-2.0",
"devDependencies": {
"@microsoft/microsoft-graph-types": "^2.40.0",
"@types/chai": "^4.3.11",
"@types/chai": "^4.3.14",
"@types/chai-as-promised": "^7.1.8",
"@types/jsdom": "^21.1.6",
"@types/lodash": "^4.14.202",
@ -34,13 +34,14 @@
"@types/node": "^20.10.4",
"@types/qrcode": "^1.5.5",
"builtin-modules": "^3.3.0",
"chai": "^4.3.10",
"chai": "^4.4.1",
"chai-as-promised": "^7.1.1",
"cross-env": "^7.0.3",
"dotenv": "^16.3.1",
"esbuild": "^0.19.9",
"esbuild-plugin-inline-worker": "^0.1.1",
"jsdom": "^23.0.1",
"mocha": "^10.2.0",
"mocha": "^10.4.0",
"npm-check-updates": "^16.14.12",
"obsidian": "^1.4.11",
"prettier": "^3.1.1",
@ -50,7 +51,8 @@
"typescript": "^5.3.3",
"webdav-server": "^2.6.2",
"webpack": "^5.89.0",
"webpack-cli": "^5.1.4"
"webpack-cli": "^5.1.4",
"worker-loader": "^3.0.8"
},
"dependencies": {
"@aws-sdk/client-s3": "^3.474.0",
@ -58,6 +60,7 @@
"@aws-sdk/signature-v4-crt": "^3.474.0",
"@aws-sdk/types": "^3.468.0",
"@azure/msal-node": "^2.6.0",
"@fyears/rclone-crypt": "^0.0.7",
"@fyears/tsqueue": "^1.0.1",
"@microsoft/microsoft-graph-client": "^3.0.7",
"@smithy/fetch-http-handler": "^2.3.1",
@ -69,13 +72,12 @@
"aws-crt": "^1.20.0",
"buffer": "^6.0.3",
"crypto-browserify": "^3.12.0",
"delay": "^6.0.0",
"dropbox": "^10.34.0",
"emoji-regex": "^10.3.0",
"http-status-codes": "^2.3.0",
"localforage": "^1.10.0",
"localforage-getitems": "^1.4.2",
"lodash": "^4.17.21",
"loglevel": "^1.8.1",
"lucide": "^0.298.0",
"mime-types": "^2.1.35",
"mustache": "^4.2.0",
@ -90,7 +92,6 @@
"url": "^0.11.3",
"util": "^0.12.5",
"webdav": "^5.3.1",
"webdav-fs": "^4.0.1",
"xregexp": "^5.1.1"
}
}

View File

@ -84,6 +84,15 @@ export interface OnedriveConfig {
remoteBaseDir?: string;
}
export type SyncDirectionType =
| "bidirectional"
| "incremental_pull_only"
| "incremental_push_only";
export type CipherMethodType = "rclone-base64" | "openssl-base64" | "unknown";
export type QRExportType = "all_but_oauth2" | "dropbox" | "onedrive";
export interface RemotelySavePluginSettings {
s3: S3Config;
webdav: WebdavConfig;
@ -95,16 +104,32 @@ export interface RemotelySavePluginSettings {
autoRunEveryMilliseconds?: number;
initRunAfterMilliseconds?: number;
syncOnSaveAfterMilliseconds?: number;
agreeToUploadExtraMetadata?: boolean;
concurrency?: number;
syncConfigDir?: boolean;
syncUnderscoreItems?: boolean;
lang?: LangTypeAndAuto;
agreeToUseSyncV3?: boolean;
skipSizeLargerThan?: number;
ignorePaths?: string[];
enableStatusBarInfo?: boolean;
deleteToWhere?: "system" | "obsidian";
conflictAction?: ConflictActionType;
howToCleanEmptyFolder?: EmptyFolderCleanType;
protectModifyPercentage?: number;
syncDirection?: SyncDirectionType;
obfuscateSettingFile?: boolean;
enableMobileStatusBar?: boolean;
encryptionMethod?: CipherMethodType;
/**
* @deprecated
*/
agreeToUploadExtraMetadata?: boolean;
/**
* @deprecated
@ -117,14 +142,6 @@ export interface RemotelySavePluginSettings {
logToDB?: boolean;
}
export interface RemoteItem {
key: string;
lastModified?: number;
size: number;
remoteType: SUPPORTED_SERVICES_TYPE;
etag?: string;
}
export const COMMAND_URI = "remotely-save";
export const COMMAND_CALLBACK = "remotely-save-cb";
export const COMMAND_CALLBACK_ONEDRIVE = "remotely-save-cb-onedrive";
@ -140,32 +157,83 @@ export interface UriParams {
// 80 days
export const OAUTH2_FORCE_EXPIRE_MILLISECONDS = 1000 * 60 * 60 * 24 * 80;
type DecisionTypeForFile =
| "skipUploading" // special, mtimeLocal === mtimeRemote
| "uploadLocalDelHistToRemote" // "delLocalIfExists && delRemoteIfExists && cleanLocalDelHist && uploadLocalDelHistToRemote"
| "keepRemoteDelHist" // "delLocalIfExists && delRemoteIfExists && cleanLocalDelHist && keepRemoteDelHist"
| "uploadLocalToRemote" // "skipLocal && uploadLocalToRemote && cleanLocalDelHist && cleanRemoteDelHist"
| "downloadRemoteToLocal"; // "downloadRemoteToLocal && skipRemote && cleanLocalDelHist && cleanRemoteDelHist"
export type EmptyFolderCleanType = "skip" | "clean_both";
type DecisionTypeForFileSize =
| "skipUploadingTooLarge"
| "skipDownloadingTooLarge"
| "skipUsingLocalDelTooLarge"
| "skipUsingRemoteDelTooLarge"
| "errorLocalTooLargeConflictRemote"
| "errorRemoteTooLargeConflictLocal";
export type ConflictActionType = "keep_newer" | "keep_larger" | "rename_both";
type DecisionTypeForFolder =
| "createFolder"
| "uploadLocalDelHistToRemoteFolder"
| "keepRemoteDelHistFolder"
| "skipFolder";
export type DecisionTypeForMixedEntity =
| "only_history"
| "equal"
| "local_is_modified_then_push"
| "remote_is_modified_then_pull"
| "local_is_created_then_push"
| "remote_is_created_then_pull"
| "local_is_created_too_large_then_do_nothing"
| "remote_is_created_too_large_then_do_nothing"
| "local_is_deleted_thus_also_delete_remote"
| "remote_is_deleted_thus_also_delete_local"
| "conflict_created_then_keep_local"
| "conflict_created_then_keep_remote"
| "conflict_created_then_keep_both"
| "conflict_created_then_do_nothing"
| "conflict_modified_then_keep_local"
| "conflict_modified_then_keep_remote"
| "conflict_modified_then_keep_both"
| "folder_existed_both_then_do_nothing"
| "folder_existed_local_then_also_create_remote"
| "folder_existed_remote_then_also_create_local"
| "folder_to_be_created"
| "folder_to_skip"
| "folder_to_be_deleted_on_both"
| "folder_to_be_deleted_on_remote"
| "folder_to_be_deleted_on_local";
export type DecisionType =
| DecisionTypeForFile
| DecisionTypeForFileSize
| DecisionTypeForFolder;
/**
* uniform representation
* everything should be flat and primitive, so that we can copy.
*/
export interface Entity {
key?: string;
keyEnc?: string;
keyRaw: string;
mtimeCli?: number;
mtimeCliFmt?: string;
mtimeSvr?: number;
mtimeSvrFmt?: string;
prevSyncTime?: number;
prevSyncTimeFmt?: string;
size?: number; // might be unknown or to be filled
sizeEnc?: number;
sizeRaw: number;
hash?: string;
etag?: string;
synthesizedFolder?: boolean;
}
export interface UploadedType {
entity: Entity;
mtimeCli?: number;
}
/**
* A replacement of FileOrFolderMixedState
*/
export interface MixedEntity {
key: string;
local?: Entity;
prevSync?: Entity;
remote?: Entity;
decisionBranch?: number;
decision?: DecisionTypeForMixedEntity;
conflictAction?: ConflictActionType;
sideNotes?: any;
}
/**
* @deprecated
*/
export interface FileOrFolderMixedState {
key: string;
existLocal?: boolean;
@ -180,7 +248,7 @@ export interface FileOrFolderMixedState {
sizeRemoteEnc?: number;
changeRemoteMtimeUsingMapping?: boolean;
changeLocalMtimeUsingMapping?: boolean;
decision?: DecisionType;
decision?: string; // old DecisionType is deleted, fallback to string
decisionBranch?: number;
syncDone?: "done";
remoteEncryptedKey?: string;
@ -204,12 +272,14 @@ export const DEFAULT_DEBUG_FOLDER = "_debug_remotely_save/";
export const DEFAULT_SYNC_PLANS_HISTORY_FILE_PREFIX =
"sync_plans_hist_exported_on_";
export const DEFAULT_LOG_HISTORY_FILE_PREFIX = "log_hist_exported_on_";
export const DEFAULT_PROFILER_RESULT_FILE_PREFIX =
"profiler_results_exported_on_";
export type SyncTriggerSourceType =
| "manual"
| "auto"
| "dry"
| "autoOnceInit"
| "auto"
| "auto_once_init"
| "auto_sync_on_save";
export const REMOTELY_SAVE_VERSION_2022 = "0.3.25";

View File

@ -3,8 +3,6 @@ import { reverseString } from "./misc";
import type { RemotelySavePluginSettings } from "./baseTypes";
import { log } from "./moreOnLog";
const DEFAULT_README: string =
"The file contains sensitive info, so DO NOT take screenshot of, copy, or share it to anyone! It's also generated automatically, so do not edit it manually.";
@ -19,10 +17,10 @@ interface MessyConfigType {
export const messyConfigToNormal = (
x: MessyConfigType | RemotelySavePluginSettings | null | undefined
): RemotelySavePluginSettings | null | undefined => {
// log.debug("loading, original config on disk:");
// log.debug(x);
// console.debug("loading, original config on disk:");
// console.debug(x);
if (x === null || x === undefined) {
log.debug("the messy config is null or undefined, skip");
console.debug("the messy config is null or undefined, skip");
return x as any;
}
if ("readme" in x && "d" in x) {
@ -35,12 +33,12 @@ export const messyConfigToNormal = (
}) as Buffer
).toString("utf-8")
);
// log.debug("loading, parsed config is:");
// log.debug(y);
// console.debug("loading, parsed config is:");
// console.debug(y);
return y;
} else {
// return as is
// log.debug("loading, parsed config is the same");
// console.debug("loading, parsed config is the same");
return x;
}
};
@ -52,7 +50,7 @@ export const normalConfigToMessy = (
x: RemotelySavePluginSettings | null | undefined
) => {
if (x === null || x === undefined) {
log.debug("the normal config is null or undefined, skip");
console.debug("the normal config is null or undefined, skip");
return x;
}
const y = {
@ -63,7 +61,7 @@ export const normalConfigToMessy = (
})
),
};
// log.debug("encoding, encoded config is:");
// log.debug(y);
// console.debug("encoding, encoded config is:");
// console.debug(y);
return y;
};

View File

@ -1,97 +1,71 @@
import { TAbstractFile, TFolder, TFile, Vault } from "obsidian";
import type { SyncPlanType } from "./sync";
import { readAllSyncPlanRecordTextsByVault } from "./localdb";
import {
readAllProfilerResultsByVault,
readAllSyncPlanRecordTextsByVault,
} from "./localdb";
import type { InternalDBs } from "./localdb";
import { mkdirpInVault } from "./misc";
import { mkdirpInVault, unixTimeToStr } from "./misc";
import {
DEFAULT_DEBUG_FOLDER,
DEFAULT_LOG_HISTORY_FILE_PREFIX,
DEFAULT_PROFILER_RESULT_FILE_PREFIX,
DEFAULT_SYNC_PLANS_HISTORY_FILE_PREFIX,
FileOrFolderMixedState,
} from "./baseTypes";
import { log } from "./moreOnLog";
const turnSyncPlanToTable = (record: string) => {
const syncPlan: SyncPlanType = JSON.parse(record);
const { ts, tsFmt, remoteType, mixedStates } = syncPlan;
type allowedHeadersType = keyof FileOrFolderMixedState;
const headers: allowedHeadersType[] = [
"key",
"remoteEncryptedKey",
"existLocal",
"sizeLocal",
"sizeLocalEnc",
"mtimeLocal",
"deltimeLocal",
"changeLocalMtimeUsingMapping",
"existRemote",
"sizeRemote",
"sizeRemoteEnc",
"mtimeRemote",
"deltimeRemote",
"changeRemoteMtimeUsingMapping",
"decision",
"decisionBranch",
];
const lines = [
`ts: ${ts}${tsFmt !== undefined ? " / " + tsFmt : ""}`,
`remoteType: ${remoteType}`,
`| ${headers.join(" | ")} |`,
`| ${headers.map((x) => "---").join(" | ")} |`,
];
for (const [k1, v1] of Object.entries(syncPlan.mixedStates)) {
const k = k1 as string;
const v = v1 as FileOrFolderMixedState;
const singleLine = [];
for (const h of headers) {
const field = v[h];
if (field === undefined) {
singleLine.push("");
continue;
}
if (
h === "mtimeLocal" ||
h === "deltimeLocal" ||
h === "mtimeRemote" ||
h === "deltimeRemote"
) {
const fmt = v[(h + "Fmt") as allowedHeadersType] as string;
const s = `${field}${fmt !== undefined ? " / " + fmt : ""}`;
singleLine.push(s);
} else {
singleLine.push(field);
}
}
lines.push(`| ${singleLine.join(" | ")} |`);
}
return lines.join("\n");
};
export const exportVaultSyncPlansToFiles = async (
db: InternalDBs,
vault: Vault,
vaultRandomID: string
vaultRandomID: string,
howMany: number
) => {
log.info("exporting");
console.info("exporting sync plans");
await mkdirpInVault(DEFAULT_DEBUG_FOLDER, vault);
const records = await readAllSyncPlanRecordTextsByVault(db, vaultRandomID);
let md = "";
if (records.length === 0) {
md = "No sync plans history found";
} else {
md =
"Sync plans found:\n\n" +
records.map((x) => "```json\n" + x + "\n```\n").join("\n");
if (howMany <= 0) {
md =
"Sync plans found:\n\n" +
records.map((x) => "```json\n" + x + "\n```\n").join("\n");
} else {
md =
"Sync plans found:\n\n" +
records
.map((x) => "```json\n" + x + "\n```\n")
.slice(0, howMany)
.join("\n");
}
}
const ts = Date.now();
const filePath = `${DEFAULT_DEBUG_FOLDER}${DEFAULT_SYNC_PLANS_HISTORY_FILE_PREFIX}${ts}.md`;
await vault.create(filePath, md, {
mtime: ts,
});
log.info("finish exporting");
console.info("finish exporting sync plans");
};
export const exportVaultProfilerResultsToFiles = async (
db: InternalDBs,
vault: Vault,
vaultRandomID: string
) => {
console.info("exporting profiler results");
await mkdirpInVault(DEFAULT_DEBUG_FOLDER, vault);
const records = await readAllProfilerResultsByVault(db, vaultRandomID);
let md = "";
if (records.length === 0) {
md = "No profiler results found";
} else {
md =
"Profiler results found:\n\n" +
records.map((x) => "```\n" + x + "\n```\n").join("\n");
}
const ts = Date.now();
const filePath = `${DEFAULT_DEBUG_FOLDER}${DEFAULT_PROFILER_RESULT_FILE_PREFIX}${ts}.md`;
await vault.create(filePath, md, {
mtime: ts,
});
console.info("finish exporting profiler results");
};

View File

@ -1,8 +1,6 @@
import { base32, base64url } from "rfc4648";
import { bufferToArrayBuffer, hexStringToTypedArray } from "./misc";
import { log } from "./moreOnLog";
const DEFAULT_ITER = 20000;
// base32.stringify(Buffer.from('Salted__'))

251
src/encryptRClone.ts Normal file
View File

@ -0,0 +1,251 @@
import {
Cipher as CipherRCloneCryptPack,
encryptedSize,
} from "@fyears/rclone-crypt";
// @ts-ignore
import EncryptWorker from "./encryptRClone.worker";
interface RecvMsg {
status: "ok" | "error";
outputName?: string;
outputContent?: ArrayBuffer;
error?: any;
}
export const getSizeFromOrigToEnc = encryptedSize;
export class CipherRclone {
readonly password: string;
readonly cipher: CipherRCloneCryptPack;
readonly workers: Worker[];
init: boolean;
workerIdx: number;
constructor(password: string, workerNum: number) {
this.password = password;
this.init = false;
this.workerIdx = 0;
// console.debug("begin creating CipherRCloneCryptPack");
this.cipher = new CipherRCloneCryptPack("base64");
// console.debug("finish creating CipherRCloneCryptPack");
// console.debug("begin creating EncryptWorker");
this.workers = [];
for (let i = 0; i < workerNum; ++i) {
this.workers.push(new (EncryptWorker as any)() as Worker);
}
// console.debug("finish creating EncryptWorker");
}
closeResources() {
for (let i = 0; i < this.workers.length; ++i) {
this.workers[i].terminate();
}
}
async prepareByCallingWorker(): Promise<void> {
if (this.init) {
return;
}
// console.debug("begin prepareByCallingWorker");
await this.cipher.key(this.password, "");
// console.debug("finish getting key");
const res: Promise<void>[] = [];
for (let i = 0; i < this.workers.length; ++i) {
res.push(
new Promise((resolve, reject) => {
const channel = new MessageChannel();
channel.port2.onmessage = (event) => {
// console.debug("main: receiving msg in prepare");
const { status } = event.data as RecvMsg;
if (status === "ok") {
// console.debug("main: receiving init ok in prepare");
this.init = true;
resolve(); // return the class object itself
} else {
reject("error after prepareByCallingWorker");
}
};
channel.port2.onmessageerror = (event) => {
// console.debug("main: receiving error in prepare");
reject(event);
};
// console.debug("main: before postMessage in prepare");
this.workers[i].postMessage(
{
action: "prepare",
dataKeyBuf: this.cipher.dataKey.buffer,
nameKeyBuf: this.cipher.nameKey.buffer,
nameTweakBuf: this.cipher.nameTweak.buffer,
},
[channel.port1 /* buffer no transfered because we need to copy */]
);
})
);
}
await Promise.all(res);
}
async encryptNameByCallingWorker(inputName: string): Promise<string> {
// console.debug("main: start encryptNameByCallingWorker");
await this.prepareByCallingWorker();
// console.debug(
// "main: really start generate promise in encryptNameByCallingWorker"
// );
++this.workerIdx;
const whichWorker = this.workerIdx % this.workers.length;
return await new Promise((resolve, reject) => {
const channel = new MessageChannel();
channel.port2.onmessage = (event) => {
// console.debug("main: receiving msg in encryptNameByCallingWorker");
const { outputName } = event.data as RecvMsg;
if (outputName === undefined) {
reject("unknown outputName after encryptNameByCallingWorker");
} else {
resolve(outputName);
}
};
channel.port2.onmessageerror = (event) => {
// console.debug("main: receiving error in encryptNameByCallingWorker");
reject(event);
};
// console.debug("main: before postMessage in encryptNameByCallingWorker");
this.workers[whichWorker].postMessage(
{
action: "encryptName",
inputName: inputName,
},
[channel.port1]
);
});
}
async decryptNameByCallingWorker(inputName: string): Promise<string> {
await this.prepareByCallingWorker();
++this.workerIdx;
const whichWorker = this.workerIdx % this.workers.length;
return await new Promise((resolve, reject) => {
const channel = new MessageChannel();
channel.port2.onmessage = (event) => {
// console.debug("main: receiving msg in decryptNameByCallingWorker");
const { outputName, status } = event.data as RecvMsg;
if (status === "error") {
reject("error");
} else {
if (outputName === undefined) {
reject("unknown outputName after decryptNameByCallingWorker");
} else {
resolve(outputName);
}
}
};
channel.port2.onmessageerror = (event) => {
// console.debug("main: receiving error in decryptNameByCallingWorker");
reject(event);
channel;
};
// console.debug("main: before postMessage in decryptNameByCallingWorker");
this.workers[whichWorker].postMessage(
{
action: "decryptName",
inputName: inputName,
},
[channel.port1]
);
});
}
async encryptContentByCallingWorker(
input: ArrayBuffer
): Promise<ArrayBuffer> {
await this.prepareByCallingWorker();
++this.workerIdx;
const whichWorker = this.workerIdx % this.workers.length;
return await new Promise((resolve, reject) => {
const channel = new MessageChannel();
channel.port2.onmessage = (event) => {
// console.debug("main: receiving msg in encryptContentByCallingWorker");
const { outputContent } = event.data as RecvMsg;
if (outputContent === undefined) {
reject("unknown outputContent after encryptContentByCallingWorker");
} else {
resolve(outputContent);
}
};
channel.port2.onmessageerror = (event) => {
// console.debug("main: receiving error in encryptContentByCallingWorker");
reject(event);
};
// console.debug(
// "main: before postMessage in encryptContentByCallingWorker"
// );
this.workers[whichWorker].postMessage(
{
action: "encryptContent",
inputContent: input,
},
[channel.port1, input]
);
});
}
async decryptContentByCallingWorker(
input: ArrayBuffer
): Promise<ArrayBuffer> {
await this.prepareByCallingWorker();
++this.workerIdx;
const whichWorker = this.workerIdx % this.workers.length;
return await new Promise((resolve, reject) => {
const channel = new MessageChannel();
channel.port2.onmessage = (event) => {
// console.debug("main: receiving msg in decryptContentByCallingWorker");
const { outputContent, status } = event.data as RecvMsg;
if (status === "error") {
reject("error");
} else {
if (outputContent === undefined) {
reject("unknown outputContent after decryptContentByCallingWorker");
} else {
resolve(outputContent);
}
}
};
channel.port2.onmessageerror = (event) => {
// console.debug(
// "main: receiving onmessageerror in decryptContentByCallingWorker"
// );
reject(event);
};
// console.debug(
// "main: before postMessage in decryptContentByCallingWorker"
// );
this.workers[whichWorker].postMessage(
{
action: "decryptContent",
inputContent: input,
},
[channel.port1, input]
);
});
}
}

184
src/encryptRClone.worker.ts Normal file
View File

@ -0,0 +1,184 @@
import { nanoid } from "nanoid";
import { Cipher as CipherRCloneCryptPack } from "@fyears/rclone-crypt";
const ctx: WorkerGlobalScope = self as any;
const workerNanoID = nanoid();
const cipher = new CipherRCloneCryptPack("base64");
// console.debug(`worker [${workerNanoID}]: cipher created`);
async function encryptNameStr(input: string) {
const res = await cipher.encryptFileName(input);
return res;
}
async function decryptNameStr(input: string) {
return await cipher.decryptFileName(input);
}
async function encryptContentBuf(input: ArrayBuffer) {
return (await cipher.encryptData(new Uint8Array(input), undefined)).buffer;
}
async function decryptContentBuf(input: ArrayBuffer) {
return (await cipher.decryptData(new Uint8Array(input))).buffer;
}
ctx.addEventListener("message", async (event: any) => {
const port: MessagePort = event.ports[0];
const {
action,
dataKeyBuf,
nameKeyBuf,
nameTweakBuf,
inputName,
inputContent,
} = event.data as {
action:
| "prepare"
| "encryptContent"
| "decryptContent"
| "encryptName"
| "decryptName";
dataKeyBuf?: ArrayBuffer;
nameKeyBuf?: ArrayBuffer;
nameTweakBuf?: ArrayBuffer;
inputName?: string;
inputContent?: ArrayBuffer;
};
// console.debug(`worker [${workerNanoID}]: receiving action=${action}`);
if (action === "prepare") {
// console.debug(`worker [${workerNanoID}]: prepare: start`);
try {
if (
dataKeyBuf === undefined ||
nameKeyBuf === undefined ||
nameTweakBuf === undefined
) {
// console.debug(`worker [${workerNanoID}]: prepare: no buffer??`);
throw Error(
`worker [${workerNanoID}]: prepare: internal keys not transferred to worker properly`
);
}
// console.debug(`worker [${workerNanoID}]: prepare: so we update`);
cipher.updateInternalKey(
new Uint8Array(dataKeyBuf),
new Uint8Array(nameKeyBuf),
new Uint8Array(nameTweakBuf)
);
port.postMessage({
status: "ok",
});
} catch (error) {
console.error(error);
port.postMessage({
status: "error",
error: error,
});
}
} else if (action === "encryptName") {
try {
if (inputName === undefined) {
throw Error(
`worker [${workerNanoID}]: encryptName: internal inputName not transferred to worker properly`
);
}
const outputName = await encryptNameStr(inputName);
// console.debug(
// `worker [${workerNanoID}]: after encryptNameStr, before postMessage`
// );
port.postMessage({
status: "ok",
outputName: outputName,
});
} catch (error) {
console.error(`worker [${workerNanoID}]: encryptName=${inputName}`);
console.error(error);
port.postMessage({
status: "error",
error: error,
});
}
} else if (action === "decryptName") {
try {
if (inputName === undefined) {
throw Error(
`worker [${workerNanoID}]: decryptName: internal inputName not transferred to worker properly`
);
}
const outputName = await decryptNameStr(inputName);
// console.debug(
// `worker [${workerNanoID}]: after decryptNameStr, before postMessage`
// );
port.postMessage({
status: "ok",
outputName: outputName,
});
} catch (error) {
console.error(`worker [${workerNanoID}]: decryptName=${inputName}`);
console.error(error);
port.postMessage({
status: "error",
error: error,
});
}
} else if (action === "encryptContent") {
try {
if (inputContent === undefined) {
throw Error(
`worker [${workerNanoID}]: encryptContent: internal inputContent not transferred to worker properly`
);
}
const outputContent = await encryptContentBuf(inputContent);
// console.debug(
// `worker [${workerNanoID}]: after encryptContentBuf, before postMessage`
// );
port.postMessage(
{
status: "ok",
outputContent: outputContent,
},
[outputContent]
);
} catch (error) {
console.error(error);
port.postMessage({
status: "error",
error: error,
});
}
} else if (action === "decryptContent") {
try {
if (inputContent === undefined) {
throw Error(
`worker [${workerNanoID}]: decryptContent: internal inputContent not transferred to worker properly`
);
}
const outputContent = await decryptContentBuf(inputContent);
// console.debug(
// `worker [${workerNanoID}]: after decryptContentBuf, before postMessage`
// );
port.postMessage(
{
status: "ok",
outputContent: outputContent,
},
[outputContent]
);
} catch (error) {
console.error(error);
port.postMessage({
status: "error",
error: error,
});
}
} else {
port.postMessage({
status: "error",
error: `worker [${workerNanoID}]: unknown action=${action}`,
});
}
});

215
src/encryptUnified.ts Normal file
View File

@ -0,0 +1,215 @@
import { CipherMethodType } from "./baseTypes";
import * as openssl from "./encryptOpenSSL";
import * as rclone from "./encryptRClone";
import { isVaildText } from "./misc";
export class Cipher {
readonly password: string;
readonly method: CipherMethodType;
cipherRClone?: rclone.CipherRclone;
constructor(password: string, method: CipherMethodType) {
this.password = password ?? "";
this.method = method;
if (method === "rclone-base64") {
this.cipherRClone = new rclone.CipherRclone(password, 5);
}
}
closeResources() {
if (this.method === "rclone-base64" && this.cipherRClone !== undefined) {
this.cipherRClone.closeResources();
}
}
isPasswordEmpty() {
return this.password === "";
}
isFolderAware() {
if (this.method === "openssl-base64") {
return false;
}
if (this.method === "rclone-base64") {
return true;
}
throw Error(`no idea about isFolderAware for method=${this.method}`);
}
async encryptContent(content: ArrayBuffer) {
// console.debug("start encryptContent");
if (this.password === "") {
return content;
}
if (this.method === "openssl-base64") {
const res = await openssl.encryptArrayBuffer(content, this.password);
if (res === undefined) {
throw Error(`cannot encrypt content`);
}
return res;
} else if (this.method === "rclone-base64") {
const res =
await this.cipherRClone!.encryptContentByCallingWorker(content);
if (res === undefined) {
throw Error(`cannot encrypt content`);
}
return res;
} else {
throw Error(`not supported encrypt method=${this.method}`);
}
}
async decryptContent(content: ArrayBuffer) {
// console.debug("start decryptContent");
if (this.password === "") {
return content;
}
if (this.method === "openssl-base64") {
const res = await openssl.decryptArrayBuffer(content, this.password);
if (res === undefined) {
throw Error(`cannot decrypt content`);
}
return res;
} else if (this.method === "rclone-base64") {
const res =
await this.cipherRClone!.decryptContentByCallingWorker(content);
if (res === undefined) {
throw Error(`cannot decrypt content`);
}
return res;
} else {
throw Error(`not supported decrypt method=${this.method}`);
}
}
async encryptName(name: string) {
// console.debug("start encryptName");
if (this.password === "") {
return name;
}
if (this.method === "openssl-base64") {
const res = await openssl.encryptStringToBase64url(name, this.password);
if (res === undefined) {
throw Error(`cannot encrypt name=${name}`);
}
return res;
} else if (this.method === "rclone-base64") {
const res = await this.cipherRClone!.encryptNameByCallingWorker(name);
if (res === undefined) {
throw Error(`cannot encrypt name=${name}`);
}
return res;
} else {
throw Error(`not supported encrypt method=${this.method}`);
}
}
async decryptName(name: string): Promise<string> {
// console.debug("start decryptName");
if (this.password === "") {
return name;
}
if (this.method === "openssl-base64") {
if (name.startsWith(openssl.MAGIC_ENCRYPTED_PREFIX_BASE32)) {
// backward compitable with the openssl-base32
try {
const res = await openssl.decryptBase32ToString(name, this.password);
if (res !== undefined && isVaildText(res)) {
return res;
} else {
throw Error(`cannot decrypt name=${name}`);
}
} catch (error) {
throw Error(`cannot decrypt name=${name}`);
}
} else if (name.startsWith(openssl.MAGIC_ENCRYPTED_PREFIX_BASE64URL)) {
try {
const res = await openssl.decryptBase64urlToString(
name,
this.password
);
if (res !== undefined && isVaildText(res)) {
return res;
} else {
throw Error(`cannot decrypt name=${name}`);
}
} catch (error) {
throw Error(`cannot decrypt name=${name}`);
}
} else {
throw Error(
`method=${this.method} but the name=${name}, likely mismatch`
);
}
} else if (this.method === "rclone-base64") {
const res = await this.cipherRClone!.decryptNameByCallingWorker(name);
if (res === undefined) {
throw Error(`cannot decrypt name=${name}`);
}
return res;
} else {
throw Error(`not supported decrypt method=${this.method}`);
}
}
getSizeFromOrigToEnc(x: number) {
if (this.password === "") {
return x;
}
if (this.method === "openssl-base64") {
return openssl.getSizeFromOrigToEnc(x);
} else if (this.method === "rclone-base64") {
return rclone.getSizeFromOrigToEnc(x);
} else {
throw Error(`not supported encrypt method=${this.method}`);
}
}
/**
* quick guess, no actual decryption here
* @param name
* @returns
*/
static isLikelyOpenSSLEncryptedName(name: string): boolean {
if (
name.startsWith(openssl.MAGIC_ENCRYPTED_PREFIX_BASE32) ||
name.startsWith(openssl.MAGIC_ENCRYPTED_PREFIX_BASE64URL)
) {
return true;
}
return false;
}
/**
* quick guess, no actual decryption here
* @param name
* @returns
*/
static isLikelyEncryptedName(name: string): boolean {
return Cipher.isLikelyOpenSSLEncryptedName(name);
}
/**
* quick guess, no actual decryption here, only openssl can be guessed here
* @param name
* @returns
*/
static isLikelyEncryptedNameNotMatchMethod(
name: string,
method: CipherMethodType
): boolean {
if (
Cipher.isLikelyOpenSSLEncryptedName(name) &&
method !== "openssl-base64"
) {
return true;
}
if (
!Cipher.isLikelyOpenSSLEncryptedName(name) &&
method === "openssl-base64"
) {
return true;
}
return false;
}
}

View File

@ -5,24 +5,34 @@ import {
COMMAND_URI,
UriParams,
RemotelySavePluginSettings,
QRExportType,
} from "./baseTypes";
import { log } from "./moreOnLog";
import { getShrinkedSettings } from "./remoteForOnedrive";
export const exportQrCodeUri = async (
settings: RemotelySavePluginSettings,
currentVaultName: string,
pluginVersion: string
pluginVersion: string,
exportFields: QRExportType
) => {
const settings2: Partial<RemotelySavePluginSettings> = cloneDeep(settings);
delete settings2.dropbox;
delete settings2.onedrive;
let settings2: Partial<RemotelySavePluginSettings> = {};
if (exportFields === "all_but_oauth2") {
settings2 = cloneDeep(settings);
delete settings2.dropbox;
delete settings2.onedrive;
} else if (exportFields === "dropbox") {
settings2 = { dropbox: cloneDeep(settings.dropbox) };
} else if (exportFields === "onedrive") {
settings2 = { onedrive: getShrinkedSettings(settings.onedrive) };
}
delete settings2.vaultRandomID;
const data = encodeURIComponent(JSON.stringify(settings2));
const vault = encodeURIComponent(currentVaultName);
const version = encodeURIComponent(pluginVersion);
const rawUri = `obsidian://${COMMAND_URI}?func=settings&version=${version}&vault=${vault}&data=${data}`;
// log.info(uri)
// console.info(uri)
const imgUri = await QRCode.toDataURL(rawUri);
return {
rawUri,
@ -36,6 +46,20 @@ export interface ProcessQrCodeResultType {
result?: RemotelySavePluginSettings;
}
/**
* we also support directly parse the uri, instead of relying on web browser
* @param input
*/
export const parseUriByHand = (input: string) => {
if (!input.startsWith("obsidian://remotely-save?func=settings&")) {
throw Error(`not valid string`);
}
const k = new URL(input);
const output = Object.fromEntries(k.searchParams);
return output;
};
export const importQrCodeUri = (
inputParams: any,
currentVaultName: string

View File

@ -5,26 +5,27 @@
"goback": "Go Back",
"submit": "Submit",
"sometext": "Here are some texts.",
"syncrun_alreadyrunning": "{{pluginName}} already running in stage {{syncStatus}}!",
"syncrun_alreadyrunning": "New command {{newTriggerSource}} stops because {{pluginName}} is already running in stage {{syncStatus}}!",
"syncrun_syncingribbon": "{{pluginName}}: syncing from {{triggerSource}}",
"syncrun_step0": "0/8 Remotely Save running in dry mode, not actual file changes would happen.",
"syncrun_step1": "1/8 Remotely Save Sync Preparing ({{serviceType}})",
"syncrun_step0": "0/8 Remotely Save is running in dry mode, thus not actual file changes would happen.",
"syncrun_step1": "1/8 Remotely Save is preparing ({{serviceType}})",
"syncrun_step2": "2/8 Starting to fetch remote meta data.",
"syncrun_step3": "3/8 Checking password correct or not.",
"syncrun_passworderr": "Something goes wrong while checking password.",
"syncrun_step4": "4/8 Trying to fetch extra meta data from remote.",
"syncrun_step5": "5/8 Starting to fetch local meta data.",
"syncrun_step4": "4/8 Starting to fetch local meta data.",
"syncrun_step5": "5/8 Starting to fetch local prev sync data.",
"syncrun_step6": "6/8 Starting to generate sync plan.",
"syncrun_step7": "7/8 Remotely Save Sync data exchanging!",
"syncrun_step7": "7/8 Remotely Save Sync data is exchanging!",
"syncrun_step7skip": "7/8 Remotely Save real sync is skipped in dry run mode.",
"syncrun_step8": "8/8 Remotely Save finish!",
"syncrun_shortstep0": "0/2 Remotely Save running in dry mode, not actual file changes would happen.",
"syncrun_shortstep1": "1/2 Remotely Save Sync Start Running ({{serviceType}})",
"syncrun_step8": "8/8 Remotely Save finished!",
"syncrun_shortstep0": "0/2 Remotely Save is running in dry mode, not actual file changes would happen.",
"syncrun_shortstep1": "1/2 Remotely Save starts running ({{serviceType}})",
"syncrun_shortstep2skip": "2/2 Remotely Save real sync is skipped in dry run mode.",
"syncrun_shortstep2": "2/2 Remotely Save finish!",
"syncrun_shortstep2": "2/2 Remotely Save finished!",
"syncrun_abort": "{{manifestID}}-{{theDate}}: abort sync, triggerSource={{triggerSource}}, error while {{syncStatus}}",
"protocol_saveqr": "New not-oauth2 settings for {{manifestName}} saved. Reopen the plugin Settings to the effect.",
"protocol_callbacknotsupported": "Your uri call a callback that's not supported yet: {{params}}",
"syncrun_abort_protectmodifypercentage": "Abort! you set changing files >= {{protectModifyPercentage}}% is not allowed but {{realModifyDeleteCount}}/{{allFilesCount}}={{percent}}% is going to be modified or deleted! If you are sure you want this sync, please adjust the allowed ratio in the settings.",
"protocol_saveqr": "New settings for {{manifestName}} is imported and saved. Reopen the plugin settings to make it effective.",
"protocol_callbacknotsupported": "Your uri calls a callback that's not supported yet: {{params}}",
"protocol_dropbox_connecting": "Connecting to Dropbox...\nPlease DO NOT close this modal.",
"protocol_dropbox_connect_succ": "Good! We've connected to Dropbox as user {{username}}!",
"protocol_dropbox_connect_succ_revoke": "You've connected as user {{username}}. If you want to disconnect, click this button.",
@ -37,16 +38,22 @@
"protocol_onedrive_connect_unknown": "Do not know how to deal with the callback: {{params}}",
"command_startsync": "start sync",
"command_drynrun": "start sync (dry run only)",
"command_exportsyncplans_json": "export sync plans in json format",
"command_exportsyncplans_1": "export sync plans (latest 1)",
"command_exportsyncplans_5": "export sync plans (latest 5)",
"command_exportsyncplans_all": "export sync plans (all)",
"command_exportlogsindb": "export logs saved in db",
"statusbar_time_years": "{{time}} years",
"statusbar_time_months": "{{time}} months",
"statusbar_time_weeks": "{{time}} weeks",
"statusbar_time_days": "{{time}} days",
"statusbar_time_hours": "{{time}} hours",
"statusbar_time_minutes": "{{time}} minutes",
"statusbar_time_lessminute": "less than a minute",
"statusbar_time_years": "Synced {{time}} years ago",
"statusbar_time_months": "Synced {{time}} months ago",
"statusbar_time_weeks": "Synced {{time}} weeks ago",
"statusbar_time_days": "Synced {{time}} days ago",
"statusbar_time_hours": "Synced {{time}} hours ago",
"statusbar_time_minutes": "Synced {{time}} minutes ago",
"statusbar_time_lessminute": "Synced less than a minute ago",
"statusbar_lastsync": "Synced {{time}} ago",
"statusbar_syncing": "Syncing...",
"statusbar_now": "Synced just now",
"statusbar_lastsync_label": "Last successful Sync on {{date}}",
"statusbar_lastsync_never": "Never Synced",
"statusbar_lastsync_never_label": "Never Synced before",
@ -59,6 +66,8 @@
"modal_password_attn5": "Attention 5/5: The longer the password, the better.",
"modal_password_secondconfirm": "The Second Confirm to change password.",
"modal_password_notice": "New password saved!",
"modal_encryptionmethod_title": "Hold on and PLEASE READ ON...",
"modal_encryptionmethod_shortdesc": "You are changing the encrpytion method but you have set the password before.\nAfter switching the method, you need to <b>manually</b> and <b>fully</b> delete every encrypted vault files in the remote and re-sync (so that re-upload) the newly encrypted files again.",
"modal_remotebasedir_title": "You are changing the remote base directory config",
"modal_remotebasedir_shortdesc": "1. The plugin would NOT automatically move the content from the old directory to the new one directly on the remote. Everything syncs from the beginning again.\n2. If you set the string to the empty, the config would be reset to use the vault folder name (the default config).\n3. The remote directory name itself would not be encrypted even you've set an E2E password.\n4. Some special char like '?', '/', '\\' are not allowed. Spaces in the beginning or in the end are also trimmed.",
"modal_remotebasedir_invaliddirhint": "Your input contains special characters like '?', '/', '\\' which are not allowed.",
@ -83,6 +92,7 @@
"modal_dropboxauth_maualinput_conn_succ_revoke": "You've connected as user {{username}}. If you want to disconnect, click this button.",
"modal_dropboxauth_maualinput_conn_fail": "Something goes wrong while connecting to Dropbox.",
"modal_onedriveauth_shortdesc": "Currently only OneDrive for personal is supported. OneDrive for Business is NOT supported (yet).\nVisit the address in a browser, and follow the steps.\nFinally you should be redirected to Obsidian.",
"modal_onedriveauth_shortdesc_linux": "It seems that you are using Obsidian on Linux, and you might not be able to jump back here properly. Please consider <a href=\"https://github.com/remotely-save/remotely-save/issues/415\">using</a> the flatpack version of Obsidian, or creating an <a href=\"https://github.com/remotely-save/remotely-save/blob/master/docs/linux.md\"><code>obsidian.desktop</code> file</a>.",
"modal_onedriveauth_copybutton": "Click to copy the auth url",
"modal_onedriveauth_copynotice": "The auth url is copied to the clipboard!",
"modal_onedriverevokeauth_step1": "Step 1: Go to the following address, click the \"Edit\" button for the plugin, then click \"Remove these permissions\" button on the page.",
@ -95,20 +105,21 @@
"modal_syncconfig_attn": "Attention 1/2: This only syncs (copies) the whole Obsidian config dir, not other startting-with-dot folders or files. Except for ignoring folders .git and node_modules, it also doesn't understand the meaning of sub-files and sub-folders inside the config dir.\nAttention 2/2: After the config dir is synced, plugins settings might be corrupted, and Obsidian might need to be restarted to load the new settings.\nIf you are agreed to take your own risk, please click the following second confirm button.",
"modal_syncconfig_secondconfirm": "The Second Confirm To Enable.",
"modal_syncconfig_notice": "You've enabled syncing config folder!",
"modal_qr_shortdesc": "This exports not-oauth2 settings. (It means that Dropbox, OneDrive info are NOT exported.)\nYou can use another device to scan this qrcode.\nOr, you can click the button to copy the special url.",
"modal_qr_shortdesc": "This exports (partial) settings.\nYou can use another device to scan this qrcode.\nOr, you can click the button to copy the special uri and paste it into another device's web browser or Remotely Save Import Setting.",
"modal_qr_button": "Click to copy the special URI",
"modal_qr_button_notice": "The special uri is copied to the clipboard!",
"modal_sizesconflict_title": "Remotely Save: Some conflict were found while skipping large files",
"modal_sizesconflict_desc": "You've set skipping files larger than {{thresholdMB}} MB ({{thresholdBytes}} bytes).\nBut the following files have sizes larger than the threshold on one side, and sizes smaller than the threshold on the other side.\nTo avoid unexpected overwriting or deleting, the plugin stops, and you have to manually deal with at least one side of the files.",
"modal_sizesconflict_copybutton": "Click to copy all the below sizes conflicts info",
"modal_sizesconflict_copynotice": "All the sizes conflicts info have been copied to the clipboard!",
"modal_logtohttpserver_title": "Log To HTTP(S) Server Is DANGEROUS!",
"modal_logtohttpserver_desc": "All your sensitive logging information will be posted to the HTTP(S) server without any authentications!!!!!\nPlease make sure you trust the HTTP(S) server, and it's better to setup a HTTPS one instead of HTTP one.\nIt's for debugging purposes only, especially on mobile.",
"modal_logtohttpserver_secondconfirm": "I know it's dangerous, and insist, and am willing to bear all possible losses.",
"modal_logtohttpserver_notice": "OK.",
"settings_basic": "Basic Settings",
"settings_password": "Encryption Password",
"settings_password_desc": "Password for E2E encryption. Empty for no password. You need to click \"Confirm\". Attention: the password and other info are saved locally.",
"settings_password_desc": "Password for E2E encryption. Empty for no password. You need to click \"Confirm\". Attention: The password and other info are saved locally. After changing the password, you need to manually delete every original files in the remote, and re-sync (so that upload) the encrypted files again.",
"settings_encryptionmethod": "Encryption Method",
"settings_encryptionmethod_desc": "Encryption method for E2E encryption. RClone Crypt format is recommended but it doesn't encrypt path structure. OpenSSL enc is the legacy format of this plugin. <b>Both are not affliated with official RClone and OpenSSL product or community.</b> Attention: After switching the method, you need to manually delete every original files in the remote and re-sync (so that upload) the encrypted files again. More info in the <a href='https://github.com/remotely-save/remotely-save/tree/master/docs/encryption'>online doc</a>.",
"settings_encryptionmethod_rclone": "RClone Crypt (recommended)",
"settings_encryptionmethod_openssl": "OpenSSL enc (legacy)",
"settings_autorun": "Schedule For Auto Run",
"settings_autorun_desc": "The plugin tries to schedule the running after every interval. Battery may be impacted.",
"settings_autorun_notset": "(not set)",
@ -245,34 +256,65 @@
"settings_deletetowhere_desc": "Which trash should the plugin put the files into while deleting?",
"settings_deletetowhere_system_trash": "system trash (default)",
"settings_deletetowhere_obsidian_trash": "Obsidian .trash folder",
"settings_conflictaction": "Action For Conflict",
"settings_conflictaction_desc": "If a file is created or modified on both side since last update, it's a conflict event. How to deal with it? This only works for bidirectional sync.",
"settings_conflictaction_keep_newer": "newer version survives (default)",
"settings_conflictaction_keep_larger": "larger size version survives",
"settings_cleanemptyfolder": "Action For Empty Folders",
"settings_cleanemptyfolder_desc": "The sync algorithm majorly deals with files, so you need to specify how to deal with empty folders.",
"settings_cleanemptyfolder_skip": "leave them as is (default)",
"settings_cleanemptyfolder_clean_both": "delete local and remote",
"settings_protectmodifypercentage": "Abort Sync If Modification Above Percentage",
"settings_protectmodifypercentage_desc": "Abort the sync if more than n% of the files are going to be deleted / modified. Useful to protect users' files from unexpected modifications. You can set to 100 to disable the protection, or set to 0 to always block the sync.",
"settings_protectmodifypercentage_000_desc": "0 (always block)",
"settings_protectmodifypercentage_050_desc": "50 (default)",
"settings_protectmodifypercentage_100_desc": "100 (disable the protection)",
"setting_syncdirection": "Sync Direction",
"setting_syncdirection_desc": "Which direction should the plugin sync to? Please be aware that only CHANGED files (based on time and size) are synced regardless any option.",
"setting_syncdirection_bidirectional_desc": "Bidirectional (default)",
"setting_syncdirection_incremental_push_only_desc": "Incremental Push Only (aka backup mode)",
"setting_syncdirection_incremental_pull_only_desc": "Incremental Pull Only",
"settings_enablemobilestatusbar": "Mobile Status Bar (experimental)",
"settings_enablemobilestatusbar_desc": "By default Obsidian mobile hides status bar. But some users want to show it up. So here is a hack.",
"settings_importexport": "Import and Export Partial Settings",
"settings_export": "Export",
"settings_export_desc": "Export not-oauth2 settings by generating a qrcode.",
"settings_export_desc_button": "Get QR Code",
"settings_export_desc": "Export settings by generating a QR code or URI.",
"settings_export_all_but_oauth2_button": "Export Non-Oauth2 Part",
"settings_export_dropbox_button": "Export Dropbox Part",
"settings_export_onedrive_button": "Export OneDrive Part",
"settings_import": "Import",
"settings_import_desc": "You should open a camera or scan-qrcode app, to manually scan the QR code.",
"settings_import_desc": "Paste the exported URI into here and click \"Import\". Or, you can open a camera or scan-qrcode app to scan the QR code.",
"settings_import_button": "Import",
"settings_import_error_notice": "Your URI string is empty or not correct!",
"settings_debug": "Debug",
"settings_debuglevel": "Alter Console Log Level",
"settings_debuglevel_desc": "By default the log level is \"info\". You can change to \"debug\" to get verbose information in console.",
"settings_debuglevel": "Alter Notice Level",
"settings_debuglevel_desc": "By default the notice level is \"info\". You can change to \"debug\" to get verbose information while syncing.",
"settings_outputsettingsconsole": "Output Current Settings From Disk To Console",
"settings_outputsettingsconsole_desc": "The settings save on disk in encoded. Click this to see the decoded settings in console.",
"settings_outputsettingsconsole_button": "Output",
"settings_outputsettingsconsole_notice": "Finished outputing in console.",
"settings_obfuscatesettingfile": "Obfuscate The Setting File Or Not",
"settings_obfuscatesettingfile_desc": "The setting file (data.json) has some sensitive information. It's strongly recommended to obfuscate it to avoid unexpected read and modification. If you are sure to view and edit it manually, you can disable the obfuscation.",
"settings_viewconsolelog": "View Console Log",
"settings_viewconsolelog_desc": "On desktop, please press \"ctrl+shift+i\" or \"cmd+shift+i\" to view the log. On mobile, please install the third-party plugin <a href='https://obsidian.md/plugins?search=Logstravaganza'>Logstravaganza</a> to export the console log to a note.",
"settings_syncplans": "Export Sync Plans",
"settings_syncplans_desc": "Sync plans are created every time after you trigger sync and before the actual sync. Useful to know what would actually happen in those sync. Click the button to export sync plans.",
"settings_syncplans_button_json": "Export",
"settings_syncplans_button_1": "Export latest 1",
"settings_syncplans_button_5": "Export latest 5",
"settings_syncplans_button_all": "Export All",
"settings_syncplans_notice": "Sync plans history exported.",
"settings_delsyncplans": "Delete Sync Plans History In DB",
"settings_delsyncplans_desc": "Delete sync plans history in DB.",
"settings_delsyncplans_button": "Delete Sync Plans History",
"settings_delsyncplans_notice": "Sync plans history (in DB) deleted.",
"settings_logtohttpserver": "Log To HTTP(S) Server Temporarily",
"settings_logtohttpserver_desc": "It's very dangerous and please use the function with greate cautions!!!!! It will temporarily allow sending console loggings to HTTP(S) server.",
"settings_logtohttpserver_reset_notice": "Your input doesn't starts with \"http(s)\". Already removed the setting of logging to HTTP(S) server.",
"settings_delsyncmap": "Delete Sync Mappings History In DB",
"settings_delsyncmap_desc": "Sync mappings history stores the actual LOCAL last modified time of the REMOTE objects. Clearing it may cause unnecessary data exchanges in next-time sync. Click the button to delete sync mappings history in DB.",
"settings_delsyncmap_button": "Delete Sync Mappings",
"settings_delsyncmap_notice": "Sync mappings history (in local DB) deleted",
"settings_delprevsync": "Delete Prev Sync Details In DB",
"settings_delprevsync_desc": "The sync algorithm keeps the previous successful sync information in DB to determine the file changes. If you want to ignore them so that all files are treated newly created, you can delete the prev sync info here.",
"settings_delprevsync_button": "Delete Prev Sync Details",
"settings_delprevsync_notice": "Previous sync history (in local DB) deleted",
"settings_profiler_results": "Export Profiler Results",
"settings_profiler_results_desc": "The plugin records the time cost of each steps. Here you can export them to know which step is slow.",
"settings_profiler_results_notice": "Profiler results exported.",
"settings_profiler_results_button_all": "Export All",
"settings_outputbasepathvaultid": "Output Vault Base Path And Randomly Assigned ID",
"settings_outputbasepathvaultid_desc": "For debugging purposes.",
"settings_outputbasepathvaultid_button": "Output",
@ -280,9 +322,12 @@
"settings_resetcache_desc": "Reset local internal caches/databases (for debugging purposes). You would want to reload the plugin after resetting this. This option will not empty the {s3, password...} settings.",
"settings_resetcache_button": "Reset",
"settings_resetcache_notice": "Local internal cache/databases deleted. Please manually reload the plugin.",
"syncalgov2_title": "Remotely Save has a better sync algorithm",
"syncalgov2_texts": "Welcome to use Remotely Save!\nFrom version 0.3.0, a new algorithm has been developed, but it needs uploading extra meta data files _remotely-save-metadata-on-remote.{json,bin} to YOUR configured cloud destinations, besides your notes.\nSo that, for example, the second device can know that what files/folders have been deleted on the first device by reading those files.\nIf you agree, plase click the button \"Agree\", and enjoy the plugin! AND PLEASE REMEMBER TO BACKUP YOUR VAULT FIRSTLY!\nIf you do not agree, you should stop using the current and later versions of Remotely Save. You could consider manually install the old version 0.2.14 which uses old algorithm and does not upload any extra meta data files. By clicking the \"Do Not Agree\" button, the plugin will unload itself, and you need to manually disable it in Obsidian settings.",
"syncalgov2_button_agree": "Agree",
"syncalgov2_button_disagree": "Do Not Agree",
"official_notice_2024_first_party": "Plugin Remotely-Save is back to the party and get a HUGE update!🎉🎉🎉 Try it yourself or see the release note on https://github.com/remotely-save/remotely-save/releases."
}
"syncalgov3_title": "Remotely Save has HUGE updates on the sync algorithm",
"syncalgov3_texts": "Welcome to use Remotely Save!\nFrom this version, a new algorithm has been developed:\n<ul><li>More robust deletion sync,</li><li>minimal conflict handling,</li><li>no meta data uploaded any more,</li><li>deletion / modification protection,</li><li>backup mode</li><li>new encryption method</li><li>...</li></ul>\nStay tune for more! A full introduction is in the <a href='https://github.com/remotely-save/remotely-save/tree/master/docs/sync_algorithm/v3/intro.md'>doc website</a>.\nIf you agree to use this, please read and check two checkboxes then click the \"Agree\" button, and enjoy the plugin!\nIf you do not agree, please click the \"Do Not Agree\" button, the plugin will unload itself.\nAlso, please consider <a href='https://github.com/remotely-save/remotely-save'>visit the GitHub repo and star ⭐ it</a>! Or even <a href='https://github.com/remotely-save/donation'>buy me a coffee</a>. Your support is very important to me! Thanks!",
"syncalgov3_checkbox_manual_backup": "I will backup my vault manually firstly.",
"syncalgov3_checkbox_requiremultidevupdate": "I understand I need to update the plugin ACROSS ALL DEVICES to make them work properly.",
"syncalgov3_button_agree": "Agree",
"syncalgov3_button_disagree": "Do Not Agree"
}

View File

@ -5,15 +5,15 @@
"goback": "返回",
"submit": "提交",
"sometext": "这里有一段文字。",
"syncrun_alreadyrunning": "{{pluginName}} 正处于此阶段:{{syncStatus}}!",
"syncrun_alreadyrunning": "{{pluginName}} 正处于此阶段:{{syncStatus}}!中断触发 {{newTriggerSource}}。",
"syncrun_syncingribbon": "{{pluginName}}:正在由 {{triggerSource}} 触发运行",
"syncrun_step0": "0/8 Remotely Save 在空跑dry run模式不会发生实际的文件交换。",
"syncrun_step1": "1/8 Remotely Save 准备同步({{serviceType}}",
"syncrun_step2": "2/8 正在获取远端的元数据。",
"syncrun_step3": "3/8 正在检查密码正确与否。",
"syncrun_passworderr": "检查密码时候出错。",
"syncrun_step4": "4/8 正在获取远端的额外的元数据。",
"syncrun_step5": "5/8 正在获取本地的元数据。",
"syncrun_step4": "4/8 正在获取本地的元数据。",
"syncrun_step5": "5/8 正在获取本地上一次同步的元数据。",
"syncrun_step6": "6/8 正在生成同步计划。",
"syncrun_step7": "7/8 Remotely Save 开始发生数据交换!",
"syncrun_step7skip": "7/8 Remotely Save 在空跑模式,跳过实际数据交换步骤。",
@ -23,7 +23,8 @@
"syncrun_shortstep2skip": "2/2 Remotely Save 在空跑模式,跳过实际数据交换步骤。",
"syncrun_shortstep2": "2/2 Remotely Save 已完成同步!",
"syncrun_abort": "{{manifestID}}-{{theDate}}:中断同步,同步来源={{triggerSource}},出错阶段={{syncStatus}}",
"protocol_saveqr": " {{manifestName}} 新的非 oauth2 设置保存完成。请重启插件设置页使之生效。",
"syncrun_abort_protectmodifypercentage": "中断同步!您设置了不允许 >= {{protectModifyPercentage}}% 的变更,但是现在 {{realModifyDeleteCount}}/{{allFilesCount}}={{percent}}% 的文件会被修改或删除!如果您确认这次同步是您想要的,那么请在设置里修改允许比例。",
"protocol_saveqr": " {{manifestName}} 的新设置导入完成。请重启插件设置页使之生效。",
"protocol_callbacknotsupported": "您的 uri callback 暂不支持: {{params}}",
"protocol_dropbox_connecting": "正在连接 Dropbox……\n请不要关闭此弹窗。",
"protocol_dropbox_connect_succ": "好!我们作为用户 {{username}} 连接上了 Dropbox",
@ -38,15 +39,21 @@
"command_startsync": "开始同步start sync",
"command_drynrun": "开始同步空跑模式start sync (dry run only)",
"command_exportsyncplans_json": "导出同步计划为 json 格式export sync plans in json format",
"command_exportsyncplans_1": "导出同步计划(最近 1 次export sync plans (latest 1)",
"command_exportsyncplans_5": "导出同步计划(最近 5 次export sync plans (latest 5)",
"command_exportsyncplans_all": "导出同步计划所有export sync plans (all)",
"command_exportlogsindb": "从数据库导出终端日志export logs saved in db",
"statusbar_time_years": "{{time}} 年前",
"statusbar_time_months": "{{time}} 月前",
"statusbar_time_weeks": "{{time}} 周前",
"statusbar_time_days": "{{time}} 天前",
"statusbar_time_hours": "{{time}} 小时前",
"statusbar_time_minutes": "{{time}} 分钟前",
"statusbar_time_lessminute": "一分钟之内",
"statusbar_time_years": "{{time}} 年前同步",
"statusbar_time_months": "{{time}} 月前同步",
"statusbar_time_weeks": "{{time}} 周前同步",
"statusbar_time_days": "{{time}} 天前同步",
"statusbar_time_hours": "{{time}} 小时前同步",
"statusbar_time_minutes": "{{time}} 分钟前同步",
"statusbar_time_lessminute": "一分钟之内同步",
"statusbar_lastsync": "上一次同步于:{{time}}",
"statusbar_syncing": "正在同步",
"statusbar_now": "刚同步完",
"statusbar_lastsync_label": "上一次同步于:{{date}}",
"statusbar_lastsync_never": "没触发过同步",
"statusbar_lastsync_never_label": "没触发过同步",
@ -59,6 +66,8 @@
"modal_password_attn5": "注意 5/5密码越长越好。",
"modal_password_secondconfirm": "再次确认保存新密码",
"modal_password_notice": "新密码已保存!",
"modal_encryptionmethod_title": "稍等一下,请阅读下文:",
"modal_encryptionmethod_shortdesc": "您正在修改加密方式,但是您已经设置了密码。\n修改加密方式之后您需要<b>手动</b>和<b>完全</b>删除在远端的之前加密过的库文件,然后重新同步(从而重新上传)新的加密文件。",
"modal_remotebasedir_title": "您正在修改远端基文件夹设置",
"modal_remotebasedir_shortdesc": "1. 本插件并不会自动在远端把内容从旧文件夹移动到新文件夹。所有内容都会重新同步。\n2. 如果你使得文本输入框为空,那么本设置会被重设回库的文件夹名(默认设置)。\n3. 即使您设置了端对端加密的密码,远端文件夹名称本身也不会被加密。\n4. 某些特殊字符,如“?”、“/”、“\\”是不允许的。文本前后的空格也会被自动删去。",
"modal_remotebasedir_invaliddirhint": "您所输入的内容含有某些特殊字符,如“?”、“/”、“\\”,它们是不允许的。",
@ -83,6 +92,7 @@
"modal_dropboxauth_maualinput_conn_succ_revoke": "您已作为用户 {{username}} 连接到 Dropbox。如果您想断开连接点击此按钮。",
"modal_dropboxauth_maualinput_conn_fail": "连接 Dropbox 途中出错了。",
"modal_onedriveauth_shortdesc": "现在只支持个人版 OneDrive不支持企业版。\n在浏览器中访问以下地址然后按照网页提示操作。\n到了最后您应该会被自动重定向回来 Obsidian。",
"modal_onedriveauth_shortdesc_linux": "您正在用 Linux有可能无法跳转回来。请考虑<a href=\"https://github.com/remotely-save/remotely-save/issues/415\">使用</a> flatpack 版本的 Obsidian或创建 <a href=\"https://github.com/remotely-save/remotely-save/blob/master/docs/linux.md\"><code>obsidian.desktop</code> 文件</a>。",
"modal_onedriveauth_copybutton": "点击此按钮从而复制鉴权 url",
"modal_onedriveauth_copynotice": "鉴权 url 已复制到剪贴板!",
"modal_onedriverevokeauth_step1": "第 1 步用浏览器打开以下地址点击本插件对应的“Edit”按钮点击“Remove these permissions”按钮。",
@ -95,20 +105,20 @@
"modal_syncconfig_attn": "注意 1/2此设置只同步复制整个 Obsidian 的配置文件夹,但是不会同步其它 . 开头的文件夹或文件。除了会忽略 .git 和 node_modules 文件夹之外,它也并不理解配置文件夹的里各个子文件或子文件夹的含义。\n注意 2/2配置文件夹被同步之后各插件的设置或许会出错且 Obsidian 或许需要重启来重载各插件的新配置。\n如果您同意自行承受以上风险您可以点击以下再次确认按钮。",
"modal_syncconfig_secondconfirm": "再次确认开启",
"modal_syncconfig_notice": "您已开启配置文件夹的同步!",
"modal_qr_shortdesc": "这里可导出非 oauth2 设置。意味着Dropbox 和 OneDrive 信息不会被导出。)\n您可以使用另一个设备来扫描此 QR 码。\n又或者您可以点击以下按钮复制此特殊 URI。",
"modal_qr_shortdesc": "这里可导出(部分)设置。\n您可以使用另一个设备来扫描此 QR 码。\n又或者您可以点击以下按钮复制此特殊 URI,然后粘贴到另一台设备的网络浏览器或 Remotely Save 设置里的导入部分。",
"modal_qr_button": "点击此按钮复制特殊 URI",
"modal_qr_button_notice": "特殊 URI 已被复制到剪贴板!",
"modal_sizesconflict_title": "Remotely Save跳过大文件的时候出现了一些冲突",
"modal_sizesconflict_desc": "您设置了跳过同步大于 {{thresholdMB}} MB{{thresholdBytes}} bytes的文件。\n但是以下文件的大小在一端大于阈值在另一端则小于阈值。\n为了避免意外的覆盖或删除插件停止了运作您需要手动处理至少一端的文件。",
"modal_sizesconflict_copybutton": "点击以复制以下所有文件大小冲突信息",
"modal_sizesconflict_copynotice": "所有的文件大小冲突信息,已被复制到剪贴板!",
"modal_logtohttpserver_title": "转发终端日志到 HTTP 服务器,此操作很危险!",
"modal_logtohttpserver_desc": "所有您的带敏感信息的终端日志,都会被转发到 HTTP(S) 服务器,没有任何鉴权!!!!!\n请确保您信任对应的服务器最好设置为 HTTPS 而不是 HTTP。\n仅仅用于 debug 用途,例如手机上的 debug。",
"modal_logtohttpserver_secondconfirm": "我知道很危险,坚持要设置,愿意承担所有可能损失。",
"modal_logtohttpserver_notice": "已设置。",
"settings_basic": "基本设置",
"settings_password": "密码",
"settings_password_desc": "端到端加密的密码。不填写则代表没密码。您需要点击“确认”来修改。注意:密码和其它信息都会在本地保存。",
"settings_password_desc": "端到端加密的密码。不填写则代表没密码。您需要点击“确认”来修改。注意:密码和其它信息都会在本地保存。如果您修改了密码,您需要手动删除远端的所有文件,重新同步(从而上传)加密文件。",
"settings_encryptionmethod": "加密方法",
"settings_encryptionmethod_desc": "端到端加密的方法。推荐选用 RClone Crypt 方法但是它没有加密文件路径结构。OpenSSL enc 是本插件一开始就支持的方式。<b>两种方法都和 RClone、OpenSSL 官方产品和社区无利益相关。</b>如果您修改了加密方法,您需要手动删除远端的所有文件,重新同步(从而上传)加密文件。更多详细说明见<a href='https://github.com/remotely-save/remotely-save/tree/master/docs/encryption'>在线文档</a>。",
"settings_encryptionmethod_rclone": "RClone Crypt推荐",
"settings_encryptionmethod_openssl": "OpenSSL enc旧方法",
"settings_autorun": "自动运行",
"settings_autorun_desc": "每隔一段时间,此插件尝试自动同步。会影响到电池用量。",
"settings_autorun_notset": "(不设置)",
@ -245,34 +255,65 @@
"settings_deletetowhere_desc": "插件触发删除操作时候,删除到哪里?",
"settings_deletetowhere_system_trash": "系统回收站(默认)",
"settings_deletetowhere_obsidian_trash": "Obsidian .trash 文件夹",
"settings_conflictaction": "处理冲突",
"settings_conflictaction_desc": "如果一个文件,在本地和服务器都被创建或者修改了,那么这就是一个“冲突”情况。如何处理?这个设置只在双向同步时候生效。",
"settings_conflictaction_keep_newer": "保留最后修改的版本(默认)",
"settings_conflictaction_keep_larger": "保留文件体积较大的版本",
"settings_cleanemptyfolder": "处理空文件夹",
"settings_cleanemptyfolder_desc": "同步算法主要是针对文件处理的,您要要手动指定空文件夹如何处理。",
"settings_cleanemptyfolder_skip": "跳过处理空文件夹(默认)",
"settings_cleanemptyfolder_clean_both": "删除本地和服务器的空文件夹",
"settings_protectmodifypercentage": "如果修改超过百分比则中止同步",
"settings_protectmodifypercentage_desc": "如果算法检测到超过 n% 的文件会被修改或删除,则中止同步。从而可以保护用户的文件免受预料之外的修改。您可以设置为 100 而去除此保护,也可以设置为 0 总是强制中止所有同步。",
"settings_protectmodifypercentage_000_desc": "0总是强制中止",
"settings_protectmodifypercentage_050_desc": "50默认值",
"settings_protectmodifypercentage_100_desc": "100去除此保护",
"setting_syncdirection": "同步方向",
"setting_syncdirection_desc": "插件应该向哪里同步?注意每个选项都是只有修改了的文件(基于修改时间和大小判断)才会触发同步动作。",
"setting_syncdirection_bidirectional_desc": "双向同步(默认)",
"setting_syncdirection_incremental_push_only_desc": "只增量推送(也即:备份模式)",
"setting_syncdirection_incremental_pull_only_desc": "只增量拉取",
"settings_enablemobilestatusbar": "手机的状态栏(实验性质)",
"settings_enablemobilestatusbar_desc": "Obsidian 手机版默认隐藏了状态栏。有些用户希望展示它。这里提供了设置选项。",
"settings_importexport": "导入导出部分设置",
"settings_export": "导出",
"settings_export_desc": "用 QR 码导出非 oauth2 的设置信息。",
"settings_export_desc_button": "生成 QR 码",
"settings_export_desc": "用 QR 码或 URI 导出设置信息。",
"settings_export_all_but_oauth2_button": "导出非 Oauth2 部分",
"settings_export_dropbox_button": "导出 Dropbox 部分",
"settings_export_onedrive_button": "导出 OneDrive 部分",
"settings_import": "导入",
"settings_import_desc": "您需要使用系统拍摄 app 或者扫描 QR 码的app来扫描对应的 QR 码。",
"settings_import_desc": "粘贴之前导出的 URI 到这里然后点击“导入”。或,使用拍摄 app 或者扫描 QR 码的 app来扫描对应的 QR 码。",
"settings_import_button": "导入",
"settings_import_error_notice": "您输入的 URI 是空的或者不准确的!",
"settings_debug": "调试",
"settings_debuglevel": "修改终端输出的 level",
"settings_debuglevel_desc": "默认值为 \"info\"。您可以改为 \"debug\" 从而在终端里获取更多信息。",
"settings_debuglevel": "修改同步提示信息",
"settings_debuglevel_desc": "默认值为 \"info\"。您可以改为 \"debug\" 从而在同步时候里获取更多信息。",
"settings_outputsettingsconsole": "读取硬盘上的设置文件输出到终端",
"settings_outputsettingsconsole_desc": "硬盘上的设置文件是编码过的,点击这里从而解码并输出到终端。",
"settings_outputsettingsconsole_button": "输出",
"settings_outputsettingsconsole_notice": "已输出到终端",
"settings_obfuscatesettingfile": "是否混淆保存设置文件",
"settings_obfuscatesettingfile_desc": "设置文件data.json含有敏感信息。强烈建议混淆后保存它从而避免出乎意料的读取和修改。如果您确认要手动查看和修改它可以关闭混淆保存。",
"settings_viewconsolelog": "查看终端输出",
"settings_viewconsolelog_desc": "电脑上输入“ctrl+shift+i”或“cmd+shift+i”来查看终端输出。手机上安装第三方插件 <a href='https://obsidian.md/plugins?search=Logstravaganza'>Logstravaganza</a> 来导出终端输出到一篇笔记上。",
"settings_syncplans": "导出同步计划",
"settings_syncplans_desc": "每次您启动同步,并在实际上传下载前,插件会生成同步计划。它可以使您知道每次同步发生了什么。点击按钮可以导出同步计划。",
"settings_syncplans_button_json": "导出",
"settings_syncplans_button_1": "导出最近 1 次",
"settings_syncplans_button_5": "导出最近 5 次",
"settings_syncplans_button_all": "导出所有",
"settings_syncplans_notice": "同步计划已导出",
"settings_delsyncplans": "删除数据库里的同步计划历史",
"settings_delsyncplans_desc": "删除数据库里的同步计划历史。",
"settings_delsyncplans_button": "删除同步计划历史",
"settings_delsyncplans_notice": "(数据库里的)同步计划已被删除。",
"settings_logtohttpserver": "临时设定终端日志实时转发到 HTTP(S) 服务器。",
"settings_logtohttpserver_desc": "非常危险,谨慎行动!!!!!临时设定终端日志实时转发到 HTTP(S) 服务器。",
"settings_logtohttpserver_reset_notice": "您的输入不是“http(s)”开头的。已移除了终端日志转发到 HTTP(S) 服务器的设定。",
"settings_delsyncmap": "删除数据库里的同步映射历史",
"settings_delsyncmap_desc": "同步映射历史存储了本地真正的最后修改时间和远程文件时间的映射。删除之可能会导致下一次同步时发生不必要的数据交换。点击按钮删除数据库里的同步映射历史。",
"settings_delsyncmap_button": "删除同步映射历史",
"settings_delsyncmap_notice": "(本地数据库里的)同步映射历史已被删除。",
"settings_delprevsync": "删除数据库里的上次同步明细",
"settings_delprevsync_desc": "同步算法需要上次成功同步的信息来决定文件变更,这个信息保存在本地的数据库里。如果您想忽略这些信息从而所有文件都被视为新创建的话,可以在此删除之前的信息。",
"settings_delprevsync_button": "删除上次同步明细",
"settings_delprevsync_notice": "(本地数据库里的)上次同步明细已被删除。",
"settings_profiler_results": "导出性能数据记录",
"settings_profiler_results_desc": "插件记录了每次同步每一步的耗时。这里可以导出记录得知哪一步最慢。",
"settings_profiler_results_notice": "性能数据已导出",
"settings_profiler_results_button_all": "导出所有",
"settings_outputbasepathvaultid": "输出资料库对应的位置和随机分配的 ID",
"settings_outputbasepathvaultid_desc": "用于调试。",
"settings_outputbasepathvaultid_button": "输出",
@ -280,9 +321,12 @@
"settings_resetcache_desc": "(出于调试原因)重设本地缓存和数据库。您需要在重设之后重新载入此插件。本重设不会删除 s3密码……等设定。",
"settings_resetcache_button": "重设",
"settings_resetcache_notice": "本地同步缓存和数据库已被删除。请手动重新载入此插件。",
"syncalgov2_title": "Remotely Save 的同步算法得到优化",
"syncalgov2_texts": "欢迎使用 Remotely Save!\n从版本 0.3.0 开始,它带来了新的同步算法,但是,除了您的笔记之外,它还需要上传额外的带有元信息的文件 _remotely-save-metadata-on-remote.{json,bin} 到您的云服务目的地上。\n从而比如说通过读取这些信息另一台设备可以知道什么文件或文件夹在第一台设备上被删除了。\n如果您同意此策略请点击按钮 \"同意\"然后开始享用此插件且特别要注意使用插件之前请首先备份好您的库Vault\n如果您不同意此策略您应该停止使用此版本和之后版本的 Remotely Save。您可以考虑手动安装旧版 0.2.14,它使用旧的同步算法,并不上传额外元信息文件。点击 \"不同意\" 之后插件会自动停止运行unload然后您需要 Obsidian 设置里手动停用disable此插件。",
"syncalgov2_button_agree": "同意",
"syncalgov2_button_disagree": "不同意",
"official_notice_2024_first_party": "插件 Remotely-Save 回来了,更新了一大堆功能!🎉🎉🎉请自行使用,或参阅更新文档: https://github.com/remotely-save/remotely-save/releases 。"
}
"syncalgov3_title": "Remotely Save 的同步算法有重大更新",
"syncalgov3_texts": "欢迎使用 Remotely Save\n从这个版本开始插件更新了同步算法\n<ul><li>更稳健的删除同步</li><li>引入冲突处理</li><li>避免上传元数据</li><li>修改删除保护</li><li>备份模式</li><li>新的加密方式</li><li>……</li></ul>\n敬请期待更多更新详细介绍请参阅<a href='https://github.com/remotely-save/remotely-save/tree/master/docs/sync_algorithm/v3/intro.md'>文档网站</a>。\n如果您同意使用新版本请阅读和勾选两个勾选框然后点击“同意”按钮开始使用插件吧\n如果您不同意请点击“不同意”按钮插件将自动停止运行unload。\n此外请考虑<a href='https://github.com/remotely-save/remotely-save'>访问 GitHub 页面然后点赞 ⭐</a>!您的支持对我十分重要!谢谢!",
"syncalgov3_checkbox_manual_backup": "我将会首先手动备份我的库Vault。",
"syncalgov3_checkbox_requiremultidevupdate": "我理解,我需要在所有设备上都更新此插件使之正常运行。",
"syncalgov3_button_agree": "同意",
"syncalgov3_button_disagree": "不同意"
}

View File

@ -5,15 +5,15 @@
"goback": "返回",
"submit": "提交",
"sometext": "這裡有一段文字。",
"syncrun_alreadyrunning": "{{pluginName}} 正處於此階段:{{syncStatus}}!",
"syncrun_alreadyrunning": "{{pluginName}} 正處於此階段:{{syncStatus}}! 中斷觸發 {{newTriggerSource}}。",
"syncrun_syncingribbon": "{{pluginName}}:正在由 {{triggerSource}} 觸發執行",
"syncrun_step0": "0/8 Remotely Save 在空跑dry run模式不會發生實際的檔案交換。",
"syncrun_step1": "1/8 Remotely Save 準備同步({{serviceType}}",
"syncrun_step2": "2/8 正在獲取遠端的元資料。",
"syncrun_step3": "3/8 正在檢查密碼正確與否。",
"syncrun_passworderr": "檢查密碼時候出錯。",
"syncrun_step4": "4/8 正在獲取遠端的額外的元資料。",
"syncrun_step5": "5/8 正在獲取本地的元資料。",
"syncrun_step4": "4/8 正在獲取本地的元資料。",
"syncrun_step5": "5/8 正在獲取本地上一次同步的元資料。",
"syncrun_step6": "6/8 正在生成同步計劃。",
"syncrun_step7": "7/8 Remotely Save 開始發生資料交換!",
"syncrun_step7skip": "7/8 Remotely Save 在空跑模式,跳過實際資料交換步驟。",
@ -23,7 +23,8 @@
"syncrun_shortstep2skip": "2/2 Remotely Save 在空跑模式,跳過實際資料交換步驟。",
"syncrun_shortstep2": "2/2 Remotely Save 已完成同步!",
"syncrun_abort": "{{manifestID}}-{{theDate}}:中斷同步,同步來源={{triggerSource}},出錯階段={{syncStatus}}",
"protocol_saveqr": " {{manifestName}} 新的非 oauth2 設定儲存完成。請重啟外掛設定頁使之生效。",
"syncrun_abort_protectmodifypercentage": "中斷同步!您設定了不允許 >= {{protectModifyPercentage}}% 的變更,但是現在 {{realModifyDeleteCount}}/{{allFilesCount}}={{percent}}% 的檔案會被修改或刪除!如果您確認這次同步是您想要的,那麼請在設定裡修改允許比例。",
"protocol_saveqr": " {{manifestName}} 的新設定匯入完成。請重啟外掛設定頁使之生效。",
"protocol_callbacknotsupported": "您的 uri callback 暫不支援: {{params}}",
"protocol_dropbox_connecting": "正在連線 Dropbox……\n請不要關閉此彈窗。",
"protocol_dropbox_connect_succ": "好!我們作為使用者 {{username}} 連線上了 Dropbox",
@ -37,16 +38,21 @@
"protocol_onedrive_connect_unknown": "不知道如何處理此 callback{{params}}",
"command_startsync": "開始同步start sync",
"command_drynrun": "開始同步空跑模式start sync (dry run only)",
"command_exportsyncplans_json": "匯出同步計劃為 json 格式export sync plans in json format",
"command_exportsyncplans_1": "匯出同步計劃(最近 1 次export sync plans (latest 1)",
"command_exportsyncplans_5": "匯出同步計劃(最近 5 次export sync plans (latest 5)",
"command_exportsyncplans_all": "匯出同步計劃所有export sync plans (all)",
"command_exportlogsindb": "從資料庫匯出終端日誌export logs saved in db",
"statusbar_time_years": "{{time}} 年前",
"statusbar_time_months": "{{time}} 月前",
"statusbar_time_weeks": "{{time}} 周前",
"statusbar_time_days": "{{time}} 天前",
"statusbar_time_hours": "{{time}} 小時前",
"statusbar_time_minutes": "{{time}} 分鐘前",
"statusbar_time_lessminute": "一分鐘之內",
"statusbar_time_years": "{{time}} 年前同步",
"statusbar_time_months": "{{time}} 月前同步",
"statusbar_time_weeks": "{{time}} 周前同步",
"statusbar_time_days": "{{time}} 天前同步",
"statusbar_time_hours": "{{time}} 小時前同步",
"statusbar_time_minutes": "{{time}} 分鐘前同步",
"statusbar_time_lessminute": "一分鐘之內同步",
"statusbar_lastsync": "上一次同步於:{{time}}",
"statusbar_syncing": "正在同步",
"statusbar_now": "剛同步完",
"statusbar_lastsync_label": "上一次同步於:{{date}}",
"statusbar_lastsync_never": "沒觸發過同步",
"statusbar_lastsync_never_label": "沒觸發過同步",
@ -59,6 +65,8 @@
"modal_password_attn5": "注意 5/5密碼越長越好。",
"modal_password_secondconfirm": "再次確認儲存新密碼",
"modal_password_notice": "新密碼已儲存!",
"modal_encryptionmethod_title": "稍等一下,請閱讀下文:",
"modal_encryptionmethod_shortdesc": "您正在修改加密方式,但是您已經設定了密碼。\n修改加密方式之後您需要<b>手動</b>和<b>完全</b>刪除在遠端的之前加密過的庫檔案,然後重新同步(從而重新上傳)新的加密檔案。",
"modal_remotebasedir_title": "您正在修改遠端基資料夾設定",
"modal_remotebasedir_shortdesc": "1. 本外掛並不會自動在遠端把內容從舊資料夾移動到新資料夾。所有內容都會重新同步。\n2. 如果你使得文字輸入框為空,那麼本設定會被重設回庫的資料夾名(預設設定)。\n3. 即使您設定了端對端加密的密碼,遠端資料夾名稱本身也不會被加密。\n4. 某些特殊字元,如“?”、“/”、“\\”是不允許的。文字前後的空格也會被自動刪去。",
"modal_remotebasedir_invaliddirhint": "您所輸入的內容含有某些特殊字元,如“?”、“/”、“\\”,它們是不允許的。",
@ -83,6 +91,7 @@
"modal_dropboxauth_maualinput_conn_succ_revoke": "您已作為使用者 {{username}} 連線到 Dropbox。如果您想斷開連線點選此按鈕。",
"modal_dropboxauth_maualinput_conn_fail": "連線 Dropbox 途中出錯了。",
"modal_onedriveauth_shortdesc": "現在只支援個人版 OneDrive不支援企業版。\n在瀏覽器中訪問以下地址然後按照網頁提示操作。\n到了最後您應該會被自動重定向回來 Obsidian。",
"modal_onedriveauth_shortdesc_linux": "您正在用 Linux有可能無法跳轉回來。請考慮<a href=\"https://github.com/remotely-save/remotely-save/issues/415\">使用</a> flatpack 版本的 Obsidian或建立 <a href=\"https://github.com/remotely-save/remotely-save/blob/master/docs/linux.md\"><code>obsidian.desktop</code> 檔案</a>。",
"modal_onedriveauth_copybutton": "點選此按鈕從而複製鑑權 url",
"modal_onedriveauth_copynotice": "鑑權 url 已複製到剪貼簿!",
"modal_onedriverevokeauth_step1": "第 1 步用瀏覽器開啟以下地址點選本外掛對應的“Edit”按鈕點選“Remove these permissions”按鈕。",
@ -95,20 +104,20 @@
"modal_syncconfig_attn": "注意 1/2此設定只同步複製整個 Obsidian 的配置資料夾,但是不會同步其它 . 開頭的資料夾或檔案。除了會忽略 .git 和 node_modules 資料夾之外,它也並不理解配置資料夾的裡各個子檔案或子資料夾的含義。\n注意 2/2配置資料夾被同步之後各外掛的設定或許會出錯且 Obsidian 或許需要重啟來過載各外掛的新配置。\n如果您同意自行承受以上風險您可以點選以下再次確認按鈕。",
"modal_syncconfig_secondconfirm": "再次確認開啟",
"modal_syncconfig_notice": "您已開啟配置資料夾的同步!",
"modal_qr_shortdesc": "這裡可匯出非 oauth2 設定。意味著Dropbox 和 OneDrive 資訊不會被匯出。)\n您可以使用另一個裝置來掃描此 QR 碼。\n又或者您可以點選以下按鈕複製此特殊 URI。",
"modal_qr_shortdesc": "這裡可匯出(部分)設定。\n您可以使用另一個裝置來掃描此 QR 碼。\n又或者您可以點選以下按鈕複製此特殊 URI,然後貼上到另一臺裝置的網路瀏覽器或 Remotely Save 設定裡的匯入部分。",
"modal_qr_button": "點選此按鈕複製特殊 URI",
"modal_qr_button_notice": "特殊 URI 已被複制到剪貼簿!",
"modal_sizesconflict_title": "Remotely Save跳過大檔案的時候出現了一些衝突",
"modal_sizesconflict_desc": "您設定了跳過同步大於 {{thresholdMB}} MB{{thresholdBytes}} bytes的檔案。\n但是以下檔案的大小在一端大於閾值在另一端則小於閾值。\n為了避免意外的覆蓋或刪除外掛停止了運作您需要手動處理至少一端的檔案。",
"modal_sizesconflict_copybutton": "點選以複製以下所有檔案大小衝突資訊",
"modal_sizesconflict_copynotice": "所有的檔案大小衝突資訊,已被複制到剪貼簿!",
"modal_logtohttpserver_title": "轉發終端日誌到 HTTP 伺服器,此操作很危險!",
"modal_logtohttpserver_desc": "所有您的帶敏感資訊的終端日誌,都會被轉發到 HTTP(S) 伺服器,沒有任何鑑權!!!!!\n請確保您信任對應的伺服器最好設定為 HTTPS 而不是 HTTP。\n僅僅用於 debug 用途,例如手機上的 debug。",
"modal_logtohttpserver_secondconfirm": "我知道很危險,堅持要設定,願意承擔所有可能損失。",
"modal_logtohttpserver_notice": "已設定。",
"settings_basic": "基本設定",
"settings_password": "密碼",
"settings_password_desc": "端到端加密的密碼。不填寫則代表沒密碼。您需要點選“確認”來修改。注意:密碼和其它資訊都會在本地儲存。",
"settings_password_desc": "端到端加密的密碼。不填寫則代表沒密碼。您需要點選“確認”來修改。注意:密碼和其它資訊都會在本地儲存。如果您修改了密碼,您需要手動刪除遠端的所有檔案,重新同步(從而上傳)加密檔案。",
"settings_encryptionmethod": "加密方法",
"settings_encryptionmethod_desc": "端到端加密的方法。推薦選用 RClone Crypt 方法但是它沒有加密檔案路徑結構。OpenSSL enc 是本外掛一開始就支援的方式。<b>兩種方法都和 RClone、OpenSSL 官方產品和社群無利益相關。</b>如果您修改了加密方法,您需要手動刪除遠端的所有檔案,重新同步(從而上傳)加密檔案。更多詳細說明見<a href='https://github.com/remotely-save/remotely-save/tree/master/docs/encryption'>線上文件</a>。",
"settings_encryptionmethod_rclone": "RClone Crypt推薦",
"settings_encryptionmethod_openssl": "OpenSSL enc舊方法",
"settings_autorun": "自動執行",
"settings_autorun_desc": "每隔一段時間,此外掛嘗試自動同步。會影響到電池用量。",
"settings_autorun_notset": "(不設定)",
@ -245,34 +254,65 @@
"settings_deletetowhere_desc": "外掛觸發刪除操作時候,刪除到哪裡?",
"settings_deletetowhere_system_trash": "系統回收站(預設)",
"settings_deletetowhere_obsidian_trash": "Obsidian .trash 資料夾",
"settings_conflictaction": "處理衝突",
"settings_conflictaction_desc": "如果一個檔案,在本地和伺服器都被建立或者修改了,那麼這就是一個“衝突”情況。如何處理?這個設定只在雙向同步時候生效。",
"settings_conflictaction_keep_newer": "保留最後修改的版本(預設)",
"settings_conflictaction_keep_larger": "保留檔案體積較大的版本",
"settings_cleanemptyfolder": "處理空資料夾",
"settings_cleanemptyfolder_desc": "同步演算法主要是針對檔案處理的,您需要手動指定空資料夾如何處理。",
"settings_cleanemptyfolder_skip": "跳過處理空資料夾(預設)",
"settings_cleanemptyfolder_clean_both": "刪除本地和伺服器的空資料夾",
"settings_protectmodifypercentage": "如果修改超過百分比則中止同步",
"settings_protectmodifypercentage_desc": "如果演算法檢測到超過 n% 的檔案會被修改或刪除,則中止同步。從而可以保護使用者的檔案免受預料之外的修改。您可以設定為 100 而去除此保護,也可以設定為 0 總是強制中止所有同步。",
"settings_protectmodifypercentage_000_desc": "0總是強制中止",
"settings_protectmodifypercentage_050_desc": "50預設值",
"settings_protectmodifypercentage_100_desc": "100去除此保護",
"setting_syncdirection": "同步方向",
"setting_syncdirection_desc": "外掛應該向哪裡同步?注意每個選項都是隻有修改了的檔案(基於修改時間和大小判斷)才會觸發同步動作。",
"setting_syncdirection_bidirectional_desc": "雙向同步(預設)",
"setting_syncdirection_incremental_push_only_desc": "只增量推送(也即:備份模式)",
"setting_syncdirection_incremental_pull_only_desc": "只增量拉取",
"settings_enablemobilestatusbar": "手機的狀態列(實驗性質)",
"settings_enablemobilestatusbar_desc": "Obsidian 手機版預設隱藏了狀態列。有些使用者希望展示它。這裡提供了設定選項。",
"settings_importexport": "匯入匯出部分設定",
"settings_export": "匯出",
"settings_export_desc": "用 QR 碼匯出非 oauth2 的設定資訊。",
"settings_export_desc_button": "生成 QR 碼",
"settings_export_desc": "用 QR 碼或 URI 匯出設定資訊。",
"settings_export_all_but_oauth2_button": "匯出非 Oauth2 部分",
"settings_export_dropbox_button": "匯出 Dropbox 部分",
"settings_export_onedrive_button": "匯出 OneDrive 部分",
"settings_import": "匯入",
"settings_import_desc": "您需要使用系統拍攝 app 或者掃描 QR 碼的app來掃描對應的 QR 碼。",
"settings_import_desc": "貼上之前匯出的 URI 到這裡然後點選“匯入”。或,使用拍攝 app 或者掃描 QR 碼的 app來掃描對應的 QR 碼。",
"settings_import_button": "匯入",
"settings_import_error_notice": "您輸入的 URI 是空的或者不準確的!",
"settings_debug": "除錯",
"settings_debuglevel": "修改終端輸出的 level",
"settings_debuglevel_desc": "預設值為 \"info\"。您可以改為 \"debug\" 從而在終端裡獲取更多資訊。",
"settings_debuglevel": "修改同步提示資訊",
"settings_debuglevel_desc": "預設值為 \"info\"。您可以改為 \"debug\" 從而在同步時候裡獲取更多資訊。",
"settings_outputsettingsconsole": "讀取硬碟上的設定檔案輸出到終端",
"settings_outputsettingsconsole_desc": "硬碟上的設定檔案是編碼過的,點選這裡從而解碼並輸出到終端。",
"settings_outputsettingsconsole_button": "輸出",
"settings_outputsettingsconsole_notice": "已輸出到終端",
"settings_obfuscatesettingfile": "是否混淆儲存設定檔案",
"settings_obfuscatesettingfile_desc": "設定檔案data.json含有敏感資訊。強烈建議混淆後儲存它從而避免出乎意料的讀取和修改。如果您確認要手動檢視和修改它可以關閉混淆儲存。",
"settings_viewconsolelog": "檢視終端輸出",
"settings_viewconsolelog_desc": "電腦上輸入“ctrl+shift+i”或“cmd+shift+i”來檢視終端輸出。手機上安裝第三方外掛 <a href='https://obsidian.md/plugins?search=Logstravaganza'>Logstravaganza</a> 來匯出終端輸出到一篇筆記上。",
"settings_syncplans": "匯出同步計劃",
"settings_syncplans_desc": "每次您啟動同步,並在實際上傳下載前,外掛會生成同步計劃。它可以使您知道每次同步發生了什麼。點選按鈕可以匯出同步計劃。",
"settings_syncplans_button_json": "匯出",
"settings_syncplans_button_1": "匯出最近 1 次",
"settings_syncplans_button_5": "匯出最近 5 次",
"settings_syncplans_button_all": "匯出所有",
"settings_syncplans_notice": "同步計劃已匯出",
"settings_delsyncplans": "刪除資料庫裡的同步計劃歷史",
"settings_delsyncplans_desc": "刪除資料庫裡的同步計劃歷史。",
"settings_delsyncplans_button": "刪除同步計劃歷史",
"settings_delsyncplans_notice": "(資料庫裡的)同步計劃已被刪除。",
"settings_logtohttpserver": "臨時設定終端日誌實時轉發到 HTTP(S) 伺服器。",
"settings_logtohttpserver_desc": "非常危險,謹慎行動!!!!!臨時設定終端日誌實時轉發到 HTTP(S) 伺服器。",
"settings_logtohttpserver_reset_notice": "您的輸入不是“http(s)”開頭的。已移除了終端日誌轉發到 HTTP(S) 伺服器的設定。",
"settings_delsyncmap": "刪除資料庫裡的同步對映歷史",
"settings_delsyncmap_desc": "同步對映歷史儲存了本地真正的最後修改時間和遠端檔案時間的對映。刪除之可能會導致下一次同步時發生不必要的資料交換。點選按鈕刪除資料庫裡的同步對映歷史。",
"settings_delsyncmap_button": "刪除同步對映歷史",
"settings_delsyncmap_notice": "(本地資料庫裡的)同步對映歷史已被刪除。",
"settings_delprevsync": "刪除資料庫裡的上次同步明細",
"settings_delprevsync_desc": "同步演算法需要上次成功同步的資訊來決定檔案變更,這個資訊儲存在本地的資料庫裡。如果您想忽略這些資訊從而所有檔案都被視為新建立的話,可以在此刪除之前的資訊。",
"settings_delprevsync_button": "刪除上次同步明細",
"settings_delprevsync_notice": "(本地資料庫裡的)上次同步明細已被刪除。",
"settings_profiler_results": "匯出效能資料記錄",
"settings_profiler_results_desc": "外掛記錄了每次同步每一步的耗時。這裡可以匯出記錄得知哪一步最慢。",
"settings_profiler_results_notice": "效能資料已匯出",
"settings_profiler_results_button_all": "匯出所有",
"settings_outputbasepathvaultid": "輸出資料庫對應的位置和隨機分配的 ID",
"settings_outputbasepathvaultid_desc": "用於除錯。",
"settings_outputbasepathvaultid_button": "輸出",
@ -280,9 +320,12 @@
"settings_resetcache_desc": "(出於除錯原因)重設本地快取和資料庫。您需要在重設之後重新載入此外掛。本重設不會刪除 s3密碼……等設定。",
"settings_resetcache_button": "重設",
"settings_resetcache_notice": "本地同步快取和資料庫已被刪除。請手動重新載入此外掛。",
"syncalgov2_title": "Remotely Save 的同步演算法得到最佳化",
"syncalgov2_texts": "歡迎使用 Remotely Save!\n從版本 0.3.0 開始,它帶來了新的同步演算法,但是,除了您的筆記之外,它還需要上傳額外的帶有元資訊的檔案 _remotely-save-metadata-on-remote.{json,bin} 到您的雲服務目的地上。\n從而比如說透過讀取這些資訊另一臺裝置可以知道什麼檔案或資料夾在第一臺裝置上被刪除了。\n如果您同意此策略請點選按鈕 \"同意\"然後開始享用此外掛且特別要注意使用外掛之前請首先備份好您的儲存庫Vault\n如果您不同意此策略您應該停止使用此版本和之後版本的 Remotely Save。您可以考慮手動安裝舊版 0.2.14,它使用舊的同步演算法,並不上傳額外元資訊檔案。點選 \"不同意\" 之後外掛會自動停止執行unload然後您需要 Obsidian 設定裡手動停用disable此外掛。",
"syncalgov2_button_agree": "同意",
"syncalgov2_button_disagree": "不同意",
"official_notice_2024_first_party": "外掛 Remotely-Save 回來了,更新了一大堆功能!🎉🎉🎉請自行使用,或參閱更新文件: https://github.com/remotely-save/remotely-save/releases 。"
}
"syncalgov3_title": "Remotely Save 的同步演算法有重大更新",
"syncalgov3_texts": "歡迎使用 Remotely Save\n從這個版本開始外掛更新了同步演算法\n<ul><li>更穩健的刪除同步</li><li>引入衝突處理</li><li>避免上傳元資料</li><li>修改刪除保護</li><li>備份模式</li><li>新的加密方式</li><li>……</li></ul>\n敬請期待更多更新詳細介紹請參閱<a href='https://github.com/remotely-save/remotely-save/tree/master/docs/sync_algorithm/v3/intro.md'>文件網站</a>。\n如果您同意使用新版本請閱讀和勾選兩個勾選框然後點選“同意”按鈕開始使用外掛吧\n如果您不同意請點選“不同意”按鈕外掛將自動停止執行unload。\n此外請考慮<a href='https://github.com/remotely-save/remotely-save'>訪問 GitHub 頁面然後點贊 ⭐</a>!您的支援對我十分重要!謝謝!",
"syncalgov3_checkbox_manual_backup": "我將會首先手動備份我的庫Vault。",
"syncalgov3_checkbox_requiremultidevupdate": "我理解,我需要在所有裝置上都更新此外掛使之正常執行。",
"syncalgov3_button_agree": "同意",
"syncalgov3_button_disagree": "不同意"
}

76
src/local.ts Normal file
View File

@ -0,0 +1,76 @@
import { TFile, TFolder, type Vault } from "obsidian";
import type { Entity, MixedEntity } from "./baseTypes";
import { listFilesInObsFolder } from "./obsFolderLister";
import { Profiler } from "./profiler";
export const getLocalEntityList = async (
vault: Vault,
syncConfigDir: boolean,
configDir: string,
pluginID: string,
profiler: Profiler
) => {
profiler.addIndent();
profiler.insert("enter getLocalEntityList");
const local: Entity[] = [];
const localTAbstractFiles = vault.getAllLoadedFiles();
profiler.insert("finish getting getAllLoadedFiles");
for (const entry of localTAbstractFiles) {
let r = {} as Entity;
let key = entry.path;
if (entry.path === "/") {
// ignore
continue;
} else if (entry instanceof TFile) {
let mtimeLocal: number | undefined = entry.stat.mtime;
if (mtimeLocal <= 0) {
mtimeLocal = entry.stat.ctime;
}
if (mtimeLocal === 0) {
mtimeLocal = undefined;
}
if (mtimeLocal === undefined) {
throw Error(
`Your file has last modified time 0: ${key}, don't know how to deal with it`
);
}
r = {
key: entry.path, // local always unencrypted
keyRaw: entry.path,
mtimeCli: mtimeLocal,
mtimeSvr: mtimeLocal,
size: entry.stat.size, // local always unencrypted
sizeRaw: entry.stat.size,
};
} else if (entry instanceof TFolder) {
key = `${entry.path}/`;
r = {
key: key,
keyRaw: key,
size: 0,
sizeRaw: 0,
};
} else {
throw Error(`unexpected ${entry}`);
}
local.push(r);
}
profiler.insert("finish transforming getAllLoadedFiles");
if (syncConfigDir) {
profiler.insert("into syncConfigDir");
const syncFiles = await listFilesInObsFolder(configDir, vault, pluginID);
for (const f of syncFiles) {
local.push(f);
}
profiler.insert("finish syncConfigDir");
}
profiler.insert("finish getLocalEntityList");
profiler.removeIndent();
return local;
};

View File

@ -1,37 +1,39 @@
import localforage from "localforage";
import { extendPrototype } from "localforage-getitems";
extendPrototype(localforage);
export type LocalForage = typeof localforage;
import { nanoid } from "nanoid";
import { requireApiVersion, TAbstractFile, TFile, TFolder } from "obsidian";
import { API_VER_STAT_FOLDER, SUPPORTED_SERVICES_TYPE } from "./baseTypes";
import { API_VER_STAT_FOLDER } from "./baseTypes";
import type { Entity, MixedEntity, SUPPORTED_SERVICES_TYPE } from "./baseTypes";
import type { SyncPlanType } from "./sync";
import { statFix, toText, unixTimeToStr } from "./misc";
import { log } from "./moreOnLog";
const DB_VERSION_NUMBER_IN_HISTORY = [20211114, 20220108, 20220326];
export const DEFAULT_DB_VERSION_NUMBER: number = 20220326;
const DB_VERSION_NUMBER_IN_HISTORY = [20211114, 20220108, 20220326, 20240220];
export const DEFAULT_DB_VERSION_NUMBER: number = 20240220;
export const DEFAULT_DB_NAME = "remotelysavedb";
export const DEFAULT_TBL_VERSION = "schemaversion";
export const DEFAULT_TBL_FILE_HISTORY = "filefolderoperationhistory";
export const DEFAULT_TBL_SYNC_MAPPING = "syncmetadatahistory";
export const DEFAULT_SYNC_PLANS_HISTORY = "syncplanshistory";
export const DEFAULT_TBL_VAULT_RANDOM_ID_MAPPING = "vaultrandomidmapping";
export const DEFAULT_TBL_LOGGER_OUTPUT = "loggeroutput";
export const DEFAULT_TBL_SIMPLE_KV_FOR_MISC = "simplekvformisc";
export const DEFAULT_TBL_PREV_SYNC_RECORDS = "prevsyncrecords";
export const DEFAULT_TBL_PROFILER_RESULTS = "profilerresults";
export interface FileFolderHistoryRecord {
key: string;
ctime: number;
mtime: number;
size: number;
actionWhen: number;
actionType: "delete" | "rename" | "renameDestination";
keyType: "folder" | "file";
renameTo: string;
vaultRandomID: string;
}
/**
* @deprecated
*/
export const DEFAULT_TBL_FILE_HISTORY = "filefolderoperationhistory";
/**
* @deprecated
*/
export const DEFAULT_TBL_SYNC_MAPPING = "syncmetadatahistory";
/**
* @deprecated
* But we cannot remove it. Because we want to migrate the old data.
*/
interface SyncMetaMappingRecord {
localKey: string;
remoteKey: string;
@ -54,132 +56,119 @@ interface SyncPlanRecord {
export interface InternalDBs {
versionTbl: LocalForage;
fileHistoryTbl: LocalForage;
syncMappingTbl: LocalForage;
syncPlansTbl: LocalForage;
vaultRandomIDMappingTbl: LocalForage;
loggerOutputTbl: LocalForage;
simpleKVForMiscTbl: LocalForage;
prevSyncRecordsTbl: LocalForage;
profilerResultsTbl: LocalForage;
/**
* @deprecated
* But we cannot remove it. Because we want to migrate the old data.
*/
fileHistoryTbl: LocalForage;
/**
* @deprecated
* But we cannot remove it. Because we want to migrate the old data.
*/
syncMappingTbl: LocalForage;
}
/**
* This migration mainly aims to assign vault name or vault id into all tables.
* @param db
* @param vaultRandomID
* TODO
* @param syncMappings
* @returns
*/
const migrateDBsFrom20211114To20220108 = async (
db: InternalDBs,
vaultRandomID: string
) => {
const oldVer = 20211114;
const newVer = 20220108;
log.debug(`start upgrading internal db from ${oldVer} to ${newVer}`);
const fromSyncMappingsToPrevSyncRecords = (
oldSyncMappings: SyncMetaMappingRecord[]
): Entity[] => {
const res: Entity[] = [];
for (const oldMapping of oldSyncMappings) {
const newEntity: Entity = {
key: oldMapping.localKey,
keyEnc: oldMapping.remoteKey,
keyRaw:
oldMapping.remoteKey !== undefined && oldMapping.remoteKey !== ""
? oldMapping.remoteKey
: oldMapping.localKey,
mtimeCli: oldMapping.localMtime,
mtimeSvr: oldMapping.remoteMtime,
size: oldMapping.localSize,
sizeEnc: oldMapping.remoteSize,
sizeRaw:
oldMapping.remoteKey !== undefined && oldMapping.remoteKey !== ""
? oldMapping.remoteSize
: oldMapping.localSize,
etag: oldMapping.remoteExtraKey,
};
const allPromisesToWait: Promise<any>[] = [];
log.debug("assign vault id to any delete history");
const keysInDeleteHistoryTbl = await db.fileHistoryTbl.keys();
for (const key of keysInDeleteHistoryTbl) {
if (key.startsWith(vaultRandomID)) {
continue;
}
const value = (await db.fileHistoryTbl.getItem(
key
)) as FileFolderHistoryRecord;
if (value === null || value === undefined) {
continue;
}
if (value.vaultRandomID === undefined || value.vaultRandomID === "") {
value.vaultRandomID = vaultRandomID;
}
const newKey = `${vaultRandomID}\t${key}`;
allPromisesToWait.push(db.fileHistoryTbl.setItem(newKey, value));
allPromisesToWait.push(db.fileHistoryTbl.removeItem(key));
res.push(newEntity);
}
log.debug("assign vault id to any sync mapping");
const keysInSyncMappingTbl = await db.syncMappingTbl.keys();
for (const key of keysInSyncMappingTbl) {
if (key.startsWith(vaultRandomID)) {
continue;
}
const value = (await db.syncMappingTbl.getItem(
key
)) as SyncMetaMappingRecord;
if (value === null || value === undefined) {
continue;
}
if (value.vaultRandomID === undefined || value.vaultRandomID === "") {
value.vaultRandomID = vaultRandomID;
}
const newKey = `${vaultRandomID}\t${key}`;
allPromisesToWait.push(db.syncMappingTbl.setItem(newKey, value));
allPromisesToWait.push(db.syncMappingTbl.removeItem(key));
}
log.debug("assign vault id to any sync plan records");
const keysInSyncPlansTbl = await db.syncPlansTbl.keys();
for (const key of keysInSyncPlansTbl) {
if (key.startsWith(vaultRandomID)) {
continue;
}
const value = (await db.syncPlansTbl.getItem(key)) as SyncPlanRecord;
if (value === null || value === undefined) {
continue;
}
if (value.vaultRandomID === undefined || value.vaultRandomID === "") {
value.vaultRandomID = vaultRandomID;
}
const newKey = `${vaultRandomID}\t${key}`;
allPromisesToWait.push(db.syncPlansTbl.setItem(newKey, value));
allPromisesToWait.push(db.syncPlansTbl.removeItem(key));
}
log.debug("finally update version if everything is ok");
await Promise.all(allPromisesToWait);
await db.versionTbl.setItem("version", newVer);
log.debug(`finish upgrading internal db from ${oldVer} to ${newVer}`);
return res;
};
/**
* no need to do anything except changing version
* we just add more file operations in db, and no schema is changed.
*
* @param db
* @param vaultRandomID
* Migrate the sync mapping record to sync Entity.
*/
const migrateDBsFrom20220108To20220326 = async (
const migrateDBsFrom20220326To20240220 = async (
db: InternalDBs,
vaultRandomID: string
vaultRandomID: string,
profileID: string
) => {
const oldVer = 20220108;
const newVer = 20220326;
log.debug(`start upgrading internal db from ${oldVer} to ${newVer}`);
await db.versionTbl.setItem("version", newVer);
log.debug(`finish upgrading internal db from ${oldVer} to ${newVer}`);
const oldVer = 20220326;
const newVer = 20240220;
console.debug(`start upgrading internal db from ${oldVer} to ${newVer}`);
// from sync mapping to prev sync
const syncMappings = await getAllSyncMetaMappingByVault(db, vaultRandomID);
const prevSyncRecords = fromSyncMappingsToPrevSyncRecords(syncMappings);
for (const prevSyncRecord of prevSyncRecords) {
await upsertPrevSyncRecordByVaultAndProfile(
db,
vaultRandomID,
profileID,
prevSyncRecord
);
}
// // clear not used data
// // as of 20240220, we don't call them,
// // for the opportunity for users to downgrade
// await clearFileHistoryOfEverythingByVault(db, vaultRandomID);
// await clearAllSyncMetaMappingByVault(db, vaultRandomID);
await db.versionTbl.setItem(`${vaultRandomID}\tversion`, newVer);
console.debug(`finish upgrading internal db from ${oldVer} to ${newVer}`);
};
const migrateDBs = async (
db: InternalDBs,
oldVer: number,
newVer: number,
vaultRandomID: string
vaultRandomID: string,
profileID: string
) => {
if (oldVer === newVer) {
return;
}
if (oldVer === 20211114 && newVer === 20220108) {
return await migrateDBsFrom20211114To20220108(db, vaultRandomID);
// as of 20240220, we assume everyone is using 20220326 already
// drop any old code to reduce the verbose
if (oldVer < 20220326) {
throw Error(
"You are using a very old version of Remotely Save. No way to auto update internal DB. Please install and enable 0.3.40 firstly, then install a later version."
);
}
if (oldVer === 20220108 && newVer === 20220326) {
return await migrateDBsFrom20220108To20220326(db, vaultRandomID);
}
if (oldVer === 20211114 && newVer === 20220326) {
// TODO: more steps with more versions in the future
await migrateDBsFrom20211114To20220108(db, vaultRandomID);
await migrateDBsFrom20220108To20220326(db, vaultRandomID);
return;
if (oldVer === 20220326 && newVer === 20240220) {
return await migrateDBsFrom20220326To20240220(db, vaultRandomID, profileID);
}
if (newVer < oldVer) {
throw Error(
"You've installed a new version, but then downgrade to an old version. Stop working!"
@ -191,21 +180,14 @@ const migrateDBs = async (
export const prepareDBs = async (
vaultBasePath: string,
vaultRandomIDFromOldConfigFile: string
vaultRandomIDFromOldConfigFile: string,
profileID: string
) => {
const db = {
versionTbl: localforage.createInstance({
name: DEFAULT_DB_NAME,
storeName: DEFAULT_TBL_VERSION,
}),
fileHistoryTbl: localforage.createInstance({
name: DEFAULT_DB_NAME,
storeName: DEFAULT_TBL_FILE_HISTORY,
}),
syncMappingTbl: localforage.createInstance({
name: DEFAULT_DB_NAME,
storeName: DEFAULT_TBL_SYNC_MAPPING,
}),
syncPlansTbl: localforage.createInstance({
name: DEFAULT_DB_NAME,
storeName: DEFAULT_SYNC_PLANS_HISTORY,
@ -222,6 +204,23 @@ export const prepareDBs = async (
name: DEFAULT_DB_NAME,
storeName: DEFAULT_TBL_SIMPLE_KV_FOR_MISC,
}),
prevSyncRecordsTbl: localforage.createInstance({
name: DEFAULT_DB_NAME,
storeName: DEFAULT_TBL_PREV_SYNC_RECORDS,
}),
profilerResultsTbl: localforage.createInstance({
name: DEFAULT_DB_NAME,
storeName: DEFAULT_TBL_PROFILER_RESULTS,
}),
fileHistoryTbl: localforage.createInstance({
name: DEFAULT_DB_NAME,
storeName: DEFAULT_TBL_FILE_HISTORY,
}),
syncMappingTbl: localforage.createInstance({
name: DEFAULT_DB_NAME,
storeName: DEFAULT_TBL_SYNC_MAPPING,
}),
} as InternalDBs;
// try to get vaultRandomID firstly
@ -253,27 +252,35 @@ export const prepareDBs = async (
throw Error("no vaultRandomID found or generated");
}
const originalVersion: number | null = await db.versionTbl.getItem("version");
// as of 20240220, we set the version per vault, instead of global "version"
const originalVersion: number | null =
(await db.versionTbl.getItem(`${vaultRandomID}\tversion`)) ??
(await db.versionTbl.getItem("version"));
if (originalVersion === null) {
log.debug(
console.debug(
`no internal db version, setting it to ${DEFAULT_DB_VERSION_NUMBER}`
);
await db.versionTbl.setItem("version", DEFAULT_DB_VERSION_NUMBER);
// as of 20240220, we set the version per vault, instead of global "version"
await db.versionTbl.setItem(
`${vaultRandomID}\tversion`,
DEFAULT_DB_VERSION_NUMBER
);
} else if (originalVersion === DEFAULT_DB_VERSION_NUMBER) {
// do nothing
} else {
log.debug(
console.debug(
`trying to upgrade db version from ${originalVersion} to ${DEFAULT_DB_VERSION_NUMBER}`
);
await migrateDBs(
db,
originalVersion,
DEFAULT_DB_VERSION_NUMBER,
vaultRandomID
vaultRandomID,
profileID
);
}
log.info("db connected");
console.info("db connected");
return {
db: db,
vaultRandomID: vaultRandomID,
@ -284,306 +291,79 @@ export const destroyDBs = async () => {
// await localforage.dropInstance({
// name: DEFAULT_DB_NAME,
// });
// log.info("db deleted");
// console.info("db deleted");
const req = indexedDB.deleteDatabase(DEFAULT_DB_NAME);
req.onsuccess = (event) => {
log.info("db deleted");
console.info("db deleted");
};
req.onblocked = (event) => {
log.warn("trying to delete db but it was blocked");
console.warn("trying to delete db but it was blocked");
};
req.onerror = (event) => {
log.error("tried to delete db but something goes wrong!");
log.error(event);
console.error("tried to delete db but something goes wrong!");
console.error(event);
};
};
export const loadFileHistoryTableByVault = async (
export const clearFileHistoryOfEverythingByVault = async (
db: InternalDBs,
vaultRandomID: string
) => {
const records = [] as FileFolderHistoryRecord[];
await db.fileHistoryTbl.iterate((value, key, iterationNumber) => {
const keys = await db.fileHistoryTbl.keys();
for (const key of keys) {
if (key.startsWith(`${vaultRandomID}\t`)) {
records.push(value as FileFolderHistoryRecord);
await db.fileHistoryTbl.removeItem(key);
}
});
records.sort((a, b) => a.actionWhen - b.actionWhen); // ascending
return records;
};
export const clearDeleteRenameHistoryOfKeyAndVault = async (
db: InternalDBs,
key: string,
vaultRandomID: string
) => {
const fullKey = `${vaultRandomID}\t${key}`;
const item: FileFolderHistoryRecord | null =
await db.fileHistoryTbl.getItem(fullKey);
if (
item !== null &&
(item.actionType === "delete" || item.actionType === "rename")
) {
await db.fileHistoryTbl.removeItem(fullKey);
}
};
export const insertDeleteRecordByVault = async (
db: InternalDBs,
fileOrFolder: TAbstractFile | string,
vaultRandomID: string
) => {
// log.info(fileOrFolder);
let k: FileFolderHistoryRecord;
if (fileOrFolder instanceof TFile) {
k = {
key: fileOrFolder.path,
ctime: fileOrFolder.stat.ctime,
mtime: fileOrFolder.stat.mtime,
size: fileOrFolder.stat.size,
actionWhen: Date.now(),
actionType: "delete",
keyType: "file",
renameTo: "",
vaultRandomID: vaultRandomID,
};
await db.fileHistoryTbl.setItem(`${vaultRandomID}\t${k.key}`, k);
} else if (fileOrFolder instanceof TFolder) {
// key should endswith "/"
const key = fileOrFolder.path.endsWith("/")
? fileOrFolder.path
: `${fileOrFolder.path}/`;
const ctime = 0; // they are deleted, so no way to get ctime, mtime
const mtime = 0; // they are deleted, so no way to get ctime, mtime
k = {
key: key,
ctime: ctime,
mtime: mtime,
size: 0,
actionWhen: Date.now(),
actionType: "delete",
keyType: "folder",
renameTo: "",
vaultRandomID: vaultRandomID,
};
await db.fileHistoryTbl.setItem(`${vaultRandomID}\t${k.key}`, k);
} else if (typeof fileOrFolder === "string") {
// always the deletions in .obsidian folder
// so annoying that the path doesn't exists
// and we have to guess whether the path is folder or file
k = {
key: fileOrFolder,
ctime: 0,
mtime: 0,
size: 0,
actionWhen: Date.now(),
actionType: "delete",
keyType: "file",
renameTo: "",
vaultRandomID: vaultRandomID,
};
await db.fileHistoryTbl.setItem(`${vaultRandomID}\t${k.key}`, k);
for (const ext of [
"json",
"js",
"mjs",
"ts",
"md",
"txt",
"css",
"png",
"gif",
"jpg",
"jpeg",
"gitignore",
"gitkeep",
]) {
if (fileOrFolder.endsWith(`.${ext}`)) {
// stop here, no more need to insert the folder record later
return;
}
}
// also add a deletion record as folder if not ending with special exts
k = {
key: `${fileOrFolder}/`,
ctime: 0,
mtime: 0,
size: 0,
actionWhen: Date.now(),
actionType: "delete",
keyType: "folder",
renameTo: "",
vaultRandomID: vaultRandomID,
};
await db.fileHistoryTbl.setItem(`${vaultRandomID}\t${k.key}`, k);
}
};
/**
* A file/folder is renamed from A to B
* We insert two records:
* A with actionType="rename"
* B with actionType="renameDestination"
* @deprecated But we cannot remove it. Because we want to migrate the old data.
* @param db
* @param fileOrFolder
* @param oldPath
* @param vaultRandomID
* @returns
*/
export const insertRenameRecordByVault = async (
export const getAllSyncMetaMappingByVault = async (
db: InternalDBs,
fileOrFolder: TAbstractFile,
oldPath: string,
vaultRandomID: string
) => {
// log.info(fileOrFolder);
let k1: FileFolderHistoryRecord | undefined;
let k2: FileFolderHistoryRecord | undefined;
const actionWhen = Date.now();
if (fileOrFolder instanceof TFile) {
k1 = {
key: oldPath,
ctime: fileOrFolder.stat.ctime,
mtime: fileOrFolder.stat.mtime,
size: fileOrFolder.stat.size,
actionWhen: actionWhen,
actionType: "rename",
keyType: "file",
renameTo: fileOrFolder.path,
vaultRandomID: vaultRandomID,
};
k2 = {
key: fileOrFolder.path,
ctime: fileOrFolder.stat.ctime,
mtime: fileOrFolder.stat.mtime,
size: fileOrFolder.stat.size,
actionWhen: actionWhen,
actionType: "renameDestination",
keyType: "file",
renameTo: "", // itself is the destination, so no need to set this field
vaultRandomID: vaultRandomID,
};
} else if (fileOrFolder instanceof TFolder) {
const key = oldPath.endsWith("/") ? oldPath : `${oldPath}/`;
const renameTo = fileOrFolder.path.endsWith("/")
? fileOrFolder.path
: `${fileOrFolder.path}/`;
let ctime = 0;
let mtime = 0;
if (requireApiVersion(API_VER_STAT_FOLDER)) {
// TAbstractFile does not contain these info
// but from API_VER_STAT_FOLDER we can manually stat them by path.
const s = await statFix(fileOrFolder.vault, fileOrFolder.path);
if (s !== undefined && s !== null) {
ctime = s.ctime;
mtime = s.mtime;
}
}
k1 = {
key: key,
ctime: ctime,
mtime: mtime,
size: 0,
actionWhen: actionWhen,
actionType: "rename",
keyType: "folder",
renameTo: renameTo,
vaultRandomID: vaultRandomID,
};
k2 = {
key: renameTo,
ctime: ctime,
mtime: mtime,
size: 0,
actionWhen: actionWhen,
actionType: "renameDestination",
keyType: "folder",
renameTo: "", // itself is the destination, so no need to set this field
vaultRandomID: vaultRandomID,
};
}
await Promise.all([
db.fileHistoryTbl.setItem(`${vaultRandomID}\t${k1!.key}`, k1),
db.fileHistoryTbl.setItem(`${vaultRandomID}\t${k2!.key}`, k2),
]);
};
export const upsertSyncMetaMappingDataByVault = async (
serviceType: SUPPORTED_SERVICES_TYPE,
db: InternalDBs,
localKey: string,
localMTime: number,
localSize: number,
remoteKey: string,
remoteMTime: number,
remoteSize: number,
remoteExtraKey: string,
vaultRandomID: string
) => {
const aggregratedInfo: SyncMetaMappingRecord = {
localKey: localKey,
localMtime: localMTime,
localSize: localSize,
remoteKey: remoteKey,
remoteMtime: remoteMTime,
remoteSize: remoteSize,
remoteExtraKey: remoteExtraKey,
remoteType: serviceType,
keyType: localKey.endsWith("/") ? "folder" : "file",
vaultRandomID: vaultRandomID,
};
await db.syncMappingTbl.setItem(
`${vaultRandomID}\t${remoteKey}`,
aggregratedInfo
return await Promise.all(
((await db.syncMappingTbl.keys()) ?? [])
.filter((key) => key.startsWith(`${vaultRandomID}\t`))
.map(
async (key) =>
(await db.syncMappingTbl.getItem(key)) as SyncMetaMappingRecord
)
);
};
export const getSyncMetaMappingByRemoteKeyAndVault = async (
serviceType: SUPPORTED_SERVICES_TYPE,
export const clearAllSyncMetaMappingByVault = async (
db: InternalDBs,
remoteKey: string,
remoteMTime: number,
remoteExtraKey: string,
vaultRandomID: string
) => {
const potentialItem = (await db.syncMappingTbl.getItem(
`${vaultRandomID}\t${remoteKey}`
)) as SyncMetaMappingRecord;
if (potentialItem === null) {
// no result was found
return undefined;
const keys = await db.syncMappingTbl.keys();
for (const key of keys) {
if (key.startsWith(`${vaultRandomID}\t`)) {
await db.syncMappingTbl.removeItem(key);
}
}
if (
potentialItem.remoteKey === remoteKey &&
potentialItem.remoteMtime === remoteMTime &&
potentialItem.remoteExtraKey === remoteExtraKey &&
potentialItem.remoteType === serviceType
) {
// the result was found
return potentialItem;
} else {
return undefined;
}
};
export const clearAllSyncMetaMapping = async (db: InternalDBs) => {
await db.syncMappingTbl.clear();
};
export const insertSyncPlanRecordByVault = async (
db: InternalDBs,
syncPlan: SyncPlanType,
vaultRandomID: string
vaultRandomID: string,
remoteType: SUPPORTED_SERVICES_TYPE
) => {
const now = Date.now();
const record = {
ts: syncPlan.ts,
tsFmt: syncPlan.tsFmt,
ts: now,
tsFmt: unixTimeToStr(now),
vaultRandomID: vaultRandomID,
remoteType: syncPlan.remoteType,
remoteType: remoteType,
syncPlan: JSON.stringify(syncPlan /* directly stringify */, null, 2),
} as SyncPlanRecord;
await db.syncPlansTbl.setItem(`${vaultRandomID}\t${syncPlan.ts}`, record);
await db.syncPlansTbl.setItem(`${vaultRandomID}\t${now}`, record);
};
export const clearAllSyncPlanRecords = async (db: InternalDBs) => {
@ -610,13 +390,13 @@ export const readAllSyncPlanRecordTextsByVault = async (
};
/**
* We remove records that are older than 3 days or 100 records.
* We remove records that are older than 1 days or 20 records.
* It's a heavy operation, so we shall not place it in the start up.
* @param db
*/
export const clearExpiredSyncPlanRecords = async (db: InternalDBs) => {
const MILLISECONDS_OLD = 1000 * 60 * 60 * 24 * 3; // 3 days
const COUNT_TO_MANY = 100;
const MILLISECONDS_OLD = 1000 * 60 * 60 * 24 * 1; // 1 days
const COUNT_TO_MANY = 20;
const currTs = Date.now();
const expiredTs = currTs - MILLISECONDS_OLD;
@ -651,12 +431,66 @@ export const clearExpiredSyncPlanRecords = async (db: InternalDBs) => {
await Promise.all(ps);
};
export const clearAllLoggerOutputRecords = async (db: InternalDBs) => {
await db.loggerOutputTbl.clear();
log.debug(`successfully clearAllLoggerOutputRecords`);
export const getAllPrevSyncRecordsByVaultAndProfile = async (
db: InternalDBs,
vaultRandomID: string,
profileID: string
) => {
const res: Entity[] = [];
const kv: Record<string, Entity | null> =
await db.prevSyncRecordsTbl.getItems();
for (const key of Object.getOwnPropertyNames(kv)) {
if (key.startsWith(`${vaultRandomID}\t${profileID}\t`)) {
const val = kv[key];
if (val !== null) {
res.push(val);
}
}
}
return res;
};
export const upsertLastSuccessSyncByVault = async (
export const upsertPrevSyncRecordByVaultAndProfile = async (
db: InternalDBs,
vaultRandomID: string,
profileID: string,
prevSync: Entity
) => {
await db.prevSyncRecordsTbl.setItem(
`${vaultRandomID}\t${profileID}\t${prevSync.key}`,
prevSync
);
};
export const clearPrevSyncRecordByVaultAndProfile = async (
db: InternalDBs,
vaultRandomID: string,
profileID: string,
key: string
) => {
await db.prevSyncRecordsTbl.removeItem(
`${vaultRandomID}\t${profileID}\t${key}`
);
};
export const clearAllPrevSyncRecordByVault = async (
db: InternalDBs,
vaultRandomID: string
) => {
const keys = await db.prevSyncRecordsTbl.keys();
for (const key of keys) {
if (key.startsWith(`${vaultRandomID}\t`)) {
await db.prevSyncRecordsTbl.removeItem(key);
}
}
};
export const clearAllLoggerOutputRecords = async (db: InternalDBs) => {
await db.loggerOutputTbl.clear();
console.debug(`successfully clearAllLoggerOutputRecords`);
};
export const upsertLastSuccessSyncTimeByVault = async (
db: InternalDBs,
vaultRandomID: string,
millis: number
@ -667,7 +501,7 @@ export const upsertLastSuccessSyncByVault = async (
);
};
export const getLastSuccessSyncByVault = async (
export const getLastSuccessSyncTimeByVault = async (
db: InternalDBs,
vaultRandomID: string
) => {
@ -697,3 +531,45 @@ export const upsertPluginVersionByVault = async (
newVersion: newVersion,
};
};
export const insertProfilerResultByVault = async (
db: InternalDBs,
profilerStr: string,
vaultRandomID: string,
remoteType: SUPPORTED_SERVICES_TYPE
) => {
const now = Date.now();
await db.profilerResultsTbl.setItem(`${vaultRandomID}\t${now}`, profilerStr);
// clear older one while writing
const records = (await db.profilerResultsTbl.keys())
.filter((x) => x.startsWith(`${vaultRandomID}\t`))
.map((x) => parseInt(x.split("\t")[1]));
records.sort((a, b) => -(a - b)); // descending
while (records.length > 5) {
const ts = records.pop()!;
await db.profilerResultsTbl.removeItem(`${vaultRandomID}\t${ts}`);
}
};
export const readAllProfilerResultsByVault = async (
db: InternalDBs,
vaultRandomID: string
) => {
const records = [] as { val: string; ts: number }[];
await db.profilerResultsTbl.iterate((value, key, iterationNumber) => {
if (key.startsWith(`${vaultRandomID}\t`)) {
records.push({
val: value as string,
ts: parseInt(key.split("\t")[1]),
});
}
});
records.sort((a, b) => -(a.ts - b.ts)); // descending
if (records === undefined) {
return [] as string[];
} else {
return records.map((x) => x.val);
}
};

View File

@ -7,15 +7,13 @@ import {
setIcon,
FileSystemAdapter,
Platform,
TFile,
TFolder,
requestUrl,
requireApiVersion,
Events,
} from "obsidian";
import cloneDeep from "lodash/cloneDeep";
import { createElement, RotateCcw, RefreshCcw, FileText } from "lucide";
import type {
FileOrFolderMixedState,
RemotelySavePluginSettings,
SyncTriggerSourceType,
} from "./baseTypes";
@ -24,22 +22,20 @@ import {
COMMAND_CALLBACK_ONEDRIVE,
COMMAND_CALLBACK_DROPBOX,
COMMAND_URI,
REMOTELY_SAVE_VERSION_2024PREPARE,
API_VER_ENSURE_REQURL_OK,
} from "./baseTypes";
import { importQrCodeUri } from "./importExport";
import {
insertDeleteRecordByVault,
insertRenameRecordByVault,
insertSyncPlanRecordByVault,
loadFileHistoryTableByVault,
prepareDBs,
InternalDBs,
clearExpiredSyncPlanRecords,
upsertLastSuccessSyncByVault,
getLastSuccessSyncByVault,
upsertPluginVersionByVault,
clearAllLoggerOutputRecords,
upsertLastSuccessSyncTimeByVault,
getLastSuccessSyncTimeByVault,
getAllPrevSyncRecordsByVaultAndProfile,
insertProfilerResultByVault,
} from "./localdb";
import { RemoteClient } from "./remote";
import {
@ -57,21 +53,24 @@ import {
import { DEFAULT_S3_CONFIG } from "./remoteForS3";
import { DEFAULT_WEBDAV_CONFIG } from "./remoteForWebdav";
import { RemotelySaveSettingTab } from "./settings";
import { fetchMetadataFile, parseRemoteItems, SyncStatusType } from "./sync";
import { doActualSync, getSyncPlan, isPasswordOk } from "./sync";
import {
doActualSync,
ensembleMixedEnties,
getSyncPlanInplace,
isPasswordOk,
SyncStatusType,
} from "./sync";
import { messyConfigToNormal, normalConfigToMessy } from "./configPersist";
import { ObsConfigDirFileType, listFilesInObsFolder } from "./obsFolderLister";
import { getLocalEntityList } from "./local";
import { I18n } from "./i18n";
import type { LangType, LangTypeAndAuto, TransItemType } from "./i18n";
import { SyncAlgoV3Modal } from "./syncAlgoV3Notice";
import { DeletionOnRemote, MetadataOnRemote } from "./metadataOnRemote";
import { SyncAlgoV2Modal } from "./syncAlgoV2Notice";
import { applyLogWriterInplace, log } from "./moreOnLog";
import AggregateError from "aggregate-error";
import { exportVaultSyncPlansToFiles } from "./debugMode";
import { SizesConflictModal } from "./syncSizesConflictNotice";
import { compareVersion } from "./misc";
import { changeMobileStatusBar, compareVersion } from "./misc";
import { Cipher } from "./encryptUnified";
import { Profiler } from "./profiler";
const DEFAULT_SETTINGS: RemotelySavePluginSettings = {
s3: DEFAULT_S3_CONFIG,
@ -95,6 +94,14 @@ const DEFAULT_SETTINGS: RemotelySavePluginSettings = {
ignorePaths: [],
enableStatusBarInfo: true,
deleteToWhere: "system",
agreeToUseSyncV3: false,
conflictAction: "keep_newer",
howToCleanEmptyFolder: "skip",
protectModifyPercentage: 50,
syncDirection: "bidirectional",
obfuscateSettingFile: true,
enableMobileStatusBar: false,
encryptionMethod: "unknown",
};
interface OAuth2Info {
@ -145,12 +152,18 @@ export default class RemotelySavePlugin extends Plugin {
i18n!: I18n;
vaultRandomID!: string;
debugServerTemp?: string;
syncEvent?: Events;
appContainerObserver?: MutationObserver;
async syncRun(triggerSource: SyncTriggerSourceType = "manual") {
const profiler = new Profiler("start of syncRun");
const t = (x: TransItemType, vars?: any) => {
return this.i18n.t(x, vars);
};
const profileID = this.getCurrProfileID();
const getNotice = (x: string, timeout?: number) => {
// only show notices in manual mode
// no notice in auto mode
@ -159,15 +172,17 @@ export default class RemotelySavePlugin extends Plugin {
}
};
if (this.syncStatus !== "idle") {
// here the notice is shown regardless of triggerSource
new Notice(
// really, users don't want to see this in auto mode
// so we use getNotice to avoid unnecessary show up
getNotice(
t("syncrun_alreadyrunning", {
pluginName: this.manifest.name,
syncStatus: this.syncStatus,
newTriggerSource: triggerSource,
})
);
if (this.currSyncMsg !== undefined && this.currSyncMsg !== "") {
new Notice(this.currSyncMsg);
getNotice(this.currSyncMsg);
}
return;
}
@ -178,7 +193,7 @@ export default class RemotelySavePlugin extends Plugin {
}
try {
log.info(
console.info(
`${
this.manifest.id
}-${Date.now()}: start sync, triggerSource=${triggerSource}`
@ -203,7 +218,11 @@ export default class RemotelySavePlugin extends Plugin {
}
}
//log.info(`huh ${this.settings.password}`)
// change status to "syncing..." on statusbar
if (this.statusBarElement !== undefined) {
this.updateLastSuccessSyncMsg(-1);
}
//console.info(`huh ${this.settings.password}`)
if (this.settings.currLogLevel === "info") {
getNotice(
t("syncrun_shortstep1", {
@ -219,6 +238,7 @@ export default class RemotelySavePlugin extends Plugin {
}
this.syncStatus = "preparing";
profiler.insert("finish step1");
if (this.settings.currLogLevel === "info") {
// pass
@ -234,10 +254,14 @@ export default class RemotelySavePlugin extends Plugin {
this.settings.dropbox,
this.settings.onedrive,
this.app.vault.getName(),
() => self.saveSettings()
() => self.saveSettings(),
profiler
);
const remoteRsp = await client.listAllFromRemote();
// log.debug(remoteRsp);
const remoteEntityList = await client.listAllFromRemote();
console.debug("remoteEntityList:");
console.debug(remoteEntityList);
profiler.insert("finish step2 (listing remote)");
if (this.settings.currLogLevel === "info") {
// pass
@ -245,57 +269,52 @@ export default class RemotelySavePlugin extends Plugin {
getNotice(t("syncrun_step3"));
}
this.syncStatus = "checking_password";
const passwordCheckResult = await isPasswordOk(
remoteRsp.Contents,
this.settings.password
const cipher = new Cipher(
this.settings.password,
this.settings.encryptionMethod ?? "unknown"
);
const passwordCheckResult = await isPasswordOk(remoteEntityList, cipher);
if (!passwordCheckResult.ok) {
getNotice(t("syncrun_passworderr"));
throw Error(passwordCheckResult.reason);
}
profiler.insert("finish step3 (checking password)");
if (this.settings.currLogLevel === "info") {
// pass
} else {
getNotice(t("syncrun_step4"));
}
this.syncStatus = "getting_remote_extra_meta";
const { remoteStates, metadataFile } = await parseRemoteItems(
remoteRsp.Contents,
this.db,
this.vaultRandomID,
client.serviceType,
this.settings.password
);
const origMetadataOnRemote = await fetchMetadataFile(
metadataFile,
client,
this.syncStatus = "getting_local_meta";
const localEntityList = await getLocalEntityList(
this.app.vault,
this.settings.password
this.settings.syncConfigDir ?? false,
this.app.vault.configDir,
this.manifest.id,
profiler
);
console.debug("localEntityList:");
console.debug(localEntityList);
profiler.insert("finish step4 (local meta)");
if (this.settings.currLogLevel === "info") {
// pass
} else {
getNotice(t("syncrun_step5"));
}
this.syncStatus = "getting_local_meta";
const local = this.app.vault.getAllLoadedFiles();
const localHistory = await loadFileHistoryTableByVault(
this.syncStatus = "getting_local_prev_sync";
const prevSyncEntityList = await getAllPrevSyncRecordsByVaultAndProfile(
this.db,
this.vaultRandomID
this.vaultRandomID,
profileID
);
let localConfigDirContents: ObsConfigDirFileType[] | undefined =
undefined;
if (this.settings.syncConfigDir) {
localConfigDirContents = await listFilesInObsFolder(
this.app.vault.configDir,
this.app.vault,
this.manifest.id
);
}
// log.info(local);
// log.info(localHistory);
console.debug("prevSyncEntityList:");
console.debug(prevSyncEntityList);
profiler.insert("finish step5 (prev sync)");
if (this.settings.currLogLevel === "info") {
// pass
@ -303,24 +322,39 @@ export default class RemotelySavePlugin extends Plugin {
getNotice(t("syncrun_step6"));
}
this.syncStatus = "generating_plan";
const { plan, sortedKeys, deletions, sizesGoWrong } = await getSyncPlan(
remoteStates,
local,
localConfigDirContents,
origMetadataOnRemote.deletions,
localHistory,
client.serviceType,
triggerSource,
this.app.vault,
let mixedEntityMappings = await ensembleMixedEnties(
localEntityList,
prevSyncEntityList,
remoteEntityList,
this.settings.syncConfigDir ?? false,
this.app.vault.configDir,
this.settings.syncUnderscoreItems ?? false,
this.settings.skipSizeLargerThan ?? -1,
this.settings.ignorePaths ?? [],
this.settings.password
cipher,
this.settings.serviceType,
profiler
);
log.info(plan.mixedStates); // for debugging
await insertSyncPlanRecordByVault(this.db, plan, this.vaultRandomID);
profiler.insert("finish building partial mixedEntity");
mixedEntityMappings = await getSyncPlanInplace(
mixedEntityMappings,
this.settings.howToCleanEmptyFolder ?? "skip",
this.settings.skipSizeLargerThan ?? -1,
this.settings.conflictAction ?? "keep_newer",
this.settings.syncDirection ?? "bidirectional",
profiler
);
console.info(`mixedEntityMappings:`);
console.info(mixedEntityMappings); // for debugging
profiler.insert("finish building full sync plan");
await insertSyncPlanRecordByVault(
this.db,
mixedEntityMappings,
this.vaultRandomID,
client.serviceType
);
profiler.insert("finish writing sync plan");
profiler.insert("finish step6 (plan)");
// The operations above are almost read only and kind of safe.
// The operations below begins to write or delete (!!!) something.
@ -333,30 +367,47 @@ export default class RemotelySavePlugin extends Plugin {
}
this.syncStatus = "syncing";
await doActualSync(
mixedEntityMappings,
client,
this.db,
this.vaultRandomID,
profileID,
this.app.vault,
plan,
sortedKeys,
metadataFile,
origMetadataOnRemote,
sizesGoWrong,
deletions,
cipher,
this.settings.concurrency ?? 5,
(key: string) => self.trash(key),
this.settings.password,
this.settings.concurrency,
(ss: FileOrFolderMixedState[]) => {
new SizesConflictModal(
self.app,
self,
this.settings.skipSizeLargerThan ?? -1,
ss,
this.settings.password !== ""
).open();
this.settings.protectModifyPercentage ?? 50,
(
protectModifyPercentage: number,
realModifyDeleteCount: number,
allFilesCount: number
) => {
const percent = (
(100 * realModifyDeleteCount) /
allFilesCount
).toFixed(1);
const res = t("syncrun_abort_protectmodifypercentage", {
protectModifyPercentage,
realModifyDeleteCount,
allFilesCount,
percent,
});
return res;
},
(i: number, totalCount: number, pathName: string, decision: string) =>
self.setCurrSyncMsg(i, totalCount, pathName, decision)
(
realCounter: number,
realTotalCount: number,
pathName: string,
decision: string
) =>
self.setCurrSyncMsg(
realCounter,
realTotalCount,
pathName,
decision,
triggerSource
),
this.db,
profiler
);
} else {
this.syncStatus = "syncing";
@ -367,6 +418,10 @@ export default class RemotelySavePlugin extends Plugin {
}
}
cipher.closeResources();
profiler.insert("finish step7 (actual sync)");
if (this.settings.currLogLevel === "info") {
getNotice(t("syncrun_shortstep2"));
} else {
@ -376,8 +431,10 @@ export default class RemotelySavePlugin extends Plugin {
this.syncStatus = "finish";
this.syncStatus = "idle";
profiler.insert("finish step8");
const lastSuccessSyncMillis = Date.now();
await upsertLastSuccessSyncByVault(
await upsertLastSuccessSyncTimeByVault(
this.db,
this.vaultRandomID,
lastSuccessSyncMillis
@ -392,20 +449,22 @@ export default class RemotelySavePlugin extends Plugin {
this.updateLastSuccessSyncMsg(lastSuccessSyncMillis);
}
log.info(
this.syncEvent?.trigger("SYNC_DONE");
console.info(
`${
this.manifest.id
}-${Date.now()}: finish sync, triggerSource=${triggerSource}`
);
} catch (error: any) {
profiler.insert("start error branch");
const msg = t("syncrun_abort", {
manifestID: this.manifest.id,
theDate: `${Date.now()}`,
triggerSource: triggerSource,
syncStatus: this.syncStatus,
});
log.error(msg);
log.error(error);
console.error(msg);
console.error(error);
getNotice(msg, 10 * 1000);
if (error instanceof AggregateError) {
for (const e of error.errors) {
@ -419,11 +478,23 @@ export default class RemotelySavePlugin extends Plugin {
setIcon(this.syncRibbon, iconNameSyncWait);
this.syncRibbon.setAttribute("aria-label", originLabel);
}
profiler.insert("finish error branch");
}
profiler.insert("finish syncRun");
console.debug(profiler.toString());
insertProfilerResultByVault(
this.db,
profiler.toString(),
this.vaultRandomID,
this.settings.serviceType
);
profiler.clear();
}
async onload() {
log.info(`loading plugin ${this.manifest.id}`);
console.info(`loading plugin ${this.manifest.id}`);
const { iconSvgSyncWait, iconSvgSyncRunning, iconSvgLogs } = getIconSvg();
@ -441,8 +512,13 @@ export default class RemotelySavePlugin extends Plugin {
this.currSyncMsg = "";
this.syncEvent = new Events();
await this.loadSettings();
// MUST after loadSettings and before prepareDB
const profileID: string = this.getCurrProfileID();
// lang should be load early, but after settings
this.i18n = new I18n(this.settings.lang!, async (lang: LangTypeAndAuto) => {
this.settings.lang = lang;
@ -452,10 +528,6 @@ export default class RemotelySavePlugin extends Plugin {
return this.i18n.t(x, vars);
};
if (this.settings.currLogLevel !== undefined) {
log.setLevel(this.settings.currLogLevel as any);
}
await this.checkIfOauthExpires();
// MUST before prepareDB()
@ -472,7 +544,8 @@ export default class RemotelySavePlugin extends Plugin {
try {
await this.prepareDBAndVaultRandomID(
vaultBasePath,
vaultRandomIDFromOldConfigFile
vaultRandomIDFromOldConfigFile,
profileID
);
} catch (err: any) {
new Notice(
@ -483,7 +556,6 @@ export default class RemotelySavePlugin extends Plugin {
}
// must AFTER preparing DB
this.redirectLoggingOuputBasedOnSetting();
this.enableAutoClearOutputToDBHistIfSet();
// must AFTER preparing DB
@ -491,53 +563,8 @@ export default class RemotelySavePlugin extends Plugin {
this.syncStatus = "idle";
this.registerEvent(
this.app.vault.on("delete", async (fileOrFolder) => {
await insertDeleteRecordByVault(
this.db,
fileOrFolder,
this.vaultRandomID
);
})
);
this.registerEvent(
this.app.vault.on("rename", async (fileOrFolder, oldPath) => {
await insertRenameRecordByVault(
this.db,
fileOrFolder,
oldPath,
this.vaultRandomID
);
})
);
function getMethods(obj: any) {
var result = [];
for (var id in obj) {
try {
if (typeof obj[id] == "function") {
result.push(id + ": " + obj[id].toString());
}
} catch (err) {
result.push(id + ": inaccessible");
}
}
return result.join("\n");
}
this.registerEvent(
this.app.vault.on("raw" as any, async (fileOrFolder) => {
// special track on .obsidian folder
const name = `${fileOrFolder}`;
if (name.startsWith(this.app.vault.configDir)) {
if (!(await this.app.vault.adapter.exists(name))) {
await insertDeleteRecordByVault(this.db, name, this.vaultRandomID);
}
}
})
);
this.registerObsidianProtocolHandler(COMMAND_URI, async (inputParams) => {
// console.debug(inputParams);
const parsed = importQrCodeUri(inputParams, this.app.vault.getName());
if (parsed.status === "error") {
new Notice(parsed.message);
@ -752,20 +779,26 @@ export default class RemotelySavePlugin extends Plugin {
async () => this.syncRun("manual")
);
// Create Status Bar Item (not supported on mobile)
if (!Platform.isMobileApp && this.settings.enableStatusBarInfo === true) {
this.enableMobileStatusBarIfSet();
// Create Status Bar Item
if (
(!Platform.isMobile ||
(Platform.isMobile && this.settings.enableMobileStatusBar)) &&
this.settings.enableStatusBarInfo === true
) {
const statusBarItem = this.addStatusBarItem();
this.statusBarElement = statusBarItem.createEl("span");
this.statusBarElement.setAttribute("data-tooltip-position", "top");
this.updateLastSuccessSyncMsg(
await getLastSuccessSyncByVault(this.db, this.vaultRandomID)
await getLastSuccessSyncTimeByVault(this.db, this.vaultRandomID)
);
// update statusbar text every 30 seconds
this.registerInterval(
window.setInterval(async () => {
this.updateLastSuccessSyncMsg(
await getLastSuccessSyncByVault(this.db, this.vaultRandomID)
await getLastSuccessSyncTimeByVault(this.db, this.vaultRandomID)
);
}, 1000 * 30)
);
@ -790,14 +823,45 @@ export default class RemotelySavePlugin extends Plugin {
});
this.addCommand({
id: "export-sync-plans-json",
name: t("command_exportsyncplans_json"),
id: "export-sync-plans-1",
name: t("command_exportsyncplans_1"),
icon: iconNameLogs,
callback: async () => {
await exportVaultSyncPlansToFiles(
this.db,
this.app.vault,
this.vaultRandomID
this.vaultRandomID,
1
);
new Notice(t("settings_syncplans_notice"));
},
});
this.addCommand({
id: "export-sync-plans-5",
name: t("command_exportsyncplans_5"),
icon: iconNameLogs,
callback: async () => {
await exportVaultSyncPlansToFiles(
this.db,
this.app.vault,
this.vaultRandomID,
5
);
new Notice(t("settings_syncplans_notice"));
},
});
this.addCommand({
id: "export-sync-plans-all",
name: t("command_exportsyncplans_all"),
icon: iconNameLogs,
callback: async () => {
await exportVaultSyncPlansToFiles(
this.db,
this.app.vault,
this.vaultRandomID,
-1
);
new Notice(t("settings_syncplans_notice"));
},
@ -806,12 +870,12 @@ export default class RemotelySavePlugin extends Plugin {
this.addSettingTab(new RemotelySaveSettingTab(this.app, this));
// this.registerDomEvent(document, "click", (evt: MouseEvent) => {
// log.info("click", evt);
// console.info("click", evt);
// });
if (!this.settings.agreeToUploadExtraMetadata) {
const syncAlgoV2Modal = new SyncAlgoV2Modal(this.app, this);
syncAlgoV2Modal.open();
if (!this.settings.agreeToUseSyncV3) {
const syncAlgoV3Modal = new SyncAlgoV3Modal(this.app, this);
syncAlgoV3Modal.open();
} else {
this.enableAutoSyncIfSet();
this.enableInitSyncIfSet();
@ -824,14 +888,15 @@ export default class RemotelySavePlugin extends Plugin {
this.vaultRandomID,
this.manifest.version
);
if (compareVersion(REMOTELY_SAVE_VERSION_2024PREPARE, oldVersion) >= 0) {
new Notice(t("official_notice_2024_first_party"), 10 * 1000);
}
}
async onunload() {
log.info(`unloading plugin ${this.manifest.id}`);
console.info(`unloading plugin ${this.manifest.id}`);
this.syncRibbon = undefined;
if (this.appContainerObserver !== undefined) {
this.appContainerObserver.disconnect();
this.appContainerObserver = undefined;
}
if (this.oauth2Info !== undefined) {
this.oauth2Info.helperModal = undefined;
this.oauth2Info = {
@ -913,11 +978,66 @@ export default class RemotelySavePlugin extends Plugin {
this.settings.s3.bypassCorsLocally = true; // deprecated as of 20240113
}
if (this.settings.agreeToUseSyncV3 === undefined) {
this.settings.agreeToUseSyncV3 = false;
}
if (this.settings.conflictAction === undefined) {
this.settings.conflictAction = "keep_newer";
}
if (this.settings.howToCleanEmptyFolder === undefined) {
this.settings.howToCleanEmptyFolder = "skip";
}
if (this.settings.protectModifyPercentage === undefined) {
this.settings.protectModifyPercentage = 50;
}
if (this.settings.syncDirection === undefined) {
this.settings.syncDirection = "bidirectional";
}
if (this.settings.obfuscateSettingFile === undefined) {
this.settings.obfuscateSettingFile = true;
}
if (this.settings.enableMobileStatusBar === undefined) {
this.settings.enableMobileStatusBar = false;
}
if (
this.settings.encryptionMethod === undefined ||
this.settings.encryptionMethod === "unknown"
) {
if (
this.settings.password === undefined ||
this.settings.password === ""
) {
// we have a preferred way
this.settings.encryptionMethod = "rclone-base64";
} else {
// likely to be inherited from the old version
this.settings.encryptionMethod = "openssl-base64";
}
}
await this.saveSettings();
}
async saveSettings() {
await this.saveData(normalConfigToMessy(this.settings));
if (this.settings.obfuscateSettingFile) {
await this.saveData(normalConfigToMessy(this.settings));
} else {
await this.saveData(this.settings);
}
}
/**
* After 202403 the data should be of profile based.
*/
getCurrProfileID() {
if (this.settings.serviceType !== undefined) {
return `${this.settings.serviceType}-default-1`;
} else {
throw Error("unknown serviceType in the setting!");
}
}
async checkIfOauthExpires() {
@ -999,7 +1119,7 @@ export default class RemotelySavePlugin extends Plugin {
// a real string was assigned before
vaultRandomID = this.settings.vaultRandomID;
}
log.debug("vaultRandomID is no longer saved in data.json");
console.debug("vaultRandomID is no longer saved in data.json");
delete this.settings.vaultRandomID;
await this.saveSettings();
}
@ -1029,11 +1149,13 @@ export default class RemotelySavePlugin extends Plugin {
async prepareDBAndVaultRandomID(
vaultBasePath: string,
vaultRandomIDFromOldConfigFile: string
vaultRandomIDFromOldConfigFile: string,
profileID: string
) {
const { db, vaultRandomID } = await prepareDBs(
vaultBasePath,
vaultRandomIDFromOldConfigFile
vaultRandomIDFromOldConfigFile,
profileID
);
this.db = db;
this.vaultRandomID = vaultRandomID;
@ -1063,7 +1185,7 @@ export default class RemotelySavePlugin extends Plugin {
) {
this.app.workspace.onLayoutReady(() => {
window.setTimeout(() => {
this.syncRun("autoOnceInit");
this.syncRun("auto_once_init");
}, this.settings.initRunAfterMilliseconds);
});
}
@ -1076,51 +1198,81 @@ export default class RemotelySavePlugin extends Plugin {
this.settings.syncOnSaveAfterMilliseconds > 0
) {
let runScheduled = false;
this.app.workspace.onLayoutReady(() => {
const intervalID = window.setInterval(() => {
const currentFile = this.app.workspace.getActiveFile();
let needToRunAgain = false;
if (currentFile) {
// get the last modified time of the current file
// if it has been modified within the last syncOnSaveAfterMilliseconds
// then schedule a run for syncOnSaveAfterMilliseconds after it was modified
const lastModified = currentFile.stat.mtime;
const currentTime = Date.now();
// log.debug(
// `Checking if file was modified within last ${
// this.settings.syncOnSaveAfterMilliseconds / 1000
// } seconds, last modified: ${
// (currentTime - lastModified) / 1000
// } seconds ago`
// );
if (
currentTime - lastModified <
this.settings!.syncOnSaveAfterMilliseconds!
) {
if (!runScheduled) {
const scheduleTimeFromNow =
this.settings!.syncOnSaveAfterMilliseconds! -
(currentTime - lastModified);
log.info(
`schedule a run for ${scheduleTimeFromNow} milliseconds later`
);
runScheduled = true;
setTimeout(() => {
this.syncRun("auto_sync_on_save");
runScheduled = false;
}, scheduleTimeFromNow);
}
const scheduleSyncOnSave = (scheduleTimeFromNow: number) => {
console.info(
`schedule a run for ${scheduleTimeFromNow} milliseconds later`
);
runScheduled = true;
setTimeout(() => {
this.syncRun("auto_sync_on_save");
runScheduled = false;
}, scheduleTimeFromNow);
};
const checkCurrFileModified = async (caller: "SYNC" | "FILE_CHANGES") => {
const currentFile = this.app.workspace.getActiveFile();
if (currentFile) {
// get the last modified time of the current file
// if it has modified after lastSuccessSync
// then schedule a run for syncOnSaveAfterMilliseconds after it was modified
const lastModified = currentFile.stat.mtime;
const lastSuccessSyncMillis = await getLastSuccessSyncTimeByVault(
this.db,
this.vaultRandomID
);
if (
this.syncStatus === "idle" &&
lastModified > lastSuccessSyncMillis &&
!runScheduled
) {
scheduleSyncOnSave(this.settings!.syncOnSaveAfterMilliseconds!);
} else if (
this.syncStatus === "idle" &&
needToRunAgain &&
!runScheduled
) {
scheduleSyncOnSave(this.settings!.syncOnSaveAfterMilliseconds!);
needToRunAgain = false;
} else {
if (caller === "FILE_CHANGES") {
needToRunAgain = true;
}
}
}, this.settings.syncOnSaveAfterMilliseconds);
this.syncOnSaveIntervalID = intervalID;
this.registerInterval(intervalID);
}
};
this.app.workspace.onLayoutReady(() => {
// listen to sync done
this.registerEvent(
this.syncEvent?.on("SYNC_DONE", () => {
checkCurrFileModified("SYNC");
})!
);
// listen to current file save changes
this.registerEvent(
this.app.vault.on("modify", (x) => {
// console.debug(`event=modify! file=${x}`);
checkCurrFileModified("FILE_CHANGES");
})
);
});
}
}
enableMobileStatusBarIfSet() {
this.app.workspace.onLayoutReady(() => {
if (Platform.isMobile && this.settings.enableMobileStatusBar) {
this.appContainerObserver = changeMobileStatusBar("enable");
}
});
}
async saveAgreeToUseNewSyncAlgorithm() {
this.settings.agreeToUploadExtraMetadata = true;
this.settings.agreeToUseSyncV3 = true;
await this.saveSettings();
}
@ -1128,9 +1280,10 @@ export default class RemotelySavePlugin extends Plugin {
i: number,
totalCount: number,
pathName: string,
decision: string
decision: string,
triggerSource: SyncTriggerSourceType
) {
const msg = `syncing progress=${i}/${totalCount},decision=${decision},path=${pathName}`;
const msg = `syncing progress=${i}/${totalCount},decision=${decision},path=${pathName},source=${triggerSource}`;
this.currSyncMsg = msg;
}
@ -1144,6 +1297,10 @@ export default class RemotelySavePlugin extends Plugin {
let lastSyncMsg = t("statusbar_lastsync_never");
let lastSyncLabelMsg = t("statusbar_lastsync_never_label");
if (lastSuccessSyncMillis !== undefined && lastSuccessSyncMillis === -1) {
lastSyncMsg = t("statusbar_syncing");
}
if (lastSuccessSyncMillis !== undefined && lastSuccessSyncMillis > 0) {
const deltaTime = Date.now() - lastSuccessSyncMillis;
@ -1154,6 +1311,8 @@ export default class RemotelySavePlugin extends Plugin {
const days = Math.floor(deltaTime / 86400000);
const hours = Math.floor(deltaTime / 3600000);
const minutes = Math.floor(deltaTime / 60000);
const seconds = Math.floor(deltaTime / 1000);
let timeText = "";
if (years > 0) {
@ -1168,8 +1327,10 @@ export default class RemotelySavePlugin extends Plugin {
timeText = t("statusbar_time_hours", { time: hours });
} else if (minutes > 0) {
timeText = t("statusbar_time_minutes", { time: minutes });
} else {
} else if (seconds > 30) {
timeText = t("statusbar_time_lessminute");
} else {
timeText = t("statusbar_now");
}
let dateText = new Date(lastSuccessSyncMillis).toLocaleTimeString(
@ -1182,7 +1343,7 @@ export default class RemotelySavePlugin extends Plugin {
}
);
lastSyncMsg = t("statusbar_lastsync", { time: timeText });
lastSyncMsg = timeText;
lastSyncLabelMsg = t("statusbar_lastsync_label", { date: dateText });
}
@ -1223,31 +1384,6 @@ export default class RemotelySavePlugin extends Plugin {
}
}
redirectLoggingOuputBasedOnSetting() {
applyLogWriterInplace((...msg: any[]) => {
if (
this.debugServerTemp !== undefined &&
this.debugServerTemp.trim().startsWith("http")
) {
try {
requestUrl({
url: this.debugServerTemp,
method: "POST",
headers: {
"Content-Type": "application/json",
},
body: JSON.stringify({
send_time: Date.now(),
log_text: msg,
}),
});
} catch (e) {
// pass
}
}
});
}
enableAutoClearOutputToDBHistIfSet() {
const initClearOutputToDBHistAfterMilliseconds = 1000 * 30;

View File

@ -1,7 +1,6 @@
import isEqual from "lodash/isEqual";
import { base64url } from "rfc4648";
import { reverseString } from "./misc";
import { log } from "./moreOnLog";
const DEFAULT_README_FOR_METADATAONREMOTE =
"Do NOT edit or delete the file manually. This file is for the plugin remotely-save to store some necessary meta data on the remote services. Its content is slightly obfuscated.";

View File

@ -1,12 +1,10 @@
import { Vault } from "obsidian";
import { Platform, Vault } from "obsidian";
import * as path from "path";
import { base32, base64url } from "rfc4648";
import XRegExp from "xregexp";
import emojiRegex from "emoji-regex";
import { log } from "./moreOnLog";
declare global {
interface Window {
moment: (...data: any) => any;
@ -30,7 +28,7 @@ export const isHiddenPath = (
}
const k = path.posix.normalize(item); // TODO: only unix path now
const k2 = k.split("/"); // TODO: only unix path now
// log.info(k2)
// console.info(k2)
for (const singlePart of k2) {
if (singlePart === "." || singlePart === ".." || singlePart === "") {
continue;
@ -75,14 +73,14 @@ export const getFolderLevels = (x: string, addEndingSlash: boolean = false) => {
};
export const mkdirpInVault = async (thePath: string, vault: Vault) => {
// log.info(thePath);
// console.info(thePath);
const foldersToBuild = getFolderLevels(thePath);
// log.info(foldersToBuild);
// console.info(foldersToBuild);
for (const folder of foldersToBuild) {
const r = await vault.adapter.exists(folder);
// log.info(r);
// console.info(r);
if (!r) {
log.info(`mkdir ${folder}`);
console.info(`mkdir ${folder}`);
await vault.adapter.mkdir(folder);
}
}
@ -120,6 +118,12 @@ export const base64ToArrayBuffer = (b64text: string) => {
return bufferToArrayBuffer(Buffer.from(b64text, "base64"));
};
export const copyArrayBuffer = (src: ArrayBuffer) => {
var dst = new ArrayBuffer(src.byteLength);
new Uint8Array(dst).set(new Uint8Array(src));
return dst;
};
/**
* https://stackoverflow.com/questions/43131242
* @param hex
@ -161,6 +165,9 @@ export const base64ToBase64url = (a: string, pad: boolean = false) => {
* @param a
*/
export const isVaildText = (a: string) => {
if (a === undefined) {
return false;
}
// If the regex matches, the string is invalid.
return !XRegExp("\\p{Cc}|\\p{Cf}|\\p{Co}|\\p{Cn}|\\p{Zl}|\\p{Zp}", "A").test(
a
@ -435,7 +442,10 @@ export const statFix = async (vault: Vault, path: string) => {
return s;
};
export const isFolderToSkip = (x: string, more: string[] | undefined) => {
export const isSpecialFolderNameToSkip = (
x: string,
more: string[] | undefined
) => {
let specialFolders = [
".git",
".github",
@ -490,3 +500,154 @@ export const compareVersion = (x: string | null, y: string | null) => {
}
return -1;
};
/**
* https://stackoverflow.com/questions/19929641/how-to-append-an-html-string-to-a-documentfragment
* To introduce some advanced html fragments.
* @param string
* @returns
*/
export const stringToFragment = (string: string) => {
const wrapper = document.createElement("template");
wrapper.innerHTML = string;
return wrapper.content;
};
/**
* https://stackoverflow.com/questions/39538473/using-settimeout-on-promise-chain
* @param ms
* @returns
*/
export const delay = (ms: number) =>
new Promise((resolve) => setTimeout(resolve, ms));
/**
* https://forum.obsidian.md/t/css-to-show-status-bar-on-mobile-devices/77185
* @param op
*/
export const changeMobileStatusBar = (
op: "enable" | "disable",
oldAppContainerObserver?: MutationObserver
) => {
const appContainer = document.getElementsByClassName("app-container")[0] as
| HTMLElement
| undefined;
const statusbar = document.querySelector(
".is-mobile .app-container .status-bar"
) as HTMLElement | undefined;
if (appContainer === undefined || statusbar === undefined) {
// give up, exit
console.warn(`give up watching appContainer for statusbar`);
console.warn(`appContainer=${appContainer}, statusbar=${statusbar}`);
return undefined;
}
if (op === "enable") {
const callback = async (
mutationList: MutationRecord[],
observer: MutationObserver
) => {
for (const mutation of mutationList) {
// console.debug(mutation);
if (mutation.type === "childList" && mutation.addedNodes.length > 0) {
const k = mutation.addedNodes[0] as Element;
if (
k.className.contains("mobile-navbar") ||
k.className.contains("mobile-toolbar")
) {
// have to wait, otherwise the height is not correct??
await delay(300);
const height = window
.getComputedStyle(k as Element)
.getPropertyValue("height");
statusbar.style.setProperty("display", "flex");
statusbar.style.setProperty("margin-bottom", height);
}
}
}
};
const observer = new MutationObserver(callback);
observer.observe(appContainer, {
attributes: false,
childList: true,
characterData: false,
subtree: false,
});
try {
// init, manual call
const navBar = document.getElementsByClassName(
"mobile-navbar"
)[0] as HTMLElement;
// thanks to community's solution
const height = window.getComputedStyle(navBar).getPropertyValue("height");
statusbar.style.setProperty("display", "flex");
statusbar.style.setProperty("margin-bottom", height);
} catch (e) {
// skip
}
return observer;
} else {
if (oldAppContainerObserver !== undefined) {
console.debug(`disconnect oldAppContainerObserver`);
oldAppContainerObserver.disconnect();
oldAppContainerObserver = undefined;
}
statusbar.style.removeProperty("display");
statusbar.style.removeProperty("margin-bottom");
return undefined;
}
};
/**
* https://github.com/remotely-save/remotely-save/issues/567
* https://www.dropboxforum.com/t5/Dropbox-API-Support-Feedback/Case-Sensitivity-in-API-2/td-p/191279
* @param entities
*/
export const fixEntityListCasesInplace = (entities: { keyRaw: string }[]) => {
entities.sort((a, b) => a.keyRaw.length - b.keyRaw.length);
// console.log(JSON.stringify(entities,null,2));
const caseMapping: Record<string, string> = { "": "" };
for (const e of entities) {
// console.log(`looking for: ${JSON.stringify(e, null, 2)}`);
let parentFolder = getParentFolder(e.keyRaw);
if (parentFolder === "/") {
parentFolder = "";
}
const parentFolderLower = parentFolder.toLocaleLowerCase();
const segs = e.keyRaw.split("/");
if (e.keyRaw.endsWith("/")) {
// folder
if (caseMapping.hasOwnProperty(parentFolderLower)) {
const newKeyRaw = `${caseMapping[parentFolderLower]}${segs
.slice(-2)
.join("/")}`;
caseMapping[newKeyRaw.toLocaleLowerCase()] = newKeyRaw;
e.keyRaw = newKeyRaw;
// console.log(JSON.stringify(caseMapping,null,2));
continue;
} else {
throw Error(`${parentFolder} doesn't have cases record??`);
}
} else {
// file
if (caseMapping.hasOwnProperty(parentFolderLower)) {
const newKeyRaw = `${caseMapping[parentFolderLower]}${segs
.slice(-1)
.join("/")}`;
e.keyRaw = newKeyRaw;
continue;
} else {
throw Error(`${parentFolder} doesn't have cases record??`);
}
}
}
return entities;
};

View File

@ -1,40 +0,0 @@
// It's very dangerous for this file to depend on other files in the same project.
// We should avoid this situation as much as possible.
import { TAbstractFile, TFolder, TFile, Vault } from "obsidian";
import * as origLog from "loglevel";
import type {
LogLevelNumbers,
Logger,
LogLevel,
LogLevelDesc,
LogLevelNames,
} from "loglevel";
const log2 = origLog.getLogger("rs-default");
const originalFactory = log2.methodFactory;
export const applyLogWriterInplace = function (writer: (...msg: any[]) => any) {
log2.methodFactory = function (
methodName: LogLevelNames,
logLevel: LogLevelNumbers,
loggerName: string | symbol
) {
const rawMethod = originalFactory(methodName, logLevel, loggerName);
return function (...msg: any[]) {
rawMethod.apply(undefined, msg);
writer(...msg);
};
};
log2.setLevel(log2.getLevel());
};
export const restoreLogWritterInplace = () => {
log2.methodFactory = originalFactory;
log2.setLevel(log2.getLevel());
};
export const log = log2;

View File

@ -1,16 +1,10 @@
import { Vault, Stat, ListedFiles } from "obsidian";
import type { Vault, Stat, ListedFiles } from "obsidian";
import type { Entity, MixedEntity } from "./baseTypes";
import { Queue } from "@fyears/tsqueue";
import chunk from "lodash/chunk";
import flatten from "lodash/flatten";
import { statFix, isFolderToSkip } from "./misc";
export interface ObsConfigDirFileType {
key: string;
ctime: number;
mtime: number;
size: number;
type: "folder" | "file";
}
import { statFix, isSpecialFolderNameToSkip } from "./misc";
const isPluginDirItself = (x: string, pluginId: string) => {
return (
@ -48,10 +42,10 @@ export const listFilesInObsFolder = async (
configDir: string,
vault: Vault,
pluginId: string
) => {
): Promise<Entity[]> => {
const q = new Queue([configDir]);
const CHUNK_SIZE = 10;
const contents: ObsConfigDirFileType[] = [];
const contents: Entity[] = [];
while (q.length > 0) {
const itemsToFetch: string[] = [];
while (q.length > 0) {
@ -72,11 +66,26 @@ export const listFilesInObsFolder = async (
children = await vault.adapter.list(x);
}
if (
!isFolder &&
(statRes.mtime === undefined ||
statRes.mtime === null ||
statRes.mtime === 0)
) {
throw Error(
`File in Obsidian ${configDir} has last modified time 0: ${x}, don't know how to deal with it.`
);
}
return {
itself: {
key: isFolder ? `${x}/` : x,
...statRes,
} as ObsConfigDirFileType,
key: isFolder ? `${x}/` : x, // local always unencrypted
keyRaw: isFolder ? `${x}/` : x,
mtimeCli: statRes.mtime,
mtimeSvr: statRes.mtime,
size: statRes.size, // local always unencrypted
sizeRaw: statRes.size,
},
children: children,
};
});
@ -87,7 +96,9 @@ export const listFilesInObsFolder = async (
const isInsideSelfPlugin = isPluginDirItself(iter.itself.key, pluginId);
if (iter.children !== undefined) {
for (const iter2 of iter.children.folders) {
if (isFolderToSkip(iter2, ["workspace", "workspace.json"])) {
if (
isSpecialFolderNameToSkip(iter2, ["workspace", "workspace.json"])
) {
continue;
}
if (isInsideSelfPlugin && !isLikelyPluginSubFiles(iter2)) {
@ -97,7 +108,9 @@ export const listFilesInObsFolder = async (
q.push(iter2);
}
for (const iter2 of iter.children.files) {
if (isFolderToSkip(iter2, ["workspace", "workspace.json"])) {
if (
isSpecialFolderNameToSkip(iter2, ["workspace", "workspace.json"])
) {
continue;
}
if (isInsideSelfPlugin && !isLikelyPluginSubFiles(iter2)) {

82
src/profiler.ts Normal file
View File

@ -0,0 +1,82 @@
import { unixTimeToStr } from "./misc";
interface BreakPoint {
label: string;
fakeTimeMilli: number; // it's NOT a unix timestamp
indent: number;
}
export class Profiler {
startTime: number;
breakPoints: BreakPoint[];
indent: number;
constructor(label?: string) {
this.breakPoints = [];
this.indent = 0;
this.startTime = 0;
if (label !== undefined) {
this.startTime = Date.now();
this.breakPoints.push({
label: label,
fakeTimeMilli: performance.now(),
indent: this.indent,
});
}
}
insert(label: string) {
if (this.breakPoints.length === 0) {
this.startTime = Date.now();
}
this.breakPoints.push({
label: label,
fakeTimeMilli: performance.now(),
indent: this.indent,
});
return this;
}
addIndent() {
this.indent += 2;
}
removeIndent() {
this.indent -= 2;
if (this.indent < 0) {
this.indent = 0;
}
}
clear() {
this.breakPoints = [];
this.indent = 0;
this.startTime = 0;
return this;
}
toString() {
if (this.breakPoints.length === 0) {
return "nothing in profiler";
}
let res = `[startTime]: ${unixTimeToStr(this.startTime)}`;
for (let i = 0; i < this.breakPoints.length; ++i) {
if (i === 0) {
res += `\n[${this.breakPoints[i]["label"]}]: start`;
} else {
const label = this.breakPoints[i]["label"];
const indent = this.breakPoints[i]["indent"];
const millsec =
Math.round(
(this.breakPoints[i]["fakeTimeMilli"] -
this.breakPoints[i - 1]["fakeTimeMilli"]) *
10
) / 10.0;
res += `\n${" ".repeat(indent)}[${label}]: ${millsec}ms`;
}
}
return res;
}
}

View File

@ -1,17 +1,19 @@
import { Vault } from "obsidian";
import type {
Entity,
DropboxConfig,
OnedriveConfig,
S3Config,
SUPPORTED_SERVICES_TYPE,
WebdavConfig,
UploadedType,
} from "./baseTypes";
import * as dropbox from "./remoteForDropbox";
import * as onedrive from "./remoteForOnedrive";
import * as s3 from "./remoteForS3";
import * as webdav from "./remoteForWebdav";
import { log } from "./moreOnLog";
import { Cipher } from "./encryptUnified";
import { Profiler } from "./profiler";
export class RemoteClient {
readonly serviceType: SUPPORTED_SERVICES_TYPE;
@ -30,7 +32,8 @@ export class RemoteClient {
dropboxConfig?: DropboxConfig,
onedriveConfig?: OnedriveConfig,
vaultName?: string,
saveUpdatedConfigFunc?: () => Promise<any>
saveUpdatedConfigFunc?: () => Promise<any>,
profiler?: Profiler
) {
this.serviceType = serviceType;
// the client may modify the config inplace,
@ -105,13 +108,13 @@ export class RemoteClient {
uploadToRemote = async (
fileOrFolderPath: string,
vault: Vault | undefined,
isRecursively: boolean = false,
password: string = "",
isRecursively: boolean,
cipher: Cipher,
remoteEncryptedKey: string = "",
foldersCreatedBefore: Set<string> | undefined = undefined,
uploadRaw: boolean = false,
rawContent: string | ArrayBuffer = ""
) => {
): Promise<UploadedType> => {
if (this.serviceType === "s3") {
return await s3.uploadToRemote(
s3.getS3Client(this.s3Config!),
@ -119,7 +122,7 @@ export class RemoteClient {
fileOrFolderPath,
vault,
isRecursively,
password,
cipher,
remoteEncryptedKey,
uploadRaw,
rawContent
@ -130,7 +133,7 @@ export class RemoteClient {
fileOrFolderPath,
vault,
isRecursively,
password,
cipher,
remoteEncryptedKey,
uploadRaw,
rawContent
@ -141,7 +144,7 @@ export class RemoteClient {
fileOrFolderPath,
vault,
isRecursively,
password,
cipher,
remoteEncryptedKey,
foldersCreatedBefore,
uploadRaw,
@ -153,7 +156,7 @@ export class RemoteClient {
fileOrFolderPath,
vault,
isRecursively,
password,
cipher,
remoteEncryptedKey,
foldersCreatedBefore,
uploadRaw,
@ -164,7 +167,7 @@ export class RemoteClient {
}
};
listAllFromRemote = async () => {
listAllFromRemote = async (): Promise<Entity[]> => {
if (this.serviceType === "s3") {
return await s3.listAllFromRemote(
s3.getS3Client(this.s3Config!),
@ -185,7 +188,7 @@ export class RemoteClient {
fileOrFolderPath: string,
vault: Vault,
mtime: number,
password: string = "",
cipher: Cipher,
remoteEncryptedKey: string = "",
skipSaving: boolean = false
) => {
@ -196,7 +199,7 @@ export class RemoteClient {
fileOrFolderPath,
vault,
mtime,
password,
cipher,
remoteEncryptedKey,
skipSaving
);
@ -206,7 +209,7 @@ export class RemoteClient {
fileOrFolderPath,
vault,
mtime,
password,
cipher,
remoteEncryptedKey,
skipSaving
);
@ -216,7 +219,7 @@ export class RemoteClient {
fileOrFolderPath,
vault,
mtime,
password,
cipher,
remoteEncryptedKey,
skipSaving
);
@ -226,7 +229,7 @@ export class RemoteClient {
fileOrFolderPath,
vault,
mtime,
password,
cipher,
remoteEncryptedKey,
skipSaving
);
@ -237,36 +240,38 @@ export class RemoteClient {
deleteFromRemote = async (
fileOrFolderPath: string,
password: string = "",
remoteEncryptedKey: string = ""
cipher: Cipher,
remoteEncryptedKey: string = "",
synthesizedFolder: boolean = false
) => {
if (this.serviceType === "s3") {
return await s3.deleteFromRemote(
s3.getS3Client(this.s3Config!),
this.s3Config!,
fileOrFolderPath,
password,
remoteEncryptedKey
cipher,
remoteEncryptedKey,
synthesizedFolder
);
} else if (this.serviceType === "webdav") {
return await webdav.deleteFromRemote(
this.webdavClient!,
fileOrFolderPath,
password,
cipher,
remoteEncryptedKey
);
} else if (this.serviceType === "dropbox") {
return await dropbox.deleteFromRemote(
this.dropboxClient!,
fileOrFolderPath,
password,
cipher,
remoteEncryptedKey
);
} else if (this.serviceType === "onedrive") {
return await onedrive.deleteFromRemote(
this.onedriveClient!,
fileOrFolderPath,
password,
cipher,
remoteEncryptedKey
);
} else {

View File

@ -1,27 +1,28 @@
import { rangeDelay } from "delay";
import { Dropbox, DropboxAuth } from "dropbox";
import type { files, DropboxResponseError, DropboxResponse } from "dropbox";
import { Vault } from "obsidian";
import * as path from "path";
import {
DropboxConfig,
RemoteItem,
Entity,
COMMAND_CALLBACK_DROPBOX,
OAUTH2_FORCE_EXPIRE_MILLISECONDS,
UploadedType,
} from "./baseTypes";
import { decryptArrayBuffer, encryptArrayBuffer } from "./encrypt";
import {
bufferToArrayBuffer,
delay,
fixEntityListCasesInplace,
getFolderLevels,
hasEmojiInText,
headersToRecord,
mkdirpInVault,
} from "./misc";
import { Cipher } from "./encryptUnified";
import { random } from "lodash";
export { Dropbox } from "dropbox";
import { log } from "./moreOnLog";
export const DEFAULT_DROPBOX_CONFIG: DropboxConfig = {
accessToken: "",
clientID: process.env.DEFAULT_DROPBOX_APP_KEY ?? "",
@ -42,7 +43,7 @@ export const getDropboxPath = (
// special
key = `/${remoteBaseDir}`;
} else if (fileOrFolderPath.startsWith("/")) {
log.warn(
console.warn(
`why the path ${fileOrFolderPath} starts with '/'? but we just go on.`
);
key = `/${remoteBaseDir}${fileOrFolderPath}`;
@ -69,13 +70,13 @@ const getNormPath = (fileOrFolderPath: string, remoteBaseDir: string) => {
return fileOrFolderPath.slice(`/${remoteBaseDir}/`.length);
};
const fromDropboxItemToRemoteItem = (
const fromDropboxItemToEntity = (
x:
| files.FileMetadataReference
| files.FolderMetadataReference
| files.DeletedMetadataReference,
remoteBaseDir: string
): RemoteItem => {
): Entity => {
let key = getNormPath(x.path_display!, remoteBaseDir);
if (x[".tag"] === "folder" && !key.endsWith("/")) {
key = `${key}/`;
@ -83,94 +84,27 @@ const fromDropboxItemToRemoteItem = (
if (x[".tag"] === "folder") {
return {
key: key,
lastModified: undefined,
size: 0,
remoteType: "dropbox",
keyRaw: key,
sizeRaw: 0,
etag: `${x.id}\t`,
} as RemoteItem;
} as Entity;
} else if (x[".tag"] === "file") {
let mtime = Date.parse(x.client_modified).valueOf();
if (mtime === 0) {
mtime = Date.parse(x.server_modified).valueOf();
}
const mtimeCli = Date.parse(x.client_modified).valueOf();
const mtimeSvr = Date.parse(x.server_modified).valueOf();
return {
key: key,
lastModified: mtime,
size: x.size,
remoteType: "dropbox",
keyRaw: key,
mtimeCli: mtimeCli,
mtimeSvr: mtimeSvr,
sizeRaw: x.size,
hash: x.content_hash,
etag: `${x.id}\t${x.content_hash}`,
} as RemoteItem;
} as Entity;
} else {
// x[".tag"] === "deleted"
throw Error("do not support deleted tag");
}
};
/**
* Dropbox api doesn't return mtime for folders.
* This is a try to assign mtime by using files in folder.
* @param allFilesFolders
* @returns
*/
const fixLastModifiedTimeInplace = (allFilesFolders: RemoteItem[]) => {
if (allFilesFolders.length === 0) {
return;
}
// sort by longer to shorter
allFilesFolders.sort((a, b) => b.key.length - a.key.length);
// a "map" from dir to mtime
let potentialMTime = {} as Record<string, number>;
// first sort pass, from buttom to up
for (const item of allFilesFolders) {
if (item.key.endsWith("/")) {
// itself is a folder, and initially doesn't have mtime
if (item.lastModified === undefined && item.key in potentialMTime) {
// previously we gathered all sub info of this folder
item.lastModified = potentialMTime[item.key];
}
}
const parent = `${path.posix.dirname(item.key)}/`;
if (item.lastModified !== undefined) {
if (parent in potentialMTime) {
potentialMTime[parent] = Math.max(
potentialMTime[parent],
item.lastModified
);
} else {
potentialMTime[parent] = item.lastModified;
}
}
}
// second pass, from up to buttom.
// fill mtime by parent folder or Date.Now() if still not available.
// this is only possible if no any sub-folder-files recursively.
// we do not sort the array again, just iterate over it by reverse
// using good old for loop.
for (let i = allFilesFolders.length - 1; i >= 0; --i) {
const item = allFilesFolders[i];
if (!item.key.endsWith("/")) {
continue; // skip files
}
if (item.lastModified !== undefined) {
continue; // don't need to deal with it
}
const parent = `${path.posix.dirname(item.key)}/`;
if (parent in potentialMTime) {
item.lastModified = potentialMTime[parent];
} else {
item.lastModified = Date.now().valueOf();
potentialMTime[item.key] = item.lastModified;
}
}
return allFilesFolders;
};
////////////////////////////////////////////////////////////////////////////////
// Dropbox authorization using PKCE
// see https://dropbox.tech/developers/pkce--what-and-why-
@ -235,7 +169,7 @@ export const sendAuthReq = async (
const resp2 = (await resp1.json()) as DropboxSuccessAuthRes;
return resp2;
} catch (e) {
log.error(e);
console.error(e);
if (errorCallBack !== undefined) {
await errorCallBack(e);
}
@ -247,7 +181,7 @@ export const sendRefreshTokenReq = async (
refreshToken: string
) => {
try {
log.info("start auto getting refreshed Dropbox access token.");
console.info("start auto getting refreshed Dropbox access token.");
const resp1 = await fetch("https://api.dropboxapi.com/oauth2/token", {
method: "POST",
body: new URLSearchParams({
@ -257,10 +191,10 @@ export const sendRefreshTokenReq = async (
}),
});
const resp2 = (await resp1.json()) as DropboxSuccessAuthRes;
log.info("finish auto getting refreshed Dropbox access token.");
console.info("finish auto getting refreshed Dropbox access token.");
return resp2;
} catch (e) {
log.error(e);
console.error(e);
throw e;
}
};
@ -270,7 +204,7 @@ export const setConfigBySuccessfullAuthInplace = async (
authRes: DropboxSuccessAuthRes,
saveUpdatedConfigFunc: () => Promise<any> | undefined
) => {
log.info("start updating local info of Dropbox token");
console.info("start updating local info of Dropbox token");
config.accessToken = authRes.access_token;
config.accessTokenExpiresInSeconds = parseInt(authRes.expires_in);
@ -290,7 +224,7 @@ export const setConfigBySuccessfullAuthInplace = async (
await saveUpdatedConfigFunc();
}
log.info("finish updating local info of Dropbox token");
console.info("finish updating local info of Dropbox token");
};
////////////////////////////////////////////////////////////////////////////////
@ -311,7 +245,7 @@ async function retryReq<T>(
for (let idx = 0; idx < waitSeconds.length; ++idx) {
try {
if (idx !== 0) {
log.warn(
console.warn(
`${extraHint === "" ? "" : extraHint + ": "}The ${
idx + 1
}-th try starts at time ${Date.now()}`
@ -348,7 +282,7 @@ async function retryReq<T>(
const fallbackSec = waitSeconds[idx];
const secMin = Math.max(svrSec, fallbackSec);
const secMax = Math.max(secMin * 1.8, 2);
log.warn(
console.warn(
`${
extraHint === "" ? "" : extraHint + ": "
}We have "429 too many requests" error of ${
@ -359,7 +293,7 @@ async function retryReq<T>(
2
)}`
);
await rangeDelay(secMin * 1000, secMax * 1000);
await delay(random(secMin * 1000, secMax * 1000));
}
}
}
@ -421,9 +355,9 @@ export class WrappedDropboxClient {
}
// check vault folder
// log.info(`checking remote has folder /${this.remoteBaseDir}`);
// console.info(`checking remote has folder /${this.remoteBaseDir}`);
if (this.vaultFolderExists) {
// log.info(`already checked, /${this.remoteBaseDir} exist before`)
// console.info(`already checked, /${this.remoteBaseDir} exist before`)
} else {
const res = await this.dropbox.filesListFolder({
path: "",
@ -436,7 +370,7 @@ export class WrappedDropboxClient {
}
}
if (!this.vaultFolderExists) {
log.info(`remote does not have folder /${this.remoteBaseDir}`);
console.info(`remote does not have folder /${this.remoteBaseDir}`);
if (hasEmojiInText(`/${this.remoteBaseDir}`)) {
throw new Error(
@ -447,10 +381,10 @@ export class WrappedDropboxClient {
await this.dropbox.filesCreateFolderV2({
path: `/${this.remoteBaseDir}`,
});
log.info(`remote folder /${this.remoteBaseDir} created`);
console.info(`remote folder /${this.remoteBaseDir} created`);
this.vaultFolderExists = true;
} else {
// log.info(`remote folder /${this.remoteBaseDir} exists`);
// console.info(`remote folder /${this.remoteBaseDir} exists`);
}
}
@ -498,7 +432,7 @@ export const getRemoteMeta = async (
// size: 0,
// remoteType: "dropbox",
// etag: undefined,
// } as RemoteItem;
// } as Entity;
// }
const rsp = await retryReq(() =>
@ -512,26 +446,31 @@ export const getRemoteMeta = async (
if (rsp.status !== 200) {
throw Error(JSON.stringify(rsp));
}
return fromDropboxItemToRemoteItem(rsp.result, client.remoteBaseDir);
return fromDropboxItemToEntity(rsp.result, client.remoteBaseDir);
};
export const uploadToRemote = async (
client: WrappedDropboxClient,
fileOrFolderPath: string,
vault: Vault | undefined,
isRecursively: boolean = false,
password: string = "",
isRecursively: boolean,
cipher: Cipher,
remoteEncryptedKey: string = "",
foldersCreatedBefore: Set<string> | undefined = undefined,
uploadRaw: boolean = false,
rawContent: string | ArrayBuffer = "",
rawContentMTime: number = 0,
rawContentCTime: number = 0
) => {
): Promise<UploadedType> => {
await client.init();
let uploadFile = fileOrFolderPath;
if (password !== "") {
if (!cipher.isPasswordEmpty()) {
if (remoteEncryptedKey === undefined || remoteEncryptedKey === "") {
throw Error(
`uploadToRemote(dropbox) you have password but remoteEncryptedKey is empty!`
);
}
uploadFile = remoteEncryptedKey;
}
uploadFile = getDropboxPath(uploadFile, client.remoteBaseDir);
@ -546,8 +485,8 @@ export const uploadToRemote = async (
let ctime = 0;
const s = await vault?.adapter?.stat(fileOrFolderPath);
if (s !== undefined && s !== null) {
mtime = Math.round(s.mtime / 1000.0) * 1000;
ctime = Math.round(s.ctime / 1000.0) * 1000;
mtime = Math.floor(s.mtime / 1000.0) * 1000;
ctime = Math.floor(s.ctime / 1000.0) * 1000;
}
const mtimeStr = new Date(mtime).toISOString().replace(/\.\d{3}Z$/, "Z");
@ -560,8 +499,8 @@ export const uploadToRemote = async (
throw Error(`you specify uploadRaw, but you also provide a folder key!`);
}
// folder
if (password === "") {
// if not encrypted, mkdir a remote folder
if (cipher.isPasswordEmpty() || cipher.isFolderAware()) {
// if not encrypted, || encrypted isFolderAware, mkdir a remote folder
if (foldersCreatedBefore?.has(uploadFile)) {
// created, pass
} else {
@ -588,9 +527,13 @@ export const uploadToRemote = async (
}
}
const res = await getRemoteMeta(client, uploadFile);
return res;
return {
entity: res,
mtimeCli: mtime,
};
} else {
// if encrypted, upload a fake file with the encrypted file name
// if encrypted && !isFolderAware(),
// upload a fake file with the encrypted file name
await retryReq(
() =>
client.dropbox.filesUpload({
@ -600,7 +543,10 @@ export const uploadToRemote = async (
}),
fileOrFolderPath
);
return await getRemoteMeta(client, uploadFile);
return {
entity: await getRemoteMeta(client, uploadFile),
mtimeCli: mtime,
};
}
} else {
// file
@ -621,8 +567,8 @@ export const uploadToRemote = async (
localContent = await vault.adapter.readBinary(fileOrFolderPath);
}
let remoteContent = localContent;
if (password !== "") {
remoteContent = await encryptArrayBuffer(localContent, password);
if (!cipher.isPasswordEmpty()) {
remoteContent = await cipher.encryptContent(localContent);
}
// in dropbox, we don't need to create folders before uploading! cool!
// TODO: filesUploadSession for larger files (>=150 MB)
@ -649,7 +595,10 @@ export const uploadToRemote = async (
foldersCreatedBefore?.add(dir);
}
}
return await getRemoteMeta(client, uploadFile);
return {
entity: await getRemoteMeta(client, uploadFile),
mtimeCli: mtime,
};
}
};
@ -664,13 +613,13 @@ export const listAllFromRemote = async (client: WrappedDropboxClient) => {
if (res.status !== 200) {
throw Error(JSON.stringify(res));
}
// log.info(res);
// console.info(res);
const contents = res.result.entries;
const unifiedContents = contents
.filter((x) => x[".tag"] !== "deleted")
.filter((x) => x.path_display !== `/${client.remoteBaseDir}`)
.map((x) => fromDropboxItemToRemoteItem(x, client.remoteBaseDir));
.map((x) => fromDropboxItemToEntity(x, client.remoteBaseDir));
while (res.result.has_more) {
res = await client.dropbox.filesListFolderContinue({
@ -684,15 +633,13 @@ export const listAllFromRemote = async (client: WrappedDropboxClient) => {
const unifiedContents2 = contents2
.filter((x) => x[".tag"] !== "deleted")
.filter((x) => x.path_display !== `/${client.remoteBaseDir}`)
.map((x) => fromDropboxItemToRemoteItem(x, client.remoteBaseDir));
.map((x) => fromDropboxItemToEntity(x, client.remoteBaseDir));
unifiedContents.push(...unifiedContents2);
}
fixLastModifiedTimeInplace(unifiedContents);
fixEntityListCasesInplace(unifiedContents);
return {
Contents: unifiedContents,
};
return unifiedContents;
};
const downloadFromRemoteRaw = async (
@ -728,7 +675,7 @@ export const downloadFromRemote = async (
fileOrFolderPath: string,
vault: Vault,
mtime: number,
password: string = "",
cipher: Cipher,
remoteEncryptedKey: string = "",
skipSaving: boolean = false
) => {
@ -749,14 +696,14 @@ export const downloadFromRemote = async (
return new ArrayBuffer(0);
} else {
let downloadFile = fileOrFolderPath;
if (password !== "") {
if (!cipher.isPasswordEmpty()) {
downloadFile = remoteEncryptedKey;
}
downloadFile = getDropboxPath(downloadFile, client.remoteBaseDir);
const remoteContent = await downloadFromRemoteRaw(client, downloadFile);
let localContent = remoteContent;
if (password !== "") {
localContent = await decryptArrayBuffer(remoteContent, password);
if (!cipher.isPasswordEmpty()) {
localContent = await cipher.decryptContent(remoteContent);
}
if (!skipSaving) {
await vault.adapter.writeBinary(fileOrFolderPath, localContent, {
@ -770,14 +717,14 @@ export const downloadFromRemote = async (
export const deleteFromRemote = async (
client: WrappedDropboxClient,
fileOrFolderPath: string,
password: string = "",
cipher: Cipher,
remoteEncryptedKey: string = ""
) => {
if (fileOrFolderPath === "/") {
return;
}
let remoteFileName = fileOrFolderPath;
if (password !== "") {
if (!cipher.isPasswordEmpty()) {
remoteFileName = remoteEncryptedKey;
}
remoteFileName = getDropboxPath(remoteFileName, client.remoteBaseDir);
@ -792,8 +739,8 @@ export const deleteFromRemote = async (
fileOrFolderPath
);
} catch (err) {
log.error("some error while deleting");
log.error(err);
console.error("some error while deleting");
console.error(err);
}
};
@ -809,7 +756,7 @@ export const checkConnectivity = async (
}
return true;
} catch (err) {
log.debug(err);
console.debug(err);
if (callbackFunc !== undefined) {
callbackFunc(err);
}

View File

@ -14,17 +14,16 @@ import {
DEFAULT_CONTENT_TYPE,
OAUTH2_FORCE_EXPIRE_MILLISECONDS,
OnedriveConfig,
RemoteItem,
Entity,
UploadedType,
} from "./baseTypes";
import { decryptArrayBuffer, encryptArrayBuffer } from "./encrypt";
import {
bufferToArrayBuffer,
getRandomArrayBuffer,
getRandomIntInclusive,
mkdirpInVault,
} from "./misc";
import { log } from "./moreOnLog";
import { Cipher } from "./encryptUnified";
const SCOPES = ["User.Read", "Files.ReadWrite.AppFolder", "offline_access"];
const REDIRECT_URI = `obsidian://${COMMAND_CALLBACK_ONEDRIVE}`;
@ -116,8 +115,8 @@ export const sendAuthReq = async (
// code: authCode,
// codeVerifier: verifier, // PKCE Code Verifier
// });
// log.info('authResponse')
// log.info(authResponse)
// console.info('authResponse')
// console.info(authResponse)
// return authResponse;
// Because of the CORS problem,
@ -142,7 +141,7 @@ export const sendAuthReq = async (
});
const rsp2 = JSON.parse(rsp1);
// log.info(rsp2);
// console.info(rsp2);
if (rsp2.error !== undefined) {
return rsp2 as AccessCodeResponseFailedType;
@ -150,7 +149,7 @@ export const sendAuthReq = async (
return rsp2 as AccessCodeResponseSuccessfulType;
}
} catch (e) {
log.error(e);
console.error(e);
await errorCallBack(e);
}
};
@ -176,7 +175,7 @@ export const sendRefreshTokenReq = async (
});
const rsp2 = JSON.parse(rsp1);
// log.info(rsp2);
// console.info(rsp2);
if (rsp2.error !== undefined) {
return rsp2 as AccessCodeResponseFailedType;
@ -184,7 +183,7 @@ export const sendRefreshTokenReq = async (
return rsp2 as AccessCodeResponseSuccessfulType;
}
} catch (e) {
log.error(e);
console.error(e);
throw e;
}
};
@ -194,7 +193,7 @@ export const setConfigBySuccessfullAuthInplace = async (
authRes: AccessCodeResponseSuccessfulType,
saveUpdatedConfigFunc: () => Promise<any> | undefined
) => {
log.info("start updating local info of OneDrive token");
console.info("start updating local info of OneDrive token");
config.accessToken = authRes.access_token;
config.accessTokenExpiresAtTime =
Date.now() + authRes.expires_in - 5 * 60 * 1000;
@ -209,7 +208,7 @@ export const setConfigBySuccessfullAuthInplace = async (
await saveUpdatedConfigFunc();
}
log.info("finish updating local info of Onedrive token");
console.info("finish updating local info of Onedrive token");
};
////////////////////////////////////////////////////////////////////////////////
@ -230,7 +229,7 @@ const getOnedrivePath = (fileOrFolderPath: string, remoteBaseDir: string) => {
}
if (key.startsWith("/")) {
log.warn(`why the path ${key} starts with '/'? but we just go on.`);
console.warn(`why the path ${key} starts with '/'? but we just go on.`);
key = `${prefix}${key}`;
} else {
key = `${prefix}/${key}`;
@ -255,22 +254,23 @@ const getNormPath = (fileOrFolderPath: string, remoteBaseDir: string) => {
return fileOrFolderPath.slice(`${prefix}/`.length);
};
const constructFromDriveItemToRemoteItemError = (x: DriveItem) => {
const constructFromDriveItemToEntityError = (x: DriveItem) => {
return `parentPath="${
x.parentReference?.path ?? "(no parentReference or path)"
}", selfName="${x.name}"`;
};
const fromDriveItemToRemoteItem = (
x: DriveItem,
remoteBaseDir: string
): RemoteItem => {
const fromDriveItemToEntity = (x: DriveItem, remoteBaseDir: string): Entity => {
let key = "";
// possible prefix:
// pure english: /drive/root:/Apps/remotely-save/${remoteBaseDir}
// or localized, e.g.: /drive/root:/应用/remotely-save/${remoteBaseDir}
const FIRST_COMMON_PREFIX_REGEX = /^\/drive\/root:\/[^\/]+\/remotely-save\//g;
// why?? /drive/root:/Apps/Graph
const FIFTH_COMMON_PREFIX_REGEX = /^\/drive\/root:\/[^\/]+\/Graph\//g;
// or the root is absolute path /Livefolders,
// e.g.: /Livefolders/应用/remotely-save/${remoteBaseDir}
const SECOND_COMMON_PREFIX_REGEX = /^\/Livefolders\/[^\/]+\/remotely-save\//g;
@ -293,6 +293,7 @@ const fromDriveItemToRemoteItem = (
}
const fullPathOriginal = `${x.parentReference.path}/${x.name}`;
const matchFirstPrefixRes = fullPathOriginal.match(FIRST_COMMON_PREFIX_REGEX);
const matchFifthPrefixRes = fullPathOriginal.match(FIFTH_COMMON_PREFIX_REGEX);
const matchSecondPrefixRes = fullPathOriginal.match(
SECOND_COMMON_PREFIX_REGEX
);
@ -303,6 +304,12 @@ const fromDriveItemToRemoteItem = (
) {
const foundPrefix = `${matchFirstPrefixRes[0]}${remoteBaseDir}`;
key = fullPathOriginal.substring(foundPrefix.length + 1);
} else if (
matchFifthPrefixRes !== null &&
fullPathOriginal.startsWith(`${matchFifthPrefixRes[0]}${remoteBaseDir}`)
) {
const foundPrefix = `${matchFifthPrefixRes[0]}${remoteBaseDir}`;
key = fullPathOriginal.substring(foundPrefix.length + 1);
} else if (
matchSecondPrefixRes !== null &&
fullPathOriginal.startsWith(`${matchSecondPrefixRes[0]}${remoteBaseDir}`)
@ -333,14 +340,14 @@ const fromDriveItemToRemoteItem = (
key = x.name;
} else {
throw Error(
`we meet file/folder and do not know how to deal with it:\n${constructFromDriveItemToRemoteItemError(
`we meet file/folder and do not know how to deal with it:\n${constructFromDriveItemToEntityError(
x
)}`
);
}
} else {
throw Error(
`we meet file/folder and do not know how to deal with it:\n${constructFromDriveItemToRemoteItemError(
`we meet file/folder and do not know how to deal with it:\n${constructFromDriveItemToEntityError(
x
)}`
);
@ -350,11 +357,15 @@ const fromDriveItemToRemoteItem = (
if (isFolder) {
key = `${key}/`;
}
const mtimeSvr = Date.parse(x?.fileSystemInfo!.lastModifiedDateTime!);
const mtimeCli = Date.parse(x?.fileSystemInfo!.lastModifiedDateTime!);
return {
key: key,
lastModified: Date.parse(x!.fileSystemInfo!.lastModifiedDateTime!),
size: isFolder ? 0 : x.size!,
remoteType: "onedrive",
keyRaw: key,
mtimeSvr: mtimeSvr,
mtimeCli: mtimeCli,
sizeRaw: isFolder ? 0 : x.size!,
// hash: ?? // TODO
etag: x.cTag || "", // do NOT use x.eTag because it changes if meta changes
};
};
@ -401,12 +412,25 @@ class MyAuthProvider implements AuthenticationProvider {
this.onedriveConfig.accessTokenExpiresAtTime =
currentTs + r2.expires_in * 1000 - 60 * 2 * 1000;
await this.saveUpdatedConfigFunc();
log.info("Onedrive accessToken updated");
console.info("Onedrive accessToken updated");
return this.onedriveConfig.accessToken;
}
};
}
/**
* to export the settings in qrcode,
* we want to "trim" or "shrink" the settings
* @param onedriveConfig
*/
export const getShrinkedSettings = (onedriveConfig: OnedriveConfig) => {
const config = cloneDeep(onedriveConfig);
config.accessToken = "x";
config.accessTokenExpiresInSeconds = 1;
config.accessTokenExpiresAtTime = 1;
return config;
};
export class WrappedOnedriveClient {
onedriveConfig: OnedriveConfig;
remoteBaseDir: string;
@ -435,26 +459,26 @@ export class WrappedOnedriveClient {
}
// check vault folder
// log.info(`checking remote has folder /${this.remoteBaseDir}`);
// console.info(`checking remote has folder /${this.remoteBaseDir}`);
if (this.vaultFolderExists) {
// log.info(`already checked, /${this.remoteBaseDir} exist before`)
// console.info(`already checked, /${this.remoteBaseDir} exist before`)
} else {
const k = await this.getJson("/drive/special/approot/children");
// log.debug(k);
// console.debug(k);
this.vaultFolderExists =
(k.value as DriveItem[]).filter((x) => x.name === this.remoteBaseDir)
.length > 0;
if (!this.vaultFolderExists) {
log.info(`remote does not have folder /${this.remoteBaseDir}`);
console.info(`remote does not have folder /${this.remoteBaseDir}`);
await this.postJson("/drive/special/approot/children", {
name: `${this.remoteBaseDir}`,
folder: {},
"@microsoft.graph.conflictBehavior": "replace",
});
log.info(`remote folder /${this.remoteBaseDir} created`);
console.info(`remote folder /${this.remoteBaseDir} created`);
this.vaultFolderExists = true;
} else {
// log.info(`remote folder /${this.remoteBaseDir} exists`);
// console.info(`remote folder /${this.remoteBaseDir} exists`);
}
}
};
@ -471,12 +495,17 @@ export class WrappedOnedriveClient {
const pathFrag = encodeURI(pathFragOrig);
theUrl = `${API_PREFIX}${pathFrag}`;
}
// we want to support file name with hash #
// because every url we construct here do not contain the # symbol
// thus it should be safe to directly replace the character
theUrl = theUrl.replace(/#/g, "%23");
// console.debug(`building url: [${pathFragOrig}] => [${theUrl}]`)
return theUrl;
};
getJson = async (pathFragOrig: string) => {
const theUrl = this.buildUrl(pathFragOrig);
log.debug(`getJson, theUrl=${theUrl}`);
console.debug(`getJson, theUrl=${theUrl}`);
return JSON.parse(
await request({
url: theUrl,
@ -492,7 +521,7 @@ export class WrappedOnedriveClient {
postJson = async (pathFragOrig: string, payload: any) => {
const theUrl = this.buildUrl(pathFragOrig);
log.debug(`postJson, theUrl=${theUrl}`);
console.debug(`postJson, theUrl=${theUrl}`);
return JSON.parse(
await request({
url: theUrl,
@ -508,7 +537,7 @@ export class WrappedOnedriveClient {
patchJson = async (pathFragOrig: string, payload: any) => {
const theUrl = this.buildUrl(pathFragOrig);
log.debug(`patchJson, theUrl=${theUrl}`);
console.debug(`patchJson, theUrl=${theUrl}`);
return JSON.parse(
await request({
url: theUrl,
@ -524,7 +553,7 @@ export class WrappedOnedriveClient {
deleteJson = async (pathFragOrig: string) => {
const theUrl = this.buildUrl(pathFragOrig);
log.debug(`deleteJson, theUrl=${theUrl}`);
console.debug(`deleteJson, theUrl=${theUrl}`);
if (VALID_REQURL) {
await requestUrl({
url: theUrl,
@ -545,12 +574,12 @@ export class WrappedOnedriveClient {
putArrayBuffer = async (pathFragOrig: string, payload: ArrayBuffer) => {
const theUrl = this.buildUrl(pathFragOrig);
log.debug(`putArrayBuffer, theUrl=${theUrl}`);
console.debug(`putArrayBuffer, theUrl=${theUrl}`);
// TODO:
// 20220401: On Android, requestUrl has issue that text becomes base64.
// Use fetch everywhere instead!
if (false /*VALID_REQURL*/) {
await requestUrl({
const res = await requestUrl({
url: theUrl,
method: "PUT",
body: payload,
@ -560,8 +589,9 @@ export class WrappedOnedriveClient {
Authorization: `Bearer ${await this.authGetter.getAccessToken()}`,
},
});
return res.json as DriveItem | UploadSession;
} else {
await fetch(theUrl, {
const res = await fetch(theUrl, {
method: "PUT",
body: payload,
headers: {
@ -569,6 +599,7 @@ export class WrappedOnedriveClient {
Authorization: `Bearer ${await this.authGetter.getAccessToken()}`,
},
});
return (await res.json()) as DriveItem | UploadSession;
}
};
@ -588,7 +619,7 @@ export class WrappedOnedriveClient {
size: number
) => {
const theUrl = this.buildUrl(pathFragOrig);
log.debug(
console.debug(
`putUint8ArrayByRange, theUrl=${theUrl}, range=${rangeStart}-${
rangeEnd - 1
}, len=${rangeEnd - rangeStart}, size=${size}`
@ -653,7 +684,7 @@ export const listAllFromRemote = async (client: WrappedOnedriveClient) => {
`/drive/special/approot:/${client.remoteBaseDir}:/delta`
);
let driveItems = res.value as DriveItem[];
// log.debug(driveItems);
// console.debug(driveItems);
while (NEXT_LINK_KEY in res) {
res = await client.getJson(res[NEXT_LINK_KEY]);
@ -666,14 +697,12 @@ export const listAllFromRemote = async (client: WrappedOnedriveClient) => {
await client.saveUpdatedConfigFunc();
}
// unify everything to RemoteItem
// unify everything to Entity
const unifiedContents = driveItems
.map((x) => fromDriveItemToRemoteItem(x, client.remoteBaseDir))
.filter((x) => x.key !== "/");
.map((x) => fromDriveItemToEntity(x, client.remoteBaseDir))
.filter((x) => x.keyRaw !== "/");
return {
Contents: unifiedContents,
};
return unifiedContents;
};
export const getRemoteMeta = async (
@ -681,14 +710,14 @@ export const getRemoteMeta = async (
remotePath: string
) => {
await client.init();
// log.info(`remotePath=${remotePath}`);
// console.info(`remotePath=${remotePath}`);
const rsp = await client.getJson(
`${remotePath}?$select=cTag,eTag,fileSystemInfo,folder,file,name,parentReference,size`
);
// log.info(rsp);
// console.info(rsp);
const driveItem = rsp as DriveItem;
const res = fromDriveItemToRemoteItem(driveItem, client.remoteBaseDir);
// log.info(res);
const res = fromDriveItemToEntity(driveItem, client.remoteBaseDir);
// console.info(res);
return res;
};
@ -696,21 +725,26 @@ export const uploadToRemote = async (
client: WrappedOnedriveClient,
fileOrFolderPath: string,
vault: Vault | undefined,
isRecursively: boolean = false,
password: string = "",
isRecursively: boolean,
cipher: Cipher,
remoteEncryptedKey: string = "",
foldersCreatedBefore: Set<string> | undefined = undefined,
uploadRaw: boolean = false,
rawContent: string | ArrayBuffer = ""
) => {
): Promise<UploadedType> => {
await client.init();
let uploadFile = fileOrFolderPath;
if (password !== "") {
if (!cipher.isPasswordEmpty()) {
if (remoteEncryptedKey === undefined || remoteEncryptedKey === "") {
throw Error(
`uploadToRemote(onedrive) you have password but remoteEncryptedKey is empty!`
);
}
uploadFile = remoteEncryptedKey;
}
uploadFile = getOnedrivePath(uploadFile, client.remoteBaseDir);
log.debug(`uploadFile=${uploadFile}`);
console.debug(`uploadFile=${uploadFile}`);
let mtime = 0;
let ctime = 0;
@ -731,8 +765,8 @@ export const uploadToRemote = async (
throw Error(`you specify uploadRaw, but you also provide a folder key!`);
}
// folder
if (password === "") {
// if not encrypted, mkdir a remote folder
if (cipher.isPasswordEmpty() || cipher.isFolderAware()) {
// if not encrypted, || encrypted isFolderAware, mkdir a remote folder
if (foldersCreatedBefore?.has(uploadFile)) {
// created, pass
} else {
@ -755,18 +789,20 @@ export const uploadToRemote = async (
await client.patchJson(uploadFile, k);
}
const res = await getRemoteMeta(client, uploadFile);
return res;
return {
entity: res,
mtimeCli: mtime,
};
} else {
// if encrypted,
// if encrypted && !isFolderAware(),
// upload a fake, random-size file
// with the encrypted file name
const byteLengthRandom = getRandomIntInclusive(
1,
65536 /* max allowed */
);
const arrBufRandom = await encryptArrayBuffer(
getRandomArrayBuffer(byteLengthRandom),
password
const arrBufRandom = await cipher.encryptContent(
getRandomArrayBuffer(byteLengthRandom)
);
// an encrypted folder is always small, we just use put here
@ -784,9 +820,12 @@ export const uploadToRemote = async (
} as FileSystemInfo,
});
}
// log.info(uploadResult)
// console.info(uploadResult)
const res = await getRemoteMeta(client, uploadFile);
return res;
return {
entity: res,
mtimeCli: mtime,
};
}
} else {
// file
@ -807,8 +846,8 @@ export const uploadToRemote = async (
localContent = await vault.adapter.readBinary(fileOrFolderPath);
}
let remoteContent = localContent;
if (password !== "") {
remoteContent = await encryptArrayBuffer(localContent, password);
if (!cipher.isPasswordEmpty()) {
remoteContent = await cipher.encryptContent(localContent);
}
// no need to create parent folders firstly, cool!
@ -863,8 +902,8 @@ export const uploadToRemote = async (
k
);
const uploadUrl = s.uploadUrl!;
log.debug("uploadSession = ");
log.debug(s);
console.debug("uploadSession = ");
console.debug(s);
// 2. upload by ranges
// convert to uint8
@ -885,7 +924,10 @@ export const uploadToRemote = async (
}
const res = await getRemoteMeta(client, uploadFile);
return res;
return {
entity: res,
mtimeCli: mtime,
};
}
};
@ -918,7 +960,7 @@ export const downloadFromRemote = async (
fileOrFolderPath: string,
vault: Vault,
mtime: number,
password: string = "",
cipher: Cipher,
remoteEncryptedKey: string = "",
skipSaving: boolean = false
) => {
@ -936,14 +978,14 @@ export const downloadFromRemote = async (
return new ArrayBuffer(0);
} else {
let downloadFile = fileOrFolderPath;
if (password !== "") {
if (!cipher.isPasswordEmpty()) {
downloadFile = remoteEncryptedKey;
}
downloadFile = getOnedrivePath(downloadFile, client.remoteBaseDir);
const remoteContent = await downloadFromRemoteRaw(client, downloadFile);
let localContent = remoteContent;
if (password !== "") {
localContent = await decryptArrayBuffer(remoteContent, password);
if (!cipher.isPasswordEmpty()) {
localContent = await cipher.decryptContent(remoteContent);
}
if (!skipSaving) {
await vault.adapter.writeBinary(fileOrFolderPath, localContent, {
@ -957,14 +999,14 @@ export const downloadFromRemote = async (
export const deleteFromRemote = async (
client: WrappedOnedriveClient,
fileOrFolderPath: string,
password: string = "",
cipher: Cipher,
remoteEncryptedKey: string = ""
) => {
if (fileOrFolderPath === "/") {
return;
}
let remoteFileName = fileOrFolderPath;
if (password !== "") {
if (!cipher.isPasswordEmpty()) {
remoteFileName = remoteEncryptedKey;
}
remoteFileName = getOnedrivePath(remoteFileName, client.remoteBaseDir);
@ -981,7 +1023,7 @@ export const checkConnectivity = async (
const k = await getUserDisplayName(client);
return k !== "<unknown display name>";
} catch (err) {
log.debug(err);
console.debug(err);
if (callbackFunc !== undefined) {
callbackFunc(err);
}

View File

@ -22,17 +22,17 @@ import { buildQueryString } from "@smithy/querystring-builder";
import { HeaderBag, HttpHandlerOptions, Provider } from "@aws-sdk/types";
import { Buffer } from "buffer";
import * as mime from "mime-types";
import { Vault, requestUrl, RequestUrlParam } from "obsidian";
import { Vault, requestUrl, RequestUrlParam, Platform } from "obsidian";
import { Readable } from "stream";
import * as path from "path";
import AggregateError from "aggregate-error";
import {
DEFAULT_CONTENT_TYPE,
RemoteItem,
Entity,
S3Config,
UploadedType,
VALID_REQURL,
} from "./baseTypes";
import { decryptArrayBuffer, encryptArrayBuffer } from "./encrypt";
import {
arrayBufferToBuffer,
bufferToArrayBuffer,
@ -41,8 +41,8 @@ import {
export { S3Client } from "@aws-sdk/client-s3";
import { log } from "./moreOnLog";
import PQueue from "p-queue";
import { Cipher } from "./encryptUnified";
////////////////////////////////////////////////////////////////////////////////
// special handler using Obsidian requestUrl
@ -225,51 +225,82 @@ const getLocalNoPrefixPath = (
return fileOrFolderPathWithRemotePrefix.slice(`${remotePrefix}`.length);
};
const fromS3ObjectToRemoteItem = (
const fromS3ObjectToEntity = (
x: S3ObjectType,
remotePrefix: string,
mtimeRecords: Record<string, number>,
ctimeRecords: Record<string, number>
) => {
let mtime = x.LastModified!.valueOf();
// console.debug(`fromS3ObjectToEntity: ${x.Key!}, ${JSON.stringify(x,null,2)}`);
// S3 officially only supports seconds precision!!!!!
const mtimeSvr = Math.floor(x.LastModified!.valueOf() / 1000.0) * 1000;
let mtimeCli = mtimeSvr;
if (x.Key! in mtimeRecords) {
const m2 = mtimeRecords[x.Key!];
if (m2 !== 0) {
mtime = m2;
// to be compatible with RClone, we read and store the time in seconds in new version!
if (m2 >= 1000000000000) {
// it's a millsecond, uploaded by old codes..
mtimeCli = m2;
} else {
// it's a second, uploaded by new codes of the plugin from March 24, 2024
mtimeCli = m2 * 1000;
}
}
}
const r: RemoteItem = {
key: getLocalNoPrefixPath(x.Key!, remotePrefix),
lastModified: mtime,
size: x.Size!,
remoteType: "s3",
const key = getLocalNoPrefixPath(x.Key!, remotePrefix);
const r: Entity = {
keyRaw: key,
mtimeSvr: mtimeSvr,
mtimeCli: mtimeCli,
sizeRaw: x.Size!,
etag: x.ETag,
synthesizedFolder: false,
};
return r;
};
const fromS3HeadObjectToRemoteItem = (
const fromS3HeadObjectToEntity = (
fileOrFolderPathWithRemotePrefix: string,
x: HeadObjectCommandOutput,
remotePrefix: string,
useAccurateMTime: boolean
remotePrefix: string
) => {
let mtime = x.LastModified!.valueOf();
if (useAccurateMTime && x.Metadata !== undefined) {
const m2 = Math.round(
// console.debug(`fromS3HeadObjectToEntity: ${fileOrFolderPathWithRemotePrefix}: ${JSON.stringify(x,null,2)}`);
// S3 officially only supports seconds precision!!!!!
const mtimeSvr = Math.floor(x.LastModified!.valueOf() / 1000.0) * 1000;
let mtimeCli = mtimeSvr;
if (x.Metadata !== undefined) {
const m2 = Math.floor(
parseFloat(x.Metadata.mtime || x.Metadata.MTime || "0")
);
if (m2 !== 0) {
mtime = m2;
// to be compatible with RClone, we read and store the time in seconds in new version!
if (m2 >= 1000000000000) {
// it's a millsecond, uploaded by old codes..
mtimeCli = m2;
} else {
// it's a second, uploaded by new codes of the plugin from March 24, 2024
mtimeCli = m2 * 1000;
}
}
}
// console.debug(
// `fromS3HeadObjectToEntity, fileOrFolderPathWithRemotePrefix=${fileOrFolderPathWithRemotePrefix}, remotePrefix=${remotePrefix}, x=${JSON.stringify(
// x
// )} `
// );
const key = getLocalNoPrefixPath(
fileOrFolderPathWithRemotePrefix,
remotePrefix
);
// console.debug(`fromS3HeadObjectToEntity, key=${key} after removing prefix`);
return {
key: getLocalNoPrefixPath(fileOrFolderPathWithRemotePrefix, remotePrefix),
lastModified: mtime,
size: x.ContentLength,
remoteType: "s3",
keyRaw: key,
mtimeSvr: mtimeSvr,
mtimeCli: mtimeCli,
sizeRaw: x.ContentLength,
etag: x.ETag,
} as RemoteItem;
} as Entity;
};
export const getS3Client = (s3Config: S3Config) => {
@ -336,11 +367,10 @@ export const getRemoteMeta = async (
})
);
return fromS3HeadObjectToRemoteItem(
return fromS3HeadObjectToEntity(
fileOrFolderPathWithRemotePrefix,
res,
s3Config.remotePrefix ?? "",
s3Config.useAccurateMTime ?? false
s3Config.remotePrefix ?? ""
);
};
@ -349,19 +379,26 @@ export const uploadToRemote = async (
s3Config: S3Config,
fileOrFolderPath: string,
vault: Vault | undefined,
isRecursively: boolean = false,
password: string = "",
isRecursively: boolean,
cipher: Cipher,
remoteEncryptedKey: string = "",
uploadRaw: boolean = false,
rawContent: string | ArrayBuffer = "",
rawContentMTime: number = 0,
rawContentCTime: number = 0
) => {
): Promise<UploadedType> => {
console.debug(`uploading ${fileOrFolderPath}`);
let uploadFile = fileOrFolderPath;
if (password !== "") {
if (!cipher.isPasswordEmpty()) {
if (remoteEncryptedKey === undefined || remoteEncryptedKey === "") {
throw Error(
`uploadToRemote(s3) you have password but remoteEncryptedKey is empty!`
);
}
uploadFile = remoteEncryptedKey;
}
uploadFile = getRemoteWithPrefixPath(uploadFile, s3Config.remotePrefix ?? "");
// console.debug(`actual uploadFile=${uploadFile}`);
const isFolder = fileOrFolderPath.endsWith("/");
if (isFolder && isRecursively) {
@ -386,17 +423,21 @@ export const uploadToRemote = async (
Body: "",
ContentType: contentType,
Metadata: {
MTime: `${mtime}`,
CTime: `${ctime}`,
MTime: `${mtime / 1000.0}`,
CTime: `${ctime / 1000.0}`,
},
})
);
return await getRemoteMeta(s3Client, s3Config, uploadFile);
const res = await getRemoteMeta(s3Client, s3Config, uploadFile);
return {
entity: res,
mtimeCli: mtime,
};
} else {
// file
// we ignore isRecursively parameter here
let contentType = DEFAULT_CONTENT_TYPE;
if (password === "") {
if (cipher.isPasswordEmpty()) {
contentType =
mime.contentType(
mime.lookup(fileOrFolderPath) || DEFAULT_CONTENT_TYPE
@ -427,8 +468,8 @@ export const uploadToRemote = async (
}
}
let remoteContent = localContent;
if (password !== "") {
remoteContent = await encryptArrayBuffer(localContent, password);
if (!cipher.isPasswordEmpty()) {
remoteContent = await cipher.encryptContent(localContent);
}
const bytesIn5MB = 5242880;
@ -445,17 +486,24 @@ export const uploadToRemote = async (
Body: body,
ContentType: contentType,
Metadata: {
MTime: `${mtime}`,
CTime: `${ctime}`,
MTime: `${mtime / 1000.0}`,
CTime: `${ctime / 1000.0}`,
},
},
});
upload.on("httpUploadProgress", (progress) => {
// log.info(progress);
// console.info(progress);
});
await upload.done();
return await getRemoteMeta(s3Client, s3Config, uploadFile);
const res = await getRemoteMeta(s3Client, s3Config, uploadFile);
// console.debug(
// `uploaded ${uploadFile} with res=${JSON.stringify(res, null, 2)}`
// );
return {
entity: res,
mtimeCli: mtime,
};
}
};
@ -512,12 +560,12 @@ const listFromRemoteRaw = async (
if (rspHead.Metadata === undefined) {
// pass
} else {
mtimeRecords[content.Key!] = Math.round(
mtimeRecords[content.Key!] = Math.floor(
parseFloat(
rspHead.Metadata.mtime || rspHead.Metadata.MTime || "0"
)
);
ctimeRecords[content.Key!] = Math.round(
ctimeRecords[content.Key!] = Math.floor(
parseFloat(
rspHead.Metadata.ctime || rspHead.Metadata.CTime || "0"
)
@ -544,23 +592,24 @@ const listFromRemoteRaw = async (
// ensemble fake rsp
// in the end, we need to transform the response list
// back to the local contents-alike list
return {
Contents: contents.map((x) =>
fromS3ObjectToRemoteItem(
x,
s3Config.remotePrefix ?? "",
mtimeRecords,
ctimeRecords
)
),
};
return contents.map((x) =>
fromS3ObjectToEntity(
x,
s3Config.remotePrefix ?? "",
mtimeRecords,
ctimeRecords
)
);
};
export const listAllFromRemote = async (
s3Client: S3Client,
s3Config: S3Config
) => {
return await listFromRemoteRaw(s3Client, s3Config, s3Config.remotePrefix);
const res = (
await listFromRemoteRaw(s3Client, s3Config, s3Config.remotePrefix)
).filter((x) => x.keyRaw !== "" && x.keyRaw !== "/");
return res;
};
/**
@ -620,8 +669,8 @@ export const downloadFromRemote = async (
fileOrFolderPath: string,
vault: Vault,
mtime: number,
password: string = "",
remoteEncryptedKey: string = "",
cipher: Cipher,
remoteEncryptedKey: string,
skipSaving: boolean = false
) => {
const isFolder = fileOrFolderPath.endsWith("/");
@ -639,7 +688,7 @@ export const downloadFromRemote = async (
return new ArrayBuffer(0);
} else {
let downloadFile = fileOrFolderPath;
if (password !== "") {
if (!cipher.isPasswordEmpty()) {
downloadFile = remoteEncryptedKey;
}
downloadFile = getRemoteWithPrefixPath(
@ -652,8 +701,8 @@ export const downloadFromRemote = async (
downloadFile
);
let localContent = remoteContent;
if (password !== "") {
localContent = await decryptArrayBuffer(remoteContent, password);
if (!cipher.isPasswordEmpty()) {
localContent = await cipher.decryptContent(remoteContent);
}
if (!skipSaving) {
await vault.adapter.writeBinary(fileOrFolderPath, localContent, {
@ -675,14 +724,18 @@ export const deleteFromRemote = async (
s3Client: S3Client,
s3Config: S3Config,
fileOrFolderPath: string,
password: string = "",
remoteEncryptedKey: string = ""
cipher: Cipher,
remoteEncryptedKey: string = "",
synthesizedFolder: boolean = false
) => {
if (fileOrFolderPath === "/") {
return;
}
if (synthesizedFolder) {
return;
}
let remoteFileName = fileOrFolderPath;
if (password !== "") {
if (!cipher.isPasswordEmpty()) {
remoteFileName = remoteEncryptedKey;
}
remoteFileName = getRemoteWithPrefixPath(
@ -696,9 +749,9 @@ export const deleteFromRemote = async (
})
);
if (fileOrFolderPath.endsWith("/") && password === "") {
if (fileOrFolderPath.endsWith("/") && cipher.isPasswordEmpty()) {
const x = await listFromRemoteRaw(s3Client, s3Config, remoteFileName);
x.Contents.forEach(async (element) => {
x.forEach(async (element) => {
await s3Client.send(
new DeleteObjectCommand({
Bucket: s3Config.s3BucketName,
@ -706,7 +759,7 @@ export const deleteFromRemote = async (
})
);
});
} else if (fileOrFolderPath.endsWith("/") && password !== "") {
} else if (fileOrFolderPath.endsWith("/") && !cipher.isPasswordEmpty()) {
// TODO
} else {
// pass
@ -731,6 +784,13 @@ export const checkConnectivity = async (
callbackFunc?: any
) => {
try {
// TODO: no universal way now, just check this in connectivity
if (Platform.isIosApp && s3Config.s3Endpoint.startsWith("http://")) {
throw Error(
`Your s3 endpoint could only be https, not http, because of the iOS restriction.`
);
}
// const results = await s3Client.send(
// new HeadBucketCommand({ Bucket: s3Config.s3BucketName })
// );
@ -746,7 +806,7 @@ export const checkConnectivity = async (
results.$metadata.httpStatusCode === undefined
) {
const err = "results or $metadata or httStatusCode is undefined";
log.debug(err);
console.debug(err);
if (callbackFunc !== undefined) {
callbackFunc(err);
}
@ -754,7 +814,7 @@ export const checkConnectivity = async (
}
return results.$metadata.httpStatusCode === 200;
} catch (err: any) {
log.debug(err);
console.debug(err);
if (callbackFunc !== undefined) {
if (s3Config.s3Endpoint.contains(s3Config.s3BucketName)) {
const err2 = new AggregateError([

View File

@ -1,15 +1,14 @@
import { Buffer } from "buffer";
import { Vault, requestUrl } from "obsidian";
import { Platform, Vault, requestUrl } from "obsidian";
import { Queue } from "@fyears/tsqueue";
import chunk from "lodash/chunk";
import flatten from "lodash/flatten";
import cloneDeep from "lodash/cloneDeep";
import { getReasonPhrase } from "http-status-codes";
import { RemoteItem, VALID_REQURL, WebdavConfig } from "./baseTypes";
import { decryptArrayBuffer, encryptArrayBuffer } from "./encrypt";
import { Entity, UploadedType, VALID_REQURL, WebdavConfig } from "./baseTypes";
import { bufferToArrayBuffer, getPathFolder, mkdirpInVault } from "./misc";
import { log } from "./moreOnLog";
import { Cipher } from "./encryptUnified";
import type {
FileStat,
@ -28,43 +27,75 @@ function onlyAscii(str: string) {
return !/[^\u0000-\u00ff]/g.test(str);
}
/**
* https://stackoverflow.com/questions/12539574/
* @param obj
* @returns
*/
function objKeyToLower(obj: Record<string, string>) {
return Object.fromEntries(
Object.entries(obj).map(([k, v]) => [k.toLowerCase(), v])
);
}
// @ts-ignore
import { getPatcher } from "webdav/dist/web/index.js";
if (VALID_REQURL) {
getPatcher().patch(
"request",
async (options: RequestOptionsWithState): Promise<Response> => {
const transformedHeaders = { ...options.headers };
const transformedHeaders = objKeyToLower({ ...options.headers });
delete transformedHeaders["host"];
delete transformedHeaders["Host"];
delete transformedHeaders["content-length"];
delete transformedHeaders["Content-Length"];
const r = await requestUrl({
const reqContentType =
transformedHeaders["accept"] ?? transformedHeaders["content-type"];
const retractedHeaders = { ...transformedHeaders };
if (retractedHeaders.hasOwnProperty("authorization")) {
retractedHeaders["authorization"] = "<retracted>";
}
console.debug(`before request:`);
console.debug(`url: ${options.url}`);
console.debug(`method: ${options.method}`);
console.debug(`headers: ${JSON.stringify(retractedHeaders, null, 2)}`);
console.debug(`reqContentType: ${reqContentType}`);
let r = await requestUrl({
url: options.url,
method: options.method,
body: options.data as string | ArrayBuffer,
headers: transformedHeaders,
contentType: reqContentType,
throw: false,
});
let contentType: string | undefined =
r.headers["Content-Type"] || r.headers["content-type"];
if (options.headers !== undefined) {
contentType =
contentType ||
options.headers["Content-Type"] ||
options.headers["content-type"] ||
options.headers["Accept"] ||
options.headers["accept"];
if (
r.status === 401 &&
Platform.isIosApp &&
!options.url.endsWith("/") &&
!options.url.endsWith(".md") &&
options.method.toUpperCase() === "PROPFIND"
) {
// don't ask me why,
// some webdav servers have some mysterious behaviours,
// if a folder doesn't exist without slash, the servers return 401 instead of 404
// here is a dirty hack that works
console.debug(`so we have 401, try appending request url with slash`);
r = await requestUrl({
url: `${options.url}/`,
method: options.method,
body: options.data as string | ArrayBuffer,
headers: transformedHeaders,
contentType: reqContentType,
throw: false,
});
}
if (contentType !== undefined) {
contentType = contentType.toLowerCase();
}
const rspHeaders = { ...r.headers };
console.log("rspHeaders");
console.log(rspHeaders);
console.debug(`after request:`);
const rspHeaders = objKeyToLower({ ...r.headers });
console.debug(`rspHeaders: ${JSON.stringify(rspHeaders, null, 2)}`);
for (let key in rspHeaders) {
if (rspHeaders.hasOwnProperty(key)) {
// avoid the error:
@ -81,74 +112,28 @@ if (VALID_REQURL) {
// }
// }
if (!onlyAscii(rspHeaders[key])) {
console.debug(`rspHeaders[key] needs encode: ${key}`);
rspHeaders[key] = encodeURIComponent(rspHeaders[key]);
}
}
}
// log.info(`requesting url=${options.url}`);
// log.info(`contentType=${contentType}`);
// log.info(`rspHeaders=${JSON.stringify(rspHeaders)}`)
// let r2: Response = undefined;
// if (contentType.includes("xml")) {
// r2 = new Response(r.text, {
// status: r.status,
// statusText: getReasonPhrase(r.status),
// headers: rspHeaders,
// });
// } else if (
// contentType.includes("json") ||
// contentType.includes("javascript")
// ) {
// log.info('inside json branch');
// // const j = r.json;
// // log.info(j);
// r2 = new Response(
// r.text, // yea, here is the text because Response constructor expects a text
// {
// status: r.status,
// statusText: getReasonPhrase(r.status),
// headers: rspHeaders,
// });
// } else if (contentType.includes("text")) {
// // avoid text/json,
// // so we split this out from the above xml or json branch
// r2 = new Response(r.text, {
// status: r.status,
// statusText: getReasonPhrase(r.status),
// headers: rspHeaders,
// });
// } else if (
// contentType.includes("octet-stream") ||
// contentType.includes("binary") ||
// contentType.includes("buffer")
// ) {
// // application/octet-stream
// r2 = new Response(r.arrayBuffer, {
// status: r.status,
// statusText: getReasonPhrase(r.status),
// headers: rspHeaders,
// });
// } else {
// throw Error(
// `do not know how to deal with requested content type = ${contentType}`
// );
// }
let r2: Response | undefined = undefined;
const statusText = getReasonPhrase(r.status);
console.debug(`statusText: ${statusText}`);
if ([101, 103, 204, 205, 304].includes(r.status)) {
// A null body status is a status that is 101, 103, 204, 205, or 304.
// https://fetch.spec.whatwg.org/#statuses
// fix this: Failed to construct 'Response': Response with null body status cannot have body
r2 = new Response(null, {
status: r.status,
statusText: getReasonPhrase(r.status),
statusText: statusText,
headers: rspHeaders,
});
} else {
r2 = new Response(r.arrayBuffer, {
status: r.status,
statusText: getReasonPhrase(r.status),
statusText: statusText,
headers: rspHeaders,
});
}
@ -178,7 +163,7 @@ const getWebdavPath = (fileOrFolderPath: string, remoteBaseDir: string) => {
// special
key = `/${remoteBaseDir}/`;
} else if (fileOrFolderPath.startsWith("/")) {
log.warn(
console.warn(
`why the path ${fileOrFolderPath} starts with '/'? but we just go on.`
);
key = `/${remoteBaseDir}${fileOrFolderPath}`;
@ -205,18 +190,19 @@ const getNormPath = (fileOrFolderPath: string, remoteBaseDir: string) => {
return fileOrFolderPath.slice(`/${remoteBaseDir}/`.length);
};
const fromWebdavItemToRemoteItem = (x: FileStat, remoteBaseDir: string) => {
const fromWebdavItemToEntity = (x: FileStat, remoteBaseDir: string) => {
let key = getNormPath(x.filename, remoteBaseDir);
if (x.type === "directory" && !key.endsWith("/")) {
key = `${key}/`;
}
const mtimeSvr = Date.parse(x.lastmod).valueOf();
return {
key: key,
lastModified: Date.parse(x.lastmod).valueOf(),
size: x.size,
remoteType: "webdav",
etag: x.etag || undefined,
} as RemoteItem;
keyRaw: key,
mtimeSvr: mtimeSvr,
mtimeCli: mtimeSvr, // no universal way to set mtime in webdav
sizeRaw: x.size,
etag: x.etag,
} as Entity;
};
export class WrappedWebdavClient {
@ -230,7 +216,8 @@ export class WrappedWebdavClient {
remoteBaseDir: string,
saveUpdatedConfigFunc: () => Promise<any>
) {
this.webdavConfig = webdavConfig;
this.webdavConfig = cloneDeep(webdavConfig);
this.webdavConfig.address = encodeURI(this.webdavConfig.address);
this.remoteBaseDir = remoteBaseDir;
this.vaultFolderExists = false;
this.saveUpdatedConfigFunc = saveUpdatedConfigFunc;
@ -241,6 +228,13 @@ export class WrappedWebdavClient {
if (this.client !== undefined) {
return;
}
if (Platform.isIosApp && !this.webdavConfig.address.startsWith("https")) {
throw Error(
`Your webdav address could only be https, not http, because of the iOS restriction.`
);
}
const headers = {
"Cache-Control": "no-cache",
};
@ -258,7 +252,7 @@ export class WrappedWebdavClient {
: AuthType.Password,
});
} else {
log.info("no password");
console.info("no password");
this.client = createClient(this.webdavConfig.address, {
headers: headers,
});
@ -270,12 +264,12 @@ export class WrappedWebdavClient {
} else {
const res = await this.client.exists(`/${this.remoteBaseDir}/`);
if (res) {
// log.info("remote vault folder exits!");
// console.info("remote vault folder exits!");
this.vaultFolderExists = true;
} else {
log.info("remote vault folder not exists, creating");
console.info("remote vault folder not exists, creating");
await this.client.createDirectory(`/${this.remoteBaseDir}/`);
log.info("remote vault folder created!");
console.info("remote vault folder created!");
this.vaultFolderExists = true;
}
}
@ -291,7 +285,7 @@ export class WrappedWebdavClient {
this.webdavConfig.manualRecursive = true;
if (this.saveUpdatedConfigFunc !== undefined) {
await this.saveUpdatedConfigFunc();
log.info(
console.info(
`webdav depth="auto_???" is changed to ${this.webdavConfig.depth}`
);
}
@ -322,27 +316,32 @@ export const getRemoteMeta = async (
remotePath: string
) => {
await client.init();
log.debug(`getRemoteMeta remotePath = ${remotePath}`);
console.debug(`getRemoteMeta remotePath = ${remotePath}`);
const res = (await client.client.stat(remotePath, {
details: false,
})) as FileStat;
log.debug(`getRemoteMeta res=${JSON.stringify(res)}`);
return fromWebdavItemToRemoteItem(res, client.remoteBaseDir);
console.debug(`getRemoteMeta res=${JSON.stringify(res)}`);
return fromWebdavItemToEntity(res, client.remoteBaseDir);
};
export const uploadToRemote = async (
client: WrappedWebdavClient,
fileOrFolderPath: string,
vault: Vault | undefined,
isRecursively: boolean = false,
password: string = "",
isRecursively: boolean,
cipher: Cipher,
remoteEncryptedKey: string = "",
uploadRaw: boolean = false,
rawContent: string | ArrayBuffer = ""
) => {
): Promise<UploadedType> => {
await client.init();
let uploadFile = fileOrFolderPath;
if (password !== "") {
if (!cipher.isPasswordEmpty()) {
if (remoteEncryptedKey === undefined || remoteEncryptedKey === "") {
throw Error(
`uploadToRemote(webdav) you have password but remoteEncryptedKey is empty!`
);
}
uploadFile = remoteEncryptedKey;
}
uploadFile = getWebdavPath(uploadFile, client.remoteBaseDir);
@ -356,28 +355,34 @@ export const uploadToRemote = async (
throw Error(`you specify uploadRaw, but you also provide a folder key!`);
}
// folder
if (password === "") {
// if not encrypted, mkdir a remote folder
if (cipher.isPasswordEmpty() || cipher.isFolderAware()) {
// if not encrypted, || encrypted isFolderAware, mkdir a remote folder
await client.client.createDirectory(uploadFile, {
recursive: false, // the sync algo should guarantee no need to recursive
recursive: true,
});
const res = await getRemoteMeta(client, uploadFile);
return res;
return {
entity: res,
};
} else {
// if encrypted, upload a fake file with the encrypted file name
// if encrypted && !isFolderAware(),
// upload a fake file with the encrypted file name
await client.client.putFileContents(uploadFile, "", {
overwrite: true,
onUploadProgress: (progress: any) => {
// log.info(`Uploaded ${progress.loaded} bytes of ${progress.total}`);
// console.info(`Uploaded ${progress.loaded} bytes of ${progress.total}`);
},
});
return await getRemoteMeta(client, uploadFile);
return {
entity: await getRemoteMeta(client, uploadFile),
};
}
} else {
// file
// we ignore isRecursively parameter here
let localContent = undefined;
let localContent: ArrayBuffer | undefined = undefined;
let mtimeCli: number | undefined = undefined;
if (uploadRaw) {
if (typeof rawContent === "string") {
localContent = new TextEncoder().encode(rawContent).buffer;
@ -391,25 +396,29 @@ export const uploadToRemote = async (
);
}
localContent = await vault.adapter.readBinary(fileOrFolderPath);
mtimeCli = (await vault.adapter.stat(fileOrFolderPath))?.mtime;
}
let remoteContent = localContent;
if (password !== "") {
remoteContent = await encryptArrayBuffer(localContent, password);
if (!cipher.isPasswordEmpty()) {
remoteContent = await cipher.encryptContent(localContent);
}
// updated 20220326: the algorithm guarantee this
// // we need to create folders before uploading
// const dir = getPathFolder(uploadFile);
// if (dir !== "/" && dir !== "") {
// await client.client.createDirectory(dir, { recursive: false });
// await client.client.createDirectory(dir, { recursive: true });
// }
await client.client.putFileContents(uploadFile, remoteContent, {
overwrite: true,
onUploadProgress: (progress: any) => {
log.info(`Uploaded ${progress.loaded} bytes of ${progress.total}`);
console.info(`Uploaded ${progress.loaded} bytes of ${progress.total}`);
},
});
return await getRemoteMeta(client, uploadFile);
return {
entity: await getRemoteMeta(client, uploadFile),
mtimeCli: mtimeCli,
};
}
};
@ -434,7 +443,7 @@ export const listAllFromRemote = async (client: WrappedWebdavClient) => {
itemsToFetch.push(q.pop()!);
}
const itemsToFetchChunks = chunk(itemsToFetch, CHUNK_SIZE);
// log.debug(itemsToFetchChunks);
// console.debug(itemsToFetchChunks);
const subContents = [] as FileStat[];
for (const singleChunk of itemsToFetchChunks) {
const r = singleChunk.map((x) => {
@ -472,11 +481,7 @@ export const listAllFromRemote = async (client: WrappedWebdavClient) => {
}
)) as FileStat[];
}
return {
Contents: contents.map((x) =>
fromWebdavItemToRemoteItem(x, client.remoteBaseDir)
),
};
return contents.map((x) => fromWebdavItemToEntity(x, client.remoteBaseDir));
};
const downloadFromRemoteRaw = async (
@ -484,7 +489,7 @@ const downloadFromRemoteRaw = async (
remotePath: string
) => {
await client.init();
// log.info(`getWebdavPath=${remotePath}`);
// console.info(`getWebdavPath=${remotePath}`);
const buff = (await client.client.getFileContents(remotePath)) as BufferLike;
if (buff instanceof ArrayBuffer) {
return buff;
@ -499,7 +504,7 @@ export const downloadFromRemote = async (
fileOrFolderPath: string,
vault: Vault,
mtime: number,
password: string = "",
cipher: Cipher,
remoteEncryptedKey: string = "",
skipSaving: boolean = false
) => {
@ -520,15 +525,15 @@ export const downloadFromRemote = async (
return new ArrayBuffer(0);
} else {
let downloadFile = fileOrFolderPath;
if (password !== "") {
if (!cipher.isPasswordEmpty()) {
downloadFile = remoteEncryptedKey;
}
downloadFile = getWebdavPath(downloadFile, client.remoteBaseDir);
// log.info(`downloadFile=${downloadFile}`);
// console.info(`downloadFile=${downloadFile}`);
const remoteContent = await downloadFromRemoteRaw(client, downloadFile);
let localContent = remoteContent;
if (password !== "") {
localContent = await decryptArrayBuffer(remoteContent, password);
if (!cipher.isPasswordEmpty()) {
localContent = await cipher.decryptContent(remoteContent);
}
if (!skipSaving) {
await vault.adapter.writeBinary(fileOrFolderPath, localContent, {
@ -542,14 +547,14 @@ export const downloadFromRemote = async (
export const deleteFromRemote = async (
client: WrappedWebdavClient,
fileOrFolderPath: string,
password: string = "",
cipher: Cipher,
remoteEncryptedKey: string = ""
) => {
if (fileOrFolderPath === "/") {
return;
}
let remoteFileName = fileOrFolderPath;
if (password !== "") {
if (!cipher.isPasswordEmpty()) {
remoteFileName = remoteEncryptedKey;
}
remoteFileName = getWebdavPath(remoteFileName, client.remoteBaseDir);
@ -557,10 +562,10 @@ export const deleteFromRemote = async (
await client.init();
try {
await client.client.deleteFile(remoteFileName);
// log.info(`delete ${remoteFileName} succeeded`);
// console.info(`delete ${remoteFileName} succeeded`);
} catch (err) {
log.error("some error while deleting");
log.error(err);
console.error("some error while deleting");
console.error(err);
}
};
@ -575,7 +580,7 @@ export const checkConnectivity = async (
)
) {
const err = "Error: the url should start with http(s):// but it does not!";
log.error(err);
console.error(err);
if (callbackFunc !== undefined) {
callbackFunc(err);
}
@ -586,7 +591,7 @@ export const checkConnectivity = async (
const results = await getRemoteMeta(client, `/${client.remoteBaseDir}/`);
if (results === undefined) {
const err = "results is undefined";
log.error(err);
console.error(err);
if (callbackFunc !== undefined) {
callbackFunc(err);
}
@ -594,7 +599,7 @@ export const checkConnectivity = async (
}
return true;
} catch (err) {
log.error(err);
console.error(err);
if (callbackFunc !== undefined) {
callbackFunc(err);
}

View File

@ -13,20 +13,32 @@ import { createElement, Eye, EyeOff } from "lucide";
import {
API_VER_ENSURE_REQURL_OK,
API_VER_REQURL,
ConflictActionType,
DEFAULT_DEBUG_FOLDER,
EmptyFolderCleanType,
SUPPORTED_SERVICES_TYPE,
SUPPORTED_SERVICES_TYPE_WITH_REMOTE_BASE_DIR,
SyncDirectionType,
VALID_REQURL,
WebdavAuthType,
WebdavDepthType,
CipherMethodType,
QRExportType,
} from "./baseTypes";
import { exportVaultSyncPlansToFiles } from "./debugMode";
import { exportQrCodeUri } from "./importExport";
import {
clearAllSyncMetaMapping,
exportVaultProfilerResultsToFiles,
exportVaultSyncPlansToFiles,
} from "./debugMode";
import {
exportQrCodeUri,
importQrCodeUri,
parseUriByHand,
} from "./importExport";
import {
clearAllPrevSyncRecordByVault,
clearAllSyncPlanRecords,
destroyDBs,
upsertLastSuccessSyncByVault,
upsertLastSuccessSyncTimeByVault,
} from "./localdb";
import type RemotelySavePlugin from "./main"; // unavoidable
import { RemoteClient } from "./remote";
@ -42,14 +54,13 @@ import {
} from "./remoteForOnedrive";
import { messyConfigToNormal } from "./configPersist";
import type { TransItemType } from "./i18n";
import { checkHasSpecialCharForDir } from "./misc";
import {
applyLogWriterInplace,
log,
restoreLogWritterInplace,
} from "./moreOnLog";
changeMobileStatusBar,
checkHasSpecialCharForDir,
stringToFragment,
} from "./misc";
import { simpleTransRemotePrefix } from "./remoteForS3";
import cloneDeep from "lodash/cloneDeep";
class PasswordModal extends Modal {
plugin: RemotelySavePlugin;
@ -121,6 +132,45 @@ class PasswordModal extends Modal {
}
}
class EncryptionMethodModal extends Modal {
plugin: RemotelySavePlugin;
constructor(app: App, plugin: RemotelySavePlugin) {
super(app);
this.plugin = plugin;
}
onOpen() {
let { contentEl } = this;
const t = (x: TransItemType, vars?: any) => {
return this.plugin.i18n.t(x, vars);
};
// contentEl.setText("Add Or change password.");
contentEl.createEl("h2", { text: t("modal_encryptionmethod_title") });
t("modal_encryptionmethod_shortdesc")
.split("\n")
.forEach((val, idx) => {
contentEl.createEl("p", {
text: stringToFragment(val),
});
});
new Setting(contentEl).addButton((button) => {
button.setButtonText(t("confirm"));
button.onClick(async () => {
this.close();
});
button.setClass("encryptionmethod-second-confirm");
});
}
onClose() {
let { contentEl } = this;
contentEl.empty();
}
}
class ChangeRemoteBaseDirModal extends Modal {
readonly plugin: RemotelySavePlugin;
readonly newRemoteBaseDir: string;
@ -450,7 +500,7 @@ class DropboxAuthModal extends Modal {
);
this.close();
} catch (err) {
log.error(err);
console.error(err);
new Notice(t("modal_dropboxauth_maualinput_conn_fail"));
}
});
@ -503,6 +553,15 @@ export class OnedriveAuthModal extends Modal {
text: val,
});
});
if (Platform.isLinux) {
t("modal_onedriveauth_shortdesc_linux")
.split("\n")
.forEach((val) => {
contentEl.createEl("p", {
text: stringToFragment(val),
});
});
}
const div2 = contentEl.createDiv();
div2.createEl(
"button",
@ -586,7 +645,7 @@ export class OnedriveRevokeAuthModal extends Modal {
new Notice(t("modal_onedriverevokeauth_clean_notice"));
this.close();
} catch (err) {
log.error(err);
console.error(err);
new Notice(t("modal_onedriverevokeauth_clean_fail"));
}
});
@ -654,9 +713,11 @@ class SyncConfigDirModal extends Modal {
class ExportSettingsQrCodeModal extends Modal {
plugin: RemotelySavePlugin;
constructor(app: App, plugin: RemotelySavePlugin) {
exportType: QRExportType;
constructor(app: App, plugin: RemotelySavePlugin, exportType: QRExportType) {
super(app);
this.plugin = plugin;
this.exportType = exportType;
}
async onOpen() {
@ -669,7 +730,8 @@ class ExportSettingsQrCodeModal extends Modal {
const { rawUri, imgUri } = await exportQrCodeUri(
this.plugin.settings,
this.app.vault.getName(),
this.plugin.manifest.version
this.plugin.manifest.version,
this.exportType
);
const div1 = contentEl.createDiv();
@ -713,65 +775,6 @@ class ExportSettingsQrCodeModal extends Modal {
}
}
class SetLogToHttpServerModal extends Modal {
plugin: RemotelySavePlugin;
serverAddr: string;
callBack: any;
constructor(
app: App,
plugin: RemotelySavePlugin,
serverAddr: string,
callBack: any
) {
super(app);
this.plugin = plugin;
this.serverAddr = serverAddr;
this.callBack = callBack;
}
onOpen() {
let { contentEl } = this;
const t = (x: TransItemType, vars?: any) => {
return this.plugin.i18n.t(x, vars);
};
contentEl.createEl("h2", { text: t("modal_logtohttpserver_title") });
const div1 = contentEl.createDiv();
div1.addClass("logtohttpserver-warning");
t("modal_logtohttpserver_desc")
.split("\n")
.forEach((val) => {
div1.createEl("p", {
text: val,
});
});
new Setting(contentEl)
.addButton((button) => {
button.setButtonText(t("modal_logtohttpserver_secondconfirm"));
button.setClass("logtohttpserver-warning");
button.onClick(async () => {
this.callBack();
new Notice(t("modal_logtohttpserver_notice"));
this.close();
});
})
.addButton((button) => {
button.setButtonText(t("goback"));
button.onClick(() => {
this.close();
});
});
}
onClose() {
let { contentEl } = this;
contentEl.empty();
}
}
const getEyesElements = () => {
const eyeEl = createElement(Eye);
const eyeOffEl = createElement(EyeOff);
@ -1162,7 +1165,7 @@ export class RemotelySaveSettingTab extends PluginSettingTab {
);
new Notice(t("settings_dropbox_revoke_notice"));
} catch (err) {
log.error(err);
console.error(err);
new Notice(t("settings_dropbox_revoke_noticeerr"));
}
});
@ -1703,6 +1706,23 @@ export class RemotelySaveSettingTab extends PluginSettingTab {
});
});
new Setting(basicDiv)
.setName(t("settings_encryptionmethod"))
.setDesc(stringToFragment(t("settings_encryptionmethod_desc")))
.addDropdown((dropdown) => {
dropdown
.addOption("rclone-base64", t("settings_encryptionmethod_rclone"))
.addOption("openssl-base64", t("settings_encryptionmethod_openssl"))
.setValue(this.plugin.settings.encryptionMethod ?? "rclone-base64")
.onChange(async (val: string) => {
this.plugin.settings.encryptionMethod = val as CipherMethodType;
await this.plugin.saveSettings();
if (this.plugin.settings.password !== "") {
new EncryptionMethodModal(this.app, this.plugin).open();
}
});
});
new Setting(basicDiv)
.setName(t("settings_autorun"))
.setDesc(t("settings_autorun_desc"))
@ -1732,7 +1752,7 @@ export class RemotelySaveSettingTab extends PluginSettingTab {
realVal > 0
) {
const intervalID = window.setInterval(() => {
log.info("auto run from settings.ts");
console.info("auto run from settings.ts");
this.plugin.syncRun("auto");
}, realVal);
this.plugin.autoRunIntervalID = intervalID;
@ -1804,7 +1824,7 @@ export class RemotelySaveSettingTab extends PluginSettingTab {
// then schedule a run for syncOnSaveAfterMilliseconds after it was modified
const lastModified = currentFile.stat.mtime;
const currentTime = Date.now();
// log.debug(
// console.debug(
// `Checking if file was modified within last ${
// this.plugin.settings.syncOnSaveAfterMilliseconds / 1000
// } seconds, last modified: ${
@ -1819,7 +1839,7 @@ export class RemotelySaveSettingTab extends PluginSettingTab {
const scheduleTimeFromNow =
this.plugin.settings.syncOnSaveAfterMilliseconds! -
(currentTime - lastModified);
log.info(
console.info(
`schedule a run for ${scheduleTimeFromNow} milliseconds later`
);
runScheduled = true;
@ -1877,7 +1897,7 @@ export class RemotelySaveSettingTab extends PluginSettingTab {
button.setButtonText(t("settings_resetstatusbar_button"));
button.onClick(async () => {
// reset last sync time
await upsertLastSuccessSyncByVault(
await upsertLastSuccessSyncTimeByVault(
this.plugin.db,
this.plugin.vaultRandomID,
-1
@ -2004,6 +2024,128 @@ export class RemotelySaveSettingTab extends PluginSettingTab {
});
});
new Setting(advDiv)
.setName(t("settings_conflictaction"))
.setDesc(t("settings_conflictaction_desc"))
.addDropdown((dropdown) => {
dropdown.addOption(
"keep_newer",
t("settings_conflictaction_keep_newer")
);
dropdown.addOption(
"keep_larger",
t("settings_conflictaction_keep_larger")
);
dropdown
.setValue(this.plugin.settings.conflictAction ?? "keep_newer")
.onChange(async (val) => {
this.plugin.settings.conflictAction = val as ConflictActionType;
await this.plugin.saveSettings();
});
});
new Setting(advDiv)
.setName(t("settings_cleanemptyfolder"))
.setDesc(t("settings_cleanemptyfolder_desc"))
.addDropdown((dropdown) => {
dropdown.addOption("skip", t("settings_cleanemptyfolder_skip"));
dropdown.addOption(
"clean_both",
t("settings_cleanemptyfolder_clean_both")
);
dropdown
.setValue(this.plugin.settings.howToCleanEmptyFolder ?? "skip")
.onChange(async (val) => {
this.plugin.settings.howToCleanEmptyFolder =
val as EmptyFolderCleanType;
await this.plugin.saveSettings();
});
});
new Setting(advDiv)
.setName(t("settings_protectmodifypercentage"))
.setDesc(t("settings_protectmodifypercentage_desc"))
.addDropdown((dropdown) => {
for (const i of Array.from({ length: 11 }, (x, i) => i * 10)) {
let desc = `${i}`;
if (i === 0) {
desc = t("settings_protectmodifypercentage_000_desc");
} else if (i === 50) {
desc = t("settings_protectmodifypercentage_050_desc");
} else if (i === 100) {
desc = t("settings_protectmodifypercentage_100_desc");
}
dropdown.addOption(`${i}`, desc);
}
dropdown
.setValue(`${this.plugin.settings.protectModifyPercentage ?? 50}`)
.onChange(async (val) => {
this.plugin.settings.protectModifyPercentage = parseInt(val);
await this.plugin.saveSettings();
});
});
new Setting(advDiv)
.setName(t("setting_syncdirection"))
.setDesc(t("setting_syncdirection_desc"))
.addDropdown((dropdown) => {
dropdown.addOption(
"bidirectional",
t("setting_syncdirection_bidirectional_desc")
);
dropdown.addOption(
"incremental_push_only",
t("setting_syncdirection_incremental_push_only_desc")
);
dropdown.addOption(
"incremental_pull_only",
t("setting_syncdirection_incremental_pull_only_desc")
);
dropdown
.setValue(this.plugin.settings.syncDirection ?? "bidirectional")
.onChange(async (val) => {
this.plugin.settings.syncDirection = val as SyncDirectionType;
await this.plugin.saveSettings();
});
});
if (Platform.isMobile) {
new Setting(advDiv)
.setName(t("settings_enablemobilestatusbar"))
.setDesc(t("settings_enablemobilestatusbar_desc"))
.addDropdown(async (dropdown) => {
dropdown
.addOption("enable", t("enable"))
.addOption("disable", t("disable"));
dropdown
.setValue(
`${
this.plugin.settings.enableMobileStatusBar
? "enable"
: "disable"
}`
)
.onChange(async (val) => {
if (val === "enable") {
this.plugin.settings.enableMobileStatusBar = true;
this.plugin.appContainerObserver =
changeMobileStatusBar("enable");
} else {
this.plugin.settings.enableMobileStatusBar = false;
changeMobileStatusBar(
"disable",
this.plugin.appContainerObserver
);
this.plugin.appContainerObserver?.disconnect();
this.plugin.appContainerObserver = undefined;
}
await this.plugin.saveSettings();
});
});
}
//////////////////////////////////////////////////
// below for import and export functions
//////////////////////////////////////////////////
@ -2018,15 +2160,87 @@ export class RemotelySaveSettingTab extends PluginSettingTab {
.setName(t("settings_export"))
.setDesc(t("settings_export_desc"))
.addButton(async (button) => {
button.setButtonText(t("settings_export_desc_button"));
button.setButtonText(t("settings_export_all_but_oauth2_button"));
button.onClick(async () => {
new ExportSettingsQrCodeModal(this.app, this.plugin).open();
new ExportSettingsQrCodeModal(
this.app,
this.plugin,
"all_but_oauth2"
).open();
});
})
.addButton(async (button) => {
button.setButtonText(t("settings_export_dropbox_button"));
button.onClick(async () => {
new ExportSettingsQrCodeModal(
this.app,
this.plugin,
"dropbox"
).open();
});
})
.addButton(async (button) => {
button.setButtonText(t("settings_export_onedrive_button"));
button.onClick(async () => {
new ExportSettingsQrCodeModal(
this.app,
this.plugin,
"onedrive"
).open();
});
});
let importSettingVal = "";
new Setting(importExportDiv)
.setName(t("settings_import"))
.setDesc(t("settings_import_desc"));
.setDesc(t("settings_import_desc"))
.addText((text) =>
text
.setPlaceholder("obsidian://remotely-save?func=settings&...")
.setValue("")
.onChange((val) => {
importSettingVal = val;
})
)
.addButton(async (button) => {
button.setButtonText(t("confirm"));
button.onClick(async () => {
if (importSettingVal !== "") {
// console.debug(importSettingVal);
try {
const inputParams = parseUriByHand(importSettingVal);
const parsed = importQrCodeUri(
inputParams,
this.app.vault.getName()
);
if (parsed.status === "error") {
new Notice(parsed.message);
} else {
const copied = cloneDeep(parsed.result);
// new Notice(JSON.stringify(copied))
this.plugin.settings = Object.assign(
{},
this.plugin.settings,
copied
);
this.plugin.saveSettings();
new Notice(
t("protocol_saveqr", {
manifestName: this.plugin.manifest.name,
})
);
}
} catch (e) {
new Notice(`${e}`);
}
importSettingVal = "";
} else {
new Notice(t("settings_import_error_notice"));
importSettingVal = "";
}
});
});
//////////////////////////////////////////////////
// below for debug
@ -2045,9 +2259,8 @@ export class RemotelySaveSettingTab extends PluginSettingTab {
.setValue(this.plugin.settings.currLogLevel ?? "info")
.onChange(async (val: string) => {
this.plugin.settings.currLogLevel = val;
log.setLevel(val as any);
await this.plugin.saveSettings();
log.info(`the log level is changed to ${val}`);
console.info(`the log level is changed to ${val}`);
});
});
@ -2058,21 +2271,74 @@ export class RemotelySaveSettingTab extends PluginSettingTab {
button.setButtonText(t("settings_outputsettingsconsole_button"));
button.onClick(async () => {
const c = messyConfigToNormal(await this.plugin.loadData());
log.info(c);
console.info(c);
new Notice(t("settings_outputsettingsconsole_notice"));
});
});
new Setting(debugDiv)
.setName(t("settings_obfuscatesettingfile"))
.setDesc(t("settings_obfuscatesettingfile_desc"))
.addDropdown(async (dropdown) => {
dropdown
.addOption("enable", t("enable"))
.addOption("disable", t("disable"));
dropdown
.setValue(
`${
this.plugin.settings.obfuscateSettingFile ? "enable" : "disable"
}`
)
.onChange(async (val) => {
if (val === "enable") {
this.plugin.settings.obfuscateSettingFile = true;
} else {
this.plugin.settings.obfuscateSettingFile = false;
}
await this.plugin.saveSettings();
});
});
new Setting(debugDiv)
.setName(t("settings_viewconsolelog"))
.setDesc(stringToFragment(t("settings_viewconsolelog_desc")));
new Setting(debugDiv)
.setName(t("settings_syncplans"))
.setDesc(t("settings_syncplans_desc"))
.addButton(async (button) => {
button.setButtonText(t("settings_syncplans_button_json"));
button.setButtonText(t("settings_syncplans_button_1"));
button.onClick(async () => {
await exportVaultSyncPlansToFiles(
this.plugin.db,
this.app.vault,
this.plugin.vaultRandomID
this.plugin.vaultRandomID,
1
);
new Notice(t("settings_syncplans_notice"));
});
})
.addButton(async (button) => {
button.setButtonText(t("settings_syncplans_button_5"));
button.onClick(async () => {
await exportVaultSyncPlansToFiles(
this.plugin.db,
this.app.vault,
this.plugin.vaultRandomID,
5
);
new Notice(t("settings_syncplans_notice"));
});
})
.addButton(async (button) => {
button.setButtonText(t("settings_syncplans_button_all"));
button.onClick(async () => {
await exportVaultSyncPlansToFiles(
this.plugin.db,
this.app.vault,
this.plugin.vaultRandomID,
-1
);
new Notice(t("settings_syncplans_notice"));
});
@ -2089,61 +2355,32 @@ export class RemotelySaveSettingTab extends PluginSettingTab {
});
});
let logToHttpServer = this.plugin.debugServerTemp || "";
new Setting(debugDiv)
.setName(t("settings_logtohttpserver"))
.setDesc(t("settings_logtohttpserver_desc"))
.addText(async (text) => {
text.setValue(logToHttpServer).onChange(async (value) => {
logToHttpServer = value.trim();
});
})
.setName(t("settings_delprevsync"))
.setDesc(t("settings_delprevsync_desc"))
.addButton(async (button) => {
button.setButtonText(t("confirm"));
button.setButtonText(t("settings_delprevsync_button"));
button.onClick(async () => {
if (logToHttpServer === "" || !logToHttpServer.startsWith("http")) {
this.plugin.debugServerTemp = "";
logToHttpServer = "";
// restoreLogWritterInplace();
new Notice(t("settings_logtohttpserver_reset_notice"));
} else {
new SetLogToHttpServerModal(
this.app,
this.plugin,
logToHttpServer,
() => {
this.plugin.debugServerTemp = logToHttpServer;
// applyLogWriterInplace((...msg: any[]) => {
// try {
// requestUrl({
// url: logToHttpServer,
// method: "POST",
// headers: {
// "Content-Type": "application/json",
// },
// body: JSON.stringify({
// send_time: Date.now(),
// log_text: msg,
// }),
// });
// } catch (e) {
// // pass
// }
// });
}
).open();
}
await clearAllPrevSyncRecordByVault(
this.plugin.db,
this.plugin.vaultRandomID
);
new Notice(t("settings_delprevsync_notice"));
});
});
new Setting(debugDiv)
.setName(t("settings_delsyncmap"))
.setDesc(t("settings_delsyncmap_desc"))
.setName(t("settings_profiler_results"))
.setDesc(t("settings_profiler_results_desc"))
.addButton(async (button) => {
button.setButtonText(t("settings_delsyncmap_button"));
button.setButtonText(t("settings_profiler_results_button_all"));
button.onClick(async () => {
await clearAllSyncMetaMapping(this.plugin.db);
new Notice(t("settings_delsyncmap_notice"));
await exportVaultProfilerResultsToFiles(
this.plugin.db,
this.app.vault,
this.plugin.vaultRandomID
);
new Notice(t("settings_profiler_results_notice"));
});
});

File diff suppressed because it is too large Load Diff

View File

@ -1,64 +0,0 @@
import { App, Modal, Notice, PluginSettingTab, Setting } from "obsidian";
import type RemotelySavePlugin from "./main"; // unavoidable
import type { TransItemType } from "./i18n";
import { log } from "./moreOnLog";
export class SyncAlgoV2Modal extends Modal {
agree: boolean;
readonly plugin: RemotelySavePlugin;
constructor(app: App, plugin: RemotelySavePlugin) {
super(app);
this.plugin = plugin;
this.agree = false;
}
onOpen() {
let { contentEl } = this;
const t = (x: TransItemType, vars?: any) => {
return this.plugin.i18n.t(x, vars);
};
contentEl.createEl("h2", {
text: t("syncalgov2_title"),
});
const ul = contentEl.createEl("ul");
t("syncalgov2_texts")
.split("\n")
.forEach((val) => {
ul.createEl("li", {
text: val,
});
});
new Setting(contentEl)
.addButton((button) => {
button.setButtonText(t("syncalgov2_button_agree"));
button.onClick(async () => {
this.agree = true;
this.close();
});
})
.addButton((button) => {
button.setButtonText(t("syncalgov2_button_disagree"));
button.onClick(() => {
this.close();
});
});
}
onClose() {
let { contentEl } = this;
contentEl.empty();
if (this.agree) {
log.info("agree to use the new algorithm");
this.plugin.saveAgreeToUseNewSyncAlgorithm();
this.plugin.enableAutoSyncIfSet();
this.plugin.enableInitSyncIfSet();
this.plugin.enableSyncOnSaveIfSet();
} else {
log.info("do not agree to use the new algorithm");
this.plugin.unload();
}
}
}

128
src/syncAlgoV3Notice.ts Normal file
View File

@ -0,0 +1,128 @@
import { App, Modal, Notice, PluginSettingTab, Setting } from "obsidian";
import type RemotelySavePlugin from "./main"; // unavoidable
import type { TransItemType } from "./i18n";
import { stringToFragment } from "./misc";
export class SyncAlgoV3Modal extends Modal {
agree: boolean;
manualBackup: boolean;
requireUpdateAllDev: boolean;
readonly plugin: RemotelySavePlugin;
constructor(app: App, plugin: RemotelySavePlugin) {
super(app);
this.plugin = plugin;
this.agree = false;
this.manualBackup = false;
this.requireUpdateAllDev = false;
}
onOpen() {
let { contentEl } = this;
const t = (x: TransItemType, vars?: any) => {
return this.plugin.i18n.t(x, vars);
};
contentEl.createEl("h2", {
text: t("syncalgov3_title"),
});
const ul = contentEl.createEl("ul");
t("syncalgov3_texts")
.split("\n")
.forEach((val) => {
ul.createEl("li", {
text: stringToFragment(val),
});
});
// code modified partially from BART released under MIT License
contentEl.createDiv("modal-button-container", (buttonContainerEl) => {
let agreeBtn: HTMLButtonElement | undefined = undefined;
buttonContainerEl.createEl(
"label",
{
cls: "mod-checkbox",
},
(labelEl) => {
const checkboxEl = labelEl.createEl("input", {
attr: { tabindex: -1 },
type: "checkbox",
});
checkboxEl.checked = this.manualBackup;
checkboxEl.addEventListener("click", () => {
this.manualBackup = checkboxEl.checked;
if (agreeBtn !== undefined) {
if (this.manualBackup && this.requireUpdateAllDev) {
agreeBtn.removeAttribute("disabled");
} else {
agreeBtn.setAttr("disabled", true);
}
}
});
labelEl.appendText(t("syncalgov3_checkbox_manual_backup"));
}
);
buttonContainerEl.createEl(
"label",
{
cls: "mod-checkbox",
},
(labelEl) => {
const checkboxEl = labelEl.createEl("input", {
attr: { tabindex: -1 },
type: "checkbox",
});
checkboxEl.checked = this.requireUpdateAllDev;
checkboxEl.addEventListener("click", () => {
this.requireUpdateAllDev = checkboxEl.checked;
if (agreeBtn !== undefined) {
if (this.manualBackup && this.requireUpdateAllDev) {
agreeBtn.removeAttribute("disabled");
} else {
agreeBtn.setAttr("disabled", true);
}
}
});
labelEl.appendText(t("syncalgov3_checkbox_requiremultidevupdate"));
}
);
agreeBtn = buttonContainerEl.createEl("button", {
attr: { type: "button" },
cls: "mod-cta",
text: t("syncalgov3_button_agree"),
});
agreeBtn.setAttr("disabled", true);
agreeBtn.addEventListener("click", () => {
this.agree = true;
this.close();
});
buttonContainerEl
.createEl("button", {
attr: { type: "submit" },
text: t("syncalgov3_button_disagree"),
})
.addEventListener("click", () => {
this.close();
});
});
}
onClose() {
let { contentEl } = this;
contentEl.empty();
if (this.agree) {
console.info("agree to use the new algorithm");
this.plugin.saveAgreeToUseNewSyncAlgorithm();
this.plugin.enableAutoSyncIfSet();
this.plugin.enableInitSyncIfSet();
this.plugin.enableSyncOnSaveIfSet();
} else {
console.info("do not agree to use the new algorithm");
this.plugin.unload();
}
}
}

View File

@ -1,90 +0,0 @@
import { App, Modal, Notice, PluginSettingTab, Setting } from "obsidian";
import type RemotelySavePlugin from "./main"; // unavoidable
import type { TransItemType } from "./i18n";
import type { FileOrFolderMixedState } from "./baseTypes";
import { log } from "./moreOnLog";
export class SizesConflictModal extends Modal {
readonly plugin: RemotelySavePlugin;
readonly skipSizeLargerThan: number;
readonly sizesGoWrong: FileOrFolderMixedState[];
readonly hasPassword: boolean;
constructor(
app: App,
plugin: RemotelySavePlugin,
skipSizeLargerThan: number,
sizesGoWrong: FileOrFolderMixedState[],
hasPassword: boolean
) {
super(app);
this.plugin = plugin;
this.skipSizeLargerThan = skipSizeLargerThan;
this.sizesGoWrong = sizesGoWrong;
this.hasPassword = hasPassword;
}
onOpen() {
let { contentEl } = this;
const t = (x: TransItemType, vars?: any) => {
return this.plugin.i18n.t(x, vars);
};
contentEl.createEl("h2", {
text: t("modal_sizesconflict_title"),
});
t("modal_sizesconflict_desc", {
thresholdMB: `${this.skipSizeLargerThan / 1000 / 1000}`,
thresholdBytes: `${this.skipSizeLargerThan}`,
})
.split("\n")
.forEach((val) => {
contentEl.createEl("p", { text: val });
});
const info = this.serialize();
contentEl.createDiv().createEl(
"button",
{
text: t("modal_sizesconflict_copybutton"),
},
(el) => {
el.onclick = async () => {
await navigator.clipboard.writeText(info);
new Notice(t("modal_sizesconflict_copynotice"));
};
}
);
contentEl.createEl("pre", {
text: info,
});
}
serialize() {
return this.sizesGoWrong
.map((x) => {
return [
x.key,
this.hasPassword
? `encrypted name: ${x.remoteEncryptedKey}`
: undefined,
`local ${this.hasPassword ? "encrypted " : ""}bytes: ${
this.hasPassword ? x.sizeLocalEnc : x.sizeLocal
}`,
`remote ${this.hasPassword ? "encrypted " : ""}bytes: ${
this.hasPassword ? x.sizeRemoteEnc : x.sizeRemote
}`,
]
.filter((tmp) => tmp !== undefined)
.join("\n");
})
.join("\n\n");
}
onClose() {
let { contentEl } = this;
contentEl.empty();
}
}

6
src/worker.d.ts vendored Normal file
View File

@ -0,0 +1,6 @@
declare module "*.worker.ts" {
class WebpackWorker extends Worker {
constructor();
}
export default WebpackWorker;
}

View File

@ -8,6 +8,10 @@
font-weight: bold;
}
.encryptionmethod-second-confirm {
font-weight: bold;
}
.settings-auth-related {
border-top: 1px solid var(--background-modifier-border);
padding-top: 18px;

View File

@ -10,13 +10,13 @@ import {
encryptStringToBase64url,
getSizeFromEncToOrig,
getSizeFromOrigToEnc,
} from "../src/encrypt";
} from "../src/encryptOpenSSL";
import { base64ToBase64url, bufferToArrayBuffer } from "../src/misc";
chai.use(chaiAsPromised);
const expect = chai.expect;
describe("Encryption tests", () => {
describe("Encryption OpenSSL tests", () => {
beforeEach(function () {
global.window = {
crypto: require("crypto").webcrypto,

View File

@ -285,3 +285,56 @@ describe("Misc: special char for dir", () => {
expect(misc.checkHasSpecialCharForDir("xxx?yyy")).to.be.true;
});
});
describe("Misc: Dropbox: should fix the folder name cases", () => {
it("should do nothing on empty folders", () => {
const input: any[] = [];
expect(misc.fixEntityListCasesInplace(input)).to.be.empty;
});
it("should sort folders by length by side effect", () => {
const input = [
{ keyRaw: "aaaa/" },
{ keyRaw: "bbb/" },
{ keyRaw: "c/" },
{ keyRaw: "dd/" },
];
const output = [
{ keyRaw: "c/" },
{ keyRaw: "dd/" },
{ keyRaw: "bbb/" },
{ keyRaw: "aaaa/" },
];
expect(misc.fixEntityListCasesInplace(input)).to.deep.equal(output);
});
it("should fix folder names", () => {
const input = [
{ keyRaw: "AAA/" },
{ keyRaw: "aaa/bbb/CCC.md" },
{ keyRaw: "aaa/BBB/" },
{ keyRaw: "ddd/" },
{ keyRaw: "DDD/EEE/fff.md" },
{ keyRaw: "DDD/eee/" },
{ keyRaw: "Ggg/" },
{ keyRaw: "ggG/hHH你好/Fff世界.md" },
{ keyRaw: "ggG/Hhh你好/" },
];
const output = [
{ keyRaw: "AAA/" },
{ keyRaw: "ddd/" },
{ keyRaw: "Ggg/" },
{ keyRaw: "AAA/BBB/" },
{ keyRaw: "ddd/eee/" },
{ keyRaw: "Ggg/Hhh你好/" },
{ keyRaw: "AAA/BBB/CCC.md" },
{ keyRaw: "ddd/eee/fff.md" },
{ keyRaw: "Ggg/Hhh你好/Fff世界.md" },
];
expect(misc.fixEntityListCasesInplace(input)).to.deep.equal(output);
});
});

View File

@ -14,7 +14,7 @@
"esModuleInterop": true,
"importHelpers": true,
"isolatedModules": true,
"lib": ["dom", "es5", "scripthost", "es2015"]
"lib": ["dom", "es5", "scripthost", "es2015", "webworker"]
},
"include": ["**/*.ts"]
}

View File

@ -32,6 +32,13 @@ module.exports = {
],
module: {
rules: [
{
test: /\.worker\.ts$/,
loader: "worker-loader",
options: {
inline: "no-fallback",
},
},
{
test: /\.tsx?$/,
use: "ts-loader",