thorp/README.org

52 lines
1.6 KiB
Org Mode
Raw Normal View History

2019-04-29 20:10:38 +01:00
* s3thorp
Synchronisation of files with S3 using the hash of the file contents.
Based on Alex Kudlick's JavaScript implementation [[https://github.com/akud/aws-s3-sync-by-hash][aws-s3-sync-by-hash]].
The normal ~aws s3 sync ...~ command only uses the time stamp of files
to decide what files need to be copied. This utility looks at the md5
hash of the file contents.
2019-05-10 22:44:27 +01:00
* Usage
#+begin_example
s3thorp
Usage: S3Thorp [options]
-s, --source <value> Source directory to sync to S3
-b, --bucket <value> S3 bucket name
-p, --prefix <value> Prefix within the S3 Bucket
#+end_example
* Creating Native Images
- Download and install GraalVM
- https://github.com/oracle/graal/releases
- Install ~native-image~ using the graal updater
#+begin_example bash
gu install native-image
#+end_example
- Create native image
#+begin_example bash
native-image -cp `sbt 'export runtime:fullClasspath'|tail -n 1` \
-H:Name=s3thorp \
-H:Class=net.kemitix.s3thorp.Main \
--allow-incomplete-classpath \
--force-fallback
#+end_example
- Resulting file requires a JDK for execution
2019-05-11 07:55:09 +01:00
* TO DO
2019-05-10 22:44:27 +01:00
2019-05-13 22:52:33 +01:00
- [X] Improve test coverage
- [X] Create os-native binaries
- [X] Replace println with real logging
2019-05-10 22:44:27 +01:00
- [ ] Add support for logging options
- [ ] Add support for exclusion filters
2019-05-11 18:09:00 +01:00
- [ ] Bulk fetching of Hash values from S3
- [ ] ? When lastModified matches local file, skip calculating local MD5 ?
2019-05-10 22:44:27 +01:00
- [ ] Add support for multi-part uploads for large files
- [ ] Add support for upload progress - may only be available with
multi-part uploads