Data De-duplication, file diff-ing and S3 style object storage using Digital Ocean Spaces

I have finally added Amazon S3 storage support to my program (SCRIPT??) hashtree.

After messing around with broken Perl libraries I finally broke down and wrote two Python scripts that upload files to my specified bucket and have the access keys hard coded into the source code.

The result is I now have data de-duplication, remote backups, (potential versioning – if I can be bothered to add it) and reduplication across all my workstations.

I was using Amazon S3 but they are too expensive. I have a FreeBSD droplet with Digital Ocean and chose to use their “Spaces” which has S3 compatible API.

I get 250GB of storage a month for $5. That’s a good deal.

Feel free to use my program. It’s BSD licensed and available here:

https://github.com/wilyarti/hashtree

Also if some one could get the S3 stuff working in Perl that would be great!

Screenshot from 2017-12-08 17-08-00Screenshot from 2017-12-08 17-08-26Screenshot from 2017-12-08 17-45-09Screenshot from 2017-12-08 17-46-08

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s