Afternoon,
As many have probably already heard, Crashplan is no longer going to cater to the Non-Business market. They are recommending Carbonite. That is a shame. Crashplan had several advantages. 1 - They were Linux Friendly 2 - I could backup a NAS with little issue. 3 - Although I did not use this feature, it allowed friends to backup to each other. My current setup at home is as follows. A NAS drive (Thecus) A Linux VM with the Thecus mounted via NFS The crashplan client was loaded on the Linux VM and anytime changes where made on the NAS, the crashplan client would send it up to the cloud. My pictures were safe. Both local backup as well as offsite backup. So, I have a couple of months before I have to worry about this, but best to start the research. So, I would like some recommendations. I have about 200 GB's of data. These are my photos. Primary JPEG with some RAW thrown in. I add about 50 GB per year. They are stored unencrypted. I have about 100 Megs of personal files. Anything with important information, I encrypt. Nothing exciting here. Commercial Options: I was paying about 60 bucks a year. I think this was a reasonable price to pay for backup. I could consider up to around 100 a year.
Open Source Solutions: Here is the deal, I could just go old school and rsync the data to my datacenter. However, we all know that unless one is paying attention, that we will not monitor the backups or do it often enough. However, I do have access to my own data center. So, I could fire up a VM and I have space for this. However, the important thing is to have some type of reporting to tell me when backups have stopped as well as versioning. I have looked at Owncloud and Duplicati. I think Duplicati is where I should aim for if I go opensource and do this myself. Any thoughts? John _______________________________________________ Chugalug mailing list [hidden email] http://chugalug.org/cgi-bin/mailman/listinfo/chugalug |
I currently pay for the family plan which is around $15 a month for 10 computers unlimited data in their cloud. Moving to their business offering will make that $10 per machine. I dont even have 10 machine on my plan currently, so i will probably end up staying with crashplan and just move my account to a business account and pay the $10 a month per machine. That being said, its $10 a month for 12 months, then it goes up to whatever the business tier costs, but i would imagine it would be around the $100 you are willing to pay for the 1 machine you backup still unlimited data. On Tue, Aug 22, 2017 at 2:58 PM, John Alcock <[hidden email]> wrote:
--------------
Nick Smith nick at nicksmith dot us _______________________________________________ Chugalug mailing list [hidden email] http://chugalug.org/cgi-bin/mailman/listinfo/chugalug |
In reply to this post by John Alcock
I like Duplicati, but it has a very hard time with large datasets. And with Amazon Cloud Drive no longer offering "unlimited" storage, your "cloud" options are a bit limited for Duplicati. I would recommend SpiderOak if you want something to keep your important files in. It's not a full fledge copy-entire-disk type of backup, but it will do versioning and file sync. The client works on just about every platform out there and their pricing isn't too bad. Its the default back up client that Red Hat uses on all of our work-provided laptops. I like it enough that I am considering buying it for my own personal use. On Tue, Aug 22, 2017 at 2:58 PM, John Alcock <[hidden email]> wrote:
_______________________________________________ Chugalug mailing list [hidden email] http://chugalug.org/cgi-bin/mailman/listinfo/chugalug |
In reply to this post by John Alcock
One suggestion that may be worth looking into would be TarSnap (http://www.tarsnap.com/) they support a wide range of UNIX-like OSes including Linux, BSD, Mac OS, and even Windows with Cygwin (they may support Windows 10 Linux subsystem or what every they call it I haven't looked into this. For your list of requirements the only thing that tarsnap do not do is reporting of when backups stopped since the client is basically an extension of the tar command. What you do get is data encryption (all encryption and decryption happens on your local machine), each backup is a differential tar archive so it will only have changes since the last time you did a backup to save cost, this would also make rolling back one or more files or recovering up to a certain point possible. Cost wise between two machines I have 4.3 TB of data (1 TB after compression) stored, due to how they do billing (they operate as a prepaid service) I'm only charged for 71 GB of the actual unique data daily, add on that both my machines are only doing daily backups to tarsnap I only pay $20 every couple of months. On Tue, Aug 22, 2017 at 2:58 PM John Alcock <[hidden email]> wrote:
_______________________________________________ Chugalug mailing list [hidden email] http://chugalug.org/cgi-bin/mailman/listinfo/chugalug |
In reply to this post by Lynn Dixon
There is a similar thread going on over at slashdot... Here is one of the comments.. I'd recommend switching everything over to /dev/null as soon as possible. Any of your Linux friends should be able to set this up for you in under an hour. I hear your backups to /dev/null have great throughput and won't impact your other network activity at all. On Tue, Aug 22, 2017 at 3:15 PM, Lynn Dixon <[hidden email]> wrote:
_______________________________________________ Chugalug mailing list [hidden email] http://chugalug.org/cgi-bin/mailman/listinfo/chugalug |
There is even a nice service for all your /dev/null needs and their pricing isn't bad https://devnull-as-a-service.com/pricing/ until you hit their Enterprise level :) On Tue, Aug 22, 2017 at 3:36 PM John Alcock <[hidden email]> wrote:
_______________________________________________ Chugalug mailing list [hidden email] http://chugalug.org/cgi-bin/mailman/listinfo/chugalug |
In reply to this post by John Alcock
So since I'm using AWS more and more for work, I have been
trying to figure out the right strategy to use the AWS CLI to copy files up to S3, and then set up S3 rules to move those files to Glacier. I think this is mostly just a matter of figuring out the right strategies for where to put files... _______________________________________________ Chugalug mailing list [hidden email] http://chugalug.org/cgi-bin/mailman/listinfo/chugalug |
In reply to this post by John Alcock
I use Backblaze at work. I know it will backup a Synology NAS. Other integrations are here:
-Russell
_______________________________________________ Chugalug mailing list [hidden email] http://chugalug.org/cgi-bin/mailman/listinfo/chugalug |
In reply to this post by John Alcock
Shoot, I use CrashPlan’s peer-to-peer backups to me for various friends that can’t really afford cloud backup solutions. Now they’re kind of SoL…. From: Chugalug [mailto:[hidden email]]
On Behalf Of John Alcock Afternoon, As many have probably already heard, Crashplan is no longer going to cater to the Non-Business market. They are recommending Carbonite. That is a shame. Crashplan had several advantages. 1 - They were Linux Friendly 2 - I could backup a NAS with little issue. 3 - Although I did not use this feature, it allowed friends to backup to each other. My current setup at home is as follows. A NAS drive (Thecus) A Linux VM with the Thecus mounted via NFS The crashplan client was loaded on the Linux VM and anytime changes where made on the NAS, the crashplan client would send it up to the cloud. My pictures were safe. Both local backup as well as offsite backup. So, I have a couple of months before I have to worry about this, but best to start the research. So, I would like some recommendations. I have about 200 GB's of data. These are my photos. Primary JPEG with some RAW thrown in. I add about 50 GB per year. They are stored unencrypted. I have about 100 Megs of personal files. Anything with important information, I encrypt. Nothing exciting here. Commercial Options: I was paying about 60 bucks a year. I think this was a reasonable price to pay for backup. I could consider up to around 100 a year.
Open Source Solutions: Here is the deal, I could just go old school and rsync the data to my datacenter. However, we all know that unless one is paying attention, that we will not monitor the backups or do it often enough. However, I do have access to my own data center. So, I could fire up a VM and I have space for this. However, the important thing is to have some type of reporting to tell me when backups have stopped as well as versioning. I have looked at Owncloud and Duplicati. I think Duplicati is where I should aim for if I go opensource and do this myself. Any thoughts? John _______________________________________________ Chugalug mailing list [hidden email] http://chugalug.org/cgi-bin/mailman/listinfo/chugalug |
It kinda of sucks. Guess it was not as high of a profit margin as they though. I also suspect a lot of people abused the service by backing up TB's of pirated content. I guess I just need to do this myself. I am looking at several solutions such as owncloud/duplicat that may furfill my needs. Hell, guess I could use good old rsync John On Tue, Aug 22, 2017 at 8:31 PM, William D. Roush <[hidden email]> wrote:
_______________________________________________ Chugalug mailing list [hidden email] http://chugalug.org/cgi-bin/mailman/listinfo/chugalug |
Hmm, I could probably set people up with OwnCloud… no way to encrypt their data from me though is there? From: Chugalug [mailto:[hidden email]]
On Behalf Of John Alcock It kinda of sucks. Guess it was not as high of a profit margin as they though. I also suspect a lot of people abused the service by backing up TB's of pirated content. I guess I just need to do this myself. I am looking at several solutions such as owncloud/duplicat that may furfill my needs. Hell, guess I could use good old rsync John On Tue, Aug 22, 2017 at 8:31 PM, William D. Roush <[hidden email]> wrote:
_______________________________________________ Chugalug mailing list [hidden email] http://chugalug.org/cgi-bin/mailman/listinfo/chugalug |
Free forum by Nabble | Edit this page |