Forum: The Break Room


Subject: Storage for Project/Asset backup?

TwoCatsYelling opened this issue on Mar 05, 2020 ยท 6 posts


TwoCatsYelling posted Fri, 06 March 2020 at 8:07 PM

Wow. That's quite the setup. Sounds like it was fun to set up.

I'm thinking of getting a personal NAS. Just not the WD "MyCloud". Tried that before, and it was awful to work with. Looks like there's some better options out there.

thoennes posted at 8:06PM Fri, 06 March 2020 - #4382705

My storage setup is:

2 development systems. Same OS and software stack on both. MacOS using brew to keep things up to date. One is a laptop and the other a desktop. Both are SSD so not as much chance of failure as an HD. I don't raid locally.

  1. I use rsync to keep my content and app libs in sync. The content sync also de-dupes within runtimes to hardlinks (using jdupes.). brew maintains a brewfile that can be used to restore my complete software stack on reinstall of a computer (or a new one). The rsync gets me a kind of lazy "raid" as far as I need. I can move back and forth between them with the same software setup.

  2. I only keep runtimes i'm using on either computer. All runtimes are sync'd also to local and remote server. I can clone a runtime if I need it. I split them off by character generation and/or category. It's unlikely that I would use PE and toon at the same time :)

  3. I use git to maintain all my projects. Each project is its own local git repo. This works well for poser since they're text files. I have a remote repo on my local server and a remote repo on a cloud VM (both servers run ubuntu). So not only are all my projects in source control, but there are 2 remote repos (one off site). I can delete any local repo I'm not using. I can clone it back from either remote. I have a non-git "working/play/experiment" directory that I use rsync to keep in sync between the two machines.

  4. For binary (image) assets I have a sparse diskimage that's in slices, making it much easier for rsync to deal with. It also is sync'd between machines and servers, local and remote.

There are some scripts that handle the mounting, de-duping, unmounting, syncing, etc. I just "work-sync" or "content-sync -a". I have some utility scripts like one that creates a custom folder embedded with a cover image for the project, usually the last image in the "renders" project, or a the "images" directory, or whatever is the last image it can find. Nice to look at a directory of folder icons and have a visual as to wtf is in there :D

The local server is ubuntu, AMD in a tower case, with 12x 4TB drives, 3 as 3 way mirror (double redundant) for homes, and the rest as raid6+hot spare. Essentially, I have to lose 3 drives in the mirror or 3 drives in the raid before I lose data. And the critical (git) data is still on the remote. Just not the runtimes.

When I got a new computer (replaced laptop which crashed and burned) it was pretty trivial to bring the new one back up with everything.

In my case, I do a lot of other stuff on that ubuntu server. If I didn't, I'd use a 3 bay 3 way mirrored 12TB drives JBOD external. That would be 12TB double redundant. The cost for redundant drives is nothing compared to the cost of reinstalling, rebuilding everything.

As my dad used to say... there are two types of people: Those who have lost data and those who will lose data.

I'm both. I've blown days (probably weeks) of my life, recovering from system failures over the years. No more. Let the record show, I eventually learn.