Search Unity

  1. Welcome to the Unity Forums! Please take the time to read our Code of Conduct to familiarize yourself with the forum rules and how to post constructively.
  2. We have updated the language to the Editor Terms based on feedback from our employees and community. Learn more.
    Dismiss Notice
  3. Join us on November 16th, 2023, between 1 pm and 9 pm CET for Ask the Experts Online on Discord and on Unity Discussions.
    Dismiss Notice

Had a monster coding weekend to finish my game ... and then lost it all this morning.

Discussion in 'General Discussion' started by Rajmahal, Jul 27, 2015.

  1. Dustin-Horne

    Dustin-Horne

    Joined:
    Apr 4, 2013
    Posts:
    4,568
    If you don't like git, use visual studio online (tfs). It's free, up to 5 contributors on your account (more with msdn), unlimited space and it's not distributed. It's super easy to use, and visual studio community edition is awesome and free as well.
     
  2. frosted

    frosted

    Joined:
    Jan 17, 2014
    Posts:
    4,044
    If you're basically just backing stuff up for yourself git -> dropbox is awesome.

    Stupidly easy to set up, fast, easy to work with.

    The nice thing about dropbox is that it's insanely useful for other things as well. You're kind of just getting cloud version control as a cherry on top.
     
  3. Dustin-Horne

    Dustin-Horne

    Joined:
    Apr 4, 2013
    Posts:
    4,568
    But Dropbox itself isn't really version control. Sure you can create zip files with dates as names but it doesn't miss some of the goodness. You don't get history, rollback capabilities, merge capabilities. One of the most useful features imho is branching. If you want to try something and you're not sure it's going to work, create a branch. Play with an entirely different mechanic or subsystem. If it works, merge it in, if not, blow it away.
     
  4. frosted

    frosted

    Joined:
    Jan 17, 2014
    Posts:
    4,044
    To clarify, I mean set up a git repo in a dropbox folder.

    git commit (project local)
    git push -> dropbox folder
    (dropbox synchs the repo in background)

    There are a bunch of other configurations you can set up, but this one is clean and easy to work with.

    Zip files and stuff are really pointless in this day and age. If absolutely nothing else, having revision history on a given file is super critical for those derp moments.
     
  5. Rajmahal

    Rajmahal

    Joined:
    Apr 20, 2011
    Posts:
    2,101
    Hey guys, thanks for the info. I'll look into source control. I didn't know about it and I'm not entirely sure what it is at present from this thread but I'll check google and educate myself a bit.

    A quick update, I've been able to recover everything from the backup and have redone about 60% of the work I lost. Should be caught up to where I was in a couple days. Lesson learned the hard way.

    As I live in Canada, I have strict limits on monthly bandwidth on internet usage. I used to use Microsoft One Drive to backup my project files but the constant updates were pushing the limits of my monthly internet usage. I have around a 25GB project folder that I'd like to have multiple, iterative and automatic backups done. Is a cloud solution the best bet given limited internet bandwidth? Are there programs that just scan for changes to a local folder and then update the cloud version with only updates? Do those take iterative versions so I could go back a few days if needed and undo a bad change?
     
    Ryiah and Dustin-Horne like this.
  6. Pix10

    Pix10

    Joined:
    Jul 21, 2012
    Posts:
    850
    When you have to work with other people, or you want to work somewhere that's not just a hobby shop, you *will* have to use a version control system.

    If you do freelance work and don't care about ever having to work with other people, you'll come across clients who won't touch you if you don't use version control and your attitude is willfully opposed to the idea.

    Version control isn't new and it's not a trend. It's a working practice, and it endures not because someone's trying to brainwash the world, but because it improves the quality of life for people who develop complex software, whether professionally or not.

    Insisting anything contrary to common wisdom (that VCS is good, zips files are bad) is, in an ironic twist, Borg-like. :)

    Nice one! :)
     
    Dustin-Horne and tiggus like this.
  7. Pix10

    Pix10

    Joined:
    Jul 21, 2012
    Posts:
    850
    @anselmo.fresquez

    Because there are people who don't know any better, read forum threads and see the loudest voices as the most influential. These tend to be the ones who don't know any better*, are easily swayed, or just have a like-minded attitude to go "against the system" (wherever conspiracy theorists see systems).

    The irony is that by fighting such systems, you start one of your own, and assimilate a subsection of readers to your own cause.
     
    Master-Frog likes this.
  8. eskovas

    eskovas

    Joined:
    Dec 2, 2009
    Posts:
    1,373
    Version control will only upload what has been changed. It automatically detects everything, so you will only upload the changes that were made. Much better than compressing the whole project and uploading it regularly, and also much faster and requires much less bandwidth.

    The biggest thing it will upload is the initial commit, which is your whole project.
    One thing to note is that (using Git), your project will increase its size, by almost double in some cases.
    You can also export the project without any of the metadata for the standard project if you really want to (i usually do that at the end of the week and save it to my ext hdd. Probably shouldn't have to, but i like to have a clean project saved outside of my pc).
    Online services usually don't support big projects ( BitBucket is 2GB max, GitHub is also something like that, etc), so i used Visual studio Online which is free and you can have unlimited sized private repositories, which is fantastic.

    I also was skeptical about using version control during 3 years of development ( i even lost 2 months of progress 2-3 years ago ).
    I've been using it for less than 1 year now and i'm very happy that i decided to use version control :)
     
    Ryiah, Dustin-Horne and Master-Frog like this.
  9. Pix10

    Pix10

    Joined:
    Jul 21, 2012
    Posts:
    850
    Ok, you're right it's all too weird. I have better things to do as this is becoming borderline trolling.
     
  10. frosted

    frosted

    Joined:
    Jan 17, 2014
    Posts:
    4,044
    Dude. It's because version control is actually that good.

    It's really rare in computers that something is so clearly and entirely superior. Git vs Zip files is one of those rare times when something is unequivocally better.
     
  11. tiggus

    tiggus

    Joined:
    Sep 2, 2010
    Posts:
    1,240
     
  12. Dustin-Horne

    Dustin-Horne

    Joined:
    Apr 4, 2013
    Posts:
    4,568
    Aaaaand... you can add up to 5 additional users free of charge as well. And if any of them have MSDN subscriptions, they don't count toward your limit of 5. :) So 5 non-msdn users and unlimited if you add msdn users.
     
    Ryiah and Master-Frog like this.
  13. Taz-dragon

    Taz-dragon

    Joined:
    May 21, 2015
    Posts:
    38
     
    Master-Frog likes this.
  14. aer0ace

    aer0ace

    Joined:
    May 11, 2012
    Posts:
    1,511
    Yeah, this VCS discussion is kind of going like this. Americans like to eat, and some pretty tasty, but unhealthy food. And people can say, "Oh in order to be healthy you have to eat plenty of salads", and then when you start to eat a salad, and you put ranch dressing on it, someone comes along and says, "Oh that's not healthy, to put ranch dressing on your salad because it's full of fat". Well, eating a salad with ranch dressing is a lot better than eating no salad at all. And others say, eating an apple a day will keep the doctor away. That's like making zip files of your project and storing them somewhere. So yeah, it's highly recommended that you introduce some form of VCS in your development practice, whether it's Perforce, SVN, GIT, Mercurial, in the cloud or not... etc.

    Anyway, I'm glad @Rajmahal had a chance to respond and see the discussion that he caused. Because it looks like it's starting to get out of hand.
    I'm wondering when this will start to dry:
     
  15. Eric5h5

    Eric5h5

    Volunteer Moderator Moderator

    Joined:
    Jul 19, 2006
    Posts:
    32,401
    What are you going to do if an asteroid hits the planet and wipes out all life? Huh? HUH? Not as prepared as you thought, are you? If you don't have off-planet backup, you're just asking for it.

    (But let's not be mean, folks. What I did up there, don't do that. Yeah.)

    --Eric
     
    Ryiah, Kiwasi, schmosef and 1 other person like this.
  16. Master-Frog

    Master-Frog

    Joined:
    Jun 22, 2015
    Posts:
    2,302
    Well, obviously we won't be around to worry about it.

    Bazinga.jpg
     
  17. Master-Frog

    Master-Frog

    Joined:
    Jun 22, 2015
    Posts:
    2,302
    Also, once that meteor strikes the thread locker will wear off, since according to this website it melts at a localized temperature of only 550° F.

    the more you know.jpg
     
  18. Kiwasi

    Kiwasi

    Joined:
    Dec 5, 2013
    Posts:
    16,860
    If you have sufficient funds you can use version control and have the data stored off planet on a satellite or voyager type probe.

    Unfortunately you are still not protected against the sun going supernova or the solar system falling into a black hole.
     
    Ryiah and Master-Frog like this.
  19. Eric5h5

    Eric5h5

    Volunteer Moderator Moderator

    Joined:
    Jul 19, 2006
    Posts:
    32,401
    Did I say the data should be stored in this solar system? :rolleyes: That's no better than backing up to a different partition on the same hard drive. Man, I have to spell out every little thing. (And yes, I'm aware that extrasolar latency is a tad high. That's something you just need to deal with.)

    --Eric
     
  20. GarBenjamin

    GarBenjamin

    Joined:
    Dec 26, 2013
    Posts:
    7,441
    LOL Wow! You folks have lost your minds a wee bit. While I have nothing against VCS (use git and tortoise svn at work for different teams and tsvn at home) at least the non-vcs users are backing up their work. That's the most important thing. Heck, I haven't even set up tortoise svn on my new laptop yet. Doesn't mean I am not doing backups. I have an external hard drive for that purpose. And (gasp) I am zipping up the folders too.

    I do agree svn is cool being able to store only the changes each commit along with your checkin notes. But still if my new laptop went belly up tonight I'd be in the same recovery scenario from my zipped up folders stored on my external hard drive as I would if I was back to using svn. Remember... I am not putting VCS down just saying give the folks some credit for actually making backups of any kind. There are still people out there who never backup anything.
     
    Kiwasi likes this.
  21. Ryiah

    Ryiah

    Joined:
    Oct 11, 2012
    Posts:
    20,141
    Extrasolar latency won't be a problem so long as you treat the satellite storage as a last resort. If anything it would finally give you a reason to use any extreme compression and encryption algorithms as retrieving the data would take longer than running them.
     
    Kiwasi likes this.
  22. Master-Frog

    Master-Frog

    Joined:
    Jun 22, 2015
    Posts:
    2,302
    Run away, while you still have the strength...
     
  23. kanga

    kanga

    Joined:
    Mar 22, 2010
    Posts:
    225
    Why dont you work straight to an external drive? Every time I haven't followed my own advice I lost important stuff. Never again!
     
    Master-Frog likes this.
  24. angrypenguin

    angrypenguin

    Joined:
    Dec 29, 2011
    Posts:
    15,516
    How would that help at all?
     
  25. Kiwasi

    Kiwasi

    Joined:
    Dec 5, 2013
    Posts:
    16,860
    Working straight to an external drive still leaves you open for single cause failure. What if the external drive fails?

    The point of any back up system is redundancy. No single hardware failure should destroy your data.
     
    angrypenguin likes this.
  26. Kiwasi

    Kiwasi

    Joined:
    Dec 5, 2013
    Posts:
    16,860
    Thinking about it I'd actually be curious to see a large corporate data management risk plan.

    We build failure analysis stuff a lot for safety cases in engineering. It's nice to know exactly how many failures must occur, and where, before a chemical plant blows up and kills everything within a hundred miles.
     
    Master-Frog likes this.
  27. GarBenjamin

    GarBenjamin

    Joined:
    Dec 26, 2013
    Posts:
    7,441
    This is how my company handles it. We check into the svn and git servers. Those are replicated to backup servers daily I think. Once per week a full week's back up is performed and this is stored off-site in the trust of a data security company. Of course, it is not limited to just our development work. Basically all data. I'd guess most companies in IT (of a certain size at least) use a similar strategy.
     
    Kiwasi likes this.
  28. kanga

    kanga

    Joined:
    Mar 22, 2010
    Posts:
    225
    The only thing you should have on your computer is the os and applications. Disks will crash. I have had total computer failure and new disk in, reinstall, business as usual in about an hour. This is actually the most basic rule of formating. The only time when you store work files on your hard drive is if applications you are using refuse to work properly with your particular configuration, for example in my case Flash and Quixel. Then you save to root and backup to external after if the files are important. I wouldnt say this way of working helps, I would say its vital.
    Touch wood I have never had an external disk failure, I have about one HD failure every three years, now I have an SSD, see how long that lasts :) So based on experience I would say use your comp for work file storage expect to loose your work. What you are saying is true, one should back up daily and place that drive in a fireproof safe in your studio! There are a myriad of commercial backup systems available and if you are running a company by all means go ahead. My system costs 80 euros and no time and has saved me more than once because I am a freelancer and although I am an army of one I dont want to tell the client I lost everything at the deadline.

    I havent read other posts apart from the op so I have no idea what others are offering as a solution but nit keeping files on your workdrive is really a no brainer.
     
    Last edited: Jul 30, 2015
  29. angrypenguin

    angrypenguin

    Joined:
    Dec 29, 2011
    Posts:
    15,516
    Haha, as compared to keeping it on your work drive and on your portable? Or on your work drive and in your VCS/cloud storage/office server?

    The problem isn't keeping stuff in any particular place. It's keeping it in any one place.

    What you're suggesting adds steps and potentially makes work slower. I don't know about you, but my internal HDDs are way faster than common external storage.

    Being external doesn't change that. It can in fact make it far more likely, given that external drives commonly face risks that internal ones don't. In that regard you should count yourself lucky.

    A "backup" is a "backup" because it is a redundant copy. If you've only got one copy then it doesn't matter if it's on your local HDD, a USB device or a bunker on the moon, it's still not "backed up".
     
  30. kanga

    kanga

    Joined:
    Mar 22, 2010
    Posts:
    225
    @angrypenguin
    Actually the problem is keeping something in a place that is unreliable. The op had his work on his machine like everyone does. In reality very few people make backups regularly. If you work straight to an external its automatically saved on a more reliable device. Your disk has to work your external only has to work a little every so often, because there is no os, no working apps there are no conflicts unless you run software as a portable. Back your work up where ever you want ( I would never suggest not to do that) but dont keep your workfiles on your workstation and expect to keep them. I have no idea how much slower running a full blown unity game is from an external drive, I dont know how the cache works, maybe with a full blown game it would run slower but most of us are busy with elements of a game, or just scenes, then I notice no slow down.

    Being external doesn't change that. It can in fact make it far more likely, given that external drives commonly face risks that internal ones don't. In that regard you should count yourself lucky.

    I have noticed that drop bears really love to plummet from the ceiling onto my external disks! So I would strongly disagree. Just trying to pass on a method that has saved me a lot of time and heart ache. If the op had worked the way I suggested then he would not have had a problem. Then I wouldnt be the only lucky one now would I?
     
  31. tiggus

    tiggus

    Joined:
    Sep 2, 2010
    Posts:
    1,240
    Kanga obviously you are happy with your system, but it sounds very similar to what my sister who is a teacher did, then I get a call on a Wednesday afternoon, "I had all my teaching material for the last 7 years on a usb drive and it is corrupted, can you get it back?".

    For me it seems pretty simple: copy 1: local HD copy 2: online vcs such as github copy 3: my cloud based backup solution that backs up my PC and VMs automatically throughout the day
     
  32. kanga

    kanga

    Joined:
    Mar 22, 2010
    Posts:
    225
    Well my drives are a bit more sturdy than a usb but you are right, I already suggested keeping a backup in a fireproof safe, or did you guys think I was joking? Nonethless I am taking about a good quality 1tb drive. If the op had done that it would have saved him redoing his work, or not? Anyhow that method works for me and I am not particularly lucky. Each to his own.
     
  33. Master-Frog

    Master-Frog

    Joined:
    Jun 22, 2015
    Posts:
    2,302
    At this point I'd like to jump in here and actually destroy my original comments...

    It's as simple as

    1) Click a window on the taskbar.
    2) Type "git commit -am 'pm'"
    3) Press enter
    3) Type "git push origin master" (usually you just press up on the keyboard since you already typed this)
    4) Press enter
    5) For me, I still have to log in (how do I set it up to be auto logged in!!!???)
    6) Your stuff is backed up...

    It's just annoying to think about setting it up.
    Once its set up... it's so stupidly easy you wonder why you weren't doing this all along.
     
  34. Dustin-Horne

    Dustin-Horne

    Joined:
    Apr 4, 2013
    Posts:
    4,568
    But be careful, even though it's fireproof it's not heat proof. In the event of a fire tremendous heat will still be generated inside the safe and is very likely to destroy digital / optical media. That's why offsite backups are a good idea.

    Personally I use CrashPlan. I tested it against Carbonite and one other that I can't recall. I found CrashPlan to be the fastest in terms of uploading and supported more 'stuff'. For example, it can back up locked files, do versioning on your files, retain deleted files, manage multiple backup sets, etc. and it's all highly configurable. You can use it to backup to other machines for free or, as I do, use their cloud service. You can set how often it backs up (from realtime to every so often), how long it retains deleted files and/or versions (from something like 1 hour to forever). And it's unlimited storage.

    Individual is $5 / month. I actually do the Family and I have 6 PCs (including laptops) that back up to it with unlimited storage. I have a few TB out there. :)
     
    angrypenguin and tiggus like this.
  35. tiggus

    tiggus

    Joined:
    Sep 2, 2010
    Posts:
    1,240
    Yep sounds identical to my setup. I use Crashplan family and private repos on github, between the two I feel pretty safe. I used to also backup to a local NAS but the hassle of replacing bad disks eventually caused me to scrap it as it really wasn't needed. Reading the stories of all the people who lost all their data when we had bad flooding a couple years ago reinforced my desire to always have offsite backups.
     
    Dustin-Horne likes this.
  36. 3agle

    3agle

    Joined:
    Jul 9, 2012
    Posts:
    508
    I don't know if you are joking or not, I wouldn't ever suggest putting a backup in a safe, but whatever works for you...

    But really, fire, water damage, burglary are all real concerns with data too. There are many big projects that have been screwed over by not properly backing up. Look at Project Zomboid.

    Lesson to learn: Use version control or at the very least cloud based back-ups! Having your data in just one place, regardless of how secure you think that is, is never a good idea. Backing up is so easy and costs little to nothing, it's not some silly thing that only hardcore dudes do, it's an essential part of working with computers, especially in software development.

    Just in case anyone needed further convincing :)
     
    Master-Frog likes this.
  37. Master-Frog

    Master-Frog

    Joined:
    Jun 22, 2015
    Posts:
    2,302
    Admittedly, though... it did take a while to understand the whole commit, check out, repository thing. Man. It's harder to remake your lost project or to mess up and not be able to roll back your mistakes than it is to learn VCS. It's just a tough sell I guess?
     
    Last edited: Jul 30, 2015
    angrypenguin likes this.
  38. frosted

    frosted

    Joined:
    Jan 17, 2014
    Posts:
    4,044
    It's actually easy to set up, its also easy as hell to use (you can use tortoisegit for some nicer shell stuff (although it's way, way, way slower)). Some of the more advanced features in git are a little tricky, but you don't have to use all the features.

    "It's just annoying to think about setting it up."

    This is a really big thing and I think you nailed it on the head. It's not actually annoying to set up it's annoying to think about setting it up. It's a subtle, weird point, but it's really true.

    The same is true for a lot of the cloud based stuff. It's 2015, you can get free cloud storage all over the place. A USB or whatever can work adequately most of the time, sure, but cloud is better - and since it's so easy, usually free, and ultra available, there's just so little reason not to stuff your crap somewhere on the cloud.

    It's easier than opening a fireproof safe over and over, AND it's more reliable...
     
    Kiwasi likes this.
  39. Master-Frog

    Master-Frog

    Joined:
    Jun 22, 2015
    Posts:
    2,302
    Yep, that's exactly what I was saying. It's weird and different, USB is easy to understand.

    Same with the cooler Unity features.

    Same with powerful language features.

    People avoid complexity by instinct.
     
    Last edited: Jul 30, 2015
  40. snacktime

    snacktime

    Joined:
    Apr 15, 2013
    Posts:
    3,356
    Making games is a complex task, and usually requires people who are multi talented to do well.

    Being willing and eager to learn new things is like a pre-requisite in this industry. It's a minimum bar that you have to meet or you need to just find something else to pursue. That is if you actually want to one day work in this industry. And that is I think the correct context to assume in a public post like this, especially when there might be a lot of people who are new at this and taking the advice given here.

    And it's not complexity you are avoiding so much as simple fear of the unknown. My git workflow on personal projects is 3 commands. That's it. It's demonstrably not complex. I guarantee you that you have mastered things that are far more complex without so much as batting an eye.

    I'm over 50 and still taking on and learning new things that are completely outside my comfort zone and area of expertise. If I can do it someone half my age has like zero excuses.
     
    angrypenguin and Master-Frog like this.
  41. angrypenguin

    angrypenguin

    Joined:
    Dec 29, 2011
    Posts:
    15,516
    The reliability of any single storage device that isn't broken is beside the point. I strongly suggest reading the rest of the thread, where methods that provide backup whilst simultaneously making life easier in other ways are discussed.

    You're trying to pass on a method that you've just happened to have not run into issues with yet. You haven't been unlucky yet, and you're irrationally attributing that to the hard drives being "external".

    Well my understanding is that HDDs, like many other mechanical devices, suffer the most wear and tear when they turn on and off...

    Epic thumbs up! It's great that you gave it a go, hope it works out well for you.
     
  42. angrypenguin

    angrypenguin

    Joined:
    Dec 29, 2011
    Posts:
    15,516
    To me it's the not knowing where to start for the first time. Often once I've taken the first step it's pretty easy from there, but if I don't know the first step I just can't get started.

    Actually, that brings me to a great bit of advice a math teacher once gave me: "If you don't know what to do next, just do anything that's mathematically correct." Even if it's not the answer or the next step to the answer it gives you a little more visibility and will often hint as to what the next step might be. The same principle works in plenty of other areas.
     
    Kiwasi and Master-Frog like this.
  43. Kiwasi

    Kiwasi

    Joined:
    Dec 5, 2013
    Posts:
    16,860
    At my last job we used a fireproof safe at work for some of the critical process control backups. However that was fourth or fifth in a line of redundancies. The data was stored on the machine, on the local server, on the backup of the local server, on the global server and its backup, and in the fireproof safe.

    In the event of an earthquake that eliminated our process control and the local server and our access to the internet (a major earthquake or volcanic eruption was within the realm of possibility) we could be back up and running in an incredibly short time. Considering some of the risks of losing control of a chemical plant, that was a good thing. Its probably overkill for game developers.

    The point about this system that made it robust is the data was stored in different locations. It only took one mishap with a truck or forklift to render the fireproof safe data unrecoverable. But to render all of the systems useless would mean a disaster on the scale of the extinction of the dinosaurs.
     
    angrypenguin likes this.
  44. Teo

    Teo

    Joined:
    Oct 31, 2009
    Posts:
    564
    I used Crashplan in the past, personally I think is best online "backup" tool. The bad part is that Crashplan upload servers truly sux, takes forever to upload stuff. So I've give up because I could not stay anymore to upload my movies from camera. And there are a few disadvantages also.

    For home, I think Qnap or Synology are best solution with at least 2 drives. Not RAID setup, just 1 drive, backup on the other one daily, and use 1 drive to put stuff in. Any solution here works perfectly, like Windows File History or GIT setups and maybe even rsync.

    As you see, the most important problem is to HAVE a backup that works, and run, instead nothing.
     
    Kiwasi likes this.
  45. Tanel

    Tanel

    Joined:
    Aug 31, 2011
    Posts:
    508
    Another plus with version control now, you can use cloud build. Not really relevant to backups, just thought i'd throw it out there.

    Also, someone should maybe do an easy to understand for first timers tutorial/blog post about setting it up, with Unity in mind. I would've loved that a year or two ago.
     
    angrypenguin and Kiwasi like this.
  46. 3agle

    3agle

    Joined:
    Jul 9, 2012
    Posts:
    508
    Yeah to be fair our last line of redundancy at work is disconnected physical hard drive copies stored in a securely locked room, so it's not that ridiculous. Just not something you'd do as your 1st part of the process, that'd be an awkward workflow :p
     
    angrypenguin and Kiwasi like this.
  47. snacktime

    snacktime

    Joined:
    Apr 15, 2013
    Posts:
    3,356
    If you are using a distributed VCS, you have redundancy baked in if your team is more then just a couple of people. Even assuming it's just you, one person. You have your local copy checked out and a copy where it's hosted. Your risk is if both the provider and you lose your data at the same time. Chances of that happening, extremely small. And it goes down for every person on your team.

    And what is the real value of a single copy in 'secure' storage versus half a dozen copies spread around developer machines? I don't actually know the math on that, but it seems that the value of your 'secure' copy goes down pretty fast as you add more developers. Especially in a startup where most developers have a copy checked out at home.

    This also fits with how most providers store data now days. Failure is considered normal, it's only very sensitive data where redundancy isn't accomplished by just copying it out over a number of cheap disks. And how is that different from multiple developers using a distributed vcs? Seems to me that the danger there is if you are large enough where developers are only checking out a fraction of the overall branches in the repo. But if you are large enough where that is the case, that mitigates the damage in itself as each feature branch is probably a small fraction of the codebase.

    I'm guessing that any reasonably sized team, say a dozen or more, using a distributed vcs, is pretty safe even without making explicit backups. I think that too a large degree old habits just die hard. I know I feel safer having backups, but with a dozen developer copies scattered across half a dozen locations, how logical is that fear?

    Could be something I'm overlooking, just thinking out loud here.
     
    Kiwasi likes this.
  48. Rajmahal

    Rajmahal

    Joined:
    Apr 20, 2011
    Posts:
    2,101
    Hey guys ... great discussion. I decided to go with Crashplan.com as my initial choice. I'll still back up to physical local drives just as secondary measure once a week or so.

    By the way, just as a bit of irony ... in my previous job, I was an IT auditor for one of the big 4 firms. A good portion of my job involved reviewing data recovery and disaster recovery planning and capabilities and giving them advise on how to make things more secure. Somewhat embarrassing really that I would then allow this to happen to me so easily. :oops:
     
    Kiwasi, aer0ace and Ryiah like this.
  49. 3agle

    3agle

    Joined:
    Jul 9, 2012
    Posts:
    508
    For our specific case, the 'secure' physical backup is mostly a legacy from the company when it was younger. We have other, better practices in place as well (off-site digital backups). It's worth noting that the physical backups are for our entire servers, not just individual projects.
    We do however, use SVN, so in lieu of a distributed system, we also have manual backups of projects every week (which of course, get physically backed up by the server, along with the repos, every fortnight, along with the cloud backups weekly...).
    Lots of these processes should be improved tbh, and the physical backups is getting a little silly now, especially since we keep increasing our server space, but we're a very busy company, and these things never take priority :p.
     
    aer0ace likes this.
  50. Kiwasi

    Kiwasi

    Joined:
    Dec 5, 2013
    Posts:
    16,860
    As an interesting aside my previous company also had strong data destruction procedures and systems in place. Basically methods to ensure old data wasn't kept for to long. Mostly around legal protection from lawsuits.
     
    angrypenguin likes this.