Search Unity

Do you use SSDs for development? / backups, version control

Discussion in 'General Discussion' started by Shushustorm, Sep 20, 2015.

  1. Shushustorm

    Shushustorm

    Joined:
    Jan 6, 2014
    Posts:
    934
    Hello everyone!
    As my current machine gets some graphical issues now and then, meaning that I suspect it to fail completely any day, I am thinking about buying a new one.
    To get to the point: I struggle with the decision concerning SSDs/HDDs. The new Macbooks only come with SSDs and I am really concerned about their reliability.
    After doing some research on this topic, it seems to be highly controversial. Some claim SSDs could last about 1000 years if you only use them once in a while (which is clearly not the case here, I'd use it hard). Others claim, it will most likely fail in a couple of years (about 5-10 years).
    Either way, even if I end up with an SSD failure after 5 years of usage, I'd still be happy with that.
    Most people will advise to make backups, which I do anyway. So this shouldn't be the biggest problem.

    My main concern is data corruption.
    While I haven't found much information about that topic, some suggest that current SSDs will have about 3% corruption rate.
    I can't afford losing 3% of my files. What if it's a script with thousands of lines of code that I worked on for weeks?
    A good backup solution won't help me with corrupted files, will it? Because when backing up, the corrupted files are not going to be miraculously cured, are they?
    Maybe I don't really understand what "file corruption" actually is? Or why would anyone use SSDs if there is such a high chance of data loss?
    Unfortunately, I can't find the article that I read anymore. But if I stumble upon it, I will link to it.

    Here is an article about 2015 Macbook Pros (including the model I'm interested in) having SSD problems, though:
    http://appleinsider.com/articles/15...ok-pro-flash-storage-issue-in-firmware-update

    So now the question: Do you use SSDs for development?
    If so, what are your experiences?
    Have you encountered data loss / file corruption?
    Are my concerns justified?
    Or do I lack Austin Powers's attitude?



    Greetings,
    Shu
     
    Last edited: Sep 20, 2015
  2. elmar1028

    elmar1028

    Joined:
    Nov 21, 2013
    Posts:
    2,205
    I am using a laptop with SSD for nearly a year.

    No corruption so far :)
     
    Shushustorm likes this.
  3. Shushustorm

    Shushustorm

    Joined:
    Jan 6, 2014
    Posts:
    934
    @elmar1028 But do you know for sure? Did you use some software to check your files? (I don't know if that actually exists) Because most of the time, you're not going to access each and every file that is on your system.
    For example, while this was about an HDD, but still: A few years ago, I encountered some audio files on an old machine of mine that I couldn't play back anymore. I guess that's an example of file corruption? I didn't expect files to "go bad" at some point. It was horrible.
    And the main problem about this: I can't seem to be able to do anything against it.
    Oh well, maybe I'm just too much affected by that scenario and it's not even that common. Maybe I just had very bad luck.
     
  4. orb

    orb

    Joined:
    Nov 24, 2010
    Posts:
    2,960
    Two years on my main system, and it's one of the potentially unreliable Samsungs. Yes, I keep checking it for problems and can't find any. No failures are more worrisome than quick failures ;)

    The laptop has been going strong for a year now with a PCIe SSD (Mid-2014 MBP).

    Occasional series of drives of any type may have problems, but it's usually a warranty issue. If it's something huge, like that line of failing Seagate drives a while back, the manufacturer will usually recall them. Treat EVERYTHING as if it's about to fail.

    SSD reliability has improved vastly since the first generation. I'm not sure which one we're up to by now, but the first one needed to be powered on every now and then to maintain data (monthly or something like that). Samsung's latest 850 EVO line (the one with all that 3D magic) is supposedly able to reliably keep data for years without power, and the lifetime of the system is usually shorter than the life expectancy of these drives. Intel also have something along those lines in durability with the latest series, I think.

    tl;dr: I would still not trust SSDs for archival storage (which nobody recommends anyway), but for day to day use it's the only thing you'll ever want. Unless you're a hardcore masochist.
     
  5. tiggus

    tiggus

    Joined:
    Sep 2, 2010
    Posts:
    1,235
    Any physical media can have bad sectors/corruption. That's why we have backups. With all the cloud based incremental backup services out there available for pennies a day you would have to be crazy not to use one as a developer.
     
  6. Shushustorm

    Shushustorm

    Joined:
    Jan 6, 2014
    Posts:
    934
    @orb
    So what would you advise? Using SSD for booting and applications only? And store project data externally on HDDs?
    Even if the data lasts, I really don't want it to get corrupted. Corrupted data lasting for ages has no use either.
    But you didn't run into any corruption. What do you use for checking your files? Maybe I will have to consider some software to check my files like paranoid, so that I can restore from backup ASAP, if anything bad happens?

    This is very good advise in any case. I was even thinking about backing up everything twice.

    But how would a backup help against data corruption? Alright, if I can access individual files on incremental backups, this may work, but on the other hand, I'm not a fan of clouds (as online services). I read about Time Machine backups being used for restoring individual files, too, though.
     
  7. hippocoder

    hippocoder

    Digital Ape Moderator

    Joined:
    Apr 11, 2010
    Posts:
    26,071
    SSD? I have a 1TB samsung as main drive I abuse relentlessly. These things don't suddenly quit, but they'll degrade. To you it won't be loss of files so much as loss of free space. Plus backups are automated. Why would I miss out on performance ?
     
    holliebuckets, Ony and Shushustorm like this.
  8. Lightning-Zordon

    Lightning-Zordon

    Joined:
    May 13, 2014
    Posts:
    48
    Ive been using an old intel ssd on my main pc for 4+ years.
     
    Shushustorm likes this.
  9. Shushustorm

    Shushustorm

    Joined:
    Jan 6, 2014
    Posts:
    934
    Yes, the loss of free space, too. I read when it comes to reliability of SSDs, the bigger, the better, because the chance of unusable areas is lower? Then again, Apple charges 600€ for 1TB SSD upgrade (from 500GB), which is insane.

    @Lightning Zordon
    Alright, it seems people actually have good experiences with SSD as well!
     
  10. orb

    orb

    Joined:
    Nov 24, 2010
    Posts:
    2,960
    @Shushustorm: I use SSDs for everything on the work computers. Backups go to external (USB and networked) drives on automatic, regular backups plus repository check-ins on a server. The only times I've had file corruption were because of faulty downloads or HDDs.

    If you want to keep tabs on the health of your drives, just look for a utility that reports SMART status. Modern drives all have SMART, which does self-checking and health/temperature monitoring. The operating systems just don't usually do much with the information, other than reporting catastrophic failure. It's usually not going to show you the warning signs, because the data reported by each manufacturer differs, and can be hard to interpret. It's a whole new skill set.

    Also run the operating system's own disk check every now and then. Non-fatal forms of disk corruption can happen to the best of drives, because it's software-controlled.

    NOTE: The 12" MacBook (the one with the lonely port) doesn't report SMART status. Avoid. Every other computer with SATA, mSATA or PCIe drives does, as far as I know.

    @hippocoder: Yeah, fortunately SSDs come with an extra storage pool to allocate from when sectors go bad. And unlike HDDs, there's no rolling snowball effect once a hard error appears, so it could be just one little teensy block that dies and the rest is fine for years (decades with current tech, allegedly). Or you could be screwed and have received the worst SSD in Scotland.
     
  11. tiggus

    tiggus

    Joined:
    Sep 2, 2010
    Posts:
    1,235
    You can access individual files. If want to restore a file to the state it was in 10 days ago I open up my backup software file browser, select a time when I know the file was good, and click restore. Done.

    You don't have to use cloud service, but you need backups. I went through a long phase(lets say 20 years) where I backed up everything to a local NAS. Then local NAS + cloud. Now I just do cloud, while still using things like github for source code as well so it is backed up in 3 locations.

    Local NAS is a bit of a pain as you will get much more frequent disk failures on it so it requires ongoing maintenance to maintain it's integrity, and if you get robbed or have a flood you risk losing everything.
     
    Shushustorm likes this.
  12. LaneFox

    LaneFox

    Joined:
    Jun 29, 2011
    Posts:
    6,458
    The stress tests done basically seem to agree that normal everyday use (like for gamedev) is not enough to merit concern over data corruption. Eventually, you will begin to lose some free space but again under normal circumstances its not a problem.

    Faulty drive causing corruption? Yeah that could happen but its just as possible on an HDD so theres no point in going 20x slower for the same risk. Just use good backup practices and get on with your life. I've never had an SSD fail or lose a significant amount of space.
     
    Shushustorm likes this.
  13. orb

    orb

    Joined:
    Nov 24, 2010
    Posts:
    2,960
    There's a bit of spare space which it'll take from if sectors go bad. A 500GB drive may actually be 512GB, with 12GB worth of sectors for backup in case of hard errors.

    They're not SATA drives anymore though. 1200 megabytes per second PCIe beasts at the moment. Still pricey, but at least a bit above the average!
     
  14. Shushustorm

    Shushustorm

    Joined:
    Jan 6, 2014
    Posts:
    934
    Yes, I do see your point here. I just don't really trust clouds. (Not in a way that it wouldn't backup safely, but in a way that my data is stored securely from any other access.)

    I will have to take another look at it. This sure sounds reasonable.

    Maybe I understand the whole technology incorrectly, but the thing is that SSDs always delete and write new sectors on each writing process? Which would mean if your RAM has some failures, all the data that you rewrite runs the risk of getting corrupted, right?

    Also, I don't know if I understand the concept of "writing processes" correctly, but: It is bad if you save, for example, a script all the time, isn't it? Because no matter what I'm working on, I'm saving files very regularly. Probably about every 20-30 seconds. Maybe that's a bad habit for using SSDs?

    So I guess I would be fine with using the 500GB version? I don't really need 1TB in terms of raw storage that I actually have data on.
     
  15. LaneFox

    LaneFox

    Joined:
    Jun 29, 2011
    Posts:
    6,458
    Theres nothing wrong with that. I've used SSDs on every computer I have for the last 6 years both at work, home and backup drives and extensively abused them. I'm actually more concerned with the huge external HDDs I have for backups than the SSDs.
     
    angrypenguin and Shushustorm like this.
  16. orb

    orb

    Joined:
    Nov 24, 2010
    Posts:
    2,960
    Quite the opposite with TRIM. It optimises writes in such a way that if a file across 50 sectors is deleted, it simply marks them as now available in the index, rather than writing zeroes to them.

    Nah, you'll be fine. You'd also be shocked if you saw just how many gigabytes of logs any modern OS writes in a month at the most! The lifetime of SSDs is measured in petabytes, and the big test somebody did a while back didn't even manage to break the EVO drives they had by the end of a multi-month constant read-write cycle.

    HDDs are the ones which fail big after long-time operation.

    If you know you can live with 500GB, get that. Where I live it's the best price per gigabyte (and within reason as a personal expense, not just a work expense).
     
  17. tiggus

    tiggus

    Joined:
    Sep 2, 2010
    Posts:
    1,235
    I understand, just realize maintaining an equivalent backup service to what you get in the cloud for $5/mo or even free costs quite a bit over time, and still doesn't deal with offsite backups. My latest NAS doing local backups was a Synology and I probably had a bad drive on it every 6-12 months or so which ended up being a lot of money.
     
    Shushustorm likes this.
  18. Shushustorm

    Shushustorm

    Joined:
    Jan 6, 2014
    Posts:
    934
    Interesting. That sounds pretty reliable! But why are you concerned about the external HDDs? Do HDDs have some risks to them that I'm not aware? I know, they are sensitive to shock, but I guess you wouldn't throw your HDDs?

    I see. That's great! So is TRIM some default setting or would I have to watch out for something and activate a certain setting?

    Yea, I'm pretty sure I will not fill those 500GB any time soon. Just in case, I could always upgrade, couldn't I?

    That seems pretty bad, though. I am using Western Digital drives for backups and in about 4 years I had to replace one of 4.
     
    Last edited: Sep 20, 2015
  19. LaneFox

    LaneFox

    Joined:
    Jun 29, 2011
    Posts:
    6,458
    Maybe I'm just over concerned about it but it seems like the HDDs I have are getting progressively slower. I haven't done any formal tests but my SSDs around the same age and usage don't seem to have as much speed degradation. For me, I'm more concerned about speed and transfer rates realtime since I move huge files around constantly (Project backup, HDR photography, timelapse RAW photos... gigs upon gigs upon gigs of stuff) so I notice speed changes immediately. I couldn't imagine trying to work with large files on an HDD anymore, it would just be abysmal.
     
    Ony likes this.
  20. Shushustorm

    Shushustorm

    Joined:
    Jan 6, 2014
    Posts:
    934
    Alright, but isn't that because HDDs get slower the fuller they are and SSD don't have that problem, because there is no looking up to get the files?
     
  21. orb

    orb

    Joined:
    Nov 24, 2010
    Posts:
    2,960
    Full drives need to scatter data across the available sectors, and since HDDs just spin a disc and move the read head around to find data it takes longer. SSDs have linear seek time - every sector takes exactly the same time to start reading from, and SSDs also have a higher number of in/out operations per second (the IOPS measurement seen in tests and advertising).

    Scary fact: HDDs have read/write heads placed around 3nm above the discs. If a mechanical failure happens, it's likely to destroy more than a few sectors.
     
    Shushustorm and LaneFox like this.
  22. Shushustorm

    Shushustorm

    Joined:
    Jan 6, 2014
    Posts:
    934
    Just some additional information: My current machine actually has some clicking noise to it, too. I think it's the HDD.
    Considering what you said, this indeed is scary.
     
  23. orb

    orb

    Joined:
    Nov 24, 2010
    Posts:
    2,960
    Much uh-oh and very, very frightening. You should definitely back up all you can and run a full disk check. If on OS X, reboot into single-user mode (cmd-S right after the gong) and run "fsck -fy". On Windows it's "chkdsk /r" from a command prompt, then reboot.
     
    Ony likes this.
  24. Shushustorm

    Shushustorm

    Joined:
    Jan 6, 2014
    Posts:
    934
    Actually, I've had this clicking noise for months now. I just hoped it would be that strange Macbook functionality that I read of. Apparently, Macbooks's HDDs will sometimes have clicking noise even if there's nothing wrong with them. It's something that tries to prevent the HDD head from scratching the HDD when the Macbook gets shocked / vibrations.
    I don't know if that noise is really related to that, though. Because I'm not moving my Macbook remarkably. Just typing. Hard.
    Also, what does "fsck -fy" do? Is this the full disk check? I use Disk Utility from time to time and it never said there was anything wrong.
     
  25. Lightning-Zordon

    Lightning-Zordon

    Joined:
    May 13, 2014
    Posts:
    48
  26. ShilohGames

    ShilohGames

    Joined:
    Mar 24, 2014
    Posts:
    2,575
    I use SSD drives in everything, including development boxes. I have seen much higher reliability from SSD than from hard disks. All drives will eventually fail, especially if they get subjected to extreme stress testing. Always keep good backups of your important data regardless of which type of drive you use.
     
    Ony and Shushustorm like this.
  27. orb

    orb

    Joined:
    Nov 24, 2010
    Posts:
    2,960
    Shushustorm likes this.
  28. Shushustorm

    Shushustorm

    Joined:
    Jan 6, 2014
    Posts:
    934
    I see. I should probably do this sometime soon. Do you know if that takes a long time?
    For now, I will backup my stuff once more and hope the HDD will be fine until I buy a new HDD / SSD or Macbook (including SSD).
     
  29. orb

    orb

    Joined:
    Nov 24, 2010
    Posts:
    2,960
    Minutes, usually. If there are problems it could be even faster. If there are unrecoverable errors it'll run three attempts before giving up, at which point you know for certain there's a problem.
     
    Shushustorm likes this.
  30. spryx

    spryx

    Joined:
    Jul 23, 2013
    Posts:
    365
  31. antislash

    antislash

    Joined:
    Apr 23, 2015
    Posts:
    646
    no, not apps, system only is fine but keep any install away or you'll end full.
    and , yes, SSDs are prone to data loss, just like any device.
     
  32. CodeMonke234

    CodeMonke234

    Joined:
    Oct 13, 2010
    Posts:
    181
    IMHO SSD is the way to go. Haven't had any issues and have heavily used them for years on both Windows and Mac.

    Of course, backup your drive either way.

    And for game development, by far the most important advice:
    Use Source Control for your projects.

    Store at least one repo remotely - e.g. github, bitbucket.
    If they are too big for github, use git-lfs.

    If your machine is stolen, or ruined by coffee, or whatever, in the end doesnt matter if it is ssd or hd.
     
    Kiwasi likes this.
  33. Ryiah

    Ryiah

    Joined:
    Oct 11, 2012
    Posts:
    14,990
    They wouldn't still be in commercial use if they were unreliable. Speaking of unreliable, I've had at least one hard drive fail on me and that particular one stands out the most because it failed within a month. At least I didn't lose anything.
     
  34. orb

    orb

    Joined:
    Nov 24, 2010
    Posts:
    2,960
    Somebody replaced a screen on a laptop Friday, and the hard drive stopped working right about then. Sometimes hardware just conspires to die at the same time.
     
  35. antislash

    antislash

    Joined:
    Apr 23, 2015
    Posts:
    646
    the fact is that recover a classical HD is common but recovering from a SSD is .... very very hard not to say impossible, they don't show sign if failure, they work or they don't.
     
    Ony likes this.
  36. JamesLeeNZ

    JamesLeeNZ

    Joined:
    Nov 15, 2011
    Posts:
    5,618
    I have four sdds which are mainly used for holding game libraries it seems. Zero problems so far.

    If your back-up is on the same machine, its not a back-up. It should at least be on a separate drive, but preferably a different machine.
     
    zombiegorilla, Kiwasi and Shushustorm like this.
  37. Shushustorm

    Shushustorm

    Joined:
    Jan 6, 2014
    Posts:
    934
    After doing that, it only said the drive "appears" to be okay. I guess that's alright then.

    What exactly do you mean I will end full? If I get a 500GB SSD, which I probably will, I will have enough space for storing all my stuff. Of course, not taking into account that it's not on a potentially more secure HDD. I do have a somewhat different view on this topic now, though. Maybe SSD isn't as bad after all. I really like being able to work quickly.

    Actually, I wouldn't even bother trying to repair an HDD as long as I have a backup that I can restore on a new one.
    Of course, this may be somewhat different with SSDs being more expensive.

    I have heard this so many times. And I agree. I should be using Source Control. But I don't. When I took a look at that topic, Unity would only support two solutions: Perforce and PlasticSCM (which is still the case: http://docs.unity3d.com/Manual/ExternalVersionControlSystemSupport.html )
    I don't really know how I would setup different solutions. Also, I'm not really familiar with that topic in general. For example, I read about some solutions backing up versions of the entire project, which would be insane. At least for me. Because I don't want all the high-res textures to be backed up as versions. Besides, I'm not a big fan of storing my stuff online, which is, from what I've read, the "normal" way to get Source Control working.
    Oh and I work alone. So for me, there isn't any benefit from uploading except for scenarios like when it gets stolen / a flood hits us / there is war / a meteorite crashes, which may be possible, but I guess I just don't want to take those horrible situations into account (that much).

    Well, the graphic chip of my Macbook died a few months ago. I was told this was very common in the 2011 models. That's why I got a free replacement. Sometimes you only know a few years later if the hardware actually was reliable, I guess.

    Yes, that makes sense. I always backup to external HDDs.
     
  38. CodeMonke234

    CodeMonke234

    Joined:
    Oct 13, 2010
    Posts:
    181
    Shushustorm - You have totally valid concerns and hesitations about source control. But the benefits far outweigh the costs. (I have sad memories of a day last January where an evil cup of coffee took out my macbook pro - Thankfully timemachine and github minimized the damage)

    Not only is it an offsite backup, also enables you to rollback to working states. Eg Unity freaks out and the massive change to 20 files you just finished broke things)
    It really becomes critical when you work on a team.

    But as you said, storing the binary assets is...difficult. Especially for large projects.
    I am working on an article about using git-lfs for gamedev - lets you store the binary art assets outside of the git repo.

    Will post you link when it is online - might be a good time to try it.

    Happy Coding :)
     
    Shushustorm likes this.
  39. Kiwasi

    Kiwasi

    Joined:
    Dec 5, 2013
    Posts:
    16,606
    For game dev stuff you should be using source control. Just roll back if a version becomes corrupt. Its pretty painless to set up git with Unity. You can get free private remote repo on BitBucket.

    Binary assets like art and sound don't work as well with version control. But then large binaries don't work well with traditional backup methods either.

    Software can all be replaced pretty easily. That's normally no big deal.

    For everything else do a regular system backup from time to time and you should be pretty fine. No system is infallible. But you can get the risk of failure pretty low.
     
  40. angrypenguin

    angrypenguin

    Joined:
    Dec 29, 2011
    Posts:
    12,307
    Of course a good backup will help with corrupted files. If something gets corrupted then an uncorrupted version can be restored from the backup. If you're using a version control system for your backup then the backup is updated based on changes rather than time (you won't be pushing a corrupted version up except by accident), and you'll have the whole history (so even if corruption does somehow get in there you can roll back to before it happened, no sweat).

    Also, as I think others have touched on, the "corruption" doesn't manifest as files containing corrupt data so much as free space which is determined too unreliable to use. So data you've written should remain safe to read.

    I've been using an SSD for ~4 years as my OS drive. I have had issues (I think compatibility issues between the mobo controller and the drive controller?) but random loss of data certainly hasn't been one of them.
     
    Shushustorm and Ryiah like this.
  41. angrypenguin

    angrypenguin

    Joined:
    Dec 29, 2011
    Posts:
    12,307
    They only provide instructions for those, but it works perfectly well with any system worth its salt. I've been using Unity with Git for 5+ years and it rocks. Unity doesn't need to provide special instructions because you set up the project for version control (ie: use text serialisation and meta files) and then the project goes in the repository just like any other set of files.

    It'll take you a few days to get used to the process of using version control, and then you'll love it.
     
  42. orb

    orb

    Joined:
    Nov 24, 2010
    Posts:
    2,960
    @Shushustorm: I use Git and SourceTree as the client, so I don't need special support inside Unity for it. When you get more advanced, also look into git-flow to boost your efficiency.

    If you get a 500GB SSD, you can safely use it for the whole system. Installing and uninstalling on HDD is a pain after you've seen how fast it is there. Plus there's a whole category of failures you won't get on SSDs ;)
     
    Shushustorm likes this.
  43. zombiegorilla

    zombiegorilla

    Moderator

    Joined:
    May 8, 2012
    Posts:
    7,959
    Version control isn't a unity thing, it's an "if you make stuff on a computer" thing. Every thing I make is in a git/svn repo. A bulk is on a local server, critical stuff and deep archived are on a remote server. In addition I make local file backups (dual) to external drives weekly. Work stuff is version controlled in a remote data warehouse with redundancy. If my place burns to the ground with all my laptops in it, (unlikely as I have at least one on me and one backup hhd ). I lose at most a week of of work, but realistically only a day or two.
     
  44. Ironmax

    Ironmax

    Joined:
    May 12, 2015
    Posts:
    892
    The bigger the SSD disk the longer the lifespan is. You can never go wrong with SSD. (unless very old ssd disks)
     
    Shushustorm likes this.
  45. orb

    orb

    Joined:
    Nov 24, 2010
    Posts:
    2,960
    That's because you have a huge place, right? Living in a cave means I lose an hour at most, if I survive the fire.
     
    Kiwasi and Shushustorm like this.
  46. angrypenguin

    angrypenguin

    Joined:
    Dec 29, 2011
    Posts:
    12,307
    I don't think you commit often enough. ;)
     
  47. Shushustorm

    Shushustorm

    Joined:
    Jan 6, 2014
    Posts:
    934
    @CodeMonke234
    I was about to ask whether or not Time Machine wouldn't meet the needs for "version control", but then I read this:
    http://pragmatos.net/2011/09/01/time-machine-is-not-version-control/
    , which makes perfect sense.

    Sounds cool! I will surely take a look at it when you post the link!

    But is it possible to set up version control that will only back up my code and only do so on my local drives?
    That would be a solution I would gladly implement.

    That, especially combined with version control, seem to be a pretty safe way to go.
    I guess you all got me now! Probably going to buy a new Macbook once they come with El Capitan preinstalled!

    Is it possible to use SourceTree for versioning locally? I've seen it on the Mac App Store and I sure was interested, yet not very informed about this whole topic.

    It seems so, yes. I was even thinking about 1TB just for that reason, but I will not pay another 600€ just for that. (upgrade from 500GB to 1TB costs this much, which is insane, I think)

    Very good advise also! Thinking about renting a cave now.
     
  48. orb

    orb

    Joined:
    Nov 24, 2010
    Posts:
    2,960
    Git is distributed version control, which means every repository is authoritative. You can use it exclusively on your own system, because it's not server-based. It's just a set of files with versions and indexing (more or less). If you add a remote repository to push to, you can have the offsite backup effortlessly. SourceTree is just managing that local repository, and has buttons for pushing, pulling and stuff on remote repos. It also shows status and contents of local and remote repos.

    I also recommend downloading SourceTree from the site: https://www.sourcetreeapp.com
     
    Shushustorm likes this.
  49. Shushustorm

    Shushustorm

    Joined:
    Jan 6, 2014
    Posts:
    934
    Sooooo I do need to download Git also when I want to use SourceTree?
    I still don't really know how all that works, but I guess I will if I have all the necessary stuff installed.
    When I started developing, I was advised to use Git and (?) bitbucket, I think. But I would have to register and all of this had "cloud based" written all over the place, which made me step back.
     
  50. angrypenguin

    angrypenguin

    Joined:
    Dec 29, 2011
    Posts:
    12,307
    No, because it's not a "back up" if it doesn't put a copy of the data elsewhere. But yes, you can set up version control to run entirely on your system if that's really what you want, and it's entirely up to you what files are or aren't included.
     
    Shushustorm likes this.