Search Unity

  1. Welcome to the Unity Forums! Please take the time to read our Code of Conduct to familiarize yourself with the forum rules and how to post constructively.
  2. Dismiss Notice

Ideal PC Spec for Development

Discussion in 'General Discussion' started by NectherLouie, Jan 30, 2021.

  1. NectherLouie

    NectherLouie

    Joined:
    Jan 22, 2013
    Posts:
    37
    Hi guys,

    I don't know where to ask this so I posted here on General Discussions. As the title states my question is for development what would be the ideal PC specifications if you are going to be developing a 3D open world game like BotW, Super Mario Odyssey, Monster Hunter or the Lego games?

    As a hobby developer I'm currently just using my laptop and not going overboard in developing a full scale 3D game and just sticking with as low poly as possible but I want to start a long project which would involved creating a small scale open world map/level and just testing out the Lego and FPS micro games series has shown me that my laptop would not be able to handle it.

    Any suggestions for a PC build for development?

    Thanks!
     
    cryptoventure likes this.
  2. andoo001

    andoo001

    Joined:
    Jul 19, 2016
    Posts:
    21
    Its really what you can afford, but i might also add, i have a 32 core gen 2 threadripper, and nothing about the Unity API makes use of so many threads. A 6 core processor, 8gb ram and 4gb VRAM will be enough to get you going, but you are talking about MASSIVE open world games so the more RAM/VRAM the better. Personally, i wouldn't be attempting a project like your suggesting on ANY laptop ( not even a premium alienware laptop).

    The other thing is, testing in the unity editor is always a lot slower than building the game and testing the build (i.e an apk on android or the exe for a windows build). It won't be instant but it will make full use of whichever specs you decide on.

    Though ultimately, with open world games you need to learn to make your open world in blocks, so you can load and unload areas your using or not using, they will be loaded into ram but only loaded into vram when they are active. Maybe watch some youtube tutorials about making open world terrains/population specifically, it might influence what you buy.

    If you have the skills to build your own pc, AMD make very affordable, good quality parts (I truly believe they have even surpassed intel with processors, GPU i still go for nvidia). Build your own, find the best price for each part, save heaps.
     
    Last edited: Jan 30, 2021
    NectherLouie likes this.
  3. MDADigital

    MDADigital

    Joined:
    Apr 18, 2020
    Posts:
    2,198
    It depends what kind of games you are making. We (3 dev's same setup) uses nvidia 3090 for GPU Lightmapping, 16 core 5950x nice for multi core stuff like project accustic test baking (fine bake needs thousands of cores) etc. We just upgraded to gen4 NVMe some minor inprovemt when rebuilding library for example. I wouldnt buy Gen3 today but if I have a gen3 it's not really worth upgrading.

    3800 MHz memory to fully take advantage of Zen. 32 gigs or more.

    Again it all depends hat you are doing, but this setup works well for our studio.
     
    NectherLouie likes this.
  4. NectherLouie

    NectherLouie

    Joined:
    Jan 22, 2013
    Posts:
    37
    Thanks for the reply guys!

    I've decided to save up for now to get a PC with a good specs for development. Thanks for the suggestions. For now I'll start developing the game with what I have which is my laptop keeping in mind that I will be getting a PC so I don't need to limit what I'm developing to what my laptop can handle. It's better to start on it.
     
  5. Joe-Censored

    Joe-Censored

    Joined:
    Mar 26, 2013
    Posts:
    11,847
    Questions about what is ideal or best are always impossible for other people to answer, because they require information that only you have. What is ideal for one person, can be too expensive or otherwise not a good fit for another person.

    As far as a good low-mid range desktop computer for Unity that should be under $1000 USD, I'd start from here though:

    AMD Ryzen 3600 6 core CPU
    16GB RAM 3200MHz
    1TB SSD
    NVidia 1650 Super GPU

    GPU prices are outrageous right now though (that GPU should be under $200). Not a great time to build a PC. Might have better luck with a new laptop. My Ryzen 4800H is freaking awesome for a laptop, and the 5000 series is supposed to be 15%+ faster and should be shipping in the next few months.
     
    NectherLouie likes this.
  6. Ryiah

    Ryiah

    Joined:
    Oct 11, 2012
    Posts:
    20,124
    Joe-Censored likes this.
  7. Antypodish

    Antypodish

    Joined:
    Apr 29, 2014
    Posts:
    10,574
  8. Ryiah

    Ryiah

    Joined:
    Oct 11, 2012
    Posts:
    20,124
    You missed my point (though to be fair I wasn't exactly putting effort into it). I'm pointing out that laptops are being used for mining and may suffer in both availability and affordability if the value of cryptocurrency continues to rise.

    Speaking of bills though you would have to be spending a fortune on electricity to make it unprofitable. Hitting an RTX 3080's break even would require you to pay almost two dollars per kilowatt. Denmark and Germany have the highest cost of power according to Wikipedia (2018 statistics) and they would still be making almost $9/day.
     
    Last edited: Feb 12, 2021
  9. Joe-Censored

    Joe-Censored

    Joined:
    Mar 26, 2013
    Posts:
    11,847
    Doesn't sound that efficient as far as an initial investment. Laptop GPU's are significantly lower performing than a desktop GPU, and when you factor in buying the entire laptop you're paying more for the same amount of performance compared to an equivalent desktop GPU.

    Though since laptops are generally designed for low power efficiency, maybe it ends up more efficient from a power consumption standpoint?
     
  10. Ryiah

    Ryiah

    Joined:
    Oct 11, 2012
    Posts:
    20,124
    Last time cryptocurrency caused a shortage of hardware Bitcoin had hit a record high of just under $20,000. Today Bitcoin is sitting around $45,000. Mobile RTX 2070 is roughly equivalent to a desktop RTX 2060 in performance and assuming no power cost an RTX 2060 makes $3.57 per day.

    Average power consumption of a desktop RTX 2060 is 158 W (3.79kW per day).

    https://www.tomshardware.com/reviews/nvidia-geforce-rtx-2060-ray-tracing-turing,5960-8.html

    Average power consumption of a laptop RTX 2070 is 115 W (2.76kW per day).

    https://www.techpowerup.com/gpu-specs/geforce-rtx-2070-mobile.c3349
     
    Last edited: Feb 12, 2021
    Joe-Censored likes this.
  11. adamgolden

    adamgolden

    Joined:
    Jun 17, 2019
    Posts:
    1,494
    Yeah I concur, I'm using a 4800H right now, 16GB RAM, SSD - and I don't feel limited at all. Granted, as mentioned, the more you can afford the better (either in cost and/or in patience looking for a good deal).. because while you're developing, waiting even a touch less for thousands of times adds up, and each fraction of a second ticking by while you wait distances you further from your train of thought. System performance greatly influences productivity. If I could grab a powerful desktop right now in addition to my laptop I would, but this will be perfectly fine for developing my current game and whatever else for the next few years. That said, I'm not creating content-heavy scenes with high-poly models and terrain that require high-end systems to play. The suggestion for a desktop is sound advice for anyone that doesn't need portability - you'll get way more bang for your buck. It's a pretty safe bet that a $2000 desktop will have vastly superior performance to a $2000 laptop.
     
    Ryiah and Joe-Censored like this.
  12. Antypodish

    Antypodish

    Joined:
    Apr 29, 2014
    Posts:
    10,574
    @Ryiah anyone wanting using average consumer GPUs for mining, would need to be part of big shared cloud. I am not sure, if individual can do that for him / her self these days. There is continuous mining difficulty rising and halving involved. Which increase demand on power demand, to the same job. Having arrays of 100+ gaming laptops may makes sense. They will be cheaper than dedicated mining cards. The price will drop soon. It always does after such pick. So it is only worth, for long few years run ahead.

    But I agree, that this can affect pricing on gaming hardware, if that what is happening. Question is, how wide the issue is spread.
     
  13. Antypodish

    Antypodish

    Joined:
    Apr 29, 2014
    Posts:
    10,574
    Btw, I had popup news on GPU hardware matter recently.
    I see, that target mining currencies may be other than the top one.

    GeForce Is Made for Gaming, CMP Is Made to Mine
    We’re limiting the hash rate of GeForce RTX 3060 GPUs so they’re less desirable to miners and launching NVIDIA CMP for professional mining.

    I looked for official source
    https://blogs.nvidia.com/blog/2021/02/18/geforce-cmp/

    Basically they want to cut performance by halve of gaming GPU, which is abused not for gaining.

    Wonder, if that will affect other future GPUs too.
     
  14. Ryiah

    Ryiah

    Joined:
    Oct 11, 2012
    Posts:
    20,124
    Correct, but it's trivial to join a mining pool with a service like NiceHash. NiceHash benchmarks the various cryptocurrencies on your system multiple times per day choosing the best for you and anything you make is converted directly into BitCoin sans their cut which is 2%. From there you simply transfer it into a wallet like CoinBase and send it to your bank.

    I wouldn't be recommending this if it weren't a viable and legit method to help pay for your hardware. If you're a professional game developer you don't even have to mine the full value of your card. You simply mine what's left after deducting the value from your taxes. For me deductions cover at least half the cost.

    https://www.nicehash.com/

    For my area the cost of solar energy is $0.09 per kilowatt (50 kilowatt blocks sold at $4.50 each). My new RTX 3070 can make nearly $8 per day. Two months of full time mining could cover the difference for me but I'm only mining during periods of time I'm actively using my machine so realistically it'll likely be several months.

    https://www.nicehash.com/profitability-calculator
     
    Last edited: Feb 21, 2021
    MadeFromPolygons and Antypodish like this.
  15. neginfinity

    neginfinity

    Joined:
    Jan 27, 2013
    Posts:
    13,321
    I saw that yesterday, apparently it only affects etherium and not the rest of algorithms.

    The situation makes me think about return of PPUs and ASICs.

    One thing about the whole mining situation is that I'm certain that with my epic luck the moment I try to invest into crypto or mining hardware, it is going to crash and take tesla with it.
     
  16. Ryiah

    Ryiah

    Joined:
    Oct 11, 2012
    Posts:
    20,124
    Thus why I'm recommending this to people who want to or have already bought a new GPU. I didn't buy my RTX 3070 because I wanted to mine. I bought it because my GTX 1080 was no longer sufficient and I wanted to play around with the new technologies (esp now that UE4 has DLSS).
     
    neginfinity likes this.
  17. neginfinity

    neginfinity

    Joined:
    Jan 27, 2013
    Posts:
    13,321
    Smart/Reasonable.

    I'm somewhat unhappy with my 1060 3GB GPU, but my inner hamster is strong and strongly objects to buying anything at modern prices.
    Also, last time I fired up nicehash, it killed my PSU. Same thing probably won't happen with the new one, but still apparently computer repair services got greedy so it was cheaper to replace it rather than fixing it.
     
  18. Ryiah

    Ryiah

    Joined:
    Oct 11, 2012
    Posts:
    20,124
    Check prices in your area. You might be able to sell your 1060 for a good amount.
     
    MadeFromPolygons likes this.
  19. MDADigital

    MDADigital

    Joined:
    Apr 18, 2020
    Posts:
    2,198
    This is why i have a seasonic titan PSU :)
     
  20. neginfinity

    neginfinity

    Joined:
    Jan 27, 2013
    Posts:
    13,321
    Well, here we go.

    Booted up XMR-gui, bitcoin lost 20% value. All hail the power of anti-Midas.

    /joke
     
  21. Joe-Censored

    Joe-Censored

    Joined:
    Mar 26, 2013
    Posts:
    11,847
    So they are diverting GPU's to this disposable CMP product, which will become e-waste as soon as crypto prices drop. Each one is one less gaming GPU, one less GPU on the second hand market down the road, which helps create more supply/demand issues with GPU availability for the next launch. Then includes a driver restriction so 3060's can't mine etherium well, which the big guys with millions of dollars on the line are more than likely just going to get around, and if they can't it just means 3070+ GPU's remain unavailable. While the little guy who buys just 1 3060 now can't make a few $ with their expensive new GPU they overpaid for when they aren't using the computer.

    Thanks Nvidia, you really got the backs of gamers here.
     
    Ryiah likes this.
  22. Ryiah

    Ryiah

    Joined:
    Oct 11, 2012
    Posts:
    20,124
    Yes, though if they're capable of being used for other computing tasks they may have a purpose for some people.
     
    Joe-Censored likes this.
  23. neginfinity

    neginfinity

    Joined:
    Jan 27, 2013
    Posts:
    13,321
    I have mixed feelings about it.

    For the start it is reasonable to treat miners and gamers as separate categories. A gamer would need a single GPU, while a miner has infinite demand for hardware. Which means that if miners leave gamers without hardware, that's a problem that needs fixing.

    However, preventing a GPU from mining is something that probably won't even work well, because this is not far removed from halting problem or DRM.

    I wonder if that shouhld've been approached differently. For example, by limiting GPUs to one GPU per person with ID verification.
     
    Joe-Censored likes this.
  24. MDADigital

    MDADigital

    Joined:
    Apr 18, 2020
    Posts:
    2,198
    Lots of gamers use SLI.
     
  25. Ryiah

    Ryiah

    Joined:
    Oct 11, 2012
    Posts:
    20,124
    Are we talking about gamers as a whole or are we talking about the people we know? Because while one might be "lots" the other is most definitely "very few".
     
  26. MDADigital

    MDADigital

    Joined:
    Apr 18, 2020
    Posts:
    2,198
    Personally i have used SLI in atleast 5 builds. 2x8800 GTS, GTX 295, GTX 590, GTX 690 and 2x980 TI
    I have lots of friends doing the same. Lots doesnt mean majority you know.

    Maybe we should ban sportcars too? Most people drive SUVs
     
  27. Ryiah

    Ryiah

    Joined:
    Oct 11, 2012
    Posts:
    20,124
    Just clarifying for the people who may have otherwise been mislead by your statement.
     
  28. MDADigital

    MDADigital

    Joined:
    Apr 18, 2020
    Posts:
    2,198
    Doesn't change the fact that we cant remove SLI from gamer GPUs
     
  29. Ryiah

    Ryiah

    Joined:
    Oct 11, 2012
    Posts:
    20,124
    Correct, we can't remove it but NVIDIA can and has removed it from all but the top tier card. I wouldn't be surprised at all if it were to disappear in one to two generations seeing as they have yet to make it work for more than a very small number of titles.
     
    Joe-Censored and angrypenguin like this.
  30. MDADigital

    MDADigital

    Joined:
    Apr 18, 2020
    Posts:
    2,198
    That would be stupid. Many people doesnt use the top tier card. Myself used 8800 GTS for example while 8800 Ultra was the top tier. Other I know used the 1080 instead of the 1080 TI etc
     
  31. Ryiah

    Ryiah

    Joined:
    Oct 11, 2012
    Posts:
    20,124
    Yet that is the current state of the market. Only the 3090 supports SLI and a few months back NVIDIA announced that all future games that use it will have to explicitly enable support for it. There will be no more "profiles".

    https://www.pcworld.com/article/357...n-slis-coffin-no-new-profiles-after-2020.html
     
  32. MDADigital

    MDADigital

    Joined:
    Apr 18, 2020
    Posts:
    2,198
    Crazy, lucky for me I have a 3090 :D
    Stupid decision from their part. Not many 3090 owners will run SLI but more 3080 owners would want to. They miss alot of potential customers.

    I might buy a second 3090 if and when prices get back to their initial price point. Problem is Unity PLM doesn't support SLI
     
  33. MadeFromPolygons

    MadeFromPolygons

    Joined:
    Oct 5, 2013
    Posts:
    3,875
    I think you greatly overestimate how many people actually care about, let alone use SLI. I dont know a single person except you inside of my circles that uses it, except when going back more than 5 years.

    Just because you use it or know people that do, does not reflect the actual amount of people out there :) In most places on average you would find it much harder to find someone using SLI, than not :)
     
    angrypenguin likes this.
  34. angrypenguin

    angrypenguin

    Joined:
    Dec 29, 2011
    Posts:
    15,509
    And the SLI thing in this context is absurd anyway. The suggestion was about increasing general availability. A handful of people not getting a second card is a non-issue compared to the majority of people not being able to get a card at all.

    (And even that isn't really that big a deal. We can all "suffer" with whatever we already have, I'm sure.)
     
  35. MDADigital

    MDADigital

    Joined:
    Apr 18, 2020
    Posts:
    2,198
    When others tell me what I want we are dangerous close to communism. Removing features like SLI from gamer cards is not a good solution at all.
     
  36. Antypodish

    Antypodish

    Joined:
    Apr 29, 2014
    Posts:
    10,574
    Quite few years back I had Dell XP M1730 Gaming laptop, for about £1.5k at that time. And I am not even top tear gamer.

    It had optional SLI, but I never took that option. I could extend later. But never hit that need for roughly next 4 years of using it (until I dropped it hard on the floor :/ ). Sure there were some games which maybe could benefit out of SLI, but I had just too little interest in such, to even bother upgrading. It did run most things I wanted at that time. I was very happy of it, besides dodgy DELL practices with power supply cord / performance drop, which I had a bit fight over. Quite heavy and could get hot tho, +4kg lap + 1kg power adapter alone, if I recall correctly.

    google screen
     
    Last edited: Feb 25, 2021
  37. MadeFromPolygons

    MadeFromPolygons

    Joined:
    Oct 5, 2013
    Posts:
    3,875
    And when people say things like this are "dangerous close to communism", we get dangerously close to "sensationalism" :) The removal of SLI because barely anyone uses it and barely anything supports it, is not communism. People pointing out that nobody uses it, is not communism.

    In fact, I will go out on a limb and say nothing in this thread is even related to communism :) But nice try ;)
     
  38. Ryiah

    Ryiah

    Joined:
    Oct 11, 2012
    Posts:
    20,124
    It's capitalism not communism that has led to the downfall of SLI. Supply and demand. There has always been very low demand for SLI for the simple fact that upgrading to a higher tier of card is almost always more economical than buying a second card which has led to limited investment of time and money into the technology.
     
    MadeFromPolygons and Antypodish like this.
  39. MadeFromPolygons

    MadeFromPolygons

    Joined:
    Oct 5, 2013
    Posts:
    3,875
    Also SLI was just buggy as **** even when it did work a lot of the time. The only people I knew who had SLI was a really small tight knit group of people on my CS course at university, and they spent more time looking into solutions for issues caused by SLI than actually using it for gains.

    Meanwhile I sat there happily gaming without issues on my single slightly beefier but cheaper overall card and laughing to myself :D

    SLI always seemed like a smart marketing-psychology move in order to try and sell lower tier cards to pc-master-race people who would only ever usually buy top tier.
     
  40. Antypodish

    Antypodish

    Joined:
    Apr 29, 2014
    Posts:
    10,574
    Yeah I can see that. Myself I spent tons of time years back comparing, if I want SLI, or better single card. Reading through pages and pages of forums, discussing pros and cons of SLI. Conclusion mostly always was ... get single but better.
     
    MadeFromPolygons likes this.
  41. neginfinity

    neginfinity

    Joined:
    Jan 27, 2013
    Posts:
    13,321
    Nope.

    We can, as NVidia apparently abandoned it. Same goes for CrossFire, as far as I know.
    https://nvidia.custhelp.com/app/answers/detail/a_id/5082

    This tech is unpopular, I recall seeing it being mentioned on game froums something like once per year. And besides, even with SLI(that nobody appears to actually use) a gaming machine will use two GPUs top.

    While miners will be hungry for infinite number of GPUs.

    Well, you can always build your own graphic card factory.

    It is market economy. The developer decides what to sell, and you decide whether to buy or not. If you don't like what they're selling, you stop buying, and maybe they'll take your wishes into account and rethink their ways. Or maybe they won't, because there are two major GPU manufacturers on the market, and the rest of the buyers don't appear to care deeply about tech you want...

    Also, miners being able to slurp up all the GPUs on the market is equivalent to others telling you what you're allowed to have anyway.
     
    Ryiah likes this.
  42. MDADigital

    MDADigital

    Joined:
    Apr 18, 2020
    Posts:
    2,198
    This is the specs of my 3090 from their site

    upload_2021-2-25_17-9-49.png


    Looks like your are wrong.

    edit: They will stop trying to support SLI on the driver level and instead its done on the API level. Its not the same thing as abandon it
     
  43. neginfinity

    neginfinity

    Joined:
    Jan 27, 2013
    Posts:
    13,321
    A single card supporting the feature doesn't mean a thing.

    The tech is "so popular" that there's no statistics for its adoption, and nobody talks about it.

    And the number of games supporting "native SLI" it is tiny.
     
  44. Joe-Censored

    Joe-Censored

    Joined:
    Mar 26, 2013
    Posts:
    11,847
    Yeah at the time of that announcement I was sure this effectively kills SLI going forward. Only a small handful of games will bother building in support for dual 3090's.
     
  45. MDADigital

    MDADigital

    Joined:
    Apr 18, 2020
    Posts:
    2,198
    We can have a betting pool how long it will take for unity to support it. Knowing their track record with Vulkan + gfx job I will not hold my breath :p
     
  46. Joe-Censored

    Joe-Censored

    Joined:
    Mar 26, 2013
    Posts:
    11,847
    For the dual 3090 scenario, you can guarantee it won't be :)

    Good luck even finding two right now at below scalper pricing!
     
  47. MDADigital

    MDADigital

    Joined:
    Apr 18, 2020
    Posts:
    2,198
    Really glad I got one back in October for list price.