Search Unity

  1. Welcome to the Unity Forums! Please take the time to read our Code of Conduct to familiarize yourself with the forum rules and how to post constructively.
  2. We have updated the language to the Editor Terms based on feedback from our employees and community. Learn more.
    Dismiss Notice
  3. Join us on November 16th, 2023, between 1 pm and 9 pm CET for Ask the Experts Online on Discord and on Unity Discussions.
    Dismiss Notice

Suggest me a good PC configuration for game development

Discussion in 'General Discussion' started by kaiyum, Oct 9, 2016.

  1. kaiyum

    kaiyum

    Joined:
    Nov 25, 2012
    Posts:
    686
    Hi fellas, it looks like I will be getting a new PC recently. https://unity3d.com/unity/system-requirements
    This link is not helping me much. :( Right now I am using one, having low to medium range, with this specification:
    • Core i5-4570
    • 16GB ram
    • GTX 650 Ti
    • Windows 10-64bit
    • No SSD
    The light-map baking, takes too much time to bear, comparing the former "autodesk beast light-mapping". A compilation of UE4 from the source, takes around 30 to 40 minutes. A webgl build or il2cpp takes ridiculously long time to finish, 10 to 30 minutes :(. Now I wish to stop this madness and thus decided to upgrade.
    I should be happy for at least 3-4 years next. The estimated budget is around 2k USD. So suggest me some. :)
     
  2. zugsoft

    zugsoft

    Joined:
    Apr 23, 2014
    Posts:
    453
    I have a 4000$ computer with bi-xeon 4ghz(24 logics cores)and the lightmap bake system with Unity5 is very very long too.
    My previous computer had a simple i7 4 cores($800 computer), and sincerely the difference is not impressive.

    Don't waste your money, just buy a SSD and at least a i7 core for your 1150socket
     
    Last edited: Oct 9, 2016
    schmosef, kaiyum and Martin_H like this.
  3. ShilohGames

    ShilohGames

    Joined:
    Mar 24, 2014
    Posts:
    2,985
    I strongly suggest using SSD drives, since that will significantly boost performance. If you want to upgrade your existing PC, switching to an SSD drive, 32GB of RAM, and a GTX 1070 would yield a huge performance boost.

    Switching from your current i5 to an i7 would basically only get you hyperthreading support. Hyperthreading improves performances, but not as much as additional cores will do. With hyperthreading, your PC will show 8 "procs" in the OS, but it won't run much better than your i5 runs in most tasks. An i7 will still be a 4 core CPU. Hyperthreading offers a relatively mild performance boost.
     
    Last edited: Oct 9, 2016
    JamesArndt and kaiyum like this.
  4. ShilohGames

    ShilohGames

    Joined:
    Mar 24, 2014
    Posts:
    2,985
    If you want to build a new system instead of upgrading your existing system, here are specs I would recommend:

    Intel i7-6700K (4.0GHz)
    32GB RAM
    GTX 1080
    M.2 style Samsung SSD drive (950 Pro for example; 960 Pro when available)
     
    Last edited: Oct 9, 2016
    kaiyum likes this.
  5. Arowx

    Arowx

    Joined:
    Nov 12, 2009
    Posts:
    8,194
    Something to consider is that AMD have it's new Zen CPU which will be out in the next few months*.

    It will be an 8 core part with core performance that is expected to be similar to current intel CPU's.

    Expect it to undercut Intel CPU's in price and therefore drive down the price of Intel CPU's.

    So if you can hold off upgrading for just a few months, you could get a higher spec system for a lower price.

    *Expected to be unveiled/released at CES 2017 (Jan 5-8).

    But an SSD can really boost project and system performance.

    I'm not sure how much difference going up from 16 GB to 32 GB ram will make in Unity?
     
    kaiyum likes this.
  6. ShilohGames

    ShilohGames

    Joined:
    Mar 24, 2014
    Posts:
    2,985
    As for 16GB vs 32GB, I have seen a few times in both Unity and UE4 where light baking certain scenes needed more than 16GB of RAM. Plus the added RAM can be used for file caching by the OS, which can be helpful when compiling applications and games.

    I would not hold my breath waiting for AMD to ship a competitive CPU. The new Zen based AMD CPUs sound impressive, but AMD's track record in recent years has been consistently poor. AMD hypes high and then under delivers. If I had to guess, I would say AMD's Zen CPU will probably ship late and perform measurably slower than Intel's current CPUs. AMD's Zen CPU will probably not even match the performance of previous generation Intel i7 CPUs.
     
    MV10 and kaiyum like this.
  7. Arowx

    Arowx

    Joined:
    Nov 12, 2009
    Posts:
    8,194
    My point is if you upgrade is not an urgent need then it could be worth your while to wait for AMD's Zen. If it does challenge intel on price/performance then expect Intel to drop their prices. If not then you can probably save up more for a better upgrade.
     
    kaiyum likes this.
  8. zugsoft

    zugsoft

    Joined:
    Apr 23, 2014
    Posts:
    453
    A Gtx1080 or a gtx650 will change nothing.
    And I prefer to develop a game with a gtx650 to optimize my game to stay above 30fps.
    Start to buy a Samsung 850Pro, check if your 16Go are enough or not.
     
    RaoulWB and kaiyum like this.
  9. kaiyum

    kaiyum

    Joined:
    Nov 25, 2012
    Posts:
    686
    hmm, SSD will be added for sure. Its sad that enlighten can't utilize GPU power for baking. The baking stage is terrible and essentially worsen the whole game development pipeline :( Its the weakest link in the chain :( Several times I have thought of building a plugin or something like that, for blender. So that I can back n forth DCC packages and game engines. The idea is to leverage the GPU accelerated lighting on those DCC packages. I would bake away the light as texture and put it into unity by a single button click. Its so much pain to generate them within unity :(
    It is one of the prime reasons, for upgrade. As pointed out by menfou , its really sad that upgrading wont help in this area. Is there any server side solution any of you can think of? For example I would put the whole unity project into the cloud and wait a few minutes and then done with lighting?

    I can wait few more months for AMD but I honestly do not want :( to. I have also thought of getting mac pro. Would be nice to run both windows and macOS, I could cover the platforms. but it is overpriced for me :(

    About RAM, going from DDR3 to DDR4, how much does it worth? Is ECC ram that necessary for me?
     
  10. ShilohGames

    ShilohGames

    Joined:
    Mar 24, 2014
    Posts:
    2,985
    ECC helps with reliability for servers. ECC does not help with performance, though. In fact, ECC can cause a small performance hit.
     
    JamesArndt, kaiyum and schmosef like this.
  11. schmosef

    schmosef

    Joined:
    Mar 6, 2012
    Posts:
    851
    Another thing that will improve performance is a RAM cache. About a year ago I maxed out the RAM on my motherboard (32GB) and installed SuperCache. It's a big improvement.

    On my next computer I'm going to get a motherboard than can handle 128GB RAM and dedicate most of it to a RAM cache.

    It's important to have a good UPS for this too.
     
    kaiyum likes this.
  12. Arowx

    Arowx

    Joined:
    Nov 12, 2009
    Posts:
    8,194
    If you look at the Enlighten Blog they mention GPU acceleration in their 3.04 release but it's a Unreal only feature!

    On the roadmap Unity have
    • Networked GI, they hacked together a working networked GI solution during their Hack Week.
    • Progressive GI being worked on for 5.5
    • PowerVR ray tracing to speed up GI calculations.
    Mind you you could always opt for an "Intel Knights Landing 76 core supercomputer on chip!" after all time is money!
     
    kaiyum likes this.
  13. RichCodes

    RichCodes

    Joined:
    Apr 3, 2013
    Posts:
    142
    Has anyone actually tried bench-marking this stuff?

    I am suddenly interested in knowing the performance differences between building/baking/etc on different configurations.
    Even if the difference isn't a lot, I would rather optimize my hardware for Unity than for playing games...
     
    Last edited: Oct 10, 2016
    kaiyum likes this.
  14. ShilohGames

    ShilohGames

    Joined:
    Mar 24, 2014
    Posts:
    2,985
    That CPU has 64-72 cores, but runs at a relatively low clock speed of 1.3GHz-1.5GHz. It would only be useful for situations where you knew all of the cores could be used effectively. Unfortunately, many tasks we do day to day are going to benefit more from high clock speed than from high core count. The Intel Xeon Phi 7290 might be great for running a bunch of virtual machines in a datacenter, but I am guessing Unity will likely run better on an Intel i7-6700K.
     
    kaiyum and Martin_H like this.
  15. Arowx

    Arowx

    Joined:
    Nov 12, 2009
    Posts:
    8,194
    Did you notice the Cores come in tiles, 2 cores and 2 VPU's per tile, the VPU's are 512 bit SIMD processor for doing batched maths very fast e.g. 8 double or 16 floating point ops per cycle. I'd betting they could tear through GI calculations in a fraction of the time of a regular CPU can. Assuming your GI calculator was written and compiled to run on Knights Landing hardware.
     
    kaiyum likes this.
  16. ShilohGames

    ShilohGames

    Joined:
    Mar 24, 2014
    Posts:
    2,985
    Yeah, the Knights Landing hardware looks impressive, and I don't doubt some software will eventually get tuned for it. But I highly doubt most software we use day to day will benefit from it. For example, I don't expect to see the Unity editor or Enlighten get optimized for Knights Landing.
     
    kaiyum, angrypenguin and Martin_H like this.
  17. Arowx

    Arowx

    Joined:
    Nov 12, 2009
    Posts:
    8,194
    What if they built an ARM version of GI for ARM SOC platforms so we could build GI server farm solvers we could network up!
     
  18. Ryiah

    Ryiah

    Joined:
    Oct 11, 2012
    Posts:
    20,141
    Are you purposefully listing slower processors with each suggestion? :p
     
    MV10 and angrypenguin like this.
  19. Arowx

    Arowx

    Joined:
    Nov 12, 2009
    Posts:
    8,194
    I would imagine that up to a point calculating scene GI is not as restricted to processor speed as it is to number of cores and bandwidth. And who wouldn't love to be able to send GI off to a cheap ARM server farm and get it back in when it's done, whilst continuing work on you desktop machine.

    After all Enlighten Geomerics is an ARM company!
     
  20. Ryiah

    Ryiah

    Joined:
    Oct 11, 2012
    Posts:
    20,141
    Individual core speed is just as important as the number of cores in your processor. Just as one example the Intel i3-6320 is only a dual core processor yet it's able to compete with the eight core AMD FX-8350 because each of its cores are that fast.

    That doesn't necessarily mean anything though.
     
  21. Arowx

    Arowx

    Joined:
    Nov 12, 2009
    Posts:
    8,194
    Depends on how parallel the job is, I bet GI can be massively parallelized and probably work better on a high end GPU than a CPU.
     
  22. angrypenguin

    angrypenguin

    Joined:
    Dec 29, 2011
    Posts:
    15,516
    There's more than just "running" on there, it also has to work with data in way that suits that particular processor. And looking at how things went for the Playstation it seems that developers would much rather optimise for common cases than for specialist CPUs.

    In all honesty, I wonder if something like cloud computing wouldn't be a better fit for light baking? I know it relies on having a decent network connection, but then you can rent as many big CPUs as you need for the time you need, and give 'em right back afterwards. I've seen offices that have their own internal server farms for this kind of thing and it's no small task. It chews up power, it takes up space, it needs cooling, it needs maintenance, the hardware isn't exactly cheap, and it goes out of date. If I didn't need to be using it all of the time (and maybe even if I did) I'd strongly consider getting that off-site.
     
    Kiwasi likes this.
  23. Ryiah

    Ryiah

    Joined:
    Oct 11, 2012
    Posts:
    20,141
    If an eight core processor is unable to surpass a dual core processor in ideal conditions (eg benchmarks) then it won't be able to surpass it in less than ideal conditions. Bringing mores cores into the equation doesn't automatically mean more speed.

    Are you working on the assumption they haven't already tried? Because a quick search shows they have some aspect(s) of their system running on graphics hardware already. Check the sub-heading "Cubemap Solver" at the link below.

    http://www.geomerics.com/blogs/enlighten-3-02/
     
    Last edited: Oct 11, 2016
  24. neginfinity

    neginfinity

    Joined:
    Jan 27, 2013
    Posts:
    13,327
    Unless something changed in most recent versions of unity, Unity GI does not utilize GPU. CPU only.
     
  25. zugsoft

    zugsoft

    Joined:
    Apr 23, 2014
    Posts:
    453
    I created a simple scene with Unity 5.3 , only some cubes.

    http://www.zugsoft.com/unity/Bake.zip (32Kb)

    Just open the scene "Bake" , open the Lighting Windows, and click on Build.
    If you want to Bake again, you have to clean the directory \AppData\LocalLow\Unity\Caches\GiCache to be sure Unity will not use previous GI


    Unity Version :
    CPU :
    RAM :
    HD :
    Time :

    My Result :
    Unity Version : 5.3.6
    CPU : i5 4570s @3.2Ghz
    RAM : 8Go
    HD : SSD Samsung 850Pro
    GPU : Intel 4600HD
    Time : 1min 19'


     
    Last edited: Oct 11, 2016
    kaiyum and RichCodes like this.
  26. Martin_H

    Martin_H

    Joined:
    Jul 11, 2015
    Posts:
    4,433
    Unity Version : 5.4.0f3
    CPU : i7 3820 @3.7GHz
    RAM : 64GB
    HD : SSD for OS drive and HDD for data storage (including unity projects)
    Time : 1 minute 09 seconds


     
    RichCodes likes this.
  27. Billy4184

    Billy4184

    Joined:
    Jul 7, 2014
    Posts:
    5,984
    Just turned up to say I think the render farm thing for light baking is an interesting idea. A number of tools are heading in the direction of being more resource hungry than a pc can reasonably handle - for example artomatix's machine-learning based texture work is run on a farm and would be impossible to run on a pc. I can imagine for example a future terrain-building application being run on a farm through a WebGL interface and you just download the results when happy. Doing it this way can open up a lot of possibilities.
     
    Last edited: Oct 11, 2016
    kaiyum likes this.
  28. RichCodes

    RichCodes

    Joined:
    Apr 3, 2013
    Posts:
    142
    Unity Version : 5.4.0f3
    CPU : AMD FX-8350 4.0GHz
    RAM : 16GB GSkill Ripjaws X DDR3 2133
    GPU: R9 285 2GB
    HD : Muskin Chronos 480GB SSD
    Time : 1 minute 26 seconds
     
  29. jonnytracker

    jonnytracker

    Joined:
    Sep 13, 2016
    Posts:
    31
    use a less powerfull pc so that any pc can run. the mistake of big commercial studio is this. they build a pc from the future and develop games. travels back in the past and publish the game. now everyone plays a slideshow game..

    imagine those ping pong and 8 bit mario and ninja gaiden games with slide cutscenes amazing games.....
     
  30. Arowx

    Arowx

    Joined:
    Nov 12, 2009
    Posts:
    8,194
    Just looked up PassMark for the i3-6320 and FX-8320

    i3-6320 gets a 6058 passmark

    fx-8320 gets a 8008 passmark

    fx-8350 gets a 8939 passmark

    So I think you need to backup your claim with better data? :p
     
  31. RichCodes

    RichCodes

    Joined:
    Apr 3, 2013
    Posts:
    142
    The mistake would be developing for the specs of the machine they are using, instead of the specs they pick as their minimum/recommended.
    I have only worked on one PC title in any sort of major role, and we actually built a machine with the specs that we wanted for the minimum to test with. No way I would have wanted to actually develop on that machine though, things like a light bake are way more demanding than running the game!
     
  32. Arowx

    Arowx

    Joined:
    Nov 12, 2009
    Posts:
    8,194
    Has anyone got or know someone with a 10 core 20 thread Intel i7-6950X, would be interesting to see what it would do in the GI benchmark!?

    Or a Xeon or Opteron multi-core server chip?
     
  33. Arowx

    Arowx

    Joined:
    Nov 12, 2009
    Posts:
    8,194
    Out of interest what breakdown of time are you seeing in the console editor log file (3 bars top right of console window -> Open Editor Log)

    I'm seeing a lot of time spent in Final Gather e.g.

    Not Bake scene I'm playing around with Sponza 3D model and GI settings.

    Definitely seems the Final Gather is the big performance hog on my CPU.
     
  34. Deleted User

    Deleted User

    Guest

    It's entirely dependant on the application, I know lightmass / shader cache compilation in UE will take full advantage of a hexacore w/ multi-threading hence why I got one. 6 X cores @ 3.6GHZ will of course run faster than a quad @ 4.0GHZ in that scenario..

    But you'd have to ask Unity how many threads / cores Enlighten will cycle on, I have to admit it's gotten a lot better but a large scene in 5.4 it will still take around six hours @ 1 Texel.
     
    kaiyum, angrypenguin and Ryiah like this.
  35. Ryiah

    Ryiah

    Joined:
    Oct 11, 2012
    Posts:
    20,141
    PassMark is not the only benchmark available. At the bottom of the link site are more benchmarks. Some of them are reporting very different results than the ones PassMark is giving. Like @ShadowK said it's very application dependent.

    http://cpuboss.com/cpus/Intel-Core-i3-6320-vs-AMD-FX-8350
     
    angrypenguin likes this.
  36. kaiyum

    kaiyum

    Joined:
    Nov 25, 2012
    Posts:
    686
    On one of my antique PCs, this is the result:
    Unity Version : 5.3.4f1
    CPU : core 2 due @2.67Ghz
    RAM : 8GB
    HD : SSD 128GB
    GPU : Nvidia GT730
    Time :
    4min 46'
     
    Martin_H likes this.
  37. kaiyum

    kaiyum

    Joined:
    Nov 25, 2012
    Posts:
    686
    At studio office, we generally save all the works and hit bake before we go home. In the morning, our bake finishes up :( This creates a very lengthy iteration time. So we assume a lot of things and do bake at the final stage.
     
  38. Ryiah

    Ryiah

    Joined:
    Oct 11, 2012
    Posts:
    20,141
    Only one day? :eek:
     
    kaiyum likes this.
  39. angrypenguin

    angrypenguin

    Joined:
    Dec 29, 2011
    Posts:
    15,516
    Are you also considering your peripherals? They won't effect a light bake, but can have an impact on overall productivity.
     
    kaiyum likes this.
  40. kaiyum

    kaiyum

    Joined:
    Nov 25, 2012
    Posts:
    686
    :confused:Not actually one day, its roughly 12-14 hour(from 8-10PM), we tend to go home late at the baking day. Because we have to make sure that everything is perfectly laid out in the scene. After the baking stage, generally we tend to avoid any static object change, at any cost literally. :(
    We have a dedicated PC for baking. We don't do anything on that PC while its baking.
    If I could find a config where next gen light bake would takes minutes! :) Lets say I am thinking this in a grand scale, that I can go as much as 10k. Then can this problem be solved? Please share your experience on this matter, guys :)
     
  41. Arowx

    Arowx

    Joined:
    Nov 12, 2009
    Posts:
    8,194
    Is there a way to trick a program into thinking it's running on a massive multi-core chip but in fact you are running it on a emulator on your GPU?
     
  42. Deleted User

    Deleted User

    Guest

    Nope, still as everyone has been saying it's application based.. In Unreal w/ preview it only takes about 5 minutes to bake a scene, for full bakes it takes about 20 mins to an hour (on some pretty large scenes) using a render farm (well render farm as in a PC and two servers) via a distributed client.

    When Enlighten has the option to distribute to other hw via network rendering, that should cut down the amount of time it takes dramatically.. Because one of the main issues with baking on development machines, the lightmapping solution will always try to eat as many resources as it can if you let it.. If it's restricted to so you can still work then of course it's going to take far longer to bake.

    I've tested various rendering solutions for concept art, it turns out having a powerful CPU is still the way to go..

    IF you are only going to have one machine, then I'd recommend a hexacore I7 / tons of RAM (32GB) and an M.2.. The GPU only needs to be as powerful as the games you're trying to develop at the top end.. So we use R390X's or 980TI's..
     
    kaiyum likes this.
  43. Ryiah

    Ryiah

    Joined:
    Oct 11, 2012
    Posts:
    20,141
    No.
     
    kaiyum likes this.
  44. kaiyum

    kaiyum

    Joined:
    Nov 25, 2012
    Posts:
    686
    Wow, this thing exists?! :eek: Did not know that! Thanks for the info. Actually from the announcement of UE4, I tried to take a flavor of it. Then there were pressures from other projects(for clients), other personal/educational things came up and I have lost the connection with UE4 meanwhile.
    But then there was a time, I understood the importance of c++ if I have to find any luck with UE4. Then the low level things came up naturally one by one, gone to the dogs literally at x86 bootloader[
    ] :oops: Then of course reverted back to normal life. Between those learning processes I lost the actual connection with UE4, so a lot of things around UE4 are unknown to me. But I am determined that I will see it top to bottom one day and call it a day :p Mind you, these all, have to be done in the free time everyday.

    Anyway, another reason of considering UE4 for my 3d projects. But I hope, unity and enlighten guys will roll out a robust-networked solution soon.

    The point is, when I code for a game: I hate to handle pointers and I also dislike the macros. I also tend to dislike the hardware-OS related things(const/volatile/asm). Also I want to play with pure logic. Unity's clear abstraction let me, only to worry about game logic. So I can quickly make something. That is one of many reasons to like unity, from my side personally. Anyway, thanks again for the info :)
    For an existing OS? No. However, if the question is like this: "is there an OS which is powered by gpu, so that the application created for that OS utilize hundreds of cores?"

    In this case, the answer will "no". This is because, as of I know gpus do not support interrupts. This is a key component to run OS. Also the current gpu tech is not suitable for this kind of general computation. Also take a look at this: http://stackoverflow.com/questions/...nse-to-run-os-kerne-level-computations-on-gpu

    I do not understand quite well this "larrabee" thing, but you take a look at it, as well, https://en.wikipedia.org/wiki/Larrabee_(microarchitecture)
     
  45. Deleted User

    Deleted User

    Guest

    It was really just an example, Unity is creating a "PowerVR" based lightmapping solution? Which will hopefully be better than Enlighten and hopefully Enlighten will use a distributed client for render farms at some point.. So Unity will catch up at some point.
     
  46. leegod

    leegod

    Joined:
    May 5, 2010
    Posts:
    2,345
    for graphic card, its enough with gtx970. If you play some games a little, then go to gtx1060. If you are heavy gamer, go to gtx1080.

    CPU is always i7-6700 or 6700k

    SSD is must for all cases.
     
    kaiyum likes this.
  47. Ryiah

    Ryiah

    Joined:
    Oct 11, 2012
    Posts:
    20,141
    Unless you live in the USA. The GTX 1060 6GB is cheaper than the GTX 970 here and is almost tied with a 980. That said you might be able to pick up a GTX 970 used for a reasonable price from eBay or Reddit.

    https://www.reddit.com/r/hardwareswap
     
    kaiyum and Martin_H like this.
  48. leegod

    leegod

    Joined:
    May 5, 2010
    Posts:
    2,345
    Then I will revise to gtx960 or 950. If not playing game, these are enough (maybe) for run Unity, 3ds max, Unreal 3d projects.

    But as of now, gtx1060 is preferable.
     
    kaiyum likes this.
  49. GoesTo11

    GoesTo11

    Joined:
    Jul 22, 2014
    Posts:
    604
  50. kaiyum

    kaiyum

    Joined:
    Nov 25, 2012
    Posts:
    686
    Meanwhile today I conducted an exciting test. :) My idea was, "lets bake the shadow, emission etc in blender with gpu accelerated "cycle" render. Then bring them back on unity."
    I told my artist buddy to make two uv channel for each of the road and the side props. The road's diffuse channel was tiled and utilized uv0, and consisted of many road blocks merged together as one big mesh. The side pillar props was also handled the same way. Then the lightmap generated from blender was fed into uv1 of a custom shader without any tiles, due to the natural non-tiled nature of light. The result was spectacular, at least a lot better than what enlighten could achieve within a reasonable time frame :)

    But I got stuck with the dynamic light. The requirement is: "The static objects will not cast or receive shadow to static objects. The static objects will only cast shadows on dynamic objects only. The static objects will also receive shadows from dynamic objects only."

    I do not know how will I achieve this, for simpler geometry I can bring a reasonable result with the mesh renderer's casting properties. But for complex geometry it looks like hacking into the process of shadow map generation :(
    Actually, for gpu I think I will go with 1080. :)
     
    JamesArndt likes this.