Search Unity

Best bang for buck laptop?

Discussion in 'General Discussion' started by Radu392, Feb 10, 2020.

  1. Radu392

    Radu392

    Joined:
    Jan 6, 2016
    Posts:
    210
    I'm looking into getting a laptop to do some unity game dev in my free time. I've been working with unity on my main rig which has an i5-7500 @3.4ghz, 16gb ram, 1tb ssd and a gtx 1070 for 3 years now and it's been more than enough for my work and for any kind of 'high end' game. But now I'm looking to just code on a laptop at night on smaller personal projects, mostly with ECS. I'm looking for a laptop that can match or get close to my main rig's cpu power, 8gb+ ram and NO dedicated graphics card. Memory can be hdd or sdd, don't really care as long as it's over 500gb. I never do anything involving involving a gpu on my old laptop since if I want to play graphically demanding games I just hop on my pc, so I'm looking to cut costs, which is why I don't want to buy a laptop with a gpu.

    I'm not looking for laptops that have more power than my main rig because I want my environment to be as close as possible to an average gamer's rig. Makes it easier to manage expectations when pushing your hardware. And to save money of course.

    What I found so far is:
    https://www.amazon.ca/Dell-Inspiron...r_1_18?keywords=laptop&qid=1581295694&sr=8-18
    and
    https://www.amazon.ca/Asus-FX505DD-...3?keywords=laptop&qid=1581295694&sr=8-13&th=1

    I would buy the second one in a hearbeat if it didn't include a gpu. The first one I hesitate because it doesn't have any reviews and it's a Dell...

    I've been using this website to compare cpu speeds of those 2 choices with my main rig's cpu. The score are all quite close to each other. But those results are kinda sketchy. How can a i5-8625u @1.6ghz have a better single thread rating than my main rig's cpu which is at @3.4ghz given that the rest of the specs are roughly the same?
    https://www.cpubenchmark.net/compar...en-7-3750H-vs-Intel-i5-8265U/2910vs3441vs3323

    Anyway, any suggestions?
     
  2. Murgilod

    Murgilod

    Joined:
    Nov 12, 2013
    Posts:
    10,151
    The 1050 is barely a GPU, but on top of that, you don't want to be developing on what you assume the average user's computer will be because making a game has its own overhead.
     
  3. Radu392

    Radu392

    Joined:
    Jan 6, 2016
    Posts:
    210
    Oh, I know all about overhead trust me. That's why I'd keep it to smaller scale, simpler projects when I work on my laptop.

    Even if it's not a powerful gpu, if it wasn't there at all, I'd be saving anywhere between 200-250 bucks. That's cash money for an indie dev.

    Additional question that came to mind, the Unity editor itself doesn't use the gpu for anything right?
     
  4. Murgilod

    Murgilod

    Joined:
    Nov 12, 2013
    Posts:
    10,151
    It uses the GPU to render the scene view and to render the game view as well.
     
    Ryiah and Billy4184 like this.
  5. Billy4184

    Billy4184

    Joined:
    Jul 7, 2014
    Posts:
    6,023
    Why would you do game dev on a computer without a dedicated gpu? You're boxing yourself into a corner there, and the editor will probably run like junk.

    Sooner or later you're going to want that extra capability and it'll cost you a lot more than $250.
     
    xVergilx and Ryiah like this.
  6. Ryiah

    Ryiah

    Joined:
    Oct 11, 2012
    Posts:
    21,183
    Unity's editor makes extensive use of the GPU. For starters, the user interface of the editor is powered by the GPU not the CPU. Beyond that everything in the Scene and Game views are rendered via the GPU, Unity's upcoming lightmappers are powered by the GPU, Unity's VFX particle system is powered by the GPU, etc.

    Choosing an iGPU might make sense if you're developing simple games for mobile devices or if you want a long battery life (gaming laptops rarely see more than a couple hours using the GPU). For just about every other situation it will be an awful experience.
     
  7. Radu392

    Radu392

    Joined:
    Jan 6, 2016
    Posts:
    210
    Obviously, Unity is going to use the gpu to render the scene and game view... What I didn’t know is that it’s also used to power the user interface. Do you have any source of confirmation for that? You mean it actually helps with rendering windows such as the inspector?

    Like I said before, I’d keep it simple. I have my main rig which I use to create massive 64km^2 open world areas for my main project that contains just about every mechanic Unity can throw at you. But I’d also want to have a recreational rig where all I do is play with ECS and sprites. That’s it. Please stop assuming every other post in this forum is from a teenager wanting to make an FPS game and trying to figure out how to put a gun in the character’s hands :)

    I guess tomorrow I’ll disable the gpu on my main rig and observe how Unity behaves for a small 2D, 2GB assets folder project. Might as well observe how it does with my main 48GB assets folder project just for science.
     
  8. Ryiah

    Ryiah

    Joined:
    Oct 11, 2012
    Posts:
    21,183
    https://www.slideshare.net/unity3d/...the-uielements-renderer-unite-copenhagen-2019

    We're not assuming anything of the sort. What we are assuming based on your statement that you want the "best bang for buck" is that you wanted a good laptop for your investment. Keep in mind there are very limited options for upgrading your laptop. Memory and storage are about it.
     
    Last edited: Feb 10, 2020
    Radu392 likes this.
  9. Billy4184

    Billy4184

    Joined:
    Jul 7, 2014
    Posts:
    6,023
    That's what you want to do right now, but your computer will be around for much longer.

    The last thing you want to be doing is premature optimization just to get the editor at a useable frame rate.
     
  10. neoshaman

    neoshaman

    Joined:
    Feb 11, 2011
    Posts:
    6,493
    Mostly it mean disabling pbr anyway (source: my own potato)
     
    Radu392 likes this.
  11. Ryiah

    Ryiah

    Joined:
    Oct 11, 2012
    Posts:
    21,183
    Clock speed is only one part of the equation when it comes to performance and while it's very common for marketing to focus attention on it the reality is it's completely meaningless unless you are comparing processors from the exact same family.

    We need to look at the full statistics to have an actual idea for how the processors will perform. Both processors are on the same lithography (14nm). There are different levels of refinement for the same lithography but the two generations are close together so the difference will be minimal.

    Typical improvements within the same lithography tend to be less than 10% (typically around 5%). Jumping to a completely different node size can bring large improvements depending on the size of the jump. AMD's jump from 12nm to 7nm saw an improvement of 15% performance in addition to massive power savings.

    Looking beyond that the maximum turbo frequency on the newer processor is slightly higher at 3.9 GHz versus the previous generation's 3.8 GHz. A 2.5% improvement.

    https://ark.intel.com/content/www/u...5-7500-processor-6m-cache-up-to-3-80-ghz.html
    https://ark.intel.com/content/www/u...-8265u-processor-6m-cache-up-to-3-90-ghz.html

    That said at the end of the day the only true way to know performance is to run benchmarks. There are two types - real world and synthetic. PassMark is a synthetic benchmark and may not reflect the real world performance of the CPU and are typically used for very rough estimations and bragging between overclockers.

    Real world benchmarks are typically Blender (to compare rendering speeds), Premier Pro (to compare encoding and scrubbing performance), games, etc. These results vary wildly between apps but some are close to the behaviour of Unity. Blender's rendering, for example, is a good way to determine lightmapping performance for CPU.

    Exact model benchmarks can be difficult to find. For that you take the closest processor of the same tier (eg i5 for i5) in the same family (eg the 8250U linked below for the 8265U), and then modify the numbers based on the difference between them in clock speed.

    https://techreport.com/review/32863/intels-core-i5-8250u-cpu-reviewed/
     
    Last edited: Feb 10, 2020
    Billy4184 and Radu392 like this.
  12. Radu392

    Radu392

    Joined:
    Jan 6, 2016
    Posts:
    210
    You're probably right, might as well fork out a lil more, who knows what will happen in a few years.
     
  13. Radu392

    Radu392

    Joined:
    Jan 6, 2016
    Posts:
    210
    So PassMark's rating tests are based off of running the CPU's overclocked?

    Thanks for all the info btw!
     
  14. Ryiah

    Ryiah

    Joined:
    Oct 11, 2012
    Posts:
    21,183
    No. I was saying that one of the primary uses of synthetic benchmarks is showing off your overclock. Cinebench is the app that is typically used for this. I don't know if PassMark records their own numbers or if they average the results from people running their benchmark. Either way the important thing is to know that the number may not show actual performance.