Search Unity

AMD and ATI

Discussion in 'General Discussion' started by taumel, Jul 24, 2006.

  1. taumel

    taumel

    Joined:
    Jun 9, 2005
    Posts:
    5,292
  2. hsparra

    hsparra

    Joined:
    Jul 12, 2005
    Posts:
    750
    Pretty interesting. There is a big article in the Austin paper today about this. It will be interested to see how Intel responds. Will they finally get decent integrated graphics?
     
  3. NicholasFrancis

    NicholasFrancis

    Joined:
    Apr 8, 2005
    Posts:
    1,587
    Yeah - have just been reading up on it as well... I can see the point from AMDs perspective (completeness->they don't have an integrated chipset).

    From what I've heard the latest batch of Intel GPUs are actually not that bad. The have vertex programs (which is what I really care about). Still low end, but no worse than other's low-end.
     
  4. bigkahuna

    bigkahuna

    Joined:
    Apr 30, 2006
    Posts:
    5,434
    For game developers, integrated graphics is a bad thing. I know that ATI has been used on Macs with success, but in the PC world, ATI graphics cards spell headaches for game developers. This merger is bad news IMHO. I've always been very happy with AMD cpu's and they've always worked well with nVidia graphics cards in the past. The combination has been awesome for high end game graphics, but I suspect in the future that combination will either not be possible, or will have problems.

    I spoke with a rep from AMD a couple weeks ago and learned that they will be changing all their cpu footprints in the near future, so the mother boards that are on the market right now will become obsolete within the next 12-18 months... argh... :(
     
  5. Morgan

    Morgan

    Joined:
    May 21, 2006
    Posts:
    1,223
    I don't know if it's the concept of integrated graphics per say that' a bad thing, but rather this:

    * Low-end graphics are cheaper than higher-end.

    * People who aren't shopping for higher-end graphics end up buying what's cheap.

    * Therefore people who don't shop for higher-end graphics can't play games that demand higher-end graphics.

    I don't think anything will change that--BUT "low-end" graphics will continue to improve just as high-end graphics do. One day, low-end cheap-o graphics will be super-detailed, and then we'll be bothered by all the cheapskates buying that instead of something EVEN better :) That day can't come too soon for me.
     
  6. taumel

    taumel

    Joined:
    Jun 9, 2005
    Posts:
    5,292
    Well, i guess not only in my opinion these are very interesting news. From a short term perspective there won't change anything but if you're looking a bit further into the future there is a great chance that this will have a large impact on what we will see in the future. If we will still have the freedom to choose between different graphics cards no matter if it's intel or amd and so on. You can think of scenarios where this helps to push the integrated graphics market to scenarios where we have another monoculutre with amd/ati vs. intel (if they decide to enlarger their dev-team) or intel/nvidia.
    For the chipset market it's very likely that there also will be two roads with intel and amd/ati as i don't think ati will make chipsets for intel in the future. This could easily be a intel and nvidia onyl market. Well, let's wait and see...

    @bigkahuna
    I experienced it the other way around. ATI is very stable comapred to nVIDIA on the PC. You get much more headaches with nVIDIA who where almost famous for causing bluescreens with their drivers.
     
  7. podperson

    podperson

    Joined:
    Jun 6, 2006
    Posts:
    1,371
    Repeat after me: Integrated graphics aren't the enemy. Bad graphics are the enemy.

    If AMD is able to produce a single chip solution which provides high quality 3D graphics and a solid processor on a box that costs $400 I don't see this as a problem.

    We're not looking at the bar being lowered for high end PCs, so much as the bar being raised for low end PCs.

    Also note that CPUs tend to be socketed and upgradeable, even on cheap PCs. Integrated graphics aren't.

    We're in the midst of a shift in thinking that considers GPU functionality (or think of it as a powerful general purpose DSP) as being as intrinsic to a CPU as floating point.

    Here's a platform with no dedicated graphics chip: the PS2. Must suck for games.
     
  8. socksy

    socksy

    Joined:
    May 21, 2005
    Posts:
    244
    Can't disagree with your logic there.

    ;)
     
  9. taumel

    taumel

    Joined:
    Jun 9, 2005
    Posts:
    5,292
  10. Marble

    Marble

    Joined:
    Aug 29, 2005
    Posts:
    1,268
    Note that the zdnet article appears to have been written by Jason O'Grady, author of Powerpage.org, a somewhat unreliable Apple rumor site. Even though it's just speculation, it's good to know what sort of speculation you're reading ;).
     
  11. taumel

    taumel

    Joined:
    Jun 9, 2005
    Posts:
    5,292
    Yep, i'm pretty much open minded, don't you think?! I even read you posting! ;O)

    Well, i guess you can find a little bit of truth nearly everywhere. And whilst looking at my mini the new ones already use an intel one. But after all it's your responsibility to balance the information correctly given to you...
     
  12. deram_scholzara

    deram_scholzara

    Joined:
    Aug 26, 2005
    Posts:
    1,043
    The zdnet article refers to the G5 as a G4... clearly the author is a bit confused.
     
  13. taumel

    taumel

    Joined:
    Jun 9, 2005
    Posts:
    5,292
    Who is not... :O)