Search Unity

Good or bad for indies? EA says Gaming's move away from ownership model is inevitable.

Discussion in 'General Discussion' started by VIC20, Sep 16, 2017.

?

In EA’s dreamworld indies are

  1. doomed

  2. still as broke as usual

  3. are better off (because?)

Results are only viewable after voting.
  1. Martin_H

    Martin_H

    Joined:
    Jul 11, 2015
    Posts:
    4,436
    Actually if you just stream the video output of the game, large game-size becomes a pro-streaming argument since you can play right away without downloading 100gb first. The 100gig just have to be available to the machine in the cloud that's providing your stream.

    I still think it's stupid as a general pc game solution. It might make sense though for providing console games to people that just have pcs. Like iirc PS-now is doing or planning to do. I'd probably pay for a streaming service to play "the last of us 2" and "bloodborne", since my alternative would be buying a PS4 for basically just those two games that interest me on that plattform.
     
    EternalAmbiguity likes this.
  2. Eric5h5

    Eric5h5

    Volunteer Moderator Moderator

    Joined:
    Jul 19, 2006
    Posts:
    32,401
    If VR takes off, that would put an end to any idea of streaming for the foreseeable future, since you really can't have any latency at all, never mind the massive bandwidth needed.

    --Eric
     
    Kiwasi, Martin_H, VIC20 and 1 other person like this.
  3. angrypenguin

    angrypenguin

    Joined:
    Dec 29, 2011
    Posts:
    15,620
    Huh, somehow I missed the post immediately after the one I was responding to. Oops...

    I suspect that's actually backwards.

    Streaming bandwidth is based on frame rate, image resolution, and the detail of the image (ie: how much it can be compressed in real time). You have to send every frame, but you only have to send what you see.

    What's the bandwidth usage of streaming 10 hours of 4k lets play video from YouTube or Twitch? That's what you'd be looking at for playing 10 hours of the same game via streaming. I doubt that it's going to be multiple tens of gigs as is being suggested.
     
  4. angrypenguin

    angrypenguin

    Joined:
    Dec 29, 2011
    Posts:
    15,620
    I tend to think the opposite. I have a beefy PC that sits at my house unused most of the time. I suspect that there are plenty of people in similar situations who also happen to have a beefy Internet connection. With that in mind it makes a lot of sense to me to share hardware - use the connections we've already got to use hardware that's already available, rather than everyone getting more of it.

    In an ideal world where that could happen with minimal downside, for instance, my PC could easily be shared among 2 or 3 people for gaming purposes, which could mean cheaper gaming for all of us or it could mean more frequent upgrades. Do that at scale and you could quite feasibly have hubs of hundreds of computers satisfying thousands of gamers. You could also do really cool stuff, like assigning people host machines based on the game they're playing. Someone playing a game of Civilisation doesn't need the same power as someone playing Battlefield, so they can be swapped onto a machine with lower power consumption.

    As for hardware, the thin client model doesn't require a lot. Something that can play appropriate resolution video and feed back an input signal, that's all. My PS3 could be a thin client. Or my Xbox 360. Or the $180 micro-PC I bought a year ago. Or my phone, for that matter.

    The downsides to this are logistics, latency and bandwidth costs. They're very real downsides, so I'm not suggesting that the above is likely to happen any time soon. But if it could, and if people were less tied to the idea of "owning" things, it'd be quite practical.

    (Note that I'm talking about playing games there. I probably wouldn't want my workstation to be remotely hosted, because that introduces a bunch of other issues.)
     
  5. EternalAmbiguity

    EternalAmbiguity

    Joined:
    Dec 27, 2014
    Posts:
    3,144
    Good points.

    However, as for the 4k size...it's huge. Like, huge. Though that's 4k. 1080p would certainly be much less. Though in both cases this is also stuff you're downloading in real time, so it's not like you can let it download things overnight and run them locally.
     
  6. angrypenguin

    angrypenguin

    Joined:
    Dec 29, 2011
    Posts:
    15,620
    That's also for cinema quality footage, where the priority is (hopefully) getting the highest quality footage possible within hardware constraints. For real-time streamed stuff I'm sure it'd be a fraction of that, because there the priority is to balance quality with bandwidth usage. Compare the size of a movie streamed from Netflix to the same movie on a Bluray - I suspect that the size is quite different even though visually you'd have to look quite closely to see a difference. (I did try searching for numbers quickly, but mostly found people talking about ripping stuff.)
     
    EternalAmbiguity likes this.
  7. Ryiah

    Ryiah

    Joined:
    Oct 11, 2012
    Posts:
    21,203
    According to Netflix the minimum speed to watch Ultra HD (3840 by 2160) is 25 Mbps or 11.5GB per hour. Just keep in mind the average movie is typically running at 24Hz while the lowest for games is typically 30Hz on the consoles and 60Hz on the PC.

    https://help.netflix.com/en/node/306

    According to YouTube the minimum speed to watch Ultra HD at 60Hz is up to 68 Mbps or 30.6GB per hour.

    https://support.google.com/youtube/answer/1722171?hl=en
     
  8. angrypenguin

    angrypenguin

    Joined:
    Dec 29, 2011
    Posts:
    15,620
    Wow, ok. I was watching 1080p movies on an ADSL2 connection at ~6Mbps (on a good day). I'm surprised that it doesn't scale better with resolution (since most of those extra pixels are going to store data very similar to the pixels that were already there).

    That said... I wonder if they're talking about 25Mbps actual, or 25Mbps advertised, since many connections never reach their advertised speed? My ~6Mbps actual connection was advertised as an up to 24Mbps connection.
     
  9. Ryiah

    Ryiah

    Joined:
    Oct 11, 2012
    Posts:
    21,203
    My assumption based on the same webpage is that they look at actual speed. If you click the drop-box it will tell you how to test your Internet connection and then to compare that to the chart on the page.
     
  10. Eric5h5

    Eric5h5

    Volunteer Moderator Moderator

    Joined:
    Jul 19, 2006
    Posts:
    32,401
    I see someone hasn't been playing the last few versions of Civ. ;) 4K bandwidth can be reduced by using better compression; e.g. h.265 vs. h.264 is about 2X smaller, but it does need more computing power. But then, 30fps isn't great; 60fps or higher is better for games, so you're back to huge bandwidth again.

    --Eric
     
    Ryiah and EternalAmbiguity like this.
  11. EternalAmbiguity

    EternalAmbiguity

    Joined:
    Dec 27, 2014
    Posts:
    3,144
    In this "next-gen" console age, scarce few seem to care all that much for 60...they seem willing to sacrifice it for higher visual fidelity.

    I'm not sure what effect that has on bandwidth. Probably balances out honestly.
     
  12. VIC20

    VIC20

    Joined:
    Jan 19, 2008
    Posts:
    2,689
    That’s my traffic. 90% of it is based on streaming HD content from netflix and amazon prime.

    rx / tx / total / estimated
    Sep '17 381.35 GiB / 28.29 GiB / 409.64 GiB / 674.53 GiB
    yesterday 15.77 GiB / 1.41 GiB / 17.17 GiB
    today 3.03 GiB / 299.38 MiB / 3.33 GiB / 15.20 GiB

    Eric made the point with VR. Streaming will never work with VR and there is no technical way to ever do this. Unless you find a way to do networking with a kind of quantum entanglement.
     
    Martin_H likes this.
  13. Ryiah

    Ryiah

    Joined:
    Oct 11, 2012
    Posts:
    21,203
    Console gamers may be more willing to put up with reduced frame rates, but console players will be hard to sell the service to because they're already paying far less for their hardware compared to the average PC gamer. PC gamers will be hard to convince that a reduced frame rate is acceptable when they're completely used to 60Hz and greater.

    It's a shame that Steam doesn't list commonly used refresh rates.
     
    Martin_H and EternalAmbiguity like this.
  14. EternalAmbiguity

    EternalAmbiguity

    Joined:
    Dec 27, 2014
    Posts:
    3,144
    Hmm. I can see what you're saying, but at the same time, what they're doing feels like streaming-they're sacrificing control to simply play a game. They're saying, "I don't care all that much about hardware."

    And as to them paying far less--you clearly know this as much as I do (so please don't take this as condescension), but PCs with equivalent power to those consoles can be made for little more than them. So it's not that they're making a conscious effort to go "cheaper" than a PC would be, but that they're paying for the convenience of a console...kind of like streaming. As again, as to it being less...you have stuff like the Pro and the XB1X (or XBOX, which is kinda clever) shaking that up...

    And to be honest with (paid) services like Xbox Live and PSN, they're primed for more subscription based services.


    I suspect 60 is by FAR the highest, perhaps followed by 144 then the more exotic ones like 100 or 240 or 75.
     
    angrypenguin likes this.
  15. Eric5h5

    Eric5h5

    Volunteer Moderator Moderator

    Joined:
    Jul 19, 2006
    Posts:
    32,401
    120/144Hz monitors seem to be on the upswing from what I can tell. They're not 4K though...getting 60fps at 4K is already a bit of a struggle.

    --Eric
     
  16. angrypenguin

    angrypenguin

    Joined:
    Dec 29, 2011
    Posts:
    15,620
    Sure, but if things go the way they're going you won't need to pay anything for hardware.

    A thin client just has to play a video and return inputs. The last TV my dad purchased already has Netflix built in. Plenty of TVs allow you to install apps. I agree with EA in so far as we're already moving towards this being a possibility.

    And for the mainstream market who mostly care about convenience, 30hz vs 60hz isn't really a thing. That's a concern for PC enthusiasts. Almost everyone else happily plays games however they come on whatever device they happen to have.
     
    EternalAmbiguity likes this.
  17. FMark92

    FMark92

    Joined:
    May 18, 2017
    Posts:
    1,243
    But games already cost pennies. Well not EA games, but steam.
    This part of the article is truly ridiculous:
    I love it how they say it like it's nobody's business. "Did you know we currently rip you off and make out like bandits if you want to get into FIFA? Together (for just a small payment of $9.99 (per month (per game (for eternity)))) we can stop this!"

    The comment section is also golden.
     
    Last edited: Sep 19, 2017
    GarBenjamin likes this.
  18. Ryiah

    Ryiah

    Joined:
    Oct 11, 2012
    Posts:
    21,203
    The PlayStation 4 Pro is only $399. Unless you make use of existing parts you simply cannot build or buy a gaming machine that will play the exact same games at the exact same quality settings for the same price.

    It isn't for a lack of research either. Go search pcpartpicker.com for a few minutes and you'll quickly realize that every single computer that fits within that budget is taking existing hardware into consideration or is simply too weak to handle the games.

    https://pcpartpicker.com/builds/#X=29599,39754

    Agreed, but it's worth pointing out that the difference in cost between quality 1080p @ 60Hz, 1080p @ 75Hz, and 1080p @ 144Hz monitors has shrunk considerably. They're $100, $150, and $200 respectively for 24-inch from Acer.

    Yes, but how will the cost of a subscription compare to the cost of a good gaming machine in the long term? My last rig was only retired because the motherboard died and it took at least six years to reach that point. Yet it was still playing the games at reasonably high quality settings with Fallout 4 hitting 45 FPS on nearly the highest possible settings.

    Streaming videos makes a great deal of sense because the cost of the storage and bandwidth are insignificant on both client and server and a single server is easily able to handle a substantial number of users. How about a gaming server though? How many users can you realistically fit onto a single server simultaneously?

    Based on the original OnLive pricing I simply can't imagine the service being that cheap.

    https://www.geek.com/games/onlive-pricing-announced-and-its-expensive-1264822/
     
    Last edited: Sep 19, 2017
    EternalAmbiguity likes this.
  19. FMark92

    FMark92

    Joined:
    May 18, 2017
    Posts:
    1,243
    I didn't say it was going to happen tomorrow... or in 10-25 years.
    Just... eventually.
     
    angrypenguin likes this.
  20. angrypenguin

    angrypenguin

    Joined:
    Dec 29, 2011
    Posts:
    15,620
    One, but see what I said before about my own gaming machine. It gets used a fraction of the time. Plenty of the time it is getting used it's at a fraction of its power. Amortise that cost over multiple people and there are savings on multiple fronts.

    Things have advanced since OnLive was a thing, though, and they'll only continue to get better. I'm not saying this will be a thing tomorrow. I am saying I wouldn't rule it out some time in the future.

    Also, OnLive's pricing isn't really comparable to what we're talking about here because the games are purchased on top of the subscription, rather than accessed as a part of it.

    Quite well, I'd assume. Assuming $20/mo (so about twice a Netflix sub, and a heck of a lot more than Origin's subscription service) I could subscribe for over four years for the price of my PC's performance parts alone (so not including OS or peripherals). And I haven't gone 4 years without upgrading something, either.

    I don't think that PC enthusiasts are likely to be the target audience here. Still, for the price of my PS4 at those rates I could subscribe for over a year and a half...

    So do I think it could be made cost effective? Quite possibly.

    I'd love to say that I don't know why they'd do that when people still happily pay $100+ per year for flagship franchise games, but stuff like Battlefield is already in their subscription service. (Though that only gets you half of what the $100 does. You still need to buy Premium if you want that content.)
     
    EternalAmbiguity and Ryiah like this.
  21. BIGTIMEMASTER

    BIGTIMEMASTER

    Joined:
    Jun 1, 2017
    Posts:
    5,181
    There's cattle, and there's wolves.

    Maybe the dumb wolves will eat all the dumber cattle and then they'll die too.

    But the roaches will always be here.
     
  22. FMark92

    FMark92

    Joined:
    May 18, 2017
    Posts:
    1,243
    Nope.

    There is cattle.

    And then on the other continent there is wolves.
     
  23. Fera_KM

    Fera_KM

    Joined:
    Nov 7, 2013
    Posts:
    307
    Something about the Spotify model...

    Most of us have read about complaints from artists for the income from Spotify.
    What they leave out in those interviews is the insane amount of money Spotify have payed in licensing (and div.) fees to the record companies (the big three; Sony, Warner, Universal), which most of those artists gets a salary from.

    The big big losers in such a models are the indie artists, or the (small) record companies outside of the big three that has to pay a fee to be included in Spotify. Which is not including whatever deal Spotify has with the record companies for front page exposure and suggested listening and artist radios.

    In my opinion, the Spotify model is the worst case scenario for the future development of games.

    Large publishers (aka. EA, Activision, Ubisoft etc) would gain significantly from this.
    Medium - Large developing studios, under the wing of a large publisher, would lose on it.
    Indie and small developing studios and small publishers would lose significantly on it.
     
    VIC20 and Martin_H like this.
  24. FMark92

    FMark92

    Joined:
    May 18, 2017
    Posts:
    1,243
    Didn't know a thing about Spotify, so thanks for that perspecive. Now;
    indies only lose if they no longer have a way to publish their games. Does EA's platform publish indie games? Does Sony or Ubisoft? If the answer too those is "no" (I don't know, I don't look in that direction much), then aren't we assuming "ownership model" will lose customers it never had to begin with?
     
  25. Fera_KM

    Fera_KM

    Joined:
    Nov 7, 2013
    Posts:
    307
    Assuming an average "gamer's" monthly games budget is $60 (no idea what a real number might look like).

    Imagine a model where steam had, instead of purchasing games, had a subscription model of, say $60 a month.
    And instead of taking a cut per purchase, they had a payment system of 3 cent/play hour for developers/publishers.
    To keep the big publishers interested in staying on the platform, steam would have to pay, say 60-70% of the subscription fees directly to them (where larger publishers gets a larger cut).

    Now we are inn a position where a gamer (the consumer), can not choose away a steam subscription, because it's the only way to access the major games, even if you are only interested in one or two. At the time there is very little to no money left in the monthly spending budget to purchase other games outside of steam.

    I believe, this is closely the model that they are talking about, when big publishers talk about subscription models.
    Or in other words, how to redistribute the current total market profit in games (so they get an even larger majority)..
     
  26. EternalAmbiguity

    EternalAmbiguity

    Joined:
    Dec 27, 2014
    Posts:
    3,144
    Sorry, for that part I was talking about the X1 and PS4 base models. They went back to selling at a loss for the upgraded models I believe.
     
  27. Ryiah

    Ryiah

    Joined:
    Oct 11, 2012
    Posts:
    21,203
    Good luck finding a new or used computer with equivalent power for $299. ;)

    The PlayStation 4 was never sold at a loss, but then they don't have to pay for Windows either. :p

    https://arstechnica.com/gaming/2013/11/report-399-playstation-4-costs-about-381-to-build/
     
  28. Eric5h5

    Eric5h5

    Volunteer Moderator Moderator

    Joined:
    Jul 19, 2006
    Posts:
    32,401
    The article you linked to suggests it is indeed being sold at a loss. Cost of assembly alone = $381. Sold at $399, so add marketing, R&D, retail margins, etc. = loss. Much less of a loss than the PS3, however.

    --Eric
     
    Martin_H likes this.
  29. Ryiah

    Ryiah

    Joined:
    Oct 11, 2012
    Posts:
    21,203
    Yeah I suppose that's just the cost of the parts. You have to factor in development and support for the device too.
     
  30. EternalAmbiguity

    EternalAmbiguity

    Joined:
    Dec 27, 2014
    Posts:
    3,144
    Fair enough, though you can get pretty close I suspect.

    I know the PS4 (wasn't at a loss Edit - debatable?), I was talking about the "upgraded" ones there--the Pro. Though I'm not sure about that honestly, and a cursory search doesn't provide answers.
     
  31. Ryiah

    Ryiah

    Joined:
    Oct 11, 2012
    Posts:
    21,203
    You may be able to build a gaming computer, but it won't be able to play the same games with anything remotely close to the same graphical quality. There isn't a whole lot of performance info related to the consoles, but the PS4's GPU is rated at 1.84 TFLOPs. The closest APU for that would be the A12-9800 for a little over $100.

    That said the A12-9800 only achieves 1 TFLOP while consuming a third of the budget and last I was aware it was the best APU available from AMD at this point in time. You'd need to step up to a GPU to have a real chance at similar performance.

    https://www.reddit.com/r/Amd/comments/6v9xmd/final_a129800_compute_performance_hitting_1_tflop/
    https://www.walmart.com/ip/AMD-A12-...-Socket-AM4-Processor-AD9800AHABBOX/721857775

    Closest GPU is a GeForce 1050 at 1.7 TFLOPs at $125. A 1030 would be $50 less but also only 1.1 TFLOP.

    https://en.wikipedia.org/wiki/GeForce_10_series
    https://www.walmart.com/ip/EVGA-NVIDIA-GeForce-GTX-1050-GAMING-Graphic-Card/144645190
    https://www.walmart.com/ip/GeForce-GT-1030-2G-LP-OC/312140787
     
    Last edited: Sep 19, 2017
  32. EternalAmbiguity

    EternalAmbiguity

    Joined:
    Dec 27, 2014
    Posts:
    3,144
    I've heard that AMD FLOPs generally translate to lower performance than an nVidia FLOP. So that 1050 is probably closer than it appears.

    On the other hand, you've got console-level optimizations...
     
  33. VIC20

    VIC20

    Joined:
    Jan 19, 2008
    Posts:
    2,689
    This may be true for future TV boxes and smart TVs but not for computers and not for smartphones and tablets. Beside gaming, there are thousands of reasons why we need decent graphics cards in our devices today.
    And since CUDA & Co the number of reasons is increasing each day.
     
  34. angrypenguin

    angrypenguin

    Joined:
    Dec 29, 2011
    Posts:
    15,620
    That was in the context of consoles and gaming PCs. You're right that we all pretty much need a phone anyway, so we'll still have to buy phones, but we do that anyway. I'm saying that it's possible for gaming to become a hardware-agnostic platform that sits on top of whatever devices we already have.

    That aside, I'm not convinced that most people need beefy GPUs. Or, stated differently, I'm pretty confident that most people who need a beefy GPU need it for games. The popularity of Intel integrated solutions is pretty good evidence of that, I'd have thought? And as I said earlier, for the purposes of this I'm only talking about gaming. I wouldn't want to outsource my workstation hardware.
     
    Kiwasi likes this.
  35. grimunk

    grimunk

    Joined:
    Oct 3, 2014
    Posts:
    278
    Streaming (aka real-time video encoding of a game running on a server) is a non-starter for the top-grossing games. The main reason is because they just can't feel as good due to speed-of-light limitations (latency), so I don't see the market completely shifting over to that model. F2P multiplayer games are quite common now and provide a great business model. There might be some appeal, if you play a lot of different games, where the streaming model can save you money IF you are already buying all the games you play. Otherwise, I can't really see any broad appeal. It's not like streaming hasn't been tried in the market already. It has, and it has failed every time.
     
  36. VIC20

    VIC20

    Joined:
    Jan 19, 2008
    Posts:
    2,689
    I don’t need a stinkin phone – all I want is a desktop computer. My iPhones and iPads are laying like a stone on the table all day – I only need them to test games on it.
    I make mobile games but actually I hate all mobile devices. I think it is totally idiotic to try to play a game on a smartphone. No idea what is wrong with all those people who have decide that a cinema for mouses is better than a 27“ screen of a computer.

    My Mac has one of those terrible intel integrated graphics which is barely enough to drive my screen and develop mobile games with Unity. And I use it like I would use any other GPU – it feels just like 2005
     
    BIGTIMEMASTER likes this.
  37. BIGTIMEMASTER

    BIGTIMEMASTER

    Joined:
    Jun 1, 2017
    Posts:
    5,181

    Phone games are for getting through boring classes/office jobs.
     
  38. VIC20

    VIC20

    Joined:
    Jan 19, 2008
    Posts:
    2,689
    Need no stinkin boring classes/office jobs :)
     
  39. neoshaman

    neoshaman

    Joined:
    Feb 11, 2011
    Posts:
    6,493
    I think the technology is actually pushing toward an hybrid model,
    - Latency can be reduces with smart prediction, Microsoft had a paper that showed positive result on this.
    https://www.theverge.com/2014/8/23/...elorean-lag-free-cloud-gaming-system-revealed
    - You can mitigate streaming bandwidth issue, by "double" rendering, ie having a fairly low frequency image frequency rendering on client, and the high frequency rendered on the server, then through syncing, you would only send the delta to reconstruct the image, which mean you save a big chunk of bandwidth. Again that was some microsoft research.
    - much less likely but foveated rendering with gaze detection, I doubt that the latency would permit it. It would be an hardware solution.
    - It's probable that deep learning will also help resolve image on a per game basis, if you train a network on a specific game, it might be used as a dedicated codec.

    Ultimately it's about finding the right model toward the consumer.
    - Smartphone was improbable to be mainstream, until Ipone
    - Mp3 player didn't took off up until the ipod
    - The minitel was internet before internet, you had to wait for "the world wide web" to pierce into the everyday's life (poor telnet).
    - Steam was met with outrage when HL2 was sold with a ticket to download the game.
     
  40. angrypenguin

    angrypenguin

    Joined:
    Dec 29, 2011
    Posts:
    15,620
    The point is that people can get the 50" experience from the same piece of hardware that currently provides the "cinema for mouses". I'm not saying that people should play the games on their phone screens. I'm saying that if game streaming becomes a thing then client-end hardware becomes (almost) a non-issue.

    With continued network advancements I don't see that latency being any worse than what's already there, such as (mut not limited to) display lag. Sticking to display lag, though, 20ms is considered "excellent". By coincidence, a speed test just told me that my current connection (in a busy office) has... 20ms ping.
     
  41. Eric5h5

    Eric5h5

    Volunteer Moderator Moderator

    Joined:
    Jul 19, 2006
    Posts:
    32,401
    I think it's overly optimistic to expect that network speeds and reliability will improve enough for significant numbers of people any time soon, especially in rural areas. Would it really be acceptable to exclude gamers based on location? It's not like video, where with slower internet you might have to wait 10 minutes for a show to buffer sufficiently, but at least you can watch it normally after that. Not to mention the significant variation in speed that frequently happens based on time of day/congestion.

    --Eric
     
    Ryiah likes this.
  42. FMark92

    FMark92

    Joined:
    May 18, 2017
    Posts:
    1,243
    That assumes steam kills it's indie market, which would warrant a change of platform for indies anyway.
    I tried to do the math and it would probably pan out with just 3 cents an hour but I don't have enough steam stats to prove it or disprove it/confirm it or deny it.

    It hasn't stopped Microsoft before.
    http://www.ign.com/wikis/xbox-one/Always_Online_Connection
    I'm glad they got s*** for that.
     
  43. Deleted User

    Deleted User

    Guest

    the computer was MADE to pirate information (break enigma code)... .... i dont know why they are all so dumb enough to not understand reality... ...

    it is a REALITY of the medium, you cant win against it, if you dont like it, dont invest in the medium...
    computer's #1 purpose is the widespread and infinite duplication of information...

    how can they be soo stupid? they don't even know about the medium they're working with LOL
    make board games or something instead EA .. dumbasses...
     
    Last edited by a moderator: Sep 20, 2017
  44. FMark92

    FMark92

    Joined:
    May 18, 2017
    Posts:
    1,243
    False. While Alan Turing's machine did set the groundwork for later machines, computers existed long before that to help with (relatively) long simple expressions. And there was even a programmable one.
     
    Ryiah likes this.
  45. Ryiah

    Ryiah

    Joined:
    Oct 11, 2012
    Posts:
    21,203
    Enigma isn't that old. Some of the machinery historians consider to be computers, or at least as close as you could get to a computer, not only predate Enigma but predate both of the World Wars as well. Punch card processing, for example, goes as far back as the beginning of the 1800s.

    https://en.wikipedia.org/wiki/History_of_computing_hardware
     
    angrypenguin likes this.
  46. zombiegorilla

    zombiegorilla

    Moderator

    Joined:
    May 8, 2012
    Posts:
    9,052
    Something that hasn't been noted though, is that "streaming" on the back end is a very different thing for movies vs. games. Netflix provides static data, in fact more often than not, especially with popular content, it is cached "locally" (at a nearby data farm, or even with your isp). With a game the content is completely unique to each viewer and actually has to be generated on the fly.
    I have a hard time seeing this being completely ubiquitous. Thin clients, or some sort of hybrid, like webgl seem to be the smartest.
     
    Martin_H, VIC20, Ryiah and 1 other person like this.
  47. VIC20

    VIC20

    Joined:
    Jan 19, 2008
    Posts:
    2,689
    And to generate it on the fly in a decent quality requires a lot of CPU power. I doubt this will become big within the next 10 years. I guess it ends as one option of many to play games.
     
  48. angrypenguin

    angrypenguin

    Joined:
    Dec 29, 2011
    Posts:
    15,620
    That already happens. You try playing Stafcraft 2 from rural Australia. ;) Services are provided based on whether they can reach enough people to be commercially viable, not on whether they can reach literally everyone.

    I think that the bigger challenge will be the stuff that @zombiegorilla raises - the lack of some equivalent of geographic cache optimisation. Playing Game of Thrones somewhere is assisted by having it on a HDD somewhere nearby, where "nearby" means "wherever storage and bandwidth are cheap". Playing The Witcher 3 via stream requires a whole computer somewhere nearby, where "nearby" is "physically close enough to minimise network latency". Moving data around is trivially cheap and easy. Moving hardware around isn't.

    For me, that means in the city nearby. That means that, on top of the technology challenges, they'd also need to get enough subscribers in my state alone, possibly in the metro area alone, to justify a data center stocked with gaming PCs. (To be honest, given my country's combination of population density and network infrastructure, I don't see anything of the kind taking off here easily any time soon even if the technical challenges were solved tomorrow.)
     
    Martin_H likes this.