Search Unity

  1. Megacity Metro Demo now available. Download now.
    Dismiss Notice
  2. Unity support for visionOS is now available. Learn more in our blog post.
    Dismiss Notice

As monitors get faster could you simulate a CRT monitor with low bandwidth a line at a time?

Discussion in 'General Graphics' started by Arowx, Jan 4, 2021.

  1. Arowx

    Arowx

    Joined:
    Nov 12, 2009
    Posts:
    8,194
    Old CRT displays draw the screen with a beam that quickly goes line by line. The beam goes so fast and the screen fades slowly enough for your eye to see a TV image.



    Retro computers would even chase the beam to allow their limited hardware to do more amazing graphics.

    The thing is as modern monitors get higher and higher frame rates it should be possible to create a CRT display on the CPU/GPU that is super low bandwidth...

    If instead of writing to an entire image every frame it cycles through a segment of the CRT display image updating it this would reduce memory bandwidth and CPU/GPU bandwidth used.

    Spreading the load across multiple frames and taking advantage of how old CRT displays used to work.

    Come to think of it could an approach like this be applied to the main display depending on animation frame rates needed?

    Currently we have variable rate shaders do they only work on resolution or do they also work on update rates?
     
  2. Neto_Kokku

    Neto_Kokku

    Joined:
    Feb 15, 2018
    Posts:
    1,751
    To draw one scanline per frame, at 60 frames per second, you would need a 3600Hz display (60*60).
     
  3. JoNax97

    JoNax97

    Joined:
    Feb 4, 2016
    Posts:
    611
    I swear your threads make less sense each time
     
  4. Arowx

    Arowx

    Joined:
    Nov 12, 2009
    Posts:
    8,194
     
  5. Arowx

    Arowx

    Joined:
    Nov 12, 2009
    Posts:
    8,194
    See above but your forgetting how low res NTSC & PAL screens were e.g. PAL which had higher resolution than NTSC was about 480 × 576.

    So you would just divide the 576 lines by your games frame rate or monitors display which ever is slower and render only that segment of the display whilst running some kind of CRT fade shader to update the rest of the display.

    Reducing the texture bandwidth used by 30,60,90,120,144 or higher times. In theory this could let you display rooms of CRT style displays all showing videos playing.

    OOPS well above 30 fps
     
    Last edited: Jan 5, 2021
  6. Torbach78

    Torbach78

    Joined:
    Aug 10, 2013
    Posts:
    296
    to approximate interlacing, a camera overlay of (2px thick) lines (ODD & EVEN) 'should' be able to discard 1/2 of the screen scanline pixels early in the rendering to attempt to save resources.. (you would not refresh a single pixel at a time; that is an insane refresh value currently)

    some technicals https://stackoverflow.com/a/6484449

    games at 60hz oscillate ODD 30hz and EVEN 30hz --persistence of vision within the human eye gives the illusion of continuous motion

    as for an illusion of lower res simply using a 1/4 or 1/8 buffer and NEAREST upscaling can create performance increases. offscreen buffers are a common practice and be able to do all the work with scanline interlacing as well
     
    Last edited: Jan 5, 2021
    Arowx likes this.
  7. Arowx

    Arowx

    Joined:
    Nov 12, 2009
    Posts:
    8,194
    Didn't old TVs/cinema manage with 24 hz?
     
  8. Torbach78

    Torbach78

    Joined:
    Aug 10, 2013
    Posts:
    296
    in USA NTSC created ~29.97 FPS

    CINEMA IIRC was projected at 48FPS to reduce shutter flicker the human eye noticed, but each frame plays twice so the capture frame rate was 24fps
     
    Last edited: Jan 5, 2021
    Arowx likes this.
  9. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,329
    96 hz, 24 fps. Showed the same frame 3 times using a 3 blade shutter. 1 or 2 blade shutters were used in the very early days, and 2 blade shutter projectors are still something you can buy but were mainly only used in inexpensive home projectors vs. anything film theater based. Modern digital theaters often project at 144 hz or even higher refresh rates, even if only displaying 24 fps content.

    The first broadcast TV signal was 18 fps, and only 50 vertical lines of resolution. But was an analog signal so it had nearly unlimited horizontal resolution.

    Some early pre-WW2 TV broadcasts in the US were 24-ish fps. NTSC standardized to 30 fps post WW2, but was really two interlaced fields per frame, and many broadcast cameras of the time just captured at 60 fps sending two half-vertical resolution images rather than interlacing a single frame.

    NTSC shifted to 29.97 fps later to include color along side the original 30 fps black & white signal in a way that didn't break old b&w tvs.


    Also it should be noted that the line-by-line style of updating that CRTs use is still how LCDs update today. The difference is LCDs stay on, and they update at the resolution of the display rather than the input signal. Also your way of calculating bandwidth is totally wrong. A 30fps 640x480 resolution digital signal uses less bandwidth than NTSC.
     
    Last edited: Jan 5, 2021
    Arowx likes this.
  10. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,329
    NTSC's max resolution is 720x525 btw, with 720x486 being potentially visible, and the rest of the vertical lines used for storing other information. As an analog signal it's difficult to directly compare a modern digital display signal to NTSC. The horizontal resolution of NTSC is variable since it's an analog signal, hence why the NES only rendered at a 256x240 resolution without problems. It just updated the horizontal signal slower than the max possible, but the 240 vertical lines actually had to line up with the 243 visible scan lines for each 59.94hz field.
     
    Torbach78 likes this.
  11. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,329
    More thoughts on the NES. It was a hardware sprite renderer. It supported a maximum of 64 sprites on screen, but only 8 per line. More than that and it just wouldn't draw them, which causes the flickering you might see in some games. Sprites flickered because they were changing the order of the sprites every new frame so a different sprite would disappear. It also had tile maps for background layers, which is what anything that didn't move independently used.

    NES didn't have any video memory like we would think of today because it was rendering the line the TV was displaying as it was being displayed, and directly outputting the color value to the TV as it went.

    But this doesn't change the video bandwidth required. Sending a signal now vs. sending the same signal later is the ultimately same amount of video bandwidth. For modern rendering systems interlacing would adds a ton of problems, because the actual bandwidth that matters more isn't the video output, it's the memory access patterns. If you're rendering in alternating lines across the screen, you're loosing out on a lot of potential coherency compared to rendering a small section of the screen at a time. By that I mean is if you render in a line across the screen, you might see a lot of different objects, with different textures, models, shaders, etc. Each of which needs to be read from memory. That creates a lot of incoherency, aka randomness, in the memory access. If you render small squares of the screen, and one object at a time, you can guarantee a lot more memory coherence since when rendering a single object it's likely using the same texture (or set of textures) everywhere it's visible. And there are likely going to be fewer objects visible in each square area on screen vs a single line of the same number of pixels. So that's what modern GPUs do. And they save it all to a frame buffer that is then scanned out to the display later.

    An HDMI video signal is still synced with the display it's being sent to. The only difference is for modern GPUs it's reading the frame from pre-filled memory instead of being rendered to as it goes. That and the display you're sending the signal too is probably storing that image in some local memory as well before actually displaying it on screen. Though gaming monitors are getting a lot better about how long they let it sit and really do try to get the panel and input signal synced up as close as possible.

    For retro consoles, there are a lot of modern analog to digital video converters that take advantage of the fact the HDMI signal is still line by line, so the latency added is only that of how many lines of the HDMI resolution the one line of the original analog signal has to span.
     
    Last edited: Jan 5, 2021