this post was submitted on 16 Jun 2024
632 points (95.3% liked)

Greentext

3770 readers
1469 users here now

This is a place to share greentexts and witness the confounding life of Anon. If you're new to the Greentext community, think of it as a sort of zoo with Anon as the main attraction.

Be warned:

If you find yourself getting angry (or god forbid, agreeing) with something Anon has said, you might be doing it wrong.

founded 10 months ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 56 points 2 months ago (4 children)

Because it is analog. There are no buffers or anything in between. Your PC sends the image data in analog throug VGA pixel by pixel. These pixels are projected instantly in the requested color on the screen.

[–] [email protected] 48 points 2 months ago (1 children)

And no motion blur because the image is not persistent. LCDs have to change their current image to the new one. The old image stays until it’s replaced. CRTs draw their image line by line and only the the last few lines are actually on screen at any time. It just happens so fast, that, to the human eye, the image looks complete. Although CRTs usually do have noticeable flicker, while LCDs usually do not.

[–] [email protected] 7 points 2 months ago

Thanks for the explanation.

OP's point is a weird flex though, like pointing out that a bicycle never runs out of gas...

[–] [email protected] 23 points 2 months ago* (last edited 2 months ago) (2 children)

Of course there's buffers. Once RAM got cheap enough to have a buffer to represent the whole screen, everyone did that. That was in the late 80s/early 90s.

There's some really bad misconceptions about how latency works on screens.

[–] HackerJoe 8 points 2 months ago

Those are on the graphics adapter. Not in the CRT.
You can update the framebuffer faster than the CRT can draw. That's when you get tearing. Same VSync then as now.

[–] [email protected] 2 points 2 months ago (1 children)

CRTs (apart from some exceptions) did not have a display buffer. The analog display signal is used to directly control the output of each electron gun in the CRT, without any digital processing happening in-between. The computer on the other end however does have display buffers, just like they do now; however eliminating extra buffers (like those used by modern monitors) does reduce latency.

[–] [email protected] -1 points 2 months ago

Doesn't matter. Having a buffer means either the buffer must be full before drawing, or you get screen tearing. It wasn't like racing the beam.

[–] [email protected] 8 points 2 months ago (1 children)

That makes 0 latency in the monitor, but how much latency is there in the drivers that convert a digital image to analogue signals? Isn't the latency just moved to the PC side?

[–] [email protected] 2 points 2 months ago* (last edited 2 months ago)

I warn you before you dive in, this is a rabbit hole. Some key points (not exact, but to make things more layman): You don't see in digital, digital is "code". You see in analog, even on an LCD (think of sound vs video, its the same thing). Digital-only lacked contrast, brightness, color, basically all adjustments. So the signal went back and forth, adding even more latency.

Maybe think of it like a TVs game mode, where all the adjustments are turned off to speed up the digital to analog conversions.

Or like compressed video (digital) vs uncompressed video (analog), where the compression means you can send more data, but latency is added because it is compressed and uncompressed at each end.

[–] [email protected] 5 points 2 months ago (3 children)

No such thing as instant. There is always some latency.

[–] [email protected] 17 points 2 months ago (1 children)

Ok fine, at the speed of light then.

[–] [email protected] -3 points 2 months ago (1 children)

Not quite... There is some attenuation due to the medium, in this case, signals sent by wire. Even optic fiber has some attenuation.

[–] [email protected] 14 points 2 months ago (1 children)

Obvious troll, but I'll explain it to the rest of you guys: the latency in CRTs is so miniscule compared to LCDs that it might as well be called instant.

[–] [email protected] -1 points 2 months ago

Not trolling, just nitpicking.

[–] [email protected] 4 points 2 months ago

When one of your times is in milliseconds, while the other requires awareness of relativistic effects, you might as well call it instant.

The propagation speed in copper is 2/3 C. With analogue monitors, that was effectively amped and thrown at the screen. The phosphate coating is the slowest part, that takes 0.25-0.5ms to respond fully.

By comparison, at the time, "gaming" LCD screens were advertising 23ms response rates.

[–] [email protected] 3 points 2 months ago

Well yes, the speed of light in this case.