If you have an old CRT VGA monitor they are always on and have analog controls.
The Energy Star 1.0 spec was released in 1994 and most (if not all) computer monitors were compliant within a few years. Within the scope of this thread, well, all the boxed computer monitors in that wrecked store would certainly be Energy Star rated, have power management features, and of course they're all going to be digital multisyncs that take a little time to figure out exactly what mode they need to switch to, just like an LCD.
Most old LCD's that are 4:3 have visible screen lag and are stuck at 60Hz.
I kind of have to say "citation needed" when it comes to "visible screen lag". In what context, and are you arguing it's bad enough to actually get you killed in a video game?
It's generally accepted that human reaction time to "random" stimuli is somewhere in the ballpark of 175-250ms. IE, if you're sitting there staring at your screen finger poised over a fire button waiting to blow the head off a counterstrike opponent it's going to take you about a fifth of a second, at best, to mash that trigger when they pop into view. At 60Hz that's 12 frames. Sure, that's less than the 29 frames you'd get with a 144Hz gamer monitor (and a GPU fast enough to actually saturate it), but let's be serious here: both are much faster than anything you're going to actually be able to respond to. I mean, sure, I guess you could say the faster monitor gives you a *little* bit of an advantage because the *first* frame with the target in it gets to you faster, but what we're talking about here is going to be like
11ms, or about 1/20th of the total average response time... IE, a figure well within the margin of error. While I'm sure there are plenty of butch gamers that will
insist they're equipped with the superhuman levels of perception to make use of that, well, color me pretty skeptical.
And to be clear, that's not to say that people can't *perceive* faster events. One of the banes of modern entertainment systems is that humans are really good at noticing mismatches between video and audio sync; the threshold for someone experiencing some level on unconscious discomfort with, say, spoken audio out of sync with the video of mouths moving can be as low as the 10-20ms ballpark, IE, just one 60Hz frame. Older HDTVs could easily exceed this by several times, but... it's worth calling out here that the bulk of these delays involve processing that computer monitors don't do. IE, TVs often have video processors that are simply determined to make things "look better" through deinterlacing, upscaling, motion interpolation, etc. Computer monitors rarely lag by anywhere near as much... and especially if you're using a modern computer, because with DVI and better it's often actually the video card doing the work of scaling non-native resolutions to the panel size instead of a scaler in the LCD.
(My 2006 vintage Apple 30" Cinema Display, which I still use via a USB-C converter, offers two built in resolutions, IE, native and 2x doubled. That's it. All the other resolutions you might choose when it's hooked up rely on the computer to do the work. This isn't really a new thing.)
People seem to forget that the reason that CRT monitors from the 1990's forward offered refresh rates higher than 60hz wasn't because it "made gaming better", it's because a fair number of people (like me) are able to subconciously percieve flickering in color CRT display phosphors at refresh rates lower than 70-85hz. This problem gets worse as CRTs get bigger because human eyes are more sensitive to quick movement in their peripheral vision; a 13" VGA monitor might be just fine at 60hz, but a 19" monitor is going to be constantly telling the edges of your retina to trigger the flight-or-flight reflex because of its greater coverage of the visual field. LCDs don't, or at least shouldn't, flicker. (Old florescent backlights do sometimes, but that's another matter.) The pixels stay whatever color they are until they're changed, so as long as the refresh rate is fast enough to make things "move" it hardly matters. Really old LCD panels themselves could be somewhat sluggish in changing color with updates, IE, if a pixel went from bright white to solid black it might actually take a measurable amount of time (10ms?), which *was* kind of perceptable (I remember old TFT LCDs had this sort of "melty" quality when transitioning one scene to a drastically different one), but... that's a 20 year old problem right there. Monitors are cheap, treat yourself to a new one?