CRT Simulation in a GPU Shader, Looks Better Than Black Frame Insertion
Comments
user_7832
tverbeure
This is a well known effect. LCD cells must be driven with alternating positive and negative values (of the same magnitude) to maintain an average neutral value, otherwise you get some kind of offset buildup that will result in flicker.
If you alternate every other image with a different color value, you upset that balance.
It will slowly rectify itself for most displays.
nayuki
> LCD cells must be driven with alternating positive and negative values (of the same magnitude) to maintain an average neutral value
This is called inversion and there are interesting web pages on the topic:
* http://www.techmind.org/lcd/
immibis
So Snow Crash[0] does affect both humans[1] and computers!
Muromec
Damn it, now I have ro read another Stephenson book.
bitwize
Snow Crash is the Hackers (1995 film) of Stephenson's body of work. It's so aggressively 90s and cyber techno that it seems somewhat adorably cheesy in retrospect, but there's an audience of Z-ers who are just now discovering it, and what it means to be excited about technology the way we were back then.
mjevans
Google needs to retarget on being the Central Intelligence Corporation they were always meant to be.
Etheryte
Thanks for explaining, I've run into this on my phone in other contexts and I was starting to think my phone screen is having its last days. Turns out it's expected? Usually I run into this when the screen brightness is at the lowest setting.
tverbeure
Screen brightness is usually modulated by changing the strength of the backlight, not the values sent to the LCD cell array. So flicker induces by inversion doesn’t change when changing brightness.
(There are exceptions: one could dial up pixel values on a dark scene and dial down the backlight settings to save power. But that depends on the image content.)
andrewf
I think some OLED displays don't have backlights?
On manipulating the backlight to display a dark scene more power-efficiently: my TV that does this, if there's a small region of constant color (e.g. a TV station's logo) its brightness wavers noticeably as the rest of the scene changes.
kevingadd
Ah, is this why VRR displays start to flicker once your framerate drops too low? I had always wondered if it was a physical property of LCDs
tverbeure
That’s partially right. There’s also the issue of decay due to the LCD cell not being refreshed. It’s similar to not refreshing a DRAM cell.
But the flicker in VRR doesn’t only happen at lower frame rates. Some panels are more susceptible than others. It’s a headache.
It’s also a serious issue when doing 3D stereo on an LCD panel and having a static scene: alternating frames will display left and right views which may be pixels of different color.
modeless
Looks way more flickery than a real CRT at 120 Hz on an OLED phone. Maybe 240 Hz would be better.
Edit: I misunderstood and was running the 240 Hz version at 120 Hz. The 120 Hz version doesn't flicker noticeably. It does seem to reduce motion blur for 60 Hz content with a brightness penalty. It doesn't immediately make me feel like I'm looking at a CRT. Maybe it would if I had a 480 Hz monitor. There is a slight rolling banding artifact on my phone, maybe an artifact introduced by the display controller as described in the article.
blensor
I assume some people will approach this as stupidly as I did.
I wanted to see something and clicked on the 120Hz version not knowing what my laptop display actually is and while I am not photosensitive this was quite uncomfortable. Thinking I don't understand what that is supposed to be I clicked on the 480Hz to see if that is better/different and that was even worse. As a hail mary I clicked on the 240Hz and well that really made sense and was actually comfortable to look at.
So if you are like me and didn't really read through the text, this will only work for you if you select the Hz that matches your display ( which kinda is the whole point of why they are doing that ). If it looks bad you clicked the wrong link
BlurBusters
I wish Shadertoy had an easier way to let me change framerate. If you click 480Hz on a 120Hz display, it flickers at an awful 15Hz instead of 60Hz; so you don't want to simulate a 15Hz CRT -- not comfortable.
Retroarch now has this CRT simulator and it will automatically keep it at 60Hz by its default settings; so it's more foolproof in Retroarch than in Shadertoy.
pavon
Awesome. I find it so ironic that the main thing tempting me to buy to a high resolution high framerate monitor is the desire to better emulate a low resolution low frame rate CRT.
schmidtleonard
After HD, adding pixels/framerate/depth/brightness is like a clean house: it's hard to articulate the value proposition up front in a way that does it justice and it's easy to talk yourself out of going to the trouble, but once you have it you realize just how good it is.
BlurBusters
Even 480Hz looks great for office use; very ergonomic -- browser scrolling has 87.5% less motion blur than a 60Hz OLED and about 90-92% less motion blur than a 60Hz DELL LCD.
mosquitobiten
I'm not sure but I think this could be used instead of BFI in any game.
BearOso
I adapted this to a retroarch slang shader really quick, and I'm seeing some pretty persistent banding on 120hz to 60hz. It shows up obviously when scrolling the same direction as the fake beam scanout. If you take the shadertoy version and edit the scanout direction to left-to-right and fullscreen it, you can see it there, too. The perpendicular scanout and scrolling the demo uses by default disguises it pretty well.
I guess you probably need a higher ratio for this to work really well.
pipes
Silly question, but what does slang mean, as in I've been using retro arch for years and have always wondered what slang means in relation to the shaders.
BearOso
That's RetroArch's shader preset format: https://github.com/libretro/slang-shaders
Not to be confused with the "slang" shader format that Khronos is looking at to replace GLSL.
pipes
Thanks
pipes
Thanks
kevingadd
Slang is a new shader language with NVIDIA's involvement that can compile to multiple other target shader languages for portability.
BlurBusters
I'm the author of this shader, here's some tips:
- Throw as much native:emulated Hz ratio as you can.
- 120Hz = up to 50% blur reduction
- 240Hz = up to 75% blur reduction
- 480Hz = up to 87.5% blur reduction
- Calibrate your black levels and white levels (e.g. via TestUFO PLUGE test and White Level tests), since you need all of the levels for the simulated phosphor fades.
- Use SDR mode, not HDR, the math in the shader is designed to the Adobe sRGB curve. I wish I had more direct access to the complex HDR curves and ABL to auto-compensate for Talbot Plateau Theorem.
- Use odd number native:emulated Hz ratio on LCD to make it immune to image retention + slightly better behaviors with LCD 6-bit FRC
- Adjust Gain-vs-Blur and gamma, if there's problems. Using low Gain-vs-Blur will reduce color ghosting. Use 0.5 for 120Hz, and if you're getting too many artifacts, try testing numbers as low as 0.25 for 240Hz to see if color ghosting problems disappear. (A fix will be coming)
- Artifacts reduce dramatically at 480Hz versus 240Hz vs 120Hz, more Hz really helps CRT simulation. More Hz the merrier, for BYOA (Bring Your Own Algorithm approaches)
There will be an improved version of my shader on Github, involving:
- Global refresh mode (like a phosphorescent BFI)
- Color balancing modes
- Black level lifter (to fix any thin dark bands caused by violations to Talbot-Plateau Theorem due to certain displays' crappy handling of below-2% greyscales, etc)
Keep an eye out for it in January 2025, just star the Github repo or wait for Retroarch (etc) to implement the improved version of my shader (after I'm finished deadline work for a client at CES 2025)
redox99
This looks REALLY good on a 240hz monitor. Much better than BFI (which I don't use because it's pretty bad on my monitor)
BlurBusters
Thank you! It's in Retroarch now
ahartmetz
Ignoring the heavy flicker, it seems to reduce motion blur even with the 120 Hz demo running on a standard 60 Hz display. Especially visible on the windows. It doesn't seem like it should work, but it does?
But I find it hard to say that what it's supposed to look like. Motion blur is considered fine and correct in the "film look". Our eyes do crazy processing and can't really be emulated by a display technology without going to crazy lengths with high DPI, high dynamic range, high refresh rate (to emulate certain effects, not because we can properly see 90+ or so Hz) and probably eye tracking.
I think I like the slight (static) pixel blur of CRTs more than the motion-related behavior. The crazy DPI numbers of state of the art screens are seemingly not so much about showing detail than about hiding pixels. Calculating all of these pixels is, in a way, a waste of work. I'm talking about ~100 DPI, i.e. making a decent resolution look nicer, not about making low res crap look blurred instead of pixelated.
empiricus
I appreciate the crazy high dpi very much. Because the text is super sharp, it helps with the focus. I am 48 and my eyes are not perfect. I look at screens for many hours every day and if the text is not sharp enough, I lose focus and everything becomes blurry. But super sharp and bright screens mean the eye can have a feedback loop for the correct focus distance.
empiricus
You need an 120hz display to run the 120hz demo. I am surprised to see that the movement is clearer with the shader. You can follow the objects and they are more stable/more clear.
vslira
I’m a complete layperson on graphics and such, so please someone help me here: does this mean we’re now able to simulate old video game visuals on crt? That would be the best Christmas gift ever
mrob
We're getting closer, but 480Hz is still too slow for a convincing simulation of phosphor decay. 1000Hz will probably be enough.
BlurBusters
It's actually good enough for most content for most people if you're just doing 320x240 retro material.
Also, there's some optimizations coming to make it look even better depending on how good or limited your display is.
yincrash
The 120Hz shadertoy works on the Pixel 8 (and hopefully other 120Hz Android devices) if you go to Developer Options and enable "Force peak refresh rate"
I wonder if there's a way to ask Android Chrome to ask for 120Hz.
yincrash
Ah, the non-developer option setting to enable 120Hz on later Pixels is under "Settings"->"Display & touch"->"Smooth display". With that enabled, Chrome will use 120Hz if power and temperature settings permit it to.
SushiHippie
Thanks for the hint, I had this setting enabled, but it didn't look good on Firefox, but using chrome made it look good!
nyanpasu64
I'm still interested in a "selective MPRT" GPU or monitor setting, that only does black frame insertion on changed parts of an image and a "safety margin" around them. This should reduce flicker on non-moving portions of an image/still screen while keeping moving portions sharper. But this probably isn't useful for office tasks, perhaps video, and high-framerate gaming (but only games running at a lower FPS than the screen can (partially?) redraw).
BlurBusters
For things like a pan -- you have to apply it globally because your eye movements will smear the static pixels and motionblur them across your retinas.
However, the CRT simulator actually is variable-MPRT; it does compress the light emissions as quickly as possible as early as possible. Dim greys, for example are brightened and pushed in an earlier refresh cycle of the series that simulates a CRT refresh cycle.
So dimmer pixels get lower MPRT and brighter pixels get higher MPRT. Any unemitted brightness gets cascaded to subsequent refresh cycle until fully emitted to meet Talbot Plateau Theorem.
stuaxo
I've thought for a while that we need to simulate how phosphorus face in and out, at the very least.
naoru
This is better than BFI, although 120Hz demo on my screen looks like it's just alternating two or three parts of the image. Maybe there is a way to use fake interlacing to make it look convincing.
240Hz demo in 144Hz mode looks flickery but much more realistic.
Hakkin
I have an AOC Q27G3XMN and while I do get reduced motion blur from this, I also experience very bad color banding/shifting. Messing with some of the values in the script config makes it slightly better, and changing the overdrive setting on the monitor seems to affect it as well, but there is still pretty strong banding no matter what strength it's on. I tested on my phone (Pixel 8) and it works very well there without any banding or color weirdness, so I guess it's just something about this particular monitor that doesn't work well with this method.
stelonix
Tried on my simple 60Hz PC screen and also on my phone with OLED screen and sadly, it's just a flickering image. Will try later this week on my friends' retrogaming setup. Looks promising
BlurBusters
You need a large native:simulated Hz ratio. I recommend at least 4. Which means 240Hz to simulate a 60Hz better.
phafu
I'm wondering if it would rather make more sense to emulate a CRT with a video projector and some shutter device (maybe a fan?) in front. Has anyone tried that yet?
UniverseHacker
Projectors tend to have brightness issues, that would make it even dimmer
zanfr
It's cute but on my AOC (at 120hz) the discoloration is substantial. Also any inconsistency in frame time will break the illusion
BlurBusters
- Turn off HDR, use Adobe sRGB both at OS level, display icc level, and display menu level. The math in the CRT simulator is optimized for the gamma2linear/linear2gamma math, needed for Talbot-Plateau Theorem, and it was easier on a well-known old gamma curve.
- Adjust your black levels and white levels so there's no clipping
- I noticed 6bit TN panels tend to have problems, try IPS or OLED
- Lower GAIN_VS_BLUR to 0.5 at 120Hz, or 0.25 at 240Hz, if discoloration is bothersome.
- There are some optimizations coming in January 2025 as band-aid workaround for display limitations (especially low-Hz TN LCDs), even 240Hz is sometimes too low.
OLED at 240Hz looks better than LCD at 360Hz with the CRT simulator for example, so if you're buying a monitor to have 75%-90% motion blur reduction in your 60fps retro content, you will want to have a high-Hz OLED, see the motion blur physics at TestUFO Variable-Persistence Black Frame Insertion demo (in TestUFO 2.1) to understand how higher Hz can reduce motion blur of low frame rates more than lower Hz; it's just the laws of physics caused by ergonomic flickerless sample-and-hold displays, and BYOA (Bring Your Own Algorithm) approaches. I can emulate plasma subfields on a 600Hz OLED, and I can emulate DLP subfields on a 1440Hz OLED; but CRT is the gold standard; still it needs a large native:simulated Hz ratio to look realistic. It's very adjustable.
jtxt
Nice! Does-this/can-this handle interlacing?
https://en.m.wikipedia.org/wiki/Interlaced_video
(Half vertical resolution, offset a bit every other frame)
kristopolous
Just start making CRTs again. There's clearly consumer demand
runlevel1
I don't know if the market is big enough to offset the cost of setting up production again.
rcxdude
it's gonna be pretty difficult to compete with used prices and make a profit with the likely volumes. The tubes aren't exactly something that scales down to low volumes either.
op00to
Could this be used to simulate CRT displays in video game cabinets?
BlurBusters
Yes. Retroarch now has the CRT simulator built in.
I recommend 240Hz OLED in arcade cabinets to emulate 60Hz CRT, do not skimp on Hz. More Hz is better for CRT simulators, due to a very important temporal principle, you need a large native:simulated Hz ratio.
isoprophlex
This probably goes without saying but...
If you have photosensitive migraine or epilepsy, stay the hell away from those demos.
dsp_person
I tried the 120Hz demo but can't really tell there's any effect. Does it look cooler with 240Hz?
sergiotapia
FWIW I have the ASUS ROG Swift PG32UCDM 31.5" 4K UHD (3840 x 2160) 240Hz Gaming Monitor - a $1.3k monitor and also don't see anything different in the demo. Maybe I'm looking at the wrong thing. https://www.shadertoy.com/view/XfKfWd
klausa
Make sure your browser lets you refresh at those framerates.
Safari by default caps animations other than scrolling at 60fps (I think?).
zamalek
This will let you know what your browser is allowing: https://www.testufo.com/framerates (it will also demonstrate the issue that the post is attempting to solve).
a1o
I only get an hyphen on the iPhone
wrboyce
I got 60fps/60Hz on mine (iPhone 16 Pro, iOS 18.2).
KuzMenachem
Just FYI, you can go to Settings -> Safari -> Advanced -> Feature Flags -> Prefer Page Rendering Updates near 60fps and switch it off to get 120Hz
BlurBusters
240Hz is more recommended, 120Hz does not give much benefit, especially on small screens. If your screen is bigger you may notice, but it starts to become noticeable if done on OLED instead of LCD, or if you increase refresh rate by 4-8x above CRT simulation target.
c22
Can I use this to play Duck Hunt?
grishka
No. Light gun games rely on the fact that a CRT display will draw the picture on the screen pretty much at the same time it's generated by the console's video chip. Modern digital displays introduce all kinds of delays due to processing and buffering they do. Usually several frames worth. This shader can't do anything to fix that.
MarioMan
For the longest time, I thought this was the only limiting factor, but modern panels are low enough latency for it to work, yet still don’t.
The other important factor is the light filter. The NES Zapper has a filter designed to only be sensitive to high-frequency light sources like CRT screens.
taneq
That said, I bet you could fake the electron beam position with a high frame rate display, a modified version of this shader, and some kind of calibration routine…
grishka
You don't really need to emulate the position of the beam, at least not for the NES light gun. When you pull the trigger, the game first makes the entire screen black for one frame, reading the sensor in the gun and checking that it doesn't detect any light, and on the next frame, a white box is drawn where a duck would be. If the gun does detect light on this frame, it's counted as a hit. That second check is performed while the frame with the white box is still being drawn because CRT phosphors decay fairly quickly. You could, in theory, work around this with an LCD/OLED display with a high enough refresh rate that it would make up for the buffering delays.
taneq
Ah dammit I actually knew this, but was thinking about light guns in general. That's a great point, it should actually be super easy to generate a video output compatible with a (S)NES light gun. As penance I'll consider digging one up and adding support to an emulator. :D
IshKebab
You wouldn't be able to get horizontal position.
brcmthrowaway
I just see a flickering image. What am I missing on iPhone?
pfg_
ios limits browser framerate by default, you can try going to settings > apps > safari > advanced > feature flags > disable "prefer page rendering near 60hz" and see if that has any effect. you can test by going to testufo and seeing if it gets the right framerate.
amelius
> Q: It looks like crap! Why?
> A: You need a bright display, try a 240Hz+ OLED. Also some local dimming LCDs have a backlight lag that sometimes interferes with quality.
Come on now. If you can simulate a CRT then surely you can make it look nice on a conventional monitor?
BlurBusters
You need a large native:simulated Hz ratio in order to accurate a CRT accurately. It's laws of physics, sadly. I need to update a pixel multiple times per videogame frame, just to accurately simulate a CRT.
120Hz = up to 50% motion blur reduction for 60fps
240Hz = up to 75% motion blur reduction for 60fps
480Hz = up to 87.5% motion blur reduction for 60fps
CRT simulation is bottlenecked by limited genuine native non-faked Hz, which is why accurate CRT simulation is so difficult.
kizer
Could someone explain the point to me? I read the post and still don’t quite understand. I remember CRTs looked smoother when pixels were still noticeable in (o)led displays. Is it to effectively lower the frame rate?
BlurBusters
You need more Hz to reduce display motion blur.
- 120Hz = can reduce motion blur by up to 50%
- 240Hz = can reduce motion blur by up to 75%
- 480Hz = can reduce motion blur by up to 87.5%
There's a new article on Blur Busters that's showing 120Hz-vs-480Hz OLED is more human visible than 60Hz-vs-120Hz, and easier to see than 720p-vs-1080p, and also why pixel response (GtG) needs to be 0 instead of 1ms, since that's like a camera shutter slowly opening & closing, but MPRT is equivalent to the shutter fullopen time. The science & physics is fascinating, including links to TestUFO animations that teaches about display motion blur and framerate physics.
Motion blur of flicker = pulsewidth
Motion blur of flickerless = frametime
So you need tons of framerate or short pulsewidth BFI/CRT/etc.
mrob
It's to reduce sample-and-hold blur. Modern displays typically produce a static image that stays visible for the whole frame time, which means the image formed on your retina is blurred when you move your eyes. CRTs instead produce a brief impulse of light that exponentially decays, so you get a sharp image on your retina. Blurbusters has a good explanation:
akoboldfrying
Can anyone explain why this requires a relatively high-end GPU? Looking at the slo-mo GIFs, it looks like `brightness *= SomeLUT[(y + t) % sizeOfTheLUT]` for each colour channel would do the trick.
What makes it so complicated?
BlurBusters
Author here.
You need to keep the GPU free to work on the game; doing CRT simulation at 60fps at 480Hz requires brand 8 new frames per videogame frame, and it's doing a bunch of math operations per subpixel per refresh cycle. If you run it at full resolution 2560x1440x480x3, that's a lot of processing.
Especially since it also uses a variable-MPRT algorithm that cascades brightest pixels to subsequent refresh cycles;
That's why it's coming to RetroArch and best to process the low-resolution framebuffers first, before scaling and sending through CRT filters/simulated curvatures/etc.
Most retro games are just 320x240.
londons_explore
To me this highlights that none of my hardware (pc, phone laptop) can actually render anything at native screen resolution and not occasionally drop frames.
Can we please design software to be frame-drop-free? Ie. if it drops a frame, even once, send a bug report to the developer to fix it, and if he cannot, refund me for the hardware?
My analogue TV from 1956 does not drop frames, I can assure you.
BlurBusters
Alas, software, and operating systems full of stuff sometimes does it...
This CRT simulator almost requires a dedicated GPU (AMD, NVIDIA, Radeon) and nothing running in the background, since it's a software-based simulation of a CRT tube. It is probable that an Intel integrated GPU from 2017 won't work, and neither will a cheap $200 Android phone do it smoothly.
It is doing almost 100 math computations per subpixel per refresh cycle, in a multi-gigaflop supercomputer called a GPU, so if you're running at 2560x1440x120x3, that's still blows past a lot of dedicated GPU abilities, as it's needing to do it on every native Hz, regardless of the low simulated Hz.
Make sure you don't have any software running in background (not even browser tabs, no system tray apps, exit your RGB animation apps), and run in Performance Mode (not Balanced Mode or Low Battery Mode).
It's frame drop-free on my Razer laptop in a clean Windows install, but it starts stuttering with an old Windows install. Not much I can do about operating system preventing realtime stuff.
grayhatter
Set your graphics card to a hard limit at 24fps, and only use pre-rendered video. I think you'll find it stops dropping frames. Just like your analog TV, you won't even have to reduce the resolution!
If you don't understand why comparing a modern graphics pipeline, sitting ontop a general purpose CPU running a time sharing kernel, might be different enough to break the comparison to an old TV, I don't know how to help you...
cmiller1
Now add different phosphor decays on the black parts for each subpixel!
BlurBusters
It already sort of does that, through a clever variable-MPRT algorithm.
Stevvo
Please flag. A couple of commenters damaged their displays, and a photosensitive person may have damaged themselves. The risks outweigh the benefits of having it up for discussion.
Dylan16807
> A couple of commenters damaged their displays
No they didn't.
> a photosensitive person may have damaged themselves
What person that might be seriously affected is going to look at the image at the top of the article and decide they want to try the full speed demo without extreme care? Plus there's a warning on the links.
Just a mini-warning/FYI: running the 120hz test on my 60hz LCD IPad (Air 4) has caused that part of the screen with the crt effect to flicker even after leaving the demo. I don’t know what might cause this but it’s weird and worth a warning to anyone interested in trying this out.
(The flickering is more obvious when the control centre is opened, I managed to take a video of it but it’s only partially clear in it. It’s been about 5 minutes so far and I think the effect has reduced. I’m also quite perceptive to flickers so others might not notice it.)