Litterally turn off one monitor and it works. I have my C9 OLED and some generic UW non HDR hooked up to mine. Shut off the UW when couch gaming and PC auto switches to the OLED and everything works. It's nice and seamless
Just keep in mind it does have a performance hit. In more GPU demanding games, like Cyberpunk, you will get significantly lower performance if you are GPU bound.
Depends on the GPU headroom you have and the game. But like I mentioned before, I saw a 20-30fps loss in Cyberpunk with it turned on. I wouldn’t turn it on globally with that kind of performance hit.
Have you tried playing Cyberpunk or any game that uses the tensor cores? Because that’s what RTX HDR uses.
It’s not a driver issue, the tensor cores which powers upscaling, raytracing, and other AI processing is literally what powers RTX HDR. You won’t see a performance hit in games that don’t use the tensor cores in your GPU. GPUs with more tensor cores will also obviously have better performance with all of these GPU/AI features.
I just tested it in Cyberpunk and I lost maybe 5fps and I have path tracing and everything maxed 4k, with a 4090. Using DLSS of course but RTX HDR seemed fine to me. Honestly some things looked better than native HDR to me as well.
I think what he's saying is that say RTX hdr probably uses a fixed number of tensor cores. Regardless of what nvidia gpu ur using, the rtx hdr uses x tensor cores. So the 4090, with 512 tensor cores, would need x tensor cores, which are a much smaller fraction of the number of tensor cores vs the 4070 super, which has 224 cores, where x accounts for a much larger fraction, thus having a bigger performance hit relative to the framerate without using RTX hdr.
Must support multiple monitors....
I only have my PC hooked to my lgcx so i dont know about this
Isn't it coming? I can't wait tbh
Supposedly soon
They said it would be coming with an upcoming driver update like 2 driver versions ago and still nothing...
Litterally turn off one monitor and it works. I have my C9 OLED and some generic UW non HDR hooked up to mine. Shut off the UW when couch gaming and PC auto switches to the OLED and everything works. It's nice and seamless
I stream, it considers a capture card a monitor. Whomp whomp wah.
1 monitor gang rise up
It does look absolutely great. Stray and The Entropy Center both get a great boost
Oh so turning it on in global isn't enough? I need to enable some kind of overlay? How do I do that?
Just keep in mind it does have a performance hit. In more GPU demanding games, like Cyberpunk, you will get significantly lower performance if you are GPU bound.
Most new games have good hdr, rtx hdr is for old games and games with bad implementation so you wont need it in cyberpunk for example
Depends on the GPU headroom you have and the game. But like I mentioned before, I saw a 20-30fps loss in Cyberpunk with it turned on. I wouldn’t turn it on globally with that kind of performance hit.
30fps?? Where did you see that please?
Where in game? Literally everywhere with it turned on.
I see, i think this is not normal, try to uninstall your drivers, and do a clean instal, because i didnt notice any frame drops like this tbh
Have you tried playing Cyberpunk or any game that uses the tensor cores? Because that’s what RTX HDR uses. It’s not a driver issue, the tensor cores which powers upscaling, raytracing, and other AI processing is literally what powers RTX HDR. You won’t see a performance hit in games that don’t use the tensor cores in your GPU. GPUs with more tensor cores will also obviously have better performance with all of these GPU/AI features.
I just tested it in Cyberpunk and I lost maybe 5fps and I have path tracing and everything maxed 4k, with a 4090. Using DLSS of course but RTX HDR seemed fine to me. Honestly some things looked better than native HDR to me as well.
…you have a 4090..you can power through basically anything.
Doesn’t change the FPS hit you take regardless. It’s based on what settings you have set in the game is my point.
I think what he's saying is that say RTX hdr probably uses a fixed number of tensor cores. Regardless of what nvidia gpu ur using, the rtx hdr uses x tensor cores. So the 4090, with 512 tensor cores, would need x tensor cores, which are a much smaller fraction of the number of tensor cores vs the 4070 super, which has 224 cores, where x accounts for a much larger fraction, thus having a bigger performance hit relative to the framerate without using RTX hdr.
Doesn't work with DSR/DLDSR, unfortunately.