Depends what you want to render. High fps requirements in conjunction with movement where the human eye is the bottleneck is a perfect interpolation case. In such a case the bad frames aren’t really seen.
no, it depends how you want to render it. Older games still had most of today’s effects. It’s just that everyone is switching to unreal, whose focus isn’t games anymore. And which imo, looks really bad on anything except a 4090, if that. Nobody is putting in the work for an optimized engine. There is no “one size fits all”. They do this to save money in development, not because it’s better.
ffs even the noisy image isn’t always at native resolution anymore.
A context aware interpolation with less overhead is a cool technology as compared to context unaware averaging. How that ends up implemented in various engines is a different topic.
there should’t be any averaging! Just render the damn frame!
You can’t tell me we could get something like mgsV on previous gen hardware at 60 fps, And that hardware with 9 times the processing power can only render a lower resolution, noisy image which is then upscaled and denoised… at 30 fps.
“But raytracing!!!”
If these are the compromises that need to be made just to shoehorn that in, then current hardware isn’t really capable of realtime raytracing in the first place.
Depends what you want to render. High fps requirements in conjunction with movement where the human eye is the bottleneck is a perfect interpolation case. In such a case the bad frames aren’t really seen.
no, it depends how you want to render it. Older games still had most of today’s effects. It’s just that everyone is switching to unreal, whose focus isn’t games anymore. And which imo, looks really bad on anything except a 4090, if that. Nobody is putting in the work for an optimized engine. There is no “one size fits all”. They do this to save money in development, not because it’s better.
ffs even the noisy image isn’t always at native resolution anymore.
A context aware interpolation with less overhead is a cool technology as compared to context unaware averaging. How that ends up implemented in various engines is a different topic.
there should’t be any averaging! Just render the damn frame!
You can’t tell me we could get something like mgsV on previous gen hardware at 60 fps, And that hardware with 9 times the processing power can only render a lower resolution, noisy image which is then upscaled and denoised… at 30 fps.
“But raytracing!!!”
If these are the compromises that need to be made just to shoehorn that in, then current hardware isn’t really capable of realtime raytracing in the first place.