Intel researchers create a method for AI-generating frames in games without added input latency

Intel researchers create a method for AI-generating frames in games without added input latency

The first GPU company to offer it was Nvidia in 2022, followed by AMD one year later, and now Intel has joined in the fun. I am, of course, talking about frame generation and while none of the systems are perfect, they all share the same issue: increased input latency. However, researchers at Intel have developed a frame generation algorithm that adds no lag whatsoever, because it’s frame extrapolation.

If you’ve a mind for highly technical documents, you can read the full details about how it all works at one of the researcher’s GitHub. Just as with all rendering technologies, this one has a catchy name and suitable initialisation: G-buffer Free Frame Extrapolation (GFFE). To understand what it’s doing differently to DLSS, FSR, and XeSS-FG, it helps to have a bit of an understanding of how the current frame generation systems work.



Source link