• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Intel announces XeSS 2 with new Frame Generation and Low Latency tech

LectureMaster

Gold Member
Launch together with Intel's new cards, didn't see my man Draugoth Draugoth mention in the thread.
MtTm3H7.jpeg





Intel XeSS, or Xe Super Sampling, is the company's answer to NVIDIA's DLSS and AMD's FSR. However, as it utilizes the AI hardware and XMX Engines found in all Arc Graphics products, it's often seen as a better solution than FSR for maintaining image quality. Alongside Intel announcing its first next-gen Battlemage GPU with the new Intel Arc B580 today, the company has also lifted the lid on Intel XeSS 2. And yes, it includes frame generation.

When NVIDIA launched DLSS 3 Frame Generation alongside the GeForce RTX 40 Series, it took a minute or two to understand what was happening. XeSS 2 is similar in that it combines three bits of technology - XeSS Super Resolution, XeSS-FG Frame Generation, and XeLL Low Latency. Super Resolution does the AI upscaling. Frame Generation leverages AI-powered 'Optical Flow Reprojection' and other game data to create a new frame. XeLL dramatically lowers system latency to improve responsiveness.

What does that mean? Well, it transforms F1 24's native 1440p Ultra performance on the Intel Arc B580 from 48 FPS with an overall latency of 57ms to 152 FPS with an overall latency of 28ms.
V6D1jIo.png

Yes, it's a game-changer, but like DLSS and XeSS, it requires game integration, so it will take a while before it becomes readily available in many games. On that note, Intel has announced the first XeSS 2 games: Assassin's Creed Shadows, Harry Potter Quidditch Champions, RoboCop Rogue City, Like a Dragon: Pirate Yakuza in Hawaii, Dying Light 2, F1 24, Ascendant, Marvel Rivals, Killing Floor III, and Citadels.

With XeSS available in over 150 games and Arc Graphics and XeSS 2 covering both the Intel Arc B-Series and Intel's mobile APUs with integrated Arc chips, there's a good chance XeSS 2 support will ramp up in the coming months.
ggnWwF1.jpeg


 

//DEVIL//

Member
"What does that mean? Well, it transforms F1 24's native 1440p Ultra performance on the Intel Arc B580 from 48 FPS with an overall latency of 57ms to 152 FPS with an overall latency of 28ms."

This is impressive if this is true. much better than dlss3 + frame gen if i am not mistaken "in terms of latency"
 
Last edited:

Buggy Loop

Member
"What does that mean? Well, it transforms F1 24's native 1440p Ultra performance on the Intel Arc B580 from 48 FPS with an overall latency of 57ms to 152 FPS with an overall latency of 28ms."

This is impressive if this is true. much better than dlss3 + frame gen if i am not mistaken "in terms of latency"

DLSS 3 frame gen is basically ~60% of native latency so in this case, as a brute extrapolation it would land at 34ms.

Without being able to compare apples to apples and I didn't look into what kind of framerate bump it would have on nvidia side, it seems Intel has the advantage here but not "much better".

The jump from 48 fps to 152 fps seems huge though, I don't recall Nvidia doing that but I can't quite search a ton on it while at work.

Hopefully it all holds up nicely together image quality wise. They could have something to shake things up a bit and hopefully a bit more competition.
 

Cattlyst

Member
Interesting. The price point of these new Arc cards looks very competitive too. If they perform as well as these articles are claiming and they enter the market at the lower end price wise, intel could be onto a winner. I’d certainly be willing to give the B580 a look when I upgrade from 3070.
 

yankee666

Member
I am excited by the possibilities, but I have a feeling that even with all these improvements it will fall short of my 6700XT. Would love to be proven wrong though.
 

Zathalus

Member
Exciting stuff. I wonder of the upscaling component of XeSS is still based off 1.3 or is better.

Pity Intel is not doing 7xx class GPUs again. A 4070+ GPU at $350 would have been remarkable.
 
Last edited:

kevboard

Member
Exciting stuff. I wonder of the upscaling component of XeSS is still based off 1.3 or is better.

Pity Intel is not doing 7xx class GPUs again. A 4070+ GPU at $350 would have been remarkable.

is it even confirmed that there will be no Arc 7 card this time? it could just be that they release the mainstream ones first and then add a more higher mi-range one later
 

TrebleShot

Member
Nobody wants fake frames.
It wasn't impressive 10 years ago in trumotion/motion flow and it's not impressive now. Give us better upscaling!
 

b0uncyfr0

Member
Nobody wants fake frames.
It wasn't impressive 10 years ago in trumotion/motion flow and it's not impressive now. Give us better upscaling!
Hard disagree. Ill gladly take frame generation done properly in most of my games. I do notice the slight input lag difference but its nowhere near terrible and after an hour or two, I've adjusted to it.
 

CamHostage

Member
A Steam Deck competitor manufactured by Intel themselves would be very interesting to see.

It's in the works already, but...


Lots of ways this already has gone wrong, though at least this one is not afraid to be different than Steam Deck and differentiate itself beyond the specs. And as others have already pointed out, other manufacturers may use Intel as their chipset instead of AMD or ARM-based Snapdragons.
 

SpokkX

Member
What kind of hardware does this need? How quick is it per frame on AMD GPU? is this something for consoles?
 

CamHostage

Member
Nobody wants fake frames.
It wasn't impressive 10 years ago in trumotion/motion flow and it's not impressive now. Give us better upscaling!

"Motion Smoothing" solutions like TruMotion/Motion Flow are purely external post-processor solutions. They take the final video output stream and interpolate what could have been in the middle of frames based on its assumptions of the processed pictures it has access to.

Frame Gen solutions are built into the GPU, and interact with the games. They have access to way more information about the image-generation stream before it is ever printed to a flat picture (although it seems like they still operate heavily on optical interpretation, even with a game's 3D and input data.) It theoretically has a much better shot of creating an authentic approximation of what would be the in-between frames.

2024-12-03-image-29.jpg


...Also, this isn't something I love saying, but Motion Smoothing got a bit of a bad rap because of how lays motion mayonnaise over films.

Fact is, cinemaphiles don't like anything outside 24FPS, as it has the "looks like a soap opera" video sensation to it. They hated The Hobbit HFR, they hated Gemini Man 3D/4K/120, they even hated while saddles with 3D glasses during James Cameron's Avatar TWoW when it switched to 48FPS to eliminate judder in high-speed action scenes (even though the entire film was technically projected at 48, it's just that most scenes double-printed frames to remain only updating every 24.) Even in the hands of expert directors and cinematographers, high-frame rate has been rejected. Sports can be higher, live news and talk shows can be higher, but everything else, be it Greys Anatomy or Toy Story or Black Mirror, be it shot on film or digital video, those images must flow at 24FPS. We have decided that the framerate established in the 1920s for sound films is how entertainment should be enjoyed, despite all the vast improvements in picture-generating over these next 100 years... and maybe we made the right choice, that a century ago we really did stumble across a sweet spot for human eyes capturing the world as we perceive it. Maybe?

giphy.gif


Either way, for games, I don't remember anybody having much of a positive or negative experience with Motion Smoothing in games. The picture was already fake graphics anyway, it didn't get more artificially "soapy" by being processed again, and LCDs and games already had lots of lag so players might not have noticed. Game modes on a TV tend to filter it out automatically, so generally gamers haven't played with Motion Smoothing on, so I doubt most of us have much experience with it on or off in gameplay and have a say about what affect it has/had on games.

(BTW, the idea that you don't want "fake frames" but then demand "better upscaling", aka fake pixels, is confusing. The dream would be games shooting for native res and native framerate maxed for our current 4K 60/120 world, but there'd be a lot to give up for that.)​
 
Last edited:

TrebleShot

Member
"Motion Smoothing" solutions like TruMotion/Motion Flow are purely external post-processor solutions. They take the final video output stream and interpolate what could have been in the middle of frames based on its assumptions of the processed pictures it has access to.

Frame Gen solutions are built into the GPU, and interact with the games. They have access to way more information about the image-generation stream before it is ever printed to a flat picture (although it seems like they still operate heavily on optical interpretation, even with a game's 3D and input data.) It theoretically has a much better shot of creating an authentic approximation of what would be the in-between frames.

2024-12-03-image-29.jpg


...Also, this isn't something I love saying, but Motion Smoothing got a bit of a bad rap because of how lays motion mayonnaise over films.

Fact is, cinemaphiles don't like anything outside 24FPS, as it has the "looks like a soap opera" video sensation to it. They hated The Hobbit HFR, they hated Gemini Man 3D/4K/120, they even hated while saddles with 3D glasses during James Cameron's Avatar TWoW when it switched to 48FPS to eliminate judder in high-speed action scenes (even though the entire film was technically projected at 48, it's just that most scenes double-printed frames to remain only updating every 24.) Even in the hands of expert directors and cinematographers, high-frame rate has been rejected. Sports can be higher, live news and talk shows can be higher, but everything else, be it Greys Anatomy or Toy Story or Black Mirror, be it shot on film or digital video, those images must flow at 24FPS. We have decided that the framerate established in the 1920s for sound films is how entertainment should be enjoyed, despite all the vast improvements in picture-generating over these next 100 years... and maybe we made the right choice, that a century ago we really did stumble across a sweet spot for human eyes capturing the world as we perceive it. Maybe?

giphy.gif


Either way, for games, I don't remember anybody having much of a positive or negative experience with Motion Smoothing in games. The picture was already fake graphics anyway, it didn't get more artificially "soapy" by being processed again, and LCDs and games already had lots of lag so players might not have noticed. Game modes on a TV tend to filter it out automatically, so generally gamers haven't played with Motion Smoothing on, so I doubt most of us have much experience with it on or off in gameplay and have a say about what affect it has/had on games.

(BTW, the idea that you don't want "fake frames" but then demand "better upscaling", aka fake pixels, is confusing. The dream would be games shooting for native res and native framerate maxed for our current 4K 60/120 world, but there'd be a lot to give up for that.)​
Thanks for your reply.
Interesting stuff. The 24fps on movies is largely attributed to the feeling of another world. So we interpret that firstly looking through a square screen first and then the ethereal quality of 24fps is something we enjoyed because of the window effect of peering into another world. People don't like smoother visuals in movies because it looks too realistic it's too close to life and exposes the poorer quality of effects and make up etc as it very clear in motion.

The soap opera effect essentially.


My issue with fake frames is no matter how good the solution is it almost always introduces artefacts and even worse is the disconnect between something running at 40 or 60 and what you see is running at 120 it's feels off. There's a disconnect and almost always does not match with what you feel.


Upscaling is different because the pixels don't effect the feel they are just visual crispness. Whereas the motion is something else.


Maybe things will get better but at the moment I really don't like any frame gen unless it's absolutely needed.

Also Motion Smoothing from tvs and frame gen ultimately have the same end effect the image looks smoother but doesn't feel any more responsive the underlying tech is v different but the outcome is the same.
 
Last edited:

CamHostage

Member
Interesting stuff. The 24fps on movies is largely attributed to the feeling of another world. So we interpret that firstly looking through a square screen first and then the ethereal quality of 24fps is something we enjoyed because of the window effect of peering into another world. People don't like smoother visuals in movies because it looks too realistic it's too close to life and exposes the poorer quality of effects and make up etc as it very clear in motion.

Well, they're certainly not used to it... and they may never well be. I personally enjoyed all 4 HFR experiences I've been able to attend (Hobbit 1, Billy Lynn, Gemini Man, & Avatar WoW) and really appreciated some of the quality-of-experience improvements it brings in cutting down judder and motion blur, but it for sure took some time to get used to each viewing, and even I was bothered by it looking "soapy" and hoaky at times. Cinema has developed a visual language of its own, and has learned to used its ancient issues as tools to make art.

There's a fun layman'ish discussion about high framerate and persistence of vision in the "Billy Lynn's Long Halftime Walk" episode of the Blank Check podcast where TV director JD Amato goes deep on the science of picture projection and motion slices over time and "beta frames" and a lot of technical detail about this apparent dead-end attempt of introducing HFR. It's really interesting to read about just how much work and research had to go into making an advancement that a huge vocal majority of those who have paid to see it say it ruins movies.



...But, movies aren't games.

Bump a game from 30 to 120, and you won't get many if any complaints. There is a difference, there are technical questions to work out when a dev is deciding what to target and whether to leave framerate unlocked, but "higher is better" is harder to argue when talking about games.
 

CamHostage

Member
Upscaling is different because the pixels don't effect the feel they are just visual crispness. Whereas the motion is something else.

Motion of course is something else, but games move, and upscaling is a choice made by a machine every time it generates a frame. I don't know that there really is a "feel" difference that would create an argument against ever using it (especially these days when we're so burdened with artifacts that we can't even deal with preferences,) but each solution has its own algo, so if there are differences in the how, there are questions to be asked in the why.

My issue with fake frames is no matter how good the solution is it almost always introduces artefacts...

Technically, sure, it always introduces artifacts. In games though, I've not heard many complaints. And there's no natural truth to compare it to. These visuals aren't real, either way.

The "Performance vs Fidelity Mode" wave of choices has given some interesting comparison sources, and I don't think people realize that it goes beyond just choosing between better effects or better framerate; you're also choosing a "feeling", same word you used to describe this in movies.

For example, check out this Toy Story clip that somebody AI-enhanced to 60FPS. (I would have preferred 48FPS to 2X the original but it's not my clip.) Now, these are "fake frames" with AI-generated artifacts and all. Pixar animates at 24 and I don't know if they've every published any HFR experiments, so it's not true source created at the desired framerate. Still, it's funny how different this looks simply from displaying at a higher framerate, how even a CG cartoon can get that "soap opera" effect. (And it's purely framerate; switch to 480p and the same clip at 30 looks pretty much as you always remember it.)




...even worse is the disconnect between something running at 40 or 60 and what you see is running at 120 it's feels off. There's a disconnect and almost always does not match with what you feel.

Do we even notice games "feeling" experientially different when we crank up the framerate? Maybe we should notice ourselves not noticing before we rush to judgment over what's better for us. It's hard to argue that a 120fps game is not smoother than a 30fps game, and so we have extrapolated that higher is always better, but we might have missed an offramp of discussion which film drive right past on the 24FPS highway.

I remember when MoH Frontline got remastered, I was psyched to play it without the barely-capable issues of the PS2 original, but when I played it, it just felt "like a videogame", with little of the intensity of battle overwhelming the experience. Partly, I suffered nostalgia, but also, the clean framerate gave a different "feeling" compared to the luggy, struggling original.



Now, I'm not insane. Sub-30 in an FPS fucking blows. But if there's an undeniable feeling difference here, what's the subtle "feeling" difference we're choosing between every day when we toggle Fidelity and Performance?

Maybe things will get better but at the moment I really don't like any frame gen unless it's absolutely needed.

Also Motion Smoothing from tvs and frame gen ultimately have the same end effect the image looks smoother but doesn't feel any more responsive the underlying tech is v different but the outcome is the same.

You do you, I'd be happier if we both could enjoy a world where frame gen didn't need to exist.

I don't know that you're right that neither motion smoothing or frame gen "feel any more responsive", I'd like to hear more opinions on it (especially if there are any kooks out there who boost for motion smoothing being on in games,) but there are technical reasons why it should feel more responsive. Ultimately, the game is still polling controls at the tick rate and animation rate it settled on in development, but your eyes are getting more motion information (even if artificial) in a single second than before, and the sensation of trajectory and change over time conceivably should allow players sharper sensory input for their reflexes. Especially with frame gen where the actual game code is helping to direct the in-between frames, there's literally more motion on screen to react to.
 
Last edited:
Top Bottom