Thread: Intel Arc GPUs & XeSS - Their AI Super Sampling Tech

IrishWhiskey

I am smoking a ...
 
Intel has recently announced some more information about their gaming line of discrete GPUs, now branded as "Arc" (AMD brands as Radeon and Nvidia as Geforce).

Their new gaming GPU architecture is known as Xe-HPG (Xe- High Performance Gaming) [Nivida architecture is called Ampere while AMD's architecture is called RDNA). Similar to AMD's approach they have a separate compute based architecture for their datacenter GPUs.

Their GPU family/line this round are code named Alchemist

Their top die is said to launch in the first half of 2022 and performance is estimated to be roughly RTX3070 levels in raster. (Possibly slightly higher).

These new GPUs have hardware accelerated Ray Tracing, although not much is known about the performance here or the architecture of their hardware based RT.

The GPUs will be manufactured on TSMC's 6nm process.

The most interesting news though is the announcement of Intel's DLSS competitor: XeSS (Xe Super Sampling)

This is an AI based temporal solution, using a trained neural network that makes use of motion vectors and previous frame data. While Intel has hardware acceleration for this, they are also planning on open sourcing their solution (at a slightly later date).

This means that their solution should work (in a slightly different method with slightly more overhead) on competitors GPUs and even integrated graphics.

Intel demonstrated this technology in motion during their presentation and the results looked quite impressive.





This is amazing news for renewed competition in the market and it looks like Intel is actually giving an honest try here in the GPU market.

What's more their XeSS looks to be an amazing boon for Intel and potentially a fantastic development for upsampling/reconstruction tech in the industry. If this is adopted widely it could prove to be tough competition for DLSS adoption.
 
More competition is always welcome. Glad to see them having an own AI upscaling. DLSS in recent games is pretty amazing and the way to go. So I'm curious about their offerings here.

Especially price vs performance.
 
  • This tbh
Reactions: Kadayi
More competition is always welcome. Glad to see them having an own AI upscaling. DLSS in recent games is pretty amazing and the way to go. So I'm curious about their offerings here.

Especially price vs performance.
Agreed, I am definitely interested to see where they land on performance for their GPUs.

It think they may actually have something pretty compelling/competitive in the mid-high end market if their performance is RTX3070 ~+10%.

I am always somewhat sceptical of Intel's claims in recent years with good reason as their execution has been mostly bungled for the last few years with AMD giving them a serious kicking in the CPU space. However it sounds like they are beginning a turnaround as a company and so we could see them make a solid comeback in all sectors, only time will tell.

Their discrete GPU division is essentially breaking new ground, although they do have some experience with integrated graphics so they have at least some foundation to start from. They have Raja Koduri leading their GPU efforts, who was previously at AMD Radeon Technology Group and was chiefly in charge of the Vega product line.

A lot of people place I think an unfounded amount of blame on Raja for Radeon's stumbles during that period while forgetting that AMD was near bankrupt and Raja and team likely had very little R&D budget to work with at the time. Now that he is at Intel with a much larger budget it will be interesting to see if he can actually deliver on his vision/goals as by all account he is a solid graphics engineer.

Of course the Raja haters could be right and he could fumble out the gate here but for healthy competition in the market I really hope Intel has success with their Arc line of GPUs.

Where Intel will likely struggle initially will be optimization and drivers. For the drivers you have to remember that they have had at least some presence in drivers for their iGPU solutions so they have a basic foundation to work from but that foundation is rocky and incomplete at best. They will have an uphill battle here and I think they will probably struggle with older titles as they will need to prioritize big hits, AAA and recent games to best utilize their engineering time/resources.

It takes years and years to build up a good driver set, often with ad-hoc hot fixes and hacks on a per game basis to improve performance so I think Intel will initially struggle here but I think they will over time develop a solid driver set to build from.

The real problem though is not just the drivers but the actual architectural optimizations in game engines/rendering pipelines. People will often mistake this for "bad drivers" but in reality it has a huge affect on the final performance of games.

Most games from at least the 360 era onwards have been console first, with engines being built to optimize for the console hardware which has pretty much been all AMD for more than a decade. This means that game engines/game rendering pipelines are often optimized at least in some capacity for AMD hardware which translates to good performance on AMD discrete GPUs as they are often similar in technology to consoles.

Granted sometimes this wasn't always the case with things like UE4 where the console engine was a different branch to the engine used for PC ports and the PC port engine was missing the microcode optimizations for AMD. Other times the PC version was optimized for the GPU market leader Nvidia either due to sheer market share or sponsorship. But even in those cases the core engine logic/pipelines at least have some optimizations for AMD.

And with Nvidia being the dominant market player by a long shot every developer has been trying very hard to optimize their PC games for Nvidia hardware as that is where the lions share of gamers are. So Nvidia has almost nothing to worry about with optimization.

Intel here I think will struggle as no games right now will be optimized for their architecture and developers may be hesitant to build or optimize an engine for Intel GPUs when they are such an unknown in the market right now. This will be their biggest hurdle I think and combined with their driver story may make their GPUs appear to perform worse than what the hardware is actually capable of. If they can hunker down and brave the initial storm while continuing to improve and strike deals with developers I think they could definitely improve this over time and carve out their own place in the market.

This is also a possibility that their architecutre could be similar enough in how it is origanized (at least partially) to either AMD or Nvidia in which case the optimization story by chance could actually end up not as bad as we all think. But I wouldn't bet on it right now.

As far as supply is concerned, if Intel had their fabrication execution solid then they could be a real contender here introducing a deluge of supply to a constrained market. This is not the case unfortunately, Intel will also be using TSMC however they will be using TSMC 6nm which should have much more supply than 7nm/5nm.

In conjunction with that, unlike AMD who have to produce millions of console SoCs, all of their consumer and server CPU/GPUs and their consumer discreet GPUs on TSMC 7nm, Intel will not face this problem and so we should see a lot more supply for their initial line up of Arc GPUs than we have seen previously. However we it will still likely not be "enough" to satiate the market.

Intel's final problem though is that they are very late to market compared to Nvidia/AMD this round. We don't know their release cadence for their next GPU line up but if they want to compete they will need to have plans in place to iterate and release very quickly so as not to be so far behind AMD/Nvidia releases for their next gen GPUs.

All in all I'm quite happy with what I'm seeing from Intel, pricing is the only open question. Silicon prices have gone up and the GPU market is crazy so I wouldn't be surprised to see Intel charge a premium for these, however they are also a new player in the field and have a bad reputation currently in the gamer/DIY market for performance so they may take that into account with the pricing and undercut their competitors by a reasonable but not massive margin. Only time will tell but I'm really happy to see more competition in the GPU market and really hope Intel succeeds with this venture for the betterment of us all as consumers.
 
Agreed, I am definitely interested to see where they land on performance for their GPUs.

It think they may actually have something pretty compelling/competitive in the mid-high end market if their performance is RTX3070 ~+10%.

I am always somewhat sceptical of Intel's claims in recent years with good reason as their execution has been mostly bungled for the last few years with AMD giving them a serious kicking in the CPU space. However it sounds like they are beginning a turnaround as a company and so we could see them make a solid comeback in all sectors, only time will tell.

Their discrete GPU division is essentially breaking new ground, although they do have some experience with integrated graphics so they have at least some foundation to start from. They have Raja Koduri leading their GPU efforts, who was previously at AMD Radeon Technology Group and was chiefly in charge of the Vega product line.

A lot of people place I think an unfounded amount of blame on Raja for Radeon's stumbles during that period while forgetting that AMD was near bankrupt and Raja and team likely had very little R&D budget to work with at the time. Now that he is at Intel with a much larger budget it will be interesting to see if he can actually deliver on his vision/goals as by all account he is a solid graphics engineer.

Of course the Raja haters could be right and he could fumble out the gate here but for healthy competition in the market I really hope Intel has success with their Arc line of GPUs.

Where Intel will likely struggle initially will be optimization and drivers. For the drivers you have to remember that they have had at least some presence in drivers for their iGPU solutions so they have a basic foundation to work from but that foundation is rocky and incomplete at best. They will have an uphill battle here and I think they will probably struggle with older titles as they will need to prioritize big hits, AAA and recent games to best utilize their engineering time/resources.

It takes years and years to build up a good driver set, often with ad-hoc hot fixes and hacks on a per game basis to improve performance so I think Intel will initially struggle here but I think they will over time develop a solid driver set to build from.

The real problem though is not just the drivers but the actual architectural optimizations in game engines/rendering pipelines. People will often mistake this for "bad drivers" but in reality it has a huge affect on the final performance of games.

Most games from at least the 360 era onwards have been console first, with engines being built to optimize for the console hardware which has pretty much been all AMD for more than a decade. This means that game engines/game rendering pipelines are often optimized at least in some capacity for AMD hardware which translates to good performance on AMD discrete GPUs as they are often similar in technology to consoles.

Granted sometimes this wasn't always the case with things like UE4 where the console engine was a different branch to the engine used for PC ports and the PC port engine was missing the microcode optimizations for AMD. Other times the PC version was optimized for the GPU market leader Nvidia either due to sheer market share or sponsorship. But even in those cases the core engine logic/pipelines at least have some optimizations for AMD.

And with Nvidia being the dominant market player by a long shot every developer has been trying very hard to optimize their PC games for Nvidia hardware as that is where the lions share of gamers are. So Nvidia has almost nothing to worry about with optimization.

Intel here I think will struggle as no games right now will be optimized for their architecture and developers may be hesitant to build or optimize an engine for Intel GPUs when they are such an unknown in the market right now. This will be their biggest hurdle I think and combined with their driver story may make their GPUs appear to perform worse than what the hardware is actually capable of. If they can hunker down and brave the initial storm while continuing to improve and strike deals with developers I think they could definitely improve this over time and carve out their own place in the market.

This is also a possibility that their architecutre could be similar enough in how it is origanized (at least partially) to either AMD or Nvidia in which case the optimization story by chance could actually end up not as bad as we all think. But I wouldn't bet on it right now.

As far as supply is concerned, if Intel had their fabrication execution solid then they could be a real contender here introducing a deluge of supply to a constrained market. This is not the case unfortunately, Intel will also be using TSMC however they will be using TSMC 6nm which should have much more supply than 7nm/5nm.

In conjunction with that, unlike AMD who have to produce millions of console SoCs, all of their consumer and server CPU/GPUs and their consumer discreet GPUs on TSMC 7nm, Intel will not face this problem and so we should see a lot more supply for their initial line up of Arc GPUs than we have seen previously. However we it will still likely not be "enough" to satiate the market.

Intel's final problem though is that they are very late to market compared to Nvidia/AMD this round. We don't know their release cadence for their next GPU line up but if they want to compete they will need to have plans in place to iterate and release very quickly so as not to be so far behind AMD/Nvidia releases for their next gen GPUs.

All in all I'm quite happy with what I'm seeing from Intel, pricing is the only open question. Silicon prices have gone up and the GPU market is crazy so I wouldn't be surprised to see Intel charge a premium for these, however they are also a new player in the field and have a bad reputation currently in the gamer/DIY market for performance so they may take that into account with the pricing and undercut their competitors by a reasonable but not massive margin. Only time will tell but I'm really happy to see more competition in the GPU market and really hope Intel succeeds with this venture for the betterment of us all as consumers.

I was thinking about the whole driver topic as well. Seeing for how many years AMD struggled to deliver drivers that have been competitive to Nvidia, I see a big challenge here for Intel.

In general, entering a market with two strong players who have been dominating the space like... forever, is no easy undertaking. More power to Intel for stepping up. But I expect at least a couple years of learning until they become a real alternative.
 
  • Like
Reactions: IrishWhiskey
Intel Inside Arc

clean-your-dirty-mind-sinugator.gif