AMD ran an RX Vega GPU event in Budapest today, but reports from the show suggest it provided few answers to lingering questions about performance. The set-up was two systems, running Battlefield 1 (Sniper Elite 4 has also been mentioned) at 3440×1440 ultrawide. One said to be RX Vega on a Freesync monitor, the other an Nvidia GTX 1080 card using G-sync (both panels reportedly 100hz).
There are no accounts of frame-rate counters, or any other significant performance gauges besides human observation. In fact, that seems to have been the point of this exercise. AMD’s intent with this show appears to have been ‘you can’t tell the difference between Vega and a 1080 in a blind-ish test’.
It sounds like there were visible differences, though. In this report on Reddit, the poster says three out of three observers felt one system was performing noticeably worse (an estimated mid-50s framerate according to some present).
Not a whole lot to go on there, and much of it second-hand. Still, we can infer a few things if we assume the basic facts (GPUs and resolution used, etc) are accurate. If one of RX Vega and a GTX 1080 was struggling slightly with Battlefield 1 at 3440×1440, it’s unlikely to have been the 1080 (unless AMD were up to some shenanigans). That would put Vega performance (in that game) behind Nvidia’s card; but close enough that AMD were convinced this was a comparative test worth doing.
If the apparent results are somehow flipped, and Vega is the one a little ahead of the 1080, that’s still pretty consistent with performance extrapolations people have made from benchmarking the Vega: Frontier Edition. Expectations for RX Vega appear to have settled down on ‘somewhere near a 1080’, be that slightly better or worse (or, quite likely, different depending on the game).
One possible wrinkle in these conclusions is that there are likely to be several Vega variants, including a cut back version. I doubt AMD were demoing the scaled back variant today, but if they were that would raise expectations for the ‘standard’ RX Vega.
Unless AMD are playing an utterly bizarre marketing game here, their reticence to really share performance or shout about how their card compares with Nvidia’s upper tier releases speaks volumes. By most accounts now, RX Vega is going to offer comparative performance to the year old 1080 (and fall far short of the 1080ti) with far greater power draw.
The only way to make that an attractive purchase is to price it significantly lower than a vanilla 1080. Implying that users will find their discount by only having to invest in a Freesync rather than G-Sync monitor (according to one unsourced comment out of Budapest) will not be good enough. It now looks more and more like Vega will have to come in cheap at the 30 July Siggraph event, or risk a flop.
Published: Jul 18, 2017 10:27 pm