While the official World of Warcraft: Dragonflight is still about two weeks away, the next phase of the Dragonflight pre-patch is launching today in North America and early tomorrow in Europe. This new patch is bringing the Dracthyr Evoker class to players who’ve pre-ordered Dragonflight and for those that haven’t there are several other non-expansion changes showing up in game.
Here’s what you’ll need to know about what’s new today and exactly when this Dragonflight pre-patch launches:
When does the Dragonflight pre-patch unlock?
Blizzard has specified that World of Warcraft maintenance began at 7:00 am PT (10 am ET) and will end at 3:00 pm PT (6 pm ET) on November 15 for players in North America. It will begin at 3:00 am CET (2 am GMT) on November 16 for the EU region.
Here’s when maintenance is scheduled to end in North American and Europe:
3:00 pm PT, November 15
6:00 pm ET, November 15
10:00 am GMT, November 16
11:00 am CET, November 16
This is Phase 2 of the Dragonflight pre-expansion patch, bringing in even more features and changes than the first patch phase from October.
Can you play a Dracthyr in the Dragonflight pre-patch?
If you’ve pre-ordered the Dragonflight expansion, you can create your first Dracthyr Evoker after the pre-patch launches. That means you’ll have access to play around with the extensive Dracthyr character creator for your human and dragon forms. You’ll start off with an Evoker at level 58 in their new race starter zone Forbidden Reach.
(Image credit: Blizzard )
What else is new in the Dragonflight pre-patch?
For players that haven’t pre-ordered the expansion, there are other changes coming to the main game in the November 15 patch. Here’s what to expect:
New dungeon Uldaman: Legacy of Tyr in the Badlands of the Eastern Kingdoms
New Primal Storms: quests and world events preceding Dragonflight
Levelling changes: lower requirements, specifically for high level characters, to level up
If you haven’t played since the October pre-patch, here’s the rest of the cumulative changes so far in the Dragonflight pre-expansion patches:
Accessibility features for spellcasting, interactions, and gamepad support
Rated solo shuffle: PvP brawl rewarding seasonal rewards and achievements
Class and race combos: Rogue, priest, and mage are all accessible to all races
Blizzard support will track known issues with the pre-expansion pass over on the Blizzard support website as well.
Even if you haven’t pre-ordered the expansion, make sure to take a look at our list of things to do before Dragonflight launches so you’re prepped for the Dragon Isles.
Phil Spencer really wants regulators to stop hassling him about Call of Duty, apparently. Speaking on a recent episode of The Verge’s Decoder podcast (opens in new tab), the Xbox boss straightforwardly said that Microsoft is open to a “longer term commitment” with Sony to keep COD on PlayStation, in the event that Microsoft’s $68 billion acquisition of Activision Blizzard (opens in new tab) goes ahead.
Microsoft’s plan to gobble up Activision Blizzard has hit a few regulatory rough patches in recent weeks. At this very moment, the acquisition is undergoing in-depth, “Phase 2” scrutiny from both UK and EU regulators (opens in new tab), each of which has voiced concerns about the potential for Microsoft to foreclose competitors’ access to Call of Duty. Even Brazilian regulators, who waved the acquisition through (opens in new tab), acknowledged a risk to Sony’s access to COD, they just didn’t think it was their problem.
The “idea that we would write a contract that says the word forever in it” is “a little bit silly,” said Spencer, but he’d have no problem at all making a “longer term commitment that Sony would be comfortable with, regulators would be comfortable with”.
Anticipating the fine-tooth comb that lawyers and audience nitpickers (like me) would take to that statement, Spencer continued, “Sony does not have to take Game Pass on their platform to make that happen. There’s nothing hidden. We want to continue to ship Call of Duty on PlayStation”.
If Spencer sounds a bit exasperated, it’s probably because he’s said some variant of “COD will stay on PlayStation (opens in new tab)” so many times that he must be sick of hearing himself saying it. Still, in fairness to Sony, well-meaning public statements don’t mean much when there are billion-dollar franchises on the line.
When Microsoft initially offered to keep COD on PlayStation for three years after the end of Sony’s current agreement with Activision, Sony declared the offer “inadequate on many levels (opens in new tab)“. A longer-term legal agreement could be much more to Sony’s liking and, Microsoft hopes, the liking of regulators as well.
The three-way back-and-forth between Sony, Microsoft, and national regulators has been going on for some time now, and has given us a lot of insight into the business arrangements that underpin titanic series like COD. A more cynical man than I might suggest that Sony’s caterwauling over access to the series is a little hypocritical, given that it apparently has a deal with Activision that’s keeping the series off Game Pass (opens in new tab). Besides, Microsoft argues that it’s not a big deal anyway, since what if future games are as mediocre as Vanguard (opens in new tab) was?
We’ll find out more about Activision’s future as the slow wheels of regulatory scrutiny turn. Both the EU and UK investigations are likely to conclude in early-to-mid 2023, and we’re still awaiting word from the US Federal Trade Commission on the matter. Spencer might have to find some new ways to say that COD isn’t going anywhere before we’re through with this thing.
https://gamingarmyunited.com/wp-content/uploads/2022/11/1668534686_Xbox-boss-says-buying-COD-isnt-about-pulling-the-rug.png6261114Carlos Pachecohttps://gamingarmyunited.com/wp-content/uploads/2022/08/Website-Logo-300x74.pngCarlos Pacheco2022-11-15 17:25:542022-11-15 17:25:54Xbox boss says buying COD isn’t about ‘pulling the rug out from underneath the PlayStation 7’
16-year-old Ed main EndingWalker was participating in only his second in-person Street Fighter tournament on record, but you wouldn’t think it with how he acquitted himself.
The young gamer tore through the winners bracket at Ultimate Fighting Arena this past weekend, overcoming highly-regarded, full-time professional players. After going 3-0 against Valmaster’s Chun Li in grand finals (opens in new tab), EndingWalker simply walked off into the crowd (opens in new tab), unperturbed.
Okay, with a lot of prodding from the enthusiastic audience, EndingWalker sheepishly cracked a smile and returned to the stage to enjoy some well-deserved applause for his standout tournament run, as well as accept his medal and grand prize reward of around $3,000. EndingWalker later tweeted (opens in new tab) that he was “overwhelmed” with the result, and that “reality hadn’t kicked in.”
A perfectly reasonable response to doing something really cool in front of an enthusiastic crowd, and it’s really just the cherry on top that it was more of a “well, now what do I do” response instead of an ice cold esports flex.
EndingWalker has been cultivating a bit of a wunderkind reputation in the Street Fighter 5 community, taking home first place at a score of online tourneys in the past year. At his first in-person appearance at VSFighting X in August, EndingWalker placed third.
Me walking off the stage after grand finals because reality hadn’t kicked in XDI had an amazing time at the venue. The staff were super friendly & helpful as was everyone! I just got a bit overwhelmed in the end which is why I left kinda quickly after winningI had a great time!November 13, 2022
See more
Aside from his grand finals sweep against Valmaster, EndingWalker’s winners finals matchup (opens in new tab) against Punk is definitely worth a watch. Punk is a prolific pro Street Fighter player, one of the highest-ranked in North America, and EndingWalker’s successful showing against him speaks to the young champ being the real deal. EndingWalker also mentioned that his set against Angrybird’s Zeku was a new favorite. You can catch it here (opens in new tab) at the beginning of this full top sixteen to top 8 run.
UK-based FGC commentator Tyrant (opens in new tab) goes so far as to call EndingWalker the “future of UK Street Fighter,” so, you know, no pressure. If you’re interested in tracking the up-and-comer’s progress, you can follow EndingWalker on Twitch (opens in new tab) and Twitter (opens in new tab).
https://gamingarmyunited.com/wp-content/uploads/2022/11/1668477295_16-year-old-Street-Fighter-player-becomes-instant-legend-after-sweeping-tournament.jpg6751200Carlos Pachecohttps://gamingarmyunited.com/wp-content/uploads/2022/08/Website-Logo-300x74.pngCarlos Pacheco2022-11-15 01:33:102022-11-15 01:33:1016-year-old Street Fighter player becomes instant legend after sweeping tournament and walking away like it’s no big deal
Call of Duty: Warzone 2 launches this week, kicking off a new map, a new mode, and loads of new features for the CoD battle royale. Although Activision is still officially calling it “Warzone 2.0,” don’t be fooled. The new Warzone is in fact a new game that won’t replace the original one, which will live on under a new name. The relationship between old and new—and Activision’s 2.0 name—has made launch week feel a bit muddy, but we’ll sort out all the dates and times you need to know right here.
The quick and dirty details you need to know about Warzone 2 are that it’s still a battle royale but includes the new extraction-style DMZ mode, the new map Al Mazrah, proximity chat, underwater combat, and a new gulag.
Here’s what you need to know about Warzone 2’s launch week and what’s happening to old Warzone when it arrives.
When does Warzone 2 launch?
Call of Duty: Warzone 2 launches at 10 am PT on Wednesday, November 16. Activision is planning to take the current Warzone down for maintenance two hours prior to that, so there will be no Warzone for about two hours on November 16. Here’s when Warzone 2 comes online:
10:00 am PT
1:00 pm ET
6:00 pm GMT
7:00 pm CET
5:00 am AEDT, November 17
Preloading for Warzone 2 has already begun as of Monday, November 14. You can start prepping for the launch by downloading it from Steam or Battle.net. Warzone 2 has its own page on Steam, but on Battle.net it’s packaged alongside Modern Warfare 2. Be prepared for a chunky 125GB download. If you’re a Modern Warfare 2 player on Battle.net looking to make room for Warzone 2, click the gear icon on the game’s page and select “modify install” to uninstall parts of the game you’re not playing (deleting co-op and campaign frees up around 30GB).
What happens to old Warzone?
Call of Duty: Warzone will go offline at 8 am PT (4 pm GMT) on Wednesday, November 16 and will relaunch as Call of Duty: Warzone Caldera at 10 am PT (6 pm GMT) on Monday, November 28.
The original CoD: Warzone isn’t actually going away. Which is confusing, given that Activision has been officially referring to the new game as Warzone 2.0. But no, it isn’t replacing the old one. The Warzone you’ve been playing up until now will go offline on November 16 and, around ten days later, come back under a new name, Call of Duty: Warzone Caldera. All existing guns and cosmetics will persist in Warzone Caldera, though the Rebirth Island and Fortune’s Keep maps will not be returning.
Just a couple of days before its launch on November 16, Call of Duty: Warzone 2 has arrived on Steam and launched a pre-load option for the game. But there’s a notable catch.
If you don’t own Modern Warfare 2, you should be clear to pre-load Warzone 2.0 on Steam. It’s a sensible 23GB at the moment, but I have a feeling that could expand with some further update at launch.
But players who do own Modern Warfare 2 on Steam are reporting some problems downloading the update, apparently because Warzone will arrive within Modern Warfare 2 as the “Season 01” update, which isn’t available yet. When you click the big green “Pre-Load Game” button on Steam, it launches Modern Warfare 2 in Steam, seemingly because both games will be rolled together into a single platform, accessible on the “Call of Duty HQ” menu. I don’t know when the Season 01 update will be pre-loadable, but I’ve emailed Activision Blizzard to ask, and I’ll update this page if I receive more information.
Also worth noting: the listed system requirements for Warzone on Steam are identical to that of Modern Warfare 2. That makes it less clear exactly how much hard drive footprint the unzipped download will take up. The system requirements page lists 125GB, but the Warzone pre-load is much smaller than that. Over on Battle.net, the “Base Game” and Warzone sum to a 17.4GB download.
If this is a bit confusing, we might actually have it better off than our console comrades, where it sounds like tracking down the Warzone 2 PS5 preload (opens in new tab), or the Xbox one, is a bit of a byzantine process.
can anyone else on steam not preload warzone 2.0?? i clicked the checkmark for it in my dlc section for mw2 and it just says it’s 24 bytes big which i know is bullshit since they said already it’s 100gbNovember 14, 2022
See more
anyone else unable to preload warzone 2 on steam? it just boots up mw2 for some reason and won’t download warzone 2#mw2 #warzone2 #ModernWarfare2November 14, 2022
See more
Wait I don’t understand @charlieINTEL so that means what i downloaded just now (mw2 pack 45.7gb + wz2 pack 2.9gb) is all there is? Not even up to 50gb? Or will there be an additional download come the 16th?November 14, 2022
See more
(Image credit: Activision)
This marks Warzone’s debut on Steam. The game debuted on PC on Battle.net in March 2020, but ahead of Activision Blizzard’s expected acquisition by Microsoft, the company has returned to Steam in the last month, also releasing Crash Bandicoot 4: It’s About Time on Steam in October, Activision’s first game on the platform since 2019.
A final note about Warzone on Steam: Warzone will actually block you from pre-loading the game until you have the newest drivers installed, which I had to do before attempting.
https://gamingarmyunited.com/wp-content/uploads/2022/11/1668461953_For-the-first-time-Warzone-is-on-Steam.jpg6561200Carlos Pachecohttps://gamingarmyunited.com/wp-content/uploads/2022/08/Website-Logo-300x74.pngCarlos Pacheco2022-11-14 21:21:052022-11-14 21:21:05For the first time, Warzone is on Steam
I’m slightly baffled by all these Batman projects that don’t have Batman in them. There’s a TV show about the early life of Batman’s butler, Alfred. 2019’s Joker film was Batman-free. October’s disappointing Gotham Knights is full of heroes, but Batman himself only shows up for the first five minutes. I know wringing every last drop from expanded superhero universes is the trendy thing right now, but I can’t bring myself to care about Batman stuff that doesn’t have Batman in it.
To quote Batman: “I’m Batman.” If you’re going to make a new Batman thing, there should be more Batman in it, not less.
So I was excited to see a new Batman game on the horizon: Batman Rogue City. And amazingly, it’s a full conversion mod for Doom 2 (opens in new tab) starring the Dark Knight himself and loads of his familiar enemies.
Hard to believe, but being a mod for a game that’s nearly 30 years old doesn’t stop Rogue City from looking like one hell of a Batman game. In the mod you can punch crooks, use a grappling hook and stun gun, battle bosses like Mr. Freeze, Mad Hatter, Harley Quinn, and the Joker, and even drive the Batmobile through Gotham City. All in the Doom engine.
The trailer takes us through Arkham Asylum, where once again the inmates are running the prison. Mad Hatter makes an appearance on monitors inside Gotham’s stock exchange where businessmen have been tied up by goons in rabbit masks. And then Batman appears, punching goons in a maze of cargo containers, pulling himself through the air with his quick-firing grapnel, and flinging his iconic batarang around.
We get a glimpse of Nora Fries in cryostasis in Mr. Freeze’s lair (followed by an angry Freeze himself) and the Batmobile speeding across a bridge firing machine guns and blowing up the Joker gang’s purple and green cars. The Gotham skyline even shows Wayne Tower and the bat signal projected on the clouds, and the trailer ends with a confrontation with the Joker and Harley. What more could you want from a Batman game?
Batman Rogue City isn’t out yet, but the creator says a public beta will be released “soon.” So keep an eye on its page at Moddb (opens in new tab) for the bat signal.
In the meantime, consider checking out Batman Doom, a very good 1999 mod (opens in new tab) from the Chilean development group that eventually became ACE Team, the folks behind Zeno Clash and Clash: Artifacts of Chaos.
https://gamingarmyunited.com/wp-content/uploads/2022/11/1668458316_The-best-new-Batman-game-might-be-this-Doom-mod.jpg6691200Carlos Pachecohttps://gamingarmyunited.com/wp-content/uploads/2022/08/Website-Logo-300x74.pngCarlos Pacheco2022-11-14 19:38:302022-11-14 19:38:30The best new Batman game might be this Doom mod
It’s almost a complete sweep for AMD for the best graphics card title right now. That’s not for a lack of excellent Nvidia GPUs over the past year or more, it’s simply down to what your money will get you nowadays. And it’s a lot more when you buy an AMD GPU.
There are heaps of cheap graphics card deals (opens in new tab) out there, and almost all of them AMD graphics cards going for much less than their original asking price. That’s why you’ll see some cards we didn’t love all that much at launch now making it into this list of the best around.
These may be budget cards on offer, but they’re also the least likely to be made redundant by a next-generation GPU anytime soon. And they’re plenty good enough for 1080p and 1440p gaming.
But we are also getting closer to that time of year when the best graphics cards might not be the best graphics cards for much longer. In fact, Nvidia’s RTX 4090 has already made mincemeat of most of the RTX 30-series in performance, albeit for a massive price tag, and soon we’ll see AMD’s RX 7900 XTX and RX 7900 XT join the fray.
It’s important to note that with new GPUs coming soon, expect more performance for less. That means the high-end RX 6950 XT and RTX 3090 Ti probably aren’t your best bet right now, as your money should get you much more performance in a matter of months. More budget-conscious GPUs like the RTX 3060 (opens in new tab) or RX 6600 XT (opens in new tab) aren’t likely to be replaced by shiny new cards immediately, however, so these are the cards we’re currently recommending. Well, one of them anyways.
Why you can trust PC Gamer Our expert reviewers spend hours testing and comparing products and services so you can choose the best for you. Find out more about how we test.
Best graphics card
(Image credit: Future)
The most absurdly powerful gaming GPU ever
Specifications
CUDA cores: 16,432
Base clock: 2,235MHz
Boost clock: 2,520MHz
TFLOPs: 82.58
Memory: 24GB GDDR6X
Memory clock: 21GT/s
Memory bandwidth: 1,008GB/s
Reasons to buy
+
Excellent gen-on-gen performance
+
DLSS Frame Generation is magic
+
Super-high clock speeds
Reasons to avoid
–
Massive
–
Ultra-enthusiast pricing
–
Non-4K performance is constrained
–
High power demands
There’s nothing subtle about Nvidia’s GeForce RTX 4090 graphics card. It’s a hulking great lump of a pixel pusher, and while there are some extra curves added to what could otherwise look like a respin of the RTX 3090 shroud, it still has that novelty graphics card aesthetic.
It looks like some semi-satirical plastic model made up to skewer GPU makers for the ever-increasing size of their cards. But it’s no model, and it’s no moon, this is the vanguard for the entire RTX 40-series GPU generation and our first taste of the new Ada Lovelace architecture.
A hell of an introduction to the sort of extreme performance Ada can deliver.
On the one hand, it’s a hell of an introduction to the sort of extreme performance Ada can deliver when given a long leash, and on the other, a slightly tone-deaf release in light of a global economic crisis that makes launching a graphics card for a tight minority of gamers feel a bit off.
But we can’t ignore it for this guide to the best GPUs around simply because, as it stands today, November 2022, there’s no alternative to the RTX 4090 that can get anywhere close to its performance. It’s unstoppable, and probably will stay ahead of the pack as AMD sees its two new RDNA 3 cards as more RTX 4080 16GB competitors (opens in new tab).
This is a vast GPU that packs in 170% more transistors than even the impossibly chonk GA102 chip that powered the RTX 3090 Ti. And, for the most part, it makes the previous flagship card of the Ampere generation look well off the pace. That’s even before you get into the equal mix of majesty and black magic that lies behind the new DLSS 3.0 revision designed purely for Ada.
Look, it’s quick, okay. With everything turned on, with DLSS 3 and Frame Generation working its magic, the RTX 4090 is monumentally faster than the RTX 3090 that came before it. The straight 3DMark Time Spy Extreme score is twice that of the big Ampere core, and before ray tracing or DLSS come into it, the raw silicon offers twice the 4K frame rate in Cyberpunk 2077, too.
There’s no denying it is an ultra-niche ultra-enthusiast card, and that almost makes the RTX 4090 little more than a reference point for most of us PC gamers. We’re then left counting the days until Ada descends to the pricing realm of us mere mortals.
In itself, however, the RTX 4090 is an excellent graphics card and will satisfy the performance cravings of every person who could ever countenance spending $1,600 on a new GPU. That’s whether they’re inconceivably well-heeled gamers, or content creators not willing to go all-in on a Quadro card. And it will deservedly sell, because there’s no other GPU that can come near it right now.
A great card for driving a high refresh rate 1080p screen
+
Very quiet
+
Excellent build quality
Reasons to avoid
–
Small gains vs the 6600 XT
–
PCIe 8x limitation
AMD’s Radeon RX 6600 XT launched with its heart set on toppling Nvidia’s RTX 3060 Ti, or get close to it. Sadly, it wasn’t quite there at the time, and with a price tag of $379 it became a hard sell. However, AMD later followed up that launch with the RX 6650 XT, which is a moderately (slightly) faster version of the RX 6600 XT. What’s even better than that is nowadays the RX 6650 XT is dramatically cheaper than its Nvidia competition and well under its MSRP. Hallelujah!
So now AMD’s RX 6650 XT is by far the better buy than anything it’s up against. AMD’s affordable GPUs have matured like fine wine. Which is absolutely classic behaviour for AMD.
The AMD Radeon RX 6600 XT features the Navi 23 GPU; one of its second-gen Navi chips based on the seriously impressive RDNA 2 architecture. It’s another TSMC N7 (7nm) graphics processor, but the smallest of all the current-gen AMD GPUs. Despite its small scale, though, it packs in more transistors than even the most powerful of the first generation Navi cards, the Navi 10 chip at the heart of the RX 5700 XT.
When it comes to speeds and feeds we’re talking about a chip with 32 Compute Units (CUs) at its heart, with a total of 2,048 stream processors. The same as the RX 6600 XT. However, it’s capable of hitting a max clock speed of a blistering 2,635MHz, which is wildly quick by anyone’s standards.
AMD’s RX 6650 XT is by far the better buy than anything it’s up against.
For that speed, it confidently beats Nvidia’s cheapest Ampere GPU, the RTX 3060, across the board. That’s important as it’s actually competing with, and consistently beating, the RTX 3060 on price.
The best showing for the newest Navi GPU is at its 1080p target resolution. That’s generally been the case for AMD’s cards compared to their Nvidia competition this time around. It’s plenty capable at 1440p, too, though you might want a bit more power if you intend to operate a 4K screen.
The RX 6650 XT is definitely one to look out for in the lead up to Black Friday and during the busy shopping period. It’s already going cheap, yes, but at under $300 this would be an absolute steal. Even right now it’s going for as little as $260, so fingers crossed it gets even cheaper still.
There’s a certain level of pomp and excitement that comes with every major architectural overhaul, though perhaps we’re not giving enough love to what comes after. Those more affordable graphics cards that actually bring that new technology to the masses are just as important, if not more so to many gamers eyeing up an upgrade. The AMD Radeon RX 6700 XT was the beginning of that journey for RDNA 2: The GPU with the grunt of a next-gen console for under $500.
We’re not talking the cheapest of chips here. The Radeon RX 6700 XT is still a high-end card by most counts, but its price tag is slipping into the more affordable end of the market by the week, and that’s high up on a list of things we absolutely love to see in 2022.
There’s more to the Radeon RX 6700 XT than a simple halving of silicon from AMD’s top chip, the Radeon RX 6900 XT (opens in new tab). In some ways, sure, it’s a straight slice down the middle. The RX 6700 XT features 40 compute units (CUs) for a total of 2,560 RDNA 2 cores and is equipped with 64 ROPs—exactly half of the maximum configuration of the Navi 21 GPU—but the card comes with more than its fair share of memory and Infinity Cache.
Often quite a reasonable amount cheaper than the RTX 3060 Ti.
A headline feature of AMD’s RDNA 2 lineup has been bigger than thou memory capacities compared with rival GeForce GPUs, and the RX 6700 XT doesn’t buck that trend. There’s 12GB of GDDR6 packed onto this card: an attempt at what we optimistically call ‘future-proofing’. That’s greater VRAM capacity than the RTX 3060 Ti and RTX 3070, and is a match for the RTX 3060 12GB.
With a price tag closer to the GeForce RTX 3070, yet performance between it and the GeForce RTX 3060 Ti, most often closer to the latter, the Radeon was a solid alternative but hadn’t been my first port of call for this sort of cash for a good portion of its life. That’s all changed now that it’s often quite a reasonable amount cheaper than the RTX 3060 Ti.
That said I still think you could pick up an RTX 3060 Ti right now and be happy with it. If only because that card was possibly the best value proposition of the entire RTX 30-series, and is still something close to that (it’s all out of whack). If ray tracing doesn’t bother you and you’d prefer the extra memory, however, the RX 6700 XT is the more all-round GPU to buy right now.
It’s not been a great few years to buy into PC gaming or build your own machine. However, things are finally improving. Stock can still be hit or miss, but prices have started to become reasonable again.
That’s far more true of AMD’s graphics cards than Nvidia’s right now, and no more so than in the budget lineup. The RX 6600 wasn’t super impressive to us at launch, considering it asked the same amount of cash as an RTX 3060 12GB but was often beaten by the green team’s card. However, it’s now much, much cheaper. It’s even cheaper than Nvidia’s RTX 3050, which makes it a much smarter buy than both cards.
It’s even cheaper than Nvidia’s RTX 3050.
There’s always the threat of Nvidia here, as if it ever did drop its prices on the RTX 3060 12GB, it’d be the better pick. AMD card comes with 4GB less VRAM, often lower gaming performance, and one of its more valuable features, FidelityFX Super Resolution, is available cross-vendor. However, we’ve not seen any sign of Nvidia doing much about its high prices right now.
The RX 6600 is plenty capable of 1080p gaming in the modern age, don’t fret about that. It’s built using AMD’s RDNA 2 architecture, which is still, just about, the top dog out of the Radeon camp. One day soon it’ll be replaced by the RDNA 3 architecture, but that’ll be only the high-end cards arriving at first—the RX 6600 has a lot of life left in it yet.
As a red team alternative to Nvidia’s high-end graphics cards, there have been few finer than the RX 6800 XT. A highly competitive card that comes so close to its rival, with a nominal performance differential to the RTX 3080, is truly an enthusiast card worth consideration for any PC gamer with 4K in their sights.
The RX 6800 XT was the first of AMD’s RDNA 2 GPUs to enter the fray, and while we’ve had plenty of other cards since, this is the one that shines brighter than most and makes the most sense financially. At least it does if you consider its MSRP, which it’s often found going for much less than. Meanwhile Nvidia’s RTX 3080 is rarely found for anything close to its original launch price.
We’re big fans of what AMD has managed to accomplish with the RX 6800 XT.
A key battleground for Nvidia and AMD this generation has been on the memory front—covering both bandwidth and capacity. The RX 6800 XT comes with 16GB of GDDR6 across a 256-bit bus for a total bandwidth of 512GB/s. That means AMD has Nvidia’s 10GB RTX 3080 on the ropes in terms of capacity but falls slightly behind in raw bandwidth to the RTX 3080’s 760GB/s.
AMD has an ace up its sleeve in throughput terms in the form of its Infinity Cache, which bolsters the card’s ‘effective bandwidth’ considerably. Some 1,664GB/s, by AMD’s making—a 3.25x improvement over the RX 6800 XT’s raw bandwidth. In gaming terms, it means you’re looking at similar performance, despite the very different underlying technologies.
It’s a tough call between the RX 6800 XT and the RTX 3080, but the latter pips AMD to the post with the final touches à la RTX. That’s only really a consideration worth making if you’re assuming some sort of similarity in price between the two, which right now there often isn’t. The RX 6800 XT is often the far cheaper card, so the worse ray tracing performance and lack of DLSS really don’t mean as much as they once did.
AMD’s FidelityFX Super Resolution has also gained considerable momentum among developers and offers solid upscaling that’s worth enabling in supported games. The introduction of FSR 2.0 in Deathloop (opens in new tab) offers a tantalising glimpse of what the future holds too as more developers roll out the new and improved version.
The RX 6800 XT leaves AMD in an incredibly strong position going forward, delivering what is required to get the entire industry to take notice, and with a strong proposition to offer gamers instantly at launch. And it’s no surprise to hear the cooperation between Zen and RDNA engineers had a part to play in all this, too.
All of which is to say that AMD has evolved on what was already a promising architecture in RDNA and delivered it in a fantastic graphics card in the RX 6800 XT. And not the least bit impressive in just how swiftly it has achieved near performance parity with Nvidia. There’s still some way to go to claw back market share from the green team, but step one on RTG’s to-do list (build a high-end GPU) can be confidently checked off with the release of the RX 6800 XT.
The somewhat sensible ultra-enthusiast graphics card right now
Specifications
RDNA cores: 5,120
Base clock: 1,825MHz
Boost clock: 2,250MHz
TFLOPs: 23.04
Memory: 16GB GDDR6
Memory clock: 16GT/s
Memory bandwidth: 512GB/s
Reasons to buy
+
Occasional RTX 3090 performance…
+
…but much cheaper!
Reasons to avoid
–
Can lag behind RTX 3080 at times
–
Mediocre ray tracing performance
The RTX 3090 may have sat unchallenged at the top rungs of graphics performance at launch, but it wouldn’t be long until AMD rustled together a challenger in the RX 6900 XT, or ‘Big Navi.’ The RX 6900 XT hopes to knock Ampere’s finest from its perch on high and send it spiralling back down to Earth. And it gets kind of close, too, with 4K performance a little off the pace of the RTX 3090.
The issue at launch was that the RX 6900 XT couldn’t really match the RTX 3090 all the time, and in fact often slipped back to the RTX 3080 10GB at times. However, that’s really not an issue anymore, as it’s pretty much always available for less cash than an RTX 3080 10GB.
So an occasional RTX 3090 competitor with heaps of GDDR6 memory for less than the price of an RTX 3080 10GB. The graphics gods are smiling upon us.
For that reason, it’s simply the better buy for any PC gamer, and even those with any ulterior motives of the pro-creator variety. That 16GB VRAM capacity comes in handy when you’re using your PC for more intensive creation tools.
For raw gaming alone, the RX 6900 XT is a cheaper alternative to the RTX 3090.
We used to also feel the RX 6800 XT was a much better choice than this, and that’s still somewhat true considering prices for both cards have come crashing down. But nowadays there’s not as much in it between them as there once was, which means you can safely make the leap to the RX 6900 XT without breaking the bank.
AMD has since released the RX 6950 XT to compete mano a mano with Nvidia’s more recent RTX 3090 Ti (opens in new tab). Both of these cards are extreme in performance, and price: demanding well over $1,000.
The thing is, at least in my opinion, I’d rather stick with the standard RX 6900 XT if it means saving $100 or even more. The speedier RX 6950 XT isn’t often found for as little as the RX 6900 XT, and there’s not enough in it between them to sway me. The RX 6950 XT is one to look out for if it drops in price, however. It very well might end up being a similar price to the RX 6950 XT once AMD’s next gen RX 7000-series start to arrive.
That’s the other thing you’ll want to consider. As I mentioned at the top of this article, the next-gen is coming from AMD. The RX 7900 XT (opens in new tab) is technically cheaper than the RX 6900 XT was at launch, and the RX 7900 XTX bang-on the same price, and there’s no doubting they’ll both be a lot quicker than it. If you’re after pure performance, rather than a deal per se, then you might want to wait a little longer for those two RDNA 3 cards to arrive.
Every new GPU generation offers new features and possibilities. But rasterized rendering is still the most important metric for general gaming performance across the PC gaming world. Sure, Nvidia GPUs might well be better at the ray tracing benchmarks they more or less instigated, but when it comes to standard gaming performance AMD’s latest line up can certainly keep pace.
It’s also worth noting that the previous generation of graphics cards do still have something to offer, with something like the GTX 1650 Super able to outpace a more modern RTX 3050 in most benchmarks.
We’re not saying you should buy an older card in 2022—AMD’s budget RX 6000-series is a much better deal today—but it’s worth knowing where your current GPU stacks up, or just knowing the lie of the land. But there is also the fact there will be gaming rigs on sale with older graphics cards over the next few days, and if they’re cheap enough they may still be worth a punt as a cheap entry into PC gaming.
We’ve benchmarked all the latest GPUs of this generation, and have tracked their performance against the previous generation in terms of 3DMark Time Spy Extreme scores. Where we don’t have the referential numbers for an older card we have used the average index score from the UL database. These figures track alongside an aggregated 1440p frame rate score from across our suite of benchmarks.
(Image credit: Future)
MSRP list
Here’s a list of the manufacturer set retail prices (MSRP), or recommended retail price (RRP), for most the latest graphics cards. For the most part, these are the set prices for the stock or reference versions of these cards, if applicable, and not representative of overclocked or third-party graphics cards, which may well be priced higher.
Nvidia
RTX 4090 – $1,599 | £1,699
RTX 4080 16GB – $1,199 | £1,269
RTX 3090 Ti – $1,999 | ~£1,999
RTX 3090 – $1,499 | £1,399
RTX 3080 Ti – $1,199 | £1,049
RTX 3080 – $699 | £649
RTX 3070 Ti – $599 | £529
RTX 3070 – $499 | £469
RTX 3060 Ti – $399 | £349
RTX 3060 – $329 | £299
RTX 3050 – $249 | £239
AMD
RX 6950 XT – $1,099 | ~£1,060
RX 6900 XT – $999 | ~£770
RX 6800 XT – $649 | ~£600
RX 6800 – $579 | ~£530
RX 6750 XT – $549 | ~£530
RX 6700 XT – $479 | ~£420
RX 6650 XT – $399 | ~£389
RX 6600 XT – $379 | ~£320
RX 6600 – $329 | ~£299
RX 6500 XT – $199 | ~£180
Graphics card FAQ
Which is better GTX or RTX?
The older GTX prefix is now used to denote older Nvidia graphics cards which don’t have the extra AI and ray tracing silicon that the RTX-level cards do. This RTX prefix was introduced with the RTX 20-series, and highlights which cards have GPUs which sport both the Tensor Cores and RT Cores necessary for real-time ray tracing and Deep Learning Super Sampling (DLSS).
Nowadays you’ll only find older 16-series GPUs with the GTX prefix attached, so it’s pretty much RTX all the way.
Is ray tracing only for RTX cards?
The RTX prefix is only used to denote cards which house Nvidia GPUs with dedicated ray tracing hardware, but they are still using the same DirectX Raytracing API Microsoft has created, and which is used by AMD’s RDNA 2 GPUs and soon to arrive RDNA 3 GPUs.
Intel’s Alchemist graphics cards also support ray tracing, though as more budget offerings you can’t expect super-high frame rates while it’s enabled. Otherwise Intel’s ray tracing acceleration is pretty good.
Is SLI or CrossFire still a thing?
If you’re looking for maximum performance, you used to run two cards in SLI or CrossFire. However, it’s become increasingly common for major games to ignore multi-GPU users completely. That includes all DXR games. There’s also the fact that fewer and fewer modern cards actually support the linking of two cards together.
So, no. It’s not a thing.
Do I need a 4K capable graphics card?
The obvious answer is: Only if you have a 4K gaming monitor (opens in new tab). But there are other things to consider here, such as what kinds of games do you play? If frame rates are absolutely king for you, and you’re into ultra-competitive shooters, then you want to be aiming for super high fps figures. And, right now, you’re better placed to do that at either 1440p or 1080p resolutions.
That said, the more games that incorporate upscaling technologies, such as DLSS, FSR, and XeSS, the more cards will be capable of making a close approximation of 4K visuals on your 4K monitor, but at higher frame rates.
What’s a Founders Edition graphics card?
The Founders Edition cards are simply Nvidia’s in-house designs for its graphics cards, as opposed to those designed by its partners. These are usually reference cards, meaning they run at stock clocks.
Briefly, for the RTX 20-series, Nvidia decided to offer Founders Editions with factory overclocks. These had made it a little difficult to compare cards, as Founders Edition cards give us a baseline for performance, but Nvidia has since returned to producing them as reference again.
Intel also offers something similar with its Limited Edition Arc Alchemist cards featuring its own in-house cooler design, as does AMD with its reference cards.
https://gamingarmyunited.com/wp-content/uploads/2022/11/The-best-graphics-cards-in-2022.png6751200Carlos Pachecohttps://gamingarmyunited.com/wp-content/uploads/2022/08/Website-Logo-300x74.pngCarlos Pacheco2022-11-14 19:09:352022-11-14 19:09:35The best graphics cards in 2022
Three things go into choosing the best gaming laptop; performance, portability, and price. So, whether you need a colossal workstation or a sleek, ultra-thin gaming notebook, your future gaming laptop should find a way to fit perfectly into your daily life. As much as someone would love the best gaming PC (opens in new tab), the reality is that some of us don’t have the room (or the need) for a full-size PC and monitor set-up.
Thanks to the constant CPU tug-of-war between AMD and Intel, now has never been a better time to shop for a gaming laptop. AMD’s Ryzen 6000-series processors and Intel’s 12th-Gen Alder Lake chips seem evenly matched on performance. On the GPU front, the Intel Arc mobile GPUs are starting to provide some healthy competition to Nvidia and AMD, at least for entry-level gaming laptops. If you’re looking for a gaming laptop with a bit more horsepower, Nvidia’s RTX 30-series mobile GPUs have been tough to beat despite AMD’s best efforts.
Consider picking up a gaming laptop with a speedy NVMe SSD (at least 512GB) if your budget allows it. Even if you’re opting for an entry-level GPU/CPU spec, SSDs provide a fast and reliable storage solution that also cuts down on game loading times. Some laptops even have high refreshment displays, which could pique your interest if you’re a competitive gamer looking to hone your skills from practically anywhere.
Throughout the year, we test dozens of gaming laptops. The ones that make it to the list provide the best bang for your buck. They offer your budget the best balance of performance, price, and portability. We have our favorite gaming laptops as well as the best configs for each of our entries.
Why you can trust PC Gamer Our expert reviewers spend hours testing and comparing products and services so you can choose the best for you. Find out more about how we test.
The latest spin of the Razer Blade 15 once again improves on one of the best gaming laptops ever made. It has the same gorgeous CNC-milled aluminum chassis as its predecessor, only this time it can house one of Nvidia’s latest RTX 30-series GPUs and an Intel 12th Gen Core i9 CPU.
We’ve played with the Razer Blade 15 Advanced with a 10th Gen Intel chip and RTX 3080 (95W) GPU inside it. And we fell in love all over again. We’ve also since then tried out the larger Razer Blade 17 with an RTX 3080 Ti (opens in new tab) humming away inside it, alongside one of those tasty 12th Gen chips and still stand bewildered by what will fit inside such a compact and neat chassis.
The Razer Blade 15 is the overall best gaming laptop on the market right now.
That said, you will get some throttling because of that slimline design, and even on the larger Blade 17 the battery life can be a little slim, but you’re still getting outstanding performance from a beautiful machine.
The Blade 15 isn’t the lightest gaming laptop you can buy, but five pounds is still way better than plenty of traditional gaming laptops, while also offering similar performance and specs. That heft helps make it feel solid too. It also means the Blade 15 travels well in your backpack. An excellent choice for the gamer on the go… or if you don’t have the real estate for a full-blown gaming desktop and monitor.
Keyboard snobs will be happy to see a larger shift and half-height arrow keys. The Blade 15 Advanced offers per-key RGB lighting over the Base Model’s zonal lighting. Typing feels great, and I’ve always liked the feel of the Blade’s keycaps. The trackpad can be frustrating at times, but you’re going to want to use a mouse with this gorgeous machine anyway, so it’s not the end of the world.
One of the best things about the Blade 15 is the number of configurations (opens in new tab) Razer offers. From the RTX 3060 Base Edition to the RTX 3080 Ti Advanced with a 144Hz 4K panel, there’s something for almost everyone. It’s one of the most beautiful gaming laptops around and still one of the most powerful.
Whatever config you pick, we think the Razer Blade 15 is the overall best gaming laptop on the market right now, though you will be paying a premium for the now-classic design.
The new version of the Zephyrus G14 for 2022 impresses us once again with its well-balanced spec and excellent gaming performance. Seriously, this thing shreds through frames up to its 120Hz refresh rate, and it’s great for much more than gaming, too.
We checked out the version with AMD’s RX 6800S under the hood, though there is an option for an RX 6700S, for a chunk less cash. Arguably, that cheaper option sounds a bit better to us, as the high-end one can get a little pricey and close in on the expensive but excellent Razer Blade 14. It’s not helped much by its 32GB of DDR5-4800 RAM in that regard, though we do love having all that speedy memory raring to go for whatever you can throw at it.
At its heart is the AMD Ryzen 6900HS. That’s one of the top chips from AMD’s Ryzen 6000-series, but not its best and brightest—though you’re really fighting over boost clocks and not much else when it comes to the tippy top of the red team’s mobile processors anyways. It delivers eight cores and 16 threads of the Zen 3+ architecture, capable of boosting to 4.9GHz (which it actually does on occasion), so that’s more than acceptable by me.
I’m heartily impressed with the G14’s gaming performance overall.
That GPU and CPU combo makes quick work of our benchmarking suite, however, and I have to say I’m heartily impressed with the G14’s gaming performance overall. That’s even without turning to the more aggressive Turbo preset—I tested everything with the standard Performance mode. It’s able to top the framerate of RTX 3080 and RTX 3070 mobile chips pretty much across the board, and while it does slip below the RTX 3080 Ti in the Razer Blade 17, that’s a much larger laptop with a much larger price tag.
One of my favorite things about the G14 is in the name—it’s a 14-inch laptop. The blend of screen real estate and compact size is a great in-between of bulkier 15- and 17-inch designs, and not quite as compromised as a 13-inch model can feel. But the big thing with the 2022 model is that the 14-inch size has been fitted out with a larger 16:10 aspect ratio than previous models’ 16:9 panels.
When it’s running smoothly, the G14’s high refresh and high-resolution panel also looks fantastic. Being such a bright and colorful IPS display on this model, you really get to soak in every detail.
One of the downsides with this machine is the battery life, which really isn’t the best while gaming—less than an hour while actually playing. You’ll get more when playing videos or doing something boring like working, but we do expect a bit more from a modern laptop. It’s not a deal-breaker, but definitely something you’ll want to bear in mind.
The G14 has lost that quality of being surprisingly cheap for what you get, too, even if you do get stellar performance out of it.
Perhaps one reason for that is the inclusion of 32GB of DDR5 RAM—16GB of that is soldered to the board, and the other 16GB attached via removable SO-DIMM from the underside of the laptop. That’s not cheap memory. DDR5 prices have hardly settled down since the memory standard was introduced last year, and 32GB is a bounty of high-performance memory by comparison to most gaming PCs today.
Overall, though, the G14 experience is a pretty easy and straightforward one. I didn’t run into any major issues with it over the couple of weeks I’ve had it, and for the few negatives I have with the design, Asus has offset them with heaps of positives. The cheaper models may be a better bet than the one we reviewed, however. The same chassis and great design but with a slightly more amicable price tag.
I am mighty tempted to push the Razer Blade 14 further up the list, simply because the 14-inch form factor has absolutely won me over. The Asus ROG Zephyrus G14 in the No. 2 slot reintroduced the criminally under-used laptop design, but Razer has perfected it. Feeling noticeably smaller than the 15-inch Blade and closer to the ultrabook Stealth 13, the Blade 14 mixes a matte black MacBook Pro-style with genuine PC gaming pedigree.
The Razer style is classic, and it feels great to hold, too. And with the outstanding AMD Ryzen 9 5900HX finally finding its way into a Blade notebook, you’re getting genuine processing power you can sling into a messenger bag. And you’re now able to get your hands on the Blade 14 with the brand new Ryzen 9 6900HX chip at its heart, though in practice that has changed very little apart from offering some decent integrated graphics.
But add in some extra Nvidia RTX 30-series graphics power—now all the way up to an RTX 3080 Ti, but wear earplugs—and you’ve got a great mix of form and function that makes it the most desirable laptop I’ve maybe ever tested.
My only issue is that the RTX 3080 Ti would be too limited by the diminutive 14-inch chassis and run a little loud. So I would then recommend the lower-spec GPU options, though if you’re spending $1,800 on a notebook, that feels like too high for 1080p gaming. But you’re not buying the Blade 14 specifically for outright performance and anything else; this is about having all the power you need in a form factor that works for practical mobility.
The PC is all about choice, and Razer has finally given us the choice to use an AMD CPU in its machines.
The PC is all about choice, and Razer has finally given us the choice to use an AMD CPU in its machines, although it would be great if we had the option elsewhere in its range of laptops. It’s notable that we’ve heard nothing about a potential Blade 14 using an AMD discrete Radeon GPU alongside that Ryzen CPU. Ah well.
Forgetting the politics a second, the Razer Blade 14 itself is excellent, and is one of the most desirable gaming laptops I’ve had in my hands this year. Maybe ever. The criminally underused 14-inch form factor also deserves to become one of the biggest sellers in Razer’s extensive lineup of laptops. And if this notebook becomes the success it ought to be, then the company may end up having to make some difficult choices about what CPUs it offers, and where.
The choice you have to make, though, is which graphics card to go with. Sure, the RTX 3080 Ti is quicker, but it leaves a lot more gaming performance on the workshop floor. That’s why the cheaper RTX 3060, with its full-blooded frame rates, gets my vote every day.
The Legion Pro 5 proves that AMD is absolutely a serious competitor in the gaming laptop space. Pairing the mobile Ryzen 7 5800H with the RTX 3070 results in a laptop that not only handles modern games with ease, but that can turn its hand at more serious escapades too.
The QHD 16:10 165Hz screen is a genuine highlight here and one that makes gaming and just using Windows a joy. It’s an IPS panel with a peak brightness of 500nits too, so you’re not going to be left wanting whether you’re gaming or watching movies.
The Legion Pro 5 really is a beast when it comes to gaming too, with that high-powered RTX 3070 (with a peak delivery of 140W it’s faster than some 3080s) being a great match for that vibrant screen. You’re going to be able to run the vast majority of games at the native 2560 x 1600 resolution at the max settings and not miss a beat. The fact that you can draw on DLSS and enjoy some ray tracing extras for the money all helps to make this an incredibly attractive package.
This is a lovely laptop to actually type on too. Folks have always gushed about how good Lenovo keyboards are; I always ignored them because I don’t trust people that are too into keyboards. But, I will admit, I think I get it. The rounded bottom keycaps have a nice feel to them. Add that with the large 4.7 x 3-inch touchpad, and you’ve got yourself a lovely work laptop that plays games well. I wish other laptop makers would take advantage of the added space of 17-inch gaming laptops and use it to make our lives easier.
The Lenovo Legion Pro 5 made me realize that Legion laptops deserve a spot at the top.
If anything knocks the Legion 5 Pro, it would have to be its rather underwhelming speakers and microphone combo. Anything with a hint of bass tends to suffer, which is a shame. The microphone was another surprising disappointment. My voice, I was told, sounded distant and quiet during work calls, which paired with a mediocre 720p webcam doesn’t make for the best experience. I will commend the Legion for fitting a webcam on a screen with such a small top bezel though—A for effort.
The Lenovo Legion Pro 5 made me realize that Legion laptops deserve a spot at the top, being one of the more impressive AMD-powered laptops we’ve gotten our hands on recently. From the bright, colorful screen to the great feeling full-sized keyboard, The Legion Pro 5 has everything you want in a gaming laptop for a lot less than the competition manages.
The MSI GS66 is one hell of a machine: It’s sleek, slick, and powerful. But it’s not Nvidia Ampere’s power without compromise, however. MSI has had to be a little parsimonious about its power demands to pack something as performant as an RTX 3080 into an 18mm thin chassis.
The top GPU is the 95W version, which means it only just outperforms a fully unleashed RTX 3070, the sort you’ll find in the Gigabyte Aorus 15G XC. But it is still an astonishingly powerful slice of mobile graphics silicon.
It can get a little loud, but thankfully, you have the benefits of all the Nvidia Max-Q 3.0 features at your disposal. This includes Whisper Mode 2.0, which will bring gaming down to barely audible levels, for when you want to be stealthy.
The GS66 also comes with an outstanding 240Hz 1440p panel, which perfectly matches the powerful GPU when it comes to games. Sure, you’ll have to make some compromises compared to an RTX 3080 you might find in a hulking workstation, but the MSI GS66 Stealth is a genuinely slimline gaming laptop.
It’s seriously thin.
It’s a shame laptop manufacturers are seemingly just content to use old chassis and cooling designs for their new Nvidia-based gaming laptops. Things are getting ever more thermally constrained when it comes to performance because new hardware is being dropped into old designs. This is the compromise with the GS66, and you have to be absolutely invested in having a low-profile gaming laptop to make it worth paying the price. And remember, that price is paid both in frame rates and in dollar bills.
Nvidia’s new suite of Max Q goodies help, although it’s a shame they are likely to be overlooked as manufacturers fail to offer consistent messaging about them and users aren’t necessarily going to go digging into exactly how they all work. But they do work, and this latest tranche of gaming laptops will absolutely be the best we’ve ever seen. Unfortunately, it might just be tougher than ever to figure out exactly which machine is right for you.
There’s a lot to love about the latest MSI GS66. This implicitly means there is also a lot that will frustrate. The overall machine is rather lovely—it’s seriously thin, especially for a gaming laptop, and comes with some seriously tasty internal specs too, but there are places where it feels like it might have benefitted from a little extra design time.
We loved Acer’s Predator Helios 300 during the GTX 10-series era, and the current generation Helios still manages to punch above its weight class compared to other $1,500 laptops. It may not be the best gaming laptop, but it’s one of the best value machines around.
The newest version of the Helios packs an RTX 3060 GPU and a sleeker form factor without raising the price significantly. It also has a 144Hz screen and smaller bezels, putting it more in line with sleek thin-and-lights than its more bulky brethren of the previous generation.
The only real drawback is the diminutive SSD, although the laptop has slots for two SSDs and an HDD, which makes upgrading your storage as easy as getting a screwdriver. You often need to get handy with the upgrades at this end of the price spectrum, and the Helios 300 is no different in that respect.
If you’re desperate for just that little bit of extra gaming performance and hang the sense of it—or the sound and fury of it—the Turbo button is Acer’s one-touch GPU overclocking feature. The Predator laptops have this simple feature designed to eke out as much extra gaming performance as possible.
In theory, it’s a neat feature but much as we saw with the Predator Triton earlier this year, you really only get about a 1-3% increase in performance. It also makes your system run very hot and distractingly loud. Honestly, I really don’t think it’s worth it unless you’re truly militant about maximizing your frames per second. Or have a really good noise-canceling gaming headset.
But for its $1,500 sticker price, the 15-inch Predator Helios 300 provides high-end 1080p gaming performance at a mid-range price. Even without the dubious utility of the Turbo button. The design changes are small but smart—like the power cord in the back instead of the side, having room for three storage drives, and the price/performance ratio is great. And that all makes the Acer Predator Helios 300 a strong contender for one of the best gaming laptops around.
There’s absolutely no question you can buy a much more sensible gaming laptop than this, but there is something about the excesses of the ROG Strix Scar 17 that make it incredibly appealing. It feels like everything about it has been turned up to 11, from the overclocked CPU—which is as beastly as it gets—to the gorgeously speedy 360Hz screen on the top model. Asus has pushed that little bit harder than most to top our gaming laptop benchmarks.
And top the benchmarks of the best gaming laptops it does, thanks in the main to the GeForce RTX 3070 Ti that can be found beating away at its heart. This is the 150W version of Nvidia’s new Ampere GPU, which means it’s capable of hitting the kind of figures thinner machines can only dream of. You can draw on Nvidia’s excellent DLSS, where implemented, to help hit ridiculous frame rates, too. And if that’s not enough, you can also grab this machine kitted out with an RTX 3080 Ti too.
The 17-inch chassis means the components have a bit more room to breathe compared to the competition too, and coupled with the excellent cooling system, you’re looking at a cool and quiet slice of gaming perfection. This extra space has allowed Asus to squeeze an optomechanical keyboard onto the Scar 17, which is a delight for gaming and more serious pursuits.
Importantly, all this power comes at a cost not only to the temperatures but also to the battery life. Sure, you’re not as likely to play games with the thing unplugged, but if you ever have to, an hour is all you get.
I’m not enamored with the touchpad either, while we’re nit-picking. I keep trying to click the space beneath it, and my poor, callous fingers keep forgetting where the edges are. This particular model doesn’t come with a camera either, which is a glaring omission for the price, and there’s a distinct lack of USB Type-A ports for the unnecessary arsenal of peripherals I’m packing. There are a couple of USB Type-C ports around the back to make up for it, though, and I’m happy there’s a full-sized keyboard.
For a machine with a 17-inch chassis, it doesn’t weigh the world and doesn’t need two power adapters to work to its full potential either.
Such gripes are easy to overlook when Asus has managed to pack such an immense config in here. The frankly unnecessary 32GB of DDR5-4800 RAM and that 2TB SSD are awesome, but a bit over the top for most mortals.
For a machine with a 17-inch chassis, it doesn’t weigh the world and doesn’t need two power adapters to work to its full potential either. For that, it doesn’t feel as much like a hulking desktop replacement as we’ve seen. And that’s okay. Particularly when you crown a portable machine like this off with a 1440p, 240Hz IPS panel with 3ms response time, which also does a smashing job of reducing glare.
While you could get a Lenovo Legion 5 Pro with its RTX 3070 for half the price, spending $2,999 on this Strix Scar config will put you ahead of the competition with very little effort. And sure, it’s not as stylish or as apt with ray tracing as the Blade 17, but there’s a good $1,000 price difference there. And for something that can outpace the laptops of yesteryear in almost every running, I’d pay that price for sure.
One thing I love about Alienware is the company’s unrelenting confidence in its new products. Whether that’s boasting about a desktop being a “Benchmark Bruiser (opens in new tab)” or releasing one of the slickest OLED gaming monitors (opens in new tab) to date. So, when I get offered the chance to check out the new Alienware m17 R5 it has dubbed “the most powerful 17-inch AMD Advantage gaming laptop,” how can I pass that up?
Powering the Alienware m17 R5 is an AMD Ryzen 9 6900HX and Radeon RX 6850M XT combo that, on paper, seems like a slam dunk for Team Red. It’s also the first gaming laptop we’ve tested with an RX 6850M XT, and so I was stoked to see how it stacked up against laptops with an RTX 3070 Ti and even RTX 3080 Ti mobile GPUs.
The surprisingly bright 500cd/m² 4K display got a lot of use for streaming video. I appreciate seeing the detailed stress lines on Guenther Steiner’s face in the last season of F1: Drive to Survive in 4K on Netflix. Games look pretty good, though this display is better served for professionals and creatives. So if you don’t fall into that category, you may be better off sticking with the 1080p display at 240Hz to save money and add time to your much-needed battery life.
This laptop also takes advantage of AMD’s suite of game-boosting technologies (opens in new tab), such as SmartAccess Graphics, which automatically switches your output from the Ryzen APU and Radeon GPU, along with Smartshift Max, which automatically shifts around your power depending on what app or game you’re using. The good thing about these features is that they just work without you messing with them. All nice features, especially if you are bouncing between gaming and, let’s say, video editing and trying to eke out extra horsepower.
On the GPU side, the Alienware m17 R5 AMD Advantage model excelled at nearly all our gaming benchmarks at 1080p on mostly maxed-out settings.
On the GPU side, the Alienware m17 R5 AMD Advantage model excelled at nearly all our gaming benchmarks at 1080p on mostly maxed-out settings. It hit triple-digit framerates in nearly all the games I played, with a Hitman 3 average of just over 200 fps.
If you value frame rate over resolution, the m17 R5 easily delivers over 100 fps frames on most games at 1080p. Even Cyberpunk was hitting around 128 fps (with FSR turned on). The high 120Hz refresh rate on the display means you’ll run into little to no screen tearing. Because of the drastically higher frame rates, I played more games at 1080p than 4K. If you’re playing a shooter like Apex Legends or Fortnite, that’s the way to go, which makes the 4K display a bit redundant much of the time.
The Alienware m17 R5 might not be the most powerful gaming laptop ever, but it isn’t through lack of trying. Even with the CPU performance lagging behind some of its rivals, the m17 R5 makes up the deficit with impressive gaming results. The 4K display is great for anyone watching movies or working, but you could save a few hundred dollars by scaling down some of the more expensive components and score yourself a really solid 1080p gaming laptop.
What’s the most important gaming laptop component?
When it comes to gaming, the obvious answer is the graphics card, but that’s where things have gotten a little more complicated recently. With GPU performance now so dependent on cooling, you have to pay attention to what wattage a graphics card is limited to and what chassis it’s squeezed into.
As we said at the top, an RTX 3080 confined in an 18mm chassis will perform markedly slower than one in a far chunkier case with room for higher performance cooling.
Should I worry about what the CPU in a gaming laptop is?
That really depends on what you want to do with your laptop. An 8-core, 16-thread AMD Ryzen chip will allow you to do a whole load of productivity on the road, but honestly, it will have little benefit in gaming. That’s one of the reasons Intel has launched its Tiger Lake H35 chips; they’re quad-core, 8-thread CPUs, but they’re clocked high to deliver high-end gaming performance when paired with something like the RTX 3070.
What screen size is best for a gaming laptop?
This will arguably have the most immediate impact on your choice of the build. Picking the size of your screen basically dictates the size of your laptop. A 13-inch machine will be a thin-and-light ultrabook, while a 17-inch panel almost guarantees workstation stuff. At 15-inches, you’re looking at the most common size of the gaming laptop screen.
Are high refresh rate panels worth it for laptops?
We love high refresh rate screens here, and while you cannot guarantee your RTX 3060 will deliver 300 fps in the latest games, you’ll still see a benefit in general look and feel running a 300Hz display.
Should I get a 4K screen in my laptop?
Nah. 4K gaming laptops are overkill; they’re fine for video editing if you’re dealing with 4K content, but it’s not the optimal choice for games. The standard 1080p resolution means that the generally slower mobile GPUs are all but guaranteed high frame rates, while companies are slowly drip-feeding 1440p panels into their laptop ranges.
A 1440p screen offers the perfect compromise between high resolution and decent gaming performance. At the same time, a 4K notebook will overstress your GPU and tax your eyeballs as you squint at your 15-inch display.
Where are the laptops with AMD graphics cards?
Your guess is as good as ours. A few gaming laptop SKUs offered the RX 5000-series cards, but they were thin on the ground. But AMD has promised RX 6000, Big Navi mobile GPUs will be on their way to gaming laptops in the first half of this year, but so far we haven’t seen them in the labs.
What if, instead of regular old human hands, your meathooks were two wooden baseball bats? That’s the deep philosophical question Triband is looking to answer with its bat-themed sequel to 2020’s What the Golf?
What the Bat? takes the offbeat physics fun of its predecessor and throws it into VR, tasking you with completing all manner of jobs with your wooden club claws. Aside from being able to use them for their actual purpose of hitting balls, the game is stuffed with different toys and minigames to play around with. You can gently smack a chicken for its eggs and then attempt to crack them open with your wooden stumps. They also make for great smashing tools, breaking open piggy banks or launching objects through glass windows if that’s what you fancy.
My favourite task I spotted in the trailer has to be serving up some tasty fishy treats to a row of orange cats, their blank stares boring into my very soul as the bats frantically smack a lever, green fish slopping down onto the plate. It looks to be a daft bit of fun, the exact sort of game that makes me wish I had a VR to mess about and have a good giggle at for a few hours. Considering Chris Livingston gave its predecessor 88 in his What the Golf? review, I have high hopes that Triband’s new game is going to be a good’un.
https://gamingarmyunited.com/wp-content/uploads/2022/11/1668447259_What-the-Golf-dev-has-a-new-VR-game-that.jpg6751200Carlos Pachecohttps://gamingarmyunited.com/wp-content/uploads/2022/08/Website-Logo-300x74.pngCarlos Pacheco2022-11-14 16:51:092022-11-14 16:51:09What the Golf? dev has a new VR game that turns your precious hands into wooden smacking sticks
“It’s amazing how much architecture is done on hotel napkins,” AMD fellow, Andy Pomianowski tells a room packed with press at AMD’s RDNA 3 launch event. It’s news to me. I had always assumed a liberal amount of wipe board markers had been the go-to way to note down any forthcoming ideas. Yet RDNA 3’s chiplet architecture was actually first jotted down on a flimsy piece of paper in a hotel during an off-site staff meeting.
“We’re grappling with challenges. How do we provide the best product for our customers? We’ve had a lot of success in server and the desktop market, and the application of that technology to GPUs wasn’t obvious,” Sam Naffziger, corporate fellow at AMD, tells us.
“Mike [Mantor] and Andy [Pomianowski] had very aggressive targets, a lot of features and goals that we knew we could not meet in combination without doing something different.”
“So we were off at our staff off-site, and doing our part, being good, pretending we were engaged, but not all of the presentations were as engaging, There was one where we were sitting there thinking, my mind is working in the background, and just thinking through all of the technology challenges and the options. And so I started scratching out on a little hotel pad there, which no one usually uses but once in a while they come in handy.”
According to Naffziger, he jotted down something that would now be pretty familiar to any PC gamer that’s hot on the latest hardware: the plan for the chiplets within RDNA 3’s recently announced GPUs: the RX 7900 XTX and RX 7900 XT (opens in new tab).
“So the GCD/MCD thing. I scratched out something remarkably like what we showed yesterday [at RDNA 3’s launch event] and it seemed a bet. So I slipped it over to Andy, and he sat there and he did one of his, you know, furrowed his brow, and said ‘I think that can work’.”
“Start with a napkin. Then it’s PowerPoint, and then the engineering teams just do it,” Pomianowski jokes.
If only it were that simple. The RDNA 3 architecture does involve just two chiplet types—the GCD and the MCD—but there’s a whole lot more to it than that would suggest.
Think of RDNA 3 as an amicable split for the graphics pipeline and the larger part of the memory subsystem.
The GCD is where the actual shader cores live—known as stream processors in AMD’s RDNA architecture. These are grouped into Dual Compute Units, not unlike RDNA 2, except with a new and improved multi-purpose ALU for better instruction throughput, an enhanced AI operation unit with the new Matrix Accelerator, and a larger Vector Cache. These upgrades and many others allow RDNA 3’s Dual CU to offer much improved clock for clock performance over last-gen—around 17.4%.
Eight Dual Compute Units share L1 cache within a Shader Engine. Six Shader Engines share L2 cache, a Geometry Processor, and a Graphics Command Processor. All of which lives within the GCD and is joined by the card’s PCIe Gen 4 silicon, Multimedia Engine, and Display Engine.
And that about wraps up a very top-level division of the GCD within the Navi 31 GPU. Yet some stuff is missing: Infinity Cache, for one, which is a key feature of RDNA introduced back with RDNA 2, but also crucially a way for the GPU to communicate with the memory chips installed off package on the graphics card PCB. You wouldn’t get very far in the latest games without access to a large memory buffer.
That’s where AMD’s using what’s called an MCD. This takes all the stuff usually stuck surrounding the Graphics Engine—the Infinity Cache and the GDDR6 memory interfaces—and boots them off to their own chiplet. Each MCD is much, much smaller than the GCD, but therein lies one of the benefits of this chiplet system.
Whereas the Navi 21 GPU found in the RX 6950 XT is 520mm2, and the AD102 GPU in Nvidia’s RTX 4090 is a whopping 608mm2, AMD’s GCD for Navi 31 is just 300mm2.
Each MCD is only 37mm2.
A lower chip size makes for higher yields. Higher yields should make for a much better supply picture.
“The smaller the die, the better the yield, and so it is, just from an economic standpoint, those are all very small, very, very good yield,” Laura Smith, corporate vice president, Graphics MNC and Product Management, tells me.
“If you put them all into one big die, then you’ll see, and you see it in all sorts of products, you need some redundant capabilities, because you’re going to have fallout.”
I’d love to think this chiplet approach would have a desirable effect on the overall supply picture and thus trickle down to impact the prices and supply us gamers will actually see over at retailers after the initial launch fervour. A single chiplet that dramatically reduces die size while also being utilised across multiple products in AMD’s lineup could be a real winner in that regard, even if AMD isn’t targeting Nvidia’s top GPU (opens in new tab) in performance. It certainly worked for Ryzen, which employed a similar approach with its cIOD—a die that brought together all the uncore functionality of the processor under one roof and on an older process node.
The same point can be made for AMD’s RDNA 3 chips in regards to process nodes. The memory interface and the Infinity Cache weren’t set to benefit a whole lot from TSMC’s 5nm process node, so splitting them off from the core and manufacturing them on the cheaper 6nm node made more sense.
“When we are looking at chiplet design, we want to maximise it, which means we want to put the things that shrink well and get the benefits from the advanced and expensive technology nodes in that technology and the things that don’t get much benefit we can leave behind on old technology nodes,” Naffziger says.
Best gaming PC (opens in new tab): The top pre-built machines from the pros Best gaming laptop (opens in new tab): Perfect notebooks for mobile gaming
Naffziger worked on AMD’s Ryzen chiplet approach—it was his “baby” for years—so it only makes sense that he’d be the one to think up the new way this technology could be applied to a gaming GPU. That also necessitated a new interconnect—GPUs are suckers for bandwidth—and that’s where AMD’s exciting Infinity Links (opens in new tab) comes in.
But to think this all started on a scrap of paper in a hotel during a boring meeting. So think about that next time you’re sitting in a meeting listening to someone drone on about why your company has to turn off all the heating in the office this winter—you could dream up your next big breakthrough right there and then.
https://gamingarmyunited.com/wp-content/uploads/2022/11/1668443613_AMDs-innovative-new-RDNA-3-GPU-started-life-as-a.jpg6761200Carlos Pachecohttps://gamingarmyunited.com/wp-content/uploads/2022/08/Website-Logo-300x74.pngCarlos Pacheco2022-11-14 15:38:222022-11-14 15:38:22AMD’s innovative new RDNA 3 GPU started life as a doodle during a boring meeting
We may request cookies to be set on your device. We use cookies to let us know when you visit our websites, how you interact with us, to enrich your user experience, and to customize your relationship with our website.
Click on the different category headings to find out more. You can also change some of your preferences. Note that blocking some types of cookies may impact your experience on our websites and the services we are able to offer.
Essential Website Cookies
These cookies are strictly necessary to provide you with services available through our website and to use some of its features.
Because these cookies are strictly necessary to deliver the website, refusing them will have impact how our site functions. You always can block or delete cookies by changing your browser settings and force blocking all cookies on this website. But this will always prompt you to accept/refuse cookies when revisiting our site.
We fully respect if you want to refuse cookies but to avoid asking you again and again kindly allow us to store a cookie for that. You are free to opt out any time or opt in for other cookies to get a better experience. If you refuse cookies we will remove all set cookies in our domain.
We provide you with a list of stored cookies on your computer in our domain so you can check what we stored. Due to security reasons we are not able to show or modify cookies from other domains. You can check these in your browser security settings.
Google Analytics Cookies
These cookies collect information that is used either in aggregate form to help us understand how our website is being used or how effective our marketing campaigns are, or to help us customize our website and application for you in order to enhance your experience.
If you do not want that we track your visit to our site you can disable tracking in your browser here:
Other external services
We also use different external services like Google Webfonts, Google Maps, and external Video providers. Since these providers may collect personal data like your IP address we allow you to block them here. Please be aware that this might heavily reduce the functionality and appearance of our site. Changes will take effect once you reload the page.
Google Webfont Settings:
Google Map Settings:
Google reCaptcha Settings:
Vimeo and Youtube video embeds:
Other cookies
The following cookies are also needed - You can choose if you want to allow them:
Privacy Policy
You can read about our cookies and privacy settings in detail on our Privacy Policy Page.