Today in “making videogames is hard” news: Respawn’s journey to track down a bug that caused months of audio issues in Apex Legends. Grenades that don’t explode, guns that don’t shoot, damage that has no source, and months of agonizing investigation—all apparently caused by a single line of code added in Apex Legends’ Season 16 update.

As outlined in a thorough Reddit post (opens in new tab) by Respawn community manager Amy Thiessen, the trouble began at the start of Season 16 in February. The studio had started getting reports of “disappearing nades” in Apex. Respawn soon determined that grenades weren’t “disappearing” exactly, but they would sometimes fail to explode despite damaging players.

“This had not occurred during our Season 16 playtesting, could not be reproduced internally after initial reports, and was very difficult to pin down using live gameplay videos as the root cause was not always shown in the player’s POV,” the post reads.

Respawn got a better handle on the problem after receiving similar reports about missing gun sound FX and particle effects. “After a preliminary investigation, the primary suspect was found to be the system our servers use to dispatch ‘start’/’stop’ commands for various effects (e.g. certain sounds, particle systems, physics impacts, bullet tracers, explosions).”

Dev Team Update: Audio Update from r/apexlegends

Essentially, something was happening during a match that could overload the server’s limit for sound FX or particles, causing some sounds and FX to get dropped.

“From there, the theory was that something may be flooding this engine limitation, requesting thousands of effects every second!” the post says. “But was this a systemic issue or could it be a single entity acting up? Every season update comprises thousands of changes to assets, code, script, and levels. Which meant finding a needle in a haystack.”

Respawn turned to metrics to help suss out the problem, but nothing in the telemetry indicated a clear issue. This suggested to Respawn that this bug was a unique situation their systems had not previously seen.

“This left us with a complex issue that we knew was impacting our community, but was hard to reproduce despite detailed reports, had minimal leads internally, and there were no metrics to prove definitively that this limit was being hit at all.”

Where do you go from there? Respawn decided to test its theory of overloaded effects by intentionally breaking Apex Legends servers. The team spun up a test build and spawned 50 characters that all fired guns at the same time and infinitely used abilities to push the server effects load over the edge. It worked—the team could finally reproduce audio drops similar to the bug reports, but how it was happening to actual players was still a mystery.

“This gave us proof that FX would get dropped, but only with completely unrealistic test cases. Various aspects of our server performance were investigated, but nothing definite was found.”

Respawn kept a close eye on the issue as Season 16 raged on. The team eventually noticed that dropped audio reports tended to come from high-level play. This gave them the idea to deploy a server update that let Respawn track new metrics in a smaller subset of matches, which instantly led to a breakthrough.

“As the server update was finalizing, we found it. A single line of code was identified to be the root cause of the issue. Season 16’s new weapon.”

That weapon is the Nemesis, Apex’s newest burst-fire energy assault rifle. The Nemesis has a unique mechanic where dealing damage will “charge” the gun and make it shoot faster (as demonstrated (opens in new tab) by YouTuber Dazs above). This charging effect is represented visually on the gun by arcing electricity within the barrel. Respawn says that a line of code meant to tell this effect to “stop” while the gun wasn’t charged or holstered was actually repeating indefinitely for all players holding a Nemesis in their inventory.

“This means that every single player with an uncharged Nemesis would create a ‘stop particle’ effect on the server every frame, and this line of code was being called even when the weapon was holstered.”

Funnily enough, this also explains why the audio drops were happening more often in high-level play. “14 clients with a Nemesis running at 180 fps would be enough to cause FX to begin being dropped.” For once, it was the top-spec PC players who had a disadvantage.

Respawn says this also explains why its internal testing didn’t encounter the bug.

“The builds used for testing might not have had enough holstered Nemesis in play, had a rarer correlation with missing FX, or didn’t have enough clients at that fps—something for us to keep in mind and improve on for future testing.”

A patch deployed last week finally squashed the bug for good. And there you have it—a meddlesome audio bug with a complicated root cause that will, in the long run, help Respawn catch similar bugs before they reach players. Respawn concluded the post with an aside about testing, reminding players that “a minute of players playing Apex is the equivalent of 10 testers playing the game for a year!”


Source link

Intel’s Core i3, i5, and i7 branding has been around for well over a decade now, starting with the release of Nehalem chips back in 2009. It’s been through a few iterations since then, namely the introduction of a higher Core i9 tier back in 2017, but its biggest shake-up might be on the way with next-gen Meteor Lake (opens in new tab) chips.

Intel might be ditching the ‘i’ from Core i3, Core i5, etc., and in some cases replacing it with the word “Ultra”.

The rumours first started when an Ashes of the Singularity benchmark showed up with a Meteor Lake chip in testing called the Core Ultra 5 1003H (via Videocardz (opens in new tab)). That’s a bit of a weird name even without the Ultra stuffed in there, but this isn’t a desktop chip. It’s likely a mobile processor from the specs listed in a SiSoft benchmark database entry spotted by BenchLeaks (opens in new tab).

The chip is running at 45W with 18 cores and 18 threads listed, which would be an oddity for a chip built around Intel’s new hybrid architecture. Now AotS could be throwing out a red herring here, but it would be pretty weird to have a CPU with six Performance-cores (P-Cores) and six Efficient-cores (E-Cores) as Intel currently groups E-Cores into four-core clusters. It’s been supposed that it’s just four E-Cores and the extra two threads come from two tiled SoC cores. That’s possible because Meteor Lake will be the first Intel design to use disaggregated designs, i.e. chiplets (or tiles) rather than a single chip, but it’s also possible that the way Intel is divvying up E-Cores has changed with the coming generation.

But as to the new naming, I had first thought this to be just internal Intel parlance. But that doesn’t seem to be the case. 

One of Intel’s own, Bernard Fernandes, director of global comms at Intel, has confirmed that there will be some changes to CPU naming conventions with the coming Meteor Lake generation. They just haven’t said what they are yet.

See more

It’s very likely then that what we’ve seen in those early benchmarks will end up being the new naming convention for Intel’s coming Meteor Lake CPUs. That should mean we’ll see the ‘i’ dropped when Meteor Lake lands.

Notably, the ‘Core’ bit and the numbering system appears to be remaining the same.

With AMD also using the same numbering system for its Ryzen CPUs, it’d be odd for Intel to entirely rip up the rulebook and start again. With this new nomenclature, it’s looking more like Intel’s naming will directly align with that of AMD’s Ryzen CPUs, with a Core 7 an easy point of comparison for a Ryzen 7.

Meteor Lake is set for release in the second-half of this year, and Intel’s CEO Pat Gelsinger has confirmed that these chips are already in production (opens in new tab). But don’t get your hopes up for these chips powering your gaming PC this year, or ever. Meteor Lake is expected to launch mobile-first with possibly a small number of low-end desktop chips to round off the generation. For gamers, it’s the rumoured Raptor Lake Refresh that’s set to land that’s of most interest. 



Source link

I did not enjoy sinking my teeth into Redfall prior to launch day. That’s partially just because I didn’t gel with its co-op shooter/immersive sim mashup (see our review in progress for more on that), but also: It was running like butt. I’m happy to report that now, at least on mid to high-end hardware, that picture has changed.

Things were looking bleak just a few days ago. On my pretty respectable i5 12600K, RTX 3070 rig, I was averaging in the mid-50s fps as a baseline on 1440p medium, with DLSS performance mode enabled. While grinding through some early missions with fellow PC Gamer editor Tyler Colp, this generally low performance was augmented by everyone’s favorite new PC gaming quirk, shader compilation stutters (opens in new tab). Both my CPU and GPU utilization were around 40%, and as a cherry on top all the leaves in this foliage-dense New England town were always flickering and shimmering.

Tyler saw a generally more performant 4K 60 fps on the still-mighty RTX 3080 Ti, but still experienced general visual weirdness and stuttering. Hot on the heels of two rough PC ports (opens in new tab) in The Last of Us (opens in new tab) and Jedi: Survivor (opens in new tab), I thought it was curtains for Redfall. “Here we go again,” I said aloud, to my cat. “Reddit’s gonna crucify this game, and Digital Foundry’s gonna deliver the eulogy.”

When I sat down today to reconfirm my findings, however, Steam started downloading a massive 70 GB update for Redfall before I could do anything. After a long wait, the game’s now in a much more acceptable state. At 1440p with a mix of high and medium settings on the DLSS quality setting, I was comfortably in the 80s-90s fps tooling around the open world and getting into firefights. I found the shimmering foliage to be pretty well addressed by upping the level of detail setting, though the shader compilations are still a stutterin’. Tyler’s had much the same experience: stutters still present but not as debilitating, and foliage shimmering fixed via higher LoD.

If you saw leaks of Redfall gameplay earlier on launch day, that’s not necessarily the full story, and on my mid-range, relatively up-to-date setup, it’s currently getting perfectly adequate performance with this day one patch. Whether the game’s worth playing once you close the hood and get to it is something I’ll leave to our full review, but if you were hoping for a new Arkane classic… you may want to keep hoping. Tyler puts its bluntly in his review-in-progress (opens in new tab): “40 hours with Redfall has me wishing the vampires would win.”

If you’re playing Redfall, let us know how it’s performing in the comments. We’ve only been able to test it on Nvidia GPUs so far.


Source link

As you probably know by now, the PC version of Star Wars Jedi: Survivor (opens in new tab) has not launched in the best of conditions, with many players reporting choppy performance in Respawn’s latest Star Wars adventure. Publisher EA has already issued a sort-of apology (opens in new tab) for the problems, and has promised more patches coming in the next few weeks.

One of those patches had landed today, which apparently provides “Performance improvements for non-raytraced rendering.” That’s literally all the patchnotes say, though. There’s no specific information about what’s been fixed, or what’s been causing all the problems in the first place. But this is the second update EA has released in the space of a few days, and Morgan reported that the first patch alleviated some of the issues (opens in new tab) he experienced while reviewing it, so let’s hope this second patch further smooths things out.

 

See more

The announcement also addresses a bunch of bugfixes coming to consoles tomorrow, but as EA point out, the PC has already had those. It doesn’t make any mention of other issues with the PC version that have been raised, such as the terrible implementation of AMD FSR 2.0 upscaling, and whether there’s any chance of getting some DLSS action up in here.

Chances are Respawn and EA have further improvements coming in the next few weeks. But it would have been better if all this could have been avoided, especially since the game underneath it all is genuinely very good. This isn’t a Cyberpunk situation, where the flaws stretch beyond the technical and into the core game design. For what it’s worth, the Steam reviews have improved (opens in new tab) from “Mostly Negative” to “Mixed”, which suggests things are a little better than they were before the weekend.

Perhaps the problems would smart less if Survivor wasn’t the latest in a spate of rough PC ports, including the technical mess (opens in new tab) that was Forspoken, and the outright disaster (opens in new tab) that was the PC version of the Last of Us. It’s difficult to pin down what exactly the industry’s problem is with the PC at the moment, but there certainly seems to be something troublesome in the water.



Source link

PC gamers have really been getting the rough end of the stick lately with lacklustre ports and buggy releases. One of the latest disappointments is the release of Star Wars: Jedi Survivor, which has been plagued by performance issues. It’s not just a matter of older hardware struggling with a new game either, as even powerful PCs have been had trouble. All this has earned the game a “mostly negative” rating on Steam (opens in new tab).

Over on YouTube, PureDark (opens in new tab) has uploaded a video showcasing a modded version of the game running at much better framerates. They’ve implemented a DLSS Frame Generation mod, which according to the video evidence brought their game up from 45 to 90 actual fps. That’s a marked improvement over what many are seeing, especially Steam’s ticked-off comment section.

PureDark’s modded solution is still far from perfect. It has visual artefacts and isn’t fully featured when it comes to camera info, and it’s not currently available to the public. This is more experiment than anything else, though PureDark says in a comment under the video, “I had a breakthrough and [am] now trying to replace FSR2 with DLSS, that would make the image look much better.”

While the mod seems to do a decent job mimicking Nvidia’s DLSS 3 (opens in new tab), it’s not the real deal and it was never intended to be. However, for folks packing a RTX40 card and still having poor performance in Star Wars: Jedi Survivor, it paints a frustrating picture.

This mod making such a huge improvement to the game’s performance heavily implies actual DLSS 3 support would make a huge difference. Of course, that would only benefit the owners of compatible cards and, as the upscaling tech only works with the 40-series, that’s a small pool of owners. But it’s a pool who paid a premium to have one of the best GPUs on the planet, and tend to expect better performance out of their games.

It’s likely Respawn hasn’t included DLSS 3 support due to AMD sponsorship. While Jedi Survivor doesn’t support Nvidia’s upscaling, it does work with AMD’s FSR2, but that doesn’t appear to be helping much either.

Star Wars Jedi: Survivor has been patched since launch (opens in new tab), but it still has some egregious issues to fix. EA has released a pseudo-apology (opens in new tab) citing high-end hardware running Windows 10 as one of the bigger issues, but that’s never really posed a problem before and given the wide range of people on all sorts of hardware that are reporting problems, doesn’t sound quite right. Especially when a modded DLSS 3 implementation appears to make so much difference.


Source link