this post was submitted on 22 Nov 2023
1 points (100.0% liked)

AMD

26 readers
4 users here now

For all things AMD; come talk about Ryzen, Radeon, Threadripper, EPYC, rumors, reviews, news and more.

founded 1 year ago
MODERATORS
top 42 comments
sorted by: hot top controversial new old
[–] [email protected] 1 points 11 months ago
[–] [email protected] 1 points 11 months ago

to sum it up AMD can totally add support for anti-lag+ if they do it properly for all gpu's

[–] [email protected] 1 points 11 months ago (1 children)

aside from the antilag stuff:

this is an excellent visual representation of CPU & GPU bottlenecks. what they are and how it occurs.

most people don't understand it well enough

[–] [email protected] 1 points 11 months ago (1 children)

The problem is that it's not technically correct. GPU can't bottleneck CPU because the pipeline goes one way. Nothing prevents CPU from drawing as much frames at it wants. What happens in the most game engines - the engine checks if GPU is ready to accept another frame, and if it's not, it tells CPU to chill a bit, but not all games do that properly - i.e. in NFS Undercover rendering thread sticks to 100% at all times, at least did last time I checked. Saying "GPU bound" is more technically correct, as bottleneck implies that there is some other PC part down the pipeline, which isn't the case for GPU - GPU is the last part. But I'm just being pedantic here; objectively - Battle(non)sense ineed does a great job at explaining things to people in simple terms.

[–] [email protected] 1 points 11 months ago (1 children)

It's more complicated than that. Yes, the physical pipeline ends at the GPU since the frame just sits in the GPU until the OS is ready to present it, but the logical pipeline loops back to the CPU since the CPU then moves on to the next frame in the render queue which may or may not be available. Ideally it would simply be available as the GPU has finished rendering that frame and the OS has finished presenting that frame which gives the CPU free reign over it, but it may be in a present-pending state where it's waiting for the OS to present it or it may be in a currently-rendering state where the GPU is actively rendering it.

If the frame is in a currently-rendering state then the CPU cannot use that frame since that frame's resources are being actively used by the GPU and trying to access those resources leads to a very bad time, so the CPU has to try another frame. If the frame is in a present-pending state then the CPU can use it so long as vsync is disabled and screen tearing is acceptable, as that frame's resources aren't being actively used anymore and the OS generally allows reusing a present-pending frame (after all, that's why vsync is typically an option and not mandatory).

If the CPU is sufficiently far ahead of the GPU then it will always eventually hit a wall where it tries to use a currently-rendering frame, has no other frames it can use and is forced to sit idle. If you're on newer APIs such as Vulkan or DirectX 12 then you can bypass this somewhat by using the mailbox presentation mode (not sure what the name is under DirectX 12, but that's the name under Vulkan) to at least tell the OS that you intend on ping-ponging between two different frames in a triple-buffer setup, which lets the CPU ping-pong between those two frames while the GPU is busy rendering its currently-rendering frame. Things get exponentially more complicated under DirectX 12 and Vulkan, however, as the engine itself is now responsible for building and managing the render queue, the API/driver/OS just handles the presentation side of things.

[–] [email protected] 1 points 11 months ago (1 children)

This raises some questions.

  1. What do you mean by "frame may not be available" for CPU? I assumed CPU creates frames. And then "CPU cannot use that frame". Did you mean to say "frame buffer"?
  2. What do you mean by "frame's resources"?
  3. Isn't "the wall" render queue limit typically?
  4. I guess mailbox presentation mode is LIFO-queued triple buffering. What you described sound like CPU is filling frame buffers with some data that might or might not be later used by GPU, but I assumed it's GPU that creates and fills frame buffers with data. Are you sure it has anything to do with CPU's job?
  5. In unlocked framerate with no VSync scenario, when GPU is at 99% usage - in most games CPU usage reduces, as render queue is full. It, however, is not the case for some games, like NFS Undercover. How specifically does this process happen in such scenario, or what tells CPU to wait instead of drawing more frames?
[–] [email protected] 1 points 11 months ago (1 children)

What do you mean by "frame may not be available" for CPU? I assumed CPU creates frames. And then "CPU cannot use that frame". Did you mean to say "frame buffer"?

I meant the render queue, of which the framebuffer/swapchain is part of.

What do you mean by "frame's resources"?

In this case I mean GPU resources that the CPU may need to access. Think uniform buffers that pipe game state information to the shaders, textures that hold animations that update each frame, vertex/index buffers that hold mesh data that updates each frame, etc. Each frame typically has to be given its own set of these resources so that when the CPU updating the resources for frame N doesn't change or potentially corrupt the resources that the GPU is actively using for frame N-1.

Isn't "the wall" render queue limit typically?

Yes and no, depends on how well the CPU and GPU stay in sync with each other.

I guess mailbox presentation mode is LIFO-queued triple buffering. What you described sound like CPU is filling frame buffers with some data that might or might not be later used by GPU, but I assumed it's GPU that creates and fills frame buffers with data. Are you sure it has anything to do with CPU's job?

Yes, since it basically lets the CPU bounce between two available/present-pending frames while it waits for a currently-rendering frame to clear. This way the CPU never sits idle, it's just constantly overwriting previously recorded command lists and previously updated resources that haven't been picked up by the GPU yet.

In unlocked framerate with no VSync scenario, when GPU is at 99% usage - in most games CPU usage reduces, as render queue is full. It, however, is not the case for some games, like NFS Undercover. How specifically does this process happen in such scenario, or what tells CPU to wait instead of drawing more frames?

Normally it's an API/system call that tells the render queue to present the current frame and swap to the next frame that tells the CPU to wait. In older APIs it's a lot more nebulous so I can't tell you exactly why NFS Undercover does that, but my guess would be that the CPU and GPU are close enough to not exhaust the render queue quickly or the API is detecting that some usage pattern lets the CPU access in-use resources by the GPU in some places in the pipeline.

[–] [email protected] 1 points 11 months ago (1 children)

Thanks for taking your time to explain all this!

[–] [email protected] 1 points 11 months ago

No problem. I left out some of the more complicated details and simplified others so if you want to learn more I'd recommend looking into how Vulkan's command buffers, device queues, fence/semaphore resources work which are all part of the logical side of the render queue, as well as how Vulkan's swapchain works for the frame presentation side of the render queue. Vulkan and DirectX 12 both expose quite a lot of how the render queue works so they can shed some light on what the driver is having to do behind the scenes for DirectX 11 and OpenGL.

[–] [email protected] 1 points 11 months ago

Am I the only one who doesn’t really give a shit about these technologies? I’ve never noticed an input lag difference this small.

[–] [email protected] 1 points 11 months ago (1 children)

Latency added to the list of things Radeon users don't care about. List also includes upscaling, graphics, efficiency, crack software backing, being competitive

List of things they do care about: COD fps numbers, pretending like they get more value per dollar so they can sit on their high horse over the uninformed pleb, Su-bae

[–] [email protected] 1 points 11 months ago (2 children)

Things I don't care about: Ray Tracing (mostly, maybe in a few years it will be vital), Frame Generation (misnomer, actually fancy motion blur). Things I do care about: Upscaling (I use this in every game that features it, and generally think FSR2 looks pretty good), Latency (antilag+ needs to be reenabled asap), Power usage(I used to care more but I moved my pc into a cooler room).

Obviously we get more value per dollar... ah nm you're just a troll.

[–] [email protected] 1 points 11 months ago

This is some pretty cringe cope

[–] [email protected] 1 points 11 months ago

Obviously we get more value per dollar

Now consider EU electricity prices, 7900 XTX can draw up to 100w more than 4080 (same tier competitor).

Lets say u use card gaming 20 hours a week (insane numbers for some, extremely low for most gamers here on reddit), 80w extra power draw, 0.4 euro per kwh in Europe (Denmark i think easily hits 0.5 euro per kwh tho for example).

Thats 33 euro per year more to run XTX. Bump it to 40 hours a week (again, depends on the user but still far from what some play), thats 66 euro per year extra.
XTX is like 80 euro cheaper than 4080 in my country atm. It rather quickly becomes more expensive card. And theres the idle draw bug, which to this day happens on my SO system, where her desktop draws 100w idle on 7900XT. Never been so disapointed with a card (we didnt plan or want to buy XT but it was spontaneous decision before christmas to buy her new GPU to enjoy some new games and there wasnt much choice in physical store before christmas).

[–] [email protected] 1 points 11 months ago (1 children)

I remember AntiLag+, it went away much faster than it came.

[–] [email protected] 1 points 11 months ago

Went away without lag

[–] [email protected] 1 points 11 months ago

AAH, wtf is this intro sound?

[–] [email protected] 1 points 11 months ago

First they force it, when no one wants it, make it open so they can proclaim to care about users.
It's AMD modus operandi lately

[–] [email protected] 1 points 11 months ago

BattleNonSense is back?

[–] [email protected] 1 points 11 months ago

Can you still get a driver with Anti-lag+ enabled? If so, which one? I would like to use it in Cyberpunk.

[–] [email protected] 1 points 11 months ago

Wow the big news actually is that battle nonsense is back!

This dude simply rocks for anything of tech analysis and latency and stuff

[–] [email protected] 1 points 11 months ago (3 children)

I don't really understand the tone. The guy seems to understand roughly how the rendering pipeline works and the pros and cons of each solution. By doing it, he sort of answers his own question of why did AMD decide to go with this, and so I don't understand why he sounds so dumbfounded about it.

I am not here to defend AMD but working in the industry I can say that there are plenty of reasons for why AMD did what they did: competitive, managerial, technical, budget and, of course, developer relations.

For one, it takes a lot less time to address this issue by developing a library that hooks to every game possible and make it faster without ever needing to talk with the developer. Is it a hack? Of course it is and they know it. But it may have been necessary for AMD to give a quick and cheap answer to players in a competive market. Their answer does come with lots of caveats but it probably achieves 80% of quality with 20% effort as compared to Nvidia. Enabling it by default is bad I guess, especially because it can break games. However, not all games are played competitively online nor have anti-cheat software, so I don't understand either why just focus on CS and whatnot. Again, it should not have been the default and it should come with CAPS disclaimer that the feature can result in banning. Especially if they know what games are, they can even have made it impossible to turn on the DLL for such games. With that, I agree.

Back to the question. Gamers in general and people doing these reviews often downplay the magnitude of the work involved in creating a stable foundation. Let's say all of a sudden a company has to engage with 100 game developer companies about a "potentially new SDK prototype". It's not simply "hey, we developed this, use it". It takes time to first build something that's barely usable, understand and get feedback of how it can integrate in the developer's workflow, ask developers to add another dependency and another level of testing which incurs in costs both for AMD/Nvidia and the developers. While all of that is happening you have your boss knocking on your door asking "why are we losing to the competitor?". That applies to mid-level management as to game developers.

AMD using the described method (injecting DLLs into the games' processes) is as terribly hacky as it is a good of a short/mid term plan. Of course, that doesn't rule out the long term plan of talking with developers for a proper, longer term plan.

At the end of the day I find it quite positive that we have at least 2 companies with 2 different strategies for the same problem. If anything, that means we learn with it and we have more variety to choose from.

[–] [email protected] 1 points 11 months ago

I agree, I didn't see any problem with the software at all, it just worked in every game flawlessly. The drivers have all sorts of things that just work on every title to improve the game. For me it sucks that they removed it. I don't play e-sports but I can appreciate it would suck to get banned through no fault of your own. But I think the game companies have at least a little responsibility for such a draconian response, which ultimately was caused by people trying to cheat. These people are the root cause of all these issues. We can't have nice things because someone will wreck it all. Yes it could have been handled better in hindsight. Maybe games that can ban people for using cheats can implement a simple api to disable modifications. Cheaters do things at other people's expense, and now they have caused me to lose out when I'm not even playing the same game as them. I'm not blaming AMD for this.

[–] [email protected] 1 points 11 months ago

I work in Software Dev, not the games industry but the approach really shocked me, it seems a bit too cowboy for a big company, what you said may make sense from a project manager point of view but I would have expected the developers to push back with concerns about:

  • It being easier for them to produce an SDK that can be offered to partners than produce a library and find hook points in games to inject it into (might end up being a bit easier than this work being multiplied per game as these hooks probably exist at the engine level)
  • Hooking into DLL function calls being an extremely fragile way to build software. I'd be surprised if any tech lead signed off on this approach.
  • Anti-Cheats in the games they are trying to inject code in are designed to attest that the game's memory has not been modified and it's integrity is maintained. How is our solution compatible with this?
    • Research task is created and developers / project managers reach out to partners to discuss.
  • Legal concerns, hooking into DLLs in most proprietary games as well as any other modifications are often banned in the EULA.
    • Task created to reach out to legal.

On the communication side, I can see that being something that got missed, it often breaks down when there is far too much pressure to deliver but most devs really care about the quality of their software so the technical counterpoints would surprise me if they weren't raised. I imagine they probably were but somewhere high up a decision was made to go ahead regardless.

[–] [email protected] 1 points 11 months ago

Customers don't care how its made.

They care about the end result and they are paying for it. If the result is worse than some other product, they don't feel like the money they spent is best value sometimes.

Options and competition is good. However AMD doesn't seem keen on "competiting" and rather offer an alternative that costs less, and is more niche, or worse, or whatever, for that part of the market that is just anti-number one.

[–] [email protected] 1 points 11 months ago

It's quite sad that even after all his videos, he still have to remind people that running GPU at 99% usage increases input latency, and majority of PC gamers still say it's not true. What about not creating the problem in the first place, so you don't need all those Anti-Lag and Reflex?

[–] [email protected] 1 points 11 months ago (2 children)

Maybe AMD should just sell off the radeon brand. Their cpu's are the ones making the most money, gpu division seems to be unneeded. Just leave it to nvidia for gpu's

[–] [email protected] 1 points 11 months ago

I'm not saying Radeon is much of a competitor right now, but the last think I want is an actual monopoly.

[–] [email protected] 1 points 11 months ago (1 children)

Yeah just throw away the tens of millions of GPUs they sell to Xbox and Playstation every generation, why not?

[–] [email protected] 1 points 11 months ago (1 children)

Next gen consoles can change to nvidia

[–] [email protected] 1 points 11 months ago (1 children)

You know AMD makes the APUs so they make both the CPU and GPU on a single chip right? The only way for Nvidia to make console GPUs would be for them to partner with either AMD or Intel and codesign an APU which would make consoles impossible cost wise.

[–] [email protected] 1 points 11 months ago

Nvidia can make apu as well, switch uses nvidia apu and Nvidia are coming up with their arm based cpu's soo

Nvidia are bigger than amd, if it wasn't for the x86 monopoly intel and amd share, nvidia could have made x86 cpu's that curb stomp ryzen line

[–] [email protected] 1 points 11 months ago (1 children)

Anti-Lag+ is a cool tech and it works well. The problem was obviously the lack of communication with developers which could have prevented it from triggering anticheat.

[–] [email protected] 0 points 11 months ago (1 children)

It's not an issue of communication. It's a fundamental flaw with their implementation which needs to be addressed.

[–] [email protected] 1 points 11 months ago (1 children)

it works as intended. it allows a very broad range of games to easily support it. but they absolutely need to make it so that devs don't trigger anti-cheat if it is turned on.

[–] [email protected] 1 points 11 months ago

No, AMD didn't intend to get users banned and for the feature to be useless for multiplayer games. That's why AMD was caught by surprise and had to remove it.

[–] [email protected] 1 points 11 months ago

Lets be honest. Buying AMD is like taking an L.

[–] [email protected] 1 points 11 months ago

still waiting for RDNA2 support...

[–] [email protected] 1 points 11 months ago

Okay so i have a few questions First, why AMD didn't go the same approach as nvidia and added an SDK so game devs could add it into their game? Second, why did AMD lock it to RDNA3 Gpus while Reflex is usable from 900 series and up? And thirdly, why is reflex proprietary to nvidia since it's a software solution(A dynamic frame limiter) ? Correct me if i am wrong

[–] [email protected] 1 points 11 months ago (2 children)

I don’t get what is even going on with amd anymore. Should I just return my 7900XT? Is it just a total piece of shit? Everything I see on the internet is just saying why I should be regretting my purchase and that I made a mistake not going nvidia

[–] [email protected] 1 points 11 months ago

If you are happy with your purchase who cares what anyone else thinks. It's all just redditors engaging in console war behavior. It will happen now and will still happen for decades to come.

If you are not happy with your purchase then buy an NVIDIA card. It really does not matter at the end of the day who gets your money or which redditor is right or wrong what matters is if you are happy with your purchase.

[–] [email protected] 1 points 11 months ago

Why did you buy it over Nvidia in the first place? Are those reasons and needs being fulfilled and does it meet your expectation for what you paid for it? If so, then why are you regretting your purchase? If you like it, if it does what YOU want it to do, then you shouldn't regret it.