M9z wroteAs for the first statement you quoted, you're right, but also wrong in a way. I know demand on a CPU lessens when you push higher resolutions depending on your GPU, but I was talking about an APU alone, so when I said it's not going to run 4k ultimate, I was correct, no APU that I currently know of could run a game in ultimate detail in 4k.
"the CPU he recommended is the Ryzen 3 1200 which retails for about $60 as it is a bit of an older model, don't let that fool you though as it's still perfectly capable for some decent gaming, not gonna be 4k ultimate but it's capable of running smoothly."
The CPU he recommends is an R3 1200, that is not capable of "4k ultimate" is what you said. So no, I'm not "right, but also wrong".
M9z wroteI've always known AMD CPU's and GPU's to run hotter in similar scenarios against Intel and Nvidia
Depends entirely which 2 CPU's or GPU's you're comparing. Intel, NVidia, and AMD all have lots of different architectures they've used. Look at Bulldozer vs Ryzen. Ryzen is more efficient than Intel's current architecture, so it definitely doesn't run hotter than Intel.
As for the GPU side, the RX 580 was a GTX 1060 competitor and assuming coolers were the same/similar, temps between the 2 GPU's shouldn't be vastly different. Like I said, yes, NVidia are still more efficient on the GPU side but AMD aren't anywhere near as bad as they have been in the past- nor as bad as NVidia ave been pre-Maxwell.
This obviously isn't your fault you think that, it's a common statement when people are talking about PC hardware, "Oh, AMD runs hot and loud", but it just isn't accurate. It used to be in some cases, but nowadays, not so much. Other than the likes of Vega, which as I said was overvolted from factory to maximise usable silicon.
M9z wroteand hit a higher top heat as compared to Nvidia before they thermal throttle
I'm actually not 100% sure about this. AMD and NVidia both report different temps from different parts of the GPU now though, IIRC. So 80c on an NVidia GPU isn't the same as 80c on an AMD GPU. AMD displays the junction temp, which is technically more useful for making sure you don't throttle.
For thermal throttling, I think Turing is still the 85c range? I'm pretty sure around 85c has been NVidia's temp for a few generations now at which thermal throttling will start.
Navi I actually am not sure when it'll start to thermal throttle, but I think between 95-105c?
NVidia probably mention what their thermal throttle range is, at least they used to I believe. For Navi GPUs specifically, I'm almost certain GamersNexus has a video on when they'' start to thermal throttle and what temps are acceptable- I've not watched it though, hence why I'm not certain.
You're getting me all backwards with all this quoting back and forth lol. What I'm saying is the 1200, mixed with an RX 580, will not run in ultimate detail at 4k on most games, it was a sly joke on the hardware, as if saying "This Honda ain't gonna come first place in any races but it runs good." Or something similar lol, and truth be told I've been out of the PC game for a long long time until a few years ago when I did my 580 build, so hey OP, I can back the card up, mine was a 4gb version too, it did what I wanted it to, mine then was paired with a 1300x and performance for the games I wanted was solid, I'm sure a step down isn't gonna mess up much, it'll still be capable, but if your CPU does end up needing more juice, I'm fairly certain you can find an overclocking guide, but get an aftermarket cooler prior to that just to be on the safe side