Just be aware that there are a ton of built-in functions which try to push this chip to increase speeds, but then those same built-in functions have to fight against the temperature limit of this massive chip, and that's where the failures begin to happen. Max wattage for the CPU is now capped at around 215w (250w limit) and the benefit is that my temperatures have yet to exceed 60c - and guess what, my animation is chugging along nicely.ĭon't let any of these issues put you off building a machine with this chip - when it works it really is unbeatable value for money. I have now disabled every possible option for running this chip above the 3ghz base-clock, it has an auto feature which takes it up to the 250w limit but the temperature limit counteracts this and causes system failure. (But unless AMD can back it up with software, it won't mean much, especially for markets like deep learning, which NVidia is going full-force.So I found a Reddit thread which discusses a 68c temp limit for the 2990WX - had no idea this existed before building this machineĪs my CPU was hitting this AMD specified limit, the chip was failing and the motherboard had stopped communicating with it completely, I'm guessing some kind of fail-safe in the 2990WX causes this but it is not publicised anywhere and has driven me crazy for weeks! And they're definitely cheaper, for sure. The place Vega has advantages are in more niche areas, like open source Radeon drivers, fully unlocked FP16 support (garbage market segmentation from Nvidia), and the Instinct/SSG GPU lines they'll be offering will be unique. ![]() This would do quite well for a deep learning machine, for example. And they'll have better performance, power, and be easier to buy than any Vega card. You can get Threadripper and mobo at a much smaller cost than a Xeon rig, with more lanes, but just load it with Pascal GPUs. If anything, Ryzen's increased PCIe lanes are a reason to go for nvidia. They're running circles around themselves at this point. And Nvidia is already on the move with Volta, which AMD will seemingly have zero answer for. The 1080ti has absolutely no competition from AMD in terms of performance/power and availability for most workloads calling it "old fashioned" is. Even factoring out PSU efficiency losses, measuring inside the case something is eating at least 130W that isn't showing up as package power. ![]() The onboard package-power measurement appears to be drastically undershooting the power as measured at the wall. It also pulls an absolutely absurd amount of power to do it, literally more than a 7900X. Despite AMD's attempted pushback it appears Intel's smack-talk was correct and Infinity Fabric is not a magic panacea for NUMA performance problems. ![]() Not a particularly great showing overall for a processor with 60% more cores. Of course there may still be some tuning, but right now it's certainly not the slam-dunk that everyone assumed it would be. Also, it appears to perform pretty badly at some compilation workloads too, like a 1950X barely beats a 6900K at Chromium compiling using MSVS.Īt most you can say that you really need to look at the specific task. It's actually slower in x265 encoding than a 7900X, and it doesn't really pull away much in x264 or Blender rendering either (10-25% over a 7900X). Well, for multi core workloads AMD is a clear winner.Īctually not really.
0 Comments
Leave a Reply. |
Details
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |