PC Build Thread - 2020... and beyond!!!

Questions on how we spend our money and our time - consumer goods and services, home and vehicle, leisure and recreational activities
tortoise84
Posts: 463
Joined: Thu Nov 19, 2020 10:03 pm

Re: PC Build Thread - 2020... and beyond!!!

Post by tortoise84 »

madbrain wrote: Sat Oct 23, 2021 7:37 am So, it turns out that on my current X99 motherboard, 2 of the 10 SATA ports are disabled when M2 is in use, so I really only have 8 usable SATA ports, not 10. That's documented in the manual for the MSI X99A Raider. Still 2 more SATA than on the Prime X570 Pro.

What's much more concerning is that my LSI SAS2 x8 card, in an x16 slot, is now running at just x1 according to HWINFO. The peak throughput on the striped 8 x 1TB Samsung 860 that are attached to the LSI card has dropped to "just" 900MB/s, barely 1.5x SATA-3 bandwidth. It used to be about 4GB/s. I don't know when this regression happened, sigh.

Sadly, the motherboard BIOS has no settings for PCIe slot bandwidth at all.

Maybe adding the NVMe in the M2 slot a year ago also stole lanes from one of the PCIe x16 slots. But that part is not documented in the motherboard manual. There is no way to disable the M2 slot in the BIOS. I fear physically removing the NVMe from the M2 slot is the only sure way to test that theory, but my failing vision won't allow me to do that myself anymore. Those M2 screws are just too tiny. Also, the NVMe SSD is the current boot drive, and it would be a PITA to test - involving a full OS install to a small SSD I have laying around. Or maybe cloning, but I don't think I have a big enough SATA SSD for that.

Perhaps swapping PCIe slots between the Aquantia x4 and LSI x8 cards would partially solve the problem. Aquantia card running at PCIe 3.0 x1 would not quite hit 10 Gbps anymore, though. LSI card running at PCIe 3.0 x4 would not quite hit 4GB/s either, but it should be close, at least in theory. It's certainly an easier test to switch PCIe slots than to remove the NVMe drive.
You're going to run into this problem a lot because there simply aren't enough PCIe lanes to go around. A Zen 3 has PCIe Gen 4 x24 lanes and the X570 chipset has an additional PCIe Gen 4 x16 lanes. These are allocated by the motherboard manufacturer to the GPU, PCIe slots, M.2 slots, SATA ports, LAN, Wi-Fi, sometimes additional USB 3.2 Gen 2 10 Gbps or Gen 2x2 ports, etc. Furthermore, as you have found, you may or may not be able to reallocate the lanes in the BIOS, or some slots or ports get disabled when others are in use.

So you may have to get creative. For example, it looks like you need a lot of SATA ports rather than NVMe, so there are adapters to convert an M.2 slot to 5 SATA ports like this:

https://www.amazon.com/Internal-Non-Rai ... psc=1&th=1

Or, you could get a motherboard with 10 GbE LAN built-in which will free up a x4 slot.
madbrain
Posts: 6512
Joined: Thu Jun 09, 2011 5:06 pm
Location: San Jose, California

Re: PC Build Thread - 2020... and beyond!!!

Post by madbrain »

Independent George wrote: Sat Oct 23, 2021 9:05 am The closest motherboard I could find to your specs is this ASUS Threadripper Pro model, which has eight SATA ports and seven PCIE 4.0 slots, but has dual 10G ethernet ports so that the Aquantia card may not be needed anymore. Of course, that requires Threadripper (with Gen 3 due out soon, but possibly delayed), but that actually sounds right up your alley. I haven't read the manual to determine if any PCIE lanes are shared, but I doubt it on a Threadripper.

Of course, none of this solves your GPU problem, which will persist until crypto crashes.
Thanks, that is a very nice find. I think was using the wrong socket when searching for Threadripper motherboards earlier. I'm just not that familiar with the HEDT socket types. The Threadripper mobos and CPUs just aren't in the same price category as the Ryzen.
I think my i7-5820k was something like $400 for the CPU, $300 for the motherboard, $100 for the Noctua CPU cooler, and $200 for the RAM, so about $1000 for the core components.
Here we are talking about a $999 motherboard, probably a >$1000 chip (3955WX is $1092 on newegg, if using a current Zen2 CPU), and who knows how much for cooler and RAM - I guess about $350 for the RAM if using 64 GB ECC 8x8 DDR4-3200, and $100 for a compatible Noctua. So, a bit over $2500 in core parts. Definitely not the same budget. And that's not counting the GPU, of course. That crypto crash just can't happen soon enough.
I don't think I will actually spend that much to upgrade this system, as tempting as that is.

A more reasonably priced upgrade would be a Prime x570 Pro at $260, 5900x CPU for $570, keeping the same NH-D14 cooler with the addition of the NM-AM4 mounting kit for about $10. Not sure about keeping the existing 4x8GB sticks of Corsair LPX DDR4-2666. That might be a bit slow for a Ryzen 5000 series. DDR4-3200 is the recommended speed. Not sure how much the RAM would limit things. The RAM also supports DDR4-2800 in the second XMP profile at 1.35V, but I have been running at 1.2V and 2666 (first XMP profile). My first set of Corsair DDR4-2666 LPX RAM went bad in less than a year after running it at 2800 MHz/1.35V, and was replaced under warranty, and I haven't wanted to run the new set at the higher voltage and speed. No problems for years at 1.2V/2666 with the second set. If I kept the same RAM, that's "just" $840 for the CPU & mobo upgrade. But still can't stomach $500 for a GTX 1660 series GPU. I guess if crypto doesn't crash, maybe a used GPU would be in order :(
Topic Author
Independent George
Posts: 1590
Joined: Wed Feb 17, 2016 11:13 am
Location: Chicago, IL, USA

Re: PC Build Thread - 2020... and beyond!!!

Post by Independent George »

madbrain wrote: Sat Oct 23, 2021 9:53 am Thanks, that is a very nice find. I think was using the wrong socket when searching for Threadripper motherboards earlier. I'm just not that familiar with the HEDT socket types. The Threadripper mobos and CPUs just aren't in the same price category as the Ryzen.
I think my i7-5820k was something like $400 for the CPU, $300 for the motherboard, $100 for the Noctua CPU cooler, and $200 for the RAM, so about $1000 for the core components.
Here we are talking about a $999 motherboard, probably a >$1000 chip (3955WX is $1092 on newegg, if using a current Zen2 CPU), and who knows how much for cooler and RAM - I guess about $350 for the RAM if using 64 GB ECC 8x8 DDR4-3200, and $100 for a compatible Noctua. So, a bit over $2500 in core parts. Definitely not the same budget. And that's not counting the GPU, of course. That crypto crash just can't happen soon enough.
I don't think I will actually spend that much to upgrade this system, as tempting as that is.

A more reasonably priced upgrade would be a Prime x570 Pro at $260, 5900x CPU for $570, keeping the same NH-D14 cooler with the addition of the NM-AM4 mounting kit for about $10. Not sure about keeping the existing 4x8GB sticks of Corsair LPX DDR4-2666. That might be a bit slow for a Ryzen 5000 series. DDR4-3200 is the recommended speed. Not sure how much the RAM would limit things. The RAM also supports DDR4-2800 in the second XMP profile at 1.35V, but I have been running at 1.2V and 2666 (first XMP profile). My first set of Corsair DDR4-2666 LPX RAM went bad in less than a year after running it at 2800 MHz/1.35V, and was replaced under warranty, and I haven't wanted to run the new set at the higher voltage and speed. No problems for years at 1.2V/2666 with the second set. If I kept the same RAM, that's "just" $840 for the CPU & mobo upgrade. But still can't stomach $500 for a GTX 1660 series GPU. I guess if crypto doesn't crash, maybe a used GPU would be in order :(
From what you've described, you might be one of the few people for whom Threadripper actually makes sense; you're both operating a server and editing 8k video on the same machine. Even if you're not doing this as a business, that's still a professional workload for which a professional workstation seems quite suitable. It's possible that the 3000 series Threadripper will get cheaper once the 5000 series is out (and DDR4 will eventually get cheaper once everyone's on DDR5), but this might be the worst time to build a PC in decades.

Used GPUs aren't any cheaper. My RX580 currently sells for $400+ on eBay. For anyone skeptical that crypto is the driver, note that the eBay prices for GPUs currently line up exactly with their crypto mining efficiency (hash rate/watt), and not their gaming FPS levels. This includes the LHR Nvidia cards (which have already been hacked).
madbrain
Posts: 6512
Joined: Thu Jun 09, 2011 5:06 pm
Location: San Jose, California

Re: PC Build Thread - 2020... and beyond!!!

Post by madbrain »

Hi,
tortoise84 wrote: Sat Oct 23, 2021 9:45 am You're going to run into this problem a lot because there simply aren't enough PCIe lanes to go around. A Zen 3 has PCIe Gen 4 x24 lanes and the X570 chipset has an additional PCIe Gen 4 x16 lanes. These are allocated by the motherboard manufacturer to the GPU, PCIe slots, M.2 slots, SATA ports, LAN, Wi-Fi, sometimes additional USB 3.2 Gen 2 10 Gbps or Gen 2x2 ports, etc. Furthermore, as you have found, you may or may not be able to reallocate the lanes in the BIOS, or some slots or ports get disabled when others are in use.
Yes, I know there are some hard limits. I wish manufacturers better documented how all these lanes are allocated/reallocated.

My Intel 5820k has 28 lanes of PCIe 3.0, so going to 24 lanes with a Ryzen 5000 series would be losing four. Even though they are faster 4.0 lanes, it doesn't matter when none of the existing cards are 4.0.

I could have sworn I had both the NVMe at x4 and the SAS2 at x8 in the past on my 5820k when I installed the NVMe, because I benchmarked them both and compared them after installing the NVMe. My recollection is that the SATA array still had peak throughput faster than the NVMe.
I'm hoping it's some sort of OS software/driver regression. But really not sure at this point. Maybe I need to flash older Win10 revisions to USB stick and run the portable version of HWINFO from there to check ...

Doing the PCIe math, 8 lanes are allocated to the GTX960 (PCIe 3.0). 4 to the Aquantia (PCIe 3.0). 1 for the TI Firewire card (PCIe 1.1). 1 for the Hauppauge capture card (PCIe 1.1). That's just 14 lanes out of the 28 that the 5820k has. It should be more than enough for the LSI card to run at x8 and not the x1 it's currently running at. Even if the M2 slot is consuming four of the CPU lanes, that should still be enough.
I looked at https://en.wikipedia.org/wiki/Intel_X99, and it seems all the SATA/USB ports come from the chipset and don't consume CPU PCIe lanes. But the M2 ports do. But it still should fit within 28 lanes.
So you may have to get creative. For example, it looks like you need a lot of SATA ports rather than NVMe, so there are adapters to convert an M.2 slot to 5 SATA ports like this:

https://www.amazon.com/Internal-Non-Rai ... psc=1&th=1
Thanks. This looks interesting indeed. The name J-Micron on the chip gives me pause, though. They are not known to be the most reliable in my experience in the past with old SATA-2 JMB chips on motherboards.
Or, you could get a motherboard with 10 GbE LAN built-in which will free up a x4 slot.
Wouldn't that still consume some PCie lanes even if the 10Gbe LAN is integrated ? The objection to integrated 10Gbe is due to modifications that some manufacturers make (Asus) to the 10Gbe NICs, removing the WOL feature in particular.

A search on newegg for AM4 motherboards with 10 Gbe comes up with this :
https://www.newegg.com/p/pl?N=100007625 ... -1&Order=1

These motherboards all have 10 Gbe, but they have just 3 PCIe slots vs the 6 slots that the Asus Prime X570 Pro has. So, having the integrated 10 Gbe seems to cause more problems than it solves. I have 5 PCIe cards currently, one of which is 10 Gbe. I still need 4 slots even if the 10 Gbe is integrated. The one exception to the number of slots is this board :
https://www.newegg.com/msi-prestige-x57 ... klink=true

This has 7 PCIe slots ! Now that seems to be a motherboard with my name on it. I have to wonder how much stuff gets disabled when populate all the PCIe slots, M2 ports and SATA ports. Time to read the manual, but if history if any indication with my MSI X99A Raider, it won't actually get into those details.
Unfortunately, the board is shipped from a China seller on Newegg, and I'm not buying direct from China. Last year, I had a webcam ship directly from there and take 4 months to arrive. Not available on Amazon either, except used. Looks discontinued. Also lacks BIOS flashback, so not easy to install a Ryzen 5000 CPU series in it if I could get ahold of one. I'll be reading reviews on that board for fun, though. Not likely that any reviewer actually maxed out all the slots to see what happens, though.
Last edited by madbrain on Sat Oct 23, 2021 5:39 pm, edited 1 time in total.
madbrain
Posts: 6512
Joined: Thu Jun 09, 2011 5:06 pm
Location: San Jose, California

Re: PC Build Thread - 2020... and beyond!!!

Post by madbrain »

Independent George wrote: Sat Oct 23, 2021 12:26 pm From what you've described, you might be one of the few people for whom Threadripper actually makes sense; you're both operating a server and editing 8k video on the same machine. Even if you're not doing this as a business, that's still a professional workload for which a professional workstation seems quite suitable. It's possible that the 3000 series Threadripper will get cheaper once the 5000 series is out (and DDR4 will eventually get cheaper once everyone's on DDR5), but this might be the worst time to build a PC in decades.
Agree it's suitable, but I probably won't be throwing that kind of money at it. Was unemployed for 7 months this year and I'm just on contract now. And have some pricey house maintenance issues that will take priority. I don't know how much of my career is still left, TBH. I may already be retired and just not know it yet. I can probably afford to drop $1k, maybe 1.5K on an upgrade if I really stretch it, but I don't feel like dropping $3K+. I guess the HEDT CPU offerings like Threadripper will always be one gen behind vs the consumer ones, also. In truth, the CPU and GPU matter much more for 8K than the fast disk I/O. Encoding 8K is very slow and could probably be done on an old HDD IDE drive, so the 900MB/s limit on my SATA array isn't going to the bottleneck. It's just frustrating to see the perf regression.

If I do upgrade to a Ryzen now, I will try to keep my current sticks of RAM. I hope a Ryzen 5900/5950 would at least boot with DDR4-2666. I would expect DDR4 RAM to drop in the next year indeed. But with this chip shortage, who really knows. It's possible the DDR5 parts will just be available in such small supplies that they'll be priced out of mere mortals, and DDR4 prices still won't budge. I have 3 more machines in the house that take DDR4, though, and the Corsair sticks have a lifetime warranty for the original owner, so I won't sell those RAM sticks no matter what, even if I'm forced to buy new compatible RAM for a Ryzen 5000.
Edit: looks like my CMK32GX4M4A2666C16 are on the QVL for the Asus Prime X570 Pro, so I should be good. Though I have no way to check if it's the specific 3.20 revision.

One reason I'm considering upgrading this time of the year is that black friday is coming up, and there are usually some deals around that time. With a credit card that has 60 day price protection (like my Wells Fargo cards do), buying now makes some sense, and making a claim if/when my chosen parts drop in price in the next 60 days. Good possibility the Ryzen 5900X and 5950X would drop when Alder Lake is introduced.
Used GPUs aren't any cheaper. My RX580 currently sells for $400+ on eBay. For anyone skeptical that crypto is the driver, note that the eBay prices for GPUs currently line up exactly with their crypto mining efficiency (hash rate/watt), and not their gaming FPS levels. This includes the LHR Nvidia cards (which have already been hacked).
Sadly, my GTX 960 4GB looks like it's only worth $150-$200 on ebay. But that's a Jan 2015 chip release, whereas yours is April 2017.
Interestingly, the motherboard still seems to be worth something, though not clear how much, as few of them are sold. Maybe $150-$200 also.
The 5820k CPU is near worthless on ebay. There is one for under $50. The most recently sold was $35 just 3 days ago. Not all that surprising for a 2014 CPU, this is why I have upgraditis.
Marseille07
Posts: 16054
Joined: Fri Nov 06, 2020 12:41 pm

Re: PC Build Thread - 2020... and beyond!!!

Post by Marseille07 »

deleted
Last edited by Marseille07 on Sun Oct 24, 2021 12:32 am, edited 1 time in total.
tortoise84
Posts: 463
Joined: Thu Nov 19, 2020 10:03 pm

Re: PC Build Thread - 2020... and beyond!!!

Post by tortoise84 »

madbrain wrote: Sat Oct 23, 2021 5:07 pm My Intel 5820k has 28 lanes of PCIe 3.0, so going to 24 lanes with a Ryzen 5000 series would be losing four. Even though they are faster 4.0 lanes, it doesn't matter when none of the existing cards are 4.0.

I could have sworn I had both the NVMe at x4 and the SAS2 at x8 in the past on my 5820k when I installed the NVMe, because I benchmarked them both and compared them after installing the NVMe. My recollection is that the SATA array still had peak throughput faster than the NVMe.
I'm hoping it's some sort of OS software/driver regression. But really not sure at this point. Maybe I need to flash older Win10 revisions to USB stick and run the portable version of HWINFO from there to check ...

Doing the PCIe math, 8 lanes are allocated to the GTX960 (PCIe 3.0). 4 to the Aquantia (PCIe 3.0). 1 for the TI Firewire card (PCIe 1.1). 1 for the Hauppauge capture card (PCIe 1.1). That's just 14 lanes out of the 28 that the 5820k has. It should be more than enough for the LSI card to run at x8 and not the x1 it's currently running at. Even if the M2 slot is consuming four of the CPU lanes, that should still be enough.
I looked at https://en.wikipedia.org/wiki/Intel_X99, and it seems all the SATA/USB ports come from the chipset and don't consume CPU PCIe lanes. But the M2 ports do. But it still should fit within 28 lanes.
Yes, you will lose *CPU* lanes going from the HEDT platform 5820K to a consumer level Ryzen 5000 series, but remember that the X570 *chipset* gives you an additional Gen 4 x16 lanes. So for the Asus Prime X570-PRO,

connected directly to the CPU are the:
- top 2 PCIe x16 slots which when both occupied operate in Gen 4 x8 + x8
- top M.2 slot in Gen 4 x4
- as well as some of the USB ports

Connected to the chipset are the:
- bottom PCIe x16 slot operating in Gen 4 x4
- 3 Gen 4 x1 slots
- bottom M.2 slot in Gen 4 x4
- all 6 x SATA ports
- the rest of the USB ports, LAN, and anything else

The devices connected to the chipset share bandwidth so the lanes can be allocated more flexibly by the motherboard manufacturer. But I don't think you get the same flexibility with CPU lanes. For example, on your X99 mobo, if the 3 PCIe x16 slots are allocated CPU lanes as Gen 3 x8 + x8 + x8, and the M.2 slot is Gen 3 x4, (I believe all the x1 slots are connected to the chipset). Now if you put a 10GbE card in that uses only x4 lanes, you can't allocate the remaining x4 lanes to something else. But that still doesn't explain why your LSI SAS card is running on x1 lane because you should still have enough lanes, so something fishy is going on there.
madbrain wrote: Sat Oct 23, 2021 5:07 pm Wouldn't that still consume some PCie lanes even if the 10Gbe LAN is integrated ?
It will but they'll be chipset lanes, which could be more flexible as I explained above, or it could make no difference depending on the motherboard configuration.
madbrain wrote: Sat Oct 23, 2021 5:07 pm The one exception to the number of slots is this board :
https://www.newegg.com/msi-prestige-x57 ... klink=true

This has 7 PCIe slots !
I saw that MSI Prestige X570 Creation but also noted that all of the x1 slots are only PCIe Gen 2, so it might be okay for your current cards, but I see most of the current 4K capture cards are Gen 3 x4 so it could limit your future expansion.
madbrain
Posts: 6512
Joined: Thu Jun 09, 2011 5:06 pm
Location: San Jose, California

Re: PC Build Thread - 2020... and beyond!!!

Post by madbrain »

tortoise84 wrote: Sat Oct 23, 2021 9:40 pm Yes, you will lose *CPU* lanes going from the HEDT platform 5820K to a consumer level Ryzen 5000 series, but remember that the X570 *chipset* gives you an additional Gen 4 x16 lanes. So for the Asus Prime X570-PRO,

connected directly to the CPU are the:
- top 2 PCIe x16 slots which when both occupied operate in Gen 4 x8 + x8
- top M.2 slot in Gen 4 x4
- as well as some of the USB ports

Connected to the chipset are the:
- bottom PCIe x16 slot operating in Gen 4 x4
- 3 Gen 4 x1 slots
- bottom M.2 slot in Gen 4 x4
- all 6 x SATA ports
- the rest of the USB ports, LAN, and anything else
Thanks, that's good to know. My X99 motherboard gives 28 lanes of PCIe Gen3 from the 5820k CPU. And 8 lanes of PCIe Gen2 from the chipset.
Going to 24 lanes of Gen4 from the CPU and 16 lanes of Gen4 from the chipset certainly sounds like a big improvement.

Indeed the specs for the Prime X570 Pro show that the x4 slot and the 3 x1 slots come from the chipset.
How did you learn that one M2 slot had 4 lanes from the CPU, and the other 4 lanes from the chipset ?
I certainly can't figure that out from reading specs at
https://www.asus.com/us/Motherboards-Co ... /techspec/ .
The devices connected to the chipset share bandwidth so the lanes can be allocated more flexibly by the motherboard manufacturer.
I'm really curious about this bandwidth sharing. Does this mean some devices must be disabled - one can't enable everything at once ? Or one can enable everything, but not use everything at once (ie. there is some serialization) ? So far, I don't see any indication in the Prime X570 Pro manual about any limitations of chipset devices. The only mention of PCIe bandwidth sharing is on page 3.6.5 for PCIEX16_2 Bandwidth, which I quoted earlier.
But I don't think you get the same flexibility with CPU lanes. For example, on your X99 mobo, if the 3 PCIe x16 slots are allocated CPU lanes as Gen 3 x8 + x8 + x8, and the M.2 slot is Gen 3 x4, (I believe all the x1 slots are connected to the chipset). Now if you put a 10GbE card in that uses only x4 lanes, you can't allocate the remaining x4 lanes to something else. But that still doesn't explain why your LSI SAS card is running on x1 lane because you should still have enough lanes, so something fishy is going on there.
Yes, something very fishy is going on my X99 board. I just removed the Aquantia x4 card and switched temporarily to the mainboard 1Gbe LAN.
The LSI controller is still running at x1. Very puzzling. It's really starting to sound like some software regression. That, or the PG&E power surge I had last month damaged something, but that would be wicked to affect things this way, so much less likely. Nothing in the computer so far has misbehaved. Several audio devices in the same room did (Firewire audio interface, and A/V receiver). And one surge protector went crazy. The PC may have been on that failed surge protector too, though . To be continued. Going to reboot and switch slots now.
I saw that MSI Prestige X570 Creation but also noted that all of the x1 slots are only PCIe Gen 2, so it might be okay for your current cards, but I see most of the current 4K capture cards are Gen 3 x4 so it could limit your future expansion.
Yeah, I see your point. If I'm keeping the mobo for 6 years, I would want the x1 slots to be at least Gen3. Industry will be on Gen6 or maybe even Gen7 by then. The Prime X570 Pro at least has x1 slots as Gen3. Costs 1/4 of the price of the MSI Prestige, too.
madbrain
Posts: 6512
Joined: Thu Jun 09, 2011 5:06 pm
Location: San Jose, California

Re: PC Build Thread - 2020... and beyond!!!

Post by madbrain »

madbrain wrote: Sat Oct 23, 2021 11:34 pm Yes, something very fishy is going on my X99 board. I just removed the Aquantia x4 card and switched temporarily to the mainboard 1Gbe LAN.
The LSI controller is still running at x1. Very puzzling. It's really starting to sound like some software regression. That, or the PG&E power surge I had last month damaged something, but that would be wicked to affect things this way, so much less likely. Nothing in the computer so far has misbehaved. Several audio devices in the same room did (Firewire audio interface, and A/V receiver). And one surge protector went crazy. The PC may have been on that failed surge protector too, though . To be continued. Going to reboot and switch slots now.
Well, good news. LSI card started running at x8 when moved from PCI_E3 to PCI_E5. Aquantia card runs at x4 in either PCI_E3 or PCI_E5. And they still both work at their top speed once the slots are inverted. Very weird bug.
Somehow, the LSI card only runs at x1 in PCI_E3. This has to be a regression, but no idea whose. It could be LSI, MS, Aquantia, or Intel. The MSI X99A Raider manual states that all 3 x16 slots should be able to run at up to x8 simultaneously with a CPU that has 28 lanes, like my 5820k. But clearly, they don't depending on which slot each card is in.

With the LSI now running at x8 again, the 8x1TB Samsung 860 SSD array on the SAS2 card is purring at over 4.3GB/s read in Crystaldiskmark sequential, and over 4GB/s writes, which beats the NVMe Gen3 ADATA SX8200 drive. It doesn't beat it in all the random tests, though.
brad.clarkston
Posts: 1726
Joined: Fri Jan 03, 2014 7:31 pm
Location: Kansas City, MO

Re: PC Build Thread - 2020... and beyond!!!

Post by brad.clarkston »

Brain wrote: Mon Oct 18, 2021 12:04 pm I got a video card!!!

I signed up for the EVGA queue quite a while ago and my turn came up! I got the 3060 Ti XC (https://www.evga.com/products/product.a ... P5-3663-KL) for $469, plus shipping!

So, putting it all together with the rest of the components I bought a year ago (!) gives me this system:

PCPartPicker Part List: https://pcpartpicker.com/list/bMMqTJ
CPU: AMD Ryzen 5 5600X 3.7 GHz 6-Core Processor (Purchased For $316.94)
CPU Cooler: CRYORIG M9a 48.4 CFM CPU Cooler (Purchased For $31.99)
Motherboard: MSI B550-A PRO ATX AM4 Motherboard (Purchased For $124.99)
Memory: G.Skill Ripjaws V 32 GB (4 x 8 GB) DDR4-3600 CL16 Memory (Purchased For $149.98)
Storage: ADATA XPG SX8200 Pro 2 TB M.2-2280 NVME Solid State Drive (Purchased For $229.99)
Video Card: EVGA GeForce RTX 3060 Ti 8 GB XC GAMING Video Card (Purchased For $469.99)
Case: Cougar MX331 MESH-X ATX Mid Tower Case (Purchased For $40.98)
Power Supply: EVGA SuperNOVA GA 650 W 80+ Gold Certified Fully Modular ATX Power Supply (Purchased For $97.99)
Case Fan: Rosewill RFA-120-K 74.48 CFM 120 mm Fan (Purchased For $14.99)
Total: $1477.84

I originally intended to upgrade to this system for Cyberpunk 2077. I got everything but the video card in time, so I just popped my old R9 290 in and it was actually able to play the game at ultra settings at 1080p at a pretty even ~30fps. I never encountered any really noticeable lag.

After popping the 3060 Ti in, I fired up Cyberpunk 2077 again and ran around pissing off the cops to see if there was an improvement. With no RT, I was hitting 60fps evenly (my monitor's max). With RT maxed out, it went down into the 20s.

Sweet, Congratz!
70% AVGE | 20% FXNAX | 10% T-Bill/Muni
brad.clarkston
Posts: 1726
Joined: Fri Jan 03, 2014 7:31 pm
Location: Kansas City, MO

Re: PC Build Thread - 2020... and beyond!!!

Post by brad.clarkston »

LadyGeek wrote: Thu Oct 21, 2021 6:38 pm ^^^ Good to hear.

I wish I had taken a picture of that G.skill Ripjaws 3200 MHz (4 x 8 GB) memory before I returned it. Something did not seem right. I received 2 retail packages each containing 2 sticks.

The RAM stickers were marked as (4 x 8GB), but there's no way they should be packaged as independent sets of 2. I'm wondering if they were repacked or counterfeit. Has anyone received a set of 4 RAM sticks like this?

My Trident Z RAM arrived today. All in one box with a properly marked box and packaged as expected. I don't see a reason to modify the heat spreaders, as they're made of dark gray (or black) brushed aluminum. Only the top part of the RAM is white. They look like this: Trident Z DDR4-3600MHz CL16-16-16-36 1.35V 32GB (4x8GB)

I now have all my parts and just need to dedicate some time to do the build. It might be a few days.
Yep repackaged probably referb'ed if I'm being nice, a scam if I'm not. I would have sent them back as well.
70% AVGE | 20% FXNAX | 10% T-Bill/Muni
brad.clarkston
Posts: 1726
Joined: Fri Jan 03, 2014 7:31 pm
Location: Kansas City, MO

Re: PC Build Thread - 2020... and beyond!!!

Post by brad.clarkston »

harland wrote: Fri Oct 22, 2021 10:30 am
madbrain wrote: Thu Oct 21, 2021 9:52 pm Is there a trick to getting a decent GPU price ? Waiting list ? Anything ?
You'll probably have to go the prebuilt route if you want a GPU at a somewhat reasonable price. Might as well get a computer along with the video card! :D
The Newegg shuffle is the best if you can stomach the bundle deals.

My local MicroCenter has pretty much one of everything in stock during the week and then just 6600's on the weekends but the RTX's are higher than MRSP.
70% AVGE | 20% FXNAX | 10% T-Bill/Muni
madbrain
Posts: 6512
Joined: Thu Jun 09, 2011 5:06 pm
Location: San Jose, California

Re: PC Build Thread - 2020... and beyond!!!

Post by madbrain »

brad.clarkston wrote: Sun Oct 24, 2021 1:12 am The Newegg shuffle is the best if you can stomach the bundle deals.

My local MicroCenter has pretty much one of everything in stock during the week and then just 6600's on the weekends but the RTX's are higher than MRSP.
Newegg shuffle was mentioned before - none of the bundled parts interest me, and the hours of the shuffle don't work for me.

MicroCenter in Santa Clara shut its doors 9 years ago, unfortunately. The closest one to me now is 400 miles away. Then Fry's shut down earlier this year. Now, the only legitimate retail store for computer parts left in Silicon Valley is Central Computers. Even if I had a local Microcenter, I don't see any RTX cards listed on their web site. There is one Zotac GTX 1660, for double what they sold pre-pandemic, and in store only.

I dug the receipts for my current build. It was in November 2015, so almost 6 years.
i7-5820k was $276 at Fry's
X99A was $200, also at Fry's
Corssair 32GB DDR4-2666 RAM was $207 at Central Computers
Noctua cooler was $96, also at Central
GTX 960 was $230 on Amazon
Total $1009.

Nowadays, tentative build would be
5950x $570 at Central
Prime X570 Pro $260 at Central
Keep the RAM
Keep the cooler. $10 for the NM-AM4 cooler adapter.
GPU ???
$840 + GPU cost.

The GPU is just too much of a wildcard and can nearly double the cost of the build if trying to get a recent GPU like an RTX 3060.
But even a suitable GTX 1660 runs $500, and that's plain silly.

Last year, I considered upgrading also as it was 5 years, and that used to be my upgrade timeline. But both the CPU I wanted, Ryzen 5000 series, and current nVidia GPU, were out of stock or being scalped. Now the CPU is in stock, but GPU are still being scalped. Sigh.
brad.clarkston
Posts: 1726
Joined: Fri Jan 03, 2014 7:31 pm
Location: Kansas City, MO

Re: PC Build Thread - 2020... and beyond!!!

Post by brad.clarkston »

madbrain wrote: Fri Oct 22, 2021 9:05 pm
lazydavid wrote: Fri Oct 22, 2021 4:53 pm
LadyGeek wrote: Fri Oct 22, 2021 4:33 pm The motherboard user's manual suggests the wi-fi / Bluetooth module is running off the M.2 interface. The layout says "M.2 (wi-fi)".
This seems strange at first (and did to me also), but thinking about it for a moment, it's actually pretty logical. Nearly all Wifi/bluetooth modules these days are M.2, to the point that even many add-in PCI-E Wifi cards just have an M.2 socket populated with a 2230 wifi card like you might find in a laptop. Asus probably just took one of those chipsets and dropped it directly on the board.
If you think that's weird, what do you make of the following ?

https://www.centralcomputer.com/m-2-to- ... ipset.html
That's for industrial use. I'd think about it in a small factory floor but I'd rather pay more for hardened commercial devices with RS232 connectors than a cheap m.2 consumer device but there's allot of cheap engineers out there.
70% AVGE | 20% FXNAX | 10% T-Bill/Muni
User avatar
LadyGeek
Site Admin
Posts: 95484
Joined: Sat Dec 20, 2008 4:34 pm
Location: Philadelphia
Contact:

Re: PC Build Thread - 2020... and beyond!!!

Post by LadyGeek »

I should point out that Newegg also offers single GPU card deals. Not every day, but you don't have to give up because of it. It just takes time and and patience to wait. That's how I got my card. (Understood that madbrain has a time constraint.)
Wiki To some, the glass is half full. To others, the glass is half empty. To an engineer, it's twice the size it needs to be.
tortoise84
Posts: 463
Joined: Thu Nov 19, 2020 10:03 pm

Re: PC Build Thread - 2020... and beyond!!!

Post by tortoise84 »

madbrain wrote: Sat Oct 23, 2021 11:34 pm How did you learn that one M2 slot had 4 lanes from the CPU, and the other 4 lanes from the chipset ?
I certainly can't figure that out from reading specs at
https://www.asus.com/us/Motherboards-Co ... /techspec/ .
Actually the subtle cue is that the M.2_1 socket is listed under AMD Ryzen 5000 Series Processor, while the M.2_2 socket is listed under the AMD X570 chipset.
madbrain wrote: Sat Oct 23, 2021 11:34 pm I'm really curious about this bandwidth sharing. Does this mean some devices must be disabled - one can't enable everything at once ? Or one can enable everything, but not use everything at once (ie. there is some serialization) ? So far, I don't see any indication in the Prime X570 Pro manual about any limitations of chipset devices. The only mention of PCIe bandwidth sharing is on page 3.6.5 for PCIEX16_2 Bandwidth, which I quoted earlier.
The X570 chipset actually only has a PCIe 4.0 x4 uplink to the CPU as show in this diagram: https://www.guru3d.com/articles-pages/a ... iew,4.html

That's still a lot of bandwidth at 7.877 GB/s, but yes, it has to be shared by switching the bandwidth between devices dynamically, or outright disabling devices when others are in use. So there could be a problem if you manage to saturate the Gen 4 x4 uplink, but I don't really see that happening unless you're using multiple Gen 4 x4 NVMe drives.
madbrain wrote: Sun Oct 24, 2021 12:29 am Well, good news. LSI card started running at x8 when moved from PCI_E3 to PCI_E5. Aquantia card runs at x4 in either PCI_E3 or PCI_E5. And they still both work at their top speed once the slots are inverted. Very weird bug.
Glad to hear that it was an easy fix. But yeah, this is exactly the sort of weirdness you can expect when playing around with the PCIe lane allocation.
danaht
Posts: 816
Joined: Sun Oct 18, 2015 11:28 am

Re: PC Build Thread - 2020... and beyond!!!

Post by danaht »

It's been 7 years since I last built a PC - my PC is probably considered a relic by PC "years". I have waited this long - so I am going to try to wait another two years. I am planning to build a PC when Intel's Meteor Lake Processor is released in 2023. Meteor Lake will probably have reasonable graphic capabilities for it's APU (integrated graphics) - and I may even skip the dedicated video card this time around. My hope is that it will at least run most games at 1080P with "high effects" at 60 FPS. DDR5 RAM will potentially make the next gen APUs for AMD and Intel a lot more capable too. Just hoping my current PC can last that long.
Topic Author
Independent George
Posts: 1590
Joined: Wed Feb 17, 2016 11:13 am
Location: Chicago, IL, USA

Re: PC Build Thread - 2020... and beyond!!!

Post by Independent George »

Unless you're lucky enough to live near a Micro Center, Newegg Shuffle is the only consistent vendor to have any stock. Unfortunately, all of their stock is still at the inflated covid/crypto boom prices ($500+ for a 60 series card, $800+ for a 70 series, $1,200 for an 80 series, not including a possible "bundle" cost).

I am guardedly optimistic for Intel Arc GPUs in Q2 2022 - unlike AMD, Intel actually has the cash and capacity to make inroads into Nvidia's market share. I have a sneaking suspicion that slightly before it releases, there will suddenly be a glut of 70 series cards on the market. I never in a million years thought that I'd be jumping on the Intel train to try break another company's shady monopolistic market dealings.
User avatar
LadyGeek
Site Admin
Posts: 95484
Joined: Sat Dec 20, 2008 4:34 pm
Location: Philadelphia
Contact:

Re: PC Build Thread - 2020... and beyond!!!

Post by LadyGeek »

Mentioned somewhere earlier in the thread, I will be moving my Win 10 M.2 and SATA drives to my new build. I'll keep my Linux / Win 10 dual boot SATA SSD drives on my current build (Ryzen 7 3700X).

I had purchased the Samsung 980 Pro M.2 2280 1 TB PCIe Gen 4.0 for my dual-boot Linux / Win 10 PC. The new Samsung M.2 drive has been installed and the Win 10 drives removed.

Booting into Win 10, I cloned the 1 TB SATA drive to the 980 M.2 drive using Macrium Software | Reflect Free Edition. Works as advertised, easy.

====================
For the Linux crowd:

Booting into Linux on my M.2 cloned drive resulted in a hard stop black screen. :shock: I then booted into my Linux SATA drive and was greeted with a not so friendly message that you can't clone a drive because Linux doesn't play well with identical LVMs on 2 different disks (Logical Volume Managers).

I then removed the SSD and booted into Linux just fine. The boot time appears to be slightly faster, but not all that much.

My dual-boot Linux / Win 10 desktop PC is now up and running on the Ryzen 7 3700X build. I'm using the NVidia proprietary graphics drivers at 3840 x 2160 @ 60Hz.

=====================

For the Win 10 crowd:

In preparation for moving Win 10 to new hardware, I handled the licensing differently than last time. Starting with the recent Win 10 updates, Microsoft no longer uses product keys for authentication. Running PowerShell or terminal commands resulted in a blank output. No product key was listed.

Instead, I added an email account and ensured that my license was linked to my Microsoft account. See: Activate Windows, from Microsoft.

OK, what about the password associated with the email I used for login? It's not human friendly. There's an easy solution - enable Windows Hello with a PIN that you can remember. See: Learn about Windows Hello and set it up Check the "Use letters and symbols" box if needed. Be sure you test the login before removing hardware.

Remember this is only for PC build and test. Once everything is working, I'll reconfigure my PC back to the way it was. At least that's the plan.

===================
Looking at the new case, it wasn't apparent from the user manual that the front case fans are on slotted rails. My GPU was longer than the case allowed, so I purchased a narrow case fan (15 mm) to accommodate the length. The stock fan is 25 mm deep.

I may not need this slim 15 mm case fan after all, as I can slide the stock fan higher on the rail and possibly avoid an interference concern. I'll wait until I'm done before making any strong conclusions.
Wiki To some, the glass is half full. To others, the glass is half empty. To an engineer, it's twice the size it needs to be.
madbrain
Posts: 6512
Joined: Thu Jun 09, 2011 5:06 pm
Location: San Jose, California

Re: PC Build Thread - 2020... and beyond!!!

Post by madbrain »

tortoise84 wrote: Sun Oct 24, 2021 9:14 am Actually the subtle cue is that the M.2_1 socket is listed under AMD Ryzen 5000 Series Processor, while the M.2_2 socket is listed under the AMD X570 chipset.
Ah yes, indeed.
madbrain wrote: Sat Oct 23, 2021 11:34 pm The X570 chipset actually only has a PCIe 4.0 x4 uplink to the CPU as show in this diagram: https://www.guru3d.com/articles-pages/a ... iew,4.html
Thanks for that link.
That's still a lot of bandwidth at 7.877 GB/s, but yes, it has to be shared by switching the bandwidth between devices dynamically, or outright disabling devices when others are in use. So there could be a problem if you manage to saturate the Gen 4 x4 uplink, but I don't really see that happening unless you're using multiple Gen 4 x4 NVMe drives.
It's a lot of bandwidth indeed, but given that chipset uses 16 lanes worth of devices, but only has a 4 lane interconnect to the CPU, it would be possible. As you say, just using the M2_2 slot and populating it with a PCIE 4.0 x4 drive would fill it up.
At that point, heh SSDs hooked up to the 6 SATA ports would start suffering.
Or SSDs hooked up through USB->SATA adapters to USB ports of the chipset. I would not want to use USB connected drives as part of a striped volume though.
Glad to hear that it was an easy fix. But yeah, this is exactly the sort of weirdness you can expect when playing around with the PCIe lane allocation.
Thanks. It's definitely a software regression somewhere, probably in some Windows X99 chipset drivers. I had run the LSI SAS2 card just fine at x8 in the middle PCIe x16 slot before. Moving it to the bottom slot, there is very little space between it and the Corsair AX1200i PSU, which is at the bottom. All the case wires are in that tight spot between the LSI card and the PSU. I pulled the power LED accidentally and had to remove the LSI card, reconnect the case wires and put the LSI card back in.

I also wanted to check the revision of my Corsair LPX 2666 RAM while I was it, to see if it was the one on the Prime X570 Pro QVL, but the DIMMs are all underneath the huge NH-D14 cooler. Of course, the revision is printed on the side of the DIMM that's not visible. I tried to remove one of the cooler fans, but it was in the way of the USB 3.0 case connector on one side, and GPU on the other side. Would have to remove the GPU to take out the Noctua fan. And the PSU power connector going to it, and 3 monitor cables. I gave up, at least for the time being. I guess if I do upgrade to Ryzen, and the RAM isn't compatible, there are plenty of affordable faster DDR4 DIMMs on the market - no shortage there, and probably lower prices around black friday.
Last edited by madbrain on Sun Oct 24, 2021 10:54 pm, edited 1 time in total.
madbrain
Posts: 6512
Joined: Thu Jun 09, 2011 5:06 pm
Location: San Jose, California

Re: PC Build Thread - 2020... and beyond!!!

Post by madbrain »

Independent George wrote: Sun Oct 24, 2021 11:40 am Unless you're lucky enough to live near a Micro Center, Newegg Shuffle is the only consistent vendor to have any stock. Unfortunately, all of their stock is still at the inflated covid/crypto boom prices ($500+ for a 60 series card, $800+ for a 70 series, $1,200 for an 80 series, not including a possible "bundle" cost).
Sadly, Micro Center in Santa Clara closed 9 years ago. The closest one is now near LA, about 400 miles away. Definitely not practical. By 60 series, I'm assuming you mean 2060/3060. I wanted a 1660 series. But those are $500 at MicroCenter also. I don't need the ray-tracing features of the RTX.
I am guardedly optimistic for Intel Arc GPUs in Q2 2022 - unlike AMD, Intel actually has the cash and capacity to make inroads into Nvidia's market share. I have a sneaking suspicion that slightly before it releases, there will suddenly be a glut of 70 series cards on the market. I never in a million years thought that I'd be jumping on the Intel train to try break another company's shady monopolistic market dealings.
I simply wouldn't consider an Intel GPU, built-in or standalone, like the Arc they recently announced. The HTPC in the master bedroom upstairs runs a Skylake 6400. Two months ago, the Asus GT1030 card in it just died. I switched the machine to use the built-in Intel 530 GPU. It couldn't keep up with playing 4K movies. Part of the problem is that that generation lacks HDR decoding in hardware. But there were also tons of bugs, and regressions when updating. Intel added HDR decoding in their 8th Gen CPUs. Unfortunately, the Z170 motherboard can only handle 6th and 7th Gen CPUs. I bought an GTX 1050Ti on Ebay for a whopping $165, and things went back to normal. Not sure if I made the right call. The machine was built in July 2016, and was also getting to the age where an upgrade might be in order, but we have been using it less and less, and using the streaming sticks more I considered removing that HTPC altogether, but decided against it.

In June, I updated my mother's >10 year old computer in France to an AMD 5650G APU. I was very surprised that the APU also couldn't keep up with 4K movies in Plex. It should have handled 4K HDR in hardware, but it just couldn't keep up. She doesn't have a 4K monitor or 4K TV, so it's not a huge loss. Perhaps the problem could have been solved, if I spent more time trying to fix it. And perhaps 4K HDR movies (transcoded to HD) would have worked in other applications. I didn't bother. I still think it's pretty damning. Previously, the machine had an nVidia GT210 discrete GPU. I bought the AMD APU because of the GPU shortage, of course. I would have put another more recent nVidia if there was one to be had that didn't double the price of the build. But of course, there wasn't.

I just don't think Intel has figured out how to write proper GPU drivers yet, unfortunately. Not certain AMD has either, at least for their APUs. The experience is not nearly as seamless as with nVidia cards and their drivers, sadly. The hardware can be great, but without proper the software to go with it, it's not useful.
brad.clarkston
Posts: 1726
Joined: Fri Jan 03, 2014 7:31 pm
Location: Kansas City, MO

Re: PC Build Thread - 2020... and beyond!!!

Post by brad.clarkston »

madbrain wrote: Sun Oct 24, 2021 10:28 pm
Independent George wrote: Sun Oct 24, 2021 11:40 am Unless you're lucky enough to live near a Micro Center, Newegg Shuffle is the only consistent vendor to have any stock. Unfortunately, all of their stock is still at the inflated covid/crypto boom prices ($500+ for a 60 series card, $800+ for a 70 series, $1,200 for an 80 series, not including a possible "bundle" cost).
Sadly, Micro Center in Santa Clara closed 9 years ago. The closest one is now near LA, about 400 miles away. Definitely not practical. By 60 series, I'm assuming you mean 2060/3060. I wanted a 1660 series. But those are $500 at MicroCenter also. I don't need the ray-tracing features of the RTX.
I am guardedly optimistic for Intel Arc GPUs in Q2 2022 - unlike AMD, Intel actually has the cash and capacity to make inroads into Nvidia's market share. I have a sneaking suspicion that slightly before it releases, there will suddenly be a glut of 70 series cards on the market. I never in a million years thought that I'd be jumping on the Intel train to try break another company's shady monopolistic market dealings.
I simply wouldn't consider an Intel GPU, built-in or standalone, like the Arc they recently announced. The HTPC in the master bedroom upstairs runs a Skylake 6400. Two months ago, the Asus GT1030 card in it just died. I switched the machine to use the built-in Intel 530 GPU. It couldn't keep up with playing 4K movies. Part of the problem is that that generation lacks HDR decoding in hardware. But there were also tons of bugs, and regressions when updating. Intel added HDR decoding in their 8th Gen CPUs. Unfortunately, the Z170 motherboard can only handle 6th and 7th Gen CPUs. I bought an GTX 1050Ti on Ebay for a whopping $165, and things went back to normal. Not sure if I made the right call. The machine was built in July 2016, and was also getting to the age where an upgrade might be in order, but we have been using it less and less, and using the streaming sticks more I considered removing that HTPC altogether, but decided against it.

In June, I updated my mother's >10 year old computer in France to an AMD 5650G APU. I was very surprised that the APU also couldn't keep up with 4K movies in Plex. It should have handled 4K HDR in hardware, but it just couldn't keep up. She doesn't have a 4K monitor or 4K TV, so it's not a huge loss. Perhaps the problem could have been solved, if I spent more time trying to fix it. And perhaps 4K HDR movies (transcoded to HD) would have worked in other applications. I didn't bother. I still think it's pretty damning. Previously, the machine had an nVidia GT210 discrete GPU. I bought the AMD APU because of the GPU shortage, of course. I would have put another more recent nVidia if there was one to be had that didn't double the price of the build. But of course, there wasn't.

I just don't think Intel has figured out how to write proper GPU drivers yet, unfortunately. Not certain AMD has either, at least for their APUs. The experience is not nearly as seamless as with nVidia cards and their drivers, sadly. The hardware can be great, but without proper the software to go with it, it's not useful.
That AMD 5650G is a bit less surprising when you understand that only half the CPU is used for the CPU things and the other half for GPU things.
Its is no where near as powerful as a 5650X that's just grinding away as a CPU even if you add a dedicated GPU to the 5660G later.

You had the right idea transcode them to HD and it will work just fine, the CPU can stay up with that.
70% AVGE | 20% FXNAX | 10% T-Bill/Muni
tortoise84
Posts: 463
Joined: Thu Nov 19, 2020 10:03 pm

Re: PC Build Thread - 2020... and beyond!!!

Post by tortoise84 »

madbrain wrote: Sun Oct 24, 2021 9:41 pm I also wanted to check the revision of my Corsair LPX 2666 RAM
There's a program called Thaiphoon Burner which can tell you lots of info about your RAM modules:
[link removed by admin LadyGeek, see below]
madbrain
Posts: 6512
Joined: Thu Jun 09, 2011 5:06 pm
Location: San Jose, California

Re: PC Build Thread - 2020... and beyond!!!

Post by madbrain »

brad.clarkston wrote: Sun Oct 24, 2021 10:50 pm That AMD 5650G is a bit less surprising when you understand that only half the CPU is used for the CPU things and the other half for GPU things.
Its is no where near as powerful as a 5650X that's just grinding away as a CPU even if you add a dedicated GPU to the 5660G later.

You had the right idea transcode them to HD and it will work just fine, the CPU can stay up with that.
There is no 5650X. Presumably you meant a 5950X ?

Anyway, with my mother's computer, when I played the movie with Plex, the movie was stored on a remote server, also in France. Local Internet connection was fiber. It was stored as 4K HDR on the server. Plex was playing the movie locally on the Windows box. Plex client for Windows always transcodes HDR to SDR. It doesn't support HDR at all, even if you have an HDR display. So, it would have had to decode the 4K HDR and display it as HD SDR in real-time. I would have expected an APU released in 2021, that has hardware support for decoding 4K HDR, to be able to handle that. It just couldn't. The GPU usage was maxed out in task manager as I recall, and CPU was close to maxed out, but it was clear that GPU was the the bottleneck. It just shouldn't have been the case.

For reference, as I typed this, I just used my 5820k CPU, released in 2014, with my GTX 960 GPU, released in 2015. I just launched Plex on my 3rd monitor, which is HD and SDR, just like my mother's PC monitor. I played an MKV rip of the 4K blu-ray of my favorite movie, 2001 Space Odyssey, from the Plex server on my LAN . The file was stored in 4K HDR - not transcoded. Exact same use case as my mother's except the Plex server was on LAN, not WAN. My CPU was at 11%. GPU was at 28%. This is with the browser still open, also.

I don't think AMD has any excuse for releasing an APU in 2021 that cannot play 4K HDR blu ray content. Hopefully, it's just a software issue that will be resolved, some day, and not a hardware limitation. If it's a hardware limitation, then AMD really screwed up. Either way, one cannot just use hardware without software. So, I stand by what I said. No AMD APU for me. Only nVidia. This was my first experience with an AMD APU.

Note that the i5-6400 in my HTPC upstairs also chokes in this case when using the iGPU. But it was released in 2015. 4K HDR Bluray were first released in 2016. The 2015 Intel iGPU doesn't support HDR decoding in hardware at all. So, the CPU is left doing all the work. and an i5-6400 is just not powerful enough. However, combined with the nVidia GTX 1050Ti GPU, it's just fine. I just went upstairs to check this, and with Plex, CPU is at 11%, and GPU at 55%, playing the same movie, but onto 4K display, still in SDR. With JRiver, I can play the ISO of the same Blu-ray from my NAS. JRiver switched the TV automatically to HDR mode using the nVidia. CPU is at 22% and GPU at 24%, in the "Red october HQ, best performance" mode. The GTX 1050Ti is a 2016 GPU. The same machine handled 4K HDR just fine with a cheaper GT 1030, but the 1030 died unfortunately.

The person that sold me the AMD 5650G in France told me the GPU performance was comparable to the nVidia GT 1030. That might be so in games. Sadly, it just isn't so for playing movies. And note that my rant isn't so much an endorsement of nVidia as it is an indictment of AMD for their APU.
I still want nVidia to add bitstreaming of DSD audio over HDMI, something they have promised to never, ever do. They have recently removed 3D MVC decoding from their drivers. So, a Raspberry Pi3 is used for playing 3D Blu-rays now in the guest room that still has an old 3D TV. No PC or nVidia hardware there.
madbrain
Posts: 6512
Joined: Thu Jun 09, 2011 5:06 pm
Location: San Jose, California

Re: PC Build Thread - 2020... and beyond!!!

Post by madbrain »

tortoise84 wrote: Sun Oct 24, 2021 11:05 pm
madbrain wrote: Sun Oct 24, 2021 9:41 pm I also wanted to check the revision of my Corsair LPX 2666 RAM
There's a program called Thaiphoon Burner which can tell you lots of info about your RAM modules:
[link removed by admin LadyGeek, see below]
Thank you ! I had never heard of it.

The information I was looking for was the version number printed on the sticker, and unfortunately, it's not stored anywhere in the DIMM/SPD information. However, the program did tell me that the DRAM components are made by Hynix.
And the DRAM part number is H5AN4G8NMFR-TFC .
According to https://forum.corsair.com/forums/topic/ ... 1-532-539/, version 5.39 is Hynix 8Gbit MFR.
So, it sounds like my DRAM chips match what's in version 5.39. Version 5.39 is the one on the Asus QVL.
I can't be certain there aren't other versions that are Hynix MFR, of course. I will only know that for sure if I take the modules out.
And of course, versions not on the QVL may still work, they just haven't been tested. So, it's looking good for my RAM compatibility with the Asus Prime X570 Pro. Thank you for pointing me to that program.
lazydavid
Posts: 5126
Joined: Wed Apr 06, 2016 1:37 pm

Re: PC Build Thread - 2020... and beyond!!!

Post by lazydavid »

madbrain wrote: Sun Oct 24, 2021 9:41 pm I also wanted to check the revision of my Corsair LPX 2666 RAM while I was it, to see if it was the one on the Prime X570 Pro QVL, but the DIMMs are all underneath the huge NH-D14 cooler. Of course, the revision is printed on the side of the DIMM that's not visible. I tried to remove one of the cooler fans, but it was in the way of the USB 3.0 case connector on one side, and GPU on the other side. Would have to remove the GPU to take out the Noctua fan. And the PSU power connector going to it, and 3 monitor cables. I gave up, at least for the time being.
Can't Thaiphoon Burner give you the information you need, no disassembly required?

Edit: should have read further. :mrgreen:
User avatar
LadyGeek
Site Admin
Posts: 95484
Joined: Sat Dec 20, 2008 4:34 pm
Location: Philadelphia
Contact:

Re: PC Build Thread - 2020... and beyond!!!

Post by LadyGeek »

My new PC build is up and running (Ryzen 7 5800X CPU). It booted on the first try, no errors. That's a first for me. :)

The immediate first check test is to visually confirm that all the fans are running (yes). Booting into BIOS showed all hardware present, fans and temperatures as expected. Somewhat expected was the RAM running in fallback mode (2133 MHz). OK, I'll work on that later.

Booting into Windows looked good except for my one continual annoyance - the license would not activate. :annoyed I tried a bunch of different things going through the activation troubleshooter (I changed my hardware...), but Microsoft did not find my PC. It found my laptop, but that's about it. I might have to try phone activation, which I've done before.

In hindsight, I should have gone with the full-size Meshify-2 case with the larger GPU (the only one I could get via the Newegg shuffle). The GPU I have exceeded the maximum length for this case when using the front case fan.

As noted earlier, the front case fan was on a slotted rail which allowed me to slide it out of the way of the GPU. I was able to do that, but it was still a bit tight due to cable routing. It was also a tight fit routing the PCIe power cable to the GPU. There's very little room and I had to route the cable differently than I wanted. Hence, my suggestion to go with a larger case.

Additionally, the stock front case fan is a 3-pin connector. The Noctua 15 mm slim replacement fan has a 4-pin connector - which I preferred. I was impressed with the quality of Noctua fan and decided to go with it.

OK, so how do you remove the front panel to access the fan? There's nothing in the manual which shows how to do it. YouTube to the rescue - Fractal Design Meshify C Proper Front Panel Removal (2 minutes in). You'll need to pull down slightly on the cover after the bottom tab detaches in order to release the tabs at the top.

I have a lot more to do, but the hardware looks good. The rest is cable cleanup, BIOS, and software.
Wiki To some, the glass is half full. To others, the glass is half empty. To an engineer, it's twice the size it needs to be.
Topic Author
Independent George
Posts: 1590
Joined: Wed Feb 17, 2016 11:13 am
Location: Chicago, IL, USA

Re: PC Build Thread - 2020... and beyond!!!

Post by Independent George »

Can you mount the front intake fans outside of the case (but inside the mesh panel)? That's a pretty common setup with removal mesh front panels.
madbrain
Posts: 6512
Joined: Thu Jun 09, 2011 5:06 pm
Location: San Jose, California

Re: PC Build Thread - 2020... and beyond!!!

Post by madbrain »

LadyGeek wrote: Mon Oct 25, 2021 8:31 pm Additionally, the stock front case fan is a 3-pin connector. The Noctua 15 mm slim replacement fan has a 4-pin connector - which I preferred. I was impressed with the quality of Noctua fan and decided to go with it.
Yeah, it's really difficult to go wrong with anything Noctua. I wish they produced 230mm case fans. The larger the fan, the lower the RPM you need, and the quieter your PC. My large boxes (PC and NAS) with nearly maxed out PCIe slots and drive bays are actually extremely quiet for that reason.
But I have had several Cooler Master 230mm case fans fail over the years in my two Cooler Master HAF cases, HAF 932 Advanced and HAF XM. The fans didn't actually stop working, they just started getting louder as time passed. Fortunately, the 200mm Noctua fans have worked as replacements, and are very quiet.

It's a real shame that case manufacturers are not producing this kind of case anymore. New cases, many of which have zero front drive bays, 3.5in or 5.25in, are more about looks. Some fill them with LEDs. Yours fills them with fans. I understand most people don't use optical drives anymore, but things like SATA or NVMe hotswap bays, or space to install hubs for the next generation of connector (USB 4, Thunderbolt 5, etc), or fan controllers, or ... I really don't get why you want to build a custom PC and then never be able to add any of those things later on. Even my HTPC cases have multiple bays. They were produced before USB 3.0 existed. I added 3.5 in hubs/card readers that expose USB 3.0 ports.

To be fair, my 932 Advanced has 77% more volume than your Meshify C case. The 932 Advanced case could in theory fit 36 SATA SSDs in the 6 front 5.25 drive bays. And at least another 10 internally in the 5 3.5 SATA hotswap bays. I bet I could find adapters to fit more. Or it could fit nine 3.5 HDDs in the front 6 5.25 spaces, and another five HDDs inside - a respectable 14 HDDs. All hotswap. Of course, I would also need a Threadripper Pro motherboard with 4 LSI 16-drive SAS cards.

Cooling that many drives would be a nightmare, though. To get back to where I started - fans - small case fans behind SATA hotswap bays tend to be very noisy, because they are very small (some as small as 30mm. Think laptop CPU fans) and run at very high RPM as a result. You would need openings and large case fans on each side of the case to do better with less noise. I don't think any case has ever been designed that way. Maybe I'll be able to 3D print my ideal case design some day to improve on the HAF series. Just having custom side panels to install a larger number of intake & outtake fans might be enough.
User avatar
LadyGeek
Site Admin
Posts: 95484
Joined: Sat Dec 20, 2008 4:34 pm
Location: Philadelphia
Contact:

Re: PC Build Thread - 2020... and beyond!!!

Post by LadyGeek »

Independent George wrote: Mon Oct 25, 2021 8:53 pm Can you mount the front intake fans outside of the case (but inside the mesh panel)? That's a pretty common setup with removal mesh front panels.
I can mount the intake fans on the outside of the case (other side of the mounting rails), but there's no room to reinstall the mesh panel. If you say it's possible, then I'm guessing this Meshify C "Compact" case is exactly that. The smaller size limits what you can do.
madbrain wrote: Mon Oct 25, 2021 9:53 pm Yeah, it's really difficult to go wrong with anything Noctua.
I'm also impressed with the quality of the cabling and connectors. The supplied documentation is excellent. The CPU cooling fan came with instruction booklets for each socket type.

BTW, my G.skill Trident Z RAM fits just fine with the Noctua NH-U12A cooler. Due to the cooler's offset design, the RAM is in front of, and slightly higher than the fan. It doesn't interfere with the fan blades. If the RAM was higher, I can mount the fan a few fins higher on the radiator.

Oh, and it's not possible to mount this fan in any other orientation than front-to-back air flow. I won't be repeating the mistake I made with my first build. :)

I never thought to check if the motherboard can accommodate two CPU fans. Yes, my Asus TUF Gaming X570 motherboard has CPU and CPU_opt fan connectors. If the motherboard didn't have that 2nd CPU fan connector, Noctua supplies a Y-adapter cable that will combine the two fans into a single connector.

Both fans are recognized and running at slightly different speeds in my BIOS. The difference is insignificant, a few RPM. I'm wondering if this is due to measurement error or if they have an algorithm to calculate each fan speed separately.
Wiki To some, the glass is half full. To others, the glass is half empty. To an engineer, it's twice the size it needs to be.
User avatar
Peculiar_Investor
Site Admin
Posts: 2432
Joined: Thu Oct 20, 2011 12:23 am
Location: Calgary, AB 🇨🇦
Contact:

Re: PC Build Thread - 2020... and beyond!!!

Post by Peculiar_Investor »

LadyGeek wrote: Mon Oct 25, 2021 8:31 pm Somewhat expected was the RAM running in fallback mode (2133 MHz). OK, I'll work on that later.
[Motherboard]How to optimize the Memory performance by setting XMP or DOCP in BIOS? | Official Support | ASUS Global
Normal people… believe that if it ain’t broke, don’t fix it. Engineers believe that if it ain’t broke, it doesn’t have enough features yet. – Scott Adams
User avatar
LadyGeek
Site Admin
Posts: 95484
Joined: Sat Dec 20, 2008 4:34 pm
Location: Philadelphia
Contact:

Re: PC Build Thread - 2020... and beyond!!!

Post by LadyGeek »

Thanks. Since I have a similar motherboard to my previous build, I'll also go by my notes from last year.

First, I'll activate Win 10 by phone - what I did last year. The post is here. I'll follow the same path and start running stress tests (Prime95, GPU frame rates, etc.) once I have everything stable.
Wiki To some, the glass is half full. To others, the glass is half empty. To an engineer, it's twice the size it needs to be.
tortoise84
Posts: 463
Joined: Thu Nov 19, 2020 10:03 pm

Re: PC Build Thread - 2020... and beyond!!!

Post by tortoise84 »

madbrain wrote: Mon Oct 25, 2021 9:53 pm It's a real shame that case manufacturers are not producing this kind of case anymore. New cases, many of which have zero front drive bays, 3.5in or 5.25in, are more about looks. Some fill them with LEDs. Yours fills them with fans. I understand most people don't use optical drives anymore, but things like SATA or NVMe hotswap bays, or space to install hubs for the next generation of connector (USB 4, Thunderbolt 5, etc), or fan controllers, or ... I really don't get why you want to build a custom PC and then never be able to add any of those things later on. Even my HTPC cases have multiple bays. They were produced before USB 3.0 existed. I added 3.5 in hubs/card readers that expose USB 3.0 ports.

To be fair, my 932 Advanced has 77% more volume than your Meshify C case. The 932 Advanced case could in theory fit 36 SATA SSDs in the 6 front 5.25 drive bays. And at least another 10 internally in the 5 3.5 SATA hotswap bays. I bet I could find adapters to fit more. Or it could fit nine 3.5 HDDs in the front 6 5.25 spaces, and another five HDDs inside - a respectable 14 HDDs. All hotswap. Of course, I would also need a Threadripper Pro motherboard with 4 LSI 16-drive SAS cards.

Cooling that many drives would be a nightmare, though. To get back to where I started - fans - small case fans behind SATA hotswap bays tend to be very noisy, because they are very small (some as small as 30mm. Think laptop CPU fans) and run at very high RPM as a result. You would need openings and large case fans on each side of the case to do better with less noise. I don't think any case has ever been designed that way. Maybe I'll be able to 3D print my ideal case design some day to improve on the HAF series. Just having custom side panels to install a larger number of intake & outtake fans might be enough.
Modern cases are designed for CPU and GPU water cooling since the power and heat of these components keeps going up, so they have to fit 240/280/360/420mm radiators. (I'm actually working on building a water cooled 3080 Ti right now :D). But anyway, if you want storage, there are some modern cases like the Fractal Meshify 2 XL:
https://www.fractal-design.com/products ... ass/black/

It can fit 18 x 3.5" and 5 x 2.5" drives, and they can be cooled easily by up to 4 x 120mm fans in the front.
othermike27
Posts: 182
Joined: Mon Jun 13, 2016 7:14 am
Location: Chicago Metro

Re: PC Build Thread - 2020... and beyond!!!

Post by othermike27 »

madbrain wrote: Mon Oct 25, 2021 9:53 pm
Yeah, it's really difficult to go wrong with anything Noctua. I wish they produced 230mm case fans. The larger the fan, the lower the RPM you need, and the quieter your PC. My large boxes (PC and NAS) with nearly maxed out PCIe slots and drive bays are actually extremely quiet for that reason.
But I have had several Cooler Master 230mm case fans fail over the years in my two Cooler Master HAF cases, HAF 932 Advanced and HAF XM. The fans didn't actually stop working, they just started getting louder as time passed. Fortunately, the 200mm Noctua fans have worked as replacements, and are very quiet.

It's a real shame that case manufacturers are not producing this kind of case anymore. New cases, many of which have zero front drive bays, 3.5in or 5.25in, are more about looks. Some fill them with LEDs. Yours fills them with fans. I understand most people don't use optical drives anymore, but things like SATA or NVMe hotswap bays, or space to install hubs for the next generation of connector (USB 4, Thunderbolt 5, etc), or fan controllers, or ... I really don't get why you want to build a custom PC and then never be able to add any of those things later on.
Agreed. But you can still find the CoolerMaster HAF XB case (https://www.coolermaster.com/us/en-us/c ... af-xb-evo/) in stock: https://www.newegg.com/black-cooler-mas ... 6811119265 This is the case I use for my primary desktop PC. The form factor is a bit unusual - not your typical tower, more like a squared-off microwave oven. But there are some features that I like. The inside is divided into two compartments by a removable motherboard tray. "Upstairs" houses the motherboard, all peripheral cards and your air or liquid cooling setup. "Downstairs" is for drives: four 5.25" slots and a cage with four slots for 2.5" HDD/SSD. Since the motherboard is a solid steel tray, you can do your entire build outside the case on a firm surface and run tests or burn-ins before installing everything. The HAF XB also fits nicely on a shelf in a 19" enclosure if you are into the server room aesthetic (not a case to show off pretty blinkenlights though).
Topic Author
Independent George
Posts: 1590
Joined: Wed Feb 17, 2016 11:13 am
Location: Chicago, IL, USA

Re: PC Build Thread - 2020... and beyond!!!

Post by Independent George »

LadyGeek wrote: Tue Oct 26, 2021 6:47 am
madbrain wrote: Mon Oct 25, 2021 9:53 pm Yeah, it's really difficult to go wrong with anything Noctua.
I'm also impressed with the quality of the cabling and connectors. The supplied documentation is excellent. The CPU cooling fan came with instruction booklets for each socket type.
Another truly great thing about Noctua is that they will supply free adapter brackets for different CPU mounts long after they're obsolete. Back when CPUs were in short supply in late 2020, they confirmed that they had LGA-1366 adapter brackets available which would let me mount their cooler on my old CPU. Fortunately, it became moot when I snagged my Ryzen, but it's great to know you have the option of keeping the same CPU cooler for your next [X] builds.
User avatar
Tyler9000
Posts: 740
Joined: Fri Aug 21, 2015 11:57 am

Re: PC Build Thread - 2020... and beyond!!!

Post by Tyler9000 »

tortoise84 wrote: Tue Oct 26, 2021 7:20 am But anyway, if you want storage, there are some modern cases like the Fractal Meshify 2 XL:
https://www.fractal-design.com/products ... ass/black/

It can fit 18 x 3.5" and 5 x 2.5" drives, and they can be cooled easily by up to 4 x 120mm fans in the front.
I also recommend the Define 7 (or last-gen Define R6) if you're into storage and other devices. On top of the many interior drive brackets, it includes an exterior 5.25" drive bay hidden behind a front door.
lazydavid
Posts: 5126
Joined: Wed Apr 06, 2016 1:37 pm

Re: PC Build Thread - 2020... and beyond!!!

Post by lazydavid »

Independent George wrote: Tue Oct 26, 2021 10:51 am Another truly great thing about Noctua is that they will supply free adapter brackets for different CPU mounts long after they're obsolete. Back when CPUs were in short supply in late 2020, they confirmed that they had LGA-1366 adapter brackets available which would let me mount their cooler on my old CPU. Fortunately, it became moot when I snagged my Ryzen, but it's great to know you have the option of keeping the same CPU cooler for your next [X] builds.
It wasn't free, but I was able to buy an AM4 bracket for the Corsair AiO water cooler (H150 I think) that I used with my heavily overclocked i7-950 (LGA 1366) back in 2009. Didn't ultimately use it (included spire cooler turned out to be good enough for my 3600X), but it was nice to have the option.
madbrain
Posts: 6512
Joined: Thu Jun 09, 2011 5:06 pm
Location: San Jose, California

Re: PC Build Thread - 2020... and beyond!!!

Post by madbrain »

tortoise84 wrote: Tue Oct 26, 2021 7:20 am Modern cases are designed for CPU and GPU water cooling since the power and heat of these components keeps going up, so they have to fit 240/280/360/420mm radiators. (I'm actually working on building a water cooled 3080 Ti right now :D). But anyway, if you want storage, there are some modern cases like the Fractal Meshify 2 XL:
https://www.fractal-design.com/products ... ass/black/

It can fit 18 x 3.5" and 5 x 2.5" drives, and they can be cooled easily by up to 4 x 120mm fans in the front.
Thanks. As far as I can tell, none of these drive bays are hotswap, and all require screws. And none of them are in the front. In my NAS, with the HAF-XM case, I have 6 drives internally in the removable trays. The case came with two 3.5 hotswap docks SATA in the 5.25 bays. These are not removable docks. I use two of them for permanent drives. That's enough for the 8 for my 8x14TB RAID-Z2 ZFS array.
One more 5.25 bay is occupied by a 2-drive SATA dock, that can accept both a 2.5 and 3.5 drive simultaneously. The OS in on an SSD in this dock.
Then, two more 5.25 bays are occupied by a 3 in 2 dock that accepts three 3.5in hard drives. That one has an extremely loud fan. However, the fan turns on only if you insert drives in it. I use this dock only to backup the NAS itself. Actually, I need more storage to backup the NAS.
RAIDZ2 ZFS isn't a backup. I sold 6x10TB drives earlier this year that I was using for that purpose, expecting drive prices to go down, which was not a good bet to make in 2021.

I think front hotswap trays are pretty useful. Case design should not preclude having front bays, if you don't want to use fans, and want drive bays instead. I think the fans are removable in many modern cases, but you still can't use the space for drive bays. Inflexible design choices.
madbrain
Posts: 6512
Joined: Thu Jun 09, 2011 5:06 pm
Location: San Jose, California

Re: PC Build Thread - 2020... and beyond!!!

Post by madbrain »

othermike27 wrote: Tue Oct 26, 2021 8:13 am Agreed. But you can still find the CoolerMaster HAF XB case (https://www.coolermaster.com/us/en-us/c ... af-xb-evo/) in stock: https://www.newegg.com/black-cooler-mas ... 6811119265 This is the case I use for my primary desktop PC. The form factor is a bit unusual - not your typical tower, more like a squared-off microwave oven. But there are some features that I like. The inside is divided into two compartments by a removable motherboard tray. "Upstairs" houses the motherboard, all peripheral cards and your air or liquid cooling setup. "Downstairs" is for drives: four 5.25" slots and a cage with four slots for 2.5" HDD/SSD. Since the motherboard is a solid steel tray, you can do your entire build outside the case on a firm surface and run tests or burn-ins before installing everything. The HAF XB also fits nicely on a shelf in a 19" enclosure if you are into the server room aesthetic (not a case to show off pretty blinkenlights though).
Yeah, I had heard of this case. Didn't realize it was still available. I'm more interested in the large towers in the HAF series.
The HAF XB might be interesting as an HTPC case, but a bit too boxy (height).
My HTPCs use a DH101 case (2007 vintage) and Silverstone LC20B-M (I think 2008). Both have built-in IR receivers that hookup to the PSU, and allow the PC to be turned on with an IR remote from cold, even after a power loss. You can also use the remote in audio/video programs such as PowerDVD, JRiver Media Center. I have programmed the IR codes into a universal ARRX18G remote. I have 6 of those in the house, which I program with RemoteMaster to control all my components in 4 different rooms. 2 are just backups. It's been challenging to add IR receivers to the TV sticks like Amazon Fire Stick 4K, and Chromecast with Google TV, but it could be done, and was done, doubling the price of the sticks in the process, in order to be able to use just one remote in the room for everything.
I also have 3 of these .
https://www.amazon.com/gp/product/B076H ... UTF8&psc=1
Not the most reliable units, unfortunately, especially in the master bedroom when operating from the bed, at 27ft range from the TV on the opposite side wall. Makes 82" looks small, too. Couldn't afford the 98in LCDs at $60k a piece.
User avatar
LadyGeek
Site Admin
Posts: 95484
Joined: Sat Dec 20, 2008 4:34 pm
Location: Philadelphia
Contact:

Re: PC Build Thread - 2020... and beyond!!!

Post by LadyGeek »

tortoise84 wrote: Sun Oct 24, 2021 11:05 pm
madbrain wrote: Sun Oct 24, 2021 9:41 pm I also wanted to check the revision of my Corsair LPX 2666 RAM
There's a program called Thaiphoon Burner which can tell you lots of info about your RAM modules:
[link removed by admin LadyGeek, see below]
Sorry, but I had to remove the link. I used Thaiphoon Burner last year, but it wasn't mentioned in this thread. The website is flagged by Malwarebytes for a Trojan (Type: Outbound connection). Google suggests this may be a false positive, but I'm not taking any chances when many of our readers are not tech savvy.

Google also shows many complaints due Malwarebytes (and Firefox) blocking the site. That doesn't change things. I still need to protect our readers.

Here's what I did:
- Use Linux, not Windows.
- Google for the website. It's easy to find. Download the program, which is a .zip file.
- Clear the browser cache (Linux).
- Transfer the file to Windows.
- Scan the .zip file with Malwarebytes (or your favorite anti-malware program)
- Extract the .zip file. Scan it again, just to be sure.
- Run the .exe

I'll report on my new build shortly, but malware concerns made this a priority.
Wiki To some, the glass is half full. To others, the glass is half empty. To an engineer, it's twice the size it needs to be.
User avatar
LadyGeek
Site Admin
Posts: 95484
Joined: Sat Dec 20, 2008 4:34 pm
Location: Philadelphia
Contact:

Re: PC Build Thread - 2020... and beyond!!!

Post by LadyGeek »

My Ryzen 5800X build seems stable, but it's only been a day.

- I activated Windows by phone. Details are in this post. Slow, but it works.
- The change also broke the activation for Office 2016 (what I have). Fortunately, I was able to activate within Office. Easy.

- Update to the latest BIOS
To those with Asus motherboards - The BIOS EZ Flash utility does not work with NTFS filesystems. If you download a BIOS update to an NTFS formatted drive, you'll get an error reading the file (not a UEFI BIOS). Copy the file to a USB drive that's formatted as FAT32.

- Modify BIOS to recognize the new RAM profile. [Motherboard]How to optimize the Memory performance by setting XMP or DOCP in BIOS? AMD calls it D.O.C.P., Intel calls it XMP. Works as advertised.

- Updated AMD and motherboard / chipset drivers for Win 10. This includes a reinstallation of the AMD Win 10/11 patch mentioned in Peculiar_Investor's post here.

- Ran Prime95 and saw my CPU temperature go to 84 deg C. I got concerned and stopped after a few minutes. Nothing crashed, I just didn't like the temperature. Temperature is monitored with Core Temp.

GPU tools -
- MSI Afterburner - temperature monitoring and overclocking. I'm not touching anything in my GPU - I'll never find a replacement if anything goes south. :) I'm using the tool as a temperature monitor.
- MSI Kombustor - stress test and benchmark. Awesome at 3840 x 2160 8-). The GPU was running at 69 deg C.

Folding@home - One of my main applications that fully loads both the CPU and GPU. The CPU and GPU each had a work unit to chew on. The test ran for a few hours until the work units were complete.

Both GPU & CPU running: 85 deg C (CPU) 70 deg C (GPU)
GPU only: 70 deg C

Nothing running (just Win 10) "idle" temperatures: 29 deg C (CPU) 41 deg C (GPU)

I'm not at all happy with the noise of those 3 GPU fans (RTX 3060 Ti). When the GPU is going full bore, it's much louder than my Noctua fans. It's also louder than my RTX 2060 on my 3700x build. I may run folding@home on my Linux PC due to the fan noise.

As for the CPU temperature, running at 80 - 90 deg C is apparently "normal" for the 5800X. I see lots of complaints about this. Here's one example: Re: Ryzen 9 5950x running very hot 74C, the image showing AMD guidance is here: CPU Thermal Recommendations.

There's not much I can do at this point. I may play with BIOS or Windows power management, but not sure how much that will help.

Update 10/28/2021: Fixed typo, prior build is Ryzen 7 3700X.
Wiki To some, the glass is half full. To others, the glass is half empty. To an engineer, it's twice the size it needs to be.
Topic Author
Independent George
Posts: 1590
Joined: Wed Feb 17, 2016 11:13 am
Location: Chicago, IL, USA

Re: PC Build Thread - 2020... and beyond!!!

Post by Independent George »

madbrain wrote: Tue Oct 26, 2021 2:34 pm
tortoise84 wrote: Tue Oct 26, 2021 7:20 am Modern cases are designed for CPU and GPU water cooling since the power and heat of these components keeps going up, so they have to fit 240/280/360/420mm radiators. (I'm actually working on building a water cooled 3080 Ti right now :D). But anyway, if you want storage, there are some modern cases like the Fractal Meshify 2 XL:
https://www.fractal-design.com/products ... ass/black/

It can fit 18 x 3.5" and 5 x 2.5" drives, and they can be cooled easily by up to 4 x 120mm fans in the front.
Thanks. As far as I can tell, none of these drive bays are hotswap, and all require screws. And none of them are in the front. In my NAS, with the HAF-XM case, I have 6 drives internally in the removable trays. The case came with two 3.5 hotswap docks SATA in the 5.25 bays. These are not removable docks. I use two of them for permanent drives. That's enough for the 8 for my 8x14TB RAID-Z2 ZFS array.
One more 5.25 bay is occupied by a 2-drive SATA dock, that can accept both a 2.5 and 3.5 drive simultaneously. The OS in on an SSD in this dock.
Then, two more 5.25 bays are occupied by a 3 in 2 dock that accepts three 3.5in hard drives. That one has an extremely loud fan. However, the fan turns on only if you insert drives in it. I use this dock only to backup the NAS itself. Actually, I need more storage to backup the NAS.
RAIDZ2 ZFS isn't a backup. I sold 6x10TB drives earlier this year that I was using for that purpose, expecting drive prices to go down, which was not a good bet to make in 2021.

I think front hotswap trays are pretty useful. Case design should not preclude having front bays, if you don't want to use fans, and want drive bays instead. I think the fans are removable in many modern cases, but you still can't use the space for drive bays. Inflexible design choices.
I think the problem is you're looking for professional workstation features in consumer PC cases; there just isn't much demand for 5.25 bays anymore, let alone hot-swappable drive trays in a desktop rather than a dedicated NAS.
madbrain
Posts: 6512
Joined: Thu Jun 09, 2011 5:06 pm
Location: San Jose, California

Re: PC Build Thread - 2020... and beyond!!!

Post by madbrain »

Independent George wrote: Tue Oct 26, 2021 4:24 pm I think the problem is you're looking for professional workstation features in consumer PC cases; there just isn't much demand for 5.25 bays anymore, let alone hot-swappable drive trays in a desktop rather than a dedicated NAS.
I don't know that drive bays are professional features. To me, they are a basic feature that determines how expandable the computer case is, or not. It doesn't matter if you use it for future USB/Thunderbolt, new USB connectors (we went from A to C), a fan controller, hotswap drive bays, an optical drive, a floppy drive (still a few of those in the house!), a flash card reader, a fast charging port for your phone. It's all about future expandability. Cases are sold to people who custom build their computers. You would think they would have some interest in that. But maybe it's a form of planned obsolescence. You are forced to buy a new case to put new Firewire connectors, new USB 3.0 connectors, eSATA, eSATAP, USB 3.1 connectors, new USB 3.2 Gen2, USB 3.2 Gen2x2, USB 4.0, Thunderbolt USB-C connectors, and so on ... All those ports have appeared on new desktop computers, and new laptops. Some have been obsoleted, of course. If you have any drive bay at all, it's not a problem to keep your case. And if you have PCIe slots, you can usually add them with a cheap-add on card, without changing the motherboard, also.
I understand not having drive bay on tiny cases such for, say nano-ITX or pico-ITX motherboards. Or single board computers. But if you are doing a custom PC build, it seems very counter-intuitive to have none. I don't think it raises the price, either, like adding PCIe slots does on motherboards.
Last edited by madbrain on Tue Oct 26, 2021 6:21 pm, edited 2 times in total.
lazydavid
Posts: 5126
Joined: Wed Apr 06, 2016 1:37 pm

Re: PC Build Thread - 2020... and beyond!!!

Post by lazydavid »

LadyGeek wrote: Tue Oct 26, 2021 4:23 pm - Update to the latest BIOS
To those with Asus motherboards - The BIOS EZ Flash utility does not work with NTFS filesystems. If you download a BIOS update to an NTFS formatted drive, you'll get an error reading the file (not a UEFI BIOS). Copy the file to a USB drive that's formatted as FAT32.
This is not my experience. I updated my X570 TUF gaming to BIOS 4021 last week using an old SSD in an external enclosure, which is GPT partitioned and NTFS formatted. In fact, this is the exact drive I've used for every BIOS update since getting this board just over a year ago.
User avatar
LadyGeek
Site Admin
Posts: 95484
Joined: Sat Dec 20, 2008 4:34 pm
Location: Philadelphia
Contact:

Re: PC Build Thread - 2020... and beyond!!!

Post by LadyGeek »

You might be right. I do remember updating from my NTFS hard drive on earlier build Asus motherboards. Possibly my prior build from last year.

I was getting a "Selected file is not a UEFI BIOS" error and Google suggested the FAT32 drive, so I went with it. I'm now on BIOS 4021.
Wiki To some, the glass is half full. To others, the glass is half empty. To an engineer, it's twice the size it needs to be.
tortoise84
Posts: 463
Joined: Thu Nov 19, 2020 10:03 pm

Re: PC Build Thread - 2020... and beyond!!!

Post by tortoise84 »

LadyGeek wrote: Tue Oct 26, 2021 4:23 pm I'm not at all happy with the noise of those 3 GPU fans (RTX 3060 Ti). When the GPU is going full bore, it's much louder than my Noctua fans. It's also louder than my RTX 2060 on my 3800x build. I may run folding@home on my Linux PC due to the fan noise.

As for the CPU temperature, running at 80 - 90 deg C is apparently "normal" for the 5800X. I see lots of complaints about this. Here's one example: Re: Ryzen 9 5950x running very hot 74C, the image showing AMD guidance is here: CPU Thermal Recommendations.

There's not much I can do at this point. I may play with BIOS or Windows power management, but not sure how much that will help.
The 5800X is known to run hot. You can try to undervolt it using the Curve Optimizer feature with a negative value e.g. -25 which will make the CPU use less voltage at a given clock speed, or reach a higher clock speed for a given voltage.

For single-threaded or lightly-threaded workloads, you will most likely hit the max boost frequency of the CPU (4.7GHz) so the result of using Curve Optimizer will be a lower voltage and hence less power and heat.

However, for multi-threaded workloads, you will most likely be running at the Precision Boost Overdrive PPT limit, with the CPU boost frequency being reduced to say 4.4GHz to stay under the PPT limit. Thus when using Curve Optimizer, the CPU will follow the second scenario of boosting clock speed back up while utilizing the same voltage (and hence the same power and heat). So you also have to reduce the PPT limit, which for a 5800X, the default limit is 142W so you could try reducing it a bit to say, 128W. Then try running a multi-threaded benchmark again and compare the temps and average clock frequencies. You should aim to reduce the PPT and temps while maintaining the same clock frequency and performance.

Here's a video that explains this and gives some temperature tests: https://www.youtube.com/watch?v=dfkrp25 ... ptimumTech

The same principle applies to your GPU but the procedure is a little different. You can just use the OC Scanner in MSI Afterburner and it will automatically find the highest stable frequencies at 4 voltage points. Run it at the 100% power limit first, but if the frequencies and temps actually increase under your GPU workload, then you may also have to reduce the GPU power limit to get your temps down (because this follows similar boost principles to Curve Optimizer on a Zen 3 CPU as I explained above). Another thing is, GPU workloads are all different in that they may be limited by something else like V-Sync, or a game having a CPU limited bottleneck, so you have to tune the power limit, temps, and performance for your specific GPU workload.

Here's the official MSI blog on the Afterburner OC Scanner: https://www.msi.com/blog/get-a-free-per ... oc-scanner

You can also tune your GPU fan curves in Afterburner, but at first I would just leave them as is for now, because when you reduce the voltages and power, the temps will also go down, and hence the fans will slow down as well.
User avatar
LadyGeek
Site Admin
Posts: 95484
Joined: Sat Dec 20, 2008 4:34 pm
Location: Philadelphia
Contact:

Re: PC Build Thread - 2020... and beyond!!!

Post by LadyGeek »

Good info, thanks. I'll take a deep-dive look shortly.

I have folding@home running now with Core Temp monitoring. This utility not only displays temperature, but individual core frequencies, loading percentages, and power consumption.

I can see each of the 5800X's 8 core frequencies changing in real-time. 8-) The range is roughly 4272 MHz to 4547 MHz.

Max CPU temp is 83 deg C @ 128 W power.

For completeness, the GPU is at 69 deg C (MSI Afterburner).

I forget to mention earlier that my UPS is reporting a total power draw of 420 W. My Seasonic PSU is rated at 750 W. It's not quite double the steady-state power, so I'm hoping there's enough headroom for transient high current loads.

MSI Afterburner is helpful, as it shows the GPU hitting voltage and power limits. FYI - I no longer feel guilty about using MSI's utilities, as my GPU is MSI.
Wiki To some, the glass is half full. To others, the glass is half empty. To an engineer, it's twice the size it needs to be.
tortoise84
Posts: 463
Joined: Thu Nov 19, 2020 10:03 pm

Re: PC Build Thread - 2020... and beyond!!!

Post by tortoise84 »

I finally got an EVGA 3080 Ti FTW3 Hybrid with a 240 mm AIO water cooler and installed it last night. Ran the Unigine Heaven benchmark with HWiNFO to monitor data. I was amazed to see the GPU temp peaked at just 60C! That's with the stock power limit of 400 W. I also ran the automatic OC scanner and it gave me +112 MHz, so the clock speed was sitting around 2010 MHz in Heaven. My air cooled 3080 Ti would regularly hit 80C with the power limit reduced to 360 W, and the clock speeds were around 1835 MHz.

My motherboard temps also dropped from 54C to 44C and I have my poor NVMe drive under the GPU which also dropped from 62C to 47C. My case is a Cooler Master NR400 with the GPU 240 mm radiator exhausting out the top, an Arctic Liquid Freezer II 120 mm for the CPU exhausting out the back, and 3 x 120 mm Arctic P12 PWM fans intaking at the front.

One thing is the stock GPU fans are a little loud, so I might swap them out with some Arctic P12 PWM fans.
User avatar
LadyGeek
Site Admin
Posts: 95484
Joined: Sat Dec 20, 2008 4:34 pm
Location: Philadelphia
Contact:

Re: PC Build Thread - 2020... and beyond!!!

Post by LadyGeek »

First, the video you referenced Ryzen 5000 Undervolting with PBO2 – Absolutely Worth Doing was very helpful, thanks. Now, I understand what's going on.

I forgot about HWiNFO and downloaded the portable version (just unzip and run the .exe). Very cool. 8-)

Your new water cooled EVGA 3080 Ti FTW3 Hybrid is emphasizing how much heat is dumped out by the GPU. Looking further at my air-cooled MSI 3060 Ti, the rear case fan (exhaust) can't be mounted as low as I want.

The GPU heat that should be vented out the case is instead flowing into the Noctua CPU cooler and reduces its cooling capacity. I see that effect when running folding@home with only the GPU running a work unit. The CPU temperature is higher compared to "idle".

My GPU is also covering an NVMe slot. Fortunately, I only have 1 NVMe drive and it's in the other slot (not under anything). Otherwise, the drive would not be happy getting warmed by the GPU.

The fan noise from my new Ryzen 7 5800X build is louder than my prior Ryzen 7 3700X build. Complicating things is that the new build is next to my desk and out in the open. My prior build is inside a cabinet opening in an adjacent desk, so it's somewhat sheltered from noise.

I'll run folding@home on my prior build due to less perceived noise and 83 W power savings. My folding@ home comparison (GPU + CPU loaded):

Ryzen 7 3700X / GeForce RTX 2060 super:* 337 W (total power reported by the UPS), 78 deg C (GPU), 81 deg C (CPU)
Ryzen 7 5800X / GeForce RTX 3060 Ti: 420 W (total power reported by the UPS), 69 deg C (GPU), 82 deg C (CPU)

* Running in Linux. Temperatures are reported using psensor.
Wiki To some, the glass is half full. To others, the glass is half empty. To an engineer, it's twice the size it needs to be.
tortoise84
Posts: 463
Joined: Thu Nov 19, 2020 10:03 pm

Re: PC Build Thread - 2020... and beyond!!!

Post by tortoise84 »

I posted my build with pictures here: https://pcpartpicker.com/b/cGykcf

I did replace the GPU radiator fans with Arctic P12 PWM fans and they are much quieter (a gentle whooshing sound) and the GPU still stays under 60C. I can really feel lots of hot air exhausting out the top, and I measured the surface temp of the top grills at 45C so the air temp is probably even higher. It would be quite detrimental if I had the GPU radiator mounted at the front of my case blowing this hot air at all my components.

I've thought about air-cooled GPUs a lot and how to get rid of the heat, but haven't really come up with any good solutions. With a standard case layout, the best you can do is probably to have 3 intake fans in the front set to increase to a fairly high speed based on a motherboard or case temp sensor, which will better represent the GPU temp, rather than using the CPU temp sensor. The bottom front fans will blow cool air into the GPU. In my case, I also placed a piece of cardboard to direct air from the very bottom fan upwards at the GPU, instead of under the PSU shroud. The top front fans will blow cool air at the CPU as well as push the hot GPU exhaust air towards the rear exhaust fan. (But yes, some of the hot GPU exhaust will go into the CPU cooler).

This is why I don't like glass side panels, and prefer vented panels so some of the hot GPU exhaust can escape out the side. My old SAMA IM01 case had a vented side panel, as does the Cooler Master NR200 and a lot of the new ITX cases, especially ones that have a 'sandwich' layout with the GPU in a different compartment connected with a riser cable.

Of course, the best solution is to go water cooling, which allows you to place the radiator much further away from the heat source, so you can just exhaust all the heat directly out of the case, instead of blowing it at another component.
Post Reply