Jump to content

What do all pc builders agree on?

xaviltt
1 hour ago, 8tg said:

-HD620 performed like an 8800GTX and for old games, an i5 7600 on its own in an office machine was surprisingly capable, new iris Xe igpu stuff is nuts, especially in laptops 

-office ore builds are insanely economical choices because of the office lease cycle and it’s a reasonable compromise to have to deal with their proprietary nature to get coffee lake 6 cores in complete ready to use systems for under $150

-the idea that low end parts are poor value over time is dependent on a progressive use case, a gt 1030 is inherently more useful than it was when new, because there is more it can do now than content existed for it when new, it’s not like time means it can’t play gta v in 720p anymore

-high end parts and their attribution to value is dependent on immediate need, to someone who simply wants or needs the best available hardware, the concept of economical value is entirely moot. It doesn’t matter if the 4090 is overpriced, if you need a 4090, that is simply the price of a 4090, value isn’t a question asked 


 

Usb type c and 3.0 headers are worse

 

Two different hobbies entirely with the same end goal. Consoles are convenient and simple, if you just want to play games, they are dedicated machines that exist to play games.

PCs are a whole other thing, and playing games on them is entirely different even if it’s the same end goal. One is not inherently better than the other. It’s like comparing two different kinds of transit, they both get you from one place to another, and they both have their benefits and drawbacks.

Bring back non proprietary power supply connectors, ultra had this perfected in 2004 with male to male cables

IMG_1916.thumb.jpeg.f9c3037dbbcc4d6dcd4d61ec97649cc2.jpeg

you could use any cables you wanted since it was identical on both ends, you could snip molex connections from other power supplies and plug them into this thing 

question, how is better or worse than the standard use now? is it just driving pricing up on psu?
I remember one company went to try and do a new take on psu I forget which one, but I found it very cool

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, BrandonLatzig said:

question, how is better or worse than the standard use now? is it just driving pricing up on psu?
I remember one company went to try and do a new take on psu I forget which one, but I found it very cool

It’s objectively better in every way. Mostly in terms of consumer benefits, Proprietary psu cables exist to lock people into buying manufacturer cables as replacements or extras. 
There’s no reason why a pcie 8 pin can’t be the same on both ends, it’s an oriented connector you can’t plug in wrong. Same for eps or the 24 pin.

Sata is probably not the best connector on the psu end so ultra just did this weird 5 pin thing, but that’s probably the only proprietary thing they did 

IMG_1919.jpeg.7875f63f0cea0143a6249fe005ef1dc4.jpeg
 

In the alternative timeline where evga and corsair didn’t take off and ultra stuck around a bit longer, getting new psu cables have been as simple as browsing Amazon for any psu cables you wanted.

It wouldn’t involve $100 kits or $30 individual cables, or spending a fortune on cablemod. The same companies making any old psu cable extensions would be making entire kits universal to any power supply.

Link to comment
Share on other sites

Link to post
Share on other sites

  • 3 weeks later...
On 2/16/2024 at 10:52 AM, 8tg said:

-the idea that low end parts are poor value over time is dependent on a progressive use case, a gt 1030 is inherently more useful than it was when new, because there is more it can do now than content existed for it when new, it’s not like time means it can’t play gta v in 720p anymore

The judgement of poor value for lower end parts is that as prices decrease more of the final retail price is dedicated to the fixed price of for example a gpu.

 

Let's say a 4080 is 1200$ retail 

 

Maybe 5$ of its price is packaging 

 

Add 10$ for logistics, warehouse, qc, lost items, 

 

Unrealistic,but let's say the hdmi license is 1$, pcie, and other regulating bodies have licensing, pretend there's 5$ in licensing 

 

The fans pretend they are 5$

 

The pcb manufacturing base price is say 20$

 

There's more tiers to this. But effectively in this post im going to suggest that there's fixed costs of 45$ for "any gpu" obviously a 1030 and a 4080 have vastly different pcbs and components, but for the sake of this post there are these fixed costs.

 

45$ in fixed costs leave 1155$ to potentially go towards, the gpu core, vram, heat sink, elements that make a gpu good at rendering video games performantly.

 

Now if we look at a rx 6400 that card is in the range of 150$. Now it shares the vast majority of the fixed costs with a 4080, the pcie slot, the licenses for pcie and hdmi, having a pcb, heatsink and fans, packaging and infrastructure.

 

With 45$ in fixed cost 1/3rd of the retail price is dedicated to fixed costs in this example for a 6400 vs about 4% in our example 4080

 

A card that is only 100$ more than 6400 will have double the amount of money going towards non fixed costs in this example. 

 

 

So with that in mind.

 

A 6400 xt in real life is 150$

A 6650 xt in real life is 250$

 

In games a 6650 is 300%  of the performance for only 66% more money. It's a better value.

Intel i5-3570K/ Gigabyte GTX 1080/ Asus PA248Q/ Sony MDR-7506/MSI Z77A-G45/ NHD-14/Samsung 840 EVO 256GB+ Seagate Barracuda 3TB/ 16GB HyperX Blue 1600MHZ/  750w PSU/ Corsiar Carbide 500R

 

Link to comment
Share on other sites

Link to post
Share on other sites

On 2/16/2024 at 1:34 AM, Kisai said:

 

- iGPU's suck

You seem to assume every PC is for gaming. For non-gamers, iGPU are perfect.

 

Just because of all the different use cases of a PC, there will be no consensus what is good or bad. 

 

But I have one we all may agree on: saving money by buying a bad PSU and case never is a good idea long term for any somewhat powerful system.

AMD 9 7900 + Thermalright Peerless Assassin SE

Gigabyte B650m DS3H

2x16GB GSkill 60000 CL30

Samsung 980 Pro 2TB

Fractal Torrent Compact

Seasonic Focus Plus 550W Platinum

W11 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

19 minutes ago, Lurking said:

You seem to assume every PC is for gaming. For non-gamers, iGPU are perfect.

 

No, iGPU's categorically suck in every use case they've been marketed for. Business use, Gaming use, HEDT, Media players, Workstations, Servers.

 

There are only four potentially useful purposes an iGPU slots into:

1. Laptop "low power GPU" in a high-end/low-end configuration

2. Media Player (single 1080p screen)/Kiosk

3. Rescue GPU, when the dGPU dies, or in a server that would otherwise have to waste PCI lanes on one.

4. Additional GPU playback engines in a correctly setup desktop.

 

Any other use case, and you're crippling productivity or crippling maintenance. I've seen enough iGPU's over 25 years that I haven't seen more than a handful of cases where the  iGPU made sense. You never buy a computer based on it having an iGPU. You are given no choice. At least in a desktop with a dGPU, you can still utilize the iGPU for additional media decode engines, but it's pretty worthless, even to connect additional monitors.

 

Business use: Good luck getting 2x 4K or 3x 4K monitors on any iGPU. The same computers that have iGPU's often use USB-C docking stations that use software GPU's, which also impose further load on the USB controller and the CPU. If you only need a single 1080p screen, then it will pass. But iGPU performance was never viable at any point in history, with Windows Vista and later requiring a GPU for doing basic compositing. Good luck ever getting away with baseline iGPU's for anything that needs to be productive.

 

Gaming use: No iGPU is good for playing anything unless the game is 20 years old, or it is very very light (Eg games built on gamemaker.)

 

"Web browsers": What rock have you been under. https://web.basemark.com/ , https://www.shadertoy.com/view/XsBXWt (this is a 10 year old demo), Just because you can view the google home page doesn't mean that experience is equal across the web. I swear there are more "poor performing" websites out there than there was 20 years ago. 20 years ago people designed the pages to work a certain way, today everything wants to be reactive ad-laden BS, and even pages that don't have ads on it, are taking upwards of 30 seconds to load because the browser lacks any information to progressively render the page. You only make it worse on the low end CPU/GPU parts. "WebASM" and "WebGL" have ultimately ruined "the web" for the purpose it was designed for. So now you get all these sites that have potential WebASM malware or poorly developed cross compiled C++ code. Thanks Google. I hate it.

 

Workstations and Servers, the iGPU is essentially no good for anything but a backup device in case the dGPU dies or a physical KVM needs to be plugged in to a server that utilizes a GUI (eg Windows).

 

I've seen enough "iGPU good enough for business/school" that I will show you a sucker, every time. Why do you want to give your employees or students a rubbish experience that wastes their time?

Link to comment
Share on other sites

Link to post
Share on other sites

30 minutes ago, Kisai said:

No, iGPU's categorically suck in every use case they've been marketed for. Business use, Gaming use, HEDT, Media players, Workstations, Servers.

 

There are only four potentially useful purposes an iGPU slots into:

1. Laptop "low power GPU" in a high-end/low-end configuration

2. Media Player (single 1080p screen)/Kiosk

3. Rescue GPU, when the dGPU dies, or in a server that would otherwise have to waste PCI lanes on one.

4. Additional GPU playback engines in a correctly setup desktop.

 

Any other use case, and you're crippling productivity or crippling maintenance. I've seen enough iGPU's over 25 years that I haven't seen more than a handful of cases where the  iGPU made sense. You never buy a computer based on it having an iGPU. You are given no choice. At least in a desktop with a dGPU, you can still utilize the iGPU for additional media decode engines, but it's pretty worthless, even to connect additional monitors.

 

Business use: Good luck getting 2x 4K or 3x 4K monitors on any iGPU. The same computers that have iGPU's often use USB-C docking stations that use software GPU's, which also impose further load on the USB controller and the CPU. If you only need a single 1080p screen, then it will pass. But iGPU performance was never viable at any point in history, with Windows Vista and later requiring a GPU for doing basic compositing. Good luck ever getting away with baseline iGPU's for anything that needs to be productive.

 

Gaming use: No iGPU is good for playing anything unless the game is 20 years old, or it is very very light (Eg games built on gamemaker.)

 

"Web browsers": What rock have you been under. https://web.basemark.com/ , https://www.shadertoy.com/view/XsBXWt (this is a 10 year old demo), Just because you can view the google home page doesn't mean that experience is equal across the web. I swear there are more "poor performing" websites out there than there was 20 years ago. 20 years ago people designed the pages to work a certain way, today everything wants to be reactive ad-laden BS, and even pages that don't have ads on it, are taking upwards of 30 seconds to load because the browser lacks any information to progressively render the page. You only make it worse on the low end CPU/GPU parts. "WebASM" and "WebGL" have ultimately ruined "the web" for the purpose it was designed for. So now you get all these sites that have potential WebASM malware or poorly developed cross compiled C++ code. Thanks Google. I hate it.

 

Workstations and Servers, the iGPU is essentially no good for anything but a backup device in case the dGPU dies or a physical KVM needs to be plugged in to a server that utilizes a GUI (eg Windows).

 

I've seen enough "iGPU good enough for business/school" that I will show you a sucker, every time. Why do you want to give your employees or students a rubbish experience that wastes their time?

I've been using iGPU at home all my life and even on 2x4K for 7 years. No problem at all. Nothing lagging. Even the 6000 series quad cores are fine emulating 4K videos (their old iGPU don't natively play most of the modern YT codecs). Newer iGPU often play those natively. 

 

I use DP on my recent build. But even an employer provided tablet with docking station I use for work doesn't have problems with 2x4K via USB-C. 

 

Many people don't game or use productivity software that requires a dGPU. 

 

At work designing in 3D and for gaming..... sure a good dGPU is needed. But that isn't every PC. 

 

I don't go to malware websites and use ad blockers.... But website speed depends more on network capacity. 

AMD 9 7900 + Thermalright Peerless Assassin SE

Gigabyte B650m DS3H

2x16GB GSkill 60000 CL30

Samsung 980 Pro 2TB

Fractal Torrent Compact

Seasonic Focus Plus 550W Platinum

W11 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

19 minutes ago, Lurking said:

I've been using iGPU at home all my life and even on 2x4K for 7 years. No problem at all. Nothing lagging. 

 

I don't believe you. I've used iGPU's going all the way back to i810. There is always this small windows where the iGPU is "Sufficient" for a current task, but feature creep in web browsers and sites have largely discarded the "Web/Cloud services" answer.

 

Too much presently fights for the GPU memory bandwidth, which is shared with the main memory. I've seen far too many 12"-14" laptops that are iGPU only, that fall far too short of being capable to run MS Office 365 let alone engineering tools. Too many people without technical prowness are making purchasing decisions that result in loss of productivity from downtime due to the laptop form factor, or the iGPU being too week, lacking thermal headroom to survive being used for long periods of time.

 

It's rare to see laptops just straight up die, but a common factor is trying to use the iGPU device in a thermally constrained environment. A SFF with an oversized thermal solution might be just fine where a laptop with a laptop cooling solution that is otherwise the same spec would destroy the iGPU after a few months.

 

19 minutes ago, Lurking said:

 

I don't go to malware websites and use ad blockers.... But website speed depends more on network capacity. 

Nope, When there is fiber at both ends, the problem is the site being designed without any regard for the devices it will be used with. The direction the web is going is requiring higher performance devices to do very basic things that it used to be possible to do 10 years ago on weaker hardware. Consider how wide the gulf is between an Android mobile burner device and a high end desktop PC. 

 

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, Kisai said:

I don't believe you. I've used iGPU's going all the way back to i810. There is always this small windows where the iGPU is "Sufficient" for a current task, but feature creep in web browsers and sites have largely discarded the "Web/Cloud services" answer.

 

Too much presently fights for the GPU memory bandwidth, which is shared with the main memory. I've seen far too many 12"-14" laptops that are iGPU only, that fall far too short of being capable to run MS Office 365 let alone engineering tools. Too many people without technical prowness are making purchasing decisions that result in loss of productivity from downtime due to the laptop form factor, or the iGPU being too week, lacking thermal headroom to survive being used for long periods of time.

 

It's rare to see laptops just straight up die, but a common factor is trying to use the iGPU device in a thermally constrained environment. A SFF with an oversized thermal solution might be just fine where a laptop with a laptop cooling solution that is otherwise the same spec would destroy the iGPU after a few months.

 

Nope, When there is fiber at both ends, the problem is the site being designed without any regard for the devices it will be used with. The direction the web is going is requiring higher performance devices to do very basic things that it used to be possible to do 10 years ago on weaker hardware. Consider how wide the gulf is between an Android mobile burner device and a high end desktop PC. 

 

I'm talking desktop mainly. On the other hand my cheap work tablet also runs 2x4K fine for remoting in. Network speed (on my employer side) is more a problem.

 

In general it seems Intel iGPU does worse than AMD. But all my PCs in the last decade were Intel (current AMD system in signature). Never really a problem.

 

I don't have fiber. But I think for a site like this forum with ad blocker my Internet speed is fine. 

 

I agree on web designers getting way overboard with features instead of content. But most sites I chose to visit are OK. 

AMD 9 7900 + Thermalright Peerless Assassin SE

Gigabyte B650m DS3H

2x16GB GSkill 60000 CL30

Samsung 980 Pro 2TB

Fractal Torrent Compact

Seasonic Focus Plus 550W Platinum

W11 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×