Jump to content

Damn, Intel!

16 hours ago, RejZoR said:

Everyone is screaming how amazing Adler Lake is and I'm one of few who just sit here and stare into the abyss because I'm not really that impressed. So it beats a year old part from AMD. Congrats I guess? At expense of ludicrous power consumption and temperatures and all they are yet again doing is just pushing the clocks high once again. Maybe to someone with 2x 360 custom water cooling it's irrelevant, but every time power consumption was involved, people couldn't shut up about it how important it is. Suddenly everyone kinda sorta brush at it for a brief moment and that's it. Hm.

 

Intel makes some really good stuff, but it's mostly really for the mobile parts and even those seem to be super niche. Like the Atom CPU's for mobile devices. I was always fascinated by them and actually owned several models, starting with N270, then Z2000 series (not sure which model I had exactly) and now I have Z8350 and I've just ordered another one ultrabook with Pentium N6000. For whatever reason only Acer is super into this one and it's one of very rare devices that are passive cooled. I couldn't find a single AMD's part except ancient A9 Bulldozer based ones. And I could hardly find the N6000 in any other vendor's products, other than Acer anyways. Weird. And knowing Atoms from the past I know it'll serve me great for multimedia. But on desktops, it was meh for several years now and it's still meh.

 

If Adler Lake had power consumption of Ryzen 5950X and this performance, I'd be properly impressed. But I'm just not and it's weird everyone being so enthusiastic about it when it's pretty meh. Or is DDR5 and PCIe 5.0 really that awesome to be so hyped about it? Even efficiency cores seem to not do much other than still have scheduler confused given that AMD's idle consumption isn't that problematic and when all cores are firing on all "pistons", any extra work that falls on them is an insignificant blip on the performance radar. This isn't mobile device with finite power where every mW counts when phone is sitting around with display off and needs to slowly deal with background tasks...

idk about other people, but I've always kind of brushed off power consumption

 

to give an idea, I live in Virginia, and the electricity here is 12.40¢ / kWh, so if your cpu used 300w, and you used it 24/7 for a full year, it would cost $325.872, to give a comparison, last time I checked, our air conditioning bill is ~$500/MONTH, by comparison, this cpu pulling in 300w 24/7 is $27.156/month

maybe now you understand, 4 minutes of work every day to get ~10% better performance, along with access to DDR5, and to get an actual use for windows 11

 

you may say how the air conditioning is only on during summer, but during winter the bill is essentially just switched to the heater, it's technically a gas heater, but it also costs ~$500/month

 

 

tldr: if you're concerned about spending an extra $1-200/month by using intel over AMD, you probably shouldn't be looking towards buying a PC at all, just get a laptop with a power-saving CPU (like an i3-7020U)

░█▀▀█ ▒█░░░ ▒█▀▀▄ ▒█▀▀▀ ▒█▀▀█   ▒█░░░ ░█▀▀█ ▒█░▄▀ ▒█▀▀▀ 
▒█▄▄█ ▒█░░░ ▒█░▒█ ▒█▀▀▀ ▒█▄▄▀   ▒█░░░ ▒█▄▄█ ▒█▀▄░ ▒█▀▀▀ 
▒█░▒█ ▒█▄▄█ ▒█▄▄▀ ▒█▄▄▄ ▒█░▒█   ▒█▄▄█ ▒█░▒█ ▒█░▒█ ▒█▄▄▄

Link to comment
Share on other sites

Link to post
Share on other sites

10 hours ago, Arika S said:

People sure seem weirdly salty about that fact that Intel is competitive again

oMg INtEL usiNG An exTra 100w, THaT mEanS I Will SpEnd An EXTra $100/YEaR oN ElEcTRicitY, THis IS stupId WhY IS tHIs coNsidEREd a VIctoRY At aLL

 

literally every AMD fan rn ^

░█▀▀█ ▒█░░░ ▒█▀▀▄ ▒█▀▀▀ ▒█▀▀█   ▒█░░░ ░█▀▀█ ▒█░▄▀ ▒█▀▀▀ 
▒█▄▄█ ▒█░░░ ▒█░▒█ ▒█▀▀▀ ▒█▄▄▀   ▒█░░░ ▒█▄▄█ ▒█▀▄░ ▒█▀▀▀ 
▒█░▒█ ▒█▄▄█ ▒█▄▄▀ ▒█▄▄▄ ▒█░▒█   ▒█▄▄█ ▒█░▒█ ▒█░▒█ ▒█▄▄▄

Link to comment
Share on other sites

Link to post
Share on other sites

The extra heat doesn't vanish, you're also spending extra on your air conditioning to cool down your room while you're playing or doing heavy stuff with your pc. 

 

But yeah... It's not about the amount really, it's about dishonesty or blindly focusing on one metric in reviews (fps, mips, whatever) and leaving aside / ignoring / barely mentioning the others  (power used, noise levels etc). 

 

There was a huge argument and debate when Intel came up with that system with a 28 core cpu running at 5.2 ghz which used a 1hp chiller consuming 1200 watts to cool the loop : https://9gag.com/gag/aBxXmjD

There was a lot of discussion when  Epyc processors came out with 280w TDP .... 

 

Now you have a 8 core (+8 ep cores) cpu consuming more power than even Epyc processors and nobody complains. 

 

For idea, the 64 core / 128 threads Threadripper Pro processors peak at 282 watts - something that's almost reached by the 12900k doing nearly 260 watts in some scenarios, and has only 8 cores + 8 efficient cores. 

See if you don't believe me : https://www.anandtech.com/show/16805/amd-threadripper-pro-review-an-upgrade-over-regular-threadripper/2

 

It's fairly obvious  that they're simply cpus with process optimized for laptops (very efficient at low frequencies, low voltage) but they have to pump loads of power at high frequencies hence the huge power consumption. 

 

I see them as  kinda like AMD did back in the day with those FX-9xxx that were shipped with water coolers because they were too hot.

 

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, mariushm said:

The extra heat doesn't vanish, you're also spending extra on your air conditioning to cool down your room while you're playing or doing heavy stuff with your pc. 

 

But yeah... It's not about the amount really, it's about dishonesty or blindly focusing on one metric in reviews (fps, mips, whatever) and leaving aside / ignoring / barely mentioning the others  (power used, noise levels etc). 

 

There was a huge argument and debate when Intel came up with that system with a 28 core cpu running at 5.2 ghz which used a 1hp chiller consuming 1200 watts to cool the loop : https://9gag.com/gag/aBxXmjD

There was a lot of discussion when  Epyc processors came out with 280w TDP .... 

 

Now you have a 8 core (+8 ep cores) cpu consuming more power than even Epyc processors and nobody complains. 

 

For idea, the 64 core / 128 threads Threadripper Pro processors peak at 282 watts - something that's almost reached by the 12900k doing nearly 260 watts in some scenarios, and has only 8 cores + 8 efficient cores. 

See if you don't believe me : https://www.anandtech.com/show/16805/amd-threadripper-pro-review-an-upgrade-over-regular-threadripper/2

 

It's fairly obvious  that they're simply cpus with process optimized for laptops (very efficient at low frequencies, low voltage) but they have to pump loads of power at high frequencies hence the huge power consumption. 

 

I see them as  kinda like AMD did back in the day with those FX-9xxx that were shipped with water coolers because they were too hot.

 

 

 

 

the power/heat is negligible

 

also server processors are known to be EXTREMELY power efficient, so that's kind of just flexing your ignorance right there

░█▀▀█ ▒█░░░ ▒█▀▀▄ ▒█▀▀▀ ▒█▀▀█   ▒█░░░ ░█▀▀█ ▒█░▄▀ ▒█▀▀▀ 
▒█▄▄█ ▒█░░░ ▒█░▒█ ▒█▀▀▀ ▒█▄▄▀   ▒█░░░ ▒█▄▄█ ▒█▀▄░ ▒█▀▀▀ 
▒█░▒█ ▒█▄▄█ ▒█▄▄▀ ▒█▄▄▄ ▒█░▒█   ▒█▄▄█ ▒█░▒█ ▒█░▒█ ▒█▄▄▄

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, yesyes said:

the power/heat is negligible

 

also server processors are known to be EXTREMELY power efficient, so that's kind of just flexing your ignorance right there

It's not a server processor, it's a HEDT / Workstation processor.  

 

EPYC processors are for servers and their tdp is between 155w and 280w  - see https://en.wikichip.org/wiki/amd/cores/milan

 

Link to comment
Share on other sites

Link to post
Share on other sites

53 minutes ago, mariushm said:

It's not a server processor, it's a HEDT / Workstation processor.  

 

EPYC processors are for servers and their tdp is between 155w and 280w  - see https://en.wikichip.org/wiki/amd/cores/milan

 

it's a bit of a stretch to call it a workstation processor

 

the 5950x is a what I'd consider a workstation processor, a threadripper is at the line of server territory

░█▀▀█ ▒█░░░ ▒█▀▀▄ ▒█▀▀▀ ▒█▀▀█   ▒█░░░ ░█▀▀█ ▒█░▄▀ ▒█▀▀▀ 
▒█▄▄█ ▒█░░░ ▒█░▒█ ▒█▀▀▀ ▒█▄▄▀   ▒█░░░ ▒█▄▄█ ▒█▀▄░ ▒█▀▀▀ 
▒█░▒█ ▒█▄▄█ ▒█▄▄▀ ▒█▄▄▄ ▒█░▒█   ▒█▄▄█ ▒█░▒█ ▒█░▒█ ▒█▄▄▄

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, yesyes said:

maybe now you understand, 4 minutes of work every day to get ~10% better performance, along with access to DDR5, and to get an actual use for windows 11

It's at least 30% more expensive to get 10% more performance, not worth it to everyone, maybe for those using an Intel 7th or 8th gen system that didn't want to go with AMD, but it's not enough of a gain for everyone, and not a huge win considering the new node and core architecture Intel is using. Also if you need to do actual work I wouldn't recommend Windows 11.

13 minutes ago, yesyes said:

it's a bit of a stretch to call it a workstation processor

 

the 5950x is a what I'd consider a workstation processor, a threadripper is at the line of server territory

The 5950X is an HEDT, Threadripper is HEDT or workstation.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Blademaster91 said:

It's at least 30% more expensive to get 10% more performance, not worth it to everyone, maybe for those using an Intel 8th gen system that didn't want to go with AMD, but it's not enough of a gain for everyone, and not a huge win considering the new node and core architecture Intel is using. Also if you need to do actual work I wouldn't recommend Windows 11.

The 5950X is an HEDT, Threadripper is HEDT or workstation.

hey, that's me!

 

this is getting off-topic

░█▀▀█ ▒█░░░ ▒█▀▀▄ ▒█▀▀▀ ▒█▀▀█   ▒█░░░ ░█▀▀█ ▒█░▄▀ ▒█▀▀▀ 
▒█▄▄█ ▒█░░░ ▒█░▒█ ▒█▀▀▀ ▒█▄▄▀   ▒█░░░ ▒█▄▄█ ▒█▀▄░ ▒█▀▀▀ 
▒█░▒█ ▒█▄▄█ ▒█▄▄▀ ▒█▄▄▄ ▒█░▒█   ▒█▄▄█ ▒█░▒█ ▒█░▒█ ▒█▄▄▄

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, yesyes said:

oMg INtEL usiNG An exTra 100w, THaT mEanS I Will SpEnd An EXTra $100/YEaR oN ElEcTRicitY, THis IS stupId WhY IS tHIs coNsidEREd a VIctoRY At aLL

 

literally every AMD fan rn ^

It literally wasn't power consumption in cost per kW. It was always argument about heat and overclocking and how it heats up the case and how you need better PSU then and it heats up room more and you need AC and all that. I don't really remember anyone arguing with cost side of things.

 

Also it's funny when people call everyone "AMD fans" because they aren't jumping from top of the roof for Alder Lake despite them still using Intel CPU's or have done so for past 2 decades. Reality is, 5950X is 1 year old CPU. If Alder Lake wasn't beating it by this much, it would be downright garbage. Now it's more like "meh". It's not garbage, but it doesn't really excite me like crazy either. DDR5 is more of an annoyance with extra cost attached to it and we aren't even scratching the PCIe 4.0 yet except with storage.

 

Maybe it'll be better in laptops where hybrid cores actually can mean something or if they pull out Gracemont cores and stick them as stand alone CPU in low power laptops. Atoms have always been great in general from my own experience despite everyone shitting on them solely because only experience they had with Intel Atom were the initial N series (N260 and N270) which were in-order design with single low clock thread and HT bolted on top. And even that, it arrived with Windows XP and I was using it with Windows 7 later on and it was perfectly usable, even more so when I stick Intel's X25-M 80GB SSD next to it (it was Acer Aspire One netbook).

 

To everyone arguing where 5950X belongs, AMD doesn't use HEDT designation like Intel was. 5950X is a normal top of the line consumer desktop CPU. Threadripper is workstation and EPYC is server. If anything, Threadripper would be HEDT, 5950X just isn't. It's more like "Extreme" version with Intel. It's still consumer, just for those who want best home PC for gaming and casual work as in encoding of streams or whatever.

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, RejZoR said:

It literally wasn't power consumption in cost per kW. It was always argument about heat and overclocking and how it heats up the case and how you need better PSU then and it heats up room more and you need AC and all that. I don't really remember anyone arguing with cost side of things.

 

Also it's funny when people call everyone "AMD fans" because they aren't jumping from top of the roof for Alder Lake despite them still using Intel CPU's or have done so for past 2 decades. Reality is, 5950X is 1 year old CPU. If Alder Lake wasn't beating it by this much, it would be downright garbage. Now it's more like "meh". It's not garbage, but it doesn't really excite me like crazy either. DDR5 is more of an annoyance with extra cost attached to it and we aren't even scratching the PCIe 4.0 yet except with storage.

 

Maybe it'll be better in laptops where hybrid cores actually can mean something or if they pull out Gracemont cores and stick them as stand alone CPU in low power laptops. Atoms have always been great in general from my own experience despite everyone shitting on them solely because only experience they had with Intel Atom were the initial N series (N260 and N270) which were in-order design with single low clock thread and HT bolted on top. And even that, it arrived with Windows XP and I was using it with Windows 7 later on and it was perfectly usable, even more so when I stick Intel's X25-M 80GB SSD next to it (it was Acer Aspire One netbook).

 

To everyone arguing where 5950X belongs, AMD doesn't use HEDT designation like Intel was. 5950X is a normal top of the line consumer desktop CPU. Threadripper is workstation and EPYC is server. If anything, Threadripper would be HEDT, 5950X just isn't. It's more like "Extreme" version with Intel. It's still consumer, just for those who want best home PC for gaming and casual work as in encoding of streams or whatever.

even if you aren't amazed by the flat performance of the 12900k (I personally am, idk about you), you also need to note that this was an absolutely ENORMOUS leap over last gen, at this rate, Intel may be ahead by 2023, 2 years before they expect to be

░█▀▀█ ▒█░░░ ▒█▀▀▄ ▒█▀▀▀ ▒█▀▀█   ▒█░░░ ░█▀▀█ ▒█░▄▀ ▒█▀▀▀ 
▒█▄▄█ ▒█░░░ ▒█░▒█ ▒█▀▀▀ ▒█▄▄▀   ▒█░░░ ▒█▄▄█ ▒█▀▄░ ▒█▀▀▀ 
▒█░▒█ ▒█▄▄█ ▒█▄▄▀ ▒█▄▄▄ ▒█░▒█   ▒█▄▄█ ▒█░▒█ ▒█░▒█ ▒█▄▄▄

Link to comment
Share on other sites

Link to post
Share on other sites

16 hours ago, A51UK said:

I was not happy with the review, as I have seen it total unfair, I want to see how good the Intel  vs AMD, not how good is DDR5. There should use the same hardware where possible e.g. use DDR 4 for both and maybe show one with DDR 5 as well. 

 

I want to see intel vs AMD (both with DDR 4) and intel vs AMD (with intel DDR 5).  

 

Intel CPU look good so far, I cannot wait until seeing the new AMD CPU with DDR 5 and we can have some fair benchmark. 

 

Also, it is sad that  Windows 11 could cause problems as well. Maybe we get better and fair benchmark later on. 

 

 

The thing is, its a new platform.

You get 12th gen, you probably get ddr5.

And ddr5 really isnt that much better than 4rn, the latency increases keep the perf near the same, even though the ddr4 is slpwer mhz

I could use some help with this!

please, pm me if you would like to contribute to my gpu bios database (includes overclocking bios, stock bios, and upgrades to gpus via modding)

Bios database

My beautiful, but not that powerful, main PC:

prior build:

Spoiler

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, HelpfulTechWizard said:

The thing is, its a new platform.

You get 12th gen, you probably get ddr5.

And ddr5 really isnt that much better than 4rn, the latency increases keep the perf near the same, even though the ddr4 is slpwer mhz

in some tasks, DDR4 is actually performing BETTER than DDR5

░█▀▀█ ▒█░░░ ▒█▀▀▄ ▒█▀▀▀ ▒█▀▀█   ▒█░░░ ░█▀▀█ ▒█░▄▀ ▒█▀▀▀ 
▒█▄▄█ ▒█░░░ ▒█░▒█ ▒█▀▀▀ ▒█▄▄▀   ▒█░░░ ▒█▄▄█ ▒█▀▄░ ▒█▀▀▀ 
▒█░▒█ ▒█▄▄█ ▒█▄▄▀ ▒█▄▄▄ ▒█░▒█   ▒█▄▄█ ▒█░▒█ ▒█░▒█ ▒█▄▄▄

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, yesyes said:

hey, that's me!

 

this is getting off-topic

I think Alder Lake is nice because Intel is competitive again, but the gain over Zen 3 might not be worth it for everyone, and it doesn't destroy Zen 3 like some are insisting, Zen 3 has been out for a year, uses DDR4, and is slower in Windows 11. The motherboard, RAM,and cooler price kind of ruins any value going with Intel, and you have to spend at least $1000 to get a decent GPU in the current market, so IMO building a PC doesn't make much sense anyway unless you have a GPU already or you can make the new build a business writeoff.

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, Blademaster91 said:

I think Alder Lake is nice because Intel is competitive again, but the gain over Zen 3 might not be worth it for everyone, and it doesn't destroy Zen 3 like some are insisting, Zen 3 has been out for a year, uses DDR4, and is slower in Windows 11. The motherboard, RAM,and cooler price kind of ruins any value going with Intel, and you have to spend at least $1000 to get a decent GPU in the current market, so IMO building a PC doesn't make much sense anyway unless you have a GPU already or you can make the new build a business writeoff.

if you already have Zen 3, I wouldn't suggest upgrading at all, not for Zen 3D, not for alder lake, not for AMD 3nm.

but if you are looking to upgrade right now, I would suggest getting alder lake above Zen 3

░█▀▀█ ▒█░░░ ▒█▀▀▄ ▒█▀▀▀ ▒█▀▀█   ▒█░░░ ░█▀▀█ ▒█░▄▀ ▒█▀▀▀ 
▒█▄▄█ ▒█░░░ ▒█░▒█ ▒█▀▀▀ ▒█▄▄▀   ▒█░░░ ▒█▄▄█ ▒█▀▄░ ▒█▀▀▀ 
▒█░▒█ ▒█▄▄█ ▒█▄▄▀ ▒█▄▄▄ ▒█░▒█   ▒█▄▄█ ▒█░▒█ ▒█░▒█ ▒█▄▄▄

Link to comment
Share on other sites

Link to post
Share on other sites

Its such a damn shame this performance uplift has come at a time of insane price increases.

 

GPU's - obviously are still in the realm of "if u buy one, ur an idiot" territory.

RAM - DDR5 being new means you have the 'early adopters TAX' on it, which isnt great

Motherboard - £200-£400 for the bulk of Z690 are a good £100 more expensive than previous generations.

 

Add all that together and even a 'mid range' gaming build will be costing u substantially more than 2 or more generations ago.

 

With GPU's, the unfortunate reality is that the next gen of GPU's is likely also be affected. Not only from world events including crypto, but the manufacturers themselves will likely try to hold onto the tier shift they introduced in 2018. Ever since, all GPU's from both AMD and Nvidia have been sold with a MSRP set 1 tier above all previous generations going back over a decade. (E.G: xX70 cards going from $300-$400 to $400-$500, xX80ti cards going from $600-$700 to $1000-$1200)

CPU: Intel i7 3930k w/OC & EK Supremacy EVO Block | Motherboard: Asus P9x79 Pro  | RAM: G.Skill 4x4 1866 CL9 | PSU: Seasonic Platinum 1000w Corsair RM 750w Gold (2021)|

VDU: Panasonic 42" Plasma | GPU: Gigabyte 1080ti Gaming OC & Barrow Block (RIP)...GTX 980ti | Sound: Asus Xonar D2X - Z5500 -FiiO X3K DAP/DAC - ATH-M50S | Case: Phantek Enthoo Primo White |

Storage: Samsung 850 Pro 1TB SSD + WD Blue 1TB SSD | Cooling: XSPC D5 Photon 270 Res & Pump | 2x XSPC AX240 White Rads | NexXxos Monsta 80x240 Rad P/P | NF-A12x25 fans |

Link to comment
Share on other sites

Link to post
Share on other sites

Looks like Intel might be my next upgrade. Built an AMD B550 system for the living room and I’m extremely unhappy that after all the “fixes” I still have problems with USB dropouts. If Intel can keep the performance crown that i5-12600k is looking really nice. 
 

Looks like Jayztwocents did all of his benchmarks on windows 10 and they still beat out AMD easily on a lot of tasks. Optimum Tech also included numbers with both DDR4 and DDR5 and it looks like DDR5 is at best marginally better in some tasks.

Link to comment
Share on other sites

Link to post
Share on other sites

On 11/4/2021 at 3:23 PM, RejZoR said:

At expense of ludicrous power consumption and temperatures and all they are yet again doing is just pushing the clocks high once again.

This seems like a pretty disingenuous take. The higher temps and power usage are very dependent on what you're doing, and it isn't really the case with gaming. Further, you blow past the biggest difference - the use of efficiency cores.

Link to comment
Share on other sites

Link to post
Share on other sites

18 hours ago, yesyes said:

 

to give an idea, I live in Virginia, and the electricity here is 12.40¢ / kWh, so if your cpu used 300w, and you used it 24/7 for a full year, it would cost $325.872, to give a comparison, last time I checked, our air conditioning bill is ~$500/MONTH, by comparison, this cpu pulling in 300w 24/7 is $27.156/month

maybe now you understand, 4 minutes of work every day to get ~10% better performance, along with access to DDR5, and to get an actual use for windows 11

If you're spending $500 a month on Air con you are doing something extremely wrong. 

Link to comment
Share on other sites

Link to post
Share on other sites

  • 3 weeks later...

I have watched a video where you have proclaimed your commitment to in depth reviews, building new facilities, expert hardware analysis and then I watch this video, that I have at first missed because of the non-descriptive title and instead of providing useful information, you've compared the new CPUs on a yet unrefined OS with low adoption rate and wildly inconsistent performance and used different RAM with wildly different clock rates.

 

Sure, it produces interesting number and makes for an entertaining narrative, but if you are an entertainment channel, just say so, don't make statements about you journalistic commitment, taking the flag from traditional tech media. If you were after informative content, why couldn't you also use the same DDR4 in the 12th gen chips for at least few tests? Why couldn't you run at least a few test on W10, 2k19, Debian and RHEL? All of them much more mature platforms. That would have been genuinely informative content. Why should anyone trust that you will surpass the tech press with state of the art facilities and commitment when here, you could have made an informative content with no additional HW requirements, just additional effort and man-hours and you chose not to?

Link to comment
Share on other sites

Link to post
Share on other sites

The video was what it had to be at launch, and it was explained in detail in the subsequent WAN show. And they said they'd do more in-depth comparisons later. 

F@H
Desktop: i9-13900K, ASUS Z790-E, 64GB DDR5-6000 CL36, RTX3080, 2TB MP600 Pro XT, 2TB SX8200Pro, 2x16TB Ironwolf RAID0, Corsair HX1200, Antec Vortex 360 AIO, Thermaltake Versa H25 TG, Samsung 4K curved 49" TV, 23" secondary, Mountain Everest Max

Mobile SFF rig: i9-9900K, Noctua NH-L9i, Asrock Z390 Phantom ITX-AC, 32GB, GTX1070, 2x1TB SX8200Pro RAID0, 2x5TB 2.5" HDD RAID0, Athena 500W Flex (Noctua fan), Custom 4.7l 3D printed case

 

Asus Zenbook UM325UA, Ryzen 7 5700u, 16GB, 1TB, OLED

 

GPD Win 2

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, Kilrah said:

The video was what it had to be at launch, and it was explained in detail in the subsequent WAN show. And they said they'd do more in-depth comparisons later. 

 

Thank you for the additional information, this does make it better. I whish they'd put it in the description, annotation, OP of this thread, anywhere. I just don't know how to discern the entertainment content from the informative content, the high effort from the improv ones. There are some extremely informative and educational videos from LTT that I have learn a ton from, just as there are some videos that are so interesting and you just see the days and weeks that went into them, but then there are some barely scripted that stretch a 2minute idea into 12 minute almost reminding me of a video blog, but with a high production values.

 

I'm looking forward to the full review, the big-little architecture is quite intriguing.

Link to comment
Share on other sites

Link to post
Share on other sites

  • 2 months later...

Hi guys ,I have a bit of a problem and I was hoping you might help,

so we changed our company CPU into intel Core i9 12900KF (16C ,24T) in one of our PCs and i use 3ds max and corona for rendering , the problem is while rendering i open the task manager , I've noticed that its only using 8 cores at 100% and the the other 16 threads at 0% .

so is there some thing that has to be enabled or something?

thanks and i hope you have some ideas.

 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×