Jump to content

Damn, Intel!

1 minute ago, HelpfulTechWizard said:

ye, quotes without scource, thats  great

I never said that loss was still active tho....

https://www.techspot.com/article/2350-windows-11-benchmark-ryzen/

then why mention it at all...

anyone who looked up "11" in the past month would've found out about the bug with cache latency

░█▀▀█ ▒█░░░ ▒█▀▀▄ ▒█▀▀▀ ▒█▀▀█   ▒█░░░ ░█▀▀█ ▒█░▄▀ ▒█▀▀▀ 
▒█▄▄█ ▒█░░░ ▒█░▒█ ▒█▀▀▀ ▒█▄▄▀   ▒█░░░ ▒█▄▄█ ▒█▀▄░ ▒█▀▀▀ 
▒█░▒█ ▒█▄▄█ ▒█▄▄▀ ▒█▄▄▄ ▒█░▒█   ▒█▄▄█ ▒█░▒█ ▒█░▒█ ▒█▄▄▄

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, mariushm said:

 

Wow... cherry picking ...

 

It's OFFICE stuff.   You're comparing a  $600 12900k + $300..400 mobo and a $300 kit of 32 GB DDR5  to a $550 5950 + $150 mobo + $150 DDR4 kit   .... you get 17% extra performance for DOUBLE the power consumption and maybe $500+ more.

 

Here's another question ... in an OFFICE environment, do you think the Dell / HP / whatever machines will come with unlocked CPUs that do 240w, or do you think those systems will CPUs capped at 125-150w and with shit motherboards with barely any heatsinks on the VRM and with a stock cooler and some chunky air cooler. 

 

Those systems will be capped and throttled, so the fact that 12900k can do 17% more for 2x the power consumption is meaningless... it's just something for your wanking pleasure.

 

¯\_(ツ)_/¯ it's what I found on the page

 

I think they will give you an i7-8700 with a 27" 1440p monitor and the typical government office mouse/keyboard totaling at ~$2000, not a 12900k/5050x

░█▀▀█ ▒█░░░ ▒█▀▀▄ ▒█▀▀▀ ▒█▀▀█   ▒█░░░ ░█▀▀█ ▒█░▄▀ ▒█▀▀▀ 
▒█▄▄█ ▒█░░░ ▒█░▒█ ▒█▀▀▀ ▒█▄▄▀   ▒█░░░ ▒█▄▄█ ▒█▀▄░ ▒█▀▀▀ 
▒█░▒█ ▒█▄▄█ ▒█▄▄▀ ▒█▄▄▄ ▒█░▒█   ▒█▄▄█ ▒█░▒█ ▒█░▒█ ▒█▄▄▄

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, yesyes said:

I was under the impression that HEDT just referred to high end desktop PCs

no its the whole platform - they are often with more pcie or more ram channels.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, GodOfROG said:

no its the whole platform - they are often with more pcie or more ram channels.

like motherboards?

 

confuse

░█▀▀█ ▒█░░░ ▒█▀▀▄ ▒█▀▀▀ ▒█▀▀█   ▒█░░░ ░█▀▀█ ▒█░▄▀ ▒█▀▀▀ 
▒█▄▄█ ▒█░░░ ▒█░▒█ ▒█▀▀▀ ▒█▄▄▀   ▒█░░░ ▒█▄▄█ ▒█▀▄░ ▒█▀▀▀ 
▒█░▒█ ▒█▄▄█ ▒█▄▄▀ ▒█▄▄▄ ▒█░▒█   ▒█▄▄█ ▒█░▒█ ▒█░▒█ ▒█▄▄▄

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, yesyes said:

https://www.techspot.com/article/2350-windows-11-benchmark-ryzen/

then why mention it at all...

anyone who looked up "11" in the past month would've found out about the bug with cache latency

becuase I was pointing out that windows 11 has the opitite thing happen in the past, not what you were claiming.

I want to seen gn do it tho

I could use some help with this!

please, pm me if you would like to contribute to my gpu bios database (includes overclocking bios, stock bios, and upgrades to gpus via modding)

Bios database

My beautiful, but not that powerful, main PC:

prior build:

Spoiler

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, HelpfulTechWizard said:

becuase I was pointing out that windows 11 has the opitite thing happen in the past, not what you were claiming.

I want to seen gn do it tho

¯\_(ツ)_/¯

those times are over, it's like bringing up intel being stuck on 14nm

░█▀▀█ ▒█░░░ ▒█▀▀▄ ▒█▀▀▀ ▒█▀▀█   ▒█░░░ ░█▀▀█ ▒█░▄▀ ▒█▀▀▀ 
▒█▄▄█ ▒█░░░ ▒█░▒█ ▒█▀▀▀ ▒█▄▄▀   ▒█░░░ ▒█▄▄█ ▒█▀▄░ ▒█▀▀▀ 
▒█░▒█ ▒█▄▄█ ▒█▄▄▀ ▒█▄▄▄ ▒█░▒█   ▒█▄▄█ ▒█░▒█ ▒█░▒█ ▒█▄▄▄

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, yesyes said:

¯\_(ツ)_/¯

those times are over, it's like bringing up intel being stuck on 14nm

no its not

Why are you so fixated on it.......

I could use some help with this!

please, pm me if you would like to contribute to my gpu bios database (includes overclocking bios, stock bios, and upgrades to gpus via modding)

Bios database

My beautiful, but not that powerful, main PC:

prior build:

Spoiler

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, HelpfulTechWizard said:

no its not

Why are you so fixated on it.......

bc it's irrelevant

░█▀▀█ ▒█░░░ ▒█▀▀▄ ▒█▀▀▀ ▒█▀▀█   ▒█░░░ ░█▀▀█ ▒█░▄▀ ▒█▀▀▀ 
▒█▄▄█ ▒█░░░ ▒█░▒█ ▒█▀▀▀ ▒█▄▄▀   ▒█░░░ ▒█▄▄█ ▒█▀▄░ ▒█▀▀▀ 
▒█░▒█ ▒█▄▄█ ▒█▄▄▀ ▒█▄▄▄ ▒█░▒█   ▒█▄▄█ ▒█░▒█ ▒█░▒█ ▒█▄▄▄

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, yesyes said:

bc it's irrelevant

it wa---

nope im done, it doesnt matter, this is just derailing the thread.

I could use some help with this!

please, pm me if you would like to contribute to my gpu bios database (includes overclocking bios, stock bios, and upgrades to gpus via modding)

Bios database

My beautiful, but not that powerful, main PC:

prior build:

Spoiler

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, HelpfulTechWizard said:

it wa---

nope im done, it doesnt matter, this is just derailing the thread.

for the first time, couldn't agree more

░█▀▀█ ▒█░░░ ▒█▀▀▄ ▒█▀▀▀ ▒█▀▀█   ▒█░░░ ░█▀▀█ ▒█░▄▀ ▒█▀▀▀ 
▒█▄▄█ ▒█░░░ ▒█░▒█ ▒█▀▀▀ ▒█▄▄▀   ▒█░░░ ▒█▄▄█ ▒█▀▄░ ▒█▀▀▀ 
▒█░▒█ ▒█▄▄█ ▒█▄▄▀ ▒█▄▄▄ ▒█░▒█   ▒█▄▄█ ▒█░▒█ ▒█░▒█ ▒█▄▄▄

Link to comment
Share on other sites

Link to post
Share on other sites

back on topic, my point is that this greatly exaggerates the power usage by intensely overclocking the CPU for extra lead over AMD

 

Intel expects to be ahead in 2025, so this is a surprise for everyone

░█▀▀█ ▒█░░░ ▒█▀▀▄ ▒█▀▀▀ ▒█▀▀█   ▒█░░░ ░█▀▀█ ▒█░▄▀ ▒█▀▀▀ 
▒█▄▄█ ▒█░░░ ▒█░▒█ ▒█▀▀▀ ▒█▄▄▀   ▒█░░░ ▒█▄▄█ ▒█▀▄░ ▒█▀▀▀ 
▒█░▒█ ▒█▄▄█ ▒█▄▄▀ ▒█▄▄▄ ▒█░▒█   ▒█▄▄█ ▒█░▒█ ▒█░▒█ ▒█▄▄▄

Link to comment
Share on other sites

Link to post
Share on other sites

I think they all are biased. To what extent, is debatable....

I edit my posts more often than not

Link to comment
Share on other sites

Link to post
Share on other sites

38 minutes ago, yesyes said:

like motherboards?

 

confuse

HEDT, or High End Desktop, is an acronym originally coined by Intel to refer to their high end consumer desktop PC platforms designed for enthusiasts, gamers, and content creators. Today, both Intel and AMD refer to their high end consumer desktop PC platforms as HEDT.
 

Some of Intel’s HEDT platforms include X79 (Sandy Bridge-E, Ivy Bridge-E), X99 (Haswell-E, Broadwell-E), and X299 (Skylake-X, Kaby Lake-X). AMD’s HEDT platforms include X399, TRX40 (Threadripper).
 

These are usually on a socket that mirrors  (or closely follow) current Server CPU sockets, eg LGA-2011, LGA-2011v3, LGA2066, Socket TR4, Socket sTRX4, and so on.

 

Often these have ability to use ECC, extra PCIE lanes, and more memory channels. And always have supported Overclocking. Often with more flexible ways to overclock on the platform. (eg CLK, multiplier, Freq. and others.)

 

Link to comment
Share on other sites

Link to post
Share on other sites

Everyone is screaming how amazing Adler Lake is and I'm one of few who just sit here and stare into the abyss because I'm not really that impressed. So it beats a year old part from AMD. Congrats I guess? At expense of ludicrous power consumption and temperatures and all they are yet again doing is just pushing the clocks high once again. Maybe to someone with 2x 360 custom water cooling it's irrelevant, but every time power consumption was involved, people couldn't shut up about it how important it is. Suddenly everyone kinda sorta brush at it for a brief moment and that's it. Hm.

 

Intel makes some really good stuff, but it's mostly really for the mobile parts and even those seem to be super niche. Like the Atom CPU's for mobile devices. I was always fascinated by them and actually owned several models, starting with N270, then Z2000 series (not sure which model I had exactly) and now I have Z8350 and I've just ordered another one ultrabook with Pentium N6000. For whatever reason only Acer is super into this one and it's one of very rare devices that are passive cooled. I couldn't find a single AMD's part except ancient A9 Bulldozer based ones. And I could hardly find the N6000 in any other vendor's products, other than Acer anyways. Weird. And knowing Atoms from the past I know it'll serve me great for multimedia. But on desktops, it was meh for several years now and it's still meh.

 

If Adler Lake had power consumption of Ryzen 5950X and this performance, I'd be properly impressed. But I'm just not and it's weird everyone being so enthusiastic about it when it's pretty meh. Or is DDR5 and PCIe 5.0 really that awesome to be so hyped about it? Even efficiency cores seem to not do much other than still have scheduler confused given that AMD's idle consumption isn't that problematic and when all cores are firing on all "pistons", any extra work that falls on them is an insignificant blip on the performance radar. This isn't mobile device with finite power where every mW counts when phone is sitting around with display off and needs to slowly deal with background tasks...

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, RejZoR said:

Everyone is screaming how amazing Adler Lake is and I'm one of few who just sit here and stare into the abyss because I'm not really that impressed. So it beats a year old part from AMD. Congrats I guess? At expense of ludicrous power consumption and temperatures and all they are yet again doing is just pushing the clocks high once again. Maybe to someone with 2x 360 custom water cooling it's irrelevant, but every time power consumption was involved, people couldn't shut up about it how important it is. Suddenly everyone kinda sorta brush at it for a brief moment and that's it. Hm.

Alder Lake is impressive on the sole merit that it's Intel being relevant again. That's it. Given the circumstances, it's a good showing from Intel and a good return to proper competition, even if there are glaring issues with its overall performance relative to AMD. In either case, the result will be, in all likelihood, that AMD will choose to cut prices on Zen 3 parts at some point.

 

That being said, this is a really really good time for Intel to go all-in on the lower-end market. AMD hasn't had a peep of anything below the 5600X. If the 12400 comes anywhere near the 12600K, Intel will likely have a stranglehold on the lower end. They also need to release the lower end chipsets to go along with these CPUs, since a 12400 with a $210 Z690 is a pretty poor value proposition.

It's entirely possible that I misinterpreted/misread your topic and/or question. This happens more often than I care to admit. Apologies in advance.

 

珠江 (Pearl River): CPU: Intel i7-12700K (8p4e/20t); Motherboard: ASUS TUF Gaming Plus Z690 WiFi; RAM: G.Skill TridentZ RGB 32GB (2x16GB) DDR4 @3200MHz CL16; Cooling Solution: NZXT Kraken Z53 240mm AIO, w/ 2x Lian Li ST120 RGB Fans; GPU: EVGA Nvidia GeForce RTX 3080 10GB FTW3 Ultra; Storage: Samsung 980 Pro, 1TB; Samsung 970 EVO, 1TB; Crucial MX500, 2TB; PSU: Corsair RM850x; Case: Lian Li Lancool II Mesh RGB, Black; Display(s): Primary: ASUS ROG Swift PG279QM (1440p 27" 240 Hz); Secondary: Acer Predator XB1 XB241H bmipr (1080p 24" 144 Hz, 165 Hz OC); Case Fans: 1x Lian Li ST120 RGB Fan, 3x stock RGB fans; Capture Card: Elgato HD60 Pro

 

翻生 (Resurrection): CPU: 2x Intel Xeon E5-2620 v2; Motherboard: ASUS Z9PR-D12 (C602 chipset) SSI-EEB; RAM: Crucial 32GB (8x4GB) DDR3 ECC RAM; Cooling Solution: 2x Cooler Master Hyper 212 EVO; GPU: ASRock Intel ARC A380 Challenger ITX; StorageCrucial MX500, 500GB; PSU: Super Flower Leadex III 750W; Case: Phanteks Enthoo Pro; Expansion Card: TP-Link Archer T4E AC1200 PCIe Wi-Fi Adapter Display(s): Dell P2214HB (1080p 22" 60 Hz)

 

壯麗 (Glorious): Mainboard: Framework Mainboard w/ Intel Core i5-1135G7; RAM: G.Skill Ripjaws 32GB (2x16GB) DDR4 SODIMM @3200MHz CL22; eGPU: Razer Core X eGPU Enclosure w/ (between GPUs at the moment); Storage: Samsung 970 EVO Plus, 1TB; Display(s): Internal Display: Framework Display; External Display: Acer (unknown model) (1080p, 21" 75 Hz)

Link to comment
Share on other sites

Link to post
Share on other sites

While it was acknowledged that the tests were not 'like for like' with regards to platform being that there is still a bug with AMD in W11 and the Intel had DDR5, it does look like Intel have managed to claw back into relevancy with three major caveats.

1, That power draw is killer, AMD managed singled digit differences in performance with the bug AND with less than hald the power draw so the possibilities with overclocking  and/or longevity of the new chips remain to be seen.

2, Like mentioned, the cost is painful with DDR5 RAM and motherboards being widely unobtanium for a reasonable price.
3, We're only a couple of months out from Ryzen 6K so it remains to be seen the improvements that AMD has made in that time.

It's good to see that intel is relevant again but unfortunately it's with an * next to it for me.

Sloth's the name, audio gear is the game
I'll do my best to lend a hand to anyone with audio questions, studio gear and value for money are my primary focus.

Click here for my Microphone and Interface guide, tips and recommendations
 

For advice I rely on The Brains Trust :
@rice guru
- Headphones, Earphones and personal audio for any budget 
@Derkoli- High end specialist and allround knowledgeable bloke

Link to comment
Share on other sites

Link to post
Share on other sites

I did some math comparing my planed AMD 5600X build with an i5, sticking with DDR4 for both, and the difference was 30% more expensive for Intel (mostly thanks to the motherboard). From the benchmarks I see performance is only12% to 15% better. So to me it's not worth it for the budget I have to work with, the value is still with AMD. Sure the newest thing is exciting and impressive but unless we can print our own money it isn't always the most viable option.

Link to comment
Share on other sites

Link to post
Share on other sites

I was not happy with the review, as I have seen it total unfair, I want to see how good the Intel  vs AMD, not how good is DDR5. There should use the same hardware where possible e.g. use DDR 4 for both and maybe show one with DDR 5 as well. 

 

I want to see intel vs AMD (both with DDR 4) and intel vs AMD (with intel DDR 5).  

 

Intel CPU look good so far, I cannot wait until seeing the new AMD CPU with DDR 5 and we can have some fair benchmark. 

 

Also, it is sad that  Windows 11 could cause problems as well. Maybe we get better and fair benchmark later on. 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Didn't take log for this thread to get derailed. Never change internet. lol

Re-Posting this here:

Quote

My thoughts

This reeks of a lot of panic from Intel at least. A lot of red flags on this: Obscuring that this won't be performant in the real world calling it a "upgrade to adjusted metrix after being minamally exceptional and failing to meet goals" barf. In plane terms as far as I can tell that only means: Well we can't play off your fears of missing out without a new CPU, without hiding the fact that post covid and hitting practical limmitis on how many transistors and what not we can choink into a CPU.

Fact is this was something coming for easily years. Probably a decade. Which was the arms race between less and less efficient coding,. Zero desire for app makers to optimize and bug fix things.  Another more concrete problem is. Essentially even with a petillion hz clock speed you can only do so much one step at a time. This is called Synchronous computing. But the magic to truly unlock all that potential comes from  either fully asynchronous computing. ie: ram flush at the same time a new app is loaded. Which is hard. reely hard right now (task scheduling is a PITA). App and OS makers are the other side of this arms race. Anyone that's used windows  has at one time had the "joy" of 'that one program' that takes a stupid amount of  time to load, quite, or  do much anything. This is called threading. Unless the new best practice becomes multithreading. and watching that ram, and CPU use. I just don't see a way for hardware to have good, and propper genuine  improvements. 

To follow up on my own thinking. Waaaaaaay back in the day. CPUs first got over 4GHz. That's 4billion potential calculations a second one core  can (in theory) do. In practice?I'd be surprised if anyone feels or notices  that now.here's why again. A computer, physically speaking can only do one task at a time.It's reely fast. And we fake it with software level multitasking. But then enter the all mighty mob: shit level optimizations. A industry problem for years.  Hardly anything uses hardware level multitasking-and multithreading. Going on about 'efficiency cores' vs 'performance cores' and a direct cokain fuled line to ram and back can not possibly counter: One task, one ste at. a. time. Same way of the hardware being to better do stuff asynchronously. Is probably the only way either AMD or Intel can possible make real, actual advances'  Think about this: we can choink 32 gigs of ram. Have a SSD that has, essentially instant read and write.  bottlecked because...the CPU is stuck waiting for something. weather it's the GPU  number crunching, or a fan to spin up. etc etc.

This starting trend of different types of cores. In my mind is a symptom of hitting a wall. die can only get so thin etc before physics, and or practicallity hit them upside the head.  

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, The Flying Sloth said:

While it was acknowledged that the tests were not 'like for like' with regards to platform being that there is still a bug with AMD in W11 and the Intel had DDR5, it does look like Intel have managed to claw back into relevancy with three major caveats.

1, That power draw is killer, AMD managed singled digit differences in performance with the bug AND with less than hald the power draw so the possibilities with overclocking  and/or longevity of the new chips remain to be seen.

2, Like mentioned, the cost is painful with DDR5 RAM and motherboards being widely unobtanium for a reasonable price.
3, We're only a couple of months out from Ryzen 6K so it remains to be seen the improvements that AMD has made in that time.

It's good to see that intel is relevant again but unfortunately it's with an * next to it for me.

All this. all of this. I for one don't get the FUMO  tech heads have. going: 'OMG OMG OMG I can't get DDR5 ram I just heard about!' .Or winging about 'omg omg omg omg I "need" a fucking  RTX90000 because nvidia said so. Nevermind that nvidia is having an ID conflict and cant even keep car companies (tesla) supplied. but this GPU on my computer that's 6 months old is now bad because nvidia said so' wtf. I get wanting a new GPU. I also get the world got goatse' level of bitchslapped from covid, oh right now bit-fu cking-coin speculators and scalpers. those are all tag teaming on ANY chip production. Hello 80s how were you? 

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, RejZoR said:

Everyone is screaming how amazing Adler Lake is and I'm one of few who just sit here and stare into the abyss because I'm not really that impressed. So it beats a year old part from AMD. Congrats I guess? At expense of ludicrous power consumption and temperatures and all they are yet again doing is just pushing the clocks high once again. Maybe to someone with 2x 360 custom water cooling it's irrelevant, but every time power consumption was involved, people couldn't shut up about it how important it is. Suddenly everyone kinda sorta brush at it for a brief moment and that's it. Hm.

 

Intel makes some really good stuff, but it's mostly really for the mobile parts and even those seem to be super niche. Like the Atom CPU's for mobile devices. I was always fascinated by them and actually owned several models, starting with N270, then Z2000 series (not sure which model I had exactly) and now I have Z8350 and I've just ordered another one ultrabook with Pentium N6000. For whatever reason only Acer is super into this one and it's one of very rare devices that are passive cooled. I couldn't find a single AMD's part except ancient A9 Bulldozer based ones. And I could hardly find the N6000 in any other vendor's products, other than Acer anyways. Weird. And knowing Atoms from the past I know it'll serve me great for multimedia. But on desktops, it was meh for several years now and it's still meh.

 

If Adler Lake had power consumption of Ryzen 5950X and this performance, I'd be properly impressed. But I'm just not and it's weird everyone being so enthusiastic about it when it's pretty meh. Or is DDR5 and PCIe 5.0 really that awesome to be so hyped about it? Even efficiency cores seem to not do much other than still have scheduler confused given that AMD's idle consumption isn't that problematic and when all cores are firing on all "pistons", any extra work that falls on them is an insignificant blip on the performance radar. This isn't mobile device with finite power where every mW counts when phone is sitting around with display off and needs to slowly deal with background tasks...

OMFG!! THANK YOU!! 

Intel is.. well they exist. 

Link to comment
Share on other sites

Link to post
Share on other sites

Ok Linus ludacris framerates for CS GO? The 5950X is basically its equal less the max frames... Sure it's cheaper, but I wouldn't say it's amazing for a generational leap considering AMD had this last year...

 

5 hours ago, yesyes said:

Intel expects to be ahead in 2025, so this is a surprise for everyone

Surprise? Nah they had to catch up else risk more market share loss, and that first part requires AMD to sit on their asses too, which they will not. It's good to finally see something "usable" from intel but I'm not sold that this generation is going to be the tipping point just yet, hopefully next year. Intel is still playing catch up this year.

Link to comment
Share on other sites

Link to post
Share on other sites

Intel sure is doing a lot better being out of the 6-generation Skylake bs, but they still don't deserve the crown just yet.

Asus ROG G531GT : i7-9750H - GTX 1650M +700mem - MSI RX6600 Armor 8G M.2 eGPU - Samsung 16+8GB PC4-2666 - Samsung 860 EVO 500G 2.5" - 1920x1080@145Hz (172Hz) IPS panel

Family PC : i5-4570 (-125mV) - cheap dual-pipe cooler - Gigabyte Z87M-HD3 Rev1.1 - Kingston HyperX Fury 4x4GB PC3-1600 - Corsair VX450W - an old Thermaltake ATX case

Test bench 1 G3260 - i5-4690K - 6-pipe cooler - Asus Z97-AR - Panram Blue Lightsaber 2x4GB PC3-2800 - Micron CT500P1SSD8 NVMe - Intel SSD320 40G SSD

iMac 21.5" (late 2011) : i5-2400S, HD 6750M 512MB - Samsung 4x4GB PC3-1333 - WT200 512G SSD (High Sierra) - 1920x1080@60 LCD

 

Test bench 2: G3260 - H81M-C - Kingston 2x4GB PC3-1600 - Winten WT200 512G

Acer Z5610 "Theatre" C2 Quad Q9550 - G45 Express - 2x2GB PC3-1333 (Samsung) - 1920x1080@60Hz Touch LCD - great internal speakers

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Arika S said:

People sure seem weirdly salty about that fact that Intel is competitive again

probably a ryzen for that. intel prices are sane. it costs an arm and a leg. no need to blow a motherboard of cash on this. Yet.

 

I'll see myself out.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×