Jump to content

I’m breaking one of my biggest rules..

James

Buy an ASUS TUF NVIDIA GeForce RTX 4070 Ti: https://geni.us/NhTlo3i
Buy a ZOTAC GeForce RTX 3070 Ti: https://geni.us/Te4W
Buy a Razer Core X GPU Enclosure: https://geni.us/Jrn6S
Buy a Valve Steam Deck: https://geni.us/Apee9DU

 

We asked our community to send us their hot tech takes for Linus to react to. Some of them are pretty hot...

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

at 17:22

 

that person was talking about GPUs, not CPUs, and they are absolutely correct. no one who actually has the option of using nvidia ever chooses to use an AMD GPU. at least i haven't seen it. every reviewer/influencer always makes a video where they put away their nvidia GPU and force themselves to use an AMD GPU for clicks then move back to nvidia almost immediately after 

 

i feel like you guys knew that but intentionally pretended like you didn't understand that post to not offend amd

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, terroralpha said:

at 17:22

 

that person was talking about GPUs, not CPUs, and they are absolutely correct. no one who actually has the option of using nvidia ever chooses to use an AMD GPU. at least i haven't seen it. every reviewer/influencer always makes a video where they put away their nvidia GPU and force themselves to use an AMD GPU for clicks then move back to nvidia almost immediately after 

 

i feel like you guys knew that but intentionally pretended like you didn't understand that post to not offend amd

They more than know it, they did the same thing!  Swapping Nvidia GPUs for 7900XTXs.

Although I think Linus kept the AMD GPU after since it was working well enough that he forgot about removing it after the experiment was over.

At least that's what I remember seeing recently.

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, terroralpha said:

no one who actually has the option of using nvidia ever chooses to use an AMD GPU

I did. Actually, I switched to an AMD APU (R5 4600G) from a nVidia GPU card. (OK, TBF, it was a GT710 😛 )

 

As a Linux user, nVidia support is rather troublesome, AMD has a much better reputation.

"You don't need eyes to see, you need vision"

 

(Faithless, 'Reverence' from the 1996 Reverence album)

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Dutch_Master said:

I did. Actually, I switched to an AMD APU (R5 4600G) from a nVidia GPU card. (OK, TBF, it was a GT710 😛 )

 

As a Linux user, nVidia support is rather troublesome, AMD has a much better reputation.

the conversation was specifically about influencers who have access to high-end hardware. you switches from one potato to a different potato 

Link to comment
Share on other sites

Link to post
Share on other sites

17 minutes ago, terroralpha said:

at 17:22

 

that person was talking about GPUs, not CPUs, and they are absolutely correct. no one who actually has the option of using nvidia ever chooses to use an AMD GPU. at least i haven't seen it.

Hi.

Corps aren't your friends. "Bottleneck calculators" are BS. Only suckers buy based on brand. It's your PC, do what makes you happy.  If your build meets your needs, you don't need anyone else to "rate" it for you. And talking about being part of a "master race" is cringe. Watch this space for further truths people need to hear.

 

Ryzen 7 5800X3D | ASRock X570 PG Velocita | PowerColor Red Devil RX 6900 XT | 4x8GB Crucial Ballistix 3600mt/s CL16

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, Dutch_Master said:

As a Linux user, nVidia support is rather troublesome, AMD has a much better reputation.

 

I've been an AMD GPU user (mostly firepro) in my laptops for many years now, and this upgrade cycle I took a punt on Nvidia because I think the open kernel module is going to improve the situation markedly. Honestly, I'm really not sorry to see the back of AMD. Then again, I'm still using X11 because of KDE's glacial Wayland progress so maybe I'm going to live to regret this decision.

Link to comment
Share on other sites

Link to post
Share on other sites

My hot take under the 9900KS video in 2019:

Intel's foundries are their liability. Moving to a smaller-but-immature 10 nm node now means losing out the current older-but-mature 14 nm node capacity, which Intel clearly would not until they have sorted out their 10 nm problems. It's not about microarchitecture, but how their foundries (by extension, their manufacturing business model) works.

 

And a bunch of people immediately jumped in without even knowing what I was talking about. Someone brought up the broken Tick-Tock model and be like "130 nm FTW". Then there's people keep mixing architecture advantage and node advantage and came up with "1x nm Zen1/+ competes with 22 nm Haswell or 14 nm Broadwell".

 

Anyway, I was mocked and humiliated that day for merely understanding Intel's broken business model of holding onto their fabs and let node supersedes architecture.

 

Years later, lo and behold, Intel's 10th and 11th gen is still utilizing 14 nm.

 

I was right.

"Mankind’s greatest mistake will be its inability to control the technology it has created."

Link to comment
Share on other sites

Link to post
Share on other sites

At 13:25:

 

I don't think you don't need to roll in and bulldoze entire neighborhoods to restore walkable cities. But you do need to allow people to open businesses in areas that are currently legally restricted to only residential uses to move back towards that direction. People will come up with all sorts of creative ways to use residential space, like what's shown in this video about how you could bring back front yard businesses. Or, sure, individual property owners could knock down their own houses if they want and build something with commercial space at the ground floor instead. But they're banned from doing any of this now!

Link to comment
Share on other sites

Link to post
Share on other sites

Linus at 18:36 in the video is well of base here and it feels almost comically out of date with what almost all competitive game settings/low latency settings actually recommend to the point it pushed me to making an account lmao. By allowing an uncapped framerate to begin with means that there is some level of render queue where the most up to date frame is not the one being delivered to the display as soon as is possible and is instead creating an inherent render queue which means there HAS to be latency. I hope LTT revises their advice in competitive gaming seeing how well documented it is from creators such as Battle(non)sense.  

 

Link to comment
Share on other sites

Link to post
Share on other sites

At 23:15:

 

The hot take about Ryzen having issues and Intel being better - Not saying either one is better, but I have experienced a lot of XMP issues on all AM4 generation CPUs (I've built over 30 Ryzen systems myself, and no, I'm not counting building same systems over and over and over again, but bulk orders) and also I've seen people having similar issues in comments under memory kits in our local stores and also on reddit. I did not have chance to check it on AM5/DDR5 yet though.

 

I've seen multiple people stating they RMAd everything and came back to same issue with whole new set of hardware and AMD support wasn't helpful, so they're not trying AMD cpus ever again.

 

So first of all - reddit community around Ryzen was fully in fanboy mode whenever I tried discussing the issue and showing proof that the issue exists. Also the fact that this problem doesn't seem to exist for youtubers is also a problem for whoever encounters it.

 

Secondly - what is the problem - When running XMP above 2933 MHz for two sticks and above 2666 MHz for four sticks the system becomes more and more unstable over a span of between a week and up to few months. Browser tabs crash, explorer, apps crash and there's a fancy lockup when everything already running is running perfectly fine, but the system cannot start a new process. The similar if not the exactly same problems occurred in various configurations - different XMP, old vs fresh system, big cooler vs low profile cooler, expensive memory sticks from QVL list and some cheap sticks, boards from different vendors and so on.

 

Thirdly - the state of memory (in system) seems to degrade over time, I had some sticks being literally unusable in windows after running them for a month on XMP, but they pass memtest perfectly fine.

 

After tons of tinkering around with various bios settings over years with those issues, I'm at loss at figuring it out.

My guesswork would be that there are regions that are getting either sub-par quality chips for memory kits or the CPUs, OR there's some tiny detail in the setup of those system that even I am such big of an Idiot to miss that over and over again for years, so everyday Joe buying a Ryzen system is not protected from this issue by perfect idiotproofing.

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, SaperPL said:

The hot take about Ryzen having issues and Intel being better - Not saying either one is better, but I have experienced a lot of XMP issues on all AM4 generation CPUs (I've built over 30 Ryzen systems myself, and no, I'm not counting building same systems over and over and over again, but bulk orders) and also I've seen people having similar issues in comments under memory kits in our local stores and also on reddit. I did not have chance to check it on AM5/DDR5 yet though.

 

I've seen multiple people stating they RMAd everything and came back to same issue with whole new set of hardware and AMD support wasn't helpful, so they're not trying AMD cpus ever again.

 

So first of all - reddit community around Ryzen was fully in fanboy mode whenever I tried discussing the issue and showing proof that the issue exists. Also the fact that this problem doesn't seem to exist for youtubers is also a problem for whoever encounters it.

 

Secondly - what is the problem - When running XMP above 2933 MHz for two sticks and above 2666 MHz for four sticks the system becomes more and more unstable over a span of between a week and up to few months. Browser tabs crash, explorer, apps crash and there's a fancy lockup when everything already running is running perfectly fine, but the system cannot start a new process. The similar if not the exactly same problems occurred in various configurations - different XMP, old vs fresh system, big cooler vs low profile cooler, expensive memory sticks from QVL list and some cheap sticks, boards from different vendors and so on.

 

Thirdly - the state of memory (in system) seems to degrade over time, I had some sticks being literally unusable in windows after running them for a month on XMP, but they pass memtest perfectly fine.

 

After tons of tinkering around with various bios settings over years with those issues, I'm at loss at figuring it out.

My guesswork would be that there are regions that are getting either sub-par quality chips for memory kits or the CPUs, OR there's some tiny detail in the setup of those system that even I am such big of an Idiot to miss that over and over again for years, so everyday Joe buying a Ryzen system is not protected from this issue by perfect idiotproofing.

Sample size of 3 and not a single issue*. Have you checked the compatibility list of the motherboard manufacturer?

 

*) I did have a dead RAM stick once which resulted in infrequent bluescreens until I could narrow it down to this one stick and got a replacement.

Link to comment
Share on other sites

Link to post
Share on other sites

24 minutes ago, SkyHound0202 said:

And a bunch of people immediately jumped in without even knowing what I was talking about. Someone brought up the broken Tick-Tock model and be like "130 nm FTW". 


Some context to be added here - 130nm was like... 2001. 
Here's a late 2000 review. They had Tualatin in the channel in 2001 and Northwood in the channel in Jan 2002. 

https://www.intel.com/pressroom/archive/releases/2000/cn110700.htm

https://www.anandtech.com/show/804

3900x | 32GB RAM | RTX 2080

1.5TB Optane P4800X | 2TB Micron 1100 SSD | 16TB NAS w/ 10Gbe
QN90A | Polk R200, ELAC OW4.2, PB12-NSD, SB1000, HD800
 

Link to comment
Share on other sites

Link to post
Share on other sites

31 minutes ago, HenrySalayne said:

Sample size of 3 and not a single issue*. Have you checked the compatibility list of the motherboard manufacturer?

 

*) I did have a dead RAM stick once which resulted in infrequent bluescreens until I could narrow it down to this one stick and got a replacement.

 

For the record - not like 30 out of 30 sample size had the issues, but I'd say at least a dozen out of 30.

For first and second generation of Ryzen it was hard to get exactly the sticks with chips from motherboard QVL for memory, but for third (3600X/3700X) and fifth gen (ryzen 5600X) it was possible, but still the issue occurred on some of the systems.

 

I shuffled various kits on systems that had the issues and switching kits didn't really help - new kit would progressively cause more issues when running XMP and even after downclocking to 2933 which would normally keep them stable from day 0, they would have issues, less than at full XMP but still.

 

I'm running right now on F4-3600C16D-32GVKC from G.skill and it had my board (Asrock B550 Phantom Gaming ITX/ax) listed back when I purchased this kit:

http://web.archive.org/web/20220507193306/https://www.gskill.com/qvl/165/184/1562831784/F4-3600C16D-32GVKC-QVL Screenshot (wayback machine slow)

But now the support for this kit specifically on Asrock boards is not not there anymore: 

https://www.gskill.com/qvl/165/184/1562831784/F4-3600C16D-32GVKC-QVL

 

And it's not like I only had this problems on asrock boards, I had issues on boards from asus and gigabyte as well.

 

Anyway the problem with QVL in my region is that a lot of kits available in stores are pretty close by their vendor ID, but not exact matches, they seem to have dies from the same manufacturer, but it was still hard to get an actual QVL match for most of the AM4 generation.

Link to comment
Share on other sites

Link to post
Share on other sites

Gonna have to disagree that discord is only popular cuase MSN Messager and Skype were worse. Neither werre built with live communities in mind. You had to know before hand who you were connecting to and adding the live community feature plus everything else discord has would have been a radical overhaul. Discord works because it take th old BBoard concept and takes it live letting you connect with people of similar interest by searching for or being invited to a persistent group instead of a one off session and preserving old longs so you cna catch up. It's the LMG forums but with an option for live feed back without reload and a idea home for the old Role Play boards. None of which MSN Messenger or skype could do without a massive overhaul.

Link to comment
Share on other sites

Link to post
Share on other sites

You guys gotta look at your sponsors and the current time, like I mean really? An XSplit sponsor? After YOU just got hacked? You realize they got MASSIVELY hacked a while back?

Link to comment
Share on other sites

Link to post
Share on other sites

As a 4090 owner i can 100% confirm raytracing is useless.

And i couldn't care less if it allows corporations to throw their expenses on me by forcing me to buy a better GPU while they save a buck on development time.

Do your job. And if you'll have to add less lootboxes and false advertisement to instead make better lighting manually - do it.

Link to comment
Share on other sites

Link to post
Share on other sites

Brand fanboying aside, there are things one can expect from AMD and never expect from Intel. For example, a budget CPU made for overclocking - after its release, Athlon 3000G became the instant hit among budget builders, the same way budget versions of FXes and Phenoms were back then.

Yes, I had an account here before. Do not ask me about something related to current political events in the part of the planet I live in - I wouldn't answer that for my own sake and safety. Feel free to address me with any other kind of questions.

Link to comment
Share on other sites

Link to post
Share on other sites

My hot take on upcoming technologies:

Considering just how primitive our current battery tech is when compared to the progress in semiconductors, I think the most important technological advancement coming in the near future is Solid State Batteries. Apart from the longer lifespan and larger capacity for smartphones and EVs, they will also help improve efficiency in storage of renewable energy. Very important for the planet!

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, nekollx said:

Gonna have to disagree that discord is only popular cuase MSN Messager and Skype were worse. Neither werre built with live communities in mind. You had to know before hand who you were connecting to and adding the live community feature plus everything else discord has would have been a radical overhaul. Discord works because it take th old BBoard concept and takes it live letting you connect with people of similar interest by searching for or being invited to a persistent group instead of a one off session and preserving old longs so you cna catch up. It's the LMG forums but with an option for live feed back without reload and a idea home for the old Role Play boards. None of which MSN Messenger or skype could do without a massive overhaul.

Discord also was able to displace Mumble/TeamSpeak etc. as it was very easy to setup. You didn't need dedicated hardware / money because the business model was different. 

AMD Ryzen 5 3600 | AsRock B450M-Pro4 | Zotac GTX 3070 Ti

Shure SRH840A | Sennheiser Momentum 2 AEBT | LG C9 55"

Link to comment
Share on other sites

Link to post
Share on other sites

I have a question for Linus: If motion blur is only for hide low fps, tell me how to get the speed sensation on racing games without having a huge map, because the cars on these games doesn't really have the "real life speed", otherwise the map would feel much smaller.

Made In Brazil 🇧🇷

Link to comment
Share on other sites

Link to post
Share on other sites

In regards to the future proofing hot take, I'll never advocate for buying the top of the line for most people, but if I'm buying an Intel CPU I'm going to assume when it's time to replace it, there's also a new socket and I need to update my motherboard as well. Maybe the RAM too. So I tend to buy a little higher than I might need at the time, I can hold off on several parts, but it likely isn't as cheap as just grabbing a new CPU

Link to comment
Share on other sites

Link to post
Share on other sites

Gotta agree with the guy who said the EU were wrong to force USB-C on Apple.  I'm not a fan of Apple but people seem to have short memories, USB-C was not the high speed replacement for micro USB, that was Micro USB 3.0 (an enlarged micro USB port that was still single sided).  The only reason USB-C exists in the first place is because Apple brought out Lightning and embarrassed the USB Implementers Forum into creating a knee jerk reaction port to compete.  Forcing Apple to use USB-C is absolutely going to limit innovation as the only people allowed to innovate a replacement will be the consortium with a track record of **** poor innovation 😞

Link to comment
Share on other sites

Link to post
Share on other sites

11 minutes ago, themrsbusta said:

I have a question for Linus: If motion blur is only for hide low fps, tell me how to get the speed sensation on racing games without having a huge map, because the cars on these games doesn't really have the "real life speed", otherwise the map would feel much smaller.

I agree that this is a valid use, but the sensation of speed may be achieved with change of camera fov while you're accelerating and various games are doing exactly this and using some simple blurred lines as "wind", but not exactly motion blur. 

 

I feel like Linus did omit this part of the problem while he focused specifically on overuse of blur in non-vehicular games where you walk and have blur while turning around...

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×