Jump to content

A Talking point that targets Linus arguments about not receiving top of the line GPU cards for review. Literally giving his counter argument.

The arguments are simple from both side. Give us a fair review, and I will give you a fair review. 

 

Summary

 CES - CNET sounds like its giving the counter spiel to Linus argument about top of the line GPU purchases. His argument contained not changing his opinion to receive free cards for review.

 

Quotes

Quote

You were enjoying it for the time you had it.

 

My thoughts

I disagree with Linus review of cutting edge features on graphics cards. The need for them is strong for creators, especially creators who are making next gen stuff. I also think for businesses with upgrade cycles and budgets that come around every couple years. Even if features wont last through that generation for readiness. It still adds longevity to the businesses operations. I do not consider gaming for any choices I make in computer purchases. But my requirements are higher than your first party game. With machine learning, computer vision, photogrammetry, Unity3D development, and rendering. I never want a bottle neck in my programming cycle to be saving a couple hundred dollars, my time is worth the extra investment. 

 

I disagree with any company that formulates its reviews. Its moving closer to Apple who pays for its reviews. I want to hear the fan boys rave so that I can disagree with their rhetoric as its obvious. I want to hear someone's arguments towards a product because it reveals any hidden agendas, highlights things I agree, or I don't care about. I am happy that Linus shared both his opinion over all, and the opinion being shared with him on why he wont be receiving free stuff. It allows me to agree or disagree along the way, and them formulate my own opinion. Raytracing in these cards only leaves a taste in your mouth once then next gen cards come out. I can decide that I do or don't need that feature, I can also decide its an un needed addition to the bundle. 

 

Nvidia wants to be the Ferrari. They send lawyers over to your house when they see you painted your car Pink and Black checkers on your Instagram. Nvidia wants the buffer that Intel had with its CPU's. Enough arguments to survive the storm. AMD made the best choice in history amongst them all when they purchases ATI. I only wish they would have shaved the name Radeon off at the same time. The advantages are simple when tackling two architectures that should never be separate. The CPU GPU merge will have to happen one day, and you see Apple giving a glimpse of these ideas today. Nvidia is in trouble when they start to micro manage reviewers. its the same with Nintendo and its hammer on YouTube reviewers. We don't like Nintendo for its next gen bleeding edge graphics. We like Nintendo for its product history, comfort, and enjoyability. Nvidia and Intel will fall into the ranks of Nintendo I believe. Unless either of them can make a proper transition into the other ones playing field. 

 

I am not asking for Linus to agree or disagree with me. I will delete my post should he try to. 

 

Sources CNET

 https://youtu.be/P4IhmCKaMVk?t=1181

Sources Linus

https://youtu.be/iXn9O-Rzb_M?t=261

Link to comment
Share on other sites

Link to post
Share on other sites

o....k.

CPU: Ryzen 9 5900 Cooler: EVGA CLC280 Motherboard: Gigabyte B550i Pro AX RAM: Kingston Hyper X 32GB 3200mhz

Storage: WD 750 SE 500GB, WD 730 SE 1TB GPU: EVGA RTX 3070 Ti PSU: Corsair SF750 Case: Streacom DA2

Monitor: LG 27GL83B Mouse: Razer Basilisk V2 Keyboard: G.Skill KM780 Cherry MX Red Speakers: Mackie CR5BT

 

MiniPC - Sold for $100 Profit

Spoiler

CPU: Intel i3 4160 Cooler: Integrated Motherboard: Integrated

RAM: G.Skill RipJaws 16GB DDR3 Storage: Transcend MSA370 128GB GPU: Intel 4400 Graphics

PSU: Integrated Case: Shuttle XPC Slim

Monitor: LG 29WK500 Mouse: G.Skill MX780 Keyboard: G.Skill KM780 Cherry MX Red

 

Budget Rig 1 - Sold For $750 Profit

Spoiler

CPU: Intel i5 7600k Cooler: CryOrig H7 Motherboard: MSI Z270 M5

RAM: Crucial LPX 16GB DDR4 Storage: Intel S3510 800GB GPU: Nvidia GTX 980

PSU: Corsair CX650M Case: EVGA DG73

Monitor: LG 29WK500 Mouse: G.Skill MX780 Keyboard: G.Skill KM780 Cherry MX Red

 

OG Gaming Rig - Gone

Spoiler

 

CPU: Intel i5 4690k Cooler: Corsair H100i V2 Motherboard: MSI Z97i AC ITX

RAM: Crucial Ballistix 16GB DDR3 Storage: Kingston Fury 240GB GPU: Asus Strix GTX 970

PSU: Thermaltake TR2 Case: Phanteks Enthoo Evolv ITX

Monitor: Dell P2214H x2 Mouse: Logitech MX Master Keyboard: G.Skill KM780 Cherry MX Red

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, WereCatf said:

Um, now that's an incoherent ramble without any point.

I thought i was the only one who thought that lol...

Folding Stats

 

SYSTEM SPEC

AMD Ryzen 5 5600X | Motherboard Asus Strix B550i | RAM 32gb 3200 Crucial Ballistix | GPU Nvidia RTX 3070 Founder Edition | Cooling Barrow CPU/PUMP Block, EKWB Vector GPU Block, Corsair 280mm Radiator | Case NZXT H1 | Storage Sabrent Rocket 2tb, Samsung SM951 1tb

PSU NZXT S650 SFX Gold | Display Acer Predator XB271HU | Keyboard Corsair K70 Lux | Mouse Corsair M65 Pro  

Sound Logitech Z560 THX | Operating System Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

11 minutes ago, schwellmo92 said:

still no clue wtf you are rambling about

No one else in this threads has any clue on that, either. Including OP themselves!

Hand, n. A singular instrument worn at the end of the human arm and commonly thrust into somebody’s pocket.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, WereCatf said:

Um, now that's an incoherent ramble without any point.

Wow so I am not the only one, I read through it and my brain just didn't comprehend it.

Lorem ipsum dolor sit amet, consectetur adipiscing elit. Aenean vel luctus dolor. Aliquam convallis hendrerit erat dignissim sodales. Pellentesque habitant morbi tristique senectus et netus et malesuada fames ac turpis egestas. Pellentesque habitant morbi tristique senectus et netus et malesuada fames ac turpis egestas.

 

Don't be bothered by the toxic idiots of the community. Somebody will always try to get you annoyed. The best fight is not giving them any attention. Never forget this!!

Spoiler

print("Hello World")

 

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, Looking for a job Linus said:

The arguments are simple from both side. Give us a fair review, and I will give you a fair review. 

You have wrong timestamp for WAN Show. The current goes for GamersNexus vs Nvidia thing, not anything about how much midrange GPUs should cost and how their performance should be in 2 years time, which is what CNET video is talking about. Or maybe you wanted to reference something else in CNET video? Anyway, having two different topics mixed like this is not very good idea.

 

Quote

I disagree with Linus review of cutting edge features on graphics cards. The need for them is strong for creators, especially creators who are making next gen stuff. I also think for businesses with upgrade cycles and budgets that come around every couple years. Even if features wont last through that generation for readiness. It still adds longevity to the businesses operations. I do not consider gaming for any choices I make in computer purchases. But my requirements are higher than your first party game. With machine learning, computer vision, photogrammetry, Unity3D development, and rendering. I never want a bottle neck in my programming cycle to be saving a couple hundred dollars, my time is worth the extra investment. 

So if you want to reference videos, besides timestamps you must explain more what is the actual line/point your are talking about. Otherwise it comes out in senseless rambling as in here. You can do it without quote boxes, for example:

Quote

Linus and Luke expressed their worry how reviewing culture might change if manufacturers can bully reviewers like they did with Steve. I agree that reviewers integrity should always be more important than everything else, and company trying to dismantle that should get some bad press from tech community as whole.

 

 

Quote

I am not asking for Linus to agree or disagree with me. I will delete my post should he try to.

For one, report it if you really want that. For two, why bother posting if you don't want criticism or response? You being like "I have opinion and I will take it back if someone tries to agree or disagree" is very odd stance to take overall. Have some backbone and keep your opinions. Or don't bother posting them at all.

Edited by LogicalDrm

^^^^ That's my post ^^^^
<-- This is me --- That's your scrollbar -->
vvvv Who's there? vvvv

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, Looking for a job Linus said:

I am not asking for Linus to agree or disagree with me. I will delete my post should he try to. 

I don't think you... can delete your own post?

elephants

Link to comment
Share on other sites

Link to post
Share on other sites

Reporting this cuz it made my head hurt.

"Do what makes the experience better" - in regards to PCs and Life itself.

 

Onyx AMD Ryzen 7 7800x3d / MSI 6900xt Gaming X Trio / Gigabyte B650 AORUS Pro AX / G. Skill Flare X5 6000CL36 32GB / Samsung 980 1TB x3 / Super Flower Leadex V Platinum Pro 850 / EK-AIO 360 Basic / Fractal Design North XL (black mesh) / AOC AGON 35" 3440x1440 100Hz / Mackie CR5BT / Corsair Virtuoso SE / Cherry MX Board 3.0 / Logitech G502

 

7800X3D - PBO -30 all cores, 4.90GHz all core, 5.05GHz single core, 18286 C23 multi, 1779 C23 single

 

Emma : i9 9900K @5.1Ghz - Gigabyte AORUS 1080Ti - Gigabyte AORUS Z370 Gaming 5 - G. Skill Ripjaws V 32GB 3200CL16 - 750 EVO 512GB + 2x 860 EVO 1TB (RAID0) - EVGA SuperNova 650 P2 - Thermaltake Water 3.0 Ultimate 360mm - Fractal Design Define R6 - TP-Link AC1900 PCIe Wifi

 

Raven: AMD Ryzen 5 5600x3d - ASRock B550M Pro4 - G. Skill Ripjaws V 16GB 3200Mhz - XFX Radeon RX6650XT - Samsung 980 1TB + Crucial MX500 1TB - TP-Link AC600 USB Wifi - Gigabyte GP-P450B PSU -  Cooler Master MasterBox Q300L -  Samsung 27" 1080p

 

Plex : AMD Ryzen 5 5600 - Gigabyte B550M AORUS Elite AX - G. Skill Ripjaws V 16GB 2400Mhz - MSI 1050Ti 4GB - Crucial P3 Plus 500GB + WD Red NAS 4TBx2 - TP-Link AC1200 PCIe Wifi - EVGA SuperNova 650 P2 - ASUS Prime AP201 - Spectre 24" 1080p

 

Steam Deck 512GB OLED

 

OnePlus: 

OnePlus 11 5G - 16GB RAM, 256GB NAND, Eternal Green

OnePlus Buds Pro 2 - Eternal Green

 

Other Tech:

- 2021 Volvo S60 Recharge T8 Polestar Engineered - 415hp/495tq 2.0L 4cyl. turbocharged, supercharged and electrified.

Lenovo 720S Touch 15.6" - i7 7700HQ, 16GB RAM 2400MHz, 512GB NVMe SSD, 1050Ti, 4K touchscreen

MSI GF62 15.6" - i7 7700HQ, 16GB RAM 2400 MHz, 256GB NVMe SSD + 1TB 7200rpm HDD, 1050Ti

- Ubiquiti Amplifi HD mesh wifi

 

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Dedayog said:

Reporting this cuz it made my head hurt.

2 mods have already commented 🤔

Main Rig:-

Ryzen 7 3800X | Asus ROG Strix X570-F Gaming | 16GB Team Group Dark Pro 3600Mhz | Corsair MP600 1TB PCIe Gen 4 | Sapphire 5700 XT Pulse | Corsair H115i Platinum | WD Black 1TB | WD Green 4TB | EVGA SuperNOVA G3 650W | Asus TUF GT501 | Samsung C27HG70 1440p 144hz HDR FreeSync 2 | Ubuntu 20.04.2 LTS |

 

Server:-

Intel NUC running Server 2019 + Synology DSM218+ with 2 x 4TB Toshiba NAS Ready HDDs (RAID0)

Link to comment
Share on other sites

Link to post
Share on other sites

*** Also cleaned ***

 

If you want to chit-chat, do it in the dedicated thread(s). Even if the OP might not be very comprehensive to you, that doesn't give any rights for posting non-contributing stuff.

^^^^ That's my post ^^^^
<-- This is me --- That's your scrollbar -->
vvvv Who's there? vvvv

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, Looking for a job Linus said:

My thoughts

I disagree with Linus review of cutting edge features on graphics cards. The need for them is strong for creators, especially creators who are making next gen stuff. I also think for businesses with upgrade cycles and budgets that come around every couple years. Even if features wont last through that generation for readiness. It still adds longevity to the businesses operations. I do not consider gaming for any choices I make in computer purchases. But my requirements are higher than your first party game. With machine learning, computer vision, photogrammetry, Unity3D development, and rendering. I never want a bottle neck in my programming cycle to be saving a couple hundred dollars, my time is worth the extra investment. 

Finally a paragraph that makes sense.

Buy a Titan if you are worried,but lack of features won't necessarily bottleneck you,Unity 3D hits the system like any game out there so it should not be a problem to run with a gaming GPU.

A PC Enthusiast since 2011
AMD Ryzen 7 5700X@4.65GHz | GIGABYTE GTX 1660 GAMING OC @ Core 2085MHz Memory 5000MHz
Cinebench R23: 15669cb | Unigine Superposition 1080p Extreme: 3566
Link to comment
Share on other sites

Link to post
Share on other sites

Now I really have to get out of here before i start to talk about incoherent texts, moderation and politics.

ಠ_ಠ

Link to comment
Share on other sites

Link to post
Share on other sites

I will clean up my post later. I may have linked the wrong video, and I seem to have lost everyone in my Jibber Jabber. My sarcasm was even missed when I said that i'd delete my post, it was a jab at not getting free stuff for bad reviews.

 

Link to comment
Share on other sites

Link to post
Share on other sites

I think gaining an appropriate skill set, experience then compiling it within a professional CV and sending it attached with a suitable cover letter will yield better results than this OP.

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Amias said:

I think gaining an appropriate skill set, experience then compiling it within a professional CV and sending it attached with a suitable cover letter will yield better results than this OP.

I would agree with that, but its funnier watching paint.net on videos, and its funnier seeing pyramid builds with fitting issues. I wanted a job, but I probably would not get one due to the fact I am the guy who will point out to the boss he has booger on his face to allow him to correct it. rather than be all the employees that's stay silent. too many times watching videos I do love, I see cohosts allowing Linus to continue without being educated because he is King. I could continue but its uncalled for, and I am not interested. I do love your quote here, I find it quite appropriate to the flow of this thread.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Looking for a job Linus said:

I would agree with that, but its funnier watching paint.net on videos, and its funnier seeing pyramid builds with fitting issues. I wanted a job, but I probably would not get one due to the fact I am the guy who will point out to the boss he has booger on his face to allow him to correct it. rather than be all the employees that's stay silent. too many times watching videos I do love, I see cohosts allowing Linus to continue without being educated because he is King. 

 

To educate anybody, you have to be able to communicate effectively. So far it seems nobody reading this can figure out what point you're trying to make. 

 

People who are willing to stand up to the boss when he's wrong may be rare, but people who think they know a lot but have zero people skills or communication ability are, unfortunately, a dime a dozen. 

Corps aren't your friends. "Bottleneck calculators" are BS. Only suckers buy based on brand. It's your PC, do what makes you happy.  If your build meets your needs, you don't need anyone else to "rate" it for you. And talking about being part of a "master race" is cringe. Watch this space for further truths people need to hear.

 

Ryzen 7 5800X3D | ASRock X570 PG Velocita | PowerColor Red Devil RX 6900 XT | 4x8GB Crucial Ballistix 3600mt/s CL16

Link to comment
Share on other sites

Link to post
Share on other sites

21 hours ago, Looking for a job Linus said:

I will clean up my post later. I may have linked the wrong video, and I seem to have lost everyone in my Jibber Jabber. My sarcasm was even missed when I said that i'd delete my post, it was a jab at not getting free stuff for bad reviews.

 

Sarcasm rarely carries over just text. More so if the rest of the post isn't easy to comprehend. If you want to point out the sarcasm (or jokes for that matter), we usually use /s and /jk. Using such will also help when and if moderation needs to evaluate things.

^^^^ That's my post ^^^^
<-- This is me --- That's your scrollbar -->
vvvv Who's there? vvvv

Link to comment
Share on other sites

Link to post
Share on other sites

  • 1 month later...
On 1/13/2021 at 12:27 AM, Looking for a job Linus said:

Nvidia wants to be the Ferrari. They send lawyers over to your house when they see you painted your car Pink and Black checkers on your Instagram. Nvidia wants the buffer that Intel had with its CPU's. Enough arguments to survive the storm. AMD made the best choice in history amongst them all when they purchases ATI. I only wish they would have shaved the name Radeon off at the same time. The advantages are simple when tackling two architectures that should never be separate. The CPU GPU merge will have to happen one day, and you see Apple giving a glimpse of these ideas today. Nvidia is in trouble when they start to micro manage reviewers. its the same with Nintendo and its hammer on YouTube reviewers. We don't like Nintendo for its next gen bleeding edge graphics. We like Nintendo for its product history, comfort, and enjoyability. Nvidia and Intel will fall into the ranks of Nintendo I believe. Unless either of them can make a proper transition into the other ones playing field. 

 

I am not asking for Linus to agree or disagree with me. I will delete my post should he try to. /JK

 

Well its been some time now, I hope you have all cooled off and collected your heads. I believe that my post was tech news as it predicted two strong moves in the industry. Intel releasing strong hardware on the GPU side, and now Nvidia making strong move into the CPU market. Like I mentioned before that Intel had a good buffer on CPU's. but its lack of advancements on GPU's left a huge advantage for a company like AMD who is focused on both CPU and GPU. an architecture like CPU is still very much single process thought pattern. even with Threading, and multi Cores, we are just handling more than one single process in parallel. But GPU is a different beast all together, and the CPU needs to make the gains they have in respect to core counts. Nvidia seems to be doing what I had predicted, Move into CPU market or wither away like Intel has been lately. Intel doing the same thing and pushing forward on GPU, and I would argue they have sacrificed resources internally in order to make it happen. 

 

Here is my next prediction for 1-4 years from now. I would like to say Asus would do this, but I am thinking Gigabyte will probably be the first. who knows MSI or whatever it someone will be the first. I think the real moves will be in displays, and not in GPU's so much focus. The advances with machine learning chips showing up in TV's is a first to show what can be done with groups of pixels. There is going to be a push for displays that can understand larger areas of the screen. For instance its great when a display can handle stair casing with some anti aliasing built in. Newer machine learning algorithm chips that can take pay loads will be revolutionary.  Payload being like a GPU which can be targeted directly and accessible by graphics card or software lever for rendering. handling the world of possibilities for increasing resolutions where information is missing is an ultimate goal. but to land on that page early can be achieved with some simple optimization.

 

If you have Game A and GPU B and Display C. 

Game A runs its textures through machine learning that lights the texture every single way, angle reflections, light passes on a huge GPU farm. The result is a highly optimized result that actually finished and produced machine learning model that is compact and Game A specific. 

Now GPU B can deliver its best in class ability, aka your budget GPU. Then Display C will take that result picture and in Realtime apply Game A payload to that area of the screen, or window of the screen worth of pixels.

 

The result is interesting in different scenarios as you may be running in window mode. experiencing windows OS at 4K, and a Game in window mode running a super resolution at 16K for kickers. I say 16K because the machine built into the monitor payload is only uncovering the lack of information in the picture. for instance the soup cans that the GPU deliver unreadable and now legible because of the display and Game A payload algorithms. 

 

Linus Grat's on the SolidWorks sponsored video. Its close to my heart, how appropriate for Valentine's day. 

Cheers,

Link to comment
Share on other sites

Link to post
Share on other sites

11 hours ago, Looking for a job Linus said:

Well its been some time now, I hope you have all cooled off and collected your heads. I believe that my post was tech news as it predicted two strong moves in the industry. Intel releasing strong hardware on the GPU side, and now Nvidia making strong move into the CPU market. Like I mentioned before that Intel had a good buffer on CPU's. but its lack of advancements on GPU's left a huge advantage for a company like AMD who is focused on both CPU and GPU. an architecture like CPU is still very much single process thought pattern. even with Threading, and multi Cores, we are just handling more than one single process in parallel. But GPU is a different beast all together, and the CPU needs to make the gains they have in respect to core counts. Nvidia seems to be doing what I had predicted, Move into CPU market or wither away like Intel has been lately. Intel doing the same thing and pushing forward on GPU, and I would argue they have sacrificed resources internally in order to make it happen. 

 

Here is my next prediction for 1-4 years from now. I would like to say Asus would do this, but I am thinking Gigabyte will probably be the first. who knows MSI or whatever it someone will be the first. I think the real moves will be in displays, and not in GPU's so much focus. The advances with machine learning chips showing up in TV's is a first to show what can be done with groups of pixels. There is going to be a push for displays that can understand larger areas of the screen. For instance its great when a display can handle stair casing with some anti aliasing built in. Newer machine learning algorithm chips that can take pay loads will be revolutionary.  Payload being like a GPU which can be targeted directly and accessible by graphics card or software lever for rendering. handling the world of possibilities for increasing resolutions where information is missing is an ultimate goal. but to land on that page early can be achieved with some simple optimization.

 

If you have Game A and GPU B and Display C. 

Game A runs its textures through machine learning that lights the texture every single way, angle reflections, light passes on a huge GPU farm. The result is a highly optimized result that actually finished and produced machine learning model that is compact and Game A specific. 

Now GPU B can deliver its best in class ability, aka your budget GPU. Then Display C will take that result picture and in Realtime apply Game A payload to that area of the screen, or window of the screen worth of pixels.

 

The result is interesting in different scenarios as you may be running in window mode. experiencing windows OS at 4K, and a Game in window mode running a super resolution at 16K for kickers. I say 16K because the machine built into the monitor payload is only uncovering the lack of information in the picture. for instance the soup cans that the GPU deliver unreadable and now legible because of the display and Game A payload algorithms. 

 

Linus Grat's on the SolidWorks sponsored video. Its close to my heart, how appropriate for Valentine's day. 

Cheers,

You're really good at rambling walls of text?  I still have no idea what you're trying to express with these posts.

 

Anyways a few random comments to points you tried to make:

 

CPUs today are based on x86 Architecture still.  (Drastically different than x86 of the 1980's, but still the same core, mostly.)  This limits certain things in how they can function.  There have already been multi-threaded CPUs out there, but they're not x86.  

 

NVidia has no real reason to try and beat Intel or AMD at the CPU Game.  Their products make more money than the CPU's do.

Intel wants to get into graphics to expand their market share, and not rely on other vendors as much.  (We'll see if it's productive or not.)  AMD Already has this advantage.

 

Link to comment
Share on other sites

Link to post
Share on other sites

11 hours ago, Looking for a job Linus said:

Well its been some time now 

 

Yes, a month has passed and nothing you're saying makes any more sense than it did in January.

 

Quote

 I believe that my post was tech news as it predicted two strong moves in the industry. Intel releasing strong hardware on the GPU side, and now Nvidia making strong move into the CPU market. 

Apart from the fact neither of these things has happened, sure.

 

Quote

Nvidia seems to be doing what I had predicted, Move into CPU market or wither away like Intel has been lately.

 

Nvidia is making money hand over fist from selling GPU's and is even bringing back entry-level GPU's from 5 years ago to sell them at inflated prices demand is so high, so I guess your definition of withering away is pretty unique. 

 

Linus mentioned earlier in today's stream they may eventually hire people for written content. I can't wait for you to apply. 

Corps aren't your friends. "Bottleneck calculators" are BS. Only suckers buy based on brand. It's your PC, do what makes you happy.  If your build meets your needs, you don't need anyone else to "rate" it for you. And talking about being part of a "master race" is cringe. Watch this space for further truths people need to hear.

 

Ryzen 7 5800X3D | ASRock X570 PG Velocita | PowerColor Red Devil RX 6900 XT | 4x8GB Crucial Ballistix 3600mt/s CL16

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, Middcore said:

Linus mentioned earlier in today's stream they may eventually hire people for written content. I can't wait for you to apply. 

I find it interesting that you were watching the live stream today. I also find it interesting the posts you make on the forums, about 800+ posts. are you a LTT insider? its seems obvious to me when reading your posts, but I might have a prejudice opinion. 

 

"Can't wait for you to apply". This is said in such a way that you would get notified should I even apply.

 

ATT Linus: paying people to fill your forums with supporting content is the same as paying for a good review of your product. 

 

But I may have misspoken, but I might not of.

 

I am just a nice guy.

Cheer,

Link to comment
Share on other sites

Link to post
Share on other sites

Guest
This topic is now closed to further replies.

×