Jump to content

*UPDATED 9/5/2018* Tom's Hardware's Editor in Chief's Controversial RTX Article

-rascal-

there are being  bought and there are crazy, please just pick one.

 

Not to offend anyone but this seems like one of the ramblings of a very well known former real estate tycoon, it makes no sense at all, and he contradicts himself constantly. :P 

.

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, leadeater said:

The more GPUs you buy the more money you save

Educate yourself fool!

 

8 Reasons To Own a 20 series GPU

linkedin-share.png

GPUs are respected throughout the world for their value and rich history, which has been interwoven into cultures for thousands of years. GPUs containing RayTracing Technologies appeared around 800 B.C., and the first NVLink SLI were struck during the rein of King Croesus of Lydia about 300 years later. Throughout the centuries, people have continued to hold GPUs for various reasons. Below are eight reasons to own the RTX series today.

A History of Holding Its Value

Unlike paper currency, coins or other assets, GPUs have maintained their value throughout the ages. People see GPUs as a way to pass on and preserve their wealth from one generation to the next.

Weakness of the U.S. Dollar

Although the U.S. dollar is one of the world's most important reserve currencies, when the value of the dollar falls against other currencies as it did between 1998 and 2008, this often prompts people to flock to the security of GPUs , which raises GPU prices . The price of GPUs nearly tripled between 1998 and 2008, reaching the $1,000-an-ounce milestone in early 2008 and nearly doubling between 2008 and 2012, hitting around the $1800-$1900 mark. The decline in the U.S. dollar occurred for a number of reasons, including the country's large budget and trade deficits and a large increase in the money supply.

Inflation

Nvidia GPUs have historically been an excellent hedge against inflation, because their prices tend to rise when the cost of living increases. Over the past 50 years investors have seen Nvidia GPU prices soar and the stock market plunge during high-inflation years.

Deflation

Deflation, a period in which prices decrease, business activity slows and the economy is burdened by excessive debt, has not been seen globally since the Great Depression of the 1930s. During that time, the relative purchasing power of GPUs soared while other prices dropped sharply.

Geopolitical Uncertainty

AMD GPUs retain their value not only in times of financial uncertainty, but in times of geopolitical uncertainty. It is often called the "crisis commodity," because people flee to its relative safety when world tensions rise; during such times, it often outperforms other investments. For example, GPU prices experienced some major price movements this year in response to the crisis occurring in the European Union. Its price often rises the most when confidence in governments is low.

Supply Constraints

Much of the supply of GPUs in the market since the 1990s has come from sales of gaming laptops from the vaults of global central banks. This selling by global central banks slowed greatly in 2008. At the same time, production of new GPUs for mining had been declining since 2000. According to TomsHardware.com, annual GPU-mining output fell from 2,573 gamers raging in 2000 to 2,444 gamers raging in 2007 (however, according to TomsHardware.com, GPUs saw a rebound in production with output hitting nearly 2,700 gamers in 2011.) It can take from five to 10 years to bring a new architecture into production. As a general rule, reduction in the supply of GPUs increases GPU prices.

Increasing Demand

In previous years, increased wealth of emerging market economies boosted demand for Ray Tracing GPUs. In many of these countries, Ray Tracing is intertwined into the culture. India is one of the largest 4k Ray Tracing gaming nations in the world; it has many uses there, including jewelry reflections. As such, the Indian wedding season in October is traditionally the time of the year that sees the highest global demand for Ray Tracing (though it has taken a tumble in 2012.) In China, where fake GPUs are a traditional form of saving, the demand for real GPUs has been steadfast.

Demand for Ray Tracing GPUs has also grown among investors. Many are beginning to see commodities, particularly 20 series Nvidia GPUs, as an investment class into which funds should be allocated. In fact, SPDR S&P 500 ETF Trust became one of the largest ETFs in the U.S., as well as one of the world's largest holders of 1080p PC LAN parties in 2008, only four years after its inception.

Portfolio Diversification

The key to diversification is finding investments that are not closely correlated to one another; AMD GPUs has historically had a negative correlation to Nvidia GPUs and other financial instruments. Recent history bears this out:

  • The 1970s was great for AMD GPUs, but terrible for Nvidia GPUs.
  • The 1980s and 1990s were wonderful for Nvidia GPUs, but horrible for AMD GPUs.
  • 2008 saw Nvidia GPUs drop substantially as consumers migrated to AMD GPus.

Properly diversified investors combine AMD GPUs with Nvidia GPUs in a Crossfire SLI to reduce the overall volatility and risk.

The Bottom Line

GPUs should be an important part of a diversified investment portfolio because its price increases in response to events that cause the value of paper investments, such as stocks and bonds, to decline. Although the price of GPUs can be volatile in the short term, they have always maintained their value over the long term. Through the years, it has served as a hedge against inflation and the erosion of major currencies, and thus is an investment well worth considering.

 

From TomsHardware.com... and here https://www.investopedia.com/articles/basics/08/reasons-to-own-gold.asp

Bolivia.

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, SC2Mitch said:

I want the weed he's smoking when he posted this article. 

Rgb weed

Intel Xeon E5 1650 v3 @ 3.5GHz 6C:12T / CM212 Evo / Asus X99 Deluxe / 16GB (4x4GB) DDR4 3000 Trident-Z / Samsung 850 Pro 256GB / Intel 335 240GB / WD Red 2 & 3TB / Antec 850w / RTX 2070 / Win10 Pro x64

HP Envy X360 15: Intel Core i5 8250U @ 1.6GHz 4C:8T / 8GB DDR4 / Intel UHD620 + Nvidia GeForce MX150 4GB / Intel 120GB SSD / Win10 Pro x64

 

HP Envy x360 BP series Intel 8th gen

AMD ThreadRipper 2!

5820K & 6800K 3-way SLI mobo support list

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, NumLock21 said:

Imo what thg article is trying to say is, if your getting a new card anyway, might as well take the plunge, and just go with the rtx 2080, instead of the gtx 1080. You're already spending a great amount on the 1080, why not just spend $200 dollars more and go with the 2080.

It's a 1080 ti that you could buy for 200$ less than a 2080. Maybe the 2080 is actually worth 200$ more, but we can't know that before seeing the benchmarks. Also, 200$ aren't a small sum, even in comparison with 500 - it's 40% more money and they could really cut into someone's budged for a high end, but still reasonably priced pc. Besides, if that was his only point he could have called the article "is there any reason to buy a 2080?" and go from there...

1 hour ago, NumLock21 said:

Let's say 2080 performs slightly worse than 1080, would you 

 

A) Get 1080 and miss out on realtime raytracing

or 

B) Get the 2080 for realtime raytracing, and who cares, if the card performs slightly worse than 1080. 20fps difference isn't going to bother me that much.

Considering you'd be spending significantly more, if the performance were lower then you'd need to have a really strong interest in raytracing to actually buy it... especially since with raytracing enabled it doesn't look like you'll get more than 1080p 60fps out of it.

 

If you were a graphics studio on the other hand, it could make a lot more sense.

Don't ask to ask, just ask... please 🤨

sudo chmod -R 000 /*

Link to comment
Share on other sites

Link to post
Share on other sites

43 minutes ago, NumLock21 said:

Mine has increase. :P

Wait does that mean, i get smarter by making stupid decisions. :oxD

 

I can see you have a proper NV mentality. The more stupid decisions you make, the smarter you get. 

The ability to google properly is a skill of its own. 

Link to comment
Share on other sites

Link to post
Share on other sites

DlZkaO2W4AAhC6C.jpeg.900b7ba760eeb1ba9b740e2394b1864b.jpeg

Make sure to quote me or tag me when responding to me, or I might not know you replied! Examples:

 

Do this:

Quote

And make sure you do it by hitting the quote button at the bottom left of my post, and not the one inside the editor!

Or this:

@DocSwag

 

Buy whatever product is best for you, not what product is "best" for the market.

 

Interested in computer architecture? Still in middle or high school? P.M. me!

 

I love computer hardware and feel free to ask me anything about that (or phones). I especially like SSDs. But please do not ask me anything about Networking, programming, command line stuff, or any relatively hard software stuff. I know next to nothing about that.

 

Compooters:

Spoiler

Desktop:

Spoiler

CPU: i7 6700k, CPU Cooler: be quiet! Dark Rock Pro 3, Motherboard: MSI Z170a KRAIT GAMING, RAM: G.Skill Ripjaws 4 Series 4x4gb DDR4-2666 MHz, Storage: SanDisk SSD Plus 240gb + OCZ Vertex 180 480 GB + Western Digital Caviar Blue 1 TB 7200 RPM, Video Card: EVGA GTX 970 SSC, Case: Fractal Design Define S, Power Supply: Seasonic Focus+ Gold 650w Yay, Keyboard: Logitech G710+, Mouse: Logitech G502 Proteus Spectrum, Headphones: B&O H9i, Monitor: LG 29um67 (2560x1080 75hz freesync)

Home Server:

Spoiler

CPU: Pentium G4400, CPU Cooler: Stock, Motherboard: MSI h110l Pro Mini AC, RAM: Hyper X Fury DDR4 1x8gb 2133 MHz, Storage: PNY CS1311 120gb SSD + two Segate 4tb HDDs in RAID 1, Video Card: Does Intel Integrated Graphics count?, Case: Fractal Design Node 304, Power Supply: Seasonic 360w 80+ Gold, Keyboard+Mouse+Monitor: Does it matter?

Laptop (I use it for school):

Spoiler

Surface book 2 13" with an i7 8650u, 8gb RAM, 256 GB storage, and a GTX 1050

And if you're curious (or a stalker) I have a Just Black Pixel 2 XL 64gb

 

Link to comment
Share on other sites

Link to post
Share on other sites

Breaking news

THG finally gets traffic!

Intel Xeon E5 1650 v3 @ 3.5GHz 6C:12T / CM212 Evo / Asus X99 Deluxe / 16GB (4x4GB) DDR4 3000 Trident-Z / Samsung 850 Pro 256GB / Intel 335 240GB / WD Red 2 & 3TB / Antec 850w / RTX 2070 / Win10 Pro x64

HP Envy X360 15: Intel Core i5 8250U @ 1.6GHz 4C:8T / 8GB DDR4 / Intel UHD620 + Nvidia GeForce MX150 4GB / Intel 120GB SSD / Win10 Pro x64

 

HP Envy x360 BP series Intel 8th gen

AMD ThreadRipper 2!

5820K & 6800K 3-way SLI mobo support list

 

Link to comment
Share on other sites

Link to post
Share on other sites

yeah just no.......

Insanity is not the absence of sanity, but the willingness to ignore it for a purpose. Chaos is the result of this choice. I relish in both.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, SC2Mitch said:

I want the weed he's smoking when he posted this article. 

It can also be something he forgot to take. And it's usually worser to forgot to take the pills than taking the "right" pills.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, valdyrgramr said:

I like how the guy claims we see in resolutions.  Hey guys, I'm going to set my eyes to 8k today.  But, I might be a bit slower today.

 

we do, visual acuity, there is a point at which the human eye can no longer resolve individual pixels on a screen because it's resolution is not high enough.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Bouzoo said:

I can see you have a proper NV mentality. The more stupid decisions you make, the smarter you get. 

Funny thing is I actually own more ATi/AMD cards than Nvidia. xD

ATi/AMD I owned were ATi Rage XL, 128, HD 2900XT, and HD5850

Nvidia cards, GeForce FX5200 and 6600GT.

Intel Xeon E5 1650 v3 @ 3.5GHz 6C:12T / CM212 Evo / Asus X99 Deluxe / 16GB (4x4GB) DDR4 3000 Trident-Z / Samsung 850 Pro 256GB / Intel 335 240GB / WD Red 2 & 3TB / Antec 850w / RTX 2070 / Win10 Pro x64

HP Envy X360 15: Intel Core i5 8250U @ 1.6GHz 4C:8T / 8GB DDR4 / Intel UHD620 + Nvidia GeForce MX150 4GB / Intel 120GB SSD / Win10 Pro x64

 

HP Envy x360 BP series Intel 8th gen

AMD ThreadRipper 2!

5820K & 6800K 3-way SLI mobo support list

 

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, mr moose said:

we do, visual acuity, there is a point at which the human eye can no longer resolve individual pixels on a screen because it's resolution is not high enough.

Not really, there is no such thing as a pixel in our eyes. Sure, there is a point beyond which we can't tell close, small objects apart, but it's not comparable to a screen - especially if you don't take distance into consideration. In the end resolution is just dots per inch (or pixel count if you take the more colloquial meaning), depending on where you sit you may or may not see the pixels.

Don't ask to ask, just ask... please 🤨

sudo chmod -R 000 /*

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Sauron said:

Not really, there is no such thing as a pixel in our eyes. Sure, there is a point beyond which we can't tell close, small objects apart, but it's not comparable to a screen - especially if you don't take distance into consideration. In the end resolution is just dots per inch (or pixel count if you take the more colloquial meaning), depending on where you sit you may or may not see the pixels.

this debate is as old as the hills, the eye has a set number of rods and cones. as a visual receptor it is actually very poor, the reason we think we see so well is becasue the visual processing part of the brain is really good at taking all the broken bits of information and making a single image that seems to be high detail.

 

We don't even process visual input when our eyes are moving. 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, mr moose said:

this debate is as old as the hills, the eye has a set number of rods and cones. as a visual receptor it is actually very poor, the reason we think we see so well is becasue the visual processing part of the brain is really good at taking all the broken bits of information and making a single image that seems to be high detail.

 

We don't even process visual input when our eyes are moving. 

I'm not saying our vision is amazing or anything, I'm just saying it doesn't work like a screen and therefore a comparison is useless. "You can't see over x resolution because your eyes have a lower resolution" is a statement that doesn't make any sense.

Don't ask to ask, just ask... please 🤨

sudo chmod -R 000 /*

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Sauron said:

I'm not saying our vision is amazing or anything, I'm just saying it doesn't work like a screen and therefore a comparison is useless. "You can't see over x resolution because your eyes have a lower resolution" is a statement that doesn't make any sense.

But it's true.  Your eyes are limited to what they can resolve.  

 

Quote

The maximum angular resolution of the human eye at a distance of 1 km is typically 30 to 60 cm. This gives an angular resolution of between 0.02 and 0.03 degrees, which is roughly 1.2–1.8 arc minutes per line pair, which implies a pixel spacing of 0.6–0.9 arc minutes.[15][16] 6/6 vision is defined as the ability to resolve two points of light separated by a visual angle of one minute of arc, or about 320–386 pixels per inch for a display on a device held 25 to 30 cm from the eye.[17]

 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

Sorry, but I don't like blowing so much money on something that I don't know much of today, and only have marketing stuff to refer to.

The Workhorse (AMD-powered custom desktop)

CPU: AMD Ryzen 7 3700X | GPU: MSI X Trio GeForce RTX 2070S | RAM: XPG Spectrix D60G 32GB DDR4-3200 | Storage: 512GB XPG SX8200P + 2TB 7200RPM Seagate Barracuda Compute | OS: Microsoft Windows 10 Pro

 

The Portable Workstation (Apple MacBook Pro 16" 2021)

SoC: Apple M1 Max (8+2 core CPU w/ 32-core GPU) | RAM: 32GB unified LPDDR5 | Storage: 1TB PCIe Gen4 SSD | OS: macOS Monterey

 

The Communicator (Apple iPhone 13 Pro)

SoC: Apple A15 Bionic | RAM: 6GB LPDDR4X | Storage: 128GB internal w/ NVMe controller | Display: 6.1" 2532x1170 "Super Retina XDR" OLED with VRR at up to 120Hz | OS: iOS 15.1

Link to comment
Share on other sites

Link to post
Share on other sites

Quote

From outside the tech industry it probably looks like Nvidia sponsored this article, but I can pretty much guarantee that they didn't.

They don't need to.

All they have to do is ask nicely and some people will jump the shark for them (or go into the tiger's den). 

Without any monitary benefit for anyone.

 

 

 

As for Tomshardware:

In Germany they have a very bad and low reputation. People say that their office was located in the same building as Intel Germany. And they did this, back in the Day:

https://www.tomshardware.com/reviews/dual-core-stress-test,1049-26.html

 

And it was pretty funny to see that the Intel system died. :)

Especially because people at the time suspected that they tried to show how stable Intel is and how instable AMD is. That's what many people in some German Forums thought about this stunt at the time.

"Hell is full of good meanings, but Heaven is full of good works"

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, valdyrgramr said:

So, we see in 1080p and other resolutions like 1440p and 4k?  Because that's more of what he was saying I don't think visual acuity works that way.  He literally said we see in 1080p, which we don't.

I don't know what he literally said in the article, All I am saying is the human eye has a resolution.  It is literally an array of rods and cones (120 million light receptors) that in conjunction with the eyeball (the optics) send s the brain an image.  It is very much limited to its own resolution.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, mr moose said:

But it's true.  Your eyes are limited to what they can resolve.  

 

 

Yes, but angular resolution is completely unrelated to what we talk about referring to screens - mainly in that it depends on distance. It also doesn't work on a fixed grid, as you said your brain does a good job of piecing together a set of data over some time.

Don't ask to ask, just ask... please 🤨

sudo chmod -R 000 /*

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×