Jump to content
Search In
  • More options...
Find results that contain...
Find results in...

You'd be a fool to buy an RTX 3070

About $50 more bucks gets you double the VRAM, Infinity-whatever-they-call-it with Ryzen 5000, and likely better performance than a 2080Ti (which I'd say is still slightly better than a 3070).

 

Prove me wrong. 

AMD Ryzen 5 3600  |  MSI B550 Tomahawk  |  32GB G.SKILL 3600 CL16 4x8GB |  PowerColor Red Devil 5700XT  | Creative Sound Blaster Z  |  WD Black SN850 500GB NVMe  |  WD Black SN750 2TB NVMe  |  WD Blue 1TB SATA SSD  |  Corsair RM850x  |  Corsair 4000D  |  LG 27GL650F-B  |

Link to post
Share on other sites
1 minute ago, Action_Johnson said:

Prove me wrong. 

I'll make up my mind once the reviews drop

"We're all in this together, might as well be friends" Tom, Toonami.

Sorry if my post seemed rude, that is never my intention.

"Why do we suffer a lifetime for a moment of happiness?" - Anonymous

Link to post
Share on other sites
2 minutes ago, Action_Johnson said:

About $50 more bucks gets you double the VRAM, Infinity-whatever-they-call-it with Ryzen 5000, and likely better performance than a 2080Ti (which I'd say is still slightly better than a 3070).

But I wouldn't be getting e.g. far better drivers, CUDA or NVENC.

Hand, n. A singular instrument worn at the end of the human arm and commonly thrust into somebody’s pocket.

Link to post
Share on other sites
1 minute ago, WereCatf said:

But I wouldn't be getting e.g. far better drivers, CUDA or NVENC.

Tohe drives are fine now. Though it might come down for what you do. Gaming, you might as well get the 6800, but if you need cudo or nvec, you are forced to get the 3070

I am still TechWizardThatNeedsHelp, just less of a mouthfull.

 

My beautiful, but not that powerful, main PC:

My new PC I'm saving for:                                                                /\

(was a H1 build, but I cant spend 400$ on a case)                |

DREAM BUILD BIT BY BIT                        |

  • Ryzen 5 5600x                                                                         |
  • ROG Strix B550-f GAMING                                                 |
  • 2x8gb Corsair Vengance LPX DDR4 3200mhz cl6   |
  • Old drives from Grandma -------------------------------------/
  • New Crucial P1 and Seagate Barracuda Compute 2TB
  • Whichever 3070 I can get
  • Meshify C with tempered glass
  • 6 Cooler Master MasterFan MF120 Halo
  • Full custom loop water cooling.

https://pcpartpicker.com/user/HelpfulTechWizard/saved/s8XrD3

Link to post
Share on other sites

im a neural network enthusiast and im saying nothing can compete with cuda

AND nvidia will reveal super or ti card soon 

if it was useful give it a like :) btw if your into linux pay a visit here  and i will be thankful if you send me an opinion here  

 

Link to post
Share on other sites
1 minute ago, HelpfulTechWizard said:

Tohe drives are fine now.

asides nvidia's drivers were just awful when people first got their hands on the 3000 cards

Link to post
Share on other sites
4 minutes ago, HelpfulTechWizard said:

Tohe drives are fine now.

The problem isn't now - or in other words - 15 months after release, but rather at launch.

Navi's drivers were a shitshow for more than half of said 15 months, and the VII wasn't all that different either.

Desktop: Intel Core i9-9900K | be quiet! Dark Rock Pro 4 | ASUS Strix Z390-F | G.Skill Trident Z Neo 2x16GB 3200MHz CL14-14-14-34 | EVGA GeForce RTX 2070 SUPER XC Ultra | Corsair RM650x | Fractal Design Define R6

Laptop: 2018 Apple MacBook Pro 13"  --  i5-8259U | 8GB LPDDR3 | 512GB NVMe | macOS Big Sur

Peripherals: Leopold FC660C w/ Topre Silent 45g | Logitech MX Master 3 & Razer Basilisk X HyperSpeed | Beyerdynamic Custom One Pro Plus | Audio-Technica AT2020USB+

Displays: Alienware AW2521HF & BenQ BL2420PT

Link to post
Share on other sites
2 minutes ago, HelpfulTechWizard said:

Tohe drives are fine now.

How can you know for sure the drivers for a not-yet-out product won't be broken?

 

Anyway, no reviews no point. 

F@H
Desktop: i7-5960X 4.4GHz, Noctua NH-D14, ASUS Rampage V, 32GB, RTX2080S, 2TB NVMe SSD, 2x16TB HDD RAID0, Corsair HX1200, Thermaltake Overseer RX1, Samsung 4K curved 49" TV, 23" secondary

Mobile SFF rig: i9-9900K, Noctua NH-L9i, Asrock Z390 Phantom ITX-AC, 32GB, GTX1070, 2x1TB NVMe SSD RAID0, 2x5TB 2.5" HDD RAID0, Athena 500W Flex (Noctua fan), Custom 4.7l 3D printed case

 

Dell XPS 2 in 1 2019, 32GB, 1TB, 4K

 

GPD Win 2

Link to post
Share on other sites
Just now, GoodEnough said:

asides nvidia's drivers were just awful when people first got their hands on the 3000 cards

the drivers were part of the problem, but nvidia not giving game ready drivers to aibs fast enough was part of it. Thats why aib crashes were worse, though the FE had some of it.

I am still TechWizardThatNeedsHelp, just less of a mouthfull.

 

My beautiful, but not that powerful, main PC:

My new PC I'm saving for:                                                                /\

(was a H1 build, but I cant spend 400$ on a case)                |

DREAM BUILD BIT BY BIT                        |

  • Ryzen 5 5600x                                                                         |
  • ROG Strix B550-f GAMING                                                 |
  • 2x8gb Corsair Vengance LPX DDR4 3200mhz cl6   |
  • Old drives from Grandma -------------------------------------/
  • New Crucial P1 and Seagate Barracuda Compute 2TB
  • Whichever 3070 I can get
  • Meshify C with tempered glass
  • 6 Cooler Master MasterFan MF120 Halo
  • Full custom loop water cooling.

https://pcpartpicker.com/user/HelpfulTechWizard/saved/s8XrD3

Link to post
Share on other sites
2 minutes ago, mahyar said:

im a neural network enthusiast and im saying nothing can compete with cuda

AND nvidia will reveal super or ti card soon 

If you need CUDA then then yeah you have to buy Nvidia.

But I am expecting Nvidia to release a 3080Ti with slightly more CUDA cores and more vram, and since theres 2 empty solder pads on the 3080, possibly a 12GB card. Although a Ti card from Nvidia would have to be $999 to go between the 3080 and 3090.

Link to post
Share on other sites
Just now, GoodEnough said:

key word, enthusiast.

people choose amd over nvidia because they arent made of money

define enthusiast

if it was useful give it a like :) btw if your into linux pay a visit here  and i will be thankful if you send me an opinion here  

 

Link to post
Share on other sites
13 minutes ago, Action_Johnson said:

Prove me wrong. 

Prove yourself right first.

In case of USB confusion, refer to this sh*t.

  • 05Gb/s - USB 3.2 Gen 1 (USB 3.0)
  • 10Gb/s - USB 3.2 Gen 2
  • 20Gb/s - USB 3.2 Gen 2x2
  • 40Gb/s - Thunderbolt 3
  • 40Gb/s certified - Thunderbolt 4
  • 40Gb/s - USB 4

Some Wallpapers I created • The Basics of Solid State Drives and Modules (A reference guide by me)

Link to post
Share on other sites
13 minutes ago, Action_Johnson said:

Prove me wrong. 

Prove you are correct first.

 

Slayerking92

<Type something witty here>
<Link to some pcpartpicker fantasy build and claim as my own>

Link to post
Share on other sites
1 minute ago, Blademaster91 said:

If you need CUDA then then yeah you have to buy Nvidia.

But I am expecting Nvidia to release a 3080Ti with slightly more CUDA cores and more vram, and since theres 2 empty solder pads on the 3080, possibly a 12GB card. Although a Ti card from Nvidia would have to be $999 to go between the 3080 and 3090.

well cuda isnt necessary for neural network but cpus are extremely inefficient in neural network processing 

if it was useful give it a like :) btw if your into linux pay a visit here  and i will be thankful if you send me an opinion here  

 

Link to post
Share on other sites
12 minutes ago, Action_Johnson said:

About $50 more bucks gets you double the VRAM, Infinity-whatever-they-call-it with Ryzen 5000,

VRAM isnt everything, no difference between 16gb and 10gb if you dont go beyond 2560x1440

 

13 minutes ago, Action_Johnson said:

and likely better performance than a 2080Ti (which I'd say is still slightly better than a 3070).

We'll see

CPU: i7-2600K 4751MHz 1.44V (software) --> 1.47V at the back of the socket Motherboard: Asrock Z77 Extreme4 (BCLK: 103.3MHz) CPU Cooler: Noctua NH-D15 RAM: Adata XPG 2x8GB DDR3 (XMP: 2133MHz 10-11-11-30 CR2, custom: 2203MHz 10-11-10-26 CR1 tRFC:230 tREFI:14000) GPU: Asus GTX 1070 Dual (Super Jetstream vbios, +70(2025-2088MHz)/+400(8.8Gbps)) SSD: Samsung 840 Pro 256GB (main boot drive), Transcend SSD370 128GB PSU: Seasonic X-660 80+ Gold Case: Antec P110 Silent, 5 intakes 1 exhaust Monitor: AOC G2460PF 1080p 144Hz (150Hz max w/ DP, 121Hz max w/ HDMI) TN panel Keyboard: Logitech G610 Orion (Cherry MX Blue) with SteelSeries Apex M260 keycaps Mouse: BenQ Zowie FK1

 

Model: HP Omen 17 17-an110ca CPU: i7-8750H (0.125V core & cache, 50mV SA undervolt) GPU: GTX 1060 6GB Mobile (+80/+450, 1650MHz~1750MHz 0.78V~0.85V) RAM: 8+8GB DDR4-2400 18-17-17-39 2T Storage: 1TB HP EX920 PCIe x4 M.2 SSD + 1TB Seagate 7200RPM 2.5" HDD (ST1000LM049-2GH172), 128GB Toshiba PCIe x2 M.2 SSD (KBG30ZMV128G) gone cooking externally Monitor: 1080p 126Hz IPS G-sync

 

Desktop benching:

Cinebench R15 Single thread:168 Multi-thread: 833 

SuperPi (v1.5 from Techpowerup, PI value output) 16K: 0.100s 1M: 8.255s 32M: 7m 45.93s

Link to post
Share on other sites

There's always going to be a niche use-case for a 3070 over 6800. I'd argue if you can get a used 2080ti for the price of a 3070, get the 2080ti for the additional VRAM and overclocking headroom.

 

If you care about RT then you clearly haven't used it yet, because it's still a Beta feature that runs like a turd. I'd wait for next-next gen before getting excited about it.

 

If you're a content producer worth your salt then you're not looking at the 3070 anyway.

 

All that being said, if you make your GPU choice based on brand loyalty then none of these things actually matter. If like myself you have AMD and Nvidia graphic cards to mess around with, there are some strengths and drawbacks with both brands. for example: Nvidia is able to utilize available VRAM better when VRAM is tight by dynamically adjusting the texture quality of objects in-game, while AMD using their image sharpening tech and deeper colors make lower resolutions look a lot better than they should. If you don't have both to compare side-by-side, you're probably going to be biased toward what you already own.

R9 3900XT | Tomahawk B550 | Ventus OC RTX 3090 | Photon 1050W | 32GB DDR4 | TUF GT501 Case | Vizio 4K 50'' HDR

 

Link to post
Share on other sites

You'd be a fool to proclaim someone a fool based on first party, non-independently verified info.

Nvidia's first party info was not taken at face value. AMD's shouldn't be either. Even if Lisa's haircut is cooler than Jensen's jacket.

Link to post
Share on other sites
5 minutes ago, HelpfulTechWizard said:

the drivers were part of the problem, but nvidia not giving game ready drivers to aibs fast enough was part of it. Thats why aib crashes were worse, though the FE had some of it.

The Nvidia 30 series launch was a mess, and yeah that is an important point, AIB's didn't get drivers soon enough to test for stability, thats why the capacitor layout was a problem. The reviewers got drivers before the AIB's did, yet people still give AMD crap for drivers.

Link to post
Share on other sites
2 minutes ago, mahyar said:

define enthusiast

nvidia is more often targeted to people who have the extra money to spend on a more "reliable" gpu than someone who doesnt, i think your wallet would agree with me that amd still has better price to performance, even if it may or may not be slightly slower or faster

Link to post
Share on other sites

ahem

 

TENSOR CORES

 

that is all

I WILL find your ITX build thread, and I WILL recommend the SIlverstone Sugo SG13B

 

Primary PC:

i7 8086k (won) - EVGA Z370 Classified K - G.Kill Trident Z RGB - Force MP500 - Jedi Order Titan Xp - Hyper 212 Black (with RGB Riing flair) - EVGA G2 650W - Black and green theme, Razer branwashed me.

Draws 400 watts under max load, for reference.

 

Linux Proliant ML150 G6:

Dual Xeon X5560 - 24GB ECC DDR3 - GTX 750 TI - old Seagate 1.5TB HDD - Dark moded Ubuntu (and Win7, cuz why not)

 

How many watts do I need? Seasonic Focus threadUserbenchmark (Et al.) is trash explained, PSU misconceptions, protections explainedgroup reg is bad

Link to post
Share on other sites
2 minutes ago, Briggsy said:

If you care about RT then you clearly haven't used it yet, because it's still a Beta feature that runs like a turd. I'd wait for next-next gen before getting excited about it.

you dont know the definition of running like a turd until you have tried running continuum rt on a rx 580 lmao

Link to post
Share on other sites
1 minute ago, GoodEnough said:

nvidia is more often targeted to people who have the extra money to spend on a more "reliable" gpu than someone who doesnt, i think your wallet would agree with me that amd still has better price to performance, even if it may or may not be slightly slower or faster

tenser cores and cuda core amd simply can not compete with these 

if it was useful give it a like :) btw if your into linux pay a visit here  and i will be thankful if you send me an opinion here  

 

Link to post
Share on other sites
2 minutes ago, mahyar said:

well cuda isnt necessary for neural network but cpus are extremely inefficient in neural network processing 

The annoying thing with CUDA is that that it isn't particularly better than OpenCL in terms of actual performance, but because the software out there is almost all optimised for it rather than OpenCL.  Which of course leads into the cycle of people that need compute buy Nvidia, so developers target CUDA, so people buy Nvidia, so developers...

Link to post
Share on other sites
Just now, Koeshi said:

The annoying thing with CUDA is that that it isn't particularly better than OpenCL in terms of actual performance, but because the software out there is almost all optimised for it rather than OpenCL.  Which of course leads into the cycle of people that need compute buy Nvidia, so developers target CUDA, so people buy Nvidia, so developers...

well no cuda is much better than open cl when comes to my workloads

if it was useful give it a like :) btw if your into linux pay a visit here  and i will be thankful if you send me an opinion here  

 

Link to post
Share on other sites
Guest
This topic is now closed to further replies.

×