Jump to content

Nvidia GTX 2060 3D Marks Score Leak

27 months since 1000 series, higher prices, more proprietary technologies. I would also take the ''leaked'' benchmark with a pinch of salt, since the alleged bandwidth of the 2060 is significantly less than that of the 1080. I would also completely ignore the opinions of the WCCFtech's ''journalists'' regarding GDDR6 overclocking to 20gb/s (2.5GHZ) in it's very first iteration on a mid range graphics card.

 

Until we can verify the performance and just as importantly, the price point. I would remain skeptical, the WCCFtech journalists however won't, and i would disregard their opinion completely.

Motherboard: Asus X570-E
CPU: 3900x 4.3GHZ

Memory: G.skill Trident GTZR 3200mhz cl14

GPU: AMD RX 570

SSD1: Corsair MP510 1TB

SSD2: Samsung MX500 500GB

PSU: Corsair AX860i Platinum

Link to comment
Share on other sites

Link to post
Share on other sites

With the source being WCCFTech, something tells me that this is bs.

There are 10 types of people in this world. Those that understand binary and those that don't.

Current Rig (Dominator II): 8GB Corsair Vengeance LPX DDR4 3133 C15, AMD Ryzen 3 1200 at 4GHz, Coolermaster MasterLiquid Lite 120, ASRock B450M Pro4, AMD R9 280X, 120GB TCSunBow SSD, 3TB Seagate ST3000DM001-9YN166 HSD, Corsair CX750M Grey Label, Windows 10 Pro, 2x CoolerMaster MasterFan Pro 120, Thermaltake Versa H18 Tempered Glass.

 

Previous Rig (Black Magic): 8GB DDR3 1600, AMD FX6300 OC'd to 4.5GHz, Zalman CNPS5X Performa, Asus M5A78L-M PLUS /USB3, GTX 950 SC (former, it blew my PCIe lane so now on mobo graphics which is Radeon HD 3000 Series), 1TB Samsung Spinpoint F3 7200RPM HDD, 3TB Seagate ST3000DM001-9YN166 HDD (secondary), Corsair CX750M, Windows 8.1 Pro, 2x 120mm Red LED fans, Deepcool SMARTER case

 

My secondary rig (The Oldie): 4GB DDR2 800, Intel Core 2 Duo E8400 @ 3GHz, Stock Dell Cooler, Foxconn 0RY007, AMD Radeon HD 5450, 250GB Samsung Spinpoint 7200RPM HDD, Antec HCG 400M 400W Semi Modular PSU, Windows 8.1 Pro, 80mm Cooler Master fan, Dell Inspiron 530 Case modded for better cable management. UPDATE: SPECS UPGRADED DUE TO CASEMOD, 8GB DDR2 800, AMD Phenom X4 9650, Zalman CNPS5X Performa, Biostar GF8200C M2+, AMD Radeon HD 7450 GDDR5 edition, Samsung Spinpoint 250GB 7200RPM HDD, Antec HCG 400M 400W Semi Modular PSU, Windows 8.1 Pro, 80mm Cooler Master fan, Dell Inspiron 530 Case modded for better cable management and support for non Dell boards.

 

Retired/Dead Rigs: The OG (retired) (First ever PC I used at 3 years old back in 2005) Current Specs: 2GB DDR2, Pentium M 770 @ 2.13GHz, 60GB IDE laptop HDD, ZorinOS 12 Ultimate x86. Originally 512mb DDR2, Pentium M 740 @ 1.73GHzm 60GB IDE laptop HDD and single boot XP Pro. The Craptop (dead), 2gb DDR3, Celeron n2840 @ 2.1GHz, 50GB eMMC chip, Windows 10 Pro. Nightrider (dead and cannibalized for Dominator II): Ryzen 3 1200, Gigabyte A320M HD2, 8GB DDR4, XFX Ghost Core Radeon HD 7770, 1TB Samsung Spinpoint F3 (2010), 3TB Seagate Barracuda, Corsair CX750M Green, Deepcool SMARTER, Windows 10 Home.

Link to comment
Share on other sites

Link to post
Share on other sites

I wonder if buying a 5GB card in 2018 might bite me in the ass. Though I don't plan to go above 1080p with the card; I'll stick to that resolution until 4k becomes something playable at 60 fps on $400 cards and below.

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, Dylanc1500 said:

The problem is you can't use the same setup for raytracing and tensor functions, as you would with the standard rasterization technique. 

So? I haven't seen proof of their real usefulness so far.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, laminutederire said:

So? I haven't seen proof of their real usefulness so far.

The CGI industry (Pixar, Dreamworks, Disney, etc.) is the largest user of raytracing, it is a multi-billion dollar market, that is currently dominated by render farms using CPUs because they are the most effective. This now opens up for NVidia to upset that. Unless you are speaking directly for gaming purposes. Which it still offers benefit to.

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, Dylanc1500 said:

The CGI industry (Pixar, Dreamworks, Disney, etc.) is the largest user of raytracing, it is a multi-billion dollar market, that is currently dominated by render farms using CPUs because they are the most effective. This now opens up for NVidia to upset that. Unless you are speaking directly for gaming purposes. Which it still offers benefit to.

Because you really think Disney haven't thought about that?

The only reason they use it in Nvidias case is because they use it to do really simple stuffs. Add-on features to replace current methods. It does not bring that much more fidelity per computation. It just adds a small complexity for the sake of it. It's not motivated by a real constraint.

Disney is also using code for ray tracing that is first of all so well optimized you won't see them move away from it just like that. Then they are so much further away than Nvidia for Monte Carlo denoiser that they really don't need Nvidias tech.

And finally Nvidias attempt is bound to stay in gaming, as it's there to get a cheap retraced result, while the cgi industry strives for precision and quality, and they will never sacrifice that for a small extra time.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, laminutederire said:

 

And finally Nvidias attempt is bound to stay in gaming, as it's there to get a cheap retraced result, while the cgi industry strives for precision and quality, and they will never sacrifice that for a small extra time.

I wouldn't make any prejudgments on that, we have no idea how GPU ray-tracing is going to compare to a render farm of CPUs for time and accuracy. It might flop due to massive integration problems or might excel and make current render farms obsolete.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

On 8/17/2018 at 8:51 PM, Watson242 said:

Yeah Nvidia made a false statement , since the 1080 is actually atleast 75% faster than 980 , even the 1070 can be 50% faster than 980 sometimes

Yeah no it’s not

ƆԀ S₱▓Ɇ▓cs: i7 6ʇɥפᴉƎ00K (4.4ghz), Asus DeLuxe X99A II, GT҉X҉1҉0҉8҉0 Zotac Amp ExTrꍟꎭe),Si6F4Gb D???????r PlatinUm, EVGA G2 Sǝʌǝᘉ5ᙣᙍᖇᓎᙎᗅᖶt, Phanteks Enthoo Primo, 3TB WD Black, 500gb 850 Evo, H100iGeeTeeX, Windows 10, K70 R̸̢̡̭͍͕̱̭̟̩̀̀̃́̃͒̈́̈́͑̑́̆͘͜ͅG̶̦̬͊́B̸͈̝̖͗̈́, G502, HyperX Cloud 2s, Asus MX34. פN∩SW∀S 960 EVO

Just keeping this here as a backup 9̵̨̢̨̧̧̡̧̡̧̡̧̡̡̢̢̡̢̧̡̢̡̡̢̧̛̛̛̛̛̛̱̖͈̠̝̯̹͉̝̞̩̠̹̺̰̺̲̳͈̞̻̜̫̹̱̗̣͙̻̘͎̲̝͙͍͔̯̲̟̞͚̖̘͉̭̰̣͎͕̼̼̜̼͕͎̣͇͓͓͎̼̺̯͈̤̝͖̩̭͍̣̱̞̬̺̯̼̤̲͎̖̠̟͍̘̭͔̟̗̙̗̗̤̦͍̫̬͔̦̳̗̳͔̞̼̝͍̝͈̻͇̭̠͈̳͍̫̮̥̭͍͔͈̠̹̼̬̰͈̤͚̖̯͍͉͖̥̹̺͕̲̥̤̺̹̹̪̺̺̭͕͓̟̳̹͍̖͎̣̫͓͍͈͕̳̹̙̰͉͙̝̜̠̥̝̲̮̬͕̰̹̳͕̰̲̣̯̫̮͙̹̮͙̮̝̣͇̺̺͇̺̺͈̳̜̣̙̻̣̜̻̦͚̹̩͓͚̖͍̥̟͍͎̦͙̫̜͔̭̥͈̬̝̺̩͙͙͉̻̰̬̗̣͖̦͎̥̜̬̹͓͈͙̤̜̗͔̩̖̳̫̑̀̂̽̈́̈́̿͒̿̋̊͌̾̄̄̒̌͐̽̿̊͑̑̆͗̈̎̄͒̑̋͛̑͑̂͑̀͐̀͑̓͊̇͆̿͑͛͛͆́͆̓̿̇̀̓͑͆͂̓̾̏͊̀̇̍̃́̒̎̀̒̄̓̒̐̑̊̏̌̽̓͂͋̓̐̓͊̌͋̀̐̇̌̓̔͊̈̇́̏͒̋͊̓̆̋̈̀̌̔͆͑̈̐̈̍̀̉̋̈́͊̽͂̿͌͊̆̾̉͐̿̓̄̾͑̈́͗͗̂̂́̇͂̀̈́́̽̈́̓̓͂̽̓̀̄͌̐̔̄̄͒͌̈́̅̉͊̂͒̀̈́̌͂̽̀̑̏̽̀͑̐̐͋̀̀͋̓̅͋͗̍́͗̈́̆̏̇͊̌̏̔̑̐̈́͑̎͑͆̏̎́̑̍̏̒̌̊͘͘̚̕̚̕̕̚̕̚̕̕͜͜͜͜͜͝͝͠͠͝͝͝͝͝͝͝͠͝͝ͅͅͅͅͅͅͅ8̵̨̛̛̛̛̮͍͕̥͉̦̥̱̞̜̫̘̤̖̬͍͇͓̜̻̪̤̣̣̹̑͑̏̈́̐̐́̎͒̔͒̌̑̓̆̓͑̉̈́́͋̌͋͐͛͋̃̍̽̊͗͋͊̂̅͊͑́͋͛̉̏̓͌̾̈́̀͛͊̾͑̌̀̀̌̓̏̑́̄̉̌͂́͛̋͊̄͐͊̈́̀̌̆̎̿̓̔̍̎̀̍̚̕̕͘͘͘̕̚͝͝͠͠͠0̶̡̡̡̢̨̨͕̠̠͉̺̻̯̱̘͇̥͎͖̯͕̖̬̭͔̪̪͎̺̠̤̬̬̤̣̭̣͍̥̱̘̳̣̤͚̭̥͚̦͙̱̦͕̼͖͙͕͇̭͓͉͎̹̣̣͕̜͍͖̳̭͕̼̳̖̩͍͔̱̙̠̝̺̰̦̱̿̄̀͐͜͜ͅͅt̶̡̨̡̨̧̢̧̢̨̧̧̧̧̢̡̨̨̢̨̢̧̢̛̛̛̛̛̠͍̞̮͇̪͉̩̗̗͖̫͉͎͓̮̣̘̫͔̘̬̮̙̯̣͕͓̲̣͓͓̣̹̟͈̱͚̘̼̙̖̖̼̙̜̝͙̣̠̪̲̞̖̠̯̖̠̜̱͉̲̺͙̤̻̦̜͎̙̳̺̭̪̱͓̦̹̺͙̫̖̖̰̣͈͍̜̺̘͕̬̥͇̗̖̺̣̲̫̟̣̜̭̟̱̳̳̖͖͇̹̯̜̹͙̻̥̙͉͕̜͎͕̦͕̱͖͉̜̹̱̦͔͎̲̦͔̖̘̫̻̹̮̗̮̜̰͇̰͔̱͙̞̠͍͉͕̳͍̰̠̗̠̯̜̩͓̭̺̦̲̲͖̯̩̲̣̠͉̦̬͓̠̜̲͍̘͇̳̳͔̼̣͚̙͙͚͕̙̘̣̠͍̟̪̝̲͇͚̦̖͕̰̟̪͖̳̲͉͙̰̭̼̩̟̝̣̝̬̳͎̙̱͒̃̈͊̔͒͗̐̄̌͐͆̍͂̃̈́̾͗̅̐͒̓̆͛̂̾͋̍͂̂̄̇̿̈͌̅̈́̃̾̔̇̇̾̀͊͋̋̌̄͌͆͆̎̓̈́̾̊͊̇̌̔̈́̈́̀̐͊̊̍͑̊̈̓͑̀́̅̀̑̈́̽̃̽͛̇́̐̓̀͆̔̈̀̍̏̆̓̆͒̋́̋̍́̂̉͛̓̓̂̋̎́̒̏̈͋̃̽͆̓̀̔͑̈́̓͌͑̅̽́̐̍̉̑̓̈́͌̋̈́͂̊́͆͂̇̈́̔̃͌̅̈́͌͛̑̐̓̔̈́̀͊͛̐̾͐̔̾̈̃̈̄͑̓̋̇̉̉̚̕̚͘̕̚̚̕̕͜͜͜͜͜͜͜͜͜͜͜͜͜͝͝͝͠͝͝͝͝͝͠ͅͅͅͅͅi̵̢̧̢̧̡̧̢̢̧̢̢̢̡̡̡̧̧̡̡̧̛̛͈̺̲̫͕̞͓̥̖̭̜̫͉̻̗̭̖͔̮̠͇̩̹̱͈̗̭͈̤̠̮͙͇̲͙̰̳̹̲͙̜̟͚͎͓̦̫͚̻̟̰̣̲̺̦̫͓̖̯̝̬͉̯͓͈̫̭̜̱̞̹̪͔̤̜͙͓̗̗̻̟͎͇̺̘̯̲̝̫͚̰̹̫̗̳̣͙̮̱̲͕̺̠͉̫̖̟͖̦͉̟͈̭̣̹̱̖̗̺̘̦̠̯̲͔̘̱̣͙̩̻̰̠͓͙̰̺̠̖̟̗̖͉̞̣̥̝̤̫̫̜͕̻͉̺͚̣̝̥͇̭͎̖̦̙̲͈̲̠̹̼͎͕̩͓̖̥̘̱̜͙̹̝͔̭̣̮̗̞̩̣̬̯̜̻̯̩̮̩̹̻̯̬̖͂̈͂̒̇͗͑̐̌̎̑̽̑̈̈́͑̽́̊͋̿͊͋̅̐̈́͑̇̿̈́̌͌̊̅͂̎͆̏̓͂̈̿̏̃͑̏̓͆̔̋̎̕͘͘͘͜͜͜͜͜͜͜͝͝͠͠ͅͅͅͅͅͅͅͅͅZ̴̧̢̨̢̧̢̢̡̧̢̢̢̨̨̨̡̨̧̢̧̛̛̬̖͈̮̝̭̖͖̗̹̣̼̼̘̘̫̠̭̞͙͔͙̜̠̗̪̠̼̫̻͓̳̟̲̳̻̙̼͇̺͎̘̹̼͔̺̹̬̯̤̮̟͈̭̻͚̣̲͔͙̥͕̣̻̰͈̼̱̺̤̤͉̙̦̩̗͎̞͓̭̞̗͉̳̭̭̺̹̹̮͕̘̪̞̱̥͈̹̳͇̟̹̱̙͚̯̮̳̤͍̪̞̦̳̦͍̲̥̳͇̪̬̰̠͙͕̖̝̫̩̯̱̘͓͎̪͈̤̜͎̱̹̹̱̲̻͎̖̳͚̭̪̦̗̬͍̯̘̣̩̬͖̝̹̣̗̭͖̜͕̼̼̲̭͕͔̩͓̞̝͓͍̗̙̯͔̯̞̝̳̜̜͉̖̩͇̩̘̪̥̱͓̭͎͖̱̙̩̜͎̙͉̟͎͔̝̥͕͍͓̹̮̦̫͚̠̯͓̱͖͔͓̤͉̠͙̋͐̀͌̈́͆̾͆̑̔͂͒̀̊̀͋͑̂͊̅͐̿́̈́̐̀̏̋̃̄͆͒̈́̿̎́́̈̀̀͌̔͋͊̊̉̿͗͊͑̔͐̇͆͛̂̐͊̉̄̈́̄̐͂͂͒͑͗̓͑̓̾̑͋̒͐͑̾͂̎̋̃̽̂̅̇̿̍̈́́̄̍͂͑̏̐̾̎̆̉̾͂̽̈̆̔́͋͗̓̑̕͘̕͘͜͜͜͜͜͝͝͝͝͠͠͝ͅo̶̪͆́̀͂̂́̄̅͂̿͛̈́̿͊͗́͘͝t̴̡̨̧̨̧̡̧̨̡̢̧̢̡̨̛̪͈̣̭̺̱̪̹̺̣̬̖̣̻͈̞̙͇̩̻̫͈̝̭̟͎̻̟̻̝̱͔̝̼͍̞̼̣̘̤̯͓͉̖̠̤͔̜̙͚͓̻͓̬͓̻̜̯̱̖̳̱̗̠̝̥̩͓̗̪̙͓̖̠͎̗͎̱̮̯̮͙̩̫̹̹̖͙̙͖̻͈̙̻͇͔̙̣̱͔̜̣̭̱͈͕̠̹͙̹͇̻̼͎͍̥̘͙̘̤̜͎̟͖̹̦̺̤͍̣̼̻̱̲͎̗̹͉͙̪̞̻̹͚̰̻͈͈͊̈́̽̀̎̃̊́̈́̏̃̍̉̇̑̂̇̏̀͊̑̓͛̽͋̈́͆́̊͊̍͌̈́̓͊̌̿̂̾̐͑̓̀́͒̃̋̓͆̇̀͊̆͗̂͑͐̀͗̅̆͘̕͘̕̕͜͜͝͝͝͝͝͝͝ͅͅͅͅͅͅͅͅͅḁ̶̢̡̨̧̡̡̨̨̧̨̡̡̢̧̨̡̡̛̛̛͍̱̳͚͕̩͍̺̪̻̫̙͈̬͙̖͙̬͍̬̟̣̝̲̼̜̼̺͎̥̮̝͙̪̘̙̻͖͇͚͙̣̬̖̲̲̥̯̦̗̰̙̗̪̞̗̩̻̪̤̣̜̳̩̦̻͓̞̙͍͙̫̩̹̥͚̻̦̗̰̲̙̫̬̱̺̞̟̻͓̞͚̦̘̝̤͎̤̜̜̥̗̱͈̣̻̰̮̼̙͚͚̠͚̲̤͔̰̭̙̳͍̭͎̙͚͍̟̺͎̝͓̹̰̟͈͈̖̺͙̩̯͔̙̭̟̞̟̼̮̦̜̳͕̞̼͈̜͍̮͕̜͚̝̦̞̥̜̥̗̠̦͇͖̳͈̜̮̣͚̲̟͙̎̈́́͊̔̑̽̅͐͐͆̀͐́̓̅̈͑͑̍̿̏́͆͌̋̌̃̒̽̀̋̀̃̏̌́͂̿̃̎̐͊̒̀̊̅͒̎͆̿̈́̑̐̒̀̈́̓̾͋͆̇̋͒̎̈̄̓̂͊̆͂̈́̒̎͐̇̍̆̋̅̿̔͒̄̇̂̋̈́͆̎̔̇͊̊̈́̔̏͋́̀͂̈́̊͋͂̍̾̓͛̇̔̚͘̚̕̚͘͘̕̕̕̚͘͘̚̕̚̕͜͜͜͝͝͝͝͝͝͝͝ͅͅͅͅͅç̵̧̢̨̢̢̢̧̧̡̨̡̢̧̧̧̨̡̡̨̨̢̢̢̧̨̢̨̢̛̛͉̗̠͇̹̖̝͕͚͎̟̻͓̳̰̻̺̞̣͚̤͙͍͇̗̼͖͔͕͙͖̺͙̖̹̘̘̺͓̜͍̣̰̗̖̺̗̪̘̯̘͚̲͚̲̬̞̹̹͕̭͔̳̘̝̬͉̗̪͉͕̞̫͔̭̭̜͉͔̬̫͙̖̙͚͔͙͚͍̲̘͚̪̗̞̣̞̲͎͔͖̺͍͎̝͎͍̣͍̩̟͈͕̗͉̪̯͉͎͖͍̖͎̖̯̲̘̦̟̭͍͚͓͈͙̬͖̘̱̝̜̘̹̩̝̥̜͎̬͓̬͙͍͇͚̟̫͇̬̲̥̘̞̘̟̘̝̫͈̙̻͇͎̣̪̪̠̲͓͉͙͚̭̪͇̯̠̯̠͖̞̜͓̲͎͇̼̱̦͍͉͈͕͉̗̟̖̗̱̭͚͎̘͓̬͍̱͍̖̯̜̗̹̰̲̩̪͍̞̜̫̩̠͔̻̫͍͇͕̰̰̘͚͈̠̻̮͊̐̿̏̐̀̇̑̐̈͛͑͑̍̑̔̃̈́̓̈́̇̐͑̐̊̆͂̀̏͛̊̔̍̽͗͋̊̍̓̈́̏̅͌̀̽́̑͒͒̓͗̈́̎͌͂̕̚͘͘͜͜͜͜͜͠͝͝͝͝ͅͅͅͅͅͅͅS̵̡̡̧̧̨̨̡̢̡̡̡̡̧̧̡̧̢̫̯͔̼̲͉͙̱̮̭̗͖̯̤͙̜͚̰̮̝͚̥̜̞̠̤̺̝͇̻̱͙̩̲̺͍̳̤̺̖̝̳̪̻̗̮̪̖̺̹̭͍͇̗̝̱̻̳̝̖̝͎̙͉̞̱̯̙̜͇̯̻̞̱̭̗͉̰̮̞͍̫̺͙͎̙̞̯̟͓͉̹̲͖͎̼̫̩̱͇̲͓̪͉̺̞̻͎̤̥̭̺̘̻̥͇̤̖̰̘̭̳̫̙̤̻͇̪̦̭̱͎̥̟͖͕̣̤̩̟̲̭̹̦̹̣͖̖͒̈́̈́̓͗̈̄͂̈́̅̐̐̿̎̂͗̎̿̕͘͜͜͜͜͝͝ͅͅt̸̡̡̧̧̨̡̢̛̥̥̭͍̗͈̩͕͔͔̞̟͍̭͇̙̺̤͚͎͈͎͕̱͈̦͍͔͓̬͚̗̰̦͓̭̰̭̎̀̂̈́̓̒̈́̈́̂̄̋́̇̂͐͒̋̋̉͐̉̏̇͋̓̈́͐̾͋̒͒͐̊̊̀̄͆̄͆̑͆̇̊̓̚̚̕̚̕͜͠͝͝ͅͅơ̵̡̨̡̡̡̨̛̺͕̼͔̼̪̳͖͓̠̘̘̳̼͚͙͙͚̰͚͚͖̥̦̥̘̖̜̰͔̠͕̦͎̞̮͚͕͍̤̠̦͍̥̝̰̖̳̫̮̪͇̤̱̜͙͔̯͙̙̼͇̹̥̜͈̲̺̝̻̮̬̼̫̞̗̣̪̱͓̺̜̠͇͚͓̳̹̥̳̠͍̫͈̟͈̘̯̬̞͔̝͍͍̥̒̐͗͒͂͆̑̀̿̏́̀͑͗̐́̀̾̓́̌̇̒̈́̌̓͐̃̈́̒̂̀̾͂̊̀̂͐̃̄̓̔̽̒̈́̇̓͌̇̂̆̒̏̊̋͊͛͌̊̇̒̅͌̄̎̔̈́͊́̽̋̈̇̈́́͊̅͂̎̃͌͊͛͂̄̽̈́̿͐̉̽̿́́̉͆̈́̒́̂̾̄̇̌̒̈̅̍̿̐͑̓͊̈́̈̋̈́̉̍̋̊̈̀̈́̾̿̌̀̈́͌̑̍́̋̒̀̂̈́́̾̏̐̅̈̑͗͐̈͂̄̾̄̈́̍̉͑͛͗͋̈́̃̄̊́́͐̀̀̽̇̓̄̓̃͋͋̂̽̔̀̎͌̈́̈́̑̓̔̀̓͐͛͆̿̋͑͛̈́͂̅̋̅͆͗̇́̀̒́̏͒̐̍͂̓͐͐̇̂̉̑̊͑̉̋̍͊̄̀͂̎͒̔͊̃̏̕̚̕̕͘͘͘̚͘̚͘̕͘̚͘̚̚̚̕͘͜͜͜͝͝͠͠͝͝͠͠͝͝͝͝͝͝͝͝͝ͅͅͅc̴̨̡̢̢̢̡̡̢̛̛̛̻͇̝̣͉͚͎͕̻̦͖̤̖͇̪̩̤̻̭̮̙̰̖̰̳̪̱̹̳̬͖̣͙̼̙̰̻̘͇͚̺̗̩̫̞̳̼̤͔͍͉̟͕̯̺͈̤̰̹̍̋́͆̾̆̊͆͋̀͑͒̄̿̄̀̂͋̊͆́͑̑̽͊̓́̔̽̌͊̄͑͒͐̑͗̿̃̀̓̅́̿͗̈́͌̋̀̏̂͌̓́̇̀͒͋̌̌̅͋͌̆͐̀̔̒͐̊̇̿̽̀̈́̃̒̋̀̈́̃̏̂̊͗̑̊̈̇̀̌͐̈́̉̂̏͊̄͐̈̽͒̏̒̓́̌̓̅́̓̃͐͊͒̄͑̒͌̍̈́̕͘̚͘̕͘̚̕͜͝͠͝͝͝ͅǩ̴̢̢̢̧̨̢̢̢̨̨̨̢̢̢̨̧̨̡̡̢̛̛̛̛̛̛̛̜̥̩̙͕̮̪̻͈̘̯̼̰̜͚̰͖̬̳͖̣̭̼͔̲͉̭̺͚̺̟͉̝̱̲͎͉̙̥̤͚͙̬̪̜̺͙͍̱̞̭̬̩̖̤̹̤̺̦͈̰̗̰͍͇̱̤̬̬͙̙̲̙̜͖͓̙̟̙̯̪͍̺̥͔͕̝̳̹̻͇̠̣͈̰̦͓͕̩͇͈͇̖͙͍̰̲̤̞͎̟̝̝͈͖͔͖̦̮̗̬̞̞̜̬̠̹̣̣̲̮̞̤̜̤̲̙͔͕̯͔͍̤͕̣͔͙̪̫̝̣̰̬̬̭̞͔̦̟̥̣̻͉͈̮̥̦̮̦͕̤͇̺͆͆̈͗̄̀̌̔̈́̈̉̾̊̐̆̂͛̀̋́̏̀̿͒̓̈́̈́͂̽̾͗͊̋̐̓̓̀̃̊̊͑̓̈̎̇͑̆̂̉̾̾̑͊̉̃́̑͌̀̌̐̅̃̿̆̎̈́̀̒́͛̓̀̊́̋͛͒͊̆̀̃̊͋̋̾̇̒̋͂̏͗͆̂̔́̐̀́͗̅̈̋̂̎̒͊̌̉̈̈́͌̈́̔̾̊̎́͐͒̋̽̽́̾̿̚̕͘͘̚̕̕̕̚̚̕̚̕͘͜͜͜͝͠͝͝͝͝͝͝͝͝ͅͅͅͅͅͅB̸̢̧̨̡̢̧̨̡̡̨̡̨̡̡̡̢̨̢̨̛̛̛̛̛̛͉̞͚̰̭̲͈͎͕͈̦͍͈̮̪̤̻̻͉̫̱͔̞̫̦̰͈̗̯̜̩̪̲̻̖̳͖̦͎͔̮̺̬̬̼̦̠̪̤͙͍͓̜̥̙̖̫̻̜͍̻̙̖̜̹͔̗̪̜̖̼̞̣̠̫͉̯̮̤͈͎̝̪͎͇͙̦̥͙̳̫̰̪̣̱̘̤̭̱͍̦͔̖͎̺̝̰̦̱̣͙̙̤͚̲͔̘̱̜̻͔̥̻͖̭͔̜͉̺͕͙͖̜͉͕̤͚̠̩̮̟͚̗͈͙̟̞̮̬̺̻̞͔̥͉͍̦̤͓̦̻̦̯̟̰̭̝̘̩̖̝͔̳͉̗̖̱̩̩̟͙͙͛̀͐̈́̂̇͛̅̒̉̏̈́̿͐́̏̃̏̓̌̽͐̈́͛̍͗͆͛̋̔̉͂̔̂̓̌͌͋̂͆̉͑̊̎́̈́̈̂͆͑́̃̍̇̿̅̾́́̿̅̾̆̅̈́̈̓͒͌͛̃͆̋͂̏̓̅̀͂̽̂̈̈́̎̾̐͋͑̅̍̈́̑̅̄͆̓̾̈́͐̎̊͐̌̌̓͊̊̔̈́̃͗̓͊͐̌͆̓͗̓̓̾̂̽͊͗́́́̽͊͆͋͊̀̑̿̔͒̏̈́́̏͆̈́͋̒͗͂̄̇̒͐̃͑̅̍͒̎̈́̌̋́̓͂̀̇͛̋͊͆̈́̋́̍̃͒̆̕̚̚̕̕̕͘̕̚̚͘̕͜͜͜͜͝͠͠͝͠͝͝͝͝͠͝͝͝͝ͅͅͅͅͅI̵̡̢̧̨̡̢̨̡̡̢̡̧̡̢̢̢̡̢̛̛͕͎͕̩̠̹̩̺̣̳̱͈̻̮̺̟̘̩̻̫͖̟͓̩̜̙͓͇̙̱̭̰̻̫̥̗̠͍͍͚̞̘̫͉̬̫̖̖̦͖͉̖̩̩̖̤̺̥̻̝͈͎̻͓̟̹͍̲͚͙̹̟̟̯͚̳̟͕̮̻̟͈͇̩̝̼̭̯͚͕̬͇̲̲̯̰̖̙̣̝͇̠̞̙͖͎̮̬̳̥̣̺̰͔̳̳̝̩̤̦̳̞̰̩̫̟͚̱̪̘͕̫̼͉̹̹̟̮̱̤̜͚̝̠̤̖̮̯̳͖̗̹̞̜̹̭̿̏͋̒͆̔̄̃̾̓͛̾̌́̅̂͆̔͌͆͋̔̾́̈̇̐̄̑̓̂̾́̄̿̓̅̆͌̉̎̏̄͛̉͆̓̎͒͘̕̕͜͜͜͜͜͜͜͝͠ͅͅƠ̷̢̛̛̛̛̛̛̛̛̟̰͔͔͇̲̰̮̘̭̭̖̥̟̘̠̬̺̪͇̲͋͂̅̈́̍͂̽͗̾͒̇̇̒͐̍̽͊́̑̇̑̾̉̓̈̾͒̍̌̅̒̾̈́̆͌̌̾̎̽̐̅̏́̈̔͛̀̋̃͊̒̓͗͒̑͒̃͂̌̄̇̑̇͛̆̾͛̒̇̍̒̓̀̈́̄̐͂̍͊͗̎̔͌͛̂̏̉̊̎͗͊͒̂̈̽̊́̔̊̃͑̈́̑̌̋̓̅̔́́͒̄̈́̈̂͐̈̅̈̓͌̓͊́̆͌̉͐̊̉͛̓̏̓̅̈́͂̉̒̇̉̆̀̍̄̇͆͛̏̉̑̃̓͂́͋̃̆̒͋̓͊̄́̓̕̕̕̚͘͘͘̚̕̚͘̕̕͜͜͝͝͝͠͝͝͝͝͠ͅS̷̢̨̧̢̡̨̢̨̢̨̧̧̨̧͚̱̪͇̱̮̪̮̦̝͖̜͙̘̪̘̟̱͇͎̻̪͚̩͍̠̹̮͚̦̝̤͖̙͔͚̙̺̩̥̻͈̺̦͕͈̹̳̖͓̜͚̜̭͉͇͖̟͔͕̹̯̬͍̱̫̮͓̙͇̗̙̼͚̪͇̦̗̜̼̠͈̩̠͉͉̘̱̯̪̟͕̘͖̝͇̼͕̳̻̜͖̜͇̣̠̹̬̗̝͓̖͚̺̫͛̉̅̐̕͘͜͜͜͜ͅͅͅ.̶̨̢̢̨̢̨̢̛̻͙̜̼̮̝̙̣̘̗̪̜̬̳̫̙̮̣̹̥̲̥͇͈̮̟͉̰̮̪̲̗̳̰̫̙͍̦̘̠̗̥̮̹̤̼̼̩͕͉͕͇͙̯̫̩̦̟̦̹͈͔̱̝͈̤͓̻̟̮̱͖̟̹̝͉̰͊̓̏̇͂̅̀̌͑̿͆̿̿͗̽̌̈́̉̂̀̒̊̿͆̃̄͑͆̃̇͒̀͐̍̅̃̍̈́̃̕͘͜͜͝͠͠z̴̢̢̡̧̢̢̧̢̨̡̨̛̛̛̛̛̛̛̛̲͚̠̜̮̠̜̞̤̺͈̘͍̻̫͖̣̥̗̙̳͓͙̫̫͖͍͇̬̲̳̭̘̮̤̬̖̼͎̬̯̼̮͔̭̠͎͓̼̖̟͈͓̦̩̦̳̙̮̗̮̩͙͓̮̰̜͎̺̞̝̪͎̯̜͈͇̪̙͎̩͖̭̟͎̲̩͔͓͈͌́̿͐̍̓͗͑̒̈́̎͂̋͂̀͂̑͂͊͆̍͛̄̃͌͗̌́̈̊́́̅͗̉͛͌͋̂̋̇̅̔̇͊͑͆̐̇͊͋̄̈́͆̍̋̏͑̓̈́̏̀͒̂̔̄̅̇̌̀̈́̿̽̋͐̾̆͆͆̈̌̿̈́̎͌̊̓̒͐̾̇̈́̍͛̅͌̽́̏͆̉́̉̓̅́͂͛̄̆͌̈́̇͐̒̿̾͌͊͗̀͑̃̊̓̈̈́̊͒̒̏̿́͑̄̑͋̀̽̀̔̀̎̄͑̌̔́̉̐͛̓̐̅́̒̎̈͆̀̍̾̀͂̄̈́̈́̈́̑̏̈́̐̽̐́̏̂̐̔̓̉̈́͂̕̚̕͘͘̚͘̚̕̚̚̚͘̕̕̕͜͜͝͠͠͝͝͝͝͠͝͝͝͠͝͝͝͝͝͝ͅͅͅī̸̧̧̧̡̨̨̢̨̛̛̘͓̼̰̰̮̗̰͚̙̥̣͍̦̺͈̣̻͇̱͔̰͈͓͖͈̻̲̫̪̲͈̜̲̬̖̻̰̦̰͙̤̘̝̦̟͈̭̱̮̠͍̖̲͉̫͔͖͔͈̻̖̝͎̖͕͔̣͈̤̗̱̀̅̃̈́͌̿̏͋̊̇̂̀̀̒̉̄̈́͋͌̽́̈́̓̑̈̀̍͗͜͜͠͠ͅp̴̢̢̧̨̡̡̨̢̨̢̢̢̨̡̛̛͕̩͕̟̫̝͈̖̟̣̲̖̭̙͇̟̗͖͎̹͇̘̰̗̝̹̤̺͉͎̙̝̟͙͚̦͚͖̜̫̰͖̼̤̥̤̹̖͉͚̺̥̮̮̫͖͍̼̰̭̤̲͔̩̯̣͖̻͇̞̳̬͉̣̖̥̣͓̤͔̪̙͎̰̬͚̣̭̞̬͎̼͉͓̮͙͕̗̦̞̥̮̘̻͎̭̼͚͎͈͇̥̗͖̫̮̤̦͙̭͎̝͖̣̰̱̩͎̩͎̘͇̟̠̱̬͈̗͍̦̘̱̰̤̱̘̫̫̮̥͕͉̥̜̯͖̖͍̮̼̲͓̤̮͈̤͓̭̝̟̲̲̳̟̠͉̙̻͕͙̞͔̖͈̱̞͓͔̬̮͎̙̭͎̩̟̖͚̆͐̅͆̿͐̄̓̀̇̂̊̃̂̄̊̀͐̍̌̅͌̆͊̆̓́̄́̃̆͗͊́̓̀͑͐̐̇͐̍́̓̈́̓̑̈̈́̽͂́̑͒͐͋̊͊̇̇̆̑̃̈́̎͛̎̓͊͛̐̾́̀͌̐̈́͛̃̂̈̿̽̇̋̍͒̍͗̈͘̚̚͘̚͘͘͜͜͜͜͜͜͠͠͝͝ͅͅͅ☻♥■∞{╚mYÄÜXτ╕○\╚Θº£¥ΘBM@Q05♠{{↨↨▬§¶‼↕◄►☼1♦  wumbo╚̯̪̣͕̙̩̦͓͚̙̱̘̝̏̆ͤ̊̅ͩ̓̏̿͆̌Θ̼̯͉ͭͦ̃͊͑̉ͯͤ̈́ͬ͐̈́͊ͤͅº͍̪͇͖̝̣̪̙̫̞̦̥ͨ̂ͧ̄̿£̺̻̹̠̯͙͇̳ͬ̃̿͑͊ͨͣ╚̯̪̣͕̙̩̦͓͚̙̱̘̝̏̆ͤ̊̅ͩ̓̏̿͆̌Θ̼̯͉ͭͦ̃͊͑̉ͯͤ̈́ͬ͐̈́͊ͤͅº͍̪͇͖̝̣̪̙̫̞̦̥ͨ̂ͧ̄̿£̺̻̹̠̯͙͇̳ͬ̃̿͑͊ͨͣ╚̯̪̣͕̙̩̦͓͚̙̱̘̝̏̆ͤ̊̅ͩ̓̏̿͆̌Θ̼̯͉ͭͦ̃͊͑̉ͯͤ̈́ͬ͐̈́͊ͤͅº͍̪͇͖̝̣̪̙̫̞̦̥ͨ̂ͧ̄̿£̺̻̹̠̯͙͇̳ͬ̃̿͑͊ͨͣ╚̯̪̣͕̙̩̦͓͚̙̱̘̝̏̆ͤ̊̅ͩ̓̏̿͆̌Θ̼̯͉ͭͦ̃͊͑̉ͯͤ̈́ͬ͐̈́͊ͤͅº͍̪͇͖̝̣̪̙̫̞̦̥ͨ̂ͧ̄̿£̺̻̹̠̯͙͇̳ͬ̃̿͑͊ͨͣ╚̯̪̣͕̙̩̦͓͚̙̱̘̝̏̆ͤ̊̅ͩ̓̏̿͆̌Θ̼̯͉ͭͦ̃͊͑̉ͯͤ̈́ͬ͐̈́͊ͤͅº͍̪͇͖̝̣̪̙̫̞̦̥ͨ̂ͧ̄̿£̺̻̹̠̯͙͇̳ͬ̃̿͑͊ͨͣ╚̯̪̣͕̙̩̦͓͚̙̱̘̝̏̆ͤ̊̅ͩ̓̏̿͆̌Θ̼̯͉ͭͦ̃͊͑̉ͯͤ̈́ͬ͐̈́͊ͤͅº͍̪͇͖̝̣̪̙̫̞̦̥ͨ̂ͧ̄̿£̺̻̹̠̯͙͇̳ͬ̃̿͑͊ͨͣ╚̯̪̣͕̙̩̦͓͚̙̱̘̝̏̆ͤ̊̅ͩ̓̏̿͆̌Θ̼̯͉ͭͦ̃͊͑̉ͯͤ̈́ͬ͐̈́͊ͤͅº͍̪͇͖̝̣̪̙̫̞̦̥ͨ̂ͧ̄̿£̺̻̹̠̯͙͇̳ͬ̃̿͑͊ͨͣ╚̯̪̣͕̙̩̦͓͚̙̱̘̝̏̆ͤ̊̅ͩ̓̏̿͆̌Θ̼̯͉ͭͦ̃͊͑̉ͯͤ̈́ͬ͐̈́͊ͤͅº͍̪͇͖̝̣̪̙̫̞̦̥ͨ̂ͧ̄̿£̺̻̹̠̯͙͇̳ͬ̃̿͑͊ͨͣ╚̯̪̣͕̙̩̦͓͚̙̱̘̝̏̆ͤ̊̅ͩ̓̏̿͆̌Θ̼̯͉ͭͦ̃͊͑̉ͯͤ̈́ͬ͐̈́͊ͤͅº͍̪͇͖̝̣̪̙̫̞̦̥ͨ̂ͧ̄̿£̺̻̹̠̯͙͇̳ͬ̃̿͑͊ͨͣ╚̯̪̣͕̙̩̦͓͚̙̱̘̝̏̆ͤ̊̅ͩ̓̏̿͆̌Θ̼̯͉ͭͦ̃͊͑̉ͯͤ̈́ͬ͐̈́͊ͤͅº͍̪͇͖̝̣̪̙̫̞̦̥ͨ̂ͧ̄̿£̺̻̹̠̯͙͇̳ͬ̃̿͑͊ͨͣ╚̯̪̣͕̙̩̦͓͚̙̱̘̝̏̆ͤ̊̅ͩ̓̏̿͆̌Θ̼̯͉ͭͦ̃͊͑̉ͯͤ̈́ͬ͐̈́͊ͤͅº͍̪͇͖̝̣̪̙̫̞̦̥ͨ̂ͧ̄̿£̺̻̹̠̯͙͇̳ͬ̃̿͑͊ͨͣ╚̯̪̣͕̙̩̦͓͚̙̱̘̝̏̆ͤ̊̅ͩ̓̏̿͆̌Θ̼̯͉ͭͦ̃͊͑̉ͯͤ̈́ͬ͐̈́͊ͤͅº͍̪͇͖̝̣̪̙̫̞̦̥ͨ̂ͧ̄̿£̺̻̹̠̯͙͇̳ͬ̃̿͑͊ͨͣ╚̯̪̣͕̙̩̦͓͚̙̱̘̝̏̆ͤ̊̅ͩ̓̏̿͆̌Θ̼̯͉ͭͦ̃͊͑̉ͯͤ̈́ͬ͐̈́͊ͤͅº͍̪͇͖̝̣̪̙̫̞̦̥ͨ̂ͧ̄̿£̺̻̹̠̯͙͇̳ͬ̃̿͑͊ͨͣ╚̯̪̣͕̙̩̦͓͚̙̱̘̝̏̆ͤ̊̅ͩ̓̏̿͆̌Θ̼̯͉ͭͦ̃͊͑̉ͯͤ̈́ͬ͐̈́͊ͤͅº͍̪͇͖̝̣̪̙̫̞̦̥ͨ̂ͧ̄̿£̺̻̹̠̯͙͇̳ͬ̃̿͑͊ͨͣ╚̯̪̣͕̙̩̦͓͚̙̱̘̝̏̆ͤ̊̅ͩ̓̏̿͆̌Θ̼̯͉ͭͦ̃͊͑̉ͯͤ̈́ͬ͐̈́͊ͤͅº͍̪͇͖̝̣̪̙̫̞̦̥ͨ̂ͧ̄̿£̺̻̹̠̯͙͇̳ͬ̃̿͑͊ͨͣ

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, laminutederire said:

Because you really think Disney haven't thought about that?

The only reason they use it in Nvidias case is because they use it to do really simple stuffs. Add-on features to replace current methods. It does not bring that much more fidelity per computation. It just adds a small complexity for the sake of it. It's not motivated by a real constraint.

Disney is also using code for ray tracing that is first of all so well optimized you won't see them move away from it just like that. Then they are so much further away than Nvidia for Monte Carlo denoiser that they really don't need Nvidias tech.

And finally Nvidias attempt is bound to stay in gaming, as it's there to get a cheap retraced result, while the cgi industry strives for precision and quality, and they will never sacrifice that for a small extra time.

I'm not stating whether or not it will succeed. It's somewhere for NVidia to further, as they see a massive market with a lot of money. They are attempting to get in that market. I wasn't meaning that they would be using it right now, more so it opens up a doorway to get their foot in. As @mr moose said, what the industry will have to weigh is weather or not a farm of GPUs is more effiecent at running at the accuracies and resolutions they typically run. Right now that is unknown, to the public at least.

 

Also I never said NVidia had plans to leave the gaming market, and I apologize if that was how it was interpreted as my meaning was they intend to expand their reach into other markets.

 

Again, my apologies if my wording wasn't entirely clear. You may have more insight in what Disney's intentions and use case may be.

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, mr moose said:

I wouldn't make any prejudgments on that, we have no idea how GPU ray-tracing is going to compare to a render farm of CPUs for time and accuracy. It might flop due to massive integration problems or might excel and make current render farms obsolete.

Well there are fundamental problems that make gpu unsuited for it. Memory requirements for one. As a ray bounces off, there is no way to prefetch totally unrelated chunks of memory that result from the way going randomly in the scene. That and the BVH or Kd-tree acceleration data structures can lead to slowing down themselves since they can't be held fully in cache for sufficiently large scenes.

That's why gpu can be quite good to ray trace faster small scenes.

Movies on the other hand need complex scenes and path tracing to get multiple bounces.

And many people have already tried porting renderers to gpu for quite a bit of time, and it ended always be significantly slower with production scenes. At best those gpu might close the huge gap there currently is.

There is a reason why rtx can only do real time simple ray tracing of reflections, and not full on path traced complex scenes.

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, laminutederire said:

Well there are fundamental problems that make gpu unsuited for it. Memory requirements for one. As a ray bounces off, there is no way to prefetch totally unrelated chunks of memory that result from the way going randomly in the scene. That and the BVH or Kd-tree acceleration data structures can lead to slowing down themselves since they can't be held fully in cache for sufficiently large scenes.

That's why gpu can be quite good to ray trace faster small scenes.

Movies on the other hand need complex scenes and path tracing to get multiple bounces.

And many people have already tried porting renderers to gpu for quite a bit of time, and it ended always be significantly slower with production scenes. At best those gpu might close the huge gap there currently is.

There is a reason why rtx can only do real time simple ray tracing of reflections, and not full on path traced complex scenes.

The issue here is we don't know if that is still the case or if nvidia have found a way to resolve those issues. 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

RIP Radeon if true, it is WCCFtech.. It was king of expected, it’s one of those. You figured it was true. You hoped it was true. But at the same time you kind of want to see the giant fall and the little guy catch up. Maybe intel and their bucket loads of money will give Nvidia a run for their money eventually. 

Link to comment
Share on other sites

Link to post
Share on other sites

#PoorNavi

 

I think it might be about time for AMD to seriously consider whether or not it's worth holding on to RTG. Personally I feel they should spin them off and double down on their CPU business like they should have to begin with. If Intel wants RTG properties and patents then so be it.

New Build (The Compromise): CPU - i7 9700K @ 5.1Ghz Mobo - ASRock Z390 Taichi | RAM - 16GB G.SKILL TridentZ RGB 3200CL14 @ 3466 14-14-14-30 1T | GPU - ASUS Strix GTX 1080 TI | Cooler - Corsair h100i Pro | SSDs - 500 GB 960 EVO + 500 GB 850 EVO + 1TB MX300 | Case - Coolermaster H500 | PSUEVGA 850 P2 | Monitor - LG 32GK850G-B 144hz 1440p | OSWindows 10 Pro. 

Peripherals - Corsair K70 Lux RGB | Corsair Scimitar RGB | Audio-technica ATH M50X + Antlion Modmic 5 |

CPU/GPU history: Athlon 6000+/HD4850 > i7 2600k/GTX 580, R9 390, R9 Fury > i7 7700K/R9 Fury, 1080TI > Ryzen 1700/1080TI > i7 9700K/1080TI.

Other tech: Surface Pro 4 (i5/128GB), Lenovo Ideapad Y510P w/ Kali, OnePlus 6T (8G/128G), PS4 Slim.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, mr moose said:

The issue here is we don't know if that is still the case or if nvidia have found a way to resolve those issues. 

The fact they call it raytracing and not path tracing tells it all. You're looking to at best one bounce or zero. Hence why not much new here.

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, laminutederire said:

The fact they call it raytracing and not path tracing tells it all. You're looking to at best one bounce or zero. Hence why not much new here.

It seems nvidia thinks differently:

https://www.digitalartsonline.co.uk/news/motion-graphics/new-nvidia-graphics-cards-offer-industry-changing-real-time-raytracing-on-your-computer-for-first-time/

Quote

 

Nvidia has announced RTX Server, which combines eight RTX cards with the company’s Infinity software to created render farm datacentres and virtual workstations. Some customers will get early access in Q4 of this year, with it being generally available in Q1 2019.

Nvidia claims that the RTX Server will cost about a quarter of a CPU-based render farm, taking up around a tenth of the space and using around an eleventh of the power. Or conversely, for the same money you can get four times the performance.

 

 

With support for pixar renderman it sounds to me like they consider it to be heading toward animation render farms.  But as I said before, whether it is actually good enough or not remains to be seen.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

14 minutes ago, mr moose said:

It seems nvidia thinks differently:

https://www.digitalartsonline.co.uk/news/motion-graphics/new-nvidia-graphics-cards-offer-industry-changing-real-time-raytracing-on-your-computer-for-first-time/

 

With support for pixar renderman it sounds to me like they consider it to be heading toward animation render farms.  But as I said before, whether it is actually good enough or not remains to be seen.

Whatever. Let's pray our gods Nvidia and be done with it.

They're just too good for everyone out there and they cannot ever fail. That's a known fact as of now.

Link to comment
Share on other sites

Link to post
Share on other sites

26 minutes ago, mr moose said:

  But as I said before, whether it is actually good enough or not remains to be seen.

 

10 minutes ago, laminutederire said:

Whatever. Let's pray our gods Nvidia and be done with it.

They're just too good for everyone out there and they cannot ever fail. That's a known fact as of now.

???  no one is saying they can't fail or are gods to pray to, we're just saying you can't make claims it isn't capable of certain things until we actually see evidence it isn't capable.  Surely that's not a hard concept to understand?

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, mr moose said:

 

???  no one is saying they can't fail or are gods to pray to, we're just saying you can't make claims it isn't capable of certain things until we actually see evidence it isn't capable.  Surely that's not a hard concept to understand?

Surely you can understand how frustrating to see people praise Nvidia left and right without having all details as well. One of those details is price. Like 2060 at 1080 level is good, but not if its at 1080 price... its not even good at 1070ti price and not that good at 1070 price. And it would seem priving should increase to at least 1070 level. At some point I need to stop conceding things if my educated guesses show me otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, laminutederire said:

Surely you can understand how frustrating to see people praise Nvidia left and right without having all details as well. One of those details is price. Like 2060 at 1080 level is good, but not if its at 1080 price... its not even good at 1070ti price and not that good at 1070 price. And it would seem priving should increase to at least 1070 level. At some point I need to stop conceding things if my educated guesses show me otherwise.

It's just as frustrating as the people who praise AMD left right and centre (some of them lie about it too),   You can't stop some people form being naive, but you can correct them without going in the opposite direction.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, mr moose said:

It's just as frustrating as the people who praise AMD left right and centre (some of them lie about it too),   You can't stop some people form being naive, but you can correct them without going in the opposite direction.

Yeah. Maybe I'm just biased because I've met arrogant people from Nvidia who didn't really understood everything about what they presented, so in my mind they don't really grasp their tech to the full extent.

Link to comment
Share on other sites

Link to post
Share on other sites

I still have faith in RTG that they will make a few decent  graphic cards in the future. It is very important for a budget gamer, that Nvidia has competition, at least in the low to mid range segment, otherwise we'll be paying 200$ for an equivalent of a GT 1030 in no time.

Link to comment
Share on other sites

Link to post
Share on other sites

I'd love to see AMD come out with something competitive, but if these rumours are anything to go by then sadly I just don't think it's going to happen with Navi. It'll be late and underperforming as per.  I really wanted to get another AMD GPU for my next upgrade but it seems that it'll have to be the green team this time around. 

Gaming PC: i5 8600k @ 4.8GHz | 16GB T-Force Delta RGB @ 3200Mhz | Asus Prime Z370-A | Sapphire Radeon VII (Dead :( ), RX480 8GB | EVGA SuperNova 750 G2 | 120GB Sandisk SSD Plus, 120GB Kingston A400 | 4TB Seagate 7200RPM , 1TB WD Blue 7200RPM | Phanteks P400 Windows 10, MacOS Catalina

Second PC: i5 3340s @ 2.8GHz | 8GB Corsair Vengeance @ 1333MHz | EVGA 750Ti SC | Cheap £7 250GB WD HDD Windows 7

LaptopLate 2009 MacBook (Core 2 Duo, 4GB Ram, 120GB SSD, 9400m, running Mojave) , Late 2013 MacBook Pro Retina (i5, 4GB Ram, 128GB SSD)

Consoles: Xbox One, PS4 Slim, PS3 slim, Xbox 360 fat still going strong after almost 10 years, Original Xbox, PS2, PS1

Phone: Realme 6, Xiaomi Mi A2, Xiaomi Redmi Note 4x, iPhone 6s (jailbroken)

Tablet: iPad Mini 2nd Gen Retina (Jailbroken)

Headphones:  Hifiman HE4xx, Phillips Fidelio X2, Status Audio CB-1,  Fiio E10k DAC, Schiit Magni 3+, Tin T2 IEM's, Astrotec S80

Keyboards:  2x Custom 60% - 1x Gateron Yellow, 1x Box Reds

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×