Jump to content

Polaris 11 could have 8.6 billion transistors.

DocSwag

Does it have CUDA? No?

 

Then I guess I'm still enslaved by Nvidia......

Want to help researchers improve the lives on millions of people with just your computer? Then join World Community Grid distributed computing, and start helping the world to solve it's most difficult problems!

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Humbug said:

I'm betting that regardless of who wins the polaris vs pascal round they will be within 15% of each others general performance... Like they always are. Don't see anybody totally wiping the floor with it, based on recent GPU wars history...

I don't know, I think this time it might be different, as each manufacturer's priorities and preparedness for DX12 comes under light. Every new DirectX generation there's been "shocking" developments and this should be no different.

In case the moderators do not ban me as requested, this is a notice that I have left and am not coming back.

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, Imakuni said:

Does it have CUDA? No?

It does actually, they've had a CUDA license for 4 months now.

 

Nothing I can do about double-post btw, forum software doesn't let me copy a quote to an edit of a post.

In case the moderators do not ban me as requested, this is a notice that I have left and am not coming back.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, That Norwegian Guy said:

It does actually, they've had a CUDA license for 4 months now.

 

Nothing I can do about double-post btw, forum software doesn't let me copy a quote to an edit of a post.

As far as I know, that's not a fully fledged CUDA license. It was only a partial license with a ton of restrictions; and if I remember it right, it was also only for server grade GPUs, not consumer ones.

Want to help researchers improve the lives on millions of people with just your computer? Then join World Community Grid distributed computing, and start helping the world to solve it's most difficult problems!

 

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, That Norwegian Guy said:

It does actually, they've had a CUDA license for 4 months now.

 

Nothing I can do about double-post btw, forum software doesn't let me copy a quote to an edit of a post.

I thought this as well, but it does not seem to be licensing the way you think it is.

 

I believe AMD has the license to develop a compiler for developers.  They (Devs) can use the compiler to program better OpenCL support for whatever program they are working on.

 

I could still be way off, but I just woke up... gimme a break =P

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Prysin said:

They can, if they plan to pull another rebrand party next year or the one after. Simply pull another 300 series. Hue hue hue.

 

Or, they know samsungs 10nm is going to arrive fast enough that they can jump down yet another node very soon.

LOL! Yeah, 14nm came fast only because it uses 20nm BEOL. 10nm will use 14nm BEOL. 10nm is not coming soon from anyone except maybe the DRAM and NAND people were you don't have anywhere near the complexity of a GPU or CPU.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, That Norwegian Guy said:

It does actually, they've had a CUDA license for 4 months now.

 

Nothing I can do about double-post btw, forum software doesn't let me copy a quote to an edit of a post.

Source?

 

As far as I'm aware, AMD does NOT have a CUDA license, and the chances of them making Polaris capable of running CUDA code (Eg: PhysX, GamesWorks effects based on CUDA, etc) is pretty much zero.

 

The only mention of a CUDA license was @patrickjp93 who was "inferring" that they had acquired one, due to the Compiler that AMD created (It takes the raw CUDA source code that a dev would have, and translates it to C++ if I recall correctly, thus allowing devs who wrote a CUDA program to quickly port it to OpenCL or whatever). There was never any actual evidence of that even meaning they had a CUDA license, let alone being allowed to execute actual compiled CUDA code.

 

If my information is incorrect or out of date, please correct me with a source.

1 hour ago, Imakuni said:

As far as I know, that's not a fully fledged CUDA license. It was only a partial license with a ton of restrictions; and if I remember it right, it was also only for server grade GPUs, not consumer ones.

As far as I'm aware, there was never any license at all. All the program does is take CUDA code, and translate it into C++, so that a dev (Who owns the rights to their own code/program) can easily port something from CUDA to C++. The original article's author speculated that it might mean AMD acquired a CUDA license, but it was pure speculation only.

1 hour ago, stconquest said:

I thought this as well, but it does not seem to be licensing the way you think it is.

 

I believe AMD has the license to develop a compiler for developers.  They (Devs) can use the compiler to program better OpenCL support for whatever program they are working on.

 

I could still be way off, but I just woke up... gimme a break =P

As far as I'm aware, no license is required for what AMD created. It's a program that takes uncompiled raw CUDA code and translates it to C++, so that it can be compiled for OpenCL. It doesn't translate it in real time or anything - the dev still needs to check the code, correct any mistakes or anything the translator missed, and then he still needs to compile the code into OpenCL.

For Sale: Meraki Bundle

 

iPhone Xr 128 GB Product Red - HP Spectre x360 13" (i5 - 8 GB RAM - 256 GB SSD) - HP ZBook 15v G5 15" (i7-8850H - 16 GB RAM - 512 GB SSD - NVIDIA Quadro P600)

 

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, dalekphalm said:

Source?

 

As far as I'm aware, AMD does NOT have a CUDA license, and the chances of them making Polaris capable of running CUDA code (Eg: PhysX, GamesWorks effects based on CUDA, etc) is pretty much zero.

 

The only mention of a CUDA license was @patrickjp93 who was "inferring" that they had acquired one, due to the Compiler that AMD created (It takes the raw CUDA source code that a dev would have, and translates it to C++ if I recall correctly, thus allowing devs who wrote a CUDA program to quickly port it to OpenCL or whatever). There was never any actual evidence of that even meaning they had a CUDA license, let alone being allowed to execute actual compiled CUDA code.

 

If my information is incorrect or out of date, please correct me with a source.

As far as I'm aware, there was never any license at all. All the program does is take CUDA code, and translate it into C++, so that a dev (Who owns the rights to their own code/program) can easily port something from CUDA to C++. The original article's author speculated that it might mean AMD acquired a CUDA license, but it was pure speculation only.

As far as I'm aware, no license is required for what AMD created. It's a program that takes uncompiled raw CUDA code and translates it to C++, so that it can be compiled for OpenCL. It doesn't translate it in real time or anything - the dev still needs to check the code, correct any mistakes or anything the translator missed, and then he still needs to compile the code into OpenCL.

ty.

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, Arokhantos said:

Hopefully they will including all gameworks titles, gameworks is so bad.

You know... Shit aditudes like this are the reason we get console grade graphics. 

 

People need to get off their high horses and look long term... Gameworks moves the graphics industry  forward at a pace that wouldn't be possible  without it (example,  amd wouldn't work nearly as hard on their variant if they didn't feel a need to catch up to nvidia.)  It is a thing that can at any time be turned off,  and one day down the line even if you don't have the horsepower for it yet,  you will be able to play those games with it and appreciate the additional graphics (just like crysis series today).  

 

 

LINK-> Kurald Galain:  The Night Eternal 

Top 5820k, 980ti SLI Build in the World*

CPU: i7-5820k // GPU: SLI MSI 980ti Gaming 6G // Cooling: Full Custom WC //  Mobo: ASUS X99 Sabertooth // Ram: 32GB Crucial Ballistic Sport // Boot SSD: Samsung 850 EVO 500GB

Mass SSD: Crucial M500 960GB  // PSU: EVGA Supernova 850G2 // Case: Fractal Design Define S Windowed // OS: Windows 10 // Mouse: Razer Naga Chroma // Keyboard: Corsair k70 Cherry MX Reds

Headset: Senn RS185 // Monitor: ASUS PG348Q // Devices: Note 10+ - Surface Book 2 15"

LINK-> Ainulindale: Music of the Ainur 

Prosumer DYI FreeNAS

CPU: Xeon E3-1231v3  // Cooling: Noctua L9x65 //  Mobo: AsRock E3C224D2I // Ram: 16GB Kingston ECC DDR3-1333

HDDs: 4x HGST Deskstar NAS 3TB  // PSU: EVGA 650GQ // Case: Fractal Design Node 304 // OS: FreeNAS

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

I'm sure flagship should have significantly more transistors. Taking account new node and architectural improvements, quotes on how much efficient it can be, though making room for potential power rather than just efficiency will probably be made.

| Ryzen 7 7800X3D | AM5 B650 Aorus Elite AX | G.Skill Trident Z5 Neo RGB DDR5 32GB 6000MHz C30 | Sapphire PULSE Radeon RX 7900 XTX | Samsung 990 PRO 1TB with heatsink | Arctic Liquid Freezer II 360 | Seasonic Focus GX-850 | Lian Li Lanccool III | Mousepad: Skypad 3.0 XL / Zowie GTF-X | Mouse: Zowie S1-C | Keyboard: Ducky One 3 TKL (Cherry MX-Speed-Silver)Beyerdynamic MMX 300 (2nd Gen) | Acer XV272U | OS: Windows 11 |

Link to comment
Share on other sites

Link to post
Share on other sites

43 minutes ago, -BirdiE- said:

I'm torn.

 

I want AMD to do better than Nvidia... But I also just spent $1300 USD on a G-Sync monitor....

Well you could sell it and buy an Adaptive-Sync monitor. Sooner or later NVIDIA will support them (Hell, NVIDIA could add driver support literally any time they want - Pascal will in all likelihood, be 100% hardware compatible with Adaptive-Sync monitors).

For Sale: Meraki Bundle

 

iPhone Xr 128 GB Product Red - HP Spectre x360 13" (i5 - 8 GB RAM - 256 GB SSD) - HP ZBook 15v G5 15" (i7-8850H - 16 GB RAM - 512 GB SSD - NVIDIA Quadro P600)

 

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, dalekphalm said:

Well you could sell it and buy an Adaptive-Sync monitor. Sooner or later NVIDIA will support them (Hell, NVIDIA could add driver support literally any time they want - Pascal will in all likelihood, be 100% hardware compatible with Adaptive-Sync monitors).

Problem is. WILL Nvidia support it? I think that the G-Sync chips are purchased from them (I could be wrong) so they wouldn't want to give business away for free.

Make sure to quote me or tag me when responding to me, or I might not know you replied! Examples:

 

Do this:

Quote

And make sure you do it by hitting the quote button at the bottom left of my post, and not the one inside the editor!

Or this:

@DocSwag

 

Buy whatever product is best for you, not what product is "best" for the market.

 

Interested in computer architecture? Still in middle or high school? P.M. me!

 

I love computer hardware and feel free to ask me anything about that (or phones). I especially like SSDs. But please do not ask me anything about Networking, programming, command line stuff, or any relatively hard software stuff. I know next to nothing about that.

 

Compooters:

Spoiler

Desktop:

Spoiler

CPU: i7 6700k, CPU Cooler: be quiet! Dark Rock Pro 3, Motherboard: MSI Z170a KRAIT GAMING, RAM: G.Skill Ripjaws 4 Series 4x4gb DDR4-2666 MHz, Storage: SanDisk SSD Plus 240gb + OCZ Vertex 180 480 GB + Western Digital Caviar Blue 1 TB 7200 RPM, Video Card: EVGA GTX 970 SSC, Case: Fractal Design Define S, Power Supply: Seasonic Focus+ Gold 650w Yay, Keyboard: Logitech G710+, Mouse: Logitech G502 Proteus Spectrum, Headphones: B&O H9i, Monitor: LG 29um67 (2560x1080 75hz freesync)

Home Server:

Spoiler

CPU: Pentium G4400, CPU Cooler: Stock, Motherboard: MSI h110l Pro Mini AC, RAM: Hyper X Fury DDR4 1x8gb 2133 MHz, Storage: PNY CS1311 120gb SSD + two Segate 4tb HDDs in RAID 1, Video Card: Does Intel Integrated Graphics count?, Case: Fractal Design Node 304, Power Supply: Seasonic 360w 80+ Gold, Keyboard+Mouse+Monitor: Does it matter?

Laptop (I use it for school):

Spoiler

Surface book 2 13" with an i7 8650u, 8gb RAM, 256 GB storage, and a GTX 1050

And if you're curious (or a stalker) I have a Just Black Pixel 2 XL 64gb

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, DocSwag said:

Problem is. WILL Nvidia support it? I think that the G-Sync chips are purchased from them (I could be wrong) so they wouldn't want to give business away for free.

Yes, NVIDIA makes the chips (Or, more precisely, they pay a fab to make them, then they program them) and sells them directly to the monitor vendors.

 

But yes, I think sooner or later, NVIDIA will support Adaptive-Sync. Eventually, we'll get to true feature parity. Right now, while it's pretty damn close, G-Sync is still a more robust implementation. That gap will narrow, then disappear over time.

 

Once we get to a point where G-Sync and the industry standard (eg: Non-AMD) Adaptive-Sync are at feature parity, and perform the same, then (hopefully) we'll see enough pressure from consumers that NVIDIA will adopt the industry standard.

 

We all know that it's within their technical capabilities. All they need is a GPU with DP 1.2a or DP 1.3+. They can even keep the G-Sync branding on their drivers, just like AMD has the FreeSync branding for their Adaptive-Sync drivers.

 

Just think of the day where you can buy a $1500 monitor that is both FreeSync AND G-Sync certified! But NVIDIA wants to keep that day as far away as possible, because they LOVE vendor lock-in.

For Sale: Meraki Bundle

 

iPhone Xr 128 GB Product Red - HP Spectre x360 13" (i5 - 8 GB RAM - 256 GB SSD) - HP ZBook 15v G5 15" (i7-8850H - 16 GB RAM - 512 GB SSD - NVIDIA Quadro P600)

 

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, dalekphalm said:

Well you could sell it and buy an Adaptive-Sync monitor. Sooner or later NVIDIA will support them (Hell, NVIDIA could add driver support literally any time they want - Pascal will in all likelihood, be 100% hardware compatible with Adaptive-Sync monitors).

Yeah. I think DocSwag got it pretty right. I don't think Nvidia would do that if they're selling g-sync chips.. it would kill their sales since there would be no reason for any manufacturer to make a g-sync monitor.

 

And just to save you potential confusion in the future, the non-proprietary technology is called "Freesync" and both Freesync and G-sync are examples of "Adaptive Sync" technology.

Link to comment
Share on other sites

Link to post
Share on other sites

they need to pull a Skyline. deliver a sleeper hit power house that actually delivers more power than advertised. 

Don't fail me now as i've failed you then.

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, Starelementpoke said:

I dunno, I don´t play cod.

What game were you talking about?

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, Arokhantos said:

How is using non universal standards good vs universal standards ?

Nvidia won't even allow early acces for amd to optimise game gameworks is purely nvidia thingy that works on ati to but aimed to bring down ati performance down, like tell me why would a brick need 10,000 polygons ? why do we need tesalated water underground not even visible, gameworks is just bad and you know it, if not you just ignorant.

Good job completely ignoring 100% of the point.

Nvidia making Gameworks and continuing to develop is literally the ONLY reason AMD pushes forward with their own standard.

 

Do you honestly believe if Nvidia didn't implement Gameworks that a studio would do actually equivalent features themselves? LOLOLOL.

 

It isn't non-universal standards vs universal standards, it was (prior to the latest iteration of GPUOPEN) have Nvidia do physx/gameworks for you or just don't bother implementing anything like it (havoc engine was the closest thing to Gameworks and it is a joke by comparison). Now (FINALLY) AMD is following in Nvidia's footsteps (GPUOPEN) and has a system that MIGHT be able to challenge Gameworks in a few months/years (it still is significantly worse from a graphics perspective).

 

BTW the whole Gameworks conspiracy theory is truly laughable these days with AMD cards shooting above par performance relative to their Nvidia (outside of Fiji which is pretty bad) in Gameworks title (and the Gameworks fiasco is one of the reasons AMD pushed the tesselation improvements to Hawaii and other non GCN1.2 200/300 series cards, so really you should thank Nvidia for giving AMD owners free preformance from here on out. Tesselation isn't going anywhere, and really it shouldn't. You have to make wire frames more complex at some point because textures have basically already caught up.)

LINK-> Kurald Galain:  The Night Eternal 

Top 5820k, 980ti SLI Build in the World*

CPU: i7-5820k // GPU: SLI MSI 980ti Gaming 6G // Cooling: Full Custom WC //  Mobo: ASUS X99 Sabertooth // Ram: 32GB Crucial Ballistic Sport // Boot SSD: Samsung 850 EVO 500GB

Mass SSD: Crucial M500 960GB  // PSU: EVGA Supernova 850G2 // Case: Fractal Design Define S Windowed // OS: Windows 10 // Mouse: Razer Naga Chroma // Keyboard: Corsair k70 Cherry MX Reds

Headset: Senn RS185 // Monitor: ASUS PG348Q // Devices: Note 10+ - Surface Book 2 15"

LINK-> Ainulindale: Music of the Ainur 

Prosumer DYI FreeNAS

CPU: Xeon E3-1231v3  // Cooling: Noctua L9x65 //  Mobo: AsRock E3C224D2I // Ram: 16GB Kingston ECC DDR3-1333

HDDs: 4x HGST Deskstar NAS 3TB  // PSU: EVGA 650GQ // Case: Fractal Design Node 304 // OS: FreeNAS

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Both companies help each other innovate. They push each other to "one-up" the other in tech. Without it, we'd see Intel's minuscule tick-tock 5%* performance improvement from generation to generation.

 

*Exaggeration, but still.

CPU: Intel Core i7 7820X Cooling: Corsair Hydro Series H110i GTX Mobo: MSI X299 Gaming Pro Carbon AC RAM: Corsair Vengeance LPX DDR4 (3000MHz/16GB 2x8) SSD: 2x Samsung 850 Evo (250/250GB) + Samsung 850 Pro (512GB) GPU: NVidia GeForce GTX 1080 Ti FE (W/ EVGA Hybrid Kit) Case: Corsair Graphite Series 760T (Black) PSU: SeaSonic Platinum Series (860W) Monitor: Acer Predator XB241YU (165Hz / G-Sync) Fan Controller: NZXT Sentry Mix 2 Case Fans: Intake - 2x Noctua NF-A14 iPPC-3000 PWM / Radiator - 2x Noctua NF-A14 iPPC-3000 PWM / Rear Exhaust - 1x Noctua NF-F12 iPPC-3000 PWM

Link to comment
Share on other sites

Link to post
Share on other sites

17 hours ago, dalekphalm said:

 port something from CUDA to C++. 

i think it was C not C++.

not sure tho

We've now got three different subjects going on, an Asian fox and motorbike fetish, two guys talking about Norway invasions and then some other people talking about body building... This thread is turning into a free for all fetish infested Norwegian circle jerk.

Link to comment
Share on other sites

Link to post
Share on other sites

10 hours ago, Curufinwe_wins said:

Good job completely ignoring 100% of the point.

Nvidia making Gameworks and continuing to develop is literally the ONLY reason AMD pushes forward with their own standard.

 

Do you honestly believe if Nvidia didn't implement Gameworks that a studio would do actually equivalent features themselves? LOLOLOL.

 

It isn't non-universal standards vs universal standards, it was (prior to the latest iteration of GPUOPEN) have Nvidia do physx/gameworks for you or just don't bother implementing anything like it (havoc engine was the closest thing to Gameworks and it is a joke by comparison). Now (FINALLY) AMD is following in Nvidia's footsteps (GPUOPEN) and has a system that MIGHT be able to challenge Gameworks in a few months/years (it still is significantly worse from a graphics perspective).

 

BTW the whole Gameworks conspiracy theory is truly laughable these days with AMD cards shooting above par performance relative to their Nvidia (outside of Fiji which is pretty bad) in Gameworks title (and the Gameworks fiasco is one of the reasons AMD pushed the tesselation improvements to Hawaii and other non GCN1.2 200/300 series cards, so really you should thank Nvidia for giving AMD owners free preformance from here on out. Tesselation isn't going anywhere, and really it shouldn't. You have to make wire frames more complex at some point because textures have basically already caught up.)

I see your point. Gameworks was bad for GPU performance, but it pushed AMD to try harder as well as launch a GPUOpen initiative.

However, I believe even though overall it was probably good, Nvidia could have gone a different route.

For me, the consumer, I would much rather have Nvidia just trying to make better GPUs.

Instead, gameworks reduced performance for both Nvidia and AMD GPUs. AMD GPUs just got hit harder.

Now we are finally seeing its benefits. However, if instead of doing this Nvidia just pushed harder with their GPUs we might have got the same results without all this gameworks crap mixed in.

However, I do admit that GPUOpen is probably one of the good things that came from Gameworks that probably would not have otherwise.

Make sure to quote me or tag me when responding to me, or I might not know you replied! Examples:

 

Do this:

Quote

And make sure you do it by hitting the quote button at the bottom left of my post, and not the one inside the editor!

Or this:

@DocSwag

 

Buy whatever product is best for you, not what product is "best" for the market.

 

Interested in computer architecture? Still in middle or high school? P.M. me!

 

I love computer hardware and feel free to ask me anything about that (or phones). I especially like SSDs. But please do not ask me anything about Networking, programming, command line stuff, or any relatively hard software stuff. I know next to nothing about that.

 

Compooters:

Spoiler

Desktop:

Spoiler

CPU: i7 6700k, CPU Cooler: be quiet! Dark Rock Pro 3, Motherboard: MSI Z170a KRAIT GAMING, RAM: G.Skill Ripjaws 4 Series 4x4gb DDR4-2666 MHz, Storage: SanDisk SSD Plus 240gb + OCZ Vertex 180 480 GB + Western Digital Caviar Blue 1 TB 7200 RPM, Video Card: EVGA GTX 970 SSC, Case: Fractal Design Define S, Power Supply: Seasonic Focus+ Gold 650w Yay, Keyboard: Logitech G710+, Mouse: Logitech G502 Proteus Spectrum, Headphones: B&O H9i, Monitor: LG 29um67 (2560x1080 75hz freesync)

Home Server:

Spoiler

CPU: Pentium G4400, CPU Cooler: Stock, Motherboard: MSI h110l Pro Mini AC, RAM: Hyper X Fury DDR4 1x8gb 2133 MHz, Storage: PNY CS1311 120gb SSD + two Segate 4tb HDDs in RAID 1, Video Card: Does Intel Integrated Graphics count?, Case: Fractal Design Node 304, Power Supply: Seasonic 360w 80+ Gold, Keyboard+Mouse+Monitor: Does it matter?

Laptop (I use it for school):

Spoiler

Surface book 2 13" with an i7 8650u, 8gb RAM, 256 GB storage, and a GTX 1050

And if you're curious (or a stalker) I have a Just Black Pixel 2 XL 64gb

 

Link to comment
Share on other sites

Link to post
Share on other sites

19 hours ago, -BirdiE- said:

Yeah. I think DocSwag got it pretty right. I don't think Nvidia would do that if they're selling g-sync chips.. it would kill their sales since there would be no reason for any manufacturer to make a g-sync monitor.

 

And just to save you potential confusion in the future, the non-proprietary technology is called "Freesync" and both Freesync and G-sync are examples of "Adaptive Sync" technology.

You're mistaken. FreeSync is a proprietary, entirely AMD owned Driver-Side software package. The FreeSync branding is used quite liberally by the monitor industry (and AMD), since currently they're the only party to officially support any products that work with Adaptive-Sync.

 

Adaptive-Sync is the DisplayPort industry standard.

See:

http://www.vesa.org/news/vesa-adds-adaptive-sync-to-popular-displayport-video-standard/

 

Adaptive-Sync is the hardware side implementation, as part of the VESA DisplayPort standard (It's an optional feature still though - so it's up to the monitor manufacturers to implement it still).

 

G-Sync is NOT Adaptive-Sync, nor does it use the same technology. G-Sync is an example of Adaptive-VSync (Note the V), which is a generic term used to describe various technologies that adapt frame-rate and refresh rate on demand. Adaptive-Sync is a very specific, VESA standard.

 

G-Sync works in a very similar fashion to Adaptive-Sync, but there are fundamental differences, including the amount of two-way communication that happens between the G-Sync module and the GPU. The entire reason a G-Sync module was required, is because at the time, Adaptive-Sync didn't even exist, so there was no Monitor hardware-side way for NVIDIA to adapt the refresh rate as needed. They filled this gap with a proprietary hardware module -> the G-Sync Module.

6 hours ago, Stadin6 said:

i think it was C not C++.

not sure tho

Could be - I'm not 100% sure whether it's C or C++ - at this point though, the difference is trivial, as my point was still well understood :P

For Sale: Meraki Bundle

 

iPhone Xr 128 GB Product Red - HP Spectre x360 13" (i5 - 8 GB RAM - 256 GB SSD) - HP ZBook 15v G5 15" (i7-8850H - 16 GB RAM - 512 GB SSD - NVIDIA Quadro P600)

 

Link to comment
Share on other sites

Link to post
Share on other sites

Wait. 

That cant be right,  since the fury x has 8.9 billion.  That would be a downgrade.  He must be refering to polaris 10.

AMD Ryzen R7 1700 (3.8ghz) w/ NH-D14, EVGA RTX 2080 XC (stock), 4*4GB DDR4 3000MT/s RAM, Gigabyte AB350-Gaming-3 MB, CX750M PSU, 1.5TB SDD + 7TB HDD, Phanteks enthoo pro case

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×