Jump to content

Nvidia Disables GPU PhysX when Second non-Nvidia GPU is Installed

Najuno

this is a good point. But I think you can disable the onboard graphics of the apu the same way you would when you install a dedicated GPU, unless this new Nvidia fail checks for that as well...

 

EDIT: read the rest of the artice and yeah... this is just retarded, wtg nvidia, another reason I wont support you.

i've read the full article and no you cannot fully disable the gpu in the apu only lower the clockspeed to the lowest believe me i've tried and i have an overclock board with lots of setting and i couldnt find an off swith for it

Link to comment
Share on other sites

Link to post
Share on other sites

Nvidia may be overpriced, but its cards fail far less often than AMD's, use less electricity, run cooler, and yes they have features which Nvidia owns the rights to and has an obligation to protect vis a vis their shareholders. People need to stop complaining about sensible business practices and get AMD to produce competitive products.

PhysX is actually extremely important in running calculations for simulation. It's a great hardware engine if you know how to use it properly. AMD is fine for unrefined gaming and some low-end mining, but otherwise how exactly has it pushed the industry forward lately, and how has it warranted this bashing of nvidia?

 

1) That part about Nvidia being more efficient is only true on the later generations: Nvidia had some pretty shit cards in the past and ATI (then) had the performance crown as well as being overall more efficient. Not sure if this is likely to happen again because AMD is basically putting an undue burden on the ATI side of the business to carry the company and their outdated tech on the processors side but if they figure it out they could easily overtake them again.

The point however, is that we will never know what the future holds

 

2) Introducing a new protocol like PhysX even if it's a great one, it's useless if you're more concerned with protecting your Intellectual Property (to a fault since this crosses into the anticompetitive and even illegal side) than getting more widespread adoption to have the tech grow up into a standard. Yes you can promote that without giving away the house perse and just make everything open source (Hint: AMD throws a lot of bullshit "free" this and "Open" that when in reality their stuff is just as locked down proprietary when it really matters) but this is more akin to what the music industry does which is put more effort into draconic copyright protection than actually producing good music.

-------

Current Rig

-------

Link to comment
Share on other sites

Link to post
Share on other sites

Nvidia may be overpriced, but its cards fail far less often than AMD's, use less electricity, run cooler, and yes they have features which Nvidia owns the rights to and has an obligation to protect vis a vis their shareholders. People need to stop complaining about sensible business practices and get AMD to produce competitive products.

PhysX is actually extremely important in running calculations for simulation. It's a great hardware engine if you know how to use it properly. AMD is fine for unrefined gaming and some low-end mining, but otherwise how exactly has it pushed the industry forward lately, and how has it warranted this bashing of nvidia?

 

First sentence is all kinds of wrong. No one is asking PhysX to run on AMD/Intel hardware here, just that if they use either of those products that they can also make use of there NV GPU. Side note NV didn't produce PhysX they bought it and the company Ageia then totally locked the platform down to the point where Ageia PPU's didn't even work anymore.

 

Second sentence is even more wrong, PhysX is a software engine that runs amazingly on any GPU since its great for parallel computing so that means AMD, Ageia (old PPU's) and NV.

 

And finally the last wrong thing you said. AMD does push the industry alot, first of all 64bit computing AMD64 was the first marketable solution that Intel had to copy just to keep up.

 

And a list of AMD tech:

 

Mantle

Trueaudio

Freesync(as well as getting variable refresh rate monitors as a VESA standard.)

TressFX(as well as GrassFX in development.) 

Eyefinity (was far ahead of Surround and still is by supporting different resolutions on your setup)

Crossfire (now doesn't need connectors, which is a nice change) 

Raptr Gaming evolved  (now had all the capability of Shadowplay).

 

All developed by AMD now I'm not bashing NV at all here, but they aren't the soul force pushing the industry and also all they did was buy PhysX and lock it down. Had Ageia not sold we'd all have PhysX and more games would have, gamers would be happier etc etc.

AMD Ryzen 5900x, Nvidia RTX 3080 (MSI Gaming X-trio), ASrock X570 Extreme4, 32GB Corsair Vengeance RGB @ 3200mhz CL16, Corsair MP600 1TB, Intel 660P 1TB, Corsair HX1000, Corsair 680x, Corsair H100i Platinum

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Can you not just install the PhysX software and use the CPU?

You can, however calculations won't be as accurate as it will take a lot of shortcuts to ensure that the CPU doesn't end up bottlenecking.

QUOTE ME OR I PROBABLY WON'T SEE YOUR RESPONSE 

My Setup:

 

Desktop

Spoiler

CPU: Ryzen 9 3900X  CPU Cooler: Noctua NH-D15  Motherboard: Asus Prime X370-PRO  RAM: 32GB Corsair Vengeance LPX DDR4 @3200MHz  GPU: EVGA RTX 2080 FTW3 ULTRA (+50 core +400 memory)  Storage: 1050GB Crucial MX300, 1TB Crucial MX500  PSU: EVGA Supernova 750 P2  Chassis: NZXT Noctis 450 White/Blue OS: Windows 10 Professional  Displays: Asus MG279Q FreeSync OC, LG 27GL850-B

 

Main Laptop:

Spoiler

Laptop: Sager NP 8678-S  CPU: Intel Core i7 6820HK @ 2.7GHz  RAM: 32GB DDR4 @ 2133MHz  GPU: GTX 980m 8GB  Storage: 250GB Samsung 850 EVO M.2 + 1TB Samsung 850 Pro + 1TB 7200RPM HGST HDD  OS: Windows 10 Pro  Chassis: Clevo P670RG  Audio: HyperX Cloud II Gunmetal, Audio Technica ATH-M50s, JBL Creature II

 

Thinkpad T420:

Spoiler

CPU: i5 2520M  RAM: 8GB DDR3  Storage: 275GB Crucial MX30

 

Link to comment
Share on other sites

Link to post
Share on other sites

There is no way to completely disable igpu,the only thing i can do is not connect any monitor to it but its still not disabled like when you unplug a video card from the slot,if i connect the monitor it instantly recognized the monitor and shows extension desktop for my 2nd monitor...

I dont want to connect my 2nd monitor to main gpu cause dont have enough ram and i run other stuff like youtube on 2nd screen when i game and it uses VRAM.

Anyway there arent any physx games anymore but i ever want to play any this will hurt.

Hmm strange. On my Asus board I got a simple setting in the BIOS. Windows doesn't detect the iGPU, and nothing happens when I plug a monitor in.

I just took it for granted that motherboard manufacturers implemented a simple on/off switch for it like on my board. I guess this could be a bigger issue than I thought then.

You can disable it in Windows though. You do it from the device manager. Just right click and then "Disable".

Link to comment
Share on other sites

Link to post
Share on other sites

First sentence is all kinds of wrong. No one is asking PhysX to run on AMD/Intel hardware here, just that if they use either of those products that they can also make use of there NV GPU. Side note NV didn't produce PhysX they bought it and the company Ageia then totally locked the platform down to the point where Ageia PPU's didn't even work anymore.

Second sentence is even more wrong, PhysX is a software engine that runs amazingly on any GPU since its great for parallel computing so that means AMD, Ageia (old PPU's) and NV.

And finally the last wrong thing you said. AMD does push the industry alot, first of all 64bit computing AMD64 was the first marketable solution that Intel had to copy just to keep up.

And a list of AMD tech:

Mantle

Trueaudio

Freesync(as well as getting variable refresh rate monitors as a VESA standard.)

TressFX(as well as GrassFX in development.)

Eyefinity (was far ahead of Surround and still is by supporting different resolutions on your setup)

Crossfire (now doesn't need connectors, which is a nice change)

Raptr Gaming evolved (now had all the capability of Shadowplay).

All developed by AMD now I'm not bashing NV at all here, but they aren't the soul force pushing the industry and also all they did was buy PhysX and lock it down. Had Ageia not sold we'd all have PhysX and more games would have, gamers would be happier etc etc.

No, all wrong (other than AMD technologies listed, 90% of which are useless apart from HSA and 64-bit code, which btw Intel had its own version. AMD just beat them to market). Physx is a hardware engine. Other implementations are direct compute software emulations which are nowhere close to being as efficient. AMD has never had a more power efficient card, ever, despite gaining the performance crown in ye olden days. Their GPU engineering teams have more money than ever and nothing to show for it. Nvidia can render and simulate simultaneously in real time with the Titan Z. The R9 295x2 brought nothing new to the table. Mantle is the same sort of bought-out, locked down project as Ageia. Mantle was an open-source project which AMD bought the rights to and shut down all outside efforts on, and it's gotten nowhere. Eyefinity was a flash in the pan, true audio is lousy and has better competing algorithms, and the rest is ancient history.

Crossfire lost efficiency when they got rid of the connector, which is why Nvidia stuck with it and will be improving it with the gtx 900 series with SLI 2.0. Free sync never should have been allowed to be patented because it's a blatant ripoff of several proposed techniques from the decade leading up to it. AMD lost everything good about them when they fired Jim Keller the first time. Maybe now that he's back they'll move the industries forward a bit.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

That is interesting. I didn't know you can pair a low-end Nvidia card with a high-end Radeon for a cheap way to get physx. Technically, you can use your cpu to get the physX in-game effect like in Borderlands 2, though it could eat up your cpu processing power. 

Link to comment
Share on other sites

Link to post
Share on other sites

No, all wrong (other than AMD technologies listed, 90% of which are useless apart from HSA and 64-bit code, which btw Intel had its own version. AMD just beat them to market). Physx is a hardware engine. Other implementations are direct compute software emulations which are nowhere close to being as efficient. AMD has never had a more power efficient card, ever, despite gaining the performance crown in ye olden days. Their GPU engineering teams have more money than ever and nothing to show for it. Nvidia can render and simulate simultaneously in real time with the Titan Z. The R9 295x2 brought nothing new to the table. Mantle is the same sort of bought-out, locked down project as Ageia. Mantle was an open-source project which AMD bought the rights to and shut down all outside efforts on, and it's gotten nowhere. Eyefinity was a flash in the pan, true audio is lousy and has better competing algorithms, and the rest is ancient history.

Crossfire lost efficiency when they got rid of the connector, which is why Nvidia stuck with it and will be improving it with the gtx 900 series with SLI 2.0. Free sync never should have been allowed to be patented because it's a blatant ripoff of several proposed techniques from the decade leading up to it. AMD lost everything good about them when they fired Jim Keller the first time. Maybe now that he's back they'll move the industries forward a bit.

Mantle was built in house and was not bought out. And Ageia PhysX was alot more open in the sense you could bang a PPU in any system and have it regardless of other hardware.

 

Crossfire lost nothing from getting rid of the connector absolutely no sources for that.

 

AMD has never built a power efficient card? Both the 780ti and the 290x have identical power draws while under load with about 8watts difference while idle. 

 

TrueAudio has no competitors since EAX and other technology like it isn't supported anymore.

 

Eyefinity is a flash in the pan? Is that because its good? You're not coming across very unbiased.

 

If we're comparing the TitanZ to 295x2, we can point out that TitanZ is slower than 2 Blacks, costs alot more and its only driving factor is you only take up 3 slots instead of 4? And that costs you what? Nearly £600 so you can gain a slot and have slower cards? 

 

And finally Freesync there is no patent on it since its part of the VESA 1.2DPa and 1.3DP standards as a variable refresh rate. AMD's Freesync is a slight variation on that tech but should you have Intel HD graphics I wouldn't be surprised if they took this up as well. 

AMD Ryzen 5900x, Nvidia RTX 3080 (MSI Gaming X-trio), ASrock X570 Extreme4, 32GB Corsair Vengeance RGB @ 3200mhz CL16, Corsair MP600 1TB, Intel 660P 1TB, Corsair HX1000, Corsair 680x, Corsair H100i Platinum

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

@Gunjob

Mantle was begun outside AMD by a group of OpenCL enthusiasts. AMD brought them onboard and took the whole thing (except some of the early API work) out of the open source ecosystem.

The Titan Z is a compute card, not a gaming card, and Nvidia said that from day 1. It's a pinnacle card which does what I said it does: simulate and render simultaneously in real time, something the 295x2 is incapable of doing and a first for the industry. It has unlocked double precision on the heels of Intel's Xeon Phi and far ahead of anything AMD's Cards to date can do. Also, 375 watts vs. 500 is a HUGE difference. Any card that requires watercooling to function is an abject failure of engineering.

Free sync is patented and licensed. I'm not biased at all. It's also the most basic and least powerful of the given solutions right now. G-sync as far as I've been able to pry information is a cut above, though a small one.

I like to support underdogs, but AMD has yet to deliver in a manner worth supporting. Like I said: with Jim Keller back in charge now, the tides may turn.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

@patrickjp93

 

Yes the 295x2 is a very power hungry bugger. One could make the argument for the water cooling. Since 7990 and 690 were dual slot cards, both the next generation needed alot more to keep them cool 295x2 using a AIO and TitanZ uses a 3 slot cooler. TitanZ does run 20c hotter under load than 295x2. Yes TitanZ is good at compute but its neither the best for it nor is it a dedicated compute card as shown byit being slower than FirePro W9150 and Tesla K40 in compute tasks and raw performance. 

 

TitanZ is a gaming card as long as Nvidia markets it as such http://www.nvidia.co.uk/gtx-700-graphics-cards/gtx-titan-z/ doesn't say compute once on there. And as such it be treat as a gaming card twice as expensive and slower in most games to a 295x2. Now to the few people who can't afford a Tesla or Quadro, the Titan/Black still make more sense. 

 

Ninja Edit: seems you got the wattage for the TitanZ wrong it pulls 450-500Watt underload. Matching the 295x2 quote for Guru3D's review:

 

 
Power Consumption

The GK110B Kepler GPUs are rated as having a 250 Watt TDP, we measure that to be a little better though, roughly 225~245 Watts (under full stress) per GPU. At this performance level you are looking at a card that consumes 450 to 500 Watts, that is okay. We think an 850 Watt PSU would be sufficient. So while it's not great to have two GPUs consuming 500 Watts it could have been a lot worse really. Also let me state that we measure peak power consumption, the average power consumption is a good notch lower depending on GPU utilization.

 

AMD Ryzen 5900x, Nvidia RTX 3080 (MSI Gaming X-trio), ASrock X570 Extreme4, 32GB Corsair Vengeance RGB @ 3200mhz CL16, Corsair MP600 1TB, Intel 660P 1TB, Corsair HX1000, Corsair 680x, Corsair H100i Platinum

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

i understand that they have competition and that its their technology. BUTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTT..... IF IM A FUCKIN CUSTOMER  weather its AMD or nVidia I EXPECT MY THING TO WORK AS ADVERTISED. its like saying NIssan wont let me charge my leaf if i have another chevy volt and a honda civic in my driveway as well. WTF is wrong with them

It's more like them not letting you use the same charger even though it has the same Amps/Watts/Volts and connector just so you buy their expensive solution.

Mein Führer... I CAN WALK !!

Link to comment
Share on other sites

Link to post
Share on other sites

It's more like them not letting you use the same charger even though it has the same Amps/Watts/Volts and connector just so you buy their expensive solution.

That's still bad. I saw linus's duel gpu video and since i do video editing i thought how amazing that would be. What i like to do is get a amd gpu one day and a nvidia the other and this is stupid

Space Journal #1: So Apparently i  was dropped on the moon like i'm a mars rover, in a matter of hours i have found the transformers on the dark side of the moon. Turns out its not that dark since dem robots are filled with lights, i waved hi to the Russians on the space station, turns out all those stories about space finding humans instead of the other way around is true(soviet Russia joke). They threw me some Heineken beer and I've been sitting staring at the people of this forum and earth since. 

Link to comment
Share on other sites

Link to post
Share on other sites

@Gunjob, the 295x2 has a tdp of 500 watts(vs. Nvidia's 375) but it pulls 700 (vs. Nvidia's 530) at full load (firsthand experience). Tesla and FirePro are supercomputer cards with reduced instruction sets and more redundant computation units. It's hardly a fair contest. The Titan Z is a mini supercomputer for enthusiast consumers and it has been called that since its demo. It's a far more powerful, feature-rich card. The 295x2 is just fast at gaming and barely sells at a profit (4% given yield problems). It's still a crime to call it good engineering.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

@Gunjob, the 295x2 has a tdp of 500 watts(vs. Nvidia's 375) but it pulls 700 (vs. Nvidia's 530) at full load (firsthand experience). Tesla and FirePro are supercomputer cards with reduced instruction sets and more redundant computation units. It's hardly a fair contest. The Titan Z is a mini supercomputer for enthusiast consumers and it has been called that since its demo. It's a far more powerful, feature-rich card. The 295x2 is just fast at gaming and barely sells at a profit (4% given yield problems). It's still a crime to call it good engineering.

 

no, the titanz has been advertised as a gaming card from the beginning, they even say so in their own press release blogs:

http://blogs.nvidia.com/blog/2014/03/25/titan-z/

http://www.geforce.com/whats-new/articles/announcing-the-geforce-gtx-titan-z

Rock On!

Link to comment
Share on other sites

Link to post
Share on other sites

I just hope they don't do something like this and add suport but mention its not certified but there

that would be nice 

Current system - ThinkPad Yoga 460

ExSystems

Spoiler

Laptop - ASUS FX503VD

|| Case: NZXT H440 ❤️|| MB: Gigabyte GA-Z170XP-SLI || CPU: Skylake Chip || Graphics card : GTX 970 Strix || RAM: Crucial Ballistix 16GB || Storage:1TB WD+500GB WD + 120Gb HyperX savage|| Monitor: Dell U2412M+LG 24MP55HQ+Philips TV ||  PSU CX600M || 

 

Link to comment
Share on other sites

Link to post
Share on other sites

no, the titanz has been advertised as a gaming card from the beginning, they even say so in their own press release blogs:

http://blogs.nvidia.com/blog/2014/03/25/titan-z/

http://www.geforce.com/whats-new/articles/announcing-the-geforce-gtx-titan-z

Please actually go to the demo of the card (on youtube). Yes, GeForce automatically requires it be gaming-friendly, but you're an idiot if you think that was the primary purpose of the card. It's a preliminary design for the new dual-GPU quadro line with dual-blower cooler designs like the GTX 880 reference cooler.

The Titan Z is not for gaming unless you're a rich twat Nvidia diehard with no common sense.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

no, the titanz has been advertised as a gaming card from the beginning, they even say so in their own press release blogs:

http://blogs.nvidia.com/blog/2014/03/25/titan-z/

http://www.geforce.com/whats-new/articles/announcing-the-geforce-gtx-titan-z

Yes, a gaming card, but it is specifically intended for people who need a workstation but don't want to have to build a second system if they want to play games as well. It's intended market is very narrow, and not just because of the price. None of the Titan's provide gamers with benefit over a 780 Ti. They only exist to perform the dual function of creating a hybrid workstation and gaming PC. Yes some people who don't need the double-point floating precision calculations will still be the cards, but that's their choice. There will always people who will spend money to say they have the "best" hardware.

There is no AMD alternative to this solution. You would need to have to completely different cards to offer everything the Titan series offers.

i7 2600K @ 4.7GHz/ASUS P8Z68-V Pro/Corsair Vengeance LP 2x4GB @ 1600MHz/EVGA GTX 670 FTW SIG 2/Cooler Master HAF-X

 

http://www.speedtest.net/my-result/3591491194

Link to comment
Share on other sites

Link to post
Share on other sites

Interesting

CPU: I7 3770k @4.8 ghz | GPU: GTX 1080 FE SLI | RAM: 16gb (2x8gb) gskill sniper 1866mhz | Mobo: Asus P8Z77-V LK | PSU: Rosewill Hive 1000W | Case: Corsair 750D | Cooler:Corsair H110| Boot: 2X Kingston v300 120GB RAID 0 | Storage: 1 WD 1tb green | 2 3TB seagate Barracuda|

 

Link to comment
Share on other sites

Link to post
Share on other sites

@GunjobFree sync is patented and licensed. I'm not biased at all. It's also the most basic and least powerful of the given solutions right now. G-sync as far as I've been able to pry information is a cut above, though a small one.

Free sync is for sure. But freesync is AMD's solution to Adaptive-sync. Adaptive-sync is what is in DP1.2a and 1.3 and freesync is what uses that tech in AMD's hardware. Guess what, adaptive-sync is the standard and any company may use it, intel and Nvidia included. They will have to only develop their own solution to work with adaptive-sync the way AMD uses freesync. Intel may do this but chances are Nvidia wont.

AMD didn't create adaptive-sync but they helped push it forward & are the only ones utilizing it's technology at the moment.

Processor: AMD FX8320 Cooler: Hyper 212 EVO Motherboard: Asus M5A99FX PRO 2.0 RAM: Corsair Vengeance 2x4GB 1600Mhz

Graphics: Zotac GTX 1060 6GB PSU: Corsair AX860 Case: Corsair Carbine 500R Drives: 500GB Samsung 840 EVO SSD & Seagate 1TB 7200rpm HDD

 

Link to comment
Share on other sites

Link to post
Share on other sites

Free sync is for sure. But freesync is AMD's solution to Adaptive-sync. Adaptive-sync is what is in DP1.2a and 1.3 and freesync is what uses that tech in AMD's hardware. Guess what, adaptive-sync is the standard and any company may use it, intel and Nvidia included. They will have to only develop their own solution to work with adaptive-sync the way AMD uses freesync. Intel may do this but chances are Nvidia wont.

AMD didn't create adaptive-sync but they helped push it forward & are the only ones utilizing it's technology at the moment.

Okay you confused me for a bit because you never mentioned adaptive sync. Also, no. G-sync was in the works in the same time frame and Intel will probably implement their own solution with the successor to Cannonlake. At that point their graphics will be on par with the upcoming GTX 880.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

Why is this news? I'm pretty sure this happened many years ago, only way to make it work was with hacked drivers and lots of headaches.

“The mind of the bigot is like the pupil of the eye; the more light you pour upon it the more it will contract” -Oliver Wendell Holmes “If it can be destroyed by the truth, it deserves to be destroyed by the truth.” -Carl Sagan

Link to comment
Share on other sites

Link to post
Share on other sites

Nvidia is becoming the GPU version of Apple. Too much anti-competitive BS, so i'm glad i chose AMD over Nvidia.

Case: Phanteks Enthoo Pro | PSU: Enermax Revolution87+ 850W | Motherboard: MSI Z97 MPOWER MAX AC | GPU 1: MSI R9 290X Lightning | CPU: Intel Core i7 4790k | SSD: Samsung SM951 128GB M.2 | HDDs: 2x 3TB WD Black (RAID1) | CPU Cooler: Silverstone Heligon HE01 | RAM: 4 x 4GB Team Group 1600Mhz

Link to comment
Share on other sites

Link to post
Share on other sites

Okay you confused me for a bit because you never mentioned adaptive sync. Also, no. G-sync was in the works in the same time frame and Intel will probably implement their own solution with the successor to Cannonlake. At that point their graphics will be on par with the upcoming GTX 880.

huh? I get the sarcasm on not mentioning adaptive-sync, but no what? That nvidia wont use adaptive-sync? That's a given.... & G-sync is a separate entity & of course it was in the works for a while. Hard to say what Intel will do at this point but they're already looking at Mantle so who knows - Shows they are open to the idea of using the same technology as AMD.

Processor: AMD FX8320 Cooler: Hyper 212 EVO Motherboard: Asus M5A99FX PRO 2.0 RAM: Corsair Vengeance 2x4GB 1600Mhz

Graphics: Zotac GTX 1060 6GB PSU: Corsair AX860 Case: Corsair Carbine 500R Drives: 500GB Samsung 840 EVO SSD & Seagate 1TB 7200rpm HDD

 

Link to comment
Share on other sites

Link to post
Share on other sites

huh? I get the sarcasm on not mentioning adaptive-sync, but no what? That nvidia wont use adaptive-sync? That's a given.... & G-sync is a separate entity & of course it was in the works for a while. Hard to say what Intel will do at this point but they're already looking at Mantle so who knows - Shows they are open to the idea of using the same technology as AMD.

Intel is not looking at using Mantle. They asked if they could play with it. That means they'd develop their own vastly superior product. Also, no, your original posts did not mention adaptive sync. They only brought up free sync. All free sync did was beat G-sync to market with a half-assed solution (I've seen and tested the code. It's brute force and jittery).

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×