Jump to content

NVIDIA Under Attack Again for GameWorks in The Witcher 3: Wild Hunt

CtW

I feel like you're either an AMD fanboy, or a troll...

 

And did you even read the posts I linked?

 

Phsyx engine is NOT the same as Physx Effects.

Each time there is an argument is that what you do, play the troll card or call someone a fanboy?.

i got a 970 i got screwed over with the 3.5gb bs and i don't want the market of pc to have exclusive just based on which graphics card you got

 

btw this is what am talking about http://www.reddit.com/r/pcmasterrace/comments/367qav/mark_my_word_if_we_dont_stop_the_nvidia_gameworks

Link to comment
Share on other sites

Link to post
Share on other sites

Don't worry, we gave up in the other thread too. 

 

PhysX accelerated effects require CUDA which AMD REFUSED to license, so its not Nvidias problem that AMD can't get good particle acceleration. AMD refused to license CUDA and to build PhysX capable drivers back in the day. 

PhsyX physics engines are no different (though far more modern) than Havok, which is another physics engine; all of which are CPU bound and don't care what GPU you use. 

 

People don't want to understand the difference, they'd rather continue being stubborn and fanboys. 

I mean darn, it's almost like some of these are familiar and used in other games because they all do the same thing..

post-1744-0-33453800-1431886952.jpg

.

Link to comment
Share on other sites

Link to post
Share on other sites

I get this entirely. They spent time, money and resources developing it, so why should they share it with someone that contributed in no way what so ever?

Just look at this scenario from this point of view: A couple of people worked hard making something and as soon as they show their work another group comes along and demands they give them the right to copy/use their work. That just doesn't make sense at all...

And this is coming from someone that uses an AMD GPU. So I'm certainly not an Nvidia fanboy.

Yeah I'm with you on this man, I think you misunderstood my quote ;)

MacBook Pro 15' 2018 (Pretty much the only system I use)

Link to comment
Share on other sites

Link to post
Share on other sites

AMD made HBM with hynix and they gave this to nvidia, they made Tressfx and they gave this to nvidia, they made mantle, then made better VULKAN and gave this to nvidia too.

 

Nvidia is bitch! buys physx does not give to amd, makes this kind of shit. and does not give amd, makes G-sync for 150$ that same tech on amd is fucking free!

 

so

Computer users fall into two groups:
those that do backups
those that have never had a hard drive fail.

Link to comment
Share on other sites

Link to post
Share on other sites

I mean darn, it's almost like some of these are familiar and used in other games because they all do the same thing..

attachicon.gifphysics engines.jpg

 

Whoa now wait just a minute, you can't throw around facts like that in a place like this! 

 

I know Havok is outdated as hell, but I still love the memories that broken engine gave me from Halo. 

Okay, back to the facts so people can shut the fuck up about how PhysX is terrible. 

 

PhysX the physics engine powers a lot of games and a lot of game engines, it is baked into the damn thing and is CPU reliant, IT DOES NOT GIVE A FLYING FUCK what your GPU is. 

 

PhysX as accelerated effects requires CUDA. which AMD refused to license. So if you want to bitch about not having CUDA accelerated effects from PhysX on your AMD card, blame AMD. Its also why performance tanks when you shift PhsyX effects onto your CPU. Your CPU is kinda already handling a physics engine and its own calculations. Now you want it to do real time calculations for effects that are not scripted. 

Good luck with that shit on consumer CPUs. 

 
PhysX in video games[edit]

PhysX technology is used by game engines such as Unreal Engine (version 3 onwards), UnityGamebryoVision (version 6 onwards), Instinct Engine,[22] Panda3DDieselTorqueHeroEngine and BigWorld.[23]

As one of the handful of major physics engines, it is used in many games, such as Mirror's EdgeBulletstormNeed for Speed: ShiftCastlevania: Lords of ShadowMafia IIAlice: Madness ReturnsBatman: Arkham CityBorderlands 2Kerbal Space Program, etc. Most of these games use the CPU to process the physics simulations.

Video games with optional support for hardware-accelerated PhysX often include additional effects such as tearable cloth, dynamic smoke or simulated particle debris.[24][25][26]

PhysX in other software[edit]

Other software with PhysX support includes:

 

  •  
Link to comment
Share on other sites

Link to post
Share on other sites

.....why doesn't AMD just come up with their own equivalent of gameworks?

 

Or was that essentially what Mantle was?

 

AMD do have their own version, but its open source and as such, comes with less support from AMD than if it was licensed like Gameworks (at least in theory).

 

--------------

 

I believe that Ryan Shrout believes he is being the voice of reason, although with the sheer number of years/decades he has been in the business, you can feel in his words that he is tired as s*** of seeing fanboy wars over drivel, and would rather review exciting products, and not have to deal with the drama that comes from aggressive competition inside a lopsided market. Hell, I think any enthusiast wants selection and choice in what product they buy. It pisses me off that I'm stuck with Intel for gaming.

 

This whole debate over gameworks effects us enthusiasts greatly, while it has very little impact on someone who just wants a 'puter to play games on, regardless of who manufactures the parts. If there is anything Ryan is overlooking, its the unsightly little problem where choice becomes a rehearsal. With each generation of graphic cards, I'm glued to my computer or cellphone, excited for what AMD and Nvidia have to offer, and sometimes I take the green pill, and sometimes the red pill. Proprietary software may be competition between corporations, buts its a punch to the gut for us enthusiasts who want some semblance of choice, where we can be excited for product launches. Right now I'm hanging on either the 390x or the 980ti, and I want that choice to feel like it's my own, dammit.  :angry:  :D

R9 3900XT | Tomahawk B550 | Ventus OC RTX 3090 | Photon 1050W | 32GB DDR4 | TUF GT501 Case | Vizio 4K 50'' HDR

 

Link to comment
Share on other sites

Link to post
Share on other sites

 

Whoa now wait just a minute, you can't throw around facts like that in a place like this! 

 

I know Havok is outdated as hell, but I still love the memories that broken engine gave me from Halo. 

Okay, back to the facts so people can shut the fuck up about how PhysX is terrible. 

 

PhysX the physics engine powers a lot of games and a lot of game engines, it is baked into the damn thing and is CPU reliant, IT DOES NOT GIVE A FLYING FUCK what your GPU is. 

 

PhysX as accelerated effects requires CUDA. which AMD refused to license. So if you want to bitch about not having CUDA accelerated effects from PhysX on your AMD card, blame AMD. Its also why performance tanks when you shift PhsyX effects onto your CPU. Your CPU is kinda already handling a physics engine and its own calculations. Now you want it to do real time calculations for effects that are not scripted. 

Good luck with that shit on consumer CPUs. 

 
  •  

 

Now you just wait one gosh darned minute here. Borderlands 2 used PhysX as a physics engine? But that must surely run terrible on AMD cards!

 

Oh shit, it runs fine. Even with PhysX enabled.

PhysX.png

.

Link to comment
Share on other sites

Link to post
Share on other sites

Why is GameWorks getting bashed so hard continuously? Because of the lack of a competing AMD API pretty much. Seriously???

 

No. Because it's turning PC gaming into console gaming: You like X series of games? Oh too bad it might not be a Sony/MS exclusive but the hair effects? Might as well be. Even if you consider AMD to be inferior, there is no chance of them improving their hardware and drivers for things Nvidia has convinced devs no one else can play with.

 

It might not seem like much, it certainly isn't for me (I can deal with less impressive hair on wolves, I'll focus on trying to kill them not pause the game to take screenshots and video) but what's good for the geese should be good for the gander: We're really not far from gaming saying "You need gameworks and you can't turn it off, so this game is effectively unplayable on AMD" and after that they will of course leave the market and after that you can expect Nvidia to charge 300 USD for entry level cards and 1500 for an x80 card (you can sell your car for a Titan card of course) and nobody will be able to do shit about it because ultimately, you wanted fancy hair and didn't care what it took to get it.

 

For as much shit as AMD gets they did eventually gave everybody Mantle on the other side.

-------

Current Rig

-------

Link to comment
Share on other sites

Link to post
Share on other sites

Each time there is an argument is that what you do, play the troll card or call someone a fanboy?.

i got a 970 i got screwed over with the 3.5gb bs and i don't want the market of pc to have exclusive just based on which graphics card you got

 

btw this is what am talking about http://www.reddit.com/r/pcmasterrace/comments/367qav/mark_my_word_if_we_dont_stop_the_nvidia_gameworks

It's not going to be exclusivity for games... Effects, maybe. But that's the price premium on the Nvidia cards... AMD Does the SAME thing with Mantel, and Gaming Evolved...

 

The 970 VRam didn't screw you over. Games Dont use more than 3.5 GB of VRAM at a Resolution the gimped GM204 GPU Can run...

Just remember: Random people on the internet ALWAYS know more than professionals, when someone's lying, AND can predict the future.

i7 9700K (5.2Ghz @1.2V); MSI Z390 Gaming Edge AC; Corsair Vengeance RGB Pro 16GB 3200 CAS 16; H100i RGB Platinum; Samsung 970 Evo 1TB; Samsung 850 Evo 500GB; WD Black 3 TB; Phanteks 350x; Corsair RM19750w.

 

Laptop: Dell XPS 15 4K 9750H GTX 1650 16GB Ram 256GB SSD

Spoiler

sex hahaha

Link to comment
Share on other sites

Link to post
Share on other sites

Now you just wait one gosh darned minute here. Borderlands 2 used PhysX as a physics engine? But that must surely run terrible on AMD cards!

 

Oh shit, it runs fine. Even with PhysX enabled.

 

 

Phsyx in Borderlands 2 was shit though xD My god it made the game chug so much, I instantly disabled it. (i5-4670K + GTX 670 at the time).

Link to comment
Share on other sites

Link to post
Share on other sites

 

But that does not put the blame on nvidia by default. Nvidia offering gamesworks services/libraries, should not inherently mean the game should therefor be unplayable on non-nvidia hardware. And the only persons that hurts this financially, are the developers. Nvidia couldn't care less, they're payed for the libraries anyway. So again, i'm mainly blaming the developer for not making the gamesworks libraries mutually exclusive (and the game itself agnostic). By which I mean a feature you can toggle on/off. And AMD for A. not providing their own services and/or asking royalties for said services B. not pitching in when the developer asked (alledgedly).

 
 
I feel like your cognitive bias makes you aproach things very single-sided and not even humor the other options. It runs better on Maxwell hardware, you can explain that as malicious, or you can explain that due to the fact it's a different architecture and can thus have better performance in certain tasks which is not by design or malicious intent. I'm willing to bet they didn't set out to gimp their own consumerbase. I'd lose all faith in the company tbh, and i have absolutely no reason to believe they're this stupid. They've shown, more than AMD, they know how to run a business profitably.

 

 

AMD not asking royalties for TressFX is their own damn fault. They make a lot of dumb mistakes, hence they're doing so poorly. It's not hard to connect the dots and see how utterly misguided AMD is. It's like the Konami of computer hardware. Gameworks being attractive to developers, says to me that the libraries are worth a damn and save them time. If they sucked, they'd be putting their own product at risk. And the only reason they're experiencing issues at the end of the development, is because AMD isn't willing to play ball. No developer sets out to create a game that will only run on nvidia hardware, you'd leave money on the table.

 

You think that AMD making everything open source and not asking royalties makes them somehow gain karma and eventually revenue? No, that's not how it works. It makes them look weak, unsure of their product and unable to run a company. It advocates consumer mistrust and deevaluates their products as a result. As is painfully evident by their constant decline in marketshare and being dubbed "the cheap alternative".

 

 

 

I have no idea what point you're trying to make here, but if you're advocating that no company can have digital rights, company secrets or can ask royalties for their properties, you're delusional.

 

Fair enough, i'll agree that regardless of what happens the developer chose this path. I'm just on the side of the fence where had Nvida been a little more pro consumer they wouldn't offer something that limited developers so extremely. That said in the end the devlopers are the one who made that choice, they can take the blame in this particular choice. 

 

I'm not here to argue about who's doing better from a business standpoint. That's incredibly obvious and not even arguable. I'm saying things are locked down and anti-consumer and you seem to be arguing that its fine because Nvidia makes more money. They make more money. Fantastic. That doenst make locking a chunk of the market out of software any more ok.

 

I'm not saying anything like they cant have secrets. However allegedly nvidia purposefully gimps AMD hardware with a piece of software. This was a massive issue with Intel vs AMD and was a fight that Intel lost as its was deemed anti-competitive. Why is this not an issue? How is this different? I'm not mad that they have this thing that they license out, i'm mad that there is fairly decent grounds that its intentionally slowing competitive hardware and apparently nobody has an issue with it, because Nvidia is just competing? 

CPU: Intel i5 4690k W/Noctua nh-d15 GPU: Gigabyte G1 980 TI MOBO: MSI Z97 Gaming 5 RAM: 16Gig Corsair Vengance Boot-Drive: 500gb Samsung Evo Storage: 2x 500g WD Blue, 1x 2tb WD Black 1x4tb WD Red

 

 

 

 

"Whatever AMD is losing in suddenly becomes the most important thing ever." - Glenwing, 1/13/2015

 

Link to comment
Share on other sites

Link to post
Share on other sites

Phsyx in Borderlands 2 was shit though xD My god it made the game chug so much, I instantly disabled it. (i5-4670K + GTX 670 at the time).

Ran perfectly fine for everyone I know.

.

Link to comment
Share on other sites

Link to post
Share on other sites

It's not going to be exclusivity for games... Effects, maybe. But that's the price premium on the Nvidia cards... AMD Does the SAME thing with Mantel, and Gaming Evolved...

 

The 970 VRam didn't screw you over. Games Dont use more than 3.5 GB of VRAM at a Resolution the gimped GM204 GPU Can run...

 

GTA V can if you crank it high enough, specially at higher resolutions and SLI set ups where you are more likely to use it. Now 90% of the 970 users got it as a 1080p card cause they want it to be overpowered and easily handle 1080p but you can't claim this isn't a problem cause 10% of the owners of a very popular car is quite significant.

-------

Current Rig

-------

Link to comment
Share on other sites

Link to post
Share on other sites

No. Because it's turning PC gaming into console gaming: You like X series of games? Oh too bad it might not be a Sony/MS exclusive but the hair effects? Might as well be. Even if you consider AMD to be inferior, there is no chance of them improving their hardware and drivers for things Nvidia has convinced devs no one else can play with.

 

It might not seem like much, it certainly isn't for me (I can deal with less impressive hair on wolves, I'll focus on trying to kill them not pause the game to take screenshots and video) but what's good for the geese should be good for the gander: We're really not far from gaming saying "You need gameworks and you can't turn it off, so this game is effectively unplayable on AMD" and after that they will of course leave the market and after that you can expect Nvidia to charge 300 USD for entry level cards and 1500 for an x80 card (you can sell your car for a Titan card of course) and nobody will be able to do shit about it because ultimately, you wanted fancy hair and didn't care what it took to get it.

 

For as much shit as AMD gets they did eventually gave everybody Mantle on the other side.

 

 

^^^^^^^^^^^^^^^^^^^^^^^^

So much of this. So many people are apparently completely ok with this, defending it in the name of "competition" 

 

Though the going out of business because of it may be a bit extreme we are dangerously close to a "you want to play this series? better have a nvidia card then or you're royaly f*cked!" world. 

CPU: Intel i5 4690k W/Noctua nh-d15 GPU: Gigabyte G1 980 TI MOBO: MSI Z97 Gaming 5 RAM: 16Gig Corsair Vengance Boot-Drive: 500gb Samsung Evo Storage: 2x 500g WD Blue, 1x 2tb WD Black 1x4tb WD Red

 

 

 

 

"Whatever AMD is losing in suddenly becomes the most important thing ever." - Glenwing, 1/13/2015

 

Link to comment
Share on other sites

Link to post
Share on other sites

Phsyx in Borderlands 2 was shit though xD My god it made the game chug so much, I instantly disabled it. (i5-4670K + GTX 670 at the time).

 

PhysX effects would tank almost any single GPU system for a while. With a 670 and even a 680 you needed a card specifically for PhysX processing. With the 780, 780TI, and the Titan that started to become less of a problem.

Link to comment
Share on other sites

Link to post
Share on other sites

AMD made HBM with hynix and they gave this to nvidia, they made Tressfx and they gave this to nvidia, they made mantle, then made better VULKAN and gave this to nvidia too.

 

Nvidia is bitch! buys physx does not give to amd, makes this kind of shit. and does not give amd, makes G-sync for 150$ that same tech on amd is fucking free!

 

so

-snip--

They didn't make HBM, they invested in Hynix, so they could get it first... They didn't make Vulkan either that was Khronos with some of AMD's Mantle code... TressFx is the only point you made which stands...

 

Mantle was offered to Nvidia, they denied it.

CUDA Licensing (which would lead to Physx running on AMD cards, and possibly the whole gameworks suit) was offered to AMD who denied it...

 

And G-Sync is superior to Free-Sync...

Due to it's custom Gsync module, which drives up the cost and the range of it's effectiveness increases.

PCPER Article on it  http://www.pcper.com/reviews/Graphics-Cards/Dissecting-G-Sync-and-FreeSync-How-Technologies-Differ

Just remember: Random people on the internet ALWAYS know more than professionals, when someone's lying, AND can predict the future.

i7 9700K (5.2Ghz @1.2V); MSI Z390 Gaming Edge AC; Corsair Vengeance RGB Pro 16GB 3200 CAS 16; H100i RGB Platinum; Samsung 970 Evo 1TB; Samsung 850 Evo 500GB; WD Black 3 TB; Phanteks 350x; Corsair RM19750w.

 

Laptop: Dell XPS 15 4K 9750H GTX 1650 16GB Ram 256GB SSD

Spoiler

sex hahaha

Link to comment
Share on other sites

Link to post
Share on other sites

They didn't make HBM, they invested in Hynix, so they could get it first... They didn't make Vulkan either that was Khronos with some of AMD's Mantle code... TressFx is the only point you made which stands...

 

Mantle was offered to Nvidia, they denied it.

CUDA Licensing (which would lead to Physx running on AMD cards, and possibly the whole gameworks suit) was offered to AMD who denied it...

 

And G-Sync is superior to Free-Sync...

Due to it's custom Gsync module, which drives up the cost and the range of it's effectiveness increases.

Has this been proven? I'm pretty sure this isnt a thing. article pls?

CPU: Intel i5 4690k W/Noctua nh-d15 GPU: Gigabyte G1 980 TI MOBO: MSI Z97 Gaming 5 RAM: 16Gig Corsair Vengance Boot-Drive: 500gb Samsung Evo Storage: 2x 500g WD Blue, 1x 2tb WD Black 1x4tb WD Red

 

 

 

 

"Whatever AMD is losing in suddenly becomes the most important thing ever." - Glenwing, 1/13/2015

 

Link to comment
Share on other sites

Link to post
Share on other sites

No. Because it's turning PC gaming into console gaming: You like X series of games? Oh too bad it might not be a Sony/MS exclusive but the hair effects? Might as well be. Even if you consider AMD to be inferior, there is no chance of them improving their hardware and drivers for things Nvidia has convinced devs no one else can play with.

 

It might not seem like much, it certainly isn't for me (I can deal with less impressive hair on wolves, I'll focus on trying to kill them not pause the game to take screenshots and video) but what's good for the geese should be good for the gander: We're really not far from gaming saying "You need gameworks and you can't turn it off, so this game is effectively unplayable on AMD" and after that they will of course leave the market and after that you can expect Nvidia to charge 300 USD for entry level cards and 1500 for an x80 card (you can sell your car for a Titan card of course) and nobody will be able to do shit about it because ultimately, you wanted fancy hair and didn't care what it took to get it.

 

For as much shit as AMD gets they did eventually gave everybody Mantle on the other side.

Titles that use GameWorks are definitely playable on AMD cards though...just not as well as Nvidia's offerings.

i5 4670k @ 4.2GHz (Coolermaster Hyper 212 Evo); ASrock Z87 EXTREME4; 8GB Kingston HyperX Beast DDR3 RAM @ 2133MHz; Asus DirectCU GTX 560; Super Flower Golden King 550 Platinum PSU;1TB Seagate Barracuda;Corsair 200r case. 

Link to comment
Share on other sites

Link to post
Share on other sites

Has this been proven? I'm pretty sure this isnt a thing. article pls?

http://www.pcper.com/reviews/Graphics-Cards/Dissecting-G-Sync-and-FreeSync-How-Technologies-DifferJust edited in :)

Just remember: Random people on the internet ALWAYS know more than professionals, when someone's lying, AND can predict the future.

i7 9700K (5.2Ghz @1.2V); MSI Z390 Gaming Edge AC; Corsair Vengeance RGB Pro 16GB 3200 CAS 16; H100i RGB Platinum; Samsung 970 Evo 1TB; Samsung 850 Evo 500GB; WD Black 3 TB; Phanteks 350x; Corsair RM19750w.

 

Laptop: Dell XPS 15 4K 9750H GTX 1650 16GB Ram 256GB SSD

Spoiler

sex hahaha

Link to comment
Share on other sites

Link to post
Share on other sites

I'm not here to argue about who's doing better from a business standpoint. That's incredibly obvious and not even arguable. I'm saying things are locked down and anti-consumer and you seem to be arguing that its fine because Nvidia makes more money. They make more money. Fantastic. That doenst make locking a chunk of the market out of software any more ok.

 

I'm not saying anything like they cant have secrets. However allegedly nvidia purposefully gimps AMD hardware with a piece of software. This was a massive issue with Intel vs AMD and was a fight that Intel lost as its was deemed anti-competitive. Why is this not an issue? How is this different? I'm not mad that they have this thing that they license out, i'm mad that there is fairly decent grounds that its intentionally slowing competitive hardware and apparently nobody has an issue with it, because Nvidia is just competing? 

 

How is nvidia locking out a chunk of the market by providing a service to a developer. Something the developer decides. I'm just not seeing it, nor how that is anti-consumer by nvidia's doing. Their offering of libraries should not be free by default, a company is allowed to charge what they want for proprietary software and it's up to the consumer or userbase to decide whether they're willing to cough up the bucks. That is how the consumer/producer relationship works, atleast as far as luxury products go. This changes when you talk about stuff like medicine, or basic needs like food.

 

If you have to use the word "allegedly" when accusing nvidia for gimping AMD software, maybe you shouldn't really utter the accusation in the first place. Because on the internet, many people making alleged accusations, turns into a truth after a certain critical mass is formed. I'm also not seeing how cinebench is related to gameworks, because the gimping of cinebench was proven, the gimping of AMD cards isn't. You're welcome to investigate, but don't accuse if you haven't a shred of evidence pointing to this. 

Link to comment
Share on other sites

Link to post
Share on other sites

PhysX effects would tank almost any single GPU system for a while. With a 670 and even a 680 you needed a card specifically for PhysX processing. With the 780, 780TI, and the Titan that started to become less of a problem.

 

oh jeez, I remember playing Mirrors edge on a 660, and during the scene where you run through the glass walkway and hundreds of bullets are streaming in... yeah, luckily we could turn hardware physX off completely back then.

R9 3900XT | Tomahawk B550 | Ventus OC RTX 3090 | Photon 1050W | 32GB DDR4 | TUF GT501 Case | Vizio 4K 50'' HDR

 

Link to comment
Share on other sites

Link to post
Share on other sites

 

Fair enough, I feel with new revisions of monitors/drivers this could improve but the point stands. 

CPU: Intel i5 4690k W/Noctua nh-d15 GPU: Gigabyte G1 980 TI MOBO: MSI Z97 Gaming 5 RAM: 16Gig Corsair Vengance Boot-Drive: 500gb Samsung Evo Storage: 2x 500g WD Blue, 1x 2tb WD Black 1x4tb WD Red

 

 

 

 

"Whatever AMD is losing in suddenly becomes the most important thing ever." - Glenwing, 1/13/2015

 

Link to comment
Share on other sites

Link to post
Share on other sites

PhysX effects would tank almost any single GPU system for a while. With a 670 and even a 680 you needed a card specifically for PhysX processing. With the 780, 780TI, and the Titan that started to become less of a problem.

 

Nah even on my GTX 970 it tanked in some hectic situations to about 40fps. I did see the CPU load spike, maybe the GPU wasn't doing the physx in the first place.

Link to comment
Share on other sites

Link to post
Share on other sites

oh jeez, I remember playing Mirrors edge on a 660, and during the scene where you run through the glass walkway and hundreds of bullets are streaming in... yeah, luckily we could turn hardware physX off completely back then.

 

I remember that too. Mirror's Edge was a really good showcase for PhysX effects, assuming you could run them. I had a 9800GTX at the time and for Mirror's Edge and Arkham Asylum I got a second one.

Link to comment
Share on other sites

Link to post
Share on other sites

Nah even on my GTX 970 it tanked in some hectic situations to about 40fps. I did see the CPU load spike, maybe the GPU wasn't doing the physx in the first place.

 

When I had a 780 I had no problems running it with PhysX on so that might be.

Link to comment
Share on other sites

Link to post
Share on other sites

Guest
This topic is now closed to further replies.


×