Jump to content
Search In
  • More options...
Find results that contain...
Find results in...
BiG StroOnZ

NVIDIA Fires Shots at AMD’s 7nm Tech - Claims "Can Create Most Energy-efficient GPU in the World Anytime"

Recommended Posts

20 minutes ago, ryao said:

Comparing raytracing on SGI workstations to what Nvidia has done is laughable. That occurred on the CPU and was only done as part of render farms for 3D movies.Doing it as part of a video game for regular people just did not happen because regular people did not have SGI hardware and the hardware could not do real-time raytracing at any decent resolution.

 

The ray tracing card that you linked could not touch Nvidia’s Ray tracing capability, which is something like 1000 times higher. It therefore was not useful for real-time ray tracing and consequently, was not marketed for it. Furthermore, it was something no end users would buy and no game developers would touch. It was even less mass market than this:

 

https://www.red.com/red-rocket-x

 

Coincidentally, the RTX cards make the Red Rocket X obsolete too:

 

https://fstoppers.com/news/nvidia-and-red-unveil-gpu-solution-8k-real-time-editing-320356

 

I do not understand the desire to minimize Nvidia’s achievement. Doing 10 billion rays per second to render a 4K scene in real-time is amazing. Nobody had done that with commodity hardware before Nvidia did it.

The card he linked to could do 4M rays per second, the RTX can do in the billions per second.   It doesn't matter which way we split the hairs, trying to dismiss nvidias contribution is just childish ignorance at best.  

 

Most of us don't even care who made the first, the best or the biggest GPU, or even who makes the next best, so long as it is better and doesn't require I sell a kidney I'll buy it.


QuicK and DirtY. Read the CoC it's like a guide on how not to be moron.  Also I don't have an issue with the VS series.

Link to post
Share on other sites

No one denied RTX isn't impressive as a feature. What we're criticizing is the fact it's a commercial flop. They promised all these games and what we got? 2 bloody games after months of waiting and 1 extra (Metro Exodus) half a year after RTX launch. And that's it. That DLSS or whatever the fuck it's even named is an useless tech. It gives better performance. And makes everything blurry. If it was reverse DSR and worked with any game I'd even be fine with it, but this nonse with "deep learning" bollocks and requiring games to have special support for it just makes it the most useless thing in the world.

 

NVIDIA should really thought this through better and make a rock solid deal with developers to deliver at least 10 titles with full RTX support on day 1 of card release or within 1 month from RTX cards release. But 3 games after months and months of waiting is just pathetic and useless. Bloody Tomb Raider still isn't released with RTX. People finished and forgotten about it already, no one's gonna even try it with RTX if it ever comes out with support they promised.

 

It's gonna be cool when ray tracing becomes a norm, but it won't be that soon no matter how hard NVIDIA is pushing it (which is not that hard really).

Link to post
Share on other sites
10 hours ago, RejZoR said:

No one denied RTX isn't impressive as a feature. What we're criticizing is the fact it's a commercial flop. They promised all these games and what we got? 2 bloody games after months of waiting and 1 extra (Metro Exodus) half a year after RTX launch. And that's it. That DLSS or whatever the fuck it's even named is an useless tech. It gives better performance. And makes everything blurry. If it was reverse DSR and worked with any game I'd even be fine with it, but this nonse with "deep learning" bollocks and requiring games to have special support for it just makes it the most useless thing in the world.

 

NVIDIA should really thought this through better and make a rock solid deal with developers to deliver at least 10 titles with full RTX support on day 1 of card release or within 1 month from RTX cards release. But 3 games after months and months of waiting is just pathetic and useless. Bloody Tomb Raider still isn't released with RTX. People finished and forgotten about it already, no one's gonna even try it with RTX if it ever comes out with support they promised.

 

It's gonna be cool when ray tracing becomes a norm, but it won't be that soon no matter how hard NVIDIA is pushing it (which is not that hard really).

Sounds like Nvidia should hire you to run their business, because clearly they don't know what you do and haven't really been successful because of it. 🙄


QuicK and DirtY. Read the CoC it's like a guide on how not to be moron.  Also I don't have an issue with the VS series.

Link to post
Share on other sites

I don't know, why do you think Intel is hiring so many people that are just very experienced users/testers and not people with PhD's in economics and shit? Coz they are able to view things from a different perspective that may not align with usual corporate mumbo jumbo. That's why.

Link to post
Share on other sites
11 hours ago, RejZoR said:

No one denied RTX isn't impressive as a feature. What we're criticizing is the fact it's a commercial flop. They promised all these games and what we got? 2 bloody games after months of waiting and 1 extra (Metro Exodus) half a year after RTX launch. And that's it. That DLSS or whatever the fuck it's even named is an useless tech. It gives better performance. And makes everything blurry. If it was reverse DSR and worked with any game I'd even be fine with it, but this nonse with "deep learning" bollocks and requiring games to have special support for it just makes it the most useless thing in the world.

 

NVIDIA should really thought this through better and make a rock solid deal with developers to deliver at least 10 titles with full RTX support on day 1 of card release or within 1 month from RTX cards release. But 3 games after months and months of waiting is just pathetic and useless. Bloody Tomb Raider still isn't released with RTX. People finished and forgotten about it already, no one's gonna even try it with RTX if it ever comes out with support they promised.

 

It's gonna be cool when ray tracing becomes a norm, but it won't be that soon no matter how hard NVIDIA is pushing it (which is not that hard really).

What kind of logic is this?

Dx9 10 11 and 12 games didn't just pop up right away they needed hardware support first plus you don't want to alienate a huge audience by focusing on using newer tech as a developer you dabble and offer a little or some support until it becomes norm or mainstream

Dx12 been out for how long? Vulkan?

How many games are using them?

Ok then shit takes time sadly

 

But you may have your commerical flop opinion but then again you just showed how your logic works

Link to post
Share on other sites

I love when companies are like, "we could do this whenever we want," like ok then why don't you. I'm not saying Nvidia couldn't do that whenever they want but like whoopdee do. 

 

 

also the RTX 2070 has two power connectors and all 10 series cards up to the 1080 had one 8 pin and sipped power, kinda took a step back in power consumption with the 20 series imo

Link to post
Share on other sites

The kind of logic where if you push some shit and brag so hard about it, at least have a significant backlog of games featuring it already. You can't compare whole API with a single feature from it. Which RTX is. NVIDIA should have made sure they have enough big titles ready to go with it on release date of RTX cards. One thing is impressing people with tech videos of it, another saying, we'll have 5 RTX ready games in 1 week time after the release of cards. People would go mad about it. Instead, big load of nothing happened.

Link to post
Share on other sites
28 minutes ago, pas008 said:

What kind of logic is this?

Dx9 10 11 and 12 games didn't just pop up right away they needed hardware support first plus you don't want to alienate a huge audience by focusing on using newer tech as a developer you dabble and offer a little or some support until it becomes norm or mainstream

Dx12 been out for how long? Vulkan?

How many games are using them?

Ok then shit takes time sadly

 

But you may have your commerical flop opinion but then again you just showed how your logic works

I have had this exact discussion with RejZoR before, he doesn't really understand that side of the technology, for him it's just a superficial whinge about NVIDIA being crap for whatever reason only he understands.


QuicK and DirtY. Read the CoC it's like a guide on how not to be moron.  Also I don't have an issue with the VS series.

Link to post
Share on other sites
10 minutes ago, RejZoR said:

The kind of logic where if you push some shit and brag so hard about it, at least have a significant backlog of games featuring it already. You can't compare whole API with a single feature from it. Which RTX is. NVIDIA should have made sure they have enough big titles ready to go with it on release date of RTX cards. One thing is impressing people with tech videos of it, another saying, we'll have 5 RTX ready games in 1 week time after the release of cards. People would go mad about it. Instead, big load of nothing happened.

Wow you are lost you think devs should make games on tech that doesn't exist yet 

And once it's released it takes time to utilize it still

 

Link to post
Share on other sites
1 hour ago, RejZoR said:

The kind of logic where if you push some shit and brag so hard about it, at least have a significant backlog of games featuring it already. You can't compare whole API with a single feature from it. Which RTX is. NVIDIA should have made sure they have enough big titles ready to go with it on release date of RTX cards. One thing is impressing people with tech videos of it, another saying, we'll have 5 RTX ready games in 1 week time after the release of cards. People would go mad about it. Instead, big load of nothing happened.

Ray Tracing is an entire change to the whole graphics pipeline, that's as big if not bigger than some of the changes Direct X has gone through. Compound that with nobody is really going to start a 3+ year development cycle to support a hardware feature for hardware that does not exist yet so you have no way of testing or optimization. Expecting 5, 10, 20 games at the release of new graphics cards hardware is totally unrealistic, in fact when is the last time that many games got released at the same time. That amount doesn't even happen with new consoles, you're lucky to get 5 and they get much better and sooner early access development hardware.

 

Game developers won't, and shouldn't fit their schedule around a GPU company's release of hardware or technology like you want them to. You might as well just ask Nvidia to be a game development studio and publisher at that point.

Link to post
Share on other sites
4 hours ago, pas008 said:

Wow you are lost you think devs should make games on tech that doesn't exist yet 

Some people just prefer to live with their delusions fantasies of how reality works when it gives them an excuse to hate on game developers (and the hardware companies) for existing as a for-profit organization 😂.

Link to post
Share on other sites
6 hours ago, leadeater said:

Ray Tracing is an entire change to the whole graphics pipeline, that's as big if not bigger than some of the changes Direct X has gone through. Compound that with nobody is really going to start a 3+ year development cycle to support a hardware feature for hardware that does not exist yet so you have no way of testing or optimization. Expecting 5, 10, 20 games at the release of new graphics cards hardware is totally unrealistic, in fact when is the last time that many games got released at the same time. That amount doesn't even happen with new consoles, you're lucky to get 5 and they get much better and sooner early access development hardware.

 

Game developers won't, and shouldn't fit their schedule around a GPU company's release of hardware or technology like you want them to. You might as well just ask Nvidia to be a game development studio and publisher at that point.

Erm, have you thought you're looking at this the wrong way? You're saying they couldn't release features for cards that didn't exist yet. I was thinking more along the lines of NVIDIA making sure devs are capable of doing that before releasing cards. You know, not releasing them before their prime time?

 

Why do you all think people were buying graphic cards, proper 3D ones like mad in the past? Because Doom and Quake games were pushing the boundaries of graphics and were basically the only games to utilize tech to such extent. People WANTED to buy new hardware because of that. Today, it doesn't matter if you have a 250€ graphic card or a 1200€ graphic card, end experience is basically the same, it's just how fluid it is. And it's very even with cheap cards if they aren't exactly bottom crap. If RTX was marketed properly, people would WANT the RTX cards for that. What we got instead? Almost everyone recommending to just buy old 1080Ti coz it makes no difference if it's older gen without RTX. NVIDIA released this awesome super realistic feature with basically zero incentive to jump on it other than bunch of hyped videos that are next to worthless for a gamer. Essentially they had to have just 1 high profile game fully ready to go with RTX on release day and it would make the whole difference, for which they should partner with a developer to do it before the launch. They were making this thing for 10 years if CEO of NVIDIA is to be believed and yet in 10 years time, they were unable to make a deal and ensure that. Come on, who's the idiot here? It would also be easy to convince shareholders to postpone the launch if they were pressuring them, because rushing a product when competition has nothing and then have this awesome hardware no one can really use is all the R&D time and money thrown down the toilet. No one's gonna buy a 1200€ card for the future to maybe use it for this feature some day. It's struggling to do it now, it's not gonna be faster in next 2-3 years...

Link to post
Share on other sites
12 minutes ago, RejZoR said:

NVIDIA making sure devs are capable of doing that before releasing cards.

Why? Why should Nvidia seed cards to devs months or years in advance just so you can have a few extra launch titles. Why make every wait for new faster hardware because of a sub set of features, you know everything else about the cards work right? Is the gaming industry supposed to be at the beck and call of Nvidia? No.

 

It makes zero sense to delay either games or hardware so they perfectly line up in a magical world that shouldn't even exist anyway. It's a co-operative relationship not a controlling one, you can work together as well as work independently. 

Link to post
Share on other sites
11 minutes ago, RejZoR said:

Erm, have you thought you're looking at this the wrong way?

This thought might surprise you, but most of us consider things with an open mind.  In fact it seems you are the only one (maybe two others)  in this discussion who has been unwilling to comprehend the size and intricacies of technological development.

Quote

You're saying they couldn't release features for cards that didn't exist yet. I was thinking more along the lines of NVIDIA making sure devs are capable of doing that before releasing cards. You know, not releasing them before their prime time?

 

NVIDIA has been working with the devs to make sure they can use it.   In fact Nvidia, like everyone else working on RT, have been working closely with developers and industry specialists for many years now to bring this tech to the market.  


QuicK and DirtY. Read the CoC it's like a guide on how not to be moron.  Also I don't have an issue with the VS series.

Link to post
Share on other sites
8 hours ago, leadeater said:

 

To be fair, nvidia did make bold claims about it being easy to implement if my memory serves me right. So in a way, not having games for it while understandable isn't entirely acceptable considering those claims.

(And to be fair, they could have trained DLSS models for way more games or they really wanted to. That would not take 3 years to do per game, that should be around a week to a month of training the network, and around the same time to include it in the AA pipeline, so that could have been implemented more widespread.

Link to post
Share on other sites
4 hours ago, mr moose said:

This thought might surprise you, but most of us consider things with an open mind.  In fact it seems you are the only one (maybe two others)  in this discussion who has been unwilling to comprehend the size and intricacies of technological development.

NVIDIA has been working with the devs to make sure they can use it.   In fact Nvidia, like everyone else working on RT, have been working closely with developers and industry specialists for many years now to bring this tech to the market.  

I know enough that no one's gonna buy a superduper cool product if they can't use features on it. Especially if it costs bloody 1000€ and more...

Link to post
Share on other sites
23 minutes ago, RejZoR said:

I know enough that no one's gonna buy a superduper cool product if they can't use features on it. Especially if it costs bloody 1000€ and more...

Well if you were looking at buying a 1080Ti you might as well get a 2080, if you were looking at a 1080 you might as well get a 2070 etc. If you're in the market for a new card would it not be best to buy the best card you can for the same price point you were looking at, the RTX cards have more improvements than just Ray Tracing and that's primarily in fixing all the hardware shortcomings in relation to DX12 and Vulkan.

 

Even if you don't really like the state of RTX as it is now protesting buying a worse product is not intelligent.

Link to post
Share on other sites

Meanwhile Vega wipes the floor with Turing in GPU computing performance...


...is there a question here? 🤔

sudo chmod -R 000 /*

What is scaling and how does it work? Asus PB287Q unboxing! Console alternatives :D Watch Netflix with Kodi on Arch Linux Sharing folders over the internet using SSH Beginner's Guide To LTT (by iamdarkyoshi)

Sauron'stm Product Scores:

Spoiler

Just a list of my personal scores for some products, in no particular order, with brief comments. I just got the idea to do them so they aren't many for now :)

Don't take these as complete reviews or final truths - they are just my personal impressions on products I may or may not have used, summed up in a couple of sentences and a rough score. All scores take into account the unit's price and time of release, heavily so, therefore don't expect absolute performance to be reflected here.

 

-Lenovo Thinkpad X220 - [8/10]

Spoiler

A durable and reliable machine that is relatively lightweight, has all the hardware it needs to never feel sluggish and has a great IPS matte screen. Downsides are mostly due to its age, most notably the screen resolution of 1366x768 and usb 2.0 ports.

 

-Apple Macbook (2015) - [Garbage -/10]

Spoiler

From my perspective, this product has no redeeming factors given its price and the competition. It is underpowered, overpriced, impractical due to its single port and is made redundant even by Apple's own iPad pro line.

 

-OnePlus X - [7/10]

Spoiler

A good phone for the price. It does everything I (and most people) need without being sluggish and has no particularly bad flaws. The lack of recent software updates and relatively barebones feature kit (most notably the lack of 5GHz wifi, biometric sensors and backlight for the capacitive buttons) prevent it from being exceptional.

 

-Microsoft Surface Book 2 - [Garbage - -/10]

Spoiler

Overpriced and rushed, offers nothing notable compared to the competition, doesn't come with an adequate charger despite the premium price. Worse than the Macbook for not even offering the small plus sides of having macOS. Buy a Razer Blade if you want high performance in a (relatively) light package.

 

-Intel Core i7 2600/k - [9/10]

Spoiler

Quite possibly Intel's best product launch ever. It had all the bleeding edge features of the time, it came with a very significant performance improvement over its predecessor and it had a soldered heatspreader, allowing for efficient cooling and great overclocking. Even the "locked" version could be overclocked through the multiplier within (quite reasonable) limits.

 

-Apple iPad Pro - [5/10]

Spoiler

A pretty good product, sunk by its price (plus the extra cost of the physical keyboard and the pencil). Buy it if you don't mind the Apple tax and are looking for a very light office machine with an excellent digitizer. Particularly good for rich students. Bad for cheap tinkerers like myself.

 

 

Link to post
Share on other sites
14 minutes ago, leadeater said:

Well if you were looking at buying a 1080Ti you might as well get a 2080, if you were looking at a 1080 you might as well get a 2070 etc. If you're in the market for a new card would it not be best to buy the best card you can for the same price point you were looking at, the RTX cards have more improvements than just Ray Tracing and that's primarily in fixing all the hardware shortcomings in relation to DX12 and Vulkan.

 

Even if you don't really like the state of RTX as it is now protesting buying a worse product is not intelligent.

I think the mindset is that people don't want to pay for an RTX for 2 maybe 3 reasons.  Used prices on Pascal and RX+Vega compared to the current offerings from Nvidia.  Like you can get a 1080 Ti, vega 64, vega 56, 1080, 1070 ti, and 1070 for less than 400 used off ebay.  New prices they're probably less drastic, but that is part of it.  Secondly, people don't want to pay the price of turing and have RTX and DLSS off most of the time as they apparently feel like they're overpaying for that is one defense I've heard.  Thirdly, they're just put off by the RTX lack of support and when they have seen it applied DLSS is causing a blur fest and RTXi s cutting into the frame rate.  The only reason I refuse to buy the 2080 Ti, while I can afford it, is the fuck you price from Nvidia due to the lack of competition.  I'm perfectly fine with a Radeon VII for my use case where it actually is better.  And, I would like a 2080 Ti if Nvidia lowered the price a little, but not now with that excessive price tag for no reason other than, "We have no competition, so fuck you and buy it anyways!  Come one, having them funds makes you better!"


VashTheStampede 4.0:

CPU: AMD Threadripper 1950x | CPU Cooling: EKWB Liquid Cooling(EK-Supremacy sTR4 RGB - Nickel, EK-CoolStream SE 280, EK-Vardar EVO 140ER Black x 2, EK-XRES 100 SPC-60 MX PWM (incl. pump), EK-ACF Fitting 10/13mm - Red (6-pack), EK-DuraClear 9,5/12,7mm 3M, and Scarlet Red Premix) | Compound: Thermal Grizzly Kryonaut | Mobo: Asrock X399 Taichi | Ram: G.Skill Ripjaws V 32GBs (2x16) DDR4-3200 | Storage: Crucial MX500 500GB M.2-2280 SSD/PNY CS900 240GB SSD/Toshiba X300 4TB 7200RPM | GPU: Sapphire Radeon VII | Case: Fractal Define R5 Blackout Edition w/Window | PSU: EVGA SuperNOVA G2 750W 80+ Gold | Operating System: Windows 10 Pro | Keyboard: Ducky Shine 7 Blackout Edition with Cherry MX Silent Reds | Mouse: Corsair M65 Pro RGB FPS | Headphones:  AKG K7XX Mass Drop Editions(Replacing with k712s) | Mic: Audio-Technica ATR2500 | Speakers: Mackie MR624 Studio Monitors

 

Surtr:

CPU: AMD Ryzen 3 2200G(Temp/Upping to a Zen 2 3950x) | CPU Cooling: Wraith(Dark Rock Pro 4 later) | Compound: Thermal Grizzly Kryronaut | Mobo: Asrock x470 Taichi | Ram: G.Skill Ripjaws V 16GBs (2x8) DDR4-3200 | Storage: PNY - BX500 240 GB SSD+Seagate Constellation ES.3 1TB 7200RPM | GPU: PowerColor - Radeon RX VEGA 64 8 GB RED DEVIL | Case: Corsair - SPEC-DELTA RGB | PSU: EVGA SuperNOVA G2 750W 80+ Gold | Optical Drive: Random HP DVD Drive | Operating System: Windows 10 | Keyboard: Corsair K70 with Cherry MX Reds | Mouse: Corsair M65 Pro RGB FPS Speakers: JBL LSR 305 Studio Monitors(At some point)

 

Prince of Dark Rock:

CPU: AMD Ryzen 5 2600 | CPU Cooling: be quiet! - Dark Rock Pro 4 | Compound: Thermal Grizzly Kryronaut | Mobo: MSI B450 Tomahawki | Ram: G.Skill Ripjaws V 8GBs (2x4) DDR4-3200 | Storage: Crucial - BX200 240 GB SSD+Seagate Constellation ES.3 1TB 7200RPM | GPU: EVGA - GeForce GTX 1060 6GB SSC | Case: Cooler Master - MasterBox MB511 | PSU: Corsair - CXM 550W | Optical Drive: Random HP DVD Drive | Operating System: Windows 10 Home | Keyboard: Rosewill - NEON K85 RGB BR | Mouse: Razer DeathAdder Elite Destiny 2 Edition 

Link to post
Share on other sites
On ‎3‎/‎30‎/‎2019 at 7:04 AM, Chett_Manly said:

I mean hes not wrong. AMD's 7nm product has about 12nm RTX2080 performance at a higher power draw. The Nvidia's Turing architecture is more power efficient at 12nm than AMD at 7nm.  

 

Load Power Consumption - Battlefield 1

345 Watts.. ow wait …. total system power consumption … oke oke nvm 

Link to post
Share on other sites
3 hours ago, leadeater said:

Well if you were looking at buying a 1080Ti you might as well get a 2080, if you were looking at a 1080 you might as well get a 2070 etc. If you're in the market for a new card would it not be best to buy the best card you can for the same price point you were looking at, the RTX cards have more improvements than just Ray Tracing and that's primarily in fixing all the hardware shortcomings in relation to DX12 and Vulkan.

 

Even if you don't really like the state of RTX as it is now protesting buying a worse product is not intelligent.

a) I bought 1080Ti two months after release which will make it 2 years old this june iirc so I'm not protesting anything

b) 1080Ti was way cheaper than RTX 2080 when RTX 2080 was released

c) basically everyone recommended 1080Ti over RTX, including Linus and Jayz along with bunch of others, they only stopped when prices stabilized and GTX 1080Ti weren't available in stores anymore

Link to post
Share on other sites
52 minutes ago, RejZoR said:

b) 1080Ti was way cheaper than RTX 2080 when RTX 2080 was released

Barely. Most 1080 Tis were still going for $700+ when the 2080 launched.


Dell S2417DG, RTX 2080 XC, R5 3600, MSI B350M Mortar, 2x8GB 3600MHz CL16

Link to post
Share on other sites

And how much was RTX 2080? 699 ? My ass it was... Ti version was 1200€ and regular was 1000€...

Link to post
Share on other sites
52 minutes ago, RejZoR said:

And how much was RTX 2080? 699 ? My ass it was... Ti version was 1200€ and regular was 1000€...

800 ish but 3rd party 1080ti was selling for 750 to 800 at the time with dual fans at least

 

Link to post
Share on other sites
8 hours ago, valdyrgramr said:

I think the mindset is that people don't want to pay for an RTX for 2 maybe 3 reasons.  Used prices on Pascal and RX+Vega compared to the current offerings from Nvidia.  Like you can get a 1080 Ti, vega 64, vega 56, 1080, 1070 ti, and 1070 for less than 400 used off ebay.  New prices they're probably less drastic, but that is part of it.  Secondly, people don't want to pay the price of turing and have RTX and DLSS off most of the time as they apparently feel like they're overpaying for that is one defense I've heard.  Thirdly, they're just put off by the RTX lack of support and when they have seen it applied DLSS is causing a blur fest and RTXi s cutting into the frame rate.  The only reason I refuse to buy the 2080 Ti, while I can afford it, is the fuck you price from Nvidia due to the lack of competition.  I'm perfectly fine with a Radeon VII for my use case where it actually is better.  And, I would like a 2080 Ti if Nvidia lowered the price a little, but not now with that excessive price tag for no reason other than, "We have no competition, so fuck you and buy it anyways!  Come one, having them funds makes you better!"

I never liked the price increase either, that's not how it's supposed to happen but the problem is that's what we have now. We can stand around having a bitch about it all day but that doesn't actually make the RTX cards worse than what they are at the given price points. Realistically most people buy new and with the actual improvements in Turing and how 'well' Nvidia supports GPU architecture after they have been succeeded I could only ever recommend buying the same priced Turing as the Pascal.

Link to post
Share on other sites
Guest
This topic is now closed to further replies.


×