Jump to content

New Titan X Arctic Storm?

OAcesync

Same here too! 2-way Titan X's. Also 12GB is going to last for a very long time. I'm also a video game developer major and with Titan GPUs in general with people using it for both gaming and productivity, it can get by for a while. Also with the games I'm going to be developing (AAA games mind you), they'll be on par with Crytek's Crysis trilogy, where their first release (Crysis) was so insanely beautiful, it rendered all the hardware at the time totally unfeasible for the game. For my games, it's a little more laid back so that the highest end hardware at that release is feasible enough to handle them at 1080p.

 

-Example-

First game VRAM usage (Max Graphics Settings, No AA): 8-10 GB

Best Card at those settings: GeForce GTX Titan X

 

What does this mean? It means to me a lot, especially that developers can really tap into that amount of VRAM to push the texture quality and models to unprecedented levels never before imagined. Crytek has done it before and they're willing to do it again. Players enjoyed something they never thought possible.

lol wat.

 

Im an actual game dev and god is this some bull... not only are you not making AAA games straight out of college (unless youre really really good or know a dude in a studio), no matter how good your game looks, if its actively pulling over 3GB of Vram, youre doing something wrong. which you are, if youre straight out of school. not instancing and referencing your objects, textures of insane resolutions, over the top poly counts that can be reproduced by textures, stuff like that.

 

Also BTW i know some Crytek artist, and no, they arent willing to do it again, they mostly just do stuff to keep the company afloat.

"Unofficially Official" Leading Scientific Research and Development Officer of the Official Star Citizen LTT Conglomerate | Reaper Squad, Idris Captain | 1x Aurora LN


Game developer, AI researcher, Developing the UOLTT mobile apps


G SIX [My Mac Pro G5 CaseMod Thread]

Link to comment
Share on other sites

Link to post
Share on other sites

Please include a source link in your post.

"It pays to keep an open mind, but not so open your brain falls out." - Carl Sagan.

"I can explain it to you, but I can't understand it for you" - Edward I. Koch

Link to comment
Share on other sites

Link to post
Share on other sites

                                                                                                                 Setup

CPU: i3 4160|Motherboard: MSI Z97 PC MATE|RAM: Kingston HyperX Blue 8GB(2x4GB)|GPU: Sapphire Nitro R9 380 4GB|PSU: Seasonic M12II EVO 620W Modular|Storage: 1TB WD Blue|Case: NZXT S340 Black|PCIe devices: TP-Link WDN4800| Montior: ASUS VE247H| Others: PS3/PS4

Link to comment
Share on other sites

Link to post
Share on other sites

lol wat.

 

Im an actual game dev and god is this some bull... not only are you not making AAA games straight out of college (unless youre really really good or know a dude in a studio), no matter how good your game looks, if its actively pulling over 3GB of Vram, youre doing something wrong. which you are, if youre straight out of school. not instancing and referencing your objects, textures of insane resolutions, over the top poly counts that can be reproduced by textures, stuff like that.

 

Also BTW i know some Crytek artist, and no, they arent willing to do it again, they mostly just do stuff to keep the company afloat.

How about Ubisoft? How about CD Projekt Red? How about Rocksteady Studios? Their latest games so far has some ridiculous system requirements (IMO a GeForce GTX 770 or 780 as recommended (Batman's ULTRA system requirements calls for the 980)) because they want their players to experience the game the way they intended. James Cameron's Avatar took 10 years in the making and boy does the film look awesome, combined with the highest level of writing quality to bring about a story that's as good as the visuals. When you go through the trouble of making a game as realistic as it can be, it will draw more than 3GB of VRAM, even with all the optimizations out of the way. The short story here from my point of view is I want my gamers in the future to experience something the way I would experience it myself. It may not be in the next year or so. That will come much later. And games that only draw up to 3GB is more like a mainstream title to me like Tomb Raider for example. And when Batman: Arkham Knight is being sold again after the patches are out of the way, it will still go up to about 4GB because of the texture resolution and 3D models being realistic in proportion and physically for proper depiction of how the skin would cave in if struck. Oh and when you go and model a realistic vehicle from real life (Name a vehicle brand and make) you will need tens of thousands of pixels to sculpt the body of the vehicle. Trust me I've seen videos of vehicles modeled with that many pixels before AND in person in a 3D Animation class.

And you're right in some meaningful way. However for any of my games requiring Titan GPUs, more for like 4K gaming. And I know it won't come soon, probably within the next few years when I have some experience in the industry, which largely depends on how fast I gain experience and how good are my projects. Remember you're talking to someone who has some serious expectations, just like how James Cameron has his. Did you know that James Cameron's first major film he produced as a director, producer, and writer took him 9 years before he started the project and garnered universal acclaim. If it takes me that long before my first game, that's okay because he went through it.

More VRAM equals higher quality textures and models. Remember that. Even more companies are starting to take that word, though I'm not responsible for that, I'm just saying it from their perspective.

Try and 3D scan a very porous cinder block would you please. Then tell me how many pixels the 3D scanner generated.

RIGZ

Spoiler

Starlight (Current): AMD Ryzen 9 3900X 12-core CPU | EVGA GeForce RTX 2080 Ti Black Edition | Gigabyte X570 Aorus Ultra | Full Custom Loop | 32GB (4x8GB) Dominator Platinum SE Blackout #338/500 | 1TB + 2TB M.2 NVMe PCIe 4.0 SSDs, 480GB SATA 2.5" SSD, 8TB 7200 RPM NAS HDD | EVGA NU Audio | Corsair 900D | Corsair AX1200i | Corsair ML120 2-pack 5x + ML140 2-pack

 

The Storm (Retired): Intel Core i7-5930K | Asus ROG STRIX GeForce GTX 1080 Ti | Asus ROG RAMPAGE V EDITION 10 | EKWB EK-KIT P360 with Hardware Labs Black Ice SR2 Multiport 480 | 32GB (4x8GB) Dominator Platinum SE Blackout #338/500 | 480GB SATA 2.5" SSD + 3TB 5400 RPM NAS HDD + 8TB 7200 RPM NAS HDD | Corsair 900D | Corsair AX1200i + Black/Blue CableMod cables | Corsair ML120 2-pack 2x + NB-BlackSilentPro PL-2 x3

STRONK COOLZ 9000

Spoiler

EK-Quantum Momentum X570 Aorus Master monoblock | EK-FC RTX 2080 + Ti Classic RGB Waterblock and Backplate | EK-XRES 140 D5 PWM Pump/Res Combo | 2x Hardware Labs Black Ice SR2 480 MP and 1x SR2 240 MP | 10X Corsair ML120 PWM fans | A mixture of EK-KIT fittings and EK-Torque STC fittings and adapters | Mayhems 10/13mm clear tubing | Mayhems X1 Eco UV Blue coolant | Bitspower G1/4 Temperature Probe Fitting

DESK TOIS

Spoiler

Glorious Modular Mechanical Keyboard | Glorious Model D Featherweight Mouse | 2x BenQ PD3200Q 32" 1440p IPS displays + BenQ BL3200PT 32" 1440p VA display | Mackie ProFX10v3 USB Mixer + Marantz MPM-1000 Mic | Sennheiser HD 598 SE Headphones | 2x ADAM Audio T5V 5" Powered Studio Monitors + ADAM Audio T10S Powered Studio Subwoofer | Logitech G920 Driving Force Steering Wheel and Pedal Kit + Driving Force Shifter | Logitech C922x 720p 60FPS Webcam | Xbox One Wireless Controller

QUOTES

Spoiler

"So because they didn't give you the results you want, they're biased? You realize that makes you biased, right?" - @App4that

"Brand loyalty/fanboyism is stupid." - Unknown person on these forums

"Assuming kills" - @Moondrelor

"That's not to say that Nvidia is always better, or that AMD isn't worth owning. But the fact remains that this forum is AMD biased." - @App4that

"I'd imagine there's exceptions to this trend - but just going on mine and my acquaintances' purchase history, we've found that budget cards often require you to turn off certain features to get slick performance, even though those technologies are previous gen and should be having a negligible impact" - ace42

"2K" is not 2560 x 1440 

Link to comment
Share on other sites

Link to post
Share on other sites

I thought Nvidia doesn't allow third party Titan cards which is why EVGA had to make a separate stand alone cooler?

Link to comment
Share on other sites

Link to post
Share on other sites

...

 

Also BTW i know some Crytek artist, and no, they arent willing to do it again, they mostly just do stuff to keep the company afloat.

It was expected but it still makes me kinda sad to read it.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×