Jump to content

Nvidia shows off Post Volta GPUs in Next Generation Drive PX

AlTech

Audi as well. They have this partnership.

I do know that Honda with the latest car redesign, like the Civic 10th Generation (Honda Civic 2016, 2017 and 2018) , and the latest CR-V models, and I think the Odyssey line as well, feature a Tegra SoC of some kind for the infotainment system. Well, apparently. No idea if the sensor system uses Tegra SoC or not.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, GoodBytes said:

Maybe in some research lab. But the FAA certification is so extensive... it is for a reason why plane sports 3DFX chips for its graphical interface, and not a GeForce GTX 1080.

 

Flying in the air is simple. Landing and take off if where you need a pilot, and it is most dangerous part of flying. Highly trained pilot is needed, with quick thinking to manage all sorts of changing situation as the plane takes off and lands.

 

If AI in planes was a legit thing, trust me, Air line company would ditch pilots day 1.

Aircraft can in fact already take-off and land themselves, though I'm not sure in what breadth of conditions. Of course, technological and regulatory reality don't always coincide. 

Link to comment
Share on other sites

Link to post
Share on other sites

49 minutes ago, Taf the Ghost said:

FAA cert takes between 3-5 years, from what I've been told.

Yes.. that is if you have all requirements for the FAA, this means that every single line of code, every single circuit line on a chip and board is fully documented. In addition, It requires multi-year testing that can easily take 5+ years, as the company that validate the hardware is 100% responsible if there is a discovered circuitry flaw or bug in a chip. This doesn't include the software that will run it. So now, we are 10 years behind. And that is assuming Nvidia or AMD or Intel or any ARM SoC manufacture wants to provide such information, as it also public. Meaning, if Nvidia does it for the GTX 1080, you can bet Intel and AMD will copy ideas. So no manufacture wants to go in It is always with completely out dated hardware if they go in, or if they have nothing to loose. Not to mention drivers validation that can only happen once everything is tested and approved, if it gets approved. Imagine waiting 4 years and 6 month for validation, on a 5 year test and then a bug is found. Now you have not only to fix it, but restart the whole process. Once you have everything, NOW you can bring everything to the FAA for certification.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, AluminiumTech said:

*TCAS

 

Traffic

Collision

Avoidance

System.

The generic term is ACAS https://en.m.wikipedia.org/wiki/Airborne_collision_avoidance_system

1 hour ago, AluminiumTech said:

But Moore's Law isn't a real law. It's just an observation one man made over 10 years ago.

When I say "let Moore's law take care of it", it means that developers don't spend any time choosing better solutions or optimizing because they know technology progresses enough that the next best thing will give them the performance they really wanted.

Link to comment
Share on other sites

Link to post
Share on other sites

22 minutes ago, GoodBytes said:

Yes.. that is if you have all requirements for the FAA, this means that every single line of code, every single circuit line on a chip and board is fully documented. In addition, It requires multi-year testing that can easily take 5+ years, as the company that validate the hardware is 100% responsible if there is a discovered circuitry flaw or bug in a chip. This doesn't include the software that will run it. So now, we are 10 years behind. And that is assuming Nvidia or AMD or Intel or any ARM SoC manufacture wants to provide such information, as it also public. Meaning, if Nvidia does it for the GTX 1080, you can bet Intel and AMD will copy ideas. So no manufacture wants to go in It is always with completely out dated hardware if they go in, or if they have nothing to loose. Not to mention drivers validation that can only happen once everything is tested and approved, if it gets approved. Imagine waiting 4 years and 6 month for validation, on a 5 year test and then a bug is found. Now you have not only to fix it, but restart the whole process. Once you have everything, NOW you can bring everything to the FAA for certification.

I've heard horror stories, but I didn't know the exact timelines. Only know any Aviation tech works through secondary connections, and, oddly, the horrors aren't something they want to talk about. haha.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Sniperfox47 said:

There's no reason for them to enable it for the consumer space.

The new Wolfenstein game takes advantage of VEGA's FP16 performance so there's no reason why it shouldn't take advantage of FP16 on future GeForce cards if Nvidia allows it.

 

2 hours ago, Sniperfox47 said:

If VEGA is the architecture you're taking cues from, you're doing it wrong. Is VEGA good for prosumers? Sure. Is VEGA good for deep learning or datacenters?

It's actually really damn awesome for datacenters and deep AI learning. Arguable more suited to that than gaming.

2 hours ago, Sniperfox47 said:

Not in the slightest. The heat generation of it is insane, the power draw is insane, and all that to get not even the same performance as a Tesla P100, much less a Tesla V100. Unless RTG has some ace hidden up their sleeve for the enterprise market, they're in trouble.

Are you talking about the same VEGA architecture as I am?

 

Radeon Pro WX 9100 alienates the P100 and even the V100 in tests. AMD showed this off at Siggraph (the only useful things they talked about at siggraph).

2 hours ago, Sniperfox47 said:

And sure, but you couldnode shrink Pascal too. And if you took GV100, cut out all the new enterprise features, and cut out all the stuff from GV100 that was cut out from GP100, then you'd basically wind up with a node shrunk GP102...

 

Volta has 0 benefits for consumers, outside maybe dumb prosumers who are trying to run 10-20 rendering workloads on a single GPU all at once where the QoS system would help. If that didn't get cut out.

It's not my fault that Nvidia can't design GPUs properly.

 

They should be aiming for better performance for their key market segments and they're currently only addressing business and not consumer. Consumer is a key part of their business.

Judge a product on its own merits AND the company that made it.

How to setup MSI Afterburner OSD | How to make your AMD Radeon GPU more efficient with Radeon Chill | (Probably) Why LMG Merch shipping to the EU is expensive

Oneplus 6 (Early 2023 to present) | HP Envy 15" x360 R7 5700U (Mid 2021 to present) | Steam Deck (Late 2022 to present)

 

Mid 2023 AlTech Desktop Refresh - AMD R7 5800X (Mid 2023), XFX Radeon RX 6700XT MBA (Mid 2021), MSI X370 Gaming Pro Carbon (Early 2018), 32GB DDR4-3200 (16GB x2) (Mid 2022

Noctua NH-D15 (Early 2021), Corsair MP510 1.92TB NVMe SSD (Mid 2020), beQuiet Pure Wings 2 140mm x2 & 120mm x1 (Mid 2023),

Link to comment
Share on other sites

Link to post
Share on other sites

Whats the point of new GPU's anyway? 4K is a flop and not worth it, HDR isnt much really just turn contrast/saturation higher = HDR, unless you like snake oil then go ahead pay for it, just like premium HDMi cables, someone's going to make a buck off you.

Games are increasingly toxic/p2w, you buy the game 3 times and the you still have microtransactions, the development of games isnt becoming any easier thats a fact, it actually increases the more we want higher details(i miss good old simple games that dindt cost 1B $ to make and asserted themselves trough gameplay coding/story).

What we actually need is more vulkan/dx12, better AI systems, and betwork multiplayer games, what we dont need is more fancy graphics on p2w pale games with no essence and gameplay.

Link to comment
Share on other sites

Link to post
Share on other sites

56 minutes ago, AluminiumTech said:

It's not my fault that Nvidia can't design GPUs properly.

 

 

They should be aiming for better performance for their key market segments and they're currently only addressing business and not consumer. Consumer is a key part of their business.

Yes it is your fault. You are the one not buying 16 graphics card for your gaming rig.

Nvidia profits for consumer product is small compared to enterprise segment. And that has greatly benefited consumers with more power efficient chips, and better be able to deliver great performance. Lower power consumption, allows for greater clocks, allowing really powerful GPUs on the consumer market.

Nvidia decision to focus on research and simulation back in ~2007, has really paid off for the company. ATI/AMD didn't, and look where they are. Their focus on consumer gaming cards, prevented them in making power efficient chip. Now they have hot chips that can't be clock fast to be competitive to Nvidia. Instead of making CUDA competitor, they focused on the "gaming friendly" OpenCL, where you use shaders instead of C to code computational model for various task (physics, simulation models, general video processing, AI, etc.). The only competitor to Nvidia in that field is Intel, and Nvidia offering is very compelling over Intel's for many (not all) applications. This made AMD irrelevant in the targeted fields from Nvidia.

 

You are angry because you don't have Volta now. But you do realize that the more you wait, the greater the chances you get HBM2 memory as it drops in price, and allows Volta to shine, and not limited by GDDR5X. And, you allow the software to catch up to the hardware.  Already a 1080 runs a wide variety of games at max or near max settings are super fast fps. So what is the rush?

Link to comment
Share on other sites

Link to post
Share on other sites

20 minutes ago, AluminiumTech said:

The new Wolfenstein game takes advantage of VEGA's FP16 performance so there's no reason why it shouldn't take advantage of FP16 on future GeForce cards if Nvidia allows it.

Congrats. RPM is used by a tiny handful of AMD partner games. We'll see what kind of benefits it offers at the end of the month when Wolfenstein II finally comes out, but don't hold your breath. Dropping precision in certain parts of your rendering engine while maintaining it in others is *incredibly* use-case sensitive if you don't want obvious degradation in visuals. There's a reason why games pushed for 16-bit and then 32-bit render precision in the first place. It's also not going to be widely adopted because handtuning your engine for it is a *ton* of work. I'm honestly surprised that even a relatively big company like ZeniMax would agree to go along with AMD on it.

 

20 minutes ago, AluminiumTech said:

It's actually really damn awesome for datacenters and deep AI learning. Arguable more suited to that than gaming.

And please tell me how you get that? How is it scalable in any way shape or form, at the same density as even Pascal based Teslas? Keep in mind, it's driver certification, power efficiency and heat efficiency that matter in a Datacenter loadout, not raw compute power or consumer-level cost/performance.

 

20 minutes ago, AluminiumTech said:

Radeon Pro WX 9100 alienates the P100 and even the V100 in tests. AMD showed this off at Siggraph (the only useful things they talked about at siggraph).

It literally has 1/5-1/4 the Int8 TOPS that the V100 has... 4/5ths the FP32 performance... and 1/10th the DP FP64 performance of the GV100... at 230Watts vs 250 Watts... What are you even talking about? Unless you're gaming on it (mostly FP32) it makes no sense for a datacenter. Either you're doing FP64 (scientific models, data analysis, etc.) or you're doing Int8 (Neural network tensors).

 

It's more of a workstation card than a datacenter card.

 

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, yian88 said:

Whats the point of new GPU's anyway? 4K is a flop and not worth it, HDR isnt much really just turn contrast/saturation higher = HDR

4K allows high-DPI display to be made. Allowing rich visuals with high details. From text being easier to read, detailed rich icons, to better AA. You can say it is silly all you want. Smartphones aren't 96ppi for a reason... you don't have premium 720p phones. People really appreciate the high resolution.

 

HDR is more wide gamut support and 1.07 billion colors support. Your eyes can see a great number of colors. Potentially trillion of colors.. definitely way more than 16.7 million. Why limit one self to washout colors displays (standard gamut) that can't display gradients smoothly (you see steps due to lack of colors). HDR aims to fix this. They are many standards of HDR, but it will clear up.

 

HDR is a fancy marketing to define:

  • Panel can reproduce natively or simulation (FRC) 10-bit colors per channel to reach 10.7 billion colors
  • Wide Gamut
  • Color space

 

3 minutes ago, yian88 said:

Games are increasingly toxic/p2w, you buy the game 3 times and the you still have microtransactions, the development of games isnt becoming any easier thats a fact, it actually increases the more we want higher details(i miss good old simple games that dindt cost 1B $ to make and asserted themselves trough gameplay coding/story).

So then vote with your wallet. So fa, based on publishers, DLC, micro transactions are a big success. I see more and more games going that path.

 

 

3 minutes ago, yian88 said:

What we actually need is more vulkan/dx12, better AI systems, and betwork multiplayer games,

And we do. It takes time.

Link to comment
Share on other sites

Link to post
Share on other sites

11 minutes ago, Sniperfox47 said:

Congrats. RPM is used by a tiny handful of AMD partner games. We'll see what kind of benefits it offers at the end of the month when Wolfenstein II finally comes out, but don't hold your breath. Dropping precision in certain parts of your rendering engine while maintaining it in others is *incredibly* use-case sensitive if you don't want obvious degradation in visuals. There's a reason why games pushed for 16-bit and then 32-bit render precision in the first place. It's also not going to be widely adopted because handtuning your engine for it is a *ton* of work. I'm honestly surprised that even a relatively big company like ZeniMax would agree to go along with AMD on it.

 

And please tell me how you get that? How is it scalable in any way shape or form, at the same density as even Pascal based Teslas? Keep in mind, it's driver certification, power efficiency and heat efficiency that matter in a Datacenter loadout, not raw compute power or consumer-level cost/performance.

 

It literally has 1/5-1/4 the Int8 TOPS that the V100 has... 4/5ths the FP32 performance... and 1/10th the DP FP64 performance of the GV100... at 230Watts vs 250 Watts... What are you even talking about? Unless you're gaming on it (mostly FP32) it makes no sense for a datacenter. Either you're doing FP64 (scientific models, data analysis, etc.) or you're doing Int8 (Neural network tensors).

 

It's more of a workstation card than a datacenter card.

 

Flops, TOPS and whatever other marketing fluff  is not a good indicator of performance.

Judge a product on its own merits AND the company that made it.

How to setup MSI Afterburner OSD | How to make your AMD Radeon GPU more efficient with Radeon Chill | (Probably) Why LMG Merch shipping to the EU is expensive

Oneplus 6 (Early 2023 to present) | HP Envy 15" x360 R7 5700U (Mid 2021 to present) | Steam Deck (Late 2022 to present)

 

Mid 2023 AlTech Desktop Refresh - AMD R7 5800X (Mid 2023), XFX Radeon RX 6700XT MBA (Mid 2021), MSI X370 Gaming Pro Carbon (Early 2018), 32GB DDR4-3200 (16GB x2) (Mid 2022

Noctua NH-D15 (Early 2021), Corsair MP510 1.92TB NVMe SSD (Mid 2020), beQuiet Pure Wings 2 140mm x2 & 120mm x1 (Mid 2023),

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, GoodBytes said:

Maybe in some research lab. But the FAA certification is so extensive... it is for a reason why plane sports 3DFX chips for its graphical interface, and not a GeForce GTX 1080.

 

Flying in the air is simple. Landing and take off if where you need a pilot, and it is most dangerous part of flying. Highly trained pilot is needed, with quick thinking to manage all sorts of changing situation as the plane takes off and lands.

 

If AI in planes was a legit thing, trust me, Air line company would ditch pilots day 1.

Would you take a flight without a pilot?

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, AluminiumTech said:

Flops, TOPS and whatever other marketing fluff  is not a good indicator of performance.

You're absolutely right...

But until they come out 4 weeks ago (second week of September in their release announcement) we can't go by anything other than specs, where they officially say they have no improved FP64 or INT8 capabilities... Wait what's that? 4 weeks ago already past, and they're still not shipping? HBM2 delays perhaps? And you wonder why Nvidia isn't shipping consumer cards with HBM2?

 

If you want to offer some kind of counterpoint with *any* actual supporting evidence I'm happy to listen to you, but so far all your points have basically been "Nvidia can't build a good GPU. VEGAs so much better. They just win at everything. Your evidence isn't evidence because it's theoretical performance, despite it being what both companies are marketing to their customers." If you don't want to support your arguments, I'll just drop it because this conversation is going nowhere.

Link to comment
Share on other sites

Link to post
Share on other sites

24 minutes ago, GoodBytes said:

HDR is a fancy marketing to define:

  • Panel can reproduce natively or simulation (FRC) 10-bit colors per channel to reach 10.7 billion colors
  • Wide Gamut
  • Color space

HDR also needs to have a high static contrast ratio panel to work correctly, otherwise there's no point. HDR is more about getting static contrast ratios high than better coloring. Coloring just ensures accuracy and avoids ugly color banding that higher static contrast ratios will show.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, M.Yurizaki said:

HDR also needs to have a high static contrast ratio panel to work correctly, otherwise there's no point. HDR is more about getting static contrast ratios high than better coloring. Coloring just ensures accuracy and avoids ugly color banding that higher static contrast ratios will show.

Except HDR LCDs still use dynamic contrast via local dimming rather than truly having good static contrast.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, M.Yurizaki said:

HDR also needs to have a high static contrast ratio panel to work correctly, otherwise there's no point. HDR is more about getting static contrast ratios high than better coloring. Coloring just ensures accuracy and avoids ugly color banding that higher static contrast ratios will show.

Well.. yes and no... Yes for TVs as most are LimitedRGB and not FullRGB, and therefore can't reproduce great contrast. Side note: It is important to make sure that your game console and graphic card on your PC are set to Full RGB if you connect it to a Full RGB TV or any computer monitors (which are all full RGB)

And it is also part of reproducing more colors (wide gamut and more colors).

Link to comment
Share on other sites

Link to post
Share on other sites

43 minutes ago, yian88 said:

Whats the point of new GPU's anyway? 4K is a flop and not worth it, HDR isnt much really just turn contrast/saturation higher = HDR, unless you like snake oil then go ahead pay for it, just like premium HDMi cables, someone's going to make a buck off you.

4k is a flop......uh what....you never used a 4k panel have you?

Also HDR is a thing and does make a different but on my OLED it makes little difference in quality, but its not just a contrast/saturation booster. 

Link to comment
Share on other sites

Link to post
Share on other sites

11 minutes ago, GoodBytes said:

Well.. yes and no... Yes for TVs as most are LimitedRGB and not FullRGB, and therefore can't reproduce great contrast. Side note: It is important to make sure that your game console and graphic card on your PC are set to Full RGB if you connect it to a Full RGB TV or any computer monitors (which are all full RGB)

And it is also part of reproducing more colors (wide gamut and more colors).

10-bit color only provides 1024 levels of any given shade of color, so in theory color space alone isn't enough. It has to be how bright white actually is vs. how dark black actually is.

 

15 minutes ago, Sniperfox47 said:

Except HDR LCDs still use dynamic contrast via local dimming rather than truly having good static contrast.

I kind of count that as static contrast ratio, given Wikipedia's definition of the term:

Quote

Static contrast ratio is the luminosity ratio comparing the brightest and darkest color the system is capable of producing simultaneously at any instant of time

Whereas I would say dynamic contrast ratio is what the black level is at backlight setting 0 vs. white level at backlight setting 100. Or whatever the hell it's supposed to be.

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, M.Yurizaki said:

10-bit color only provides 1024 levels of any given shade of color, so in theory color space alone isn't enough. It has to be how bright white actually is vs. how dark black actually is.

Which is why OLED panels notice HDR much less than LCDs. Blacks are black and peak white can get bright.

Link to comment
Share on other sites

Link to post
Share on other sites

And just continuing on... in a semi-perfect world, you would have an LED for each LCD pixel. You'd feed the luminescence of the pixel to the LED for lighting and the chroma to the LCD itself for color.

 

But at that point, you may as well just have RGB LEDs.

 

Also I am butthurt Apple bought a company researching into micro LEDs which sounds like they would do as I described. And I'm butthurt because knowing Apple, they'd probably hoard the technology patents for themselves.

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, M.Yurizaki said:

I kind of count that as static contrast ratio, given Wikipedia's definition of the term:

Whereas I would say dynamic contrast ratio is what the black level is at backlight setting 0 vs. white level at backlight setting 100.

Okay, fair enough. I've always thought of it as dynamic contrast, since it still doesn't represent the maximum contrast ratio between any given two pixels on the screen.

 

Just now, M.Yurizaki said:

And just continuing on... in a semi-perfect world, you would have an LED for each LCD pixel. You'd feed the luminescence of the pixel to the LED for lighting and the chroma to the LCD itself for color.

 

But at that point, you may as well just have RGB LEDs.

isn't that basically the difference between LCD and Plasma/OLED? :P

Link to comment
Share on other sites

Link to post
Share on other sites

11 minutes ago, M.Yurizaki said:

10-bit color only provides 1024 levels of any given shade of color, so in theory color space alone isn't enough. It has to be how bright white actually is vs. how dark black actually is.

You can have 1:1000000000 static contrast ratio. But if your TV only supports 2 colors (basically a projector lamp directly pointing at your eyes. Where you have 2 "colors": off and ON/white)  is nice and neat, but colors also giving you finer adjustment. To clarify, I am not excluding contrast. But I wanted to complete what you are saying. Hence why I said "Yes and No". Sorry for not being clear. Typed quickly.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, AluminiumTech said:

Hopefully Dr Lisa Su get's RTG together and releases good products in 2018 which compete.

Raja has that front, and his brain children will be launching the next few years, beginning with the next Arch

The Vinyl Decal guy.

Celestial-Uprising  A Work In-Progress

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×