Jump to content

Next NVIDIA Line will be named "Hopper"

Nicnac
15 minutes ago, Bombastinator said:

Exactly.  A quasi arbitrary number only used in computers for a brief time.

A byte is still a fundamental unit of data though. I think it is the smallest still used data structure in programming other than a bit itself. Bigger units are almost always a multiple of 8. 

 

Still it makes an interesting point. What would the world be like if everyone used bits as the unit of measure for example.

 

You should get 16GB of ram for gaming.

You should get 128Gb of ram for gaming.

 

A 256GB SSD is ok but a 512GB one isn't much more expensive.

A 2Tb SSD is ok but a 4Tb one isn't much more expensive.

 

And just to turn things around...

 

Everyone should run a 64-bit OS now.

Everyone should run an 8-byte OS now.

 

Ok, that last one feels bad. :D 

 

Edit: don't tell marketing departments about this. I think it is contained to networking for now where connections are in bits rather than bytes, but bigger numbers are better right?

Main system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, Corsair Vengeance Pro 3200 3x 16GB 2R, RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, porina said:

Everyone should run a 64-bit OS now.

Everyone should run an 8-byte OS now.

 

Ok, that last one feels bad. :D 

lol hurts to read

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, leadeater said:

lol hurts to read

How about the Intel 8080 being a one byte CPU, or the SNES being a two byte console?

Link to comment
Share on other sites

Link to post
Share on other sites

51 minutes ago, Dylanc1500 said:

How about the Intel 8080 being a one byte CPU, or the SNES being a two byte console?

There was a time before 8 bit computing.  It was 4 bit once.

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

57 minutes ago, Dylanc1500 said:

How about the Intel 8080 being a one byte CPU, or the SNES being a two byte console?

I always eat my biscuits in one byte

Link to comment
Share on other sites

Link to post
Share on other sites

Hmm I wonder how long until Nvidia's naming system overtakes people that are still alive. Looking forward to a Carmack or (if nvidia's narcissistic enough) Huang line.

Link to comment
Share on other sites

Link to post
Share on other sites

15 hours ago, Nicnac said:

NVIDIA-Hopper-GPU-Trademark.png

 

OmNiVeRsE might be my favorite :P

Wicked, six new models of the 2060 incoming. 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, porina said:

A byte is still a fundamental unit of data though. I think it is the smallest still used data structure in programming other than a bit itself. Bigger units are almost always a multiple of 8. 

 

Yeah, when people are talking coequically about somthing IRL it's not really used, but down at the technical level it';s used all the freaking time.

Link to comment
Share on other sites

Link to post
Share on other sites

On 12/12/2019 at 4:11 PM, leadeater said:

The problem with that which also effects SLI/Corssfire is it breaks post processing effects, certain lighting techniques and a few other things. That's why for a long time now SLI/Crossfire used AFR as it is the most compatible and also the simplest to implement.

Assuming that a chiplet based setup can access the same memory pool, this is a moot point because they could write to and reference the same render target

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, Mira Yurizaki said:

Assuming that a chiplet based setup can access the same memory pool, this is a moot point because they could write to and reference the same render target

Well it's not because memory bandwidth in GPUs is hundreds of gigabytes per second and currently we don't have anything that would link chiplets and memory stacks together to achieve that. You can do chiplet easily without cross memory dependency easily or low bandwidth requirement, which is why it's going to happen with compute first not graphics because you can have workloads that tick one of both those conditions.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×