Jump to content

B.J.Blazkowicz brings the HDDpocalypse

WereCatf
2 minutes ago, Loote said:

I am picking a bone with 'HDDpocalypse' and the justification that HDDs becoming even worse primary storage for games is something with big influence over HDDs relevance in general. imo it's another nail, but nothing that's able to make a big dent on its own. Games are not everything.

Wow, you sure overreact to this. It was a wordplay and it was in the context of gaming, not in general. No one claimed anywhere that HDDs are now suddenly useless for everything.

Hand, n. A singular instrument worn at the end of the human arm and commonly thrust into somebody’s pocket.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, WereCatf said:

Wow, you sure overreact to this. It was a wordplay and it was in the context of gaming, not in general. No one claimed anywhere that HDDs are now suddenly useless for everything.

Yeah, I admit I am fed up wit xxxpocalypses in media because every time I open such news it turns out to have been a clickbait. Just from this thread I assumed some hacker with a login bjblazkowicz found a vulnerability that can destroy data on HDDs from both manufacturers as that would truly be an HDDpocalypse. In the end we just get a confirmation that what was assumed to start happening after new consoles are out is in fact starting to happen.

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, WereCatf said:

Summary

 

Mechanical HDDs are slow, necessitating the use of various kinds of techniques in order to get loading-times down in games, including e.g. duplication of assets, causing the games to be larger than they technically need to be. Another common technique is the introduction of "chokepoints" in level-design, like e.g. making players wait a few moments in an elevator, making them crawl very slowly through some passages or similar things, so assets can be loaded from HDDs in the background. Alas, with the latest Xbox - and PlayStation - consoles ditching HDDs and moving to fast NVMe - storage, these techniques can be ditched and games can be designed around far faster access to assets, and that's exactly what MachineGames have announced they're doing for Wolfenstein 3 in a blog post.

 

hddpocalypse.thumb.jpg.3bf3275e593ec38d866eb3dd0f4e2397.jpg

 

Quotes

 

My thoughts

As a fan of the Wolfenstein - franchise and as a PC - gamer, I can't help but be rather excited to see how the game changes now that the constraints put on game - design by HDDs are removed. I haven't yet heard of any other PC - game ditching the HDD - based design, but with Wolfenstein 3 as the first or among the first, I predict we'll hear similar news for other games as well shortly. It looks like NVMe - drives may become a necessity for AAA - titles in modern gaming - PCs, finally pushing HDDs all the way down to the bottom, just for use as plain bulk - storage.

 

Sources

https://bethesda.net/en/article/6TiCksNgkXvYTWgAEuWHd0/nojs.html

Phew! I was really unsure whether to get an SSD or a HDD, now I know I need a
 

 

 

 

 


 

 

 

 

 

 

 

 

 

 

 


 

 

 

 

 

 

 

 

 

 


 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 


 

 

 

 

10,000 RPM DOUBLE STACKED 20TB Shingled Magnetic Recording WD HDD

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Justaphysicsnerd said:

10,000 RPM

1213402757_Over9000.thumb.jpg.413cdd472145291c3dd8466884234107.jpg

Hand, n. A singular instrument worn at the end of the human arm and commonly thrust into somebody’s pocket.

Link to comment
Share on other sites

Link to post
Share on other sites

26 minutes ago, WereCatf said:

What you are talking about requires you to already have that asset in memory, whereas duplication of assets at the storage-layer is before it's in memory -- two entirely different things. Duplication of assets at the storage-layer is done to reduce seek-times and improve amount of sequential reads, ergo being more HDD-friendly. HDDs SUCK at seek-times, but they're okay for sequential reads where they don't have to do constant seeks.

I know how HDD's work. And this still makes no sense. Game assets are mostly loaded at level load time. Meaning it has to end up in memory. If it's accessed during data streaming, it's accessed once and then it's cached in memory unless streaming logic rejects it from the cache because it becomes stale. Duplicating it solves nothing about seek, you still need to physically seek it through platters. Seeking it on HDD from 2 different locations can't possibly be faster than seeking it once and caching in memory and reading it from memory 50 billion times coz memory has nanosecond. Or parsing it directly from HDD from multiple duplicated assets for some fucked up reason is the dumbest thing I've ever heard. And if it's not cached, then wtf?! What is this, 1992?

 

That's like me writing a code for a program for same function 16 times consecutively instead of just looping it back and referencing it to the same code segment 16 times.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, RejZoR said:

And if it's not cached, then wtf?! What is this, 1992?

Games are designed with the lowest common denominator in mind, not the highest. You can't cache all the assets of a typical AAA-title in just 4GB of RAM, for example. Besides which, yes, many games just toss out all the cached data and start from scratch whenever there's a loading-screen, because they've been designed with consoles in mind. I'm not arguing with you about the stupidity of it, I'm just stating how things are.

Hand, n. A singular instrument worn at the end of the human arm and commonly thrust into somebody’s pocket.

Link to comment
Share on other sites

Link to post
Share on other sites

Games have or use multiple textures for the same objects (why use a 512x512 texture to draw a building side when the user is looking at the building from 1 mile away, and the building is 1cm on the screen) and multiple LODs (level of detail) 

Depending on the graphical settings chosen when you load the level, the game may seek the texture in a file, extract it from file, then it may be downsampled before it's uploaded in the video card vram to reduce fragmentation and other issues (why upload a 1024x1024 texture, if the player runs 1080p medium-high quality on a 3-4 GB VRAM card, where 512x512 is enough) - that nvidia io and direct storage would help because you don't need to decode the image and downsample it, you upload the big texture in video card vram and then the hardware decoders downsample it to the format wanted

 

Link to comment
Share on other sites

Link to post
Share on other sites

Is this really news? It was known that faster media was going to lead to better design opportunities, literally every dev is doing this and any dev on PC already was for probably 10+ years. I dont think anyone was duplicating assets on PC in at least that long probably.

MOAR COARS: 5GHz "Confirmed" Black Edition™ The Build
AMD 5950X 4.7/4.6GHz All Core Dynamic OC + 1900MHz FCLK | 5GHz+ PBO | ASUS X570 Dark Hero | 32 GB 3800MHz 14-15-15-30-48-1T GDM 8GBx4 |  PowerColor AMD Radeon 6900 XT Liquid Devil @ 2700MHz Core + 2130MHz Mem | 2x 480mm Rad | 8x Blacknoise Noiseblocker NB-eLoop B12-PS Black Edition 120mm PWM | Thermaltake Core P5 TG Ti + Additional 3D Printed Rad Mount

 

Link to comment
Share on other sites

Link to post
Share on other sites

Weird title, I thought it's something like 500GB game or something. Either way, definitely SSDs will be getting utilized more and more. Even without I/O dedicated hardware optimizations, some games really benefit from it a lot. Stil, multi TB SSD is very expensive vs HDD still and games are getting huge too. The price for SSDs will definitely need to go down and down with increasing capacity along. 

| Ryzen 7 7800X3D | AM5 B650 Aorus Elite AX | G.Skill Trident Z5 Neo RGB DDR5 32GB 6000MHz C30 | Sapphire PULSE Radeon RX 7900 XTX | Samsung 990 PRO 1TB with heatsink | Arctic Liquid Freezer II 360 | Seasonic Focus GX-850 | Lian Li Lanccool III | Mousepad: Skypad 3.0 XL / Zowie GTF-X | Mouse: Zowie S1-C | Keyboard: Ducky One 3 TKL (Cherry MX-Speed-Silver)Beyerdynamic MMX 300 (2nd Gen) | Acer XV272U | OS: Windows 11 |

Link to comment
Share on other sites

Link to post
Share on other sites

Sounds good, now if only they could make the protagonist for their next installment of Wolfenstein be a likeable character.

Ketchup is better than mustard.

GUI is better than Command Line Interface.

Dubs are better than subs

Link to comment
Share on other sites

Link to post
Share on other sites

9 hours ago, WereCatf said:

True, but the counter-argument -- and a rather strong one at that -- is that you shouldn't rely on your data being possible to recover that way in the first place. Just simply keep backups of everything you count important.

Exactly. I have been trying to make this point to someone just recently that is building a NAS. They have many HDDs worth of data, and insist that they want to use redundancy in their build. Which isn't bad per say, but they think that having redundancy is as good as having backups... I have been banging my head against a wall trying to explain why that is a bad idea! they say they haven't got the money for backups, so I say "well only backup the stuff you want to keep, and that you consider important. Anything else is basically an accident waiting to happen". He might get lucky, and not have a situation where he loses x amount of disks, but I wouldn't risk it personally.

Please quote my post, or put @paddy-stone if you want me to respond to you.

Spoiler
  • PCs:- 
  • Main PC build  https://uk.pcpartpicker.com/list/2K6Q7X
  • ASUS x53e  - i7 2670QM / Sony BD writer x8 / Win 10, Elemetary OS, Ubuntu/ Samsung 830 SSD
  • Lenovo G50 - 8Gb RAM - Samsung 860 Evo 250GB SSD - DVD writer
  •  
  • Displays:-
  • Philips 55 OLED 754 model
  • Panasonic 55" 4k TV
  • LG 29" Ultrawide
  • Philips 24" 1080p monitor as backup
  •  
  • Storage/NAS/Servers:-
  • ESXI/test build  https://uk.pcpartpicker.com/list/4wyR9G
  • Main Server https://uk.pcpartpicker.com/list/3Qftyk
  • Backup server - HP Proliant Gen 8 4 bay NAS running FreeNAS ZFS striped 3x3TiB WD reds
  • HP ProLiant G6 Server SE316M1 Twin Hex Core Intel Xeon E5645 2.40GHz 48GB RAM
  •  
  • Gaming/Tablets etc:-
  • Xbox One S 500GB + 2TB HDD
  • PS4
  • Nvidia Shield TV
  • Xiaomi/Pocafone F2 pro 8GB/256GB
  • Xiaomi Redmi Note 4

 

  • Unused Hardware currently :-
  • 4670K MSI mobo 16GB ram
  • i7 6700K  b250 mobo
  • Zotac GTX 1060 6GB Amp! edition
  • Zotac GTX 1050 mini

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, RejZoR said:

Game assets are mostly loaded at level load time.

Especially in open world environment, it isn't all loaded (and I would imagine in a lot of modern games it's actually not all loaded as well..that's why pop-ins exist).  That means each section of the map as it gets loaded in has to be read in and loaded to memory...that is where duplication comes into place.  Storing all the assets for a needed section together means loading a miss in an asset not being loaded won't have huge consequences.  It's not about the asset not being loaded multiple times within a small area, but rather the asset being duplicated multiple times in different zones.  [And given that a zone is loaded on proximity the seek times of the HDD make it detrimental to have it stored in just one place]...e.g. Try copying 100, 1MiB file on a HDD then try copying 1 100MiB file...even having a few assets strewn about in odd places will cause a slowdown of the loading of a zone.

 

In short, assets I believe typically are loaded only once in memory, but as zones are brought into memory they need to load assets that aren't currently in memory (and thus each zone would have a duplication of assets that are needed for the zone to keep the speed at which the zone is brought into memory at a minimum)

 

4 hours ago, RejZoR said:

That's like me writing a code for a program for same function 16 times consecutively instead of just looping it back and referencing it to the same code segment 16 times.

Not that the analogy fits with what is actually going on...but inline functions exist for a reason.  It's quicker having inline functions (at the cost of executable size) than having a function call

3735928559 - Beware of the dead beef

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, WereCatf said:

With the above in mind, I don't really know what RTX IO, specifically, brings to the table over Direct Storage, or if it's just some stupid marketing-scheme again.

DirectStorage is the API. RTX IO is simply Nvidia's implementation of that API in hardware.

 

See here: https://www.nvidia.com/en-us/geforce/news/rtx-io-gpu-accelerated-storage-technology/

Quote

How NVIDIA RTX IO Works

 

NVIDIA RTX IO plugs into Microsoft’s upcoming DirectStorage API, which is a next-generation storage architecture designed specifically for gaming PCs...

 

Remember to either quote or @mention others, so they are notified of your reply

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, Loote said:

Yeah, I admit I am fed up wit xxxpocalypses in media because every time I open such news it turns out to have been a clickbait. Just from this thread I assumed some hacker with a login bjblazkowicz found a vulnerability that can destroy data on HDDs from both manufacturers as that would truly be an HDDpocalypse. In the end we just get a confirmation that what was assumed to start happening after new consoles are out is in fact starting to happen.

clickbait is cancer, modern media needs some serious renovation, taking things to the extreme for clicks and views isn't healthy

5 hours ago, RejZoR said:

I know how HDD's work. And this still makes no sense. Game assets are mostly loaded at level load time. Meaning it has to end up in memory. If it's accessed during data streaming, it's accessed once and then it's cached in memory unless streaming logic rejects it from the cache because it becomes stale. Duplicating it solves nothing about seek, you still need to physically seek it through platters. Seeking it on HDD from 2 different locations can't possibly be faster than seeking it once and caching in memory and reading it from memory 50 billion times coz memory has nanosecond. Or parsing it directly from HDD from multiple duplicated assets for some fucked up reason is the dumbest thing I've ever heard. And if it's not cached, then wtf?! What is this, 1992?

 

That's like me writing a code for a program for same function 16 times consecutively instead of just looping it back and referencing it to the same code segment 16 times.

if there wasn't duplication the drive would be constantly seeking back for some really small file and performance would tank, you dont always start from the same point so you can't assume it will be already in ram/vram as one can usually start the game directly to any level you want

 

bolded part:

funny you say that, as that is one of the things one can do to make your programs faster

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, RejZoR said:

I know how HDD's work. And this still makes no sense. Game assets are mostly loaded at level load time. Meaning it has to end up in memory. If it's accessed during data streaming, it's accessed once and then it's cached in memory unless streaming logic rejects it from the cache because it becomes stale. Duplicating it solves nothing about seek, you still need to physically seek it through platters. Seeking it on HDD from 2 different locations can't possibly be faster than seeking it once and caching in memory and reading it from memory 50 billion times coz memory has nanosecond. Or parsing it directly from HDD from multiple duplicated assets for some fucked up reason is the dumbest thing I've ever heard. And if it's not cached, then wtf?! What is this, 1992?

Depends on the game. Take a game like Witcher 3 or GTA V. Large open world games where new assets are constantly streamed into memory and old ones discarded as you move around. It absolutely makes sense to reduce load times as much as possible.

 

If you need a bunch of assets right now, you don't want to wait an additional 4 - 10 ms every time the read head needs to seek the next file. You want the files to be next to each other so you can read them as one consecutive stream. Otherwise your CPU spends additional time just waiting for the read to begin, before it can even start to decompress the data that is coming it. This may be the difference between noticeable stuttering when entering a new area and a smooth transition.

 

5 hours ago, RejZoR said:

That's like me writing a code for a program for same function 16 times consecutively instead of just looping it back and referencing it to the same code segment 16 times.

It would be silly to (still) do that by hand, true. However an optimizing compiler may certainly do a loop-unroll if it deems it beneficial to performance.

Remember to either quote or @mention others, so they are notified of your reply

Link to comment
Share on other sites

Link to post
Share on other sites

If SSD's would come down in price to match or even come close to how cheap HDD's are that would be nice. Otherwise SSD's aren't going to dominate like people think they will, cost is more important then performance. No one is going to store their library of movies or music on an SSD at the moment, for one thing there is barely any reason to even playing 8k video isn't needed on an SSD and it's simply not cost effective to do so.

 (\__/)

 (='.'=)

(")_(")  GTX 1070 5820K 500GB Samsung EVO SSD 1TB WD Green 16GB of RAM Corsair 540 Air Black EVGA Supernova 750W Gold  Logitech G502 Fiio E10 Wharfedale Diamond 220 Yamaha A-S501 Lian Li Fan Controller NHD-15 KBTalking Keyboard

Link to comment
Share on other sites

Link to post
Share on other sites

10 hours ago, CircleTech said:

Except that telling someone to backup their data is completely useless if it's happened after the fact. Which is the case of all data recovery.

That's why you back up important things BEFORE there is a need to do data recovery. There really isn't an excuse for people not to have some kind backup for important data in this day and age. This has been a common sense for decades.

Link to comment
Share on other sites

Link to post
Share on other sites

sure, let me just replace my

 

6TB HDD

4TB HDD

2TB HDD

 

with SSDs. now let's just do a price check....awesome, only $1500....

🌲🌲🌲

 

 

 

◒ ◒ 

Link to comment
Share on other sites

Link to post
Share on other sites

On 10/19/2020 at 12:13 AM, CircleTech said:

Tell the HDDs not to let the door hit them on the way out. 

 

But there is one thing good about HDDs beyond just low cost storage - Data recovery. As of today, it is much easier to recover data from a dead HDD than SSD because the technology is at this point very well understood and the industry is heavily consolidated to a few players. With many SSD manufacturers using a variety of controller, caching, redundency, and storage geometry configurations, this is going to make standard data recovery a lot harder. 

It's pretty much impossible to recover data from a SSD, doubly so for NVMe drives. TRIM pretty much destroys any possibility of recovering data that is accidently deleted. You still stand a chance of recovering data from drives that were simply immersed in water or some bad user dropped their laptop off a back of a truck, but that's an edge case. 

 

If you want to ensure data is destroyed on a SSD beyond saving, into the paper shredder it goes. 

 

Full drive encryption also pretty much means no recovery without recovering the device it was part of.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×