Jump to content

Chaftalie

Member
  • Posts

    116
  • Joined

  • Last visited

Reputation Activity

  1. Like
    Chaftalie reacted to mariushm in Samsung 860 Evo 2TB and Samsung 860 QVO 2TB   
    Crucial MX 500 uses a different kind of TLC memory (Samsung manufactures their own memory chips, Crucial uses TLC memory made by Micron) which is slightly lower quality than the memory used by Samsung for their drives.
    Also, the Crucial uses a cheaper controller ( Silicon Motion SM2258) which isn't as smart as the Samsung controller, so it's not as good at writing the data in optimum places, in order to improve the life of the drive.
    Therefore, Crucial only guarantees 700 TB of life for their 2 TB models.
     
    ALL drives can be used for daily file downloads, rendering videos on them, anything... I'm just saying you have to be aware of these "lifetime writes" numbers.
    When you're rendering or capturing videos from games, you're often writing large amounts of data, for example I often capture 1 TB worth of data in a few hours of gameplay, because I like to capture in a lossless format and then compress it at the highest quality. In such scenario, where I capture 1 TB of data every day, the drive would wear out in less than 2 years, much less than the 5 year warranty of the drive.
    If you want very long endurance, a drive to last you a long time, you should look for a drive that uses MLC memory. However, these are quite rare these days and expensive.
     
    For example, Samsung 860 2 TB PRO  uses MLC and has a 2400 TB endurance value (twice as much as the TLC version) but costs 500$: https://www.amazon.com/Samsung-Inch-SATA-Internal-MZ-76P2T0BW/dp/B07879KC15/
    Or, Adata SU900 uses MLC memory and has a lifetime writes of 1600 TB : https://www.amazon.com/ADATA-Ultimate-SU900-SU900SS-Internal/dp/B07473GKD1/
     
    My opinion is pretty much this:  even if you write 1 TB each day to the drive, with that 2 TB drive you'll still wear it out in more than 3 years... 2-3 years from now, you'll probably be able to buy a 4-6 TB SSD for the equivalent of today's 100-200$, so you really won't care about it.
    It's like being sorry today that a 250-500 GB mechanical drive you bought 3-5 years ago is about to die on you. 500 GB is so low you won't care, and you won't mind buying a replacement drive, it has served you well for 3-5 years.
     
    I'd go with the 860 EVO because it would have a more consistent performance and it's a proven drive, used by loads of people... QLC is still relatively new, and I wouldn't be comfortable with the performance quirks.
     
    ps. please stop with the private messages / emails, I don't answer to those. Be patient and wait for answers on the forum.
  2. Like
    Chaftalie reacted to mariushm in Samsung 860 Evo 2TB and Samsung 860 QVO 2TB   
    Well, you can simply do the math.
    Assuming a game has 25 GB size on average, if you install 40 games each week, you'll write 40 x 25 GB = 1000 GB (~1 TB) each week.
    Therefore, you'll eat through those 1200 TB in 1200 weeks, or 23 years (assuming 52 weeks in a year).
     
    You're only writing a game once, so you only write 25 GB once, or whatever the amount of data the game uses. Reading data from the memory chips is free, it doesn't wear out the memory cells.
     
    Note that these calculations are only valid if there's plenty of empty space on the drive. The more full the drive is, the more that 1200 TB number will be less accurate. For example, if you fill 1.8 TB of those 2 TB with stuff and then you keep installing games and uninstalling games in those 200 GB remaining, the endurance may drop to 1000 TB over the drive's life.
    The SSD is smart enough to constantly move around data inside the memory cells and write new data in cells that are less "abused", and it tries to keep the "wear and tear" level on each memory chip at around the same percentage with all the others.... for example, for TLC memory, in general the memory cells can be erased up to 200-400 times, and then they become unusable, so the controller keeps track of how much each memory chip and each portion of the memory chips is used, and picks less used memory areas to put new data into.
    But, under certain conditions like when there's little room available, it's forced to choose less optimal locations for new data, because it would otherwise simply take too much time to calculate where to put the data, and the speed would decrease.
     
     
    Your math is wrong ... 3200 / 7 = 0.46 GB per day
    so it's 1200 TB / 0.46 = 2625 days /366 days in a year = ~ 7 years.
     
  3. Like
    Chaftalie reacted to mariushm in Samsung 860 Evo 2TB and Samsung 860 QVO 2TB   
    So explaining it in simpler words.
     
    There's different kinds of flash memory chips, based on different technologies: SLC , MLC, TLC, QLC 
    SLC on the left is the most expensive to manufacture, but has high performance and high endurance.
    The more you go to the right, the cheaper the chips are to manufacture (because they can pack more bits in the same area) but the downside is you lose a bit of performance and endurance.
    The 860 EVO uses TLC and has an endurance of 1200 TB - that means that you can write data on it as much as you want, but after the 1200 TB of data written to the drive (in real world probably a few tens to hundreds of TB after that threshold, these numbers are conservative), some memory cells may become unable to store new information, they become read only ... so from that point gradually you will no longer be able to store new data on the drive.
    The 860 QVO uses QLC memory - it's cheaper to make and packs more bits but tradeoff is lower endurance - the 2 TB drive can only handle 720 TB instead of 1200 TB.
     
    QLC is also slower when it comes to writing data to it, so Samsung resorts to some tricks in order to improve speed.  As long as there's empty space on the SSD, it configures a part of the memory chips in SLC mode, which allows it to quickly write incoming data into those portions of the memory chips and when the drive is idle, it slowly moves the data from those SLC portions to the more permanent locations, emptying those area and making them available for new incoming data.
    In the case of the 2 TB model, this SLC portion is adjusted dynamically between 6 GB (when drive is nearly 100% full), up to 78 GB when there's plenty of free space.
     
    So for example, let's say you install a huge game, like let's say Fallout 4 with Ultra HD textures, basically a 90 GB game.  The SSD will start writing these GB of data as they come in using those 78 GB of SLC mode memory at very high speeds, let's say up to 520 MB/s ... and once 78 GB (or whatever quantity was in SLC mode at that point) is reached, the drive will be forced to start writing in the memory cells configured in QLC mode, so the 12 GB remaining will be written at slower speeds, around 160 MB/s.
    Once the installer completes writing those 90 GB, the drive will work in background and slowly transfer those 78 GB into the QLC memory, at 100-160 MB/s so let's say those 78 GB will be transferred to QLC over the next 78000 MB / 100 MB/s = 780 seconds or 10-15 minutes.
    As soon as portions of that 78 GB of SLC memory are available, the drive can start reusing them for caching writes.
    From the operating system's point of view, everything is transparent.
     
    The read speeds are not affected, QLC is just as fast as TLC, the limitation is the SATA protocol which limits the transfer speeds to around 500 MB/s.
     
    So EVO is better, has higher endurance, but if you're just gonna use the SSD for storing games (so you write very little, you write the game once and then launch it multiple times) then QLC would be OK.
    You don't want to use QLC as a drive where you write a lot, like daily file downloads, rendering videos on it, capturing video games to ssd etc etc because you'll burn through those 700 TB of life fast.
     
     
     
  4. Like
    Chaftalie got a reaction from DesolationGod in My games and computer crashes.   
    First download the driver, disconnect internet so windows update does nothing crazy, then use the program, then install the driver (;
    I would use the programm DDU. Its more work, but the result has a higher chance of success
  5. Like
    Chaftalie reacted to Bulldogge Builds in Salvage   
    So my dear friends, here we go again!     My last project  went up in smoke since the company I had contacted for help with creating the vital parts of my scratch build promised more than they could keep. So here I was with all these parts I had received for that project and nothing to show for. I had a discussion with be quiet! and explained the situation, told them I wanted to do anything to make up for all this and do the best out of this crappy situation. Luckily they were kind enough to send me a Dark Base 700 so I could make a new project with all these parts... So I decided to name this proect "Salvage" which I though was suitable.   My thought with this project is a clean all air cooled build in all its "simplicity". Open up and mod parts for better airflow. Since all parts are close to inaudible that wont be an issue. Despite "access" to RGB on almost every component in this project I've decided to go with the classic red and black theme.   Components I will be using for this project     Case - be quiet! Dark Base 700   Mobo and CPU - MSI Gaming M7 AC and Ryzen 2 2700X   PSU - be quiet! Straight Power 11 850W   GPU - MSI 1050Ti Gaming X 4GB   SSD and RAM - Teamgroup Delta RGB SSD 250GB och Teamgroup Delta RGB 3000Mhz   CPU cooler - be quiet Dark Rock Pro 4   Fans - be quiet! Silent Wings 3   I've already started modifying the case quite a bit, but I still hope you'd want to follow me on the final parts of this project.   Once again I'd like to send a huge thanks to my sponsors for the continued support and confidence in me despite the debacle with the last project!
  6. Like
    Chaftalie reacted to Bulldogge Builds in Salvage   
    As stated, I've already made some modifications by now. I planned to cut an opening in the top and in the front of the case and I had some parts made for this...               My plans was to use a similar design for both the front and the top... but I realized that the front was designed in a way that cutting the entire piece in a similar way as the top was a no go       So I'll be going with a solid front and come up with some other stuff for that part... pictures of my solution for that will come in a future update.
  7. Like
    Chaftalie reacted to Bulldogge Builds in Salvage   
    I thought it would be a waste not to use that nice looking piece for the front. So I took the front part of my old project and cut it so I can use it there instead.     The psu shroud is cut open and I've made a piece of acrylic that will be lit up with the name of the project on it.             And I've of course cut the top open so I can use that nice looking top piece...  
  8. Informative
    Chaftalie got a reaction from Kenny.h in Win 7 or upgrade to Win 10   
    Windows 7 officialy won't support the 8700K. it will be functioning but you will get an popup thats nearly as big as your screen saying that your CPU is not supported.
     
    Some features of the 8700K won't function with win7 so there can be programs where you can get a reall big boost compared to win7.
    (4k video encoding won't be hardware accelerated, ...)
  9. Agree
    Chaftalie reacted to OrionFOTL in Does PSU quality help other component' lifespans   
    Yes, but it's the other way around. Bad PSUs reduce the lifespans of other components, while good PSUs don't. 
     
    How did you measure the voltages? 
    Also the deviation isn't so important as much as voltage drop from low load to high load. 
  10. Agree
    Chaftalie reacted to samcool55 in Will a Noctua NH-U14S fit on this motherboard with a GTX 1070   
    That's more than enough.
    Question tho: Why get a K-cpu if you won't OC?
  11. Agree
    Chaftalie reacted to Mister Woof in Which ram is better   
    The one that matches your system's color scheme the best, obviously.
     
    No seriously, that's my answer.
  12. Agree
    Chaftalie reacted to Mikensan in NAS or Fileserver?   
    Cool beans, I think ultimately this is the best route. As for backup, maybe look at data that maybe your company would need within minutes to survive while servers / data is restored. Then cost that out to a cloud provider such as google/microsoft/aws.
     
    I'm not sure where your backup software is installed for your tape drive, but if it is the file server then if it dies you'll be down for an extended period of time. Also by identifying data that's a "must have" for daily operations, you get offsite storage that can be recovered quickly.
     
    Food for thought anyhow, lets us (or at least me lol) know how it all goes.
  13. Agree
    Chaftalie got a reaction from Mikensan in NAS or Fileserver?   
    Today we had an meeting and we came to the conclusion that we will buy a second server, virtulize both of them.
    The fileserver will run on one of the servers.
     
    We got to this conclusion, because with a second server + virtulization we can run other services on the server like our warehouse management.
     
    Thanks for your input @Mikensan
     
    Without the Data the company can't run, so thats very critical.
    But for that we have a working backupsystem on tape thats working great. Without the NAS we will stick to the tapes.
    And yeah, their are definitelly too few users justifying an exact analyse.
  14. Agree
    Chaftalie got a reaction from LinusOnLine in NAS or Fileserver?   
    Today we had an meeting and we came to the conclusion that we will buy a second server, virtulize both of them.
    The fileserver will run on one of the servers.
     
    We got to this conclusion, because with a second server + virtulization we can run other services on the server like our warehouse management.
     
    Thanks for your input @Mikensan
     
    Without the Data the company can't run, so thats very critical.
    But for that we have a working backupsystem on tape thats working great. Without the NAS we will stick to the tapes.
    And yeah, their are definitelly too few users justifying an exact analyse.
  15. Like
    Chaftalie reacted to Ben Winchester in USB-C Display Adapter   
    Thank you for your response, I don't have a Mini DP cable, but will look into purchasing one. Was hoping there would be away.
     
  16. Agree
    Chaftalie reacted to Mira Yurizaki in USB-C Display Adapter   
    The USB adapter shows up as its own GPU. It's not suitable for gaming on if it can even do it.
    Even if your laptop has a DP port connected to the dGPU, that doesn't mean every laptop with Optimus is wired up this way. All of the laptops I've used with Optimus don't have their dGPU outputs hooked up to anything.
  17. Like
    Chaftalie got a reaction from Ben Winchester in USB-C Display Adapter   
    The HDMI port of my Acer BlackEdition VN7 Laptop is directly connected to the dGPU. At least I think so, because the nvidia controll center says that and I can OC the connected monitor over nv controll panel.
     
    And the acer Support said that my displayport over the type C port of my Laptop is also dorectly connected to the dGPU, but I can not verfiy that for sure.
     
    A mini DP Adaptor should be about 10 bucks so nothing to loose there.
  18. Like
    Chaftalie got a reaction from Ben Winchester in USB-C Display Adapter   
    I never actually tried hooking up an monitor over USB 3.0 so I cannot really give you any advice there.
    (USB 3.0 = 5Gbit/s HDMI 2.0, which is probably used here has got 14.4 Gbit/s, so there must be some serius compression to achieve two HDMI ports with 4k/60fps.
    Maybe you do not see this compression, but I do not know because I never used something like that.)
     
    Did you try your mini Displayport? Maybe thats directly connected to the dGPU (I do not think thats the cause I read somewhere that msi connects all Display out port through the iGPU) but it definitelly is worth an shot.
  19. Agree
    Chaftalie reacted to LinusOnLine in NAS or Fileserver?   
    It is pretty much impossible to answer. To say for sure you would need to analyze data usage. How critical is the data, how much redundancy do you need and so on. Is there even enough users to be worth doing an analyze like that?
  20. Informative
    Chaftalie reacted to Mikensan in NAS or Fileserver?   
    Text files can be compressed so your actual disk I/O will be low while the CPU or Raid Controller will be more active. The hardware and O/S shouldn't have any problems saturating the gigabit. Any delay in save times will either be because you have 1 disk failing - slowing down the other disks or their workstations have some issues.
     
    I do suggest that databases have their own raid array in most cases, even if it's just 3 disks in a RAID that get backed up. So a dedicated server for SQL would be even better which sounds like it might be your goal.
     
    Fileserver is synonymous with NAS - but I take it you mean a NAS appliance from a vendor like Synology / QNAP vs server hardware like a Dell R7x0. It really depends on your goals. With a file server you can add a HBA card and just add a storage shelf and forever expand the storage. Some higher end Synologies / QNAPs also allow you to expand but you're right back at the price point of server hardware.
     
    If your only goal is faster save times for your engineers, there's 0 difference. If your goal is to be able to expand then you have some differences to consider.
     
    To me, Synology and QNAP just offer a nicer GUI to manage the NAS and is more turnkey. Great for small to medium business depending on the business needs.
  21. Agree
    Chaftalie reacted to Mikensan in NAS or Fileserver?   
    Having never actually bought and used these NAS's I can't personally vouch for any. However the two vendors I've seen with the most praise has been Synology and QNAP. You may pay a premium over a brand such as Buffalo / Netgear / whatever, but their large communities and support are worth it imo.
     
    The honest truth is a lot of them share hardware (SoC / embedded) with maybe the differences being the enclosure / powersupply / backplane / memory. Just like anything else, accept the risks for buying something you're unsure of.
     
    As for the "big boys" like NetApp / HPE / Dell that also sell somewhat turnkey solutions for enterprise environments - those things might not have the prettiest GUI / interface, but they are tanks. Last place I left was using a NetApp from 2008 - only ever had the occasional dead disk but the 5 shelves and controller never had an issue.
     
    However I would not pay too big of a premium - if for the difference in cost between the low end cheap vs brand name you could buy an entire second cheap one, it's a no brainer. 
     
    My last thoughts are just budget for backups including offsite - CYA. Even if it's just a USB external drive off of craigslist lol.
  22. Agree
    Chaftalie got a reaction from kirashi in Can you overclock this motherboard?   
    If you mean to overclock the CPU with this MB, Yes you can, because it is an Z390 MB.
     
  23. Like
    Chaftalie reacted to hooraah in NAS or Fileserver?   
    Something isn't adding up.  4 users all simultaneously saving a 10mb file should be almost no load on the server, unless you have exactly 0 bytes left, or if one of the hard drives is failing.
     
    I would start looking at some stats and see where the bottleneck on your current server is - is the throughput maxed?  disk i/o maxed?  sitting there idle?  If you copy a 1GB file to a file share, whats the transfer rate?
     
    You definitely don't need a new server to the tune of 5K.  New hard drives at most.
  24. Like
    Chaftalie reacted to Mikensan in NAS or Fileserver?   
    Man, that is not a quick question - it's a wall of text.
     
    Your 5 year old server is more than capable of handling your needs. You simply need to add storage. You could connect an external, move files off of it temporarily, replace the drives, and move the data back.
     
    SAS adds more "lanes" thus increases r/w speeds etc.. If you're working off a 1gb network and simply using this for storage - there's no benefit. If the server is doing some local computation and heavy i/o then SAS would benefit (or a very busy SQL Server).
  25. Like
    Chaftalie reacted to Rallen9999 in Computer deciding to break today   
    I ended up reinstalling synapse! Thanks for your input thought, I appreciate it!
×