Jump to content

Thread for Linus Tech Tips Video Suggestions

CPotter

Channel Super Fun is almost at 1 Million subscribers. Let's get that plaque for the best channel on YouTube...

 

For it, I suggest you phone-record Linus getting triggered by LMG Staff at Language Mistakes like Jake did on purpose and by mistake in the recent livestream "I ACTUALLY Need to Buy Some Stuff - Black Friday Shopping Stream".

 

Probably everyone at LMG has at one point triggered Linus with some mistake and will remember what it was, since he's a Language Purist.

Guys, please do this... it'll probably take weeks to get everyone to get to trigger him, but it'll be worthwhile. Just don't do it all in one day otherwise he'll stop caring for a while.

I can see one person getting called into his office for sending an e-mail full of common grammar mistakes.

And the finalle being everyone in in the company chat system agreeing to the same mistake, like typing: LEGOS! Fk yeah!

 

And please, place at a corner the name of the prankster.

 

Cheers.

p.s: Get Luke involved

 

 

 

 

 

 

 

.

.

.

Take care, and take care of somebody else.
George Carlin: Jammin' in New York 1992

</>

Link to comment
Share on other sites

Link to post
Share on other sites

How about desk cable management. Where Linus shows the most optimal way to organize Cables under a desk [corner desk, U shape desk, Regular desk,]

╔═════════════╦═══════════════════════════════════════════╗
║__________________║ hardware_____________________________________________________ ║
╠═════════════╬═══════════════════════════════════════════╣
║ cpu ______________║ ryzen 9 5900x_________________________________________________ ║
╠═════════════╬═══════════════════════════════════════════╣
║ GPU______________║ ASUS strix LC RX6800xt______________________________________ _║
╠═════════════╬═══════════════════════════════════════════╣
║ motherboard_______ ║ asus crosshair formulla VIII______________________________________║
╠═════════════╬═══════════════════════════════════════════╣
║ memory___________║ CMW32GX4M2Z3600C18 ______________________________________║
╠═════════════╬═══════════════════════════════════════════╣
║ SSD______________║ Samsung 980 PRO 1TB_________________________________________ ║
╠═════════════╬═══════════════════════════════════════════╣
║ PSU______________║ Corsair RM850x 850W _______________________ __________________║
╠═════════════╬═══════════════════════════════════════════╣
║ CPU cooler _______ ║ Be Quiet be quiet! PURE LOOP 360mm ____________________________║
╠═════════════╬═══════════════════════════════════════════╣
║ Case_____________ ║ Thermaltake Core X71 __________________________________________║
╠═════════════╬═══════════════════════════════════════════╣
║ HDD_____________ ║ 2TB and 6TB HDD ____________________________________________║
╠═════════════╬═══════════════════════════════════════════╣
║ Front IO__________   ║ LG blu-ray drive & 3.5" card reader, [trough a 5.25 to 3.5 bay]__________║
╠═════════════╬═══════════════════════════════════════════╣ 
║ OS_______________ ║ Windows 10 PRO______________________________________________║
╚═════════════╩═══════════════════════════════════════════╝

 

Link to comment
Share on other sites

Link to post
Share on other sites

I feel like someone at LMG needs to clear some myths about molex to PCIe 6/8 pin adapters.

 

Might as well be an opportunity to collab with Electroboom once again, who knows…

Asus ROG G531GT : i7-9750H - GTX 1650M +700mem - MSI RX6600 Armor 8G M.2 eGPU - Samsung 16+8GB PC4-2666 - Samsung 860 EVO 500G 2.5" - 1920x1080@145Hz (172Hz) IPS panel

Family PC : i5-4570 (-125mV) - cheap dual-pipe cooler - Gigabyte Z87M-HD3 Rev1.1 - Kingston HyperX Fury 4x4GB PC3-1600 - Corsair VX450W - an old Thermaltake ATX case

Test bench 1 G3260 - i5-4690K - 6-pipe cooler - Asus Z97-AR - Panram Blue Lightsaber 2x4GB PC3-2800 - Micron CT500P1SSD8 NVMe - Intel SSD320 40G SSD

iMac 21.5" (late 2011) : i5-2400S, HD 6750M 512MB - Samsung 4x4GB PC3-1333 - WT200 512G SSD (High Sierra) - 1920x1080@60 LCD

 

Test bench 2: G3260 - H81M-C - Kingston 2x4GB PC3-1600 - Winten WT200 512G

Acer Z5610 "Theatre" C2 Quad Q9550 - G45 Express - 2x2GB PC3-1333 (Samsung) - 1920x1080@60Hz Touch LCD - great internal speakers

Link to comment
Share on other sites

Link to post
Share on other sites

Xposting because it's also an idea:

I would love a video getting to the bottom of why internet hasn't improved much after the 90s. Its practically a long set of coax cables and hubs. Tmobile is acting like they can also be an ISP. How the hell is that going to play out though? Cells of any type have a softcap for users right? that's why frequency hopping (spread spectrum) showed a lot of promise several years ago.

But it fundimentally, hasn't improved.  Gigabyte speed fiber optic is always around the corner. Wireless doesn't need a glorified LAN party (sort of) but has other problems. If, in gamer speak) you lose line of site, then no more internet, and or spotty f'n awful internet.

Any chance for a series about Why the fuck is internet still in 2010? and not in goodway. What's the future of internet technology look like?  Will we get to where internet is provided the same or better way as  water and Beer are? What about finally doing something about internet trolls, and misinformation traps? Will we have utopia where speeds and quality is so bad ass we can have holographic projectors, and holodecks, and  the Oasis? or are we stuck doing it blockbuster anime and pizza night pleb level?

Why do people care so much about how fast they can get cat pictures, or  apps from a place that rymes Brats and May, and Captain Jacksparrow would fit right on in with. What about the upload speed side. Backing  shit up now is about like pulling a bulling ball through your ear while juggling  kittens. If I run a speed test right I'd see big download numbers but a 1mbs upload. 

 

Link to comment
Share on other sites

Link to post
Share on other sites

video suggestion on K5-PRO viscous thermal paste for thermal pad replacement or any similar products if there is. Which can fix the problem of not knowing the thickness of pads for basic user. Does it really work? It also has no electrical conductivity, they show you must add a ton on components

Link to comment
Share on other sites

Link to post
Share on other sites

Cover the best alternatives to whatsapp

My Laptop: A MacBook Air 

My Desktop: Don’t have one 

My Phone: An Honor 8s (although I don’t recommend it)

My Favourite OS: Linux

My Console: A Regular PS4

My Tablet: A Huawei Mediapad m5 

Spoiler

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

In a recent video Linus announced that LTT would expand to do more in-depth testing and written product reviews. I think this is an awesome direction for LTT to take and I would like to suggest something to be included in monitor reviews. One not so well-known feature of monitors is called DDC/CI. It can best be thought of as a software interface for your monitor OSD. DDC/CI is directly accessible on Windows via WinAPI, on Linux via a tool called ddcutil and even on Macs, though I've heard support is flaky on M1.

Many applications such as Twinkle Tray or Dell's Display Manager rely on DDC/CI. Apart from changing monitor settings, one interesting use case for DDC/CI is to set the input source of your monitor through software. This basically amounts to a software KVM switch that doesn't suffer from the downsides of a hardware KVM switch:

  • Because the monitor is directly connected to the input source, your bandwidth is not limited by a KVM-switch and it works with current and future port standards
  • Features like G-Sync/FreeSync just work (if your monitor/GPU support them, of course)
  • No/much fewer compatibility issues/issues with display sleep/etc.

In my case, I have a StreamDeck with two buttons which I've programmed to tell my monitor to switch between my PC and PS5. No need to fiddle with the OSD.

 

DDC/CI is incredibly valuable for power users who use a Windows machine for gaming and e.g. a Linux or Mac machine for work, or who have one or more consoles which they would also like to connect to their monitor. Unfortunately it is rarely even mentioned in display reviews so it is difficult to find any information on the level of support of a particular display. Therefore I would love to see the following information included in monitor reviews:

  • Does the monitor support DDC/CI?
  • Can all settings in the OSD be changed via DDC/CI? Which settings are/are not available? Are there any that can only be changed through DDC/CI, but not in the OSD? (Think brightness in sRGB mode)
  • Can inputs be selected via DDC/CI?
  • Are DDC/CI commands supported via a non-active input? (This is important for KVM use cases involving consoles, as a console cannot send a DDC/CI command to tell the monitor to switch back to the PC.)
  • Does the monitor fully disconnect the non-active inputs? (If it does, Windows rearranges all the windows such they they become visible, which is often undesirable and annoying. Hot-plug detection killers are needed to prevent this behavior.)
  • Bonus: Can I do all of this while using PiP/PbP? Can I change both input sources individually? Can I change the arrangement or size?

In my personal experience newer Dell monitors have great DDC/CI support, which is why I've been exclusively using them. It would be great to find out how well DDC/CI is supported across the industry so that I can buy from other manufacturers knowing that it will work with my setup.

 

Here is an example of where someone achieved a full software KVM switch in a VFIO setup with VMs.

Link to comment
Share on other sites

Link to post
Share on other sites

Hi LMG, I recently got into a discussion regarding the strenght of hinges in laptops. It began with a video of a cat breaking a laptops hinge(macbook). The discussion arose when i said that you don't need much force to break a laptops hinge when its all the way open. And while the video clearly shows a (not that big) cat breaking that hinge backwards, people argued that thats not normal and any "normal" laptop would not have that issue. Now i do not have the resources to go and buy broken laptops to try and test this for myself.  I was wondering if hinges really did become worse in terms of their maximum load over the years and if there's any merrit to the claim that laptop hinges can easily take 4/5 kg of force while moving over a lever(that being the screen). And since i couldn't find anything about the maximum load a laptop hinge can take, which ofcourse is also brand dependant i was hoping you guys could help me and some others out. I think it would make for a pretty (nerdy) but interesting video where you can compare build quality over different brands

Link to comment
Share on other sites

Link to post
Share on other sites

I have been looking around on the internet for quite some time and haven't found a good video discussing it, so here's a suggestion from me:

Do a video on DLNA for 4k blu-ray players.

I kinda stumbled onto this idea while I was looking for an external 4k blu-ray player (only found 1 LTT video on it, but I was especially surprised that the major tech stores in the Netherlands either don't sell them, or only sell one).

Since I couldn't find any of those that I liked I began looking into ways to connect my home media 4k blu-ray player to my PC. At first I thought of the usb port (which of course doesn't work) or I thought about the HDMI (but literally no pc has an HDMI in). eventually I found out about DLNA but I don't really understand how it works because there are hardly any sources about it on the internet.

 

Ps. might be interesting to see how/if the newest video game consoles work with DLNA

 

If I ever think of more ideas, I'll be sure to drop them here :D

 

Link to comment
Share on other sites

Link to post
Share on other sites

It would be cool if you just made prebuild computers and sold it on lttstore.com to show dell and hp how to do it properly like linus and jake build 1000 usd 1500 usd etc budget pc cases and then compare them battle jake vs linus and then sell it as prebuild to a lucky person on lttstore

Link to comment
Share on other sites

Link to post
Share on other sites

Reef babe twerking challenge winner gets one of linuses millions of desktops

Link to comment
Share on other sites

Link to post
Share on other sites

On 3/20/2019 at 5:16 PM, CPotter said:

Hit us with your best Linus Tech Tips video suggestions! This is to replace our old "What should we review next" thread. Linus or one of the writers will read these suggestions, but they may not reply to you in this thread directly.

 

Linus Tech Tips

This isn't so much about one particular product, but rather about the issue of modern standby that plagues virtually every contemporary Windows laptop. From my knowledge, Microsoft aggressively pushed this new mode of sleep to bring Windows machines up to par with MacBooks in terms of features such as instant wake, so much so that the previously reliable S3 sleep mode is now absent from most new Windows laptops on a hardware level. I wouldn't propose this as a video idea if it functioned properly, or even with any semblance of consistency, but that remains far from the truth. Upon receiving my new laptop (a 2020 Dell XPS 13 2-in-1 for your knowledge) earlier this March, I was smitten. What I believed to be essentially a dream laptop for school and productivity purposes soon turned out to infuriate me more than Apple's decision to stick a notch on their otherwise immaculate new laptops, as closing the lid of the machine would prompt one of two responses, either:

 

A) The laptop would suspend my activity as expected and spring to life immediately upon request, or

 

B) It would appear to be in its dormant state but soon inexplicably overheat whilst its display remained off with fans roaring to a positively sizzling and literally hazardous temperature--I was legitimately burned by this thing once.

 

This behavior initially puzzled me, but I figured I provoked it while in sleep mode by plugging in a USB cable to power up my phone, so I vowed to abstain from repeating that mistake and hoped it was a one time occurrence. Oh how I hoped. From then on, traveling with my laptop felt like a sadistic game of Russian roulette; either I'd open it up and it'd function like normal, or, more often, I would hear a faint whine coming from across the room and attend to the source, whose fans were blowing as if I was playing Crysis 3 with maximum ray-tracing on a machine without a dedicated GPU. During the beginning of this behavioral pattern, I had no clue what to do to stop my laptop from searing itself, but I soon realized I could just force shut down the machine instead of letting its battery drain to empty, acting akin to an emergency shut down button at a nuclear facility (not Chernobyl, though). Even though I now had a panic button, I still wasn't satisfied with just letting this pattern continue; I needed to find the issue at its core, and so I took to Dell's forums. Apparently this is a relatively widespread complaint among Dell's new XPS lines, with likely many more device affected. Many of the supposed solutions included adding new registry keys in an attempt to restore the previous S3 mode, but these were fruitless for newer models as the functionality was stripped from the bios. Customer support was equally infuriating, as the overwhelming consensus amongst Dell representatives was something along the lines of "don't put your laptop in a backpack, shut it down when it's not in use." This would be more understandable for a different product category, however, the reason modern standby exists in the first place is a consequence of Microsoft trying to make Windows laptops more spontaneously usable. Asking users of an obnoxious workaround such as shutting their machine down every time the lid closes seemed like a low effort attempt by Dell support to gaslight their customers, and for something as fundamental as a functioning sleep state on machines costing upward of three grand, I was beyond enraged. At this point, I had confirmed my laptop's behavior to be tied to the S3 standby mode and accepted that letting my computer sleep was not an option given my normal and intended use of the machine. So, after a few registry edits, I had replaced every instance of "sleep" in my power plan with the significantly slower but tolerable "hibernate" setting. I recently found out this option is likely not enabled by default due to it chewing through the lifespan of the laptop's SSD when used consistently, but by this point, my options were exhausted and I just wanted to be able to open my laptop without worrying about its explosive mood swings. Now, under any other circumstances, this would've been the end of my story. A lengthy couple of paragraphs to be sure, but I would be remiss if I didn't mention my last point: one of my closest friends needed a 2-in-1 to finish off high school and last well into college. So, me being the optimistic person I am, I strongly recommended my laptop to them. This had to just be a hardware bug on a small batch of laptops, right? Spoiler alert: it wasn't, and even my replacement device had the exact same problem. Despite their ultimate model being a year older than mine (7390 vs. 9310), I often witness them closing and opening the laptop, only for it to be powered off and take an eon (~15 seconds) to boot back up, as this is simply the only feasible option if the machine is to have a long-lasting SSD and not set a house on fire. This is not acceptable. To emphasize the heightened stakes of this defect, my friend is not as tech-savvy as I am, and I can only imagine the thousands of other people unknowingly falling victim to this design are even less so. Imagining a large portion of the userbase simply submitting to this hazardous behavior to no fault of their own is soul-crushing to me. So to conclude this essay-bordering-on-manifesto, I would like to state that I have been a Windows/Android user my entire life. My loyalty to these operating systems is not partisan per se, but rather I enjoy the freedom they allot me and how tangible the confidence each respective dev team has in its user base is. That being said, this is the first time in my life where I'm not sure which aisle my next laptop will belong to.

 

Thank you for reading.

Link to comment
Share on other sites

Link to post
Share on other sites

How about a video verifying VRR claims of manufacturers.  I bought a Sony X85J with promised VRR.  That promise appeared about a month ago, but there's a reddit thread about it completely breaking for some.

And another Sony, the X900H apparently has VRR that "works" by forcing refresh down to 60 Hz no matter what?

 

And I suspect the problem isn't limited to these two Sonys, or even to just Sony.  Testing TV VRR claims might be more useful than testing all those cables.

Link to comment
Share on other sites

Link to post
Share on other sites

"The ITX desktop in a laptop" video idea

 

Most gaming laptops are mainly used in a desktop capacity, its really difficult to upgrade them, and the battery only lasts for a short duration under load. Why not make an ITX laptop that uses desktop components with no battery. EDIT: I don't expect it to be thin but think of the ability to upgrade in the future and still have a screen and keyboard on one device.

Link to comment
Share on other sites

Link to post
Share on other sites

There is a tech piece I'd love to see someone tackle. Some tech news outlets, when doing performance reviews for games/game hardware, will list results as 1080/1440/2k - etc. With the range of widescreen & ultra widescreen formats out there I would really appreciate an indepth piece about the difference in operational/resource/hardware requirements between 2560x1440 & 3440x1440 & beyond. I get & love that 2k and 4k monitors are becoming more prevalent but for the games I play, ultra-wides are really where it's at for me.

 

I also think it might be more complex than simply GPU, operating on an ultra-wide doesn't simply mean it's more polygons for that same aspect ratio when comparing 1920x1080 versus say 3840x2160

With superwide/ultrawide formats, there is I think more mechanics (for lack of a better word) that needs to get processed by CPU, RAM & GPU - and some games would be more taxing than others as well.

 

I suspect I'm not pitching this perfectly, but hopefully someone at LTT can see where my thinking is at and turn it into a quality video topic.

Link to comment
Share on other sites

Link to post
Share on other sites

Hey LTT so on the topic of cooling vs power,

Id love, if you could find well cooled low spec computers that can outperform badly cooled top of the line rigs.

Just how bad, does overheat make a good processor?

Just how good does it make well cooled low spec devices?

Linus sweats repeatedly about laptops and poorly cooled devices that are sold and bought thinking only of the graphics card sticker, and there's no chump experience more universal than seeing your games be beautiful for 20 minutes before everything gets too hot.

Cooling is frequently neglected by manufacturers, when it should be a higher priority. 

This is a hot subject on the show which never seems to get expounded on as much as desired.

If you guys proved that a 3070 on a hot day, performs like a well cooled 2060,

it would be pretty cool.

Link to comment
Share on other sites

Link to post
Share on other sites

Maybe not a full video bit at least a segment on WAN show or TechLinked:

 

https://discussions.ubisoft.com/topic/121755/proton-support-for-r6-siege-would-be-awesome?lang=en-US&page=1

 

Ubisoft is checking the interest for Proton support for Rainbow Six anti cheat.

 

Would be awsome for Linux gaming imo 🙂

You've allready done a whole bunch for new Linux users and getting some serious bugs highlighted(looking at you Linus 😏), getting R6 support would hopefully spark interest with other game publishers/developers as well.

 

https://news.itsfoss.com/ubisoft-rainbow-six-siege-proton/

 

Link to comment
Share on other sites

Link to post
Share on other sites

Silly idea: see how many average gpu's you can run and game on at the same time off 1 CPU and motherboard (probably threadripper because it has a lot of cores and PCIE lanes so you could get tons going) using mining GPU splitter cards and risers. WIth a 64 core threadripper, could you run 64 (or 32 more realistically) VM's each with their own gpu and game on all of them) ? This would sort of be 8 gamers 1 cpu, but way cooler! I would not be surprised if there are other limitations stopping this, but even 16 gamers would be very cool!

My primary system: Core I7 10700k, 32 gb Trident Z RGB ram@3200mhz, EVGA GTX 970 SSC (will upgrade), NZXT N7 Z490 motherboard (Black), Samsung 970 Evo plus 1TB SSD, NZXT C850 PSU, Hyper 212 EVO cooler (getting new water cooler soon), NZXT H510i case. 

 

My secondary system: Core I7 4820k, 16 gb quad channel 1600mhz ram, GTX 780 reference, Asus PX79LE, SK Hynix GOLD s31 500gb SSD, some 10 yr old Cooler Master 750w psu, Hyper 212, old Cooler Master case.

 

Laptop: Lenovo l380 yoga I5 8250u, 8gb ram, 256gb ssd storage)

 

Link to comment
Share on other sites

Link to post
Share on other sites

Maybe a bit out there, but I think alex or some other probably would be able to pull it off.

 

So I found this paper on how you would go about making micro fins in a water block, and surprisingly, it wasn't that hard. You basically need a very thin wire, a high current ps and a kind of accurate wire spool mounted in the cnc mill. It would be cool to see if you can pull off making a di-why scuffed water block of some sort

 

Here is a publication of the paper:

https://asmedigitalcollection.asme.org/IMECE/proceedings-abstract/IMECE2000/155/1125363

And a scihub link 😉

https://sci-hubtw.hkvisa.net/10.1080/10426914.2019.1566959

 

Oh, and here is applied science's video on this type of machining process in general, although he makes it much harder than the process for micro fins:

https://youtu.be/rpHYBz7ToII

 

Btw, the idea came from viewing your A/C cooler build, which kinda have inspired me to do my own take. Goal is to make it seamless in a case, but we will see about that..

Link to comment
Share on other sites

Link to post
Share on other sites

With all the recent controversy regarding the youtube dislike button getting removed, wouldn't it be a cool idea to make an extension that gives that information back?

 

With so many people against the removal of the dislike count you could make an app that creators can use to give this information back to their viewers.

 

The basic idea is that a creator can allow the application to retrieve the like/dislike count from the youtube API. Then on the user side, the user would install an extension that allows you to view that information like when you click on the extension you can see the likes/dislikes via the extension. 

 

Some considerations:

1. Maybe spending this much dev time on an extension that no one might use is stupid. Then you can downsize the project/scope and just limit it to a simple website. The idea is to make a website with a box where they can place a youtube link and you can see the like/dislike count. LTT can lead by example as Linus has said that he doesn't like the removal of the dislike count. This could be an opportunity to back up your words. The video would be the announcement of this project and a call for other creators to join in and make their dislike counts public. Many creators have spoken against the dislike change this would be an opportunity to back their words up.

 

2. This wouldn't help address one of the main points of the dislike count helping weed out scams and such or bad advice. Well if this project is successful and receives widespread adoption then it would but I don't see widespread adoption as likely. If it does receive widespread adoption then people who don't adopt it would be seen as suspicious much like disabling the likes/dislikes in the old youtube system.

 

3. The point of the project while the above point talks about the possible but unlikely outcome of the project, the main point of the project is to protest the dislike by making that information available to your viewers and to take action instead of just talking about it and saying that you dislike the change actions speak louder than words. Especially if you explicitly try to reach out to other creators to join the program.

 

4. This was inspired by a recent ludwig video about the dislike change which showed the dislike count in the title which was inspired by tom scotts video which was made a while called this video has X views. This project aims to do that but on a much larger scale instead of just one video.

If you really think this is too much work you could just make a similar video showing the dislike count in the title and encouraging other content creators to do so as well. Maybe this many creators doing this might get youtube to actually notice and care.

 

5. Some more stupid ideas a short 30 second spot on the WAN show showing the dislike counts for the videos in the past week. It could be like that meme.

Week 1 of showing my dislike counts to protest the removal of the dislike count.

Week 2 of showing my dislike counts to protest the removal of the dislike count.

Etc etc.

 

I hope you guys take this idea into consideration.

Thx,

An average LTT viewer who likes the dislike count.

Link to comment
Share on other sites

Link to post
Share on other sites

Is the new GeForce Now GTX 3080 subscription worth it. It doesn't appear to make any difference to me.

Link to comment
Share on other sites

Link to post
Share on other sites

Find an older office computer that a company is selling, maybe from 2010-2015 and upgrade it to run games at 1080p

Link to comment
Share on other sites

Link to post
Share on other sites

Girlfriend is looking to purchase a tablet for mainly for web browsing; BUT a keyboard and mouse might be useful once and a while. Back in 2012 I remember getting the first Windows Surface and enjoying it for the most part, mainly the keyboard that was pretty solid considering how thin it was. Now looking at Google Chrome books like the Duet and a Lonovo, posting amazon links below. 


Maybe an episode regarding good Xmas deals between the most low cost options vs. $250-350+ and like how much tablet is too much tablet? Also, compared to what we had in 2012 have these come a long way?

 

https://www.amazon.ca/Lenovo-Chromebook-MediaTek-Integrated-ZA6F0031US/dp/B0856QVM2F/ref=sr_1_5?keywords=Lenovo Chromebook Duet&qid=1638513063&sr=8-5

https://www.amazon.ca/Acer-Chromebook-Convertible-Bluetooth-CP311-2H-C679/dp/B086MBQKH2?ref_=ast_sto_dp&th=1&psc=1

Link to comment
Share on other sites

Link to post
Share on other sites

I've got an old (2008) 80GB SLC enterprise cache card made by FusionIO that I can't find drivers for (I've tried 8 different driver packages and being on the phone for over an hour with a very helpful lady from HPE support (I think shipped in an old HP Proliant server), and therefore can't get working. I think it's got 37.5PB (yes, 37,500,000GB) of write endurance (based on CraftComputing's 160GB version having 75PB of write endurance), so it's kind-of a neat, niche product in that regard, plus it's just a cool bit of early SSD computing history from the enterprise side of things. I think it'd certainly make for an interesting video if y'all were able to get it working (I.E. find drivers for it), and I'd be more than happy to ship it up to you guys for you to mess around with, even if it didn't end up being in a video. If y'all were able to get it working and felt so inclined to send it back (and send me the drivers for it), that'd be swell, but you'd be also more than welcome to just keep it if you wanted.

Here's a link to CraftComputing's video about a bunch of similar cards (it's what inspired me to buy this one):

And here's some pictures of the specific card I've got (yeah, sorry in advance, I've only got a low-profile bracket for it, it didn't come with a standard-height bracket):

80GB_FusionIO_Front.thumb.JPG.9c3a89f50cd764d3fda870b4fb895db0.JPG80GB_FusionIO_Rear.thumb.JPG.d6b6966ccf60a7574e9b9b35d87ab8e1.JPG

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×