Jump to content

skywake

Member
  • Posts

    2,155
  • Joined

  • Last visited

Everything posted by skywake

  1. You know, that expansion card got me thinking. If someone put some effort into it the framework mainboardcould be the base for a pretty solid "prosumer" grade NAS. I feel like that space is pretty neglected. You're either going with Synology or QNAP and getting a locked down, buy once and bin when you outgrow it hardware platform. Or you go DIY and go full desktop CPU with its fairly limited case options for 3.5" drives. Basically your only "small footprint" options for the desktop route being the Node 304 and Jonsbo N2. Both of which limit you to mITX which basically has the same IO restrictions as the Framework motherboard anyways and both are, although relatively small, significantly larger than the off-the-shelf options Just have enough space for the mainboard, make a case with a backplane for like 5x 3.5" drives + multi-gig LAN that connects to PCIe expansion in some way. Have some way to deliver power to the mainboard and the drives via some external power brick. Boom, small footprint, power efficient NAS. I mean enthusiasts have been repurposing their old desktop hardware as servers for years. Why not do the same with old laptop hardware? edit: Also I'm pretty sold now, going to go with Framework for my next laptop. It just might be a while given the 3 year old Dell with its Zen 2 Ryzen is still plenty enough for the things I actually use it for
  2. As a developer and inadvertent dev-ops guy at work this very much pleases me. There's no better feeling than having an automation save you hours of time during the day. And I definitely feel the pain of those small tasks that are just frequent enough to demand your full attention You know this is a novel and kinda cool idea in a way, points for that. But not really something that makes sense. I mean if you were to do this you'd probably do something like set up a VM with multiple GPUs indirectly attached. Then have some kind of way to automatically swap them around in software, possibly spinning up a new VM with the new drivers already pre-installed. Sure you'd have an overhead with the VM but if the only changing variable is the GPU it'd be good enough However I think the main problem with your idea is that you kinda need to step back a bit. What's the actual goal here? Because I would imagine testing a variety of games across multiple GPUs would be only one of the goals. You'd also want to potentially test multiple CPUs on a variety of games, or multiple RAM speeds, multiple OSes, different BIOS settings. And in any case once you have run the benchmark for a particular config you have that data saved in a database. The main pain point they'd be trying to resolve would be the crunch before the NDA lifts when they're benchmarking one or two products across a suite of games. And the swapping of the GPU would be a very, very small part of that process
  3. Yeah, I'm not really going to be transcoding UHD files. For one thing, kinda defeats the point a bit, you have a UHD copy so it can be at a stupid resolution. And secondly when I get a UHD BluRay usually it'll come with a regular BluRay which I'll also rip so I'll always have a 1080p copy also... well almost always. At this point the only UHD movies I don't have a 1080p copy of are the LotR trilogy and Justice League
  4. At risk of opening a can of worms on this sub-forum why Unraid over TrueNAS? Any particular reason? I was leaning towards TrueNAS purely because it's free and I can basically start from scratch here but perhaps that's not the best approach given my random assortment of hardware. Eg if I go Unraid I might be able to get a couple of 6 or 8TB drives now, "borrow" one of the 2TB drives from my existing NAS and get myself upto 8-10TB of storage with mis-matched drives. Then down the road if/when I use that capacity then I can drop in another 6/8TB drive in the 4th SATA port Now I'm curious what any of the possible TrueNAS evangelists might suggest and why
  5. I currently have a Synology DS418j with 4x 2TB HDDs and am starting to run low on space so I'm taking this as an opportunity to get something a bit beefier also. Because the Realtek CPU in the Synology is pretty anemic, especially if you want to do any transcoding. Which would be nice given pretty much all of the storage on it is taken up by BluRay rips. While considering my options I realised I can probably just re-use an older machine. Putting TrueNAS on it and going that route. As an expectation setting here, I'm not expecting to transcode 4K content but it would be nice to be able to have Plex downscale 1080p rips to 720p/480p for mobile use So here's the spare hardware I have to play with: - Asus H97M-E (4x SATA, Gigabit LAN, 1x 16x PCIe, 3x 1x PCIe, NVMe) - i5-4590 (quad core, no HT, HD-4600 iGPU) - 16GB DDR3 - Multiple spare 120GB SATA SSDs - Spare case with 4x 3.5" HDD bays, 3x 2.5" HDD bays, 3x 5.25" bays - Nicer spare case with 3x 3.5" HDD bays, 3x 2.5" HDD bays - 450W Power supply - R9 285 (I probably don't want to use this but it is spare, if I was to go down this road I have more power efficient GPUs currently in use I could put on it later) Currently I'm considering purchasing the following: - 4x 4TB HDDs. RAIDZ1 giving me 12TB of storage, doubling my storage - 250GB M.2 SSD, basically the smallest I can get. Possibly partition some of it for L2ARC? - 2.5Gbps NIC, not immediately but given I only have one other device with >1Gbps and no >1Gbps switches yet This new server would be home to all of the files currently on my NAS in addition to any new rips going forward. Including UHD rips. The old NAS I'll delete all the UHD vids from and have at my parent's place for off-site backup. Probably sneaker-net to start with but I might consider some kind of automated backup over the internet down the line. At least for the smaller files (home videos, photos, that sort of stuffs) So my main questions are: - Is using the existing SATA ports on the motherboard fine or should I get an expansion card? - If I do get more SATA ports, would it be worth using one of my spare SATA SSDs as an L2ARC? - Would using an SSD for L2ARC give me much of anything? Note, will cost me nothing other than possibly a SATA port - Any other particular suggestions/changes I should make here?
  6. I think you kinda missed my point a bit and there's a huge middle ground here that I think is the best position here. My point was quite literally that all of the energy talking about smart home devices is put into trivial, unreliable and flashy actions that don't have much practical value. Sure some of those actions are fun but to me, that's not really where the appeal lies. The appeal lies in the more invisible automations You talk about offline thermostats and garage doors and sure, those things are good automations ideas. So good that they became commercially viable products well before there was any kind of standards or central controllers for this kind of stuff. And you know what? They're still good automations. But there are other ways to do the same thing and also other related kind of ideas people haven't built a commercial product for. Also yes, security is a concern IMO but there are plenty of automations you can run that don't impact security. Every automation I've done for myself so far is either for temperature control, basically an extended thermostat, or "accent lighting" that tells me something. It's something I've enjoyed doing so I'll probably extend it to automate powering on/off devices on schedules/triggers and so forth. I do have Sonos speakers in every room which I guess you would argue makes me "lazy" but it's nice and it works so why not? I draw the line at "security" related items, automating locks and indoor cameras It depends on what it is. The home opens I've gone to I'd honestly be pretty happy if there was someone who was a bit of a techie in there before me. Even if it meant having to work out to what to do with some old in ceiling speakers. Because the places I've looked at, I'm certainly not seeing people with houses wired up with Cat6. No, most of the time? People have done nothing. They have a WiFi modem combo unit on the kitchen counter and that's it....... Tech is a selling point, not a negative. And working in a company tangentially related to the home building industry I can tell you now. People want this stuff. Or at least, people are getting this stuff regardless of if they want it
  7. I've gotta go with the mini-DVDs on the GC purely because there was basically no even theoretical advantage to that choice. I know some speculate that that choice was to "make it harder to pirate games" but as someone who wasinto that scene in the early 2000s, I would think bandwidth was the barrier to piracy not physical disc size. I don't see what they gained out of that. It just gimped the size of their games and locked them entirely out of any chance to also be a DVD player. Which mattered a lot in the early 2000s. Also, if they had used regular sized discs they could have continued to release GC compatible games through the Wii era given how close the Wii was to the GC internally With the other things people are talking about, the Wii U Gamepad and the N64's cartridges? Of course they were gambles that at the time did not pay off. But they were at least reasonable gambles. I'd argue when you look back now games on the N64 generally hold up better than PS1/Saturn games because it was a more capable machine. And load times weren't exactly stellar on those old 2x CD ROM drives. Similar deal with the Wii U, the Switch has kinda proven that the concept itself was sound even if they were just a little bit too ahead of the curve. And those games released on the Wii U have been ported to the Switch and sold 10s of millions of copies (Mario Kart 8, New SMB, 3D World, BotW) So yeah, GC Mini-DVDs -> dumb move
  8. Honestly, I think this is kinda my point in a round about way. I mean a thermostat IS an automation and IMO probably the best kind of automation there is. You set it and it reads sensors (time/temp/humidity) to decide what it does (heat/cool). That's the gold standard of home automation right there. And there's a lot more you can do along those lines For example, here in Perth Australia it gets pretty damn hot in the summer and "programmable thermostats" aren't really something that's that common. And there are usually degrees of actions that you'll do throughout a particularly hot day like turning on ceiling fans well before you turn on the aircon, closing blinds during the peak of the heat, opening the doors when the temp outside drops to lower than the inside temp late in the day etc. Doing it that way saves energy compared to just blasting the aircon all day. So a reasonable set of automations might be: - Fans go on when the temp goes above a certain level (I've done this, for some fans) - Set aircon to turn on when the doors are shut, people are home and the temp goes above some other threshold (I plan to do this) - Turn the aircon off when the temp outside starts to drop or when someone opens the door for more than a couple of mins - Have an RGB lamp that visually shows what the weather is like outside vs inside (I'm close to this but I don't have any inside temp sensors yet) You know, extended thermostat kind of stuffs. Do I need some automation on the washer/dryer/fridge/dishwasher/oven? Probably not, there's not a lot I would think you'd need to automate there. Not saying there couldn't be for some people but I wouldn't see the need. For example I have seen an automation someone had on their dishwasher where it would change the colour of a light strip near it to indicate whether the dishes in it were clean or dirty as a prompt to stop people from putting dirty dishes into it without emptying it. So you can do stuff with some of these "smart appliances" but..... I would agree most of those kind of "smart" device features are not very useful. The useful stuff is often not the flashy stuff they put in the ads and on boxes. Really, this is quite literally the point I'm making I mean, of course it's a luxury. We're literally talking about creating a virtual butler for us. Do I need my bedroom lights to slowly fade on in the morning before my alarm goes of? Of course not, I did fine without it. But damn, it's a hell of a lot nicer than being woken up by some rude device shouting in your ear to shock you awake. I'd also argue waking up in that way makes me quite a bit more productive, a slowly fading on light doesn't interrupt your sleep in the same way an alarm does In terms of the environmental impact? Again, I think it really depends on what you're doing. I think a lot of the more useful automations are built around more efficiently using resources. Really, the two biggest things you tend to have to manually juggle in a house would be keeping the climate comfortable and setting appropriate lighting. Then there are tertiary things like managing when to schedule things like watering, high draw devices, exhaust fans. Pretty much all of that stuff, to some degree, has an element of reducing consumption. So it's not necessarily a decision between environment and "luxury", it can be both Kinda thinking about this now, I'm not so keen on automating things like dryers at all but there is something you could do along those lines. We get charged more for power we take off the grid than we do power we put back into the grid via solar. If you had your solar output tied into home assistant and a simple RGB strip you could visibly show when you were exporting power to the grid and to what degree. Maybe combine with the chance of rain and humidity. Red lamp? Use clothes line. Green lamp? Dryer. Blue lamp? Wait
  9. I was looking at this tweet from @LinusTech this arvo and it kinda got me thinking a bit. Has the tech industry kinda missed the boat a bit on what "smart" devices should be doing for us? The entire point of automation, and to a degree tech in general, should be to make things easier for us. As soon as you start using tech that's "flashy" while making the task more complex you're kinda defeating the point (unless of course the point is to sell smarthome gear which I guess IS the point but still) Eg the example from the tweet. Saying a phrase to have a car door open, especially when car doors come with a remote already, is not something that makes the task easier. Doubly so when the voice assistant misunderstands you and does the wrong thing. I'd go further and say the same thing is true for most voice assistant commands. Unless you're issuing some command with some attribute (eg turn the lamp blue, play king gizzard in the kitchen) you're just doing an action that would be easier if it was replaced with a button or a switch. I'm pretty into home automation but the only smarthome command I ever issue to google is "turn off the family room TV" when someone has left the TV on in the family room I think there should only be really one kind of home automation people should be thinking about. It should be some kind of event that I'm otherwise not directly aware of followed by some appropriate action to that event. Everything else is just adding complexity to otherwise simple tasks. A few examples of good automations, some of which I've done: - fade on bedroom lights early on days when I set the alarm on my phone for a work day - turn lights on/off when I leave/return home - turn on fan if temperature outside exceeds 27C (then off when it cools down) - announce over speakers when certain devices connect to the WiFi (eg relatives have come over) - change light colours based on sensors (eg blue -> humidity, red -> temperature, green -> "cold and dry") - turn on exhaust fan when bathroom gets humid etc That sort of stuff, genuinely helpful little automations. That's the appeal of this kind of thing in my mind. The fact that we seem to only ever talk about it and companies only ever advertise smart home stuff as "hey google, turn on the lights"? I think it really does damage to the entire concept of smart home in the minds of a lot of people. Because people just think that it's not a hassle to just turn on the lights with a switch. Which is valid, because it's true, but it's not the appeal /rant
  10. In my mind the only advantages consoles have over PCs are the upfront costs, the controller first UI and the exclusives. I have a Switch for Nintendo's exclusive which obviously I'm into given my choice of avatar. And I probably do the majority of my gaming on Switch. But I also have a PC connected to my TV with a wireless KB/Mouse, an XBox controller and a huge backlog of games on Steam If I wasn't into Nintendo's exclusives I'd be PC only
  11. huh? The Wii U was after the DS so that wouldn't have made any sense. Surely the complaint would be that the the Wii never had a DS player Also Nintendo did sell DS games on the Wii U eShop. So technically you could play DS games on the Wii U but not on the Wii In any case, OP is asking for the best ways to play Wii games on a modern TV. The Wii U is fully BC with the Wii and has HDMI. Failure or otherwise, it makes for a better Wii than an old Wii
  12. The GBA player isn't compatible with the Wii either so not sure what you're getting at here
  13. A couple of additional suggestions. Find a Wii U, for all the failings of the Wii U it is still also a Wii with a HDMI port. Plus you get the added bonus of having the ability to stream your Wii games to a portable screen. The Wii U GamePad may only be 480p but Wii games were 480p anyways so it's not much of an issue Second suggestion, emulation. If you have a Wii and games it's easy enough to dump your discs. These days Wii emulation is good enough, there are some titles that have some funky stuff going on and scaling in some titles does odd things. Also you'll want to figure out a way to get a sensor bar (the Wii is a bit of an old device now, not as easy to find a USB or battery powered sensor bar as it once was). But once you get it going? It's pretty solid. You can get Mario Kart Wii running at 4K/60fps on just about anything
  14. I had a look and thought I'd come back to answer this one. The screens do work although they are running at a lower framerate than the game and a little bit of tearing. But I remember that always being the case and looking at world record runs online it seems it is replicating how it rendered on original hardware. Just at a significantly improved level of clarity Actually don't take my word for it but, this one seems like a fairly honest demo. It appears in multiplayer the screen doesn't render correctly (not sure if that was always the case?) but in single player it's fine. And they're talking about maybe a frame of lag in 2 player a 2 frames in four player but still, in their words, better than other online implementations of N64 multiplayer. Which makes sense given you'd have to add latency to allow for the controller inputs to arrive (it's an emulated game, there is no such thing as predicting player positions)
  15. @Dragonwinged I'm a dude with a Super Mario Kart avatar so take what I say with a bucket of salt I guess. But I would say that the criticism of the N64 titles on NSO have been pretty heavily overblown, I suspect people are mostly trying to justify their objections to the price. From the couple of hours I've played across most of the titles the only issue I've had is the terrible framerates in Ocarina of Time which is just because that's the framerate that OoT ran in. There are reports of missing effects in OoT but I didn't pick those up, they can probably patch that out without much effort. Really in terms of official ways to play these games of which I have all of the options? IMO the Switch is the best platform for most titles. The Wii was possibly the most accurate service for emulation of N64 but the Wii's output resolution was 480p and barring heavy modifications wasn't portable. The Wii U VC service had issues all over the place and portable Wii U gaming was compressed 480p within line of site of the console. The Switch renders the N64 games at a relatively high resolution and they look pretty great in portable mode. I'd just suggest finding a copy of the 3DS remakes for games where that's a thing. Assuming you want to go the official route. And if you're into Goldeneye and are already into Wii emulation maybe track down a copy of the Wii version. N64 games in general haven't aged well so if the goal is "best versions" then that's the way to go. If you want a quick and easy service? NSO does the job. And in any case I'm on the Family Plan with a bunch of people who are into Animal Crossing. It's an additional $55AU/year for the NSO Expansion Pack while the Animal Crossing DLC is $37AU. Three people on my plan want the Animal Crossing DLC so it made sense to upgrade to the higher tier. The N64 games are really just a bonus extra for me
  16. Definitely, although it's mostly because with access points after a certain point you're paying for capacity and range rather than single user throughput. With the Unifi example you can get a WiFi 6 Lite AP for ~$190AU which supports a connection speed of "1.2Gbps" (actually ~700Mbps in real world tests). Going upto the BasestationXG you're spending $2300AU which does connection speeds upto "1.7Gbps". Which on paper seems like a very small gap for the difference in price. But the XG has a significantly higher gain antenna, three 5Ghz radios rather than 1 and a 10Gbps backhaul. Which is usefull but isn't going to be of any value when you have at most half a dozen users....
  17. I mean of course, a wired connection for the kind of applications that demand "solid WiFi" is preffered and having it run locally is even better. But there's flexibility you get with having content stream over a network and even stream over WiFi. With the UHD BluRay rips example I could drop them on a HDD and plug that HDD into the device I want to play the video on. Or I could drop them on a NAS and stream them over a wired connection to my TV (shakes fist at TVs with 100Mbps Ethernet) or over WiFi to a laptop or something The point of having a network is that you're sharing resources. My laptop doesn't need a huge m.2 SSD because I have a multi TB NAS, I don't need a HTPC because I can just stream videos over my network. Hell I don't even need a high end gaming laptop because in-home streaming is servicable and I can just leverage the power of my desktop. Means I'm not buying gear I don't need.... which to be fair is probably less of a concern to someone who's blowing $1000s on stadium tier WiFi
  18. Of course when there's a video about WiFi there are always the two inevitable commenters. The people who don't understand the distinction between "WiFi" and "internet" or why you may want solid WiFi even if you don't have a 100Mbps+ internet connection. Presumably they've never heard of a NAS, steam in-home streaming, security cameras, whole house audio setups, smart home decives. There's also more to solid WiFi than just high speeds. Although I guess this misunderstanding is just amplified every time a video like this comes out where speedtest is the benchmark they use rather than iPerf or, you know, UniFi's built in speed tester..... (Slight tangent, how do most new TVs still only have 100Mbps ports? There are use-cases for speeds above what streaming services provide. Are we just pretending UHD BluRay rips aren't a thing?) And then there are the nuts who somehow think 1 or even a dozen ~10W access point pushing out non-ionising ratiation is a concern. The same people who will happily go out every day without any protection underneath a fusion reaction that pumps out a few kWh/m^2 radiation daily a fair portion of which is ionising Ultraviolet radiation.... Not sure about other options but when I was setting up mine I just used the one built into the Unifi controller. Although it seems they've pushed it here now: https://design.ui.com/
  19. Firstly, can we not do this joke? Just don't. Secondly what's not to be excited about? If you are into portables this is basically what you've been dreaming of for the last decade. Speaking as someone who's more of a portable gamer, this class of product is the way forward. It has been for a bit of a while now. And it's great that Valve is not only pushing it to a larger audience but also pricing it this aggressively. This is exactly what people who are into portable devices have been waiting for. I mean ok if you're not into portables it's not a product for you and that's fine. Not everything has to be for you. And other people being excited about a thing that isn't for you does not diminish your enjoyment of other stuff. So just let people enjoy this I for one make no secret of being a Nintendo fanboy and mostly because they've been pretty much the only game in town for portables since forever. The Switch was and still is a pretty fantastic piece of hardware for that reason. But the prospect of a half-decent PC in the same form factor? That's a nice future. I'll gladly pay the early adopter tax for that. And because it's literally a PC I can plug a large HDD full of movies into it and use it as a HTPC when it's docked or grab my wireless KB/Trackpad and use it as a kind of super-portable laptop. It's a nice concept
  20. I've always been more of a portable gamer and I've always finished more games on GB/DS/3DS/Switch than on PC or home consoles. Just how I tend to like playing games. So for me this is a no-brainer, especially when I have a Steam Library containing 100s of unplayed games. And with a spec along those lines it could serve double duty as a half decent HTPC when combined with a dock.
  21. Adding an external GPU isn't really practical or cost effective. The dock as it is currently is nothing much more than a USB dongle with a Display Port to HDMI adapter. I'm not even sure there are the PCIe lanes possible to make an eGPU possible. Even more to the point, the Switch isn't just lacking a decent GPU there's also the issue of CPU performance and memory bandwidth. It's also worth noting that there was a hardware revision that improved the capabilities of the SoC in 2019. The current Switch SKUs including the new OLED SKU are on paper 30% faster than the original Switch. The problem is Nintendo underclocked the original Switch when docked by about 30% and then an additional 50% when in portable mode. The clocks of the original Switch remain on the SKUs they've released since. So on paper there is about a 60% gap between what the OLED & 2019 Switch SKUs can do when docked and what Nintendo actually allows. Nintendo could unlock that performance if they wanted to at pretty much any time. Anyways, I don't think we'll have an "upgraded dock" to enable a performance boost. What we might get is an upgraded dock that supports HDMI 2.0, it's not entirely clear whether or not the OLED dock does support HDMI 2.0 or not at least on paper. It may do. What I expect we'll get is a new SKU next year when the silicon shortage slows down a bit. Nvidia has a new Tegra SoC they're planning to have available next year that's Ampere based. The current Switch SoC is Maxwell based so that should provide a bit of a bump
  22. If nothing else I now know which bits of useless tech trivia I'm pretty decent at and where I fall short. Apparently I'm good at archaic AV terminology and programming/linux but suck at Apple history and random US cities/schools tech companies were founded in. Sounds about right.
  23. True but I'd still file this under "lies we tell ourselves when we upgrade". Speaking for myself I build myself a new machine every 5 years or so, I definitely told myself this lie for the last two at the very least. In 2014 I was telling myself that "multi-gigabit LAN" or "NVME storage" were things that I might want to get in the next few years. But other than going from 8 to 16GB of RAM I didn't actually upgrade my 2014 PC at all. Most of the expansion I think I might want a couple of years into the future seems to come built into motherboards by the time I actually upgrade. 2.5Gbps LAN is starting to become common on motherboards, NVMe is pretty much the default for storage, if you go ITX in a new build odds are you'll have WiFi 6 built in. Any other expansion can be done over USB with pretty much zero downsides. If I'm being honest with myself the only reason not to go ITX is that mATX boards don't include WiFi, which I don't need, and are therefore cheaper. That's it.
  24. To be fair the point of RFID card blockers is security not "blocking the 5Gs" and they do a semi-reasonable job of it. Although the best protection against people reading your cards is probably just to stack them ontop of each other which is basically what you do in any wallet. LPL did a good video on RFID lock hacking
  25. Definitely what I was commenting on which should've been pretty damn obvious. But hey, case and point I guess. Seems pretty fair that someone takes issue after skim reading my post about me pointing out that the title did a good job at showing that people take issue with a skim reading of a post. I've never really had much of an issue with the clickbaity LTT video titles/thumbnails because I'm subscribed and enjoy the content itself. That kind of stuff only impacts people who aren't subscribed. On the second point, I don't think they really need to do that kind of A/B testing with a specific video. I'm sure LTT keeps track of all the video analytics as well as what the content was, what their title was, what kind of thumbnail they used, who wrote the scripts, who the hosts were etc etc. They have thousands of videos on their channel so they'd have more of an idea of what works and doesn't work for tech videos on the platform than probably anyone. But if you do want an A/B test, their Mac Mini video had a very positive and far less clickbaity title of "Apple Destroyed my Expectations" with 2.2mill views This video has done 1.4mill views so far. Certainly far from the worst performing video in their last month or so but remains to be seen if it reaches 2.2mill
×