Jump to content

porina

Member
  • Posts

    15,596
  • Joined

  • Last visited

Everything posted by porina

  1. What's the market? It isn't Nvidia's core business. Google have their own chip. What does Twitch use? Also I wouldn't use the illustration to estimate sizes, even if they're somewhat indicative. Annotated die shots like that found half way through link below is better, but it doesn't split it down to a fine enough detail. https://locuza.substack.com/p/nvidias-ada-lineup-configurations I was using their recommended upload bitrates earlier as a proxy to estimate the potential storage impact of multiple formats.
  2. Another "when" method I've used in the past is to basically firewall off MS servers. Windows can't update if it can't reach MS servers. MS has a list somewhere of the servers/ports for WU to work. You can simply use that in reverse to only block MS servers. Note this also blocks MS Store in case that matters. Edit: I don't know if this is the list I used before but it is a current reference: https://learn.microsoft.com/en-us/windows-server/administration/windows-server-update-services/deploy/2-configure-wsus#211-configure-your-firewall-to-allow-your-first-wsus-server-to-connect-to-microsoft-domains-on-the-internet
  3. I've seen data corruption in lower end branded SSDs before and wonder if this is the same. As a precaution I'd backup anything important. If Kingston offer SSD software use that to check the SSD initially. Also what does SMART report? Anything of interest? I've seen SSDs have problems even when not reporting problems. Then I'd do a full surface read of the SSD. It may well crash again on that same part. Generally errors are not unexpected as they will eventually happen, but you'd hope they'd fail more gracefully than this. It should detect and map out bad areas with spares on next write. The problem might be the corruption is so bad it can't silently do this. For redundant storage use cases, it is better to have a disk fail than to give bad data and that might be what they're doing here.
  4. I didn't think it would but still worth a look. That means you can buy pretty much any standard ram and it'll work. I still recommend seeking out 2R modules as they can perform better than 1R ones where all else is equal. Laptop BIOS do tend to be very limited/basic compared to desktop enthusiast mobos with overclocking options. You don't get more fancy options unless you have one of the rare "overclocking" enabled laptop CPUs. Edit: if you can't find 2R ram, or not at a good price, don't worry about it. See it as a "nice to have" and not a necessity.
  5. I missed this post earlier. If you still upload to YouTube you'll see this on the video detail page: These light up as they are done encoding. SD is the first block to light up, and the others follow later. When SD is done I get 360p offered only. When encoding is complete I get all the options from 144p to 4k. It may be possible, where it is cheap to do so, they can real time transcode lower resolutions. Like 360p to 144p is relatively low cost. You're not going to want to do 4k to 1440p real time. But given after the initial SD encoding is done I do not get 144p option, that doesn't seem to be the case. You also got me interested in codec choice/offerings. I upload with YouTube's recommended H.264 settings. On my system I always get offered that video playback in VP9 all the way from 144p to 4k. I checked in case they might alter for different resolutions. This is with Chrome, Ampere GPU, Win11. I don't know if you have different browser, GPU or OS if that may differ. We can look at YouTube's recommended upload bitrates for 60 fps SDR content. 4k is 53-68Mbps, and the sum of all 1440p, 1080p, 720p, 480p, 360p is 49 Mbps. They don't list 240p or 144p but they're not going to be significant. Let's simplify and say that if you upload 4k content, the storage requirements for all 8 offered resolutions will be less than 2x that. I know what you download may differ, but it should scale similarly.
  6. Go into the bios and see if it supports XMP. If not, you'll be limited to JEDEC 2133 speeds. In that case, pretty much any JEDEC standard ram (anything without XMP) should work. You can buy higher speeds as they'll also work at the lower speeds fine. I'd suggest Kingston if they sell directly in your region. You can select 16GB modules that are 2R, which generally perform better than 1R ones. https://www.kingston.com/unitedkingdom/en/memory/client/ddr4-3200mts-non_ecc-unbuffered-sodimm Other companies like Crucial don't break out 2R options, more likely will be 1R.
  7. How it could work might better be compared to jpeg encoding as that is in the frequency domain. You separate out the different spatial frequency content and store them separately. The coarsest band would be lowest resolution. You can layer on the higher frequency information to build up the higher resolution images. I'm going to guess a problem with this approach is in the temporal domain. Things like motion data and predictive frames. Do they still work well when broken apart in this way? At best I can imagine there will be an encoding overhead.
  8. It's usually the opposite problem often defaulting to 480p, even on my 10" tablet. 480p is just about ok if you're viewing landscape content on a phone in portrait orientation but rotate it and full screen, it still could do with 720p. YouTube has been somewhat aggressive at serving lower resolutions presumably to save bandwidth. I use an extension on desktop to force it to 1080p.
  9. My BX500 1TB for comparison. Your writes do seem unusually low. For starters I'd try a force TRIM and see if that helps. Run the Windows "Defragment and Optimise Drives" tool on the SSD, wait a bit, then run the bench again and see if it changes. May sure there is nothing else going on at the same time as the benchmark. The BX series are/were DRAM-less so writes can suffer if you write a lot and don't give it some time to recover. Otherwise, if appropriate install/update chipset or storage drivers. Use Crucial's software to check if there is a firmware update for it.
  10. I see the higher MSI Claw has dropped to £600 making the lower model is more expensive! Is this the cheapest way to get Meteor Lake right now? While the Claw has some specs I like, the Steamdeck OLED is probably still the best choice if a mobile gaming device is the main goal and that's also cheaper.

    1. Poinkachu

      Poinkachu

      Saw GamersNexus review of it and it looked like Claw has some issues & quirks here and there

  11. porina

    Having not owned an android tablet for five yea…

    Not sure I've seen a recent tablet that isn't end firing sound, but the thing to watch out for is the speaker positioning. My old iPad mini only had speakers on one short edge so in landscape mode sound was one sided. I made sure after that speakers were distributed suitably for landscape operation. BT volume steps could be host/device interaction. I used to support BT devices where I used to work and we had all sorts of problems depending on how it gets mapped. Dunno if there is any easy solution for this.
  12. porina

    I massively want to be on the deck of USS Misso…

    Any particular reason that ship in particular? My interest in WW2 era warships started since playing Azur Lane. Only visited HMS Belfast so far but will try for more when the chance presents itself.
  13. porina

    So I kinda live in the middle of no where but I…

    Lots of bigger companies work their warehouses late into the night. If you catch the right courier pickup within their distribution network it can happen. I've had several <12h non-local deliveries before from similar. Best delivery speed (distance/time) I had was ordering from UK late on a Friday night to a west coast USA company. It arrived Monday 9am. Must have got just the right flights.
  14. I'm not familiar with that offering and if it is based on the 11900H models as claimed, it should perform well. But it also isn't anywhere near as cheap as the CPUs listed in OP + a cheap X99 mobo. The cheapest one I have managed to find on Aliexpress is still over 2x the cost. So it then becomes a value judgement. I think there is a lot of FUD. I've actually bought and used tens of various cheap PSUs in the past, either directly or indirectly e.g. comes with a cheap case, or unknown used systems. The failure mode is instability. I've never seen one explode. My house did not burn down. Maybe if you look hard enough you could find someone who had that but they're certainly not common.
  15. Used VLC for a long time but it isn't satisfying due to its limitations. I'd like some or all of the following features in a video player. I'm going to guess all of them isn't likely but let's see what we can get: Free Can apply LUTs (.cube) Can play multiple audio tracks simultaneously, ideally should be set by default Navigation similar to Youtube. Click on screen to stop/resume. Arrow buttons to skip a bit forwards/back. Frame by frame possible in both directions. 2 and 3 are the main features I'm looking for. I don't mind using multiple players if needed. 4 is a nice to have. So, any suggestions?
  16. Probably YT will do the work for now. There is no change to the recommended upload settings for creators. Still H.264. https://support.google.com/youtube/answer/1722171
  17. I agree but that doesn't matter. AV1 being default is a long way from AV1 being mandatory. What I wrote was expanding the post I replied to. Give a bit more context as to which systems might support AV1 hardware decode. Probably vast majority of the last two years, and a good proportion going back 3.5 years. There will always be older systems that don't support the latest thing. That doesn't negate the need to add new things otherwise nothing would ever change. We had similar arguments with ray tracing in games and we've already passed the 50% RT capable mark on Steam Hardware Survey. Ball park 5 years from nothing to majority of Steam gamers. We're probably going to have similar arguments about AI. It takes time but it has to start somewhere. Edit: we also have the impending doom of Windows 10. Once that happens that could jump things up a bit.
  18. Nvidia Ampere: Desktop: Sep 2020 Laptop: Jan 2021 AMD RDNA2: Desktop dGPU: Nov 2020 Laptop dGPU: Mar 2021 APU: Jan 2022 Intel: Desktop iGPU: Mar 2021 (11th gen Rocket Lake) Laptop iGPU: Sep 2020 (11th gen Tiger Lake) Desktop dGPU: Mar 2022 (Arc) - I'm ignoring the OEM only card they made just before Arc. Basically earliest hardware AV1 decode support was late 2020, about 3.5 years ago. Amongst the early implementations are Intel iGPUs, which will make up a good chunk of PCs out there. If you were to expand it to all product categories, we're pretty much covered with new releases for over 2 years, with a good proportion going back further.
  19. Where I am, 7700XT new is >30% more expensive than a store offered used 3070 with warranty. Ebay is cheaper and private sales go even lower. IMO 3070 would still be sufficient for a great experience up to 1440p for foreseeable future. Just don't expect the highest settings in newer more demanding games.
  20. For gaming I'd go for 6 or 8 cores with as much clock as possible. Of the models listed, the 1650v4 seems best. 6 cores is sufficient and it clocks best. Plus it is pretty cheap. The Chinese x99 boards are pretty cheap. I think this is something people worry far too much about. I used to buy the <$10 "550W" gold PSUs that all die in just over a year of 24/7 running. System goes unstable, drop in another one and away you go. Many higher end games have scaled to benefitting from 8c16t for years. In my own testing I've seen Watchdogs Legion show higher performance at 12 cores than 8 cores (HT on). We are in diminishing returns there, and since OP is looking for a budget build high end gaming probably isn't a major consideration.
  21. I guess Final Fantasy is the biggest name I'm dropping. The move to action based combat kills it for me. I have exactly zero interest in action combat style games. Last I bought was XV, and don't think I got past the 1st chapter. I actually spent more time in the AC crossover they did, and I don't like AC much either. I didn't bother at all with XVI or VIIR, beyond the demo just to make sure I still dislike it. On a parallel note, I got Gran Turismo 7 when I got a PS5. The two have gone together since practically forever, and to me is the best racing game series. But it felt old. Graphics were better than ever, but it was essentially the same game. I just wasn't interested in doing the same thing yet again. I forget exactly which older version it was, they added a B-spec side mode where you're a manager instead of driver. I kinda liked that and hope they flesh it out into its own game. Probably wont though. Oh, speaking of doing the same thing over and over again. Pokemon. I was a very late starter, my first was HeartGold on DS. I played the main games to... the Hawaii one? I think it was the last one was on the 3DS. Again, it felt like basically every game was the same. You go out, collect pokemon, battle gyms, and eventually the elite 4. Places might be different, pokemon might be different, and graphics got improved. But you're fundamentally doing the same thing. I wanted a high res version of the game at the time. We got Switch. Nintendo are almost dead to me now. I still play Pokemon Go, but that's more a side thing.
  22. What's probably happening when you see high temps but not high total power is that all the power is going into a few cores, and those are getting hot. I don't know of a good solution to this without adjusting turbo behaviour directly.
  23. If your complaint is bad subpixel rendering, the proper fix is with the subpixel rendering, not the subpixel layout. I just found this thread(?) discussing it and asking MS for a solution: https://github.com/microsoft/PowerToys/issues/25595 Edit: forgot to quote/reply to it, but I use the OLED TV primarily as a PC display.
  24. I don't claim to be an expert in display technology. There will likely be considerations and tradeoffs that aren't apparent. A polariser at best will reduce brightness levels, and may introduce more unwanted side effects. I've only owned two OLED devices and can't say I noticed any problem with text on either. One is the display on a Pixel 3a, the other is a LG OLED TV. Is the apparent pixel density low enough on a monitor for it to be a problem? The TV I have used sitting close to it to give a similar FOV to using a monitor, and can't say I noticed any complaint about text quality. Probably something is hitting a limit so they don't have the exact control left over to remain accurate. A bit like audio (loudness) compression. Pure guessing: heat density?
  25. I think the first link below might be the software I was thinking of. The 2nd link I don't remember using but I've used their other software before in the distant past. https://learn.microsoft.com/en-us/sysinternals/downloads/procmon https://www.nirsoft.net/utils/registry_changes_view.html
×