Jump to content

rattacko123

Member
  • Posts

    1,281
  • Joined

  • Last visited

Awards

This user doesn't have any awards

5 Followers

Contact Methods

  • Discord
    pls don't add me
  • Steam
    you know the name
  • Origin
    rattacko123
  • Battle.net
    rattacko123#something
  • PlayStation Network
    never used this
  • Xbox Live
    you mean windows live?
  • Twitch.tv
    rattacko123
  • Twitter
    @rattacko123
  • Heatware
    more like vaporware
  • Website URL

Profile Information

  • Gender
    Male
  • Location
    Tasmania, Australia, The world, space, milkyway, universe...
  • Interests
    Computer related stuff, youtube, anime trash, I need more hobbies, currently looking into graphics design
  • Biography
    Hoi, this is Rattacko.
    I'm probably the only person who actually gets decent speeds on the NBN.
    I currently have no idea what to do in my life, I guess I'll just go with the flow until I find a nice fishing village.
    I've grown bored of playing video games.
  • Occupation
    Student (as of 2018)
  • Member title
    Memes.

System

  • CPU
    Intel Core i7 5820k
  • Motherboard
    Asus X99 Deluxe
  • RAM
    32GB DDR4
  • GPU
    Inno3D GTX 1080
  • Case
    Phanteks Enthoo Luxe
  • Storage
    1TB samsung 850 evo (OS), WD velociraptor 1TB, bunch of external hard drives
  • PSU
    750W EVGA G2
  • Display(s)
    Dell P2415Q, ASUS 24 inch 1080p display
  • Cooling
    Be Quiet! Dark Rock Pro 3
  • Keyboard
    Ducky One 711 Special Edition
  • Mouse
    Logitech g502
  • Sound
    FiiO K3 w/ Sennheiser HD6XX headphones and Antlion Modmic USB
  • Operating System
    Windows 10 pro
  • Laptop
    Surface Go 128GB
  • PCPartPicker URL

Recent Profile Visitors

4,594 profile views
  1. I think they want to maximise the number of people playing their game to make matchmaking easier and faster. CSGO (now CS2) and Starcraft II both implemented a free to play model quite successfully, with a big raise in player count, so I think they want to replicate that. It does sound like they have a good way of monetizing the game (via campaign missions, commanders, and probably skins) so I'm not too worried about it; hopefully they don't put any atrocious forms of microtransactions such as loot boxes.
  2. This YouTube video by Frost Giant Studios summarises what Stormgate is about nicely
  3. Frost Giant Studios just launched their kickstarter today for their upcoming game Stormgate. The game has already far surpassed the pledge goal, nearing close to $1,000,000 as of writing this. They passed their $100,000 goal within 15 minutes of launching the kickstarter. The Kickstarter is live until February 2nd. You may be wondering, what is Stormgate? Stormgate is an upcoming Free to play RTS game from Starcraft II, Warcraft III, and Red Alert 2 developers. It was first announced last year. If you watched/listened to WAN Show around that time, you might have distant memories of Luke being fairly excited about Stormgate’s potential to be a good game in the current landscape of not-so-great RTS games. Stormgate has already started closed Beta testing, and already had enough funding to release the game even before the Kickstarter launched. Why is Frost Giant Studios running a Kickstarter? Well, they want to offer extras to players including a unique Collector’s Edition with a really sick looking mech statue. They also are selling “Founder’s Packs” with exclusive perks and Beta keys. Depending on the tier of founder pack, digital rewards will be included such as “heroes, story chapters, army accent cosmetics, pets, fog of war shader”. They're also offering stretch goals in kickstarter funding to add new features into the game, such as a $1.1mil USD goal to get expanded bot personalities. Quotes from the Kickstarter page In addition to a 1v1 head-to-head mode, Stormgate will have co-op modes including a 3P co-op vs. AI mode “played with powerful Heroes with unique abilities”. It will also include a 3v3 mode. Stormgate will also allow players to compete with others. It will include advanced matchmaking including a competitive 1v1 ladder. Don't be turned away by the Free to play aspect of the game, because Frost Giant Studios says that Stormgate is never pay to win. “Stormgate is free-to-play so that everyone can enjoy the fun of RTS. Support the development of additional optional content that you enjoy, without ever paying to remove nuisances or gain an advantage”. My thoughts One thing that really stands out to me about Stormgate is that it is marketed as “The first truly social RTS” on the Steam page. “Campaign missions can be played cooperatively, Stormgate’s open-ended co-op mode supports 3 players, and even 1v1 competitive players can make social connections through the built-in tournament system”. I wonder how they will improve the social experience from other RTS games such as StarCraft II; we will find out when the game finally releases I suppose. IMO, the fact that it’ll be on Steam already gives it an advantage; Steam is a lot better than Battle.net at its social features. Based on the Kickstarter rewards, I think its pretty safe to say that the campaigns won’t be free to play (similar to StarCraft II). This is further evidenced by their FAQ "Going forward, we plan to sell new story missions in ‘chapter packs’ several times a year." Speaking of kickstarter rewards, I am surprised to hear they are locking a fog of war shader behind a kickstarter (it’s part of the $60 USD “Ultimate Founder” tier); I am curious as to what exactly this shader is (an alternative look to the fog of war?) On their newsletter, Frost Giant Studios notes that “Valve is an important partner for us, and their policy for Early Access games only allows us to sell Founder’s Packs with a limited number of Steam beta keys as part of a crowdfunding campaign.”. Does that mean this kickstarter only has a limited amount of Beta keys? It looks like each kickstarter reward only has a limited quantity (most of the tiers have a max quantity of 10,000). It may be that Valve’s limitations on Steam keys, while good at limiting scams from sites like G2A, may be impacting developers. Overall, I think Stormgate sounds very very promising. It has a really impressive team behind it. Their audio director has even worked on Unreal Tournament and Deus Ex. Given the fact that they already have enough money to finish the game and eventually launch it, and that the closed beta has already started, this is likely going to be a safe Kickstarter to back. Source: Stormgate newsletter (can’t find a URL to this sorry) https://www.kickstarter.com/projects/stormgate/stormgate/description https://store.steampowered.com/app/2012510/Stormgate/
  4. Does anyone know what encoding settings RARBG used for their x265 videos? They struck the perfect balance between quality and file size for me (and the denoising/degraining was great). Now that RARBG no longer exists I see a need for me to compress my own blu-rays.
  5. LG OLED TVs are nice. They have pretty good highlights and very strong colour reproduction. A very clear image. But, they have a quality control problem. On my particular TV (LG CX 55"), the black reproduction is so bad it's worse than an LCD display - I used to have a Samsung LCD TV from 2014 which broke which had BETTER blacks than my LG oled does. The gamma curve for the blacks is just completely off on my OLED, and as a result I get heavy black crush and posterization. It's kind of hilarious - OLEDs are supposed to have perfect blacks but yet my OLED has the worst dark colour and grey reproduction I've ever seen on any display unless I artificially raise the black level. I've made a bunch of posts about it on this website - just view my posting history and you'll find it not too far down below.

    I think for my next TV, I would spend the extra cash on a Sony most likely, since they have better quality control; and being the company that buys the panels from the panel manufacturers (LG, Samsung, etc) they get the "good" panels with good quality control. Plus, Sony offers DTS support which is a rarity nowadays (if you're looking this up yourself note that Sony only seems to list the full specs on their US website and it may not be listed in your region (but the feature is still there)). While the Smart features may be lacking on a Sony I don't really care that much because I have an Apple TV for all the Smart stuff I need; plus if I really needed to I could always get an Nvidia Shield or something like that. IMO it's worth it, just like why I spend a lot of money on a PC and a nice ergonomic desk setup I can justify spending a lot of money on a TV because it's something I use a lot in my spare time. Whenever I have family around (which, right now, is always) or have a friend visiting the TV is the go-to device. Outside from renovating the kitchen, purchasing gym equipment, or something like that I can't really think of a higher yield investment in regards to improving my quality of life in regard to the material possessions I have. Idk, maybe I should just spend more time outside instead...

  6. Rant:

    iOS 15 is the buggiest OS yet. Background refresh of apps is completely broken. I’ve had apps not resume and shortcuts break in the background across my iPad, iPhone, AND Watch. It’s not just a RAM limitation issue. I can’t believe that a company like Apple could screw up something so basic.

    Well, I can believe that, since they made the unforgivable Podcasts app “sync” feature which doesn’t work AT ALL (it’s ridiculous that they don’t allow you to delete the app from Watch I hate the app so much I want to delete it [addit: correct me if I’m wrong but I think deleting podcasts on phone also gets rid of it on watch?]) and countless MacOS updates that brick computers.

    1. rattacko123

      rattacko123

      iOS 17 and WatchOS 10 seems to have improved the above issue (finally). But iOS 17 seems buggy in other ways. I've had many different bugs occur once (the most annoying of which being the camera app not working) but I haven't been able to replicate them.

  7. Looking forward to the day when iPhones take pictures in more than 12 MP quality. 4023 x 3024 is not crispy enough lol. All this news with the rumours of high resolution cameras in upcoming iPhones sounds exciting at first but then the bombshell drops that it's going to use pixel binning and the final image is probably not going to be higher resolution. That makes me sad (sad face). 

  8. I’ve been experimenting with my LG TV (I’m using the LG CX, which is the infamous downgrade from the C9 with removed HDMI 2.1 features and no DTS audio support but hey gotta be happy for what I’ve got rather than to ponder over what I’ve missed) and I’ve found the best settings for HDR to be as follows:

    Cinema Preset:

    Peak Brightness OFF in HDR, Low in Dolby Vision (in my opinion, this setting is overrated. I see people often lowering their contrast because peak brightness makes everything brighter, which is a big no-no. In cinema mode, I think you must first be comfortable with an OLED Backlight of 100 and Contrast of 100 BEFORE you raise the peak brightness setting. If you lower these settings just to enable peak brightness that is not a good idea as it only produces a beneficial effect in 1% of scenes and makes all other scenes look worse because contrast has to be lowered to compensate. By all means, go for peak brightness high and having backlight and contrast on 100, but if you find it's too bright LOWER the peak brightness FIRST before lowering other settings. One thing to note is that Dynamic Tone Mapping in HDR increases the average brightness of images relative to Dolby Vision, so DV benefits more from a higher peak brightness. It depends somewhat on user preference too; some people may prefer the aesthetic of lowering the contrast while raising the peak brightness, just see what you like. I could be wrong in this, maybe there is some comparison that can show the differences between keeping contrast high vs lowering it for peak brightness; all I know is, I'd rather maximise my contrast first as setting a high peak brightness affects the brightness of subtitles (which can be fixed on a computer or on Apple TV by changing subtitle colour but cannot be fixed on LG's built in video playing app as subtitles cannot be set any darker than "grey"), not only that but there is increased brightness variation between scenes which looks far too exaggerated at times. I prefer the more natural look of a reduced peak brightness, and I don't like to be blinded too much).

    OLED Backlight 100 (lower this to your taste. In standard mode, you may want to lower this further; even further still if you really want to enable the peak brightness setting)

    Contrast 100 (This setting adjusts average scene brightness. It's best to keep it at 100 in Cinema preset. If it's too bright, lower the backlight first before lowering contrast. In other presets you may want to double check if there's any noticeable crushing in the blacks or the whites first, and then find some balance between contrast and backlight.)

    Brightness 56 (my panel has really bad banding, so I've had to elevate this. I elevated this until the black bars started looking grainy/noisy and elevated with my face next to the screen. This gets rid of most of the banding, with only a very small amount noticeable in some scenes if you look for it. Also it's interesting to note that I notice the black bars becoming more elevated in HDR but on SDR I don't notice the black bars being elevated; instead the source content becomes elevated. It is IMPORTANT to note that this setting will vary depending on YOUR OWN TV, how colour accurate and gamma accurate it is, and what Backlight and Contrast settings you use. If you decide to increase the backlight and contrast you may want to readjust the brightness and lower it; if you decide to lower the backlight and contrast you may wish to increase the brightness to compensate. See my notes below for how to adjust brightness under the SDR setting; the method to adjust this setting is exactly the same.).

    White point set to "medium". This is equivalent to a white point of 0 on standard mode.
    Real Cinema enabled.

    ALL image enhancements turned off, except for Dynamic Tone Mapping and Real Cinema. And I mean ALL; sharpness turned to zero, and dynamic contrast turned off, etc etc.

     

    Comparing Cinema mode to Standard mode, I'm confused. Cinema mode has WAYYY better blacks. Not only that, but the scenes are overall DARKER despite the dark regions being brighter? And for some reason the flickering I had in LOTR is practically gone even at higher brightness settings! 

     

    SDR:

    Picture Mode: Standard

    OLED Backlight 32

    Contrast 80

    Brightness 61 (When it comes to raised black levels or banding, where scenes vary, there are two solutions. Either you could crush the blacks to eliminate raised black levels entirely, or you could increase the black levels so that the variation in black levels is less noticeable scene to scene. I decided to go with the latter, which also has the added benefit of reducing banding or posterization and reducing the effect of ASBL if you haven’t disabled it. I’m thinking, even if you have flickering it is worthwhile to just raise the brightness to a good level – while the flickering may be annoying you just have to deal with the poor source footage unfortunately, and raising the blacks makes the flickering less jarring. Unfortunately this is the solution I have had to take; but I don’t mind it because without the annoying banding I used to have the LG OLED really shines in bringing out the detail in dark parts of the image; even if it doesn’t look as inky as I would like, but then again not very many HDR movies (particularly the ones shot on film) have black levels low enough to achieve a consistently inky appearance anyway so I'm not losing a lot. It still looks better than my LCD computer screen anyway haha. How do you figure out what to set this to on your TV? Well, if you see in any part of the image a grainy-looking sharp gradient from one shade straight to black, or glaringly obvious compression artifacts or macro-blocking within a black part of the image; then you should consider raising the brightness and see how it alters the image – for optimum results you should get to the point where raising the brightness increases the black level but makes no difference to the banding or compression artifacts, at this point you should not raise the brightness further and should probably lower it by 1 point (in HDR) or 2-3 points (in SDR). It can be rather difficult as it depends on the source; it’s really hard to find a good dark scene to compare – On Dune in SDR on two scenes (the tent hallucination scene and the scene where Paul fights some guy at the end), a brightness of 62 was sufficient, in The Beast of War a brightness of 63 was enough, but then I looked at the opening titles of another movie (the very first few frames of the movie) and it turned out a brightness of 64 worked well to remove banding. However this seems too bright, so I found that a brightness of 61 struck a nice balance - any higher than 61 resulted in a significantly noticeable jump in black levels and a brightness of 60 made banding, compression artifacts, and/or black crush a little obvious in a few scenes. Also a brightness of 61 in SDR roughly matches a brightness of 56 in HDR with the backlight and contrast settings that I use.)

     

    Just a note about dynamic tone mapping in HDR:

    There is a bit of controversy about this setting, whether to enable it and disable it. It's up to opinion, but I'll put what I think makes the most sense here. You see, HDR movies are mastered at different levels of brightness; some are mastered with a peak brightness of 1000 nits, whilst others may be mastered with a much larger dynamic range with a peak brightness at 10,000 nits. As a result, if you don't have dynamic tone mapping enabled, as the TV tries to map the whole dynamic range of the movie to the TV it may result in significant brightness variation between movies depending on what brightness they are mastered at. From what I've read, Dynamic Tone Mapping aims to prevent this problem by essentially reducing or adjusting the dynamic range so that it fits the TV, which should, in theory, make all HDR movies at identical brightness; giving a more consistent viewing experience without reducing quality as the TV cannot replicate the full brightness of the source content anyway. So you should enable DTM... Right? Well, if LG implemented it perfectly, 100% yes. But, unfortunately, DTM on LG TVs seems to exaggerate the brightness of source content; resulting in a brighter image than "should be" the case. I think it's still better to enable, as it makes the brightness more consistent between films; not only that, I don't need to enable the peak brightness setting as the brightness is already high enough which means I don't have to blind myself with bright subtitles. Also I feel like, with DTM enabled, the brightness is more close to the Dolby Vision equivalent than with it turned off. 

     

    Note to self: I made some more notes in my 2021 folder on OneDrive.

     

    UPDATE:

    After a while with the standard settings, I was wondering why everything looked kinda dark. I figured out why. The reason why cinema mode looks better than standard is because the gamma is more correct with a default gamma of 2.2 (which cannot be changed in HDR). I just made a comparison on SDR, the default "standard" gamma is "high1" which seems to be very similar to a gamma of 2.4 on cinema preset. Meanwhile a gamma of "medium" on standard is similar to a gamma of 2.2 on cinema. So it seems that, for HDR, it makes far more sense to choose cinema mode because the gamma is more accurate as I presume it's the same story as with SDR; although it is strange that you cannot change the gamma manually in HDR because there definitely seems to be some gamma changes comparing standard and cinema. In SDR, you can adjust the gamma on either mode so it doesn't matter as much. Actually, it does seem to matter because on standard I get this strange concentric circular banding which I don't see on the other mode; so cinema mode just makes more sense. Anyway, for the sake of making everything on an even playground I made a cinema setting for SDR as well as follows:

    Cinema (user)

    OLED Light 32

    Contrast 85

    Brightness 60 (this seems to be about the same as a brightness of 61 on standard with high1 gamma.)

    sharpness 0

    colour 50

    Colour Gamut Wide (note that this is different to the setting I use for HDR)
    Gamma 2.2

    Colour temperature Medium (I don't know why the default is higher than this because the text looks YELLOW if it's higher than medium which I believe is not accurate despite how much film purists may like it)

    Black Level Low (on an oled this seems like how it should be)

    TruMotion Off

    All other image processing off.

    If you can't be bothered to change any settings on your TV I would suggest at least, at the minimum, doing the following; turn OFF power savings mode on the TV, on SDR change the gamma from high1 to medium, and on HDR change from standard mode to cinema mode because standard mode on HDR is garbage. Note that these settings may differ depending on what TV you have, but it seems to hold true for the LG CX model which I have. 

     

    1.   Show previous replies  1 more
    2. rattacko123

      rattacko123

      https://arstechnica.com/civis//viewtopic.php?f=6&t=1476320 
      this is a pretty good thread. I disagree with the idea of setting different picture modes for different content. I prefer one setting to rule them all. That’s what I generally use, but I also have another setting with default settings which is more of a “backup” setting just in case. I know, for Apple TV I could probably make another setting with peak brightness enabled; as subtitles on Apple TV are far less bright so I find it more bareable to raise the peak brightness, but beyond that I am perfectly happy with my current settings - as the brightness has been raised so much that it is suitable for daytime viewing but the peak brightness isn't so high it's blinding in night time usage. 
       

      noted on the rtings link shared in that thread there is the discussion about which picture modes to use. IMO, outside from Cinema mode, the best ones to use are Standard and Game presets, as they both have good adjustability; with filmmaker mode also being a good option (not available for Dolby Vision content however). However, Game might be off-putting for some as it disables Real Cinema. Note that they are MUCH brighter than Cinema preset; so you may need to reduce the backlight whereas the Cinema preset is ideal at a backlight of 100. 

      Also I had a bit of a brief look into motion interpolation, and if you really really want to use it apparently the best settings are just to have de-judder set to 1 or 2 and leave it at that. I haven't tested it; although with past experience with motion interpolation enabled on other TVs and SVP Project while I can bear with motion interpolation I generally prefer to have it off as only on rare occasion do I find that judder bothers me with motion interpolation off in panning shots in movies with OLED (maybe like one or two camera shots per film), whereas motion interpolation artifacts I notice far more frequently. It's up to personal preference really.

    3. rattacko123
    4. rattacko123

      rattacko123

      I had a bit of a look at a few forum posts. Looking at a few links hyperlinked it sounds like brightness of 49 is the most accurate (assuming your panel handles the blacks properly unlike mine), and while that appears true in dark scenes in HDR compared to SDR there are some scenes like in Dune 2021 where higher brightness looks more correct. Also interesting that forum posts note a difference in DV vs HDR on the CX but I don't really notice much of a difference? The blacks look the same to me (although to be fair I'm comparing two completely different movies) and DV maybe looks a little less bright I suppose. Actually, scratch that, some of these notes I've written up above are old and I've made quite a few edits. Which brightness level is "correct" depends a LOT on how accurate the colour and the gamma is on your TV, and it also depends somewhat on your OLED backlight level and what picture mode you use. On my particular TV, I've had to elevate the black levels noticeably to get a good image; sure it would look best if the brightness levels were a few notches lower but then banding and black crush becomes too obvious. For sure, a brightness of 50 is too dark on my display. The ultimate solution is to get your display professionally calibrated but, of course, most people aren't going to spend the money and time doing that.

       

      Also, I don’t think any of the settings mentioned in these links need to be changed, but if you have had issues with your TV it may be worth looking at. Just putting these down for future reference:

      https://www.avsforum.com/threads/how-to-turn-off-asbl-on-lg-oled-tv.2440714/

      https://www.avsforum.com/threads/2020-lg-oled-cx-gx-owners-thread-faq-posts-1-6-no-price-talk.3119288/page-1189 

      I personally disabled ASBL, TCP, and GSR because found those things to bother me, but I wouldn't recommend getting a service remote and disabling them unless it really bothers you.

  9. iOS 15.1 (and WatchOS 8.1) seems to have quite a few bugs with shortcuts:

    • Wind Down (sleep mode) shortcuts very often do not load
    • Shortcuts Complication on Apple Watch sometimes doesn't load upon restart.
    • Shortcuts often hang for several minutes when opening another application while a shortcut is running on Apple Watch. Workaround is to open shortcuts app again after opening another application. It's a bit annoying when I forget to do this, sometimes I use a shortcut before I go to bed and the buzz that the shortcuts app creates interferes with my sleep. 
    • Shortcuts Widgets in a stack (multiple shortcuts in one) have strange behaviour, where the selected shortcut is instead replaced with one on top of the list. The workaround is to move the selected shortcut to the top of the list in the shortcuts app.
    • Unrelated to shortcuts app, but Books app often loses progress on imported epub files. I think this may be an issue with RAM behaviour on my iPad 6th gen 32GB where for some reason it cannot retain the data in the RAM, because I notice this issue doesn't occur when I force close the app; it only seems to happen when I return to the app. iPadOS 15 seems to be at fault here as I did not have this issue prior in iPadOS 14. The same issue seems to happen on Bluefire Reader as well. I'm seriously considering making a bug report on this because there is no easy workaround aside from force closing or bookmarking every time you leave the app. In general, returning back to an app after going to the home screen or switching apps results in strange behaviour where the apps I use seem to always refresh themselves; which I believe may be related to this books app issue. Multitasking seems to be somewhat broken on my iPad in this way. This issue occurs despite me having 10GB of free storage and restarting the iPad multiple times.

    Also I've noticed iOS 15 just has a few quirks here and there. Some apps don't behave as they should and battery drain seems a little higher. Also, the QR code button on camera app is too small and double tapping to zoom in doesn't work anymore on the photos app if you're tapping on text thanks to iOS's new OCR features (you have to use two fingers to zoom into text instead which I don't mind). Aside from these issues, I'm actually pretty happy with iOS 15. I would consider it to be pretty much the definitive version of iOS and iPadOS.

     

    WatchOS too has largely improved as well, despite the issues. For instance the Always On behavior for certain apps (such as the Timer app) works much better now (also, just somethin I noticed, why does Apple Watch support multiple Timers while iOS still doesn't??). Speaking of WatchOS I still really dislike the wrist down behaviour on Apple Watch where, for example, I put my wrist down and the screen turns off right away rather than staying on for 15 seconds as I specified in the settings. Also WatchOS notifications, while servicable, aren't as intuitive as Pebble watch or Garmin watches still...

     

    Anyway, that's my ramble for today. I probably made this too soon as I just realised there is a new version of iOS and WatchOS.

     

    UPDATE: With the latest version of IOS (15.2 and WatchOS 8.3), STILL none of the issues have been fixed; in fact I have discovered another issue:

    • I could not update labels on alarms in the alarms app (I click "done" but it doesn't save). It works when you update labels the first time but a second time doesn't work - this is when I use voice to text on Apple Watch btw.

    This is a case in point why updating software isn't always a good idea, as sometimes things can get worse (i.e. the podcasts app getting "updated" with ads and removed features) or change into something completely different to what it used to be - I miss some of the older softwares from years back but it's important to accept that the world is changing and that you can't hold on to everything (can't hold onto much in the material world at all really). 

     

     

  10. Tried out the Airpods 3. Fits my particular ears much better than the Airpods 2 so far. Your mileage may vary. FYI, I use the large size silicone tips on my IEMs so that's how big my ears are. I don't really get the hype behind the sound quality of the Pods 3, it's literally just Airpods but with extra sub-bass lol. I've only used it for a short while so maybe I haven't had enough time to really appreciate it, but so far it only seems slightly better in sound than the pods 2. That is not to say that the sound is bad; rather, I actually quite liked the sound of the airpods already, so this is just more of the same but it's nothing exceptional.

    If you're wondering about how much bass there is, to put it simply, if you like the bass on the QC35 II you'll like these. It's somewhere in the middle between regular airpods and QC35s in terms of bass.

  11. Not a fan of this change. Turns out Emplemon's prediction was right
  12. The Surface Pro 8 looks fantastic! The addition of thunderbolt 4 and 120Hz makes this the biggest update to the Pro line in many years IMO. 120Hz is better for watching 24Hz films than 60Hz is, and thunderbolt 4 gives people the option to use an eGPU (and, thus, HDMI 2.1) if they need it. The new signature keyboard looks really cool as it has built-in wireless charging for the slim pen 2. As for the pen itself I have mixed opinions about it due to its shape and lacking magnetic attachment to the side, but combined with the new keyboard I think it’s a reasonably compelling upgrade to the original pen for most people. So, in summary, not only do we have the most compelling upgrade to the Pro in a while, we also have the first proper-but-not-really upgrade to the original surface pen (albeit, only for Pro users). I think microsoft has done well here. As for what this means for me, I don’t think the surface laptop is a good enough upgrade to my current gaming laptop so only the tablets look like something I might want to buy. If I do get a new tablet I’d want a small tablet (for ergonomics, mainly) so I’d probably end up getting the Surface Go. The new iPad with 256GB of storage looks tempting but I need the features of windows. In addition, I find that Drawboard PDF is better than most iPad annotating software and PotPlayer is the best touchscreen video player ever so despite the mediocre specs I am content with getting a Surface Go. The touchscreen on my Surface Go 1 is beginning to fail (the top and bottom of the screen no longer register touch input so I’ve had to change my aspect ratio to widescreen as a stopgap) so I’m quite tempted to get the Go 3 in the near-ish future. I’m mildly disappointed they haven’t upgraded the GPU much since the Go 1 but the better CPU makes up for that as CPU speed was the biggest downside of the older generations of the Go.
  13. I just wanted to post a quick update to the folding at home saga. I decided to skip making the folding at home video. I thought it was too dry and bland, and didn't really add much beyond what LTT and techquickie already mentioned in their videos about folding at home and fighting coronavirus. Maybe I might revisit this topic again in the future - perhaps a "Top 10 Tips for Folding at Home" or something like that?

    So, here are the findings I wanted to mention in the video, and a bit of an update from before: I figured out why my GPU wasn't folding on idle. The problem was, I was using the folding at home screensaver when, instead, I should have been using a regular screen saver. With the folding at home screen saver, the GPU does work initially, but once the computer turns the screen off, folding at home stops working. With a regular screensaver, this does not happen. The folding at home screensaver is buggy, I do not recommend using it.

    Also, if you want to maximise your points, it's probably best just doing a set and forget and just make it run at full only on idle. Running at "light" all the time isn't worthwhile, unless you BOTH use your computer A LOT and ONLY do light tasks on your PC. 

    And that's all of my findings from memory. Maybe there are a few things I didn't mention but I can't be bothered to open my video script and sift through it.

    1. rattacko123

      rattacko123

      Check out my channel Rattacko, and my 2nd channel Mousetacko for new upcoming videos 😉 https://www.youtube.com/user/rattacko2nd 

  14. I thought I would do a quick post about optimal recording settings for OBS and the like. I kind of want to make a video about this at some point:
    Basically the most important thing to consider is how readable is your screen recording. Most people will either use a phone (~5 inches), or a tablet/laptop (~13 inches). So you have to consider how much video fidelity you can actually squeeze into a 5 inch screen size. Most people have screens that are around 1080p resolution - and most people probably won't be able to see much more than 720p in terms of detail on such a screen.
    So it seems that a "resolution" of 720p-1080p is about optimal for a screen recording. Of course, you can just scale the video in post but that isn't always easy to do. I would say that your goal for the end video would be a video that is easily readable on a laptop screen and mostly readable on a phone screen - depending on the content that you want to make of course. So, how do you achieve this level of readability? There are 3 methods I have found:
    1. Use a 1080p display. If you use native 1080p resolution the recording sizes will be smaller and it is less likely to have scaling issues. The only negative is that the text won't be as crisp/smooth as in higher resolutions. In my case I have a secondary 1080p display which is the main display that I use for recordings while my primary 4k monitor is used for non-recording purposes. This is the best resolution for doing gaming recordings IMO - although newer games have pretty good display scaling so it's probably not as big of a deal as it once was.

    2. Scale your display so that the approximate fidelity of the screen is around 1080p or lower - if you use a laptop such as a MacBook your display scaling should already be about optimal. If you have a 13" laptop that is actually probably the best platform to record a recording since you probably won't need to adjust scaling settings because the scaling will already be set to an appropriate level. If you use a 4k monitor, you should use 200% scaling or higher. If you use a 5k monitor, you should use 250% scaling, and so on. If you make 4k YouTube videos, this is the option I would recommend.

    3. Edit your video in post, you can manually select elements from your screen recording and resize this as you see fit, but this takes time. The easiest way to get around this is making your screen recording easier to edit by using appropriate screen settings.
    Hopefully this helps. I am not an expert in this stuff and there are probably better settings you can use - find what works for you. But, simply put, if you use a resolution greater than 1080p then using display scaling will likely improve the legibility of your recordings.

  15. Running folding@home all the time (even while working) on 'Light' folding power actually works pretty well. For some reason Folding@home doesn't recognise my graphics card (when I try adding a new slot it doesn't work) and looking at the log I don't see any error codes (or maybe I am blind)? I use Nvidia Studio drivers for my GTX 1080 so maybe that's why.

    1. rattacko123

      rattacko123

      Ok now it seems to work just fine. Updating my drivers fixed the issue (still running studio drivers of course). Or maybe it didn't work before because I had the web browser window closed while I was accessing advanced controls. Anyway, shutting down folding@home, updating drivers, restarting folding@home, and running both web browser window + advanced controls fixed it.

    2. rattacko123

      rattacko123

      I have a new problem: My GPU doesn't always fold on idle. It's kind of weird, sometimes it does sometimes it doesn't. What I discovered is that if I manually put my computer to sleep, my GPU starts folding. This is because folding@home prevents the computer from sleeping and seems to detect the PC as idle when sleep mode is enabled... Or maybe it doesn't, maybe it just detects when my monitor turns off when I put it to sleep and not the actual action of the computer actually going into sleep mode.

       

      Seriously how does folding@home detect idle? It's really vague and unclear based on the forum posts I read. And, all those forum posts are 5+ years old anyway so the info is outdated.

       

      I had it set so that my two monitors turn off and sleep mode activates at the same time - this worked a few times but I noticed that sometimes my GPU wasn't doing anything (I was able to check this by listening to my loud Inno3D GTX 1080 as well as opening MSI Afterburner, which I always keep running in the background, and it detected no activity). It seems like folding@home doesn't do a particularly good job at detecting when my PC is idle or not. Anyway. I changed it so that my computer goes to sleep 5 minutes after my screen turns off (25 minutes screen off followed by 30minutes sleep) - hopefully this fixes it? If not I'm not sure if I can be bothered changing it again.

       

      Apparently the creators of F@H are making a new more user-friendly version of F@H, hopefully that version solves the issue. At the moment I am using version 7.6.9 along with my (recently upgraded) Windows Version 1909. I wouldn't be surprised if it's a windows problem - I probably need to reinstall windows at some point lol. My current install is bogged down by unnecessary driver software. The next time I install windows I think it would be better to let windows update handle the driver installation rather than installing the drivers manually - hopefully there will be less bloat if I do that.

    3. rattacko123

      rattacko123

      I plan to make a video about optimal Folding at home settings and how to maximise points in folding@home sometime during the holidays. I finish my exams in June so hopefully I'll have enough time to make a video then.

×