Jump to content

Sniperfox47

Member
  • Content count

    2,325
  • Joined

  • Last visited

Awards


This user doesn't have any awards

3 Followers

About Sniperfox47

  1. Chrome tabs are laid out in seperate processes, but cross-site documents still open in the same process. This is an additional isolation feature that process isolates cross-site documents. Say you are on this site and there's a hotlink to a facebook image, that image currently still opens in the same process since it's the same tab. A script could then abuse that to gain cross-site access between here and facebook. This would process isolate that facebook image in it's own process node, hopefully preventing cross-site hooks from attacking anything in between. This was rolled out as an opt-in enterprise setting back in Chrome 63 but wasn't rolled out more widely because it has the potential to break a *LOT* of stuff on the web. More details can be found here: https://www.chromium.org/Home/chromium-security/site-isolation There was a Firefox OS... it freaking died because web based OSes are aweful. Even if they had managed to get developers on board there straight up are a lot of things the OS simply could not do performantly. Even now with WebGL 2.0 there would be a lot of innefficiency with anything that needs highly performant code and shaders. Android is Linux. Librem 5 OS is GNU/Linux. The whole reason Google is pushing to develop Zircon is because Linux is a *terrible* kernel for mobile development due to it's ABI instability.
  2. Huh that's odd, it's almost like making a resonant charging pad with custom circuitry and multiple coils is a really difficult thing! But if that's the case why did all of the OEMs jump off Rezence and start offering wireless charging laptops? ...oh wait... But then why is Qi's high power charging standard for appliances such a huge hit? ...oh wait... Other resonant Qi phone chargers? Umm nope not any successes there either. Stick to what you're good at Apple, taking proven and tested technologies and bringing them to market in a simplified and consumable form.
  3. I don't understand why everyone praises Discord. The PC version is decent but the mobile version is trash. Bugs where when your friends log off it still shows them online until you close and reopen the app The lack of many / commands, including /me on the mobile client. Many of these commands, again including /me, wind up formated weirdly for mobile users if used by desktop users. Shortform emotes not completing on mobile. On mobile if you send ":" + "$" for blushing it stays as the ascii for you, but turns into :unammused: for users on the other end with no feedback to you. Wouldn't be so bad if they just used the industry standard shortforms... >.> Notifications for friend chats getting occasionally dropped if simultaneously getting notifications from group chats. Sooooooooooooo many little QoL problems similar to the kind of issues I had with Skype. Stuff I've never had on even hangouts for as awful as that is.
  4. In Apple's defence, they were one of the first adopters of OpenGl, literally created OpenCL, pushed the adoption of various industry standard ports such as firewire and USB-C, and have generally been pretty good about pushing open standards forward when it didn't directly financially benefit them to lock people into anti-competitive and anti-consumer accessory programs like MFi.
  5. Wait... so the issue here is that apps in your email client, that are to help you with your emails, that you agreed to let read your emails on a clear permission screen, can in fact read your emails like for realsies? I think I've lost what little faith I still had in humanity. I mean I don't know how you can be any less ambiguous than that it can "View your email messages when the add-on is running"
  6. G-sync HDR Modules to add 500+ dollars to monitor price

    @Stefan Payne Just to be clear if you're finding a 4k HDR 1000nit monitor that consumes less than 180 watts there's something funky there, since to reach that brightness that's a totally reasonable power consumption. Do you have an example of even a 600nit 4k 144hz monitor that consumes less power with Freesync support?
  7. Gentoo Github Hacked

    How so? Nothing got compromised on the dev end. All of the signing keys are all intact so anyone validating signatures (which if you're using gentoo you should *really* be doing) are fine. Anyone using the official repositories are fine. This makes me chuckle a bit solely because of the coincidence of it. Microsoft buys GitHub and then a couple weeks later the Gentoo account gets hacked Probably a total coincidence but still funny in my mind. Honestly it's most likely just a social engineering thing. Gentoo being a mostly community driven project it probably wouldn't be very hard for a potential attacker to convince GitHub that theyre newly responsible for the GitHub page and nobody told them the password. Can I get a password reset please?
  8. I wouldn't jump the gun just yet. If they don't release a near smartwatch chip by September that's probably the case but from what they've said in the past couple months they have new wearable chips (plural) coming out soon. They've had Snapdragon Wear chips that's don't support WearOS before. The SDW1100 and SDW1200 both used it. The leaks we've had about the upcoming smartwatches claim that the new Chipsets will be a SDW3100 series, so this could very well just be Qualcomm adding a new intermediate level to their wearable chip platform and holding off the new high end chip until closer to the product launches
  9. G-sync HDR Modules to add 500+ dollars to monitor price

    That is *hardly* an apples to apples comparison. There's far more to a good monitor than resolution and refresh rate. What panel types do they have? What kind of colour accuracy? Do they have any microstuttering from an underpowered controller? Input lag? All of these things add costs to a monitor. How about minimum refresh rate for the VRR? Of course you're going to be able to find cheaper Freesync monitors than Gsync monitors... The Gsync module sets a performance baseline that the monitor has to meet to be economicly viable. You can cut down and strip out the monitor to all but the bare minimum and still support Freesync.
  10. G-sync HDR Modules to add 500+ dollars to monitor price

    For me it was simple. A) For a whole year after buying my card the Vega64 didn't exist. There was no equivalent to even the midhigh range Nvidia options at the time. B) I need CUDA for work anyways so it was convenient. C) Because I use 1080p monitors and AMD's super resolution option isn't as good as Nvidia's. D) Because even almost 2 years later, now that Vega has launched, they have nothing even remotely close to my liquid cooled Titan X Pascal in terms of performance or power efficiency. Also let's be honest, if you're buying a $3000 monitor you're probably not pairing it up with a 1070/Vega56 anyways
  11. We have tons of reference to base our opinion on... A bunch of people including loudeater, Mr. Moose, and Myself all work in the industry and deal with this on a regular basis. We have a US Attorney, a neutral third party trusted source who deals with these issues on a daily basis telling us people are freaking out about nothing. You have those documents that Mr. Moose posted that, by your own admission, have the same type of wording at their root. And people keep repeating the same thing because you disregard any actual evidence to maintain an asinine point of view, then call off the discussion as "autistic" which doesn't mean what you seem to think it means. >.> You seriously seem to be trolling at this point. Most of these people telling you "this is fine" are the same people who jump down Nvidia's throat every chance they get. These are the same people who went on tirades about GPP and Gameworks. It's rather telling that you haven't taken the time to look into some of these people because Mr. Moose and myself normally couldn't side together less on these topics. Umm what? If they have plans to reveal information in the next few days of course they'll need you to sign it in the next few days. There's nothing odd about having a short agreement window on an NDA. If someone tells you to come in for an interview and asks you to sign an NDA to attend it's not going to be half a week you have to sign it, it's going to be "sign it here and now or bye". It's not a 5 year long deal. Read the freaking contract. You, the signee, can opt out in writing at any time without any notice. You just write to Nvidia and say "I'm out" and it's over. At this point you're searching for reasons to take offence with this when there's literally nothing here to take offence to. This is about as mundane and straight forward a contract as they come.
  12. Wait 19:9 aspect ratio? So it has a notch that if you turn it off it turns into 16:9? Well at least I'll have that... >.> It'll still look freaking weird having all the IO right in the middle with the glossyass screen dark to either side, but whatever it is what it is.
  13. G-sync HDR Modules to add 500+ dollars to monitor price

    Agree with everything else you said, but just to answer this question. Their older laptop version of gsync used VESA AdaptiveSync. I believe it no longer does now that they have full blown desktop GPUs in there afaik, but it used to. Also their older (X1 and earlier, not the X2) Tegra Chipsets supported AdaptiveSync because a full gsync module isn't feasible to include in a handheld form factor. Not that it did any good when the Jetson boards weren't wired for it and nobody ever implimented it.
  14. G-sync HDR Modules to add 500+ dollars to monitor price

    A) it doesn't violate the standard because it doesn't even try to follow the standard. Violating a standard is when you try to be compliant but then do something the standard forbids, like offering Qualcomm Quick Charge over a USB-C port... B) Just want to point out that most receivers and TVs do not have DisplayPort so I don't know why you're attributing anything to VESA. Yeah VESA AdaptiveSync can be implimented over HDMI but very few people do this outside of weird legacy freesync HDMI monitors. Most of the upcoming displays with Freesync support and the support Samsung has rolled out on their TVs is using the HDMI 2.1 variable refresh rate (which can also be backported to some HDMI 2.0b devices). This is handled by the HDMI LLC and not VESA who they compete with. These are two totally separate and independent standards and formats. AMD just happens to unify them both under the "Freesync" marketing umbrella. C) Because there are several different standards, and none of them have what Nvidia wants? I'm going to ignore the matter of platform lock-in because while that's definitely a reason it's not a constructive one. The biggest other reasons why Nvidia might want to roll their own standard are to set a minimum performance criteria and make platform specific improvements. See some of the earlier comments in this thread but minimum performance is a big thing. When the Freesync program first started many AdaptiveSync devices had serious quality problems. Even if AMD had set higher performance minimums for branding, because they're using the standard they still have to ensure compatibility with all the crappy low end monitors that just barely support the standard. Gsync allowed Nvidia to sidestep much of this by creating a scaler module with way more than enough performance to handle anything they threw at it. And as far as platform specific tweaks, it allows them to make sure that the protocol is cleanly insertable into their normal driver pipeline and makes the most clean use of the GPU and monitor. Is GSync an anticonsumer anticompetitive mess? Kinda yeah. But there are other legitimate reasons behind it.
  15. G-sync HDR Modules to add 500+ dollars to monitor price

    Vsync is where you wait for a vertical blank to send the next frame to a monitor. That's what vsync is. Everybody's implimentation of it is a little different. Nvidia has... what... 4 different implimentations now? Fast Sync, Adaptive Vsync, Legacy Vsync, and "Vsync" are all just different implimentations of Vsync. Fast sync is just a specific implimentation that benefits from a specific set of constraints to avoid some of the issues of other implimentations while introducing its own. Saying fast sync is not vsync is like saying a circle is not an oval. That's straight up wrong. A circle is an oval, it's just a specific kind of oval.
×