Jump to content

Anti-cheat ideas

Thomas A. Fine
1 hour ago, Mark Kaine said:

you didn't answer my questions the first time around,  i don't know why you would now. you also said i lay words in your mouth (i screen shotted it...) i don't think thats a good basis for a discussion at all, but for the sake of it... *again*... you said in the OP "get caught in one game,  get banned in all games" then you continued to say its "bulletproof" but somehow later in the same post its "tamper resistant..."

"get caught in one game, get banned in al games".  That wasn't about the mouse specifically.  That was a different idea, that anti-cheat might benefit from a more centralized system.  I hadn't noticed anyone bringing this point up.  (I'm also going to nitpick that you put this in quotes, but it doesn't exactly match what I said.)

 

The "bulletproof" comment was "That signed data is now bulletproof".  Once data has been signed, the payload data can't be changed or it won't match the signature.  Absolutely true.

 

And "tamper resistant" was about the packaging for the hardware and embedded private key.


So, you just put these three different things together as if they were connected when they weren't at all connected, and it is a perfect example of twisting my words, taking them out of context.  Nice self own there.  Keep it up.

1 hour ago, Mark Kaine said:

So simple question how do you prevent false detections?

Since you are mixing contexts, I'm not sure which idea this refers to.  False positives can be a serious problem with anit-cheat.  With regards to the mouse idea, though, there can't really be a false positive unless there's bad software that's not transmitting the same mouse data in the live game play and in the signed anti-cheat packets.  In which case... every packet would be bad for every single player with the same game software and it would be obvious that the anti-cheat wasn't working.  Or likewise if it was a firmware issue on the mouse, none of that model of mouse would ever work for anyone.  It would be obvious.

 

If you're really worried about rare corner case bugs you could make an exception for positives in which the deviation between the generated mouse motions and the signed data was very small and/or very infrequent.  Anyone who was actually cheating would see wide, regular deviations in the data that would obviously look like cheating.

1 hour ago, Mark Kaine said:

And what kind of cheats would this "mouse" even prevent exactly in theory? i only saw you mentioning "aimbot"?  is that really worth all the likely issues with this kind of solution (and for who)? 

As I have repeatedly said here, it would only be for aimbot issues, not for other information hacks.  And I do think it would be worth it, because I think "all the likely issues" people have raised are not actually issues.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Thomas A. Fine said:

See?  You agree it is a mouse issue.  (LOL)

 

I agree with much of what you say though.  A mouse is one small component, and, as I said already, does not do anything to prevent the X-Ray, ESP, heads-up display information that is (perhaps) a larger problem.

 

I would note however that while everyone is correct to point out that server-side processing would solve this, and that this can make for some real lag issues, there is a reasonable possibility to have still have client-side collision processing, but do a much better job of not telling the client where every player in the game is and what they're doing.  You only need to reveal those players that are visible or nearly visible.  HUDs would then see players only a second or two before they were visible, which is still a cheat, but nothing like just seeing a map of everyone, or seeing through every wall at any distance.  Coarse visibility determinations can be made very efficiently on the server because they're probably already using some sort of space partitioning algorithm.  And I'm sure that some games already do this, but it is also clear that some games do not.

There are many problems with implementing too much stuff on the server or the client. If You put too much trust in the client we will end up with a quake arena anticheat where you can have godmode, if you put too much trust on the server, any ping higher than 5 will start to cause problems and the lag compensation won't be enough. It's a balancing game. There are a lot of games which partition the map and check to see which partition will be visible and show the character, a good example is csgo where you can't use esp more than 2 map partitions far.

 

From personal experience, the arma 3 anticheat, battle eye anti cheat, is one that works pretty good. It's a bit invasive but it works wonders. I've never seen accounts that last more than a day at best, and ragers get banned in less than half an hour. I've personally tried to co-develop a cheat for arma 3 and the anticheat has been the biggest headache we've ever seen.

 

The mouse is the least of the problems, the are humanizers being handed left and right for more legit looking play, and some of them are pretty good. Recoil (RCS) can also be handled in a humanized way with random seeds and fine tuning, etc, this is one of the main ways to not trigger faceit when using small amounts of aim and some RCS. I can name providers that give such granularity so you don't just have to believe a random guy's word.

Link to comment
Share on other sites

Link to post
Share on other sites

On 3/4/2023 at 9:49 PM, mononymous said:

How about using a government ID/passport for verification so it is much more difficult to create duplicate accounts?

Clearly doesn't work on Chinese servers

 

Also, Runescape suffers from this:

 

Specs: Motherboard: Asus X470-PLUS TUF gaming (Yes I know it's poor but I wasn't informed) RAM: Corsair VENGEANCE® LPX DDR4 3200Mhz CL16-18-18-36 2x8GB

            CPU: Ryzen 9 5900X          Case: Antec P8     PSU: Corsair RM850x                        Cooler: Antec K240 with two Noctura Industrial PPC 3000 PWM

            Drives: Samsung 970 EVO plus 250GB, Micron 1100 2TB, Seagate ST4000DM000/1F2168 GPU: EVGA RTX 2080 ti Black edition

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, williamcll said:

Clearly doesn't work on Chinese servers

The ID verification is not for cheating though just for game addiction, so even though the methods may be similar the purpose is different. 

AMD Ryzen 5 3600 | AsRock B450M-Pro4 | Zotac GTX 3070 Ti

Shure SRH840A | Sennheiser Momentum 2 AEBT | LG C9 55"

Link to comment
Share on other sites

Link to post
Share on other sites

20 hours ago, Thomas A. Fine said:

First off, I have said — REPEATEDLY — that physical attacks on such a mouse device would remain possible.  And that it is my belief that the mitigation gained in not having widely distributable cheats makes physical attacks fairly irrelevant.

 

Ok, so now you're inventing statements on the fly and trying to isolate things to change what you originally wrote. This ain't Twitter where people haven't read half the conversation.

 

20 hours ago, Thomas A. Fine said:

Second of all, what Yubikey is talking about is physically gaining access to keys.  But that's because the Yubikey use case is about authentication.  The mouse use case absolutely is not.  I can give the mouse to my mortal enemy and it's not a problem, because there is no trust issue of the kind you keep trying to bring up.  The digital signature is an assurance that the data coming out of the mouse has not been tampered with, not that it comes form any individual personally.

 

It is very relevant, because you're assuming you can make something magically tamper proof in hardware. Let me just be quite blunt about it, you're just a thick skulled bloody idiot if you don't understand why your proposal to sign data with a key embedded in the mouse is completely meaningless, so to repeat it one final time in a format that might make it into your head:

THE KEY IS NOT SECURE, YOU HAVE TO ASSUME ANYONE CAN SIGN ANYTHING WITH THAT KEY BECAUSE YOU DO NOT CONTROL THE ACCESS TO IT.

In other words, you can just as well not have signed it, because you have no clue if the mouse signed the data or something else. How is that so difficult to grasp for you? You are literally magically hand waving away the main weakness of public key cryptography.

 

20 hours ago, Thomas A. Fine said:

I'm casually ignoring your "chain of trust" complaints because it is mind-bogglingly obvious that this is FAR EASIER to deal with than, say, web server certificate chain of trust, which is something that is normal and commonplace (though, more expensive, because identities of individual entities are being verified).

It's a really different scenario than a web server, and you're either too stupid, too arrogant, or have no clue what the hell you're talking about if you don't see the issue here. You cannot establish a chain of trust here. You keep saying it's easy, but do tell me, how are you going to ensure that key was actually used by the mouse and not by a piece of software written specifically to bypass your glorious infallible mouse design? With actively-maintained, internet-connected servers kept in a locked location with limited access there are definitely ways to somewhat guarantee such things, but there is absolutely no guarantee with the infallible mouse. This ain't you signing something with PGP before mailing it, this is the equivalent of putting the key on a piece of paper and handing it to someone and telling them to not use it for anything malicious.

 

18 hours ago, Thomas A. Fine said:

"get caught in one game, get banned in al games".  That wasn't about the mouse specifically.  That was a different idea, that anti-cheat might benefit from a more centralized system.  I hadn't noticed anyone bringing this point up.  (I'm also going to nitpick that you put this in quotes, but it doesn't exactly match what I said.)

Even Valve stepped away from that.

 

18 hours ago, Thomas A. Fine said:

The "bulletproof" comment was "That signed data is now bulletproof".  Once data has been signed, the payload data can't be changed or it won't match the signature.  Absolutely true.

 

And "tamper resistant" was about the packaging for the hardware and embedded private key.


So, you just put these three different things together as if they were connected when they weren't at all connected, and it is a perfect example of twisting my words, taking them out of context.  Nice self own there.  Keep it up.

You can sign whatever the fuck you want, if the key with which you sign it is not kept in a trusted place that signature is entirely meaningless, I can just discard your data and sign my own payload.

 

There is no tamper resistant packaging, all it does is slow people down. You can hide things further, you can complicate things further (and make them significantly more expensive as a result), and all you'll do is slow them down a little. Ask Sony how that went with the PS5, it took less than two years and that's significantly more complicated to get into than your supposed hardware security. 

 

You're just moving mileposts, and even separately those ideas are meaningless.

Link to comment
Share on other sites

Link to post
Share on other sites

20 hours ago, Mark Kaine said:

you didn't answer my questions the first time around,  i don't know why you would now.

He has no clue what he's talking about, that's why. You cannot guarantee the security of hardware you do not control physical access to, and anyone who claims they can is probably smoking something they oughtn't. This is also why things like credit and debit cards still require some form of user interaction (e.g., a password/pin from memory, finger print, ...). But even without knowing that secondary information, the possible pool of solutions tends to be quite limited if there's only something like a four number PIN in use. That's why it is so important to report if such a card gets stolen. Folks assume these things are a lot more secure than they really are. And there are plenty of physical attacks, a couple of the more invasive ones are listed here: http://www.infosecwriters.com/text_resources/pdf/Known_Attacks_Against_Smartcards.pdf 

 

But often you don't need to go that far, for example if they use the noise generated by a diode as random number generator seed, you could try cooling the chip to significantly to reduce the noise amplitude to the point where only a small set of numbers is being generated, making your source significantly less random. You could then go "yeah, but we'll add a PTAT circuit to detect the temperature of the circuit", which could then be distorted by directing light at the correct area of the die, etc. It's basically an arms race, and even for most of the security oriented products (like the aforementioned yubikey), there's a serious limit to how far they can push it without turning it into some sort of multi-year development hell that will never be completed..

 

Oh well, this is a mostly pointless discussion at this point.

Link to comment
Share on other sites

Link to post
Share on other sites

 

16 minutes ago, ImorallySourcedElectrons said:

Ok, so now you're inventing statements on the fly and trying to isolate things to change what you originally wrote. This ain't Twitter where people haven't read half the conversation.

Inventing statements?

I talked about the possibility of tampering with the mouse, and security never being perfect here:

and here:

 

and here:

 

and here:

So I guess you really haven't read the conversation.

 

Or, you're just trolling me.

27 minutes ago, ImorallySourcedElectrons said:

THE KEY IS NOT SECURE, YOU HAVE TO ASSUME ANYONE CAN SIGN ANYTHING WITH THAT KEY BECAUSE YOU DO NOT CONTROL THE ACCESS TO IT.

Seriously, are you just trolling me?  You can scream this claim in big red letters about literally any security measure, with similar validity.

31 minutes ago, ImorallySourcedElectrons said:

It's a really different scenario than a web server, and you're either too stupid, too arrogant, or have no clue what the hell you're talking about if you don't see the issue here.

So you quote me as contrasting these things as being very different from each other ("FAR EASIER" I wrote), and then you tell me I'm too stupid to tell that these are different?  I mean, you're the one that kept claiming chain of trust is so important here, and web certificates are the canonical example of chain of trust.

 

Or maybe you're just trolling me.

35 minutes ago, ImorallySourcedElectrons said:

You can sign whatever the fuck you want, if the key with which you sign it is not kept in a trusted place that signature is entirely meaningless, I can just discard your data and sign my own payload.

If I sign whatever I want because I tampered with the mouse, then only I and on one else can do this, and this system has mitigated the downloadable aimbot software issue and reduced aimbots to people with the time and ability to tamper with an individual mouse.  If I sign whatever I want with some other key, it won't match any public keys on record for any mice and it won't be accepted as coming from a secure mouse.


This is so incredibly obvious that, maybe, I think, you might just be trolling me.

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, ImorallySourcedElectrons said:

He has no clue what he's talking about, that's why.

Why do you continue these personal attacks against me?

 

You keep giving examples that are about authentication.  This is not the only use of digital signatures.  You keep refusing to understand this.  It almost seems intentional.


Do you really think that Sony also doesn't understand this genius insight you claim to have about how private keys can't do this?  Because this is the exact same analogous application:

 

https://pro.sony/ue_US/solutions/forgery-detection

Link to comment
Share on other sites

Link to post
Share on other sites

On 3/7/2023 at 6:23 AM, Thomas A. Fine said:

It would be a much smaller system than DNS.  With less frequent lookups since game servers could cache public keys they've seen basically forever.  That cache for a million players would be a few hundred meg.  The lookups would be fast (though speed doesn't really matter), and use extremely limited server resources.  A million lookups a day would be no problem even for something like a Raspberry Pi.  Vendors might pool their resources into a single verified service that handed out public keys.

Let me give you a scale if this kind of system was implemented in CS:GO. There is currently about 650,000 players playing CS:GO, they all most likely have mouse and keyboard (both of them would need to be chipped because, as I said, if they weren't a cheater would just use the one that isn't chipped to cheat), that would be 1,300,000 devices. Let's say it would perform a check every 10 minutes and that would be 13,000,000 calls every hour and 312,000,000 calls a day. And that is only for a single game (be it the most played game on Steam currently).

The size of the database, again let's start with the peak amount of Steam users online which for the past 24 hours is a bit over 29,000,000, that would be already 58,000,000 device active database at the minimum. It's sad that Steam HW survey doesn't check for controllers but it does give us that around 2% of Steam users have VR headsets so that would be 1,740,000 chipped devices more (both controllers and the headset). And these are just active numbers, for devices in stores and unused, add in consoles (because if I really wanted to cheat and it would be hard on PC, I would just cheat on consoles because that is also possible) and we can probably 2 fold the size easily which would end us up around 100,000,000 devices. But that's not really the problem, 100 million SHA256 keys would take up just 2.5GB or something pretty insignificant.

 

But the problem is who is going to uphold that database. Device manufacturers won't share anything, that is just clear by only looking at for example Razer and Logitech who have absolutely 0 cross supported devices (as in Logitech software would control Razer decide or vice versa). Hell, better example are the motherboards, like you would think those are pretty damn standardized, nope, not at all. Just only buy something by Gigabyte and see how nothing else can get temperature readings or fan speeds from it except Gigabytes own softwares which are pretty damn garbage (at least RGB Fusion and Easy Tune were complete asses while I still used my GA-AX370 Gaming K5 MB, like anything, even the manual setups from something like FanSpeed would have been way better than what Gigabyte had).

As said, any more official standardization organization wouldn't be interested about this because they have their hands full trying to keep all of the world running and gaming isn't high on that priority scale. For them this should me something more than just about gaming, like it or not, gaming just isn't that huge of a deal in the general world.

So, most likely in the worst case not only would every game server need to make check calls to some external server but they would need to either call every manufacturers own servers with all of the players or figure out from the encoded packages which are going to where. That is going to be a lot of traffic. Or equally worse the game servers would need to run every manufacturers own softwares to run the proofing and that's going to cost a lot with companies like Razer around who are just terrible at producing software.

 

Quote

And, again (!!!) this is coming, because mice or not, other devices like cameras are going to have a system like this in place, and will require a service to look up public keys.  (And in fact services for lookups like this already exist.)

Have you even looked at what the Sony thing is all about?

 

Let me summarize you it with a single term: C2PA. Microsoft, Canon, Adobe, Nikon, ARM and some others (with Sony being one of the later joiners) still more planning how they can combat misinformation especially in photos. The underlineable part there is that they do not even try to prove that the photo shown to someone is real, C2PA just adds signatures over signatures and presents them to the people so they can choose whether or not the image is real. Even already from the few sources where C2PA is used (not in Sony cameras, they haven't really even started, Nikon and Canon already have something) you can just extract the C2PA signature and even the salt for it from the EXIF data and do whatever you want with it. Really not much to do unless you want to be really evil and start forgering the C2PA signatures to "fake" images but there's very little anything to do because C2PA really needs support from the grass root level up to be something and even then it is just advisory system. And the price tag for that, completely new chip and currently only in flagship models with very little interest to make it come to the consumer cameras or cellphones because it's mainly meant for professional journalists.

 

Reasons why they don't try to make it anything more serious than just advisory is that making it anything more is impossible. You can point the camera at anything to get signed photo and that's already it. You can just wipe all of the EXIF data and get rid of the signature, paint around something invisible in Photoshop, paste the image you just wiped and you just got it signed by your Photoshop. While it is possible with the C2PA standard to have the website to show the signature and even the version history of the image, most likely very few websites will do it because hosting even the original resolution images would multiply the amount of hosting they need to do compared to the compressed images they currently use. Most will be only host the signature history of the image and that is then all up to the viewers to decide whether or not it's real.

Link to comment
Share on other sites

Link to post
Share on other sites

10 hours ago, Thaldor said:

Let's say it would perform a check every 10 minutes and that would be 13,000,000 calls every hour and 312,000,000 calls a day. And that is only for a single game (be it the most played game on Steam currently).

Why on earth would you look up the same information every 10 minutes?  The public key for a given device never changes.  At worst, you look it up once per player session.  At best, devices are cached per player.  Game servers or Valve or whoever could cache these per player for weeks or months or even forever.

 

10 hours ago, Thaldor said:

But that's not really the problem, 100 million SHA256 keys would take up just 2.5GB or something pretty insignificant.

We agree on this.  Or to put it another way, it is a small marginal cost, relative to all the other device information that hardware vendors are already storing about their products, or that game servers are storing about their players.

 

10 hours ago, Thaldor said:

But the problem is who is going to uphold that databas

As I said, it's a marginal cost.  But, I think the most likely answer is that traditional certificate authority companies or DNS server companies or other infrastructure companies would enter this space and provide this service to vendors for a contracted fee.

 

11 hours ago, Thaldor said:

Have you even looked at what the Sony thing is all about?

Your C2PA discussion is interesting, but not really relevant to the points raised here.  Sure C2PA will have some interesting real world issues in terms of how relevant it will actually be to consumers day-to-day.  But that's a totally different application than this.  There's no issues with tracking changes and history in my proposed application, nor are there problems with any sort of need to preserve the original signed data for any length of time.

 

The key thing about the Sony camera, the reason I brought it up, is that they stated that the digital signature is happening in camera.  Meaning that all the complaints about how this technology is infeasible seem to already have been solved by Sony (and, I'm sure, by other companies too).  Unique private keys per camera, some at least basic tamper protection, public key storage somewhere are the main issues that people have raised about why the mouse idea is too hard, but the Sony effort demonstrates that Sony doesn't see these as roadblocks.

Link to comment
Share on other sites

Link to post
Share on other sites

On 3/9/2023 at 8:46 PM, Thomas A. Fine said:

 

Inventing statements?

I talked about the possibility of tampering with the mouse, and security never being perfect here:

and here:

 

and here:

 

and here:

So I guess you really haven't read the conversation.

 

Or, you're just trolling me.

Seriously, are you just trolling me?  You can scream this claim in big red letters about literally any security measure, with similar validity.

So you quote me as contrasting these things as being very different from each other ("FAR EASIER" I wrote), and then you tell me I'm too stupid to tell that these are different?  I mean, you're the one that kept claiming chain of trust is so important here, and web certificates are the canonical example of chain of trust.

 

Or maybe you're just trolling me.

If I sign whatever I want because I tampered with the mouse, then only I and on one else can do this, and this system has mitigated the downloadable aimbot software issue and reduced aimbots to people with the time and ability to tamper with an individual mouse.  If I sign whatever I want with some other key, it won't match any public keys on record for any mice and it won't be accepted as coming from a secure mouse.


This is so incredibly obvious that, maybe, I think, you might just be trolling me.

I wasn't actually going to respond to any further garbage statements, but since you feel the need to do even more personal attacks on me (saying "you have no clue what you're talking about", accusing me of trolling, etc.): You're just repeating yourself, that's not a counter argument, you're just magically handwaving away issues. And you claim folks are misunderstanding you, but you have not explained anything about your idea, other than the initial statement and indicating that you'd want an individual key per mouse.

 

But you do not seem to grasp the following set of fundamental issues:

  • You cannot prevent anyone from physically attacking your wunder-mouse.
  • Your mouse has to communicate with your software through a driver.
  • You cannot control access to the key stored in the mouse, it is accessible through hardware attacks.
  • Your signature has no means of determining if the data was signed by the mouse or by a piece of software.
  • Very little cheats actually use automatic mouse movements.
  • You just created a centralized DDoS target to take out multiplayer gaming.

 

On 3/9/2023 at 8:56 PM, Thomas A. Fine said:

Why do you continue these personal attacks against me?

 

You keep giving examples that are about authentication.  This is not the only use of digital signatures.  You keep refusing to understand this.  It almost seems intentional.


Do you really think that Sony also doesn't understand this genius insight you claim to have about how private keys can't do this?  Because this is the exact same analogous application:

 

https://pro.sony/ue_US/solutions/forgery-detection

Because you have no clue what the fuck you're saying and you're just handwaving away problems as if they don't exist? You're also casually forgetting to check things like ... maybe, I don't know ... the details behind the system you're touting as example? All this system verifies is that someone who had access to a key claims this data hasn't been tampered with, and that claim is only as secure as the storage of the key. So if the user, or anyone else who has access to said key or the hardware it is on, has malicious intent  that signature is completely meaningless. This is exactly the same issue as with the yubikey or smartcards, the security is only as good as the physical security of the device. Any tamper proofing only buys time for the user to detect and report loss or damage to the device. But with your mouse the user has no incentive to keep it secure, so there is absolutely no guarantee that your system is secure. I don't get why this is so difficult to grasp for you.

 

22 hours ago, Thaldor said:

But that's not really the problem, 100 million SHA256 keys would take up just 2.5GB or something pretty insignificant.

SHA256 is a hashing function, while it plays a role in cryptographic signatures, you'd be looking at storing significantly more data than that. And 100 million devices is nothing, if you want this to be usable you'd have to roll it out on a massive scale, so you're looking at hundreds of millions of keys per year. Running a service like this would quite literally be the equivalent of running a large messaging application the likes of WhatsApp after a couple of years.

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, ImorallySourcedElectrons said:

But you do not seem to grasp the following set of fundamental issues:

  • You cannot prevent anyone from physically attacking your wunder-mouse.
  • Your mouse has to communicate with your software through a driver.
  • You cannot control access to the key stored in the mouse, it is accessible through hardware attacks.
  • Your signature has no means of determining if the data was signed by the mouse or by a piece of software.
  • Very little cheats actually use automatic mouse movements.
  • You just created a centralized DDoS target to take out multiplayer gaming.

 

  • I acknowledged that
  • It's irrelevant
  • I acknowledged that
  • It's irrelevant
  • Partly I acknowledge that, but partly, LOL, you are overstating this, what cheat software doesn't auto-aim?
  • Lordy we must protect this critical infrastructure or terrorists will target it.  (And also, no, not really.)
7 minutes ago, ImorallySourcedElectrons said:

All this system verifies is that someone who had access to a key claims this data hasn't been tampered with, and that claim is only as secure as the storage of the key.

Sure.  Where we seem to disagree is that I think this would be a significant (orders of magnitude) reduction in actual cheating, and you don't.  I have said over and over again this is about mitigation of risk.  Time and skills spent possibly or likely destroying hardware in an effort to cheat is vastly more costly than downloading some software from somewhere for the LULZ.

Link to comment
Share on other sites

Link to post
Share on other sites

On 3/6/2023 at 4:55 PM, Erioch said:

So, just don't talk and you're unbannable?

Cheaters aren't that smart.

Link to comment
Share on other sites

Link to post
Share on other sites

23 hours ago, Thomas A. Fine said:
  • I acknowledged that
  • It's irrelevant
  • I acknowledged that
  • It's irrelevant
  • Partly I acknowledge that, but partly, LOL, you are overstating this, what cheat software doesn't auto-aim?
  • Lordy we must protect this critical infrastructure or terrorists will target it.  (And also, no, not really.)

Sure.  Where we seem to disagree is that I think this would be a significant (orders of magnitude) reduction in actual cheating, and you don't.  I have said over and over again this is about mitigation of risk.  Time and skills spent possibly or likely destroying hardware in an effort to cheat is vastly more costly than downloading some software from somewhere for the LULZ.

Ok, so cheap troll it is, consider this my last response because I'm not going to waste more time on responding to the ramblings of an idiot.

 

To set the record straight:

  • Auto-aim is actually incredibly rare. Even during the earlier days of gaming when it was relatively easy to implement with little risk of being caught, folks almost never used it because it completely removes the fun from playing, and it impedes your ability to even walk around in things like shooters. Automated input is far more often used for things like MMORPGs, where you want to farm and sell items/currency/... for real life money.
  • Your security system is worthless if it doesn't provide actual security, because you cannot guarantee that the data was signed by the mouse, so the signature is worthless. So yes, there being an ability to perform a man-in-the-middle attack or just straight-up virtualising your security device is highly relevant. So that driver issue is very much a real problem that you seem to casually hand wave away with your wunder mouse.
  • You're targeting an industry (writing and publishing cheats and trainers) that makes hundreds of millions in profit, and you claim it's irrelevant and that no one is going to spend time on breaking it. Spending a couple of days working on the disassembly and probing around to attack it is no significant barrier to entry.

I think you don't have a clue about the amount of money that goes around in gaming these days, cheating can be surprisingly profitable for both the developers and the users. Running that cheat might get you entry into tournaments with real money prizes, could get you advertisement revenue as streamer, ...  So a signature system which competes with Swiss cheese in terms of holes it has is pretty much the equivalent of taping some cellophane across the door, writing "don't pass" on it, and calling it a security door

Link to comment
Share on other sites

Link to post
Share on other sites

On 3/10/2023 at 5:12 AM, Thomas A. Fine said:

Why on earth would you look up the same information every 10 minutes?  The public key for a given device never changes.  At worst, you look it up once per player session.  At best, devices are cached per player.  Game servers or Valve or whoever could cache these per player for weeks or months or even forever.

Because that's literally what you do in computer security to check that the device hasn't been swapped out or been compromised.  Even video games do this now since they dump most of the game to the HDD/SSD.  The console just spins up the disc every once in a while to check that you aren't trying to pirate the game by not having the disc in the system anymore. 

It's a terrible idea though because it will only encumber the experience of the average gamer while hackers will find a fairly easy way to work around it and it'll probably make the game experience better for them.  That's the problem with literally all anticheat put on the player side of the game though.  The only real answer is to have a moderator watching matches.  Whether that be a real person or an AI like I suggested earlier comes down to price.  Though human labor is expensive and can be corrupted with bribes.

Link to comment
Share on other sites

Link to post
Share on other sites

14 hours ago, IRMacGuyver said:

Because that's literally what you do in computer security to check that the device hasn't been swapped out or been compromised.  Even video games do this now since they dump most of the game to the HDD/SSD.  The console just spins up the disc every once in a while to check that you aren't trying to pirate the game by not having the disc in the system anymore. 

We were talking about public key lookups.  The public key doesn't change.  The mouse is constantly sending information.  And if it is not signed by the public key (which needs to be looked up only once per session), then the signature doesn't match.

14 hours ago, IRMacGuyver said:

hackers will find a fairly easy way to work around it

There is no way to work around it, besides tampering with the device to extract the private key would not be "fairly easy".

Link to comment
Share on other sites

Link to post
Share on other sites

On 3/13/2023 at 4:49 PM, Thomas A. Fine said:

We were talking about public key lookups.  The public key doesn't change.  The mouse is constantly sending information.  And if it is not signed by the public key (which needs to be looked up only once per session), then the signature doesn't match.

There is no way to work around it, besides tampering with the device to extract the private key would not be "fairly easy".

Make up your mind is it a public key or a private key. 

Link to comment
Share on other sites

Link to post
Share on other sites

On 3/15/2023 at 1:28 PM, IRMacGuyver said:

Make up your mind is it a public key or a private key. 

I really hope this was meant as a joke.

Link to comment
Share on other sites

Link to post
Share on other sites

On 3/10/2023 at 5:56 PM, Thomas A. Fine said:
  • Partly I acknowledge that, but partly, LOL, you are overstating this, what cheat software doesn't auto-aim?

Having been through Counter-Strike for over 20 years, many cheats were single-use back in the day; aimbot, wallhack, script/hotkey assistance. Majority of people were content with wallhack only, as aimbots took several evolutions to get to a subtle level vs being blatant in the majority of cases.

Of course, cheat "suites" did and still exist, but having/using an aimbot is far from a given. A wallhack alone is more than enough of an edge for a moderately skilled player to enhance their game.

I used to work at CEVO, who developed an anti-cheat. It was novel at the time, but as more and more cheat developers took it apart, they slowly found subtle ways to avoid detection, causing further and further iterations, as is normal with all cheats/anti-cheats.

 

There is no one-size-fits-all for cheat elimination. When dealing with an online game, there are way too many factors that cannot be accounted for. While most anti-cheats will deal with the lowest hanging fruit, there is too much variability and too many options to properly account for them all.

Even in a tournament setting, while the machines themselves can be locked down and vetted, we've seen plenty of peripheral cheaters that can utilize their keyboard/mouse to inject/run their software. Sure, some have been caught by attentive staff, but I can almost guarantee there have been others who have been able to use silent/invisible methods.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×