Jump to content
Search In
  • More options...
Find results that contain...
Find results in...

Mouse hardware DPI vs Software DPI

CoolJosh3k
 Share

Question, so I know I am not crazy:

 

If you have a mouse that can do 16,000DPI, but set to only 1,600DPI, it still physically calculates 16,000DPI then later divides it by 10 right?

 

or in other words, setting your mouse to 1,600DPI then using it would be the exact same accuracy as if you set it to its max of 16,000DPI and changed your mouse sensitivity in Windows to 10%.

 

I know there are other factors at play, but just from a DPI stand point this is all true, right?

Link to comment
Share on other sites

Link to post
Share on other sites

no.

 

Windows controls mouse acceleration and maximum speed.  It does not affect DPI.

 

As for the mouse hardware, I imagine how that works will vary by mouse.  

 

DPI is independent from that, and that controls how accurate its readings are.

Link to comment
Share on other sites

Link to post
Share on other sites

17 minutes ago, tkitch said:

no.

 

Windows controls mouse acceleration and maximum speed.  It does not affect DPI.

 

As for the mouse hardware, I imagine how that works will vary by mouse.  

 

DPI is independent from that, and that controls how accurate its readings are.

Mouse software tends to have a setting labeled DPI. My assumption is that that is software only and is done after the hardware does everything at the maximum DPI.

Link to comment
Share on other sites

Link to post
Share on other sites

23 minutes ago, CoolJosh3k said:

Mouse software tends to have a setting labeled DPI. My assumption is that that is software only and is done after the hardware does everything at the maximum DPI.

most good mice have hardware buttons to control DPI, and not rely on software for it, so it will vary based on mouse.

Link to comment
Share on other sites

Link to post
Share on other sites

13 hours ago, tkitch said:

most good mice have hardware buttons to control DPI, and not rely on software for it, so it will vary based on mouse.

Well yeah, but the point is the actual sensor. I expect the calculations taking place on the DSP, and other parts, are always at max DPI, while the mouse software, mouse buttons and such just do some division to make that more usable.

Link to comment
Share on other sites

Link to post
Share on other sites

Sensors do not calculate everything at the maximum CPI, and there's a reason for that. The sensor is essentially a camera, with a limited amount of pixels. There's a point at which the count to pixel ratio is 1:1, and a range from that point within which you can manipulate the raw data (pixel subdivision) and not lose any accuracy. 

 

At the maximum CPI, the resolution increments are usually hugely smaller than the physical resolution of the sensor, which results in ripple artifacts (or as it's more commonly known, jitter). Basically the sensor can't tell exactly where it is, so it jumps between virtual locations, creating that jittery effect in its output. Data downsampled from max CPI will still contain some degree of this inaccuracy.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, TheChromaBristlenose said:

Sensors do not calculate everything at the maximum CPI, and there's a reason for that. The sensor is essentially a camera, with a limited amount of pixels. There's a point at which the count to pixel ratio is 1:1, and a range from that point within which you can manipulate the raw data (pixel subdivision) and not lose any accuracy. 

 

At the maximum CPI, the resolution increments are usually hugely smaller than the physical resolution of the sensor, which results in ripple artifacts (or as it's more commonly known, jitter). Basically the sensor can't tell exactly where it is, so it jumps between virtual locations, creating that jittery effect in its output. Data downsampled from max CPI will still contain some degree of this inaccuracy.


Well I do understand that the image sensor doesn’t actually contain enough pixels for such a high DPI, but that instead something like bicuspid interpolation is used.

 

Im theorising that changing the DPI per an interface is just some simple math on the raw data from the mouse’s hardware, so that it is a reasonable experience.

 

In the most simple terms I can think of, I imagine the mouse would convert 1 inch of physical movement and translate that to ~16,000 pixels, but because an output that would make it a mouse cursor movement zip the user’s screen so fast, an extra software layer on top that converts those to sub-pixels.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
 Share


×