Re: [AD] About Javier Gonzalez's patch |
[ Thread Index |
Date Index
| More lists.liballeg.org/allegro-developers Archives
]
On Thu, Jan 04, 2001 at 12:14:37PM +0100, Eric Botcazou wrote:
> What I wonder is whether we shouldn't also take care of the consistency with
> the others ports of Allegro.
I'm not sure what you mean by this; I don't think it's
possible to be totally consistent. Don't the sizes of mickeys
vary according to the mouse? Some mice certainly implement
sensitivity setting in the hardware, and in Linux at least the
mouse tends to be read-only to non-root applications so you
can't always change or read the hardware sensitivity.
> The DirectDraw mouse interface is very basic:
> only raw hardware mickeys are returned without taking any speed or
> resolution factor into account, so we have to do the entire job.
> Now on others platforms (DOS and X11 at least, BeOS ?), a significant part
> of this job is performed by the system mouse driver: how are this drivers
> dealing with the screen resolution ? Are they using threshold values too or
> a global scaling factor based on the resolution ?
(DOS) Allegro used to use the mouse driver's idea of the mouse
position, sending it information about the size of the screen
and reading back real coordinates. But this was changed in
order to get faster mouse movement -- now the main DOS mouse
driver uses mickeys. So in DOS the screen resolution doesn't
affect the mouse sensitivity -- I don't think the low level
mouse drivers change their sensitivity when you change the
range, but if they do then this would affect the non-mickey DOS
mouse drivers.
Linux Allegro currently reads raw mouse data, in all variants of
the driver, so it's basically mickeys. It doesn't take any
notice of the screen resolution.
The X mouse driver uses mouse positions provided the user
doesn't try to read mickey information. The coordinates it
returns are the coordinates within the Allegro window of the X
mouse pointer, which is hidden -- so when the X pointer moves
into the Allegro window, it changes to an Allegro pointer until
it moves out again. This makes GUI-type applications behave
nicely. Games will use mickeys instead of just mouse_x etc, and
when they do this the X mouse driver switches into a different
mode in which it clamps the cursor to the centre of the window
and calculates mickeys for each X mouse event by subtracting the
centre coordinates from the reported mouse coordinates, then
warping the pointer back to the centre. I think it stays in
this modes for a few seconds after each call to
get_mouse_mickeys, so if the application stops doing game-type
reading and starts doing GUI-type reading the pointer will
change to the better system for that.
So in the X mouse driver the sensitivity in Allegro
automatically matches the sensitivity in X in general, and it's
possible that the screen resolution is taken into account by X.
The DGA2 X mouse driver (from what I remember), and maybe also
the DGA1 one, just receives its data from X as mickey offsets.
I don't know whether that is fairly raw data or already
processed by X's mouse sensitivity code, but this is one step
further away from depending upon the screen resolution.
Generally I think mickey systems shouldn't use the screen
resolution, because the idea of reading mickeys is that you're
not just moving a pointer around. If you're directly
controlling a character in a game (e.g. a 3D first person game)
you tend to want the same mouse motion to give the same result,
no matter what the screen resolution is. Also, the danger with
using screen resolutions is that you'll make a system which
can't click on all the pixels in high resolution modes.
George
--
Random project update:
09/05/2000: Libnet 0.10.8 uploaded -- a few bugfixes
http://www.canvaslink.com/libnet/ (try changes-0.10.8.txt)