Re: [AD] GGI patch for 3.9.21 |
[ Thread Index |
Date Index
| More lists.liballeg.org/allegro-developers Archives
]
Peter Wang <tjaden@xxxxxxxxxx> writes:
> After you ./configure you will have to add `-lggi' to the LIBS= line in
> makefile, as I don't understand those .m4 things.
Eventually we need someone to do this properly and make it autodetect things
like GGI, SVGAlib, and fbcon by looking for the relevant libs/headers. I
need to sit down with the autoconf manual and learn it properly, I think.
> automatically emulates different colour modes if required, which makes it
> a bit of a wart, but I thought that was better than having GFX_SAFE
> failing under >8bpp colour depths (eg. under X).
This raises a more general question: does GFX_SAFE always have to be 8 bit,
or not? The same problem arises with fbcon, which currently just won't run
if it can't set an 8 bit mode.
Reasons to make GFX_SAFE allow truecolor displays: it will enable programs
to run even when nothing else is available.
Reasons not to allow them: if the program wasn't expecting it, that might
have some really weird effects!
In practice, I can easily set things up so that gfx_mode_select(), alert(),
etc, will all look sensible no matter what color depth was chosen by
GFX_SAFE. But if this sometime sets a truecolor mode, a lot of the examples
and tests won't work properly, and I'm not keen to rewrite them all to be
depth independent (that's a big job, and the added complexity would get in
the way of their value as easily understood example programs).
What do people think? Perhaps we need two levels of GFX_SAFE, one which asks
for an ok mode in the current color depth, and one which is totally
desperate and will accept absolutely anything the driver can provide?
> In ggi.c there are a couple of functions which use inline assembly, but
> have bits of C stuck inbetween, which may or may not slow them down. In
> any case, it's ugly. Could some assembly guru clean that up, pleeease?
These are for the asm-interface bank switch routines, right? Or does the GGI
design require asm directly? Anyway, I think it is usually cleaner to do
such things as external .s files: that is usually fewer lines of typing than
trying to inline them within a C source, and makes it easier to spot what
needs changing to get the code working on other processors.
> The problem with using the current console driver is that it assumes
> you're using the vga driver (correct me if I'm wrong), and sets up console
> switching, etc. This is not required as libggi automatically does this for
> you (and is conflicting). It also wants root privileges. The problem with
> the X driver is that is pops up a window, which ggi also does.
Is there really any benefit to using GGI under X, given that we will soon
have our own native X code?
I think this is really a more general issue: to what extent do we overlap
with GGI, versus to what extent do we sit on top of it.
General policy for Allegro is to sit exactly one level above the very
bottom. ie. we don't talk to hardware directly unless there is absolutely no
other alternative, but we do talk directly to the lowest level way that we
can access the hardware drivers. On Linux there are a lot of different
possibilities for how graphics hardware can be accessed, and I think it is
important to support them all: the worst possible thing would be if there
are any ever any machines that have working graphics drivers, but we don't
know how to use them.
The general approach taken by GGI seems to be a little bit more confused. At
the same time, they are writing hardware drivers and libs that sit on top of
other hardware drivers. We certainly want to use their hardware drivers. We
may also want to use their support for other interfaces, but I'm less
convinced about that: unless there are strong reasons for it, supporting
things through intermediate layers just loses performance. If GGI was a
universal standard, I would say that we should use it exclusively, but it
isn't yet that, so we still have to support all the other options directly
(eg. we need native VGA drivers, SVGAlib driver, fbcon driver, native
support for X and DGA, etc). This being so, I'm inclined to think that we
should use GGI only as a way to access their hardware devices from the Linux
console, and use our own code for all the other possible driver scenarios.
Console switching is an interesting thing. The current code no longer calls
any VGA-specific functions, and works in non-root programs (eg. I can run
fbcon modes as any user), but it does require ownership of the console. This
has the advantage of being consistent, though (if we do it ourselves, we can
impose identical behaviour no matter what devices are providing the actual
driver support), and in any case we need to know about the switches in order
to implement callbacks, shutdown the timers and input systems, etc, so it
would still need significant code even if we use GGI to do a lot of this
work. One big advantage to doing the switching ourselves is that we can
reliably disable switches when changing modes, which would be tricky if for
example you were going from a VGA mode selection dialog to a GGI ingame
display mode, since if you had to transfer ownership of the console during
that change, there is a danger of a virtual console switch coming along at
the wrong time and ruining the whole thing.
I suppose the real question is whether there is any way to bypass the GGI
console switching mechanism. If so, I think we should do that and use the
existing system driver routines instead. If it can't be avoided, though, we
obviously have no choice but to find some way to work around it.
As a matter of historical precedent, when the GGI developers encountered
this identical problem in making GGI run on top of SVGAlib, they did the
same thing which I'm proposing here, fudging some signal handlers to
manually take back control and not let SVGAlib have any access to the switch
events. So the GGI people themselves obviously felt that it was best to
handle the switch in the higher level rather than leaving it up to whatever
other system they are running on top of, and if they managed to make this
work with SVGAlib, I suspect the same method will enable us to make it work
with their system...
> Should I add another system driver, or just modify the current ones as
> required?
Good question, I hadn't thought of doing this as another system driver. On
the one hand, that's a potentially nice way to leave GGI more control over
things like the switch mechanism. But on the other hand it could end up with
a pointless duplication of code, as you'd have to write lots of GGI-specific
input drivers, background mode support, etc, rather than using existing code
for this. Suddenly it would become a much bigger project than just getting
the graphics driver working, as you'd have to write all the OS-specific
functions in a GGI-manner, probably using GII for input, etc.
I guess the basic question is, to what extent do we want to adapt Allegro to
the assumptions made by GGI, versus make our GGI driver work alongside the
assumptions made by Allegro? I'm personally biased towards the latter, but
welcome any arguments for the former approach...
--
Shawn Hargreaves - shawn@xxxxxxxxxx - http://www.talula.demon.co.uk/
"A binary is barely software: it's more like hardware on a floppy disk."