Re: [AD] Color convertors |
[ Thread Index |
Date Index
| More lists.liballeg.org/allegro-developers Archives
]
>> This means that an Allegro palette needs to fit into this system one, so a
>> conversion is still needed. So yes, the driver needs to update the LUT
>> everytime the Allegro palette changes...
>
>That's very slow, isn't it ?
>Why not use the following sheme:
>
> Allegro palette driver LUT
>8-bit index ---------------> 12-bit RGB ----------> 8-bit index
>
>In this way, you don't need to recalculate the LUT each time the palette
>changes. The drawback is of course you use one more lookup operation for the
>conversion.
I thought about the possibility of using such a method when all this thing
started; well, I came to the conclusion the 256 bytes LUT method is faster in
most cases. That's because normally you always do one lookup operation with it
per pixel, and the only case of possible slowdowns would be if you do intense
palette updates (color cyclying for example) which are not that common.
On the other way, with your method you -always- do two lookup ops per pixel,
and that's not that fast IMO...
> I just re-added the support for color conversion to 8-bit with a slightly
> simplified interface: _get_colorconv_map() doesn't take the color depth as a
> parameter any more and _release_colorconv_map() doesn't exist any longer.
> The latter has been merged into _release_colorconv_blitter().
Ok, I've updated the BeOS code too to reflect the changes. Later, when all the
conversion routines will be ready, I'll test everything to assure there are no
probs. Then, finally and hopefully we'll be ready to release WIP 3.9.38... =)
PS: on colconv.c, line 209 there is currently a "return;", but shouldn't it be
a "break;"?
--
Angelo Mottola
a.mottola@xxxxxxxxxx
http://www.ecplusplus.com