Re: [hatari-devel] Less dark doubled TV-monitor mode for 32-bit output

[ Thread Index | Date Index | More Archives ]


On 11/6/19 10:38 AM, Thomas Huth wrote:
Am Wed, 6 Nov 2019 01:31:32 +0200
schrieb Eero Tamminen <oak@xxxxxxxxxxxxxx>:
and ST screen conversion macros in src/convert/macros.h rely
on them, they don't use SDL_PixelFormat.

No, they don't rely on these hard-coded values. They use STRGBPalette[]
which should contain the right values.

OK, good to know.

It seems that only AVI recording and screenshot functionality
use SDL_PixelFormat (through PixelConvert_* functions).

We've had endianess issues with the AVI code in the past, that's why it
uses PixelFormat. It's needed. Really. Really really.

So instead of discussing, why don't you simply try it?

Something like this (not tested, just scribbled here):

     uint32_t *next, *line;
     SDL_PixelFormat *fmt = surf->format;
     uint32_t mask = (fmt->Rmask >> 1) & fmt->Rmask)
                     | (fmt->Gmask >> 1) & fmt->Gmask)
                     | (fmt->Bmask >> 1) & fmt->Bmask);

     ... loop ...
           *next++ = (*line++ >> 1) & mask;

That's certainly less code in the loop body than shifting each byte on
its own.

Thanks, that works great!

I pushed the changes, as they should be a clear improvement
to the current code (lower CPU usage for doubling, simpler code,
TV-mode that's not too dark).

	- Eero

Mail converted by MHonArc 2.6.19+