RE: [AD] TRUE and FALSE

[ Thread Index | Date Index | More lists.liballeg.org/allegro-developers Archives ]


I remember when programming on the Atari ST (the platform where Allegro was
born), TRUE would often be defined as -1. In Binary, -1 is where all bits
are set, and 1 is where only bit 0 is set. For a 1 bit value, you get
either -1 or 1 depending on whether you interpret the bit as the value or
the two's complement sign-bit.

Keeping the definition of TRUE as -1 would make it inconsistent with other
libs that use TRUE = 1 (but if the code's properly written (eg.
"if(bool1&&bool2)" instead of "if(bool1==bool2)"), it shouldn't be a
problem). However, if we make the change in Allegro, then we could end up
breaking code such as "X += FunctionThatReturnsABool()" (it's not written
properly, but the code would be broken and the problem would be hard to find
if compiling an old program with the new Allegro).

I think that on the PC, TRUE is generally defined as 1. Does anyone know
what it's generally defined as on Macs? I'm sure that on the Atari ST, -1
was often used.


AE.



>-----Original Message-----
>From: alleg-developers-admin@xxxxxxxxxx
>[mailto:alleg-developers-admin@xxxxxxxxxx Behalf Of Ben
>Davis
>Sent: 22 April 2003 15:17
>To: alleg-developers@xxxxxxxxxx
>Subject: [AD] TRUE and FALSE
>
>
>Hi,
>
>This has bugged me for ages. Why is TRUE defined as -1? The
>canonical truth
>value in C is 1. Would it break too many programs to change it? Or
>could it
>be deprecated in favour of AL_TRUE (and AL_FALSE) perhaps?
>
>If someone makes this blunder in Allegro 5, I shall personally kill that
>person. >:E~~~
>
>:)
>
>Ben
>
>
>-------------------------------------------------------
>This sf.net email is sponsored by:ThinkGeek
>Welcome to geek heaven.
>http://thinkgeek.com/sf
>--
>https://lists.sourceforge.net/lists/listinfo/alleg-developers
>





Mail converted by MHonArc 2.6.19+ http://listengine.tuxfamily.org/