Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

Spot the mistake

2 views
Skip to first unread message

7

unread,
Jul 10, 2011, 6:36:53 PM7/10/11
to
Spot the mistake
----------------

Spot the deliberate mistake:

#define SMALL 1;

main()
{
int i = SMALL;
printf("i=%i", SMALL);
}


Got any more good ones?
Please share.

High Plains Thumper

unread,
Jul 10, 2011, 7:17:38 PM7/10/11
to
7 wrote:

COLA Troll: "Linux has less than 1% marketshare."

and

COLA Troll: "Linux is good at motion picture industry video / sound
production and editing, good at high speed scientific and financial
computing, but as a desktop it sucks, as a small studio sound editor it
sucks."

--
HPT

Hadron

unread,
Jul 10, 2011, 8:00:58 PM7/10/11
to

7 <email_at_www_at_en...@enemygadgets.com> writes:

It's not a good one at all. It's not a mistake anyone with a clue about
C would make, thats for sure. ";"..

You also don't use i, but thats typical sloppyness.

And we won't even discuss you not doing the necessary includes.


Chris Ahlstrom

unread,
Jul 10, 2011, 9:16:31 PM7/10/11
to
7 wrote this copyrighted missive and expects royalties:

> Spot the mistake
> ----------------
>
> Spot the deliberate mistake:
>
> #define SMALL 1;

The semi-colon will be expanded as part of the macro, causing the printf to
fail to compile.

Also, you need to include <stdio.h>. Although gcc will recognize printf()
anyway as a built-in function and issue a warning about an incompatible
implicit declaration, if you leave it out.

> main()
> {
> int i = SMALL;
> printf("i=%i", SMALL);
> }
>
> Got any more good ones?
> Please share.

Nah, hardly anyone here cares about coding puzzles; I'd rather concentrate
on working code.

--
This is for all ill-treated fellows
Unborn and unbegot,
For them to read when they're in trouble
And I am not.
-- A. E. Housman

Chris Ahlstrom

unread,
Jul 10, 2011, 9:34:24 PM7/10/11
to
Chris Ahlstrom wrote this copyrighted missive and expects royalties:

> 7 wrote this copyrighted missive and expects royalties:
>
>> Spot the mistake
>> ----------------
>>
>> Spot the deliberate mistake:
>>
>> #define SMALL 1;
>
> The semi-colon will be expanded as part of the macro, causing the printf to
> fail to compile.
>
> Also, you need to include <stdio.h>. Although gcc will recognize printf()
> anyway as a built-in function and issue a warning about an incompatible
> implicit declaration, if you leave it out.
>
>> main()
>> {
>> int i = SMALL;
>> printf("i=%i", SMALL);
>> }
>>
>> Got any more good ones?
>> Please share.
>
> Nah, hardly anyone here cares about coding puzzles; I'd rather concentrate
> on working code.

======= I should say, "more practical code"

Damn, I listened to that Badger Badger Badger tune too much, now it is stuck
in my head, an Ohrwurm.

--
I might have gone to West Point, but I was too proud to speak to a congressman.
-- Will Rogers

Gregory Shearman

unread,
Jul 10, 2011, 11:51:31 PM7/10/11
to
On 2011-07-11, Chris Ahlstrom <ahls...@xzoozy.com> wrote:
> 7 wrote this copyrighted missive and expects royalties:
>
>> Spot the mistake
>> ----------------
>>
>> Spot the deliberate mistake:
>>
>> #define SMALL 1;
>
> The semi-colon will be expanded as part of the macro, causing the printf to
> fail to compile.
>
> Also, you need to include <stdio.h>. Although gcc will recognize printf()
> anyway as a built-in function and issue a warning about an incompatible
> implicit declaration, if you leave it out.
>
>> main()
>> {
>> int i = SMALL;
>> printf("i=%i", SMALL);
>> }
>>
>> Got any more good ones?
>> Please share.
>
> Nah, hardly anyone here cares about coding puzzles; I'd rather concentrate
> on working code.
>

Isn't the variable "i" redundant?

--
Regards,
Gregory.
Gentoo Linux - Penguin Power

Chris Ahlstrom

unread,
Jul 11, 2011, 6:20:12 AM7/11/11
to
Gregory Shearman wrote this copyrighted missive and expects royalties:

Good catch, sort of. If you add the "-Wall" switch to gcc it will remind
you that 'i' is unused.

There're also two other issues that -Wall reminds us of...

1. The implicit return type of main is 'int' [and we don't declare
'int main()' explicitly.

2. We left out a 'return <int>;' statement.

So here is the version of the code that passes through "gcc -Wall" without
any warnings or worse:

#include <stdio.h>
#define SMALL 1

int
main()
{
printf("SMALL=%i", SMALL);
return 0;
}

Well hello world!

--
If this is a service economy, why is the service so bad?

Chris Ahlstrom

unread,
Jul 11, 2011, 6:32:20 AM7/11/11
to
Chris Ahlstrom wrote this copyrighted missive and expects royalties:

And damn, there's still a minor issue that even the compiler doesn't spot.

The return type of printf() is not 'void', it is 'int', and we're not using
it, so, conventionally, we should flag it with a cast:

(void) printf("SMALL=%i", SMALL);

Neither cppcheck or splint flag the lack of use of the '(void)' cast,
though, since it really is just a reminder to other coders that
you're deliberately throwing away the return value.

Have we beat this one to death yet?

Would that we could apply the same methods to certain trolls!

--
Are you a parent? Do you sometimes find yourself unsure as to what to
say in those awkward situations? Worry no more...

Do as I say, not as I do.
Do me a favour and don't tell me about it. I don't want to know.
What did you do *this* time?
If it didn't taste bad, it wouldn't be good for you.
When I was your age...
I won't love you if you keep doing that.
Think of all the starving children in India.
If there's one thing I hate, it's a liar.
I'm going to kill you.
Way to go, clumsy.
If you don't like it, you can lump it.

Siegfried Vereneseneckockkrockoff

unread,
Jul 11, 2011, 1:30:51 PM7/11/11
to
"7" <email_at_www_at_en...@enemygadgets.com> schreef in
bericht news:9QpSp.27018$NX1....@newsfe18.ams2...

http://www.gowrikumar.com/c/
Fuck off twat and stuff your silly games up your ass.

Sven Grönberg

unread,
Jul 11, 2011, 1:35:52 PM7/11/11
to

"7" <email_at_www_at_en...@enemygadgets.com> wrote in message
news:9QpSp.27018$NX1....@newsfe18.ams2...

> Spot the mistake

yes. your momma's abortion didn't take and you managed to survive with but
a few coat-hanger scrapes across your skull.

Homer

unread,
Jul 11, 2011, 4:31:39 PM7/11/11
to
Verily I say unto thee, that Chris Ahlstrom spake thusly:

> So here is the version of the code that passes through "gcc -Wall"
> without any warnings or worse:
>
> #include <stdio.h>
> #define SMALL 1
>
> int
> main()
> {
> printf("SMALL=%i", SMALL);
> return 0;
> }
>
> Well hello world!

What small change needs to be made in the following to produce the
intended output, and what is that output?:

#include <stdio.h>

int hmm[ ] = {0x6c64210a, 0x20576f72, 0x656c6c6f, 0x00000048 };

void erm(char* umm) {
char* c = umm;
if (umm > (char *)&hmm[3])
erm(umm+1);
printf("%c", *c);
}

int main(int argc, char **argv) {
erm((char*)hmm);
return 0;
}

Good luck, and no cheating!


######
answer=ww0ECQMCjRMpoMJlmx9gyUs9U3R/WUWMKjJqPvAo5jLMoK43lmwnCh8scf5gZCoHKxL5k\
OlwLYE2VsfA1LEoY67t0SSzhHIvCtGIkV6kozfui4JEXHmGrm1KrAY=

Decrypt with:
echo -e "$answer" | base64 -d | mdecrypt

--
K. | Thy name
http://slated.org | Shalt not
Fedora 8 (Werewolf) on šky | Take the vein
kernel 2.6.31.5, up 49 days | Of my root

7

unread,
Jul 11, 2011, 4:49:30 PM7/11/11
to
Chris Ahlstrom wrote:

Nearly:


"The semi-colon will be expanded as part of the macro, causing the
printf to fail to compile."

Correct - but in reality what I actually do is exploit that to
make it intentionally fail!

e.g. I could easily write

if(SMALL) { do something } else { do something else }

That is bad programming - for the most part, I know I would never write
if(SMALL) ...
because if I set SMALL to 2,3,4, then everything is OK when configuring the
software, but if accidentally set SMALL to 0
the execution of the if() statement
will change and that would have been an unintentional side effect.

If I accidentally wrote the code with if(SMALL) it will not fail
especially hard to spot the mistake if it is buried in a complex formula.
And there is no warning of impending doom.

So by putting semicolon in #define SMALL 1;
I've made sure on compiling it it is guaranteed to fail
when used out of context.

7

unread,
Jul 11, 2011, 4:50:58 PM7/11/11
to
Siegfried Vereneseneckockkrockoff wrote:

> "7" <email_at_www_at_en...@enemygadgets.com> schreef in
> bericht news:9QpSp.27018$NX1....@newsfe18.ams2...
>> Spot the mistake
>> ----------------
>>
>> Spot the deliberate mistake:
>>
>>
>>
>> #define SMALL 1;
>>
>> main()
>> {
>> int i = SMALL;
>> printf("i=%i", SMALL);
>> }
>>
>>
>> Got any more good ones?
>> Please share.
>>
>
> http://www.gowrikumar.com/c/


Why thank you troll, that was very thoughtful of you.


> Fuck off twat and stuff your silly games up your ass.

Charming.

Did you get out of bed the wrong way this morning?

cc

unread,
Jul 11, 2011, 4:53:15 PM7/11/11
to
On Jul 11, 4:49 pm, 7

<email_at_www_at_enemygadgets_dot_...@enemygadgets.com> wrote:
> Chris Ahlstrom wrote:
> > Chris Ahlstrom wrote this copyrighted missive and expects royalties:
>
> >> Gregory Shearman wrote this copyrighted missive and expects royalties:
>

That is incredibly poor coding practice.

cc

unread,
Jul 11, 2011, 5:04:04 PM7/11/11
to
On Jul 11, 6:32 am, Chris Ahlstrom <ahlstr...@xzoozy.com> wrote:
> Chris Ahlstrom wrote this copyrighted missive and expects royalties:
>
>
>
>
>
> > Gregory Shearman wrote this copyrighted missive and expects royalties:
>

It's not necessary in conforming C code, which is why it's not
flagged. On certain older, non-conforming compilers though, you needed
a (void) cast to shut up the warning. Compilers know you're not using
the return value and will create the same code if you cast it to void
or not. Using it as a comment to let others know you're throwing away
the return value is perfectly acceptable though.

Chris Ahlstrom

unread,
Jul 11, 2011, 5:12:14 PM7/11/11
to
7 wrote this copyrighted missive and expects royalties:

> Siegfried Vereneseneckockkrockoff wrote:


>
>> "7" <email_at_www_at_en...@enemygadgets.com> schreef in
>> bericht news:9QpSp.27018$NX1....@newsfe18.ams2...
>>> Spot the mistake
>>> ----------------
>>>
>>> Spot the deliberate mistake:
>>>
>>> #define SMALL 1;
>>>
>>> main()
>>> {
>>> int i = SMALL;
>>> printf("i=%i", SMALL);
>>> }
>>>
>>> Got any more good ones?
>>> Please share.
>>
>> http://www.gowrikumar.com/c/
>
> Why thank you troll, that was very thoughtful of you.
>

>> <expletive deleted>


>
> Charming.
> Did you get out of bed the wrong way this morning?

Bed? I'm sure he sleeps in a coffin!

Anyway, Gimpel Software had an ongoing "bug of the month" advertisement in
C/C++ User's Journal, and it was fun to work them out. Usually pretty easy,
occasionally a bit of a stumper.

--
Example is not the main thing in influencing others. It is the only thing.
-- Albert Schweitzer

Chris Ahlstrom

unread,
Jul 11, 2011, 5:26:24 PM7/11/11
to
cc wrote this copyrighted missive and expects royalties:

> On Jul 11, 6:32?am, Chris Ahlstrom <ahlstr...@xzoozy.com> wrote:
>
>> Neither cppcheck or splint flag the lack of use of the '(void)' cast,
>> though, since it really is just a reminder to other coders that
>> you're deliberately throwing away the return value.
>
> It's not necessary in conforming C code, which is why it's not
> flagged. On certain older, non-conforming compilers though, you needed
> a (void) cast to shut up the warning. Compilers know you're not using
> the return value and will create the same code if you cast it to void
> or not. Using it as a comment to let others know you're throwing away
> the return value is perfectly acceptable though.

Ah, now I know why you call yourself, 'cc' <grin>.

--
Men occasionally stumble over the truth, but most of them pick themselves
up and hurry off as if nothing had happened.
-- Winston Churchill

7

unread,
Jul 11, 2011, 5:42:43 PM7/11/11
to
cc wrote:

And being a troll, you will now prove your claim to be true
by suggesting your own preferred solution.

Popcorn and LOLs at the ready.


Gregory Shearman

unread,
Jul 12, 2011, 2:08:24 AM7/12/11
to

Hello World!

What about endianness? We don't all work on intel CPUs...

Chris Ahlstrom

unread,
Jul 12, 2011, 6:27:33 AM7/12/11
to
Gregory Shearman wrote this copyrighted missive and expects royalties:

> On 2011-07-11, Homer <use...@slated.org> wrote:

Heh.

3| // l d ! * _ W o r e l l o H
4| int hmm[ ] = {0x6c64210a, 0x20576f72, 0x656c6c6f, 0x00000048 };

And under the debugger (at the first call, Intel CPU):

(gdb) display umm
1: umm = 0x600930 "\n!dlroW olleH"

The code exits after the first character (newline) is printed.
Change the '>' to a '<' (Intel CPU).

--
Illiterate? Write today, for free help!

cc

unread,
Jul 12, 2011, 8:15:52 AM7/12/11
to
On Jul 11, 5:42 pm, 7

There is no way #define SMALL 1; is good coding practice and I'd like
to hear someone with a grasp of the English language explain how it
is. First, lines like int i = SMALL (no semicolon) now become legal.
If someone else is reading your code, they will go, what the fuck.
Second the fact that you are using it to prevent one case (by making
it fail to compile) is useless. You should know what your defines mean
and what the legal value of them are. For future maintenance, you
should have a comment explicitly stating that SMALL cannot be zero.
You said if SMALL is zero "execution of if() statement will change and
that would have been an unintentional side effect." In that case why
are you using an if statement? You obviously want it to always be true
and always execute. You should know that when you're writing your.
It's bad coding practicing because the problem it solves is not a
problem at all, and the problems it could introduce are large.

Hadron

unread,
Jul 12, 2011, 8:34:24 AM7/12/11
to
cc <scat...@hotmail.com> writes:

>
> That is incredibly poor coding practice.

He's an idiot. A real bonefide idiot.

He's also probably the type of twat who when learning C puts

#define BEGIN

type things in so he can "write it like pascal"....

Wow. What a clueless moron he is.

nessuno

unread,
Jul 12, 2011, 8:22:55 AM7/12/11
to
On Jul 11, 12:32 pm, Chris Ahlstrom <ahlstr...@xzoozy.com> wrote:
> Chris Ahlstrom wrote this copyrighted missive and expects royalties:
>
>
>
> > Gregory Shearman wrote this copyrighted missive and expects royalties:
>

I'd rather just pull the handle, watch 'em swirl around a few times,
and down they go! Very low tech.

Hadron

unread,
Jul 12, 2011, 8:49:13 AM7/12/11
to
cc <scat...@hotmail.com> writes:

> There is no way #define SMALL 1; is good coding practice and I'd like
> to hear someone with a grasp of the English language explain how it
> is. First, lines like int i = SMALL (no semicolon) now become legal.
> If someone else is reading your code, they will go, what the fuck.

And THAT is the main crux of the matter despite the other horrific
nasties.

This is yet another example of a clueless twat learning enough to be
dangerous.

"7" should stick to "Gambas" or whatever it his he, and 3 other people,
use.

Siegfried Vereneseneckockkrockoff

unread,
Jul 12, 2011, 10:44:45 AM7/12/11
to
"Chris Ahlstrom" <ahls...@xzoozy.com> schreef in bericht
news:ivfp5l$a2o$4...@dont-email.me...

>7 wrote this copyrighted missive and expects royalties:
>
>> Siegfried Vereneseneckockkrockoff wrote:
>>
>>> "7" <email_at_www_at_en...@enemygadgets.com> schreef in
>>> bericht news:9QpSp.27018$NX1....@newsfe18.ams2...
>>>> Spot the mistake
>>>> ----------------
>>>>
>>>> Spot the deliberate mistake:
>>>>
>>>> #define SMALL 1;
>>>>
>>>> main()
>>>> {
>>>> int i = SMALL;
>>>> printf("i=%i", SMALL);
>>>> }
>>>>
>>>> Got any more good ones?
>>>> Please share.
>>>
>>> http://www.gowrikumar.com/c/
>>
>> Why thank you troll, that was very thoughtful of you.
>>
>>> <expletive deleted>
>>
>> Charming.
>> Did you get out of bed the wrong way this morning?
>
> Bed? I'm sure he sleeps in a coffin!
>
Too stinky?, use this!
http://img.chan4chan.com/img/2009-04-02/1238674690529.jpg

Gary Stewart

unread,
Jul 12, 2011, 10:51:40 AM7/12/11
to

You'll need a case of the stuff to remove the stench left by that
gasbag Goblin.

--
7/12/2011 10:51:16 AM
Gary Stewart

Please visit our hall of Linux idiots.
http://linuxidiots.blogspot.com/

Watching Linux Fail:
http://limuxwatch.blogspot.com/

Come laugh at Linux "advocacy" with us!

http://www.youtube.com/social/blog/techrights-org

Linux's dismal desktop market share:

http://royal.pingdom.com/2011/05/12/the-top-20-strongholds-for-desktop-linux/

Desktop Linux: The Dream Is Dead
"By the time Microsoft released the Windows 7 beta
in January 2009, Linux had clearly lost its chance at desktop
glory."
http://www.pcworld.com/businesscenter/article/207999/desktop_linux_the_dream_is_dead.html

Desktop Linux on Life Support:

http://www.techradar.com/news/software/operating-systems/is-linux-on-the-desktop-dead--961508

When I use the term Linux I am speaking of desktop Linux unless
otherwise stated.

Homer

unread,
Jul 12, 2011, 12:17:02 PM7/12/11
to
Verily I say unto thee, that Gregory Shearman spake thusly:

> On 2011-07-11, Homer <use...@slated.org> wrote:

>> #include <stdio.h>
>>
>> int hmm[ ] = {0x6c64210a, 0x20576f72, 0x656c6c6f, 0x00000048 };
>>
>> void erm(char* umm) {
>> char* c = umm;

/* if (umm > (char *)&hmm[3]) */


if (umm < (char *)&hmm[3])
>> erm(umm+1);
>> printf("%c", *c);
>> }
>>
>> int main(int argc, char **argv) {
>> erm((char*)hmm);
>> return 0;
>> }

[...]


> Hello World!
>
> What about endianness? We don't all work on intel CPUs...

Yup, it's just an obfuscation hack. It could be made all the more
obfuscated if the data were taken from the results of other parts of a
larger program, rather than a nice, convenient array. :)

You could also test the endian order and data model then byteswap the
results accordingly, although you'd also need to define the input data
correctly so it matched, or even get the system itself to provide the
input data in its native format, using a less system-dependent
obfuscation technique.

--
K. | Thy name
http://slated.org | Shalt not
Fedora 8 (Werewolf) on šky | Take the vein

kernel 2.6.31.5, up 50 days | Of my root

Siegfried Vereneseneckockkrockoff

unread,
Jul 12, 2011, 1:29:57 PM7/12/11
to
"Gary Stewart" <stewart...@oohay.com> schreef in bericht
news:1ec9g89vl2jv6$.17hdh8szxwao7$.dlg@40tude.net...
Yup, he just has to do something usefull with all that gas!
http://www.fatback.net/Gas_Grill.jpg

Gary Stewart

unread,
Jul 12, 2011, 1:40:12 PM7/12/11
to

Hahahha!

I'll bet sMarti has one of those too!

--
7/12/2011 1:39:55 PM

7

unread,
Jul 12, 2011, 2:35:41 PM7/12/11
to
cc wrote:


Your abilities to post without trolling is noted.

However, you are still wrong because of your shallow experience
in coding in C.

Not all #defines are used in the same way.

Some #defines are to define constants intended for general expressions.
Other #defines are intended to be used only as macros.

The two purposes are not distinguished by the compiler
and get you into horrible mess.

If you want a #define to be used only as a macro
and prevent it from being used in an expression
then that is a way to do it.

For example a #define for turning on and off an LED should only be
used as a macro, and should never be used in an expression.


cc

unread,
Jul 12, 2011, 2:57:47 PM7/12/11
to
On Jul 12, 2:35 pm, 7

You should also know that by looking at the define and therefore not
use it in an expression. Which is also why it's common practice to
name your #defines and variables something meaningful so there's less
likely of a chance for it to occur. I have never seen #define with a
semicolon on the end being defined as acceptable. Maybe ask in clc to
confirm. The closest thing I can think of for what you're talking
about is surrounding a macro function with do..while to make it behave
like a function. You shouldn't be using a semi-colon in a #define to
prevent a macro from being used in an expression though. That's bad
coding practice. If you don't want your macro used in an expression,
then don't use it in an expression. That's acceptable coding practice.
Using sloppy code so the compiler prevents you from doing something
you don't want to do is unacceptable.

I know there are some other coders in here. Anyone else think
otherwise?

Chris

unread,
Jul 12, 2011, 4:24:38 PM7/12/11
to
Am Tue, 12 Jul 2011 11:57:47 -0700 schrieb cc:

> You should also know that by looking at the define and therefore not use
> it in an expression. Which is also why it's common practice to name your
> #defines and variables something meaningful so there's less likely of a
> chance for it to occur. I have never seen #define with a semicolon on
> the end being defined as acceptable.

...

> I know there are some other coders in here. Anyone else think otherwise?

It's not a matter of thinking for me as I'm not very good at C yet. But I
know where to find some example code. :)

~ % grep -R -E "#define.*;$" /usr/src/linux-3.0.0-rc7-mainline | grep -v
"(.*)"
/usr/src/linux-3.0.0-rc7-mainline/scripts/genksyms/lex.c_shipped:#define
YY_BREAK break;
/usr/src/linux-3.0.0-rc7-mainline/scripts/kconfig/
lex.zconf.c_shipped:#define YY_BREAK break;
/usr/src/linux-3.0.0-rc7-mainline/scripts/kconfig/lex.zconf.c:#define
YY_BREAK break;
/usr/src/linux-3.0.0-rc7-mainline/scripts/dtc/dtc-
lexer.lex.c_shipped:#define YY_BREAK break;
/usr/src/linux-3.0.0-rc7-mainline/arch/x86/include/asm/xen/
interface.h:#define XEN_EMULATE_PREFIX .byte 0x0f,0x0b,0x78,0x65,0x6e ;
/usr/src/linux-3.0.0-rc7-mainline/include/linux/kd.h:#define
KDKBDREP 0x4B52 /* set keyboard delay/repeat rate;
/usr/src/linux-3.0.0-rc7-mainline/include/linux/rtnetlink.h:#define
RTPROT_REDIRECT 1 /* Route installed by ICMP redirects;
/usr/src/linux-3.0.0-rc7-mainline/include/linux/ext3_fs.h:#define
i_frag osd2.hurd2.h_i_frag;
/usr/src/linux-3.0.0-rc7-mainline/include/linux/ext3_fs.h:#define
i_fsize osd2.hurd2.h_i_fsize;
/usr/src/linux-3.0.0-rc7-mainline/include/linux/mfd/tps65910.h:#define
LDO_MAX_VOLT 3300;

So here we have an example:
#define LDO_MAX_VOLT 3300;

Hadron

unread,
Jul 12, 2011, 4:45:54 PM7/12/11
to
cc <scat...@hotmail.com> writes:

What he is doing is ridiculous. Simple. A bit like Ahsltrom recommending
that guy to cover his compiler warnings with casts. It smacks of
amaterism where he knows just enough to be dangerous. He thinks he's
being clever but is in fact coding in horrendous non standard gimicks
which will trip others up later (not that anyone will be touching his
code - he doesnt release any to the FOSS community) - a bit like dicks
who override integer "+" in C++ thinking they're being "all OO"...

cc

unread,
Jul 12, 2011, 4:48:10 PM7/12/11
to

All coding standards I've been involved with would not allow that. I'm
particularly confused about #define YY_BREAK break; First off, why
define it? break is a standard C statement and must be portable, so I
see no need to define it. Second, you could have a bare line in the
code of just

YY_BREAK

That's definitely not been acceptable for any C code I've ever been
involved in. I'll have to look through the Linux Kernel Code Standards
and see what their reasoning is. Interesting find, but I still think
it does more harm than good and should be changed.

7

unread,
Jul 12, 2011, 4:57:06 PM7/12/11
to
cc wrote:


I can't believe you are serious!!!!!!!!!!!!!!!!!!!!!!

Gary Stewart

unread,
Jul 12, 2011, 5:00:11 PM7/12/11
to
On Tue, 12 Jul 2011 21:57:06 +0100, 7 wrote:


> I can't believe you are serious!!!!!!!!!!!!!!!!!!!!!!

A poster who calls himself "The President of COLA"
in comp.os.linux.advocacy has been claiming that he has won
the European Inventor Of The Year award.

His real name is Joseph Michael and he goes by the moniker "7".

Here is a link to him making the claim:

http://tinyurl.com/3odhvwp


This claim is fully supported without challenge by the group's
Linux supporters including but not limited to:

Chris Ahlstrom
Gregory Shearman
Roy Schestowitz.
chrisv
Chris
Peter Kohlmann
Goblin
and several others.

I am suggesting that each person who reads this message contact the
EPO which is the organization that presents the award and ask them
about Mr. Joseph Michael's claims.

Here is their website.

http://www.epo.org/news-issues/european-inventor.html

http://www.epo.org/service-support/contact-us.html

Surely the President of comp.os.linux.advocacy should
have no problem with people verifying the claims he has
been making for years.

How about it Chrissy Ahlstrom, will you be the first to
step up and show that you aren't the spineless git you
seem to be?

Or how about you Goblin?
You try to portray yourself as a genteel person.
Let's see what kind of integrity you really have.


--
7/12/2011 4:59:19 PM

cc

unread,
Jul 12, 2011, 4:53:05 PM7/12/11
to
> see no need to use a define it. Second, you could have a bare line in the

> code of just
>
> YY_BREAK
>
> That's definitely not been acceptable for any C code I've ever been
> involved in. I'll have to look through the Linux Kernel Code Standards
> and see what their reasoning is. Interesting find, but I still think
> it does more harm than good and should be changed.

In Chapter 12: Macros, Enums and RTL of the Kernel Code Standards it
doesn't mention one way or another about semicolons on #defines, so I
don't know what to think.

Also, what I meant by "break is a standard C statement and must be
portable, so I see no need to use a define it" was that often times in
kernel code you'll see #defines for different function calls depending
on what OS your on because certain OSes have different system calls.
break is universal.

cc

unread,
Jul 12, 2011, 5:02:49 PM7/12/11
to
On Jul 12, 4:57 pm, 7

Besides the fact that you wrote the #define so you should know whether
or not it can be used in an expression, you should also comment it if
it can't be. If you're not commenting your code with things like /*
DON'T USE THIS DEFINE IN AN EXPRESSION */ then that's also
unacceptable.

Chris Ahlstrom

unread,
Jul 12, 2011, 5:18:08 PM7/12/11
to
cc wrote this copyrighted missive and expects royalties:

> On Jul 12, 4:48?pm, cc <scatnu...@hotmail.com> wrote:
>>
>> That's definitely not been acceptable for any C code I've ever been
>> involved in. I'll have to look through the Linux Kernel Code Standards
>> and see what their reasoning is. Interesting find, but I still think
>> it does more harm than good and should be changed.
>
> In Chapter 12: Macros, Enums and RTL of the Kernel Code Standards it
> doesn't mention one way or another about semicolons on #defines, so I
> don't know what to think.
>
> Also, what I meant by "break is a standard C statement and must be
> portable, so I see no need to use a define it" was that often times in
> kernel code you'll see #defines for different function calls depending
> on what OS your on because certain OSes have different system calls.
> break is universal.

I hate #defines and typedefs like UINT.

C'mon man! Just use "unsigned" or "unsigned int". UINT is for weasels.
Looks like Qt4 and Microsoft use it.

By the way, the LDO_MAX_VOLT macro doesn't occur in my current (on this
machine) kernel source, 2.6.35.

--
"Our reruns are better than theirs."
-- Nick at Nite

Gary Stewart

unread,
Jul 12, 2011, 5:24:38 PM7/12/11
to
On Tue, 12 Jul 2011 17:18:08 -0400, Chris Ahlstrom wrote:


>
> I hate #defines and typedefs like UINT.

I'm sure they hate you too Chris Ahlstrom.
You're not very likeable.


--
7/12/2011 5:23:46 PM

Gregory Shearman

unread,
Jul 12, 2011, 5:36:13 PM7/12/11
to
On 2011-07-12, Homer <use...@slated.org> wrote:
> Verily I say unto thee, that Gregory Shearman spake thusly:
>> On 2011-07-11, Homer <use...@slated.org> wrote:
>
>>> #include <stdio.h>
>>>
>>> int hmm[ ] = {0x6c64210a, 0x20576f72, 0x656c6c6f, 0x00000048 };
>>>
>>> void erm(char* umm) {
>>> char* c = umm;
> /* if (umm > (char *)&hmm[3]) */
> if (umm < (char *)&hmm[3])
>>> erm(umm+1);
>>> printf("%c", *c);
>>> }
>>>
>>> int main(int argc, char **argv) {
>>> erm((char*)hmm);
>>> return 0;
>>> }
> [...]
>> Hello World!
>>
>> What about endianness? We don't all work on intel CPUs...
>
> Yup, it's just an obfuscation hack. It could be made all the more
> obfuscated if the data were taken from the results of other parts of a
> larger program, rather than a nice, convenient array. :)

Ugh! That's how stack overflows happen... 8-)

Snit

unread,
Jul 12, 2011, 7:39:20 PM7/12/11
to
cc stated in post
4b0d8407-d9c8-4aa9...@z39g2000yqz.googlegroups.com on 7/12/11
1:53 PM:

>> That's definitely not been acceptable for any C code I've ever been
>> involved in. I'll have to look through the Linux Kernel Code Standards
>> and see what their reasoning is. Interesting find, but I still think
>> it does more harm than good and should be changed.
>
> In Chapter 12: Macros, Enums and RTL of the Kernel Code Standards it
> doesn't mention one way or another about semicolons on #defines, so I
> don't know what to think.

You should think what "we're supposed to". :)


--
[INSERT .SIG HERE]


Chris

unread,
Jul 12, 2011, 7:50:26 PM7/12/11
to
Am Tue, 12 Jul 2011 17:00:11 -0400 schrieb Gary Stewart:

> On Tue, 12 Jul 2011 21:57:06 +0100, 7 wrote:
>> I can't believe you are serious!!!!!!!!!!!!!!!!!!!!!!
> A poster who

Offtopic: -1
Repeatedly spammed: -1
Not even an attempt to answer seriously to replies to that spam: -1

> calls himself "The President of COLA"
> in comp.os.linux.advocacy has been claiming that he has won the European
> Inventor Of The Year award.

Not getting the humourus "title" of "President": -1
Ignoring logical impossibility: -1
Ignoring given explanation: -1

> His real name is Joseph Michael and he goes by the moniker "7".
> Here is a link to him making the claim:
> http://tinyurl.com/3odhvwp

Actually providing a link when asked to: +1
Had to be asked too many times: -1
The linked website not actually containing that claim: -1

> This claim is fully supported without challenge by the group's Linux
> supporters including but not limited to:

Still building on an unproven premise: -1

> Chris Ahlstrom Gregory Shearman Roy Schestowitz.
> chrisv Chris Peter Kohlmann Goblin and several others.

Ignoring my explanation and still lieing: -1

> I am suggesting that each person who reads this message contact the EPO
> which is the organization that presents the award and ask them about Mr.
> Joseph Michael's claims.
> Here is their website.
> http://www.epo.org/news-issues/european-inventor.html
> http://www.epo.org/service-support/contact-us.html

Illogical suggestion based on unproven premise: -1

> Surely the President of comp.os.linux.advocacy should have no problem
> with people verifying the claims he has been making for years.

You should have no problems verifying your own claims. Yet you ignore
facts: -1

> How about it Chrissy Ahlstrom, will you be the first to step up and show
> that you aren't the spineless git you seem to be?
> Or how about you Goblin?
> You try to portray yourself as a genteel person.
> Let's see what kind of integrity you really have.

Useless ad hominem attack: -1

If I counted correctly I have given you -12 points.
But I'll give an extra -1 for
http://ompldr.org/vOWZlZg
As everyone interested can read: In "my" subtree in which you didn't
answer I explained that I do not fully support the claims of 7 having won
"the" European Inventor of the Year award because I see no evidence
whatsoever that he claimed that. You have failed to prove so, too.
I think that makes you a liar.

Homer

unread,
Jul 12, 2011, 9:35:22 PM7/12/11
to
Verily I say unto thee, that Gregory Shearman spake thusly:
> On 2011-07-12, Homer <use...@slated.org> wrote:
>> Verily I say unto thee, that Gregory Shearman spake thusly:

>>> What about endianness? We don't all work on intel CPUs...


>>
>> Yup, it's just an obfuscation hack. It could be made all the more
>> obfuscated if the data were taken from the results of other parts of
>> a larger program, rather than a nice, convenient array. :)
>
> Ugh! That's how stack overflows happen... 8-)

Not on my watch :p

Talking of stacks, I wonder if it's possible to create a series of steps
so obfuscated that they cannot be traced and interpreted by a human, but
can by a compiler/interpreter/debugger (in order to actually run)?

Even encrypted data would need to be decrypted at some point.

Sounds implausible, but interesting nonetheless.

chrisv

unread,
Jul 13, 2011, 6:29:02 AM7/13/11
to
Chris wrote:

>If I counted correctly I have given you -12 points.

I think that qualifies the troll for a free kick in the arse with a
steel-toed boot.

>But I'll give an extra -1 for
>http://ompldr.org/vOWZlZg
>As everyone interested can read: In "my" subtree in which you didn't
>answer I explained that I do not fully support the claims of 7 having won
>"the" European Inventor of the Year award because I see no evidence
>whatsoever that he claimed that. You have failed to prove so, too.
>I think that makes you a liar.

Oh, the troll is a liar, all right. And proud of it!

--
"Many Android users are far from happy." - "True Linux advocate"
Hadron Quark

Linux Lizard

unread,
Jul 13, 2011, 9:24:44 AM7/13/11
to

rat shit "chrisv" <chr...@nospam.invalid> wrote in message
news:gmsq17p7cg5nn7a34...@4ax.com...

> Chris wrote:
>
>>If I counted correctly I have given you -12 points.
>
> I think that qualifies the troll for a free kick in the arse with a
> steel-toed boot.
>
>>But I'll give an extra -1 for
>>http://ompldr.org/vOWZlZg
>>As everyone interested can read: In "my" subtree in which you didn't
>>answer I explained that I do not fully support the claims of 7 having won
>>"the" European Inventor of the Year award because I see no evidence
>>whatsoever that he claimed that. You have failed to prove so, too.
>>I think that makes you a liar.
>
> Oh, the troll is a liar, all right. And proud of it!
>

more fine advocacy from the retarded turd who's too stupid to do anything
but talk about trolls all day.

"chrisv" is a liar. "chrisv" is a piece of shit.


cc

unread,
Jul 13, 2011, 2:29:18 PM7/13/11
to
On Jul 12, 5:18 pm, Chris Ahlstrom <ahlstr...@xzoozy.com> wrote:
> cc wrote this copyrighted missive and expects royalties:
>
>
>
>
>
> > On Jul 12, 4:48?pm, cc <scatnu...@hotmail.com> wrote:
>
> >> That's definitely not been acceptable for any C code I've ever been
> >> involved in. I'll have to look through the Linux Kernel Code Standards
> >> and see what their reasoning is. Interesting find, but I still think
> >> it does more harm than good and should be changed.
>
> > In Chapter 12: Macros, Enums and RTL of the Kernel Code Standards it
> > doesn't mention one way or another about semicolons on #defines, so I
> > don't know what to think.
>
> > Also, what I meant by "break is a standard C statement and must be
> > portable, so I see no need to use a define it" was that often times in
> > kernel code you'll see #defines for different function calls depending
> > on what OS your on because certain OSes have different system calls.
> > break is universal.
>
> I hate #defines and typedefs like UINT.

I agree, although that seems more way more common.

> C'mon man!  Just use "unsigned" or "unsigned int".  UINT is for weasels.
> Looks like Qt4 and Microsoft use it.
>
> By the way, the LDO_MAX_VOLT macro doesn't occur in my current (on this
> machine) kernel source, 2.6.35.
>

I started a thread in clc. I still think it's bad practice, but I'm
interested in what the pedants there have to say.

--
"There are no general principles in HCI." - John M. Carroll

cc

unread,
Jul 13, 2011, 4:40:44 PM7/13/11
to

http://groups.google.com/group/comp.lang.c/browse_thread/thread/974604928f05f867?hl=en#

It seems like everyone is agreeing that it's not a good idea. So...

Chris

unread,
Jul 13, 2011, 5:34:34 PM7/13/11
to
Am Wed, 13 Jul 2011 13:40:44 -0700 schrieb cc:

> http://groups.google.com/group/comp.lang.c/browse_thread/


thread/974604928f05f867?hl=en#
>
> It seems like everyone is agreeing that it's not a good idea. So...

By the way the example is still there in mainline:

http://git.kernel.org/?p=linux/kernel/git/torvalds/
linux-2.6.git;a=blob;f=include/linux/mfd/
tps65910.h;h=8bb85b930c0783a5fb697a1cfa3daf3b7cd49254;hb=HEAD#l272

And maybe one over there is right that it's simply an error considering
the line above:

#define LDO_MIN_VOLT 1000
#define LDO_MAX_VOLT 3300;

I'd consider that rather unexpected. But I haven't looked at much code so
maybe this is indeed useful for something. But I don't know and I'm not
_that_ interested. :)

Snit

unread,
Jul 13, 2011, 6:25:05 PM7/13/11
to
cc stated in post
4a7729c1-70a8-4983...@y13g2000yqy.googlegroups.com on 7/13/11
1:40 PM:

>> I started a thread in clc. I still think it's bad practice, but I'm
>> interested in what the pedants there have to say.
>>
>
> http://groups.google.com/group/comp.lang.c/browse_thread/thread/974604928f05f8
> 67?hl=en#
>
> It seems like everyone is agreeing that it's not a good idea. So...

A shame you were not honest in that thread.


--
[INSERT .SIG HERE]


cc

unread,
Jul 14, 2011, 7:59:50 AM7/14/11
to
On Jul 13, 6:25 pm, Snit <use...@gallopinginsanity.com> wrote:
> cc stated in post
> 4a7729c1-70a8-4983-9a35-5f03bedc4...@y13g2000yqy.googlegroups.com on 7/13/11

> 1:40 PM:
>
> >> I started a thread in clc. I still think it's bad practice, but I'm
> >> interested in what the pedants there have to say.
>
> >http://groups.google.com/group/comp.lang.c/browse_thread/thread/97460...

> > 67?hl=en#
>
> > It seems like everyone is agreeing that it's not a good idea. So...
>
> A shame you were not honest in that thread.
>

How was I dishonest?

Snit

unread,
Jul 14, 2011, 10:28:04 AM7/14/11
to
cc stated in post
0b5869cd-9ae9-49bc...@10g2000yqn.googlegroups.com on 7/14/11
4:59 AM:

> On Jul 13, 6:25 pm, Snit <use...@gallopinginsanity.com> wrote:
>> cc stated in post
>> 4a7729c1-70a8-4983-9a35-5f03bedc4...@y13g2000yqy.googlegroups.com on 7/13/11
>> 1:40 PM:
>>
>>>> I started a thread in clc. I still think it's bad practice, but I'm
>>>> interested in what the pedants there have to say.
>>
>>> http://groups.google.com/group/comp.lang.c/browse_thread/thread/97460...
>>> 67?hl=en#
>>
>>> It seems like everyone is agreeing that it's not a good idea. So...
>>
>> A shame you were not honest in that thread.
>
> How was I dishonest?

This is a great question from you: it shows dishonesty has become such a
habit you cannot even tell when you are being dishonest... or you are
dishonest in so many ways you cannot even tell where you were caught.

> --
> "There are no general principles in HCI." - John M. Carroll

Wow... you just cannot let that debate you lost go. By the way, doing a
search on that phrase, I find only *you* saying it. Only you.

Not Carroll. Care to point to a reference where he actually said that? Oh,
wait... was that from the alleged email to you where he denounced his public
stance?

Hey, in 5 seconds of searching:

<http://www.situatedgaming.com/CISHCIExam/carroll.html>

And what do you know! He gives a list of principles (what he calls
"dimensions"). And, of course, it includes "consistency":
-----
Consistency: similar semantics are expressed in similar syntactic forms
-----

And then there are classes such as "Principles of User Interface Design,
Implementation and Evaluation" where they have as suggested reading, "HCI
Models, Theories, and Frameworks: Toward a Multidisciplinary Science by John
Carroll" <http://social.cs.uiuc.edu/class/cs465/>

On and on and on... I mean, really, you lost a Usenet debate and made a bit
of a fool of yourself. Oh no!

Let it go.


--
[INSERT .SIG HERE]


cc

unread,
Jul 14, 2011, 10:50:29 AM7/14/11
to
On Jul 14, 10:28 am, Snit <use...@gallopinginsanity.com> wrote:
> cc stated in post
> 0b5869cd-9ae9-49bc-bb1b-d5b9651a0...@10g2000yqn.googlegroups.com on 7/14/11

> 4:59 AM:
>
>
>
>
>
> > On Jul 13, 6:25 pm, Snit <use...@gallopinginsanity.com> wrote:
> >> cc stated in post
> >> 4a7729c1-70a8-4983-9a35-5f03bedc4...@y13g2000yqy.googlegroups.com on 7/13/11
> >> 1:40 PM:
>
> >>>> I started a thread in clc. I still think it's bad practice, but I'm
> >>>> interested in what the pedants there have to say.
>
> >>>http://groups.google.com/group/comp.lang.c/browse_thread/thread/97460...
> >>> 67?hl=en#
>
> >>> It seems like everyone is agreeing that it's not a good idea. So...
>
> >> A shame you were not honest in that thread.
>
> > How was I dishonest?
>
> This is a great question from you: it shows dishonesty has become such a
> habit you cannot even tell when you are being dishonest... or you are
> dishonest in so many ways you cannot even tell where you were caught.


I feel like I accurately represented 7 and his statements in the clc
thread. If you feel otherwise, I would like to know why. The fact that
others have perused the thread and no one has said anything about me
misrepresenting the so called President of COLA in any way (which
would of course be the first thing they would jump on) makes me think
you have misunderstood something.

Snit

unread,
Jul 14, 2011, 11:00:58 AM7/14/11
to
cc stated in post
195fe581-a328-4713...@q1g2000vbj.googlegroups.com on 7/14/11
7:50 AM:

> On Jul 14, 10:28 am, Snit <use...@gallopinginsanity.com> wrote:
>> cc stated in post
>> 0b5869cd-9ae9-49bc-bb1b-d5b9651a0...@10g2000yqn.googlegroups.com on 7/14/11
>> 4:59 AM:
>>
>>
>>
>>
>>
>>> On Jul 13, 6:25 pm, Snit <use...@gallopinginsanity.com> wrote:
>>>> cc stated in post
>>>> 4a7729c1-70a8-4983-9a35-5f03bedc4...@y13g2000yqy.googlegroups.com on
>>>> 7/13/11
>>>> 1:40 PM:
>>
>>>>>> I started a thread in clc. I still think it's bad practice, but I'm
>>>>>> interested in what the pedants there have to say.
>>
>>>>> http://groups.google.com/group/comp.lang.c/browse_thread/thread/97460...
>
>>>>> 67?hl=en#
>>
>>>>> It seems like everyone is agreeing that it's not a good idea. So...
>>
>>>> A shame you were not honest in that thread.
>>
>>> How was I dishonest?
>>
>> This is a great question from you: it shows dishonesty has become such a
>> habit you cannot even tell when you are being dishonest... or you are
>> dishonest in so many ways you cannot even tell where you were caught.
>
> I feel like I accurately represented 7 and his statements in the clc
> thread.

He is a "very good friend"? Really? Even though in this very thread he
calls you a "troll".

He claimed it was "perfectly acceptable"? Where? Looking now I see where
he said:
-----


So by putting semicolon in #define SMALL 1;
I've made sure on compiling it it is guaranteed to fail
when used out of context.

-----


> If you feel otherwise, I would like to know why. The fact that
> others have perused the thread and no one has said anything about me
> misrepresenting the so called President of COLA in any way (which
> would of course be the first thing they would jump on) makes me think
> you have misunderstood something.


--
[INSERT .SIG HERE]


cc

unread,
Jul 14, 2011, 11:11:13 AM7/14/11
to
On Jul 14, 11:00 am, Snit <use...@gallopinginsanity.com> wrote:
> cc stated in post
> 195fe581-a328-4713-971d-b13876b39...@q1g2000vbj.googlegroups.com on 7/14/11

> 7:50 AM:
>
>
>
>
>
> > On Jul 14, 10:28 am, Snit <use...@gallopinginsanity.com> wrote:
> >> cc stated in post
> >> 0b5869cd-9ae9-49bc-bb1b-d5b9651a0...@10g2000yqn.googlegroups.com on 7/14/11
> >> 4:59 AM:
>
> >>> On Jul 13, 6:25 pm, Snit <use...@gallopinginsanity.com> wrote:
> >>>> cc stated in post
> >>>> 4a7729c1-70a8-4983-9a35-5f03bedc4...@y13g2000yqy.googlegroups.com on
> >>>> 7/13/11
> >>>> 1:40 PM:
>
> >>>>>> I started a thread in clc. I still think it's bad practice, but I'm
> >>>>>> interested in what the pedants there have to say.
>
> >>>>>http://groups.google.com/group/comp.lang.c/browse_thread/thread/97460...
>
> >>>>> 67?hl=en#
>
> >>>>> It seems like everyone is agreeing that it's not a good idea. So...
>
> >>>> A shame you were not honest in that thread.
>
> >>> How was I dishonest?
>
> >> This is a great question from you: it shows dishonesty has become such a
> >> habit you cannot even tell when you are being dishonest... or you are
> >> dishonest in so many ways you cannot even tell where you were caught.
>
> > I feel like I accurately represented 7 and his statements in the clc
> > thread.
>
> He is a "very good friend"?  Really?  Even though in this very thread he
> calls you a "troll".

Haha, really? That was an inside joke for those from COLA reading the
thread. Of course 7 isn't my very good friend. But I apologize for
jokingly calling 7 my very good friend. I don't know him at all.

> He claimed it was "perfectly acceptable"?   Where?  Looking now I see where
> he said:

http://groups.google.com/group/comp.os.linux.advocacy/msg/f66f2e6b68a0c1ec?hl=en&dmode=source

If he didn't think it was acceptable, why did he do it, and why did he
ask me to explain (and call me a troll) when I said it was poor coding
practice?

>     -----
>     So by putting semicolon in #define SMALL 1;
>     I've made sure on compiling it it is guaranteed to fail
>     when used out of context.
>     -----
>

So he did it (and apparently does it regularly) but he doesn't think
it's acceptable, but he decided to argue that it was okay anyway?
Really, you're stretching here. But if you feel I've been dishonest, I
suggest you refer the good people of clc to 7's original post so they
have the full story. I would not want clc to be unknowingly deceived.

Snit

unread,
Jul 14, 2011, 11:36:27 AM7/14/11
to
cc stated in post
501c9c42-34e5-4671...@gh5g2000vbb.googlegroups.com on 7/14/11
8:11 AM:

>>>> This is a great question from you: it shows dishonesty has become such a
>>>> habit you cannot even tell when you are being dishonest... or you are
>>>> dishonest in so many ways you cannot even tell where you were caught.
>>
>>> I feel like I accurately represented 7 and his statements in the clc
>>> thread.
>>
>> He is a "very good friend"?  Really?  Even though in this very thread he
>> calls you a "troll".
>
> Haha, really? That was an inside joke for those from COLA reading the
> thread. Of course 7 isn't my very good friend. But I apologize for
> jokingly calling 7 my very good friend. I don't know him at all.

You misrepresented the purpose of your post in the other forum.

>> He claimed it was "perfectly acceptable"?   Where?  Looking now I see where
>> he said:
>
> http://groups.google.com/group/comp.os.linux.advocacy/msg/f66f2e6b68a0c1ec?hl=
> en&dmode=source
>
> If he didn't think it was acceptable, why did he do it, and why did he
> ask me to explain (and call me a troll) when I said it was poor coding
> practice?

He did something in a thread *he* started and named "Spot the mistake"...
and you openly misrepresented this as him claiming it was "perfectly
acceptable".

This is your second misrepresentation.

>>     -----
>>     So by putting semicolon in #define SMALL 1;
>>     I've made sure on compiling it it is guaranteed to fail
>>     when used out of context.
>>     -----
>
> So he did it (and apparently does it regularly) but he doesn't think
> it's acceptable, but he decided to argue that it was okay anyway?


"guaranteed to fail" is not "okay".

> Really, you're stretching here. But if you feel I've been dishonest, I
> suggest you refer the good people of clc to 7's original post so they
> have the full story. I would not want clc to be unknowingly deceived.

You were openly dishonest about the reasons for the code and your
relationship with the coder.

--
[INSERT .SIG HERE]


cc

unread,
Jul 14, 2011, 1:59:42 PM7/14/11
to
On Jul 14, 11:36 am, Snit <use...@gallopinginsanity.com> wrote:
> cc stated in post
> 501c9c42-34e5-4671-8af7-70b54ebe2...@gh5g2000vbb.googlegroups.com on 7/14/11

> 8:11 AM:
>
> >>>> This is a great question from you: it shows dishonesty has become such a
> >>>> habit you cannot even tell when you are being dishonest... or you are
> >>>> dishonest in so many ways you cannot even tell where you were caught.
>
> >>> I feel like I accurately represented 7 and his statements in the clc
> >>> thread.
>
> >> He is a "very good friend"?  Really?  Even though in this very thread he
> >> calls you a "troll".
>
> > Haha, really? That was an inside joke for those from COLA reading the
> > thread. Of course 7 isn't my very good friend. But I apologize for
> > jokingly calling 7 my very good friend. I don't know him at all.
>
> You misrepresented the purpose of your post in the other forum.


No I didn't. I mentioned the disagreement 7 had with me (and Hadron
and anyone who has ever coded in C) had. That was the purpose of the
post. It was not my purpose to convince people that 7 and I are
friends.

> >> He claimed it was "perfectly acceptable"?   Where?  Looking now I see where
> >> he said:
>

> >http://groups.google.com/group/comp.os.linux.advocacy/msg/f66f2e6b68a...


> > en&dmode=source
>
> > If he didn't think it was acceptable, why did he do it, and why did he
> > ask me to explain (and call me a troll) when I said it was poor coding
> > practice?
>
> He did something in a thread  *he* started and named "Spot the mistake"...
> and you openly misrepresented this as him claiming it was "perfectly
> acceptable".
>
> This is your second misrepresentation.

Spot the intentional mistake, if you read his first post.

> >>     -----
> >>     So by putting semicolon in #define SMALL 1;
> >>     I've made sure on compiling it it is guaranteed to fail
> >>     when used out of context.
> >>     -----
>
> > So he did it (and apparently does it regularly) but he doesn't think
> > it's acceptable, but he decided to argue that it was okay anyway?
>
> "guaranteed to fail" is not "okay".

You obviously don't understand what he's trying to accomplish. In this
case, according to 7's reasoning, guaranteed to fail IS okay because
he didn't want it to succeed. That also makes it poor coding practice,
as I and others have mentioned. Really, you're the only one who seems
to think I'm misrepresenting 7's thoughts. Not even 7 feels that way.

> > Really, you're stretching here. But if you feel I've been dishonest, I
> > suggest you refer the good people of clc to 7's original post so they
> > have the full story. I would not want clc to be unknowingly deceived.
>
> You were openly dishonest about the reasons for the code and your
> relationship with the coder.
>

I specifically said in the other thread that he was using it to make
statements like if(SMALL) fail to compile. That is in that thread.
Look it up. Beyond that, I suggest you let everyone in clc know I was
dishonest, and for what reasons. The overwhelming yawns will be
interesting to see.

If anyone else thinks I was dishonest in that thread I would like to
hear it. I don't see 7 commenting any more, but I don't see him
complaining on how I characterized his thoughts on #defines with
semicolons either. I would welcome input from someone who at least
understands what 7 was trying to accomplish (however wrong it may be).

Snit

unread,
Jul 14, 2011, 6:27:13 PM7/14/11
to
cc stated in post
d5df4d9e-7f11-405f...@e21g2000vbz.googlegroups.com on 7/14/11
10:59 AM:

Huh? Why would I talk about you to other people. My goodness you think the
world revolves around you.



> If anyone else thinks I was dishonest in that thread I would like to
> hear it. I don't see 7 commenting any more, but I don't see him
> complaining on how I characterized his thoughts on #defines with
> semicolons either. I would welcome input from someone who at least
> understands what 7 was trying to accomplish (however wrong it may be).

You lied and said he as a "very good friend".
You lied when you claimed that a mistake in the code in a post he labeled
"spot the mistake" was "perfectly acceptable" code.

But you have no problem with your actions. Not unless others call you out
on your lies... even just one person doing so does not matter.

For me: if I lied, even if nobody know it - it would still matter to me.

Just another way we are different. Not judging.


--
[INSERT .SIG HERE]


cc

unread,
Jul 15, 2011, 7:04:00 AM7/15/11
to
On Jul 14, 6:27 pm, Snit <use...@gallopinginsanity.com> wrote:
> cc stated in post
> d5df4d9e-7f11-405f-bf28-98178b415...@e21g2000vbz.googlegroups.com on 7/14/11

Well if I was being lied to, I would like to know.

> > If anyone else thinks I was dishonest in that thread I would like to
> > hear it. I don't see 7 commenting any more, but I don't see him
> > complaining on how I characterized his thoughts on #defines with
> > semicolons either. I would welcome input from someone who at least
> > understands what 7 was trying to accomplish (however wrong it may be).
>
> You lied and said he as a "very good friend".

It was a joke, Snit. Do you not understand that?

> You lied when you claimed that a mistake in the code in a post he labeled
> "spot the mistake" was "perfectly acceptable" code.

He does think it's perfectly acceptable! You don't understand C and
what he was trying to show. He explicitly called it an intentional
mistake. He also mentioned why he did it, which is what he thought was
acceptable. You don't understand this. Ask Hadron to explain it to
you.


> But you have no problem with your actions.  Not unless others call you out
> on your lies... even just one person doing so does not matter.

I have no problems joking around and also telling the exact truth
about the situation. Seriously, have someone else explain to you what
7 was trying to accomplish, and why it is poor practice.


> For me: if I lied, even if nobody know it - it would still matter to me.

I didn't lie. You're just too dumb to realize it. Ask anyone with some
C knowledge if I was lying. I dare you. Then come back here and
apologize.

> Just another way we are different.  Not judging.
>

We certainly are different. I can answer questions and I'm not a
moron.

cc

unread,
Jul 15, 2011, 7:14:36 AM7/15/11
to

I went back and made another post clarifying things. I quoted 7's
entire post to them. If they change their answer based on that, I'd be
shocked, since it's exactly what I said before. But I'll keep you
posted.

Snit

unread,
Jul 15, 2011, 11:18:39 AM7/15/11
to
cc stated in post
f8026f7f-5fa7-4d3e...@gv8g2000vbb.googlegroups.com on 7/15/11
4:04 AM:

>>> I specifically said in the other thread that he was using it to make
>>> statements like if(SMALL) fail to compile. That is in that thread.
>>> Look it up. Beyond that, I suggest you let everyone in clc know I was
>>> dishonest, and for what reasons. The overwhelming yawns will be
>>> interesting to see.
>>
>> Huh?  Why would I talk about you to other people.  My goodness you think the
>> world revolves around you.
>
> Well if I was being lied to, I would like to know.

Who said anyone was lying to you? And, frankly, why would you care if you
were lied to about some people you know nothing about? But if you are
feeling bad about your lying then *you* should go confess. Why do you want
me to do your dirty work?

>>> If anyone else thinks I was dishonest in that thread I would like to
>>> hear it. I don't see 7 commenting any more, but I don't see him
>>> complaining on how I characterized his thoughts on #defines with
>>> semicolons either. I would welcome input from someone who at least
>>> understands what 7 was trying to accomplish (however wrong it may be).
>>
>> You lied and said he as a "very good friend".
>
> It was a joke, Snit. Do you not understand that?

Ah, of course. Nice retcon.

>> You lied when you claimed that a mistake in the code in a post he labeled
>> "spot the mistake" was "perfectly acceptable" code.
>
> He does think it's perfectly acceptable! You don't understand C and
> what he was trying to show. He explicitly called it an intentional
> mistake. He also mentioned why he did it, which is what he thought was
> acceptable. You don't understand this. Ask Hadron to explain it to
> you.

I am merely noting your dishonesty.

--
[INSERT .SIG HERE]


Snit

unread,
Jul 15, 2011, 11:20:42 AM7/15/11
to
cc stated in post
37d672b0-5f65-40bb...@j15g2000yqf.googlegroups.com on 7/15/11
4:14 AM:

>>> Huh? �Why would I talk about you to other people. �My goodness you think the
>>> world revolves around you.
>>
>> Well if I was being lied to, I would like to know.
>>
>
> I went back and made another post clarifying things. I quoted 7's
> entire post to them. If they change their answer based on that, I'd be
> shocked, since it's exactly what I said before. But I'll keep you
> posted.

Good to see you felt guilty enough to admit to your lies. I bet they don't
really care about you lying about people they do not even know... but we
shall see.


--
[INSERT .SIG HERE]


Snit

unread,
Jul 15, 2011, 11:29:45 AM7/15/11
to
cc stated in post
b69b0465-efd0-4193...@s2g2000vbw.googlegroups.com on 7/13/11
11:29 AM:

> "There are no general principles in HCI." - John M. Carroll

Please point to the reference where he alledgedly said this.

Hint: there is none.

You lied.


--
[INSERT .SIG HERE]


cc

unread,
Jul 15, 2011, 11:30:44 AM7/15/11
to
On Jul 15, 11:20 am, Snit <use...@gallopinginsanity.com> wrote:
> cc stated in post
> 37d672b0-5f65-40bb-a45c-e009ffa92...@j15g2000yqf.googlegroups.com on 7/15/11

I didn't lie. I made a joke, and said exactly what 7 said. You
obviously have never written a line of code in your life, otherwise
you would see that. Notice how 7 calls me a troll, Chris A has me
killfiled from time to time, etc. etc., people consider me a
"wintroll", but no one but you has claimed I lied about what 7 said!
Hadron explicitly agreed with what I wrote, and the two Chris's made
comments on it. No one said I was lying about anything, except of
course you. Perhaps 7 doesn't understand what he wrote and you do.
Perhaps the others don't understand C or what 7 wrote, but you do.
This is a mindboggling situation where the people who are usually so
quick to jump all over any mistake I make had absolutely no problem
with how I characterized 7's statements. And yet you do somehow. So
either all the rest of us, who have written code and know about C and
one of which actually wrote the comments in question, are incorrect,
or you are. Hmm...

Snit

unread,
Jul 15, 2011, 12:24:48 PM7/15/11
to
cc stated in post
e8057b1e-28b5-46e2...@y16g2000yqk.googlegroups.com on 7/15/11
8:30 AM:

> On Jul 15, 11:20 am, Snit <use...@gallopinginsanity.com> wrote:
>> cc stated in post
>> 37d672b0-5f65-40bb-a45c-e009ffa92...@j15g2000yqf.googlegroups.com on 7/15/11
>> 4:14 AM:
>>
>>>>> Huh? Why would I talk about you to other people. My goodness you think the
>>>>> world revolves around you.
>>
>>>> Well if I was being lied to, I would like to know.
>>
>>> I went back and made another post clarifying things. I quoted 7's
>>> entire post to them. If they change their answer based on that, I'd be
>>> shocked, since it's exactly what I said before. But I'll keep you
>>> posted.
>>
>> Good to see you felt guilty enough to admit to your lies.  I bet they don't
>> really care about you lying about people they do not even know... but we
>> shall see.
>
> I didn't lie.

You admitted you lied and even claimed you had made someone - though you did
not say who - "very upset" with your lying.

You have also been lying about me - just making up bizarre "quotes" I never
said... just as you did with Carroll when you lost a debate about HCI stuff.

You cannot help but lie. You even bring your lies to other forums. It is
weird.

And now you are running around snipping in a Carrollesque fashion and just
making things up - these are the things Carroll did as he headed for his
mental breakdown. And you are showing all the signs of the same. Over
what? Here are the claims of mine that have freaked you out:

* There are, of course, well known principles / guidelines in
UI design - the one we focused on the most was the use of
consistency.

* I believe police abuse is "too common"... and abuse by other
government authorities.

Now in the discussion of these things there have been side issues, such as
when you insisted a paper in a peer reviewed journal was nothing more than a
"student worksheet"... but for the most part, the above covers our
disagreements. Well, in the case of the second one you will not even say
what you disagree with - you just say I am "paranoid". Then you claim to
agree with my conclusions of having lots of oversight and accountability.

And you make up stories about my life, make up false "quotes" and attribute
them to me (as you did with Carroll), etc.

In short: I have some completely non-offensive views and my expression of
them has lead to you completely freaking out.

> I made a joke, and said exactly what 7 said. You
> obviously have never written a line of code in your life, otherwise
> you would see that. Notice how 7 calls me a troll, Chris A has me
> killfiled from time to time, etc. etc., people consider me a
> "wintroll", but no one but you has claimed I lied about what 7 said!
> Hadron explicitly agreed with what I wrote, and the two Chris's made
> comments on it. No one said I was lying about anything, except of
> course you. Perhaps 7 doesn't understand what he wrote and you do.
> Perhaps the others don't understand C or what 7 wrote, but you do.
> This is a mindboggling situation where the people who are usually so
> quick to jump all over any mistake I make had absolutely no problem
> with how I characterized 7's statements. And yet you do somehow. So
> either all the rest of us, who have written code and know about C and
> one of which actually wrote the comments in question, are incorrect,
> or you are. Hmm...
>
> --
> "There are no general principles in HCI." - John M. Carroll

See: you made up that quote - you lied. Just as you lied about your
"quotes" you claim I said.

--
[INSERT .SIG HERE]


Snit

unread,
Jul 15, 2011, 10:51:02 PM7/15/11
to
cc stated in post
e8057b1e-28b5-46e2...@y16g2000yqk.googlegroups.com on 7/15/11
8:30 AM:

> There are no general principles in HCI.

Sure there are - as has been discussed and beaten to death.


--
[INSERT .SIG HERE]


Kelsey Bjarnason

unread,
Jul 17, 2011, 6:51:27 PM7/17/11
to
[snips]

On Mon, 11 Jul 2011 06:32:20 -0400, Chris Ahlstrom wrote:

> And damn, there's still a minor issue that even the compiler doesn't
> spot.
>
> The return type of printf() is not 'void', it is 'int', and we're not
> using it, so, conventionally, we should flag it with a cast:
>
> (void) printf("SMALL=%i", SMALL);

The only reason to cast printf to void like that would be to cause
certain overly-fanatical lint checkers to shut up... but code should be
written for clarity and correctness, not to keep a lint-checker happy.

Kelsey Bjarnason

unread,
Jul 17, 2011, 6:54:13 PM7/17/11
to
[snips]

On Mon, 11 Jul 2011 21:31:39 +0100, Homer wrote:

> What small change needs to be made in the following to produce the
> intended output, and what is that output?:
>
> #include <stdio.h>
>
> int hmm[ ] = {0x6c64210a, 0x20576f72, 0x656c6c6f, 0x00000048 };
>
> void erm(char* umm) {
> char* c = umm;
> if (umm > (char *)&hmm[3])
> erm(umm+1);
> printf("%c", *c);
> }
>
> int main(int argc, char **argv) {
> erm((char*)hmm);
> return 0;
> }

The "small change" is a complete rewrite, to avoid the unwarranted
assumption that any particular character will have any particular numeric
value.

Kelsey Bjarnason

unread,
Jul 17, 2011, 6:39:22 PM7/17/11
to
On Sun, 10 Jul 2011 23:36:53 +0100, 7 wrote:

> Spot the mistake
> ----------------
>
> Spot the deliberate mistake:
>
>
>
> #define SMALL 1;
>
> main()
> {
> int i = SMALL;
> printf("i=%i", SMALL);
> }


"The" mistake?

main is defined as "int main(void)", "int main(int,char **)" or an
acceptable aliasing of the latter, not as "main()". At least, not since
about 1989.

There is no return value from main; in C99 this is acceptable, but in C90
it isn't.

Calling a variadic function without a prototype in scope is either
implementation-defined or undefined behaviour, don't recall which,
offhand.

The extra semicolon on the define line produces a syntax error in the
printf line.

And let us not overlook the issue which is _at least_ poor style:
neglecting to ensure the output is actually displayed, by ending the
printf with a \n, or calling fflush(stdout).

So which of these is "the mistake"?

Chris Ahlstrom

unread,
Jul 17, 2011, 9:16:04 PM7/17/11
to
Kelsey Bjarnason wrote this copyrighted missive and expects royalties:

I like to let people know that I'm ignoring a return value.

--
QFM:
Quelle fashion mistake. "It was really QFM. I mean painter
pants? That's 1979 beyond belief."
-- Douglas Coupland, "Generation X: Tales for an Accelerated
Culture"

Kelsey Bjarnason

unread,
Jul 17, 2011, 11:48:24 PM7/17/11
to
On Sun, 17 Jul 2011 21:16:04 -0400, Chris Ahlstrom wrote:

> Kelsey Bjarnason wrote this copyrighted missive and expects royalties:
>
>> [snips]
>>
>> On Mon, 11 Jul 2011 06:32:20 -0400, Chris Ahlstrom wrote:
>>
>>> And damn, there's still a minor issue that even the compiler doesn't
>>> spot.
>>>
>>> The return type of printf() is not 'void', it is 'int', and we're not
>>> using it, so, conventionally, we should flag it with a cast:
>>>
>>> (void) printf("SMALL=%i", SMALL);
>>
>> The only reason to cast printf to void like that would be to cause
>> certain overly-fanatical lint checkers to shut up... but code should be
>> written for clarity and correctness, not to keep a lint-checker happy.
>
> I like to let people know that I'm ignoring a return value.

If you're not assigning it or evaluating it, it's pretty obvious you're
ignoring it. :)

Personally, that's the sort of thing I'd prefer comments on, even if it's
just one comment at the head of the module: "Code does not check for
return values from certain functions, such as printf; this is by design".

Putting in casts where they really shouldn't be always makes me nervous
that they may also exist in riskier areas, such as, oh, casting the
return of *alloc and suchlike.

Chris Ahlstrom

unread,
Jul 18, 2011, 6:36:20 AM7/18/11
to
Kelsey Bjarnason wrote this copyrighted missive and expects royalties:

> On Sun, 17 Jul 2011 21:16:04 -0400, Chris Ahlstrom wrote:
>
>>> The only reason to cast printf to void like that would be to cause
>>> certain overly-fanatical lint checkers to shut up... but code should be
>>> written for clarity and correctness, not to keep a lint-checker happy.
>>
>> I like to let people know that I'm ignoring a return value.
>
> If you're not assigning it or evaluating it, it's pretty obvious you're
> ignoring it. :)
>
> Personally, that's the sort of thing I'd prefer comments on, even if it's
> just one comment at the head of the module: "Code does not check for
> return values from certain functions, such as printf; this is by design".

Cool.

> Putting in casts where they really shouldn't be always makes me nervous
> that they may also exist in riskier areas, such as, oh, casting the
> return of *alloc and suchlike.

Now just how are you going to use the return value of malloc() without
casting?

<Cue the "Hadron" fsckwit to bring up, for the 30th time, his lies and
misunderstandings about casting.>

--
Watch it Tim. Ahlstrom has been showing off explaining how he and Peter
study C articles and "craft" C code of the highest standard. Which is
amazing for someone who didn't understand how throwing casts at compiler
warnings is not generally a good idea since the warnings are explaining
how implicit casting is not working and there is a potential data clash.
-- "Hadron" <h4aae9$4o1$3...@hadron.eternal-september.org>

Chris Ahlstrom

unread,
Jul 18, 2011, 6:42:36 AM7/18/11
to
Kelsey Bjarnason wrote this copyrighted missive and expects royalties:

> On Sun, 17 Jul 2011 21:16:04 -0400, Chris Ahlstrom wrote:
>
>>> The only reason to cast printf to void like that would be to cause
>>> certain overly-fanatical lint checkers to shut up... but code should be
>>> written for clarity and correctness, not to keep a lint-checker happy.
>>
>> I like to let people know that I'm ignoring a return value.
>
> If you're not assigning it or evaluating it, it's pretty obvious you're
> ignoring it. :)
>
> Personally, that's the sort of thing I'd prefer comments on, even if it's
> just one comment at the head of the module: "Code does not check for
> return values from certain functions, such as printf; this is by design".

Cool.

> Putting in casts where they really shouldn't be always makes me nervous
> that they may also exist in riskier areas, such as, oh, casting the
> return of *alloc and suchlike.

Now just how are you going to use the return value of malloc() without
casting? (Aside for bare copies/comparisons using memcpy/memcmp)

cc

unread,
Jul 18, 2011, 7:55:31 AM7/18/11
to
On Jul 18, 6:42 am, Chris Ahlstrom <ahlstr...@xzoozy.com> wrote:
> Kelsey Bjarnason wrote this copyrighted missive and expects royalties:
>
> > Putting in casts where they really shouldn't be always makes me nervous
> > that they may also exist in riskier areas, such as, oh, casting the
> > return of *alloc and suchlike.
>
> Now just how are you going to use the return value of malloc() without
> casting?  (Aside for bare copies/comparisons using memcpy/memcmp)
>


It's cast implicitly in C. In C++ you would need the cast (but you
should probably be using new anyway). In C, if you don't include
stdlib.h and you cast the return of malloc, then you're in big
trouble. The compiler can't find the prototype, and the compiler
thinks malloc returns an int, and also doesn't generate a warning
since you're casting. So if sizeof(char*) is different from
sizeof(int), you'll have some fucked up code that is hard to debug.

--
"Good Lord I'm fat." - Snit

cc

unread,
Jul 18, 2011, 8:18:20 AM7/18/11
to
On Jul 18, 7:55 am, cc <scatnu...@hotmail.com> wrote:
> On Jul 18, 6:42 am, Chris Ahlstrom <ahlstr...@xzoozy.com> wrote:
>
> > Kelsey Bjarnason wrote this copyrighted missive and expects royalties:
>
> > > Putting in casts where they really shouldn't be always makes me nervous
> > > that they may also exist in riskier areas, such as, oh, casting the
> > > return of *alloc and suchlike.
>
> > Now just how are you going to use the return value of malloc() without
> > casting?  (Aside for bare copies/comparisons using memcpy/memcmp)
>
> It's cast implicitly in C.

Just for clarification, "it's" is void* which is cast implicitly in C.
Just in case someone thought I meant the return of malloc is special
in some way. It's all void pointers that are cast implicitly.

--
"There are no general principles in HCI." - John M. Carroll

Chris Ahlstrom

unread,
Jul 18, 2011, 8:47:05 AM7/18/11
to
cc wrote this copyrighted missive and expects royalties:

> On Jul 18, 6:42?am, Chris Ahlstrom <ahlstr...@xzoozy.com> wrote:
>> Kelsey Bjarnason wrote this copyrighted missive and expects royalties:
>>
>> > Putting in casts where they really shouldn't be always makes me nervous
>> > that they may also exist in riskier areas, such as, oh, casting the
>> > return of *alloc and suchlike.
>>
>> Now just how are you going to use the return value of malloc() without

>> casting? ?(Aside for bare copies/comparisons using memcpy/memcmp)


>
> It's cast implicitly in C.

Well I'll be damned, you're right, at least for gcc and the -Wall, -Wextra,
and -pedantic options. To me, that sucks. I want those options to be very
very picky.

Nonetheless, I would make the cast explicit.

> In C++ you would need the cast (but you
> should probably be using new anyway). In C, if you don't include
> stdlib.h and you cast the return of malloc, then you're in big
> trouble. The compiler can't find the prototype, and the compiler
> thinks malloc returns an int, and also doesn't generate a warning
> since you're casting. So if sizeof(char*) is different from
> sizeof(int), you'll have some fucked up code that is hard to debug.

Thank you for the lesson.

--
Love is a grave mental disease.
-- Plato

cc

unread,
Jul 18, 2011, 9:03:36 AM7/18/11
to

No problem. Kelsey can probably give you more info, and I think some
of my terms may be off (I don't think it's a "cast" I think it's a
type coercion or something like that). Anyway, Kelsey is nothing if
not pedantic in his C knowledge and can (and will) correct anywhere I
was mistaken. But the gist of the idea is there. If you do alot of C++
programming (like I do now) then it's a common "mistake" to make when
switching back to C. In reality I bet it's rarely a problem, but don't
try using that excuse in clc!

--
"I don't know the difference between opinion and fact." - Snit

Hadron

unread,
Jul 18, 2011, 9:33:04 AM7/18/11
to
Chris Ahlstrom <ahls...@xzoozy.com> writes:

> Kelsey Bjarnason wrote this copyrighted missive and expects royalties:
>
>> On Sun, 17 Jul 2011 21:16:04 -0400, Chris Ahlstrom wrote:
>>
>>>> The only reason to cast printf to void like that would be to cause
>>>> certain overly-fanatical lint checkers to shut up... but code should be
>>>> written for clarity and correctness, not to keep a lint-checker happy.
>>>
>>> I like to let people know that I'm ignoring a return value.
>>
>> If you're not assigning it or evaluating it, it's pretty obvious you're
>> ignoring it. :)
>>
>> Personally, that's the sort of thing I'd prefer comments on, even if it's
>> just one comment at the head of the module: "Code does not check for
>> return values from certain functions, such as printf; this is by design".
>
> Cool.
>
>> Putting in casts where they really shouldn't be always makes me nervous
>> that they may also exist in riskier areas, such as, oh, casting the
>> return of *alloc and suchlike.
>
> Now just how are you going to use the return value of malloc() without
> casting? (Aside for bare copies/comparisons using memcpy/memcmp)
>
> <Cue the "Hadron" fsckwit to bring up, for the 30th time, his lies and
> misunderstandings about casting.>

I understand casting Ahlstrom. YOU do not. And I have explained numerous
times to you why YOUR comments on casting where those of a preening
dickhead who thinks he understands typing when he doesn't.

In C you do NOT cast the return value of malloc. ALL *GOOD* C
programmers know that.

But to stop you making a bigger dick of yourself once again:-

http://en.wikipedia.org/wiki/Malloc
http://en.wikipedia.org/wiki/Malloc#Disadvantages_to_casting

If people could bookmark this spanking I would be greatful.

But back to casting in general:-

Learn the difference between implicit and explicit casting and why
explicit casting frequently hides issues with down casting you simpering
little weenie.


Snit

unread,
Jul 18, 2011, 9:33:33 AM7/18/11
to
cc stated in post
87046321-3a45-4dc1...@bl1g2000vbb.googlegroups.com on 7/18/11
5:18 AM:

> "There are no general principles in HCI." - John M. Carroll

You made this up. As you did with your "quotes" from me.

Please, cc, stop humiliating yourself. It is horrible to watch you crumble
like this.


--
[INSERT .SIG HERE]


Snit

unread,
Jul 18, 2011, 9:35:51 AM7/18/11
to
cc stated in post
5548792a-1834-48cf...@y16g2000yqk.googlegroups.com on 7/18/11
4:55 AM:

> Good Lord I'm fat.

Um, Ok. So go on a diet if it bothers you. Why post about it in COLA?


--
[INSERT .SIG HERE]


Snit

unread,
Jul 18, 2011, 9:36:22 AM7/18/11
to
cc stated in post
395bedf9-ad33-4cd8...@gh5g2000vbb.googlegroups.com on 7/18/11
6:03 AM:

> I don't know the difference between opinion and fact.

So?


--
[INSERT .SIG HERE]


cc

unread,
Jul 18, 2011, 9:42:18 AM7/18/11
to
On Jul 18, 9:33 am, Hadron<hadronqu...@gmail.com> wrote:

This is true, but many good C++ programmers think their knowledge is
directly transferable to C. And while this is often the case, in
situations like these a good C++ programmer can run in to trouble. I
don't know what the situation is here, but just pointing it out.

--
"My name is Snit, and I'm just real dumb." - Snit

Snit

unread,
Jul 18, 2011, 9:55:16 AM7/18/11
to
cc stated in post
3c94a077-3479-41fe...@s17g2000yqs.googlegroups.com on 7/18/11
6:42 AM:

> My name is Snit ...

Please stop fantasizing about being me. I mean, really, sure it is clear I
am a better person than you - but work on improving yourself and not just on
living in a fantasy life.


--
[INSERT .SIG HERE]


Chris Ahlstrom

unread,
Jul 18, 2011, 12:24:01 PM7/18/11
to
cc wrote this copyrighted missive and expects royalties:

> On Jul 18, 8:47?am, Chris Ahlstrom <ahlstr...@xzoozy.com> wrote:
>> cc wrote this copyrighted missive and expects royalties:
>>
>> > On Jul 18, 6:42?am, Chris Ahlstrom <ahlstr...@xzoozy.com> wrote:
>> >> Kelsey Bjarnason wrote this copyrighted missive and expects royalties:
>>
>> >> > Putting in casts where they really shouldn't be always makes me nervous
>> >> > that they may also exist in riskier areas, such as, oh, casting the
>> >> > return of *alloc and suchlike.
>>
>> >> Now just how are you going to use the return value of malloc() without
>> >> casting? ?(Aside for bare copies/comparisons using memcpy/memcmp)
>>
>> > It's cast implicitly in C.
>>
>> Well I'll be damned, you're right, at least for gcc and the -Wall, -Wextra,

>> and -pedantic options. ?To me, that sucks. ?I want those options to be very


>> very picky.
>>
>> Nonetheless, I would make the cast explicit.
>>
>> > In C++ you would need the cast (but you
>> > should probably be using new anyway). In C, if you don't include
>> > stdlib.h and you cast the return of malloc, then you're in big
>> > trouble. The compiler can't find the prototype, and the compiler
>> > thinks malloc returns an int, and also doesn't generate a warning
>> > since you're casting. So if sizeof(char*) is different from
>> > sizeof(int), you'll have some fucked up code that is hard to debug.
>>
>> Thank you for the lesson.
>
> No problem. Kelsey can probably give you more info, and I think some
> of my terms may be off (I don't think it's a "cast" I think it's a
> type coercion or something like that). Anyway, Kelsey is nothing if
> not pedantic in his C knowledge and can (and will) correct anywhere I
> was mistaken. But the gist of the idea is there. If you do alot of C++
> programming (like I do now) then it's a common "mistake" to make when
> switching back to C. In reality I bet it's rarely a problem, but don't
> try using that excuse in clc!

I alternately write in C, then C++ a lot, and I guess some of my C++ habits
have gotten back-ported to my C code.

One time, for the hell of it, I wrote a set of "polymorphic" data structures
in C rather than C++, just to see what it was like.

It worked, but I won't ever try *that* again! :-D

--
A national debt, if it is not excessive, will be to us a national blessing.
-- Alexander Hamilton

Chris Ahlstrom

unread,
Jul 18, 2011, 12:37:34 PM7/18/11
to
cc wrote this copyrighted missive and expects royalties:

> On Jul 18, 9:33?am, Hadron<hadronqu...@gmail.com> wrote:
>> Chris Ahlstrom <ahlstr...@xzoozy.com> writes:
>> > Kelsey Bjarnason wrote this copyrighted missive and expects royalties:
>>
>> >> On Sun, 17 Jul 2011 21:16:04 -0400, Chris Ahlstrom wrote:
>>
>> >>>> The only reason to cast printf to void like that would be to cause
>> >>>> certain overly-fanatical lint checkers to shut up... but code should be
>> >>>> written for clarity and correctness, not to keep a lint-checker happy.
>>
>> >>> I like to let people know that I'm ignoring a return value.
>>
>> >> If you're not assigning it or evaluating it, it's pretty obvious you're
>> >> ignoring it. :)
>>
>> >> Personally, that's the sort of thing I'd prefer comments on, even if it's
>> >> just one comment at the head of the module: "Code does not check for
>> >> return values from certain functions, such as printf; this is by design".
>>
>> > Cool.
>>
>> >> Putting in casts where they really shouldn't be always makes me nervous
>> >> that they may also exist in riskier areas, such as, oh, casting the
>> >> return of *alloc and suchlike.
>>
>> > Now just how are you going to use the return value of malloc() without

>> > casting? ?(Aside for bare copies/comparisons using memcpy/memcmp)


>>
>> > <Cue the "Hadron" fsckwit to bring up, for the 30th time, his lies and
>> > misunderstandings about casting.>

And right on cue...

>> I understand casting Ahlstrom. YOU do not. And I have explained numerous
>> times to you why YOUR comments on casting where those of a preening
>> dickhead who thinks he understands typing when he doesn't.
>>
>> In C you do NOT cast the return value of malloc. ALL *GOOD* C
>> programmers know that.

Sure, old chap, sure. Yeah, adding that cast changes the underlying
assembly code, sure it does. Idiot.

Do you think that Kernighan and Ritchie are *BAD* C programmers, "Hadron"?

Then take a gander at page 143 of the second edition of their book, "The C
Programming Language." At the top of the page is an implementation of the
strdup() function, and in it you will see this line of code:

char *p;
p = (char *) malloc(strlen(s)+1);

Page 167:

int *ip;
ip = (int *) calloc(n, sizeof(int));

Idiot.

> This is true, but many good C++ programmers think their knowledge is
> directly transferable to C. And while this is often the case, in
> situations like these a good C++ programmer can run in to trouble. I
> don't know what the situation is here, but just pointing it out.

cc, "Hadron" is full of shit.

He's the same fellow who claimed that it was an illegal C statement to
assign NULL to a pointer, and to dereference a null pointer.

He's also the same fellow who apparently never heard of the multitude of C++
casting statements, and why they can be useful.

Peter set "Hadron" straight, but I suspect it went right over its head.

So, as usual, our insane troll gloms onto some arbitrary dictum and
proclaims that it is THE dogma, instead of, like cc, being away of important
nuances.

http://en.wikipedia.org/wiki/Malloc#Casting_and_type_safety

Jesus Christ, if someone is going to be a pompous blowhard about a subject,
the least they can do is make sure they are CORRECT.

--
So many men, so many opinions; every one his own way.
-- Publius Terentius Afer (Terence)

cc

unread,
Jul 18, 2011, 1:05:32 PM7/18/11
to

If I remember correctly K&R is before the ANSI C standard. ANSI C is
what made void* implicitly cast. But I'm not 100% sure on that.

--
"My name is Snit and I'm good friends with Homer and chrisv and none
of us understand the Paradox of Choice." - Snit

Chris Ahlstrom

unread,
Jul 18, 2011, 1:20:51 PM7/18/11
to
cc wrote this copyrighted missive and expects royalties:

> On Jul 18, 12:37?pm, Chris Ahlstrom <ahlstr...@xzoozy.com> wrote:
>>
>> Do you think that Kernighan and Ritchie are *BAD* C programmers, "Hadron"?
>>
>> Then take a gander at page 143 of the second edition of their book, "The C

>> Programming Language." ?At the top of the page is an implementation of the


>> strdup() function, and in it you will see this line of code:
>>

>> ? ?char *p;
>> ? ?p = (char *) malloc(strlen(s)+1);
>>
>> Page 167:
>>
>> ? ?int *ip;
>> ? ?ip = (int *) calloc(n, sizeof(int));


>
> If I remember correctly K&R is before the ANSI C standard. ANSI C is
> what made void* implicitly cast. But I'm not 100% sure on that.

The book I have is stamped "ANSI C". It's the second edition, not the first
edition.

In any case, there are nuances in the standard, and differences in
implementation, so that dogmatic statements are unwise. A certain amount of
"wive's tales" seems to get propagated on the web.

For the only significant downside of casting noted, where issues occur with
malloc() not properly declared, it turns out that my main compiler, gcc,
catches the issue with or without the casts.

--
I must Create a System, or be enslav'd by another Man's;
I will not Reason and Compare; my business is to Create.
-- William Blake, "Jerusalem"

chrisv

unread,
Jul 18, 2011, 1:33:01 PM7/18/11
to
Chris Ahlstrom wrote:

>> Hadron quacked:


>>>
>>> In C you do NOT cast the return value of malloc. ALL *GOOD* C
>>> programmers know that.
>
>Sure, old chap, sure. Yeah, adding that cast changes the underlying
>assembly code, sure it does. Idiot.
>
>Do you think that Kernighan and Ritchie are *BAD* C programmers, "Hadron"?

The clueless Quack Wintroll has some odd ideas of what makes a "GOOD"
programmer...

--
"How hard it is to find GOOD programmers who specialise in anything
other than Windows APIs?" - "True Linux advocate" Hadron Quark

Jacques Clouseau

unread,
Jul 18, 2011, 1:35:28 PM7/18/11
to

stupid turd "chrisv" <chr...@nospam.invalid> wrote in message
news:agr8271npa0i2pqpu...@4ax.com...

>
> The clueless Quack Wintroll has some odd ideas of what makes a "GOOD"
> programmer...

another fine "advocacy" post from the useless asshole.

"chrisv" is a liar. "chrisv" is a piece of shit.


cc

unread,
Jul 18, 2011, 1:36:44 PM7/18/11
to
On Jul 18, 1:20 pm, Chris Ahlstrom <ahlstr...@xzoozy.com> wrote:
> cc wrote this copyrighted missive and expects royalties:
>
>
>
>
>
> > On Jul 18, 12:37?pm, Chris Ahlstrom <ahlstr...@xzoozy.com> wrote:
>
> >> Do you think that Kernighan and Ritchie are *BAD* C programmers, "Hadron"?
>
> >> Then take a gander at page 143 of the second edition of their book, "The C
> >> Programming Language." ?At the top of the page is an implementation of the
> >> strdup() function, and in it you will see this line of code:
>
> >> ? ?char *p;
> >> ? ?p = (char *) malloc(strlen(s)+1);
>
> >> Page 167:
>
> >> ? ?int *ip;
> >> ? ?ip = (int *) calloc(n, sizeof(int));
>
> > If I remember correctly K&R is before the ANSI C standard. ANSI C is
> > what made void* implicitly cast. But I'm not 100% sure on that.
>
> The book I have is stamped "ANSI C".  It's the second edition, not the first
> edition.

Ahh, I am incorrect it seems.


> In any case, there are nuances in the standard, and differences in
> implementation, so that dogmatic statements are unwise.  A certain amount of
> "wive's tales" seems to get propagated on the web.
>
> For the only significant downside of casting noted, where issues occur with
> malloc() not properly declared, it turns out that my main compiler, gcc,
> catches the issue with or without the casts.
>

--
"My name is Snit and I was born fat and stupid." - Snit

cc

unread,
Jul 18, 2011, 2:23:35 PM7/18/11
to

Just for some clarafication:

http://plan9.bell-labs.com/cm/cs/cbook/2ediffs.html

"142(6.5, toward the end): The remark about casting the return value
of malloc ("the proper method is to declare ... then explicitly
coerce") needs to be rewritten. The example is correct and works, but
the advice is debatable in the context of the 1988-1989 ANSI/ISO
standards. It's not necessary (given that coercion of void * to
ALMOSTANYTYPE * is automatic), and possibly harmful if malloc, or a
proxy for it, fails to be declared as returning void *. The explicit
cast can cover up an unintended error. On the other hand, pre-ANSI,
the cast was necessary, and it is in C++ also."

So K&R acknowledge the debate of the cast on malloc. So although the
"example is correct and works" they acknowledge it is potentially bad
practice, while also acknowledging a few of the plus sides for the
cast. So...

--
"My name is Snit and I'm a liar." - Snit

Snit

unread,
Jul 18, 2011, 2:45:23 PM7/18/11
to
cc stated in post
f20956e6-ee70-4c96...@t7g2000vbv.googlegroups.com on 7/18/11
11:23 AM:

> I'm a liar.

Yes, you are.


--
[INSERT .SIG HERE]


Kelsey Bjarnason

unread,
Jul 18, 2011, 4:47:51 PM7/18/11
to
[snips]

On Mon, 18 Jul 2011 06:42:36 -0400, Chris Ahlstrom wrote:

>> Putting in casts where they really shouldn't be always makes me nervous
>> that they may also exist in riskier areas, such as, oh, casting the
>> return of *alloc and suchlike.
>
> Now just how are you going to use the return value of malloc() without
> casting? (Aside for bare copies/comparisons using memcpy/memcmp)

Okay, let's set a stage here. The initial code used printf rather than
cout and failed to include a required header; subsequent comments
involved *alloc rather than new. Between these we can be reasonably
certain we're discussing C, rather than C++.

In C, void pointers are compatible with other object pointers, meaning
there is no need to cast them. The following is perfectly valid C code:

#include <stdlib.h>

int main(void)
{
double *ptr;
ptr = malloc( 100 * sizeof(*ptr) );
if (ptr) free(ptr);
return 0;
}


Note that while ptr is a pointer-to-double and malloc returns pointer-to-
void, there is no need to cast the return value of malloc to double *.
There never has been in C. Not in K&R C, not in C89, not in C99.

Now consider what happens if you do cast:

#include <stdlib.h>

int main(void)
{
double *ptr;
ptr = (double *)malloc( 100 * sizeof(*ptr) );
if (ptr) free(ptr);
return 0;
}

So far no problem, but what if you make a boo-boo?


/* Note, no header */

int main(void)
{
double *ptr;
ptr = (double *)malloc( 100 * sizeof(*ptr) );
if (ptr) free(ptr);
return 0;
}

Depending upon version of C and implementation, that code may well break
horribly - and not give any warning at all at compile time.

It isn't required to warn because, while malloc, under the implicit-int
rule, returns an int - which *would be* incompatible with a pointer-to-
double, the fact you used a cast means "I know what I'm doing", so the
compiler is free to ignore the issue and carry on.

However, come runtime, you're trying to take the return from the library
function malloc, pretend it as an int - which it may not fit in, or may
not have compatible representation with, or even be returned by the same
mechanism - then stuff this value which may already be mangled and
destroyed into a pointer-to-double which the int may not be compatible
with.

If you're coding in C and you're casting the return of void * functions
such as malloc, stop. It is unnecessary, it hides potentially serious
bugs and it adds nothing to any aspect of the programming process, not
reliability, not readability, not maintainability, not correctness.

Chris Ahlstrom

unread,
Jul 18, 2011, 5:27:55 PM7/18/11
to
Kelsey Bjarnason wrote this copyrighted missive and expects royalties:

> [snips]

In the case of gcc, your premise (it hides potentially serious bugs)
is wrong:

m.c:

int main (void)
{
double * p = (double *) malloc(100*sizeof(double));
return p != 0 ? 0 : 1 ;
}

$ gcc m.c
m.c: In function 'main':
m.c:4: warning: incompatible implicit declaration of built-in function 'malloc'

And, since I am very strict about fixing all warnings the right way,
I'll never get bit by that issue.

And you're also wrong about readability. The rest you are correct about,
though.

And at least you're not simply echoing claims you found by Googling, like a
certain arrogant troll, and claiming you know more, based on it.

--
The test of a first-rate intelligence is the ability to hold two opposed
ideas in the mind at the same time and still retain the ability to function.
-- F. Scott Fitzgerald

Kelsey Bjarnason

unread,
Jul 19, 2011, 2:34:24 AM7/19/11
to
[snips]

On Mon, 18 Jul 2011 17:27:55 -0400, Chris Ahlstrom wrote:

>> Depending upon version of C and implementation

> In the case of gcc, your premise (it hides potentially serious bugs) is
> wrong:

See the statement above. gcc is but one of many implementations.

> m.c:
>
> int main (void)
> {
> double * p = (double *) malloc(100*sizeof(double)); return p != 0
> ? 0 : 1 ;
> }
>
> $ gcc m.c
> m.c: In function 'main':
> m.c:4: warning: incompatible implicit declaration of built-in function
> 'malloc'

Note this is a warning, a diagnostic. Some implementations are better
than others about what they warn on.

> And, since I am very strict about fixing all warnings the right way,
> I'll never get bit by that issue.

Assuming you never use an implementation which doesn't warn you on such
matters.


> And you're also wrong about readability. The rest you are correct
> about, though.

How does the inclusion of unnecessary, potentially dangerous casting
*improve* readability? If anything it means the maintainer now has to
scan the code much more intensively to look for potential "gotchas" which
might be hidden by other casts which shouldn't be there.

> And at least you're not simply echoing claims you found by Googling,
> like a certain arrogant troll, and claiming you know more, based on it.

I'm far from perfect, but I did code in C for many, many years, on
several different implementations on several different OSen. I was more
or less forced to learn the difference between what seems to work and
what actually works.

I've been known to be wrong, to be certain, but when it comes to coding
in C, I tend to be more often wrong on matters of opinion than on matters
of fact.

For example, I referred to a particular behaviour of C in regards to
"const" variables as "rather obscure". Others disagreed. Fine, I'm
willing to accept their view... but when someone tells me that seeing a
const pointer parameter does *not* imply the results won't be modified,
I'm going to have a very, very hard time ever trusting any code they
write.

I'll agree it may not be as obscure an issue as I would have thought, but
the implications of getting too chummy with rules such as that are not,
IMO, good implications when it comes to reliable code.

But, in any case, as to malloc and friends. In C, it is not necessary to
cast the return of malloc. Doing so tends to simply raise flags about
what other unnecessary, and potentially dangerous, constructs may be
lurking within.

Chris Ahlstrom

unread,
Jul 19, 2011, 6:30:32 AM7/19/11
to
Kelsey Bjarnason wrote this copyrighted missive and expects royalties:

> [snips]


>
> On Mon, 18 Jul 2011 17:27:55 -0400, Chris Ahlstrom wrote:
>
>> $ gcc m.c
>> m.c: In function 'main':
>> m.c:4: warning: incompatible implicit declaration of built-in function
>> 'malloc'
>
> Note this is a warning, a diagnostic. Some implementations are better
> than others about what they warn on.

Indeed.

None are perfect, which is why I like to pass the code through more than one
compiler, and turn on every warning I possibly can.

>> And, since I am very strict about fixing all warnings the right way,
>> I'll never get bit by that issue.
>
> Assuming you never use an implementation which doesn't warn you on such
> matters.

True. But this is 2011. I can't remember the last time I ever saw a
compiler miss a warning about implicit declaration.

>> And you're also wrong about readability. The rest you are correct
>> about, though.
>
> How does the inclusion of unnecessary, potentially dangerous casting
> *improve* readability? If anything it means the maintainer now has to
> scan the code much more intensively to look for potential "gotchas" which
> might be hidden by other casts which shouldn't be there.

The maintainer doesn't have to scan shit, Kels. He just has to pass it
through a decent compiler.

I'm telling the compiler I know what I am doing, and the reader that,
yes, there is a conversion going on.

And, actually, for the sake of a certain troll, I'd write it this way:

double * p = (double *) (double *) (double *) malloc(100*sizeof(double));

:-D

>> And at least you're not simply echoing claims you found by Googling,
>> like a certain arrogant troll, and claiming you know more, based on it.
>
> I'm far from perfect, but I did code in C for many, many years, on
> several different implementations on several different OSen. I was more
> or less forced to learn the difference between what seems to work and
> what actually works.
>
> I've been known to be wrong, to be certain, but when it comes to coding
> in C, I tend to be more often wrong on matters of opinion than on matters
> of fact.
>
> For example, I referred to a particular behaviour of C in regards to
> "const" variables as "rather obscure". Others disagreed. Fine, I'm
> willing to accept their view... but when someone tells me that seeing a
> const pointer parameter does *not* imply the results won't be modified,
> I'm going to have a very, very hard time ever trusting any code they
> write.

What's the difference between these two statements?

const char * p1 = "Hello";

const char * const p2 = "Hello";

> I'll agree it may not be as obscure an issue as I would have thought, but
> the implications of getting too chummy with rules such as that are not,
> IMO, good implications when it comes to reliable code.
>
> But, in any case, as to malloc and friends. In C, it is not necessary to
> cast the return of malloc. Doing so tends to simply raise flags about
> what other unnecessary, and potentially dangerous, constructs may be
> lurking within.

I agree, and, apparently like K&R, who also made sure to include stdlib.h,
did not realize that, in the hands of less careful coders with less
rigorous compilers, explicitly informing the reader/compiler about
a conversion could be dangerous.

Anyway, I get more worried about coders who ignore warnings, saying,
"They're just warnings, not errors." And, when you ask them to get
rid of them, they either drop the warning level to /W3 or disable the
warning in the goddam project file (Microsoft's compiler).

Hopefully it all comes out in the unit-test rinse!

--
A rolling stone gathers no moss.
-- Publilius Syrus

Kelsey Bjarnason

unread,
Jul 19, 2011, 4:52:42 PM7/19/11
to
[snips]

Chris Ahlstrom wrote:

>> How does the inclusion of unnecessary, potentially dangerous casting
>> *improve* readability? If anything it means the maintainer now has to
>> scan the code much more intensively to look for potential "gotchas" which
>> might be hidden by other casts which shouldn't be there.
>
> The maintainer doesn't have to scan shit, Kels. He just has to pass it
> through a decent compiler.

This assumes he has access to what you consider a decent compiler. It also
assumes that said "decent compiler" doesn't do a whole host of brain-dead
things which would interfere.

Visual C++, for example, would spew pages of warnings if you included
certain of its *own* headers, and had it set to maximum warning levels. The
most common way to avoid endless repetitions of useless warnings was to drop
the warning level a litle lower... which did stop it complaining about its
own headers, but *also* stopped it issuing warnings for the same problems in
other code.

Meanwhile, running it past a "decent compiler" will only generate warnings
on things the compiler _can_ warn on, or is required to - which does not
include, for example, instances of undefined behaviour.

Seeing unnecessary casts, which do not improve the code quality, but which
can mask potentially serious problems in the code is not a sign all is well,
it's a sign the code needs to be thoroughly reviewed, by a human, to ensure
that other errors aren't being hidden or masked, errors which a compiler
cannot necessarily detect on its own.

> I'm telling the compiler I know what I am doing, and the reader that,
> yes, there is a conversion going on.

You're assigning a pointer-to-void to a pointer-to-double; it's pretty
freaking obvious there's a conversion going on.

Here's a related example, see if you can see why this would tend to make one
worry when running across it in someone's code:

char *ptr = malloc( 100 * sizeof(char) );

> What's the difference between these two statements?
>
> const char * p1 = "Hello";
>
> const char * const p2 = "Hello";

One's a pointer to const char, the other's a pointer to const char which
itself is const.

> Anyway, I get more worried about coders who ignore warnings, saying,
> "They're just warnings, not errors."

Yes, but coders who rely on an implementation to provide warnings, rather
than actually understanding what their code is doing - especially how it may
be *masking* those warnings - also worry me.

> And, when you ask them to get
> rid of them, they either drop the warning level to /W3 or disable the
> warning in the goddam project file (Microsoft's compiler).

At least with old versions of Visual C++, your options were to lower warning
levels, or get buried in useless warnings caused by their own headers.
Neither's a good option if you're relying on the compiler's warning output
to point out issues in your code, and even less so if you're masking
potential warnings.


Chris Ahlstrom

unread,
Jul 19, 2011, 5:40:01 PM7/19/11
to
Kelsey Bjarnason wrote this copyrighted missive and expects royalties:

> [snips]


>
> Chris Ahlstrom wrote:
>
>>> How does the inclusion of unnecessary, potentially dangerous casting
>>> *improve* readability? If anything it means the maintainer now has to
>>> scan the code much more intensively to look for potential "gotchas" which
>>> might be hidden by other casts which shouldn't be there.
>>
>> The maintainer doesn't have to scan shit, Kels. He just has to pass it
>> through a decent compiler.
>
> This assumes he has access to what you consider a decent compiler.

http://gcc.gnu.org/

> It also assumes that said "decent compiler" doesn't do a whole host of
> brain-dead things which would interfere.
>
> Visual C++, for example, would spew pages of warnings if you included
> certain of its *own* headers, and had it set to maximum warning levels. The
> most common way to avoid endless repetitions of useless warnings was to drop
> the warning level a litle lower... which did stop it complaining about its
> own headers, but *also* stopped it issuing warnings for the same problems in
> other code.

Indeed. That is just one of the reasons I consider Visual Studio to be an
inferior development environment for C/C++ projects.

(That it is as slow as hell is another.)

> Meanwhile, running it past a "decent compiler" will only generate warnings
> on things the compiler _can_ warn on, or is required to - which does not
> include, for example, instances of undefined behaviour.
>
> Seeing unnecessary casts, which do not improve the code quality, but which
> can mask potentially serious problems in the code is not a sign all is well,
> it's a sign the code needs to be thoroughly reviewed, by a human, to ensure
> that other errors aren't being hidden or masked, errors which a compiler
> cannot necessarily detect on its own.

Sure.

However, the code should be thoroughly reviewed regardless.

And guess what, Kels? The humans will miss a lot of it. The only
way to be sure (and even then problems might be missed) is with a rigorous
body of unit tests that pass on a number of different platforms. (You know
this, of course.)

>> I'm telling the compiler I know what I am doing, and the reader that,
>> yes, there is a conversion going on.
>
> You're assigning a pointer-to-void to a pointer-to-double; it's pretty
> freaking obvious there's a conversion going on.
>
> Here's a related example, see if you can see why this would tend to make one
> worry when running across it in someone's code:
>
> char *ptr = malloc( 100 * sizeof(char) );

Why would it make someone worry?

However, tossing the cast in the will damn sure make the reader take note!

>> What's the difference between these two statements?
>>
>> const char * p1 = "Hello";
>>
>> const char * const p2 = "Hello";
>
> One's a pointer to const char, the other's a pointer to const char which
> itself is const.
>
>> Anyway, I get more worried about coders who ignore warnings, saying,
>> "They're just warnings, not errors."
>
> Yes, but coders who rely on an implementation to provide warnings, rather
> than actually understanding what their code is doing - especially how it may
> be *masking* those warnings - also worry me.

Unfortunately, Kelsey, the days when the compiler masks the warnings are
long over.

>> And, when you ask them to get rid of them, they either drop the warning
>> level to /W3 or disable the warning in the goddam project file
>> (Microsoft's compiler).
>
> At least with old versions of Visual C++, your options were to lower warning
> levels, or get buried in useless warnings caused by their own headers.
> Neither's a good option if you're relying on the compiler's warning output
> to point out issues in your code, and even less so if you're masking
> potential warnings.

// #include <stdlib.h>
int main (void)
{
int p = malloc(100*sizeof(double));
int q = (int) malloc(100*sizeof(double));
return (p != 0) && (q != 0) ? 0 : 1 ;
}

$ gcc m.c
m.c: In function 'main':

m.c:5: warning: incompatible implicit declaration of built-in function 'malloc'
m.c:5: warning: initialization makes integer from pointer without a cast
m.c:6: warning: cast from pointer to integer of different size

#include <stdlib.h>
int main (void)
{
int p = malloc(100*sizeof(double));
int q = (int) malloc(100*sizeof(double));
return (p != 0) && (q != 0) ? 0 : 1 ;
}

$ gcc m.c
m.c: In function 'main':

m.c:5: warning: initialization makes integer from pointer without a cast
m.c:6: warning: cast from pointer to integer of different size

// #include <stdlib.h>
int main (void)
{
int * p = (int *) malloc(100*sizeof(double));
int * q = (int *) malloc(100*sizeof(double));
return (p != 0) && (q != 0) ? 0 : 1 ;
}

$ gcc m.c
m.c: In function 'main':

m.c:5: warning: incompatible implicit declaration of built-in function 'malloc'

Obviously, though, one would be foolish to rely on the compiler.
You need unit tests.

--
Earn cash in your spare time -- blackmail your friends.

Kelsey Bjarnason

unread,
Jul 19, 2011, 10:36:14 PM7/19/11
to
[snips]

Chris Ahlstrom wrote:

> However, the code should be thoroughly reviewed regardless.
>
> And guess what, Kels? The humans will miss a lot of it.

Yes, they will... so why go out of one's way to add things which do not
actually improve the code, but make it *less* likely a potential problem
will be spotted?

>> Here's a related example, see if you can see why this would tend to make
>> one worry when running across it in someone's code:
>>
>> char *ptr = malloc( 100 * sizeof(char) );
>
> Why would it make someone worry?

Because the person writing that has an unclear grasp of how C works, as
demonstrated by their use of sizeof(char).


> However, tossing the cast in the will damn sure make the reader take note!

How will including a pointless and potentially dangerous cast have any
effect whatsoever on the use of sizeof(char) there?

> Unfortunately, Kelsey, the days when the compiler masks the warnings are
> long over.

Really? How odd. And here I understood there were many platforms, many
OSen, where decent-quality compilers were still hard to come by. Maybe if
your world is limited to a few pet platforms, this might be true, but you're
asserting it's true everywhere, for all OSen, all implementations still in
use?

Meanwhile, it still doesn't explain why one would add a cast which does
nothing to improve the code, yet even potentially masks a warning condition.
Hell, it doesn't explain the use of pointless casting *at all*. Do you also
write statements such as:

int x = (int) 3;

or

double d = (double) 2.0; ?

If you're using pointless casting for malloc, why not everywhere?

> // #include <stdlib.h>
> int main (void)
> {
> int p = malloc(100*sizeof(double));
> int q = (int) malloc(100*sizeof(double));
> return (p != 0) && (q != 0) ? 0 : 1 ;
> }
>
> $ gcc m.c

And gcc, of course, defines what the C language requires an implementation
to do, and therefore all implementations work exactly the way gcc does,
right?

If not, then what was your point?


Gary Stewart

unread,
Jul 19, 2011, 11:18:57 PM7/19/11
to
On Tue, 19 Jul 2011 19:36:14 -0700, Kelsey Bjarnason wrote:

> [snips]
>
> Chris Ahlstrom wrote:
>
>> However, the code should be thoroughly reviewed regardless.
>>
>> And guess what, Kels? The humans will miss a lot of it.
>
> Yes, they will... so why go out of one's way to add things which do not
> actually improve the code, but make it *less* likely a potential problem
> will be spotted?

Kels?

Watch out Kelsey because it looks like Liarmutt (Chris Ahlstrom) is
once again looking for a new master.

Liarmutt doesn't like sharing Roy Schestowitz with Goblin.

You've been warned :)

--
7/19/2011 11:17:27 PM
Gary Stewart

Please visit our hall of Linux idiots.
http://linuxidiots.blogspot.com/

Watching Linux Fail:
http://limuxwatch.blogspot.com/

Come laugh at Linux "advocacy" with us!

http://www.youtube.com/social/blog/techrights-org

Linux's dismal desktop market share:

http://royal.pingdom.com/2011/05/12/the-top-20-strongholds-for-desktop-linux/

Desktop Linux: The Dream Is Dead
"By the time Microsoft released the Windows 7 beta
in January 2009, Linux had clearly lost its chance at desktop
glory."
http://www.pcworld.com/businesscenter/article/207999/desktop_linux_the_dream_is_dead.html

Desktop Linux on Life Support:

http://www.techradar.com/news/software/operating-systems/is-linux-on-the-desktop-dead--961508

When I use the term Linux I am speaking of desktop Linux unless
otherwise stated.

It is loading more messages.
0 new messages