Relay-Version: version B 2.10 5/3/83; site utzoo.UUCP Path: utzoo!watmath!clyde!floyd!harpo!decvax!cca!ima!haddock!dan From: dan@haddock.UUCP Newsgroups: net.lang.c Subject: Re: Re: Anyone on ANSI standard C - (nf) Message-ID: <58@haddock.UUCP> Date: Wed, 8-Feb-84 23:38:31 EST Article-I.D.: haddock.58 Posted: Wed Feb 8 23:38:31 1984 Date-Received: Fri, 10-Feb-84 04:25:27 EST Lines: 15 #R:rlgvax:-167200:haddock:12400003:000:853 haddock!dan Feb 8 17:03:00 1984 I have a question about the effects of specifying parameter types in the ANSI C standard. What will the exact effect be when the type of an actual parameter differs from the type given in the declaration? Seems to me that while silently casting is usually the right thing, in some cases the compiler should issue warnings because the odds are good the user is making a mistake. I think the compiler should cast silently if the declared and actual types are both arithmetic, regardless of length or signed/unsigned distinctions (leave warnings about lost accuracy to lint), or if the declared type is a pointer of any type and the actual parameter is the constant 0, but warn the user about everything else (e.g., using an int or char* where a FILE* was expected). In this way one avoids the absurd implicit problems of PL/I. Is this how it's done?