Path: utzoo!attcan!utgpu!jarvis.csri.toronto.edu!mailrus!purdue!ames!sun-barr!decwrl!shlump.nac.dec.com!hiatus.dec.com!grue.dec.com!daniels
From: daniels@grue.dec.com (Bradford R. Daniels)
Newsgroups: comp.std.c
Subject: %g format in printf
Message-ID: <1439@hiatus.dec.com>
Date: 5 Sep 89 23:19:47 GMT
Sender: news@hiatus.dec.com
Organization: Digital Equipment Corporation
Lines: 23

What should the default number of significant digits be for the
%g format specifier in printf?  The standard says that an explicit
0 should be treated as a 1, but doesn't say anything about what to
do if no precision is specified.

Right now, the VAX C RTL uses 6 as the default precision.  This
seems reasonable, since 6 is the default precision for the %e
and %f specifiers.  However, precision has a different meaning
with %g than with those other specifiers.  Is VAXCRTL's current
behavior correct?

Also, the definition of significant digits I learned in my high
school science classes says that if I am asked to print out 1024
with 1 significant digit, I should get 1000 printed out.  Is that
correct?

Thanks,
- Brad

-----------------------------------------------------------------
Brad Daniels			|  Digital Equipment Corp. almost
DEC Software Devo		|  definitely wouldn't approve of
"VAX C RTL Whipping Boy"	|  anything I say here...