Path: utzoo!attcan!utgpu!jarvis.csri.toronto.edu!mailrus!uflorida!haven!adm!smoke!gwyn
From: gwyn@smoke.BRL.MIL (Doug Gwyn)
Newsgroups: comp.std.c
Subject: Re: %g format in printf
Message-ID: <10961@smoke.BRL.MIL>
Date: 7 Sep 89 04:24:34 GMT
References: <1439@hiatus.dec.com> <19426@mimsy.UUCP>
Reply-To: gwyn@brl.arpa (Doug Gwyn)
Organization: Ballistic Research Lab (BRL), APG, MD.
Lines: 16

In article <19426@mimsy.UUCP> chris@mimsy.UUCP (Chris Torek) writes:
-In article <1439@hiatus.dec.com> daniels@grue.dec.com (Bradford R. Daniels)
-writes:
->Also, the definition of significant digits I learned in my high
->school science classes says that if I am asked to print out 1024
->with 1 significant digit, I should get 1000 printed out.  Is that
->correct?
-No: `1000' has four significant digits (as does `1.000'); you need `1e3'.
-I have seen arguments on both sides of this, but only believe the one
-that says `the number of digits written is the number of significant
-digits'.

While 1.000 indeed unambiguously has 4 significant digits, the number
of significant digits in 1000 is not evident from inspection; it could
be from 1 to 4.  When you ask printf() to format 1024 with 1 significant
digit in a fairly wide field, it's supposed to produce 1000 not 1e3.