This may/may not be the correct place to ask this question. I have a
program written in C on my Digital UNIX 3.2C machine. I am getting an
error when I a am reading in a number. It causes the program to
crash. I have narrowed it down to the point that I get an message
back saying that the number is floating point negative denormalized.
Can someone tell me what is meant by denormalized - how to check for
it - and/or how to normalize. I know that this is more of a
programming questions, but the problem does not occur when I am using
a PC or an SGI.
Thanks
kmadison_at_logicon.com
Received on Mon Mar 10 1997 - 16:03:41 NZDT