That is exactly what I am saying. 1- (0.999999...) is an infinitesimally small value not quite equal to zero. When we talk about 0.99999999... we are talking about the limit of x as x approaches 1, which is represented lim(x) x --> 1. This is understood to be infinitesimally close to 1 but not quite one.



I don't think so. The limit is the limit. That is, the limit is taken as exactly equal to one - well, that's the way I learned (but as Bean pointed out, we learn a lot of incorrect stuff).


lim x = 1 (identically)
x->1


For example, we don't say that integrals solved exactly are "approximately equal to blah blah."

The integral of cos(x) is sin(x) + C, not approximately sin(x) + C (and what is an integral, but a limit?).


OTOH, we don't say that differentials are identically equal to zero. It's a bit confusing. Back in HS, before actually taking calc, I read this book by a guy named Boyer called "The History of Calculus and its Conceptual Development." From what I recall (of more than 20 years ago), these concepts of differentials and limits that we take for granted today were a really big deal at one time - very controversial.

I'm a bit envious of Bean for actually having taken Real Analysis. I'm sure she's got a much better handle on the particulars because of it. (I think maybe Real Analysis is - ahem, approximately - to calculus as number theory is to arithmetic.)

k