There is a book just out - the title is either What You Were Never Taught in American History - or- What You Never Learned in American History ... it was a brief blurb on TV and I can't recall the author's name, if I even heard it ... It sounded interesting, what little I heard. Anyone read it?