Critics of American colleges typically attribute the failings of undergraduate education to a tendency on the part of professors to neglect their teaching to concentrate on research. In fact, the evidence does not support this thesis, except perhaps in major research universities.
We're constantly told that all cultures are equal, and that every belief system is as good as the next. And it led to a kind of - and generally, that America was to be known for its flaws rather than its virtues.
I've done quite a lot of growing up in public, which has been tricky at times.
For un-subscribe please check the mail footer.