Doctors work
    to improve and preserve life, whilst teachers teach us the skills to function in life and
    hopefully educate and improve future generations. Without teachers a doctor would not have
    learned the skills nessesary to become a doctor. So yes a doctor can help you when you are sick
    or injured, but their work is in the "here and now". Teachers are educating the minds
    of the future and to me it is more important that the human race moves
    forward.
Tuesday, 3 May 2011
Reasons Why Doctors Are More Important
Subscribe to:
Post Comments (Atom)
To what degree were the U.S., Great Britain, Germany, the USSR, and Japan successful in regards to their efforts in economic mobilization during the...
This is an enormous question that can't really be answered fully in this small space. But a few generalizations can be made. Bo...
- 
The forest in represents the place where nature and passion can grow freely and isn't restrained by Puritan ethics. While the P...
 - 
On the domestic front, the affair leads to a breakdown in trust between Elizabeth and John Proctor. John had always been such a good...
 - 
Poor white Southerners did not have an easy life in the South after the Civil War. A lot of men either did not come back from the w...
 
No comments:
Post a Comment