Doctors work
to improve and preserve life, whilst teachers teach us the skills to function in life and
hopefully educate and improve future generations. Without teachers a doctor would not have
learned the skills nessesary to become a doctor. So yes a doctor can help you when you are sick
or injured, but their work is in the "here and now". Teachers are educating the minds
of the future and to me it is more important that the human race moves
forward.
Tuesday, 3 May 2011
Reasons Why Doctors Are More Important
Subscribe to:
Post Comments (Atom)
To what degree were the U.S., Great Britain, Germany, the USSR, and Japan successful in regards to their efforts in economic mobilization during the...
This is an enormous question that can't really be answered fully in this small space. But a few generalizations can be made. Bo...
-
Poor white Southerners did not have an easy life in the South after the Civil War. A lot of men either did not come back from the w...
-
In a moment that foreshadows the arrival of his future companion, Goodman Brown says to himself as he walks into the forest, "...
-
The forest in represents the place where nature and passion can grow freely and isn't restrained by Puritan ethics. While the P...
No comments:
Post a Comment