What Western Medicine Really Is

Western Medicine

Definitions of Western Medicine * from

  • National Cancer Institute (of NIH): a system in which medical doctors and other healthcare professionals (such as nurses, pharmacists, and therapists) treat symptoms and diseases using drugs, radiation, or surgery.
  • Macmillan Dictionary: the type of medical treatment that is the most popular in North America and Western European countries, based on the use of drugs and surgery to treat symptoms (=signs of illness).

Western medicine is largely the practice of diagnosing and treating symptoms, and then following pharmaceutical, radiological, or surgical protocol procedures to offset or prevent the spread of those symptoms. In short, it is a system to manage and treat symptoms with drugs, radiation, or surgery. But the root causes of diseases are unknown and are not treated or cured. Continue reading “What Western Medicine Really Is”