Definitions of Western Medicine * from
- National Cancer Institute (of NIH): a system in which medical doctors and other healthcare professionals (such as nurses, pharmacists, and therapists) treat symptoms and diseases using drugs, radiation, or surgery.
- Macmillan Dictionary: the type of medical treatment that is the most popular in North America and Western European countries, based on the use of drugs and surgery to treat symptoms (=signs of illness).
Western medicine is largely the practice of diagnosing and treating symptoms, and then following pharmaceutical, radiological, or surgical protocol procedures to offset or prevent the spread of those symptoms. In short, it is a system to manage and treat symptoms with drugs, radiation, or surgery. But the root causes of diseases are unknown and are not treated or cured. Continue reading “What Western Medicine Really Is”
In spring, the residual cold air and incoming hot air causes whiplash weather and frequent changes between cold and warm. If not protected, the body is easily susceptible to getting sick from the frequent weather changes. Continue reading “Springtime: Dressing Appropriately is Key to Healthy and Happy Year”
Many people think that drinking water first thing in the morning cleanses the digestive tract and is good for the body. Drinking water in the morning may damage the body if done under the incorrect circumstances.
Continue reading “Could drinking water in the morning be bad for you?”
Chinese media have covered many cell phone related deaths and injuries. Continue reading “Deaths and Burns from Cellphones”
Hi there! I’m a healthy nut, health consultant, energy master and healer. I collect all the health and wellness information from historic and modern China and all over the world. This is my blog to share the information for our self-care at home to take charge of our own health. I live in the great Boston area in Massachusetts, have two wonderful sons, and I like to cook all ethnic dishes and to practice qigong, Reiki, and energy healing.