Do Doctors Really Heal?

To answer the question “Do doctors really heal?” one must first define the term “health,” to understand what it is doctors are supposed to be restoring, and then we need to define the word "doctor." The concept of “health” and the term "doctor" was understood very differently among the ancients in biblical times.

The Creation of Wealth

Why an article on “wealth” on a website about health? Because the way we think about wealth and finances has a tremendous impact on our health. Some of the wealthiest people on earth are also some of the sickest people. This is probably due to the stress related to managing wealth. Stress also occurs when we are poor and facing an uncertain economic future. Fortunately, the Bible gives clear guidance in all areas regarding finances and wealth. This article summarizes the Biblical view of wealth, which is essential to understand if you want to live a healthy life.