The role of a doctor is indeed to educate the patient about the manner in which they can improve their health. However, I would disagree with the premise that it is solely the physician's responsibility to educate the public, instead I believe it is the duty of the schools, homes, media, governments, and even the workplace to educate the public.