➝ Informed Consent
Informed Consent is a legal term and ethical principle that describes biological ethics, which was introduced after World War II. It replaced the earlier medical attitudes that were founded on having implicit trust in a doctor’s decisions, and put patients in charge of their own...See more