Organicism

Definition:

  • (n.) The doctrine of the localization of disease, or which refers it always to a material lesion of an organ.