"The first step in developing such a test would require that a set of basic, standardized criteria be developed; if measures for emotionally or consciously motivated behavioral responses are standardized, then judging behavior against those criteria becomes possible. In other words, if a pig chews needlessly on cage bars and her bodily responses (cortisol levels and skin temperature, as well as other indicators of stress in the mammalian central nervous system) correspond, then that pig can be said to be experiencing anxiety.
Tufts University professor and eminent theorizer of consciousness Daniel Dennett has proposed something similar for use with humans, a practice he refers to as "heterophenomenology." This practice, he argues, is "the sound way to take the first-person account as seriously as it can be taken." In a nutshell, Dennett says that if the researcher both listens to a subject’s inner account of a situation, and then observes the environmental factors, an objective conclusion can be reached about the inner-workings of the subject’s conscious thought processes.
He writes in an article on the subject: "a more constructive approach recognizes the neutrality of heterophenomenology and accepts the challenge of demonstrating, empirically, in its terms, that there are marvels of consciousness that cannot be captured by conservative theories."[11] The conservative theories to which Dennett refers hold back the study of consciousness in general, in humans and nonhumans alike. And though his proposal is meant for human consciousness, the principals could easily be applied to nonhuman animal emotion (Dennett, who is also a cognitive scientist, sometimes turns to the discussion of nonhuman animal consciousness).
It's a bold proposal, one that many will emphatically argue requires too great a leap of faith. The prospect is arguably rife with anthropomorphism, the supposed bane of all nonhuman animal consciousness studies. But approaches based on sweeping subjective generalizations are common throughout many areas of human psychology. Consider a hypothetical (and oversimplified) therapy session between Jane Doe and Dr. M. A set of criteria for emotion (in disorder form) is already accepted in the psychiatric field—the DSM—and judging by these formerly laid out, generalized criteria, Dr. M can determine by Jane Doe's verbal explanation of her experience whether or not his patient is experiencing anxiety. Dr. M makes a leap to believe his patient, thereby prescribing a medication that will alleviate her suffering. This trust in Jane and how well she knows her own emotion is subjective, but is nevertheless generally accepted. Take the language out of the equation—Jane’s ability to tell Dr. Z about her anxiety—and all we’ve left to go on is inference.
Is it so extreme an idea to suggest that nonhuman animals experience some form of emotion? It’s ethically easier to assume, as did Rene Descartes, that nonhumans are simply mechanized automatons, more like Turing’s machines than like us. I know my visit to Farm Sanctuary would have been much easier if I believed none of the residents had suffered any pain or anxiety."- More Here (hope we stop rationalizing at-least by end of this century)
No comments:
Post a Comment