Tuesday, September 15, 2009

Augmenting Human Compassion


“Augmenting human intellect” represents one of the continual goals of any well-designed technology. In the course of our readings, three major influences have sprung up time and again as to how we, as a society, may go about using technology to evolve our cognitive capabilities. For Douglas Engelbart (1962), technology may do so by leveraging already existing perceptual mappings or by bringing the mental abilities of a person up to a level of more complex thought through various methods including “streamlined terminology” and “powerful concepts.” Licklider (1960) also defines a similar concept called “man-computer symbiosis”, a system whereby humans and computers work in conjunction to “think as no human brain has ever thought and process data in a way not approached by the information-handling machines we know today.” William Ross Ashby (1956) also wrote of “amplifying intelligence” in his work on cybernetics. However, intelligence needs to be countered and balanced by basic moral and ethical considerations. I would argue that the foundation of these considerations lies squarely with an entity’s capability to feel compassion. Additionally, with a certain level of compassion (and intelligence), ethics, as in a listing of rules or system of conduct, becomes secondary. So, instead of concentrating on the aspect of human cognitive evolution defined as intellect, I would like to find methods for augmenting human compassion using digital media.
Compassion as a component of a healthy mental state and as a necessity for a large societal organization is a sometimes marginalized concept. That may be because it is seen as a responsibility of parents and families to develop compassion in their children. Certain concepts are only slowly adopted into the mainstream’s consideration. As it is, the research into human-computer interaction focuses mainly on functionality and usability. Even more human-centered designs are concerned with business considerations such as turnaround and click-through analytics.
For example, while reading Myers’ A Brief History of Human Computer Interaction Technology, I found that the introduction clearly states that his history only covers the “computer side of HCI” and that “a companion article on the history of the ‘human side,’ discussing the contributions from psychology, design, human factors and ergonomics would also be appropriate.” This “human side” approach would form the basis of my research project for determining how one might augment human compassion.
Discovering what makes one more compassionate would be the first topic for research. Within the context of digital media and within the constraints of one semester, it seems daunting to hone compassion down to a measurable aspect, but I hope that by making an open call for ideas that some epiphany will come about.
From the historical perspective, we can see that in developing his conceptual framework for augmenting human intellect, Engelbart defines the objectives for his study and covers his basic perspective. He promotes leaving room for intuition or a human’s “feel for a situation”. For augmenting compassion, I would also say that one would have to leave room for epiphany as well.
My first avenue for exploration could include researching whether or not any current internet memes act to augment compassion. From cute LOLCATS with funny captions to YouTube videos like Christian the Lion, does sharing these with others help to augment our society’s overall level of compassion? And, conversely, is sharing morbid imagery damaging to compassion? One caveat comes with the level of interaction that might be necessary for long-term effects. If one sees something, is it enough to have a persisting effect? Or must one also be involved somehow to ensure a stable change in mentality?
Given these possibilities for investigation, another avenue for exploration that might prove to engender long-term increases in compassion levels would involve the integration of a participation component through interactive art or music. If a lack of compassion stems from a lack of empathy with others or with a disconnect from humanity or nature, then a key component to developing compassion in others would involve creating a palpable connection to others and, thereby, to humanity in general. With interactive art, the person becomes a component of the creation, a powerful metaphor that might prove helpful for compassion development. However, a connection beyond the computer might also be necessary for augmentation. An association from human to art piece to creator of art piece to humanity would be ideal.
Following Engelbart’s format, the objective of a study taking in the previous suppositions and conjectures would include the following goals: (1) to find the factors that determine a given individual’s level of compassion; and (2) to develop methods that would act to augment human compassion using digital media. Engelbart’s specifications for his framework still fit for this research direction.
Step one would be to find a test for compassion so that quantitative results can verify any changes over time, from before exposure to the stimulus to afterwards. Step two would involve testing non-participatory stimuli such as the YouTube videos for changes in levels of compassion. Step three could then cover participatory situations of varying complexity.
As this blog post/response essay is written in response to our second week’s readings on the subject of History in Perspective, any further reading suggestions along these lines of augmenting compassion, augmenting empathy, or developing emotional intelligence would be greatly appreciated. Any studies that have been performed on the effect of interactive art would also be of great interest to me. Usually I am not one for trying to pin down the exact meaning or relevance of a piece of art, but in the context of a compassionate evolution I would concede the necessity for some formal investigation into the matter.
I believe that the nexus of intelligence and compassion would negate the need for overly strict rules that may be based on a narrow or subjective morality. The ultimate goal for technological society must include room for this augmented compassion.
--Christine Rosakranse

No comments: