![]() ![]() 6 For Weizenbaum, however, this ready substitution of the human with the computer was far more serious it was narcissistic and ethically dubious at best and, at worst, recalled the rationalism that led to the Holocaust and the Vietnam War. Sherry Turkle has noted our propensity to ascribe to programs human intention and has coined the term “ELIZA-effect” to describe the tendency of people to treat responsive programs as more intelligent than they are. For subsequent scholars, this finding has offered fertile ground for reflecting on our relations with technologies. From its earliest days, users wanted to have time alone to interact with ELIZA. Notably, ELIZA’s creator expressed horror at what he saw as early users’ too easy rejection of human sociality in favor of intimacy with a computer program. This realization has spawned an entire field of AI dedicated to the development of chatbots, conversational agents, and human computer interaction (HCI).ĮLIZA, however, also prompted anxiety. By restricting turns, subject matter, and responses, it was possible to program human machine communication. 5 In an era in which machine translation (MT) was stumbling, ELIZA demonstrated that conversation (or rather what Margaret Masterman might call “toy conversation”) might be more easily modeled than language overall. Partly this is because ELIZA demonstrated that a few simple rules could be deployed to produce the “illusion of understanding” for the user and the impression of an interactive conversation. ![]() The program was the product of extremely limited coding, utilizing keyword triggers, minimal context, simple transformation rules, and a procedure for responding “intelligently” if no keywords are detected. The psychotherapist interaction was chosen for being an interaction that assumed little knowledge of contextual information on the part of the computer. ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |