Human-centered computing

From Wikipedia, the free encyclopedia

Human-centered computing (HCC) studies the design, development, and deployment of mixed-initiative human-computer systems. It is emerged from the convergence of multiple disciplines that are concerned both with understanding human beings and with the design of computational artifacts.[1] Human-centered computing is closely related to human-computer interaction and information science. Human-centered computing is usually concerned with systems and practices of technology use while human-computer interaction is more focused on ergonomics and the usability of computing artifacts and information science is focused on practices surrounding the collection, manipulation, and use of information.

Human-centered computing researchers and practitioners usually come from one or more disciplines such as computer science, human factors, sociology, psychology, cognitive science, anthropology, communication studies, graphic design , and industrial design. Some researchers focus on understanding humans, both as individuals and in social groups, by focusing on the ways that human beings adopt and organize their lives around computational technologies. Others focus on designing and developing new computational artifacts.

Overview[edit]

Scope[edit]

HCC aims at bridging the existing gaps between the various disciplines involved with the design and implementation of computing systems that support human's activities.[1] Meanwhile, it is a set of methodologies that apply to any field that uses computers in applications in which people directly interact with devices or systems that use computer technologies.

HCC facilitates the design of effective computer systems that take into account personal, social, and cultural aspects and addresses issues such as information design, human information interaction, human-computer interaction, human-human interaction, and the relationships between computing technology and art, social, and cultural issues.[1]

HCC topics[edit]

The National Science Foundation (NSF) defines three-dimensional research as "a three dimensional space comprising human, computer, and environment."[2] According to the NSF, the human dimension ranges from research that supports individual needs, through teams as goal-oriented groups, to society as an unstructured collection of connected people. The computer dimension ranges from fixed computing devices, through mobile devices, to computational systems of visual/audio devices that are embedded in the surrounding physical environment. The environment dimension ranges from discrete physical computational devices, through mixed reality systems, to immersive virtual environments.[2] Some examples of topics in the field are listed below.

List of topics in the HCC field[edit]

  • Problem-solving in distributed environments, ranging across Internet-based information systems, grids, sensor-based information networks, and mobile and wearable information appliances.
  • Multimedia and multi-modal interfaces in which combinations of speech, text, graphics, gesture, movement, touch, sound, etc. are used by people and machines to communicate with one another.
  • Intelligent interfaces and user modeling, information visualization, and adaptation of content to accommodate different display capabilities, modalities, bandwidth, and latency.
  • Multi-agent systems that control and coordinate actions and solve complex problems in distributed environments in a wide variety of domains, such as disaster response teams, e-commerce, education, and successful aging.
  • Models for effective computer-mediated human-human interaction under a variety of constraints, (e.g., video conferencing, collaboration across high vs. low bandwidth networks, etc.).
  • Definition of semantic structures for multimedia information to support cross-modal input and output.
  • Specific solutions to address the special needs of particular communities.
  • Collaborative systems that enable knowledge-intensive and dynamic interactions for innovation and knowledge generation across organizational boundaries, national borders, and professional fields.
  • Novel methods to support and enhance social interaction, including innovative ideas like social orthotics, affective computing, and experience capture.
  • Studies of how social organizations, such as government agencies or corporations, respond to and shape the introduction of new information technologies, especially with the goal of improving scientific understanding and technical design.
  • Knowledge-driven human-computer interaction that uses ontologies to address the semantic ambiguities between human and computer's understandings towards mutual behaviors[3]
  • Human-centered semantic relatedness measure that employs human power to measure the semantic relatedness between two concepts[4]

Human-centered systems[edit]

Human-centered systems (HCS) are systems designed for human-centered computing. This approach was developed by Mike Cooley in his book Architect or Bee? [5] drawing on his experience working with the Lucas Plan. HCS focuses on the design of interactive systems as they relate to human activities.[6] According to Kling et al., the Committee on Computing, Information, and Communication of the National Science and Technology Council, identified human-centered systems, or HCS, as one of five components for a High Performance Computing Program.[7] Human-centered systems can be referred to in terms of human-centered automation. According to Kling et al., HCS refers to "systems that are:

  1. based on the analysis of the human tasks the system is aiding
  2. monitored for performance in terms of human benefits
  3. built to take account of human skills and
  4. adaptable easily to changing human needs."[7]

In addition, Kling et al. defines four dimensions of human-centeredness that should be taken into account when classifying a system: systems that are human centered must analyze the complexity of the targeted social organization, and the varied social units that structure work and information; human centeredness is not an attribute of systems, but a process in which the stakeholder group of a particular system assists in evaluating the benefit of the system; the basic architecture of the system should reflect a realistic relationship between humans and machines; the purpose and audience the system is designed for should be an explicit part of the design, evaluation, and use of the system.[7]

Human-computer interaction[edit]

Within the field of human-computer interaction (HCI), the term "user-centered" is commonly used. The main focus of this approach is to thoroughly understand and address user needs to drive the design process. However, human-centered computing (HCC) goes beyond conventional areas like usability engineering, human-computer interaction, and human factors which primarily deal with user interfaces and interactions. Experts define HCC as a discipline that integrates disciplines such as learning sciences, social sciences, cognitive sciences, and intelligent systems more extensively compared to traditional HCI practices.

The concept of human-centered computing (HCC) is regarded as an essential aspect within the realm of computer-related research, extending beyond being just a subset discipline of computer science. The HCC perspective acknowledges that "computing" encompasses tangible technologies that enable diverse tasks while also serving as a significant social and economic influence.

In addition, Dertouzos elaborates on how HCC goes beyond the notion of interfaces that are easy for users to navigate by strategically incorporating five technologies: natural interaction, automation, personalized information retrieval, collaborative capabilities, and customization.

While the scope of HCC is extensive, three fundamental factors are proposed to constitute the core of HCC system and algorithm design processes:

  1. Social and culturally aware considerations.
  2. Direct augmentation and/or consideration of human abilities.
  3. Adaptability is a key feature.

Adherence to these factors in system and algorithm design for HCC applications is anticipated to yield qualities such as:

  1. Responsive actions aligned with the social and cultural context of deployment.
  2. Integration of input from various sensors, with communication through diverse media as output.
  3. Accessibility for a diverse range of individuals.

Human-centered activities in multimedia[edit]

Wikimania human-centered design visualization, created by Myriapoda.

The human-centered activities in multimedia, or HCM, can be considered as follows according to:[8] media production, annotation, organization, archival, retrieval, sharing, analysis, and communication, which can be clustered into three areas: production, analysis, and interaction.

Multimedia production[edit]

Multimedia production is the human task of creating media.[9] For instance, photographing, recording audio, remixing, etc. All aspects of media production concerned must directly involve humans in HCM. There are two main characteristics of multimedia production. The first is culture and social factors. HCM production systems should consider cultural differences and be designed according to the culture in which they will be deployed. The second is to consider human abilities. Participants involved in HCM production should be able to complete the activities during the production process. The field of Multimedia in Human-Centered Multimedia (HCM) is dedicated to the creation and development of various forms of media, including photography, audio recording, and remixing. What sets HCM apart is its emphasis on active human involvement throughout the production process. This means that cultural differences must be taken into account to tailor HCM systems according to specific cultural contexts. Furthermore, a key factor for achieving success in HCM production lies in recognizing and utilizing human capabilities effectively; this enables active participation and ensures efficient completion of all production activities.

Multimedia analysis[edit]

Multimedia analysis can be considered as a type of HCM applications which is the automatic analysis of human activities and social behavior in general. There is a broad area of potential relevant uses from facilitating and enhancing human communications, to allowing for improved information access and retrieval in the professional, entertainment, and personal domains. The field of Multimedia Analysis in Human-Centered Multimedia (HCM), involves automatically analyzing human activities and social behavior. This application area covers a wide range of domains, including improving communication between individuals and enhancing information access in professional, entertainment, and personal contexts. The possibilities for utilizing multimedia analysis are extensive, as it goes beyond simple categorization to achieve a nuanced understanding of human behavior. By doing so, system functionalities can be enhanced while providing users with improved experiences.

Multimedia interaction[edit]

Multimedia interaction can be considered as the interaction activity area of HCM. It is paramount to understand both how humans interact with each other and why, so that we can build systems to facilitate such communication and so that people can interact with computers in natural ways. To achieve natural interaction, cultural differences and social context are primary factors to consider, due to the potential different cultural backgrounds. For instance, a couple of examples include: face-to-face communications where the interaction is physically located and real-time; live-computer mediated communications where the interaction is physically remote but remains real-time; and non-real time computer-mediated communications such as instant SMS, email, etc.

Human-Centered Design Process[edit]

The Human-Centered Design Process is a method to problem-solving used in design. The process involves, first, empathizing with the user to learn about the target audience of the product and understand their needs. Empathizing will then lead to research, and asking the target audience specific question to further understand their goals for the product at hand. This researching stage may also involve competitor analysis to find more design opportunities in the product's market. Once the designer has compiled data on the user and the market for their product design, they will then move on to the ideation stage, in which they will brainstorm design solutions through sketches and wireframes. Wireframing is a digital or physical illustration of a user interface, focusing on information architecture, space allocation, and content functionality. Consequently, a wireframe typically does not have any colors or graphics and only focuses on the intended functionalities of the interface.[10]

To conclude the Human-Centered Design Process, there are two final steps. Upon wireframing or sketching, the designer will usually turn their paper sketches or low-fidelity wireframes into high-fidelity prototypes. Prototyping allows the designer to explore their design ideas further and focus on the overall design concept.[10] High-fidelity means that the prototype is interactive or "clickable" and simulates the a real application.[11] After creating this high-fidelity prototype of their design, the designer can then conduct usability testing. This involves collecting participants that represent the target audience of the product and having them walk through the prototype as if they were using the real product. The goal of usability testing is to identify any issues with the design that need to be improved and analyze how real users will interact with the product.[12] To run an effective usability test, it is imperative to take notes on the users behavior and decisions and also have the user thinking out loud while they use the prototype.

Career[edit]

Academic programs[edit]

As human-centered computing has become increasingly popular, many universities have created special programs for HCC research and study for both graduate and undergraduate students.

User interface designer[edit]

A user interface designer is an individual who usually with a relevant degree or high level of knowledge, not only on technology, cognitive science, human–computer interaction, learning sciences, but also on psychology and sociology. A user interface designer develops and applies user-centered design methodologies and agile development processes that includes consideration for overall usability of interactive software applications, emphasizing interaction design and front-end development.

Information architect (IA)[edit]

Information architects mainly work to understand user and business needs in order to organize information to best satisfy these needs. Specifically, information architects often act as a key bridge between technical and creative development in a project team. Areas of interest in IA include search schemas, metadata, and taxonomy.[13]

Projects[edit]

NASA/Ames Computational Sciences Division[edit]

NASA Mars Project

The Human-Centered Computing (HCC) group at NASA/Ames Computational Sciences Division is conducting research at Haughton as members of the Haughton-Mars Project (HMP) to determine, via an analog study, how we will live and work on Mars.[14]

  1. HMP/Carnegie Mellon University (CMU) Field Robotics Experiments—HCC is collaborating with researchers on the HMP/CMU field robotics research program at Haughton to specify opportunities for robots assisting scientists. Researchers in this project have carried out a parallel investigation that documents work during traverses. A simulation module has been built, using a tool that represents people, their tools, and their work environment, that will serve as a partial controller for a robot that assist scientists in the field work in mars. When it comes to take human, computing and environment all into consideration, theory and techniques in HCC field will be the guideline.
  2. Ethnography of Human Exploration of Space—HCC lab is carrying out an ethnographic study of scientific field work, covering all aspects of a scientist's life in the field. This study involves observing as participants at Haughton and writing about HCC lab`s experiences. HCC lab then look for patterns in how people organize their time, space, and objects and how they relate to each other to accomplish their goals. In this study, HCC lab is focusing on learning and conceptual change.

Center for Cognitive Ubiquitous Computing (CUbiC) at Arizona State University[edit]

Note-Taker device with initial inventor David Hayden

Based on the principles of human-centered computing, the Center for Cognitive Ubiquitous Computing (CUbiC)[15] at Arizona State University develops assistive, rehabilitative and healthcare applications. Founded by Sethuraman Panchanathan in 2001, CUbiC research spans three main areas of multimedia computing: sensing and processing, recognition and learning, and interaction and delivery. CUbiC places an emphasis on transdisciplinary research and positions individuals at the center of technology design and development. Examples of such technologies include the Note-Taker,[16] a device designed to aid students with low vision to follow classroom instruction and take notes, and VibroGlove,[17] which conveys facial expressions via haptic feedback to people with visual impairments.

In 2016, researchers at CUbiC introduced "Person-Centered Multimedia Computing",[18] a new paradigm adjacent to HCC, which aims to understand a user's needs, preferences, and mannerisms including cognitive abilities and skills to design ego-centric technologies. Person-centered multimedia computing stresses the multimedia analysis and interaction facets of HCC to create technologies that can adapt to new users despite being designed for an individual.

See also[edit]

References[edit]

  1. ^ a b c Alejandro Jaimes; Daniel Gatica-Perez; Nicu Sebe; Thomas S. Huang (November 20, 2007). "Human-centered computing: toward a human revolution". Computer. 40 (5): 30–34. doi:10.1109/MC.2007.169. S2CID 2180344.
  2. ^ a b "US NSF - CISE - IIS". www.nsf.gov. Retrieved April 17, 2015.
  3. ^ Dong, Hai, Hussain, Farookh, and Chang, Elizabeth (2010). "A human-centered semantic service platform for the digital ecosystems environment". World Wide Web. 13 (1–2): 75–103. doi:10.1007/s11280-009-0081-5. hdl:20.500.11937/29660. S2CID 10746264.{{cite journal}}: CS1 maint: multiple names: authors list (link)
  4. ^ Dong, Hai, Hussain, Farookh, Chang, Elizabeth (2013). "UCOSAIS: A Framework for User-Centered Online Service Advertising Information Search". UCOSAIS: A Framework for User-Centered Online Service Advertising Information Search, Web Information Systems Engineering – WISE 2013. Lecture Notes in Computer Science. Vol. 8180. Springer-Verlag Berlin Heidelberg. pp. 267–276. doi:10.1007/978-3-642-41230-1_23. ISBN 978-3-642-41229-5.{{cite book}}: CS1 maint: multiple names: authors list (link)
  5. ^ http://www.spokesmanbooks.com/Spokesman/PDF/131OGrady.pdf | Architect or Bee? The human price of technology
  6. ^ Communications, Texas. "Human-Centered Systems | Research Areas | Research | Computer Science & Engineering | College of Engineering". engineering.tamu.edu. Retrieved April 17, 2015.
  7. ^ a b c "Human Centered Systems in the Perspective of Organizational and Social Informatics" (PDF). philfeldman.com. Retrieved April 17, 2015.
  8. ^ Jaimes, A. (2006). "Human-centered multimedia: culture, deployment, and access". IEEE MultiMedia. 13 (1): 12–19. doi:10.1109/MMUL.2006.8. S2CID 8169985.
  9. ^ Jaimes, Alejandro; Sebe, Nicu; Gatica-Perez, Daniel (2006). "Human-centered computing". Proceedings of the 14th ACM international conference on Multimedia. pp. 855–864. doi:10.1145/1180639.1180829. ISBN 978-1595934475. S2CID 4412002.
  10. ^ a b Affairs, Assistant Secretary for Public (September 6, 2013). "Wireframing". www.usability.gov. Retrieved December 9, 2019.
  11. ^ "High-Fidelity Prototype | Usability.gov". www.usability.gov. June 10, 2013. Retrieved December 9, 2019.
  12. ^ Affairs, Assistant Secretary for Public (November 13, 2013). "Usability Testing". www.usability.gov. Retrieved December 11, 2019.
  13. ^ "Information Architecture Basics". www.usability.gov. October 8, 2013. Retrieved March 10, 2017.
  14. ^ "NASA - Human Centered Computing". www.nasa.gov. Retrieved March 10, 2017.
  15. ^ "Home | Center for Cognitive Ubiquitous Computing". Retrieved December 28, 2018.
  16. ^ Kullman, Joe (August 23, 2011). "Note-Taker device promises to help students overcome visual impairments". ASU Now. Retrieved December 28, 2018.
  17. ^ Panchanathan, Sethuraman; Krishna, Sreekar; Bala, Shantanu. "VibroGlove". CUbiC.asu.edu. Retrieved December 28, 2018.
  18. ^ Panchanathan, S.; Chakraborty, S.; McDaniel, T.; Tadayon, R. (July–September 2016). "Person-Centered Multimedia Computing: A New Paradigm Inspired by Assistive and Rehabilitative Applications". IEEE MultiMedia. 23 (3): 12–19. doi:10.1109/MMUL.2016.51.

Further reading[edit]