HP TECH TAKES /...

Exploring today's technology for tomorrow's possibilities
Exploring Human-Computer Interaction

Exploring Human-Computer Interaction

Megan Edwards
|
Reading time: 10 minutes
These days, most of us take for granted the extraordinary computational powers we have access to on a daily basis. The capabilities of smartphones, tablets, and PCs have skyrocketed in the past few decades and changed the landscape of technology forever.

What is HCI?

While it’s easy to focus on how we’ve advanced the scope of computers, it can often be more insightful to question how computers have changed us. The field of Human-Computer Interaction (HCI) is both an academic discipline and a system of design that looks at how people interact with computing technology.
Through the combination of psychology, sociology, engineering, philosophy, and design, this growing field encompasses the complexities that come with the fast-approaching closure between the organic and artificial worlds.
Prior to the 1970s, computers were only available to a small group of academics, scientific professionals, and incredibly wealthy hobbyists. But with the boom in personal computers, technology was forced to simultaneously simplify its user interface and increase the complexity of its capabilities. Thus, the discipline of HCI was born.
Today, HCI is employed by tech companies large and small to create the most seamless experiences possible between humans and machines. With breakthrough augmented reality (AR) and virtual reality (VR) applications just on the horizon, HCI has never been more important in determining how the future of technology will play out.

A (brief) history of human-computer interaction

Long before HCI existed as a field of study, the idea of an electronically-powered computing machine seemed far-fetched. While designs for simplistic calculating tools had been circulating since the 1600s, English inventor Charles Babbage was the first to formulate a feasible plan for a mechanical computer in the 1800s [1].
Babbage received many grants from the British government to construct the device, which he called the Difference Engine. This machine could theoretically calculate complex problems and store data for later processing, just like a modern computer. However, he failed to complete the project after becoming fixated on a new idea: The Analytical Engine.
This device would be a general-purpose, fully programmable machine that could solve any calculation. Its design laid the groundwork for many components of the modern computer, including the central processing unit (CPU). But just like his first invention, Babbage was unable to take his plans from concept to construction.
Progress toward a digital computer stalled. Then in 1931, a professor from MIT and his team of grad students developed the Differential Analyzer, which could solve specific types of complex equations [2]. But tech junkies wanted more. Howard Aiken developed the Harvard Mark I, which was a mostly-mechanical computational device that weighed about 5 tons [3]. Apart from this breakthrough device, Aiken is praised for creating the first fully automatic large-scale calculator.
However, the first electronic digital computer was built by Alan Turing in the midst of World War II. The British government was desperate to solve the Nazi Germany encryption device known as the Enigma machine, which produced nearly impossible ciphers to disguise enemy correspondence [4].
Turing was an early advocate for the possibility of a universal computing machine that could not only perform arithmetic but also represent abstract mental states in its calculations. In other words, he envisioned artificial intelligence (AI). During the war, Turing created Colossus, a device that eventually cracked the Enigma machine and laid the groundwork for modern general-purpose computers.
Over time, the technology progressed and smaller computers were introduced into scientific laboratories to assist with complex calculations. Hewlett-Packard, or HP®, established itself as a key player in this market and worked to advance the accessibility of these so-called “minicomputers.” For additional reference, here is a timeline of HP®'s history.
Thanks to a huge leap forward in technology during the 1970s, minicomputers soon transformed into the modern desktop computer most people are familiar with today. Computer giants such as Microsoft and Apple were formed, and by 1983 more than 10 million PCs were being used in the United States [5].
Video games, the World Wide Web (or Internet), search engines, email, video chat services, touch screens, and smartphones all blossomed from this boom in personal computing. The study of HCI began to take shape in the minds of scholars and designers alike. Computers were increasingly being integrated into everyday life, and the capabilities they offered were rapidly changing human interactions.
While the precise definition of human-computer interaction is evolving with the most recent technological breakthroughs, it comes down to a simple question. How can people and computer-based technology interact harmoniously?

What does human-computer interaction do for technology?

The first HCI conference was held in 1983 by the ACM Special Interest Group on Computer-Human Interaction [6]. When using HCI to develop new tech, it was agreed that four main components factor into the equation: the user, the task, the interface, and the context. These four elements help evaluate the level of usability for a given computing device.
“Usability” is almost interchangeable with human-computer interaction. HCI enables user experience (UX) and user interface (UI) designers to create more intuitive technology that better serves the needs of people around the world.
What used to be science fiction is now a reality. You can watch movies as you soar through the air on an international flight, control video games with your body movements, or get a Master’s degree online in a virtual classroom.
Here are just a few examples of how HCI has affected the expanding world of technology:

1. Voice-guided interfaces

Whether you use Cortana, Siri, or Alexa, voice-guided interfaces are an increasingly prevalent part of our daily interactions. Instead of worrying about misspelling a search engine query or stopping an important task to research a question, these voice-activated assistants allow users to simply ask for what they want.
Just as if you were speaking to a friend, the goal of these interfaces is to make acquiring information a casual and convenient exercise. HCI experimentation has refined this technology and helped designers gain valuable insight into the differences between how people speak to each other versus how they speak to technology.

2. Eye-tracking software

For people with disabilities such as full-body paralysis, using a keyboard to type out commands is impossible. But thanks to developments in eye-tracking software, thousands of people who were previously incapable of accessing the benefits of a computer are able to simply look at a screen and use special software to carry out commands.
Without the field of HCI constantly assessing how to make technology more accessible to a larger swath of people, improvement in user experience for those with disabilities would not be where it is today.

3. The laser printer

What may seem like a simple piece of technology nestled away in your home office is actually a monument to decades of careful testing and tweaks. With the rise of touch screen technology, HCI designers realized it could easily be incorporated into different types of printers to create a more intuitive printing process.
Additionally, remote printing capabilities have been added to many new printers to accommodate the increasing number of remote employees, a prime example of tech responding to human behavior.

4. Health trackers

Instead of monitoring your health directly through a healthcare provider, HCI has put the power of tracking your fitness directly into the hands of the individual user. From wearable fitness trackers to smartphone apps that calculate your sleep cycle, your own personal well-being can be monitored conveniently from home. This technology has already massively impacted the healthcare industry and will continue to change how people monitor their own health and report their findings to doctors.

5. Virtual reality

From video gamers to architects, virtual reality is changing what’s possible in terms of HCI. This technology will continue to develop as researchers and designers gain more insight into how people respond to virtual environments. Not only does it have the possibility to revolutionize the entertainment industry, but it could also shift how employees across industries interact with their daily tasks.
From the hospital emergency room to your kitchen counter, HCI examines how humans interact with computer-based machines in order to improve usability for all.

How is HP® designing immersive experiences through human-computer interaction?

One of the pioneering organizations in the field of HCI is the HP Labs. Their mission is to prioritize people in order to create the technology of the future. Through qualitative and quantitative research methods, the HP Lab team tests new techniques to delight, inspire, and make life easier for all tech users.

HCI in VR

A 2016 project from the Immersive Experiences Lab experimented with ways to allow virtual reality users to keep track of time in the real world. Instead of taking up visual space with a virtual clock, the lab team assembled different iterations of a haptic feedback backpack that delivered physical sensations to the wearer.
From a simulated shoulder tap to a “hugging” sensation, the researchers gained valuable insight into which types of haptic feedback could be used to alert virtual reality users of the passage of time. If VR is to be successfully integrated into the workplace, it’s questions like these that need to be examined in order to create a seamless experience between humans and computers.
It’s not uncommon that HCI gets associated with big-name projects such as VR or AR. But this field is constantly working to design less glamorous computing experiences that are just as revolutionary.

HCI, healthcare, and security

The HP Labs are also pioneering a writable, energy-free display technology for identification cards. These incredible cards feature an embedded screen that can display text, images, and QR codes. Capable of being rewritten thousands of times, this technology could drastically change the security, health, and business sectors, among others. The same size and flexibility of standard credit cards, the ability to quickly update card information will streamline countless processes.

HCI in ongoing tech innovation

Apart from major developments in new technology, it’s important to remember that HCI is constantly used to make improvements to old technology. Every time a new laptop is released, there are minor tweaks that make it easier to operate than the last version even though the core technology isn’t reinvented with each iteration.
Each PC that HP® releases is the product of thousands of hours of testing, calibrations, and design changes. How can the startup menu be adjusted to be more intuitive? How can the physical construction be reimagined to provide multiple modes of use? What would make it easier for users to take advantage of the webcam? Without human-computer interaction questions like these, the computer industry would stagnate.
Take the HP Sprout Pro AiO desktop PC as one of the best examples of this process. This workstation offers a fully immersive experience with two multi-touch displays and software that can capture 2D and 3D objects. By offering the ability to interact with and manipulate content in real time, the HP Sprout Pro could expand the horizons of education, manufacturing, and marketing.

What is the future of human-computer interaction?

Wondering what the future holds? The field of HCI is practically limitless. Here are some common ideas about where technology is headed over the next decade.

1. Thought control

Many researchers and academics believe it could be possible to create a system where you can control objects with your mind. Whether this entails placing small sensors on your head, wearing a specially designed hat, or some other less-obvious mechanism, it could be possible to scroll through email, turn on the TV, and play music all without lifting a finger.

2. Trauma recovery

Augmented reality has already proven to be a successful technique for amputees to cope with phantom limb syndrome, but patients with all sorts of ailments could benefit by advancements in this arena. Whether it be psychological trauma, such as PTSD, or a chronic mental health problem, such as crippling anxiety, AR and VR might be able to offer a treatment program that helps to “rewire” the brain through interactive stimulation.

3. Eldercare

Instead of placing an elderly person who is experiencing memory loss directly into a nursing facility, advanced headsets or virtual assistants would be able to remember everything they cannot. Medicine schedules, driving directions, and daily chores could all be accessed by the device so seniors can retain the freedom of independent living for a longer period of time.

4. Personalized marketing

While you’re probably already used to receiving targeted advertisements based on your preferences, HCI could take marketing to a whole new level. Many designers are aiming for smart devices that can communicate with each other while you’re on the move.
Imagine walking down the street at noon and having lunch special deals from restaurants you’re likely to eat at pop up on your phone. Your personal device would share information about you with other nearby devices in an effort to tailor every venture out into the world to your own exact tastes.

The bottom line on HCI

While there is certainly room for an increased violation of privacy as computer-based devices get smarter, the benefits of human-computer interaction have enormous promise to solve complex problems.
Immersive technologies will increase the scope of what’s possible in both professional and personal fields, changing the norms of tech forever. But one thing is certain: we’ve only scratched the surface of how humans and machines can coexist.
[1] Encyclopedia Britannica; Computer Technology
[2] MIT Museum; Differential Analyzer
[5] Computer Hope; Computer History
[6] Direction Bordeaux; Looking Back, A Very Brief History of HCI

About the Author

Megan Edwards is a contributing writer for HP® Tech Takes. Megan is a digital content creator based in Southern California and specializes in creating multimedia content for various industries, including technology.

Disclosure: Our site may get a share of revenue from the sale of the products featured on this page.