Case Study

An all-in-one desktop management interface for Nokia's major, US-national telecommunications clients.

privacy nokia.jpg

Project Summary

Problem Domain: Telecommunications Hardware & Software Management

Notice: I am currently under an NDA for this project, and so much of the design process and material from this project is confidential. I will however share information or process pieces that are in the public domain. If you have more questions about this project, please feel free to email me.

Timeline: 3 months, Jun. 2018 - Aug. 2018
Role & Team: Human Factors Engineering Intern - Interactive Gaming and Visual Programming, Team of 4 (3 HFE Interns and Head of Nokia HFE Studio)
Tools: InVision, Sketch, PowerPoint, Keynote
Methodologies: secondary research, wireframing, task flows, in-house testing, digital prototyping, hi-fi prototyping, client interview 
Final Product: a hi-fi, proof-of-concept desktop management interface for Nokia's major telecommunications clients + interface evaluation framework


Design Process

While I am under NDA for the content and product-related content, I will be able to share our project timeline, void of any product details, as well as my secondary research on interactive gaming which was used to draw insights from for our team interface design. This research on gaming consoles, hardware and software is all publicly available.

NOKIA HFE PROJECT-page-001.jpg
NOKIA HFE PROJECT-page-001.jpg

Secondary Research: Interactive Gaming

Interactive Gaming: What is it?

Interactive Gaming is an umbrella term to describe video games that utilize actions of the player to engage them in virtual gameplay.

Interactive gaming has many forms. Interactive gaming can use different forms of interactions: full-body movement, text/command-based input, button pressing, joystick input, mouse+click input etc. Gaming experiences are tailored to the type of interaction mandated of the player.

Here are some different examples of user-interaction used in gaming today.

Body-tracking: Nintendo Wii, Xbox Kinect

These systems use infrared tracking to track player movements and render them in real-time in a virtual environment. These systems can be used to create virtual avatars of players that interact with the virtual environment.

For example, Wii Sports is able to show avatars interacting with virtual agents through sports games like tennis, bowling, and swimming. These activities engage the user on a full-body level. The experiences are also enhanced by haptic-feedback from Wii remote-controllers. In game experiences feel real due to this haptic feedback

This level of full-body engagement is great to simulate the actual experiences of sports, slaying dragons etc. but can only engage the user for so long before fatigue kicks in. This is why sales of such devices were so negatively impacted. For the most part, when people want to play video games, they want to relax, meaning interacting the least part in full-body movement.


Controller based systems utilize video game controller that is connected to gaming systems through wired or wireless connections. These controllers, in large part, consist of a simple interface including buttons, triggers, and joysticks.

For example, the Xbox One S controller features, 4 (XYAB) buttons on the right-hand side, two joysticks on both sides, Xbox, View and Menu buttons, a d-pad (directional pad) and two spring-loaded triggers.

This controller is designed to give the user multiple tools to interact with virtual gameplay. In looking into each part, each has their own capabilities to leverage:

4 (XYAB) buttons: These four buttons all give the user easy access to enable commands and follow instructions. They give easy one-touch navigation through menus as well as act as facilitators for action sequences in games. If the game mandates that the user follow patterns or serial events with proper button execution, these buttons allow for them to complete those actions.

Examples of this functionality in gameplay include fighting sequences in Naruto Shippuden’s Ninja Storm series for Xbox, as well as gameplay from action-fighting sequences in the Call-Of-Duty series.

Two Joysticks: The joysticks of the Xbox controller add a level versatility that is not possible with just one touch interaction. The joysticks are both omnidirectional, meaning that they can be moved/rotated in any x-plane direction. Usually in gameplay these two joysticks control the direction of player view. One joystick would control the x-direction of player rotation, while the second would control the y-direction of player rotation. The joysticks also feature rubber gripping on the edges so that players keep a good grip on their joystick controls. This is critical when a player is trying maneuver guerilla war tactics, navigate a base of dark elves etc. These controls can also be used to rotate in-game graphic models and items as well. Examples of this functionality in gameplay include navigating your character in the maps of Fortnite, as well as rotating world maps in Elder Scrolls: Skyrim.

Xbox, View and Menu buttons: These buttons are used less often in Xbox gameplay, but still hold an important role in navigating users to important console menus.

Two Triggers: The triggers of the Xbox controller allow the user to easily act in games. Whether shooting bullets, or placing square blocks, users are able to interact with their virtual space at the pull of a finger. Additionally, the Xbox triggers also provide vibration haptic feedback in order to give the user additional information, such as the volume of bullets being released from their gun. Minimal vibration can mean less bullets being fired, and more vibration can mean more bullets being fired.


Users utilize a keyboard and mouse to play games on their PC.


  • Games can utilize many keyboard shortcuts.

  • UX/UI can have a more detailed interface to work with, since the player is able to see more of the screen up close.

  • Graphics can be displayed at a higher quality than with most consoles.


  • Players cannot play with multiple people in-person.

  • Haptic feedback may not be as convincing as with traditional controllers with vibration motors

Virtual Reality-based:

Using a tracking system of some kind, users are tracked and then rendered in a virtual gaming environment.


  • Heightened/full sense of presence within the virtual environment.

  • Feedback to enhance presence.

  • 360 degree visuals

Many other awesome things :)


  • Fatigues the user (recommended 20-30 mins duration).

  • Physical limitations: fatigue, tracking and safety issues.

  • Oculomotor disconnection causes nausea, dizziness, and ocular discomfort.

  • Possible PTSD, need for regulation in material.

  • Can be ‘too’ real. Real for bad reasons.

Augmented Reality-based:

Using cameras, often on mobile devices, developers are able to render


  • Portability, can go anywhere if you got a camera.

  • No ocular stress or oculomotor disconnect.

  • Variety of applications.


  • Hard to position objects in the right place in the real world.


Insights from Interactive Gaming:

Article: “Game UI By Example: A Crash Course in the Good and the Bad”


Guiding Principle: “A good UI tells you what you need to know, and then gets out of the way.” 

6 fundamental questions: 

  • Does this interface tell me what I need to know right now? 

  • Is it easy to find the information I'm looking for, or do I have to look around for it? (Are the menus nested so deep that they hide information from the player?) 

  • Can I use this interface without having to read instructions elsewhere? 

  • Are the things I can do on this screen obvious? 

  • Do I ever need to wait for the interface to load or play an animation? 

  • Are there any tedious or repetitive tasks that I can shorten (with a shortcut key, for example) or remove entirely? 

Functional aspects of UI: 

  • How big is it? 

  • Does it (or should it) scroll? 

  • What information is displayed and where? 

  • How does the player navigate through it? 

  • Make mock-ups with two or three colors 

  • Never rely on color changes alone to convey information 

  • Keep in mind colorblindness too 

Features to include: 

  • Shortcuts 

  • Screen space-efficiency 

  • Maximize task efficiency; give the player what they want and when they need it 

  • Less interactions so player can be more engaged in playing the game 

  • You can drag these windows anywhere on the screen, and you can resize them or minimize them to tailor the UI to your needs

  • If you see an unfamiliar icon there you can hover over it in Menu mode, and a tooltip will tell you what it is

  • Almost everything is literally one click away

Key Takeaways: 

  • Predict what the user wants to know, and give them that information. 

  • Information must be easy to find 

  • Your UI should be easy to use and navigate

  • Use established patterns where you can. Everyone knows that Ctrl-Click adds items to selection, so don’t make it swap items instead

  • Make the user’s location in the menu system seems obvious, and make it obvious where the user can go and what they can do from there

  • Minimize load times and avoid animations in your menus

  • Eliminate or simplify repetitive tasks