
StockPylot
AR glasses to speed up workflow of delivery drivers and stocking workers
Overview
Elevate retail stocking efficiency with our cutting-edge AR glasses system! From clocking in to completing routes, our AR glasses provide seamless guidance. Receive route information, scan items, and confirm placements visually, all without your fingertips. This transformative technology enhances accuracy and productivity throughout the stocking process, helping retail workers perform their tasks with ease and precision.


What was the problem?
How can we design an AR/XR and IoT solution to enhance awareness, communication, and information retrieval for employees?
For this project, we were tasked with utilizing AR/VR/XR to improve awareness, communication, and information retrieval in a chosen industry scenario.
We decided to design an AR/XR and IoT application/device to support employees of large businesses/warehouses that deal with large volumes of product, such as Amazon Warehouse order pickers and stockers and online order/pick-up in-store shopping apps.
Here is the problem statement that we reached:
THE PROBLEM
• Stores have different layouts (for workers that travel)
• Lack of information flow
• Lack of team coordination
POSSIBLE SOLUTIONS
• Augmented reality to guide employee efficiency
• IoT to improve data flow, increase team coordination, and reduce downtime
EXPECTED OUTCOMES
• Improved efficiency and accuracy of employee workflows
• Improved team communication and coordination
• Decreased downtime
• Improved customer service
Research
Following our problem statement, we decided to next analyze other similar products for inspiration and see where we can improve upon features. We then empathized with potential users by conducting user research.
Google Maps Live View
Website provides information and resources related to Google's AR and VR technologies.
Image shown is Google Maps' Live View, which uses a smartphone's camera to see virtual overlays, such as arrows or directional indicators, on top of the live camera view to help users navigate their surroundings.
DoorDash Shop and Deliver
DoorDash, a popular online food delivery platform, has a feature for Dashers (employees of DoorDash) called Shop and Deliver, which is a system for picking up grocery items and delivering them to customers.
Images show the process of adding an item to the cart
Uses aisle location to reduce downtime and improve efficiency
Storyboards
Online-order/Pick-up In-Store Scenario
Rather than using a handheld scanner that is clunky and can be lost, AR Glasses are used
System is connected to all order picking employees and uses AI to parse work based on current assigned orders, specific order details, and work rate
Inter-store merchandiser/stocker scenario
Uses AR glasses to identify items on a pallet
Uses AR glasses to identify individual items and their locations within the store
User Personas
User Flow
User Journey Map
Initial Prototypes
Paper Prototype
To conceptualize our vision for the product, we decided to create a paper prototype of the user journey of the AR glasses.
Testing the Paper Prototype
I showed the prototype to potential users and asked for their thoughts. This helps us understand how users perceive and interact with the design, and guide our improvements.
General feedback we received was that they really liked the simplicity and functionality of the glasses, but thinks that there should be an option for changing the opacity for the maps feature while driving or disabling it altogether to avoid distractions.
Physical Prototype
We were also tasked with creating a physical prototype of our product. Using our paper prototype and integrating user feedback, we used a pair of glasses and cardboard to simulate an augmented reality experience. The prototype's screens were tailored to fit the glasses, maintaining the user flow from our paper prototype.



Findings
Recommendations
Video Prototype
I created a video showcasing how the product would be used by a worker, and their overall user experience with it.
Final Prototypes
Final Physical Prototype
In response to user testing feedback, we improved the physical prototype by introducing a performance summary screen for users upon completing a store and a full-day route. Additionally, we designed a clip-on version for glasses to offer an alternative for users who need prescription glasses, potentially making it a more cost-effective option compared to a complete AR glasses system.



The screens above (8.1 and 8.2) are screens that were created based on the user feedback from the previous physical prototype. They show the user finishing their store, followed by viewing their performance statistics for that store (speed and accuracy).
This is similar to a process Kroger (pictured above) uses with their online order employees, which we found through our research.
Interactive Prototype
This prototype uses the scenarios generated in our initial storyboards and research.



Testing the Interactive Prototype
The scenario:
Imagine you are an employee using these AR glasses for the first time. Your task is to complete a route at a store using the glasses to locate and stock items. Please complete the route as you would in a real-world scenario and let us know if you encounter any issues or have any feedback along the way.
Who participated?
32-year-old male "Adam"
HCI Master's Student
Recommendations
Reflection & Next Steps
What Worked?
Following design thinking methods, having a system and an end goal made the process clearer and more systematic
Researching currently existing products helped give inspiration on what to do, and what not to do
Having two team members with personal experience with grocery stocking helped us empathize with the research and development of the product.
What Didn't Work?
For such a small form factor, it was difficult to prototype some of the elements we wanted to convey. Using slideware can help things like fine print be much more consumable.
Working through meetings on Zoom, it was difficult to have a 1:1 product between team members developing different aspects of it.
What would we do for the next iteration?

