Queen Victoria Markets: Find what you’re looking for using Augmented Reality

Akanksha Hirokawa
4 min readFeb 13, 2021

I’m new to Melbourne (from Brisbane) and have recently discovered the joys of the Queen Victoria Markets. The price, the abundance of produce and the esoteric nature of some of the items leaves me wanting to come back again and again.

I am quite time poor however, and I wake up quite late on the weekends, which leaves me scrambling to get to the markets before they shut. Since I am new to the markets, I inevitably find that I’m not aware of where some of the more well-kept esoteric secrets are located, and I always run out of time before I can discover them.

I decided to come up with an augmented reality mobile app design that could help people find the items they are looking for in the least amount of time it can take.

Goal of this experience: To help users find vendors who stock the specific items they are after.

Using Augmented Reality to search for items at QVM

Who is the audience?

Grocery shoppers in a hurry, who are only after specific items. Hannah’s persona echoes the woes of time-poor grocery shoppers.

How to look for items at Queen Vic markets

Users will have to download the app and select the items they need. They’ll then be asked to scan the QR code placed at the market’s entrance with their mobile phones.

Note: The different entrances will have different QR codes. I thought this was best, given that there are so many entrances.

The designs in blue in the wireframes are the only added overlays that will be shown on the mobile screen. The rest of the markets will be exactly as seen through the phone camera.

Users will then be asked to tilt up their phone and scan the markets from left to right.

Once the phone has calibrated the user’s location, it will have mapped the shortest path to all their grocery items. Users will be prompted to follow the path to their first item.

The path will lead to the first vendor who stocks the item the user is after. To make the location of the vendor obvious, their stall (and wares, depending on the spatial mapping capability of the phone) will be highlighted, and there will be a sign floating in above (and then in front of) the vendor.

Once they’ve reached the vendor, the user will be asked to confirm they’ve located the vendor.

They’ll then be asked to confirm whether they have bought the item, or whether it was out of stock.

If they select ‘Out of stock’, the app will calibrate a new path for them to include the item they have missed.

If they select ‘Bought it’, they’ll be directed to the next item. In the wireframe below, I’ve tried to depict the left arrow to be slightly pulsating. Depending on whether the user is wearing headphones, diegetic sound will focus the user’s direction to the left (as multimodal inputs are the best way to direct users).

Turns in the pathway till clearly be shown in the app, along with when users can expect them.

After the user has bought all their items, they’ll be prompted to select ‘Great!’ (which will close down the app) or ‘Buy more’, which will take them back to the initial screen where they can select more items from the list.

If using this with a headset

Most of the interactions will remain the same. The only change I can think of is that tapping on the mobile screen will be replaced with air tapping or the specific selection method of that headset.

Reflection

I really wanted to prototype this app in order to test it out but I couldn’t find a prototyping tool that handled AR very well. I’d really appreciate it if anyone could suggest a decent AR prototyping tool.

It would have been cool to prototype this tool with some existing AR indoor navigation apps out there like ViewAR but unfortunately they only work on iPhones (I have an Android).

Any thoughts/feedback on the design would be greatly appreciated!

--

--

Akanksha Hirokawa
0 Followers

I’m a UX designer based in Melbourne, Australia.