Vacay

Information Architecture Design

VacayCover.png

Overview

For an Information Architecture course, my partner and I designed a mobile site that assists travelers in planning their trips. The project focus was on designing the structure and organization of the site in order to create an effective navigational structure.

 

Team

Samara Miu

Strategies

Personas, Freelisting, Card Sorting, Task-based testing (IA Testing and First click-Testing), Sitemaps

Timeline

June-August 2020 (10 weeks)

Tools

Figma, Optimal Workshop

 

The Problem

Often while traveling, there are a number of important documents, events, lists, etc. one needs to keep track of but can be difficult to maintain when they are scattered across multiple channels or get lost among other emails. Our goal was to create a new site that would aim to solve this problem and assist travelers in planning upcoming trips.

The Solution

We created a mobile site that allows users to create a comprehensive view of all travel-related items for a trip. The site allows users the ability to easily import and organize all the information whether it is a hotel confirmation in their email or a list of restaurants to visit from their Yelp account. This will help users prepare for a future trip by gathering all of their plans and research into one place as well as assisting them during the trip since they can access critical documents more quickly.

Process

Processwhite.png

Personas

Maria Ruiz.jpg
Leo Kaufman.jpg

Freelisting

We recruited participants to come up with as many terms that can be applied to the domain in order to help us the gain familiarity with user vocabulary. We surveyed 9 participants and included screener questions to determine if our participants travel and their expertise level. Our freelising process consisted of analyzing common terms, reviewing frequency of terms and categorizing terms. 

Participants came up with terms related to: transportation, things to do, lodging, reservations, and packing

Card Sort

After gaining a better understanding of the content in the domain, we completed two iterations of card sorts. We conducted our card sorts to test our assumptions and to better understand how users categorize the terms.

What we learned:

  • An overlap in "things to do" and "in the area" categories

  • All categories made sense to participants with the exception of “My Trips”

Screen Shot 2021-04-06 at 12.09.08 PM.png

Tree Test

We conducted a tree test in order to identify any issues within our navigation. We were particularly interested in seeing the results of the tasks that required the user to select “My Trips” as we were unsure whether the name was clear. We created five tasks that we used for both rounds that were related to finding sporting events, reviewing your schedule, checking car rentals, uploading pictures, and uploading receipts.

Main Takeaway

The overall success rate increased from 88% in round one to 96% in round two. This was mostly due to our team removing the My Trip category as the feedback we received mentioned it was unclear what action items were associated with this category. By creating My Itinerary and My Pictures as two new categories, it simplified the paths and allowed users to successfully complete the tasks and do so more quickly.

Site Map

Below is a high-level snapshot of where we landed for our final global header navigation with six categories total. The free listing exercise and two rounds of tree tests helped my team come to this global header navigation.

Screen Shot 2021-04-06 at 12.12.41 PM.png

Full Site Map

Wireframes

We created wireframes that display the full path for completing two tasks. Our wireframes were used to test first impressions of the site. We conducted first click tests using Optimal Workshop. It helped us understand where users clicked first when given a task and how users navigated through the app.

Wireframes.jpg

First Click Test

Using the wireframes our team created, we conducted a first click test in order to determine if our design was intuitive and clear. We tested a total of 10 participants on 4 tasks total. Overall there was a high success rate. Our results helped us determine exactly which design elements could use adjusting and retesting.

Screen Shot 2021-04-06 at 12.26.50 PM.png

What we learned:

Need for clearer distinction between headlines and call-to-action buttons

  • First two tasks tested showed there was some confusion between what was a headline and what was a clickable call-to-action button

  • Could be resolved by making sure call-to-action buttons also have icon beside the text to more clearly indicate it is a button

People seem to click more on the icon

  • In task 4, it was interesting to see that the majority of the people clicked specifically on the icon

  • In terms of design revisions, we might want to apply arrow icons to all call-to-action buttons and potentially add icons to other clickable elements as needed

Reflection

This project was valuable in understanding how vital it is to implement user input throughout the design process. Ultimately, evaluating and analyzing user’s successes and failures helped improve my UX research skills.

Key Takeaways

  • A site that has devoted time to improve their IA will give them a competitive advantage. 

    • Users will be able to perform their tasks more efficiently and with less frustration. 

  • Card sorting is vital when organizing information in that it helps the research team uncover the users’ mental models. 

    • Our card sort results made it easy for us to organize our information so that users knew exactly where to look when navigating our site.

Changes to my approach

  • Diversify participant pool

    • In future studies, I aim to seek a diverse pool of participants in order to ensure the site fits the needs for a broader group of people.

Previous
Previous

UX Hackathon

Next
Next

Complex Systems