top of page

6.  VISUAL SYSTEM

Part of my process in visual design is exploring concepts. In this phase, I explored three distinct visual concepts before going on to style and design the pages of the app. This phase developed over the course of three-ish weeks and involved ironing out the mood and tone of the app, and iterating artboards in XD. 

6.1. MOODBOARDS

The mood boards were used to guide the mood and tone of the visual system. These boards also helped me gauge better what would work, and what would not work as far as translating into app components. 

After exploring the mood and tone, I searched the world of apps to find examples of components that aligned with the concepts and communicated AR. In fact, translating the concepts from the above-mood-boards into components that communicated AR, was a challenge! Because AR might be relatively unfamiliar to people, analyze the example components I found to determine if they could live cohesively in one design, or if they could be adapted to communicate AR without being to Sci-Fi. 

Moodboards

6.1. MORE DISCOVERY

The Prototyping Begins! (and there's a problem)
More discovery

I moved forward with the "Surreal Mystery Tech" direction.

 

As I began applying the color, type, and styles to buttons it soon became clear visually there was something lacking in the design. For being such a visual product, the app itself did not have much to look at. There were lists, maps, and icons, but not really any images, or videos because that content would live in "the real world" (so to speak) and would not be visible in the app. Visually, although clear, and quite functional the interface was a little boring, and although the tone and colors were there from the mood board, it was not really communicating the AR features of the app.

Screen Shot 2020-12-19 at 9.52.10 AM.png

Next, I experimented with adding some icons I had sketched out, wondering if this would help communicate the visual aspect of the AR post. 

Screen Shot 2020-12-19 at 9.50.57 AM.png

Eventually, I landed on using our emoticon language as a system of indicating the content of the AR post. ​Introducing this language would provide users more ability to communicate with their network about the content of the posts, as well as make more informed decisions about the media they would choose to visit.

Yet, a challenge was how to communicate the AR nature of this product. On the far left artboard below, I experimented with superimposing the list-feed over an open camera-view; meaning, the viewer's camera is taking in the surrounding environment and creating the background for this page. 

Screen Shot 2020-12-19 at 9.56.35 AM.png
The Visual Solution

The updated information architecture looked like this: The list of posts that was previously in the home feed is integrated into the map section of the app, and the home feed is now a street view, with AR indicators of posts. 

Home Feed: No Longer a List, Now Integrating AR

Recalling the Double Diamond Model, making this design change really fits into a non-linear process.  Looking back, I wonder how I would have been able to produce this concept in a paper prototype. As much as the paper prototyping helped me discover some of the user features and UI patterns, this page's UI solution came to fruition while developing the application's visual system. 

iPhone X, XS, 11 Pro – 4.png
HomefeedARscreen.png

In the end, I found a favorable solution that required revisiting the information architecture and redesigning the home-feed section and map section. The outcome of this redesign was a home-feed that would play-out in AR. 

HomefeedARscreen.png
Artboard – 1.png

The app components include logo, buttons, text, color palettes, forms, icons, maps, etc. These components were designed to be a cohesive system and remain functional. 

Artboard – 1.png
Finalizing components
bottom of page