oneShot

oneShot is an experimental app that aims to encourage people to explore and embrace the process behind taking a photo by adding limitations matching analog cameras and removing automatic presets of mobile cameras.

The Problem

Oneshot started after reflection on mobile photography and how the increasing features for automatization and the need to capture everything sometimes can be more predominant than the picture or the moment itself.

In this reflection, analog photography and analog processes appeared as a way to rescue the lost experience. Oneshot is an attempt to combine analog characteristics with mobile cameras to reconnect people with the experience of taking a photograph.

My role in the project

Oneshot is a personal project created between my developer-husband and me, the designer-wife.

We both acted as product owners defining the concept, goals, and features of the app. I, then, work on it as a UX Designer and UI Designer.

The Approach to the solution

We started with a contradiction about defining who our users are. On one hand, we saw ourselves as the typical Oneshot user. After all, we wanted to create this app to use it ourselves, but on the other hand, we are very aware of the "You Are Not the User" mantra by which people project their behaviors and reactions onto the users.

To resolve this, we abstract the characteristics we thought more users shared with us, and from there, we expand to similar profiles and topic related profiles. This way, we acted as users to design the app, but we have a view of other profiles and a clear need to validate these assumptions.

Other profiles to explore:

  • Experience photographers.
  • Amateur photographers.
  • Enthusiasts, wanting to learn more about photography.
  • Photography teachers.
  • Visual artists.

We intended to validate our assumptions about the users launching an MVP.

Concept, goals, and MVP Definition.

We start exploring mobile photography by analyzing the current situation and context. What can we learn from the evolution of apps like Instagram and photo-editing apps? What are the improvements in today's mobile cameras? How are the users behaving in this context?

We noticed that the ability to take an infinite amount of shots combined with the advanced features in editing apps like Snapseed, which even allow removing unwanted things from a photo by using a cloning tool, could make people unconscious of the action they are performing, and by this not experiencing the moment they are trying to capture.

Apps like Instagram and the improvements in mobile cameras add to the situation by promoting a one look-photo everyone tries to get.

Our goal is to give an opportunity and allow people to experiment this instant and embrace the process behind taking a photo.

Notebooks with ideas for the oneshot app

Ideation

We rounded this phase with an ideation session to translate the concept into an experience to overcome this situation.

Instead of having limitless possibilities, we started by putting a timer on the camera, so it would only be possible to take one photo every 3-5 minutes. We think that the pressure in only having one shot would enhance creativity.

We also wanted to remove any automatic camera presets and replaced them with analog features like obturation, having to select a film and type of lenses before taking the photo.

We also knew we didn't want to have users thinking about likes and followers. We decided to limit that by making photos anonymous and without comments. We want users to focus on pictures and not being influenced by any other parameter.

We realized that all of these features could be divided into two parts; the analog camera itself and the "instant" experimentation part.

Both aspects complement each other and are crucial for the experience we were trying to achieve. To define the MVP, we compared features against the app goals and voted out on those which were essential to reach them.

With this information, I could now start working on the wireframes to define the user's flow.

The Process

Wireframing

The goal for this stage was to validate the concept and features and define a general user flow checking feasibility to implement with development.

I opted for hand-drawn wireframes because I find that it is the quickest way to get things going without being too detail-oriented. It is common in the first stage to realize many things that weren't contemplated, so the fastest we can materialize this, the better.

I started from the first screen every user will see when they access the app: the feed. I sketched two different layouts. One masonry style gallery to showcase several photos before scrolling down. The other, a one-column irregular layout that allowed to see full-size pictures directly in the feed. By taping in the photo, it will flip and show the details in the back.

From here, I started adding the options according to the user profile. Log in and Sign up buttons, registration flow, and in the end, the most complex module: the camera.

Notebooks with ideas for the oneshot app

I listed every camera feature and grouped them by function, similarity, or moment of use. The aim was to present the features without overwhelming the user. I was also taking into account the full features as well as the MVP, making both user flows made sense.

I used an analog camera as a reference and separated those features that aren't easily changeable, like the film reel, or the camera lens. The experience starts with the user choosing these two setups with an option to learn more about these parameters.

Once selected, the user accesses the camera itself, where they can see the options to manipulate the camera. The traditional analog features like obturation, focus, lens, and diaphragm are grouped. At the top are the options to enable flash and sound.

At the bottom, I included a dock with all the actions regarding shooting, the shoot button, the film reel, and the option to select shot mode between normal or multiple exposures.

Once the user takes one shot, the camera isn't available again for the next 3 to 5 minutes, so the camera is replaced with a timer and also access to check the feed, learn more about photography concepts, or play with the camera presets until it became available again.

Notebooks with ideas for the oneshot app Notebooks with ideas for the oneshot app Notebooks with ideas for the oneshot app

First Feedback

With this user flow materialized, we discussed the ins and outs.

About the user registration and access:

  • The registration flow wasn't clear enough.
  • Who can ask what?
  • Ask for a login to see details?

From this feedback, it was clear we needed to define user levels better.

About the feed, we evaluate both layouts seeing that the main difference was that the masonry layout allowed us to have unlimited space to show the pic detail. We also needed to confirm if it was enough space in the portrait mode to show the detail behind the photo.

We were happy with the camera flow and the analogy with analog cameras, but we also have some new ideas to make it better:

  • Include learning tutorials about photography relevant to the different features in the flow.
  • Change some wording in the titles.
  • The main thing we realized about the camera analogy, is that the lens in many cameras could be changed.
  • We saw necessary to add the camera info selected in previous steps to the dock.
  • We thought about using random values for the default camera as a way to get people playing with the presets.
  • Use an entire film reel per camera before switching.

We debated about the idea that the users don't get to see the photo right after they take it, but instead, they'll have to wait until the reel is developed. We also think about removing the option to play with the camera while they wait.

I took the feedback included in the wireframes and moved to Figma to start prototyping.

Prototype

I moved directly to Figma because it is a tool where I can manage interactive prototyping, design, real-time collaboration, and hand-off development within the same tool.

To began, I clustered the feedback by:

  • Arisen concerns: I evaluated these concerns and made the first decisions to cut those that didn't apply and resolved the ones that were easy to settle. I then included the rest in some of the other groups.
  • Functional: Feedback about the order in the user flow and features to include.
  • UI / Information: Feedback regarding naming conventions, information to add in some stages.
Prototype Prototype

I took the relevant feedback and started prototyping the hand-drawn wireframes.

In this stage, I also did some benchmarking to see if we could capitalize on some apps with cameras, and see if we were leaving something important out.

We wanted to make sure that the MVP was able to expand and be coherent with the future features, so I included all features with basic functionality without going too deep in the process of those features that would not make the MVP.

The prototype's goal was to design the complete user flow, create the primary use cases, and test wireframes' ideas.

Feedback from Prototype

We discussed the complete flow and got relevant feedback to work on before moving into designing the UI.


Prototype feedback Prototype feedback

Visual Design

Once again, I clustered the feedback.

This time I grouped the feedback by specific features and feedback about the general flow.

The goal was to start designing as soon as possible and apply the feedback directly in the visual design (High fidelity prototype). So, I evaluated the modifications in the general flow first and deal with the specific input for features as I get ahead with the design.

Right now, I am working on the visual design for the app while applying the feedback.
More updates coming soon, Stay tuned!