- Invite your cat to wherever you are, virtually -
AR app for cat enthusiasts
Lead UX / UI, Product Ideation
Yuki Hirayama and Dr. Sorachai Kornkasem
Working people have significantly less time to spend with their pets.
flis enables the pet owners to interact with their pet virtually, at anywhere they are by utilizing AR technologies. The users can 3D scan their cat or browse cat library contributed by other users. The user can either virtually pet them or invite them to their environment wherever they are by utilizing AR technologies, incorporating Interactive Affordances.
The reason behind this is targeted for cat enthusiasts is, cats have less movements than, let’s say dogs. The purpose of the app is to provide engagements not via vigorous activities. The user can be at their desk and takes a glance at their cat sleeping or roaming while working, for instance.
Our exploratory findings told us that most available pets products on the market are for kids and have limited functionality to interact with the products pets. Most products are related to animation functionalities that do not engage into the AR technology. While we found one or two pet products that are augmented technology, they only involved the superimposed of the pets into the background environment.
Research AR technology
Our findings show most available products providing limited interaction such as overlaying the image onto the environments and transform the background environment, i.g. Snapchat, with the exception of PokemonGO (users can interact the objects, as seen in the background).
Interactive Affordances (Gesture Recognition)
In addition, we reviewed extensive possibility of the interaction such as gestures within the AR technology. Our main reason we tackled into the interaction issue is because we believe that the AR technology would allow us to be involved actively not purely visual, so we can get closer to our target (i.e. our pets), which increase personal satisfaction. This particular idea is supported by the human computer interaction research underlying embodied cognition framework, where our personal experiences can be shaped and understood by all aspects of the body, including motor and perceptual systems (how we move, react, and perceive). We found various types and numbers of gestures related to AR technology. Among 60+ gestures (mostly related to communication), we could narrow down into two main categories that would foster our product:
1. Iconic Gestures
Big and readable gestures: pointing, symbols, or sign language.
2. Draw in the Air
Simple sketches: heart, geometry shapes, etc.
The Fidget Spinner toys may come close to the assumption of helping people with ADHD more focus with some theoretical supports but not scientifically based data.
Fidgetting Gestures can be activities such as:
ouching your hair, Biting your nails, Playing with your clothing, Spinning pens in your hands
This link to the article (popular science) illustrates a few theoretical assumption for what spinner do to the human brain. Fidgeting (gestures) may occupy parts of your brain that otherwise would distract the rest of your brain with random thoughts. Body movements are actually part of the thinking and expression process.
Fidgeting serves as a ritual. Rituals can offer comforting predictability, familiarity and structure that may be relatively absent in real life. In exploring into the body of research in gestures and action, we might conclude TWO actions that should be integrated into the AR Pet application:
1. raw in the Air as “Cat” (as in sign language)
2. Fidgeting as petting a cat under the chin (two or three fingers)
IA and User Flow
Quantitative A/B testing / Prototype
The quantitative A / B testing was done in person with this prototype, accompanied by a PDF.
- WIP - This page will be further updated -