Experiential Design - Task 1: Trending Experiences

 23.09.2024 - 19.10.2024 / Week 1 - Week 4

Teo Mei Hui / 0358315

Experiential Design / Bachelors of Design (Honors) in Creative Media / Taylor's University

Task 1: Trending Experiences


TASKS

Week 1: Introduction to Experiential Design

Imagine the scenario in either of the two places (kitchen, shopping mall and gym). what would the AR experience be and what extended visualization can be useful? What do you want the user to feel?

Shopping Mall
User will be able to navigate their way around the mall easily. They can also see store ratings and offers before deciding to enter. User will feel more satisfied as well as efficient with their time and energy.


Gym
Panels of user's fitness information will appear. User will be able to monitor and track their progress in the middle of working out. Users may feel more motivated and in control of their body.



Week 2: Experience Design & Marker-Based AR Experience

Designing Experiences (Lecturer & Class Group Activity)

  • Experience Design is the practice of designing products, processes, services, events, omnichannel journeys, and environments with a focus placed on the quality of the user experience and culturally relevant solutions.
  • A user journey map is a visual representation that outlines the steps a user takes while interacting with a product, service, or experience.

During class, we were split into groups to create a user journey map for an experience. We decided to create one for Genting Skyworlds Theme Park experience as most of us has been to Genting or other theme parks. We figured out all stages of a journey through the theme park, then identified the gain points, pain points and solutions for each stages. 


We also created a future user journey map showing the gain points after implementing the suggested solutions. It was a great experience to do this as a group, as I could hear so many diverse perspectives and opinions from my groupmates.


Creating Marker-Based AR Experience using Unity & Vuforia

Unify - the platform where you create and manage the entire AR experience (including 3D models, user interface, and interactions)

Vuforia - a plugin that extends Unity’s capabilities, adding the image recognition and tracking needed for marker-based AR.

1. Set up Unity and Vuforia 
I installed Unity and Vuforia package on my laptop. After setting up 3D project in Unity, I imported the Vuforia package. Once that was done, I created the image target game object using the Vuforia engine drop down. I also deleted the main camera at the hierarchy panel and then added AR camera. Then, I copied the link for the app license key from Vuforia and pasted it into Unity. 





2. Create image target
Moving on, i created a database in target manager. I uploaded an image to the database to work as the marker. After that, I was able to select the database and image target in Unify.





3. Attach an object to the marker
I created a 3D cube as the child object for the image target. 


Final Outcome

Using play mode in Unity, I tested out the first ever augmented reality experience that I created. I completed the entire process by following a YouTube tutorial. When I first opened the app, I felt quite lost and overwhelmed since it was my first time and there were so many foreign features. However, as I followed the step-by-step instructions I was able to finish everything with ease. It is quite astonishing to see the final outcome, I am excited for the works I will come up with in the days to come.




Week 3: User Controls and UI

Buttons that lead to scene

1. Create Canvas and Button
In the hierarchy panel, go to UI then create canvas (panel for background), then create button.


2. Create Script
On the assets window, go to create then create C# Script. Two types of scripts were taught, one is gotoScene(), where you have to code the scene that you want. The other is gotoCustomScene(string sceneName), it can go to any scene that you choose later on.



Drag the script into the inspector panel of the canvas.


At the button's inspector panel, drag canvas into the box under "On Click", then look for the script and action. 






Buttons for different animation

1. Create Canvas and Button
In the hierarchy panel, go to UI and create canvas, then create 3 buttons for different animation.


2. Create Animation
Press Window>Animation>Animation, then drag the panel to the bottom. Press on child object of image target then create animation. Press the red button to begin recording animation.


Create a folder for all 3 animations.


At the animator panel (Window>Animation>Animator), right click on idle animation and set as layer default state. This makes it the default animation of the child object when image target is detected.


At the inspector panel of the buttons, drag cube into the box under "On Click". Choose animator for the dropdown then type out the animation you want for the button. 


Final Outcome



Week 4: Markerless AR Experience

Markerless AR

1. Create Plane Finder and Ground Plane Stage
In the hierarchy panel, go to Vuforia Engine then Ground Plane to create Plane Finder and Ground Plane Stage. The Plane Finder helps detect flat surfaces in the physical environment using a device’s camera. Ground Plane Stage is an object used to define where virtual content should appear once a surface has been detected by the Plane Finder.


Then, drag Ground Plane Stage into Anchor Stage at Plane Finder's inspector panel. Tick Duplicate stage so that the virtual content can spawn as many times as users click.


Before experiencing on mobile devices, we can first test it out using emulator ground plane image that will act as stand in for the ground.




Final Outcome



Exporting AR Experience to Mobile Device

1. Set Player Settings
Before exporting the AR experience application onto the mobile devices, we have to set the player settings under build settings. 

First, ensure auto graphics API is turned off. Then, remove Vulkan from graphics API.


Change the minimum API level according to your mobile device.


Change the scripting backend option from Mono to ILC2PP.


Lastly, change the target architectures from ARMv7 to ARM64.



2. Select and Run Device
Connect mobile device to laptop using USB cable. On the mobile device, got to about phone in settings, then click on Build Number 7 times, which will make you a developer.


Then, go to developer options and enable USB debugging.


Look for the device connected. Once done, we can build and run, the AR experience application will be exported into the mobile device.


Final Outcome





Video as Virtual Content

Create a Quad through Hierarchy > 3D object. Then, in the inspector panel, add component and search for video player. Drag the desired video into the video clip box.



Under image target inspector panel, drag the quad into the box under "On Target Found" and "On Target Lost". When image target is detected, the video plays automatically, when not detected, it pauses.


Final Outcome




AR Idea Proposal



REFLECTIONS

Through Task 1, I was able to learn about the basics of AR such as marker-based and marker-less AR. Being able to create simple AR experiences of our own is really fun and accomplishing. Although it felt a little overwhelming to learn how to use Unity, I was fortunate to have a great lecturer who is very helpful and patient with any difficulties we were facing. I was quickly able to familiarize myself with the basic tools and features of Unity. On the other hand, I also did individual research on how AR technologies can significantly improve the quality of our lives, or help multiple industries improve their professional processes and experience. It is really inspiring and I hope to be able to create a meaningful AR experience of my own too.


Comments

Popular Posts