Check information in Mix-reality

2040
LA Micro Future
Arya Kani, Dongrun He, Jiayu Liu, Kiran Babu, Saman Ghasemloo

Our 2040 Micro-mobility vision looks at various UX mobility solutions that can diversify the way people move in the future as opposed to today’s dependence on cars, with solutions that look at how micro mobility can integrate better with public transit and ride share vehicles in 2040 to create a mobility system that is connected and accessible to all.
As per statistics, the US bike-sharing industry is witnessing a robust growth with over 35 million trips taken in the year 2017. With the changing trends towards the use of sustainable modes of transportation, many leading companies are shifting focus towards e-bikes as well. In the US alone, the micromobility market is predicted to be worth between $200B-$300B by 2030, per McKinsey.
The major challenges for Micro-Mobility adoption have been because of Lack of City Infrastructure, Low Profitability, Almost no regulation of where they can move, the number of Micro mobility devices that are not disposed or recycled. One of the major reasons of lower adoption of Micro-mobility in certain regions is due to harsh weather and lower accessibility to them.

Key Future Technologies
Ai
High speed everything
(internet, telecom etc.)
Biotech
Genetically-Modified-
Organisms (GMO)
Immersive technology AR/VR
Tech industry shifting from
silicon valley to China-global
superpower
Environment + Climate
The world faces the consequences of climate change resulting in, increased migration and stress on west coast cities.
Increased migration to developed countries for job opportunities and better life.
The large wildfires will continue, followed by recurring wildfires that occur in warm and dry conditions.
Use of special greenhouse techniques to provide stable climates in the future.
Risks to marine life and life on land due to global warming.
Jobs + Employment
UX designer roles
Robot sherpas
Health Data analysts
consultant Mixed reality
builders
Elderly Gen-X

Living + Housing
Majority of the U.S. population (87%) will live in urban areas.
The nation’s cities will likely continue to accumulate all the power, technology and wealth, while rural areas fall behind.
There will be a shift of the tech industry from the west coast to central America with increase in remote working and lower taxes to attract people.
Fashion Trends
AR/VR will become a significant part of fashion.
Sustainability will redefine the future of fashion goods.
AR/MR will be integrated with wearable.
Interfaces on clothing/wearable.
Subscription based programs will emerge in Fashion.
Food & Agriculture
Food delivery industry will still expend.
Food delivery firms are no longer dependent on third-party delivery services as they explore new and novel delivery methods to reduce delivery time and costs, such as distribution by robots, drones, and even parachutes.
Using new technologies to bring new food production increasing efficiencies in the food chain (incl. Urban/vertical farming, GMO, cultured meat, 3D printing).

Cultural Trends
Hyper individuality and localization trends will counteract each other. In future showing of will be about micro mobility and health.
Society has about another decade to change courses and avoid collapse by investing in sustainable technologies and equitable human development. Increased climate change awareness makes people more responsible on how resources are used.
Transportation + Mobility
Private ownership of micro mobility will boost.
Autonomous vehicles will appear at least by 2045 and will not be affordable.
Electric vehicles will now be the majority around the world.
Micro Mobility and other forms of individual mobility will be autonomous and will be controlled by AI.
Infrastructure will also be controlled by AI, resulting in less signs/traffic lights etc.






Matt P.
26 yro
College student
Location: Downtown LA
Context
Matt is a graduate student living in downtown LA. Spends most of his time at school. He prefers to spends his free time with friends.
Behavior
Loves to express himself, wants to save money wherever possible, wants to explore Los Angeles to the fullest and learn from its cultural diversity.
Pain-points
Has difficulty find micromobility in dense/busy areas of the town, also cannot easily take his micromobility around using public transit or ride-share
Aspirations
Quick, reliable, cost-efficient commutes
“Being a student on a tight budget and time, I always need something that is cheap, accessible yet dependable to explore and commute”



Low fidelity interfaces
Get automatic schedule and personalized trip plan. Your personal assistant will help you deal with messages and extract possible schedules with your commitment.
Your journey will be highly personalized, not just arriving somewhere, you can choose different modes for entertaining, exercising, etc.
Get easy access to multi-model transportation by IoT and NFC

High fidelity interfaces

Get rid of mobile phone
You can get easier access to information without bringing extra devices, for example, a mobile phone. With a multi-functional wristband, Virtual glasses, and earphones users can get interactions in visually and physically. Stop staring at your mobile phone to get information, Feel it!

3D maps in AR glasses
Mix-reality experience
Limited but important info from physical interfaces, supporting information, or gamified user experience from virtual interfaces. The mix-reality experience is designed with a hierarchy of information.

Connect with your mobility devices
IoT and future portable ID
Your portable devices can also be your personal identifications. It can help you unlock sharing facilities, get access to multiple services. It can be anything, your watch, your ring, necklace, or even tattoo.


Micro-Functional hub
A mini-center which has the functions of collecting and distributing, sanitizing and charging micro-mobility devices.
There is an interactive screen on the surface of the hub and the users only have to go through 3 easy steps to rent one.

01
Matt goes directly to the hub to get a scooter

02
He taps on the screen

03
The gate slides up so that he could enter

04
XR holographic interface on the scooter

05
Nano tech allows him to change the color of scooter

06
AR+MR info on the screen

07
Joystick handle
+
Thumb control
Key Scenes

01
Micro-functional hub by the pavement

02
Starting page of Hub Interface.

03
This page requires the user to register their ID by scanning biometrics or tapping their wearables.

04
After successful sign-in, the system will ask the user if they would like to sanitize their devices again? They can either sanitize or skip this.

05
After going through previous 2 major steps, the user can open the gate and enjoy the ride now.

06
Micro-functional hub opens up
Figma Prototype
Key Scenes

01 Welcome Page

02 Color Change
03 XR Interface Illustration

Exploration Mode Shown Above

04 Clean Mode
Clean mode displays only the essential information and brings the user a cleaner interface.

05 Exploration Mode
The user taps or uses thumb control to click on the Navi icon to enter the page. Then the user uses Voice control to enter the address.

06 Navigation Page
Navigation page displays AR route and the dynamic info area also switch to navigation info including: time cost, remaining distance, ETA.

07 Turn on the Carbon Points Game Mode

08 Game Mode - Notice Page
The user has to read the notice first and know the rule while riding in game mode.

09 Game Mode Page
Carbon points game mode allow the user to collect points from manually riding the vehicle ( without using the electricity power)

Thanks for watching and interacting with it!














Vehicular Integration
By Arya Kani
To maximize the fexbility of micromobiltiy system across the LA Metro area, citizens will have the option to easily integrate their micromobility devices within the autonomous ride share services.
This interaction will be achieved in 3 overall steps:

Integration with AR glass for wayfinding and booking the vehicle.
Smart Curbside for flexible autonomous mobility stops.
In-vehicle infotainment system.

01
Using the AR Glasses, pickup location will be indicated.

02
The curbside screen will also indicate the journey information and exact pickup spot.

03
Passengers get onboard, micromobility will also autonomously get onboard.

04
Micromobility will be automatically docked and charged.

05
Upon entering, passengers can review the travel journey on AR powered windows.

06
Interface will indicate the order of drop of/pickup, ETA and more.

Integration with AR glass for wayfinding and booking the vehicle
AR glasses can also be used in navigating through town and in particular to help find the autonomous ride share pickup zones. Meanwhile there’s always the option to change reservation, change pickup time, etc. using this interface as well. The following pages indicate some options on the interface on the AR glasses.


01
Showing direction, time of pickup, and other essential information

02
Option to interact with virtual assistant at any point and integration with voice assistant.

03
Ability to spot the exact pickup location (featuring the smart curbside) even within a long distance

Smart Curbside for flexible autonomous mobility stops
The Smart Curbside allow passengers of autonomous mobility to easily get on/of the vehicle anywhere on the curbside. Upon coordinating the locating of the vehicle with the location of passengers, the system will locate the closest pickup zone and direct towards it.


01
Early indication of an incoming ride: This will not only help the passengers to find the location, but also let everyone around to know that there will be an embarking shortly.

02
Essential information such as ETA, direction where the vehicle is approaching, username/avatar of the passenger will be showing.

03
ETA will co-ordinate withe arrow indicating the location, the lesser time, the arrow will also move toward the center.

04
Upon approach of the vehicle, the vehicle’s identity (alias), license number and color will be indicated. Additionally, the Smart curbside will show a welcome message.

In-vehicle infotainment system
In-vehicle infotainment system is designed to enhance the passenger’s experience by showing their itinerary on the AR powered windows. This journey mapping will show the order of drop of/pickup on the map. The following scenes will also indicate some other key possibilities of this interface.


01
Upon boarding, the order of stops will be shown, next to the passenger names (on the map). Universal voice assistant option on the bottom of the screen, ready to help at any time.

02
When approaching the next stop, the interface will indicate the passenger’s name and exact ETA for them as well as upcoming stops.

03
After dropping of the passenger, the itinerary will be briefly showed on the map again. The universal voice assistant capability is also activated upon request.

04
Voice assistant can help with many functions such as showing the essential information about the status of micromobility (charge, energy score used/earned, etc.)

the description of AR technology, a micro projector inside the glass handle projects the desired content on the reflective surface, so that the user can understand the contact relative to the environment.
Micro Delivery AR Experience
By Saman Ghasemloo
See Larger with Smart Watch and Smart Glasses.
Due to limitations of the small watch screens, We explored a smart way to be able to see the screen details larger, thus experience a better UX.
This Eye-Glass acts as a projector for the future smart watch, with a small projector, projecting on the glass in front of Retina,
The AI, can also understand haptic gestures in the air, expanding new possibilities for new UX.

In this scenario, a fast delivery of a drone is portraited from users POV, and the AR mapping technology is showing in real time the path and landing area of the drone
Better Tracking
With AR, and GPS Mapping technologies, there is an opportunity to observe the objects as in action.
In this case, a parcel delivery for an emergency medical situation is done with a drone.
The receiver of the parcel can track the drone live by observing the drone’s path mapped on the real environment.

From Left to Right: Jiayu Liu, Dongrun He, Arya Kani, Saman Ghasemloo, Kiran Babu