AudiResponse

Placeholder Show Content

Abstract/Contents

Abstract

In 2025, we expect an “Internet of Things” to exist, in which shoes might talk to gym bags and smart watches can push status updates. A decade from now, automated cars will exist but less automated driver-assistive technologies will be much more common in use. Despite the existence of automated driving, the responsibility will still remain with the driver.

Wearable technologies containing a myriad of biometric sensors are just being introduced to the market today, and will likely be ubiquitous in several years. These sensors track a variety of information, from heart rate to activity level. After exploring many different future needs and technologies, we realized that these sensors could be used to monitor health and safety in this future world.

New technological features in automobiles are plentiful, but these features often come with the cost of safety by creating distractions. According to the World Health Organization, an estimated 1.2 million people are killed in road crashes each year and as many as 50 million are injured. These figures will likely increase by about 65% over the next 20 years unless there is new commitment to prevention. As the number of wheels on the road increases and drivers become less experienced due to the adoption of automated or semi-automated cars, we need to ensure that advances in accident response and prevention keep abreast of other technologies newly available in the car.

As it stands today, Emergency Medical Services (EMS) is notified of auto accidents when a bystander calls 911 and speaks with a dispatcher. The dispatcher sends EMS an “accident code” which relays the basics of the accident without direct communication. The information they receive typically lacks an exact GPS location, information about the number of people involved, and how critically injured they are. This leads to wasted time searching for the crash and a failure to accurately divert resources to the scene. Approximately 50% of the information currently received is incorrect.

During the last several months, we have refined our scenario: In 2025, you have just bought a new self-driving Audi. Although you have the latest safety features in your Audi, you cannot avoid all accidents. After a sudden encounter with a large animal, you are knocked unconscious. It is dark and there are no other cars on the highway. Your car then jumps into action to save your life. Through the same system of sensors, the car gathers information about you and the car. It then packages the information and sends it to first responders.

Accident Response:
We wanted to build a system that could be completely independent of the vehicle itself, such that in the case of an accident we would not have to depend on the vehicle’s power source and electronics.
A GPS continuously tracks the car’s position, and records latitude and longitude measurements on an Arduino.
Each passenger wears a Basis watch, which collects data about each passenger’s heart rate and movement, as well as the current air temperature. The Basis watch syncs once per minute via Bluetooth, using a square wave produced by the Tiny Lily (a small Arduino).
We outfitted an Audi with four seat belt covers containing accelerometers, in order to measure the accelerations experienced by all passengers. Each accelerometer is attached to an Arduino Mini Pro, which can wirelessly send information to our computer.
The computer, an Intel NUC, uses Processing to compile all the data from the GPS, accelerometers, and Basis Bands. Prior to an accident, the user interface for the driver displays relevant data about the users in the car on an iPad mini on the vehicle’s dashboard. With this platform, further applications can be developed to track health and prevent accidents caused by stress and other medical incidents.
An accident is triggered when the seat belt accelerometers log above-threshold values. At this point, the latest data is sent to EMS and then continuously updated until they arrive on the scene. The GUI for EMS displays information about the car’s location on a map and the current injury status of each passenger in the car. This interface also includes a webcam for two-way communication with the passengers in the vehicle, so that EMS can get further information about the people in the car along with reassure the passengers that they are on their way.
The world of 2025 could bring some major changes in our lifestyles, but given current trends we expect people to want data collection and analysis on all aspects of their lives. This formidable bank of information could translate into more efficient emergency response as well as machine learning to better customize the car during everyday driving. This platform, which incorporates biometric sensors into the car, will streamline and integrate this data into the driving experience.

Accident Prevention:
Exploring beyond accident response, we wanted to expand the AudiResponse as a platform for further applications using the myriad of sensor information found in 2025. One example for a future application and need is accident prevention for the driver during major health events. The wearable would be able to determine a heart attack, a hypoglycemic event, or a seizure and alerts the car appropriately to inform medical services and safely maneuvers the car to either the shoulder or the hospital. Continued additional applications such as autonomous driving preferences or wearable induced temperature control could be implemented by adding more input sensors for new user needs.

Description

Type of resource text
Date created 2014

Creators/Contributors

Author Chau, Harrrison
Author Deng, Michelle
Author Elovainio, Lari
Author Jin, Tiffany
Author Lu, Hu
Author Roine, Otso
Author Suchoski, Jacob
Sponsor The Volkswagen Electronics Research Laboratory

Subjects

Subject Product Design
Subject Mechanical Engineering
Subject Audi
Subject Volkswagen
Subject AudiResponse
Genre Student project report

Bibliographic information

Access conditions

Use and reproduction
User agrees that, where applicable, content will not be used to identify or to otherwise infringe the privacy or confidentiality rights of individuals. Content distributed via the Stanford Digital Repository may be subject to additional license and use restrictions applied by the depositor.
License
This work is licensed under a Creative Commons Attribution Non Commercial 3.0 Unported license (CC BY-NC).

Preferred citation

Preferred Citation
Chau, Harrrison; Deng, Michelle; Elovainio, Lari; Jin, Tiffany; Lu, Hu; Roine, Otso; and Suchoski, Jacob. (2014). AudiResponse. Stanford Digital Repository. Available at: http://purl.stanford.edu/vr835kx1498

Collection

ME310 Project Based Engineering Design

View other items in this collection in SearchWorks

Contact information

Also listed in

Loading usage metrics...