From pocket to passenger seat: Smartphones can help identify who’s driving
New ID system could lead to method for warning drivers when it’s unsafe to drive
Smartphones have already changed how we connect and communicate with others, how we pay for purchases and even how we eat and sleep. Now, research from Binghamton University’s Thomas J. Watson School of Engineering and Applied Science could end up changing how we drive.
Researchers have developed a system that can determine a driver’s identity based on his/her driving behaviors by using the sensors in the driver’s phone.
Smartphones contain accelerometers, which measure motion and acceleration; gyroscopes, which measure rotation and establish the direction the phone is facing; and GPS units, which allow phones to communicate with satellites to determine location and speed. These sensors can provide information about the driver such as speed, acceleration, breaking and turning behavior.
At the outset of the study, Assistant Professor Yan Wang and PhD student Fatemeh Tahmasbi from the Watson School’s Computer Science Department, collaborated with professors Yingying Chen and Marco Gruteser from Rutgers University to determine if they could identify who is driving based on the driver’s behavior captured by motion sensors in a short time period, such as a single turn. They found that the sensors embedded in smart devices, such as smartphones, can capture drivers’ behaviors on the road, which contain the drivers’ unique behavioral patterns. Such driving behaviors could potentially offer clues as to who is driving or help law enforcement determine whether the driver is driving dangerously.
Existing works can already identify driving behaviors by exploiting different sensing modalities on smartphones, such as cameras, motion sensors and microphones. While these works can distinguish driving behaviors, they can’t differentiate drivers based on their driving behaviors captured by smartphones. In addition, recent work shows that smartwatches and smartphones could be used to identify drivers based on their driving behaviors. However, this requires a large amount of training data from drivers, which isn’t convenient for users.
“To differentiate drivers based on their driving behaviors within just a turn, we need fine-grained information from sensors, which is very challenging,” Wang said.
“If I want to get real data from real drivers’ experiences, I don’t know what’s going on there, like what’s the traffic? Are people driving slow or fast? Is the weather good?” Tahmasbi added.
Building on that initial work, the Binghamton researchers collected data from four volunteer drivers in a residential area where there was less traffic, fewer traffic signs and fewer things to interfere with typical driving habits. This allowed them to collect simple data and explore what driving behaviors could be quantified and differentiated. Using 90 percent of their training dataset, their classifier identified the correct driver 92 percent of the time with only one turn. When they shrunk their training set to 50 percent, the classifier still identified the driver 83 percent of the time.
The team presented their findings, “Poster: Your Phone Tells Us the Truth: Driver Identification Using Smartphone on One Turn,” last fall at MobiCom 2018 in New Delhi, India.
Now, the research team intends to work with Binghamton University’s Transportation and Parking Services and Off Campus College (OCC) bus drivers to extend this project to the next level. However, there will be some complications: They will have no control over the external environment such as traffic, pedestrians and weather; they won’t be able to control the interior environment such as rowdy passengers and other driver distractions; and there are behavioral differences in driving a bus versus driving a car.
Despite the obstacles, the research holds great promise. A person’s driving can contain unique characteristics such as how they turn or brake at certain times and can possibly indicate whether a driver is tired or potentially unsafe. Not only do Wang and Tahmasbi hope to identify drivers based on these traits, they also hope to do it with the minimum amount of data.
“We want to avoid accidents, so we want to identify this tired mode or any abnormal status when you’re driving,” Wang said. “But, the identity part is the most challenging and amazing part that we feel excited about.”
Long term, the work could make driving safer.
“If you can recognize different driving behaviors this could also extend to training driving behavior for new drivers,” Wang said. “A device could tell you that you are driving well or can give you suggestions [such as], ‘After a turn you should take it slower or brake earlier.’”