Transforming Everyday Life With Phone Apps


A decade ago, Professor of Computer Science Andrew Campbell could see that smartphone sensing applications were going to transform everyday life. Today, much of that transformation has taken place, and there is more to come, Campbell says.

Professor of Computer Science Andrew Campbell.

“There is definitely the momentum toward devices that we wear and use that can track our physical state and I think in the future we’ll be able to very accurately track our psychological state as well,” says Professor of Computer Science Andrew Campbell. (Photo by Eli Burakian ’00)

“My group was really at the forefront of creating the field of smartphone sensing,” says Campbell. “We were the first to implement a smartphone sensing app called CenceMe in 2007 and released it on the app store in 2008. It was one of a few hundred apps on the app store when it opened for business in 2008. Today there are 1.4 million apps on the app store.”

Read more:

How Are You Feeling—and Doing? Ask Your Smart Phone

Harnessing the Power of Smartphones to Prevent Psychosis

Dartmouth Smartphone App Targets Driver Safety

Students and Apps

App development may once have been the province of serious computer geeks, but these days, the requisite programming skills are pervasive.

These day, far more students are developing their own applications, says Campbell. “Five years ago it was just my group, and now there is the DEN, the DALI Lab, and there are Tuck students and Thayer students. Many of them have learned enough programming to put something rudimentary together. So it’s really spreading like wildfire across campus.”

An experimental computer scientist, Campbell says he is “focused on turning the everyday smartphone into a cognitive phone by pushing intelligence to the phone and cloud to make inferences about people’s behavior, surroundings, and their life patterns; using smartphones to sense, inform and nudge people in a better direction in terms of their physical and mental health.”

Campbell’s priority in application development is student health, particularly mental health.

“There has been much discussion about the apparent rise of anxiety and stress among college students,” he says. “These phones have been used effectively to measure physical activity, walking, running, exercising. But if we can use them to monitor and understand the stresses in people’s lives, we help put people in a better place—to be reflective and maybe make decisions that can help them deal with their anxiety.”

Applicable Apps

Campbell taught a mobile health seminar during winter semester, with about 50 students in the class—half undergraduates and half graduate students. “We had great local speakers—all sorts of people who had different viewpoints of health.” These included computer scientists, engineers, clinicians, psychologists, and mental health workers.

The class generated more than a dozen apps, including a therapy app for those with Parkinson’s disease and an app that uses a microphone on the phone to accurately detect an asthma sufferer’s coughing and wheezing.

SmartGPA is an outgrowth of the StudentLife project—a study using passive and automatic sensing data from the phones of a class of 48 students over a 10-week term. The app, which has generated significant media attention, can predict college students’ grade point average based on their studying, sleeping, exercising, and other behavior as recorded on smartphone sensors.

Into the Future

Campbell imagines a future where a phone’s functions will continue to expand as more sophisticated apps are developed. To some degree, the phone is already a mobile laboratory, a mobile doctor, and a mobile psychiatrist. “I think that’s where everything is going,” says Campbell.

He is working with the Geisel School of Medicine’s Dror Ben-Zeev and Cornell’s Tanzeem Choudhury, using the mobile phone to help people who are afflicted with schizophrenia.

“The whole idea there is to use the device to try to predict relapse,” says Campbell. “If we can sense that that their health is failing, then we can route them to the hospital and people can intervene.”

Campbell and Choudhury developed the sensing and machine learning platform. “Ben Zeev is like the intellectual powerhouse behind the creation of the project,” says Campbell.

“There is definitely the momentum toward devices that we wear and use that can track our physical state and I think in the future we’ll be able to very accurately track our psychological state as well,” Campbell says.

He predicts that 10 years from now there won’t be a thing called a smartphone. There will be something else. “Wearables are definitely happening now,” he observes. “The smartphone revolution was 2006 to 2015, now 2015 to 2020 is the wearables revolution. We are just seeing consumer wearable devices popping up. Google Glass failed, but not for technological reasons—perhaps because there were no really great applications driving it.”

While acknowledging that these predictions are not his alone, he affirms their certainty. “With technology disappearing into things we wear, we won’t have to be taking out the phone and tapping something or pulling out the computer and typing,” Campbell says. “The Apple Watch is a step in that direction.”

Joseph Blumberg