Multi-Sensor Big Data
To date, the methodology to track and identify vertical movement from large-scale unstructured data sets is lacking. Here, we design and develop such a framework to accurately and systematically identify the sparse human vertical displacement activity typically buried into the predominantly horizontal mobility. Our framework uses sensor data from a barometer, accelerometer, and Wi-Fi scanner coupled with an extraction step involving a combination of feature engineering and data segmentation. This methodology is subsequently integrated into a machine-learning-based classifier to automatically distinguish vertical displacement activity—with 98% overall accuracy and a 92% F1-score—from its horizontal counterpart. We illustrate the potential of this framework by applying it to an unstructured large-scale data set associated with over 16,000 participants going about their daily activity in the city-state of Singapore. With the vertical movements of this large group uncovered, we can analyze the specific features of this activity class using its statistical distribution. This new knowledge would have significant ramifications for the architectural design of vertical cities.