You need to sign in or sign up before continuing. dismiss

Ben Steers

and 5 more

As part of the urban metabolism, city buildings consume resources and use energy, producing environmental impacts on the surrounding air by emitting plumes of pollution. Plumes that have been observed in Manhattan range from water vapor emitted from heating and cooling systems’ steam vents to CO2 and dangerous chemical compounds (e.g. ammonia, methane). City agencies are interested in detecting and tracking these plumes as they provide evidence for signs of urban activity, cultivation of living and working spaces and can support the provision of services whilst monitoring environmental impacts. The Urban Observatory at New York University’s Center for Urban Science and Progress (CUSP-UO) continuously images the Manhattan skyline at 0.1 Hz, and day-time images can be used to detect and characterize plumes from buildings in the scene. This project built and trained a deep convolutional neural network for detection and tracking of these plumes in near real-time. The project created a large training set of over 1,100 actual plumes as well as sources of contamination such as clouds, shadows and lights, and applied the relevant network architecture for training of the model. The trained convolutional neural network was applied to the archival Urban Observatory data between two time periods: 26th October-31st December 2013 and 1st January-13th March 2015 to generate detections of building plume activity during those time periods. Buildings with high plume ejection rates were identified, and all plumes could be classified by their color (i.e. carbon vs water vapor). The final result was a detection of plumes emitted during the time periods that the dataset spans.