Has anyone thought about or up for creating a lens/ filter based on real-time data? Imagine a weather focused lens that located the user and displayed relevant weather visuals.
Example 1: User opens Snapchat > open lenses > taps Sun/Cloud icon> weather API pulls in data (Wind 5 MPH from NNW) > "virtual" wind is applied in the NNW direction in the augmented world for users to visualize.
Example 2: User opens Snapchat > lenses> Cloud/Sun icon> weather API pulls in data (Storm Cell 1.5 mi. South) > user now sees a 3D animated cloud with rain and lightning coming from the south.
These can then be shared with friends and family to show what's upcoming and currently happening.