Now that I've finished and published a few lenses I figure I'd put a list of features I'd like to see:
* Delete key clears a field in the inspector
* A volume slider on the Audio component
* Animation times specified in frames instead of seconds
* Ability to use an FBX file per animation instead of one FBX with all the animations in one long sequence. It makes it much harder to add/remove animations when you have to re-bake out everything into one file
* Have Lens Studio compress the textures instead of using JPG files directly. It would be cool if you could check off "compress" on an individual PNG file in the inspector to have it convert it to JPEG (which is weird) or native texture formats (what should be happening) when you publish or send do your development device. It also would be cool to specify the output size so you could put very large source texture in the project but in the inspector tell it to resize it to 1024x1024 etc. This would make it much easier to play with different detail levels
* Ability to detect the performance level of the phone so you can add/reduce detail based on device. Perhaps we could specify coarse detail settings (no shadows, no specular, etc.) for various setups, like Level 1 devices, 2 and 3--and then Snapchat could have an API that tells us which type of device we are running on (high end, middle, low end etc.) Then we can switch the detail level at startup.
* Ambient light in shader. Right now I'm using emission to simulate an ambient light. Because it seems like you can't have ambient and directional in the same scene? I might be wrong about this.
* Have the option to allow ambient and light sources to use native SDK's light detection features. Both ARKit and ARCore have features that detect the lighting conditions and return lighting data based on its environmental lighting estimate. It would be great if we had an option to allow lights to use this information to somewhat match the lighting of the environment in the Lens.
* Generate environmental reflections from video. This is an example from Unity but the same principle applies: https://github.com/johnsietsma/ARCameraLighting It's splatting the video feed into a spherical texture and using it as the environment map for IBL and reflections. I think this might be better than choosing from the built-in reflection maps in many cases.
* Seamless sound looping. When I try to loop an MP3 there is an audible hiccup when it loops. Maybe yo need to decompress it in memory and play it as uncompressed to remove this stutter?
* Ability to cancel submission in review if it hasn't been reviewed yet
* HTTP class to access web resources (JSON, REST APIs. etc) from inside the Lens
* A user class to access user info in the lens (some kind of unique ID, perhaps unique to the app, so we could store persistent info about the user in our own database accessed via REST API)
* Particle system! I guess I could write one with the billboarding sprites, but a basic scriptable particle system would be great.
I'm sure I'll have more, this is just my first list! :)