Soundshare is a multiplayer colocated music creation experience where each user can add or remove a musical note to a shared music player. Each user is randomly assigned an instrument but can change to any instrument available in the scene. Each user will need to look towards at a marker to initialize the experience.
In this example, we will walk through a project on how we made a shared AR experience, Soundshare, in the New Generation of Spectacles using the Connected Lens Module.
Connected Lens for Spectacles is only available in a co-located space between other Spectacles (Spectacles to Mobile experience coming soon!). When using Connected Lenses, some APIs will be restricted in order to protect the user’s privacy. Take a look at the Connected Lenses Overview Guide for more info.
When you are testing the experience on Lens Studio, make sure you select the "Launch Soundshare" button in your preview panel. It is also important to note here that you can only play with this experience with "Colorful Home" Environment in the Interactive Environment mode.
Let's dive to the script structure of the Lens project!
Experience Manager
The ExperienceManager.js manages the UI dialogs users will see when they are first navigating through the experience.
In order to change in between these UI dialogs, we need to create a basic state machine which will allow the old UI Dialog to close and the new one to open.
function changeDialog(toDialog){
//Do nothing if the dialog to change is the same as the current one
if(currDialogState == toDialog){
return;
}
//Turns off the old UI dialog
switch(currDialogState){
casedialogState.OVERALLINSTRUCTION:
activateDialogOverallInstruction(false);
break;
casedialogState.SHAREDINSTRUCTION:
activateDialogSharedSpaceInstruction(false);
break;
casedialogState.SHAREDLOADER:
activateDialogSharedSpaceLoader(false);
break;
casedialogState.SHAREDCREATED:
activateDialogSharedSpaceCreated(false);
break;
casedialogState.SPECTACLESONLY:
activateDialogSpectaclesOnly(false);
break;
default:
break;
}
currDialogState = toDialog;
//Turns on the new UI dialog
switch(currDialogState){
casedialogState.OVERALLINSTRUCTION:
activateDialogOverallInstruction(true);
break;
casedialogState.SHAREDINSTRUCTION:
global.worldScene.findSharedSpace();
activateDialogSharedSpaceInstruction(true);
break;
casedialogState.SHAREDLOADER:
activateDialogSharedSpaceLoader(true);
break;
casedialogState.SHAREDCREATED:
activateDialogSharedSpaceCreated(true);
break;
casedialogState.SPECTACLESONLY:
activateDialogSpectaclesOnly(true);
break;
default:
break;
}
}
For other scripts to have the ability to change the dialog, we need to expose it. One method is to turn the Experience Manager into a global instance so that it is accessible in any script. Generally, you should be using the keyword global cautiously as it could lead to tightly coupled coding. However, having a global instance will help better manage objects that need to be exposed to multiple scripts in the scene. When the global keyword is used, do make sure to only have one instance of it at a time or you may get unwanted results!
//Creates an instance of the Experience Manager
global.experienceManager = {};
global.experienceManager.changeDialog = changeDialog;
Multiplayer Controller
Setting Up Multiplayer Session
script.createEvent("ConnectedLensEnteredEvent").bind(function(){
var options = ConnectedLensSessionOptions.create();
options.onConnected = onConnected;
options.onDisconnected = onDisconnected;
options.onMessageReceived = onMessageReceived;
options.onUserJoinedSession = onUserJoinedSession;
options.onUserLeftSession = onUserLeftSession;
options.onError = onError;
connectedLensModule.createSession(options);
global.experienceManager.startDialog();
})
Let’s also use the ConnectedLensEnteredEvent to start our experience. (Note that because we are building this experience in Lens Studio, it can also technically run on Snapchat on your phone. We can tackle this extension in the future. For now, we’ll make the experience only available on Spectacles.)
if(global.deviceInfoSystem.isSpectacles() || global.deviceInfoSystem.isEditor()){
//Subscribe to the event "ConnectedLensEnteredEvent" which will be fired when connected lens is launched by "Creating a room" or joining an existing room.
script.createEvent("ConnectedLensEnteredEvent").bind(function(){
var options = ConnectedLensSessionOptions.create();
options.onConnected = onConnected;
options.onDisconnected = onDisconnected;
options.onMessageReceived = onMessageReceived;
options.onUserJoinedSession = onUserJoinedSession;
options.onUserLeftSession = onUserLeftSession;
options.onError = onError;
connectedLensModule.createSession(options);
global.experienceManager.startDialog();
})
} else{
global.experienceManager.changeDialog(global.dialogState.SPECTACLESONLY);
}
Setting Events For Multiplayer
Let’s create a basic event system to share the following server events with other objects in the scene. The user:
- Joins a session
- Leaves a session
- Sends a message
- Sends a message with timeout
- Receives a message
First, create fire event functions to broadcast these events to whatever objects are subscribing to them.
/** * Executes all the subscribed function in the subscribeOnUserJoinedEvents. * * @param {session} session: An instance of a Connected Lens session among a group of participants who were successfully invited into the experience. * @param {string} userId: The user that joined the session. * @method fireOnUserJoinedEvents(session, userId) * @return {void} */ function fireOnUserJoinedEvents(session, userId){ for(var i = 0; i < subscribeOnUserJoinedEvents.length; i++){ subscribeOnUserJoinedEvents[i](session, userId); } } /** * Executes all the subscribed function in the subscribeOnUserLeftEvents. * * @param {session} session: An instance of a Connected Lens session among a group of participants who were successfully invited into the experience. * @param {string} userId: The user that left the session. * @method fireOnUserLeftEvents(session, userId) * @return {void} */ function fireOnUserLeftEvents(session, userId){ for(var i = 0; i < subscribeOnUserLeftEvents.length; i++){ subscribeOnUserLeftEvents[i](session, userId); } } /** * Sends message to the connected Lens server. * * @param {object} data: The object that contains messageType and data(dependent on which messageType). * messageType is determined by the global.serverMessage. * @method global.multiplayerController.send(data) * @return {void} */ global.multiplayerController.send = function(data){ global.multiplayerController.session.sendMessage(JSON.stringify(data)); } /** * Sends message to the connected Lens server with timeout. Connected Lens server will drop message that doesn't arrive within timeout. * * @param {object} data: The object that contains messageType and data(dependent on which messageType). * messageType is determined by the global.serverMessage. * @param {float} timeOutMs: The time duration before server drops message. Timeout is in milliseconds. * @method global.multiplayerController.sendWithTimeOut(data, timeOutMs) * @return {void} */ global.multiplayerController.sendWithTimeOut = function(data, timeOutMs){ global.multiplayerController.session.sendMessageWithTimeout(JSON.stringify(data), timeOutMs); } /** * Subscribes to an event that will get fired when a user other than you joins a session. * * @param {function} func: The reference variable that maps to a function with a 'session' and 'userId' parameter. * @method global.multiplayerController.subscribeOnUserJoinedEvents(func(session,userId)) * @return {void} */ global.multiplayerController.subscribeOnUserJoinedEvents = function(func){ subscribeOnUserJoinedEvents.push(func); }
Next, create event subscriber functions for scene objects to subscribe to. By subscribing, the scene object stores a reference function to the event’s array. This function is then called when the event fires.
/** * Subscribes to an event that will get fired when a user other than you leaves a session. * * @param {function} func: The reference variable that maps to a function with a 'session' and 'userId' parameter. * @method global.multiplayerController.subscribeOnUserLeftEvents(func(session,userId)) * @return {void} */ global.multiplayerController.subscribeOnUserLeftEvents = function(func){ subscribeOnUserLeftEvents.push(func); } /** * Subscribes to an event that will get fired when a message is received from the connected Lens server. * * @parm {string} messageType: MessageType is an enum defined by the global.serverMessage. * @param {function} func: The reference variable that maps to a function with a 'session', 'userId, and 'data' parameter. * @method global.multiplayerController.subscribeOnMessageReceive(func(messageType,func(session, userId, data))) * @return {void} */ global.multiplayerController.subscribeOnMessageReceive = function(messageType, func){ if(!subscribedOnMessageReceivedEvents.hasOwnProperty(messageType)){ subscribedOnMessageReceivedEvents[messageType] = []; } subscribedOnMessageReceivedEvents[messageType].push(func); }
World Scene
For any Lens experience that is utilizing the Device Tracking Component - World, each device have their own unique virtual coordinate system based on the physical location of when the user opens the Lens. When multiple Spectacles users joins the same session their virtual coordinate system will be misaligned. In this project, we use the Marker Tracking Component to align the virtual coordinate system space of all users. The Marker Tracking Component for Spectacles works within a 1 meter distance for a 20x20cm image. When a Marker is detected by Spectacles, there are some initial noises in movement. Before establishing the common virtual coordinate system, the marker needs to stabilize. Therefore, when we detect a marker we should check for whether there have been a significant movement in terms of position and rotation. If there is not, then it is a valid coordinate system and the Lens will adopt the newly created coordinate system.
//Assign a function to be called every frame.
var markerInitializationUpdate = script.createEvent("UpdateEvent"); markerInitializationUpdate.enabled = false; markerInitializationUpdate.bind(function(e){ if(!markerTracker.isTracking()){ currStablizedTrackerTimer = 0.0; prevMarkerPos = vec3.zero(); global.experienceManager.changeDialog(dialogState.SHAREDINSTRUCTION); return; } else{ global.experienceManager.changeDialog(dialogState.SHAREDLOADER); } var currPos = markerTransform.getWorldPosition(); var currRot = markerTransform.getWorldRotation(); //Check to see if marker is not moving or rotating if(currPos.distance(prevMarkerPos) > MAX_MARKER_STABLIZED_DIST || quat.angleBetween(currRot, prevMarkerRot) >= MAX_MARKER_STABLIZE_ROT){ currStablizedTrackerTimer = 0.0; } else{ currStablizedTrackerTimer += getDeltaTime(); } prevMarkerPos = currPos; prevMarkerRot = currRot; //After marker has not moved or rotated for a specific duration, initialize the world scene to begin the experience. if(currStablizedTrackerTimer >= STABLIZED_TRACKER_TIME){ markerTracker.enabled = false; var currentWorldPos = markerTransform.getWorldPosition(); var transform = script.getTransform(); var toLook = markerTransform.getWorldRotation().toEulerAngles(); toLook.x = 0; toLook.z = 0; var currentWorldRot = markerTransform.getWorldRotation(); worldObj .setParentPreserveWorldTransform(null); worldTransform.setWorldPosition(currentWorldPos); worldTransform.setWorldRotation(quat.fromEulerVec(toLook)); markerInitializationUpdate.enabled = false; FireInitializationEvents(); } }) markerTracker.onMarkerFound = function(){ if(!initialized && !markerInitializationUpdate.enabled){ markerInitializationUpdate.enabled = true; initialized = true; } }
When the common virtual coordinate system has been established, we can begin the experience. There are many SceneObjects in the project that need to take an action when the virtual coordinate system has been established. Similar to the MultiplayerController.js, let's create a basic event system for when the virtual coordinate system has been established or the World Scene.
global.worldScene = {}; /** * Executes all the delegates subscribed to initializationEvents. * * @method FireInitializationEvents() * @return {void} */ function FireInitializationEvents(){ global.experienceManager.changeDialog(dialogState.SHAREDCREATED); for(var i = 0; i < initializationEvents.length; i++){ initializationEvents[i](); } } /** * Subscribes to an event that will get fired when the world scene is initialized. * * @param {function} func: The reference variable that maps to a function. * @method global.worldScene.subscribeInitialization(func) * @return {void} */ global.worldScene.subscribeInitialization = function(func){ initializationEvents.push(func); } /** * Unsubscribes to an event that will get fired when the world scene is initialized. * * @param {function} func: The reference variable that maps to a function. * @method global.worldScene.unsubscribeInitialization(func) * @return {void} */ global.worldScene.unsubscribeInitialization = function(func){ var index = initializationEvents.indexOf(func); initializationEvents.splice(index, 1); }
Input Controller
The InputController.js control the inputs of the user's touchpad. For this experience, we will create a logic for when a user has initiated a tap or tap and hold. Tap is when a user touches and release on the touchpad with a single finger. Tap and hold is when a user touches and leaves their finger on the touchpad. The delta time between when a touch starts and end is unique to every person, some users may tap much quicker or slower than others. We found that approximately <400ms is a good duration for the delta of touch start and end to determine whether the user has tapped on the touchpad.
//global.touchSystem is a Lens Studio native API: https://lensstudio.snapchat.com/api/classes/TouchDataProvider/
global.touchSystem.touchBlocking = true; global.touchSystem.enableTouchBlockingException("TouchTypeDoubleTap", false); script.createEvent("TouchStartEvent").bind(function(e){ //Ignores other fingers that is not the initial one placed if(currentTouchId != e.getTouchId()){ return; } var touchPosition = e.getTouchPosition(); touchStartTime = getTime(); delayEventCanFire = true; delayTouchEvent.enabled = true; delayTouchEvent.reset(MAX_TAP_DURATION); }) script.createEvent("TouchEndEvent").bind(function(e){ //Ignores other fingers that is not the initial one placed if(currentTouchId != e.getTouchId()){ return; } var touchPosition = e.getTouchPosition(); var deltaTime = getTime() - touchStartTime; if(deltaTime <= MAX_TAP_DURATION){ fireSubscribeTapEvents(); } delayEventCanFire = false; delayTouchEvent.enabled = false; })
//Event that will be called after a certain amount of time has passed. delayTouchEvent.bind(function(e){ if(!delayEventCanFire){ return; } fireSubscribeTapHoldEvents(); })
There are many SceneObjects in the project that need to take an action when a user taps or tap and hold. Similar to the MultiplayerController.js, let's create a basic event system for when a user tap or tap and hold.
global.inputController = {};
/** * Calls all functions that is subscribed to subscribedTaps. * * @method fireSubscribeTapEvents() * @return {void} */ function fireSubscribeTapEvents(){ for(var i = 0; i < subscribedTaps.length; i++){ subscribedTaps[i](); } } /** * Calls all functions that is subscribed to subscribedTapHold. * * @method fireSubscribeTapHoldEvents() * @return {void} */ function fireSubscribeTapHoldEvents(){ for(var i = 0; i < subscribedTapHold.length; i++){ subscribedTapHold[i](); } } /** * Subscribes to an event that will get fired when a user taps on the touchpad. * * @param {function} func: The reference variable that maps to a function. * @method global.inputController.subscribeTap(func) * @return {void} */ global.inputController.subscribeTap = function(func){ subscribedTaps.push(func); } /** * Unsubscribes to an event that will get fired when a user taps on the touchpad. * * @param {function} func: The reference variable that maps to a function. * @method global.inputController.unsubscribeTap(func); * @return {void} */ global.inputController.unsubscribeTap = function(func){ const index = subscribedTaps.indexOf(func); if (index > -1) { subscribedTaps.splice(index, 1); } } /** * Subscribes to an event that will get fired when a user tap and hold on the touchpad. * * @param {function} func: The reference variable that maps to a function. * @method script.api.getRenderMeshVisual(); * @return {void} */ global.inputController.subscribeTapHold = function(func){ subscribedTapHold.push(func); } /** * Unsubscribes to an event that will get fired when a user tap and hold on the touchpad. * * @param {function} func: The reference variable that maps to a function. * @method script.api.getRenderMeshVisual(); * @return {void} */ global.inputController.unsubscribeTapHold = function(func){ const index = subscribedTapHold.indexOf(func); if (index > -1) { subscribedTapHold.splice(index, 1); } }
Server Sync Manager
global.serverSyncManager = {}; /** * Subscribes to an event that will get fired every update event with the time stamp of the connected Lenses server. * * @param {function} func: The reference variable that maps to a function with a timeStamp parameter. * @method global.serverSyncManager.subscribeToServerTimeUpdateEvent(func(timeStamp)) * @return {void} */ global.serverSyncManager.subscribeToServerTimeUpdateEvent = function(func){ serverTimeUpdateEvents.push(func); } /** * Unsubscribes to an event that will get fired every update event with the time stamp of the connected Lens server. * * @param {function} func: The reference variable that maps to a function with a timeStamp parameter. * @method global.serverSyncManager.unsubscribeToServerTimeUpdateEvent(func(timeStamp)) * @return {void} */ global.serverSyncManager.unsubscribeToServerTimeUpdateEvent = function(func){ var index = serverTimeUpdateEvents.indexOf(func); if(index > -1){ serverTimeUpdateEvents.splice(index, 1); } } /** * Executes all the delegates subscribed to serverTimeUpdateEvents with the connected Lens' server time stamp. * * @param {float} timeStamp: The time in seconds + milliseconds of the current session. The value never goes above 60 seconds. * @method FireServerTimeUpdateEvents(timeStamp) * @return {void} */ function FireServerTimeUpdateEvents(timeStamp){ for(var i = 0 ; i < serverTimeUpdateEvents.length; i++){ serverTimeUpdateEvents[i](timeStamp); } } script.createEvent("UpdateEvent").bind(function(e){ //Lens running on Lens Studio will use the built-in system's time instead of server time. if(global.deviceInfoSystem.isEditor()){ var date = new Date(); var time = date.getSeconds() + date.getMilliseconds()/1000; FireServerTimeUpdateEvents(time); } else{ if(!global.multiplayerController.sessionConnected){ return; } global.date.setTime(global.multiplayerController.session.getServerTimestamp()); var time = global.date.getSeconds() + global.date.getMilliseconds()/1000; FireServerTimeUpdateEvents(time); } })
Instruments
There are four types of instrument for this project: Snare, High Hat, Kick Bass, and Vibey Bass. However, the project is set in a modular way to include more. The instruments are represented in abstract shapes.
Instrument Object
The Instrument.js contains the visual and audio information of a single instrument object. When a user has the intent or no intent of selecting the instrument, this script is responsible for driving what visuals will look like.
/** * Plays the highlight animation when an instrument has the intention of being selected. * * @method script.api.activateHighlightAnimation() * @return {void} */ script.api.activateHighlightAnimation = function(){ if(script.renderMeshVisual.mainMaterial == null){ return; } var material = script.renderMeshVisual.mainMaterial; var initTransitionAmt = material.mainPass.transition; global.LSTween.rawTween(HIGHLIGHT_ANIM_SPEED) .onUpdate(function (object) { material.mainPass.transition = global.Mathf.map_range(object.t, 0, 1, initTransitionAmt, TRANSITION_TO); }) .start(); } /** * Stops the highlight animation when an instrument has the intention of not being selected. * * @method script.api.stopHighlightAnimation() * @return {void} */ script.api.stopHighlightAnimation = function(){ if(script.renderMeshVisual.mainMaterial == null){ return; } var material = script.renderMeshVisual.mainMaterial; var initTransitionAmt = material.mainPass.transition; global.LSTween.rawTween(HIGHLIGHT_ANIM_SPEED) .onUpdate(function (object) { material.mainPass.transition = global.Mathf.map_range(object.t, 0, 1, initTransitionAmt, 0); }) .start(); }
E.g. Highlight and Unhighlight Animation
The highlight and unhighlight animation is driven by the material InstrumentMaterial_NotPlayed, InstrumentMaterial_0, InstrumentMaterial_1, InstrumentMaterial_2, and InstrumentMaterial_3. Every instrument is assigned a copy InstrumentMaterial_NotPlayed when the instrument is not selected by the user and a unique material that represents when the instrument is selected. The unique material is distinguished by the BaseColor and the Emissive color.
/** * Instrument note the user have selected. This instrument will be the note that will play when the user selects a beat. * * @param {bool} selected: Selection type. * @method script.api.selectedInstrument(selected) * @return {void} */ script.api.selectedInstrument = function(selected){ var transitionValue = script.renderMeshVisual.mainMaterial.mainPass.transition; if(selected){ script.renderMeshVisual.mainMaterial = script.selectedMat; } else{ script.renderMeshVisual.mainMaterial = unselectedMatClone; } script.renderMeshVisual.mainMaterial.mainPass.transition = transitionValue; }
E.g. Selected and Unselected Instruments
Instruments Manager
The InstrumentsManager.js initializes the transform of all the instruments in the scene and contains information on all the instruments and its scripts. The initialization of the instruments will start when the world initializes after a valid marker has been established. Instruments are populated based on a fixed offset in the X direction.
/** * Sets the initial placement of the instruments. * * @method initInstrumentsPlacement() * @return {void} */ function initInstrumentsPlacement(){ instrumentParentObj = global.scene.createSceneObject("InstrumentsParent"); instrumentParentTransform = instrumentParentObj.getTransform(); var randomInstrumentIndex = global.Mathf.getRandomIntInclusive(0, allInstruments.length - 1); script.api.setCurrentInstrument(randomInstrumentIndex); for(var i = 0; i < allInstruments.length; i++){ var instrumentTransform = instrumentsTransform[i]; var instrumentObj = instrumentTransform.getSceneObject(); instrumentObj.setParent(instrumentParentObj); var spawnPos = new vec3((i*PLACEMENTOFFSET) - (PLACEMENTOFFSET*(allInstruments.length-1)/2), 0,0); instrumentTransform.setLocalPosition(spawnPos); } instrumentParentTransform.setWorldScale(vec3.zero()); } /** * Set the anchor of the instruments to be relative to the newly created coordinate system and animate to initialization. * * @method worldSceneInit() * @return {void} */ var worldSceneInit = function(){ var worldScene = global.worldScene.worldObj; instrumentParentObj.setParent(worldScene);
//Scale in when first getting initialized. global.LSTween.scaleFromToWorld(instrumentParentTransform, vec3.zero(), vec3.one(), INITALIZED_ANIMATION_MS) .easing(TWEEN.Easing.Quadratic.Out) .start(); } //Subscribes to world scene initialization, which will get called when a marker is stable enough to become the new coordinate system anchor.
global.worldScene.subscribeInitialization(worldSceneInit);
Other objects and scripts will use the Instruments Manager to change the contents of the instruments (i.e. highlight animation or selection of the instruments.
/** * Returns the parent object in which all the instruments are anchored to. * * @method script.api.getInstrumentsTransform(); * @return {sceneObject} */ script.api.getInstrumentParentObj = function(){ return instrumentParentObj; } /** * Returns the parent transform in which all the instruments are anchored to. * * @method script.api.getInstrumentParentTransform(); * @return {transform} */ script.api.getInstrumentParentTransform = function(){ return instrumentParentTransform; } /** * Get transform of a single instrument with the passed index. * * @param {integer} instrumentIndex: The index of the instrument you want to get the transform of. * @method script.api.getInstrumentTransformAt(instrumentIndex) * @return {transform} */ script.api.getInstrumentTransformAt = function(instrumentIndex){ return instrumentsTransform[instrumentIndex]; } /** * Get transform for all the instruments in the scene. * * @method script.api.getAllInstrumentsTransform(); * @return {transform} */ script.api.getAllInstrumentsTransform = function(){ return instrumentsTransform; } /** * Get the active instrument. * * @method script.api.getCurrentInstrument(); * @return {Instrument.js} */ script.api.getCurrentInstrument = function(){ return currInstrument; } /** * Get the total amount of instruments avaliable. * * @method script.api.getTotalInstrumentCount(); * @return {integer} */ script.api.getTotalInstrumentCount = function(){ return instrumentsTransform.length; } /** * Get the index of the currently played instruments. * * @method script.api.getCurrentInstrumentIndex(); * @return {integer} */ script.api.getCurrentInstrumentIndex = function(){ return currInstrumentIndex >= 0 && currInstrumentIndex < script.instrumentScripts.length ? currInstrumentIndex : -1; } /** * Setting instrument based on new index. * * @param {integer} instrumentIndex: The index of the instrument you want to set the Instruments.js api of. * @method script.api.setCurrentInstrument(instrumentIndex); * @return {Instrument.js} */ script.api.setCurrentInstrument = function(instrumentIndex){ if(currInstrument != null){ currInstrument.selectedInstrument(false); } currInstrument = script.instrumentScripts[instrumentIndex].api; currInstrumentIndex = instrumentIndex; currInstrument.selectedInstrument(true); } /** * Get the array of all the instrument in the scene. Instruments are objects with api calls from Instrument.js. * * @method script.api.getAllInstruments() * @return {Instrument.js[]} */ script.api.getAllInstruments = function(){ return allInstruments; } /** * Get the number of instruments in the scene. * * @method script.api.getAllInstrumentsLength(); * @return {integer} */ script.api.getAllInstrumentsLength = function(){ return allInstruments.length; } /** * Helper function that gets the render mesh visual component of a given instrument. * * @param {integer} instrumentIndex: The index of the instrument you want to get the RenderMeshVisual Component from. * @method script.api.getRenderMeshVisualInstrumentAt(instrumentIndex); * @return {Component.RenderMeshVisual} */ script.api.getRenderMeshVisualInstrumentAt = function(instrumentIndex){ return script.instrumentScripts[instrumentIndex].api.getRenderMeshVisual(); } /** * Helper function that gets the material of a given instrument. * * @param {integer} instrumentIndex: The index of the instrument you want to get the Material from. * @method script.api.getInstrumentSelectedMaterialAt(instrumentIndex); * @return {Asset.Material} */ script.api.getInstrumentSelectedMaterialAt = function(instrumentIndex){ return script.instrumentScripts[instrumentIndex].api.getSelectedMaterial(); } /** * Helper function that plays the visual animation when the user has the intent to take action to select the instrument. * * @param {integer} instrumentIndex: The index of the instrument you want to begin the highlight animation. * @method script.api.activateHighlightAnimationInstrumentAt(instrumentIndex); * @return {void} */ script.api.activateHighlightAnimationInstrumentAt = function(instrumentIndex){ script.instrumentScripts[instrumentIndex].api.activateHighlightAnimation(); } /** * Helper function that stop the visual animation when the user has no more intent to select the instrument. * * @param {integer} instrumentIndex: The index of the instrument you want to stop the highlight animation. * @method script.api.stopHighlightAnimationInstrumentAt(instrumentIndex); * @return {void} */ script.api.stopHighlightAnimationInstrumentAt = function(instrumentIndex){ script.instrumentScripts[instrumentIndex].api.stopHighlightAnimation(); } /** * The radius for which the instrument will be detected by the visual reticle. * * @method script.api.instrumentPlacementRadius * @return {float} */ script.api.instrumentPlacementRadius = PLACEMENTRADIUS; /** * The offset between each of the instrument when initialized. * * @method script.api.instrumentPlacementOffset * @return {float} */ script.api.instrumentPlacementOffset = PLACEMENTOFFSET;
Visual Beats and Notes
Visual Beats
Beats are the object in which a user can add or remove an instrument note. When the beat is played, it will play the instrument note it is assigned. No audio will play if there are no instrument note assigned. The VisualBeats.js is only responsible for driving the animation for the visual representation of the beats in the scene. The beat's highlight or active state is determined by the SFX Manager.
/** * Plays animation to help indicate when user is look at the beat. * * @method script.api.playBeatActiveAnimation() * @return {void} */ script.api.playBeatActiveAnimation = function(){ if(beatMeshMat == null){ return; } var initTransitionAmt = beatMeshMat.mainPass.transition; global.LSTween.rawTween(BEAT_ACTIVATE_ANIM_SPEED_MS) .onUpdate(function (object) { beatMeshMat.mainPass.transition = global.Mathf.map_range(object.t, 0, 1, initTransitionAmt, 1); }) .start(); } /** * Stops the animation when user beat is no longer being looked at. * * @method script.api.playBeatActiveAnimation() * @return {void} */ script.api.stopBeatActiveAnimation = function(){ if(beatMeshMat == null){ return; } var initTransitionAmt = beatMeshMat.mainPass.transition; global.LSTween.rawTween(BEAT_ACTIVATE_ANIM_SPEED_MS) .onUpdate(function (object) { beatMeshMat.mainPass.transition = global.Mathf.map_range(object.t, 0, 1, initTransitionAmt, 0); }) .start(); }
Visual Notes
/** * Gets the Render Mesh Visual component of this SceneObject. * * @method script.api.getRenderMeshVisual(); * @return {Component.RenderMeshVisual} */ script.api.getRenderMeshVisual = function(){ return script.renderMeshVisual; } /** * Sets the material of the Visual Note. * * @param {Asset.Material} material: The reference material to update to. * @method script.api.setToMaterial(material) * @return {void} */ script.api.setToMaterial = function(material){ toMaterial = material; script.renderMeshVisual.mainPass.baseColor = neutralMaterial.mainPass.baseColor; firstTimeActive = true; } /** * Starts the animation when the beat is played. If instrument hasn't been played once then it will start from grey to the assigned color. * * @method script.api.playActivatedAnimation() * @return {void} */ script.api.playActivatedAnimation = function(){ if(firstTimeActive && toMaterial != null){ firstTimeActive = false; global.LSTween.colorFromTo(script.renderMeshVisual.mainMaterial, neutralMaterial.mainPass.baseColor, toMaterial.mainPass.baseColor, NOTE_FIRST_ACTIVATE_TIME_MS) .start(); } if(firstTimeActive){ return; } global.LSTween.rawTween(NOTE_PLAY_ANIMATION_TIME_MS) .onUpdate(function(o){ script.renderMeshVisual.mainMaterial.mainPass.transition = o.t; }) .start(); }
SFX Visualizer
The SFXVisualizer.js manages the visual representation of notes, beats, and progress bar pertaining to a single musical loop player. Please take a look at the SFX Controller section for more details on the musical loop player. When the World Scene has been initialized, the SFX Visualizer will create the notes and beats for the music loop player based on the parameter passed in by the SFX Controller.
/** * Initializes the SFXVisualizer with the radius of where to spawn beats, amount of beats, how fast visually the beats should be played, and parent to anchor to. * * @param {float} inRadius2spawn: The spawn radius of the visual representation of the beats. * @param {integer} inBeatsCount: The number of beats that will be instantiated. * @param {float} inMusicBPM: The speed in which the beats will play. * @param {SceneObject} inToParent: The object to anchor the SFXVisualizer's content to. * @method script.api.initialize(inRadius2spawn, inBeatsCount, inMusicBPM, inToParent) * @return {void} */ script.api.initialize = function(inRadius2spawn, inBeatsCount, inMusicBPM, inToParent){ radius2spawn = inRadius2spawn; beatsCount = inBeatsCount; musicBPM = inMusicBPM; visualizer.obj = inToParent; spawnBeatsVisualsAround(inToParent); } /** * Initialization where the visual beats will spawn around. * * @param {SceneObject} parentObj: The parent SceneObject to have beats spawn around. * @method spawnBeatsVisualsAround(parentObj) * @return {void} */ function spawnBeatsVisualsAround(parentObj){ var angleToSpawnIncrements = 360 / beatsCount; var instrumentsManager = script.instrumentsManager.api; //Calculating and Spawning Beats for(var i = 0; i < beatsCount; i++){ var visualBeatsObj = script.visualBeatsPrefab.instantiate(parentObj); var visualBeatsScript = visualBeatsObj.getComponent("Component.ScriptComponent"); var visualBeatsT = visualBeatsObj.getTransform(); visualizer.beatsTransforms.push(visualBeatsT); visualizer.beatsScripts.push(visualBeatsScript); //Calculate Beats Spawn Position var spawnPos = quat.angleAxis(angleToSpawnIncrements * i * global.Mathf.DEG2RAD, vec3.up()); spawnPos = spawnPos.multiplyVec3(vec3.right()); spawnPos = spawnPos.uniformScale(radius2spawn); //Calculate Beats Look Rotation var lookQuat = spawnPos.sub(vec3.zero()); lookQuat = quat.lookAt(lookQuat, vec3.up()); //Set Beats Spawn Position and Look Rotation visualBeatsT.setLocalPosition(spawnPos); visualBeatsT.setLocalRotation(lookQuat); //Insantiate visual notes that will be represented when the beat is active var playNoteVisualsObj = script.visualNotePrefab.instantiate(visualBeatsObj); var playNoteVisualsTransform = playNoteVisualsObj.getTransform(); var playNoteVisualsScripts = playNoteVisualsObj.getComponent("Component.ScriptComponent").api; var playNoteMeshVisual = playNoteVisualsScripts.getRenderMeshVisual(); playNoteMeshVisual.mainMaterial = playNoteMeshVisual.mainMaterial.clone(); playNoteVisualsObj.enabled = false; playNoteVisualsTransform.setLocalPosition(INITIAL_NOTE_OFFSET); visualizer.playNoteVisuals.push(playNoteMeshVisual); visualizer.playNoteVisualsObj.push(playNoteVisualsObj); visualizer.playNoteVisualsScript.push(playNoteVisualsScripts); } }
/** * Changing the instrument and note the beat will play. If there is a pre-existing instrument, it will override the previous. * * @param {integer} playBeatIndex: The index in which beats the instrument will play from. * @param {integer} instrumentType: The index in which type of instrument to play. * @param {integer} note: The index in which note to play of the instrument. * @method script.api.changePlayNote(playBeatIndex, instrumentType, note) * @return {void} */ script.api.changePlayNote = function(playBeatIndex, instrumentType, note){ var noteVisuals = visualizer.playNoteVisuals[playBeatIndex]; var noteVisualsObj = visualizer.playNoteVisualsObj[playBeatIndex]; var noteVisualsScript = visualizer.playNoteVisualsScript[playBeatIndex]; var instrumentVisuals = script.instrumentsManager.api.getRenderMeshVisualInstrumentAt(instrumentType); var toInstrumentMaterial = script.instrumentsManager.api.getInstrumentSelectedMaterialAt(instrumentType); noteVisualsObj.enabled = true; noteVisuals.mesh = instrumentVisuals.mesh; noteVisualsScript.setToMaterial(toInstrumentMaterial); var noteVisualsTransform = noteVisualsObj.getTransform(); noteVisualsTransform.setLocalScale(vec3.zero()); //Animate in the instrument visual global.LSTween.scaleToLocal(noteVisualsTransform, new vec3(NOTE_SIZE,NOTE_SIZE,NOTE_SIZE), SPAWN_OR_HIDE_ANIMATION_SPEED_MS) .easing(TWEEN.Easing.Cubic.Out) .start(); } /** * Visually hides the current instrument that is playing from the specific beat. * * @param {integer} playBeatIndex: The index in which beats the instrument will stop playing. * @method script.api.hidePlayNote(playBeatIndex) * @return {void} */ script.api.hidePlayNote = function(playBeatIndex){ var noteVisualsObj = visualizer.playNoteVisualsObj[playBeatIndex]; var noteVisualsTransform = noteVisualsObj.getTransform(); global.LSTween.scaleToLocal(noteVisualsTransform, new vec3(0,0,0), SPAWN_OR_HIDE_ANIMATION_SPEED_MS) .easing(TWEEN.Easing.Cubic.Out) .start(); }
/** * Visuals that shows the highlight of the progress bar. The progress bar will show an indication of when the instrument might play. * * @param {float} angle: The angular value which will translate between 0 through 1 * @method updateProgressBarVisual(angle) * @return {void} */ function updateProgressBarVisual(angle){ if(clonedProgressBarMat == null){ return; } clonedProgressBarMat.mainPass.scrollOffset = global.Mathf.map_range(angle, 0, 360, 0, 1); } /** * Updates the visual progress bar based on the current timestamp of the connected Lens session and music speed time. * * @param {float} timeStamp: The time in seconds + milliseconds of the current session. The value never goes above 60 seconds. * @method script.api.updateEvent(timeStamp) * @return {void} */ script.api.updateEvent = function(timeStamp){ var ticks = (timeStamp * musicBPM) / 60; var moddedTicks = ticks % beatsCount; var toAngleDeg = global.Mathf.map_range(moddedTicks, 0, beatsCount, 0, 360); var toRot = quat.angleAxis(toAngleDeg * global.Mathf.DEG2RAD, vec3.up()); updateProgressBarVisual(toAngleDeg); var toBeatIndex = parseInt(moddedTicks); if(currBeatIndex != toBeatIndex){ visualizer.playNoteVisualsScript[toBeatIndex].playActivatedAnimation(); currBeatIndex = toBeatIndex; } }
E.g. Progress Bar Indicator
SFX Controller
The SFXController.js controls the logic in which instrument is played at a specific beat. As well as the behavior for when an instrument is added to a beat or removed.
/**
* Assigns an audio track associated with the instrument to the beat.
*
* @param {integer} playBeatsIndex: The index of the beat you want to assign the audio.
* @param {Asset.AudioTrack} audioClip: The audio track that will play when the sfx is assigned to a beat.
* @param {integer} instrumentType: The type of instrument the audio is.
* @param {integer} note: The specific note of the instrument of the audio.
* @method script.api.addSfxAtBeat(playBeatsIndex, audioClip, instrumentType, note)
* @return {void}
*/
script.api.addSfxAtBeat = function(playBeatsIndex, audioClip, instrumentType, note){
if(playBeatsIndex < 0 || playBeatsIndex >= playBeatAudio.length){
print("ERROR: " + "addSfxAtBeat - playBeatsIndex in SFXController.js length is out of bounds. Passed Value: " + playBeatsIndex);
return;
}
playBeatAudio[playBeatsIndex].audioTrack = audioClip;
sfxVisualizer.changePlayNote(playBeatsIndex, instrumentType, note);
}
/**
* Assigns an audio track associated with the instrument to the beat.
*
* @param {integer} playBeatsIndex: The index of the beat you want to delete audio from.
* @method script.api.deleteSfxAtBeat(playBeatsIndex)
* @return {void}
*/
script.api.deleteSfxAtBeat = function(playBeatsIndex){
if(playBeatsIndex < 0 || playBeatsIndex >= playBeatAudio.length){
print("ERROR: " + "deleteSfxAtBeat - playBeatsIndex in SFXController.js length is out of bounds. Passed Value: " + playBeatsIndex);
return;
}
playBeatAudio[playBeatsIndex].audioTrack = null;
sfxVisualizer.hidePlayNote(playBeatsIndex);
}
The SFX Controller controls the settings of the visual musical loop players in the scene. How many beats, positioning of the beats in radius, as well as how often the beats are played (BPM or Beats Per Minute), are all adjustable in the parameter control of the Inspector Panel of the SFX Controller.
E.g. SFX Controller Settings
A 60 BPM will have a totally of 60 beats or ticks in the span of 60 seconds. A 100 BPM will have a total of 100 beats or ticks in the span of 60 seconds. To determine which beat to play at a given time stamp ranging from 0 seconds to 60 seconds is:
Beat To Play = Time Stamp x BPM / 60;
To determine which of the beats we created in the scene to activate, we can use modular arithmetic (if you haven't worked with modular arithmetic, take a look at this article from Khan Academy).
Beat Index In Scene To Play = Beat To Play % Total Beats in Scene;
When a beat is activated the SFXController will play the note assigned to it.
/** * Every frame, checks to see if it should play an instrument assigned to a beat based on the timeStamp and music BPM. * * @param {float} timeStamp: The time in seconds + milliseconds of the current session. The value never goes above 60 seconds. * @method updateEvent(timeStamp) * @return {void} */ var updateEvent = function(timeStamp){ var ticks = (timeStamp * script.musicBPM) / 60; var toPlayBeatsIndex = parseInt(ticks % script.playBeatCount); changePlayBeat(toPlayBeatsIndex); sfxVisualizer.updateEvent(timeStamp); } /** * Plays the instrument that the current beat holds. Will play nothing if no instrument is set to the beat. * * @param {integer} toBeatIndex: The index to play the instrument in the beat. * @method changePlayBeat(toBeatIndex) * @return {void} */ function changePlayBeat(toBeatIndex){ if(currPlayBeatIndex == toBeatIndex){ return; } currPlayBeatIndex = toBeatIndex; var audio = playBeatAudio[currPlayBeatIndex]; if(audio.audioTrack != null){ audio.play(1); } }
SFX Manager
The SFXManager.js manages all the SFX Controllers in the scene. It is responsible for sending and receiving play session to the connected Lenses server. It also identifies which instrument or beats is nearest based on the user's gaze.
Whenever a message is sent to the server or received from the server, the SFX Manager will need to understand what message type the data it is. In the Enum.js, we declare a global enum value for all the message types that this Lens can use:
- REMOVE_SFX_WITH_INDEX - Message will be related to a player's action to remove a note from a beat.
-
SPAWN_SFX_WITH_INDEX - Message will be related to a player's action to add a note to a beat.
- PLAYER_TRANSFORM - Message will be related to a player's position and rotation in the scene.
The SFX Manager will subscribe to the event that will be fired whenever a message is received with a type REMOVE_SFX_WITH_INDEX or SPAWN_SFX_WITH_INDEX.
global.multiplayerController.subscribeOnMessageReceive(global.serverMessage.SPAWN_SFX_WITH_INDEX, playBeatIndexInstantiateReceive);
global.multiplayerController.subscribeOnMessageReceive(global.serverMessage.REMOVE_SFX_WITH_INDEX, playBeatIndexRemoveReceive);
/**
* Remove a note to a beat based on the information given from connected Lens server. Message Type is global.serverMessage.REMOVE_SFX_WITH_INDEX.
*
* @param {session} session: An instance of a Connected Lens session among a group of participants who were successfully invited into the experience.
* @param {string} userId: The user that send their removal of instrument to a beat.
* @param {object} data: Contains the playBeatIndex and visualizerIndex.
* @method playBeatIndexRemoveReceive(session, userId, data)
* @return {void}
*/
var playBeatIndexRemoveReceive = function(session, userId, data){
var playBeatIndex = data.playBeatIndex;
var visualizerIndex = data.visualizerIndex;
var sfxController = script.allSfxControllers[visualizerIndex].api;
sfxController.deleteSfxAtBeat(playBeatIndex);
}
/**
* Add a note to a beat based on the information given from connected Lens server. The type of note of the instrument being sent is randomized. Message Type is global.serverMessage.SPAWN_SFX_WITH_INDEX.
*
* @param {session} session: An instance of a Connected Lens session among a group of participants who were successfully invited into the experience.
* @param {string} userId: The user that send their instantiation of instrument to a beat.
* @param {object} data: Contains the type of instrument, note of the instrument, which beat instrument should be spawned, and which loop rings it should be placed on.
* @method playBeatIndexInstantiateReceive(session, userId, data)
* @return {void}
*/
var playBeatIndexInstantiateReceive = function(session, userId, data){
var instrumentType = data.instrumentType;
var instrumentNote = data.instrumentNote;
var playBeatIndex = data.playBeatIndex;
var visualizerIndex = data.visualizerIndex;
var allInstruments = instrumentsManager.getAllInstruments();
var instrument = allInstruments[instrumentType];
var instrumentAudioClip = instrument.getAudioClip(instrumentNote);
var sfxController = script.allSfxControllers[visualizerIndex].api;
sfxController.addSfxAtBeat(playBeatIndex, instrumentAudioClip, instrumentType, instrumentNote);
}
Before discussing how we send messages to the connected Lens server, we need to discuss the when. There are two types of interactive objects in the scene, Beats and Instruments.
Interaction of these objects are based on the user's gaze, tapping, and tap + holding on the touchpad. It is important to give visual feedback to these object when a user intends to interact with them. It will help the user verify which objects is being interacted with at the given time.
E.g. Beat's Inactive and Active Animation
E.g. Highlight and Unhighlight Animation
Programmatically to determine whether or not a user is trying to interact with an object, we can find which object is closest from a ray with origin of the camera's position. Since there is no raycasting methods within Lens Studio, at of the time when this was written, we will be creating our own simple ray to sphere collision method.
/** * Helper function to rayToSphereCollision(center, radius, ray) * * @param {vec3} center: The center of the sphere you need to test collision with. * @param {float} radius: The radius of the sphere you need to test collision with. * @method rayToSphereCollisionHelper(center, radius) * @return {vec3} */ function rayToSphereCollisionHelper(center, radius){ var ray = {}; ray.direction = camTransform.back; ray.origin = camTransform.getWorldPosition(); return rayToSphereCollision(center, radius, ray); } /** * Calculates the the collision between a given ray and sphere. * * @param {vec3} center: The center of the sphere you need to test collision with. * @param {float} radius: The radius of the sphere you need to test collision with. * @param {object} ray: The ray contains the origin and direction vector of the ray. * @method rayToSphereCollision(center, radius, ray) * @return {vec3} */ function rayToSphereCollision(center, radius, ray){ var m = ray.origin.sub(center); var b = m.dot(ray.direction); var c = m.dot(m) - (radius * radius); if(c > 0 && b > 0){ return -1; } var discr = b*b - c; if(discr < 0){ return - 1; } var t = -b - Math.sqrt(discr); if(t < 0.0){ t = 0.0; } return ray.origin.add(ray.direction.uniformScale(t)); } /** * Finds the closest objects bazed on the user's gaze. * * @param {vec3} center: The center of the sphere you need to test collision with. * @param {float} radius: The radius of the sphere you need to test collision with. * @param {object} ray: The ray contains the origin and direction vector of the ray. * @method checkClosestObjects() * @return {closestObjectsInfo} * closestObjectsInfo.primaryHitDistance//float * closestObjectsInfo.secondaryHitDistance//float * closestObjectsInfo.visualizerIndex//integer * closestObjectsInfo.playBeatIndex//integer * closestObjectsInfo.instrumentIndex//integer * closestObjectsInfo.type//integer, 0 is collision with beats, 1 is collision with instruments */ function checkClosestObjects(){ var closestObjectsInfo = {}; closestObjectsInfo.primaryHitDistance = null; closestObjectsInfo.secondaryHitDistance = null; closestObjectsInfo.visualizerIndex = null; closestObjectsInfo.playBeatIndex = null; closestObjectsInfo.instrumentIndex = null; closestObjectsInfo.type = null; var toCheck = []; var primaryShortestDistance = Infinity; //Scan through all the SFX Controllers in the scene and add to for(var i = 0; i < allSfxControllersTransform.length; i++){ var collidedPos = rayToSphereCollisionHelper(allSfxControllersTransform[i].getWorldPosition(), script.allSfxControllers[i].api.getLoopRadius() * 1.2); //Collision was Detected if(collidedPos !== -1){ var sfxVisualizer = script.allSfxControllers[i].api.getSfxVisualizer(); var visualizerInfo = sfxVisualizer.getInfo(); var beatsTransforms = visualizerInfo.beatsTransforms; //Check all the beats in individual controller to review for a secondary hit detection for(var j = 0; j < beatsTransforms.length; j++){ //Collision Info type 0 - beats var collisionInfo = {}; collisionInfo.type = 0; collisionInfo.position = beatsTransforms[j].getWorldPosition(); collisionInfo.visualizerIndex = i; collisionInfo.playBeatIndex = j; toCheck.push(collisionInfo); } var camPos = camTransform.getWorldPosition(); var dist = collidedPos.distance(camPos); if(dist < primaryShortestDistance ){ closestObjectsInfo.primaryHitDistance = dist; primaryShortestDistance = dist; } } } //Check for Collision in the general vicinity of the Instruments if(instrumentsManager.getInstrumentParentTransform() == null){ return closestObjectsInfo; } var instrumentParentPos = instrumentsManager.getInstrumentParentTransform().getWorldPosition(); var instrumentCollidedPos = rayToSphereCollisionHelper(instrumentParentPos, instrumentsManager.instrumentPlacementRadius*1.2); if(instrumentCollidedPos !== -1){ var instrumentsTransform = instrumentsManager.getAllInstrumentsTransform(); for(var i = 0; i < instrumentsTransform.length; i++){ //Collision Info type 1 - instruments var collisionInfo = {}; collisionInfo.type = 1; collisionInfo.position = instrumentsTransform[i].getWorldPosition(); collisionInfo.instrumentIndex = i; toCheck.push(collisionInfo); } var camPos = camTransform.getWorldPosition(); var dist = instrumentCollidedPos.distance(camPos); if(dist < primaryShortestDistance ){ closestObjectsInfo.primaryHitDistance = dist; primaryShortestDistance = dist; } } var shortestDist = Infinity; var shortestIndex = -1 for(var i = 0; i < toCheck.length; i++){ var collisionInfo = toCheck[i]; var collidedPos = rayToSphereCollisionHelper(collisionInfo.position, 3); if(collidedPos !== -1){ var camPos = camTransform.getWorldPosition(); var dist = collidedPos.distance(camPos); if(dist < shortestDist){ shortestDist = dist; shortestIndex = i; closestObjectsInfo.secondaryHitDistance = dist; } } } if(shortestIndex !== -1){ var collisionInfo = toCheck[shortestIndex]; //Collision Type is Beats or playBeatIndexs if(collisionInfo.type == 0){ closestObjectsInfo.visualizerIndex = collisionInfo.visualizerIndex; closestObjectsInfo.playBeatIndex = collisionInfo.playBeatIndex; closestObjectsInfo.type = 0; } //Collision Type is Instruments else if(collisionInfo.type == 1){ closestObjectsInfo.instrumentIndex = collisionInfo.instrumentIndex; closestObjectsInfo.type = 1; } } //If primary target is too close or far then treat it as if it didn't hit if(closestObjectsInfo.primaryHitDistance != null){ if(closestObjectsInfo.primaryHitDistance < MIN_INTERACTION_DIST){ closestObjectsInfo.primaryHitDistance = null; } //If primary target is too then secondary target is invalid as well if(closestObjectsInfo.primaryHitDistance > MAX_INTERACTION_DIST){ closestObjectsInfo.secondaryHitDistance = null; closestObjectsInfo.primaryHitDistance = null; } } return closestObjectsInfo; }
Finding the closest intractable object should be detected every frame. Other scripts may use this behavior as well so let's create a simple event system other scripts can subscribe to.
var closestObjectsEvents = [];
/** * Subscribes to closestObjectsEvents(func) which will output the closest object detected by gaze. * * @method subscribeClosestObjectEvent() * @return {void} */ var subscribeClosestObjectEvent = function(func){ closestObjectsEvents.push(func); } /** * Executes all the functions subscribed to closestObjectsEvents(func) which will output the closest object detected by gaze. * * @method fireClosestObjectEvents() * @return {void} */ function fireClosestObjectEvents(){ var closestObjectsInfo = checkClosestObjects(); for(var i = 0; i < closestObjectsEvents.length; i++){ closestObjectsEvents[i](closestObjectsInfo); } }
var updateEvent = function(){ fireClosestObjectEvents(); } global.serverSyncManager.subscribeToServerTimeUpdateEvent(updateEvent);
The functionality to highlight and unhighlight an instrument or a beat based on the user gaze will be determined by the closest intractable object function created above.
/** * Updates any beat highlight animation when the user is gazing at the beat object. * * @param {vec3} center: The center of the sphere you need to test collision with. * closestObjectsInfo.primaryHitDistance//float * closestObjectsInfo.secondaryHitDistance//float * closestObjectsInfo.visualizerIndex//integer * closestObjectsInfo.playBeatIndex//integer * closestObjectsInfo.instrumentIndex//integer * closestObjectsInfo.type//integer, 0 is collision with beats, 1 is collision with instruments * @method updateBeatsAnimation(closestObjectsInfo) * @return {void} */ var updateBeatsAnimation = function (closestObjectsInfo){ if(closestObjectsInfo.type != 0 || closestObjectsInfo.secondaryHitDistance == null){ if(selectedBeats != null){ selectedBeats.api.stopBeatActiveAnimation(); } selectedBeats = null; return; } var visualizerIndex = closestObjectsInfo.visualizerIndex; var playBeatIndex = closestObjectsInfo.playBeatIndex; var sfxController = script.allSfxControllers[visualizerIndex].api; var sfxVisualizer = sfxController.getSfxVisualizer(); var beatsScripts = sfxVisualizer.getInfo().beatsScripts; if(selectedBeats != beatsScripts[playBeatIndex]){ if(selectedBeats != null){ selectedBeats.api.stopBeatActiveAnimation(); } selectedBeats = beatsScripts[playBeatIndex]; selectedBeats.api.playBeatActiveAnimation(); } } /** * Updates any instrument highlight animation when the user is gazing at the beat object. * * @param {vec3} center: The center of the sphere you need to test collision with. * closestObjectsInfo.primaryHitDistance//float * closestObjectsInfo.secondaryHitDistance//float * closestObjectsInfo.visualizerIndex//integer * closestObjectsInfo.playBeatIndex//integer * closestObjectsInfo.instrumentIndex//integer * closestObjectsInfo.type//integer, 0 is collision with beats, 1 is collision with instruments * @method updateInstrumentHighlightAnimation(closestObjectsInfo) * @return {void} */ var updateInstrumentHighlightAnimation = function (closestObjectsInfo){ if(closestObjectsInfo.type != 1 || closestObjectsInfo.secondaryHitDistance == null){ if(selectedInstrument != null){ //stop anim instrumentsManager.stopHighlightAnimationInstrumentAt(selectedInstrument); } selectedInstrument = null; return; } if(selectedInstrument != closestObjectsInfo.instrumentIndex){ if(selectedInstrument != null){ //stop anim instrumentsManager.stopHighlightAnimationInstrumentAt(selectedInstrument); } selectedInstrument = closestObjectsInfo.instrumentIndex; //play animation instrumentsManager.activateHighlightAnimationInstrumentAt(selectedInstrument); } } global.sfxManager.subscribeClosestObjectEvent(updateBeatsAnimation); global.sfxManager.subscribeClosestObjectEvent(updateInstrumentHighlightAnimation);
When a user gazes on an intractable object and taps on the touchpad, it will update the current instrument the user will play if the intractable object is an instrument and places a note if the intractable object is a beat. The SFX Manager will also let the server know that a user has added a note to a musical loop player by sending a message with the type SPAWN_SFX_WITH_INDEX.
/** * Add a note to a beat and let the connected Lens server know it has been sent. The type of note of the instrument being sent is randomized. Message Type is global.serverMessage.SPAWN_SFX_WITH_INDEX. * * @param {object} toPlayInfo: The angular value which will translate between 0 through 1 * playInfo.visualizerIndex //integer * playInfo.playBeatIndex //integer * playInfo.note = //integer * playInfo.instrument //integer * playInfo.instrumentType //integer * @method playBeatIndexInstantiateSend(toPlayInfo) * @return {void} */ function playBeatIndexInstantiateSend(toPlayInfo){ var instrument = toPlayInfo.instrument; var instrumentType = toPlayInfo.instrumentType; toPlayInfo.note = global.Mathf.getRandomIntInclusive(0, instrument.getAudioClipsLength() - 1); var instrumentAudioClip = instrument.getAudioClip(toPlayInfo.note); var sfxController = script.allSfxControllers[toPlayInfo.visualizerIndex].api; var data = {}; data.messageType = global.serverMessage.SPAWN_SFX_WITH_INDEX; data.visualizerIndex = toPlayInfo.visualizerIndex; data.instrumentType = toPlayInfo.instrumentType; data.instrumentNote = toPlayInfo.note; data.playBeatIndex = toPlayInfo.playBeatIndex; if(global.multiplayerController.sessionConnected){ global.multiplayerController.send(data); } sfxController.addSfxAtBeat(toPlayInfo.playBeatIndex, instrumentAudioClip, toPlayInfo.instrumentType, toPlayInfo.note); } /** * Helper function to playBeatIndexInstantiateSend(toPlayInfo) * * @param {integer} visualizerIndex: The index that reprents the visualizer loop objects. * @param {integer} playBeatIndex: The index that reprents the index of the beats. * @param {integer} note: The index that reprents the note of an instrument. * @method playBeatIndexInstantiateSendHelper(visualizerIndex, playBeatIndex, note) * @return {void} */ function playBeatIndexInstantiateSendHelper(visualizerIndex, playBeatIndex, note){ var toPlayInfo = playInfo; toPlayInfo.visualizerIndex = visualizerIndex; toPlayInfo.playBeatIndex = playBeatIndex; toPlayInfo.note = note; toPlayInfo.instrument = instrumentsManager.getCurrentInstrument(); toPlayInfo.instrumentType = instrumentsManager.getCurrentInstrumentIndex(); playBeatIndexInstantiateSend(toPlayInfo); } /** * Action which excutes when user taps on the touch pad. Touching on the touchpad will spawn a beat or instrument based on user's gaze. * * @method tapEvent() * @return {void} */ var tapEvent = function(){ var closestObjectsInfo = checkClosestObjects(); if(closestObjectsInfo.type == 0 && closestObjectsInfo.secondaryHitDistance != null){ playBeatIndexInstantiateSendHelper(closestObjectsInfo.visualizerIndex,closestObjectsInfo.playBeatIndex, 0); } else if(closestObjectsInfo.type == 1 && closestObjectsInfo.secondaryHitDistance != null){ instrumentsManager.setCurrentInstrument(closestObjectsInfo.instrumentIndex); } }
global.inputController.subscribeTap(tapEvent);
/** * Remove a note from a beat and let the connected Lens server know it has been removed. Message Type is global.serverMessage.REMOVE_SFX_WITH_INDEX. * * @param {integer} visualizerIndex: The index * @param {integer} playBeatIndex: The user that send their instantiation of instrument to a beat. * @param {object} data: Contains the type of instrument, note of the instrument, which beat instrument should be spawned, and which loop rings it should be placed on. * @method playBeatIndexRemoveSend(visualizerIndex, playBeatIndex) * @return {void} */ function playBeatIndexRemoveSend(visualizerIndex, playBeatIndex){ var sfxController = script.allSfxControllers[visualizerIndex].api; var data = {}; data.messageType = global.serverMessage.REMOVE_SFX_WITH_INDEX; data.visualizerIndex = visualizerIndex; data.playBeatIndex = playBeatIndex; if(global.multiplayerController.sessionConnected){ global.multiplayerController.send(data); } sfxController.deleteSfxAtBeat(playBeatIndex); } /** * Action which executes when user tap and hold on the touch pad. Removes objects in which user's gaze is upon. * * @method tapEvent() * @return {void} */ var tapHoldEvent = function(){ var closestObjectsInfo = checkClosestObjects(); if(closestObjectsInfo.type == 0 && closestObjectsInfo.secondaryHitDistance != null){ playBeatIndexRemoveSend(closestObjectsInfo.visualizerIndex, closestObjectsInfo.playBeatIndex); } } global.inputController.subscribeTapHold(tapHoldEvent);
Reticle Animation
The PlayReticleAnimation.js controls the animation of the reticle. The reticle helps indicate which object you are selecting and dynamically changes its Z-depth.
Reticle is another visual feedback for the users in the Lens to know when they are interfacing with an intractable object. The reticle is shown if the user is near by an intractable object and hides itself when there is not. The reticle also transforms itself from an idle frame to a hover frame when the user's gaze is looking directly at an intractable object.
/** * Scales in or out the reticle based on the given state. Scales in if reticleActive.ACTIVE. Scales out if reticleActive.INACTIVE * * @param {reticleActive} toActiveState: The reticle active state you want the reticle to go to. * @method changeReticleActiveState(toActiveState) * @return {void} */ function changeReticleActiveState(toActiveState){ if(!reticleActiveAnimReady || currReticleActive == toActiveState){ return; } currReticleActive = toActiveState; switch(toActiveState){ case reticleActive.INACTIVE: ChangeScale(0); break; case reticleActive.ACTIVE: ChangeScale(1); break; } } /** * Changes the reticle animation state based on whether reticleState.IDLE or reticleState.HOVER. * * @param {reticleState} toChangeReticleState: The reticle animation state you want the reticle to go to. * @method changeReticleState(toChangeReticleState) * @return {void} */ function changeReticleState(toChangeReticleState){ if(!reticleStateAnimReady || currReticleState == toChangeReticleState){ return; } var prevReticleState = currReticleState; currReticleState = toChangeReticleState; if(prevReticleState == reticleState.IDLE){ switch(toChangeReticleState){ case reticleState.HOVER: IdleToHover(); break; default: break; } } else if(prevReticleState == reticleState.HOVER){ switch(toChangeReticleState){ case reticleState.IDLE: HoverToIdle(); break; default: break; } } } /** * Changes the animation frame based on the animNum given. * * @param {integer} animNum: The frame number of the animation that should be played. * @method ToAnimSheet(animNum) * @return {void} */ function ToAnimSheet(animNum){ var changeRange = global.Mathf.map_range(TEXTURE_SIZE_HALF + animNum * TEXTURE_SIZE, 0, FULL_TEXTURE_SIZE, -0.5, 0.5); script.offsetMaterial.mainPass.offset = new vec2(changeRange,0); } /** * Plays the full animation from a given animation sheet. * * @param {float} duration: How long the animation will take to finish. * @param {integer} frameStart: The frame number in which the animation with start. * @param {integer} frameEnd: The frame number in which the animation will end. * @param {integer} finalFrame: The final frame number in which the animation will go to after animation if finished. * @method PlayAnimation(duration, frameStart, frameEnd, finalFrame) * @return {void} */ function PlayAnimation(duration, frameStart, frameEnd, finalFrame){ reticleStateAnimReady = false; global.LSTween.rawTween(duration) .onUpdate(function(o){ var toAnim = parseInt(global.Mathf.map_range(o.t, 0, 1, frameStart, frameEnd)); ToAnimSheet(toAnim); }) .onComplete(function(o){ if(finalFrame != null){ ToAnimSheet(finalFrame); } reticleStateAnimReady = true; }) .start(); } /** * Changes the scale of the reticle. * * @param {float} toVal: The scale the animation will go to. * @method ChangeScale(toVal) * @return {void} */ function ChangeScale(toVal){ reticleStateAnimReady = false; var initScale = reticleTransform.getWorldScale().x; var toScale = toVal; global.LSTween.rawTween(SCALE_ANIMATION_DURATION) .onUpdate(function(o){ var scale = global.Mathf.map_range(o.t, 0, 1, initScale, toScale); reticleTransform.setWorldScale(new vec3(scale,scale,scale)); }) .onComplete(function(o){ reticleTransform.setWorldScale(new vec3(toScale,toScale,toScale)); reticleStateAnimReady = true; }) .start(); } /** * Plays the animation from idle to hover. * * @method IdleToHover() * @return {void} */ function IdleToHover(){ PlayAnimation(ANIMATION_IDLE_DURATION, ANIMATION_IDLE_FRAME_START, ANIMATION_IDLE_FRAME_END); } /** * Plays the animation from hover to idle. * * @method IdleToHover() * @return {void} */ function HoverToIdle(){ PlayAnimation(ANIMATION_HOVER_DURATION, ANIMATION_IDLE_FRAME_END, ANIMATION_IDLE_FRAME_START); } /** * Updates the reticle animation and position depending on valid object. * * @param {object} closestObjectsInfo: The object that gives the ray-cast to sphere collision information based on user's gaze. * closestObjectsInfo.primaryHitDistance //float or null * closestObjectsInfo.secondaryHitDistance //float or null * closestObjectsInfo.visualizerIndex //integer * closestObjectsInfo.playBlockIndex //integer * closestObjectsInfo.instrumentIndex //integer * closestObjectsInfo.type //integer * @method updateVisualReticle(closestObjectsInfo); * @return {void} */ var updateVisualReticle = function(closestObjectsInfo){ var reticleDist = DEFAULT_RETICLE_DIST; if((closestObjectsInfo.primaryHitDistance == null && closestObjectsInfo.secondaryHitDistance == null)){ //nothing is hit if(prevReticleDist !== -1){ reticleDist = prevReticleDist; } changeReticleState(reticleState.IDLE); changeReticleActiveState(reticleActive.INACTIVE); } else if(closestObjectsInfo.secondaryHitDistance == null){ //primary hit but not secondary reticleDist = closestObjectsInfo.primaryHitDistance; changeReticleState(reticleState.IDLE); changeReticleActiveState(reticleActive.ACTIVE); } else{ //secondary hit reticleDist = closestObjectsInfo.secondaryHitDistance; changeReticleState(reticleState.HOVER); changeReticleActiveState(reticleActive.ACTIVE); } var toPos = camTransform.getWorldPosition().add(camTransform.back.uniformScale(reticleDist)); reticleTransform.setWorldPosition(toPos); prevReticleDist = reticleDist; }
Players Manager
The PlayersManager.js manages all the players active in the connected Lenses experience. When a user joins or leaves the same connected Lens session as you, the server broadcast that message to everyone in the session.
/**
* Instantiate a player prefab to the scene.
*
* @param {session} session: An instance of a Connected Lens session among a group of participants who were successfully invited into the experience.
* @param {string} userId: The user that joined the session.
* @method addPlayerToScene(session, userId)
* @return {void}
*/
var addPlayerToScene = function(session, userId){
var playerObj = script.playerPrefab.instantiate(global.worldScene.worldObj);
playerObj.enabled = false;
var player = {};
var playerTransform = playerObj.getTransform();
player.transform = playerTransform;
player.sceneObj = playerObj;
player.initialized = false;
player.position = new vec3(0,0,0);
player.rotation = quat.quatIdentity();
playersInScene[userId] = player;
}
/**
* Removes a player with the pass userId from the scene.
*
* @param {session} session: An instance of a Connected Lens session among a group of participants who were successfully invited into the experience.
* @param {string} userId: The user that joined the session.
* @method removePlayerFromScene(session, userId)
* @return {void}
*/
var removePlayerFromScene = function(session, userId){
playersInScene[userId].sceneObj.destroy();
delete playersInScene[userId];
}
global.multiplayerController.subscribeOnUserJoinedEvents(addPlayerToScene);
global.multiplayerController.subscribeOnUserLeftEvents(removePlayerFromScene);
E.g. Visual Representation Of Players In The Scene
When the current user makes any movement in the scene, its position and rotation needs to be sent to the server so that it can be reflected to other users in the same session. The Player Manager will let the server know that the current user's movement has been updated by sending a message with the type PLAYER_TRANSFORM. The position is sent with the time out because it is ok to not know precisely at all time where the user is in the scene.
/**
* Sends the current Lens user's position and rotation to server with timeout (in milliseconds).
*
* @param {transform)} transform: The user that joined the session.
* @method sendCurrentPlayerTransformToServer(transform)
* @return {void}
*/
function sendCurrentPlayerTransformToServer(transform){
var pos = transform.getLocalPosition();
var rot = transform.getLocalRotation();
const data = {};
data.messageType = global.serverMessage.PLAYER_TRANSFORM;
data.posX = pos.x;
data.posY = pos.y;
data.posZ = pos.z;
data.quatW = rot.w;
data.quatX = rot.x;
data.quatY = rot.y;
data.quatZ = rot.z;
if(global.multiplayerController.sessionConnected){
global.multiplayerController.sendWithTimeOut(data, SEND_PLAYER_INFO_TIMEOUT_MS_DURATION);
}
}
/**
* Calculates the current Lens user's position + rotation and sends the finromation to the server
*
* @method updateCurrentPlayerMovement()
* @return {void}
*/
function updateCurrentPlayerMovement(){
if(playerTransform == null || Object.keys(playersInScene).length <= 0){
return;
}
timerToSendToServer += getDeltaTime();
if(timerToSendToServer > SEND_TO_SERVER_RATE){
playerTransform.setWorldPosition(cameraTransform.getWorldPosition());
playerTransform.setWorldRotation(cameraTransform.getWorldRotation());
var diffPos = prevWorldPos.distance(playerTransform.getWorldPosition());
var diffRot = quat.angleBetween(prevWorldRot, playerTransform.getWorldRotation());
timerToSendToServer = 0;
sendCurrentPlayerTransformToServer(playerTransform);
}
prevWorldPos = playerTransform.getWorldPosition();
prevWorldRot = playerTransform.getWorldRotation();
}
Each Lens experience is responsible for updating the other user's position and rotation in the scene. When the server receives the message PLAYER_TRANSFORM, we will update the position and rotation for the specific player that sent that message.
/** * Updates the position and rotation of other players in the scene. * * @param {session} session: An instance of a Connected Lens session among a group of participants who were successfully invited into the experience. * @param {string} userId: The user that send their player position. * @param {object} data: Contains information on position and rotation. * @method receiveAllPlayersPosition(session, userId, data) * @return {void} */ function receiveAllPlayersPosition(session, userId, data){ var player = playersInScene[userId]; if(player == null || player.transform == null){ return; } var toPos = new vec3(data.posX, data.posY, data.posZ); var toRot = new quat(data.quatW, data.quatX, data.quatY, data.quatZ); var player = playersInScene[userId]; if(!player.initialized){ player.transform.setLocalPosition(toPos); player.transform.setLocalRotation(toRot); player.initialized = true; player.sceneObj.enabled = true; } player.position = toPos; player.rotation = toRot; }
global.multiplayerController.subscribeOnMessageReceive(global.serverMessage.PLAYER_TRANSFORM, receiveAllPlayersPosition);
This concludes the article on Soundshare. You can use this project as a starting point for your own connected Lens experience or expand the project to make it unique to your own. You could add different audio, add more musical loop players, have the loops play faster, change the 3D mesh in the scene, and so forth. Have fun experimenting with the project and create the world you want to see!