[ad_1]
I’m making a system in Unity utilizing Mirror Networking which is able to solely require one host/server and one consumer. It’s a WebGL construct so the consumer will join on a browser. I need to ship instructions from the server to play audio clips on the consumer with out the consumer with the ability to management something. The audio is questions an avatar is asking the person. Attributable to one other constraint (Lip Sync setup) I’ve one AudioSource
and one lengthy audio clip which accommodates all of the questions and is performed by urgent the area bar. So quite than enjoying every audio clip individually, the audio clip is unpaused and paused. It would pause itself after a set time when the query line is completed after which I press the enter key to unpause the AudioSource
for the following query (and once more, every line will pause itself after a set time which is the size of the person query).
Earlier than I added Mirror or any networking part to the challenge this labored tremendous in WebGL, however now the audio is not going to behave the identical. I attempted it in Google Chrome and Microsoft Edge. It would play/unpause after I press the area/enter keys and it’ll pause however after the primary two traces (which play appropriately) it’ll begin to skip forward a couple of seconds each time I unpause it. I used the WebGL growth construct to verify that the AudioSource.time
is right earlier than it’s unpaused but it surely nonetheless will play the unsuitable a part of the audioclip. It virtually looks like the audio clip continues to be operating even after it has been paused as a result of the longer I wait the additional into the clip it’ll play, however it’s not a direct second:second relationship, even when I play traces intently after one another it appears to skip forward additional and additional every time.
On this Networking case are there two completely different AudioSources (One for the server and one for the consumer)? Although how might the debug log present the proper AudioSource.time
(in Unity editor and the browser) however then it’s unpaused at a distinct time? Is the consumer model of the AudioSource operating within the background so the time is growing although the audio has been paused? It would not make sense to me however I’m fairly new to Unity, particularly Mirror and community programming usually. Or might this be some problem with WebGL? I feel it in all probability is not solely a WebGL problem since this half was working in WebGL earlier than I added any networking part to the challenge.
Once I construct it as a Home windows challenge as an alternative of WebGL, it really works completely. So I’m not sure if it is a matter with my networking code. Might it’s a difficulty particularly brought on by utilizing Mirror or networking with a WebGL construct?
utilizing System.Collections;
utilizing System.Collections.Generic;
utilizing UnityEngine;
utilizing Mirror;
public class SoundTracker : NetworkBehaviour
{
public AudioSource soundFX;
public int rely = 0;
// Replace known as as soon as per body
void Replace()
{
if (isServer != true) return;
if (Enter.GetKeyDown(KeyCode.House))
{
rely = 0;
startSound();
StartCoroutine(SoundWaiter(33.5f));
}
if (Enter.GetKeyDown(KeyCode.Return))
{
float[] lineDurations = { 3.2f, 4.75f, 4.5f, 5f, 4f, 5.5f, 5.5f, 3.8f, 8.6f, 2.9f };
playSound();
StartCoroutine(SoundWaiter(lineDurations[count]));
rely = rely + 1;
}
}
[Server]
IEnumerator SoundWaiter(float length)
{
yield return new WaitForSeconds(length);
pauseSound();
}
[ClientRpc]
void startSound()
{
soundFX.Play();
}
[ClientRpc]
void playSound()
{
soundFX.UnPause();
}
[ClientRpc]
void pauseSound()
{
soundFX.Pause();
}
}
```
[ad_2]