Open In App

Audio Context.close() in Web APIs

Last Updated : 07 Sep, 2021
Improve
Improve
Like Article
Like
Save
Share
Report

Web Audio APIs help to make the audio of a website application sound correct. Sounds make the site more fun and attractive. Using Audio APIs, many music or sound applications can be built. In this article, we will learn about audiocontext.close() along with some basic information about Audio APIs. 

Audio Context is an object that is used to do various manipulations in the audio for the website or application. It is a pre-built function that can be used on various browsers like google and Firefox Before making changes to any audio the source of the audio has to be stored. There are 3 major types of audio sources.

  • Oscillator: it is used to produce a mathematically computed sounds
  • Audio Samples: to get audio from various files
  • Audio stream: getting audio from webcam or microphone

After the audio is been streamed by using either of the sources, one can either emphasize or attenuate the parts of audio by using different nodes. After the node functions are performed the audio is connected to the destination and then the sound is played. Some of the different nodes are Filter,ChannelSpiltterNode,AudioWorkletNode,etc…

AudioContext.close(): This audio context function can close the audiocontext and thereby detaching any hardware resources associated with the audiocontent. i.e. the function doesn’t further take/give any inputs to the sound device. But the already stored audio before using close() can be manipulated. This is very meaningful in low power devices like mobile. If the audiocontext stays on the device’s resources stay on and thus causing the power loss.

Syntax:

// Declaring audiocontext constructor
var audioContext=new AudioContext();
audioContext.close().then(function()
     {.
      .
      . });
await audioContext.close()

   

Example: In the following code, a small HTML page is created with 3 buttons. Clicking on each button performs respective functions related to the audio. In the example, the audio can be started and the stopped,i.e, resume. Note that the resume is different from a stop as resuming the audio means temporarily pausing the audio and restart from where it was stopped. Whereas stopping the audio using close() completely removes the audio from the sound devices and setting the state as beginning state.

Javascript




<!DOCTYPE html>
<html>
    <head>
        <meta charset="utf-8" />
 
        <title>states</title>
 
        <link rel="stylesheet" href="" />
        <style>
            body {
                background: lightcoral;
                color: #323232;
                font-weight: 300;
                height: 100vh;
                margin: 0;
                display: flex;
                align-items: center;
                justify-content: center;
                text-align: center;
                font-family: Helvetica neue, roboto;
            }
            .button:hover {
                background-color: aquamarine;
            }
            .btn-group .button {
                background-color: bisque;
                border: 1px solid black;
                color: black;
                padding: 15px 32px;
                text-align: center;
                text-decoration: none;
                font-size: 16px;
                cursor: pointer;
                width: 150px;
                display: block;
                margin: 4px 2px;
            }
            .button:hover {
                background-color: whitesmoke;
            }
 
            h1 {
                font-weight: 200;
                font-style: 26px;
                margin: 10px;
            }
        </style>
    </head>
 
    <body>
        <div class="btn-group">
            <button id="start" class="button">
              Start Audio
            </button>
            <button id="sus" class="button">
              Suspend Audio
            </button>
            <button id="stop" class="button">
              Stop Audio
            </button>
             
 
<p>Current context time: No context exists.</p>
 
 
        </div>
 
        <script>
            let AudioContext;
 
            const start = document.getElementById("start");
            const susres = document.getElementById("sus");
            const stop = document.getElementById("stop");
 
            const timeDisplay = document.querySelector("p");
 
            susres.setAttribute("disabled", "disabled");
            stop.setAttribute("disabled", "disabled");
 
            start.onclick = function () {
                start.setAttribute("disabled", "disabled");
                susres.removeAttribute("disabled");
                stop.removeAttribute("disabled");
 
                // Create web audio api context
                AudioContext = window.AudioContext
                      || window.webkitAudioContext;
                AudioContext = new AudioContext();
 
                // Create Oscillator and filter
                const oscillator = AudioContext.createOscillator();
                const filter = AudioContext.createBiquadFilter();
 
                // Connect oscillator to filter to speakers
                oscillator.connect(filter);
                filter.connect(AudioContext.destination);
 
                // Make audio/noise
                oscillator.type = "sine";
 
                // hertz frequency
                oscillator.frequency.value = 100;
                oscillator.start(0);
            };
 
            // Suspend/resume the audiocontext,i.e,
            // the audio can be played back
            susres.onclick = function () {
                if (AudioContext.state === "running") {
                    AudioContext.suspend().then(function () {
                        susres.textContent = "Resume Audio";
                    });
                } else if (AudioContext.state === "suspended") {
                    AudioContext.resume().then(function () {
                        susres.textContent = "Suspend Audio";
                    });
                }
            };
 
            // Close the audiocontext,i.e, the audio is
            // completely stopped after the stop button
            // is clicked by promise the audio resets
            // the response to beginning state(Create Audio)
            stop.onclick = function () {
                AudioContext.close().then(function () {
                    start.removeAttribute("disabled");
                    susres.setAttribute("disabled", "disabled");
                    stop.setAttribute("disabled", "disabled");
                });
            };
 
            function displayTime() {
                if (AudioContext && AudioContext.state !== "closed")
                {
                    timeDisplay.textContent = "audio time "
                    + AudioContext.currentTime.toFixed(3);
                }
                else
                {
                    timeDisplay.textContent = "Context not started";
                }
                requestAnimationFrame(displayTime);
            }
 
            displayTime();
        </script>
    </body>
</html>


Output : The audio starts by clicking ‘Create Audio’ , audio can temporarily stopped by ‘Suspend Audio’ and again started without time lapse ‘Resume Audio’. The audio is stopped by ‘Stop Audio’ using Audiocontext.close()



Like Article
Suggest improvement
Previous
Next
Share your thoughts in the comments

Similar Reads