1.0.0 âĸ Published 2 years ago
webrtc-webaudio-hooks v1.0.0
Why this exists
While working on one project, we have realized that some functionalities from WebRTC and WebAudio APIs can be abstracted in certain hooks for further easier usage.
Table of Contents
Installation
This module is distributed via npm which is bundled with node and should be installed as one of your project's dependencies:
npm install --save webrtc-webaudio-hooks
yarn add webrtc-webaudio-hooks
Usage
use-media-stream
import React from 'react'
import ReactDOM from 'react-dom'
import { useMediaStream } from 'webrtc-webaudio-hooks'
function ExampleComponent() {
const {stream, isLoading, muted, toggleVideo} = useMediaStream()
if (isLoading) return <span>Getting your stream ready...đ</span>
return (
<>
<video srcObject={stream} autoPlay />
<ControlPanel muted={muted} toggleVideo={toggleVideo} />
</>
)
}
// API
return {
// MediaStream representing stream of media content
stream: MediaStream,
// Boolean value representing whether current stream is muted
muted: boolean,
// Boolean value representing whether current stream is visible
visible: boolean,
// Function to change "muted" state to opposite
toggleAudio: () => void,
// Function to change "visible" state (including webcam light indicator)
toggleVideo: (onTurnCamOn?: (track: MediaTrack) => void) => void,
// Boolean status representing MediaStream is getting created
isLoading: boolean,
// Boolean status representing whether creating MediaStream is failed
isError: boolean
// Boolean status representing whether creating MediaStream is successful
isSuccess: boolean
}
use-screen
import React from 'react'
import ReactDOM from 'react-dom'
import { useScreen } from 'webrtc-webaudio-hooks'
function ExampleComponent({stream}: {stream: MediaStream}) {
const {startShare, stopShare} = useScreen(stream)
return <ControlPanel startShareMyScreen={startShare} stopShareMyScreen={stopShare} />
}
// API
return {
// MediaStreamTrack representing stream of media display content
screenTrack: MediaStreamTrack,
// Function that creates display media, and takes two callbacks as arguments:
// @param onstarted - an optional function that is called when screen sharing is started
// @param onended - an optional function that is called when screen sharing is stopped
startShare: (
onstarted?: () => void,
onended?: () => void
) => Promise<void>,
// Boolean value representing whether current stream is visible
stopShare: (screenTrack: MediaStreamTrack) => void
}
use-is-audio-active
import React from 'react'
import ReactDOM from 'react-dom'
import { useIsAudioActive } from 'webrtc-webaudio-hooks'
function ExampleComponent() {
const [stream, setStream] = React.useState(null)
const isActive = useIsAudioActive({ source: stream });
React.useEffect(() => {
(async function createStream() {
const stream = await navigator.mediaDevices.getUserMedia({
audio: true,
video: true,
});
setStream(stream)
})()
}, [])
return (
<p>
Am I speaking: {' '} { isActive ? 'yes, you are đē' : "seems like ain't đĻģ" }
</p>
)
}
// API
// Boolean value representing whether audio stream is active (checks every second)
return isAcive
Other Solutions
Issues
Looking to contribute? Look for the Good First Issue label.
đ Bugs
Please file an issue for bugs, missing documentation, or unexpected behavior.
đĄ Feature Requests
Please file an issue to suggest new features. Vote on feature requests by adding a đ. This helps maintainers prioritize what to work on.
Contributors â¨
This project follows the all-contributors specification. Contributions of any kind welcome!
LICENSE
MIT
1.0.0
2 years ago