Play audio data using computer's audio device
AudioPlayer object plays audio data using
the computer's audio device.
To play audio data using the computer's audio device:
Define and set up your audio player object. See Construction.
step to play audio data according
to the properties of
dsp.AudioPlayer. The behavior
step is specific to each object in the toolbox.
This System object™ buffers the data from the audio device using the process illustrated by the following figure.
H = dsp.AudioPlayer returns
an audio player object,
H, that plays audio samples
using an audio output device in real-time.
H = dsp.AudioPlayer(' returns an audio player object,
with each property set to the specified value.
H = dsp.AudioPlayer(SAMPLERATE,' returns an audio player object,
SampleRate property set to
other specified properties set to the specified values. This System object supports
variable-size input. If you use variable-size signals with this System object,
you may experience sound dropouts when the size of the input frame
increases. To avoid this behavior, use a signal of maximum expected
size when you first call
step to start running
through this System object.
Device to which to send audio data
Specify the device to which to send the audio data. The default
Number of samples per second sent to audio device
Specify the number of samples per second in the signal as an
integer. The default is
Data type used by device
Specify the data type used by the audio device to acquire audio
Source of Buffer Size
Specify how to determine the buffer size as
Specify the size of the buffer that the audio player object
uses to communicate with the audio device as an integer.
This property applies when you set the
Size of queue in seconds
Specify the length of the audio queue, in seconds. The default
To minimize latency, lower the
Enable output of underrun count
Source of device channel mapping
Specify whether to determine the channel mapping as
Data-to-device channel mapping
Vector of valid channel indices to represent the mapping between
data and device output channels. The term Channel Mapping refer
to a 1-to-1 mapping that associates channels on the selected audio
device to channels of the data. When you play audio, channel mapping
allows you to specify which channel of the audio data to output a
specific channel of audio data. By default, the
|clone||Create audio player object with same property values|
|getNumInputs||Number of expected inputs to step method|
|getNumOutputs||Number of outputs of step method|
|isLocked||Locked status for input attributes and nontunable properties|
|release||Allow property value and input characteristics changes|
|step||Write audio to audio output device|
Read in an AVI audio file, and play the file back using the standard audio output device:
AFR = dsp.AudioFileReader; % points to a default audio file AP = dsp.AudioPlayer('SampleRate',AFR.SampleRate, ... 'QueueDuration',2, ... 'OutputNumUnderrunSamples',true); while ~isDone(AFR) audio = step(AFR); nUnderrun = step(AP,audio); if nUnderrun > 0 fprintf('Audio player queue underrun by %d samples.\n'... ,nUnderrun); end end pause(AP.QueueDuration); % wait until audio is played to the end release(AFR); % close the input file release(AP); % close the audio output device
To learn how to measure and tune audio throughput using this object, see the Measuring Audio Latency example.
To run your generated standalone executable application in Shell, you need to set your environment to the following:
This object implements the algorithm, inputs, and outputs described on the To Audio Device block reference page. The object properties correspond to the block parameters.