Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[doc] daily update 2024-11-6 #2076

Open
wants to merge 1 commit into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion lib/src/agora_base.dart
Original file line number Diff line number Diff line change
Expand Up @@ -692,7 +692,7 @@ enum QualityType {
@JsonValue(7)
qualityUnsupported,

/// 8: Detecting the network quality.
/// 8: The last-mile network probe test is in progress.
@JsonValue(8)
qualityDetecting,
}
Expand Down
8 changes: 5 additions & 3 deletions lib/src/agora_media_base.dart
Original file line number Diff line number Diff line change
Expand Up @@ -970,15 +970,17 @@ class VideoFrame {
@JsonKey(name: 'matrix')
final List<double>? matrix;

/// The alpha channel data output by using portrait segmentation algorithm. This data matches the size of the video frame, with each pixel value ranging from [0,255], where 0 represents the background and 255 represents the foreground (portrait). By setting this parameter, you can render the video background into various effects, such as transparent, solid color, image, video, etc. In custom video rendering scenarios, ensure that both the video frame and alphaBuffer are of the Full Range type; other types may cause abnormal alpha data rendering.
/// The alpha channel data output by using portrait segmentation algorithm. This data matches the size of the video frame, with each pixel value ranging from [0,255], where 0 represents the background and 255 represents the foreground (portrait). By setting this parameter, you can render the video background into various effects, such as transparent, solid color, image, video, etc.
/// In custom video rendering scenarios, ensure that both the video frame and alphaBuffer are of the Full Range type; other types may cause abnormal alpha data rendering.
/// Make sure that alphaBuffer is exactly the same size as the video frame (width × height), otherwise it may cause the app to crash.
@JsonKey(name: 'alphaBuffer', ignore: true)
final Uint8List? alphaBuffer;

/// @nodoc
@JsonKey(name: 'pixelBuffer', ignore: true)
final Uint8List? pixelBuffer;

/// The meta information in the video frame. To use this parameter, please contact.
/// The meta information in the video frame. To use this parameter, contact.
@VideoFrameMetaInfoConverter()
@JsonKey(name: 'metaInfo')
final VideoFrameMetaInfo? metaInfo;
Expand Down Expand Up @@ -1393,7 +1395,7 @@ class AudioSpectrumObserver {
///
/// After successfully calling registerAudioSpectrumObserver to implement the onRemoteAudioSpectrum callback in the AudioSpectrumObserver and calling enableAudioSpectrumMonitor to enable audio spectrum monitoring, the SDK will trigger the callback as the time interval you set to report the received remote audio data spectrum.
///
/// * [spectrums] The audio spectrum information of the remote user, see UserAudioSpectrumInfo. The number of arrays is the number of remote users monitored by the SDK. If the array is null, it means that no audio spectrum of remote users is detected.
/// * [spectrums] The audio spectrum information of the remote user. See UserAudioSpectrumInfo. The number of arrays is the number of remote users monitored by the SDK. If the array is null, it means that no audio spectrum of remote users is detected.
/// * [spectrumNumber] The number of remote users.
final void Function(
List<UserAudioSpectrumInfo> spectrums, int spectrumNumber)?
Expand Down
17 changes: 2 additions & 15 deletions lib/src/agora_media_engine.dart
Original file line number Diff line number Diff line change
Expand Up @@ -48,13 +48,7 @@ abstract class MediaEngine {

/// Registers a raw video frame observer object.
///
/// If you want to obtain the original video data of some remote users (referred to as group A) and the encoded video data of other remote users (referred to as group B), you can refer to the following steps:
/// Call registerVideoFrameObserver to register the raw video frame observer before joining the channel.
/// Call registerVideoEncodedFrameObserver to register the encoded video frame observer before joining the channel.
/// After joining the channel, get the user IDs of group B users through onUserJoined, and then call setRemoteVideoSubscriptionOptions to set the encodedFrameOnly of this group of users to true.
/// Call muteAllRemoteVideoStreams (false) to start receiving the video streams of all remote users. Then:
/// The raw video data of group A users can be obtained through the callback in VideoFrameObserver, and the SDK renders the data by default.
/// The encoded video data of group B users can be obtained through the callback in VideoEncodedFrameObserver. If you want to observe raw video frames (such as YUV or RGBA format), Agora recommends that you implement one VideoFrameObserver class with this method. When calling this method to register a video observer, you can register callbacks in the VideoFrameObserver class as needed. After you successfully register the video frame observer, the SDK triggers the registered callbacks each time a video frame is received.
/// If you want to observe raw video frames (such as YUV or RGBA format), Agora recommends that you implement one VideoFrameObserver class with this method. When calling this method to register a video observer, you can register callbacks in the VideoFrameObserver class as needed. After you successfully register the video frame observer, the SDK triggers the registered callbacks each time a video frame is received.
///
/// * [observer] The observer instance. See VideoFrameObserver.
///
Expand All @@ -65,14 +59,7 @@ abstract class MediaEngine {

/// Registers a receiver object for the encoded video image.
///
/// If you only want to observe encoded video frames (such as h.264 format) without decoding and rendering the video, Agora recommends that you implement one VideoEncodedFrameObserver class through this method. If you want to obtain the original video data of some remote users (referred to as group A) and the encoded video data of other remote users (referred to as group B), you can refer to the following steps:
/// Call registerVideoFrameObserver to register the raw video frame observer before joining the channel.
/// Call registerVideoEncodedFrameObserver to register the encoded video frame observer before joining the channel.
/// After joining the channel, get the user IDs of group B users through onUserJoined, and then call setRemoteVideoSubscriptionOptions to set the encodedFrameOnly of this group of users to true.
/// Call muteAllRemoteVideoStreams (false) to start receiving the video streams of all remote users. Then:
/// The raw video data of group A users can be obtained through the callback in VideoFrameObserver, and the SDK renders the data by default.
/// The encoded video data of group B users can be obtained through the callback in VideoEncodedFrameObserver.
/// Call this method before joining a channel.
/// If you only want to observe encoded video frames (such as H.264 format) without decoding and rendering the video, Agora recommends that you implement one VideoEncodedFrameObserver class through this method. Call this method before joining a channel.
///
/// * [observer] The video frame observer object. See VideoEncodedFrameObserver.
///
Expand Down
4 changes: 2 additions & 2 deletions lib/src/agora_media_player_source.dart
Original file line number Diff line number Diff line change
Expand Up @@ -58,8 +58,8 @@ class MediaPlayerSourceObserver {
/// Reports the playback duration that the buffered data can support.
///
/// When playing online media resources, the SDK triggers this callback every two seconds to report the playback duration that the currently buffered data can support.
/// When the playback duration supported by the buffered data is less than the threshold (0 by default), the SDK returns playerEventBufferLow.
/// When the playback duration supported by the buffered data is greater than the threshold (0 by default), the SDK returns playerEventBufferRecover.
/// When the playback duration supported by the buffered data is less than the threshold (0 by default), the SDK returns playerEventBufferLow (6).
/// When the playback duration supported by the buffered data is greater than the threshold (0 by default), the SDK returns playerEventBufferRecover (7).
///
/// * [playCachedBuffer] The playback duration (ms) that the buffered data can support.
final void Function(int playCachedBuffer)? onPlayBufferUpdated;
Expand Down
35 changes: 15 additions & 20 deletions lib/src/agora_rtc_engine.dart
Original file line number Diff line number Diff line change
Expand Up @@ -1390,7 +1390,7 @@ class ChannelMediaOptions {
@JsonKey(name: 'publishCustomAudioTrack')
final bool? publishCustomAudioTrack;

/// The ID of the custom audio source to publish. The default value is 0. If you have set sourceNumber in setExternalAudioSource to a value greater than 1, the SDK creates the corresponding number of custom audio tracks and assigns an ID to each audio track, starting from 0.
/// The ID of the custom audio track to be published. The default value is 0. You can obtain the custom audio track ID through the createCustomAudioTrack method.
@JsonKey(name: 'publishCustomAudioTrackId')
final int? publishCustomAudioTrackId;

Expand Down Expand Up @@ -1851,7 +1851,7 @@ class RtcEngineEventHandler {
///
/// This callback reports the last-mile network conditions of the local user before the user joins the channel. Last mile refers to the connection between the local device and Agora's edge server. Before the user joins the channel, this callback is triggered by the SDK once startLastmileProbeTest is called and reports the last-mile network conditions of the local user.
///
/// * [quality] The last-mile network quality. qualityUnknown (0): The quality is unknown. qualityExcellent (1): The quality is excellent. qualityGood (2): The network quality seems excellent, but the bitrate can be slightly lower than excellent. qualityPoor (3): Users can feel the communication is slightly impaired. qualityBad (4): Users cannot communicate smoothly. qualityVbad (5): The quality is so bad that users can barely communicate. qualityDown (6): The network is down, and users cannot communicate at all. See QualityType.
/// * [quality] The last-mile network quality. qualityUnknown (0): The quality is unknown. qualityExcellent (1): The quality is excellent. qualityGood (2): The network quality seems excellent, but the bitrate can be slightly lower than excellent. qualityPoor (3): Users can feel the communication is slightly impaired. qualityBad (4): Users cannot communicate smoothly. qualityVbad (5): The quality is so bad that users can barely communicate. qualityDown (6): The network is down, and users cannot communicate at all. qualityDetecting (8): The last-mile probe test is in progress. See QualityType.
final void Function(QualityType quality)? onLastmileQuality;

/// Occurs when the first local video frame is displayed on the local video view.
Expand Down Expand Up @@ -2143,7 +2143,7 @@ class RtcEngineEventHandler {
/// The SDK triggers this callback when the local user receives the stream message that the remote user sends by calling the sendStreamMessage method.
///
/// * [connection] The connection information. See RtcConnection.
/// * [uid] The ID of the remote user sending the message.
/// * [remoteUid] The ID of the remote user sending the message.
/// * [streamId] The stream ID of the received message.
/// * [data] The data received.
/// * [length] The data length (byte).
Expand All @@ -2158,7 +2158,7 @@ class RtcEngineEventHandler {
/// * [connection] The connection information. See RtcConnection.
/// * [remoteUid] The ID of the remote user sending the message.
/// * [streamId] The stream ID of the received message.
/// * [code] The error code. See ErrorCodeType.
/// * [code] Error code. See ErrorCodeType.
/// * [missed] The number of lost messages.
/// * [cached] Number of incoming cached messages when the data stream is interrupted.
final void Function(RtcConnection connection, int remoteUid, int streamId,
Expand Down Expand Up @@ -3079,10 +3079,10 @@ abstract class RtcEngine {

/// Gets the warning or error description.
///
/// * [code] The error code or warning code reported by the SDK.
/// * [code] The error code reported by the SDK.
///
/// Returns
/// The specific error or warning description.
/// The specific error description.
Future<String> getErrorDescription(int code);

/// Queries the video codec capabilities of the SDK.
Expand Down Expand Up @@ -3678,14 +3678,10 @@ abstract class RtcEngine {

/// Options for subscribing to remote video streams.
///
/// When a remote user has enabled dual-stream mode, you can call this method to choose the option for subscribing to the video streams sent by the remote user.
/// If you only register one VideoFrameObserver object, the SDK subscribes to the raw video data and encoded video data by default (the effect is equivalent to setting encodedFrameOnly to false).
/// If you only register one VideoEncodedFrameObserver object, the SDK only subscribes to the encoded video data by default (the effect is equivalent to setting encodedFrameOnly to true).
/// If you register one VideoFrameObserver object and one VideoEncodedFrameObserver object successively, the SDK subscribes to the encoded video data by default (the effect is equivalent to setting encodedFrameOnly to false).
/// If you call this method first with the options parameter set, and then register one VideoFrameObserver or VideoEncodedFrameObserver object, you need to call this method again and set the options parameter as described in the above two items to get the desired results. Agora recommends the following steps:
/// Set autoSubscribeVideo to false when calling joinChannel to join a channel.
/// Call this method after receiving the onUserJoined callback to set the subscription options for the specified remote user's video stream.
/// Call the muteRemoteVideoStream method to resume subscribing to the video stream of the specified remote user. If you set encodedFrameOnly to true in the previous step, the SDK triggers the onEncodedVideoFrameReceived callback locally to report the received encoded video frame information.
/// When a remote user has enabled dual-stream mode, you can call this method to choose the option for subscribing to the video streams sent by the remote user. The default subscription behavior of the SDK for remote video streams depends on the type of registered video observer:
/// If the VideoFrameObserver observer is registered, the default is to subscribe to both raw data and encoded data.
/// If the VideoEncodedFrameObserver observer is registered, the default is to subscribe only to the encoded data.
/// If both types of observers are registered, the default behavior follows the last registered video observer. For example, if the last registered observer is the VideoFrameObserver observer, the default is to subscribe to both raw data and encoded data. If you want to modify the default behavior, or set different subscription options for different uids, you can call this method to set it.
///
/// * [uid] The user ID of the remote user.
/// * [options] The video subscription options. See VideoSubscriptionOptions.
Expand Down Expand Up @@ -3897,7 +3893,7 @@ abstract class RtcEngine {

/// Adjusts the volume during audio mixing.
///
/// This method adjusts the audio mixing volume on both the local client and remote clients.
/// This method adjusts the audio mixing volume on both the local client and remote clients. This method does not affect the volume of the audio file set in the playEffect method.
///
/// * [volume] Audio mixing volume. The value ranges between 0 and 100. The default value is 100, which means the original volume.
///
Expand Down Expand Up @@ -5597,9 +5593,8 @@ abstract class RtcEngine {
/// Sends data stream messages.
///
/// After calling createDataStream, you can call this method to send data stream messages to all users in the channel. The SDK has the following restrictions on this method:
/// Each user can have up to five data streams simultaneously.
/// Up to 60 packets can be sent per second in a data stream with each packet having a maximum size of 1 KB.
/// Up to 30 KB of data can be sent per second in a data stream. A successful method call triggers the onStreamMessage callback on the remote client, from which the remote user gets the stream message. A failed method call triggers the onStreamMessageError callback on the remote client.
/// Each client within the channel can have up to 5 data channels simultaneously, with a total shared packet bitrate limit of 30 KB/s for all data channels.
/// Each data channel can send up to 60 packets per second, with each packet being a maximum of 1 KB. A successful method call triggers the onStreamMessage callback on the remote client, from which the remote user gets the stream message. A failed method call triggers the onStreamMessageError callback on the remote client.
/// This method needs to be called after createDataStream and joining the channel.
/// In live streaming scenarios, this method only applies to hosts.
///
Expand Down Expand Up @@ -5990,7 +5985,7 @@ abstract class RtcEngine {
/// When video screenshot and upload function is enabled, the SDK takes screenshots and uploads videos sent by local users based on the type and frequency of the module you set in ContentInspectConfig. After video screenshot and upload, the Agora server sends the callback notification to your app server in HTTPS requests and sends all screenshots to the third-party cloud storage service.
///
/// * [enabled] Whether to enalbe video screenshot and upload: true : Enables video screenshot and upload. false : Disables video screenshot and upload.
/// * [config] Screenshot and upload configuration. See ContentInspectConfig. When the video moderation module is set to video moderation via Agora self-developed extension(contentInspectSupervision), the video screenshot and upload dynamic library libagora_content_inspect_extension.dll is required. Deleting this library disables the screenshot and upload feature.
/// * [config] Screenshot and upload configuration. See ContentInspectConfig.
///
/// Returns
/// When the method call succeeds, there is no return value; when fails, the AgoraRtcException exception is thrown. You need to catch the exception and handle it accordingly.
Expand Down Expand Up @@ -6026,7 +6021,7 @@ abstract class RtcEngine {
/// Sets up cloud proxy service.
///
/// When users' network access is restricted by a firewall, configure the firewall to allow specific IP addresses and ports provided by Agora; then, call this method to enable the cloud proxyType and set the cloud proxy type with the proxyType parameter. After successfully connecting to the cloud proxy, the SDK triggers the onConnectionStateChanged (connectionStateConnecting, connectionChangedSettingProxyServer) callback. To disable the cloud proxy that has been set, call the setCloudProxy (noneProxy). To change the cloud proxy type that has been set, call the setCloudProxy (noneProxy) first, and then call the setCloudProxy to set the proxyType you want.
/// Agora recommends that you call this method after joining a channel.
/// Agora recommends that you call this method before joining a channel.
/// When a user is behind a firewall and uses the Force UDP cloud proxy, the services for Media Push and cohosting across channels are not available.
/// When you use the Force TCP cloud proxy, note that an error would occur when calling the startAudioMixing method to play online music files in the HTTP protocol. The services for Media Push and cohosting across channels use the cloud proxy with the TCP protocol.
///
Expand Down
Loading
Loading