# AudioToVideoSync **Repository Path**: harmonyos_samples/AudioToVideoSync ## Basic Information - **Project Name**: AudioToVideoSync - **Description**: No description available - **Primary Language**: Unknown - **License**: Apache-2.0 - **Default Branch**: master - **Homepage**: None - **GVP Project**: No ## Statistics - **Stars**: 7 - **Forks**: 18 - **Created**: 2024-11-07 - **Last Updated**: 2026-01-17 ## Categories & Tags **Categories**: Uncategorized **Tags**: None ## README # Achieving Audio-Video Synchronization Playback Effect ## Introduction This example is based on video decoding and achieves audio-video synchronization adaptation by calculating the delay of audio and video frames. It solves the audio-video desynchronization issue in scenarios such as local video playback, network video playback, and recorded video playback. When developers implement video decoding and playback functions, integrating the audio-video synchronization module can maintain synchronized playback of video images and audio, providing users with a better viewing experience in high-latency scenarios. ## Effect Preview | Application Interface | Network Video Playback Page | Recording Page | Audio-Video Synchronization Playback Page | |-------------------------------------------------------|---------------------------------------------------------|--------------------------------------------------------|------------------------------------------------------| | | | | | ## Usage Instructions 1. Open the application, where three scenarios are available: local video audio-video synchronization, network video audio-video synchronization, and recorded video audio-video synchronization. 2. Click to enter the local video audio-video synchronization scenario and select a specific video from the gallery. 3. Enter the audio-video synchronization page. The playback page provides x1, x2, and x3 fast-forward options. 4. Click to enter the network video audio-video synchronization scenario. A fixed network video is provided; click to play it. 5. Enter the audio-video synchronization page, and follow the same operation as step 3. 6. Click to enter the recorded video audio-video synchronization scenario. 7. Enter the recording page. After recording a video, the video will be saved to the gallery. The gallery will open automatically; select the specific video, and follow the same operation as step 3. ## Project Directory ``` ├──entry/src/main/cpp // Native Layer │ ├──capabilities // Capability Interfaces and Implementations │ │ ├──include // Capability Interfaces │ │ ├──AudioDecoder.cpp // Audio Decoding Implementation │ │ ├──AudioEncoder.cpp // Audio Encoding Implementation │ │ ├──Demuxer.cpp // Demuxing Implementation │ │ ├──Muxer.cpp // Muxing Implementation │ │ ├──VideoDecoder.cpp // Video Decoding Implementation │ │ └──VideoEncoder.cpp // Video Encoding Implementation │ ├──common // Common Module │ │ ├──dfx // Log Implementation │ │ ├──SampleCallback.cpp // Codec Callback Implementation │ │ ├──SampleCallback.h // Codec Callback Definition │ │ └──SampleInfo.h // Common Class for Function Implementation │ ├──player // Native Layer Player │ │ ├──Player.cpp // Implementation of Native Layer Playback Function Call Logic │ │ ├──Player.h // Interface of Native Layer Playback Function Call Logic │ │ ├──PlayerNative.cpp // Entry Implementation of Native Layer Playback │ │ └──PlayerNative.h // Interface of Native Layer Playback │ ├──recorder // Recording Interface │ │ ├──Recorder.cpp // Recording Function Interface Implementation │ │ ├──Recorder.h // Recording Function Interface Definition │ │ ├──RecorderNative.cpp // Recording Interface Call Entry │ │ └──RecorderNative.h // Call Entry Definition │ ├──render // Rendering Module Interfaces and Implementations │ │ ├──include // Rendering Module Interfaces │ │ ├──PluginManager.cpp // Rendering Module Management Implementation │ │ └──PluginRender.cpp // Rendering Logic Implementation │ ├──types // Interfaces Exposed by Native Layer │ │ ├──libplayer // Interfaces Exposed by Playback Module to UI Layer │ │ └──librecorder // Interfaces Exposed by Recording Module to UI Layer │ └──CMakeLists.txt // Compilation Entry ├──ets // UI Layer │ ├──common // Common Module │ │ ├──utils // Common Utility Classes │ │ │ ├──CameraCheck.ets // Camera Utility Class │ │ │ ├──DateTimeUtils.ets // Date and Time Utility Class │ │ │ ├──FileUtil.ets // File Utility Class │ │ │ ├──Logger // Log Utility Class │ │ │ ├──PermissionUtil // Permission Utility Class │ │ │ ├──RecorderUtil // Video Recording Utility Class │ │ │ ├──RouterUtil // Routing Utility Class │ │ │ └──TimeUtils // Time-related tools │ │ └──CommonConstants.ets // Parameter Constants │ ├──entryability // Application Entry │ │ └──EntryAbility.ets // Entry Function Class │ ├──entrybackupability // Application Background │ │ └──EntryBackupAbility.ets // Background Management Class │ ├──model // Data Interaction Class │ │ └──CameraDateModel.ets // Camera Parameter Data Class │ ├──pages // Pages │ │ ├──Index.ets // Home Page/Scene Selection Page │ │ ├──NetworkVideo.ets // Network Video Playback Page │ │ ├──PlayerSync.ets // Audio-Video Synchronization Playback Page │ │ └──Recorder.ets // Camera Recording Page ├──resources // Used to Store Resource Files Used by the Application │ ├──base // Resource files under this directory will be assigned unique IDs │ │ ├──element // Used to Store Fonts and Colors │ │ ├──media // Used to Store Images │ │ └──profile // Application Entry Home Page │ ├──en_US // When the device language is American English, resources under this directory are matched first │ └──zh_CN // When the device language is Simplified Chinese, resources under this directory are matched first └──module.json5 // Module Configuration Information ``` ## Specific Implementation ### Video Decoding Part 1. After the user clicks the play button, a click event is triggered to call the PhotoViewPicker() interface. This function launches the gallery's file selection module and retrieves the path of the video selected by the user. 2. Upon successful file selection by the user, the playNative() interface calls the PlayerNative::Play() function to initialize and invoke the decoding module to start decoding. 3. After the decoder starts, the input callback is triggered. The data to be decoded is filled into OH_AVBuffer, and the PushInputBuffer interface is called to send the data to the decoder for decoding. At least one XPS frame must be pushed after each start. 4. Each time the decoder decodes a frame, the output callback is triggered once. The user needs to promptly call the rendering or release interface to return the buffer to the decoder. Since the decoder has an upper limit on the number of buffers, timely return is required; otherwise, the decoder will stop working once the limit is reached until a buffer is returned. 5. At the end of playback, the napi_call_function() interface in Callback() is triggered to execute the corresponding callback event. ### Video Recording: 1. Obtain camera captured data through cameraInput and create a camera input. 2. Create previewOutput to get the preview output stream, connect it via the surfaceId of XComponent, and render to XComponent. 3. Create a video recording output stream VideoOutput through surfaceId to output to a file. 4. Specific implementations based on the Recorder method are encapsulated in RecordController.ets, and those based on the AVCodec method are encapsulated in AVCodecController.ets. ### Audio-Video Synchronization Part 1. When a video frame is received, call the OH_AudioRenderer_GetTimestamp() interface to obtain information such as the audio rendering position. 2. Before the audio starts, to avoid stutters and other issues, synchronization is temporarily disabled, and video frames are directly rendered. 3. After the audio starts, calculate the delay based on the video frame PTS and audio rendering position, and select the audio-video synchronization strategy according to the delay: - If the video frame is more than 40ms behind the audio frame, discard this video frame directly. - If the video frame is less than 40ms behind the audio frame, render it directly. - If the video frame is ahead of the audio frame, perform gradual synchronization and wait for a period of time before rendering. ## Related Permissions - Allow the app to use the camera: ohos.permission.CAMERA. - Allow the app to use the microphone: ohos.permission.MICROPHONE. ## Dependencies - None involved. ## Constraints and Limitations 1. This example only supports running on standard systems, compatible devices: Huawei phones. 2. HarmonyOS version: HarmonyOS 6.0.0 Release or above. 3. DevEco Studio version: DevEco Studio 6.0.0 Release or above. 4. HarmonyOS SDK version: HarmonyOS 6.0.0 Release SDK or above.