当前位置:网站首页>Analysis of video stream scheme for picture transmission of cloud control platform

Analysis of video stream scheme for picture transmission of cloud control platform

2020-12-07 19:18:48 Aliyun yunqi

brief introduction :  This paper will summarize the video streaming scheme of this cloud control platform .

background
ARC( Gaode car and machine cloud control platform ) It is a cloud control platform based on in-depth customization of on-board equipment business , Through this platform, we can realize remote use of different types of on-board equipment . In order for remote users to use the onboard device as if they were local , It is necessary to send the pictures of on-board equipment back to users in time . therefore , Picture transmission capability is ARC A core component of the platform .

At first, we adopted the open source solution of picture transmission which is widely used in the industry (minicap). This scheme obtains the screen data and compresses it to generate JPG Images , Frame by frame Web End to show . Because the performance of the car is much worse than that of the mobile phone , Compressed images consume CPU Great performance , On some low-end vehicle equipment, compressing pictures can consume 80% Left and right CPU, It is easy to make the equipment stuck . At the same time, the image compression rate is not very high , Transmission consumes a lot of bandwidth , In low bandwidth, it causes excessive delay in the user's view .

therefore , We need a solution that can balance the quality of the image returned and the machine side of the car CPU resource consumption . This paper will summarize the video streaming scheme of this cloud control platform .
dongtao1.png

Ideas and methods
dongtao2.png

Basic transmission link based on image data , In order to be able to not consume the device side CPU resources , First of all, the image is not compressed , First transfer to the server for processing . But after research , Car engine USB Bandwidth transmission simply can not meet the requirements of high-definition image transmission without compression , HD raw data is very big , basic 1 Only about three frames of data can be transmitted per second .

Another idea is to reduce the number of hardware encoders on the device side CPU The consumption of resources . Through investigation and research Android 4.1 At the beginning, they basically brought their own H264 Video encoder . therefore , Decided to try a video streaming solution , At the device end, the video stream is encoded by hardware encoder , Forward to through the server Web Decode .

Implementation scheme
dongtao3.png

The whole implementation scheme can be divided into the following three parts :

Device end : Responsible for the acquisition and coding of pictures .
Server side : Handle the transmission and control of video streams .
Web End : Decoding and presentation of video streams .

Picture acquisition and coding
The acquisition of images directly uses Android Of Virtual Display. There are many ways to implement coding :
dongtaobiaoge1.png

because Java The plan can only support Android 5.0 The above machines , And the current car market Android 4.x The proportion is still relatively large , Can't ignore . So you can only use cpp The plan , Minimum compatible Android 4.3 edition .

Transmission and control of video stream
Web The most common live broadcast scheme is rtmp/hls/flvjs etc. . But at the very least, all of them have 1-3s Delay of . It has no effect on the general live broadcast platform , But for cloud control platforms with real-time interaction scenarios , A millisecond delay is required . therefore , Finally decided to adopt H264 The naked stream passes through Socket Transmission scheme , Device side coding H264 The video stream goes directly to Web End to play .

meanwhile , In order to improve the user experience , Flexible control for video stream transmission . By adding a cache queue to the server, it is used to monitor the front-end bandwidth load , Automatically adjust the frame rate and code rate according to the bandwidth condition , Give priority to the fluency of users .

Web End display and decoding
Web End display uses media source extensiton(MSE) + fragment mp4 The plan , hold H264 The bare stream is encapsulated as fragment mp4 after , adopt MSE api Decode and play , The specific implementation is referred to open source Jmuxer programme .
**
Lost frames and fill frames **
By default Android Virtual Display The maximum frame rate produced is 60fps, And our naked eyes 30fps You can feel smooth . In order to save bandwidth , We define the maximum output frame rate of the video stream as 30fps. In the case of poor network bandwidth , We can also minimize the frame rate . meanwhile ,Android MediaCodec Control frame rate is not supported , The frame rate is determined by the number of frames sent per second . therefore , We need to control the frame rate by dropping frames .

Win7 The hardware decoder has no low latency mode , Need roughly 10 It's about frames before the data starts playing , and VirtualDisplay It is the change of the picture that will produce the image frame , Therefore, it is necessary to implement frame complement to eliminate decoding delay .
dongtao4.png

Let's create a EglSurface To deal with lost frame and fill frame , Through time interval control eglTexture Draw in EglSurface The frequency of frame dropping , By redrawing the last frame of data to fill the frame .

summary
The scheme in ARC Use on the platform , While ensuring the transmission quality , It effectively improves the fluency of user operation . In theory, the scheme can also be applied to other similar cloud control platforms , If you don't need support Android 4.x equipment , use Java layer API To get video stream data , It can reduce development and adaptation costs .

 

Link to the original text
This article is the original content of Alibaba cloud , No reprint without permission .

版权声明
本文为[Aliyun yunqi]所创,转载请带上原文链接,感谢
https://chowdera.com/2020/11/20201112221016718w.html