Introduction to the

When we watch live mobile games, what the audience sees is the content on the player’s screen. How does this work? This blog will hand write a screen recording live Demo, to achieve the effect similar to mobile games live

The difficulty lies in transferring data to the live broadcast server. We use RtmpDump to transfer Rtmp data. Since RtmpDump is implemented in C language, we also need to use NDK development. Java alone can not be implemented, of course, if you take the trouble to compile Ffmpeg Rtmp push stream, B station open source ijkPlayer is based on Ffmpeg to develop

Implementation effect

Finally, we streamed to the live broadcast room of station B, where we could see the pictures on our mobile phone screens in real time

The basic flow

  • Obtain screen recording data

  • The data is h264 encoded

  • Rtmp packets

  • Upload to the streaming address of the livestreaming server

Obtain screen recording data

The MediaProjectionService is retrieved via the Intent, and Mediaprojection’s VirtualCanvas is retrieved from which the original screen data is recorded

private void initLive() { mediaProjectionManager = (MediaProjectionManager) getSystemService(Context.MEDIA_PROJECTION_SERVICE); Intent screenRecordIntent = mediaProjectionManager.createScreenCaptureIntent(); startActivityForResult(screenRecordIntent,100); } @Override protected void onActivityResult(int requestCode, int resultCode, Intent data) { super.onActivityResult(requestCode, resultCode, data); If (requestCode == 100 && resultCode == activity.result_OK) {//MediaProjection-- > Generate screen recording data MediaProjection = mediaProjectionManager.getMediaProjection (resultCode, data); }}Copy the code

The data is h264 encoded

The YUV bare data obtained by MediaProjection need to be h264 encoded first, at which time we use native MediaCodec for hardware coding

public void start(MediaProjection mediaProjection){ this.mediaProjection = mediaProjection; / / configuration MediaCodec MediaFormat MediaFormat = MediaFormat. CreateVideoFormat (MediaFormat MIMETYPE_VIDEO_AVC, width, height); / / color format mediaFormat. SetInteger (mediaFormat. KEY_COLOR_FORMAT, MediaCodecInfo. CodecCapabilities. COLOR_FormatSurface); mediaFormat.setInteger(MediaFormat.KEY_BIT_RATE, 400_000); mediaFormat.setInteger(MediaFormat.KEY_FRAME_RATE, 15); // Set the interval for triggering keyframes to 2 s mediaFormat.setInteger(mediaformat. KEY_I_FRAME_INTERVAL, 2); / / create the encoder try {mediaCodec = mediaCodec. CreateEncoderByType (" video/avc "); mediaCodec.configure(mediaFormat,null,null,MediaCodec.CONFIGURE_FLAG_ENCODE); Surface surface = mediaCodec.createInputSurface(); mediaProjection.createVirtualDisplay(TAG,width,height,1, DisplayManager.VIRTUAL_DISPLAY_FLAG_PUBLIC ,surface,null,null);  } catch (IOException e) { e.printStackTrace(); } start(); } @Override public void run() { isLiving = true; mediaCodec.start(); MediaCodec.BufferInfo bufferInfo = new MediaCodec.BufferInfo(); While (isLiving){// If the time difference is greater than 2 s, If (system.currentTimemillis () -timestamp >= 2000){// msgBundle = new Bundle(); msgBundle.putInt(MediaCodec.PARAMETER_KEY_REQUEST_SYNC_FRAME,0); mediaCodec.setParameters(msgBundle); timeStamp = System.currentTimeMillis(); } // MediaCodec does not need to fetch the output index. Inside has been operating int outputBufferIndex = mediaCodec. DequeueOutputBuffer (_000 bufferInfo, 100); If (outputBufferIndex > = 0) {/ / get the ByteBuffer ByteBuffer. = mediaCodec getOutputBuffer (outputBufferIndex); byte[] outData = new byte[bufferInfo.size]; byteBuffer.get(outData); }}}Copy the code

Rtmp packets

After the above two steps, we have the encoded H264 data, then packaging Rtmp is a bit of a headache.

First of all, we imported the source code of Rtmpdump into the CPP file of the project. We used Rtmpdump to connect to the server and transfer Rtmp data. It should be known that the data in hand is still H264 code stream, which cannot be directly transmitted

The third party library Rtmpdump is used to realize the push stream to the live server. Because the code amount of Rtmpdump is not very much, we directly copy the source code to the CPP file of Android. If Ffmpeg cannot be used, we need to compile the so library file in advance

We don’t need to do too deep understanding of Rtmp for the time being, because it is easy to get involved in the transmission of H264 data with Rtmp, so SPS, PPS, and how to place the key frame,Rtmp has been defined, we need to use NDK to achieve Rtmp data filling

The use of RtmpDump

  • Connecting to the server
  1. RTMP_Init(RTMP *r) initializes

  2. RTMP_EnableWrite(RTMP *r) Enables data writing

  3. RTMP_Connect(RTMP *r, RTMPPacket *cp)

  4. RTMP_ConnectStream(RTMP *r, int seekTime)

  • To send data
  1. RTMPPacket_Alloc(RTMPPacket *p, int nSize)

  2. RTMP_SendPacket(RTMP *r, RTMPPacket *packet, int queue)

  3. RTMPPacket_Free(RTMPPacket *p)

Connect to the Live broadcast Server

In this step, you need to prepare the live stream address in advance, and then implement the native method

extern "C" JNIEXPORT jboolean JNICALL Java_com_bailun_kai_rtmplivedemo_RtmpPack2Remote_connectLiveServer(JNIEnv *env, Const char * live_URL = env->GetStringUTFChars(url,0); int result; LivePack = (livePack *)(malloc(sizeof(livePack))); // Empty the memory of dirty data memset(livePack,0,sizeof(livePack)); LivePack -> Rtmp = RTMP_Alloc(); RTMP_Init(livePack->rtmp); // Set the RTMP initialization parameters, such as the timeout time, url livePack-> RTMP -> link. timeout = 10; LOGI("connect %s", url); if (! (result = RTMP_SetupURL(livePack->rtmp,(char *)live_url))){ break; } // enable Rtmp write RTMP_EnableWrite(livePack-> Rtmp); LOGI("RTMP_Connect"); if (! (result = RTMP_Connect(livePack->rtmp,0))){ break; } LOGI("RTMP_ConnectStream "); if (! (result = RTMP_ConnectStream(livePack->rtmp, 0))) break; LOGI("connect success"); }while (0); if (! result && livePack){ free(livePack); livePack = nullptr; } env->ReleaseStringUTFChars(url,live_url); return result; }Copy the code

Send data to the live server

Interestingly, the Rtmp protocol does not need to pass delimiters (h264 delimiters are 0 0 0 1), and the contents of the first Rtmp packet pushed to the stream are SPS, PPS, etc

// Extern "C" JNIEXPORT jboolean JNICALL Java_com_bailun_kai_rtmplivedemo_RtmpPack2Remote_sendData2Server(JNIEnv *env, jobject thiz, jbyteArray buffer, jint length, jlong tms) { int result; Jbyte *bufferArray = env->GetByteArrayElements(buffer, 0); result = sendDataInner(bufferArray,length,tms); Env ->ReleaseByteArrayElements(buffer,bufferArray,0); return result; } int sendDataInner(jbyte *array, jint length, jlong tms) { int result = 0; If (array[4] == 0x67){readSpsPps(array, Length,livePack); return result; } if(array[4] == 0x65){RTMPPacket * spsPpsPacket = createRtmpSteramPack(livePack); sendPack(spsPpsPacket); } RTMPPacket* rtmpPacket = createRtmpPack(array,length,tms,livePack); result = sendPack(rtmpPacket); return result; } int sendPack(RTMPPacket *pPacket) { int result = RTMP_SendPacket(livePack->rtmp,pPacket,1); RTMPPacket_Free(pPacket); free(pPacket); return result; RTMPPacket *createRtmpSteramPack(LivePack *pack) {// Create Rtmp package, Int body_size = 16 + pack->sps_len + pack->pps_len; RTMPPacket *rtmpPacket = static_cast<RTMPPacket *>(malloc(sizeof(RTMPPacket))); RTMPPacket_Alloc(rtmpPacket,body_size); int index = 0; rtmpPacket->m_body[index++] = 0x17; //AVC sequence header set to 0x00 rtmpPacket->m_body[index++] = 0x00; //CompositionTime rtmpPacket->m_body[index++] = 0x00; rtmpPacket->m_body[index++] = 0x00; rtmpPacket->m_body[index++] = 0x00; //AVC sequence header rtmpPacket->m_body[index++] = 0x01; RtmpPacket ->m_body[index++] = pack-> SPS [1]; RtmpPacket ->m_body[index++] = pack-> SPS [2]; RtmpPacket ->m_body[index++] = pack-> SPS [3]; //profile level rtmpPacket->m_body[index++] = 0xFF; RtmpPacket ->m_body[index++] = 0xE1; RtmpPacket ->m_body[index++] = (pack->sps_len >> 8) &0xff; RtmpPacket ->m_body[index++] = pack->sps_len & 0xff; Memcpy (&rtmpPacket->m_body[index], pack->sps_len); index +=pack->sps_len; // pps rtmpPacket->m_body[index++] = 0x01; RtmpPacket ->m_body[index++] = (pack->pps_len >> 8) &0xff; rtmpPacket->m_body[index++] = pack->pps_len & 0xff; Memcpy (&rtmpPacket->m_body[index], pack-> PPS, pack->pps_len) RtmpPacket ->m_packetType = RTMP_PACKET_TYPE_VIDEO; rtmpPacket->m_packetType = RTMP_PACKET_TYPE_VIDEO; // rtmpPacket->m_nBodySize = body_size; RtmpPacket ->m_nChannel = 0x04; rtmpPacket->m_nTimeStamp = 0; rtmpPacket->m_hasAbsTimestamp = 0; rtmpPacket->m_headerType = RTMP_PACKET_SIZE_LARGE; rtmpPacket->m_nInfoField2 = livePack->rtmp->m_stream_id; return rtmpPacket; } RTMPPacket *createRtmpPack(jbyte *array, jint length, jlong tms, LivePack *pack) { array += 4; RTMPPacket *packet = (RTMPPacket *) malloc(sizeof(RTMPPacket)); int body_size = length + 9; RTMPPacket_Alloc(packet, body_size); if (array[0] == 0x65) { packet->m_body[0] = 0x17; LOGI(" send keyframe data"); } else{ packet->m_body[0] = 0x27; LOGI(" send non-keyframe data"); } // fixed size packet->m_body[1] = 0x01; packet->m_body[2] = 0x00; packet->m_body[3] = 0x00; packet->m_body[4] = 0x00; Packet ->m_body[5] = (length >> 24) &0xff; packet->m_body[6] = (length >> 16) & 0xff; packet->m_body[7] = (length >> 8) & 0xff; packet->m_body[8] = (length) & 0xff; // memcpy(&packet->m_body[9], array, length); packet->m_packetType = RTMP_PACKET_TYPE_VIDEO; packet->m_nBodySize = body_size; packet->m_nChannel = 0x04; packet->m_nTimeStamp = tms; packet->m_hasAbsTimestamp = 0; packet->m_headerType = RTMP_PACKET_SIZE_LARGE; packet->m_nInfoField2 = pack->rtmp->m_stream_id; return packet; } void readSpsPps(jbyte *array, jint length, LivePack *pack) { for (int i = 0; i < length; If (array[I] == 0x00&& array[I +1] == 0x00&& array[I +2] == 0x00&& array[I +3] == 0x00&& array[I +3] == 0x01 && array[I +4] == 0x68){// Save the SPS livePack->sps_len = i-4; livePack->sps = static_cast<int8_t *>(malloc(livePack->sps_len)); memcpy(livePack->sps,array + 4,livePack->sps_len); // save PPS livePack->pps_len = length -(livePack->sps_len+4) -4; livePack->pps = static_cast<int8_t *>(malloc(livePack->pps_len)); memcpy(livePack->pps,array+4+livePack->sps_len+4,livePack->pps_len); LOGI("sps:%d pps:%d", livePack->sps_len, livePack->pps_len); }}}Copy the code

Use of Pointers

  • Malloc allocates memory and does not initialize it, so the value of a new slice of memory will be random and the allocated memory will be contiguous

  • The return type is void*, which represents a pointer to an undefined type. Void * can be cast to a pointer of any other type. Malloc: p = malloc (sizeof(int)); Void * cannot be assigned to a variable of type int *. So the cast must be done by (int *)

conclusion

First of all, we got the picture of the mobile phone screen through the system service. At this time, the original data we got could not be transmitted through the network. After encoding it with H264, we encapsulated the Rtmp package, and then transmitted it according to the way stipulated by the Rtmp protocol