At first I really wanted to write my own set, but I didn’t have enough time. So with a brief analysis of the source code of SCRCPy, I used the existing scrCPy on the server side. The client scrCPy used FFMPEG + SDL2.0 for cross-platform playback, and I planned to reconstruct the client part with Flutter
Scrcpy and Vysor are excellent projects in screen projection. They are non-invasive, do not require software to be installed on the device, have low latency control, high FPS, etc. Scrcpy has a STAR of 2.6 W
Scrcpy startup phase
How did it execute scrcpy commands and get the android device’s screen in such a short amount of time? There is no request for screen access to the device, and low latency control of the device. For developers who have experience with ADB, we can type when debugging android devices on the PC
adb shell /system/bin/screencap -p PATH
Copy the code
You can directly capture the phone screen, remove the -p switch, change to >, you can directly screenshot and redirect to the computer local, including the use of ScreenRecorder command to record the phone screen.
The above two operations are obviously used to intercept the phone screen permission, but why can not apply to the user to obtain the screen?
The reason for this is that adb shell permissions are very high, /system/app/shell = /system/app/shell = /system/app/shell = /system/app/shell = /system/app/shell = /system/app/shell
So when scrCPy starts, I upload a JAR from my SDK to the Android device. This jar is not a Java.class file, but a class Java bytecode which is converted into a dex file by dx tool, so when THIS jar is decompressed, there is a dex, which is the bytecode on Android. It can run directly.
Push the JAR to the phone
adb push $sdk/scrcpy-server.jar /data/local/tmp
Copy the code
Use Android app_process to directly start the JAR. Not only app_process, but also DalviKVM theory can be started by the following command
CLASSPATH=/data/local/ TMP/scrcpy - server. Jar app_process. / com. Genymobile. Scrcpy. Server 1.12.1 8000000 0 0true - true true
Copy the code
So the last bunch of arguments are the default arguments that I pulled out of the source code, and the ones that scrCPy will follow in isolation, These parameters will be by com. Genymobile. Scrcpy. The main function of the Server class to receive the main function will open two parameters through the receiving socket waiting for the client to connect to the device, a socket is streaming video, one is the socket of equipment control
Why can this socket be connected to PC?
Because ADB provides port forwarding function, it can forward the local port of the device to the PC, and then the PC can connect with this forwarding port and send and receive data. Required Android version greater than 5.0 forwarding port:
adb forward tcp:5005 localabstract:scrcpy
#All 5005 port traffic data on the PC will be redirected to the mobile UNIX type localAbstract
Copy the code
The above section is also the principle of all such screen projection software, including Vysor
Scrcpy run phase
As mentioned above, the SCRCPy JAR acts as a server on the device, creating a socket for the local video stream waiting for the client to connect, as shown below
public static DesktopConnection open(Device device, boolean tunnelForward) throws IOException {
LocalSocket videoSocket;
LocalSocket controlSocket;
if (tunnelForward) {
LocalServerSocket localServerSocket = new LocalServerSocket(SOCKET_NAME);
try {
System.out.println("Waiting for video socket connection...");
videoSocket = localServerSocket.accept();
System.out.println("video socket is connected.");
// send one byte so the client may read() to detect a connection error
videoSocket.getOutputStream().write(0);
try {
System.out.println("Waiting for input socket connection...");
controlSocket = localServerSocket.accept();
System.out.println("input socket is connected.");
} catch (IOException | RuntimeException e) {
videoSocket.close();
throwe; }}finally{ localServerSocket.close(); }}else {
videoSocket = connect(SOCKET_NAME);
try {
controlSocket = connect(SOCKET_NAME);
} catch (IOException | RuntimeException e) {
videoSocket.close();
throw e;
}
}
DesktopConnection connection = new DesktopConnection(videoSocket, controlSocket);
Size videoSize = device.getScreenInfo().getVideoSize();
connection.send(Device.getDeviceName(), videoSize.getWidth(), videoSize.getHeight());
return connection;
}
Copy the code
I added a bit of print information
Wait until there is a client connected to the socket, it will immediately open a new thread for screen recording, and in time through the socket to record screen coding data to the client, and then wait for a new socket to connect, the new socket is used to receive the client to the device control information
How do I control the device based on client messages?
Scrcpy defines its own protocol, but there are too few comments, these function calls jump around, half a day can not see what, just like someone also used QT graphical interface to rewrite scrCPy, also need to start from its source code, I directly share my final source code parsing out
Socket knowledge:
Since the socket transmits bytes [],
1 byte contains 8 bits, which are unsigned from 0 to 255. If there is a sign, there is one less 1 to indicate a positive or negative sign. Therefore, the range is -128 to 127
Number of digits occupied by data type
1 Short = 2 byte = 16 bits
1 int = 4 Byte = 32 bits
1 Long = 8 byte = 64 bits
In the click and slide control of the device:
- The first is used to mark the control type, so it is assigned to type
- Give a bit value to an Action that indicates whether the event is pressed, lifted, or slid
- Give the PointerId 8 bits, I don’t know what this id is
- Assign 12 bits to Position, 8 bits each representing the x and Y coordinates of the click, and 4 bits each representing the width and height of the screen.
- Give presureInt two digits to indicate the force of the press
- Add four bits for buttons
Code section
private ControlMessage parseInjectTouchEvent(a) {
// system.out.println (" Remaining length ===>"+buffer.remaining());
if (buffer.remaining() < INJECT_TOUCH_EVENT_PAYLOAD_LENGTH) {
return null;
}
int action = toUnsigned(buffer.get());
long pointerId = buffer.getLong();
Position position = readPosition(buffer);
// 16 bits fixed-point
int pressureInt = toUnsigned(buffer.getShort());
// convert it to a float between 0 and 1 (0x1p16f is 2^16 as float)
float pressure = pressureInt == 0xffff ? 1f : (pressureInt / 0x1p16f);
int buttons = buffer.getInt();
return ControlMessage.createInjectTouchEvent(action, pointerId, position, pressure, buttons);
}
Copy the code
private static Position readPosition(ByteBuffer buffer) {
int x = buffer.getInt();
int y = buffer.getInt();
int screenWidth = toUnsigned(buffer.getShort());
int screenHeight = toUnsigned(buffer.getShort());
return new Position(x, y, screenWidth, screenHeight);
}
Copy the code
Buffer is the cache from the socket, which is a class in the Java NIO package. You can see that it has get(),getLong(),getShort() methods, which take the number of bytes specified in the buffer to form an int. Long, short so when we need to initiate a press event on the device (1080×2280 for example)
Here’s the code for Dart, and other languages can do the same
Press:
[
2.0.0.0.0.0.0.0.0.0,
x >> 24,x << 8 >> 24,x << 16 >> 24,x << 24 >> 24,
y >> 24,y << 8 >> 24,y << 16 >> 24,y << 24 >> 24.1080 >> 8.1080 << 8 >> 8.2280 >> 8.2280 << 8 >> 8.0.0.0.0.0.0
]
Copy the code
Lift:
[
2.1.0.0.0.0.0.0.0.0,
x >> 24,x << 8 >> 24,x << 16 >> 24,x << 24 >> 24,
y >> 24,y << 8 >> 24,y << 16 >> 24,y << 24 >> 24.1080 >> 8.1080 << 8 >> 8.2280 >> 8.2280 << 8 >> 8.0.0.0.0.0.0
]
Copy the code
Mobile:
[
2.2.0.0.0.0.0.0.0.0,
x >> 24,x << 8 >> 24,x << 16 >> 24,x << 24 >> 24,
y >> 24,y << 8 >> 24,y << 16 >> 24,y << 24 >> 24.1080 >> 8.1080 << 8 >> 8.2280 >> 8.2280 << 8 >> 8.0.0.0.0.0.0
]
Copy the code
X and y are the screen coordinates you want to click on, and this time the data is received by the SCRCPy server and pressed at the x and Y positions of the screen, with a lot of shifting
For example, Java NIO calls getShort to get the width of the screen
int screenWidth = toUnsigned(buffer.getShort());
private static int toUnsigned(short value) {
return value & 0xffff;
}
Copy the code
As mentioned above, 1 short has 2 bytes, 16 bits, and the largest unsigned number is
1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1Copy the code
The upper limit is 2 ^ 16 -1 is :65535, which is definitely enough for transmitting resolution
If the screen width is 1080, it is larger than 1 byte, so we have to save byte[] and send it to the socket
How to transfer the width of the screen resolution in bytes to the socket? An 🌰
Convert the number 1080 to binary:
1 0 0 0 1 1 0 0 0 0Copy the code
We need to store it in 2 bytes, so we just need to find each byte (8-bit word length value) of it
We’ll fill it in with 0’s
0 0 0 0 1 0 0 0 1 1 1 0 0 0 0 0 0Copy the code
So the two bytes are
0 0 0 0 1 0 0 0 1 1 1 0 0 0 0 0 0Copy the code
By passing these two bytes into the socket, the buffer’s getShort method yields the value of 1080.
So how do we get the value of each byte in our program?
By shift operation:
First byte: Move 1080 eight words to the right to get it
1 0 0 0 0 0 0 0 0 0 1 1 1 0 0 0 > > 8 = 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0Copy the code
Second byte: move 1080 8 words to the left (clear the first part) and 8 words to the right
0 0 0 0 1 0 0 0 1 1 0 0 0 0 0 0 <<8 = 0 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 0 0 0 0 0 0 0 0Copy the code
The second byte can also be bitwise and
1 0 0 0 0 0 0 0 0 0 1 1 1 0 0 0 & 0 0 0 0 0 0 0 0 1 1 1 1 1 1 1 1 = 0 0 0 0 0 0 0 0 0 0 1 1 1 0 0 0Copy the code
Nio’s buffer.getShort method is implemented in Java.
ByteBuffer is an abstract class, getShort I don’t see any overridden, but I can implement it myself. Okay
short getShort(ByteBuffer buffer){
return buffer.get() <<8 | buffer.get();
}
Copy the code
Of course, we will directly use ByteBuffer. All the above parts have been verified. Currently, we can control the devices in the LAN with low latency in the client developed by Flutter.