Results show

By modifying springboard. app in jailbreak environment, an iOS desktop with unlimited screen mode is realized, and the actual shooting effect is as follows:

background

A few days ago, Smartisan held a summer press conference, the author watched the whole conference with the mentality of listening to cross talk. When he saw the infinite screen segment, he couldn’t help but sigh at Lao Luo’s imagination. Aside from its practicality, the author studied the principle and realization of the infinite screen, and perfectly reduced this function on the jailbreak machine.

The principle of

To achieve unlimited screen, there are mainly two points, the first is a stable inertial navigation algorithm to obtain the relative displacement of the mobile phone the second point is rendering a far greater than the virtual space of the phone’s screen, make the viewport displacement occurs, traveling on infinite screen effect, this article will explain the specific implement on both counts, and open source the infinite plate at the end of the article.

Get the relative displacement of the phone

ARKit can achieve a relatively stable visual odometer through the combination of dual cameras or single camera and gyroscope, so as to detect the posture and displacement of mobile phone in the real world and map them to the virtual world. In order to obtain the relative displacement of mobile phone, we can start an ARSession in App. The position information of the camera in the virtual world is obtained through the callback updated by ARFrame, so as to calculate the relative displacement.

In ARKit’s virtual world, the gyroscopic right-handed system is used, as shown below.

In Lao Luo’s presentation, we saw that the function of infinite screen mainly included two parts: moving the viewport left and right along the X axis and moving the viewport up and down along the Y axis. Therefore, we needed to obtain the relative displacement of X axis and Y axis through ARFrame.

After ARSession is started, ARFrame will be constantly updated by callback. In the callback method, we can get the transform matrix of the camera, which is 4×4 in size. After consulting the data, we know that the first three elements of the last row of the matrix are the coordinates of x, Y and Z axes relative to the origin of AR respectively. This row is also known as the translate vector of the camera.

- (void)session:(ARSession *)session didUpdateFrame:(ARFrame *)frame {
    matrix_float4x4 mat33 = frame.camera.transform;
    simd_float4 pos = mat33.columns[3];
    float x = pos[0];
    float y = pos[1];
    float z = pos[2];
}
Copy the code

It should be noted that these three coordinates are calculated relative to the origin determined by ARKit. Now we need to calculate the relative movement of the mobile phone based on the origin of the current position, so the origin of the data needs to be recalibrated. A simple method is to record the positions of the current x, Y and Z axes as A reference point A(x0, y0, z0) after ARFrame initialization, and then calculate them relative to point A in subsequent calculation.

At the initialization stage of ARKit, the Translate vector will return all zeros. Therefore, the first time that translate is not zero is used as an indication that the initialization is complete. Point A is marked and the relative position output is started, as shown in the following code.

// The variable used to calculate the triaxial data
@property (nonatomic.assign) float x_pre;
@property (nonatomic.assign) float x_base;
@property (nonatomic.assign) BOOL hasInitX;
@property (nonatomic.assign) BOOL findXBase;

@property (nonatomic.assign) float y_pre;
@property (nonatomic.assign) float y_base;
@property (nonatomic.assign) BOOL hasInitY;
@property (nonatomic.assign) BOOL findYBase;

@property (nonatomic.assign) float z_pre;
@property (nonatomic.assign) float z_base;
@property (nonatomic.assign) BOOL hasInitZ;
@property (nonatomic.assign) BOOL findZBase;

// val: camera The actual coordinates of an axis
// pre: indicates the last camera coordinate
// base: the calibrated origin
HasInit: indicates whether the initialization of an axis is complete
// findBase: indicates whether an axial calibration is completed
float calculateOffset(float val, float *pre, float *base, BOOL *hasInit, BOOL *findBase) {
    // Check whether an axial value translate is non-zero. If not, ARKit is initialized
    if(! (*hasInit) && val <0.0000001f) {
        NSLog(@"init");
        return 0;
    } else {
        *hasInit = YES;
    }
    // Determine whether the difference between the two outputs of an ARKit axis is very small. If the difference is very small, it indicates that the ARKit axis is stable
    if(! (*findBase) && fabs(val - *pre) <0.01f) {
        NSLog(@"value is stable at %f", val);
        *base = val;
        *findBase = YES;
        return 0;
    }
    // Calculate the distance between actual translate and the marked point
    float offset = val - *base;
    *pre = val;
    return offset;
}

- (void)session:(ARSession *)session didUpdateFrame:(ARFrame *)frame {
    matrix_float4x4 mat33 = frame.camera.transform;
    simd_float4 pos = mat33.columns[3];
    / / ARCamera translate
    float x = pos[0];
    float y = pos[1];
    float z = pos[2];
    // Calculate the offset relative to the current position of the phone
    float offsetX = calculateOffset(x, &_x_pre, &_x_base, &_hasInitX, &_findXBase);
    float offsetY = calculateOffset(y, &_y_pre, &_y_base, &_hasInitY, &_findYBase);
    float offsetZ = calculateOffset(z, &_z_pre, &_z_base, &_hasInitZ, &_findZBase);
    // Output stable offset (offsetX, offsetY, offsetZ)
}
Copy the code

The above code is a bit confusing because it needs to modify global variables within the function. Basic types are passed back and forth through Pointers, which is not very elegant. Anyway, each axis has three key global variables, hasInit for ARKit initialization and findBase for calibration. The pre value is used to record the last output to detect the time when the ARKit output is stable. The origin calibration can be completed through the coordination of these three variables, so that the three-axis offset based on the current position of the mobile phone can be obtained subsequently.

Render virtual Space

The realization of infinite plate is similar to using a mobile phone browser to see the effect of the computer version of the web, for mobile phone screen size as a viewport, within the scope of a is greater than the phone’s screen to browse, is actually the position of the viewport transformation happens, can be understood as a shot straight down the camera in a huge picture on mobile.

For Springboard. app, it’s actually a huge UIScrollView, so it’s itself a larger virtual space than the screen size, and it contains a -1 screen and a multi-screen desktop, but to achieve some 3D effects, The author chose to take a screenshot of SpringBoard’s ScrollView. In the real tour, the real desktop was actually hidden and a “fake desktop” was displayed, which we called FakeScrollView for convenience. Added to FakeScrollView is a screenshot of the real desktop after processing.

Capture the full view of a UIScrollView

The entire contentSize range of the UIScrollView can be drawn into a graphical context using the Layer rendering method, as shown below.

// scrollView is the desktop SBIconScrollView for Springboard. app
CGRect rect = (CGRect) {0.0, scrollView.contentSize};
UIGraphicsBeginImageContextWithOptions(rect.size, false[UIScreen mainScreen].scale);
[scrollView.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *desktopImage = UIGraphicsGetImageFromCurrentImageContext(a);UIGraphicsEndImageContext(a);Copy the code

Add the camera and map area above and below the desktop image

At the conference, Lao luo demonstrated the functions of moving mobile phone up to take a selfie and moving mobile phone down to open the map. In order to restore this function, the author processed the desktopImage obtained from the above operations twice, and drew a topImage on the top of the picture by using CoreGraphics. A bottomImage is drawn below, the content of topImage is a row of camera ICONS, and the content of bottomImage is a row of Earth ICONS. To realize picture stitching, a larger graphics context needs to be opened, and then the pictures are rendered to specified positions in turn. The complete code is as follows.

// Capture the desktop as the middle part of the larger image
CGRect rect = (CGRect) {0.0, scrollView.contentSize};
UIGraphicsBeginImageContextWithOptions(rect.size, false[UIScreen mainScreen].scale);
[scrollView.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *middleImage = UIGraphicsGetImageFromCurrentImageContext(a);UIGraphicsEndImageContext(a);// To read the camera and earth from a resource file, USBResource is a resource retrieval helper class
UIImage *topImage = [USBResource imageNamed:@"camera.png"];
UIImage *bottomImage = [USBResource imageNamed:@"earth.png"];
// Vertical spacing of the upper and lower views
CGFloat imageMargin = 320;
// Horizontal spacing between camera and earth tiling
CGFloat marginH = 80;
// Calculate the exact position
CGFloat topImageW = 120;
CGFloat topImageH = 89;
CGFloat bottomImageW = 120;
CGFloat bottomImageH = 120;
// The context used to render the full image
CGSize ctxSize = CGSizeMake(middleImage.size.width, middleImage.size.height + topImageH + bottomImageH + imageMargin * 4);
UIGraphicsBeginImageContextWithOptions(ctxSize, NO[UIScreen mainScreen].scale);
// add top image: camera
CGFloat topImageX = marginH;
CGFloat topImageY = topImageH + imageMargin;
NSInteger count = (ctxSize.width - marginH) / (topImageW + marginH);
for (NSInteger i = 0; i < count; i++) {
    [topImage drawInRect:CGRectMake(topImageX, topImageY, topImageW, topImageH)];
    topImageX += topImageW + marginH;
}
// add middle image: desktop
[middleImage drawInRect:CGRectMake(0, topImageH + imageMargin * 2, middleImage.size.width, middleImage.size.height)];
// add bottom image: earth
CGFloat bottomImageX = marginH;
CGFloat bottomImageY = ctxSize.height - imageMargin - bottomImageH;
count = (ctxSize.width - marginH) / (bottomImageW + marginH);
for (NSInteger i = 0; i < count; i++) {
    [bottomImage drawInRect:CGRectMake(bottomImageX, bottomImageY, bottomImageW, bottomImageH)];
    bottomImageX += bottomImageW + marginH;
}
// Get the "fake desktop" image
UIImage *snapshot = UIGraphicsGetImageFromCurrentImageContext(a);UIGraphicsEndImageContext(a);Copy the code

Then add the snapshot image to FakeScrollView, hide the real desktop SBIconScrollView and display FakeScrollView when the unlimited screen mode is enabled. For better effect, Here, some 3D affine transformation is performed on FakeScrollView and Snapshot images, and the final result is shown below. This part of the code can be viewed in the source code at the end of the article, and will not be described here.

implementation

Since springboard. app needs to be modified, this article is built on the basis of jailbreak environment. If readers do not have jailbreak environment, it does not matter, they can change the goal of modification into their own app, such as a map, PDF reader, etc., which can be left and right, up and down. The implementation section of this article focuses on how to modify Springboard.app to achieve this effect.

Knowledge base and environment

  • Jailbreak development of basic knowledge, SSH, SCP, dynamic library loading Hook implementation
  • IPhone or iPad with ARKit support
  • Jailbreak iPhone or iPad Electra Jailbreak
  • Theos development environment theos.github. IO
  • MonkeyDev development environment github.com/AloneMonkey…

MonkeyDev is designed to simplify the compilation link and deployment process of Theos. It is not a necessary environment, but the absence of this environment will cause the Xcode project at the end of this article not to run properly, so you need to manually compile and install deb. MonkeyDev automates the whole process.

Hook SpringBoard

The author hooks the Desktop view SBIconScrollView of SpringBoard through Logos language provided by Theos. Since the desktop is Paging, Therefore, the – (void)setPagingEnabled (BOOL) Enabled method of UIScrollView must be called when it is started. We will use this method as the starting point of the Hook. Note that the following code is Logos language.

%hook SBIconScrollView - (void)setPagingEnabled:(BOOL)enabled { static const void *key; // Use the associative object implementation to prevent repeated calls to if (objc_getAssociatedObject(self, key)! = nil) { %orig(enabled); return; } // Complete initialization here //... objc_setAssociatedObject(self, key, @"", OBJC_ASSOCIATION_RETAIN); %orig(enabled); } %endCopy the code

SBIconScrollView (SBHomeScreenWindow); SBIconScrollView (SBHomeScreenWindow); SBIconScrollView (SBHomeScreenWindow) The following code demonstrates how to find this window and record it for subsequent operations.

for (UIWindow *window in [UIApplication sharedApplication].windows) {
        if ([window isKindOfClass:NSClassFromString(@"SBHomeScreenWindow")]) {
            // Find the key window and controller
            UIWindow *mainWindow = window;
            UIViewController *mainVc = window.rootViewController;
            break; }}Copy the code

Since the dynamic library does not dynamically add instance variables to Hook classes, this key information can only be recorded through the Runtime’s correlation objects. A large number of correlation objects would make the code less elegant. A better solution would be to use a global singleton to maintain this information.

Enter and exit the unlimited screen mode

Enter the infinite screen mode, the Hook class is directly hidden, add a FakeScrollView on the Window, and open ARSession for location tracking; Otherwise, to exit the infinite screen mode is to close the ARSession and restore the scene.

Resource access for dynamic libraries

Since the dynamic library is inserted directly into the LOAD_COMMANDS field of the Mach-o file as dylib, it cannot carry resources when loaded. An elegant way is to place resources as bundles in the installation directory of dylib and access them in the absolute path of dylib. Dylib installation directory for prison break environment/Library/MobileSubstrate/DynamicLibraries, here to place a resource bundle, and encapsulates a resource access class, the code is as follows.

#import "USBResource.h"

#define BundlePath @"/Library/MobileSubstrate/DynamicLibraries/UltimateSpringBoard.bundle"

@implementation USBResource

+ (UIImage *)imageNamed:(NSString *)name {
    return [UIImage imageWithContentsOfFile:[BundlePath stringByAppendingPathComponent:name]];
}

@end
Copy the code

Add permissions for SpringBoard

Since ARKit needs to use the camera, it is necessary to add a permission for SpringBoard, which requires to directly modify the Info.plist of SpringBoard. Don’t worry, the info.plist of system App and self-developed App are not signed by the code, so you can directly modify it. In order to prevent accidents, A backup copy of info.plist is recommended in case of emergency.

First use SSH login to iPhoen or device, use ps – ef | grep SpringBoard SpringBoard query. The path of the app, then enter the path and will Info. The plist use the SCP command or SFTP client transmitted to the computer, Add the NSCameraUsageDescription entry to Xcode and overwrite it using SCP.

Safe mode

Because the springboard. app is directly modified, if there is a serious bug but SpringBoard Crash is not caused, it will not be able to enter the SpringBoard safe mode of the jailbroken system, which will make it impossible to restart SpringBoard when it is disconnected from the computer. If SpringBoard cannot be clicked normally at this time, the mobile phone cannot be used normally. Therefore, it is necessary to design a “suicide” function to enable the plug-in to restart SpringBoard automatically. The author’s solution is to add a button on SpringBoard and execute exit(0) after clicking. Then the system automatically restarts the SpringBoard.

// Add a Respring button
UIButton *closeBtn = [UIButton new];
/ /... Omit configuration procedure
[closeBtn addTarget:self action:@selector(closeBtnClick) forControlEvents:UIControlEventTouchUpInside];
[window addSubview:closeBtn];

// Callback method
%new
- (void)closeBtnClick {
    exit(0);
}
Copy the code

Source code and operation

Download the source code

Github.com/Soulghost/I…

configuration

  1. Open the Xcode project
  2. Open Build Settings of UltimateSpringBoard Target and configure MonkeyDevDeviceIP and Port in user-defined Settings. This information is used to automatically transfer and install the DEB to the phone after Theos is built
  3. The project root directoryarch/UltimateSpringBoard.bundleTransfer to using the SCP command/Library/MobileSubstrate/DynamicLibraries/Directories, which are the resources the plug-in needs to access
  4. Added for info.plist of springboard.appNSCameraUsageDescriptionpermissions
  5. The Build project completes the installation

Manually compile and install

  • The project’s Packages directory contains compiled deb Packages that you can experience directly
  • Ultimatespringboard.xm is the Logos main file and can be compiled manually with Theos

feeling

Maybe infinity screens don’t bring much, but the exploration process was fun, and hopefully this article will help those of you who are curious about how infinity screens work and want to practice jailbreak plugin development.