What is APP performance testing

I looked it up from the Internet, and it seems there is no special definition. I give a definition of my own according to my own experience. If there is a coincidence, it is purely the same.

Client performance test is, from the perspective of business and user, the design is reasonable and effective performance test scenarios, for each client performance index of the performance scenarios (memory, CPU, caton, frame rate, power, load, time, etc.), and formulate the implementation of standardization process, in accordance with the standards performance scene at the same time use the performance test performance data collection, And analyze the data, if there is a performance problem and locate the problem, cooperate with the development to repair and verify the release, and finally output a complete performance report.

From the above definition, we can conclude that APP performance test needs to pay attention to the following aspects: performance test scenario design, definition of performance indicators, standardized execution process, performance data collection, performance data analysis, performance problem location, and performance test report.

Performance testing doesn’t mean we come up with a tool, run a scene, get the data, output a report, and then we’re done. Each step should be targeted to demonstrate the tester’s professionalism.

How to do APP performance test

Let’s take a look at each:

Design of performance test scenarios

A scenario may be a repetition of one operation or a combination of several operations. For a scenario of performance testing, there must be repeated operations or continuous operations. The purpose is to magnify performance problems to a certain extent through repeated or continuous operations, so that problems can be found.

For example, if you want to test the performance of a feed slide, you can design the feed slide 50 times, and each slide interval is 2 seconds.

Definition of performance indicators

Common mobile performance metrics include: memory, CPU, frame rate, number of latons, number of Wakp Ups, display duration, etc. What performance metrics to focus on depend on our performance test scenarios.

For example, taking TAB recommendation at station B as an example, when we cold start the APP and enter the recommendation TAB, we pay more attention to the data display duration, and pay more attention to the number of cardons in the sliding scene. We also need to seriously consider the design of reasonable performance indicators for different scenes.

Normalized execution process

After the scene and indicators are defined, it is time to start the execution, which requires standardized execution. Standardized execution is not simply to execute according to the definition of the scene, but to have many points of concern.

What are the specifications that can be defined:

S - How much do you need to wait before the scenario is executed? S - How much do you need to wait after the scenario is executed? S - Does each test need to be cold started or reinstalled? - How long do you need to wait before the test is startedCopy the code

The accuracy of performance data may be affected by each point, so specifications must be defined and implemented in accordance with the specifications every time. Moreover, the specifications are dynamic. As we continue to test, we will find many problems affecting performance data, so we must customize specifications to avoid them. Also, good specifications can lay the foundation for our performance data analysis later.

Performance data Collection

Performance data collection is probably the easiest part of the entire client performance test. There is a mature tool perfDog that is available for convenience and simplicity, or you can use a commercial PerfDog Service to automate performance data collection at a cost.

Performance data analysis

After the performance data is collected, it is necessary to analyze the data. How to analyze the data? I will talk briefly about this, and there will be an article on how to analyze the performance data

  • The trend chart shows the performance of the scenario in the current version. The following conclusions can be drawn:

    • Compare the performance indicator fluctuation with the trend chart in previous versions
    • Performance indicator peak, scenario mean, and increase changes
  • The change in the starting value of the scenario from the previous version

  • The value after the end of the scenario changes from the previous version

Locating performance Problems

After performing performance data analysis, if there is a problem, it is necessary to locate the problem in which part of the business or which Mr Caused the problem, and it is necessary to backtrack.

  • First contact the developer and communicate with the developer to see if the problem can be determined according to the appearance of the problem. If it cannot be confirmed, it is necessary to test and locate which Mr Is caused by the integration
  • List all the Mrs That are integrated in this version and filter out the services where the Mrs Are the performance problems
  • Run again for the bag before and after Mr, and confirm whether each Mr Is affected
  • After determining which Mr Is causing the performance problem, communicate with the developer again

Performance Test Report

The purpose of the performance test report is to show the performance of the current version, and it needs to include some core modules

  • The test results
  • Performance problem attribution
  • Performance indicator data for each scenario
  • Test environment and solution
  • Performance indicator charts for each scenario

Above my app performance test some superficial understanding and experience, if you have any questions, please leave a message, discuss together.

You are welcome to visit my blog for more on client-side performance testing, automated testing,Blog address