In the recent testing work, I encountered some new problems, and also put forward new requirements for my testing framework, one of which is the soft start of performance testing, and another is the new challenge of high QPS.

For example, in the fixed-thread model, I used to run N concurrent threads to initiate performance tests. Is hard, but the rough way in most of the processing ability of the service is no problem, when not pressure suddenly up to very high, because thread fixed time interval of the initial request consumption, have been relatively stable, itself is not high is one of the important reasons for the QPS. For the high QPS service, the initial request consumption time is very short. Within the time of reaching the stable period (although it is also very short), the number of requests will be very large. Compared with the non-test phase, the sudden increase phenomenon is very obvious.

So, I took the time to do some soft start functions to share the experience.

Soft start concept

Here is the encyclopedia definition:

The voltage is gradually increased from zero to rated voltage, and there is no impact torque in the whole process of motor starting, but smooth start operation. This is soft start.

It’s still physics related, middle school concept in my mind.

Train of thought

Fixed thread model

This is relatively simple, the implementation of the scheme can refer to JMeter, is to start the thread at the time of adding a thread start interval, so the initial implementation.

Here I set a constant:

    /** * Performance test startup time */
    public static double RUNUP_TIME = 30.0;
Copy the code

Then in com. Funtester. Frame. The execute. Concurrent class methods. Com funtester. Frame. The execute. Concurrent# start startup code add a line:

        startTime = Time.getTimeStamp();
        for (int i = 0; i < threadNum; i++) {
            ThreadBase thread = threads.get(i);
            if (StringUtils.isBlank(thread.threadName)) thread.threadName = StatisticsUtil.getTrueName(desc) + i;
            thread.setCountDownLatch(countDownLatch);
            sleep(RUNUP_TIME / threadNum);
            executorService.execute(thread);
        }
        shutdownService(executorService, countDownLatch);
        endTime = Time.getTimeStamp();
Copy the code

Then, in the later test cases, you can flexibly assign values:

 Constant.RUNUP_TIME = util.getIntOrdefault(2.30)
Copy the code

Fixed QPS model

No solution was found for this process, so I came up with an idea: take a fixed time (default 10s) linear stretch to a certain time (default 30s) execution, QPS thread increase to the set QPS. For example, set the QPS to 5000/s. By default, the QPS for performance test execution increases gradually from 1000QPS (5000/(30s / 10s) * 2-1) to 5000QPS in 30s, although this is only a rough estimate.

interval = 1 _000_000_000 / qps;// Unit: 1s=1000ms,1ms=1000000ns
        int runupTotal = qps * PREFIX_RUN;// Calculate total requests
        double diffTime = 2 * (Constant.RUNUP_TIME / PREFIX_RUN * interval - interval);// Calculate the difference between the maximum and minimum time intervals
        double piece = diffTime / runupTotal;// Calculate the increment of time per request
        for (int i = runupTotal; i > 0; i--) {
            executorService.execute(threads.get(limit-- % queueLength).clone());
            sleep((long) (interval + i * piece));
        }
        logger.info("Warm up complete, begin testing!");
Copy the code

Error influence

Due to the recent research on error calculation of performance testing, there are also some articles that have been produced. Of course, soft boot will also have an impact on the calculation of local performance testing indicators. First, share the previous post:

  • Performance test error analysis text version – on
  • Performance test error analysis text version – next
  • Performance test error statistical practice
  • Comparative Study of Performance Test Errors (I)

Comparative study of performance test error (2) Still in my mind……

To solve the error caused by the impact of ideas, is to complete the warm-up system, reset the counter in all kinds of data.

PS: Through my practice, I find that the actual QPS is closer to the value QPS calculated using the average response time, rather than QPS2.

Fixed thread model

The idea here is to start the soft start thread, pause it, empty the various counters, and continue the full concurrent test.

        for (int i = 0; i < threadNum; i++) {
            ThreadBase thread = threads.get(i);
            if (StringUtils.isBlank(thread.threadName)) thread.threadName = StatisticsUtil.getTrueName(desc) + i;
            thread.setCountDownLatch(countDownLatch);
            sleep(RUNUP_TIME / threadNum);
            executorService.execute(thread);
        }
        sleep(1.0);
        ThreadBase.stop();
        try {
            countDownLatch.await();
        } catch (InterruptedException e) {
            FailException.fail("Soft start performance test failed!");
        }
        threads.forEach(f -> f.initBase());
        logger.info("Warm up complete, begin testing!");
        countDownLatch = new CountDownLatch(threadNum);
Copy the code

Private static Boolean ABORT = false private static Boolean ABORT = false private static Boolean ABORT = false , causing it to always be false.

The com. Funtester. Base. Constaint. ThreadBase# initBase code is as follows:

    /** * Clears the storage list */ after the object is copied
    public void initBase(a) {
        this.executeNum = 0;
        this.errorNum = 0;
        this.costs = new ArrayList<>();
        this.marks = new ArrayList<>();
    }
Copy the code

Fixed QPS model

This is relatively simple because you can control the task generator directly. Once all the missions have been sent out, you can empty the counter. There may be a few escapes, but it doesn’t matter.

int runupTotal = qps * PREFIX_RUN;// Calculate total requests
        double diffTime = 2 * (Constant.RUNUP_TIME / PREFIX_RUN * interval - interval);// Calculate the difference between the maximum and minimum time intervals
        double piece = diffTime / runupTotal;// Calculate the increment of time per request
        for (int i = runupTotal; i > 0; i--) {
            executorService.execute(threads.get(limit-- % queueLength).clone());
            sleep((long) (interval + i * piece));
        }
        sleep(1.0);        
        allTimes = new Vector<>();
        marks = new Vector<>();
        executeTimes.getAndSet(0);
        errorTimes.getAndSet(0);
        logger.info("Warm up complete, begin testing!");
Copy the code

PS: I did not use CyclicBarrier and Phaser here because, unlike the problems mentioned in the preliminary study of the collection point and multi-phase synchronization problem in performance testing, the two starts are consistent in time but not strongly correlated under the fixed thread model, and the use of these two classes may cause other problems. In fact, I use these two kinds of little, do not master deep.


FunTester.Tencent Cloud Author of the Year,Boss direct hire contract author.Official GDevOps media partner, non-famous test development.

  • FunTester test framework architecture diagram
  • FunTester test project architecture diagram
  • Selenium4 IDE features: Elastic testing, looping, and logical judgment
  • Groovy uses topics in JMeter
  • Thread-safe classes are used in performance testing
  • Manual testing or automated testing?
  • Summary of Automatic Test failures on the Web
  • Socket interface asynchronous authentication practice
  • Use ThreadLocal to solve thread synchronization problems
  • Replay browser multiple request performance testing practices