“This is the 25th day of my participation in the August Gwen Challenge.

One, foreword

Those of you who do performance have come across scenarios where application-level performance testing finds that an operation is taking a long time to respond to, and then spends a lot of time going through it, only to discover that the culprit is some underlying algorithm in the code that is implementing inefficiently. This top-down approach is usually inefficient and costly. Therefore, we need to conduct code-level performance tests on some key algorithms early in the project to prevent such performance problems that can be found at the code level from being discovered in the final system performance testing phase. However, at the actual execution level, there is no such thing as a test tool for code-level performance testing. The usual approach is to modify the existing unit testing framework.

The most commonly used modification methods are:

  • Run a unit test case n times in a row that would have been executed only once. The value of n usually ranges from 2000 to 5000.
  • The average time of n times. If the average time is long (that is, the time of a single function call is long), such as the second level, then the implementation logic of the function under test must be optimized in general.

The reason why n times of execution is adopted here is that the execution time of a function is usually millisecond, and the error of a single execution is large. Therefore, the average value of multiple executions is adopted.

Are there any such testing tools out there? Of course, there are some, such as ContiPerf, the protagonist of today’s show.

2. Introduction to ContiPerf

ContiPerf is a lightweight testing tool, developed based on JUnit 4, that can be used for efficiency testing and more. You can specify the number of threads and times of execution, limiting the maximum and average execution times for performance testing.

Website address: sourceforge.net/p/contiperf…

3. Use of ContiPerf

So let’s do an example,

First, add the POM dependency package:

   <dependency>
            <groupId>org.springframework.boot</groupId>
            <artifactId>spring-boot-starter</artifactId>
        </dependency>

        <! -- Introducing ContiPerf testing tool -->
        <dependency>
            <groupId>org.databene</groupId>
            <artifactId>contiperf</artifactId>
            <version>2.3.4</version>
            <scope>test</scope>
        </dependency>

        <dependency>
            <groupId>org.springframework.boot</groupId>
            <artifactId>spring-boot-starter-test</artifactId>
            <scope>test</scope>
            <exclusions>
                <exclusion>
                    <groupId>org.junit.vintage</groupId>
                    <artifactId>junit-vintage-engine</artifactId>
                </exclusion>
            </exclusions>
        </dependency>
        <dependency>
            <groupId>org.springframework.boot</groupId>
            <artifactId>spring-boot-starter-web</artifactId>
        </dependency><dependency>
        <groupId>junit</groupId>
        <artifactId>junit</artifactId>
        <scope>test</scope>
    </dependency>
Copy the code

To demonstrate, a simple test interface is written: unitTestService.java

/** * Test interface class *@author zuozewei
 *
 */
public interface UnitTestService {
	
	public String process(String msg);

}

Copy the code

Implementation class: UnitTestServiceImp.java

@Service
public class UnitTestServiceImpl implements UnitTestService {

	/** * For testing purposes, the value passed in is returned directly */
	@Override
	public String process(String msg) {
		// TODO Auto-generated method stub
		returnmsg; }}Copy the code

Enter the ContiPerfRule by writing the UnitTestServiceTest test class.

/** * Write the interface performance test class *@author zuozewei
 *
 */
@RunWith(SpringRunner.class)
@SpringBootTest //SpringBootTest is the annotation used by SpringBoot for testing. You can specify boot classes, test environments, etc.
public class UnitTestServiceTest {
	
	@Autowired
	UnitTestService testService;
	
	// Enter ContiPerf for performance testing
	@Rule
	public ContiPerfRule contiPerfRule = new ContiPerfRule();

	@Test
	@PerfTest(invocations = 10000,threads = 100) //100 threads execute 10000 times
	public void test(a) {
		String msg = "this is a test";
		String result = testService.process(msg);
		// Does the assertion conform to expectationsAssert.assertEquals(msg,result); }}Copy the code

Note: @ Rule is to provide an extension of J unit interface, the interface class is: org. Junit. Rules. MethodRule, pay attention to in Junit5, has been replaced by TestRule. You can also indicate the default Settings for methods in a class by specifying @perftest and @required on the class.

@ PerfTest comments:

  • Invocations: execution times n, independent of the number of threads. The default value is 1
  • Threads: Specifies the number of threads in a pool (n). N threads are executed concurrently
  • Duration: repeat execution time n, test execution at least n milliseconds

@ Required comments:

  • @Required(throughput = 20) : At least 20 tests are Required to be executed per second;
  • @Required(Average = 50) : The average execution time is Required not to exceed 50ms;
  • @required (median = 45) : 50% of all implementations are Required to be no more than 45ms;
  • @Required(Max = 2000) : No tests are Required to exceed 2S;
  • @Required(totalTime = 5000) : The total execution time must not exceed 5s.
  • @required (Percentile90 = 3000) : 90% of the tests are Required to be less than 3s;
  • @Required(Percentile95 = 5000) : 95% of tests are Required to be 5s or less;
  • @required (Percentile99 = 10000) : 99% tests are Required not to exceed 10s;
  • @Required(Percentiles = “66:200,96:500”) : 66% of tests are Required not to exceed 200ms and 96% of tests are Required not to exceed 500ms.

Run the test and the console generates the results:

Com. Zuozewei. Springbootcontiperfdemo. Service. UnitTestServiceTest. The test samples: 10000 Max: 331 business: median 33.3522: 30Copy the code

Also access: target/contiperf-report/index.html to generate a chart:

Note: The chart requires a scientific web site to display

Indicators in the chart:

  • Execution time: Execution time
  • Throughput: TPS
  • Min. latency: indicates the minimum response time
  • Average Latency: indicates the Average response time
  • Median: Median response time
  • 90%: 90% response time range
  • Max Latency: indicates the maximum response time

Four, summary

Here are some simple examples of using Junit and ContiPerf. Considering this kind of code-level performance testing during the unit testing phase will definitely improve ROI and cost very little. I hope this article will inspire you all.

Sample code:

  • Github.com/zuozewei/bl…

References:

  • [1] : sourceforge.net/p/contiperf…
  • [2] : Lecture 52 of Software Testing