preface

The last section introduced github.com/houbb/junit… Get started using.

In this section, we’ll look at the implementation from the source perspective.

How to test performance?

Junit Rules

Junit Rules: Junit Rules: Junit Rules

The core of implementing a performance testing framework based on Junit 4 is to understand Junit Rules.

Official document: github.com/junit-team/…

Rules in effect

Rules allow for great flexibility in adding or redefining the behavior of each test method in a test class.

Testers can reuse or extend one of the rules provided below, or write their own.

Custom rules

Ps: The following is from an official example.

Most custom rules can be implemented as extensions of the ExternalResource rule.

However, if you need more information about the test class or method in question, you need to implement the TestRule interface.

import org.junit.rules.TestRule;
import org.junit.runner.Description;
import org.junit.runners.model.Statement;

public class IdentityRule implements TestRule {
  @Override
  public Statement apply(final Statement base, final Description description) {
    returnbase; }}Copy the code

Of course, the power of implementing TestRule comes from using a combination of custom constructors, adding methods to the class for testing, and wrapping the supplied Statement in a new Statement.

For example, consider the following test rule that provides a named logger for each test:

package org.example.junit;

import java.util.logging.Logger;

import org.junit.rules.TestRule;
import org.junit.runner.Description;
import org.junit.runners.model.Statement;

public class TestLogger implements TestRule {
  private Logger logger;

  public Logger getLogger(a) {
    return this.logger;
  }

  @Override
  public Statement apply(final Statement base, final Description description) {
    return new Statement() {
      @Override
      public void evaluate(a) throws Throwable {
        logger = Logger.getLogger(description.getTestClass().getName() + '. '+ description.getDisplayName()); base.evaluate(); }}; }}Copy the code

This rule can then be used as follows:

import java.util.logging.Logger;

import org.example.junit.TestLogger;
import org.junit.Rule;
import org.junit.Test;

public class MyLoggerTest {

  @Rule
  public final TestLogger logger = new TestLogger();

  @Test
  public void checkOutMyLogger(a) {
    final Logger log = logger.getLogger();
    log.warn("Your test is showing!"); }}Copy the code

Definition and Use

Looking at the example above, we can see that custom rules in junit4 are relatively simple.

Definition: Implement the TestRule interface

Mode of use; Use @rule on the internal property that you create.

Isn’t that easy?

So now that you’ve learned that 1 plus 1 is 2, let’s learn about Taylor’s expansion.

Performance test algorithm flow

How do you count the execution time of a method?

Just count a time before the method starts and a time after the method ends, and the difference is the time.

How do you simulate multiple thread calls?

Use Java multithreaded execution to simulate.

How do you generate a report file?

The above statistical dimensions of data, combined with the corresponding HTML files can be generated.

What we’re going to do is combine the points above and implement them with Junit4 Rules.

Sounds easy, doesn’t it?

Now, let’s take a look at the implementation source.

Rule entry

Introductory example

Let’s start with an example of getting started with junit4:

public class HelloWorldTest {

    @Rule
    public JunitPerfRule junitPerfRule = new JunitPerfRule();

    /** * single thread, 1000ms, default output test results in HTML *@throws InterruptedException if any
     */
    @Test
    @JunitPerfConfig(duration = 1000)
    public void helloWorldTest(a) throws InterruptedException {
        System.out.println("hello world");
        Thread.sleep(20); }}Copy the code

JunitPerfRule is the custom rule we mentioned earlier.

JunitPerfRule

The implementation is as follows:

public class JunitPerfRule implements TestRule {

    //region private fields
    // Omit the internal variables
    //endregion

    @Override
    public Statement apply(Statement statement, Description description) {
        Statement activeStatement = statement;
        JunitPerfConfig junitPerfConfig = description.getAnnotation(JunitPerfConfig.class);
        JunitPerfRequire junitPerfRequire = description.getAnnotation(JunitPerfRequire.class);

        if (ObjectUtil.isNotNull(junitPerfConfig)) {
            // Group test contexts by test class
            ACTIVE_CONTEXTS.putIfAbsent(description.getTestClass(), new HashSet<EvaluationContext>());

            EvaluationContext evaluationContext = new EvaluationContext(description.getMethodName(), DateUtil.getSimpleDateStr());
            evaluationContext.loadConfig(junitPerfConfig);
            evaluationContext.loadRequire(junitPerfRequire);
            ACTIVE_CONTEXTS.get(description.getTestClass()).add(evaluationContext);

            activeStatement = new PerformanceEvaluationStatement(evaluationContext,
                    statement,
                    statisticsCalculator,
                    reporterSet,
                    ACTIVE_CONTEXTS.get(description.getTestClass()),
                    description.getTestClass()
            );
        }

        returnactiveStatement; }}Copy the code

The main process is to obtain the @junitPerfConfig and @junitperfrequire annotations on the method, and then perform the corresponding execution statistics.

Statement

A Statement is one of the most central execution objects in junit4.

Can be found, according to the annotation information here, on this implementation rewritten as PerformanceEvaluationStatement.

The core of PerformanceEvaluationStatement implementation is as follows:

/** * Performance test statement *@authorAn old horse whistles against the west wind@seeCom. Making. Houbb. During. Core. Rule. JunitPerfRule * / used for this rule
public class PerformanceEvaluationStatement extends Statement {

    // Omit the internal variables

    @Override
    public void evaluate(a) throws Throwable {
        List<PerformanceEvaluationTask> taskList = new LinkedList<>();

        try {
            EvaluationConfig evaluationConfig = evaluationContext.getEvaluationConfig();
            
            // Create the corresponding number of execution threads according to the annotation configuration
            for(int i = 0; i < evaluationConfig.getConfigThreads(); i++) {
                // Initialize the execution task
                PerformanceEvaluationTask task = new PerformanceEvaluationTask(evaluationConfig.getConfigWarmUp(),
                        statement, statisticsCalculator);
                Thread t = FACTORY.newThread(task);
                taskList.add(task);
                // The child thread executes the task
                t.start();
            }

            // The main thread is waiting
            Thread.sleep(evaluationConfig.getConfigDuration());
        } finally {
            // When execution is interrupted, the interrupted task may have already started executing (and not yet finished), causing the main thread to move down and the interrupted thread to continue
            for(PerformanceEvaluationTask task : taskList) {
                task.setContinue(false);    // Terminate the executed task}}// Update statistics
        evaluationContext.setStatisticsCalculator(statisticsCalculator);
        evaluationContext.runValidation();

        generateReportor();
    }

    /** * Report generated */
    private synchronized void generateReportor(a) {
        for(Reporter reporter : reporterSet) { reporter.report(testClass, evaluationContextSet); }}}Copy the code

Here is the core implementation part. The main process is as follows:

(1) According to the configuration, create the corresponding task subthread

(2) According to the configuration, the subtask is initialized and executed

(3) The main thread waits for sleep

(4) When the main thread sleeps, interrupt the child thread to update the statistics

(5) Generate corresponding test report files according to statistical information

PerformanceEvaluationTask

The implementation of subtasks is also noteworthy. The core implementation is as follows:

public class PerformanceEvaluationTask implements Runnable {

    /** * Warm up time */
    private long warmUpNs;

    /** * junit statement */
    private final Statement statement;

    /** * statistical calculator */
    private StatisticsCalculator statisticsCalculator;

    /** * Whether to continue flag bit */
    private volatile boolean isContinue;

    public PerformanceEvaluationTask(long warmUpNs, Statement statement, StatisticsCalculator statisticsCalculator) {
        this.warmUpNs = warmUpNs;
        this.statement = statement;
        this.statisticsCalculator = statisticsCalculator;
        this.isContinue = true; // By default, execution continues when created
    }

    @Override
    public void run(a) {
        long startTimeNs = System.nanoTime();
        long startMeasurements = startTimeNs + warmUpNs;
        while(isContinue) { evaluateStatement(startMeasurements); }}/** * Perform validation *@paramStartMeasurements Start time */
    private void evaluateStatement(long startMeasurements) {
        //0. If the value is false, exit the execution.
        if(! isContinue) {return;
        }

        //1. The preparation stage
        if (nanoTime() < startMeasurements) {
            try {
                statement.evaluate();
            } catch (Throwable throwable) {
                // IGNORE}}else {
            long startTimeNs = nanoTime();
            try {
                statement.evaluate();
                statisticsCalculator.addLatencyMeasurement(getCostTimeNs(startTimeNs));
                statisticsCalculator.incrementEvaluationCount();
            } catch (InterruptedException e) { // NOSONAR
                // IGNORE - no metrics
            } catch(Throwable throwable) { statisticsCalculator.incrementEvaluationCount(); statisticsCalculator.incrementErrorCount(); statisticsCalculator.addLatencyMeasurement(getCostTimeNs(startTimeNs)); }}}/** * Get the elapsed time (milliseconds) *@paramStartTimeNs Start time *@returnTime consumed */
    private long getCostTimeNs(long startTimeNs) {
        long currentTimeNs = System.nanoTime();
        return currentTimeNs - startTimeNs;
    }

    //region getter & setter
    public boolean isContinue(a) {
        return isContinue;
    }

    public void setContinue(boolean aContinue) {
        isContinue = aContinue;
    }
    //endregion
}
Copy the code

This task is mainly responsible for the statistical task time.

Count the number of successes and exceptions.

The isContinue variable, defined through volatile, makes it easier to terminate the loop after the main thread has fallen asleep.

Ps: There is still a problem to be found if statement.evaluate(); It’s already executing, so it can’t be interrupted. This is an area for improvement.

summary

This article starts with junit Rules and analyzes the implementation of the entire performance testing tool.

In general, the implementation idea is not very difficult, all complex applications, are made up of simple parts.

In order to facilitate your understanding, the source code part to do a lot of simplification.

For the full source code, go to github.com/houbb/junit… .

I am an old horse, looking forward to the next reunion with you.

Of course, you may find that this approach is not elegant enough, but junit5 provides us with more powerful capabilities, which we will explore in the next section.

The resources

Github.com/houbb/junit…

Github.com/junit-team/…