background
In the daily development process, the project interface is usually agreed and provided by the service provider, and it is normal for the interface to be invoked by multiple consumers in the microservice mode. So how can the change of the provider interface be notified to consumers quickly, efficiently and without omission? In addition, when a service is invoked by multiple consumers at the same time, how can you ensure that changes to the service will be felt by all other consumers? These are questions that contract tests can answer. In addition, interface testing is a very important testing method in the microservice mode. It helps verify the collaboration and interaction between microservices in actual projects, greatly reduces testing costs and improves testing efficiency. It can be said that interface testing is a booster for business function testing. Therefore, these two testing methods are introduced here.
Phase of interface testing and contract testing
In the actual work, we customized the principle of automated test pyramid combined with the actual situation of payment, and added the content of contract automated test, forming the following new version of automated test pyramid structure.
As can be seen from the figure, the testing process of a project, from the perspective of project advancement, is first unit test, then interface automated test, contract test, and finally UI automated test and manual test.
How to carry out interface testing in microservice mode
Interface testing belongs to the category of integration testing, which is the extension and continuation of unit testing. Its main concern is whether the internal interface function is complete, such as whether the internal logic is correct, exception handling is correct. It is a transitional phase between unit testing and contract testing, and it is the bridge where the individual code logic of a project is ultimately connected to form valuable business logic. Therefore, it plays an important role. To carry out interface test in payment, the idea is to standardize and method first, followed by tool selection and personnel training, then implementation and process optimization, and finally normalize the process of continuous improvement and quality assurance.
Interface test standardization requirements
The quality assurance of interface testing and the process of testing process need to be guided and constrained by norms and methods. We have customized the following requirements (partial content) :
- When there are new interfaces or interface changes, it is required to write new interface test cases or maintain existing interface cases;
- The inventory interfaces involved in the requirements need to be regression tested;
- Interface test coverage is required to reach 100%;
- At least one round of interface regression test should be conducted before the end of requirements test, and the regression pass rate should reach 100%
The test process specification covers the entire process from requirements formulation, scripting, execution, to test reporting.
- Interface documentation. Interface documents are the basis of interface test case design. The comprehensiveness and accuracy of interface documents determine the comprehensiveness of interface test scope and the correctness and validity of interface test results. Swagger is used for interface document management.
- Interface use case design. According to the interface document design interface test cases, interface test cases through the interface test platform to write, and need to meet the principle of no leakage.
- Interface use case review. According to the actual situation of the project, after the interface test case is written, relevant stakeholders should be organized to review the case, and minutes of the meeting should be recorded and sent.
- Interface use case execution. At least one regression test is performed on the interface test case in the test environment before the requirements test is completed, requiring the case execution pass rate to reach 100%
- Defect management and test report.
- The script is incorporated into the regression system, with regular regression, continuous guarantee of interface quality, and continuous and timely feedback of interface quality.
The script naming and writing specifications are as follows (part of the content) :
- Interface naming requirements: Use interface Name_Interface Description to define a unique interface.
- Method naming requirements: Use method name_Description to define unique methods.
- Case naming requirements: Use sequence number _ Scenario Operation _ Expected Result to define a unique case.
- [Mandatory] Each interface test case must contain at least one assertion;
- [Mandatory] For JSON-formatted packets, the expected values of interface input parameters and assertion responses must be in strict JSON format.
- 【 Mandatory 】 Before importing swagger script to the interface test platform, you need to import a. Json file that contains UTF-8 code without BOM.
- [Mandatory] SQL for data initialization and assertion must have where conditions, and can uniquely locate the desired data;
- [Mandatory] The SQL of the database rollback must contain the WHERE condition, and the data to be rolled back can be uniquely located.
- [Mandatory] SQL that affects public tables (such as T_BAP_CDE_BNK table) or other database group tables (such as fund group) must be strictly checked during data initialization, rollback, data rollback affected by interfaces, and assertion rollback.
- [Mandatory] The primary key combination of the SQL predicate where condition must be placed in front of the database to quickly locate the problem when the predicate fails.
Interface test case design requirements
In order to ensure the quality of the interface, comprehensive interface testing is required, so it is necessary to rely on methods when interface test cases are involved. Therefore, we summarize the design requirements of interface test cases, as shown in the figure below.
Interface test tool
Interface test process efficiency, test process automation need to rely on automated testing tools, weapons are difficult to win. After investigation, many interface automation test tools on the market can not meet all testing requirements, so we developed the interface automation test platform. The automated test platform has the following capabilities:
- Case generation automatically. Automatic generation and import of HTTP/HTTPS interface cases.
- Centralized and visual management of the test process. By making the automatic test process web, the automation test plan, the automation test case writing, the automation test case execution, the automation test case management and the automation test report management are visualized.
- Simulate performance scenarios. Automated testing implements the ability to simulate performance test scenarios through interface cases. Parallel execution simulates performance scenarios using the interface cases provided in the platform.
- Multi-protocol and multi-packet type support. Supports automatic testing of HTTP/HTTPS, DuBBo, Socket, rabbitMQ, and other protocols, and supports protocol extension. Supports XML, JSON, SOP, 8583, and other packet types and extension.
- Test asset accumulation effectively.
- Automates scheduling execution and mail delivery. Automated test execution Enables automated batch, scheduled, and emailed test reports for a specified build by scheduling execution of cases on a scheduled basis.
- Visual feedback of system quality. Through the statistics of the implementation results of the automation case, the quality trend of the system is analyzed, and the continuous feedback of the system quality is achieved. Through root cause analysis, count the proportion of root causes of system problems, and solve quality problems more pertinently.
Through the continuous operation of the interface test for more than one year, the core business interface of the accompanying payment basically achieved full coverage of the interface test cases, which were included in the regular regression process, and continued to escort the quality of the interface.
How to conduct contract testing in microservice Mode
Value of contract tests
There are two types of contract testing, one is consumer-driven and the other is provider-driven. One of the most commonly used is the Consumer-driven Contract Test (CDC). The core idea is from the perspective of consumer business implementation, the consumer side defines the required data format and interaction details, and generates a contract file. The producer then implements his logic against the contract file and continuously verifies that the implementation is correct in a continuous integration environment. For a Restful API-based microservice, its contract refers to the RULES of the API’s request and response. As shown below:
- For the request, including request URL and parameters, request header, request content, etc.;
- For the response, including the status code, response header, response content and so on.
- In the case of metadata, it is a description of a collaboration process between the consumer and provider. Examples include the name of the consumer/provider, context, and scenario description.
So what value can contract testing bring to microservices? Some of the value of contract testing was mentioned at the beginning of this article, namely quick notification of interface changes, and quick awareness of changes to servise. In addition, it brings the following values:
- Ease service integration. The process of service integration is broken down into more detailed unit testing and interface testing, which starts from the needs of consumers, implements a contract driven by the needs of consumers as test cases, and then verifies the functionality of the provider side.
- Development parallel, improve development efficiency. The contract isolates the consumer and the provider, and both sides can work in parallel. During the development process, the contract is used to carry out pre-integration test, without waiting for the integration of tuning interface. Once mature, on the premise of quality assurance, the cost of tuning can be reduced to almost zero.
- Ensure the safety and accuracy of changes. As long as there is a change, the contract test can be found in the first time to ensure the accuracy of security and docking.
- Serve as a Mock Server to provide Mock services to consumers. Integration tests are provided to the server
How to conduct contract testing in microservices
The accompanying payment adopts the Contract test in Spring Cloud Contract. Its core process consists of 2 steps:
- When verifying a consumer’s business logic, Mock the expected response. The request (consumer) -response (based on mock provider) collaboration process is recorded as a contract;
- Through contracts, the provider is played back to ensure that the content provided by the provider meets consumer expectations.
Here is a simple example of how to design a contract test. In this example, a microservice provides a resource with three fields (” IP “, “name”, and “password”) for use by three consumer microservices. The three microservices use different parts of this resource. Consumer A uses the IP and name fields. Therefore, the test script will only verify that these two fields are correctly included in the resource from the provider, not the Password field. Consumer B uses the IP and Password fields and does not need to validate the Name field. Consumer C needs to confirm that all three fields are included in the resource. Now, if the provider needs to split the name into first name and last name, it needs to remove the old name field and add new First name and last Name fields. Contract tests are then executed and the test cases for consumers A and C are found to fail. Test case B is not affected. This means that the code for consumer A and C services needs to be modified to be compatible with updated providers. After the modification, the contract content needs to be updated.
Here is an example of how to conduct Contract testing using the Spring Cloud Contract.
-
- The Spring Cloud Contract Contract is defined using a Groovy-based DSL, and the following code snippet is a Contract test
package contracts
org.springframework.cloud.contract.spec.Contract.make {
request {
method 'PUT'
url '/fraudcheck'
body([
"client.id": $(regex('[0-9] {10}')),
loanAmount: 99999
])
headers {
contentType('application/json')
}
}
response {
status OK()
body([
fraudCheckStatus: "FRAUD"."rejection.reason": "Amount too high"
])
headers {
contentType('application/json')}}}Copy the code
-
- Gralde plug-in of Spring Cloud Contract Verifier is added to the server-server (HTTP)/producer (Messaging) side, which is used to parse Contract file generation tests. Commands to generate tests.
./gradlew generateContractTests
Copy the code
The following test script is automatically generated
@Test
public void validate_shouldMarkClientAsFraud() throws Exception {
//given:
MockMvcRequestSpecification request = given()
.header("Content-Type"."application/vnd.fraud.v1+json")
.body("{\"client.id\":\"1234567890\",\"loanAmount\":99999}");
//when:
ResponseOptions response = given().spec(request)
.put("/fraudcheck");
//then:
assertThat(response.statusCode()).isEqualTo(200);
assertThat(response.header("Content-Type")).matches("application/vnd.fraud.v1.json.*");
//and:
DocumentContext parsedJson = JsonPath.parse(response.getBody().asString());
assertThatJson(parsedJson).field("['fraudCheckStatus']").matches("[A-Z]{5}");
assertThatJson(parsedJson).field("['rejection.reason']").isEqualTo("Amount too high");
}
Copy the code
This is a standard JUnit test that uses RestAssured to launch Spring’s webApplicationContext.
@Before
public void setup() {
RestAssuredMockMvc.webAppContextSetup(webApplicationContext);
}
Copy the code
-
- Caller – Generates the Stub service Jar package by command on the service side
./gradlew verifierStubsJar
Copy the code
Spring Cloud Contract Stub Runner simulates the real service in integration tests by running WireMock instances or message routing. Therefore, before running, you need to add the dependency to gralde, which can of course be added to the private server repository.
spring-cloud-starter-contract-stub-runner
Copy the code
For callers, the Spring Cloud Contract provides Stub Runner to simplify Stub usage. You can now use the @AutoConfigureStubrunner annotation. Running stuBS annotations for Spring Cloud Contract Stub Runner adds group-id and artifact ID, as shown in the following example:
@RunWith(SpringRunner.class)
@SpringBootTest(webEnvironment=WebEnvironment.NONE)
@AutoConfigureStubRunner(ids = {"Cn. Vbill. Service: test - the client - stubs: 1.5.0 - the SNAPSHOT: stubs: 6565"},
stubsMode = StubRunnerProperties.StubsMode.LOCAL)
public class LoanApplicationServiceTests {
......
}
Copy the code
The AutoConfigureStubRunner comment sets the private library address from which the Stub Jar package was downloaded and the full package ID, 6565 being the local port on which the Stub Jar was run. When the Stub port is accessed during the test, the content is returned according to the contract.
Since the accompanying payment micro-service is based on spring Cloud technology stack, the adoption of Spring Cloud Contract for contract testing under micro-service makes the testing process smoother and smoother. The integration of Spring Cloud Contract with SpringBoot and Junit is easier and more convenient. We believe that with the optimization of the Spring Cloud Contract version, contract testing can be done better.
conclusion
This paper introduces how to carry out interface automation test, the value of contract test and how to carry out contract test in microservice mode respectively. In microservice mode, the invocation relationship between services is complex, and interface test and contract test are important means to ensure service quality improvement, so we should make full use of them.
This classified article is synchronized with the wechat account of "Entourage Payment Research Institute", and the first time to receive the public account push, please pay attention to the "entourage payment Research Institute" public account.Copy the code