The introduction

Interfaces, once developed, are usually changed or refactored with relatively little frequency and magnitude compared to UIs. It is therefore more cost-effective to automate interfaces, often in regression testing before iteration releases go live.

Manual interface testing, test data and parameters can be manually filled in and updated by the tester.

Therefore, when we consider to automate the interface use cases, the main idea is how to solve the following problems on the premise that the test cases of the single interface request have been completed:

  1. Service test scenarios invoke more than one interface, and the request of the next interface depends on the data of the previous interface. Therefore, the interface dependency problem needs to be resolved
  2. The authentication data such as the token has an expiration time. This parameter is used on multiple interfaces. Therefore, you need to resolve the problem that the modification takes effect in multiple places once
  3. An interface can be overridden with multiple test data
  4. In batch tests, you need to know whether the parameters and data returned by an interface meet the expectations

The automated interface test tool used in this article is Apifox, which can be downloaded from www.Apifox, registered and installed. Let’s take a look at how to solve these problems with Apifox.

The body of the

1. Interface parameter transmission

Give an example of a common scenario. When a query interface requests to obtain data, it must carry an access_token parameter, which must be obtained by another authentication interface. Therefore, the authentication interface must pass the obtained token parameters to the query interface before the query interface can initiate a request.

Another common scenario is that users need to log in before they can add selected items to their shopping cart. This interface is dependent on the previous interface getting the data. In the case of manual testing, manual replication is required.

Solution: You need to identify the data returned by the previous interface and extract the target parameters, save them as global variables, and call them directly by the next interface.

Procedure: 1) Operate TAB after TAB on apifox interface, and select extract variable2) Fill in the variable name, variable type and extracted expression. Extract expressions follow JSON PATH syntax. There is only one layer of returned data in this interface data, so the method of $. Target parameter is used to extract. If you have multiple parameters, click the question mark next to the extract expression to see the detailed JSON path syntax.

The obtained parameters are stored in the form of variables. Click the setting icon in the upper right corner of interface TAB to view the value of the obtained environment variables.This can then be called as an argument on the next interface:

2. External data sources

Some interfaces that post data to background processing need to upload different data to test the return and abnormal compatibility of the interface. An interface parameter needs to use different data multiple times. In the manual case, we can directly fill in the data in the parameters, and then manually change each time.

But if automated, testing like the above is difficult to implement. A common solution is to edit the CSV file, save the test data one by one, and pass it into the interface request parameters. Apifox provides the following solutions to this problem: a. For a small amount of test data, test data sets can be filled in the interface for each call; Use CSV files only if you have a large amount of data. Less data can be written directly into global variables.

Importing global variables is similar to the interface argument in the previous section, except that the test data is not fetched from the previous interface, but is filled in by ourselves. If you are using an external test data set, there is a test data switch on the right side of the Test Management TAB > Use Cases interface, which can be opened to import test data. Of course, you need to import the use cases into the test steps first.

As shown in the figure, I have imported the OCRtest interface into the use case step, enabling external test data,

Then click Manage Test Data to jump to test data TAB:Start creating/importing test data on this screen. Here, the data set name is identified by the tester and will not be passed into the interface. A data set (1 row) represents all the test data that needs to be passed in this run. The column name is used as the interface parameter.

At run time, each piece of test data is run as a test case.

The “interface parameter transfer” and “incoming test data” mentioned above have the same idea, relying on the parameterization function provided by Apifox. The uploaded data parameters are separated from the interface in the form of external data sets, and the key fields and constantly changing data are extracted independently from a single interface.

After the configuration is complete, the interface can generate, transfer, and import key data every time it runs. If you need to modify the data, you only need to modify the data in one file in one place to take effect globally. There are ideas of abstraction and encapsulation in software engineering, and assertions, which I’ll talk about next, are another idea.

Three. Test assertions

The tester running the test manually can see for himself whether the interface request is successful and the data is healthy, but in an automated practice, we need the code to help us determine whether the actual return matches the expected return.

HTTP response text is highly structured, so we expect nothing more than a response status code in header and body, a key field, and a key value. We just need to decide if these fields are what we want.

Assertions are specifically used to verify that the output matches the expected value. In testing practice, we usually compare the actual output value with the input value. That is, we want to determine whether the returned data “exists”, “contains”, “data equals”, and “text equals”.

Therefore, the realization scheme of judging the result of use case request can be divided into three elements: judging object, verification method, verification value and expected value.

With that in mind, let’s see how to implement it with scripts/features. Apifox’s assertion panel (path: Interface TAB > Run > Postaction > Assertion) has assertions including JSON, HTML and XML, header and cookie in response data, which basically meets our requirements.

The validation is done by asserting that the value of an object conforms to the value range specified by the tester

The verified value can be extracted using a JSON path expression

This can be done directly by using the function panel provided by Apifox, such as checking status codes and checking certain return values. Testers who want more flexibility with assertions need to choose custom scripts in the post-action.

For testers who are less familiar with scripting, you can use the code template provided on the right side of Apifox, which you can click to add to the script editing panel on the left, basically changing the expected value of the assertion without much difficulty.

If you test on a single interface, the assertion result is returned directly in response to a TAB

If it is a batch test, the test results will display the assertion result:

This solves our problem of “result judgment” in building interface automation use cases.

4. Environment switch

After the interface passes the test on the test server, another round of online validation is required to complete the test task.

Usually the only difference between the test server and the official server is the leading URL. In order to make the online validation process less repetitive, we can take advantage of the capabilities provided by Apifox when the automation project starts building. Configure the HTTP protocol and domain name shared by all interfaces in the project to the leading URL. The interface address contains only the resource path and parameters.

During online authentication, synchronize parameter configuration and data configuration or switch to online data configurationRuntime environmentIn the switch environment, you can carry out online verification.

Five. Batch test

1. Use case organization Form In APifox, use cases are organized in the form of test cases — use case groups — test suites. A test case can contain multiple test steps, with one interface request as one step. Interface use cases can be imported directly from interface use cases. If the Settings are synchronized with the interface, when the interface changes, the test case changes as well.

A typical use-case step is as follows, involving multiple interfaces, parameter passing between interfaces, and multiple interfaces to test a business scenario.

After the interface use case is imported, test parameters are configured, and click Run to automatically run.

2. Order of use case execution

In a test case, the interface requests are executed from top to bottom, and if you need to change the interface request steps, you can simply drag the interface to a new location.

3. The test suite runs one interface case to test a service scenario or a service process. A test suite contains multiple use cases, which can be executed in the same module. This use-case organization pattern is essentially the same as that of TestLink, the use-case management software commonly used by testers. In this way, as long as you click run, you can complete the interface test of a business module with one click.

After the test is complete, the test results of the use cases are displayed. The upper panel shows the overall execution, and the lower panel lists the execution results of specific use cases. If you need to export the test report, click the button to generate an HTML file in one click.

conclusion

The construction and execution of this interface automation project are basically based on the functions provided by Apifox. Compared with Postman, the feeling is particularly handy to use, the organization of use cases and the thinking mode of testing are basically several large and medium-sized factories are using, but also in line with the work flow of the domestic test group, process, is the tool to adapt to people, rather than people to adapt to tools, in the understanding threshold and the cost of thinking switch this point is greatly reduced.

All the way down, the project is basically the operation of the functional interface, there is almost no need for scripts, for testers who are not familiar with scripts, they can use it to quickly complete the test task in a short time.

If you are not familiar with those English testing terms, use this native interface testing software, understand the cost less, the efficiency may be higher.

2. Three basic ideas running through the whole interface automation project: a. Parameterization of test data and variables of a single interface, assertion of interface test results B. A single interface use case is built on the framework of business test scenarios. Interface dependencies are solved through parameter passing & interface execution sequence C. Use case organization is organized into test groups and test suites with business modules, business processes and logic as the framework, and aspects are later iterated and updated

This article uses APifox to do the interface automation test of the specific process and ideas are introduced here, I hope to help you.