I. Historical interface testing and management
1.1 A “live” interface
The “Growth History” of interfaces
A “typical” interface lifecycle
1.2 “History” interface management and testing epitome
Interface testing and management in 2014
Java+TestNG interface testing framework
The additive effect of “change”
Interface testing and management ROI analysis
-
Reduced cost of use case writing (1 interface /10 use cases /2 hours)
-
Reduce the cost of use case optimization (use cases are clearly described and difficult to be layered)
-
Reduced cost of use case maintenance (1 interface change, average maintenance hour)
-
Reduce tool development costs (public test services are not public, duplicate wheels)
-
Can manually
-
Everyone can use
-
When the tool with
-
For each stage
Problems and Challenges
Second, the solution of interface testing and management is advanced
2.1 Six elements of interface test writing
Six elements of interface test writing
2.2 More than 5 interface tests were executed
-
Multi-granularity: single interface, interface scenario, and execution set
-
Multiple environments: test environment, pre-release environment, production environment
-
Multiple roles: development, testing, operation and maintenance
-
Multiple data: global data, local data, and temp data
-
Multi-stage: development and debugging, self-test smoke, functional test, release verification, online regression, continuous integration, online monitoring
Five “multiples” of interface test execution
-
Execution environment constraints
-
Limitation of use phase
-
Limitations on using characters
-
Online surveillance is not systematic
2.2.1 Limited Execution environment
-
Multi-node: Stable and reliable
-
Multi-environment: meet daily functional testing, regression testing, pre-release verification
-
Multi-room: Quickly verify the high availability architecture of remote multi-room
-
Controllable: Privatized deployment of online monitoring actuators for security control of online test requests
-
Schedulable: Integrates with automated publishing, scheduling actuators immediately after publishing to trigger automated validation
-
Shareable: Share actuator information to facilitate rapid self-testing and collaborative QA testing for developers
2.2.2 Limited interface test phase
-
Implement development self-test verification with continuous integration
-
Implement PE release process verification with release platform
-
Implement complete online interface monitoring with GoAPI
Implement development self-test verification with continuous integration
Implement development self-test verification with continuous integration
-
The development submission code automatically triggers the construction through Webhook, and the construction is successful.
-
Execute unit test card points, all PASS and coverage card points PASS;
-
Perform static code check, with sonar to complete the card point judgment;
-
Package images to upload to a private repository and perform deployment services;
-
Call GoAPI interface test platform openAPI to perform corresponding interface automation;
-
Call Smartauto Intelligent UI automation platform openAPI to perform corresponding UI automation;
-
Call NPT performance pressure measuring platform openAPI to perform the corresponding performance automatic regression;
-
Call CR change code coverage platform to obtain this round of change code coverage;
-
Output overall quality test report, covering multi-dimensional quality measurement;
There are hundreds of machines in an online application cluster, and the online regression execution cannot completely cover the availability of application instances on each machine, which may cause an application instance to go online with problems due to unknown factors, leading to online failure. So for PE, their demand is to release every application instance through automatic regression, based on this, combined with the internal publishing platform implementation plan is as follows
Publish automatic regression schemes
2.3 Online interface monitoring closed-loop
-
Multiple factors affect execution stability (environmental factors, use case dependency issues, etc.)
-
The alarm mechanism is not perfect (only a single alarm notification and no alarm policy can be configured)
-
Failure analysis Time (It takes a long time to perform log analysis based on TestNG and link data dependence in use case scenarios)
-
Closed loop is not well formed (low alarm accuracy and low recall rate of online interface problems)
-
Continuous and stable 7*24 hour monitoring
-
Perfect alarm notification mechanism
-
Interface return code monitoring
-
Interface Abnormal Monitoring
-
Interface Performance Monitoring
-
Interface service logic monitoring
Interface monitoring six elements and solutions
2.4 Benefits of interface testing and management
ROI of interface testing and management
An industry-leading interface testing platform
Interface based team collaboration platform
An overview of the product
The interface test
Scenario testing
Perform set
【 Key summary 】
-
The birth of a platform must be due to the continuous evolution of business problems and challenges;
-
The core of how to improve ROI is to reduce investment costs and increase utilization;