Modern software engineering and agile development emphasize the embrace of change. In addition to the high quality requirements for software system design and implementation, Continous Integration has also become an important part of high quality delivery, which is extremely challenging for the development team including test engineers. This article will share how test engineers can embrace the future, keep pace with The Times, and continuously improve their self-cultivation.

PART 01

New challenges for test engineers

The concept of Agile development has a history of about twenty years, and its core is “rapid response, continuous delivery of value”. The software development process emphasizes the importance of people, and development teams need to be able to work together. Interpersonal communication, communication should be face-to-face, rather than detailed documentation, according to a predetermined plan for development, testing. Change is the only constant in agile development.

Developing fast commit testable code also requires tests to quickly verify the availability and stability of the commit code, and simple manual tests are no longer sufficient. In the early days of Agile, the maintenance and deployment of the environment during the test development phase was largely done by QA or development itself, and this work was mostly manual. At the same time, each Scrum team focuses on different points, so it is easy to ignore other services, resulting in a slightly different test environment from the online environment, which may lead to compatibility or integrity issues after the launch. Throughout the development and testing phases, we have to constantly repeat some work, such as bug modification, deployment of new packages, validation of bugs, new bug documentation, etc., and the entire environment has to keep the deployment of changes in sync. Manually deploying the environment by humans can take a lot of time, or it can lead to unnecessary problems due to human error, causing the entire team to be unable to work and affecting the project cycle.

The question we face is:

  • Collaborate across domains and departments, not just face to face;
  • How to improve efficiency and achieve fast delivery;
  • How to reduce human intervention and reduce the likelihood of mistakes.

PART 02

The application of the conversation

DevOps, we think, can solve these problems.

What is DevOps? According to Wikipedia, DevOps (a portman of Development and Operations) is a culture, movement, or practice that values communication and cooperation between “software developers (Dev)” and “IT Operations technicians (Ops).” Build, test, and release software faster, more frequently, and more reliably by automating the software delivery and architecture change processes. On the surface, DevOps is a combination of development and operations, but in reality it is a collective term for processes, methods, and systems.

DevOps process cycle: \

The ideal DevOps cycle starts with:

  • Develop and write code;
  • Automated compilation of packages and automatic deployment to the test environment;
  • Execute automated test cases and release them;
  • Automate deployment to the production environment.

Obviously, there is a strong emphasis on automation of build, deployment, and test in the DevOps process, using continuous integration tools and automated test tools in the DevOps cycle. The role of testing in DevOps also changes: in the traditional model, the tester’s main job is to perform their functional and regression tests. Developers typically sit with testers for a period of time before testers formally accept the test package, and in DevOps this changes:

  • From the original manual test to automated test, to try to automate all use cases, and can achieve 100% code coverage;
  • Ensure that the test environment is consistent with the production line environment and can be deployed automatically;
  • Pre-test tasks, cleanup, and post-test tasks are automated and can be assigned to continuous integration processes.

DevOps requires a high degree of collaboration and blurring of responsibility boundaries, encouraging everyone to contribute. Developers can configure deployment work, deployers can add test cases to QA’s use case library, and testers can configure and modify use cases in DevOps links. The demands on testers now and in the future will be so high that simple black box testing will not live up to expectations.

DevOps is the future, and automation is at the heart of DevOps’ success. As a tester to take automation as an important direction of career development, special knowledge and automation skills can better serve the rapid iteration of products.

PART 03

Automated Testing practices

From the above analysis, we can see that automated testing plays a role in the entire product cycle, so how should we proceed?

As can be seen from the figure, in addition to unit tests, apI-level automated tests can achieve relatively high test coverage. So we start from THE API test automation, one is to use it we can complete the scene use cases, closer to the use of users; Second, it is more stable than UI automated testing.

The test engineers of Paileyun developed a set of API automation test framework, which we used to do the following automated development:

  • Functional use case automation
  • Simulate user call interface habits
  • The performance test
  • The quality test

Pai Le Cloud API automation framework

Paileyun is to provide users with real-time audio and video services, we aim to provide users with easy-to-use, high-quality products. We are not limited to how users call the interface. Users can use the service according to their own habits and understanding, which requires higher robustness of the product.

Our test engineer conducts automated tests based on the external API layer, and designs interface test cases to simulate user habits by understanding product interfaces, mainly from the following aspects:

  • Changing interface Parameters
  • Change the interface call order
  • Changes the interface call condition

One of the goals of automation is to save labor costs, and so is the automation of using interface tests to do work cases. In this scenario, we focus primarily on regression test cases. As more features are added, regressions become more expensive. Therefore, the functional regression set will continue to expand with the increase of functions. We split the regression use case set into two parts:

  • Basic use cases, ensuring basic functionality
  • Detailed set of use cases, including more functional logic

Each of these two parts serves a different purpose. The base case is used to build the pipeline and verify whether the new code affects the use of basic functionality, so that functionality is not lost and the pipeline load is reduced. Detailed use case sets are used to verify more complex user usage scenarios. They are mainly applied to functional testing in test environments to achieve certain coverage and further guarantee functional availability. In our daily releases, there are certain performance tests, such as media server concurrency, audio and video performance on specific devices, etc. Performance data is the embodiment of product stability. The focus of performance testing is to define scenarios and determine performance bottlenecks in certain scenarios. For example, how many large-scale conferences can a Media Server support? How many six-person conferences can be held at the same time with a certain configuration of MediaServer. One of the six can send 720P and receive 5x180p, and the other six can send 180P and receive 720p and 4180P respectively. We will obtain performance data of related scenarios as a benchmark according to our own requirements. Performance scenarios are also defined based on the actual usage scenarios or requirements of most users.

Multiple media channels for a while

In a six-person conference, one person sends 720P + receives 5-way 180P, and the other five people send 180P + receives 1x720p+4x180p

Media service Performance Metrics we focus on CPU, Memory, and network throughput. There is also the performance of the end, this data in addition to the scene related but also the configuration of the device itself, such as the same scene iPhoneX and iPhone13 Pro performance is certainly different, iPhone13 Pro can be stable and long time encoding 720PX30F and decoding 1 channel 720PX30F, The same scenario on the iPhoneX could last less than an hour or less before the code drops frame rate or resolution. Therefore, the performance of the device is not only concerned with CPU and Memory usage, but also the length of time the device can support the corresponding video specifications, as well as power consumption.

With functional use cases, simulated user invocation scenarios, and automation of performance and quality testing, we can achieve approximately 50% coverage. In addition to interface automation, we also applied UI automation, mainly to the product form of our external APP. Api-level automated testing will be the focus of our automation, and we will continue to improve its role. In the future, we will strengthen the following work:

  • Add interface use cases in time, we hope to achieve Test Driven Development;
  • To overcome difficult functions, such as the white board, because the white board will involve interactive operation functions, so we cannot fully realize some automation at present;
  • Combine with interface automation to realize the automation of quality testing.