Last year, I wrote two blog posts about UI automatic testing and interface automatic testing, both of which are about some basic methodology. This year, I transferred my post to do automatic testing, and I made a lot of new discoveries. Of course, I also met a lot of challenges.

So I relaunched a blog to talk about some challenges I encountered in doing automated testing recently, as well as my solutions and some thoughts…

 

I. Test scope

Whether it’s functional testing, automation or performance testing, the first step is to define the scope and requirements. For automated testing, especially UI automation, not all function points are suitable for UI automation.

According to the specific business situation and project stability, it is the best plan to select the combination of UI automation +API automation and select the appropriate business points to carry out targeted automated test scheme design.

For what project is suitable for UI automation, in the previous blog talk about UI automation testing has been analyzed, here are a few points THAT I personally think are suitable for UI automation:

(1) High frequency of use, more abnormal judgment, and the most basic functions, such as user registration, login;

(2) the core functions of the client, such as order center and member center;

③ promotional page, information display page, etc.;

The concept of layered testing, which was introduced earlier, is pyramided as follows:

Unit automation is the most profitable in terms of results, but it is too difficult for most automation test beginners.

And at present, most of the domestic small and medium-sized enterprises, the development of its own development standards, management and other work is not very good, unit automation, the probability of realization, not optimistic in the short term.

Interface automation, by far, is the most cost-effective option.

UI automation has the lowest input/output ratio of the three models, but for larger and more complex systems, UI automation can in some ways reduce the amount of post-release smoke testing.

 

Second, system architecture

For most functional testers, the system architecture is not much of a concern, but for automation or performance testers, the system architecture is a focus of early requirements analysis, technical solution selection design.

For example, the programming language of the system development, the type of database used, the communication service framework (if API automation is to be carried out, then the communication protocol adopted by the system is an inevitable point), the deployment of the application server, etc., all need to be considered.

PS: The enterprise I work for now uses SQL Server in Microsoft family bucket package for database, while python is the development language of my test script, which does not support SQL Server and Oracle very well, resulting in many pits in database configuration.

Of course, I finally solved the problem. Database support is only a small part of the problem I have encountered.

Another reason to be familiar with the system architecture is that, for example, the development language used by the system is Java, and the script development language used by the automation testers themselves is Java, so the compatibility is very good, and there are technical problems, you can also ask the development colleagues to help solve.

This virtually also saves a lot of time, but also to their own technical improvement, is not a small help.

After getting familiar with the system architecture, it is the best way to consider the design of automatic test scheme and the selection of technical scheme. Instead of using the inherent way to solve different problems, we should use different technical solutions to solve different problems.

Python: Use pyMMSQL module to operate SQL Server database

 

Iii. Project Status

In this regard, I personally believe that analytical communication should be used to confirm whether the system is suitable for automated testing. So, what kind of system is suitable for automated testing?

As shown in the figure above, I have outlined some of the characteristics of a project suitable for automated testing. Of course, it is not necessary to have all of them, but only a few of them (the red line is the basic criteria) before it can be considered for automated testing.

Of course, the project situation is not only these, there are other points, also need to consider, such as:

Document management

Documents include requirements documents, test schemes, test cases, test specifications, development specifications, database table design documents, interface documents, etc.

If the documentation is not perfect or there is no corresponding documentation, then the automation of the test work, the preparation of the work requires more time and effort.

PS: For example, if API automation is to be carried out, but there is no interface document, no database table design document, then API interface description, input and output parameter description, which will bring a lot of additional work, and the statistics of document data, is a very troublesome thing.

Sometimes it is necessary for the database to confirm the impact of the interface on the data. If there is no relevant documentation for the corresponding database table fields, it will take a lot of time and energy to sort them out and communicate with the DBA.

But many enterprises have too one-sided understanding of automation, thinking that automation can replace people to find a lot of bugs, and the pursuit of short-term obvious effects, and if there is no more perfect document management, then automation testers in the early preparation stage will affect the time spent

It can be said that the leader’s doubt about personal ability and work efficiency is also the drawback of many domestic enterprises.

Problems always need to be solved. If similar problems are encountered, these factors can be taken into account in task splitting and man-hour estimation, calculated in individual man-hour and indicated the reasons. Problems can be thrown out in the evaluation with the leader or even higher levels.

Drive problem solving, which is also a value for automated testers.

②, process management

Are the requirements iterations of the project stable? Is there a standard process from requirement review to development, testing and launch?

If requirements are iterated quickly or unstable, script maintenance for automated tests multiplies. If there is no good process management, the progress of automation will always encounter many indescribable pits, communication costs, change costs, etc., will also affect

Test plan design and planning.

In automated testing, familiarize yourself with these problems and be prepared to deal with them. Variation is the biggest challenge in automated testing!!

 

Fourth, the same “objective”

The “goal” here refers to the individual’s goal setting for work and the leader’s expectation for automated work. Different perspectives and understandings result in different expectations. I’m not trying to reach complete agreement, but I’m trying to convey the importance of communication.

Personal understanding of system situation, technical architecture, test scope, corresponding automated test scheme design and target setting should be communicated and adjusted with the leader constantly to reach a certain degree of agreement, which can also avoid many subsequent troubles.

Of course, the goal setting should also be based on specific automated test needs, task splitting, review, and continuous adjustment, which is a continuous adjustment process, need patience and persistence!

 

Five, operating environment

Automated test script development also needs to be based on different environments, and the selection of environments needs to consider different situations. Here are my personal ideas:

1. SIT Environment:

The SIT environment, also known as the system integration environment, is often referred to as the test environment. If the test scripts are developed and debugged based on the SIT environment, the SIT environment needs to be relatively stable.

However, most of the time, because the SIT environment requires multiple submission tests, the cross influence is large, and the SIT environment and production environment is still a certain difference (as for why, I believe that testing children understand the pain points).

2. UAT Environment:

UAT environment, is what we call acceptance environment, of course, similar, also have gray environment and so on. Compared with the SIT environment, the UAT environment is more stable and consistent with production, which is suitable for automated test script development and debugging, as well as daily test regression.

But automated testing can do more than that. It can also do post-release production smoke, timed polling testing, and so on.

3. Production environment:

If you do test script development and debugging, as well as test execution, in a production environment, the following issues need to be addressed:

① Network problems

Generally, the SIT and UAT environments are deployed inside the enterprise, and the network environment is also Intranet. Therefore, request resolution is different from that on the Internet. When running the script in production environment for regression smoke, you need to consider the real user operating environment, different network impact on test results.

And in test scripts, sleep is best used when possible (especially with UI automation, forcing to wait can lead to a lot of uncontrollable variables).

(2) Data pollution

When automatic tests are performed in the production environment, data generated during the tests must be isolated. Otherwise, services will be affected and production data will be polluted. Common data isolation methods are as follows:

Test whitelisted accounts, network tags, thread tags, test data fall into the corresponding test library, and MOCK objects and baffles can be used for some businesses.

Data isolation is also an important consideration in performance testing, where these methods are often used to isolate data to avoid the impact of production data contamination.

 

Service deployment

Previous blogs have covered the skills needed to do interface testing, but the core of automated testing is continuous integration. Then a separate and unaffected continuous integration environment is a must.

Continuous integration environments are also known as CI environments, as are CD environments (continuous delivery).

These applications are widely used in agile testing. However, in the industry, I know that more and more enterprises start to build continuous integration environment, which has many benefits, such as:

(1) Version iteration test can be submitted by CI service with one click, saving time and improving efficiency;

(2) If you need to create a large number of test data, you can use a small script, only need to configure, click the start button;

(3) When the production environment is released, the CI service can be automatically packaged and deployed to avoid the misoperation of manual packaging configuration.

(4) Automated test scripts, deployed in CI environment, only need to click start, or set the starting conditions, in the form of scheduled task, to do production scheduled polling, smoke after release and other test work;

How is the CI environment deployed? You can refer to my previous blog: Linux Environments: Continuous Integration Service Deployment series

 

Test strategy

The strategy here can be described from the following perspectives:

1. Split tasks

After confirming the test scope and demand indicators, split the automated test tasks with smaller granularity, and accurate the time required for the work content to the range of people/day, complete the tasks on time, and constantly adjust the problems;

2. Priority

After task splitting, prioritize the tasks and confirm which tasks to complete first and then through review communication.

In the design of the test scheme, it can be considered as the first phase and the second phase of automatic test tasks. For example, in the first phase, automatic test coverage of PC terminal is achieved to a certain extent, and in the second phase, automatic test coverage of mobile terminal can be considered.

The third phase can consider the development of automated test platform and so on. Have a goal-oriented plan and adjust according to project changes and schedule.

To prioritize automated tasks, consider these points:

(1) The higher the user frequency is, the higher the priority should be;

(2) The basic functions of the system should be in the forefront of priority;

(3) Core functions of the system (such as payment and order) should be covered earlier;

(4) The function points with many production problems in historical records should be covered as much as possible;

3. Estimated working hours

After the preliminary preparatory work (such as scope confirmation, demand analysis, CI environment deployment, test scheme design, technical framework selection, document sorting, etc.) is completed, it is necessary to estimate the working hours of specific tasks after splitting to confirm the phased progress of the work.

Convenient adjustment at any time. The following points need to be considered in the estimation of working hours:

(1) the technical ability and characteristics of the automation tester;

(2) The degree of change in project situation, process management and system architecture;

(3) The difficulty of the implementation of the test scheme technology;

 

8. Solve problems

Automated test road is full of potholes!!

We will encounter a lot of problems in our work, including communication problems, management problems, technical difficulties, and some indescribable problems. But only by solving these problems can we continuously improve in our work, complete our work, and obtain the harvest we should have.

Different enterprises have different kinds of problems, no perfect company, so in work, learn to adapt to the environment, harmonious communication, is also a necessary content in work.

The essence of work is to take money to eliminate disasters, solve problems and create value!!

Therefore, a good state of mind, a persistent attitude, the idea of solving problems, is the need for automated testing children have…

 

The above is my personal automation testing process encountered some problems, challenges, and some of their own thinking and solutions, I hope to see the children’s shoes to provide some ideas and help, for reference only…

Of course, many of them, taken out alone, there are also a lot of worthy of discussion, limited to the space here is not an analysis, will continue to update…