This is the 31st day of my participation in the August Text Challenge.More challenges in August
Httprunner /hrun — startProject your_projectName: httprunner/hrun — startProject your_projectName
- API: Directory of generated API interface test cases
- Reports: Executes the default report directory for test cases
- Testcases: directory of testcases
- Testsuites: Test case suites
- Debugtalk. py: is a heat treatment file. The default name is debugtalk.py
- Env: environment management
- .gitignore Can ignore specified folders and files
1. Environment management configuration file
Key information can be written here for management, how can it be referenced in a test case?
Example: ${ENV(PARAM)} calls the PARAM argument in the ENV variable file
Tips: Don't have blank lines in the env file variable configuration
2. Set global variables under the current file through variables in json/ YML file;
Variables: key1:value1 Key2 :value2 Reference: $key1Copy the code
3. In the Httprunner project directory, a debugtalk.py file is automatically created
Scenario: If register/login interface password encryption algorithm, or construct HTTP request header
Use: write the function we need in debugtalk.py
Reference: ${function_name ()}
4. On the same level as variables is the base_URL keyword, let’s call it that
Usage scenario: In our JSON/YML test case, the requested urls are all full paths, which can be a bit cumbersome if variable substitution is used
When hRUN executes the use case, it checks whether the URL starts with HTTP. Otherwise, it will find the Base_URL to concatenate with the URL
Tips: Execute test cases in API, TestCase, and variables according to the override principle, i.e., get variables nearby or execute them
5. Types of validates assertions:
– eq:[“message”,”success”]
– {“check”:”status”,”comparator”:”eq”,”expect”:200}
– {“check”:”content.status”,”comparator”:”eq”,”expect”:200}
Tips: Content is the result of receiving the request response. If the JSON is automatically converted to dict, the comparator equation: Contains
Note that many interfaces have a content data return, so during the extraction, there will be content.content.token situation
The first is the entire message that the interface responded to, the second is the field in the response data, and then is the data that we need to extract
Httprunner three parameterization methods
A >paramters is used in the yML file of the test suite, and is equivalent to name in testCase: concatenate variables that need to be parameterized with a -dash, and then set the values to match
paramters:
title-mobile-passwd-status-msg:
[” Login successful “,”10086″,”111111″,”200″,”success”]
[” Password error “,”10086″,”123456″,”200″,”success”]
[” Account cannot be empty “,””,”123456″,”200″,”failture”]
B > File parameterization, such as CSV, design use case field names: table headers are separated by commas and entered side by side
title,mobile,passwd,status,msg
Login successfully, 10086111, 111200, the success
Password mistake, 10086123, 456200, the success
The request header is the same as the paramters keyword:
paramters:
title-mobile-passwd-status-msg:${P(csv_path)}
One drawback of CSV parameterization is that int is considered STR
C >debugtalk.py, a list of nested dictionaries
account=[{"title":"Login successful"."mobile":"10086"."passwd":"111111"."status":"200"."msg":"success"},
{"title":"Password error"."mobile":"10086"."passwd":"111111"."status":"200"."msg":"faiture"},
{"title":"The account cannot be empty."."mobile":""."passwd":"111111"."status":"201"."msg":"failture"}]
Copy the code
Use it the same way as paramters:
title-mobile-passwd-status-msg:${P(csv_path)}
Tips: More flexible use, read a large number of Excel use case data for parameterization
Httprunner: httprunner: httprunner
from httprunner.api import Httprunner
httprunner.run("yaml_path")
print(httprunner.summary) Request a summary of the results returned
Copy the code
9, Hrun execution test case can specify some parameters:
–log-level LOG_LEVEL Specifies the log output level during execution. The default value is INFO. Specify logging level, default is INFO.
–log-file LOG_FILE can specify log output to the file Write logs to specified file path.
–dot-env-path DOT_ENV_PATH Specify only the environment variable file path Specify. Env file path, which is useful for keeping sensitive data.
–report-template REPORT_TEMPLATE Specify report template path
–report-dir REPORT_DIR Specify report save directory specify report save directory
–report-file REPORT_FILE Specifies the report file generation specify report file name.
— Failfast When a test case fails, it does not execute subsequent use cases. By default, it executes all use cases to Stop the test run on the first error or failure.
–save-tests Save loaded tests and parsed tests to JSON file.
— startProject startProject Specify new project name.
–validate [validate [validate…]] check whether the testcase format is correct. Multiple testcase format validate JSON testcase format is supported.
Prettify [prettify [prettify…]] Prettify JSON testcase format