As the second part of the statistics, combined with the statistical tasks of the first part, and then expanded, the previous share of the daily summary of the amount of data, this part is mainly about how to display the data “correctly”.
Notice in advance: The whole article is more like a record of a daily or weekly newspaper. I just use my own words to describe it, so the article will be very wordy. Here I would like to say hello to everyone.
Many partners may think it is nothing more than a query interface, without any technical content, but as a solid pit, SO I have a kind of impulse to share, this may be the original intention of my writing this article, to avoid the experienced people to lie in the pit again.
Bug 1: Start time and end time are not distinguished correctly
A part in the search criteria used in the start time and end time, this parameter is suggested from the code, instead of using the database functions as a query parameter, here is considering the database functions may influence index, and other conditions, including every time use the database function to count the time, really very expensive! The SQL statement used in the previous part is posted below:
select trade_type, sum(`settle_amount`) as `sumSettleAmount` from finance_record where uid = ? and trade_time between ? and ? GROUP BY `trade_type`
Copy the code
The code logic uses between? and ? , so at the specified point in time, code similar to the following time format is generated:
trade_time between '2021-01-12 00:00:00' and '2021-01-12 23:59:59'
Copy the code
This ensures that there will be no duplicate data for each day of statistics, because if the second parameter is filled with ‘2021-01-13 00:00:00’, the data will be collected at more than one time point.
Bug 2: The query criteria of statistics are incorrect
Like literally, write the wrong query conditions, I mainly reviews I made two mistakes, here basically is combined with the business, in the SQL statistics, there is a query conditions, but at that time I was really confused, is similar to the original statistical data of each uid, the results of each uid values are the same (actually not uid), Then lead to someone a lot of the amount of data statistics, the results did not find out the problem, just comment out code, to locate out online again the next day, the day’s data is right, but yesterday’s data is wrong, and to locate, the results found that because running data has a problem, yesterday was really confused, by the same pit pit twice!
Let’s share the next two small optimizations
Optimization point 1: When collecting statistics and updating data, you need to print the statistics and view them in logs
Select * from group by; select * from group by; select * from group by; Error cases like the following:
select * from finance_record where puid = ? group by puid;
Copy the code
Today is here first, thank you to see the officer!