In addition to data export, the project can also import data in batches. For example, open cards in batches through Execl spreadsheet, import user or department information in batches, etc. Today we’ll look at common data import schemes.
0x01: Synchronous import
If the batch import of data is dozens of to one or two hundred of data volume; In addition, you can import each piece of data synchronously if the services are not time-consuming. The solution requires only one interface. The main function of the interface is to upload the Execl spreadsheet data, then parse the data in the Execl spreadsheet, perform business operations, and import it into the database.
If necessary, when parsing the Execl spreadsheet data, you can save the validation records of the data in the same batch to the database, then return a batch number batchNo to the client, and then export the import of each record of the imported data in a write interface.
0x02: Asynchronous import
If the amount of data is large, synchronous data import is not appropriate. You can use asynchronous data import. The main design is two interfaces.
First interface: this interface is responsible for uploading the Execl spreadsheet. Then set an incomplete flag in Redis and start a thread to parse the Execl spreadsheet data. Finally, perform business operations to import data into the data, and mark the logo set in Redis as finished.
The second interface: used for polling, to obtain the completion of the import data
Similarly, if you need to record the validation of a row, you can record the validation of a row to the database, and then design an interface to export the import of a row.
0x03: Services are imported
The first two solutions do not record information about the uploaded Execl spreadsheet file, the time of import, and the person who imported it. If there are some unexpected problems, it is very inconvenient to troubleshoot errors. Therefore, sometimes it is necessary to design a library table to record this information, which is convenient for troubleshooting when problems occur. The fields of the recording table can be designed as follows:
Id: primary key ID file_path: path of the uploaded file file_type: file type module: owning service module Opr_state: overall import status, 0 successful, 1 failed upload_time: upload time upload_user: upload personCopy the code
This solution, combined with the first or second solution, can accomplish an import service function.
Overall, designing a good import function requires two tables and three interfaces
-
Two tables: import information record table, Execl record row data validation status and row import situation table
-
Three interfaces: interface for uploading Execl spreadsheet and asynchronous data import, interface for polling the status of successful import, interface for downloading and importing Execl record the status of verifying the data of the row and the status of importing the row