Some time ago, I encountered a demand to store data in batch to Redis, and the demand was large. Because Redis is a multi-process and single-thread server, the method of loop +set command will definitely lose a lot of performance and time, and may cause Redis thread block, as follows

Store Map data in batches

 Map<String, List<T>> map = new HashMap<>(); 
//处理数据 
//... 
Set<String> keySet = map.keySet(); 
//将数据储存到Redis中 
for (String key : keySet) {
     redisUtil.set(key, JSON.toJSONString(map.get(key)), 86400);//24小时有效时间
}
Copy the code

Batch store the data in the List

List<ScenerySpot> spotList = new ArrayList<>(); 
for (ScenerySpot scenerySpot : spotList) {
     redisUtil.set(redisKey + scenerySpot.getStationNum(),SON.toJSONString(scenerySpot),86400);
}
Copy the code

The above code is very simple, traversal set into Redis, but this is inefficient, each time to obtain a connection, the connection process is in the state of “request-response, request-response, request-response”, most of the time is spent creating and destroying connections, the real execution of the task is relatively much less.

We can optimize this by using Redis Pipelined channel, which can store all data in a single connection and perform request, request, request…, request, response…, response, response, response to achieve batch insertion

I use in the project to SpringDataRedis, serializing the Value approach is Jackson2JsonRedisSerializer, code is as follows

/ / get serialized way RedisSerializer keySerializer = redisTemplate. GetKeySerializer (); RedisSerializer valueSerializer = redisTemplate.getValueSerializer(); / / set up channel redisTemplate. ExecutePipelined ((RedisCallback < String >) redisConnection - > {spotList. Stream (). The forEach (s - > { Byte [] keyByte = keySerializer. Serialize (redisKey + s.gestationNum ()); // Serialize the Key. / / serializing the Value and stored redisConnection. Set (keyByte, valueSerializer serialize (JSON. ToJSONString (s))); Redisconnection. expire(keyByte, 86400); }); return null; });Copy the code

Store Map data. When I use this channel in the application, Redis sometimes times out because the storage time is too long. Later, I need to modify the timeout time

If (map.size() > 0) {// Store data in Redis Set<String> Set = map.keyset (); RedisSerializer keySerializer = redisTemplate.getKeySerializer(); RedisSerializer valueSerializer = redisTemplate.getValueSerializer(); Try {/ / open channel redisTemplate. ExecutePipelined ((RedisCallback < Object >) connection - > {for (String redisKey: Set) {// Serialize the Key value byte[] keyByte = keySerializer. Serialize (tourismRedisKey + ":" + redisKey); / / serializing the Value and store the connection. The set (keyByte, valueSerializer serialize (JSON. ToJSONString (map) get (redisKey)))); // Set expiration time 24 hours connection.expire(keyByte, 86400); } return null; }); } catch (Exception ex) {logger. error("redis: "+ ex.getMessage(), ex); }}Copy the code

RedisTimepiece’s multiGet can be used to fetch data in batches, but we need to design the key value in advance, such as [key: Parameter], we can keySet query whether there is a key series of values, there can be taken out, the following attached code:

Public <T> List<T> getList(String keys,Class<T>); public <T> getList(String keys,Class<T>); tClass){ if (keys == null){ return null; Set<String> keySet = redistemplate.keys (keys); if (keySet == null){ return null; } / / batch get the value List < Object > objects. = redisTemplate opsForValue () multiGet (keySet); List<T> result = new ArrayList<>(); if (objects ! = null){ for (Object object : Result.add (jsonObject.parseObject (object.toString(), tClass)); } } return result; } catch (Exception e) { e.printStackTrace(); } return null; } // call List<ScenerySpot> data = redisutil. getList(redisKey + ":*", sceneryspot.class);Copy the code