background

In Java, we typically store amounts using Bigdecimal, whereas in Mongo there is no Bigdecimal type, corresponding to Decimal128.

Where mongo is used in the project, in order to be able to convert Bigdecimal to Decimal128 when Mongo is inserted and Decimal128 back to Bigdecimal when querying, Can take advantage of the spring in the org. Springframework. Core. The convert. The converter. The converter. The following

BigDecimalToDecimal128Converter.class

@WritingConverter
public class BigDecimalToDecimal128Converter implements Converter<BigDecimal.Decimal128> {
 
    @Override
    public Decimal128 convert(BigDecimal bigDecimal) {
        return newDecimal128(bigDecimal); }}Copy the code

Decimal128ToBigDecimalConverter.class

@ReadingConverter
public class Decimal128ToBigDecimalConverter implements Converter<Decimal128.BigDecimal> {
 
    @Override
    public BigDecimal convert(Decimal128 decimal128) {
        returndecimal128.bigDecimalValue(); }}Copy the code

Add a custom Converter when injecting MongoTemplate into the container


@Bean
public CustomConversions customConversions(a) { List<Converter<? ,? >> converters =new ArrayList<>(2);
    converters.add(new BigDecimalToDecimal128Converter());
    converters.add(new Decimal128ToBigDecimalConverter());
    return new CustomConversions(converters);
}
 
@Bean("customMongoTemplate")
public MongoTemplate mongoTemplate(a) {
    MongoTemplate mongoTemplate = new MongoTemplate(mongoDbFactory());
    MappingMongoConverter mongoMapping = (MappingMongoConverter) mongoTemplate.getConverter();
    mongoMapping.setCustomConversions(customConversions());
    mongoMapping.afterPropertiesSet();
    return mongoTemplate;
}
Copy the code

This allows us to freely mongo store and read BigDecimal types.

The problem

Spring is characterized by flexible use, but poor understanding of improper use can cause a lot of debugging work.

The above methods are fine under normal circumstances, but in real projects they can be used incorrectly.

Example:

Entity class:

@Data
@Document(collection = "BookTestXyh")
public class Book {
    private String bookNo;
    private BigDecimal amount;
    private Map<String, JSONObject> extend;
}
Copy the code

The test class:

public String test(a) {
    Book book = new Book();
    book.setBookNo("book1");
    book.setAmount(new BigDecimal("10.11"));
    Map<String, JSONObject> extend = new HashMap<>();
    JSONObject data = new JSONObject();
    data.put("amountTax".new BigDecimal("0.88"));
    extend.put("param", data);
    book.setExtend(extend);
    mongoTemplate.insert(book);
 
    Query query = Query.query(Criteria.where("bookNo").is("book1"));
    List<Book> books = mongoTemplate.find(query, Book.class);
 
    return "success";
}
Copy the code

The test class inserts first and then executes the query.

Insert result:

No problem, BigDecimal is normally converted to Decimal128.

What about queries?

As you can see from Mongo, the amount is correctly converted to BigDecimal, while the amountTax in extend is still Decimal128 and not converted to BigDecimal. It would be wrong if we used amountTax as BigDecimal.

Analyze the

Why?

You can imagine that reading data from Mongo into memory must go through deserialization and type conversion, so the current phenomenon deserialization is no problem, the problem must be in type conversion.

The amount field can be converted normally, but the amountTax in extend is not. There must be something wrong with parsing nested structures.

What type is JsonObject? MappingMongoConverter#readMap:


protected Map<Object, Object> readMap(TypeInformation
        type, DBObject dbObject, ObjectPath path) {
 
    Assert.notNull(dbObject, "DBObject must not be null!");
    Assert.notNull(path, "Object path must not be null!"); Class<? > mapType = typeMapper.readType(dbObject, type).getType(); TypeInformation<? > keyType = type.getComponentType(); Class<? > rawKeyType = keyType ==null ? null : keyType.getType();
 
    // Map Value corresponds to the type
    // The vlaue for extend is Jsonobject, that is, Map
      
       , so when converting amountTax, valueType is Object
      ,object>TypeInformation<? > valueType = type.getMapValueType(); Class<? > rawValueType = valueType ==null ? null : valueType.getType();
 
    Map<Object, Object> map = CollectionFactory.createMap(mapType, rawKeyType, dbObject.keySet().size());
    Map<String, Object> sourceMap = dbObject.toMap();
 
    if(! DBRef.class.equals(rawValueType) && isCollectionOfDbRefWhereBulkFetchIsPossible(sourceMap.values())) { bulkReadAndConvertDBRefMapIntoTarget(valueType, rawValueType, sourceMap, map);return map;
    }
 
    for (Entry<String, Object> entry : sourceMap.entrySet()) {
        if (typeMapper.isTypeKey(entry.getKey())) {
            continue;
        }
 
        Object key = potentiallyUnescapeMapKey(entry.getKey());
 
        if(rawKeyType ! =null) {
            key = conversionService.convert(key, rawKeyType);
        }
 
        Object value = entry.getValue();
 
        if (value instanceof DBObject) {
            // nested structure, continue recursive parsing
            map.put(key, read(valueType, (DBObject) value, path));
        } else if (value instanceof DBRef) {
            map.put(key, DBRef.class.equals(rawValueType) ? value
                    : readAndConvertDBRef((DBRef) value, valueType, ObjectPath.ROOT, rawValueType));
        } else{ Class<? > valueClass = valueType ==null ? null : valueType.getType();
            // For JsonObject, the essence is Map
      
       , so when converting the amountTax field, Spring thinks we want to convert it to Object!
      ,object>
            // While Object is a common ancestor, Decimal128 is itself an Object, so We don't need to cast amountTax any moremap.put(key, getPotentiallyConvertedSimpleRead(value, valueClass)); }}return map;
}
Copy the code

Key points on the map. The put (key, getPotentiallyConvertedSimpleRead (value, valueClass)); Have a comment.

conclusion

To summarize why it is trample pits.

1. Understanding mongo and Java type conversion is not deep, an error using the Converter, the Decimal128ToBigDecimalConverter in some cases.

2. For type sensitive fields, it is not suitable to use Map<String,JsonObject> to store MONgo, and other data structures need to be considered.

To solve

How to solve it?

Copy MappingMongoConverter, custom type conversion logic.

2. Change the data structure.

reflection

1. For classes that require persistence, the design of the data structure is important, and changing the data structure can be costly when complex scenarios go wrong.

2. Mongo has the advantage of unstructured data storage. It can recursively convert field types when entering the library, but cannot intelligently convert field types when reading the library. Is spring’s support for Mongo incomplete, or is it a problem of incorrect use itself?

For this example, why not do the conversion based on the type read (find Decimal128, apply Converter to Decimal128 to BigDecimal), but do the conversion based on the type to be returned (find Object, Read Decimal128 as Object, return)?