This article has been included in my Github selection, welcome to Star: github.com/yehongzhi/l…
Writing in the front
The last article outlined the dynamic-datasource functionality, and it’s really easy to use, with an @DS annotation and some simple configuration to switch between multiple data sources. So how does that work? How does that work at the bottom? With this question, we studied the source code.
As the framework itself more functional points, there are many small functions such as support for SPEL, regular expression matching, dynamic data source this function of the source will not go into detail. We only care about the core function, which is switching multiple data sources.
The source code parsing
First, we all remember that we need to introduce spring-boot-starter in the beginning:
<dependency> <groupId>com.baomidou</groupId> <artifactId>dynamic-datasource-spring-boot-starter</artifactId> The < version > 3.3.0 < / version > < / dependency >Copy the code
The starter auto-configuration classes are specified from the meta-INF/Spring. factories file:
org.springframework.boot.autoconfigure.EnableAutoConfiguration=\
com.baomidou.dynamic.datasource.spring.boot.autoconfigure.DynamicDataSourceAutoConfiguration
Copy the code
Then open the class:
Dynamic data source core automatic configuration class / * * * * * @ author TaoYu Kanyuxia * @ see DynamicDataSourceProvider * @ see DynamicDataSourceStrategy * @ see DynamicRoutingDataSource * @ since 1.0.0 * / @ Slf4j @ Configuration @ AllArgsConstructor / / in the spring. The datasource. The dynamic prefix for reading Configuration @EnableConfigurationProperties(DynamicDataSourceProperties.class) / / before SpringBoot injection DataSourceAutoConfiguration bean automatic configuration, Loaded into the current this kind of bean first into the container @ AutoConfigureBefore (DataSourceAutoConfiguration. Class) / / Druid's autoConfig is introduced and the Creator of all kinds of data source connection pool @ Import (value = {DruidDynamicDataSourceConfiguration. Class, DynamicDataSourceCreatorAutoConfiguration class}) / / loading conditions, When the prefix is spring. The datasource. "dynamic" configuration to enable this autoConfig @ ConditionalOnProperty (prefix = DynamicDataSourceProperties prefix, name = "enabled", havingValue = "true", matchIfMissing = true) public class DynamicDataSourceAutoConfiguration { private final DynamicDataSourceProperties properties; // Read multiple data source configuration, Into the spring container @ Bean @ ConditionalOnMissingBean public DynamicDataSourceProvider DynamicDataSourceProvider () {Map < String, DataSourceProperty> datasourceMap = properties.getDatasource(); return new YmlDynamicDataSourceProvider(datasourceMap); } / / registered its own dynamic multiple source DataSource @ Bean @ ConditionalOnMissingBean public DataSource DataSource (DynamicDataSourceProvider dynamicDataSourceProvider) { DynamicRoutingDataSource dataSource = new DynamicRoutingDataSource(); dataSource.setPrimary(properties.getPrimary()); dataSource.setStrict(properties.getStrict()); dataSource.setStrategy(properties.getStrategy()); dataSource.setProvider(dynamicDataSourceProvider); dataSource.setP6spy(properties.getP6spy()); dataSource.setSeata(properties.getSeata()); return dataSource; } //AOP facets, enhancements to DS annotated methods, @role (value = BeanDefinition.ROLE_INFRASTRUCTURE) @bean @conditionalonmissingBean public DynamicDataSourceAnnotationAdvisor dynamicDatasourceAnnotationAdvisor(DsProcessor dsProcessor) { DynamicDataSourceAnnotationInterceptor interceptor = new DynamicDataSourceAnnotationInterceptor(properties.isAllowedPublicOnly(), dsProcessor); DynamicDataSourceAnnotationAdvisor advisor = new DynamicDataSourceAnnotationAdvisor(interceptor); advisor.setOrder(properties.getOrder()); return advisor; } @role (value = BeanDefinition.ROLE_INFRASTRUCTURE) @conditionalonProperty (prefix = DynamicDataSourceProperties.PREFIX, name = "seata", havingValue = "false", matchIfMissing = true) @Bean public Advisor dynamicTransactionAdvisor() { AspectJExpressionPointcut pointcut = new AspectJExpressionPointcut(); pointcut.setExpression("@annotation(com.baomidou.dynamic.datasource.annotation.DSTransactional)"); return new DefaultPointcutAdvisor(pointcut, new DynamicTransactionAdvisor()); // ConditionalOnMissingBean public DsProcessor DsProcessor () {DsHeaderProcessor headerProcessor = new DsHeaderProcessor(); DsSessionProcessor sessionProcessor = new DsSessionProcessor(); DsSpelExpressionProcessor spelExpressionProcessor = new DsSpelExpressionProcessor(); headerProcessor.setNextProcessor(sessionProcessor); sessionProcessor.setNextProcessor(spelExpressionProcessor); return headerProcessor; }}Copy the code
We can find that, when used to configure the prefix for spring. The datasource. The dynamic configuration will be read into DynamicDataSourceProperties class, as a Bean into the spring container. In fact, this way of reading configuration file information is also common in daily development.
@Slf4j @Getter @Setter @ConfigurationProperties(prefix = DynamicDataSourceProperties.PREFIX) public class DynamicDataSourceProperties { public static final String PREFIX = "spring.datasource.dynamic"; public static final String HEALTH = PREFIX + ".health"; Private String primary = "master"; private String primary = "master"; /** * Specifies whether to enable strict mode. */ private Boolean strict = false */ private Boolean strict = false */ private Boolean strict = false; */ private Boolean p6spy = false; */ private Boolean seATA = false; Private SeataMode SeataMode = seatamode.at; /** * Whether to use spring actuator monitoring */ private Boolean health = false; /** * private Map<String, DataSourceProperty> datasource = new LinkedHashMap<>(); /** * clazz, default load balancing algorithm */ private Class<? extends DynamicDataSourceStrategy> strategy = LoadBalanceDynamicDataSourceStrategy.class; */ private Integer order = Ordered.HIGHEST_PRECEDENCE; / * * * * / Druid global parameters configuration @ NestedConfigurationProperty private DruidConfig Druid = new DruidConfig (); / * * * * / @ HikariCp global parameters configuration NestedConfigurationProperty private HikariCpConfig hikari = new HikariCpConfig (); /** * Global default publicKey */ private String publicKey = cryptoutils.default_public_key_string; Private Boolean allowedPublicOnly = true; }Copy the code
But how to combine the configuration file information with spring’s DataSource? Mybatis-Plus dynamic DataSource interface is implemented for spring DataSource interface.
Dynamic access to data sources * * / * * * abstract @ author TaoYu * @ since 2.2.0 * / public abstract class AbstractRoutingDataSource extends AbstractDataSource {// AbstractDataSource {// AbstractDataSource {// AbstractDataSource {// // Override the getConnection() method, @override public Connection getConnection() throws SQLException {// Where xID involves processing distributed transactions. String xid = TransactionContext.getXID(); If (stringutils.isempty (xid)) {return determineDataSource().getConnection(); } else { String ds = DynamicDataSourceContextHolder.peek(); ConnectionProxy connection = ConnectionFactory.getConnection(ds); return connection == null ? getConnectionProxy(ds, determineDataSource().getConnection()) : connection; }}}Copy the code
The source if studied above template pattern must be familiar with, he put the DataSource for behavior extends to the subclass to be realized, so the key is to see the realization of the subclass:
@Slf4j public class DynamicRoutingDataSource extends AbstractRoutingDataSource implements InitializingBean, DisposableBean { private static final String UNDERLINE = "_"; /** * private final Map<String, DataSource> dataSourceMap = new ConcurrentHashMap<>(); /** * private final Map<String, groupDataSources > groupDataSources = new ConcurrentHashMap<>(); @Setter private DynamicDataSourceProvider provider; @Setter private Class<? extends DynamicDataSourceStrategy> strategy = LoadBalanceDynamicDataSourceStrategy.class; @Setter private String primary = "master"; @Setter private Boolean strict = false; @Setter private Boolean p6spy = false; @Setter private Boolean seata = false; @Override public DataSource determineDataSource() { return getDataSource(DynamicDataSourceContextHolder.peek()); } private DataSource determinePrimaryDataSource() { log.debug("dynamic-datasource switch to the primary datasource"); return groupDataSources.containsKey(primary) ? groupDataSources.get(primary).determineDataSource() : dataSourceMap.get(primary); } @override public void afterPropertiesSet() throws Exception {// Check if the configuration is enabled but no dependencies checkEnv(); // add and group dataSources Map<String, DataSource> dataSources = provider.loaddatasources (); for (Map.Entry<String, DataSource> dsItem : dataSources.entrySet()) { addDataSource(dsItem.getKey(), dsItem.getValue()); } / / check whether the default data source Settings if (groupDataSources. Either containsKey (primary)) {the info (" dynamic - the datasource initial the loaded ({}) datasource,primary group datasource named [{}]", dataSources.size(), primary); } else if (dataSourceMap.containsKey(primary)) { log.info("dynamic-datasource initial loaded [{}] datasource,primary datasource named [{}]", dataSources.size(), primary); } else { throw new RuntimeException("dynamic-datasource Please check the setting of primary"); }}}Copy the code
He implements the InitializingBean interface, which needs to implement the afterPropertiesSet() method, which is a Bean lifecycle function that does something when the Bean is initialized.
What you do here is check the configuration and get a Map of the DataSource by calling provider.loaddatasources (), where Key is the name of the DataSource and Value is the DataSource.
@ Slf4j @ AllArgsConstructor public class YmlDynamicDataSourceProvider extends AbstractDataSourceProvider {/ * * * all data sources * / private final Map<String, DataSourceProperty> dataSourcePropertiesMap; @Override public Map<String, The DataSource > loadDataSources () {/ / adjustable AbstractDataSourceProvider createDataSourceMap return () method createDataSourceMap(dataSourcePropertiesMap); } } @Slf4j public abstract class AbstractDataSourceProvider implements DynamicDataSourceProvider { @Autowired private DefaultDataSourceCreator defaultDataSourceCreator; protected Map<String, DataSource> createDataSourceMap( Map<String, DataSourceProperty> dataSourcePropertiesMap) { Map<String, DataSource> dataSourceMap = new HashMap<>(dataSourcePropertiesMap.size() * 2); for (Map.Entry<String, DataSourceProperty> item : dataSourcePropertiesMap.entrySet()) { DataSourceProperty dataSourceProperty = item.getValue(); String poolName = dataSourceProperty.getPoolName(); if (poolName == null || "".equals(poolName)) { poolName = item.getKey(); } dataSourceProperty.setPoolName(poolName); dataSourceMap.put(poolName, defaultDataSourceCreator.createDataSource(dataSourceProperty)); } return dataSourceMap; }}Copy the code
The defaultDataSourceCreator here. CreateDataSource () method is used to the adapter pattern.
Because the DataSource implementation classes created by each configuration DataSource are not necessarily the same, the specific DataSource creation needs to be based on the configured DataSource type.
@Override public DataSource createDataSource(DataSourceProperty dataSourceProperty, String publicKey) { DataSourceCreator dataSourceCreator = null; // This. Creators is the applicable DataSourceCreator implementation class for (DataSourceCreator Creator: This.creators) {// Match the corresponding dataSourceCreator if (creator. Support (dataSourceProperty)) {// If so, DataSourceCreator = Creator; break; } } if (dataSourceCreator == null) { throw new IllegalStateException("creator must not be null,please check the DataSourceCreator"); } // Call the createDataSource method to create the corresponding DataSource DataSource = dataSourceCreator.createDataSource(dataSourceProperty, publicKey); this.runScrip(dataSource, dataSourceProperty); return wrapDataSource(dataSource, dataSourceProperty); }Copy the code
All the corresponding implementation classes are placed under the Creator package:
Let’s look at one of the implementation classes:
@Data @AllArgsConstructor public class HikariDataSourceCreator extends AbstractDataSourceCreator implements DataSourceCreator { private static Boolean hikariExists = false; static { try { Class.forName(HIKARI_DATASOURCE); hikariExists = true; } catch (ClassNotFoundException ignored) { } } private HikariCpConfig hikariCpConfig; @override public DataSource createDataSource(DataSourceProperty DataSourceProperty, String publicKey) { if (StringUtils.isEmpty(dataSourceProperty.getPublicKey())) { dataSourceProperty.setPublicKey(publicKey); } HikariConfig config = dataSourceProperty.getHikari().toHikariConfig(hikariCpConfig); config.setUsername(dataSourceProperty.getUsername()); config.setPassword(dataSourceProperty.getPassword()); config.setJdbcUrl(dataSourceProperty.getUrl()); config.setPoolName(dataSourceProperty.getPoolName()); String driverClassName = dataSourceProperty.getDriverClassName(); if (! StringUtils.isEmpty(driverClassName)) { config.setDriverClassName(driverClassName); } return new HikariDataSource(config); } @override public Boolean support(DataSourceProperty DataSourceProperty) {Class<? extends DataSource> type = dataSourceProperty.getType(); return (type == null && hikariExists) || (type ! = null && HIKARI_DATASOURCE.equals(type.getName())); }}Copy the code
Now, let’s go back to our DataSource’s Map, and what do we do?
Then call the addDataSource() method, which groups the data sources by the underscore “_” and puts them in the groupDataSources member variable.
/** * new dataSource added to group ** @param dataSource new dataSource name * @param dataSource new dataSource */ private void addGroupDataSource(String ds, DataSource dataSource) { if (ds.contains(UNDERLINE)) { String group = ds.split(UNDERLINE)[0]; GroupDataSource groupDataSource = groupDataSources.get(group); If (groupDataSource == null) {try {// The strategy is the default LoadBalanceDynamicDataSourceStrategy groupDataSource = new groupDataSource (group, strategy.getDeclaredConstructor().newInstance()); groupDataSources.put(group, groupDataSource); } catch (Exception e) { throw new RuntimeException("dynamic-datasource - add the datasource named " + ds + " error", e); } } groupDataSource.addDatasource(ds, dataSource); }}Copy the code
When grouping, the load balancing policy is also set. What does this load balancing do?
For example, if there are three data sources (A, B, and C) in A group master, and the usage frequency needs to be allocated properly, it is impossible to use one of them, then it requires load balancing policy, default is polling, corresponding class is:
Public class LoadBalanceDynamicDataSourceStrategy implements DynamicDataSourceStrategy {/ * * * * / private final load balancing counter AtomicInteger index = new AtomicInteger(0); @Override public DataSource determineDataSource(List<DataSource> dataSources) { return dataSources.get(Math.abs(index.getAndAdd(1) % dataSources.size())); }}Copy the code
Get the data source by:
@Override public DataSource determineDataSource() { return getDataSource(DynamicDataSourceContextHolder.peek()); Public DataSource getDataSource(String ds) {// Public DataSource getDataSource(String ds) { Default to the main data source if (StringUtils. IsEmpty (ds)) {return determinePrimaryDataSource (); } else if (! groupDataSources.isEmpty() && groupDataSources.containsKey(ds)) { log.debug("dynamic-datasource switch to the datasource named [{}]", ds); return groupDataSources.get(ds).determineDataSource(); // If the normal data source contains, Is back from a common data source} else if (dataSourceMap. Either containsKey (ds)) {the debug (" dynamic - the datasource switch to the datasource named [{}]", ds); return dataSourceMap.get(ds); } if (strict) { throw new RuntimeException("dynamic-datasource could not find a datasource named" + ds); } return determinePrimaryDataSource(); }Copy the code
So the above DynamicDataSourceContextHolder this class do? How does the @DS value come in?
Back to the start of the automatic configuration class, one of them is to configure DynamicDataSourceAnnotationAdvisor, also set up a interceptor:
@Role(value = BeanDefinition.ROLE_INFRASTRUCTURE) @Bean @ConditionalOnMissingBean public DynamicDataSourceAnnotationAdvisor DynamicDataSourceAnnotationAdvisor (DsProcessor DsProcessor) {/ / create the interceptor DynamicDataSourceAnnotationInterceptor interceptor = new DynamicDataSourceAnnotationInterceptor(properties.isAllowedPublicOnly(), dsProcessor); DynamicDataSourceAnnotationAdvisor advisor = new DynamicDataSourceAnnotationAdvisor(interceptor); advisor.setOrder(properties.getOrder()); return advisor; }Copy the code
DynamicDataSourceAnnotationAdvisor is used in AOP aspects of programming, on the aspect of the annotation @ DS processing:
Public class DynamicDataSourceAnnotationAdvisor extends AbstractPointcutAdvisor implements BeanFactoryAware {/ / notification private final Advice advice; // Pointcut private final Pointcut Pointcut; public DynamicDataSourceAnnotationAdvisor(@NonNull DynamicDataSourceAnnotationInterceptor dynamicDataSourceAnnotationInterceptor) { this.advice = dynamicDataSourceAnnotationInterceptor; this.pointcut = buildPointcut(); } @Override public Pointcut getPointcut() { return this.pointcut; } @Override public Advice getAdvice() { return this.advice; } @Override public void setBeanFactory(BeanFactory beanFactory) throws BeansException { if (this.advice instanceof BeanFactoryAware) { ((BeanFactoryAware) this.advice).setBeanFactory(beanFactory); }} private Pointcut buildPointcut () {/ / class added annotations above Pointcut CPC = new AnnotationMatchingPointcut (DS) class, true); Pointcut MPC = new AnnotationMethodPoint(ds.class); Return new ComposablePointcut(CPC).union(MPC); }}Copy the code
We know where to start, @DS annotation. So what do you do, mostly look at advice, which is the interceptor that comes in
DynamicDataSourceAnnotationInterceptor.
public class DynamicDataSourceAnnotationInterceptor implements MethodInterceptor { /** * The identification of SPEL. */ private static final String DYNAMIC_PREFIX = "#"; private final DataSourceClassResolver dataSourceClassResolver; private final DsProcessor dsProcessor; public DynamicDataSourceAnnotationInterceptor(Boolean allowedPublicOnly, DsProcessor dsProcessor) { dataSourceClassResolver = new DataSourceClassResolver(allowedPublicOnly); this.dsProcessor = dsProcessor; } @override public Object invoke(MethodInvocation) throws invocation @ds { String dsKey = determineDatasourceKey(Invocation); / / the data source name push to the current thread's stack DynamicDataSourceContextHolder. Push (dsKey); Try {// Return invocation. Proceed (); } finally {/ / released data from the stack DynamicDataSourceContextHolder. The poll (); }} // This is using the chain of responsibility pattern for some processing, Private String determineDatasourceKey(MethodInvocation) {String key = dataSourceClassResolver.findDSKey(invocation.getMethod(), invocation.getThis()); return (! key.isEmpty() && key.startsWith(DYNAMIC_PREFIX)) ? dsProcessor.determineDatasource(invocation, key) : key; }}Copy the code
There is also a DynamicDataSourceContextHolder, such as the front to get the data connection up, finally, we look at the source of this class:
** @author TaoYu Kanyuxia * @since 1.0.0 */ Public Final class DynamicDataSourceContextHolder {/ * * * why use linked lists to store (accurate stack) * < pre > * to support nested switch, such as ABC three service are different data sources * in which A business to adjust B method, B's method needs to call C's method. Level by level calls switch, forming a chain. * The traditional way of setting only the current thread cannot meet this business requirement and must use a stack, last in, first out. * </pre> */ private static final ThreadLocal<Deque<String>> LOOKUP_KEY_HOLDER = new NamedThreadLocal<Deque<String>>("dynamic-datasource") { @Override protected Deque<String> initialValue() { return new ArrayDeque<>(); }}; Private DynamicDataSourceContextHolder () {} / data * * * * * get the current thread @ return data source name * / public static String peek () {return LOOKUP_KEY_HOLDER.get().peek(); } /** * set the current thread data source * <p> * do not call manually unless necessary, * </p> * * @param ds datasource name */ public static void push(String ds) { LOOKUP_KEY_HOLDER.get().push(StringUtils.isEmpty(ds) ? "" : ds); Public static void poll() {Deque<String> Deque = LOOKUP_KEY_HOLDER.get(); deque.poll(); if (deque.isEmpty()) { LOOKUP_KEY_HOLDER.remove(); }} /** * Force the local thread to empty * <p> * prevent memory leaks, * </p> */ public static void clear() {lookup_key_holder.remove (); }}Copy the code
The main reason why stack is used here is that there will be nested switching data sources, that is, the innermost data source should be released first, the outermost data source should be released last, so we need to use the stack data structure.
The whole process
Maybe you’re still a little dizzy, but it’s a little convoluted, and that’s normal. If you want to make a thorough study, I suggest you open your IDEA and refer to the one I wrote to study it. Here I draw an overall flow chart to get a general idea:
conclusion
Source code parsing can improve the ability to read code, which I think is very important, because when we join a new company, we are not familiar with the project, so we need to learn about the project from the documentation, the code. Read the code to modify and extend it.
This article introduces the framework of the source code analysis only involves the core code, so it is not very difficult, interested students can see several times. The application of multiple data sources is also a common scenario in everyday projects.
Thank you very much for reading and I hope this article has been helpful and enlightening to you.
Please give me a thumbs-up if you think it is useful. Your thumbs-up is the biggest motivation for my creation
I’m a programmer who tries to be remembered. See you next time!!
Ability is limited, if there is any mistake or improper place, please criticize and correct, study together!