Summary:

In terms of technology stack, Spring Boot 2.0 will be adopted as the underlying framework, mainly in order to be able to access Spring Cloud for further learning and expansion. And Spring Boot 2.0 is based on Spring5, so you can get a preview of some of the new features of Spring5. Subsequent techniques will be presented in the respective blogs.

Project GitHub address: Spring-blog

Introduce the directory structure:

  • Spring – a Blog (the Parent project)
  • Spring-blog-common (Util module)
  • Spring-blog-business (Repository module)
  • Spring-blog-api (Web module)
  • Spring-blog-webflux (Web module based on Spring Boot 2.0)

To help you understand the contents of this module, the demo code will be stored in the Spring Boot project:

Github address: Sample code

1, the DataSource,

Before we begin, we need to build our runtime environment. Spring Boot introduces Mybatis tutorial can refer to portal. Without going into detail here, let’s first look at our directory structure:

Spring Boot automatically loads our DataSource when we configure our database connection information in application.properties. But if we need to do read/write separation, how to configure our own data source, is something we have to know.

First let’s look at the information in the configuration file:

spring.datasource.url=jdbc:mysql://localhost:3306/charles_blog2
spring.datasource.username=root
spring.datasource.password=root
spring.datasource.driver-class-name=com.mysql.jdbc.Driver

# Alias scan directory
mybatis.type-aliases-package=com.jaycekon.demo.model
# mapper.xml scan directory
mybatis.mapper-locations=classpath:mybatis-mappers/*.xml

# tkMapper help tool
mapper.mappers=com.jaycekon.demo.MyMapper
mapper.not-empty=false
mapper.identity=MYSQL
Copy the code

1.1 DataSourceBuilder

Let’s first look at using the DataSourceBuilder to build a DataSource:

@Configuration
@MapperScan("com.jaycekon.demo.mapper") @ EnableTransactionManagement public class SpringJDBCDataSource {/ * * * through Spring JDBC to quickly create a DataSource * * parameter format spring.datasource.master.jdbcurl=jdbc:mysql://localhost:3306/charles_blog * spring.datasource.master.username=root * spring.datasource.master.password=root * spring.datasource.master.driver-class-name=com.mysql.jdbc.Driver * * @return DataSource
     */
    @Bean
    @ConfigurationProperties(prefix = "spring.datasource.master")
    public DataSource dataSource() {
        returnDataSourceBuilder.create().build(); }}Copy the code

A DataSourceBuilder builds DataSourceBuilder DataSourceBuilder

  • DataSourceBuilder automatic identification in the configuration file only jdbcurl, username, password, driver – class – name such as name, So we need the @ ConfigurationProperties annotation on the body of the method.

  • The name of the database connection address variable needs to be jdbcurl

  • Database connection pool to use com. Zaxxer. Hikari. HikariDataSource

    When we perform the unit test, we can see the DataSource being created and then shut down.

1.2 DruidDataSource

In addition to using the above build method, we have the option to create a DataSource using the Druid database connection pool provided by Ali

@Configuration @EnableTransactionManagement public class DruidDataSourceConfig { @Autowired private DataSourceProperties  properties; @Bean public DataSource dataSoucre() throws Exception { DruidDataSource dataSource = new DruidDataSource(); dataSource.setUrl(properties.getUrl()); dataSource.setDriverClassName(properties.getDriverClassName()); dataSource.setUsername(properties.getUsername()); dataSource.setPassword(properties.getPassword()); dataSource.setInitialSize(5); dataSource.setMinIdle(5);
        dataSource.setMaxActive(100);
        dataSource.setMaxWait(60000);
        dataSource.setTimeBetweenEvictionRunsMillis(60000);
        dataSource.setMinEvictableIdleTimeMillis(300000);
        dataSource.setValidationQuery("SELECT 'x'");
        dataSource.setTestWhileIdle(true);
        dataSource.setTestOnBorrow(false);
        dataSource.setTestOnReturn(false);
        dataSource.setPoolPreparedStatements(true);
        dataSource.setMaxPoolPreparedStatementPerConnectionSize(20);
        dataSource.setFilters("stat,wall");
        returndataSource; }}Copy the code

Using DruidDataSource as a database connection pool may seem cumbersome, but on the other hand, it’s more manageable. We can retrieve the configuration file in application.properties using DataSourceProperties:

spring.datasource.url=jdbc:mysql://localhost:3306/charles_blog2
spring.datasource.username=root
spring.datasource.password=root
spring.datasource.driver-class-name=com.mysql.jdbc.Driver
Copy the code

DataSourceProperties: DataSourceProperties: DataSourceProperties: DataSourceProperties: DataSourceProperties: DataSourceProperties: DataSourceProperties: DataSourceProperties

@ConfigurationProperties(prefix = "spring.datasource")
public class DataSourceProperties
        implements BeanClassLoaderAware, EnvironmentAware, InitializingBean
Copy the code

As you can see, the prefix format is already annotated by default in the source code.

In addition to using DataSourceProperties to get the configuration file we can also use the common environment variable to read the class:

@Autowired
    private Environment env;
    
    env.getProperty("spring.datasource.write")
Copy the code

2. Configure multiple data sources

The following steps are required to configure multiple data sources:

2.1 DatabaseType Data source name

The enumeration type is used directly to distinguish between read and write data sources

public enum DatabaseType {
    master("write"), slave("read");


    DatabaseType(String name) {
        this.name = name;
    }

    private String name;

    public String getName() {
        return name;
    }

    public void setName(String name) {
        this.name = name;
    }

    @Override
    public String toString() {
        return "DatabaseType{" +
                "name='" + name + '\''+'}'; }}Copy the code

2.2 DatabaseContextHolder

This class is primarily used to record the data source used by the current thread, using ThreadLocal to record data

public class DatabaseContextHolder { private static final ThreadLocal<DatabaseType> contextHolder = new ThreadLocal<>();  public static voidsetDatabaseType(DatabaseType type) {
        contextHolder.set(type);
    }

    public static DatabaseType getDatabaseType() {
        returncontextHolder.get(); }}Copy the code

2.3 DynamicDataSource

The class inheritance AbstractRoutingDataSource used to manage our data source, the main determineCurrentLookupKey method is realized. We’ll detail how this class manages multiple data sources.

public class DynamicDataSource extends AbstractRoutingDataSource {


    @Nullable
    @Override
    protected Object determineCurrentLookupKey() {
        DatabaseType type = DatabaseContextHolder.getDatabaseType();
        logger.info("====================dataSource ==========" + type);
        return type; }}Copy the code

2.4 DataSourceConfig

The final step is to configure our data source and place it in a DynamicDataSource:

@Configuration
@MapperScan("com.jaycekon.demo.mapper") @EnableTransactionManagement public class DataSourceConfig { @Autowired private DataSourceProperties properties; / * * * through Spring JDBC to quickly create a DataSource * * Spring parameters format in the DataSource. Master. Jdbcurl = JDBC: mysql: / / localhost: 3306 / charles_blog *  spring.datasource.master.username=root * spring.datasource.master.password=root * spring.datasource.master.driver-class-name=com.mysql.jdbc.Driver * * @return DataSource
     */
    @Bean(name = "masterDataSource")
    @Qualifier("masterDataSource")
    @ConfigurationProperties(prefix = "spring.datasource.master")
    public DataSource masterDataSource() {
        returnDataSourceBuilder.create().build(); } /** * Create DruidDataSource manually and read from DataSourceProperties spring.datasource.url=jdbc:mysql://localhost:3306/charles_blog * spring.datasource.username=root * spring.datasource.password=root * spring.datasource.driver-class-name=com.mysql.jdbc.Driver * * @return DataSource
     * @throws SQLException
     */
    @Bean(name = "slaveDataSource")
    @Qualifier("slaveDataSource")
    public DataSource slaveDataSource() throws SQLException {
        DruidDataSource dataSource = new DruidDataSource();
        dataSource.setUrl(properties.getUrl());
        dataSource.setDriverClassName(properties.getDriverClassName());
        dataSource.setUsername(properties.getUsername());
        dataSource.setPassword(properties.getPassword());
        dataSource.setInitialSize(5);
        dataSource.setMinIdle(5);
        dataSource.setMaxActive(100);
        dataSource.setMaxWait(60000);
        dataSource.setTimeBetweenEvictionRunsMillis(60000);
        dataSource.setMinEvictableIdleTimeMillis(300000);
        dataSource.setValidationQuery("SELECT 'x'");
        dataSource.setTestWhileIdle(true);
        dataSource.setTestOnBorrow(false);
        dataSource.setTestOnReturn(false);
        dataSource.setPoolPreparedStatements(true);
        dataSource.setMaxPoolPreparedStatementPerConnectionSize(20);
        dataSource.setFilters("stat,wall");
        returndataSource; * Master HikariDataSource * Slave DruidDataSource * @param Master * @param Slave * @ is used for the data source connection poolreturn
     */
    @Bean
    @Primary
    public DynamicDataSource dataSource(@Qualifier("masterDataSource") DataSource master,
                                        @Qualifier("slaveDataSource") DataSource slave) { Map<Object, Object> targetDataSources = new HashMap<>(); targetDataSources.put(DatabaseType.master, master); targetDataSources.put(DatabaseType.slave, slave); DynamicDataSource dataSource = new DynamicDataSource(); dataSource.setTargetDataSources(targetDataSources); / / this method is the method of AbstractRoutingDataSource dataSource setDefaultTargetDataSource (slave); // The default datasource is set to myTestDbDataSourcereturn datasource; } @Bean public SqlSessionFactory sqlSessionFactory(@Qualifier("masterDataSource") DataSource myTestDbDataSource,
                                               @Qualifier("slaveDataSource") DataSource myTestDb2DataSource) throws Exception {
        SqlSessionFactoryBean fb = new SqlSessionFactoryBean();
        fb.setDataSource(this.dataSource(myTestDbDataSource, myTestDb2DataSource));
        fb.setTypeAliasesPackage(env.getProperty("mybatis.type-aliases-package"));
        fb.setMapperLocations(new PathMatchingResourcePatternResolver().getResources(env.getProperty("mybatis.mapper-locations")));
        returnfb.getObject(); }}Copy the code

The above code block is long, let’s parse it:

  • The masterDataSource and slaveDataSource are primarily used to create data sources, using hikaridatasource and druidDataSource, respectively
  • In the DynamicDataSource method body, we mainly put the two data sources into DynamicDataSource for unified management
  • The SqlSessionFactory method manages all DynamicDataSource data sources

2.5 UserMapperTest

Let’s take a quick look at creating a DataSource:

First we see that our two data sources are built, using the HikariDataSource and DruidDataSource, and then we put the two data sources into the targetDataSource. And we’re talking about our slave as the default data source defaultTargetDataSource

Then go to the data source section:

Mainly from AbstractRoutingDataSource determineTargetDataSource within this class () method of judging, here will call in to our DynamicDataSource method, To determine which data source to use. If no data source is set, the default data source will be used, which is the DruidDataSource data source we just set.

In the final result of the code run:

We can see that we did use the default data source that we set.

3, read and write separation

After a long journey to our read-write separation module, first we need to add some configuration information:

spring.datasource.read = get,select,count,list,query
spring.datasource.write = add,create,update,delete,remove,insert
Copy the code

These two variables are mainly used in aspect determination to distinguish between parts that need to be read and those that need to be written.

3.1 DynamicDataSource modify

public class DynamicDataSource extends AbstractRoutingDataSource {

    static final Map<DatabaseType, List<String>> METHOD_TYPE_MAP = new HashMap<>();


    @Nullable
    @Override
    protected Object determineCurrentLookupKey() {
        DatabaseType type = DatabaseContextHolder.getDatabaseType();
        logger.info("====================dataSource ==========" + type);
        return type;
    }

    void setMethodType(DatabaseType type, String content) {
        List<String> list = Arrays.asList(content.split(","));
        METHOD_TYPE_MAP.put(type, list); }}Copy the code

Here we need to add a Map to record some read and write prefix information.

3.2 DataSourceConfig modify

In DataSourceConfig, when we set DynamicDataSource, we set the prefix.

@Bean
    @Primary
    public DynamicDataSource dataSource(@Qualifier("masterDataSource") DataSource master,
                                        @Qualifier("slaveDataSource") DataSource slave) { Map<Object, Object> targetDataSources = new HashMap<>(); targetDataSources.put(DatabaseType.master, master); targetDataSources.put(DatabaseType.slave, slave); DynamicDataSource dataSource = new DynamicDataSource(); dataSource.setTargetDataSources(targetDataSources); / / this method is the method of AbstractRoutingDataSource dataSource setDefaultTargetDataSource (slave); // The default datasource is set to myTestDbDataSource Stringread = env.getProperty("spring.datasource.read");
        dataSource.setMethodType(DatabaseType.slave, read);

        String write = env.getProperty("spring.datasource.write");
        dataSource.setMethodType(DatabaseType.master, write);

        return dataSource;
    }
Copy the code

3.3 DataSourceAspect

After configuring the read and write method prefixes, we need to configure an aspect that listens to set the data source before entering the Mapper method:

The operation of the main point is DatabaseContextHolder. SetDatabaseType (type); This is the key to setting up a read or write data source, combined with our previous method of getting data sources for multiple data sources.

@Aspect
@Component
@EnableAspectJAutoProxy(proxyTargetClass = true)
public class DataSourceAspect {
    private static Logger logger = LoggerFactory.getLogger(DataSourceAspect.class);

    @Pointcut("execution(* com.jaycekon.demo.mapper.*.*(..) )")
    public void aspect() {

    }


    @Before("aspect()")
    public void before(JoinPoint point) {
        String className = point.getTarget().getClass().getName();
        String method = point.getSignature().getName();
        String args = StringUtils.join(point.getArgs(), ",");
        logger.info("className:{}, method:{}, args:{} ", className, method, args);
        try {
            for (DatabaseType type : DatabaseType.values()) {
                List<String> values = DynamicDataSource.METHOD_TYPE_MAP.get(type);
                for (String key : values) {
                    if (method.startsWith(key)) {
                        logger.info(">>{} method uses data source :{}<<", method, key);
                        DatabaseContextHolder.setDatabaseType(type);
                        DatabaseType types = DatabaseContextHolder.getDatabaseType();
                        logger.info(">>{} method uses data source :{}<<", method, types); } } } } catch (Exception e) { logger.error(e.getMessage(), e); }}}Copy the code

3.4 UserMapperTest

After the method starts, go into the section and set the data source type based on methodName.

Then get into determineTargetDataSource method to data sources:

Running result:

4. Put it at the end

If you feel helpful after reading this, please help the blogger to click Start or fork on Github

Spring-blog project GitHub address: Spring-blog

Example code Github address: Example code

Finally, I will post a new student’s public account (Java tutorial), welcome your attention, mainly share the interview content (refer to the previous blogger’s article), Ali’s open source technology and ali life related. Want to exchange interview experience, can add my personal wechat (JayCE-K) into the group learning ~