Summary:
In 2018, I started looking for something to do after tou and LAN. This time I’m going to start a personal blog and refine my skills along the way. This series of blogs will only provide some valuable technical ideas, rather than documenting the development process like a running tally.
In terms of technology stack, Spring Boot 2.0 will be adopted as the underlying framework, mainly in order to access Spring Cloud for further learning and expansion. And Spring Boot 2.0 is based on Spring5, so you can get a preview of some of the new Spring5 features. Follow-up techniques will be discussed in the corresponding blog.
Project GitHub address: Spring-blog
Introduce the directory structure:
- Spring – a Blog (the Parent project)
- Spring-blog-common (Util module)
- Spring-blog – Business (Repository module)
- Spring-blog-api (Web module)
- Spring-blog-webflux (Web module based on Spring Boot 2.0)
In order to make friends can better understand the content of this module, the demo code will be stored in the Spring Boot project:
Github address: sample code
1, the DataSource,
Before we start, we need to build our runtime environment. Spring Boot introduction to Mybatis tutorial can refer to portal. We won’t go into details here, but let’s look at our directory structure first:
Those who have used Spring Boot should be aware that Spring Boot will automatically load our DataSource after we have configured our database connection information in application.properties. However, if we need to do read/write separation, how to configure our own data source is something we must master.
First let’s look at the information in the configuration file:
spring.datasource.url=jdbc:mysql://localhost:3306/charles_blog2
spring.datasource.username=root
spring.datasource.password=root
spring.datasource.driver-class-name=com.mysql.jdbc.Driver
# Alias scan directory
mybatis.type-aliases-package=com.jaycekon.demo.model
Mapper.xml scans directories
mybatis.mapper-locations=classpath:mybatis-mappers/*.xml
# TkMapper help tool
mapper.mappers=com.jaycekon.demo.MyMapper
mapper.not-empty=false
mapper.identity=MYSQL
Copy the code
1.1 DataSourceBuilder
Let’s first look at building a DataSource using DataSourceBuilder:
@Configuration
@MapperScan("com.jaycekon.demo.mapper") @ EnableTransactionManagement public class SpringJDBCDataSource {/ * * * through Spring JDBC to quickly create a DataSource * * parameter format spring.datasource.master.jdbcurl=jdbc:mysql://localhost:3306/charles_blog * spring.datasource.master.username=root * spring.datasource.master.password=root * spring.datasource.master.driver-class-name=com.mysql.jdbc.Driver * * @return DataSource
*/
@Bean
@ConfigurationProperties(prefix = "spring.datasource.master")
public DataSource dataSource() {
returnDataSourceBuilder.create().build(); }}Copy the code
As you can see from the code, building a DataSource using DataSourceBuilder is very simple, but it’s important to note:
-
DataSourceBuilder automatic identification in the configuration file only jdbcurl, username, password, driver – class – name such as name, So we need to annotate the method body with @ConfigurationProperties.
-
The database connection address variable name needs to use JDBcurl
-
Database connection pool to use com. Zaxxer. Hikari. HikariDataSource
When we run the unit test, we can see the DataSource being created and closed.
1.2 DruidDataSource
In addition to using the above build method, we can choose to create our DataSource using the Druid database connection pool provided by Ali
@Configuration @EnableTransactionManagement public class DruidDataSourceConfig { @Autowired private DataSourceProperties properties; @Bean public DataSource dataSoucre() throws Exception { DruidDataSource dataSource = new DruidDataSource(); dataSource.setUrl(properties.getUrl()); dataSource.setDriverClassName(properties.getDriverClassName()); dataSource.setUsername(properties.getUsername()); dataSource.setPassword(properties.getPassword()); dataSource.setInitialSize(5); dataSource.setMinIdle(5);
dataSource.setMaxActive(100);
dataSource.setMaxWait(60000);
dataSource.setTimeBetweenEvictionRunsMillis(60000);
dataSource.setMinEvictableIdleTimeMillis(300000);
dataSource.setValidationQuery("SELECT 'x'");
dataSource.setTestWhileIdle(true);
dataSource.setTestOnBorrow(false);
dataSource.setTestOnReturn(false);
dataSource.setPoolPreparedStatements(true);
dataSource.setMaxPoolPreparedStatementPerConnectionSize(20);
dataSource.setFilters("stat,wall");
returndataSource; }}Copy the code
Using DruidDataSource as a database connection pool might seem like a bit of a hassle, but from a perspective, it’s more manageable. We can use DataSourceProperties to get the configuration files in application.properties:
spring.datasource.url=jdbc:mysql://localhost:3306/charles_blog2
spring.datasource.username=root
spring.datasource.password=root
spring.datasource.driver-class-name=com.mysql.jdbc.Driver
Copy the code
DataSourceProperties = DataSourceProperties; DataSourceProperties = DataSourceProperties;
@ConfigurationProperties(prefix = "spring.datasource")
public class DataSourceProperties
implements BeanClassLoaderAware, EnvironmentAware, InitializingBean
Copy the code
As you can see, the prefix format is already marked by default in the source code.
In addition to using DataSourceProperties to get configuration files, we can also use generic environment variables to read classes:
@Autowired
private Environment env;
env.getProperty("spring.datasource.write")
Copy the code
2. Configure multiple data sources
To configure multiple data sources, perform the following steps:
2.1 DatabaseType Data source name
Enumeration types are used directly to distinguish between read and write data sources
public enum DatabaseType {
master("write"), slave("read");
DatabaseType(String name) {
this.name = name;
}
private String name;
public String getName() {
return name;
}
public void setName(String name) {
this.name = name;
}
@Override
public String toString() {
return "DatabaseType{" +
"name='" + name + '\''+'}'; }}Copy the code
2.2 DatabaseContextHolder
This class is used to record the data source used by the current thread, using ThreadLocal to record data
public class DatabaseContextHolder { private static final ThreadLocal<DatabaseType> contextHolder = new ThreadLocal<>(); public static voidsetDatabaseType(DatabaseType type) {
contextHolder.set(type);
}
public static DatabaseType getDatabaseType() {
returncontextHolder.get(); }}Copy the code
2.3 DynamicDataSource
The class inheritance AbstractRoutingDataSource used to manage our data source, the main determineCurrentLookupKey method is realized. Details of how this class manages multiple data sources follow.
public class DynamicDataSource extends AbstractRoutingDataSource {
@Nullable
@Override
protected Object determineCurrentLookupKey() {
DatabaseType type = DatabaseContextHolder.getDatabaseType();
logger.info("====================dataSource ==========" + type);
return type; }}Copy the code
2.4 DataSourceConfig
The last step is to configure our data source and place it in the DynamicDataSource:
@Configuration
@MapperScan("com.jaycekon.demo.mapper") @EnableTransactionManagement public class DataSourceConfig { @Autowired private DataSourceProperties properties; / * * * through Spring JDBC to quickly create a DataSource * * Spring parameters format in the DataSource. Master. Jdbcurl = JDBC: mysql: / / localhost: 3306 / charles_blog * spring.datasource.master.username=root * spring.datasource.master.password=root * spring.datasource.master.driver-class-name=com.mysql.jdbc.Driver * * @return DataSource
*/
@Bean(name = "masterDataSource")
@Qualifier("masterDataSource")
@ConfigurationProperties(prefix = "spring.datasource.master")
public DataSource masterDataSource() {
returnDataSourceBuilder.create().build(); } /** * Create a DruidDataSource manually, use DataSourceProperties to read configuration * parameter format * spring.datasource.url=jdbc:mysql://localhost:3306/charles_blog * spring.datasource.username=root * spring.datasource.password=root * spring.datasource.driver-class-name=com.mysql.jdbc.Driver * * @return DataSource
* @throws SQLException
*/
@Bean(name = "slaveDataSource")
@Qualifier("slaveDataSource")
public DataSource slaveDataSource() throws SQLException {
DruidDataSource dataSource = new DruidDataSource();
dataSource.setUrl(properties.getUrl());
dataSource.setDriverClassName(properties.getDriverClassName());
dataSource.setUsername(properties.getUsername());
dataSource.setPassword(properties.getPassword());
dataSource.setInitialSize(5);
dataSource.setMinIdle(5);
dataSource.setMaxActive(100);
dataSource.setMaxWait(60000);
dataSource.setTimeBetweenEvictionRunsMillis(60000);
dataSource.setMinEvictableIdleTimeMillis(300000);
dataSource.setValidationQuery("SELECT 'x'");
dataSource.setTestWhileIdle(true);
dataSource.setTestOnBorrow(false);
dataSource.setTestOnReturn(false);
dataSource.setPoolPreparedStatements(true);
dataSource.setMaxPoolPreparedStatementPerConnectionSize(20);
dataSource.setFilters("stat,wall");
returndataSource; DruidDataSource * @param Master * @param Slave * @druiddatasource * @param Master * @param Slave * @return
*/
@Bean
@Primary
public DynamicDataSource dataSource(@Qualifier("masterDataSource") DataSource master,
@Qualifier("slaveDataSource") DataSource slave) { Map<Object, Object> targetDataSources = new HashMap<>(); targetDataSources.put(DatabaseType.master, master); targetDataSources.put(DatabaseType.slave, slave); DynamicDataSource dataSource = new DynamicDataSource(); dataSource.setTargetDataSources(targetDataSources); / / this method is the method of AbstractRoutingDataSource dataSource setDefaultTargetDataSource (slave); // The default datasource is myTestDbDataSourcereturn datasource; } @Bean public SqlSessionFactory sqlSessionFactory(@Qualifier("masterDataSource") DataSource myTestDbDataSource,
@Qualifier("slaveDataSource") DataSource myTestDb2DataSource) throws Exception {
SqlSessionFactoryBean fb = new SqlSessionFactoryBean();
fb.setDataSource(this.dataSource(myTestDbDataSource, myTestDb2DataSource));
fb.setTypeAliasesPackage(env.getProperty("mybatis.type-aliases-package"));
fb.setMapperLocations(new PathMatchingResourcePatternResolver().getResources(env.getProperty("mybatis.mapper-locations")));
returnfb.getObject(); }}Copy the code
The above code block is relatively long, let’s parse it:
- MasterDataSource and slaveDataSource are used to create data sources. Hikaridatasource and druidDataSource are used as data sources
- In the DynamicDataSource method body, we mainly put the two data sources into DynamicDataSource for unified management
- The SqlSessionFactory method manages all dynamic datasources in a unified manner
2.5 UserMapperTest
Let’s take a quick look at the DataSource creation process:
First we can see that our two data sources have been built using HikariDataSource and DruidDataSource, and then we can put the two data sources into targetDataSource. And our slave is the default data source defaultTargetDataSource
Then get to the data source section:
Mainly from AbstractRoutingDataSource determineTargetDataSource within this class () method of judging, here will call in to our DynamicDataSource method, To determine which data source to use. If no data source is set, the default data source will be used, which is the DruidDataSource data source we just set up.
In the final code run result:
We can see that the default data source we set up is indeed used.
3, read and write separation
Now that we have arrived at our read-write separation module, we need to add some configuration information:
spring.datasource.read = get,select,count,list,query
spring.datasource.write = add,create,update,delete,remove,insert
Copy the code
These two variables are mainly used in section judgment to distinguish which parts need to use read data source and which need to use write.
3.1 DynamicDataSource modify
public class DynamicDataSource extends AbstractRoutingDataSource {
static final Map<DatabaseType, List<String>> METHOD_TYPE_MAP = new HashMap<>();
@Nullable
@Override
protected Object determineCurrentLookupKey() {
DatabaseType type = DatabaseContextHolder.getDatabaseType();
logger.info("====================dataSource ==========" + type);
return type;
}
void setMethodType(DatabaseType type, String content) {
List<String> list = Arrays.asList(content.split(","));
METHOD_TYPE_MAP.put(type, list); }}Copy the code
Here we need to add a Map to record some read and write prefix information.
3.2 DataSourceConfig modify
In the DataSourceConfig, we set the prefix to DynamicDataSource.
@Bean
@Primary
public DynamicDataSource dataSource(@Qualifier("masterDataSource") DataSource master,
@Qualifier("slaveDataSource") DataSource slave) { Map<Object, Object> targetDataSources = new HashMap<>(); targetDataSources.put(DatabaseType.master, master); targetDataSources.put(DatabaseType.slave, slave); DynamicDataSource dataSource = new DynamicDataSource(); dataSource.setTargetDataSources(targetDataSources); / / this method is the method of AbstractRoutingDataSource dataSource setDefaultTargetDataSource (slave); // The default datasource is myTestDbDataSource Stringread = env.getProperty("spring.datasource.read");
dataSource.setMethodType(DatabaseType.slave, read);
String write = env.getProperty("spring.datasource.write");
dataSource.setMethodType(DatabaseType.master, write);
return dataSource;
}
Copy the code
3.3 DataSourceAspect
After configuring the method prefix for reading and writing, we need to configure a section that listens to the data source before entering the Mapper method:
The operation of the main point is DatabaseContextHolder. SetDatabaseType (type); Combined with our multi-data source approach above, this is the key to setting up read or write data sources.
@Aspect
@Component
@EnableAspectJAutoProxy(proxyTargetClass = true)
public class DataSourceAspect {
private static Logger logger = LoggerFactory.getLogger(DataSourceAspect.class);
@Pointcut("execution(* com.jaycekon.demo.mapper.*.*(..) )")
public void aspect() {
}
@Before("aspect()")
public void before(JoinPoint point) {
String className = point.getTarget().getClass().getName();
String method = point.getSignature().getName();
String args = StringUtils.join(point.getArgs(), ",");
logger.info("className:{}, method:{}, args:{} ", className, method, args);
try {
for (DatabaseType type : DatabaseType.values()) {
List<String> values = DynamicDataSource.METHOD_TYPE_MAP.get(type);
for (String key : values) {
if (method.startsWith(key)) {
logger.info(">>{} method uses data source :{}<<", method, key);
DatabaseContextHolder.setDatabaseType(type);
DatabaseType types = DatabaseContextHolder.getDatabaseType();
logger.info(">>{} method uses data source :{}<<", method, types); } } } } catch (Exception e) { logger.error(e.getMessage(), e); }}}Copy the code
3.4 UserMapperTest
Once the methodName method is started, enter the section and set the data source type according to methodName.
Then get into determineTargetDataSource method to data sources:
Running results:
4. Write at the end
If you find it helpful, please click Start or fork on Github
Spring-blog project GitHub address: Spring-blog
Github address: example code