ElasticSearch7 installation
steps
- Go to the official website to download, download and then directly decompress
Note:
1. The current version is 7.5.1. Plug-ins to be installed later will depend on the version of ES
2. Es is recommended to be installed in Docker. For the convenience of demonstration, the Windows version is directly used
- Go to the directory elasticsearch-7.5.1, go to the bin directory, and double-click ElasticSearch. bat to start elasticSearch
- To verify the startup, access port 9200 in the browser
- Or some other tool
Commonly used instructions
After the es is successfully started, you can run operation commands to the ES using postman
1. Add or update indexes and their documents
PUT /{index}/{document}/{id}, insert data if id is not available, update data if id is available (create index if only index is passed)
Method 2: POST /{index}/{document}/{id}, id can be omitted, if not, generated by es
2. Obtain all documents
GET /{index}/{document}/_search
Such as: http://127.0.0.1:9200/newindex/newdoc/_search
3. Obtain the document with the specified ID
GET /{index}/{document}/{id}
Such as: http://127.0.0.1:9200/newindex/newdoc/1
4. Fuzzy query
GET /{index}/{document}/_search? Q =* keyword *
Such as: http://127.0.0.1:9200/newindex/newdoc/_search? Q = * * king
5. Delete the document
DELETE /{index}/{document}/{id}
Such as: http://127.0.0.1:9200/newindex/newdoc/1
More sentences can be found on our website
Visualization tool
- Install Elasticsearch-head and download the source code
git clone https://github.com/mobz/elasticsearch-head.git
Copy the code
- Install the GRunt project build tool globally
npm install -g grunt-cli
Copy the code
- Install dependencies
cd elasticsearch-head/
npm install
Copy the code
- Example Modify the elasticSearch configuration file
vim .. / elasticsearch - 7.5.1 / config/elasticsearch. YmlCopy the code
- Add the configuration of cross-domain access at the end
http.cors.enabled: true
http.cors.allow-origin: "*"
Copy the code
- Start the ElasticSearch – Head
cd- // Return the head root directory grunt serverCopy the code
- Browser access view: localhost:9100
IK participle
download
-
Github downloads versus or directly downloads the zip. I chose the second option
-
After decompressing the elasticsearch-7.5.1\plugins folder, copy the folder to the directory ik
-
Test ik word segmentation in Chinese
Extend the content of the custom word segmentation
-
Create custom. Dic in \elasticsearch-7.5.1\plugins\ik\config;
-
Add your own custom vocabulary;
-
Modify the ikAnalyzer.cfg. XML file in the same directory and specify a custom dictionary for the <entry key=”ext_dict”> attribute.
- Restart ElasticSearch and the results are as follows:
Integrated SpringBoot
To prepare
- Add the dependent
Spring Data ElasticSearch
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-data-elasticsearch</artifactId>
</dependency>
Copy the code
- Add the configuration
spring:
data:
elasticsearch:
cluster-nodes: 127.0. 01.: 9300
Copy the code
The code
- New entity class
@Data
@Accessors(chain = true)
@Document(indexName = "school", type = "student") // indexName is the ES indexName, type is the document name
public class Student implements Serializable {
/ / id identification
// index=true indicates whether to enable index.
// type Indicates the field type
// analyzer="ik_max_word" ="ik_max_word"
// searchAnalyzer = "ik_max_word" Type of search participle
@Id
private String id;
@Field(type = FieldType.Keyword, analyzer = "ik_max_word", searchAnalyzer = "ik_max_word")
private String name;
private Integer age;
@Field(type = FieldType.Double)
private Double score;
@Field(type = FieldType.Text, analyzer = "ik_max_word")
private String info;
}
Copy the code
- Paging entity
@Data
@Accessors(chain = true)
public class QueryPage {
/** * Current page */
private Integer current;
/** * Number of records per page */
private Integer size;
}
Copy the code
- Data persistence layer
public interface EsRepository extends ElasticsearchRepository<Student.String> {
/** * Fuzzy query based on student name or information */
Page<Student> findByNameAndInfoLike(String name, String info, Pageable pageable);
}
Copy the code
- Business layer interface and its implementation
public interface EsService {
/** * Insert */
void add(Student student);
/** * Batch insert */
void addAll(List<Student> student);
/** * Fuzzy query */
Page<Student> search(String keyword, QueryPage queryPage);
}
Copy the code
@Service
public class EsServiceImpl implements EsService {
@Autowired
private EsRepository esRepository;
@Override
public void add(Student student) {
esRepository.save(student);
}
@Override
public void addAll(List<Student> student) {
esRepository.saveAll(student);
}
@Override
public Page<Student> search(String keyword, QueryPage queryPage) {
// The default es index starts from 0, and the default MP index starts from 1
PageRequest pageRequest = PageRequest.of(queryPage.getCurrent() - 1, queryPage.getSize());
returnesRepository.findByNameOrInfoLike(keyword, keyword, pageRequest); }}Copy the code
- Writing test classes
@SpringBootTest
public class EsServiceImplTest {
@Autowired
private EsService esService;
@Test
public void insert(a) {
List<Student> students = new ArrayList<>();
for (int i = 10; i <= 12; i++) {
Student student = new Student();
student.setId(i + "").setAge(10 + i).setName("Wang er Gou" + i).setScore(72.5 + i).setInfo("Your Majesty sent me to patrol the mountains." + i);
students.add(student);
}
esService.addAll(students);
}
@Test
public void fuzzySearch(a) {
QueryPage queryPage = new QueryPage();
queryPage.setCurrent(1).setSize(5);
Page<Student> list = esService.search("Two dogs 2", queryPage); list.forEach(System.out::println); }}Copy the code
Import MySql data to ElasticSearch
The installation
- Official website download, pay attention to the corresponding ES version
configuration
- Decompressing a Compressed Package
- copy
\ logstash - 7.5.1 \ config \ logstash - sample. Conf
In the current directory, rename tologstash.conf
- Modify the configuration as follows
# Sample Logstash configuration for creating a simple # Beats -> Logstash -> Elasticsearch pipeline. input { jdbc { # MySql connection configuration jdbc_connection_string = > "JDBC: MySql: / / 127.0.0.1:3306 / springboot_es? characterEncoding=UTF8" jdbc_user => "root" jdbc_password => "1234" jdbc_driver_library => "D: \ Develop_Tools_Others \ logstash - 7.5.1 \ mysql connector - Java - 5.1.26. Jar" jdbc_driver_class = > ". Com. Mysql. JDBC Driver" Jdbc_page_size => "50000" Select id,name,age, Score,info from T_student from ElasticSearch Statement => "select ID,name,age, Score,info from T_student" Output {elasticsearch {hosts => "localhost:9200" # index name => "school" # Document_id => "%{ID}"} stdout {# JSON output codec => json_lines}}Copy the code
- Create the database springBOOt_es and import the database script
SET NAMES utf8mb4;
SET FOREIGN_KEY_CHECKS = 0;
-- ----------------------------
-- Table structure for t_student
-- ----------------------------
DROP TABLE IF EXISTS `t_student`;
CREATE TABLE `t_student` (
`id` int(11) NOT NULL AUTO_INCREMENT COMMENT 'primary key'.`name` varchar(50) CHARACTER SET utf8mb4 COLLATE utf8mb4_general_ci NULL DEFAULT NULL COMMENT 'Student name'.`age` int(11) NULL DEFAULT NULL COMMENT 'age'.`score` double(255.0) NULL DEFAULT NULL COMMENT 'results'.`info` varchar(255) CHARACTER SET utf8mb4 COLLATE utf8mb4_general_ci NULL DEFAULT NULL COMMENT 'information',
PRIMARY KEY (`id`) USING BTREE
) ENGINE = InnoDB AUTO_INCREMENT = 4 CHARACTER SET = utf8mb4 COLLATE = utf8mb4_general_ci ROW_FORMAT = Dynamic;
-- ----------------------------
-- Records of t_student
-- ----------------------------
INSERT INTO `t_student` VALUES (1.'Ming'.18.88.'Study hard');
INSERT INTO `t_student` VALUES (2.'little red'.17.85.'Day day up');
INSERT INTO `t_student` VALUES (3.'Wang Er Gou'.30.59.'proletariat');
SET FOREIGN_KEY_CHECKS = 1;
Copy the code
run
- Start ES first, then es-head
- Finally, start logstash with the following command:
D: \ Develop_Tools_Others \ logstash - 7.5.1. > \ bin \ logstash bat-f .\config\logstash.conf
Copy the code
- Access localhost:9600 to check the startup success
- Check the console to see if data is synchronized
- Finally, you can view the ES-head to see the synchronized data