In May 2021, MEituan NLP center opened source ASAP, the largest Chinese attributed-level emotion analysis data set based on real scenes so far. Related papers of this data set were accepted by NAACL2021, and the data set was added into the Chinese open source data plan. Will work with other open source data sets to advance Chinese information processing technology. This paper reviews the evolution of Meituan sentiment analysis technology and its application in typical business scenarios, including discourse/sentence level sentiment analysis, attribute level sentiment analysis and opinion triplet analysis. In business application, the online real-time forecasting service and offline batch forecasting service are constructed based on the ability of sentiment analysis technology. So far, sentiment analysis service has provided services for more than 10 business scenarios within Meituan.
reference
- [1] github.com/Meituan-Dia… .
- [2] Bu J, Ren L, Zheng S, et al. ASAP: A Chinese Review Dataset Towards Aspect Category Sentiment Analysis and Rating Prediction. In Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies. 2021.
- [3] www.luge.ai/
- [4] Zhang, L. , S. Wang , and B. Liu . “Deep Learning for Sentiment Analysis : A Survey.” Wiley Interdisciplinary Reviews: Data Mining and Knowledge Discovery (2018):e1253.
- [5] Liu, Bing. “Sentiment analysis and opinion mining.” Synthesis lectures on human language technologies 5.1 (2012): 1-167.
- [6] Peng, Haiyun, et al. “Knowing what, how and why: A near complete solution for aspect-based sentiment analysis.” In Proceedings of the AAAI Conference on Artificial Intelligence. Vol. 34. No. 05. 2020.
- [7] Zhang, Chen, et al. “A Multi-task Learning Framework for Opinion Triplet Extraction.” In Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing: Findings. 2020.
- [8] Yoon Kim.2014. Convolutional neural networks for sentence classification. ArXiv Preprint arXiv:1408.5882.
- [9] Peng Zhou, Wei Shi, Jun Tian, Zhenyu Qi, Bingchen Li,Hongwei Hao, and Bo Xu. 2016. Attention-based bidirectional long short-term memory networks for relation classification. In Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short cca shut), pages 207-212.
- [10] Devlin, Jacob, et al. “Bert: Pre-training for Deep Bidirectional Transformers for Language Understanding. “arXiv Preprint arXiv:1810.04805 (2018).
- [11] Yang Yang, Jia Hao et al. Meituan BERT’s exploration and practice.
- [12] Pontiki, Maria, et al. “Semeval-2016 task 5: Aspect based sentiment analysis.” International workshop on semantic evaluation. 2016.
- [13] Pontiki, M. , et al. “SemEval-2014 Task 4: Aspect Based Sentiment Analysis.” In Proceedings of International Workshop on Semantic Evaluation at (2014).
- [14] Yequan Wang, Minlie Huang, and Li Zhao. 2016. Attention-based lstm for aspect-level sentiment classification. In Proceedings of the 2016 Conference On Empirical Methods in Natural Language Processing, Pages 606 — 615.
- [15] Sara Sabour, Nicholas Frosst, and Geoffrey E Hinton. 2017. Dynamic routing between capsules. In Advances in neural information processing systems, pages 3856–3866.
- [16] Chi Sun, Luyao Huang, and Xipeng Qiu. 2019. Utilizing bert for aspect-based sentiment analysis via constructing auxiliary sentence. arXiv Preprint arXiv: 1903.09588.
- [17] Qingnan Jiang, Lei Chen, Ruifeng Xu, Xiang Ao, and Min Yang. 2019. A challenge dataset and effective models for aspect-based sentiment analysis. In Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP), Pages 6281 — 6286.
- [18] Wu, Zhen, et al. “Grid Tagging Scheme for End-to-End Fine-grained Opinion Extraction.” In Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing: Findings. 2020.
- [19] Liu, Yinhan, et al. “Roberta: A robustly optimized Bert Pretraining approach.” arXiv preprint arXiv:1907.11692 (2019)
- [20] Clark, Kevin, et al. “Electra: Pre-training text encoders as discriminators rather than generators.” arXiv preprint arXiv:2003.10555 (2020).
0- [21] Timothy Dozat and Christopher D. Manning. 2017.Deep biaffine attention for neural dependency parsing. In 5th International Conference on Learning Representations, ICLR 2017.
The authors introduce
Ren Lei, Jia Hao, Zhang Chen, Yang Yang, Meng Xue, Ma Fang, King Kong, Wu Wei, etc., are from Meituan platform search and NLP department NLP center.
Recruitment information
Meituan Search & NLP Division /NLP Center is the core team responsible for THE research and development of MEituan ARTIFICIAL intelligence technology, with the mission to build world-class natural language processing core technology and service capabilities.
NLP is looking for an expert in natural language processing/machine learning algorithms. Interested students can send their resumes to [email protected]. The specific requirements are as follows.
responsibility
- A prospective exploration of pretraining language models, including but not limited to knowledge driven pretraining, task-based pretraining, multimodal model pretraining and cross-language pretraining;
- Responsible for the training and performance optimization of super-large models with more than 10 billion parameters;
- Model fine-tuning and forward-looking technology exploration, including but not limited to Prompt Tuning, Adapter Tuning and various parameter-efficient transfer learning directions;
- Compression techniques, including but not limited to quantization, pruning, tensor analysis, KD and NAS, etc.
- Complete the application of pre-training model in search, recommendation, advertising and other business scenarios and achieve business objectives;
- Participate in the construction and promotion of NLP platform within Meituan
Post requirements
- At least 2 years relevant working experience, have participated in the algorithm development of at least one field of search, recommendation and advertising, pay attention to the industry and academic progress;
- Solid algorithm foundation, familiar with natural language processing, knowledge graph and machine learning technology, enthusiastic about technology development and application;
- Familiar with Python/Java and other programming languages, have some engineering ability;
- Familiar with Tensorflow, PyTorch and other deep learning frameworks and practical project experience;
- Familiar with RNN/CNN/Transformer/BERT/GPT NLP model and actual project experience;
- Strong sense of purpose, good at analyzing and discovering problems, dismantling and simplifying, able to find new space from daily work;
- Organized and driven, able to sort out complex work and establish effective mechanisms to push upstream and downstream cooperation to achieve goals.
pluses
- Familiar with the basic principle of model training Optimizer, understand the basic method and framework of distributed training;
- Be familiar with the latest training acceleration methods, such as mixed precision training, low bit training, distributed gradient compression, etc
Read more technical articles from meituan’s technical team
Front end | | algorithm back-end | | | data security operations | iOS | Android | test
| in the public bar menu dialog reply goodies for [2020], [2019] special purchases, goodies for [2018], [2017] special purchases such as keywords, to view Meituan technology team calendar year essay collection.
| this paper Meituan produced by the technical team, the copyright ownership Meituan. You are welcome to reprint or use the content of this article for non-commercial purposes such as sharing and communication. Please mark “Content reprinted from Meituan Technical team”. This article shall not be reproduced or used commercially without permission. For any commercial activity, please send an email to [email protected] for authorization.