Journal of Jilin University(Engineering and Technology Edition) ›› 2022, Vol. 52 ›› Issue (8): 1872-1880.doi: 10.13229/j.cnki.jdxbgxb20210961

Previous Articles    

Dispute focus identification of pleading text based on deep neural network

Tian BAI1,2(),Ming-wei XU3,Si-ming LIU4,Ji-an ZHANG3,Zhe WANG1,2()   

  1. 1.College of Computer Science and Technology,Jilin University,Changchun 130012,China
    2.Key Laboratory of Symbolic Computation and Knowledge Engineering of Ministry of Education,Jilin University,Changchun 130012,China
    3.College of Software,Jilin University,Changchun 130012,China
    4.School of Law,Jilin University,Changchun 130012,China
  • Received:2021-09-23 Online:2022-08-01 Published:2022-08-12
  • Contact: Zhe WANG E-mail:baitian@jlu.edu.cn;wz2000@jlu.edu.cn

Abstract:

The dispute focus is the focus of dispute between the plaintiff and the defendant, which is the main line and hub of leading the trial and settlement of disputes. Accurate and rapid induction of the focus of disputes is conducive to improve the quality and efficiency of the trial, and achieve the effect of supporting the construction of 'intelligent justice'. An innovative end-to-end model was proposed to solve this problem. Based on deep neural network, this model deeply understood the semantic information of the text between both parties. By combining word level and sentence level information, this study carried out sentence level contradiction detection, classification, and complete paragraph level contradiction classification. Through certain rules, this method combined the results of the two parts, finally identified all the dispute focuses in the pleading text. Experiments on real datasets show that the proposed model can identify the focus of dispute between the plaintiff and the defendant accurately and quickly. The recognition accuracy is improved effectively compared with the existing methods.An effective new path is proposed for the intelligent identification of the dispute focus of the defense text.

Key words: computer application technology, legal artificial intelligence, dispute focus, judgement documents, natural language processing, text matching

CLC Number: 

  • TP391.1

Fig.1

Examples of statements by both parties"

Fig.2

Contradiction detection model based on BERT-CBGA"

Fig.3

Dispute focus identification model"

Table 1

Dataset information"

数据集裁判文书数量成对表述数量争议焦点数量
交通事故1 60078 5658 202
股权转让1 57997 81410 681
民间借贷1 21869 2167 399
追索报酬1 20061 8594 327
治安管理1 60087 2568 696
诈骗罪1 59470 4744 209
故意伤害1 19452 1812 263
Argmine68416 3203 264

Table 2

Experimental results-contradiction detection(Argmine dataset)"

模型准确率损失率
CNN0.84050.4291
BiLSTM0.80530.4359
BiGRU0.81100.4302
BLA0.82820.4069
BGA0.83900.4284
CBLA0.82820.4317
CBGA0.83440.4829
BERT-CNN0.84970.3712
BERT-BLA0.83430.3241
BERT-BGA0.84200.3252
BERT-CBLA0.85280.3711
BERT-CBGA0.86200.3107

Table 3

Accuracy-results of dispute focus identification experiment"

数据集HANCNN本文模型 结构-CNN本文模型
交通事故0.82400.85800.86040.8698
股权转让0.77550.78450.80350.8344
民间借贷0.86870.87140.88510.8971
追索报酬0.87770.88960.89640.9075
治安管理0.82080.83540.84170.8521
诈骗罪0.91540.91880.91950.9246
故意伤害0.93290.93870.94100.9456

Table 4

Loss-results of dispute focus identification experiment"

数据集HANCNN本文模型 结构-CNN本文模型
交通事故0.37900.32390.32320.3210
股权转让0.43300.41080.35570.3074
民间借贷0.32020.31770.27390.2316
追索报酬0.28370.24280.22170.1998
治安管理0.37850.34900.32250.3029
诈骗罪0.21330.21350.18730.1656
故意伤害0.18010.18510.16490.1258
1 蔡立东. 智慧法院建设: 实施原则与制度支撑[J]. 中国应用法学, 2017(2): 19-28.
Cai Li-dong. Construction of intellectual court: principle of implementation and institutional support[J]. China Journal of Applied Jurisprudence, 2017(2): 19-28.
2 蔡立东. 信息技术创新开启法学研究新时代[J]. 中国社会科学文摘, 2018(6): 121-122.
Cai Li-dong. New era of jurisprudential research launched by information technology innovation[J]. Chinese Social Science Digest, 2018(6): 121-122.
3 Zhong H, Xiao C, Guo Z, et al. Overview of CAIL2018: legal judgment prediction competition[J]. arXiv Preprint arXiv:.
4 Tran V, Nguyen M L, Satoh K. Building legal case retrieval systems with lexical matching and summarization using a pre-trained phrase scoring model[C]∥Proceedings of the Seventeenth International Conference on Artificial Intelligence and Law, Montreal, Canada, 2019: 275-282.
5 Kim M Y, Goebel R. Two-step cascaded textual entailment for legal bar exam question answering[C]∥Proceedings of the 16th Edition of the International Conference on Articial Intelligence and Law, London, United Kingdom, 2017: 283-290.
6 Kort F. Predicting supreme court decisions mathematically: a quantitative analysis of the "right to counsel" cases[J]. American Political Science Review, 1957, 51(1): 1-12.
7 Gardner A L. An artificial intelligence approach to legal reasoning[D]. San Francisco: Stanford Law School, Stanford University, 1984.
8 Zhong H, Xiao C, Tu C, et al. How does NLP benefit legal system: a summary of legal artificial intelligence[J]. arXiv Preprint arXiv: .
9 欧阳丹彤, 肖君, 叶育鑫. 基于实体对弱约束的远监督关系抽取[J]. 吉林大学学报: 工学版, 2019, 49(3):912-919.
Dan-tong Ou-yang, Xiao Jun, Ye Yu-xin. Distant supervision for relation extraction with weak constraints of entity pairs[J]. Journal of Jilin University (Engineering and Technology Edition), 2019, 49(3):912-919.
10 刘桂霞,王沫沅,苏令涛,等.基于深度神经网络的蛋白质相互作用预测框架[J].吉林大学学报: 工学版, 2019, 49(2): 570-577.
Liu Gui-xia, Wang Mo-yuan, Su Ling-tao, et al. Prediction of protein-protein interactions based on deep neural networks[J]. Journal of Jilin University (Engineering and Technology Edition), 2019, 49(2): 570-577.
11 Chen H, Cai D, Dai W, et al. Charge-based prison term prediction with deep gating network[C]∥Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing, HongKong, China, 2019: 6362-6367.
12 Luo B, Feng Y, Xu J, et al. Learning to predict charges for criminal cases with legal basis[C]∥Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing, Copenhagen, Denmark, 2017: 2727-2736.
13 Zhong H, Guo Z, Tu C, et al. Legal judgment prediction via topological learning[C]∥Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, Brussels, Belgium, 2018: 3540-3549.
14 Deerwester S, Dumais S T, Furnas G W, et al. Indexing by latent semantic analysis[J]. Journal of the Association for Information Science & Technology, 2010, 41(6): 391-407.
15 Blei D M, Ng A Y, Jordan M I. Latent dirichlet allocation[J]. Journal of Machine Learning Research, 2013, 3(7): 993-1022.
16 Hu B T, Lu Z D, Hang L, et al. Convolutional neural network architectures for matching natural language sentences[C]∥Proceedings of the 27th International Conference on Neural Information Processing Systems, Montreal, Canada, 2015: 2042-2050.
17 Feng M, Xiang B, Glass M R, et al. Applying deep learning to answer selection: a study and an open task[C]∥EEE Workshop on Automatic Speech Recognition and Understanding, Scottsdale, USA, 2015: 813-820.
18 Mueller J, Thyagarajan A. Siamese recurrent architectures for learning sentence similarity[C]∥Proceedings of the AAAI Conference on Artificial Intelligence, Phoenix, USA, 2016: 2786-2792.
19 Yin W, Schütze H, Xiang B, et al. ABCNN: attention-based convolutional neural network for modeling sentence pairs[J]. Transactions of the Association for Computational Linguistics, 2016, 4: 259-272.
20 Wang Z, Hamza W, Florian R. Bilateral multi-perspective matching for natural language sentences[C]∥Proceedings of the 26th International Joint Conference on Artificial Intelligence, Melbourne, Australia, 2017: 4144-4150.
21 Devlin J, Chang M W, Lee K, et al. Bert: pre-training of deep bidirectional transformers for language understanding[J]. arXiv Preprint arXiv: .
22 Chen Y. Convolutional neural network for sentence classification[D]. Waterloo: University of Waterloo, 2015.
23 Schuster M, Paliwal K K. Bidirectional recurrent neural networks[J]. IEEE Transactions on Signal Processing, 1997, 45(11): 2673-2681.
24 Cho K, van Merriënboer B, Gulcehre C, et al. Learning phrase representations using RNN encoder-decoder for statistical machine translation[J]. arXiv preprint arXiv: .
25 Vaswani A, Shazeer N, Parmar N, et al. Attention is all you need[C]∥NIPS'17: Proceedings of the 31st International Conference on Neural Information Processing Systems, Long Beach, California, USA,2017: 6000-6010.
26 Hochreiter S, Schmidhuber J. Long short-term memory[J]. Neural Computation, 1997, 9(8): 1735-1780.
27 Yang Z, Yang D, Dyer C, et al. Hierarchical attention networks for document classification[C]∥Proceedings of the 2016 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, San Diego, California, USA, 2016: 1480-1489.
[1] Fu-heng QU,Tian-yu DING,Yang LU,Yong YANG,Ya-ting HU. Fast image codeword search algorithm based on neighborhood similarity [J]. Journal of Jilin University(Engineering and Technology Edition), 2022, 52(8): 1865-1871.
[2] Ming LIU,Yu-hang YANG,Song-lin ZOU,Zhi-cheng XIAO,Yong-gang ZHANG. Application of enhanced edge detection image algorithm in multi-book recognition [J]. Journal of Jilin University(Engineering and Technology Edition), 2022, 52(4): 891-896.
[3] Shi-min FANG. Multiple source data selective integration algorithm based on frequent pattern tree [J]. Journal of Jilin University(Engineering and Technology Edition), 2022, 52(4): 885-890.
[4] Sheng-sheng WANG,Jing-yu CHEN,Yi-nan LU. COVID⁃19 chest CT image segmentation based on federated learning and blockchain [J]. Journal of Jilin University(Engineering and Technology Edition), 2021, 51(6): 2164-2173.
[5] Hong-wei ZHAO,Zi-jian ZHANG,Jiao LI,Yuan ZHANG,Huang-shui HU,Xue-bai ZANG. Bi⁃direction segmented anti⁃collision algorithm based on query tree [J]. Journal of Jilin University(Engineering and Technology Edition), 2021, 51(5): 1830-1837.
[6] Jie CAO,Xue QU,Xiao-xu LI. Few⁃shot image classification method based on sliding feature vectors [J]. Journal of Jilin University(Engineering and Technology Edition), 2021, 51(5): 1785-1791.
[7] Ya-hui ZHAO,Fei-yang YANG,Zhen-guo ZHANG,Rong-yi CUI. Korean text structure discovery based on reinforcement learning and attention mechanism [J]. Journal of Jilin University(Engineering and Technology Edition), 2021, 51(4): 1387-1395.
[8] Chun-bo WANG,Xiao-qiang DI. Cloud storage integrity verification audit scheme based on label classification [J]. Journal of Jilin University(Engineering and Technology Edition), 2021, 51(4): 1364-1369.
[9] Rong QIAN,Ru ZHANG,Ke-jun ZHANG,Xin JIN,Shi-liang GE,Sheng JIANG. Capsule graph neural network based on global and local features fusion [J]. Journal of Jilin University(Engineering and Technology Edition), 2021, 51(3): 1048-1054.
[10] Qian-yi XU,Gui-he QIN,Ming-hui SUN,Cheng-xun MENG. Classification of drivers' head status based on improved ResNeSt [J]. Journal of Jilin University(Engineering and Technology Edition), 2021, 51(2): 704-711.
[11] Yuan SONG,Dan-yuan ZHOU,Wen-chang SHI. Method to enhance security function of OpenStack Swift cloud storage system [J]. Journal of Jilin University(Engineering and Technology Edition), 2021, 51(1): 314-322.
[12] Xiang-jiu CHE,You-zheng DONG. Improved image recognition algorithm based on multi⁃scale information fusion [J]. Journal of Jilin University(Engineering and Technology Edition), 2020, 50(5): 1747-1754.
[13] ZHOU Xuan-yu, LIU Juan, SHAO Peng, LUO Fei, LIU Yang. Chinese anaphora resolution based on multi-pass sieve model [J]. 吉林大学学报(工学版), 2016, 46(4): 1209-1215.
[14] HU Guan-yu, QIAO Pei-li. High dimensional differential evolutionary algorithm based on cloud population for network security prediction [J]. 吉林大学学报(工学版), 2016, 46(2): 568-577.
[15] XIN Yu, YANG Jing, XIE Zhi-qiang. K-topic increment training algorithm based on LDA [J]. 吉林大学学报(工学版), 2015, 45(4): 1242-1252.
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed   
No Suggested Reading articles found!