“Sunwei”的版本间的差异
来自南京大学IIP
(更改可见性) |
(更改可见性) |
||
第30行: | 第30行: | ||
*<span style="font-size:larger;">On '''Multi-label Text Classification (MLTC),''' text features can be regarded as '''detailed description of documents''' and label sets can be '''a summarization of documents'''. '''Hybrid Topics''' from text features and label sets by LDA (a method of '''topic model''') can effectively mine global label correlations and deeper features. Meanwhile, a pair including topics and labels can mitigate the imbalanced problem of labels.</span> | *<span style="font-size:larger;">On '''Multi-label Text Classification (MLTC),''' text features can be regarded as '''detailed description of documents''' and label sets can be '''a summarization of documents'''. '''Hybrid Topics''' from text features and label sets by LDA (a method of '''topic model''') can effectively mine global label correlations and deeper features. Meanwhile, a pair including topics and labels can mitigate the imbalanced problem of labels.</span> | ||
*<span style="font-size:larger;">Deep learning For multi-label text classification. We utilize '''dilated convolution''' to obtain the '''semantic understanding''' of the text and design a hybrid '''attention mechansim''' for '''different labels''' (Specifically, each label should attend to most relevant textual contents). Firstly, we initialize trainable label embeddings. Then After obationing word-level information based on Bi-LSTM, we get semantic understanding of texts based on word-level information by dilated convolution. Finally, we design a hybrid attention for different labels based on label embeddings. Besides, we add '''label cooccurrence matrix into loss function '''to guide the whole network to learn and achieve good results. </span> | *<span style="font-size:larger;">Deep learning For multi-label text classification. We utilize '''dilated convolution''' to obtain the '''semantic understanding''' of the text and design a hybrid '''attention mechansim''' for '''different labels''' (Specifically, each label should attend to most relevant textual contents). Firstly, we initialize trainable label embeddings. Then After obationing word-level information based on Bi-LSTM, we get semantic understanding of texts based on word-level information by dilated convolution. Finally, we design a hybrid attention for different labels based on label embeddings. Besides, we add '''label cooccurrence matrix into loss function '''to guide the whole network to learn and achieve good results. </span> | ||
− | *<span style="font-size:larger;">'''GCN (Graph Convolution Network) '''can be used to exploit more complex label correlations on | + | *<span style="font-size:larger;">'''GCN (Graph Convolution Network) '''can be used to exploit more complex label correlations on Multi-label Learning.</span> |
<span style="font-size:larger;"><font color="#3498db">'''Publications'''</font></span> | <span style="font-size:larger;"><font color="#3498db">'''Publications'''</font></span> |
2019年5月26日 (日) 23:35的版本
M.Sc. Student @ IIP Group Email: weisun_@outlook.com | |
Supervisor
- Professor Jun-Yuan Xie
Biography
- I received my B.Sc. degree in of Soochow University in June 2017. In the same year, I was admitted to study for a Master degree in Nanjing University without entrance examination. Currently I am a second year M.Sc. student of Department of Computer Science and Technology in Nanjing University and a member of IIP Group, led by professor Jun-Yuan Xie and Chong-Jun Wang.
Research Interest
Machine Learning & Multi-label Learning.
- On Multi-label Text Classification (MLTC), text features can be regarded as detailed description of documents and label sets can be a summarization of documents. Hybrid Topics from text features and label sets by LDA (a method of topic model) can effectively mine global label correlations and deeper features. Meanwhile, a pair including topics and labels can mitigate the imbalanced problem of labels.
- Deep learning For multi-label text classification. We utilize dilated convolution to obtain the semantic understanding of the text and design a hybrid attention mechansim for different labels (Specifically, each label should attend to most relevant textual contents). Firstly, we initialize trainable label embeddings. Then After obationing word-level information based on Bi-LSTM, we get semantic understanding of texts based on word-level information by dilated convolution. Finally, we design a hybrid attention for different labels based on label embeddings. Besides, we add label cooccurrence matrix into loss function to guide the whole network to learn and achieve good results.
- GCN (Graph Convolution Network) can be used to exploit more complex label correlations on Multi-label Learning.
Publications
- Ran X., Pan Y., Sun W. and Wang C.. Learn to Select via Hierarchical Gate Mechanism for Aspect-Based Sentiment Analysis. In Proceedings of IJCAL 2019.
- Sun W., Wang C..Multi-label Classification: Select Distinct Semantic Understanding for Different Labels[C]. In Proceedings of ApWeb-WAIM 2019.
- Sun W., Ran X.. Luo X., and Wang C..An Efficient Framework by Topic Model for Multi-label Text Classification. In Proceedings of IJCNN 2019.
- Xu Y., Ran X.. Sun W., Luo X. and Wang C..Gated Neural Network with Regularized Loss for Multi-label Text Classification. In Proceedings of IJCNN 2019.
- Ran X., Pan Y., Sun W. and Wang C.. Modeling More Globally: A Hierarchical Attention Network via Multi-Task Learning for Aspect-Based Sentiment Analysis. In Proceedings of DASFAA 2019, Chiang Mai, Thailand, Apr. 22-25, 2019: 505-509.
Resources
- Extreme Classification Repository: for large-scale multi-label datasets and off-the-shelf eXtreme Multi-Label Learning (XML) solvers.
- Mulan Multi-Label Learning Datasets: regular/traditional multi-label learning datasets.
- Related Work: This page categorizes a list of works of my interest, mainly in Multi-Label Learning.
Rewards or Honors
- Second-Class Academic Scholarship, 2018-2019
- First-Class Academic Scholarship, 2017-2018
- Outstanding Graduate Student, 2017.06
- CCF Excellent University Student, 2016.10
- National Scholarship, 2015.11