25th International Conference on Database Systems for Advanced Applications

Sep. 24-27, 2020, Jeju, South Korea

Click following URL

http://dasfaa2020.sigongji.com

to visit DASFAA 2020 Online Event Site

Paper details

Title: SAEA: Self-Attentive Heterogeneous Sequence Learning Model for Entity Alignment

Authors: Jia Chen, Binbin Gu, Zhixu Li, Pengpeng Zhao, An Liu and Lei Zhao

Abstract: We consider the problem of entity alignment in knowledge graphs. Previous works mainly focus on two aspects: One is to improve the TransE-based models which mostly only consider triple-level structural information i.e. relation triples or to make use of graph convolutional networks holding the assumption that equivalent entities are usually neighbored by some other equivalent entities. The other is to incorporate external features, such as attributes types, attribute values, entity names and descriptions to enhance the original relational model. However, the long-term structural dependencies between entities have not been exploited well enough and sometimes external resources are incomplete and unavailable. These will impair the accuracy and robustness of combinational models that use relations and other types of information, especially when iteration is performed. To better explore structural information between entities, we novelly propose a Self-Attentive heterogeneous sequence learning model for Entity Alignment (SAEA) that allows us to capture long-term structural dependencies within entities. Furthermore, considering low-degree entities and relations appear much less in sequences prodeced by traditional random walk methods, we design a degree-aware random walk to generate heterogeneous sequential data for self-attentive learning. To evaluate our proposed model, we conduct extensive experiments on real-world datasets. The experimental results show that our method outperforms various state-of-the-art entity alignment models using relation triples only.

Video file:

Slide file:

Sponsors