World Scientific
  • Search
  •   
Skip main navigation

Cookies Notification

We use cookies on this site to enhance your user experience. By continuing to browse the site, you consent to the use of our cookies. Learn More
×
Our website is made possible by displaying certain online content using javascript.
In order to view the full content, please disable your ad blocker or whitelist our website www.worldscientific.com.

System Upgrade on Tue, Oct 25th, 2022 at 2am (EDT)

Existing users will be able to log into the site and access content. However, E-commerce and registration of new users may not be available for up to 12 hours.
For online purchase, please visit us again. Contact us at [email protected] for any enquiries.

Embodying the Number of an Entity’s Relations for Knowledge Representation Learning

    https://doi.org/10.1142/S0218194021500509Cited by:1 (Source: Crossref)

    Knowledge representation learning (knowledge graph embedding) plays a critical role in the application of knowledge graph construction. The multi-source information knowledge representation learning, which is one class of the most promising knowledge representation learning at present, mainly focuses on learning a large number of useful additional information of entities and relations in the knowledge graph into their embeddings, such as the text description information, entity type information, visual information, graph structure information, etc. However, there is a kind of simple but very common information — the number of an entity’s relations which means the number of an entity’s semantic types has been ignored. This work proposes a multi-source knowledge representation learning model KRL-NER, which embodies information of the number of an entity’s relations between entities into the entities’ embeddings through the attention mechanism. Specifically, first of all, we design and construct a submodel of the KRL-NER LearnNER which learns an embedding including the information on the number of an entity’s relations; then, we obtain a new embedding by exerting attention onto the embedding learned by the models such as TransE with this embedding; finally, we translate based onto the new embedding. Experiments, such as related tasks on knowledge graph: entity prediction, entity prediction under different relation types, and triple classification, are carried out to verify our model. The results show that our model is effective on the large-scale knowledge graphs, e.g. FB15K.

    References

    • 1. A. Singhal, Introducing the knowledge graph: Things, not strings, Official Google Blog (2012). Google Scholar
    • 2. S. Ji, S. Pan, E. Cambria, P. Marttinen and P. S. Yu, A survey on knowledge graphs: Representation, acquisition, and applications, IEEE Trans. Neural Netw. Learn. Syst. (2021) 1–21, https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=9416312. ISIGoogle Scholar
    • 3. X. Chen, S. Jia and Y. Xiang, A review: Knowledge reasoning over knowledge graph, Expert Syst. Appl. 141 (2020) 112948.1–112948.21. Crossref, ISIGoogle Scholar
    • 4. C. Li, A. Li, Y. Wang and H. Tu, Applications of knowledge representation learning, MDATA: A New Knowledge Representation Model: Theory, Methods and Applications (2021), pp. 165–184. CrossrefGoogle Scholar
    • 5. C. Li, A. Li, Y. Wang, H. Tu and Y. Song, A survey on approaches and applications of knowledge representation learning, in IEEE 5th Int. Conf. on Data Science in Cyberspace, 2020, pp. 312–319. CrossrefGoogle Scholar
    • 6. Y. Lin, X. Han, R. Xie, Z. Liu and M. Sun, Knowledge representation learning: A quantitative review, arXiv:1812.10901. Google Scholar
    • 7. H. L. Nguyen, D. T. Vu and J. J. Jung, Knowledge graph fusion for smart systems: A survey, Inform. Fusion 61 (2020) 56–70. Crossref, ISIGoogle Scholar
    • 8. K. Chen, G. Shen, Z. Huang and H. Wang, Improved entity linking for simple question answering over knowledge graph, Int. J. Softw. Eng. Knowl. Eng. 31 (2021) 55–80. Link, ISIGoogle Scholar
    • 9. Q. Wang, Z. Mao, B. Wang and L. Guo, Knowledge graph embedding: A survey of approaches and applications, IEEE Trans. Knowl. Data Eng. 29 (2017) 2724–2743. Crossref, ISIGoogle Scholar
    • 10. A. Rossi, D. Barbosa, D. Firmani, A. Matinata and P. Merialdo, Knowledge graph embedding for link prediction: A comparative analysis, ACM Trans. Knowl. Discov. Data 15(2) (2021) 1–49. Crossref, ISIGoogle Scholar
    • 11. A. Bordes, N. Usunier, A. Garcia-Durán, J. Weston and O. Yakhnenko, Translating embeddings for modeling multi-relational data, Adv. Neural Inform. Process. Syst. 26 (2013) 2787–2795. Google Scholar
    • 12. H. Liu, Y. Wu and Y. Yang, Analogical inference for multi-relational embeddings, in Int. Conf. Machine Learning, 2017, pp. 2168–2178. Google Scholar
    • 13. B. Yang, W.-T. Yih, X. He, J. Gao and L. Deng, Embedding entities and relations for learning and inference in knowledge bases, arXiv:1412.6575. Google Scholar
    • 14. Q. Liu, H. Jiang, A. Evdokimov, Z.-H. Ling, X. Zhu, S. Wei and Y. Hu, Probabilistic reasoning via deep learning: Neural association models, arXiv:1603.07704. Google Scholar
    • 15. I. Balazević, C. Allen and T. M. Hospedales, Tucker: Tensor factorization for knowledge graph completion, arXiv:1901.09590. Google Scholar
    • 16. Z. Wang, J. Zhang, J. Feng and Z. Chen, Knowledge graph embedding by translating on hyperplanes, in Proc. AAAI Conf. Artificial Intelligence, Vol. 28, 2014. CrossrefGoogle Scholar
    • 17. M. Fan, Q. Zhou, E. Chang and T. Zheng, Transition-based knowledge graph embedding with relational mapping properties, in Proc. 28th Pacific Asia Conf. Language, Information and Computation, 2014, pp. 328–337. Google Scholar
    • 18. Y. Lin, Z. Liu, M. Sun, Y. Liu and X. Zhu, Learning entity and relation embeddings for knowledge graph completion, in Proc. AAAI Conf. Artificial Intelligence, 2015. CrossrefGoogle Scholar
    • 19. G. Ji, S. He, L. Xu, K. Liu and J. Zhao, Knowledge graph embedding via dynamic mapping matrix, in Proc. 53rd Annual Meeting of the Association for Computational Linguistics and 7th Int. Joint Conf. Natural Language Processing, Vol. 1, 2015, pp. 687–696. Google Scholar
    • 20. Z. Sun, Z. H. Deng, J. Y. Nie and J. Tang, Rotate: Knowledge graph embedding by relational rotation in complex space, Int. Conf. Learning Representations, 2019. Google Scholar
    • 21. Z. Li, H. Liu, Z. Zhang, T. Liu and N. N. Xiong, Learning knowledge graph embedding with heterogeneous relation attention networks, IEEE Trans. Neural Netw. Learn. Syst. (2021) 1–13, https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=9359364. ISIGoogle Scholar
    • 22. L. Guo, Z. Sun and W. Hu, Learning to exploit long-term relational dependencies in knowledge graphs, Int. Conf. Machine Learning, 2019, pp. 2505–2514. Google Scholar
    • 23. D. Q. Nguyen, T. Vu, T. Nguyen, D. Q. Nguyen and D. Q. Phung, A capsule network-based embedding model for knowledge graph completion and search personalization, in Proc. Conf. North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Vol. 1, 2019, pp. 2180–2189. CrossrefGoogle Scholar
    • 24. Z. Wang and J. Li, Text-enhanced representation learning for knowledge graph, in Proc. Int. Joint Conf. Artificial Intelligent, 2016, pp. 4–17. Google Scholar
    • 25. R. Xie, Z. Liu, J. Jia, H. Luan and M. Sun, Representation learning of knowledge graphs with entity descriptions, in Proc. AAAI Conf. Artificial Intelligence, Vol. 30, 2016. CrossrefGoogle Scholar
    • 26. H. Xiao, M. Huang, L. Meng and X. Zhu, Ssp: Semantic space projection for knowledge graph embedding with text descriptions, in Proc. AAAI Conf. Artificial Intelligence, 2017, pp. 3104–3110. CrossrefGoogle Scholar
    • 27. N. Guan, D. Song and L. Liao, Knowledge graph embedding with concepts, Knowl. Based Syst. 164 (2019) 38–44. Crossref, ISIGoogle Scholar
    • 28. B. Wang, T. Shen, G. Long, T. Zhou and Y. Chang, Structure-augmented text representation learning for efficient knowledge graph completion, in Proc. Web Conf., 2021, pp. 1737–1748. CrossrefGoogle Scholar
    • 29. R. Xie, Z. Liu and M. Sun, Representation learning of knowledge graphs with hierarchical types, in Proc. Int. Joint Conf. Artificial Intelligent, 2016, pp. 2965–2971. Google Scholar
    • 30. G. Niu, B. Li, Y. Zhang, S. Pu and J. Li, Autoeter: Automated entity type representation for knowledge graph embedding, arXiv:2009.12030. Google Scholar
    • 31. W. Qian, C. Fu, Y. Zhu, D. Cai and X. He, Translating embeddings for knowledge graph completion with relation attention mechanism, in Proc. Int. Joint Conf. Artificial Intelligent, 2018, pp. 4286–4292. CrossrefGoogle Scholar
    • 32. R. Xie, S. Heinrich, Z. Liu, C. Weber, Y. Yao, S. Wermter and M. Sun, Integrating image-based and knowledge-based representation learning, IEEE Trans. Cognit. Develop. Syst. 12 (2020) 169–178. Crossref, ISIGoogle Scholar
    • 33. S. Y. Min, P. Raghavan and P. Szolovits, TransINT: Embedding implication rules in knowledge graphs with isomorphic intersections of linear subspaces, arXiv:2007.00271. Google Scholar
    • 34. A. Sadeghian, M. Armandpour, A. Colas and D. Z. Wang, ChronoR: Rotation based temporal knowledge graph embedding, in Proc. AAAI Conf. Artificial Intelligence, 2021. CrossrefGoogle Scholar
    • 35. Y. Zhu, H. Liu, Z. Wu, Y. Song and T. Zhang, Representation learning with ordered relation paths for knowledge graph completion, in Conf. Empirical Methods in Natural Language Processing and 9th Int. Joint Conf. on Natural Language Processing, 2020, pp. 2662–2671. Google Scholar
    • 36. K. Wang, Y. Liu, X. Xu and Q. Z. Sheng, Enhancing knowledge graph embedding by composite neighbors for link prediction, Computing 102(12) (2020) 2587–2606. Crossref, ISIGoogle Scholar
    • 37. J. Zhu, Z. Zheng, M. Yang, G. Fung and Y. Tang, A semi-supervised model for knowledge graph embedding, Data Mining Knowl. Discov. 34(1) (2020) 1–20. Crossref, ISIGoogle Scholar
    • 38. S. Chaudhari, G. Polatkan, R. Ramanath and V. Mithal, An attentive survey of attention models, arXiv:1904.02874. Google Scholar
    • 39. A. Vaswani, N. Shazeer, N. Parmar, J. Uszkoreit, L. Jones, A. N. Gomez, L. U. Kaiser and I. Polosukhin, Attention is all you need, in Advances in Neural Information Processing Systems (Curran Associates, 2017), pp. 5998–6008. Google Scholar
    • 40. D. Bahdanau, K. Cho and Y. Bengio, Neural machine translation by jointly learning to align and translate, in 3rd Int. Conf. Learning Representations, 2015, pp. 1–15. Google Scholar
    • 41. R. Xie, Z. Liu, H. Luan and M. Sun, Image-embodied knowledge representation learning, in Int. Joint Conf. Artificial Intelligence, 2017, pp. 3140–3146. CrossrefGoogle Scholar
    • 42. Y. Bengio, R. Ducharme, P. Vincent and C. Janvin, A neural probabilistic language model, J. Mach. Learn. Res. 3 (2003) 1137–1155. ISIGoogle Scholar
    • 43. T. Mikolov, K. Chen, G. Corrado and J. Dean, Efficient estimation of word representations in vector space, in 1st Int. Conf. Learning Representations, 2013. Google Scholar
    • 44. J. P. Turian, L. A. Ratinov and Y. Bengio, Word representations: A simple and general method for semi-supervised learning, in Proc. 48th Annual Meeting of the Association for Computational Linguistics, 2010, pp. 384–394. Google Scholar
    • 45. T. Mikolov, Distributed representations of words and phrases and their compositionality, Adv. Neural Inform. Process. Syst. 26 (2013) 3111–3119. Google Scholar
    • 46. K. Bollacker, C. Evans, P. Paritosh, T. Sturge and J. Taylor, Freebase: A collaboratively created graph database for structuring human knowledge, in Proc. ACM SIGMOD Int. Conf. Management of Data, 2008, pp. 1247–1250. CrossrefGoogle Scholar
    • 47. G. A. Miller, WordNet: A lexical database for english, Commun. ACM 38(11) (1995) 39–41. Crossref, ISIGoogle Scholar
    • 48. X. Han, S. Cao, X. Lv, Y. Lin and Z. Liu, OpenKE: An open toolkit for knowledge embedding, in Proc. Conf. Empirical Methods in Natural Language Processing: System Demonstrations, 2018, pp. 139–144. CrossrefGoogle Scholar
    Remember to check out the Most Cited Articles!

    Check out our titles in C++ Programming!