Knowledge Reasoning: History
Please note this is an old version of this entry, which may differ significantly from the current revision.
Contributor: , , ,

A knowledge graph (KG) organizes knowledge as a set of interlinked triples, and a triple ((head entity, relation, tail entity), simply represented as (h, r, t)) indicates the fact that two entities have a certain relation. The availability of formalized and rich structured knowledge has emerged as a valuable resource in facilitating various downstream tasks, such as question answering and recommender systems. Although KGs such as DBpedia, Freebase, and NELL contain large amounts of entities, relations, and triples, they are far from complete, which is an urgent issue for their broad application. To address this, researchers have introduced the concept of knowledge graph completion (KGC), which has garnered increasing interest. It utilizes knowledge reasoning techniques to automatically discover new facts based on existing ones in a KG.

  • distributed representation
  • knowledge graph
  • link prediction
  • logical rule

1. Introduction

A knowledge graph (KG) organizes knowledge as a set of interlinked triples, and a triple ((head entity, relation, tail entity), simply represented as (h, r, t)) indicates the fact that two entities have a certain relation. The availability of formalized and rich structured knowledge has emerged as a valuable resource in facilitating various downstream tasks, such as question answering [1][2] and recommender systems [3][4].
Although KGs such as DBpedia [5], Freebase [6], and NELL [7] contain large amounts of entities, relations, and triples, they are far from complete, which is an urgent issue for their broad application. To address this, researchers have introduced the concept of knowledge graph completion (KGC), which has garnered increasing interest. It utilizes knowledge reasoning techniques to automatically discover new facts based on existing ones in a KG [8].
Currently, the methods of KGC can be classified into two major categories: (1) One type of method uses explicit reasoning rules. It obtains these rules through inductive learning and then deduces new facts. (2) Another method is based on representation learning instead of directly modeling rules, aiming to learn a distributed embedding for entities and relations and perform generalization in numerical space.
Rule-based reasoning is accurate and can provide interpretability for the inference results. These rules can be hand-crafted by domain experts [9] or mined from KGs using an inductive algorithm like AMIE [10]. Traditional methods like expert systems [11][12] utilize hard logical rules for making predictions.
The methods used to determine knowledge graph embeddings (KGEs) learn how to embed entities and relations into a continuous low-dimensional space [13][14]. These embeddings maintain the semantic meaning of entities and relations, facilitating the prediction of missing triples. Additionally, these embeddings can be effectively trained using stochastic gradient descent (SGD). However, it is important to note that this approach does not fully capitalize on logical rules, which are instrumental in compactly encoding domain knowledge and have practical implications in various applications. It is worth mentioning that high-quality embeddings heavily rely on abundant data, making it challenging for these methods to generate meaningful representations for sparse entities [15][16].
In fact, both rule-based and embedding-based methods have advantages and disadvantages in the KGC task. Logical rules are accurate and interpretable, while embedding is flexible and computationally efficient. Recently, there has been research on combining the advantages of logical rules and KGEs to achieve more precise knowledge completion. Mixed techniques can infer missing triples effectively by exploiting and modeling uncertain logical rules. Some existing methods aim to learn KGEs and rules iteratively [16], and some other methods also utilize soft rules or groundings of rules to regularize the learning of KGEs [17][18].

2. Rule-Based Reasoning

Logical rules can encode human knowledge compactly, and early knowledge reasoning was primarily based on first-order logical rules. Existing rule-based reasoning methods have primarily utilized search-based inductive logic programming (ILP) methods, usually searching and pruning rules. Based on the partial completeness assumption, AMIE [10] introduces a revised confidence metric well suited for modeling KGs. AMIE+ [19] is optimized to expand to larger KGs by query rewriting and pruning. Additionally, AMIE+ improves the precision of the forecasts by using joint reasoning and type information. Based on pre-provided rules, it builds a probabilistic graph model and then learns the weights of rules. However, due to the complicated graph structure among triples, the reasoning in an MLN is time-consuming and challenging, and the incompleteness of KGs also impacts the inference results. In contrast, Iterlogic-E uses rules to enhance KGEs with more effective inference.

3. Embedding-Based Reasoning

Recently, embedding-based methods have attracted much attention. These methods aim to learn distributed embeddings for entities and relations in KGs. Generally, the current KGE methods can be divided into three classes: (1) translation-based models that learn embeddings by translating one entity into another entity through a specific relation [20][21]; (2) compositional models that use simple mathematical operations to model facts, including linear mapping [22], bilinear mapping [23][24][25], and circular correlation [26]; (3) neural-network-based models that utilize a multilayer neural structure to learn embeddings and estimate the plausibility of triples with nonlinear features: for example, R-GCN [27], ConvE [28] and so on [29][30][31]. The above methods learn representations based only on the triples existing in KGs, and the sparsity of data limits them. In order to address this issue and acquire semantic-rich representations, recent studies have made efforts to incorporate additional information apart from triples. These include contextual information [32], entity type information [33][34], ontological information [35], taxonomic information [36], textual descriptions [37], commonsense knowledge [38], and hierarchical information [39]. In contrast, the proposed Iterlogic-E utilizes embeddings to eliminate erroneous conclusions derived from rules, which combines the advantages of soft rules and embeddings.

4. Hybrid Reasoning

Both rule-based and embedding-based methods have their own advantages and disadvantages. Recent works have integrated these two types of reasoning methods together. Guo et al. [40] tried to learn a KGE from rule groundings and triples in combination. They utilized TransE to obtain entity and relation embeddings and computed the truth value for each rule using t-norm fuzzy logic. Then, they ranked the rules by their truth values and manually filtered those ranked at the top. Nayyeri et al. [41] enhanced KGEs by injecting rules. Although their works also learned embeddings jointly, they could not support the uncertainty of soft rules. Wang et al. [42] ordered relations approximately by maximizing the margin between negative and positive logical rules to capture the transitivity and asymmetry of rules. However, it cannot directly benefit from the forward reasoning of rules and does not utilize the confidence information corresponding to the rules. Zhang et al. [17], Guo et al. [18], and Guo et al. [43] obtained KGEs with supervision from soft rules, proving the effectiveness of logical rules. Whereas, they did not consider combining KGEs to filter out erroneous facts introduced by these rules during the iteration process. Qu et al. [44] used an MLN to model logical rules and inferred new triples to enhance KGEs. One main difficulty here is the large search space in determining rule structures and searching for support groundings. Zhang et al. [16] aimed to improve sparse data representation through iterative learning and update the confidence of rules through embeddings. Nevertheless, their method evaluates sparsity based on the frequency of entities participating in triples, and only conclusions related to entities with high sparsity are injected into the original knowledge graph. This design choice is specifically tailored to address the challenges posed by sparse data in the test set but did not take into account the potentially erroneous introduced by the rules. In contrast, Iterlogic-E models the conclusion labels as 0–1 variables and uses confidence regularization loss to eliminate uncertain conclusions.

This entry is adapted from the peer-reviewed paper 10.3390/app131910660

References

  1. Berant, J.; Chou, A.; Frostig, R.; Liang, P. Semantic parsing on freebase from question-answer pairs. In Proceedings of the 2013 Conference on Empirical Methods in Natural Language Processing, Seattle, WA, USA, 18–21 October 2013; pp. 1533–1544.
  2. Huang, X.; Zhang, J.; Li, D.; Li, P. Knowledge graph embedding based question answering. In Proceedings of the Twelfth ACM International Conference on Web Search and Data Mining, Melbourne, VIC, Australia, 11–15 February 2019; pp. 105–113.
  3. Wang, X.; Wang, D.; Xu, C.; He, X.; Cao, Y.; Chua, T.S. Explainable reasoning over knowledge graphs for recommendation. In Proceedings of the AAAI Conference on Artificial Intelligence, Honolulu, HI, USA, 27 January–1 February 2019; pp. 5329–5336.
  4. Cao, Y.; Wang, X.; He, X.; Hu, Z.; Chua, T.S. Unifying knowledge graph learning and recommendation: Towards a better understanding of user preferences. In Proceedings of the World Wide Web Conference, San Francisco, CA, USA, 13–17 May 2019; pp. 151–161.
  5. Auer, S.; Bizer, C.; Kobilarov, G.; Lehmann, J.; Cyganiak, R.; Ives, Z. Dbpedia: A nucleus for a web of open data. In Proceedings of the International Semantic Web Conference, Busan, Republic of Korea, 1–15 November 2007; pp. 722–735.
  6. Bollacker, K.; Evans, C.; Paritosh, P.; Sturge, T.; Taylor, J. Freebase: A collaboratively created graph database for structuring human knowledge. In Proceedings of the 2008 ACM SIGMOD International Conference on Management of Data, Vancouver, BC, Canada, 10–12 June 2008; pp. 1247–1250.
  7. Carlson, A.; Betteridge, J.; Kisiel, B.; Settles, B.; Hruschka, E.; Mitchell, T. Toward an architecture for never-ending language learning. In Proceedings of the AAAI Conference on Artificial Intelligence, Atlanta, GA, USA, 11–15 July 2010.
  8. Ji, S.; Pan, S.; Cambria, E.; Marttinen, P.; Yu, P.S. A survey on knowledge graphs: Representation, acquisition and applications. IEEE Trans. Neural Netw. Learn. Syst. 2021, 33, 494–514.
  9. Taskar, B.; Abbeel, P.; Wong, M.F.; Koller, D. Relational markov networks. Introd. Stat. Relational Learn. 2007, 175, 200.
  10. Galárraga, L.A.; Teflioudi, C.; Hose, K.; Suchanek, F. Amie: Association rule mining under incomplete evidence in ontological knowledge bases. In Proceedings of the 22nd International Conference on World Wide Web, Rio de Janeiro, Brazil, 13–17 May 2013; pp. 413–422.
  11. Giarratano, J.C.; Riley, G. Expert Systems; PWS Publishing: Boston, MA, USA, 1998.
  12. Jackson, P. Introduction to Expert Systems; Addison-Wesley Longman Publishing: Boston, MA, USA, 1986.
  13. Nickel, M.; Murphy, K.; Tresp, V.; Gabrilovich, E. A review of relational machine learning for knowledge graphs. Proc. IEEE 2016, 104, 11–33.
  14. Wang, Q.; Mao, Z.; Wang, B.; Guo, L. Knowledge graph embedding: A survey of approaches and applications. IEEE Trans. Knowl. Data 2017, 29, 2724–2743.
  15. Pujara, J.; Augustine, E.; Getoor, L. Sparsity and noise: Where knowledge graph embeddings fall short. In Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing, Copenhagen, Denmark, 7–11 September 2017; pp. 1751–1756.
  16. Zhang, W.; Paudel, B.; Wang, L.; Chen, J.; Zhu, H.; Zhang, W.; Bernstein, A.; Chen, H. Iteratively learning embeddings and rules for knowledge graph reasoning. In Proceedings of the World Wide Web Conference, San Francisco, CA, USA, 13–17 May 2019; pp. 2366–2377.
  17. Zhang, J.; Li, J. Enhanced knowledge graph embedding by jointly learning soft rules and facts. Algorithms 2019, 12, 265.
  18. Guo, S.; Li, L.; Hui, Z.; Meng, L.; Ma, B.; Liu, W.; Wang, L.; Zhai, H.; Zhang, H. Knowledge graph embedding preserving soft logical regularity. In Proceedings of the 29th ACM International Conference on Information & Knowledge Management, Shanghai, China, 3–7 November 2014; pp. 425–434.
  19. Galárraga, L.; Teflioudi, C.; Hose, K.; Suchanek, F.M. Fast rule mining in ontological knowledge bases with AMIE+. VLDB J. 2015, 24, 707–730.
  20. Bordes, A.; Usunier, N.; Garcia-Duran, A.; Weston, J.; Yakhnenko, O. Translating embeddings for modeling multi-relational data. In Proceedings of the 26th International Conference on Neural Information Processing Systems, Lake Tahoe, NV, USA, 5–8 December 2013; pp. 1–9.
  21. Yang, S.; Tian, J.; Zhang, H.; Yan, J.; He, H.; Jin, Y. Transms: Knowledge graph embedding for complex relations by multidirectional semantics. In Proceedings of the Twenty-Eighth International Joint Conference on Artificial Intelligence (IJCAI-19), Macao, China, 10–16 August 2019; pp. 1935–1942.
  22. Sun, Z.; Deng, Z.H.; Nie, J.Y.; Tang, J. Rotate: Knowledge graph embedding by relational rotation in complex space. arXiv 2019, arXiv:1902.10197.
  23. Yang, B.; Yih, W.; He, X.; Gao, J.; Deng, L. Embedding entities and relations for learning and inference in knowledge bases. arXiv 2014, arXiv:1412.6575.
  24. Trouillon, T.; Welbl, J.; Riedel, S.; Gaussier, É.; Bouchard, G. Complex embeddings for simple link prediction. In Proceedings of the International Conference on Machine Learning, New York, NY, USA, 19–24 June 2016; pp. 2071–2080.
  25. Liu, H.; Wu, Y.; Yang, Y. Analogical inference for multi-relational embeddings. In Proceedings of the International Conference on Machine Learning, Sydney, Australia, 6–11 August 2017; pp. 2168–2178.
  26. Nickel, M.; Rosasco, L.; Poggio, T. Holographic embeddings of knowledge graphs. In Proceedings of the AAAI Conference on Artificial Intelligence, Phoenix, AZ, USA, 12–17 February 2016.
  27. Schlichtkrull, M.; Kipf, T.N.; Bloem, P.; Van DenBerg, R.; Titov, I.; Welling, M. Modeling relational data with graph convolutional networks. In Proceedings of the Semantic Web: 15th International Conference, ESWC 2018, Heraklion, Crete, Greece, June 3–7, 2018, Proceedings 15; Springer International Publishing: Cham, Switzerland, 2018; pp. 593–607.
  28. Dettmers, T.; Minervini, P.; Stenetorp, P.; Riedel, S. Convolutional 2d knowledge graph embeddings. In Proceedings of the AAAI Conference on Artificial Intelligence, New Orleans, LA, USA, 2–3 February 2018.
  29. Guo, L.; Sun, Z.; Hu, W. Learning to exploit long-term relational dependencies in knowledge graphs. In Proceedings of the International Conference on Machine Learning, Long Beach, CA, USA, 10–15 June 2019; pp. 2505–2514.
  30. Shang, C.; Tang, Y.; Huang, J.; Bi, J.; He, X.; Zhou, B. End-to-end structure-aware convolutional networks for knowledge base completion. In Proceedings of the AAAI Conference on Artificial Intelligence, Hilton, HI, USA, 27 January–1 February 2019; pp. 3060–3067.
  31. Shi, B.; Weninger, T. Proje: Embedding projection for knowledge graph completion. In Proceedings of the AAAI Conference on Artificial Intelligence, Hilton, CA, USA, 4–9 February 2017.
  32. Wang, Q.; Huang, P.; Wang, H.; Dai, S.; Jiang, W.; Liu, J.; Lyu, Y.; Zhu, Y.; Wu, H. Coke: Contextualized, knowledge, graph, embedding. arXiv 2019, arXiv:1911.02168.
  33. Guo, S.; Wang, Q.; Wang, B.; Wang, L.; Guo, L. Semantically smooth knowledge graph embedding. In Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing, Beijing, China, 26–31 July 2015; pp. 84–94.
  34. Xie, R.; Liu, Z.; Sun, M. Representation learning of knowledge graphs with hierarchical types. In Proceedings of the 25th International Joint Conference on Artificial Intelligence (IJCAI-16), Palo Alto, CA, USA, 9–15 July 2016; pp. 2965–2971.
  35. Hao, J.; Chen, M.; Yu, W.; Sun, Y.; Wang, W. Universal representation learning of knowledge bases by jointly embedding instances and ontological concepts. In Proceedings of the 25th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, Anchorage, AK, USA, 4–8 August 2019; pp. 1709–1719.
  36. Fatemi, B.; Ravanbakhsh, S.; Poole, D. Improved knowledge graph embedding using background taxonomic information. In Proceedings of the AAAI Conference on Artificial Intelligence, Hilton, HI, USA, 27 January–1 February 2019; pp. 3526–3533.
  37. Veira, N.; Keng, B.; Padmanabhan, K.; Veneris, A.G. Unsupervised embedding enhancements of knowledge graphs using textual associations. In Proceedings of the 28th International Joint Conference on Artificial Intelligence (IJCAI-19), Macao, China, 10–16 August 2019; pp. 5218–5225.
  38. Niu, G.; Li, B.; Zhang, Y.; Pu, S. CAKE: A Scalable Commonsense-Aware Framework For Multi-View Knowledge Graph Completion; ACL: Stroudsburg, PA, USA, 2022; pp. 2867–2877.
  39. Zhang, Z.; Cai, J.; Zhang, Y.; Wang, J. Learning hierarchy-aware knowledge graph embeddings for link prediction. In Proceedings of the AAAI Conference on Artificial Intelligence, New York, NY, USA, 7–12 February 2020; pp. 3065–3072.
  40. Guo, S.; Wang, Q.; Wang, L.; Wang, B.; Guo, L. Jointly embedding knowledge graphs and logical rules. In Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, Austin, TX, USA, 1–5 November 2016; pp. 192–202.
  41. Nayyeri, M.; Xu, C.; Alam, M.M.; Yazdi, H.S. LogicENN: A Neural Based Knowledge Graphs Embedding Model with Logical Rules. IEEE Trans. Pattern Anal. Mach. Intell. 2023, 45, 7050–7062.
  42. Wang, M.; Rong, E.; Zhuo, H.; Zhu, H. Embedding knowledge graphs based on transitivity and asymmetry of rules. In Proceedings of the Advances in Knowledge Discovery and Data Mining: 22nd Pacific-Asia Conference, PAKDD 2018, Melbourne, VIC, Australia, June 3–6, 2018, Proceedings, Part II 22; Springer International Publishing: Cham, Switzerland; pp. 141–153.
  43. Guo, S.; Wang, Q.; Wang, L.; Wang, B.; Guo, L. Knowledge graph embedding with iterative guidance from soft rules. In Proceedings of the AAAI Conference on Artificial Intelligence, New Orleans, LA, USA, 2–7 February 2018.
  44. Qu, M.; Tang, J. Probabilistic logic neural networks for reasoning. In Proceedings of the Advances in Neural Information Processing Systems 32: Annual Conference on Neural Information Processing Systems 2019, NeurIPS 2019, Vancouver, BC, Canada, 8–14 December 2019.
More
This entry is offline, you can click here to edit this entry!
Video Production Service