ABCDEFGHIJKLMNOPQRSTUVWXY
1
Symbolic or NeuralTermDefinitionSynonymsSource or Scopus key[Document Studio] Append Status #l6591ro8
2
symbolicabduction
Abductive reasoning is the process of deriving a sufficient explanation of the known facts. An abductive logic should not be monotonic because the likely explanations are not necessarily correct. Inference to the best explanation (Walton).
3
implementationAbductive Learning
Abductive Learning is a framework that combines machine learning with first-order logical reasoning. It allows machine learning models to exploit complex symbolic domain knowledge represented by first-order logic rules
https://daiwz.net/org/pdf/gabl_ijcai21_cr.pdf
4
symbolicargumentation framework
In artificial intelligence and related fields, an argumentation framework is a way to deal with contentious information and draw conclusions from it using formalized arguments.
A directed graph such that the nodes are the arguments, and the arrows represent the attack relation. https://en.wikipedia.org/wiki/Argumentation_framework
5
symbolicassociation basedcontext: language morphology
6
n/aassociation rule miningdata mining for frequently co-ocurring item sets
7
neuralattention network
The attention mechanism emerged as an improvement over the encoder decoder-based neural machine translation system in natural language processing (NLP). The encoder LSTM is used to process the entire input sentence and encode it into a context vector, which is the last hidden state of the LSTM/RNN. But Bahdanau et al put emphasis on embeddings of all the words in the input (represented by hidden states) while creating the context vector. They did this by simply taking a weighted sum of the hidden states. https://www.analyticsvidhya.com/blog/2019/11/comprehensive-guide-attention-mechanism-deep-learning/
8
neuralattractor
In the mathematical field of dynamical systems, an attractor is a set of states toward which a system tends to evolve.
We present a recurrent neural network that encodes structured representations as systems of contextually-gated dynamical attractors called attractor graphs. This network implements a functionally compositional working memory that is manipulated using top-down gating and fast local learning.
https://www.researchgate.net/publication/349222386_Compositional_memory_in_attractor_neural_networks_with_one-step_learning
9
symbolicaxiomatic fuzzy set (AFS) theory
A means of semantic interpretations related to the features and can form chains satisfied Zadeh algebra axioms [21]. The AFS theory [27] provides a way to reflect the semantic content of data, which depends upon the data distribution and the underlying semantics. The AFS framework [20] concentrates on how to transform the information of data into semantic knowledge by combining vagueness with randomness.
44KCPGAV
10
neuralbayesian deep neural network (BDNN)
BNNs are the neural networks that place prior distributions over the weights of neurons. Given the observed data, BNNs are required to find the distributions of weights that produce the most reasonable outputs. https://www.sciencedirect.com/science/article/pii/S0925231220310481
11
implementationbayesnet (TAN)
Differentiable TAN Structure Learning for Bayesian Network Classifiers: the proposed method learns a distribution over graph structures using gradient descent
https://arxiv.org/abs/2008.09566
12
taskbelief propagationmessage passing in a graph
13
belief-rule-base (BRB)TBJ6EUJC
14
n/abilevel programming
Bilevel optimization is a special kind of optimization where one problem is embedded (nested) within another. The outer optimization task is commonly referred to as the upper-level optimization task, and the inner optimization task is commonly referred to as the lower-level optimization task. These problems involve two kinds of variables, referred to as the upper-level variables and the lower-level variables. https://en.wikipedia.org/wiki/Bilevel_optimization
7Q5JRVK2
15
taskbinding problem
identification of a general neural mechanism to represent which individual assemblies of active neurons are associated and which are not. (temporal synchrony, conjuntive coding - ability to put together parts never before seen, convolution based approaches) (pg15 Besold et al, 2017)
items which are encoded in distinct circuits of a massively parallel computing device can be combined in complex ways for various cognitive tasks. pg.22
16
symboliccase based reasoning (CBR)
an experience-based approach to solving new problems by adapting previously successful solutions to similar problems
17
implementationCILPknowledge extraction as learning task
18
symbolicclassical propositional logic
Classical (or “bivalent”) truth-functional propositional logic is that branch of truth-functional propositional logic that assumes that there are are only two possible truth-values a statement (whether simple or complex) can have: (1) truth, and (2) falsity, and that every statement is either true or false but not both. https://iep.utm.edu/prop-log/
19
n/acognitive linguistics
Cognitive linguistics is an interdisciplinary approach to the study of language, mind, and sociocultural experience that first emerged in the 1970s. ... It also takes the view that language reflects general aspects of cognition rather than adopting a modular view of mind. https://www.oxfordbibliographies.com/view/document/obo-9780199772810/obo-9780199772810-0059.xml#:~:text=Cognitive%20linguistics%20is%20an%20interdisciplinary,first%20emerged%20in%20the%201970s.&text=It%20also%20takes%20the%20view,a%20modular%20view%20of%20mind.
20
symbolicCognitive Model (CM)
list of Image Schema relations (Johnson (2013), Lakoff (1989), and Langacker (1999) established the theory of IS and CM)
NB39QA35
21
symboliccommonsense knowledegenonmonotonic
22
task
Commonsense Machine Comprehension (CMC)
Commonsense Machine Comprehension (CMC) is a popular natural language understanding task. CMC enables computers to learn about causal and temporal reasoning by exploiting implicit commonsense knowledge and can be applied to Question Answering, Search Engine and Dialogue System. https://www.researchgate.net/publication/336746043_DTC_Transfer_Learning_for_Commonsense_Machine_Comprehension
23
symboliccommonsense reasoning
24
implementationComplEx
semantic matching models, such as ComplEx and DistMult
25
eithercompositionality
capacity to produce new combinations from known components. capacity to generalize, transferring knowledge from one context to others. https://arxiv.org/pdf/1909.05885.pdf
26
neuralconcept aquisitionunsupervised, statistical
27
symbolicconcept manipulationsupervised, symbolic
28
symbolicConcepts of Neighbours2-s2.0-85111431238
29
neuralconnectionism
Connectionism is an approach in the fields of cognitive science that hopes to explain mental phenomena using artificial neural networks (ANN). Connectionism presents a cognitive theory based on simultaneously occurring, distributed signal activity via connections that can be represented numerically, where learning occurs by modifying connection strengths based on experience.
https://en.wikipedia.org/wiki/Connectionism
30
neuralconnectionist modal logic
Connectionist Modal Logics (CML). CML belongs to the domain of neural-symbolic integration, which concerns the application of problem-specific symbolic knowledge within the neurocomputing paradigm. In CML, one may represent, reason or learn modal logics using a neural network.
https://www.sciencedirect.com/science/article/pii/S030439750600750X
31
symbolicconstituency parsing
32
taskconstraint reasoning
Constraint-based reasoning has connections to a wide variety of fields, including formal logic, graph theory, relational databases, combinatorial algorithms, operations research, neural networks, truth maintenance, and logic programming. https://mitpress.mit.edu/books/constraint-based-reasoning
T3BYK5QU
33
eitherconstraint satisfaction programming
In artificial intelligence and operations research, constraint satisfaction is the process of finding a solution to a set of constraints that impose conditions that the variables must satisfy. A solution is therefore a set of values for the variables that satisfies all constraints—that is, a point in the feasible region.
Constraint programming (CP) is a paradigm for solving combinatorial problems
7Q5JRVK2
34
symboliccontinuous logic
Continuous logic is a logic whose truth values can take continuous values in [0,1]
35
neuralconvolutional neural network (CNN)
36
neuralcurriculum learning
Providing increasingly more difficult samples to learn from when training a NN.
2-s2.0-85083954234
37
taskdecision making
38
symbolicdeduction
the inference of particular instances by reference to a general law or principle. general to specific. conclusions are certain
39
neuraldeep belief network (DBN)stack many RBM in a hierarchical mannerB47SSE6P
40
symbolicDeontic Logic
The study of logical systems for formalizing normative statements. Normative knowledge can be used to model knowledge about met and unmet expectations.
Deontic logic is the field of philosophical logic that is concerned with obligation, permission, and related concepts.
7MMJY5BM,
https://link.springer.com/book/10.1007%2F978-3-319-08615-6
41
symbolicdependency parsing
42
symbolicdescription logic
Description logics (DL) are a family of formal knowledge representation languages. Many DLs are more expressive than propositional logic but less expressive than first-order logic. In contrast to the latter, the core reasoning problems for DLs are (usually) decidable.
https://en.wikipedia.org/wiki/Description_logic
43
neuralDifferentiable Neural Computertype of Memory NetworkNB39QA35
44
implementationDistMult
semantic matching models, such as ComplEx and DistMult
45
neuraldistributed representation
46
n/adistributional semnatics
The distributional hypothesis suggests that the more semantically similar two words are, the more distributionally similar they will be in turn, and thus the more that they will tend to occur in similar linguistic contexts. (You are the company you keep), word2vec et al.
distributional hypothesis of language
47
domain specific language (DSL)
Domain-specific languages are languages (or often, declared syntaxes or grammars) with very specific goals in design and implementation. https://en.wikipedia.org/wiki/Domain-specific_language
48
DrLIM
Dimensionality reduction by learning an invariant mapping
Hadsell R, Chopra S, LeCun Y (2006) Dimensionality reduction by learning an invariant mapping. In: 2006 IEEE computer society conference on computer vision and pattern recognition, vol 2. IEEE, pp 1735–1742
49
neuralDynamic Memory Networkstype of Memory NetworkNB39QA35
50
neuralDynamic Neural Turing Machinestype of Memory NetworkNB39QA35
51
n/adynamical system
52
implementationEnt-NetNB39QA35
53
symbolicepistemic logic
Epistemic logic is a subfield of epistemology concerned with logical approaches to knowledge, belief and related notions. Though any logic with an epistemic interpretation may be called an epistemic logic, the most widespread type of epistemic logics in use at present are modal logics. https://plato.stanford.edu/entries/logic-epistemic/
54
eitherevedential based reasoning
55
n/aevolutionary learning
In computer science, evolutionary computation is a family of algorithms for global optimization inspired by biological evolution, and the subfield of artificial intelligence and soft computing studying these algorithms. In technical terms, they are a family of population-based trial and error problem solvers with a metaheuristic or stochastic optimization character. In evolutionary computation, an initial set of candidate solutions is generated and iteratively updated. Each new generation is produced by stochastically removing less desired solutions, and introducing small random changes. https://en.wikipedia.org/wiki/Evolutionary_computation
56
implementation
extended belief rule-based system (EBRBS)
57
neuralextreme learning machine
Extreme learning machines are feedforward neural networks for classification, regression, clustering, sparse approximation, compression and feature learning with a single layer or multiple layers of hidden nodes, where the parameters of hidden nodes need not be tuned. https://en.wikipedia.org/wiki/Extreme_learning_machine
single-hidden-layer feedforward networks (SLFNs) with randomly generated additive or radial basis function (RBF) hidden nodes (according to any continuous sampling distribution) can work as universal approximators and the resulting incremental extreme learning machine (I-ELM) https://www.sciencedirect.com/science/article/abs/pii/S0925231207000677
58
symbolicfactor graph
A factor graph is a type of probabilistic graphical model. A factor graph has two types of nodes: Variables, which can be either evidence variables when their value is known, or query variables when their value should be predicted. Factors, which define the relationships between variables in the graph.
59
symbolicFeature Description Logic (FDL)
https://www.sciencedirect.com/topics/computer-science/description-logics
Description Logic
60
symbolicfirst order logic (FOL)
First-order logic—also known as predicate logic, quantificational logic, and first-order predicate calculus—is a collection of formal systems used in mathematics, philosophy, linguistics, and computer science. First-order logic uses quantified variables over non-logical objects, and allows the use of sentences that contain variables, so that rather than propositions such as "Socrates is a man", one can have expressions in the form "there exists x such that x is Socrates and x is a man", where "there exists" is a quantifier, while x is a variable.[1] This distinguishes it from propositional logic, which does not use quantifiers or relations;[2] in this sense, propositional logic is the foundation of first-order logic. https://en.wikipedia.org/wiki/First-order_logic
predicate logic, quantificational logic, first-order predicate calculus
61
symbolicfirst-order inferenceFOL inference?
62
Formal Concept Analysis (FCA)
63
n/aformalism
a description of something in formal mathematical or logical terms. "there is a formalism which expresses the idea of superposition"
64
symbolic
FROG preprocessing [43], Tuffy [35] or LazySAT [44]
algorithms to detect and discard these non-informative groundings (algo for propositionalization)
65
n/afunctional programming
Functions are first class citizens. Functions have no side effects. there is no state - same input always produces same output.
66
symbolicfuzzy cognitive maps (FCMs)
A fuzzy cognitive map (FCM) is a cognitive map within which the relations between the elements (e.g. concepts, events, project resources) of a "mental landscape" can be used to compute the "strength of impact" of these elements. Fuzzy cognitive maps were introduced by Bart Kosko.[1][2] Ron Axelrod introduced cognitive maps as a formal way of representing social scientific knowledge and modeling decision making in social and political systems, then brought in the computation fuzzy logic. https://en.wikipedia.org/wiki/Fuzzy_cognitive_map, http://sipi.usc.edu/~kosko/FCM.pdf
---
Fuzzy Cognitive Maps (FCMs) [8] are a type of cognitive semantic models, in which knowledge is represented with a weighted directed graph, whose vertexes are concepts while arcs correspond to relationships between them. FCMs are employed to visualize, model, and simulate the behavior of dynamic systems. (JA5HHBV2)
CGLBI7VX,
JA5HHBV2
67
symbolicfuzzy logic
In logic, fuzzy logic is a form of many-valued logic in which the truth value of variables may be any real number between 0 and 1. It is employed to handle the concept of partial truth, where the truth value may range between completely true and completely false. Probabilistic Logic
https://en.wikipedia.org/wiki/Fuzzy_logic
Probabilistic logic
68
neuralfuzzy min-max (FMM)
The fuzzy min–max (FMM) network is a supervised neural network classifier that forms hyperboxes for classification and prediction. https://www.sciencedirect.com/science/article/pii/S1568494607000865
https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=159066
69
neuralfuzzy neural network (FNN)
A fuzzy neural network or neuro-fuzzy system is a learning machine that finds the parameters of a fuzzy system (i.e., fuzzy sets, fuzzy rules) by exploiting approximation techniques from neural networks. http://www.scholarpedia.org/article/Fuzzy_neural_network#:~:text=A%20fuzzy%20neural%20network%20or,approximation%20techniques%20from%20neural%20networks.
70
symbolicfuzzy rule
71
implementation
Gated Graph Transformer Neural Network
(Johnson, 2017) converts a textual story into a graphical structure via differentiable operations.
NB39QA35
72
neuralGated recurrent units
Gated recurrent units are a gating mechanism in recurrent neural networks
73
neuralgenerative adverserial network (GAN)
74
symbolicgenetic programming (GP)
In artificial intelligence, genetic programming (GP) is a technique of evolving programs, starting from a population of unfit (usually random) programs, fit for a particular task by applying operations analogous to natural genetic processes to the population of programs. It is essentially a heuristic search technique often described as 'hill climbing' i.e. searching for an optimal or at least suitable program among the space of all programs. (iterative search)
genetic algorithms (GA)
75
neuralgradient boosted trees (GBT)classifying as neural due to gradient descent.
76
neuralgradient descent
Gradient descent is a first-order iterative optimization algorithm for finding a local minimum of a differentiable function
GD, SGD
77
n/agranular computing (GrC)
Granular computing (GrC) is an emerging computing paradigm of information processing that concerns the processing of complex information entities called "information granules", which arise in the process of data abstraction and derivation of knowledge from information or data. Generally speaking, information granules are collections of entities that usually originate at the numeric level and are arranged together due to their similarity, functional or physical adjacency, indistinguishability, coherency, or the like. At present, granular computing is more a theoretical perspective than a coherent set of methods or principles. https://en.wikipedia.org/wiki/Granular_computing
78
implementationgranular logic-oriented autoencodersRKWLKKSY
79
neuralgraph neural network (GNN)
80
taskgraph reasoning
the task of predicting a missing entity given one existing and the relation.
81
symbolicgraph representation
82
symbolicGraph-FCA2-s2.0-85111431238
83
symbolicGrounding
Grounding is the task of reducing a first-order theory and finite domain to an equivalent propositional theory. It is used as preprocessing phase in many logic-based reasoning systems. Such systems provide a rich first-order input language to a user and can rely on efficient propositional solvers to perform the actual reasoning. https://arxiv.org/pdf/1401.3840.pdf#:~:text=Grounding%20is%20the%20task%20of,many%20logic%2Dbased%20reasoning%20systems.
In mathematical logic, a ground term of a formal system is a term that does not contain any variables. Similarly, a ground formula is a formula that does not contain any variables. https://en.wikipedia.org/wiki/Ground_expression.
The action of representing elements of the logic language as elements in the vector space is referred to as grounding 10.1007/978-3-030-31095-0_11
https://arxiv.org/pdf/1401.3840.pdf#:~:text=Grounding%20is%20the%20task%20of,many%20logic%2Dbased%20reasoning%20systems.
84
implementationheterogeneous graph reasoning (HGR)
https://www.sciencedirect.com/science/article/pii/S0925231221002678
85
n/aheuristic rule
86
neuralhidden markov model (HMM)
Hidden Markov Model is a statistical Markov model in which the system being modeled is assumed to be a Markov process. (I'm tagging this a neural because it's a statistical process which employs optimization. However, it's. not strictly neural, but more probabilistic).
87
hierarchical NNLM (HNLM)
88
symbolichigher order logic
In mathematics and logic, a higher-order logic is a form of predicate logic that is distinguished from first-order logic by additional quantifiers and, sometimes, stronger semantics. Higher-order logics with their standard semantics are more expressive, but their model-theoretic properties are less well-behaved than those of first-order logic. https://en.wikipedia.org/wiki/Higher-order_logic
89
implementationHuffman treetask/implementationJGU2SECC
90
symbolichuman reasoning
91
symbolicHybrid Markov Logic Networks (HMLNs)extend MLNs to deal with continuous variables.
92
symbolicImage Schema (IS)
IS is a group of common cognitive patterns that appear repetitively in human brain while interacting with the environment. It is an abstraction of the physical world.
(Johnson (2013), Lakoff (1989), and Langacker (1999) established the theory of IS and CM)
Johnson, Mark. 1987. The body in the mind: The bodily basis of meaning, imagination, and reason. Chicago: Univ. of Chicago Press. This work develops the theoretical construct of the image schema, one of the most important ideas in cognitive linguistics.
NB39QA35
93
symbolicinconsistency-tolerant reasoningLembo et al., 2010
94
n/ainduction
inducing the universal from the particular. The process of reasoning in which the premises of an argument support the conclusion, but do not ensure it. "All obsereved swans are white, therefore all swans are white." But there could be black swans that we just haven't obsereved.
inductive reasoning, inductive logic
95
inductive bias
inductive bias can be formalized as the set of assumptions which determine the choice of a particular class of functions for supporting the learning process. Embeds prior knowledge.
indcutive bias: assump[tions about the target variable. It's differnet for different algorithms. For example, in Nearest Neighbors, the assumption is that target variables belonging to the same class are close together. In SVM, the assumption is that distinct classes tend to be separated by wide boundaries.
GS3TRUYZ
96
taskInductive concept learning(like defeasable reasoning)
97
symbolicinductive logic prgramming (ILP)
Inductive logic programming is a subfield of symbolic artificial intelligence which uses logic programming as a uniform representation for examples, background knowledge and hypotheses. Given an encoding of the known background knowledge and a set of examples represented as a logical database of facts, an ILP system will derive a hypothesised logic program which entails all the positive and none of the negative examples. https://en.wikipedia.org/wiki/Inductive_logic_programming
98
n/aintension
In linguistics, logic, philosophy, and other fields, an intension is any property or quality connoted by a word, phrase, or another symbol. In the case of a word, the word's definition often implies an intension.
99
symbolicintuitionistic logic
Intuitionism is based on the idea that mathematics is a creation of the mind; “classical logic without the principle of the excluded middle”. It is denoted by IQC, which stands for Intuitionistic Quantifier Logic.
https://plato.stanford.edu/entries/intuitionism/#IntLog
"Intuitionistic logic, sometimes more generally called constructive logic, refers to systems of symbolic logic that differ from the systems used for classical logic by more closely mirroring the notion of constructive proof. In particular, systems of intuitionistic logic do not include the law of the excluded middle and double negation elimination, which are fundamental inference rules in classical logic.
https://en.wikipedia.org/wiki/Intuitionistic_logic"
100
symbolicknowledge graph (KG)