The OpenNARS implementation of the Non-Axiomatic Reasoning System

History / Edit / PDF / EPUB / BIB /
Created: March 6, 2016 / Updated: November 2, 2024 / Status: finished / 6 min read (~1147 words)
Artificial General Intelligence

  • NARS has a memory, a logic component and a control component
  • The logic component consists of inference rules that work on statements, where the statements are goals, questions and beliefs
    • A statement can be eternal (non time-dependent) or events (time-dependent)
    • Beliefs are statements that the system believes to be true to a certain extent
    • An inference task is a statement to be processed, with additional control relevant information
  • NAL: Non-Axiomatic Logic
  • Narsese: Language for representing statements

  • 3 primary operations:
    • Return the best ranked belief or goal for inference within concepts (local inference)
    • Provide a pair of contextually relevant and semantically related statements for inference between concepts (general inference)
    • Add statements to memory whilst maintaining constraints on the system
  • The main loop:
    • Get a concept from memory
    • Get a task and belief related to the selected concept
    • Derive new tasks from the selected task and belief
    • Put the involved items back into the corresponding bags
    • Put the new tasks into the corresponding bags
  • System of metadata (budget and stamp)
    • Used to prevent certain forms of invalid inference such as double counting evidence and cyclic reasoning
    • Abstracts temporal requirements away from the Narsese grammar
    • Provides certain implementation efficiencies
  • A budget determines the allocation of system resources (time and space) and is defined as (p,d,q)[0,1]x(0,1)x[0,1]
    • p: priority
    • d: durability
    • q: quality
  • A stamp is defined as (id,tcr,toc,C,E)N×N×N×P(N)
    • id: unique id
    • tcr: creation time
    • toc: occurrence time
    • C: syntactic complexity (the number of subterms in the associated term)
    • E: an evidential set
  • Curve bag is a data structure that supports a probabilistic selection according to the item priority distribution
  • The priority value p of the items in the bag maps to their access frequency by a predefined monotonically increasing function
  • Called a curve bag because it allows the user to define a custom curve which is highly flexbile and allows emotional parameters and introspective operators to have influence on this selection
  • The memory consists of a curve bag of concepts, where a concept is a container for: a concept term, tasklink curve bag, term link curve bag, belief tables and goal tables

  • Composed of two components: an inference rule domain specific language (Meta Rule DSL) and an inference rule execution unit
  • The Meta Rule DSL != NAL grammar rules
  • Meta Rule DSL: provides a flexible methodology to quickly experiment with alternate inference rules, to support the goal of creating a literate program, and to substantially improve the quality of the software implementation
  • Meta inference rules take the following form:
    T,B,P1,...,Pn(C1,...,Cn)
    • T: the task to be processed
    • B: the belief retrieved for the task
    • P1,...,Pn: logical predicates dependent on T,B,C1,...,Cn
    • C1,...,Cn: conclusions in the form (Di,Mi) where Di is the term of the derived task the conclusion Ci defines, and Mi provides additional meta-information, such as which truth function will be used to decide the truth or desire of the conclusion, how the temporal information will be processed, or whether backwards inference is allowed
  • The inference rule execution unit roles are:
    • Parse the Meta Rule DSL into an efficient and executable representation
    • Select and execute the relevant inference rules

  • In order to support temporal reasoning, the non-temporal NAL inference rules are extended by adding temporal variants:
    • Temporal window: Events occurring within a specified temporal window will be deemed to have occurred at the same time
    • Temporal chaining: Semantically unrelated events are linked together if they are temporally following one another
    • Interval handling: Events patterns which occur at a given interval from one another can be detected/observed
    • Projection
    • Eternalization
    • Anticipation

  • When a truth value for a statement is projected in time, its value decreases according to the following function:
  • cnew=(1kc)×cold
    • cnew: new confidence value the belief
    • cold: old confidence value the belief
    • kc: confidence decay
  • kc=|tBtT||tBtC|+|tTtC|
    • kc: confidence decay
    • tB: original occurrence time of the belief
    • tT: projected occurrence time of the belief
    • tC: current time
  • In eternalization, the occurrence time is dropped, so the conclusion is about the general situation
  • cnew=1k+cold
    • cnew: new confidence value the belief
    • cold: old confidence value the belief
    • k: a global evidence horizon personality parameter

  • Based on the observation of an antecedent and behavior, a consequent is excepted (anticipated)
  • In the event that an antecedent and behavior is observed, and the consequent is also observed, nothing special needs to be done
  • In the opposite case, then the system needs to recognize that the prediction may not be 100% appropriate. Such event will have a high budget and will significantly influence the attention of the system (in order to rectify the situation)

  • The truth value of a statement is a (w+,w) pair, where w+ represents positive evidence and w represents negative evidence
  • Can alternatively be represented as a confidence and frequency tuple, where
    • c=w++wk+w++w
    • f=w+w++w
    • k: global personality parameter that indicates a global evidential horizon
  • Evidence follows the following principles:
    • Evidence can only be used once for each statement
    • A record of evidence used in each derivation must be maintained, although given AIKR (assumption of insufficient knowledge and resources), this is only a partial record, which is not an issue in practice
    • There can be positive and negative evidence for the same statement
    • Evidence is not only the key factor to determine truth, but also the key to judge the independence of the premises in a step of inference

  • Temporal chaining
  • Ranking
  • Adding to belief/desire table
  • Selecting belief for inference
  • Revision
  • Decision

  • 3 phases process:
    • Select contextually relevant and semantically related tasks for inference
    • Create or update budget values based on user requirements and/or inference results
    • Update memory with results of the updated task and concepts
  • Phase 1: Premises for inference are selected according to the following scheme
    • Select a concept from memory
    • Select a tasklink from this concept
    • Select a termlink from this concept
    • Select a belief from the concept the termlink points to ranked by the task
  • Phase 2: Formation of new statements (tasks), with new metadata, from the derivations.
  • Phase 3: Process the new tasks and insert them into memory.