|
Learning as Search
Name |
Learning as Search |
Description |
To be able to generalize and specialize in a stepwise manner,
it is necessary to find minimally more general / specific hypotheses
to a given hypothesis. If the hypothesis space is ordered according
to generality, learning can be considered as search either
from the most general to the most specific hypotheses (top-down) or
from the most specific to the most general (bottom-up).
Most special generalization
For LH being a propositional logic (attribute-value representation),
the most special generalization can be defined
referring to the partial ordering of the hypothesis space.
Given a set of possible hypotheses LH, a concept c' in LH, and a
positive example e, not covered by c'. The concept g is the most
specific generalization of s according to e, if and only if
- g is strictly more general than c'.
- the concept g covers the example e.
- there is no other concept g` in LH, that is strictly more
general than c', covers e, but is strictly more specific than g.
So g is generated by generalizing one step: There is no room left for
another generalization g' between g and the recent generalization c'.
Most general specialization
For LH being a propositional logic (attribute-value representation),
the most special generalization can be defined
referring to the partial ordering of the hypothesis space.
Given a set of possible hypotheses LH, a concept c from LH and an
example e, covered by c.
The concept s is the most general specialization of c according to
e if and only if
- s is strictly more specific than c.
- the concept s does not cover the example e.
- there is no other concept s` in LH that is strictly more specific
than c, does not cover e, but is strictly more general than s.
So s is generated by specializing one step. This does especially make
sense, if e is a negative example, which should not be
covered.
|
Dm Step |
Concept Learning
|
Methods |
CANDIDATE-ELIMINATION
|
|
|