From the above experimental results we can see that the AIS-BN algorithm can improve the sampling performance significantly. Our next series of tests focused on studying the role of the two AIS-BN initialization heuristics. The first is initializing the ICPT tables of the parents of evidence to uniform distributions, denoted by U. The second is adjusting small probabilities, denoted by S. We denote AIS-BN without any heuristic initialization method to be the AIS algorithm. AIS+U+S equals AIS-BN. We compared the following versions of the algorithms: SIS, AIS, SIS+U, AIS+U, SIS+S, AIS+S, SIS+U+S, AIS+U+S. All algorithms with SIS used the same number of samples as SIS. All algorithms with AIS used the same number of samples as AIS-BN. We tested these algorithms on the same 75 test cases used in the previous experiment. Figure 11 shows the MSE for each of the sampling algorithms with the summary statistics in Table 3. Even though the AIS algorithm is better than the SIS algorithm, the difference is not as large as in case of the AIS+U, AIS+S, and AIS-BN algorithms. It seems that heuristic initialization methods help much. The results for the SIS+S, SIS+U, SIS+U+S algorithms suggest that although heuristic initialization methods can improve performance, they alone cannot improve too much. It is fair to say that significant performance improvement in the AIS-BN algorithm is coming from the combination of AIS with heuristic methods, not any method alone. It is not difficult to understand that, as only with good heuristic initialization methods is it possible to let the learning process quickly exit oscillation areas. Although both S and U methods alone can improve the performance, the improvement is moderate compared to the combination of the two.
|