Your Search Results

Use this resource - and many more! - in your textbook!

AcademicPub holds over eight million pieces of educational content for you to mix-and-match your way.

Experience the freedom of customizing your course pack with AcademicPub!
Not an educator but still interested in using this content? No problem! Visit our provider's page to contact the publisher and get permission directly.

Inserting background knowledge in perceptrons through modification of the learning algorithm

By: Bode, J.; Shouju Ren; Xiping Zhang; Xun Liang;

1995 / IEEE / 0-7803-2768-3

Description

This item was taken from the IEEE Periodical ' Inserting background knowledge in perceptrons through modification of the learning algorithm ' Usually, knowledge to be learned by neural networks is represented implicitly in the training samples. The ability to insert knowledge apart from the implicit representations in training samples (""background knowledge"") gives rise to the hope that the learning and operation behavior of neural networks can be improved. In this paper, we develop a method to accomplish the insertion of expert knowledge into the error function during training. We modify the backpropagation learning algorithm such that the network is trained not only to minimize output error but also to consider further information provided by the expert users who train a multilayer perceptron with one hidden layer. The results are tested with an artificial example from design cost estimation using very small training set sizes of 10 samples. They show significant improvement compared to approaches which do not consider background knowledge.