Neural Information Processing - Letters and Reviews

Vol. 12, Nos. 1-3, January-March 2008

 

pp. 31-42

 

RNN with a Recurrent Output Layer for Learning of Naturalness

 

Ján Dolinský and Hideyuki Takagi

Kyushu University, 4-9-1 Shiobaru, Minami-ku, Fukuoka, 815-8540 JAPAN

E-mail: jan@plazma.sk, takagi@design.kyushu-u.ac.kr

 

Abstract

The behavior of recurrent neural networks with a recurrent output layer (ROL) is described mathematically and it is shown that using ROL is not only advantageous, but is in fact crucial to obtaining satisfactory performance for the proposed naturalness learning. Conventional belief holds that employing ROL often substantially decreases the performance of a network or renders the network unstable, and ROL is consequently rarely used. The objective of this paper is to demonstrate that there are cases where it is necessary to use ROL. The concrete example shown models naturalness in handwritten letters.

 

Keywords – recurrent output layer, RNN, ESN, naturalness learning, handwritten letters