학술논문

Measuring Human Perception to Improve Handwritten Document Transcription
Document Type
Periodical
Source
IEEE Transactions on Pattern Analysis and Machine Intelligence IEEE Trans. Pattern Anal. Mach. Intell. Pattern Analysis and Machine Intelligence, IEEE Transactions on. 44(10):6594-6601 Oct, 2022
Subject
Computing and Processing
Bioengineering
Loss measurement
Training
Tools
Task analysis
Annotations
Visualization
Computer vision
Document transcription
visual recognition
visual psychophysics
deep learning
digital humanities
Language
ISSN
0162-8828
2160-9292
1939-3539
Abstract
In this paper, we consider how to incorporate psychophysical measurements of human visual perception into the loss function of a deep neural network being trained for a recognition task, under the assumption that such information can reduce errors. As a case study to assess the viability of this approach, we look at the problem of handwritten document transcription. While good progress has been made towards automatically transcribing modern handwriting, significant challenges remain in transcribing historical documents. Here we describe a general enhancement strategy, underpinned by the new loss formulation, which can be applied to the training regime of any deep learning-based document transcription system. Through experimentation, reliable performance improvement is demonstrated for the standard IAM and RIMES datasets for three different network architectures. Further, we go on to show feasibility for our approach on a new dataset of digitized Latin manuscripts, originally produced by scribes in the Cloister of St. Gall in the the 9th century.