Journal
Article title
Authors
Content
Full texts:
Title variants
Languages of publication
Abstracts
Deep learning is a field of research attracting nowadays much attention, mainly because deep architectures help in obtaining outstanding results on many vision, speech and natural language processing – related tasks. To make deep learning effective, very often an unsupervised pretraining phase is applied. In this article, we present experimental study evaluating usefulness of such approach, testing on several benchmarks and different percentages of labeled data, how Contrastive Divergence (CD), one of the most popular pretraining methods, influences network generalization.
Publisher
Journal
Year
Volume
Physical description
Dates
published
2015
online
06 - 07 - 2016
Contributors
author
author
References
Document Type
Publication order reference
Identifiers
YADDA identifier
bwmeta1.element.ojs-issn-2083-8476-year-2015-volume-24-article-6333