Full-text resources of PSJD and other databases are now available in the new Library of Science.
Visit https://bibliotekanauki.pl

PL EN


Preferences help
enabled [disable] Abstract
Number of results
2015 | 24 |

Article title

Effectiveness of Unsupervised Training in Deep Learning Neural Networks

Content

Title variants

Languages of publication

PL

Abstracts

PL
Deep learning is a field of research attracting nowadays much attention, mainly because deep architectures help in obtaining outstanding results on many vision, speech and natural language processing – related tasks. To make deep learning effective, very often an unsupervised pretraining phase is applied. In this article, we present experimental study evaluating usefulness of such approach, testing on several benchmarks and different percentages of labeled data, how Contrastive Divergence (CD), one of the most popular pretraining methods, influences network generalization.

Publisher

Year

Volume

24

Physical description

Dates

published
2015
online
06 - 07 - 2016

Contributors

References

Document Type

Publication order reference

Identifiers

YADDA identifier

bwmeta1.element.ojs-issn-2083-8476-year-2015-volume-24-article-6333
JavaScript is turned off in your web browser. Turn it on to take full advantage of this site, then refresh the page.