Neural Networks Series Last updated: October 21, 2024 Written by: baeldung Deep LearningSeries Neural Networks Baeldung Pro – CS – NPI EA (cat = Baeldung on Computer Science) It's finally here: >> The Road to Membership and Baeldung Pro. Going into ads, no-ads reading, and bit about how Baeldung works if you're curious :) 1. Introduction to Neural Networks Introduction to Convolutional Neural Networks How Do Self-Organizing Maps Work? Neural Networks: Difference Between Conv and FC Layers What's a Non-trainable Parameter? Bias in Neural Networks Random Initialization of Weights in a Neural Network Epoch in Neural Networks What Is and Why Use Temperature in Softmax? Neural Networks: What Is Weight Decay Loss? Differences Between Backpropagation and Feedforward Networks Differences Between Bidirectional and Unidirectional LSTM Hidden Layers in a Neural Network How to Use K-Fold Cross-Validation in a Neural Network? 2. Statistics How to Calculate the VC-Dimension of a Classifier? Differences Between a Parametric and Non-parametric Model Machine Learning: Flexible and Inflexible Models Linearly Separable Data in Neural Networks Differences Between Hinge Loss and Logistic Loss 3. Deep Learning What Does Backbone Mean in Neural Networks? What Does Pre-training a Neural Network Mean? Latent Space in Deep Learning The Reparameterization Trick in Variational Autoencoders Neural Networks: Pooling Layers Differences Between Luong Attention and Bahdanau Attention Graph Attention Networks Introduction to Spiking Neural Networks An Introduction to Generative Adversarial Networks Using GANs for Data Augmentation Encoder-Decoder Models for Natural Language Processing