Alex's adventures with neural networks
Table of Contents
Posts
- 2024-06-06
- Speed reading with h5py
- 2024-05-16
- Universal approximation theorem
- 2024-05-02
- Hyperparameter optimisation with Optuna
- 2024-03-28
- Orthonormal certificates for epistemic uncertainty
- 2024-03-14
- Another look at SQR
- 2024-02-22
- Early stopping rule with a
ValidationManager
- 2024-02-08
- Gated multimodal units with synthetic data
- 2024-01-25
- Simultaneous quantile regression (SQR) with the pinball loss
- 2024-01-11
- Neural Bayes estimators using a permutation invariant architecture
- 2024-01-09
- Fine-tuning for MNIST classification
- 2024-01-07
- CNN with dropout for MNIST classification
- 2024-01-04
- CNN using PyTorch for MNIST classification
- 2024-01-03
- Basic model using PyTorch for MNIST classification
References
Software
- PyTorch Documentation
- Papa (2021) PyTorch Pocket Reference
- Akiba et al (2019) the reference for Optuna
Repositories
- Tensor puzzles to help learn broadcasting in PyTorch.
Theory
- CS231n: Deep Learning for Computer Vision
- Hinton et al (2012) the reference for dropout regularisation.
- Sainsbury-Dale et al (2023) for a review of neural Bayes estimators.
- Prechelt (1998) the reference for three types of early-stopping.
- Tagasovska and Lopez-Paz (2019) the reference for simultaneous quantile regression.
- Romano et al (2019)) the reference for conformalized quantile regression.
- Arevalo et al (2017) the reference for gated multimodal units.
- Hendrycks and Gimpel (2016) the reference for Gaussian error linear units.