SO Basics of Neural Networks 2025 school at the IAA-CSIC
For the widespread adoption of these techniques, researchers should be able to design and use their own DL models. Image classification is one of the main applications of DL in astrophysics and offers a convenient way to learn about neural networks. The de facto standard to tackle this problem is Convolutional Neural Networks (CNN), a concrete deep learning architecture. This course will serve as an introduction to Deep Learning with four sessions with the following objectives: understanding the basics of neural networks, getting to know the fundamental libraries and fundamental architectures, learning how to train a CNN, gaining confidence in using CNNs for image classification tasks, and learning how to evaluate their preformance.
The tutor of this school is Dr Francisco Eduardo Sanchez Karhunen (Universidad de Sevilla).
Summary of Contents
- Session 1: Deep Learning fundamentals
- Session 2: Convolutional Neural Networks fundamentals
- Session 3: Practical considerations in real-world CNNs
- Session 4: Evaluation
Roots of deep learning techniques. Reasons for layer stacking. Role of weights and activation functions. Layer as a map between representation spaces. Model parameters. Basic structure for classification tasks. Network training as an optimization problem. Weight initialization techniques. Typical loss functions and optimizers. Learning-rate scheduling.
Hands-on lab: Build from scratch a basic multilayer neural network using the Tensorflow-2 library. Sequential mode of layer stacking. Model training to tackle a classical basic image classification problem.
Drawbacks of classical multilayer networks in image classification tasks. Human brain image handling. Convolutional layers: padding and stride. Types of pooling layer. Kernels for feature map extraction. Kernel stacking. CNNs as an extension of classical stacked layer models. Top layers in CNNs.
Hands-on lab. Build from scratch a basic CNN for image classification using the Galaxy10 dataset.
Overfitting in CNNs. Techniques for overfitting reduction: data augmentation and drop-out. Types of data augmentation. Drop-out rates. Transfer learning: concept and usage. Top Pre-Trained models for image classification. Handling large image datasets: ImageDataGenerators. Hands-on lab: Use of ImageDataGenerators combined with realistic folder structures in image classification problems. Inclusion of data augmentation in our preprocessing pipelines. Add drop-out layers to the CNN design in session 2.
There will be coffee breaks available for all participants during the sessions.
The challenge of imbalanced datasets and the misleading nature of the accuracy metric. Advanced evaluation metrics for classification: confusion matrix, precision, recall, and F1-score. Techniques for handling class imbalance: augmented upsampling, class weights and focal loss.
Hands-on lab: Applying class weights and augmented upsampling to the Galaxy10 classification problem. Evaluating the model's performance using precision, recall, and a confusion matrix to analyze per-class results.
We recommend using Conda-Forge or Conda with mamba for fast environment management.
Miniforge is a conda distribution that includes mamba by default and uses conda-forge as the primary channel:
# Download installer (Linux/macOS)
wget "https://github.com/conda-forge/miniforge/releases/latest/download/Miniforge3-$(uname)-$(uname -m).sh"
# Install (follow the interactive prompts)
bash Miniforge3-$(uname)-$(uname -m).sh
rm Miniforge3-$(uname)-$(uname -m).sh
# Restart your terminal or initialize conda manually. These two commands will initialize conda manually.
# (If you didn't install to the default location, adjust the path accordingly)
$HOME/miniforge3/bin/conda init
source ~/.bashrc # or source ~/.zshrc for zsh users
# Verify installation
mamba --versionIf you already have conda installed, you can add mamba:
conda install conda-forge::mamba- In a terminal, go to your working directory and clone this repository:
git clone https://github.com/iaa-so-training/basic-neural-network-2025.git
cd basic-neural-network-2025
- Install the dependencies for the tutorials (replace
mambawithcondaif you don't have mamba installed):
mamba env create -f environment.yml
- Execute the tutorials
You need to activate the conda environment and initialize a Jupyter Lab session:
conda activate iaa_nn
jupyter lab
Once everything is installed, you just need to run step 4 to run the tutorials.
You can also launch the tutorials without installation in the free myBinder service by clicking here: . Note that this is a free service with limited resources, useful to execute and modify the tutorials live, but computationally expensive steps may not be possible.
- Rainer Schödel (Chair), IAA-CSIC, Spain
- Laura Darriba, IAA-CSIC, Spain
- Javier Moldón, IAA-CSIC, Spain
This event is supported by the "Center of Excellence Severo Ochoa" award to the Instituto de Astrofísica de Andalucía. We acknowledge financial support from the Severo Ochoa grant CEX2021-001131-S funded by MCIN/AEI/ 10.13039/501100011033 from the Instituto de Astrofísica de Andalucía.