Skip to content

Creative Machine Learning course and notebook tutorials in JAX, PyTorch and Numpy

License

Notifications You must be signed in to change notification settings

acids-ircam/creative_ml

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation


ACIDS
Creative Machine Learning

Creative Machine Learning course and notebooks in JAX, PyTorch and Numpy.

Course given both at the University of Tokyo (Japan) and Sorbonne Université (Paris, France)
Professor: Philippe Esling

This repository contains the courses in machine learning applied to music and other creative mediums. This course is currently given at the University of Tokyo (Japan), and along the ATIAM Masters at IRCAM, Sorbonne Paris (France). The courses slides along with a set of interactive Jupyter Notebooks will be updated along the year to provide all the ML program. This course is proudly provided by the ACIDS group, part of the Analysis / Synthesis team at IRCAM. This course can be followed entirely online through the set of Google slides and Colab notebooks links provided openly along each lesson. However, we do recommend to fork the entire environment and follow the interactive notebooks through Jupyter lab to develop your own coding environment.

As the development of this course is always ongoing, please pull this repo regularly to stay updated. Also, please do not hesitate to post issues or PRs if you spot any mistake (see the contribution section).

Discord channel

Join the Discord channel of this course to join the community and share experiences and problems.

Important update on grading

Starting from the Spring 2025 (Tokyo) session, all the assignments and grading will be performed thanks to GitHub classroom. Since it is a new system, some features might be still in development, but this should provide auto-grading and immediate feedback to enhance your learning experience. Please report any potential improvements to make on the Discord.

Refer to the evaluation section at the end of the lessons details for more information.

Table of Contents
  1. Lessons
  2. Administrative
  3. Detailed lessons
  4. Contribution
  5. About

Lessons

Quick explanation. For each of the following lessons, you will find a set of badges containing links to different parts of the course, which allows you to follow either the online or offline versions.

  • Online: Slides Colab
  • Offline: Powerpoint Notebook

Simply click on the corresponding badge to follow the lesson. Note that if the badge is displayed in red color as follows Slides it means that the content is not available yet and will be uploaded later. Also note that some courses might contain multiple notebooks and extra information, which will be clearly indicated in that case.


Slides Powerpoint Colab Notebook

This course provides a brief history of the development of artificial intelligence and introduces the general concepts of machine learning through a series of recent applications in the creative fields. This course also presents the pre-requisites, course specificities, toolboxes and tutorials that will be covered and how to setup the overall environment.

GitHub assignment


Slides Powerpoint Colab Notebook

This course introduces the formal notions required to understand machine learning along with classic problems of linear models for regression and classification. We discuss the mathematical derivation for optimization and various problems of overfitting, cross-validation and model properties and complexity that are still quintessential in modern machine learning.

GitHub assignment


Slides Powerpoint Colab Notebook

This course provides a brief history of the development of neural networks along with all mathematical and implementation details. We discuss geometric perspectives on neurons and gradient descent and how these interpretation naturally extend to the case of multi-layer perceptrons. Finally, we discuss the complete implementation of backpropagation through micro-grad.

GitHub assignment


Slides Powerpoint Colab Notebook

In this course we introduce more advanced types of neural networks such as convolutional and recurrent architectures, along with more advanced models (LSTM, GRU) and recent developments such as residual architectures. We further discuss issues of regularization and initialization in networks.


Slides Powerpoint Colab Notebook

We introduce here the fundamental shift towards deep learning, notably through the development of layerwise training and auto-encoders. We discuss how these are now less relevant through novel regularization methods and data availability. We finish this course by discussing the recent attention mechanism and transformer architectures and provide a set of modern applications.


Slides Powerpoint Colab Notebook

To operate the shift towards generative models, we introduce here the fundamentals of probabilities, distribution and inference. We discuss several properties and introduce Bayesian inference by developing the mathematical foundations of Maximum A Posteriori (MAP) and Maximum Likelihood (ML) techniques.


Slides Powerpoint Colab Notebook

We discuss the distinction between supervised and unsupervised learning, through the first context of clustering. This allows to introduce the notion of latent variables and how we can solve for this using Expectation-Maximization (EM). We provide the full derivation through Variational inference and discuss the implementation of Gaussian Mixture Models (GMM)


Slides Powerpoint Colab Notebook

We introduce here the notion of approximate inference, through historical applications of sampling with Monte-Carlo and Metropolis-Hastings. We further show how variational inference can provide an elegant solution to our estimation problems and discuss its implications.


Slides Powerpoint Colab Notebook

We show how fusing Auto-Encoders and variational inference leads to the Variational Auto-Encoder (VAE), a powerful generative models. We discuss issues of disentanglement and posterior collapse and introduce the recent Normalizing Flows and how these can be used in generative contexts.

Additional notebook on Normalizing Flows

Colab Notebook


Slides Powerpoint Colab Notebook

We introduce the notion of estimating by comparing and derive the complete adversarial objective naturally from this observation. After discussing adversarial attacks, we introduce Generative Adversarial Networks (GANs), which are still competitive generative models to this day. We discuss flaws and limitations of GANs and introduce their modern applications.


Slides Powerpoint Colab Notebook

This course explores the new class of generative models based on diffusion probabilistic models. This class of models is inspired by thermodynamics, but also denoising score matching, Langevin dynamics and autoregressive decoding. We will also discuss the more recent development of denoising diffusion implicit models and the wavegrad model, which is based on the same core principles but applies this class of models for audio data.


11 - Guest lecture (University of Tokyo)

This year we are very happy to host two special guest lectures that are aimed to provide an advanced view on both the academic and industrial research currently undergoing in creative machine learning.

Setup

Along the tutorials, we provide a reference code for each section. This code contains helper functions that will alleviate you from the burden of data import and other sideline implementations. You will find designated spaces in each file to develop your solutions. The code is in Python (notebooks impending) and relies on the concept of code sections, which allows you to evaluate only part of the code (to avoid running long import tasks multiple times and concentrate on the question at hand.

Please refer to the setup notebook to check if your configuration is correct

Dependencies

Python installation

In order to get the baseline scripts and notebooks to work, you need to have a working distribution of Python 3.7 as a minimum (we also recommend to update your version to Python 3.9). We will also be using a large set of libraries, with the following ones being the most prohiminent

We highly recommend that you install Pip or Anaconda that will manage the automatic installation of those Python libraries (along with their dependencies). If you are using Pip, you can use the following commands

pip install -r requirements.txt

If you prefer to install all the libraries by hand to check their version, you can use individual commands

pip install numpy
pip install scikit-learn
pip install torch
pip install jax
pip install librosa
pip install matplotlib

For those of you who have never coded in Python, here are a few interesting resources to get started.

Jupyter notebooks and lab

In order to ease following the exercises along with the course, we will be relying on Jupyter Notebooks. If you have never used a notebook before, we recommend that you look at their website to understand the concept. Here we also provide the instructions to install Jupyter Lab which is a more integrative version of notebooks. You can install it on your computer as follows (if you use pip)

pip install jupyterlab

Then, once installed, you can go to the folder where you cloned this repository, and type in

jupyter lab

Evaluation

The notebooks will be evaluated through rolling deadlines that happen at 3 sessions apart. Hence notebook for Course 01 (Machine Learning) must be submitted before the beginning of Course 04 (Deep Learning), and that of course 02 before course 05, and so forth.

List of assignments

  • 00 - Setup - GitHub assignment
  • 01 - Machine learning - GitHub assignment
  • 02 - Neural networks - GitHub assignment
  • 03 - Advanced networks - GitHub assignment (released : 25.05.01)
  • 04 - Deep learning - GitHub assignment (released : 25.05.15)
  • 05 - Bayesian inference - GitHub assignment (released : 25.05.22)
  • 06 - Latent models - GitHub assignment (released : 25.05.29)
  • 07 - Variational inference - GitHub assignment (released : 25.06.05)
  • 08 - Normalizing flows - GitHub assignment (released : 25.06.19)
  • 09 - Adversarial networks - GitHub assignment (released : 25.06.09)
  • 10 - Diffusion models- GitHub assignment (released : 25.07.03)

Submission instructions:

For each of the course, a separate assignment is generated through the provided GitHub Classroom link. Once you click on "Accept assignment", your individual submission repository will be automatically created (in the form : https://github.com/ACIDS-Paris/cml-XX-[assignment_name]-[your_name]). This link will be available on the page. You should then clone your repo locally as follows

git clone https://github.com/ACIDS-Paris/cml-XX-[assignment_name]-[your_name]

You can then open the notebook locally or via Google Colab (by linking GitHub directly). Please follow the instructions and correctly fill the empty notebook cells with the correct function names and other indications as required. You can then save and push your code directly as

git commit -m "Finished Assignment 1"
git push

Auto-grading

Auto-grading will be triggered automatically on push, which allows you to see how well your answer performs on automated tests in GitHub immediately.

⚠️ Note that the auto-grading does not represent your final grade for the course ! Your submissions will also be manually reviewed to ensure the quality of your code and its content. Hence, you can also gain points with answers that do not pass the tests ... But you can also loose points if the solution is not adequate or ill-coded.

Assignments will be updated on a weekly basis.

Solutions

Once the deadline has passed, you will loose write access to your repository and the last pushed version will be manually evaluated. This will also allow you to obtain a private link to the complete solution of the corresponding notebook. Please do not share these solutions to anyone

Administrative (University of Tokyo / Spring 2025)

These administrative details concerns only the current (spring) physical session attached to the University of Tokyo (Spring 2025 session)

Course details

Type Information
Type 2-credits graduate school course
Period April - July 2025
Span 13 classes of 105 minutes
Date Thursday at 2:55 - 4:40 pm (JST Time)
Onsite Seminar Room C, Engineer. 6 Building
Online https://u-tokyo-ac-jp.zoom.us/j/81363691008?pwd=cmxiRURMdm9udXBKbTNjQkZvblNFQT09

Full calendar

Date Course
April 10 00 - Introduction
April 17 01 - Machine learning
April 24 02 - Neural networks
May 1 03 - Advanced networks
May 8 No course
May 15 04 - Deep learning
May 22 05 - Bayesian inference
May 29 06 - Latent models
June 5 GX - Guest Lecture - Nao Tokui
June 12 No course
June 19 07 - Approximate inference
June 26 08 - VAEs and flows
July 3 09 - Adversarial networks
July 10 10 - Diffusion models

Administrative (ATIAM - IRCAM / Autumn 2024)

These administrative details concerns only the current (autumn) physical session attached to the ATIAM Masters (IRCAM)

Course details

Type Information
Type Masters (3-ECTS) school course
Period September - December 2024
Span 7 classes of 200 minutes (sessions of 55 minutes)
Date Variable slots (check calendar)
Onsite Room Stravinsky, IRCAM, -2
Online N / A

Full calendar

Date Course
November 10 00 - Introduction
November 10 01 - Machine learning
November 13 02 - Neural networks
November 13 03 - Advanced networks
November 20 04 - Deep learning
November 30 05 - Bayesian inference
November 30 06 - Latent models
December 08 07 - Approximate inference
December 08 08 - VAEs and flows
December 11 09 - Adversarial networks
December 18 10 - Diffusion models

Past sessions

The course has ben provided for the past 3 years in two different sides of the world:

  • [2022 - Autumn] : IRCAM, Paris
  • [2023 - Spring] : The University of Tokyo, Japan
  • [2023 - Autumn] : IRCAM, Paris
  • [2024 - Spring] : The University of Tokyo, Japan
  • [2024 - Autumn] : IRCAM, Paris
  • [2025 - Spring] : The University of Tokyo, Japan

Contribution

Please take a look at our contributing guidelines if you're interested in helping!

About

Code and documentation copyright 2012-2042 by all members of ACIDS.

Code released under the CC-BY-NC-SA 4.0 licence.

About

Creative Machine Learning course and notebook tutorials in JAX, PyTorch and Numpy

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages