01:198:461 Machine Learning Principles (TA)

Recitation, Rutgers University, 2025

Machine Learning Principles (01:198:461, Sec. 06) surveys core ML foundations: linear/logistic regression, regularization, decision trees/forests, probabilistic models, kernels/SVMs, and an introduction to deep learning (CNNs, RNNs/LSTMs, Transformers).


Welcome! Slides and materials for the recitations will appear here.

Recitation 01 — Distributions & Hypothesis Testing (Sep 15, 2025)

What we covered
  • Distributions
    • Continuous (Uniform, Gaussian, Student’s t, Laplace)
    • Discrete (Bernoulli, Binomial)
  • Hypothesis Testing
    • P-values
    • χ² tests

Recitation 02 — Decision Trees & Data Generation (Sep 22, 2025)

What we covered
  • Recursive tree building
    • Use a Node class to represent each node in the tree
    • Store necessary information such as column names, threshold, left/right children, parent, and class labels
    • Use a recursive function to split the data based on the best feature at each node
  • Synthetic data generation
    • Traversing the tree from a random leaf node up to the root
    • Estimating feature values along the way
    • Handling edge cases, such as when a leaf node has too few samples or there are missing values

Recitation 03 — K-means & Quiz-1 Review (Sep 29, 2025)

What we covered
  • K-Means clustering
    • Step-by-step demonstration
    • Code for practice and understanding
  • Quiz-1 review
    • Explanations for each question
    • Decision tree demonstration

Recitation 04 — K-means & GMM (Oct 6, 2025)

What we covered
  • Implementing the K-Means
    • Random and K-Means++ initialization
    • Data assignment step and centroids update step
  • Implementing the Gaussian Mixture Models (GMM)
    • Random initialization
    • E-step and M-step in Expectation-Maximization (EM) algorithm

Office Hours

  • Time: Not yet scheduled

  • Location: Not yet scheduled

  • Contact: daize.dong@rutgers.edu