1. What is the difference between overfitting and underfitting?
Overfitting: model learns training data too well including noise, performs poorly on new data (high variance). Underfitting: model too simple to capture the pattern (high bias). The bias-variance tradeoff: more complexity reduces bias but increases variance. Fix overfitting: regularization (L1/L2), dropout, early stopping, more data, cross-validation.