📈
Gradient Descent
★★★☆☆High School
📖Definition
Gradient descent is an optimization algorithm that iteratively moves in the opposite direction of the gradient to find the minimum.
📐Formulas
x_n+1 = xₙ - α ∇ f(xₙ)
Gradient descent update rule
α
Learning rate (step size)
✏️Examples
예제 1
Apply gradient descent to f(x) = x² with x₀ = 4, α = 0.5.
⚡Applications
Deep Learning
Neural network training
Machine Learning
Model parameter optimization
Data Science
Regression analysis
🔗Related Documents
→Prerequisites
←Next Topics
↔Related
#경사하강#기울기#gradient descent#optimization