Derivatives Comparison Table
In multivariable calculus and functional analysis, various types of derivatives are used depending on the function's domain, codomain and application. This article summarizes and explains key derivative concepts such as the ordinary derivative, total derivative, gradient, Jacobian, Hessian, Gateaux derivative, and Fréchet derivative.
Derivative (Single-variable)
-
Target Function:
-
Formula:
-
Meaning: Measures the instantaneous rate of change (slope) of a function with respect to a single variable.
Total Derivative
-
Target Function:
-
Formula:
-
Meaning: Gives the best linear approximation of a multivariable function. It corresponds to the dot product of the gradient and a small displacement.
Gradient
-
Target Function:
-
Formula:
-
Meaning: Vector that points in the direction of the steepest ascent.
Jacobian
-
Target Function:
-
Formula:
-
Meaning: Generalization of the derivative to vector-valued functions. It contains all the first-order partial derivatives.
Hessian
-
Target Function:
-
Formula:
-
Meaning: Matrix of second-order partial derivatives. It encodes the curvature of the function and is used in optimization.
Gateaux Derivative
-
Target Function: (normed space)
-
Formula:
-
Meaning: Directional derivative in the direction of a given vector . It is a generalization of the derivative to infinite-dimensional spaces.
Fréchet Derivative
-
Target Function: (Banach spaces)
-
Formula:
-
Meaning: Provides the best linear approximation in all directions. Stronger and more general than the Gateaux derivative.
Summary Table
Concept | Target Function | Meaning | Mathematical Expression |
---|---|---|---|
Derivative | Measures the rate of change or slope of a single-variable function | ||
Total Derivative | Linear approximation of multivariable functions; expressed via the dot product with the gradient | ||
Gradient | Coefficient vector of the total derivative; indicates the direction of the steepest ascent | ||
Jacobian | Organizes all first-order partial derivatives into a matrix | ||
Hessian | Organizes all second-order partial derivatives into a symmetric matrix; encodes curvature | ||
Gateaux Derivative | Derivative in a specified direction | ||
Fréchet Derivative | Best linear approximation in all directions |
Applications
In Optimization
- Gradient: Used in gradient descent algorithms
- Hessian: Used in Newton's method and second-order optimization
In Machine Learning
- Jacobian: Used in backpropagation for neural networks
- Gradient: Used in gradient-based optimization
In Functional Analysis
- Gateaux Derivative: Used in calculus of variations
- Fréchet Derivative: Used in optimization over function spaces
Key Relationships
- Gradient and Total Derivative: The gradient is the coefficient vector in the total derivative expression
- Jacobian Generalization: When , the Jacobian reduces to the gradient (as a row vector)
- Hessian as Jacobian: The Hessian is the Jacobian of the gradient
- Fréchet ⊆ Gateaux: If the Fréchet derivative exists, then the Gateaux derivative exists and they are equal
Understanding these concepts requires familiarity with:
- Multivariable calculus
- Linear algebra
- Basic functional analysis (for Gateaux and Fréchet derivatives)
In most practical applications in engineering and machine learning, you'll primarily work with gradients, Jacobians, and Hessians. The Gateaux and Fréchet derivatives are more relevant in advanced mathematical analysis and optimization theory.