Let f(x, y) = x^2 + y^2 and g(x, y) = x^3 + y^3. For every point (x, y), let
d(x, y) be the dimension of the linear span of the gradient vectors of f, and
of g, at (x, y). Compute d(x, y) for all (x, y).
The problem I'm having here is that the gradient vectors for f and g occupy R^2 but f is a first order equation and g is second order. How do I set this problem up?
Do your own homework!
AnandTech Moderator
d(x, y) be the dimension of the linear span of the gradient vectors of f, and
of g, at (x, y). Compute d(x, y) for all (x, y).
The problem I'm having here is that the gradient vectors for f and g occupy R^2 but f is a first order equation and g is second order. How do I set this problem up?
Do your own homework!
AnandTech Moderator