Gradient Descent algorithm is:

\newline & \rbrace\end{align*}$$ where, parameters $w, b$ are both updated simultaneously and where

\frac{\partial J(w,b)}{\partial b} = \frac{1}{m} \sum\limits_{i = 0}^{m-1} (f_{w,b}(x^{(i)}) - y^{(i)}) \tag{2}

\frac{\partial J(w,b)}{\partial w} = \frac{1}{m} \sum\limits_{i = 0}^{m-1} (f_{w,b}(x^{(i)}) -y^{(i)})x^{(i)} \tag{3}

* m is the number of training examples in the dataset * $f_{w,b}(x^{(i)})$ is the model's prediction, while $y^{(i)}$, is the target value ```python def compute_gradient(x, y, w, b): """ Computes the gradient for linear regression Args: x (ndarray): Shape (m,) Input to the model (Population of cities) y (ndarray): Shape (m,) Label (Actual profits for the cities) w, b (scalar): Parameters of the model Returns dj_dw (scalar): The gradient of the cost w.r.t. the parameters w dj_db (scalar): The gradient of the cost w.r.t. the parameter b """ # Number of training examples m = x.shape[0] # You need to return the following variables correctly dj_dw = 0 dj_db = 0 ### START CODE HERE ### predicted_y = w * x + b error = predicted_y - y dj_dw = (1/m)*np.sum(error*x) dj_db = (1/m)*np.sum(error) ### END CODE HERE ### return dj_dw, dj_db ```