Skip to content

Exercise 3

Part of the course Computational Chemistry.

Geometry optimization is a core part of identifying particularly relevant points on the potential energy surface. Usually, this is a high-dimensional problem. For this exercise, we will consider the one-dimensional case alone. Please do not use external libraries (such as scipy.optimize) to solve the tasks, as here we want to focus on a manual low-level implementation to improve understanding of the problem.

Task 3.1: Step by step

One way to obtain the gradient of an arbitrary function \(f\) is called finite differences. This method evaluates \(f\) at two places with a very small displacement \(h\):

\[\left.\frac{\partial f}{\partial x}\right|_{x=x_0} \approx \frac{f(x_0+h)-f(x_0-h)}{2h}\]

Implement a function gradient_descent(h, step_scaling) which optimizes the function \(f(x)=x^2\) using gradient descent and gradients from finite differences. The starting point should be at \(x=2\), and reasonable settings are \(h=0.001\) and \(s=0.1\).

Optional: Write the function to accept an arbitrary function to optimize \(f\) and a starting point \(x_\textrm{i}\): gradient_descent(f, x_i) where \(h\) and step_scaling should be chosen automatically.

Task 3.2: Line rider

Use your code from task 1 and add a test whether the position found is an actual minimum. This is done by testing whether the second derivative at that point is positive. You can obtain the second derivative of a function with finite differences like this:

\[\left.\frac{\partial^2 f}{\partial x^2}\right|_{x=x_0} \approx \frac{f(x_0+h)-2f(x_0)+f(x_0-h)}{h^2}\]

Test your code on the function \(f\) from task 1 and on \(g(x)=x^3\), each time starting from \(x_\textrm{i}=1\). What do you observe?

Task 3.3: If you know, you know

Minimize the function \(\sin(x)/x\) for different starting points. What do you observe? Which consequence do you see for geometry optimisation of molecules?