Advertisement
amu2002

GradientDescent

Nov 20th, 2023 (edited)
42
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
Python 0.83 KB | None | 0 0
  1. x = 2           # Initial value (starting point)
  2. lr = 0.01       # Learning rate
  3. precision = 0.000001  # Stop when the change in x is smaller than this
  4. previous_step_size = 1  # Initialize to a large value
  5. max_iter = 10000  # Maximum number of iterations to prevent infinite loop
  6. iter = 0         # Iteration counter
  7. gf = lambda x: (x + 3)**2  # Define the gradient of the function
  8.  
  9. import matplotlib.pyplot as plt
  10.  
  11. gd = []  # List to store the values of x during each iteration
  12.  
  13. while precision < previous_step_size and iter < max_iter:
  14.     prev = x
  15.     x = x - lr * gf(prev)  # Update x using the gradient descent formula
  16.     previous_step_size = abs(x - prev)
  17.     iter += 1
  18.     print('Iteration', iter, 'Value:', x)
  19.     gd.append(x)  # Append the current value of x to the list
  20.  
  21. print('Local Minima:', x)
  22.  
  23. plt.plot(gd)
  24.  
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement