Skip to content

Conversation

@arunpandianj
Copy link

Summary

  • Added NAdam optimizer (Nesterov-accelerated Adam) to Qiskit Machine Learning.
  • Implemented minimize, snapshot saving/loading, callback support.

Details

  • Added nadam.py to optimizers.
  • Updated __init__.py to include NAdam.
  • Added unit tests in test/optimizers/test_nadam.py.
  • Tests verify: basic optimization, noisy optimization, callback, settings, snapshot save/load.

Checks

  • Unit tests pass
  • Follows Qiskit code style

@CLAassistant
Copy link

CLAassistant commented Oct 11, 2025

CLA assistant check
All committers have signed the CLA.

@CLAassistant
Copy link

CLA assistant check
Thank you for your submission! We really appreciate it. Like many open source projects, we ask that you sign our Contributor License Agreement before we can accept your contribution.
You have signed the CLA already but the status is still pending? Let us recheck it.

@arunpandianj
Copy link
Author

This is a Python implementation of the NAdam optimizer for Qiskit, combining Adam’s adaptive moment estimation with Nesterov momentum for faster convergence. It supports numerical gradient computation, optional gradient noise, and bias-corrected moment updates. The optimizer includes features like iteration snapshots, callback functions, and convergence checking via a tolerance threshold. While it currently ignores analytical gradients, it can save and reload internal states (_m, _v, _t) from snapshots, making it suitable for long-running optimization tasks.

@edoaltamura edoaltamura added the on hold 🛑 Can not fix yet label Oct 27, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

on hold 🛑 Can not fix yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants