We Raised $8M Series A to Continue Building Experiment Tracking and Model Registry That “Just Works”
Neptune vs Comet



Standalone component. ML metadata store that focuses on experiment tracking and model registry
Stand-alone tool with community, self-serve and managed deployment options
Yes, you can deploy Comet on any cloud environment or on-premise
Managed cloud service
Managed cloud service
No special requirements other than having the neptune-client installed and access to the internet if using managed hosting. Check here for infrastructure requirements for on-prem deployment
Minimal. Just a few lines of code needed for tracking. Read more
Minimal. Few lines of code needed for tracking
Yes, through the neptune-client library
Yes, through comet_ml library
No
No
No
No
No
No
No
No
No
No
No
No
No
Interactive visualizations for bounding boxes exist
No
No
No
No
No
No
No
No
No
No
No
No
No
No
No
No
No
No
No
No
No
No
No
No
YesDataset versions
No
No
No
No
No
No
No
No
No
No
No
No
No
No
No
No
No
Yes
Yes
Yes
Yes
Query language
Query language
No
No
No
No
No
Yes, but only using a workaround
No
No
Limited
Limited to git commits
No
No
No
No
No
No
No
No
No
No
No
No
No
No
No
No
No
No
No
No
No
No
No
No
No
No
No
No
No
Yes
No
Yes
No
No
No
No
No
No
No
No
No
No
No
No
No
No
No
No
No
No
No
No
No
No (only used as dependency for another integration ludwig)
No
No
No
No
No
No
No
No
This page was updated on 13 August 2021. Some information may be outdated.
Report outdated information here.
What are the key advantages of Neptune, then?
- Responsive UI, scalable with thousands of runs
- Integrations with all the important visualization libraries that allow you to log interactive visualizations to Neptune
- Usage-based pricing that scales with model building needs, not team size

See these features in action
1. Create a free account
Sign up2. Install Neptune client library
pip install neptune-client
3. Add logging to your script
import neptune.new as neptune
run = neptune.init('Me/MyProject')
run['params'] = {'lr':0.1, 'dropout':0.4}
run['test_accuracy'] = 0.84
4. Or see how it works in a notebook (no registration)
Try live notebookThousands of ML people already chose their tool

“(…) thanks for the great tool, has been really useful for keeping track of the experiments for my Master’s thesis. Way better than the other tools I’ve tried (comet / wandb).
I guess the main reason I prefer neptune is the interface, it is the cleanest and most intuitive in my opinion, the table in the center view just makes a great deal of sense. I like that it’s possible to set up and save the different view configurations as well. Also, the comparison is not as clunky as for instance with wandb. Another plus is the integration with ignite, as that’s what I’m using as the high-level framework for model training.”
“Previously used tensorboard and azureml but Neptune is hugely better. In particular, getting started is really easy; documentation is excellent, and the layout of charts and parameters is much clearer.”
“While logging experiments is great, what sets Neptune apart for us at the lab is the ease of sharing those logs. The ability to just send a Neptune link in slack and letting my coworkers see the results for themselves is awesome. Previously, we used Tensorboard + locally saved CSVs and would have to send screenshots and CSV files back and forth which would easily get lost. So I’d say Neptune’s ability to facilitate collaboration is the biggest plus.”
“For me the most important thing about Neptune is its flexibility. Even if I’m training with Keras or Tensorflow on my local laptop, and my colleagues are using fast.ai on a virtual machine, we can share our results in a common environment.”