Neptune for Research
Free experiment tracker for academics working on foundation models
Join 1000s of researchers, professors, students, and Kagglers using Neptune to make monitoring experiments, comparing runs, and sharing results far easier than with open source tools.
Who can use Neptune for free?
You can be granted a free license if you are doing research on foundation models and you are:
Research groups
Neptune’s responsive UI makes it possible to monitor your experiments and compare metrics at speed – even at scale. So you can optimize the usage of your limited GPUs by reacting to failed runs and divergence in real time.
And since all your data lives in one place, accessing your experiment history, quickly reproducing results, and collaborating with your team is no longer a hassle.
And since all your data lives in one place, accessing your experiment history, quickly reproducing results, and collaborating with your team is no longer a hassle.
Professors and students
Neptune doesn’t require backend setup or maintenance. So getting started is smooth — even if you’re not a DevOps. You’re good to go with only a few lines of code. It’s a great way to teach and learn best practices for tracking real-life projects.
Within a single dashboard, your team can easily monitor multiple experiments, and compare and share results. And organizing data across separate groups is simple – just create a project for each one.
Within a single dashboard, your team can easily monitor multiple experiments, and compare and share results. And organizing data across separate groups is simple – just create a project for each one.
Kagglers
ML competitions require accurately tracking massive experiments and quick iteration. The visibility of performance metrics, advanced experiment comparison options, and lightning-fast UI you get with Neptune make it the Kaggle Grandmasters’ not-so-secret weapon.
Use Neptune for free — and give yourself a bigger chance to win!
Use Neptune for free — and give yourself a bigger chance to win!
What’s included in your research plan?
Unlimited members
Unlimited tracked hours
1 TB of storage
Easier collaboration. Better organization. Faster comparison.
The problem with training models on remote clusters is that every time you want to see what is going on, you need to get your FTP client up, download the logs to a machine with a graphical interface, and plot it.
I tried using TensorBoard but it was painful to set up in my situation. With Neptune, seeing training progress was as simple as hitting refresh. The feedback loop between changing the code and seeing whether anything changed is just so much shorter.
Kaare Mikkelsen
Assistant Professor at Aarhus University
I tried using TensorBoard but it was painful to set up in my situation. With Neptune, seeing training progress was as simple as hitting refresh. The feedback loop between changing the code and seeing whether anything changed is just so much shorter.

While logging experiments is great, what sets Neptune apart for us at the lab is the ease of sharing those logs. The ability to just send a Neptune link in slack and letting my coworkers see the results for themselves is awesome.
Previously, we used Tensorboard + locally saved CSVs and would have to send screenshots and CSV files back and forth which would easily get lost. So I’d say Neptune’s ability to facilitate collaboration is the biggest plus.
Greg Rolwes
Computer Science Undergraduate at Saint Louis University
Previously, we used Tensorboard + locally saved CSVs and would have to send screenshots and CSV files back and forth which would easily get lost. So I’d say Neptune’s ability to facilitate collaboration is the biggest plus.
I used to keep track of my models with folders on my machine and use naming conventions to save the parameters and model architecture. Whenever I wanted to track something new about the model, I would have to update the naming structure. It was painful. There was a lot of manual work involved.
Now everything happens automatically. I can compare models in the online interface that looks great. It saves me a lot of time, and I can focus on my research instead of keeping track of everything manually.
Abdalrheem Ijjeh
Researcher at IFFM, Polish Academy of Sciences
Now everything happens automatically. I can compare models in the online interface that looks great. It saves me a lot of time, and I can focus on my research instead of keeping track of everything manually.

I tested multiple loggers with pytorch-lightning integrations and found neptune to be the best fit for my needs. Friendly UI, ease of use and great documentation.
Itsik Adiv
Machine Learning Scientist
5 ways you can support us
Cite Neptune in papers
@software{
neptune,
author = {{neptune.ai}},
title = {neptune.ai: experiment tracker},
url = {https://neptune.ai},
year = {2024},}
Use Neptune in your public projects and repos

Share in kaggle kernels or winning write-ups

Sing our praises on social
(We’ll like, share, and ♥️ you for it!)
(We’ll like, share, and ♥️ you for it!)
Star our GitHub repo
(But only if you like Neptune!)
(But only if you like Neptune!)
Other resources you may find interesting
Our Blog
Learn from AI/ML researchers and engineers: best practices, tool reviews, and real-world examples.
Our video series
We challenged AI/ML researchers to summarize their ICML 2024 papers in less than 100 seconds.
Neptune vs TensorBoard
Check how Neptune compares feature-by-feature to TensorBoard, and why people switch to Neptune
More questions?
Get in touch! We’d love to chat.