How to Log and Analyze Model Training Metadata
What will you learn?
How you can log and inspect all the metadata generated during various model training sessions in Neptune.
-
How to use Neptune integrations and what do you get by default?
-
Analyzing the logged metrics
-
What else can I log to Neptune using the Python client?
-
You can log the metadata in a hierarchical structure using the namespaces
-
You can also log the images and files to Neptune
-
Exploring all the logged metadata in the Neptune UI
-
Visualizing the model parameters, summary and checkpoints in the Neptune UI
-
You can optionally associate the source code with the Neptune run
-
Neptune automatically tracks your Git info
-
How can I share a particular view inside my run with a team member?
-
Can I log the tabular data (i.e. pandas dataframes)?
-
You can associate some additional information to your images (i.e. class probability)
Read the documentation
Couldn’t find the use case you were looking for?
Just get in touch, and our ML team will create a custom demo for you.