We Raised $8M Series A to Continue Building Experiment Tracking and Model Registry That “Just Works”
Transformer Models for Textual Data Prediction
Transformer models such as Google's BERT and Open AI’s GPT3 continue to change how we think about Machine Learning (ML) and Natural Language Processing (NLP). Look no further than GitHub’s recent ...
Read more

Unmasking BERT: The Key to Transformer Model Performance
Read more
Understanding Vectors From a Machine Learning Perspective
Read more
10 Things You Need to Know About BERT and the Transformer Architecture That Are Reshaping the AI Landscape
Read more
Wasserstein Distance and Textual Similarity
Read more
Training, Visualizing, and Understanding Word Embeddings: Deep Dive Into Custom Datasets
Read more