Kaggle’s global community of practitioners, researchers, and enthusiasts collaborate to shape the frontier of AI. Through AI competitions, benchmarks, agentic evaluation, Kaggle serves as both the engine and proving ground for community-led innovation.
Kaggle
Check out @robmulla's live data analysis of our 2022 annual user survey results!
2 years ago | [YT] | 26
View 2 replies
Kaggle
Did you miss this episode of Chai Time ft. the Kaggle Staff? We're a little biased, but it's pretty good! Thanks to @ChaiTimeDataScience
4 years ago | [YT] | 9
View 1 reply
Kaggle
We love this livestream by Kaggle GM @abhishekkrthakur
5 years ago | [YT] | 11
View 0 replies
Kaggle
No streaming this Wednesday due to the holiday, but we'll be back on Friday at the usual time. Talk to you then and warm wishes for the holidays. :)
5 years ago | [YT] | 35
View 0 replies
Kaggle
No live streaming this morning; Rachael is still sick. 😷😞
6 years ago | [YT] | 40
View 11 replies
Kaggle
No stream today, Kagglers. Rachael is out sick. 😞
6 years ago | [YT] | 36
View 16 replies
Kaggle
No livestreaming this week, Kagglers! Rachael is traveling all week.
We'll see you next Wednesday, Nov 20, at the usual time. 😊
6 years ago | [YT] | 52
View 3 replies
Kaggle
Time to pick the next paper for the Kaggle reading group! Here are your choices:
> "Learning from Dialogue after Deployment: Feed Yourself, Chatbot!" (Hancock et al, 2019) was published at
ACL in 2019 & proposes a method for a dialog agent to continue to learn and update with user feedback. aclweb.org/anthology/P19-1358.pdf> "AutoML using Metadata Language Embeddings" (Drori et al, unpublished) proposes a method to augment existing AutoML systems by incorporating information from the text description of the dataset & description of possible algorithms to use. arxiv.org/pdf/1910.03698.pdf> "Machine Learning for Scent: Learning Generalizable Perceptual Representations of Small Molecules" (Sanchez-Lengeling et al, unpublished) outline an approach to predicting the scent of a molecule given it's structure using graph neural networks. arxiv.org/pdf/1910.10685.pdf
6 years ago | [YT] | 32
View 5 replies
Kaggle
Alright folks, time to vote on the next paper for the Kaggle reading group! 📑🤓 Your options are:
* EfficientNet: Rethinking Model Scaling for Convolutional Neural Networks (Tan & Le 2019) is from ICML 2019 and proposes a new method for scaling up CNN models & improving accuracy. proceedings.mlr.press/v97/tan19a/tan19a.pdf* ALBERT (anonymous, under review) is "a new pretraining method that establishes new state-of-the-art results on the GLUE, RACE, and SQuAD benchmarks while having fewer parameters compared to BERT-large." openreview.net/forum?id=H1eA7AEtvS* Deep Learning for Symbolic Mathematics (anonymous, under review) "train[s] a neural network to compute function integrals, and to solve complex differential equations". openreview.net/forum?id=S1eZYeHFDS
6 years ago | [YT] | 22
View 4 replies
Kaggle
No live coding on Friday because Rachael is Out of Office. She'll be back next Wednesday for the Kaggle Reading Group, though; keep your eyes peeled for the poll for the next paper!
6 years ago (edited) | [YT] | 66
View 2 replies
Load more