Alright folks, time to vote on the next paper for the Kaggle reading group! 📑🤓 Your options are:
* EfficientNet: Rethinking Model Scaling for Convolutional Neural Networks (Tan & Le 2019) is from ICML 2019 and proposes a new method for scaling up CNN models & improving accuracy. proceedings.mlr.press/v97/tan19a/tan19a.pdf* ALBERT (anonymous, under review) is "a new pretraining method that establishes new state-of-the-art results on the GLUE, RACE, and SQuAD benchmarks while having fewer parameters compared to BERT-large." openreview.net/forum?id=H1eA7AEtvS* Deep Learning for Symbolic Mathematics (anonymous, under review) "train[s] a neural network to compute function integrals, and to solve complex differential equations". openreview.net/forum?id=S1eZYeHFDS
Kaggle
Alright folks, time to vote on the next paper for the Kaggle reading group! 📑🤓 Your options are:
* EfficientNet: Rethinking Model Scaling for Convolutional Neural Networks (Tan & Le 2019) is from ICML 2019 and proposes a new method for scaling up CNN models & improving accuracy. proceedings.mlr.press/v97/tan19a/tan19a.pdf* ALBERT (anonymous, under review) is "a new pretraining method that establishes new state-of-the-art results on the GLUE, RACE, and SQuAD benchmarks while having fewer parameters compared to BERT-large." openreview.net/forum?id=H1eA7AEtvS* Deep Learning for Symbolic Mathematics (anonymous, under review) "train[s] a neural network to compute function integrals, and to solve complex differential equations". openreview.net/forum?id=S1eZYeHFDS
6 years ago | [YT] | 22