bycloud

🚨This week’s top AI/ML research papers:
- Sparse Crosscoders
- Rethinking Softmax
- Mechanistic Unlearning
- Decomposing The Dark Matter of Sparse Autoencoders
- ZIP-FIT
- Automatically Interpreting Millions of Features in Large Language Models
- Breaking the Memory Barrier
- Can Knowledge Editing Really Correct Hallucinations?
- Framer: Interactive Frame Interpolation
- Beyond position
- A Hitchhiker's Guide to Scaling Law Estimation
- Scaling up Masked Diffusion Models on Text
- Why Does the Effective Context Length of LLMs Fall Short?
- Scaling Diffusion Language Models via Adaptation from Autoregressive Models
- Improve Vision Language Model Chain-of-thought Reasoning
- PyramidDrop
- FrugalNeRF
- SAM2Long
- SeerAttention
- FiTv2

overview for each + authors' explanations
x.com/TheAITimeline/thread/1850237734381834447

read it on a website instead
mail.bycloud.ai/p/this-week-s-top-ai-ml-research-p…

Hope you like it, and have a great week!

join patreon to support me:
www.patreon.com/bycloud

6 months ago (edited) | [YT] | 389



@deltamico

> be told it slowed down > check the list > didn't shorten yeah that's fair

6 months ago | 15

@aakashsensharma7675

I've been researching machine unlearning for a while now, glad to see it's gaining more traction.

6 months ago | 5

@rawallon

Bro I swear if more pll leave open ai

6 months ago | 12

@mmmm768

Talk about Pixtral 12B already

6 months ago | 10

@astronemir

Hey I know it maybe too much to ask but can you do monthly or quarterly recaps/highlights? You can even turn that highlight into a literature review paper or publication.

6 months ago | 0

@gvteja4908

bruh tone down the list, this feels like an attempt for people to read your newsletter than actually giving a good summary of what papers are worth people's time.

6 months ago | 0

@gerdaleta

Your focus too much on AI research does a whole another thing going on with the brain brain organoid computers eventually they will combine into each other that will be the end of the human race

6 months ago | 1