<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"><channel><title>Jeet Ganatra — lab</title><description>work-in-progress experiments and questions</description><link>https://jeetganatra.me/</link><item><title>fine-tuning nanochat on math</title><link>https://jeetganatra.me/lab#finetune-nanochat-math/</link><guid isPermaLink="true">https://jeetganatra.me/lab#finetune-nanochat-math/</guid><description>does GRPO actually help on AIME problems?</description><pubDate>Wed, 22 Apr 2026 00:00:00 GMT</pubDate><category>running</category><category>rl</category><category>math</category><category>agents</category></item><item><title>agentic eval harness</title><link>https://jeetganatra.me/lab#agentic-eval-harness/</link><guid isPermaLink="true">https://jeetganatra.me/lab#agentic-eval-harness/</guid><description>LLM-as-judge, but with rubrics.</description><pubDate>Mon, 20 Apr 2026 00:00:00 GMT</pubDate><category>running</category><category>evals</category><category>agents</category></item><item><title>speculative decoding from scratch</title><link>https://jeetganatra.me/lab#speculative-decoding/</link><guid isPermaLink="true">https://jeetganatra.me/lab#speculative-decoding/</guid><description>how much latency can a draft model save?</description><pubDate>Sat, 18 Apr 2026 00:00:00 GMT</pubDate><category>running</category><category>inference</category><category>latency</category></item><item><title>reading: attention is all you need (5th time)</title><link>https://jeetganatra.me/lab#reading-attention-is-all-you-need/</link><guid isPermaLink="true">https://jeetganatra.me/lab#reading-attention-is-all-you-need/</guid><description>still finding new things in the same paper.</description><pubDate>Mon, 30 Mar 2026 00:00:00 GMT</pubDate><category>paused</category><category>transformers</category><category>papers</category></item><item><title>long-context experiments</title><link>https://jeetganatra.me/lab#long-context-experiments/</link><guid isPermaLink="true">https://jeetganatra.me/lab#long-context-experiments/</guid><description>what breaks past 100k tokens?</description><pubDate>Tue, 10 Mar 2026 00:00:00 GMT</pubDate><category>paused</category><category>long-context</category><category>inference</category></item><item><title>loss curve forensics</title><link>https://jeetganatra.me/lab#loss-curve-forensics/</link><guid isPermaLink="true">https://jeetganatra.me/lab#loss-curve-forensics/</guid><description>why did training spike at step 11k?</description><pubDate>Sat, 28 Feb 2026 00:00:00 GMT</pubDate><category>shipped</category><category>training</category><category>debugging</category></item></channel></rss>