WHAT I'M DOING NOW
Last updated: January 2026
THIS_WEEK.log
$ tail -f progress.log [2026-01-26 20:00] Implementing FlashAttention from scratch. 6 hours debugging positional encoding. Finally works. ✓ [2026-01-25 14:30] Reading "Attention Is All You Need" for the 5th time. Finally understanding why layer normalization placement matters. [2026-01-24 09:00] Started Building LLM Reasoners course. Greg Durrett is incredible. █
CURRENT FOCUS
COURSE.md
Building LLM Reasoners
Spring 2026 · Professor Greg Durrett · NYU
- Learning transformer architecture from first principles
- Implementing RLHF and chain-of-thought reasoning
- Understanding how frontier models actually think
- Not just API calls—the real mechanics
PROJECTS.json
Active Projects
{
"h1b_rag": {
"status": "ongoing",
"desc": "Deterministic eval for legal Q&A"
},
"pico_interp": {
"status": "ongoing",
"desc": "10M param model interpretability"
},
"transformers": {
"status": "ongoing",
"desc": "Building from scratch, no libs"
}
}READING.txt
Currently Reading
- "Attention Is All You Need" - Vaswani et al.
- "Language Models are Few-Shot Learners" - GPT-3 paper
- "Constitutional AI" - Anthropic
INTERNSHIP SEARCH
TARGET_ROLES.txt
- → Forward Deployment Engineer (Palantir-style)
- → AI Product Engineer (0→1 products)
- → Technical roles at AI startups
REQUIREMENTS.txt
What I'm Looking For
- ⚡ Build products that ship to real users
- 🚜 Work in messy 0→1 problem spaces
- 🤖 Move fast
STATUS: ACTIVELY LOOKING FOR SUMMER 2026
GOALS THIS MONTH
GOALS.md
- [ ] Complete FlashAttention implementation
- [ ] Finish H-1B RAG evaluation framework
- [ ] Submit 10 internship applications
- [ ] Write technical blog post on transformer internals
This page updates every Sunday.