Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Very nicely written article. Personally, I find RAG (and more abstractly, vector search) the only mildly interesting development in the latest LLM fad, and have always felt that LLMs sit way too far down the diminishing returns curve to be interesting. However, I can’t believe tokenization and embeddings in general, are not broadly considered the absolutely most paramount aspect of all deep learning. The latent space your model captures is the most important aspect of the whole pipeline, or else what is any deep learning model even doing?


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: