It's local first, and ties many different AIs into one text editor, any arbitrary text editor in fact.
It does speech recognition, which isn't useful for writing code, but is useful for generating natural language LLM prompts and comments.
It does CodeLlama (and any HuggingFace-based language model)
It does ChatGPT
It does Retrieval Augmented Gen, which is where you have a query that searches through eg PDFs, Youtube transcripts, code bases, HTML, local or online files, Arxiv papers, etc. It then surfaces passages relevant to your query, that you can then further use in conjunction with an LLM.
I don't know how mainstream LLM-powered software looks, but for devs, I love this format of tying in the best models as they're released into one central repo where they can all play off each others' strengths.
It's local first, and ties many different AIs into one text editor, any arbitrary text editor in fact.
It does speech recognition, which isn't useful for writing code, but is useful for generating natural language LLM prompts and comments.
It does CodeLlama (and any HuggingFace-based language model)
It does ChatGPT
It does Retrieval Augmented Gen, which is where you have a query that searches through eg PDFs, Youtube transcripts, code bases, HTML, local or online files, Arxiv papers, etc. It then surfaces passages relevant to your query, that you can then further use in conjunction with an LLM.
I don't know how mainstream LLM-powered software looks, but for devs, I love this format of tying in the best models as they're released into one central repo where they can all play off each others' strengths.