DeepSeek engineers are pure genius
To use DeepSeek's API, you npm install openai. Yes you read that right, you can use DeepSeek through OpenAI's client libraries. DeepSeek's REST API is 100% compatible with OpenAI's REST API.
This is hilarious and yet genius:
1. DeepSeek saved weeks of engineering on Node.js and Python client libs by simply piggybacking on OpenAI's library code.
2. Developers using OpenAI can easily try out / migrate to DeepSeek just by changing a few lines of code – simply modify the base url and API key.
3. If DeepSeek ever needs to deviate, they can fork and s/openai/deepseek
That said, my favorite way of using LLMs is still by using Vercel's ai npm library, which is an abstraction over various models, including Claude and Llama. As seen by the sudden emergence of DeepSeek, you never know when another strong competitor will emerge, it's best to not couple your app with any specific LLMs.
On a side note, DeepSeek docs website uses Docusaurus!
To use DeepSeek's API, you npm install openai. Yes you read that right, you can use DeepSeek through OpenAI's client libraries. DeepSeek's REST API is 100% compatible with OpenAI's REST API.
This is hilarious and yet genius:
1. DeepSeek saved weeks of engineering on Node.js and Python client libs by simply piggybacking on OpenAI's library code.
2. Developers using OpenAI can easily try out / migrate to DeepSeek just by changing a few lines of code – simply modify the base url and API key.
3. If DeepSeek ever needs to deviate, they can fork and s/openai/deepseek
That said, my favorite way of using LLMs is still by using Vercel's ai npm library, which is an abstraction over various models, including Claude and Llama. As seen by the sudden emergence of DeepSeek, you never know when another strong competitor will emerge, it's best to not couple your app with any specific LLMs.
On a side note, DeepSeek docs website uses Docusaurus!