#
Nuxt Chatgpt
#
About the module
#
Features
- 💪 Â Easy implementation into any Nuxt 3 project.
- 👉 Â Type-safe integration of Chatgpt into your Nuxt 3 project.
- 🕹️ Â Provides a
useChatgpt()
composable that grants easy access to thechat
, andchatCompletion
methods. - 🔥 Â Ensures security by routing requests through a Nitro Server, preventing the API Key from being exposed.
- 🧱 Â It is lightweight and performs well.
#
Getting Started
- Add nuxt-chatgpt dependency to your project
- npm
npm install --save-dev nuxt-chatgpt
- pnpm
pnpm add -D nuxt-chatgpt
- yarn
yarn add --dev nuxt-chatgpt
- Add nuxt-chatgpt to the modules section of nuxt.config.ts
export default defineNuxtConfig({
modules: ["nuxt-chatgpt"],
// entirely optional
chatgpt: {
apiKey: 'Your apiKey here goes here'
},
})
That's it! You can now use Nuxt Chatgpt in your Nuxt app 🔥
#
Usage & Examples
To access the chat
, and chatCompletion
methods in the nuxt-chatgpt module, you can use the useChatgpt()
composable, which provides easy access to them. The chat
, and chatCompletion
methods requires three parameters:
Available models for chat
- text-
davinci
-003 - text-
davinci
-002
Available models for chatCompletion
gpt-3.5-turbo
gpt-3.5-turbo-0301
You need to join wait listto use gpt-4 models within
chatCompletion` method
gpt-4
gpt-4-0314
gpt-4-32k
gpt-4-32k-0314
#
Simple chat
usage
In the following example, the model is unspecified, and the text-davinci-003 model will be used by default.
const { chat } = useChatgpt()
const data = ref('')
const message = ref('')
async function sendMessage() {
const response = await chat(message.value)
data.value = response
}
<template>
<div>
<input v-model="message">
<button
@click="sendMessage"
v-text="'Send'"
/>
<div></div>
</div>
</template>
#
Usage of chat
with different model
const { chat } = useChatgpt()
const data = ref('')
const message = ref('')
async function sendMessage() {
const response = await chat(message.value, 'text-davinci-002')
data.value = response
}
<template>
<div>
<input v-model="message">
<button
@click="sendMessage"
v-text="'Send'"
/>
<div></div>
</div>
</template>
#
Simple chatCompletion
usage
In the following example, the model is unspecified, and the gpt-3.5-turbo model will be used by default.
const { chatCompletion } = useChatgpt()
const data = ref('')
const message = ref('')
async function sendMessage() {
const response = await chatCompletion(message.value)
data.value = response
}
<template>
<div>
<input v-model="message">
<button
@click="sendMessage"
v-text="'Send'"
/>
<div></div>
</div>
</template>
#
Usage of chatCompletion
with different model
const { chatCompletion } = useChatgpt()
const data = ref('')
const message = ref('')
async function sendMessage() {
const response = await chatCompletion(message.value, 'gpt-3.5-turbo-0301')
data.value = response
}
<template>
<div>
<input v-model="message">
<button
@click="sendMessage"
v-text="'Send'"
/>
<div></div>
</div>
</template>
#
chat vs chatCompletion
The chat
method allows the user to send a prompt to the OpenAI API and receive a response. You can use this endpoint to build conversational interfaces that can interact with users in a natural way. For example, you could use the chat method to build a chatbot that can answer customer service questions or provide information about a product or service.
The chatCompletion
method is similar to the chat
method, but it provides additional functionality for generating longer, more complex responses. Specifically, the chatCompletion method allows you to provide a conversation history as input, which the API can use to generate a response that is consistent with the context of the conversation. This makes it possible to build chatbots that can engage in longer, more natural conversations with users.
#
Module Options
#
Contributing
Contributions are what make the open source community such an amazing place to learn, inspire, and create. Any contributions you make are greatly appreciated.
If you have a suggestion that would make this better, please fork the repo and create a pull request. You can also simply open an issue with the tag "enhancement". Don't forget to give the project a star! Thanks again!
- Fork the Project
- Create your Feature Branch (
git checkout -b feature/AmazingFeature
) - Commit your Changes (
git commit -m 'Add some AmazingFeature'
) - Push to the Branch (
git push origin feature/AmazingFeature
) - Open a Pull Request
#
License
Distributed under the MIT License. See LICENSE.txt
for more information.
#
Contact
Oliver Trajceski - LinkedIn - oliver@akrinum.com
Project Link: https://github.com/schnapsterdog/nuxt-chatgpt
#
Development
# Install dependencies
npm install
# Generate type stubs
npm run dev:prepare
# Develop with the playground
npm run dev
# Build the playground
npm run dev:build
# Run ESLint
npm run lint
# Run Vitest
npm run test
npm run test:watch
# Release new version
npm run release