2.1.3 • Published 8 days ago

gpti v2.1.3

Weekly downloads
-
License
MIT
Repository
github
Last release
8 days ago

GPTI

npm License Contributors Size Package JavaScript TypeScript Ko-Fi

This package simplifies your interaction with various GPT models, eliminating the need for tokens or other methods to access GPT. It also allows you to use three artificial intelligences to generate images: DALL·E, Prodia, and more, some of which are premium while others are free, all of this without restrictions or limits.

Installation

You can install the package via NPM

npm i gpti

Available Models

GPTI provides access to a variety of artificial intelligence models to meet various needs. Currently, the available models include:

Api key

If you want to access the premium models, enter your credentials. You can obtain them by clicking here.

// import { nexra } from "gpti";
const { nexra } = require("gpti");

nexra("user-xxxxxxxx", "nx-xxxxxxx-xxxxx-xxxxx");

Usage GPT

// import { gpt } from "gpti";
const { gpt } = require("gpti");

gpt.v1({
    messages: [
        {
            role: "assistant",
            content: "Hello! How are you today?"
        },
        {
            role: "user",
            content: "Hello, my name is Yandri."
        },
        {
            role: "assistant",
            content: "Hello, Yandri! How are you today?"
        }
    ],
    prompt: "Can you repeat my name?",
    model: "GPT-4",
    markdown: false
}, (err, data) => {
    if(err != null){
        console.log(err);
    } else {
        console.log(data);
    }
});

JSON

{
    "code": 200,
    "status": true,
    "model": "gpt-4",
    "gpt": "Hello, Yandri! How can I assist you today?",
    "original": null
}

Models

Select one of these available models in the API to enhance your experience.

  • gpt-4
  • gpt-4-0613
  • gpt-4-32k
  • gpt-4-0314
  • gpt-4-32k-0314
  • gpt-3.5-turbo
  • gpt-3.5-turbo-16k
  • gpt-3.5-turbo-0613
  • gpt-3.5-turbo-16k-0613
  • gpt-3.5-turbo-0301
  • text-davinci-003
  • text-davinci-002
  • code-davinci-002
  • gpt-3
  • text-curie-001
  • text-babbage-001
  • text-ada-001
  • davinci
  • curie
  • babbage
  • ada
  • babbage-002
  • davinci-002

Usage GPT v2

It's quite similar, with the difference that it has the capability to generate real-time responses via streaming using gpt-3.5-turbo.

// import { gpt } from "gpti";
const { gpt } = require("gpti");

gpt.v2({
    messages: [
        {
            "role": "assistant",
            "content": "Hello! How are you today?"
        },
        {
            "role": "user",
            "content": "Hello, my name is Yandri."
        },
        {
            "role": "assistant",
            "content": "Hello, Yandri! How are you today?"
        },
        {
            "role": "user",
            "content": "Can you repeat my name?"
        }
    ],
    markdown: false,
    stream: false
}, (err, data) => {
    if(err != null){
        console.log(err);
    } else {
        console.log(data);
    }
});

JSON

{
    "code": 200,
    "status": true,
    "model": "ChatGPT",
    "message": "Of course, Yandri! How can I assist you today?",
    "original": null
}

JSON Streaming

{"message":"","original":null,"finish":false,"error":false}
{"message":"Of","original":null,"finish":false,"error":false}
{"message":"Of course","original":null,"finish":false,"error":false}
{"message":"Of course,","original":null,"finish":false,"error":false}
{"message":"Of course, Yand","original":null,"finish":false,"error":false}
{"message":"Of course, Yandri","original":null,"finish":false,"error":false}
{"message":"Of course, Yandri!","original":null,"finish":false,"error":false}
{"message":"Of course, Yandri! How","original":null,"finish":false,"error":false}
{"message":"Of course, Yandri! How can","original":null,"finish":false,"error":false}
{"message":"Of course, Yandri! How can I","original":null,"finish":false,"error":false}
{"message":"Of course, Yandri! How can I assist","original":null,"finish":false,"error":false}
{"message":"Of course, Yandri! How can I assist you","original":null,"finish":false,"error":false}
{"message":"Of course, Yandri! How can I assist you today","original":null,"finish":false,"error":false}
{"message":"Of course, Yandri! How can I assist you today?","original":null,"finish":false,"error":false}
{"message":"Of course, Yandri! How can I assist you today?","original":null,"finish":false,"error":false}
{"message":"Of course, Yandri! How can I assist you today?","original":null,"finish":false,"error":false}
{"message":null,"original":null,"finish":true,"error":false}

Usage GPT Web

GPT-4 has been enhanced by me, but errors may arise due to technological complexity. It is advisable to exercise caution when relying entirely on its accuracy for online queries.

// import { gpt } from "gpti";
const { gpt } = require("gpti");

gpt.web({
    prompt: "Are you familiar with the movie Wonka released in 2023?",
    markdown: false
}, (err, data) => {
    if(err != null){
        console.log(err);
    } else {
        console.log(data);
    }
});

JSON

{
    "code": 200,
    "status": true,
    "gpt": "Yes, I am familiar with the movie Wonka released in 2023. Wonka is a musical fantasy film directed by Paul King, adapted from the character at the center of Roald Dahl's iconic children's book, \"Charlie and the Chocolate Factory.\" The film follows the story of a young and poor Willy Wonka as he dreams of opening a shop in a chocolate-renowned city and discovers that the industry is controlled by a greedy cartel. The film has a rating of 7.1/10 and has received positive reviews with a score of 83% on Rotten Tomatoes. It was released on December 15, 2023, and has earned $552.1 million at the box office. The cast includes actors such as Timothée Chalamet. Unfortunately, I couldn't find information on whether the movie is available on Netflix.",
    "original": null
}

Usage Bing

// import { bing } from "gpti";
const { bing } = require("gpti");

bing({
    messages: [
        {
            role: "assistant",
            content: "Hello! How can I help you today? 😊"
        },
        {
            role: "user",
            content: "Hi, tell me the names of the movies released in 2023."
        },
        {
            role: "assistant",
            content: "Certainly! Here are some movies that were released in 2023:\n\n1.  **About My Father** [^1^](https://editorial.rottentomatoes.com/guide/best-movies-of-2023/)\n2.  **The Little Mermaid** [^1^](https://editorial.rottentomatoes.com/guide/best-movies-of-2023/)\n3.  **Fast X** [^1^](https://editorial.rottentomatoes.com/guide/best-movies-of-2023/)\n4.  **Spider-Man: Across the Spider-Verse** [^1^](https://editorial.rottentomatoes.com/guide/best-movies-of-2023/)\n5.  **The Machine** [^1^](https://editorial.rottentomatoes.com/guide/best-movies-of-2023/)\n6.  **Book Club: The Next Chapter** [^1^](https://editorial.rottentomatoes.com/guide/best-movies-of-2023/)\n7.  **Guardians of the Galaxy Vol. 3** [^1^](https://editorial.rottentomatoes.com/guide/best-movies-of-2023/)\n8.  **John Wick: Chapter 4** [^1^](https://editorial.rottentomatoes.com/guide/best-movies-of-2023/)\n9.  **Are You There God? It's Me, Margaret** [^1^](https://editorial.rottentomatoes.com/guide/best-movies-of-2023/)\n10.  **Evil Dead Rise** [^1^](https://editorial.rottentomatoes.com/guide/best-movies-of-2023/)\n11.  **The Super Mario Bros. Movie** [^1^](https://editorial.rottentomatoes.com/guide/best-movies-of-2023/)\n12.  **Love Again** [^1^](https://editorial.rottentomatoes.com/guide/best-movies-of-2023/)\n13.  **Kandahar** [^1^](https://editorial.rottentomatoes.com/guide/best-movies-of-2023/)\n14.  **Dungeons & Dragons: Honor Among Thieves** [^1^](https://editorial.rottentomatoes.com/guide/best-movies-of-2023/)\n15.  **Shin Kamen Rider** [^1^](https://editorial.rottentomatoes.com/guide/best-movies-of-2023/)\n16.  **Knights of the Zodiac** [^1^](https://editorial.rottentomatoes.com/guide/best-movies-of-2023/)\n17.  **The Pope's Exorcist** [^1^](https://editorial.rottentomatoes.com/guide/best-movies-of-2023/)\n18.  **Shazam! Fury of the Gods** [^1^](https://editorial.rottentomatoes.com/guide/best-movies-of-2023/)\n19.  **All That Breathes** [^1^](https://editorial.rottentomatoes.com/guide/best-movies-of-2023/)\n20.  **Sailor Moon Cosmos** [^1^](https://editorial.rottentomatoes.com/guide/best-movies-of-2023/)\n21.  **Hypnotic** [^1^](https://editorial.rottentomatoes.com/guide/best-movies-of-2023/)\n22.  **Sound of Freedom** [^1^](https://editorial.rottentomatoes.com/guide/best-movies-of-2023/)\n23.  **The Boogeyman** [^1^](https://editorial.rottentomatoes.com/guide/best-movies-of-2023/)\n24.  **Chicken Run: Dawn of the Nugget** [^1^](https://editorial.rottentomatoes.com/guide/best-movies-of-2023/)\n25.  **A Lot of Nothing** [^1^](https://editorial.rottentomatoes.com/guide/best-movies-of-2023/)\n26.  **Followers** [^1^](https://editorial.rottentomatoes.com/guide/best-movies-of-2023/)\n27.  **Big George Foreman** [^1^](https://editorial.rottentomatoes.com/guide/best-movies-of-2023/)\n28.  **Asterix & Obelix: The Middle Kingdom** [^1^](https://editorial.rottentomatoes.com/guide/best-movies-of-2023/)\n29.  **Ant-Man and the Wasp: Quantumania** [^1^](https://editorial.rottentomatoes.com/guide/best-movies-of-2023/)\n30.  **Transformers: Rise of the Beasts** [^1^](https://editorial.rottentomatoes.com/guide/best-movies-of-2023/)\n31.  **Follow Her** [^1^](https://editorial.rottentomatoes.com/guide/best-movies-of-2023/)\n32.  **Prom Pact** [^1^](https://editorial.rottentomatoes.com/guide/best-movies-of-2023/)\n33.  **God Is a Bullet** [^1^](https://editorial.rottentomatoes.com/guide/best-movies-of-2023/)\n34.  **Still: A Michael J. Fox Movie** [^1^](https://editorial.rottentomatoes.com/guide/best-movies-of-2023/)\n35.  **Nefarious** [^1^](https://editorial.rottentomatoes.com/guide/best-movies-of-2023/)\n36.  **Nanny Dearest** [^1^](https://editorial.rottentomatoes.com/guide/best-movies-of-2023/)\n37.  **Monica** [^1^](https://editorial.rottentomatoes.com/guide/best-movies-of-2023/)\n38.  **Wild Life** [^1^](https://editorial.rottentomatoes.com/guide/best-movies-of-2023/)\n39.  **Palm Trees and Power Lines** [^1^](https://editorial.rottentomatoes.com/guide/best-movies-of-2023/)\n40.  **What's Love Got to Do with It?** [^1^](https://editorial.rottentomatoes.com/guide/best-movies-of-2023/)\n41.  **Creed III** [^1^](https://editorial.rottentomatoes.com/guide/best-movies-of-2023/)\n42.  **One True Loves** [^1^](https://editorial.rottentomatoes.com/guide/best-movies-of-2023/)\n43.  **BlackBerry** [^1^](https://editorial.rottentomatoes.com/guide/best-movies-of-2023/)\n44.  **Suzume** [^1^](https://editorial.rottentomatoes.com/guide/best-movies-of-2023/)\n45.  **Rock Dog 3: Battle the Beat** [^1^](https://editorial.rottentomatoes.com/guide/best-movies-of-2023/)\n46.  **Gridman Universe** [^1^](https://editorial.rottentomatoes.com/guide/best-movies-of-2023/)\n47.  **Digimon Adventure 02: The Beginning** [^1^](https://editorial.rottentomatoes.com/guide/best-movies-of-2023/)\n48.  **Woman of the Photographs** [^1^](https://editorial.rottentomatoes.com/guide/best-movies-of-2023/)\n49.  **El Tonto** [^1^](https://editorial.rottentomatoes.com/guide/best-movies-of-2023/)\n50.  **Seriously Red** [^1^](https://editorial.rottentomatoes.com/guide/best-movies-of-2023/)\n\nI hope this helps! Let me know if you have any other questions."
        },
        {
            role: "user",
            content: "Can you tell me how many movies you've told me about?"
        }
    ],
    conversation_style: "Balanced",
    markdown: false,
    stream: false,
}, (err, data) => {
    if(err != null){
        console.log(err);
    } else {
        console.log(data);
    }
});

JSON

{
    "code": 200,
    "status": true,
    "model": "Bing",
    "message": "I have told you about **50 movies** that were released in 2023. Is there anything else I can help you with?"
    "original:": null
}

JSON Streaming

{"message":"I","original":null,"finish":false,"error":false}
{"message":"I have","original":null,"finish":false,"error":false}
{"message":"I have told","original":null,"finish":false,"error":false}
{"message":"I have told you","original":null,"finish":false,"error":false}
{"message":"I have told you about","original":null,"finish":false,"error":false}
{"message":"I have told you about \\*\\*","original":null,"finish":false,"error":false}
{"message":"I have told you about \\*\\*50","original":null,"finish":false,"error":false}
{"message":"I have told you about \\*\\*50 movies","original":null,"finish":false,"error":false}
{"message":"I have told you about **50 movies**","original":null,"finish":false,"error":false}
{"message":"I have told you about **50 movies** that","original":null,"finish":false,"error":false}
{"message":"I have told you about **50 movies** that were","original":null,"finish":false,"error":false}
{"message":"I have told you about **50 movies** that were released","original":null,"finish":false,"error":false}
{"message":"I have told you about **50 movies** that were released in","original":null,"finish":false,"error":false}
{"message":"I have told you about **50 movies** that were released in","original":null,"finish":false,"error":false}
{"message":"I have told you about **50 movies** that were released in 202","original":null,"finish":false,"error":false}
{"message":"I have told you about **50 movies** that were released in 2023","original":null,"finish":false,"error":false}
{"message":"I have told you about **50 movies** that were released in 2023.","original":null,"finish":false,"error":false}
{"message":"I have told you about **50 movies** that were released in 2023. Is","original":null,"finish":false,"error":false}
{"message":"I have told you about **50 movies** that were released in 2023. Is there","original":null,"finish":false,"error":false}
{"message":"I have told you about **50 movies** that were released in 2023. Is there anything","original":null,"finish":false,"error":false}
{"message":"I have told you about **50 movies** that were released in 2023. Is there anything else","original":null,"finish":false,"error":false}
{"message":"I have told you about **50 movies** that were released in 2023. Is there anything else I","original":null,"finish":false,"error":false}
{"message":"I have told you about **50 movies** that were released in 2023. Is there anything else I can","original":null,"finish":false,"error":false}
{"message":"I have told you about **50 movies** that were released in 2023. Is there anything else I can help","original":null,"finish":false,"error":false}
{"message":"I have told you about **50 movies** that were released in 2023. Is there anything else I can help you","original":null,"finish":false,"error":false}
{"message":"I have told you about **50 movies** that were released in 2023. Is there anything else I can help you with","original":null,"finish":false,"error":false}
{"message":"I have told you about **50 movies** that were released in 2023. Is there anything else I can help you with?","original":null,"finish":false,"error":false}
{"message":"I have told you about **50 movies** that were released in 2023. Is there anything else I can help you with?","original":null,"finish":false,"error":false}
{"message":"I have told you about **50 movies** that were released in 2023. Is there anything else I can help you with?","original":null,"finish":false,"error":false}
{"message":"I have told you about **50 movies** that were released in 2023. Is there anything else I can help you with?","original":null,"finish":false,"error":false}
{"message":null,"original":null,"finish":true,"error":false}

Parameters

ParameterDefaultDescription
conversation_styleBalancedYou can use between: "Balanced", "Creative" and "Precise"
markdownfalseYou can convert the dialogues into continuous streams or not into Markdown
streamfalseYou are given the option to choose whether you prefer the responses to be in real-time or not

Usage LLaMA-2

// import { llama2 } from "gpti";
const { llama2 } = require("gpti");

llama2({
    messages:  [
        {
            "role": "assistant",
            "content": "Hello! How are you?"
        },
        {
            "role": "user",
            "content": "Hello! How are you? Could you tell me your name?"
        }
    ],
    system_message: "",
    temperature: 0.9,
    max_tokens: 4096,
    top_p: 0.6,
    repetition_penalty: 1.2,
    markdown: false,
    stream: false
}, (err, data) => {
    if(err != null){
        console.log(err);
    } else {
        console.log(data);
    }
});

JSON

{
    "code": 200,
    "status": true,
    "model": "LLaMA2",
    "message": "Sure, my name is LLaMA. I'm doing well, thanks for asking! Is there anything else you would like to chat about or ask me?",
    "original": null
}

JSON Streaming

{"message":"Hello","original":null,"finish":false,"error":false}
{"message":"Hello!","original":null,"finish":false,"error":false}
{"message":"Hello! I","original":null,"finish":false,"error":false}
{"message":"Hello! I'","original":null,"finish":false,"error":false}
{"message":"Hello! I'm","original":null,"finish":false,"error":false}
{"message":"Hello! I'm doing","original":null,"finish":false,"error":false}
{"message":"Hello! I'm doing well","original":null,"finish":false,"error":false}
{"message":"Hello! I'm doing well,","original":null,"finish":false,"error":false}
{"message":"Hello! I'm doing well, thanks","original":null,"finish":false,"error":false}
{"message":"Hello! I'm doing well, thanks for","original":null,"finish":false,"error":false}
{"message":"Hello! I'm doing well, thanks for asking","original":null,"finish":false,"error":false}
{"message":"Hello! I'm doing well, thanks for asking.","original":null,"finish":false,"error":false}
{"message":"Hello! I'm doing well, thanks for asking. My","original":null,"finish":false,"error":false}
{"message":"Hello! I'm doing well, thanks for asking. My name","original":null,"finish":false,"error":false}
{"message":"Hello! I'm doing well, thanks for asking. My name is","original":null,"finish":false,"error":false}
{"message":"Hello! I'm doing well, thanks for asking. My name is L","original":null,"finish":false,"error":false}
{"message":"Hello! I'm doing well, thanks for asking. My name is LLa","original":null,"finish":false,"error":false}
{"message":"Hello! I'm doing well, thanks for asking. My name is LLaMA","original":null,"finish":false,"error":false}
{"message":"Hello! I'm doing well, thanks for asking. My name is LLaMA,","original":null,"finish":false,"error":false}
{"message":"Hello! I'm doing well, thanks for asking. My name is LLaMA, I","original":null,"finish":false,"error":false}
{"message":"Hello! I'm doing well, thanks for asking. My name is LLaMA, I'","original":null,"finish":false,"error":false}
{"message":"Hello! I'm doing well, thanks for asking. My name is LLaMA, I'm","original":null,"finish":false,"error":false}
{"message":"Hello! I'm doing well, thanks for asking. My name is LLaMA, I'm a","original":null,"finish":false,"error":false}
{"message":"Hello! I'm doing well, thanks for asking. My name is LLaMA, I'm a large","original":null,"finish":false,"error":false}
{"message":"Hello! I'm doing well, thanks for asking. My name is LLaMA, I'm a large language","original":null,"finish":false,"error":false}
{"message":"Hello! I'm doing well, thanks for asking. My name is LLaMA, I'm a large language model","original":null,"finish":false,"error":false}
{"message":"Hello! I'm doing well, thanks for asking. My name is LLaMA, I'm a large language model trained","original":null,"finish":false,"error":false}
{"message":"Hello! I'm doing well, thanks for asking. My name is LLaMA, I'm a large language model trained by","original":null,"finish":false,"error":false}
{"message":"Hello! I'm doing well, thanks for asking. My name is LLaMA, I'm a large language model trained by a","original":null,"finish":false,"error":false}
{"message":"Hello! I'm doing well, thanks for asking. My name is LLaMA, I'm a large language model trained by a team","original":null,"finish":false,"error":false}
{"message":"Hello! I'm doing well, thanks for asking. My name is LLaMA, I'm a large language model trained by a team of","original":null,"finish":false,"error":false}
{"message":"Hello! I'm doing well, thanks for asking. My name is LLaMA, I'm a large language model trained by a team of research","original":null,"finish":false,"error":false}
{"message":"Hello! I'm doing well, thanks for asking. My name is LLaMA, I'm a large language model trained by a team of researcher","original":null,"finish":false,"error":false}
{"message":"Hello! I'm doing well, thanks for asking. My name is LLaMA, I'm a large language model trained by a team of researcher at","original":null,"finish":false,"error":false}
{"message":"Hello! I'm doing well, thanks for asking. My name is LLaMA, I'm a large language model trained by a team of researcher at Meta","original":null,"finish":false,"error":false}
{"message":"Hello! I'm doing well, thanks for asking. My name is LLaMA, I'm a large language model trained by a team of researcher at Meta A","original":null,"finish":false,"error":false}
{"message":"Hello! I'm doing well, thanks for asking. My name is LLaMA, I'm a large language model trained by a team of researcher at Meta AI","original":null,"finish":false,"error":false}
{"message":"Hello! I'm doing well, thanks for asking. My name is LLaMA, I'm a large language model trained by a team of researcher at Meta AI.","original":null,"finish":false,"error":false}
{"message":"Hello! I'm doing well, thanks for asking. My name is LLaMA, I'm a large language model trained by a team of researcher at Meta AI. How","original":null,"finish":false,"error":false}
{"message":"Hello! I'm doing well, thanks for asking. My name is LLaMA, I'm a large language model trained by a team of researcher at Meta AI. How about","original":null,"finish":false,"error":false}
{"message":"Hello! I'm doing well, thanks for asking. My name is LLaMA, I'm a large language model trained by a team of researcher at Meta AI. How about you","original":null,"finish":false,"error":false}
{"message":"Hello! I'm doing well, thanks for asking. My name is LLaMA, I'm a large language model trained by a team of researcher at Meta AI. How about you?","original":null,"finish":false,"error":false}
{"message":"Hello! I'm doing well, thanks for asking. My name is LLaMA, I'm a large language model trained by a team of researcher at Meta AI. How about you? What","original":null,"finish":false,"error":false}
{"message":"Hello! I'm doing well, thanks for asking. My name is LLaMA, I'm a large language model trained by a team of researcher at Meta AI. How about you? What brings","original":null,"finish":false,"error":false}
{"message":"Hello! I'm doing well, thanks for asking. My name is LLaMA, I'm a large language model trained by a team of researcher at Meta AI. How about you? What brings you","original":null,"finish":false,"error":false}
{"message":"Hello! I'm doing well, thanks for asking. My name is LLaMA, I'm a large language model trained by a team of researcher at Meta AI. How about you? What brings you here","original":null,"finish":false,"error":false}
{"message":"Hello! I'm doing well, thanks for asking. My name is LLaMA, I'm a large language model trained by a team of researcher at Meta AI. How about you? What brings you here today","original":null,"finish":false,"error":false}
{"message":"Hello! I'm doing well, thanks for asking. My name is LLaMA, I'm a large language model trained by a team of researcher at Meta AI. How about you? What brings you here today?","original":null,"finish":false,"error":false}
{"message":null,"original":null,"finish":true,"error":false}

Parameters

ParameterDefaultDescription
system_messageExplain what specific task you want the artificial intelligence to perform
max_tokens4096Min: 0, Max: 4096
temperature0.9Min: 0, Max: 1
top_p0.6Min: 0, Max: 1
repetition_penalty1.2Min: 1, Max: 2
markdownfalseYou can convert the dialogues into continuous streams or not into Markdown
streamfalseYou are given the option to choose whether you prefer the responses to be in real-time or not

Usage DALL·E

// import { dalle } from "gpti";
const { dalle } = require("gpti");

dalle.v1({
    prompt: "starry sky over the city"
}, (err, data) => {
    if(err != null){
        console.log(err);
    } else {
        console.log(data);
    }
});

JSON

{
    "code": 200,
    "status": true,
    "prompt": "starry sky over the city",
    "model": "DALL·E",
    "images": [
        "data:image/jpeg;base64,..."
    ]
}

Usage DALL·E 2 (PRO)

// import { dalle } from "gpti";
const { dalle } = require("gpti");

dalle.v2({
    prompt: "An extensive green valley stretches toward imposing mountains, adorned with meadows and a winding stream. The morning sun paints the sky with warm tones, illuminating the landscape with a serenity that invites contemplation and peace.",
    data: {
        prompt_negative: "",
        width: 1024,
        height: 1024,
        guidance_scale: 6
    }
}, (err, data) => {
    if(err != null){
        console.log(err);
    } else {
        console.log(data);
    }
});

JSON

{
    "code": 200,
    "status": true,
    "prompt": "An extensive green valley stretches toward imposing mountains, adorned with meadows and a winding stream. The morning sun paints the sky with warm tones, illuminating the landscape with a serenity that invites contemplation and peace.",
    "model": "DALL·E-2",
    "data": {
        "prompt_negative": "(deformed, distorted, disfigured:1.3), poorly drawn, bad anatomy, wrong anatomy, extra limb, missing limb, floating limbs, (mutated hands and fingers:1.4), disconnected limbs, mutation, mutated, ugly, disgusting, blurry, amputation, (NSFW:1.25)",
        "width": 1024,
        "height": 1024,
        "guidance_scale": 6
    },
    "images": [
        "data:image/jpeg;base64,..."
    ]
}

Parameters

ParameterDefaultDescription
prompt_negative(deformed, distorted, disfigured:1.3), poorly drawn, bad anatomy, wrong anatomy, extra limb, missing limb, floating limbs, (mutated hands and fingers:1.4), disconnected limbs, mutation, mutated, ugly, disgusting, blurry, amputation, (NSFW:1.25)Indicates what the AI should not do
width1024Min: 512, Max: 2048
height1024Min: 512, Max: 2048
guidance_scale6Min: 0.1, Max: 20

Usage DALL·E Mini

// import { dalle } from "gpti";
const { dalle } = require("gpti");

dalle.mini({
    prompt: "An extensive green valley stretches toward imposing mountains, adorned with meadows and a winding stream. The morning sun paints the sky with warm tones, illuminating the landscape with a serenity that invites contemplation and peace."
}, (err, data) => {
    if(err != null){
        console.log(err);
    } else {
        console.log(data);
    }
});

JSON

{
    "code": 200,
    "status": true,
    "prompt": "An extensive green valley stretches toward imposing mountains, adorned with meadows and a winding stream. The morning sun paints the sky with warm tones, illuminating the landscape with a serenity that invites contemplation and peace.",
    "model": "DALL·E-mini",
    "images": [
        "data:image/jpeg;base64,...",
        "..."
    ]
}

Usage Prodia

// import { prodia } from "gpti";
const { prodia } = require("gpti");

prodia.v1({
    prompt: "Friends gathered around a bonfire in an ancient forest. Laughter, stories, and a starry sky paint an unforgettable moment of connection beneath the shadows of the mountains.",
    data: {
        model: "absolutereality_V16.safetensors [37db0fc3]",
        steps: 25,
        cfg_scale: 7,
        sampler: "DPM++ 2M Karras",
        negative_prompt: ""
    }
}, (err, data) => {
    if(err != null){
        console.log(err);
    } else {
        console.log(data);
    }
});

JSON

{
    "code": 200,
    "status": true,
    "prompt": "Friends gathered around a bonfire in an ancient forest. Laughter, stories, and a starry sky paint an unforgettable moment of connection beneath the shadows of the mountains.",
    "model": "Prodia",
    "data": {
        "model": "absolutereality_V16.safetensors [37db0fc3]",
        "steps": 25,
        "cfg_scale": 7,
        "sampler": "DPM++ 2M Karras",
        "negative_prompt": ""
    },
    "images": [
        "data:image/jpeg;base64,..."
    ]
}

Models

List of models

Parameters

ParameterDefaultDescription
negative_promptIndicates what the AI should not do
modelabsolutereality_V16.safetensors 37db0fc3Select from the list of models
cfg_scale7Min: 0, Max: 20
steps25Min: 1, Max: 30
samplerDPM++ 2M KarrasSelect from these: "Euler","Euler a","Heun","DPM++ 2M Karras","DPM++ SDE Karras","DDIM"

Usage Prodia Stable-Diffusion

// import { prodia } from "gpti";
const { prodia } = require("gpti");

prodia.stablediffusion({
    prompt: "Friends gathered around a bonfire in an ancient forest. Laughter, stories, and a starry sky paint an unforgettable moment of connection beneath the shadows of the mountains.",
    data: {
        prompt_negative: "",
        model: "absolutereality_v181.safetensors [3d9d4d2b]",
        sampling_method: "DPM++ 2M Karras",
        sampling_steps: 25,
        width: 512,
        height: 512,
        cfg_scale: 7
    }
}, (err, data) => {
    if(err != null){
        console.log(err);
    } else {
        console.log(data);
    }
});

JSON

{
    "code": 200,
    "status": true,
    "prompt": "Friends gathered around a bonfire in an ancient forest. Laughter, stories, and a starry sky paint an unforgettable moment of connection beneath the shadows of the mountains.",
    "model": "Prodia-StableDiffusion",
    "data": {
        "prompt_negative": "",
        "model": "absolutereality_v181.safetensors [3d9d4d2b]",
        "sampling_method": "DPM++ 2M Karras",
        "sampling_steps": 25,
        "width": 512,
        "height": 512,
        "cfg_scale": 7
    }
    "images": [
        "data:image/jpeg;base64,..."
    ]
}

Models

List of models

Methods

List of methods:

  • DPM++ 2M Karras
  • Euler
  • Euler a
  • LMS
  • Heun
  • DPM2
  • DPM2 a
  • DPM++ 2S a
  • DPM++ 2M
  • DPM++ SDE
  • DPM fast
  • DPM adaptive
  • LMS Karras
  • DPM2 Karras
  • DPM2 a Karras
  • DPM++ 2S a Karras
  • DPM++ 2M Karras
  • DPM++ SDE Karras
  • DDIM
  • PLMS
  • DPM++ 2M Karras

Parameters

ParameterDefaultDescription
prompt_negativeIndicates what the AI should not do
modelabsolutereality_v181.safetensors 3d9d4d2bSelect from the list of models
sampling_methodDPM++ 2M KarrasSelect from the list of methods
sampling_steps25Min: 1, Max: 30
width512Min: 50, Max: 1024
height512Min: 50, Max: 1024
cfg_scale7Min: 1, Max: 20

Usage Prodia Stable-Diffusion XL (PRO)

// import { prodia } from "gpti";
const { prodia } = require("gpti");

prodia.stablediffusion_xl({
    prompt: "Friends gathered around a bonfire in an ancient forest. Laughter, stories, and a starry sky paint an unforgettable moment of connection beneath the shadows of the mountains.",
    data: {
        prompt_negative: "",
        model: "sd_xl_base_1.0.safetensors [be9edd61]",
        sampling_method: "DPM++ 2M Karras",
        sampling_steps: 25,
        width: 1024,
        height: 1024,
        cfg_scale: 7
    }
}, (err, data) => {
    if(err != null){
        console.log(err);
    } else {
        console.log(data);
    }
});

JSON

{
    "code": 200,
    "status": true,
    "prompt": "Friends gathered around a bonfire in an ancient forest. Laughter, stories, and a starry sky paint an unforgettable moment of connection beneath the shadows of the mountains.",
    "model": "Prodia-StableDiffusion-xl",
    "data": {
        "prompt_negative": "",
        "model": "sd_xl_base_1.0.safetensors [be9edd61]",
        "sampling_method": "DPM++ 2M Karras",
        "sampling_steps": 25,
        "width": 1024,
        "height": 1024,
        "cfg_scale": 7
    }
    "images": [
        "data:image/jpeg;base64,..."
    ]
}

Models

List of models:

  • dreamshaperXL10_alpha2.safetensors c8afe2ef
  • dynavisionXL_0411.safetensors c39cc051
  • juggernautXL_v45.safetensors e75f5471
  • realismEngineSDXL_v10.safetensors af771c3f
  • sd_xl_base_1.0.safetensors be9edd61
  • sd_xl_base_1.0_inpainting_0.1.safetensors 5679a81a
  • turbovisionXL_v431.safetensors 78890989

Methods

List of methods:

  • DPM++ 2M Karras
  • Euler
  • Euler a
  • LMS
  • Heun
  • DPM2
  • DPM2 a
  • DPM++ 2S a
  • DPM++ 2M
  • DPM++ SDE
  • DPM fast
  • DPM adaptive
  • LMS Karras
  • DPM2 Karras
  • DPM2 a Karras
  • DPM++ 2S a Karras
  • DPM++ SDE Karras

Parameters

ParameterDefaultDescription
prompt_negativeIndicates what the AI should not do
modelsd_xl_base_1.0.safetensors be9edd61Select from the list of models
sampling_methodDPM++ 2M KarrasSelect from the list of methods
sampling_steps25Min: 1, Max: 30
width1024Min: 512, Max: 1536
height1024Min: 512, Max: 1536
cfg_scale7Min: 1, Max: 20

Usage Pixart-A (PRO)

// import { pixart } from "gpti";
const { pixart } = require("gpti");

pixart.a({
    prompt: "An urban landscape bathed in the sunset, where the warm tones of the sun reflect on modern buildings and the orange and purple sky. In the foreground, there's a group of friends gathered on a rooftop, laughing and enjoying the moment. Their expressions radiate joy and camaraderie as they embrace and point towards something on the horizon. The scene is enveloped in a nostalgic and emotional aura that conveys the beauty of friendship and the warmth of the sunset in a futuristic city with touches of anime style.",
    data: {
        prompt_negative: "",
        sampler: "DPM-Solver",
        image_style: "Anime",
        width: 1024,
        height: 1024,
        dpm_guidance_scale: 4.5,
        dpm_inference_steps: 14,
        sa_guidance_scale: 3,
        sa_inference_steps: 25
    }
}, (err, data) => {
    if(err != null){
        console.log(err);
    } else {
        console.log(data);
    }
});

JSON

{
    "code": 200,
    "status": true,
    "prompt": "An urban landscape bathed in the sunset, where the warm tones of the sun reflect on modern buildings and the orange and purple sky. In the foreground, there's a group of friends gathered on a rooftop, laughing and enjoying the moment. Their expressions radiate joy and camaraderie as they embrace and point towards something on the horizon. The scene is enveloped in a nostalgic and emotional aura that conveys the beauty of friendship and the warmth of the sunset in a futuristic city with touches of anime style.",
    "model": "PixArt-a",
    "data": {
        "prompt_negative": "",
        "sampler": "DPM-Solver",
        "image_style": "Anime",
        "width": 1024,
        "height": 1024,
        "dpm_guidance_scale": 4.5,
        "dpm_inference_steps": 14,
        "sa_guidance_scale": 3,
        "sa_inference_steps": 25
    },
    "images": [
        "data:image/jpeg;base64,..."
    ]
}

Parameters

ParameterDefaultDescription
prompt_negativeIndicates what the AI should not do
samplerDPM-SolverChoose among these: "DPM-Solver", "SA-Solver"
image_style(No style)Choose from various available image types: "(No style)", "Cinematic", "Photographic", "Anime", "Manga", "Digital Art", "Pixel art", "Fantasy art", "Neonpunk", "3D Model"
width1024Min: 256, Max: 2048
height1024Min: 256, Max: 2048
dpm_guidance_scale4.5Min: 1, Max: 10
dpm_inference_steps14Min: 5, Max: 40
sa_guidance_scale3Min: 1, Max: 10
sa_inference_steps25Min: 10, Max: 40

Usage Pixart-LCM (PRO)

// import { pixart } from "gpti";
const { pixart } = require("gpti");

pixart.lcm({
    prompt: "An enchanted forest with twisted trees, a waterfall cascading into a pond of bright water lilies, and in the background, a magical tower surrounded by mythical creatures like unicorns, fairies, and dragons, under a starry sky and a giant moon.",
    data: {
        prompt_negative: "",
        image_style: "Fantasy art",
        width: 1024,
        height: 1024,
        lcm_inference_steps: 9
    }
}, (err, data) => {
    if(err != null){
        console.log(err);
    } else {
        console.log(data);
    }
});

JSON

{
    "code": 200,
    "status": true,
    "prompt": "An enchanted forest with twisted trees, a waterfall cascading into a pond of bright water lilies, and in the background, a magical tower surrounded by mythical creatures like unicorns, fairies, and dragons, under a starry sky and a giant moon.",
    "model": "PixArt-LCM",
    "data": {
        "prompt_negative": "",
        "image_style": "Fantasy art",
        "width": 1024,
        "height": 1024,
        "lcm_inference_steps": 9
    },
    "images": [
        "data:image/jpeg;base64,..."
    ]
}

Parameters

ParameterDefaultDescription
prompt_negativeIndicates what the AI should not do
image_style(No style)Choose from various available image types: "(No style)", "Cinematic", "Photographic", "Anime", "Manga", "Digital Art", "Pixel art", "Fantasy art", "Neonpunk", "3D Model"
width1024Min: 256, Max: 2048
height1024Min: 256, Max: 2048
lcm_inference_steps9Min: 1, Max: 30

Usage Stable-Diffusion 1.5

// import { stablediffusion } from "gpti";
const { stablediffusion } = require("gpti");

stablediffusion.v1({
    prompt: "An serene sunset landscape where a river winds through gentle hills covered in trees. The sky is tinged with warm and soft tones, with scattered clouds reflecting the last glimmers of the sun."
}, (err, data) => {
    if(err != null){
        console.log(err);
    } else {
        console.log(data);
    }
});

JSON

{
    "code": 200,
    "status": true,
    "prompt": "An serene sunset landscape where a river winds through gentle hills covered in trees. The sky is tinged with warm and soft tones, with scattered clouds reflecting the last glimmers of the sun.",
    "model": "StableDiffusion-1.5",
    "images": [
        "data:image/jpeg;base64,...",
        "..."
    ]
}

Usage Stable-Diffusion 2.1

// import { stablediffusion } from "gpti";
const { stablediffusion } = require("gpti");

stablediffusion.v2({
    prompt: "An serene sunset landscape where a river winds through gentle hills covered in trees. The sky is tinged with warm and soft tones, with scattered clouds reflecting the last glimmers of the sun.",
    data: {
        prompt_negative: "",
        guidance_scale: 9
    }
}, (err, data) => {
    if(err != null){
        console.log(err);
    } else {
        console.log(data);
    }
});

JSON

{
    "code": 200,
    "status": true,
    "prompt": "An serene sunset landscape where a river winds through gentle hills covered in trees. The sky is tinged with warm and soft tones, with scattered clouds reflecting the last glimmers of the sun.",
    "model": "StableDiffusion-2.1",
    "data": {
        "prompt_negative": "",
        "guidance_scale": 9
    },
    "images": [
        "data:image/jpeg;base64,...",
        "..."
    ]
}

Parameters

ParameterDefaultDescription
prompt_negativeIndicates what the AI should not do
guidance_scale9Min: 0 Max: 50

Usage Stable-Diffusion XL (PRO)

// import { stablediffusion } from "gpti";
const { stablediffusion } = require("gpti");

stablediffusion.xl({
    prompt: "An serene sunset landscape where a river winds through gentle hills covered in trees. The sky is tinged with warm and soft tones, with scattered clouds reflecting the last glimmers of the sun.",
    data: {
        prompt_negative: "",
        image_style: "(No style)",
        guidance_scale: 7.5
    }
}, (err, data) => {
    if(err != null){
        console.log(err);
    } else {
        console.log(data);
    }
});

JSON

{
    "code": 200,
    "status": true,
    "prompt": "An serene sunset landscape where a river winds through gentle hills covered in trees. The sky is tinged with warm and soft tones, with scattered clouds reflecting the last glimmers of the sun.",
    "model": "StableDiffusion-XL",
    "data": {
        "prompt_negative": "",
        "image_style": "(No style)",
        "guidance_scale": 7.5
    },
    "images": [
        "data:image/jpeg;base64,...",
        "..."
    ]
}

Parameters

ParameterDefaultDescription
prompt_negativeIndicates what the AI should not do
image_style(No style)Choose from various available image types: "(No style)", "Cinematic", "Photographic", "Anime", "Manga", "Digital Art", "Pixel art", "Fantasy art", "Neonpunk", "3D Model"
guidance_scale7.5Min: 0, Max: 50

Usage EMI

// import { emi } from "gpti";
const { emi } = require("gpti");

emi({
    prompt: "A beautiful girl in a garden full of bright flowers. Her long, silky hair is adorned with flowers, and her large eyes reflect serenity. She wears a traditional kimono, smiling as she holds a delicate butterfly in her hand.",
}, (err, data) => {
    if(err != null){
        console.log(err);
    } else {
        console.log(data);
    }
});

JSON

{
    "code": 200,
    "status": true,
    "prompt": "A beautiful girl in a garden full of bright flowers. Her long, silky hair is adorned with flowers, and her large eyes reflect serenity. She wears a traditional kimono, smiling as she holds a delicate butterfly in her hand.",
    "model": "Emi",
    "scene": "a young woman stands in a beautiful garden, full of vibrant flowers. Her long, flowing silk kimono is adorned with the same flowers, and her large, expressive eyes seem to reflect a sense of peaceful serenity. In her hand, she clutches a delicate butterfly, which seems to be caught up in the beauty of the moment. She is surrounded",
    "images": [
        "data:image/jpeg;base64,..."
    ]
}

Usage Render3D (PRO)

// import { render3d } from "gpti";
const { render3d } = require("gpti");

render3d({
    prompt: "In a remote corner of the galaxy, a star agonizes in its final stage of life. Its brightness, once dazzling, now fades slowly into the void of space, while a bright nebula forms around it.",
    data: {
        prompt_negative: ""
    }
}, (err, data) => {
    if(err != null){
        console.log(err);
    } else {
        console.log(data);
    }
});

JSON

{
    "code": 200,
    "status": true,
    "prompt": "In a remote corner of the galaxy, a star agonizes in its final stage of life. Its brightness, once dazzling, now fades slowly into the void of space, while a bright nebula forms around it.",
    "model": "Render3D",
    "data": {
        "prompt_negative": ""
    },
    "images": [
        "data:image/jpeg;base64,..."
    ]
}

Usage PixelArt

// import { pixelart } from "gpti";
const { pixelart } = require("gpti");

pixelart({
    prompt: "A coastal city in the golden hour of the sunset. The sun slowly slips toward the horizon, tinting the sky with golden and pink hues. Skyscrapers stand out against this heavenly backdrop, reflecting the light in their glass windows. In the streets, lights flicker timidly, getting ready to illuminate the night.",
    data: {
        prompt_negative: ""
    }
}, (err, data) => {
    if(err != null){
        console.log(err);
    } else {
        console.log(data);
    }
});

JSON

{
    "code": 200,
    "status": true,
    "prompt": "A coastal city in the golden hour of the sunset. The sun slowly slips toward the horizon, tinting the sky with golden and pink hues. Skyscrapers stand out against this heavenly backdrop, reflecting the light in their glass windows. In the streets, lights flicker timidly, getting ready to illuminate the night.",
    "model": "PixelArt",
    "data": {
        "prompt_negative": ""
    },
    "images": [
        "data:image/jpeg;base64,..."
    ]
}

Animagine XL (PRO)

// import { animagine } from "gpti";
const { animagine } = require("gpti");

animagine({
    prompt: "An anime girl surrounded by cherry blossoms",
    data: {
        prompt_negative: "",
        quality_tags: "Standard",
        style_present: "(None)",
        width: 1024,
        height: 1024,
        strength: 0.5,
        upscale: 1.5,
        sampler: "Euler a",
        guidance_scale: 7,
        inference_steps: 28
    }
}, (err, data) => {
    if(err != null){
        console.log(err);
    } else {
        console.log(data);
    }
});

JSON

{
    "code": 200,
    "status": true,
    "prompt": "An anime girl surrounded by cherry blossoms",
    "model": "Animagine-XL",
    "data": {
        "prompt_negative": "",
        "quality_tags": "Standard",
        "style_present": "(None)",
        "width": 1024,
        "height": 1024,
        "strength": 0.5,
        "upscale": 1.5,
        "sampler": "Euler a",
        "guidance_scale": 7,
        "inference_steps": 28
    },
    "images": [
        "data:image/jpeg;base64,..."
    ]
}

Parameters

ParameterDefaultDescription
prompt_negativeIndicates what the AI should not do
width1024Min: 512, Max: 2048
height1024Min: 512, Max: 2048
guidance_scale7Min: 1, Max: 12
quality_tagsStandardSelect from these: "Standard", "Light", "Heavy", "(None)"
style_present(None)Choose from various available image types: "(None)", "Cinematic", "Photographic", "Anime", "Manga", "Digital Art", "Pixel art", "Fantasy art", "Neonpunk", "3D Model"
strength0.5Min: 0, Max: 1
upscale1.5Min: 1, Max: 1.5
samplerEuler aSelect from these: "Euler a", "DPM++ 2M Karras", "DPM++ SDE Karras", "DPM++ 2M SDE Karras", "Euler", "DDIM"
inference_steps28Min: 1, Max: 50

Playground (PRO)

// import { playground } from "gpti";
const { playground } = require("gpti");

playground({
    prompt: "An illustration of a red owl with bright blue eye",
    model: "playground",
    data: {
        prompt_negative: "",
        width: 1024,
        height: 1024,
        guidance_scale: 3
    }
}, (err, data) => {
    if(err != null){
        console.log(err);
    } else {
        console.log(data);
    }
});

JSON

{
    "code": 200,
    "status": true,
    "prompt": "An illustration of a red owl with bright blue eyes.",
    "model": "Playground",
    "data": {
        "prompt_negative": "",
        "width": 1024,
        "height": 1024,
        "guidance_scale": 3
    },
    "images": [
        "data:image/jpeg;base64,..."
    ]
}

Parameters

ParameterDefaultDescription
prompt_negativeIndicates what the AI should not do
width1024Min: 256, Max: 1536
height1024Min: 256, Max: 1536
guidance_scale3Min: 0.1, Max: 20

API Reference

Currently, some models require your credentials to access them, while others are free. For more details and examples, please refer to the complete documentation.

Code Errors

These are the error codes that will be presented in case the API fails.

CodeErrorDescription
400BAD_REQUESTNot all parameters have been entered correctly
500INTERNAL_SERVER_ERRORThe server has experienced failures
200The API worked without issues
403FORBIDDENThe API credentials are not valid
401UNAUTHORIZEDAPI credentials are required

☕ Do you want to support this project?

If this package has helped you save time or solve a problem, consider inviting me for a coffee through Ko-fi. Your support helps me maintain and improve this project for you and other users like you. Furthermore, each donation contributes to the creation and free availability of more AI models in the future. Every small donation counts and is greatly appreciated!

Support on Ko-fi

2.1.3

8 days ago

2.1.2

18 days ago

2.1.1

18 days ago

2.1.0

26 days ago

2.0.9

1 month ago

2.0.7

1 month ago

2.0.8

1 month ago

2.0.5

3 months ago

2.0.6

3 months ago

2.0.3

3 months ago

2.0.4

3 months ago

2.0.2

3 months ago

2.0.1

3 months ago

2.0.0

4 months ago

1.0.9

4 months ago

1.0.8

4 months ago

1.0.7

4 months ago

1.0.6

4 months ago

1.0.5

8 months ago

1.0.4

8 months ago

1.0.3

8 months ago

1.0.2

8 months ago

1.0.1

8 months ago

1.0.0

8 months ago