0.1.0 • Published 6 months ago

@buun_group/ai-restaurant-chat v0.1.0

Weekly downloads
-
License
MIT
Repository
-
Last release
6 months ago

NPM Chatbot OpenRouter

A reusable AI Chat Widget for Next.js applications, powered by OpenRouter. Easily integrate a configurable chat interface into your restaurant or real estate broker websites.

Features

  • React Component: <AIChatWidget /> for easy integration.
  • OpenRouter Integration: Connects to various LLMs via OpenRouter.
  • Configurable Context: Tailor the chatbot\'s knowledge and persona (e.g., restaurant assistant, real estate broker).
  • Model Selection: Supports multiple free models from OpenRouter, configurable via props.
  • Rate Limiting: Basic in-memory rate limiting to prevent abuse (15 seconds per client ID).
  • Markdown Rendering: Supports Markdown in chat responses (requires react-markdown).
  • TypeScript Support: Built with TypeScript for type safety.
  • Private NPM Publishing: Instructions included for publishing as a private package.

Folder Structure

npm-chatbot-openrouter/
├── src/
│   ├── components/AIChatWidget.tsx
│   ├── lib/useChat.ts
│   └── api/handler.ts
├── index.ts
├── package.json
├── tsconfig.json
└── README.md

Installation

npm install @buun/ai-restaurant-chat # Replace with your actual package name after publishing

Or if you have a private registry setup:

npm install @your-scope/npm-chatbot-openrouter

Usage in a Next.js Site

  1. Add the Component: Import and use the AIChatWidget in your Next.js page or layout.

    // Example: app/some-page/page.tsx
    import { AIChatWidget } from \'@buun/ai-restaurant-chat\'; // Adjust import path to your package name
    
    export default function MyPage() {
      return (
        <div>
          <h1>Welcome to My Site</h1>
          <AIChatWidget
            type="restaurant" // or "broker"
            apiPath="/api/ai-chat" // Your backend endpoint
            context={{
              // Restaurant example
              restaurantName: "Buun Sushi",
              menu: [
                { id: 1, name: "Sushi Platter", price: "$25", description: "Assorted sushi" },
                { id: 2, name: "Ramen", price: "$15", description: "Pork broth ramen" }
              ],
              hours: "Mon-Sat 12pm-10pm",
              // model: "mistralai/mistral-7b-instruct" // Optional: specify a model
            }}
          />
        </div>
      );
    }
  2. Set up API Route: In your Next.js application (the one consuming this package), create an API route to handle chat requests. This route will import and use the handler from this package.

    Create a file app/api/ai-chat/route.ts (or your chosen apiPath):

    // app/api/ai-chat/route.ts
    export { POST } from \'@buun/ai-restaurant-chat/api/handler\'; // Adjust import path

    Note: Ensure the path to handler.ts is correct based on how your package exposes its modules. Typically, this might be @your-scope/npm-chatbot-openrouter/dist/api/handler after building, or you might re-export it from index.ts.

  1. Environment Variables: You must set your OpenRouter API key as an environment variable in your Next.js application.

    Create a .env.local file in the root of your Next.js project:

    OPENROUTER_API_KEY=sk-or-v1-xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx

    Remember to add this to your hosting provider\'s environment variables (e.g., Vercel, Netlify).

Configuration

AIChatWidget Props

PropTypeDefaultDescription
apiPathstring- (Required)The API endpoint in your Next.js app that proxies requests to the chatbot handler.
type\'restaurant\' \| \'broker\'\'restaurant\'Determines the system prompt and persona of the chatbot.
contextobject{}An object containing contextual information for the chatbot (e.g., menu, listings, hours, model override).
initialMessagesArray<{role: string, content: string}>[]Optional initial messages for the chat.
defaultModelstring\'huggingfaceh4/zephyr-7b-beta\'Default model to use if not specified in context.
rateLimitConfig{ windowMs?: number, max?: number }{ windowMs: 15000, max: 1 }Configuration for client-side request throttling (not the server-side X-Chat-Client-Id based one).
placeholderstring\'Type your message...\'Placeholder text for the chat input.
titlestring\'AI Assistant\'Title displayed at the top of the chat widget.

API Handler (src/api/handler.ts)

The handler uses the type and context passed in the request body:

  • type: Can be \'restaurant\' or \'broker\'. This selects the appropriate system prompt.
  • context: This object is stringified and injected into the system prompt. You can pass details like menus, business hours, property information, etc.
  • model: You can override the default model by passing a model string in the context object sent from the client, or directly in the POST request body.

    Example request body to API:

    {
      "messages": [{"role": "user", "content": "Hello!"}],
      "type": "restaurant",
      "context": { "restaurantName": "The Great Eatery", "model": "mistralai/mistral-7b-instruct" },
      "model": "mistralai/mistral-7b-instruct" // Alternative way to specify model
    }

Supported Models

You can specify a model to use by passing it in the context prop of the AIChatWidget or directly in the API request. Some suggested free models include:

  • huggingfaceh4/zephyr-7b-beta (default)
  • mistralai/mistral-7b-instruct
  • openchat/openchat-7b
  • undi95/toppy-m-7b
  • gryphe/mythomax-l2-13b

Check OpenRouter.ai for a full list of available models.

Publishing to NPM

1. Update package.json

Ensure your package.json is correctly configured:

  • name: Should be @your-scope/your-package-name (e.g., @buun/npm-chatbot-openrouter).
  • version: Set your initial version (e.g., 0.1.0).
  • main: Should point to your main entry file in the dist folder (e.g., dist/index.js).
  • module: For ES module support (e.g., dist/index.mjs).
  • types: Points to your type definitions (e.g., dist/index.d.ts).
  • files: An array of files/folders to include in the package (e.g., ["dist", "src/api/handler.ts"]). Adjust if you re-export handler.ts differently.
  • scripts: Include a build script: "build": "tsup index.ts --format cjs,esm --dts" (using tsup for bundling is recommended, or tsc).
  • peerDependencies: List react and react-dom.
    {
      "name": "@your-scope/npm-chatbot-openrouter",
      "version": "0.1.0",
      "main": "dist/index.js",
      "module": "dist/index.mjs",
      "types": "dist/index.d.ts",
      "files": [
        "dist",
        "src/api/handler.ts" 
      ],
      "scripts": {
        "build": "tsup src/index.ts --format cjs,esm --dts --external react", // Ensure tsup is a devDependency
        "lint": "eslint . --ext ts,tsx --report-unused-disable-directives --max-warnings 0",
        "test": "echo \\"Error: no test specified\\" && exit 1"
      },
      "peerDependencies": {
        "react": ">=17.0.0",
        "react-dom": ">=17.0.0"
      },
      "devDependencies": {
        "@types/react": "^18.2.0",
        "tsup": "^8.0.0", // Or your preferred bundler
        "typescript": "^5.0.0",
        "eslint": "...",
         // ... other dev dependencies
      },
      "dependencies": {
        "openai": "^4.0.0" // Or the version you intend to use
      }
      // ... other fields like license, repository, keywords
    }

2. Build Your Package

Run your build script:

npm run build

This should generate the dist folder.

3. Login to NPM

If you haven\'t already, log in to your NPM account:

npm login

(You might need to configure it for your private registry if you\'re using one like GitHub Packages or Verdaccio).

4. Publish

For a public package:

npm publish

For a private package (scoped packages are private by default if the scope is associated with an org/paid account, otherwise use --access): If your package is scoped (e.g., @username/my-package), it defaults to restricted/private if your user/org has private package capabilities. Otherwise, for unscoped packages or to be explicit:

npm publish --access=restricted

If publishing to a private registry like GitHub Packages, you might need to configure your .npmrc file.

Optional Add-ons & Future Enhancements

  • Streaming Responses: Implement ReadableStream for a more interactive experience.
  • Persistent Chat History: Use localStorage or a backend service like Supabase.
  • Multilingual Auto-Detect: Integrate libraries like franc-min or use navigator.language.
  • Language Toggle: Allow users to switch languages.
  • Advanced Rate Limiting: Extend server-side rate limiting with Redis or a durable cache.
  • Analytics: Implement tracking using Vercel Edge Middleware or other analytics tools.
  • UI for Model Selection: Add a dropdown in the AIChatWidget to let users choose a model.

Markdown Rendering (Frontend)

To enable Markdown rendering for chat messages, install react-markdown:

npm install react-markdown

Then, update the message display in AIChatWidget.tsx:

import ReactMarkdown from \'react-markdown\';

// Inside your component, when rendering messages:
// <div className={`inline-block px-3 py-2 rounded-xl ${msg.role === \'user\' ? \'bg-blue-100\' : \'bg-gray-100\'}`}>
//   <ReactMarkdown>{msg.content}</ReactMarkdown>
// </div>

This README provides a comprehensive guide to get started. You\'ll need to fill in the actual implementation for AIChatWidget.tsx, useChat.ts, and potentially index.ts to export your components and functions.

0.1.0

6 months ago