Skip to content

coslynx/AI-Powered-User-Query-Backend

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

16 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

AI-Powered-User-Query-Backend

A streamlined backend for integrating OpenAI's language models into applications.

Developed with the software and tools below.

Framework: FastAPI Backend: Python Database: PostgreSQL LLMs: OpenAI
git-last-commit GitHub commit activity GitHub top language

πŸ“‘ Table of Contents

  • πŸ“ Overview
  • πŸ“¦ Features
  • πŸ“‚ Structure
  • πŸ’» Installation
  • πŸ—οΈ Usage
  • 🌐 Hosting
  • πŸ“„ License
  • πŸ‘ Authors

πŸ“ Overview

This repository contains the AI Powered User Queries Backend, a Minimum Viable Product (MVP) built with Python and FastAPI for simplifying AI integration. The MVP provides a lightweight and efficient solution to access OpenAI's powerful language models through a user-friendly API.

πŸ“¦ Features

Feature Description
βš™οΈ Architecture The MVP utilizes a microservice architecture for modularity and scalability, with independent services for query processing, API integration, and database interaction.
πŸ“„ Documentation This README provides a comprehensive overview of the MVP's functionality, installation instructions, and usage examples.
πŸ”— Dependencies The MVP relies on libraries like FastAPI, SQLAlchemy, openai, and requests for API development, database management, and OpenAI integration.
🧩 Modularity The codebase is organized with separate modules for core services, database interaction, and utility functions.
πŸ§ͺ Testing The MVP includes unit tests for core functions and integration tests for verifying end-to-end functionality.
⚑️ Performance Optimized for efficient query processing and response handling, utilizing caching mechanisms where appropriate.
πŸ” Security Incorporates best practices for secure development, including data validation, input sanitization, and API key management.
πŸ”€ Version Control Uses Git for version control and includes a CI/CD pipeline for automated testing and deployment.
πŸ”Œ Integrations Seamlessly integrates with OpenAI's API for generating responses and utilizes a PostgreSQL database for storing data.
πŸ“Ά Scalability Designed for horizontal scalability by utilizing containerization with Docker and deployment on Kubernetes.

πŸ“‚ Structure

ai-query-backend
β”œβ”€β”€ api
β”‚   └── main.py
β”œβ”€β”€ config
β”‚   └── settings.py
β”œβ”€β”€ core
β”‚   β”œβ”€β”€ services
β”‚   β”‚   └── openai_service.py
β”‚   β”œβ”€β”€ database
β”‚   β”‚   β”œβ”€β”€ models.py
β”‚   β”‚   └── database.py
β”‚   └── utils
β”‚       β”œβ”€β”€ utils.py
β”‚       └── logger.py
└── tests
    └── unit
        β”œβ”€β”€ test_openai_service.py
        └── test_database.py

πŸ’» Installation

πŸ”§ Prerequisites

  • Python 3.9+
  • Docker
  • Kubernetes (kubectl)
  • PostgreSQL

πŸš€ Setup Instructions

  1. Clone the repository:

    git clone https://github.com/coslynx/AI-Powered-User-Query-Backend.git
    cd AI-Powered-User-Query-Backend
  2. Install dependencies:

    pip install -r requirements.txt
  3. Create .env File:

    cp .env.example .env
  4. Configure environment variables: Update the .env file with your:

    • DATABASE_URL (PostgreSQL connection string)
    • OPENAI_API_KEY (OpenAI API key)
    • JWT_SECRET (secret key for JWT authentication)
  5. Build and Run the Application (Docker):

    docker-compose up -d
  6. Deploy to Kubernetes (kubectl):

    kubectl apply -f deployment.yaml

πŸ—οΈ Usage

πŸƒβ€β™‚οΈ Running the MVP

  1. Start the development server:

    uvicorn api.main:app --reload
  2. Access the API:

βš™οΈ Configuration

  • config/settings.py: Contains configuration settings for database connections, OpenAI API keys, and other essential variables. You can modify these settings based on your specific environment.

πŸ“š Examples

API Endpoints:

  • POST /query:

    • Request Body:
      {
        "query": "What is the meaning of life?",
        "model": "text-davinci-003", 
        "temperature": 0.7, 
        "max_length": 256
      }
    • Response Body:
      {
        "query_id": "generated_id"
      }
  • GET /response/{query_id}:

    • Response Body:
      {
        "response": "The meaning of life is a question that has been pondered by philosophers and thinkers for centuries..."
      }

🌐 Hosting

πŸš€ Deployment Instructions

  1. Build a Docker image:

    docker build -t ai-query-backend .
  2. Push the image to a Docker registry (e.g., Docker Hub):

    docker push coslynx/ai-query-backend:latest
  3. Deploy the image to Kubernetes:

    kubectl apply -f deployment.yaml

πŸ”‘ Environment Variables

  • DATABASE_URL: Connection string for the PostgreSQL database Example: postgresql://user:password@host:port/database
  • OPENAI_API_KEY: OpenAI API key Example: sk-your-openai-api-key
  • JWT_SECRET: Secret key for JWT authentication Example: your-256-bit-secret

πŸ“œ API Documentation

πŸ” Endpoints

  • POST /query:

    • Description: Processes a user query using OpenAI's API.
    • Request Body:
      {
        "query": "What is the meaning of life?",
        "model": "text-davinci-003", 
        "temperature": 0.7, 
        "max_length": 256
      }
    • Response Body:
      {
        "query_id": "generated_id"
      }
  • GET /response/{query_id}:

    • Description: Retrieves the response for a given query ID.
    • Response Body:
      {
        "response": "The meaning of life is a question that has been pondered by philosophers and thinkers for centuries..."
      }

πŸ”’ Authentication

The MVP uses JWT for authentication. To access protected endpoints:

  1. Generate a JWT token by registering a new user or logging in.

  2. Include the token in the Authorization header of requests:

    Authorization: Bearer YOUR_JWT_TOKEN
    

πŸ“œ License & Attribution

πŸ“„ License

This Minimum Viable Product (MVP) is licensed under the GNU AGPLv3 license.

πŸ€– AI-Generated MVP

This MVP was entirely generated using artificial intelligence through CosLynx.com.

No human was directly involved in the coding process of the repository: AI-Powered-User-Query-Backend

πŸ“ž Contact

For any questions or concerns regarding this AI-generated MVP, please contact CosLynx at:

🌐 CosLynx.com

Create Your Custom MVP in Minutes With CosLynxAI!

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy