- π Overview
- π¦ Features
- π Structure
- π» Installation
- ποΈ Usage
- π Hosting
- π License
- π Authors
This repository contains the AI Powered User Queries Backend, a Minimum Viable Product (MVP) built with Python and FastAPI for simplifying AI integration. The MVP provides a lightweight and efficient solution to access OpenAI's powerful language models through a user-friendly API.
Feature | Description | |
---|---|---|
βοΈ | Architecture | The MVP utilizes a microservice architecture for modularity and scalability, with independent services for query processing, API integration, and database interaction. |
π | Documentation | This README provides a comprehensive overview of the MVP's functionality, installation instructions, and usage examples. |
π | Dependencies | The MVP relies on libraries like FastAPI, SQLAlchemy, openai, and requests for API development, database management, and OpenAI integration. |
𧩠| Modularity | The codebase is organized with separate modules for core services, database interaction, and utility functions. |
π§ͺ | Testing | The MVP includes unit tests for core functions and integration tests for verifying end-to-end functionality. |
β‘οΈ | Performance | Optimized for efficient query processing and response handling, utilizing caching mechanisms where appropriate. |
π | Security | Incorporates best practices for secure development, including data validation, input sanitization, and API key management. |
π | Version Control | Uses Git for version control and includes a CI/CD pipeline for automated testing and deployment. |
π | Integrations | Seamlessly integrates with OpenAI's API for generating responses and utilizes a PostgreSQL database for storing data. |
πΆ | Scalability | Designed for horizontal scalability by utilizing containerization with Docker and deployment on Kubernetes. |
ai-query-backend
βββ api
β βββ main.py
βββ config
β βββ settings.py
βββ core
β βββ services
β β βββ openai_service.py
β βββ database
β β βββ models.py
β β βββ database.py
β βββ utils
β βββ utils.py
β βββ logger.py
βββ tests
βββ unit
βββ test_openai_service.py
βββ test_database.py
- Python 3.9+
- Docker
- Kubernetes (kubectl)
- PostgreSQL
-
Clone the repository:
git clone https://github.com/coslynx/AI-Powered-User-Query-Backend.git cd AI-Powered-User-Query-Backend
-
Install dependencies:
pip install -r requirements.txt
-
Create
.env
File:cp .env.example .env
-
Configure environment variables: Update the
.env
file with your:DATABASE_URL
(PostgreSQL connection string)OPENAI_API_KEY
(OpenAI API key)JWT_SECRET
(secret key for JWT authentication)
-
Build and Run the Application (Docker):
docker-compose up -d
-
Deploy to Kubernetes (kubectl):
kubectl apply -f deployment.yaml
-
Start the development server:
uvicorn api.main:app --reload
-
Access the API:
- http://localhost:8000/docs for API documentation
- http://localhost:8000/redoc for an alternative API documentation view
config/settings.py
: Contains configuration settings for database connections, OpenAI API keys, and other essential variables. You can modify these settings based on your specific environment.
API Endpoints:
-
POST
/query
:- Request Body:
{ "query": "What is the meaning of life?", "model": "text-davinci-003", "temperature": 0.7, "max_length": 256 }
- Response Body:
{ "query_id": "generated_id" }
- Request Body:
-
GET
/response/{query_id}
:- Response Body:
{ "response": "The meaning of life is a question that has been pondered by philosophers and thinkers for centuries..." }
- Response Body:
-
Build a Docker image:
docker build -t ai-query-backend .
-
Push the image to a Docker registry (e.g., Docker Hub):
docker push coslynx/ai-query-backend:latest
-
Deploy the image to Kubernetes:
kubectl apply -f deployment.yaml
DATABASE_URL
: Connection string for the PostgreSQL database Example:postgresql://user:password@host:port/database
OPENAI_API_KEY
: OpenAI API key Example:sk-your-openai-api-key
JWT_SECRET
: Secret key for JWT authentication Example:your-256-bit-secret
-
POST
/query
:- Description: Processes a user query using OpenAI's API.
- Request Body:
{ "query": "What is the meaning of life?", "model": "text-davinci-003", "temperature": 0.7, "max_length": 256 }
- Response Body:
{ "query_id": "generated_id" }
-
GET
/response/{query_id}
:- Description: Retrieves the response for a given query ID.
- Response Body:
{ "response": "The meaning of life is a question that has been pondered by philosophers and thinkers for centuries..." }
The MVP uses JWT for authentication. To access protected endpoints:
-
Generate a JWT token by registering a new user or logging in.
-
Include the token in the
Authorization
header of requests:Authorization: Bearer YOUR_JWT_TOKEN
This Minimum Viable Product (MVP) is licensed under the GNU AGPLv3 license.
This MVP was entirely generated using artificial intelligence through CosLynx.com.
No human was directly involved in the coding process of the repository: AI-Powered-User-Query-Backend
For any questions or concerns regarding this AI-generated MVP, please contact CosLynx at:
- Website: CosLynx.com
- Twitter: @CosLynxAI
Create Your Custom MVP in Minutes With CosLynxAI!