A modern CLI tool to scaffold high-performance AI chatbot apps.
- 🎯 Quick Setup - Generate a full-featured AI chatbot in seconds
- 🤖 Multiple AI Providers - Support for Groq, HuggingFace, and TogetherAI
- ⚛️ Modern Frameworks - Next.js 15 (App Router) and Vanilla options
- 🎨 Beautiful UI - Production-ready, ChatGPT-style interface
- ⚡ Performance Optimized - Streaming responses, edge runtime, lazy loading
- 🌙 Dark Mode - Modern dark theme with glassmorphism
- 📱 Responsive - Mobile-first design
- 🧩 TypeScript - Full type safety
- 🚀 Deploy-Ready - Vercel and other platforms supported
npx aigenx create my-appnpm install -g aigenx
aigenx create my-app- Run the create command:
npx aigenx create my-chat-app- Follow the interactive prompts:
? Select your AI Provider: 🚀 Groq (fastest, recommended)
? Select Framework: ⚛️ Next.js (App Router) - Production Ready
OR
🎯 Vanilla - Lightweight (Zero framework dependencies)
- Navigate to your project:
cd my-chat-app- Install dependencies:
npm install- Set up your environment variables:
# Copy the example file
cp .env.example .env
# Edit .env and add your API key
# For Groq (recommended):
GROQ_API_KEY=your_groq_api_key_here-
Get your API key:
- Groq: https://console.groq.com/keys (Fastest, recommended)
- HuggingFace: https://huggingface.co/settings/tokens
- TogetherAI: https://api.together.xyz/settings/api-keys
-
Start the development server:
npm run dev- Open http://localhost:3000 in your browser
- Fastest inference speeds
- OpenAI-compatible API
- Models: Llama 3, Mixtral, and more
- Get API key: https://console.groq.com/keys
- Wide range of open-source models
- Community-driven
- Get API key: https://huggingface.co/settings/tokens
- High-performance open-source models
- Great for production
- Get API key: https://api.together.xyz/settings/api-keys
- Modern dark mode design
- Glassmorphism effects
- Smooth animations
- Typing indicators
- Auto-resize textarea
- Copy message button
- Clear chat functionality
- Mobile-responsive
- Next.js 15 with App Router
- Edge runtime for API routes
- Streaming responses
- React Server Components
- Lazy loading
- Optimized bundle size
- TypeScript
- Tailwind CSS
- Zustand for state management
- Framer Motion for animations
- Clean architecture
- Easy to customize
The generated Next.js app includes:
my-chat-app/
├── app/
│ ├── api/
│ │ └── chat/
│ │ └── route.ts # Streaming chat API
│ ├── layout.tsx # Root layout
│ ├── page.tsx # Home page
│ └── globals.css # Global styles
├── components/
│ ├── ChatInterface.tsx # Main chat container
│ ├── ChatHeader.tsx # Header
│ ├── ChatMessage.tsx # Message component
│ └── ChatInput.tsx # Input component
├── lib/
│ ├── api.ts # API client
│ ├── store.ts # State management
│ └── utils.ts # Utilities
├── styles/
│ └── globals.css # Custom styles
├── .env.example # Environment template
└── package.json
- Clone the repository:
git clone https://github.com/ramykatour/aigenx.git
cd aigenx- Install dependencies:
npm install- Link the package:
npm link- Test locally:
aigenx create test-appaigenx/
├── bin/
│ └── index.js # CLI entry point
├── lib/
│ └── create-project.js # Project creation logic
├── templates/
│ ├── nextjs-template/ # Next.js template
│ └── vanilla-template/ # Vanilla JS template
├── package.json
└── README.md
Best for production applications with:
- Full-featured framework with App Router
- Server-side rendering
- Edge runtime support
- TypeScript
- Tailwind CSS
- React Server Components
Best for lightweight projects with:
- Zero framework dependencies
- Pure HTML, CSS, JavaScript
- Minimal bundle size (~8KB)
- Instant page load
- Easy to understand and modify
- Built-in Express server for API proxy
Advantages:
- ✅ No build step required
- ✅ Works in any browser
- ✅ Easy to host anywhere
- ✅ Simple to debug
- ✅ Fastest performance
- ✅ Perfect for learning
- Update version in
package.json - Build and test:
npm test- Publish:
npm publishThe generated app uses these environment variables:
# Groq (recommended)
GROQ_API_KEY=your_key
# HuggingFace
HUGGINGFACE_API_KEY=your_key
MODEL=mistralai/Mistral-7B-Instruct-v0.2
# TogetherAI
TOGETHERAI_API_KEY=your_key
MODEL=mistralai/Mixtral-8x7B-Instruct-v0.1
# Optional
SYSTEM_PROMPT=You are a helpful AI assistant.You can easily customize:
- AI models via environment variables
- System prompts
- UI colors and styles
- API endpoint logic
npx aigenx create my-app
# Select: Groq
# Select: Next.jsnpx aigenx create my-app
# Select: HuggingFace
# Select: Next.jsnpx aigenx create my-app
# Select: Groq
# Select: VanillaThen run:
cd my-app
npm install
npm run devThe app will be available at http://localhost:3001
npx aigenx create my-app --force- Vanilla JS template ✅
- More AI providers (OpenAI, Anthropic)
- Custom template support
- Plugin system
- Multi-language support
- Authentication templates
- Database integration
Contributions are welcome! Please feel free to submit a Pull Request.
MIT © ramykatour
- Next.js team for the amazing framework
- Groq for the fast inference
- The open-source community
- GitHub: https://github.com/ramykatour/aigenx
- Email: ramymouner@hotmail.com
- Issues: https://github.com/ramykatour/aigenx/issues
Made by ramykatour