Create production-ready React Native Web applications with AI-powered generation and real-time modifications
π Quick Start β’ π Documentation β’ π― Features β’ ποΈ Architecture
AppGen Studio is a comprehensive full-stack application that revolutionizes React Native Web development through AI-powered code generation. Built with modern technologies, it provides developers with an intelligent assistant that can generate, modify, and optimize React applications in real-time.
- π€ AI-Powered Generation: Generate complete React Native Web applications from natural language descriptions
- π¬ Conversational Modifications: Modify your projects through natural language chat with specialized AI agents
- π Quality Assurance: Automated evaluation system with MLflow tracking for consistent quality
- π¨ Modern UI: Beautiful Orange Boosted design system with real-time code preview
- π File Intelligence: PDF processing and context-aware file uploads for enhanced generation
- π Multilanguage Support: Full internationalization with French, English, and Spanish support
- Live Code Editor: Integrated Sandpack editor with real-time preview
- Chat Interface: Conversational AI assistant with auto-rotating suggestions
- File Management: Drag-and-drop uploads with PDF text extraction
- Project Export: Multiple formats (JSON, ZIP) with proper file structure
- Responsive Design: Mobile-first Orange Boosted UI with accessibility support
- Language Switching: Dynamic language switching with persistent preferences (FR, EN, ES)
- Multi-Agent System: Specialized AI agents for generation and modification tasks
- LLM Support: Compatible with Claude, OpenAI GPT, and Azure GitHub Models
- Quality Evaluation: Automated scoring with weighted criteria (Requirements: 50%, Code Quality: 25%, RNW Compliance: 25%)
- Experiment Tracking: Comprehensive MLflow integration for performance monitoring
- PDF Processing: Extract text from PDFs for enhanced context understanding
- Node.js 16+
- Python 3.11+
- Docker (easiest option)
- Your LLM API Key (Claude, OpenAI, etc.)
git clone <repository-url>
cd AppGen-Studiocd backend
echo "API_KEY=your_actual_api_key_here" > .env
echo "BASE_URL=your_llm_base_url" >> .env# Go back to main folder
cd ..
# Start everything with Docker (easiest!)
docker-compose up --buildThat's it! π
Open your browser:
- App: http://localhost:3000
- API: http://localhost:8000/docs
- Analytics: http://localhost:5000
If you don't want to use Docker:
With Poetry (recommended):
cd backend
# Install Poetry if you don't have it
curl -sSL https://install.python-poetry.org | python3 -
poetry install
poetry run uvicorn app.main:app --reloadOr with pip:
cd backend
pip install -r requirements.txt
uvicorn app.main:app --reloadcd frontend
npm install
npm start# With Poetry
cd backend && poetry run mlflow server --port 5000
# Or with pip
pip install mlflow && mlflow server --port 5000graph TB
subgraph "Frontend (React)"
A[React Native Web<br/>Code Generator]
B[Live Editor<br/>Preview]
C[Chatbot]
D[Project Management<br/>Features]
end
subgraph "Backend (FastAPI)"
E[Routes API]
F[Services Logic]
G[CrewAI Agents]
H[Evaluation]
subgraph "CrewAI Agents"
I[Frontend<br/>Generator]
J[Frontend<br/>Optimizer]
end
end
subgraph "LLM Proxy Orange"
K[Vertex AI<br/>claude-sonnet-3.7]
L[OpenAI<br/>GPT-4]
end
subgraph "MLflow"
M[Metrics and LLM feedback<br/>visualization]
end
A --> E
B --> E
C --> E
D --> E
E --> F
F --> G
G --> K
G --> L
H --> M
style A fill:#FFF2E6
style B fill:#FFF2E6
style C fill:#FFF2E6
style D fill:#FFF2E6
style I fill:#FFE6E6
style J fill:#FFE6E6
style K fill:#FFE6FF
style L fill:#FFE6FF
style M fill:#E6F3FF
| Technology | Purpose | Version |
|---|---|---|
| React | UI Framework | 18+ |
| Orange Boosted | Design System | Latest |
| Sandpack | Code Editor | Latest |
| Lucide React | Icons | Latest |
| Technology | Purpose | Version |
|---|---|---|
| FastAPI | Web Framework | Latest |
| CrewAI | AI Orchestration | Latest |
| MLflow | Experiment Tracking | Latest |
| PyPDF2 | PDF Processing | Latest |
| Poetry | Dependency Management | Latest |
| Provider | Model | Usage |
|---|---|---|
| Vertex AI | Claude Sonnet 3.7 | Primary Generation |
| OpenAI | GPT-4 | Alternative LLM |
| Azure | GitHub Models | Enterprise Option |
-
Navigate to Generator
βββ Describe your project: "Todo app with dark mode" βββ Add features: "Drag & drop, local storage, categories" βββ Upload files: Drag PDFs or code files for context βββ Click "Generate Project" -
Real-time Preview
- Switch to "Preview" tab
- See live code editor with your generated project
- Make manual edits or use AI chat
-
AI-Powered Modifications
π¬ Chat: "Change the primary color to orange" π¬ Chat: "Add a hamburger menu to the header" π¬ Chat: "Implement user authentication"
AppGen Studio automatically evaluates every generated project:
- Requirements Fulfillment (50%): How well the project matches your description
- Code Quality (25%): Clean architecture, best practices, error handling
- React Native Web Compliance (25%): Proper RNW usage and responsiveness
View detailed metrics in the MLflow dashboard at http://localhost:5000
POST /generate-project
Content-Type: application/json
{
"description": "E-commerce shopping cart with filters",
"features": "User auth, payment integration, mobile responsive"
}POST /api/chat/chat/{project_id}
Content-Type: application/json
{
"message": "Add a sidebar navigation with collapsible menu"
}POST /api/pdf/extract-pdf-text
Content-Type: multipart/form-data
file: specification.pdfPOST /api/evaluation/evaluate
Content-Type: application/json
{
"use_default_cases": true,
"test_cases": [...]
}# Required
API_KEY=your_llm_api_key
BASE_URL=your_llm_base_url
# Optional
GITHUB_TOKEN=github_token_for_models
MLFLOW_TRACKING_URI=http://localhost:5000# Production with custom environment
docker-compose -f docker-compose.yml -f docker-compose.prod.yml up -d
# Scale services
docker-compose up --scale backend=3
# View resource usage
docker stats- Experiment Tracking: All project generations with metadata
- Performance Metrics: Success rates, response times, quality scores
- Artifact Storage: Generated projects and evaluation reports
- Token Usage Tracking: Monitor LLM costs in
token_usage.log - User Interaction Metrics: Chat frequency, modification success rates
- File Upload Analytics: Success rates by file type and size
App won't start?
- Check if your API key is correct in
backend/.env - Make sure ports 3000, 8000, 5000 aren't being used by other apps
Docker problems?
docker-compose down
docker-compose up --buildWithout Docker problems?
- Press
Ctrl+Cto stop all terminals - Try the commands again
Still stuck?
- Check if Node.js and Python are installed correctly
- Make sure your API key works with your LLM provider
We welcome contributions! Please follow these guidelines:
- Fork the repository
- Create a feature branch:
git checkout -b feature/amazing-feature - Commit your changes:
git commit -m 'Add amazing feature' - Push to branch:
git push origin feature/amazing-feature - Open a Pull Request
# Install pre-commit hooks (using Poetry)
cd backend
poetry install --with dev
poetry run pre-commit install
# Run tests
poetry run python -m pytest
# Frontend tests
cd frontend && npm testThis project is licensed under the MIT License - see the LICENSE file for details.
- Orange for the beautiful Boosted design system
- CrewAI for powerful AI agent orchestration
- MLflow for comprehensive experiment tracking
- Sandpack for the amazing code editor experience
- Poetry for elegant Python dependency management
Made with β€οΈ by me
β Star this repository if you found it helpful!
